CN103251455B - Assistant images display on computer display in medical robotic system and manipulation - Google Patents
Assistant images display on computer display in medical robotic system and manipulation Download PDFInfo
- Publication number
- CN103251455B CN103251455B CN201310052673.7A CN201310052673A CN103251455B CN 103251455 B CN103251455 B CN 103251455B CN 201310052673 A CN201310052673 A CN 201310052673A CN 103251455 B CN103251455 B CN 103251455B
- Authority
- CN
- China
- Prior art keywords
- display
- assistant images
- anatomical structure
- image
- processor
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
- A61B34/37—Master-slave robots
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00163—Optical arrangements
- A61B1/00193—Optical arrangements adapted for stereoscopic vision
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/044—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances for absorption imaging
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/313—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for introducing through surgical openings, e.g. laparoscopes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B18/00—Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
- A61B18/04—Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by heating
- A61B18/12—Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by heating by passing a current through the tissue to be heated, e.g. high-frequency current
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B18/00—Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
- A61B18/04—Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by heating
- A61B18/12—Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by heating by passing a current through the tissue to be heated, e.g. high-frequency current
- A61B18/14—Probes or electrodes therefor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/70—Manipulators specially adapted for use in surgery
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/70—Manipulators specially adapted for use in surgery
- A61B34/71—Manipulators operated by drive cable mechanisms
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/70—Manipulators specially adapted for use in surgery
- A61B34/76—Manipulators having means for providing feel, e.g. force or tactile feedback
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/742—Details of notification to user or communication with user or patient ; user input means using visual displays
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04847—Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0486—Drag-and-drop
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B18/00—Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
- A61B18/04—Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by heating
- A61B18/12—Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by heating by passing a current through the tissue to be heated, e.g. high-frequency current
- A61B18/14—Probes or electrodes therefor
- A61B18/1482—Probes or electrodes therefor having a long rigid shaft for accessing the inner body transcutaneously in minimal invasive surgery, e.g. laparoscopy
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B18/00—Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
- A61B2018/00571—Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body for achieving a particular surgical effect
- A61B2018/00577—Ablation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B18/00—Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
- A61B2018/00571—Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body for achieving a particular surgical effect
- A61B2018/00595—Cauterization
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B18/00—Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
- A61B2018/00982—Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body combined with or comprising means for visual or photographic inspections inside the body, e.g. endoscopes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B18/00—Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
- A61B2018/00994—Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body combining two or more different kinds of non-mechanical energy or combining one or more non-mechanical energies with ultrasound
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/10—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis
- A61B2090/101—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis for stereotaxic radiosurgery
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/364—Correlation of different images or relation of image positions in respect to the body
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/374—NMR or MRI
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/378—Surgical systems with images on a monitor during operation using ultrasound
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/378—Surgical systems with images on a monitor during operation using ultrasound
- A61B2090/3782—Surgical systems with images on a monitor during operation using ultrasound transmitter or receiver in catheter or minimal invasive instrument
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/361—Image-producing devices, e.g. surgical cameras
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61N—ELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
- A61N7/00—Ultrasound therapy
- A61N7/02—Localised ultrasound hyperthermia
- A61N7/022—Localised ultrasound hyperthermia intracavitary
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/01—Indexing scheme relating to G06F3/01
- G06F2203/014—Force feedback applied to GUI
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04804—Transparency, e.g. transparent or translucent windows
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04806—Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Medical Informatics (AREA)
- Biomedical Technology (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Heart & Thoracic Surgery (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Human Computer Interaction (AREA)
- General Physics & Mathematics (AREA)
- Robotics (AREA)
- Pathology (AREA)
- Biophysics (AREA)
- Radiology & Medical Imaging (AREA)
- Optics & Photonics (AREA)
- Plasma & Fusion (AREA)
- Otolaryngology (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Endoscopes (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
- Magnetic Resonance Imaging Apparatus (AREA)
- Manipulator (AREA)
- High Energy & Nuclear Physics (AREA)
- Apparatus For Radiation Diagnosis (AREA)
- Surgical Instruments (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
In order to help surgeon to carry out medical procedure, the assistant images of the interior details of the anatomical structure that general expression is just being treated is displayed on computer display, and is handled the master image of supplementing the general just external view of anatomical structure by surgeon.The second pattern can be converted to by surgeon, to replace the positioning equipment being used as similar mouse, in order to the display and the manipulation that help this surgeon to carry out this auxiliary information at the primary input equipment of first mode control machine human arm.
Description
The name in the application to be the applying date be October 19 in 2006 is called: the divisional application of the Chinese patent application 200680038944.1 of " the assistant images display on the computer display in medical robotic system and handle ".
The lateral reference of related application
This application claims the U.S. Provisional Application S.N.60/728 submitted on October 20th, 2005, the priority of 450, this application is by reference to being incorporated to herein.
Technical field
The present invention relates generally to medical robotic system, relate more specifically to the display of assistant images on medical robotic system Computer display screen and manipulation.
Background technology
Such as those provide many benefits for the medical robotic system performing micro-wound surgical operation process to traditional open surgery technology; comprise less pain, shorter hospital stays, return to normal activity quickly, minimum scar, the recovery time of minimizing and to tissue there is less damage.Therefore, to the demand utilizing medical robotic system to carry out micro-wound surgical operation be very strongly and constantly increase.
An example of medical robotic system is Intuitive Surgical Inc. (IntuitiveSurgical, Inc.) of California, USA Sunnyvale
surgical system, this
system comprises surgeon console, patient's handbarrow, high performance 3 d display system, and the EndoWrist that Intuitive Surgical Inc. is proprietary
tMjoint connector tool, these apparatuses are molded according to human wrist, so when being added to the action of robotic arm assemblies of fixing described surgical operating instrument, they allow the action of at least complete 6 degree of freedom, and this is suitable with the regular event of open surgery.
surgeon console has a high-resolution Stereoscopic Video Presentation screen, and it has two progressive scan cathode ray pipes (CRT).This system provides the fidelity higher than polarisation (polarization), shutter glasses (shuttereyeglass) or other technology.By object lens and a series of reflecting mirror, each eye observation presents the independently CRT of left eye or right eye perspective.In whole operation, be seated surgeon comfort and observe this display, so just make the ideal position becoming surgeon's display here and handle image in three-dimensional operation.
Except display master image on a display screen, same often needs can observe auxiliary information to obtain the medical procedure better understanding or help processing simultaneously.Auxiliary information can provide in various patterns, such as text message, bar diagram, two-dimentional pip image, and aims at and completely overlapping two dimension or 3-D view relative to their master image copy.
For assistant images, these images can obtain, to provide the interior details just connecing subject anatomical structure before surgery or in operation by utilizing the technology such as such as ultrasonography, nuclear magnetic resonance, computed axial tomography and fluoroscopy.Then this information be used to the external view of this anatomical structure supplementary, as the view obtained by the photographing unit locally placed.
Although there is too much ancillary sources and show the mode of this information, the display of assistant images and the improvement of manipulation are remained useful for helping surgeon to utilize medical robotic system to perform medical procedure better.
Summary of the invention
Therefore, an object of various aspects of the present invention is a kind of methods, and the method is used for the coverage diagram auxiliary information of the effect comprising therapeutic process being shown as the image of the anatomical structure of now being treated by treatment procedure or the coverage diagram be associated in addition with the image of the anatomical structure of now being treated by described program.
Another object of various aspects of the present invention is a kind of method, and the user of the volume drawing of the assistant images of anatomical structure is selected for the magnification factor of specifying with user the coverage diagram aimed at the master image of anatomical structure be partly shown as on computer display by the method.
Another object of various aspects of the present invention is a kind of medical robotic system, and it has primary input equipment, and this primary input equipment may be used for manually alignment image in the three dimensions of computer display.
Another object of various aspects of the present invention is a kind of medical robotic system, and it has primary input equipment, the section of the volume drawing of anatomical structure in the three dimensions that this primary input equipment may be used for being limited to computer display.
Another object of various aspects of the present invention is a kind of medical robotic system, and it has primary input equipment, and this primary input equipment may be used for each several part or the details of the volume drawing of the anatomical structure optionally revised in the three dimensions of computer display.
Another object of various aspects of the present invention is a kind of medical robotic system, and it has primary input equipment, and this primary input equipment may be used for the parameter of the volume drawing changing the anatomical structure shown on computer display.
Another object of various aspects of the present invention is a kind of medical robotic system, it has primary input equipment, this primary input equipment can switch between image acquisition mode and image manipulating mode, described in described image acquisition mode, primary input equipment controls the movement of image acquisition equipment, in described image manipulating mode, described primary input equipment controls the display on computer display of the image that obtained by described image acquisition equipment and manipulation.
These and other object is realized by various aspects of the present invention, wherein briefly, an aspect is the method showing the effect being applied to the treatment procedure in anatomical structure by therapeutic equipments on computer display, the method comprises: generate assistant images, and the instruction of this assistant images is applied to the effect of the treatment procedure in anatomical structure by therapeutic equipments; And during described treatment procedure, computer display shows the master image of the anatomical structure covered by described assistant images.
Another aspect is a kind of method for the selected part of the assistant images of anatomical structure being shown as the coverage diagram of the master image of anatomical structure on computer display, described method comprises: be associated with positioning equipment by movable window, so that described movable window can utilize described positioning equipment to locate on described computer display; The master image of the assistant images of anatomical structure and described anatomical structure is aligned, to be in identical position and orientation in common coordinate system; And described master image is shown on described computer display, and a part for the assistant images of described aligning is shown as the coverage diagram of the master image in described movable window, a part for wherein said assistant images corresponds to the same coordinate of described movable window.
Another aspect is a kind of medical robotic system, and this system comprises: for obtaining the image acquisition equipment of image; The robot arm of fixing described image acquisition equipment; Computer display; Be suitable for being moved the primary input equipment handled with multiple degree of freedom by user; And a processor, this processor is configured to handle according to the user of primary input equipment the movement controlling described image acquisition equipment when described primary input equipment is in image acquisition mode, and handles the display of image on computer display controlling to obtain from the image obtained according to the user of primary input equipment when described primary input equipment is in image manipulating mode.
Other objects of various aspects of the present invention, feature and advantage will become clearly from the description of the preferred embodiments of the present invention below and the combination of accompanying drawing.
Accompanying drawing explanation
Fig. 1 graphic extension have employed the top view of the operating room of the medical robotic system utilizing various aspects of the present invention.
Fig. 2 illustrates the structure chart of the medical robotic system utilizing various aspects of the present invention.
Fig. 3 illustrates the laparoscopic ultrasound probe that can be used for the medical robotic system utilizing various aspects of the present invention.
Fig. 4 illustrates and utilizes various aspects of the present invention, and computer display shows the flow chart of the method being applied to the effect of the treatment procedure in anatomical structure by therapeutic equipments.
Fig. 5 illustrates the external view of anatomical structure, and wherein therapeutic equipments inserts described anatomical structure for performing treatment procedure.
Fig. 6 illustrates the interior views of anatomical structure, and wherein recognizable therapeutic effect is shown as and is obtained by treatment sensing equipment.
Fig. 7 illustrates a kind of computer display, and the display of this display screen is by a kind of effect utilizing the treatment procedure of anatomical structure that the method for various aspects of the present invention generates, that cured by described treatment procedure.
Fig. 8 illustrates a kind of flow chart of method, and described method utilizes various aspects of the present invention, on computer display, in the moveable magnifier of user, shows the selected part of the assistant images of anatomical structure.
Fig. 9 illustrates a kind of flow chart of method, and described method utilizes various aspects of the present invention, the window handled of the interior views of the magnification factor display anatomical structure of specifying with.
Figure 10 illustrates the assistant images of anatomical structure and presents the concentric region of assistant images of different magnification factors, for by utilizing the method for various aspects of the present invention in computer display, display in a magnifier.
Figure 11 illustrates a kind of computer display, on this display screen, observes the cover part of the master image of anatomical structure and the assistant images of described anatomical structure in amplifying lens, as by utilizing the method for various aspects of the present invention to show.
Figure 12 illustrates a kind of flow chart of method, and the method is performed by the processor in medical robotic system, handles for utilizing various aspects of the present invention the object be presented on computer display.
Detailed description of the invention
Fig. 1 exemplarily illustrates the top view of the operating room adopting medical robotic system.Described medical robotic system is minimally invasive robotic surgery operation (MIRS) system 100 in this case, it comprises the control station (C) used by surgeon (S), simultaneously under the help of one or more assistant (A), Wicresoft's diagnosis or surgical procedure are performed to a patient (P) lain on operating-table (O).
Control station comprises basic display unit 104 (being also referred to as in this article " display screen " or " computer display "), for showing one or more images of operative site in patient body to surgeon, and other possible information.What comprise equally is primary input equipment 107,108 (being also referred to as in this article " master manipulator "), one or more pedal 105,106, for receiving the mike 103 from surgical voice command, and processor 102.It is any one or more that primary input equipment 107,108 can comprise in many kinds of input equipments, and described multiple input equipment is stick, glove, trigger-guns, hand held controller, fixture (gripper) etc. such as.Processor 102 preferably can be integrated in control station or be connected to the PC of control station in a usual manner in addition.
Surgeon is by handling primary input equipment 107,108 utilize MIRS system 100 to perform medical procedure, thus processor 102 impels the slave arm 121 of each auto correlation of these primary input equipment, 122 correspondingly handle the surgical instrument 138 that they connect removably separately and fix, 139 (being also referred to as in this article " instrument "), surgeon can observe three-dimensional (3D) image of operative site on basic display unit 104 simultaneously.
Instrument 138,139 are preferably the proprietary EndoWrist of Intuitive Surgical Inc.
tMarticulating instruments, it imitates human wrist, so after the action of robot arm being attached to fixing described instrument, they can allow the action of at least complete six-freedom degree, this can be equal to mutually with the regular event of open surgery.Other details of this instrument can in the 5th of " having the carpal joint mechanism (WristMechanismforSurgicalInstrumentforPerformingMinimall yInvasiveSurgerywithEnhancedDexterityandSensitivity) of the surgical unit for performing micro-wound surgical operation of stronger dexterity and sensitivity " by name of owning together, 797, find in No. 900 United States Patent (USP)s, this section of patent is by reference to being incorporated to herein.Steerable end effector, such as a clip in the operational tip of each instrument 138,139, grasper, shears, stapler, blade, pin, needle holder, or for electric explorer (energizableprobe) etc.
Basic display unit 104 has high-resolution stereo video display, has two progressive scan cathode ray pipes (CRT).This system provides the fidelity higher than polarisation, shutter glasses or other technology.By object lens and a series of reflecting mirror, each eye observation one presents the independently CRT of the left side or right eye perspective scope.In whole operation process, surgeon can be seated observation display comfily, so just makes the ideal position becoming surgeon's display here and handle image in three-dimensional operation.
Stereo endoscope 140 provides right camera view and left camera view for processor 102, so that this processor can carry out process information according to the instruction of sequencing, and makes it be presented on basic display unit 104.Laparoscopy ultrasound (LUS) detector 150 provides two dimension (2D) ultrasound image slices of anatomical structure, so that processor 102 can generate the volume rendering of three-dimensional ultrasonic computer model or anatomical structure for processor 102.
Each instrument 138,139 and endoscope 140 and LUS detector 150, be inserted in patient body preferably by sleeve pipe or trocar (not shown) or other tool guide, to extend to operative site by corresponding mini-incision (such as otch 161).Each slave arm 121-124 comprises Slave Manipulator and adjustment arm.Follow-up control arm can utilize motor controlled joints (being also referred to as " active joint ") automatically to move, to handle and/or to move the instrument of their each self-retaining.Adjustment arm can be manually manipulated by the joint (being also referred to as " adjustment joint ") of normal release braking, thus locates slave arm 121-124 horizontally and vertically, so that their respective instrument can be inserted in sleeve pipe.
Space constraint in medical procedure and operating room that the quantity of the Surigical tool simultaneously used and therefore usually depending among other things for the quantity of the slave arm in system 100 will perform.If need to change one or more instrument used in a program, assistant can remove from slave arm the instrument do not re-used, and substitutes with other instrument, such as, instrument 131 in the plate (T) in operating room.
Preferably, basic display unit 104 is positioned at the position near surgeon's hands, so that it can show directed projected image, thus makes surgeon experience him or she to be actually and downwardly to observe operative site.Finally, instrument 138, the image of 139 preferably appears to the position being substantially in surgeon's hands place, even if point of observation (i.e. the point of observation of endoscope 140 and LUS detector 150) is not the point of observation from image.
In addition, real time imaging preferably projects in perspective view, so that surgeon by the primary input equipment 107 or 108 of association, can carry out operation tool as the end effector of 138 or 139, seem to observe the working space substantially truly presented.The true perspective presenting the viewpoint of the operator being simulation manipulation of physical instrument being meant to image presented really.Therefore, the Coordinate Conversion of instrument is become perceived position by processor 102, so that this perspective view is that a people can see image with the level of surgeon's eyes when endoscope 140 directly viewing tool in an open surgery process.
Processor 102 performs various function in system 100.Its important function performed is by the control signal in bus 110, conversion primary input equipment 107 with 108 mechanical movement and be delivered on slave arm 121 and 122 that they associate, so that surgeon handles their respective instruments 138 and 139 effectively.Another important function realizes herein with reference to the various methods described by figure 4-12.
Although be described as processor, it should be understood that processor 102 can be realized by the combination in any of hardware, software and firmware in practice.Equally, its function described herein can be realized by a unit, or is divided into different parts, and each part also can be realized by the combination in any of hardware, software and firmware.In time being divided into different parts, various piece can concentrate on a position or be distributed in order to the object of distributed treatment in whole system 100.
Before execution medical procedure, the ultrasonography that LUS detector 150 obtains, the right 2D camera images that stereo endoscope 140 obtains and left 2D camera images, and the position of the end effector using joint location of the kinesiology of slave arm 121-124 and its sensing to determine and orientation, be calibrated and mutually aim at.
Slave arm 123 can handle endoscope 140 and LUS detector 150 by the mode identical with 139 with 122 manipulation tool 138 with slave arm 121 with 124.But, in time only there is two primary input equipment in system, such as, primary input equipment 107 in system 100,108, in order to allow surgeon can the motion of Non-follow control endoscope 140 or LUS detector 150, need to need the endoscope 140 of Non-follow control or LUS detector 150 to be associated a primary input equipment in primary input equipment 107 and 108 and surgeon, instrument and the Slave Manipulator of association before it are simultaneously locked in appropriate location temporarily.
Although do not illustrate in this illustration, but other sources of the master image of anatomical structure and assistant images can comprise within system 100, such as those are generally used for obtaining ultrasound wave, magnetic resonance, computed axial tomography, and fluoroscopy imaging.Each in these imagings source can use before surgery, and can use in operation in suitable and practical.
Fig. 2 exemplarily illustrates the module map of system 100.In this system, there are two primary input equipment 107 and 108.Primary input equipment 107 carrys out the motion of control tool 138 or stereo endoscope 140 according to its pattern controlled residing for shifter 211, and primary input equipment 108 carrys out the motion of control tool 139 or LUS detector 150 according to its pattern controlled residing for shifter 231.
Controlling shifter 211 and 231 can by surgeon by being arranged on the first or second pattern in the following ways: voice command, be actually arranged on the switch on primary input equipment 107 and 108 or near it, the pedal 105 and 106 on control station or surgeon to the selection of suitable icon or other Graphic User Interfaces selecting arrangements of being presented on basic display unit 104 or secondary monitor (not shown).
When controlling shifter 211 and being set up in the flrst mode, it causes master controller 202 to communicate with from controller 203, so that surgeon can result through the corresponding sports of the instrument 138 of slave arm 121 to the manipulation of primary input 107, endoscope 140 is locked in correct position simultaneously.On the other hand, when controlling shifter 211 and being arranged on the second pattern, it causes master controller 202 to communicate with from controller 233, so that surgeon can cause by the corresponding sports of the endoscope 140 of slave arm 123 to the manipulation of primary input 107, instrument 138 is locked in appropriate location simultaneously.
Similarly, when controlling shifter 231 and being arranged on first mode, it causes primary input controller 108 to communicate with from controller 223, so that surgeon can cause the corresponding sports of the instrument 139 of slave arm 122 to the manipulation of primary input 108.But in this case, LUS detector 150 need not be locked in appropriate location.Its motion can be guided according to the instruction be stored in memorizer 240 by pilot controller 242.Pilot controller 242 provides tactile feedback to surgeon by primary input 108 equally, and this tactile feedback reflects the reading of LUS detector pressure transducer 247.On the other hand, when controlling shifter 231 and being arranged on the second pattern, it causes master controller 108 to communicate with from controller 243, so that surgeon can cause the corresponding sports of the LUS detector 150 of slave arm 124 to the manipulation of primary input 108, instrument 139 is locked in appropriate location simultaneously.
Control shifter realize convert back its first or normal mode time, its association primary input equipment be preferably reset at its conversion before position.Substituting, the kinematic relation of associated tool slave arm that primary input equipment can remain on its current location and primary input equipment and readjust, thus once control shifter convert back its first or normal mode, there will not be the abnormal motion of instrument.Control the United States Patent (USP) 6th of other details see " cooperation Wicresoft telesurgery systems (CooperativeMinimallyInvasiveTelesurgicalSystem) " by name that such as own together of conversion, 659, No. 939, this patent is by reference to being incorporated to herein.
3rd controls shifter 241 is provided to allow its user to change between image acquisition mode and image manipulating mode, controls shifter 231 simultaneously and is in its second pattern (associating with LUS detector 150 by primary input equipment 108).Its first or normal mode (i.e. image acquisition mode), LUS detector 150 is controlled as mentioned above by primary input equipment 108 usually.In its second pattern (i.e. image manipulating mode), LUS detector 150 not acceptor's input equipment 108 controls, primary input equipment 108 is made can freely to perform other tasks, such as show on the display screen and handle assistant images, particularly performing some user's specific function as described herein.But should note, although LUS detector 150 control shifter 241 the second pattern under can not control by primary input equipment 108, it still may automatically shake according to the instruction be stored in memorizer 240 or move under the control of pilot controller 242, generates so the three-D volumes of nearest anatomical structure is drawn a series of two-dimensional ultrasonic image slice that can obtain from LUS detector 150.Other details of the motion of this motion of LUS detector 150 and other programmings, can see the 11/447th of " the laparoscopy ultrasound ripple robotic surgical system (LaparoscopicUltrasoundRoboticSurgicalSystem) " by name in submission on June 6th, 2006 owned together, No. 668 U.S. Patent applications, this application is by reference to being incorporated to herein.
Pilot controller 242 also performs other functions relevant with endoscope 140 to LUS detector 150.It receives the output from LUS detector pressure transducer 247, this pressure sensor senses is applied to the pressure on LUS detector 150, and pass through master controller 222 by force information back to primary input equipment 108, so that surgeon can experience these pressure, even if the motion of his or she not direct control LUS detector 150 at that time.Therefore, the potential injury of patient is minimized, because surgeon has the ability to stop immediately any motion of LUS detector 150, and has the ability to take Non-follow control to this motion.
Another key function of pilot controller 242 is the Show Options selected according to user, impels from the information displaying handled by endoscope 140 and LUS detector 150 on basic display unit 104.The example of this process is comprised from what received by processor for ultrasonic wave 246 and generates three-dimensional ultrasound pattern from the two-dimensional ultrasonic image slice of LUS detector 150, the position and the three-dimensional in orientation that correspond to and select or two-dimensional ultrasonic image is impelled to be presented in the picture-in-picture window of basic display unit 104, the image that the photographing unit impelling the three-dimensional of anatomical structure or two-dimensional ultrasonic image to cover the anatomical structure be presented on basic display unit 104 obtains, and perform hereinafter with reference to the method described by figure 4-12.
Although be shown as independently entity, primary input controller 202 and 222, from controller 203,233,223 and 243, and pilot controller 242 is preferably embodied as the software module performed by processor 102, and control some patten transformation aspect of shifter 211,231,241.On the other hand, processor for ultrasonic wave 246 and video processor 236, can be software module or to insert and processor 102 is coupling or the independent circuits plate of the suitable slot integrated with processor 102 or card, be applicable to being presented at the signal on basic display unit 104 so that the signal received from these image acquisition equipments is converted to, and/or other process for being undertaken by pilot controller 242 before being presented at basic display unit 104.
Although this example assume that each primary input equipment is only shared by a preassigned Tool Slave Robotic Arm or preassigned Image Capturing Device Robotic's arm, substituting being arranged in also is feasible in gamut of the present invention, and it is contemplated that.Such as, a kind of different layout, wherein each primary input equipment can be selectively with associated by any one instrument, and Image Capturing Device Robotic's arm is also possible, is even preferably suitable for maximum flexibility.Equally, controlled by single primary input equipment although Endoscope Robotic Arm illustrates in this illustration, it also can control by using two primary input equipment, to provide the sensation that " can capture image ", and is moved to diverse location or visual angle.Further, although only illustrate endoscope and LUS detector in this illustration, but other image acquisition equipments, such as those for obtain photographing unit, ultrasound wave, magnetic resonance, computed axial tomography and fluoroscopy imaging image acquisition equipment also it is expected to completely in system 100, although be not that each in these image acquisition equipments is necessary to be handled by primary input equipment.
Fig. 3 illustrates the side view of an embodiment of LUS detector 150.LUS detector 150 is a kind of smart tools, and it preferably has two far-end degree of freedom.The end opposite of drive rod or cable (not shown) is physically connected to the near-end of LUS sensor 301, and extend through the inner passage of the axis of elongation 312, utilize the traditional push-pull type action mechanically inclination of control LUS sensor 301 and yaw motion.
LUS sensor 301 obtains the two-dimensional ultrasound-slices of immediate anatomical structure, and by LUS cable 304, this information is sent back processor 102.Although be illustrated as the outside that LUS cable 304 extends in the axis of elongation 312, but it also can extend in the axis of elongation 312, Clamshell Sheath 321 surrounds the axis of elongation 312 and LUS cable 304, to provide the sealing through excellence time sleeve pipe 331 (or trocar).Reference mark 302 and 322 is arranged on LUS sensor 301 and sheath 321 for video tracking object.
Fig. 4 exemplarily illustrates the flow chart of the method for the effect showing treatment procedure or treatment on the display screen.401, the master image of anatomical structure is obtained by image acquisition equipment.Exemplarily, Fig. 5 illustrates the master image obtained by endoscope 140, and it treatment site be used in anatomical structure 501 comprised in anatomical structure 501 and partial insertion anatomical structure 501 performs the therapeutic equipments 511 for the treatment of procedure.In Another Application, therapeutic equipments 511 only need contact or close to anatomical structure 501, to perform treatment procedure.
Master image can obtain before treatment procedure or during treatment procedure.The master image obtained before the procedure is referred to as " operation consent " image, and the master image obtained during program is called " in operation " image.When the primary image is a pre-operative image, this image usually can not upgrade during program, so that the method only uses a width master image usually.On the other hand, when master image is image in operation, this image preferably upgrades at program period, so that the method can use several master image in this case.
The technology such as preoperative image general such as ultrasonic imaging, nuclear magnetic resonance (MRI) or computed axial tomography (CAT) obtain.In operation, image can be obtained in operation or therapentic part by the such as image acquisition equipment such as stereo endoscope 140 or LUS detector 150, or they can be obtained from outside by such as those technology being used for obtaining preoperative image.
In 402 of Fig. 4, therapeutic equipments is opened, or is activated or is energized, the anatomical structure in patient body can apply treatment.This instrument usually have for apply treat energy to abnormal structure (such as ill or damage tissue) end.As an example of this treatment procedure, radiofrequency ablation (RFA) can apply heat to ill tissue site by utilizing RFA detector, for destroying ill tissue, and such as, tumor in anatomical structure (such as liver).The example of other programs, comprises high-intensity focusing ultrasonic wave (HIFU) and burns.This therapeutic equipments can be connected in the instrument 138 and 139 of slave arm 121 and 122, so that it can be moved to therapentic part by surgeon by master/slave control system and handle in treatment site.
403, generate assistant images, wherein the effect of this assistant images instruction to the treatment procedure of anatomical structure.This assistant images can be the true picture of anatomical structure, and it can be provided by the information obtained by sensor device, or the information obtained by sensor device generated, and wherein said sensor device can sense the effect for the treatment of procedure.Alternatively, assistant images can be the computer model of instruction therapeutic effect, and the experience of this effect can be utilized to derive for it or routine is determined rule to generate.In the case of the latter, computer model is normally by the following bulk & form (volumetricshape) because usually determining: the geometry of the end of such as therapeutic equipments, the heat being applied to anatomical structure by the end of therapeutic equipments or energy rank, and bears the feature of therapentic part surrounding tissue for the treatment of procedure in anatomical structure.
The example of the assistant images provided as the information obtained by sensor device or therefrom derive, Fig. 6 illustrates the three-dimensional ultrasonic image of anatomical structure 601, and it derives by convention from the two-dimensional ultrasound-slices obtained by LUS detector 150.In this illustration, shown excision body 621 represents the effect for the treatment of procedure, and wherein the end 613 of RFA detector 612 is applied to the tumor locus of anatomical structure 601.In this case, because due to the heating of the surrounding tissue at tumor locus place and the change of the downright bad tissue property caused, the growth of excision body is observable.
In 404, master image and assistant images are aligned, to have same ratio, and in the identical position of common reference frame internal reference and orientation.Such aligning is known.Exemplarily, can with reference to own together be entitled as " for showing and adjusting auxiliary information on the image display of telesurgery systems to help equipment and the method (DevicesandMethodsforPresentingandRegulatingAuxiliaryInfo rmationonanImageDisplayofaTelesurgicalSystemtoAssistanOp eratorinPerformingaSurgicalProcedure) of the surgical staff performing surgical procedure " the 6th, 522, No. 906 United States Patent (USP)s, this patent is by reference to being incorporated to herein.
In 405, carry out period at treatment procedure, master image shows on the display screen, and the assistant images aimed at preferably covers in master image, so that the corresponding structure in each image or object are shown as same size, and are in identical position and orientation on the display screen.By this way, the effect for the treatment of procedure is shown as the coverage diagram that just bears in the anatomical structure for the treatment of procedure.
Exemplarily, Fig. 7 shows exemplary display screens 104, wherein for demonstration purpose is covered in the master image of Fig. 5 by the assistant images that difference is denoted as dotted line.When the information that assistant images provides or stem from sensor device to obtain by the information that sensor device obtains, therapeutic effect 521, therapeutic equipments 512, and instrument tip 513 is provided by the information obtained or is obtained from obtained information.On the other hand, when therapeutic effect 521 is generated as the computer model of volume morphing by the rule utilizing empirical value to determine, therapeutic equipments 512 and instrument tip 513 can at least in part based on its joint position handling slave arm by utilizing conventional tool following calculation to determine.
In 406 of Fig. 4, then the method checks whether therapeutic equipments closes.If closed, then mean that treatment procedure terminates, then the method terminates.On the other hand, if therapeutic equipments is still opened, then the method thinks that treatment procedure is still carrying out, and enters into 407 to determine whether and obtain new master image.Such as, if do not obtain new master image, because master image is preoperative image, then the method rebound 403 is to upgrade assistant images, and continues to loop through 403-407, until judge that treatment procedure completes by detecting that therapeutic equipments has cut out.On the other hand, if new master image obtains, such as, because master image is image in operation, then the method is before rebound step 403, upgrades master image in step 408, to upgrade assistant images, and continue to loop through 403-408, until judge that treatment procedure completes by detecting that therapeutic equipments has cut out.
Fig. 8 exemplarily illustrates a kind of flow chart of method, the method is used in the window of lens area being defined as magnifier, the assistant images of anatomical structure is shown as the aligning coverage diagram of the master image of anatomical structure by the enlargement ratio of specifying with user, and the position of display assistant images on the display screen can utilize the positioning equipment associated to handle by user with orientation.
801, the method is started by association magnifier and positioning equipment, and to make positioning equipment move, display magnifier (especially, its camera lens can be regarded as window) on the display screen moves in the corresponding way.In this case can by using positioning equipment or the cursor by making magnifier effectively amplify positioning equipment, " crawl " magnifier completes association in a conventional manner.Because display screen 104 is preferably three dimensional display, therefore positioning equipment correspondingly preferably has the three-dimensional pointing device of orientation indicating capability.
In 802, current primary image and assistant images can be used for process.Master image is obtained by endoscope 140 in this illustration, and assistant images is obtained by LUS detector 150.But other sources of master image and assistant images also can use when putting into practice of the present invention and consider, comprise the master image and assistant images that obtain from identical sources.As the example of last a kind of situation, a kind of high-resolution camera can obtain resolution higher than being used for the image of the resolution showing image on a display screen.In this case, the high-definition picture obtained by photographing unit can as assistant images process, and the display image reduced on a display screen can as master image process.
In 803, at user option magnification factor is read.Magnification factor be user by, such as, dial on positioning equipment or the control of wheel type are selected.Alternatively, user is selected by the user option shown in menu on the display screen or is selected by the configuration of any other traditional user's available parameter value or mechanism.If user does not make a choice, then can Use Defaults, such as magnification factor is 1.0.
In 804, master image and assistant images are aligned, to have same size, and with reference to same position and orientation in same reference frame, thus make corresponding structure and object in these two images have identical coordinate.
805, master image is shown on the display screen, and the 3-D view of such as anatomical structure, in this case, a part for the two dimension slicing of the assistant images of anatomical structure can be shown as the coverage diagram in the camera lens of magnifier.The window region of---this central point has the position identical with the central point of magnifier camera lens and orientation---and the region determined by magnification factor limit by having central point for this part of two dimension slicing in this case, so this part of two dimension slicing can be exaggerated or reduce, to be applicable to the camera lens of magnifier.Two dimension slicing because the position of magnifier and orientation can be handled to the optional position in the three dimensions of display screen 104 by positioning equipment, comprises those positions in anatomical structure body, so can correspond to the degree of depth in anatomical structure that any user selects.Different with physical magnifying glass, its view is not restricted to the appearance only checking anatomical structure.For other details of 805, refer to hereinafter with reference to the description of figure 9.
In 806, then the method determines whether magnifier order is closed in the following manner: such as user discharges the association made for disconnecting between magnifier and positioning equipment of " crawl " image of magnifier or the traditional switch device by some type.If association is closed, then the method terminates.On the other hand, if do not closed, then the method rebound 802, and continue to loop through 802-806, close until magnifier order is detected.It should be noted that the method circulates one time from 802-806 each time, the version (if any) of the renewal of master image and assistant images, utilize user can select the updated value process of magnification factor (if any).Therefore, if the method is in a sufficiently rapid manner by this circulation, if then user regulates magnification factor at rotatable dial or handle, simultaneously when the select location and orientation observation anatomical structure of magnifier, user can not find any obvious delay.
Fig. 9 exemplarily illustrates a kind of flow chart of method, and the method is used in the camera lens of user's movable magnifier, with the magnification factor of specifying, the auxiliary image view of anatomical structure is shown as the coverage diagram of the primary image view of anatomical structure.As previously explained, this method can be used for performing 805 of Fig. 8.
In 901, the current location of the central point of magnifier camera lens and orientation are determined in the three dimensions of display screen 104.In 902, the two dimension slicing of the body Model of the aligning of assistant images obtains from the perspective view in that position and orientation, and a part for this two dimension slicing obtains according to the restriction of auxiliary view window of the central point preferably with same position and orientation.According to the current magnification factor of magnifier, the region of auxiliary view window in this case and the region of camera lens are inversely proportional to.In 903, the part of the two dimension slicing limited by auxiliary view window is exaggerated factor and amplifies, so that the lens area of its applicable magnifier, and in 904, the master image of anatomical structure is shown on the display screen, and wherein the amplifier section of the two dimension slicing of assistant images covers on the lens area of the magnifier be shown on display screen 104.
As a visualization example of 901-904, the two dimension slicing 1001 of the assistant images of anatomical structure shown in Figure 10-11, and two of two dimension slicing round windows 1021 and 1022 are as shown in Figure 10.In this case, each window 1021,1022 correspond to the shape of the camera lens 1121 of magnifier 1120 in shape and have the central point identical with it, the camera lens 1121 of this magnifier 1120 is shown on the display screen together with the master image of the external view 1101 of anatomical structure, as shown in figure 11.In this illustration, the region of window 1021 equals the region of camera lens 1121, if so magnification factor is 1.0, so window 1021 can be selected for 902.On the other hand, the region of window 1022 is less than the region of camera lens 1121, if so magnification factor is greater than 1.0, so window 1022 can be selected for 902.Although it is circular to notice that the camera lens 1121 of magnifier 1120 is described to, it can also have the common shape that other are suitable as magnifier, such as rectangle.
Figure 12 exemplarily illustrates a kind of flow chart of method, the method is when primary input equipment is under image manipulating mode time, in response to the corresponding manipulation of association primary input equipment, performed by the processor of medical robotic system, for handling the image object be presented on the computer display of medical robotic system.
As the beginning of this method, medical robotic system comprises image acquisition equipment and is used for obtaining image (such as endoscope 140 or LUS detector 150); Still image obtains the robot arm (such as distinguishing slave arm 123 or the slave arm 124 of fixed endoscope 140 and LUS detector 150) of equipment; Computer display (such as display screen 104); Be suitable for the primary input equipment (such as primary input equipment 107 or primary input equipment 108) handled with the motion of multiple degree of freedom by user; And processor (such as pilot controller 242), it is configured to handle according to the user of primary input equipment the motion controlling image acquisition equipment when primary input equipment is in image acquisition mode, and handles the display of image on computer display controlling to obtain from obtained image according to the user of primary input equipment when primary input equipment is in image manipulating mode.
In 1201, processor detects that primary input equipment is arranged to image manipulating mode by user.A kind of mode realizing this purpose utilizes the main clutch be provided in medical robotic system, and primary input equipment departs from from its robot arm associated by this main clutch support, thus primary input equipment can be replaced.When this pattern is activated by some mechanism time, such as user presses the button on primary input equipment, presses pedal, or utilize voice activation, associated robot arm is locked in appropriate location, and cursor (iconifies expression symbolically for hands, such as
) on computer display, present to user.When user leaves this pattern time, cursor is hidden, and if if required, the control of robot arm can recover after readjusting its position.
In 1202, whether processor determination control inputs (such as being produced by the button of pressing conventional mice) is activated by a user.Control inputs can be activated by the button that pressing primary input equipment provides in this case, or it can activate by other means, such as, extrude the fixture (gripper) that primary input equipment provides or pliers (pincher) structure.About other details of clutch, fixture on primary input equipment or clamp structure, see such as own together be entitled as " cooperation Wicresoft telesurgery systems (CooperativeMinimallyInvasiveTelesurgicalSystem) " the 6th, 659, No. 939 United States Patent (USP)s, this patent is by reference to being incorporated to herein.If be not confirmed as " opening (on) " (namely activate or) at 1202 control inputs, then processor is waited for, indicates until it receives " opening " or exits image manipulating mode.
1203, after control inputs is in the instruction of " on " in reception, whether processor carries out checking observing cursor and is on the object that is presented on computer display (or in preset distance).If be not in " on ", then 1204, processor causes the menu of at user option project or activity to be presented on computer display, and 1205, the menu setecting that processor reception user makes also reacts this selection.
The optional individual event object example of triming vegetables for cooking of user comprises: magnifier, section, eraser, and image alignment.If user have selected magnifier project, then the image of magnifier is displayed on computer display, and is performed by processor with reference to the method described by figure 8.When the user is finished with the magnifying glass function, then user can indicate in any usual manner and exit this function, and this processor returns 1202.
If user have selected section option, then on computer display, show plane (or display fixed dimension or the rectangular window of user's adjustable dimension).Primary input equipment then can with this Plane association so that user can in the mode of positioning equipment by handling primary input equipment, by this plane positioning be oriented in the three dimensions of computer display.If this plane handled and moved so that and the volume drawing of anatomical structure intersect, then it is as the section of two dimension slicing of volume drawing being limited to infall.Alternatively, primary input equipment can be associated with the volume drawing of anatomical structure, then can be handled and move, and then limits section with shown Plane intersects.Plane or volume drawing and positioning equipment associate can reference diagram 8 801 the substantially identical mode described by magnifier perform.
Then two dimension slicing also can be observed in the plane of oneself, or observed in a separate window (such as picture-in-picture) on computer display.User can select the additional two dimension slicing that the extra time of section project, limitative aspect was drawn further, observes in each plane simultaneously on computer display or picture-in-picture window.In order to confusion reigned between computer display and unwanted cut-plane slices can not be made, provide conventional delete function, so that user optionally can delete any section or their respective slice.When the user is finished with the cut-plane function, then user can indicate in any convenient manner and exit this function, and processor returns with 1202.
If user's selective erasing device project, then show eraser on computer display.Primary input equipment is then associated with this eraser, so that user can by handling primary input equipment in the mode of positioning equipment, with at the three dimensions inner position of computer display and this eraser directed.In this case positioning equipment and eraser associate can utilize with above with reference to figure 8 801 magnifier described by substantially identical mode carry out.If eraser is handled and moved to such an extent as to intersects with the volume drawing of anatomical structure, then when eraser is through volume drawing, it is used for wiping this drafting completely or partially.If user selects part to wipe (or programming in advance within a processor), then whenever this eraser is through volume drawing, then less details of anatomical structure is shown.Details less in this case refers to be the roughness/fineness of drawing, or can refer to the stripping of each level in said three-dimensional body drafting.All these characteristics or the option of erasing all can utilize usual manner to select.If user erases a part for volume drawing inadvertently, conventional feature of cancelling is provided to allow user to cancel this erasing.After user completes erase feature, then user can indicate in any convenient manner and exit this function, and processor returns 1202.
Except above-described erase feature, other local, space modify features also can be considered and be regarded as falling into full breadth of the present invention, comprise optionally sharpening, blast, or variegate a part for the image of display to strengthen or highlight its visuality at selection area.Each local, this space modify feature can utilize and above perform with reference to the substantially identical method described by eraser function.
If user selects image registration item, then processor recorded the activity/action of this selection for future before rebound program again 1202 as hereinafter described by reference 1212.In this case, image alignment generally comprises the assistant images of manual alignment object (such as anatomical structure) and the corresponding master image of this object.
The one of above-described menu approach substitutes and is, indicate the icon of each optional item to be as described above displayed on computer display respectively when entering image manipulating mode, and click them by user and selected, after this, processor continues to perform according to such described by the selection with reference to their corresponding menu items.
Present continuation is with reference to the method described by Figure 12, after receiving control inputs be in the instruction of in 1201, and in 1202, judge that cursor to be positioned on the object that is shown on computer display or around it after (not being icon), the iconify expression of cursor from hands preferably converts to by processor, such as, the hands grasped carrys out directing object by " crawl ", and prepared by the user of primary input equipment handle move or " towing " extremely in another position three-dimensional of computer display and/or orientation.
1206, processor judges whether user has indicated the display parameters of selected object to be adjusted, if user indicates, then 1207, processor performs display adjustment.Exemplarily, the dial on primary input equipment can be rotated by user and indicate the display to display parameters to adjust, and these display parameters are associated with will controlled dial according to the rotation amount of the dial on selected object.Alternatively, if primary input equipment has fixture, then this fixture can be rotated, to be used as dial.The example of the display parameters that can adjust by this way comprises: be presented at the brightness of the selected object on computer display, contrast, color and level of detail (size of such as mesh coarseness/fineness, or volume elements or three-dimensional pixel and/or opacity).
Then processor continues to 1208, to judge after 1203 obtain the judgement of affirmative whether this cursor moves after " crawl " described selected object.If also do not moved, then processor rebound 1202, because user now only may wish the display parameters adjusting selected object.On the other hand, if cursor moves after " crawl " this selected object, then 1209, processor moves selected object to new cursor position.Because cursor operations is at the three dimensions of computer display, when it " enters " display screen, it can indicate this movement by such as reducing size gradually.When the three-dimensional nature of computer display realizes by using the right two dimension view of this object and the second from left dimensional view time, the difference in the cross point between these two views just represents depth value, and in right view and left view, the reduction of the depth value of cursor glyph can indicate this cursor " to enter " display screen.
Selectively, 1210, tactile feedback can be provided and return primary input equipment, so that user can sense feedback force when the object of " crawled " moves in 1209.Exemplarily, by the virtual mass of object and inertia are associated with object, the interaction of user and object can feed back to user to sense of touch, so as user's impression when with object contact or when object is accelerated or slow down the feedback force of translation and rotating object.This tactile feedback performed in 1210 only can realize for the object of some type, and the object for other can not realize, or it is only useful in some environments.The use of this tactile feedback can also be applied to magnifier and/or as mentioned above for limiting the movement of the plane of section.But in this case, tactile feedback can be restricted to and only occur after magnifier or plane enter in interested anatomical structure.
1211, processor judges whether control inputs is still in " on " state.If control still to be in " on ", then processor rebound 1208, to follow the tracks of and to respond cursor movement.On the other hand, if control to close, such as, discharged the button being pressed at first to indicate control " to open " by user, then in 1212, processor performs selected menu action.
Such as, if in response to processor at 1204 display menus (or alternatively, user clicks the icon indicating this), image registration item is selected by user, then movement object and form a line now and another image being now just presented at this object on computer display align, so that they, in common coordinate frame (coordinate system of such as computer display), have identical coordinate and orientation values.This feature contributes to, such as, and the manual alignment of the assistant images (such as utilizing LUS detector 150 to obtain) of anatomical structure and the master image (such as utilizing endoscope 140 to obtain) of anatomical structure.After the initial registration, the position of the corresponding object in master image and the change in/orientation can be mirrored, to cause the respective change of the selected object in assistant images, thus keep it relative to the relative position/orientation of master image.When user completes image alignment function, processor returns 1202.
Although various aspects of the present invention are described with reference to preferred embodiment, it should be understood that the present invention has the right to carry out complete preservation in the full breadth of claims.
Claims (12)
1. one kind is shown as the method for the coverage diagram of the master image of described anatomical structure for over the display the user of the assistant images of anatomical structure being selected part, wherein said master image provides the view of the outside of the anatomical structure obtained by the first image acquisition equipment, and wherein said assistant images provides the view of the inside by the anatomical structure using the information obtained by the second image acquisition equipment to generate, and described method comprises:
Movable window on described display is associated by purpose processor, to make described movable window can move on the display according to the movement of described positioning equipment with positioning equipment;
Described processor is used to be aimed at described master image by described assistant images, to have same size, and with reference to identical position and orientation in common coordinate frame; And
Described processor is used to show described master image on the display, and
Use described processor will to be shown as the coverage diagram of described master image in a part for the assistant images of aligning described movable window on the display.
2. method according to claim 1, wherein said movable window is shown as the lens area of magnifier on the display.
3. method according to claim 1, wherein said first image acquisition equipment is stereo endoscope, and described second image acquisition equipment is ultrasonic detector, and described display is stereo video display.
4. method according to claim 1, the coverage diagram wherein using described processor a part for described assistant images to be shown as in described movable window described master image comprises the coverage diagram using described processor the image of the two dimension slicing of described anatomical structure to be shown as in described movable window described master image, and the described two dimension slicing of wherein said anatomical structure corresponds to the degree of depth in anatomical structure of user's selection.
5. method according to claim 1, the coverage diagram wherein using described processor a part for described assistant images to be shown as in described movable window described master image comprises the current location and the orientation that use described processor to determine the central point of described movable window on the display, use described processor to determine a part for the assistant images of the aligning corresponding with the position of the central point of the described movable window on described display and orientation, the region of the part of the assistant images wherein aimed at is inversely proportional to according to the region of magnification factor and described movable window.
6. method according to claim 5, comprises further:
Described magnification factor is received at described processor place; And
Use described processor described magnification factor to be applied to the part of the assistant images of described aligning, be applicable to described movable window with the part of the assistant images making described aligning and in described movable window, be shown as the coverage diagram of described master image.
7. a medical system, it comprises:
First image acquisition equipment, it is arranged to obtain the master image of the outside of described anatomical structure relative to anatomical structure;
Second image acquisition equipment, it is arranged relative to described anatomical structure, to obtain the information of the assistant images of the inside for generating described anatomical structure;
Positioning equipment;
Display; With
Processor, it is configured to: be associated with described positioning equipment by the movable window on described display, to make described movable window move on the display according to the movement of described positioning equipment; Described assistant images is aimed at described master image, to have same size, and with reference to identical position and orientation in common coordinate frame; Described master image is shown on the display; And the coverage diagram of described master image will be shown as in a part for the assistant images of aligning described movable window on the display.
8. medical system according to claim 7, wherein said processor is configured to the lens area on the display described Shiftable window being shown as magnifier.
9. medical system according to claim 7, wherein said first image acquisition equipment is stereo endoscope, and described second image acquisition equipment is ultrasonic detector, and described display is stereo video display.
10. medical system according to claim 7, wherein said processor is configured to the coverage diagram being shown as described master image by the image of the two dimension slicing by described anatomical structure in described movable window, a part for described assistant images is shown as the coverage diagram of described master image in described movable window, the described two dimension slicing of wherein said anatomical structure corresponds to the degree of depth in anatomical structure of user's selection.
11. medical systems according to claim 7, wherein said processor is configured to current location and the orientation of the central point by determining described movable window on the display, and by the part of the assistant images of determining the aligning corresponding with the position of the central point of the described movable window on described display and orientation, the part of described assistant images is shown as in described movable window the coverage diagram of described master image, the region of the part of the assistant images wherein aimed at is inversely proportional to according to the region of magnification factor and described movable window.
12. medical systems according to claim 11, wherein said processor is configured to: receive described magnification factor at described processor place; And described magnification factor is applied to the part of assistant images of described aligning, be applicable to described movable window with the part of the assistant images making described aligning and in described movable window, be shown as the coverage diagram of described master image.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US72845005P | 2005-10-20 | 2005-10-20 | |
US60/728,450 | 2005-10-20 | ||
CN2006800389441A CN101291635B (en) | 2005-10-20 | 2006-10-19 | Auxiliary image display and manipulation on a computer display in a medical robotic system |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN2006800389441A Division CN101291635B (en) | 2005-10-20 | 2006-10-19 | Auxiliary image display and manipulation on a computer display in a medical robotic system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN103251455A CN103251455A (en) | 2013-08-21 |
CN103251455B true CN103251455B (en) | 2016-04-27 |
Family
ID=37744551
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201310052693.4A Active CN103142309B (en) | 2005-10-20 | 2006-10-19 | Auxiliary image display and manipulation on computer display in medical robotic system |
CN201310052673.7A Active CN103251455B (en) | 2005-10-20 | 2006-10-19 | Assistant images display on computer display in medical robotic system and manipulation |
CN2006800389441A Active CN101291635B (en) | 2005-10-20 | 2006-10-19 | Auxiliary image display and manipulation on a computer display in a medical robotic system |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201310052693.4A Active CN103142309B (en) | 2005-10-20 | 2006-10-19 | Auxiliary image display and manipulation on computer display in medical robotic system |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN2006800389441A Active CN101291635B (en) | 2005-10-20 | 2006-10-19 | Auxiliary image display and manipulation on a computer display in a medical robotic system |
Country Status (6)
Country | Link |
---|---|
US (4) | US20080033240A1 (en) |
EP (4) | EP1937176B1 (en) |
JP (4) | JP5322648B2 (en) |
KR (1) | KR101320379B1 (en) |
CN (3) | CN103142309B (en) |
WO (1) | WO2007047782A2 (en) |
Families Citing this family (187)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8944070B2 (en) | 1999-04-07 | 2015-02-03 | Intuitive Surgical Operations, Inc. | Non-force reflecting method for providing tool force information to a user of a telesurgical system |
US9517106B2 (en) | 1999-09-17 | 2016-12-13 | Intuitive Surgical Operations, Inc. | Systems and methods for commanded reconfiguration of a surgical manipulator using the null-space |
US9155544B2 (en) * | 2002-03-20 | 2015-10-13 | P Tech, Llc | Robotic systems and methods |
US8971597B2 (en) * | 2005-05-16 | 2015-03-03 | Intuitive Surgical Operations, Inc. | Efficient vision and kinematic data fusion for robotic surgical instruments and other applications |
US9492240B2 (en) | 2009-06-16 | 2016-11-15 | Intuitive Surgical Operations, Inc. | Virtual measurement tool for minimally invasive surgery |
US8073528B2 (en) | 2007-09-30 | 2011-12-06 | Intuitive Surgical Operations, Inc. | Tool tracking systems, methods and computer products for image guided surgery |
US9789608B2 (en) * | 2006-06-29 | 2017-10-17 | Intuitive Surgical Operations, Inc. | Synthetic representation of a surgical robot |
US10555775B2 (en) | 2005-05-16 | 2020-02-11 | Intuitive Surgical Operations, Inc. | Methods and system for performing 3-D tool tracking by fusion of sensor and/or camera derived data during minimally invasive robotic surgery |
CN103142309B (en) * | 2005-10-20 | 2015-06-17 | 直观外科手术操作公司 | Auxiliary image display and manipulation on computer display in medical robotic system |
US7907166B2 (en) * | 2005-12-30 | 2011-03-15 | Intuitive Surgical Operations, Inc. | Stereo telestration for robotic surgery |
CN104688327B (en) * | 2006-06-13 | 2017-06-09 | 直观外科手术操作公司 | Minimally invasive surgery system |
US9718190B2 (en) | 2006-06-29 | 2017-08-01 | Intuitive Surgical Operations, Inc. | Tool position and identification indicator displayed in a boundary area of a computer display screen |
US10008017B2 (en) | 2006-06-29 | 2018-06-26 | Intuitive Surgical Operations, Inc. | Rendering tool information as graphic overlays on displayed images of tools |
US20090192523A1 (en) | 2006-06-29 | 2009-07-30 | Intuitive Surgical, Inc. | Synthetic representation of a surgical instrument |
US10258425B2 (en) | 2008-06-27 | 2019-04-16 | Intuitive Surgical Operations, Inc. | Medical robotic system providing an auxiliary view of articulatable instruments extending out of a distal end of an entry guide |
US20100149183A1 (en) * | 2006-12-15 | 2010-06-17 | Loewke Kevin E | Image mosaicing systems and methods |
US11228753B1 (en) | 2006-12-28 | 2022-01-18 | Robert Edwin Douglas | Method and apparatus for performing stereoscopic zooming on a head display unit |
US10795457B2 (en) | 2006-12-28 | 2020-10-06 | D3D Technologies, Inc. | Interactive 3D cursor |
US11315307B1 (en) | 2006-12-28 | 2022-04-26 | Tipping Point Medical Images, Llc | Method and apparatus for performing rotating viewpoints using a head display unit |
US9980691B2 (en) * | 2006-12-28 | 2018-05-29 | David Byron Douglas | Method and apparatus for three dimensional viewing of images |
US11275242B1 (en) | 2006-12-28 | 2022-03-15 | Tipping Point Medical Images, Llc | Method and apparatus for performing stereoscopic rotation of a volume on a head display unit |
JP4916011B2 (en) * | 2007-03-20 | 2012-04-11 | 株式会社日立製作所 | Master / slave manipulator system |
JP5444209B2 (en) * | 2007-04-16 | 2014-03-19 | ニューロアーム サージカル リミテッド | Frame mapping and force feedback method, apparatus and system |
US9469034B2 (en) | 2007-06-13 | 2016-10-18 | Intuitive Surgical Operations, Inc. | Method and system for switching modes of a robotic system |
US9084623B2 (en) | 2009-08-15 | 2015-07-21 | Intuitive Surgical Operations, Inc. | Controller assisted reconfiguration of an articulated instrument during movement into and out of an entry guide |
US8903546B2 (en) | 2009-08-15 | 2014-12-02 | Intuitive Surgical Operations, Inc. | Smooth control of an articulated instrument across areas with different work space conditions |
US9089256B2 (en) | 2008-06-27 | 2015-07-28 | Intuitive Surgical Operations, Inc. | Medical robotic system providing an auxiliary view including range of motion limitations for articulatable instruments extending out of a distal end of an entry guide |
US9138129B2 (en) | 2007-06-13 | 2015-09-22 | Intuitive Surgical Operations, Inc. | Method and system for moving a plurality of articulated instruments in tandem back towards an entry guide |
US8620473B2 (en) * | 2007-06-13 | 2013-12-31 | Intuitive Surgical Operations, Inc. | Medical robotic system with coupled control modes |
US20090069804A1 (en) * | 2007-09-12 | 2009-03-12 | Jensen Jeffrey L | Apparatus for efficient power delivery |
US9050120B2 (en) * | 2007-09-30 | 2015-06-09 | Intuitive Surgical Operations, Inc. | Apparatus and method of user interface with alternate tool mode for robotic surgical tools |
US8042435B2 (en) * | 2007-11-26 | 2011-10-25 | Thompson Ray P | Special articulating tool holder |
US9123159B2 (en) * | 2007-11-30 | 2015-09-01 | Microsoft Technology Licensing, Llc | Interactive geo-positioning of imagery |
US8155479B2 (en) | 2008-03-28 | 2012-04-10 | Intuitive Surgical Operations Inc. | Automated panning and digital zooming for robotic surgical systems |
US8808164B2 (en) | 2008-03-28 | 2014-08-19 | Intuitive Surgical Operations, Inc. | Controlling a robotic surgical tool with a display monitor |
US8864652B2 (en) | 2008-06-27 | 2014-10-21 | Intuitive Surgical Operations, Inc. | Medical robotic system providing computer generated auxiliary views of a camera instrument for controlling the positioning and orienting of its tip |
DE102008041867B4 (en) * | 2008-09-08 | 2015-09-10 | Deutsches Zentrum für Luft- und Raumfahrt e.V. | Medical workstation and operating device for manually moving a robot arm |
US8315720B2 (en) | 2008-09-26 | 2012-11-20 | Intuitive Surgical Operations, Inc. | Method for graphically providing continuous change of state directions to a user of a medical robotic system |
CN102224737B (en) * | 2008-11-24 | 2014-12-03 | 皇家飞利浦电子股份有限公司 | Combining 3D video and auxiliary data |
US8830224B2 (en) | 2008-12-31 | 2014-09-09 | Intuitive Surgical Operations, Inc. | Efficient 3-D telestration for local robotic proctoring |
US8184880B2 (en) | 2008-12-31 | 2012-05-22 | Intuitive Surgical Operations, Inc. | Robust sparse image matching for robotic surgery |
WO2010105237A2 (en) * | 2009-03-12 | 2010-09-16 | Health Research Inc. | Method and system for minimally-invasive surgery training |
US9155592B2 (en) * | 2009-06-16 | 2015-10-13 | Intuitive Surgical Operations, Inc. | Virtual measurement tool for minimally invasive surgery |
US9492927B2 (en) | 2009-08-15 | 2016-11-15 | Intuitive Surgical Operations, Inc. | Application of force feedback on an input device to urge its operator to command an articulated instrument to a preferred pose |
US8918211B2 (en) | 2010-02-12 | 2014-12-23 | Intuitive Surgical Operations, Inc. | Medical robotic system providing sensory feedback indicating a difference between a commanded state and a preferred pose of an articulated instrument |
KR101039108B1 (en) * | 2009-09-01 | 2011-06-07 | 한양대학교 산학협력단 | Medical robot system and Method for controlling the same |
KR101683057B1 (en) * | 2009-10-30 | 2016-12-07 | (주)미래컴퍼니 | Surgical robot system and motion restriction control method thereof |
KR101598774B1 (en) * | 2009-10-01 | 2016-03-02 | (주)미래컴퍼니 | Apparatus and Method for processing surgical image |
WO2011040769A2 (en) * | 2009-10-01 | 2011-04-07 | 주식회사 이턴 | Surgical image processing device, image-processing method, laparoscopic manipulation method, surgical robot system and an operation-limiting method therefor |
US8706184B2 (en) * | 2009-10-07 | 2014-04-22 | Intuitive Surgical Operations, Inc. | Methods and apparatus for displaying enhanced imaging data on a clinical image |
EP2493387A4 (en) | 2009-10-30 | 2017-07-19 | The Johns Hopkins University | Visual tracking and annotation of clinically important anatomical landmarks for surgical interventions |
DE102010009295B4 (en) * | 2010-02-25 | 2019-02-21 | Siemens Healthcare Gmbh | Method for displaying a region to be examined and / or treated |
US9298260B2 (en) * | 2010-03-12 | 2016-03-29 | Broadcom Corporation | Tactile communication system with communications based on capabilities of a remote system |
US8675939B2 (en) | 2010-07-13 | 2014-03-18 | Stryker Leibinger Gmbh & Co. Kg | Registration of anatomical data sets |
US20140287393A1 (en) * | 2010-11-04 | 2014-09-25 | The Johns Hopkins University | System and method for the evaluation of or improvement of minimally invasive surgery skills |
US9486189B2 (en) | 2010-12-02 | 2016-11-08 | Hitachi Aloka Medical, Ltd. | Assembly for use with surgery system |
JP5717543B2 (en) | 2011-05-30 | 2015-05-13 | ジーイー・メディカル・システムズ・グローバル・テクノロジー・カンパニー・エルエルシー | Ultrasonic diagnostic apparatus and control program therefor |
WO2013005862A1 (en) | 2011-07-07 | 2013-01-10 | Olympus Corporation | Medical master slave manipulator |
JP5892361B2 (en) * | 2011-08-02 | 2016-03-23 | ソニー株式会社 | Control device, control method, program, and robot control system |
WO2013018897A1 (en) | 2011-08-04 | 2013-02-07 | オリンパス株式会社 | Surgical implement and medical treatment manipulator |
JP6021353B2 (en) | 2011-08-04 | 2016-11-09 | オリンパス株式会社 | Surgery support device |
JP5931497B2 (en) | 2011-08-04 | 2016-06-08 | オリンパス株式会社 | Surgery support apparatus and assembly method thereof |
JP5936914B2 (en) | 2011-08-04 | 2016-06-22 | オリンパス株式会社 | Operation input device and manipulator system including the same |
JP6081061B2 (en) | 2011-08-04 | 2017-02-15 | オリンパス株式会社 | Surgery support device |
JP6000641B2 (en) | 2011-08-04 | 2016-10-05 | オリンパス株式会社 | Manipulator system |
JP5953058B2 (en) | 2011-08-04 | 2016-07-13 | オリンパス株式会社 | Surgery support device and method for attaching and detaching the same |
JP6021484B2 (en) | 2011-08-04 | 2016-11-09 | オリンパス株式会社 | Medical manipulator |
JP6009840B2 (en) | 2011-08-04 | 2016-10-19 | オリンパス株式会社 | Medical equipment |
EP2740434A4 (en) | 2011-08-04 | 2015-03-18 | Olympus Corp | Medical manipulator and method for controlling same |
JP6005950B2 (en) | 2011-08-04 | 2016-10-12 | オリンパス株式会社 | Surgery support apparatus and control method thereof |
US9519341B2 (en) | 2011-08-04 | 2016-12-13 | Olympus Corporation | Medical manipulator and surgical support apparatus |
JP5841451B2 (en) | 2011-08-04 | 2016-01-13 | オリンパス株式会社 | Surgical instrument and control method thereof |
KR101828453B1 (en) | 2011-12-09 | 2018-02-13 | 삼성전자주식회사 | Medical robotic system and control method for thereof |
US9956042B2 (en) | 2012-01-13 | 2018-05-01 | Vanderbilt University | Systems and methods for robot-assisted transurethral exploration and intervention |
JP6250566B2 (en) | 2012-02-15 | 2017-12-20 | インテュイティブ サージカル オペレーションズ, インコーポレイテッド | User selection of robot system operation mode using operation to distinguish modes |
EP2650691A1 (en) | 2012-04-12 | 2013-10-16 | Koninklijke Philips N.V. | Coordinate transformation of graphical objects registered to a magnetic resonance image |
US9539726B2 (en) * | 2012-04-20 | 2017-01-10 | Vanderbilt University | Systems and methods for safe compliant insertion and hybrid force/motion telemanipulation of continuum robots |
US9687303B2 (en) | 2012-04-20 | 2017-06-27 | Vanderbilt University | Dexterous wrists for surgical intervention |
US9549720B2 (en) | 2012-04-20 | 2017-01-24 | Vanderbilt University | Robotic device for establishing access channel |
US11871901B2 (en) | 2012-05-20 | 2024-01-16 | Cilag Gmbh International | Method for situational awareness for surgical network or surgical network connected device capable of adjusting function based on a sensed situation or usage |
US20130314418A1 (en) * | 2012-05-24 | 2013-11-28 | Siemens Medical Solutions Usa, Inc. | System for Erasing Medical Image Features |
KR102167359B1 (en) | 2012-06-01 | 2020-10-19 | 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 | Systems and methods for commanded reconfiguration of a surgical manipulator using the null-space |
US9642606B2 (en) | 2012-06-27 | 2017-05-09 | Camplex, Inc. | Surgical visualization system |
US9936863B2 (en) | 2012-06-27 | 2018-04-10 | Camplex, Inc. | Optical assembly providing a surgical microscope view for a surgical visualization system |
WO2014001948A2 (en) | 2012-06-28 | 2014-01-03 | Koninklijke Philips N.V. | C-arm trajectory planning for optimal image acquisition in endoscopic surgery |
JP6382802B2 (en) * | 2012-06-28 | 2018-08-29 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | Improving blood vessel visualization using a robot-operated endoscope |
US9076227B2 (en) * | 2012-10-01 | 2015-07-07 | Mitsubishi Electric Research Laboratories, Inc. | 3D object tracking in multiple 2D sequences |
CN103054612B (en) * | 2012-12-10 | 2015-06-10 | 苏州佳世达电通有限公司 | Ultrasonic probe mouse and ultrasonoscope |
US9386908B2 (en) * | 2013-01-29 | 2016-07-12 | Gyrus Acmi, Inc. (D.B.A. Olympus Surgical Technologies America) | Navigation using a pre-acquired image |
US10507066B2 (en) | 2013-02-15 | 2019-12-17 | Intuitive Surgical Operations, Inc. | Providing information of tools by filtering image areas adjacent to or on displayed images of the tools |
AU2014233662B2 (en) | 2013-03-15 | 2019-01-17 | Sri International | Hyperdexterous surgical system |
EP4331519A2 (en) * | 2013-03-15 | 2024-03-06 | Medtronic Holding Company Sàrl | A system for treating tissue |
KR101563498B1 (en) | 2013-05-02 | 2015-10-27 | 삼성메디슨 주식회사 | Ultrasound system and method for providing change information of target object |
EP2999414B1 (en) | 2013-05-21 | 2018-08-08 | Camplex, Inc. | Surgical visualization systems |
WO2015042483A2 (en) | 2013-09-20 | 2015-03-26 | Camplex, Inc. | Surgical visualization systems |
JP5781135B2 (en) * | 2013-09-27 | 2015-09-16 | エフ・エーシステムエンジニアリング株式会社 | 3D navigation video generation device |
JP5927348B2 (en) * | 2013-10-30 | 2016-06-01 | オリンパス株式会社 | Endoscope device |
JP6358463B2 (en) | 2013-11-13 | 2018-07-18 | パナソニックIpマネジメント株式会社 | Master device for master-slave device, control method therefor, and master-slave device |
US10057590B2 (en) * | 2014-01-13 | 2018-08-21 | Mediatek Inc. | Method and apparatus using software engine and hardware engine collaborated with each other to achieve hybrid video encoding |
EP3096692B1 (en) | 2014-01-24 | 2023-06-14 | Koninklijke Philips N.V. | Virtual image with optical shape sensing device perspective |
US10083278B2 (en) * | 2014-02-12 | 2018-09-25 | Edda Technology, Inc. | Method and system for displaying a timing signal for surgical instrument insertion in surgical procedures |
EP3119329B1 (en) | 2014-03-17 | 2022-07-20 | Intuitive Surgical Operations, Inc. | Guided setup for teleoperated medical device |
WO2015142955A1 (en) | 2014-03-17 | 2015-09-24 | Intuitive Surgical Operations, Inc. | Automated structure with pre-established arm positions in a teleoperated medical system |
EP3119339B1 (en) | 2014-03-17 | 2019-08-28 | Intuitive Surgical Operations, Inc. | Systems and methods for offscreen indication of instruments in a teleoperational medical system |
US10555788B2 (en) | 2014-03-28 | 2020-02-11 | Intuitive Surgical Operations, Inc. | Surgical system with haptic feedback based upon quantitative three-dimensional imaging |
EP3122281B1 (en) | 2014-03-28 | 2022-07-20 | Intuitive Surgical Operations, Inc. | Quantitative three-dimensional imaging and 3d modeling of surgical implants |
US10334227B2 (en) | 2014-03-28 | 2019-06-25 | Intuitive Surgical Operations, Inc. | Quantitative three-dimensional imaging of surgical scenes from multiport perspectives |
JP6854237B2 (en) | 2014-03-28 | 2021-04-07 | インテュイティブ サージカル オペレーションズ, インコーポレイテッド | Quantitative 3D visualization of instruments in the field of view |
KR102397254B1 (en) | 2014-03-28 | 2022-05-12 | 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 | Quantitative three-dimensional imaging of surgical scenes |
JP2015192697A (en) * | 2014-03-31 | 2015-11-05 | ソニー株式会社 | Control device and control method, and photographing control system |
JP6305810B2 (en) * | 2014-03-31 | 2018-04-04 | キヤノンメディカルシステムズ株式会社 | Medical diagnostic imaging equipment |
WO2015168066A1 (en) * | 2014-05-01 | 2015-11-05 | Endochoice, Inc. | System and method of scanning a body cavity using a multiple viewing elements endoscope |
CN105321202A (en) * | 2014-07-16 | 2016-02-10 | 南京普爱射线影像设备有限公司 | Medical two-dimensional image and 3D image display software system |
JP6246093B2 (en) | 2014-07-25 | 2017-12-13 | オリンパス株式会社 | Treatment instrument and treatment instrument system |
US9815206B2 (en) * | 2014-09-25 | 2017-11-14 | The Johns Hopkins University | Surgical system user interface using cooperatively-controlled robot |
US10786315B2 (en) * | 2014-11-13 | 2020-09-29 | Intuitive Surgical Operations, Inc. | Interaction between user-interface and master controller |
US10123846B2 (en) | 2014-11-13 | 2018-11-13 | Intuitive Surgical Operations, Inc. | User-interface control using master controller |
US10702353B2 (en) | 2014-12-05 | 2020-07-07 | Camplex, Inc. | Surgical visualizations systems and displays |
EP3277152A4 (en) | 2015-03-25 | 2018-12-26 | Camplex, Inc. | Surgical visualization systems and displays |
CN104887175A (en) * | 2015-06-03 | 2015-09-09 | 皖南医学院 | Virtual gastroscopy and diagnosis system |
WO2017055381A1 (en) * | 2015-09-29 | 2017-04-06 | Koninklijke Philips N.V. | Instrument controller for robotically assisted minimally invasive surgery |
EP3342365B1 (en) * | 2015-10-02 | 2023-10-11 | Sony Group Corporation | Medical control device, control method, program and medical control system |
WO2017091704A1 (en) | 2015-11-25 | 2017-06-01 | Camplex, Inc. | Surgical visualization systems and displays |
CN105376503B (en) * | 2015-12-14 | 2018-07-20 | 北京医千创科技有限公司 | A kind of operative image processing unit and method |
KR20230141937A (en) * | 2016-06-09 | 2023-10-10 | 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 | Computer-assist remote control surgical system and method |
KR102607065B1 (en) | 2016-06-30 | 2023-11-29 | 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 | Graphical user interface for displaying guidance information in a plurality of modes during an image-guided procedure |
EP4238490A3 (en) | 2016-06-30 | 2023-11-01 | Intuitive Surgical Operations, Inc. | Graphical user interface for displaying guidance information during an image-guided procedure |
JP6918844B2 (en) * | 2016-07-14 | 2021-08-11 | インテュイティブ サージカル オペレーションズ, インコーポレイテッド | Systems and methods for on-screen menus in remote-controlled medical systems |
WO2018034661A1 (en) * | 2016-08-18 | 2018-02-22 | Stryker European Holdings I, Llc | Method for visualizing a bone |
KR101715026B1 (en) * | 2016-08-26 | 2017-03-13 | (주)미래컴퍼니 | Surgical robot system and motion restriction control method thereof |
CN109567902B (en) * | 2016-11-01 | 2022-04-08 | 香港生物医学工程有限公司 | Surgical robotic device and system for performing minimally invasive and transluminal endoscopic surgical actions |
CN110248583B (en) | 2016-12-02 | 2021-12-31 | 范德比尔特大学 | Steerable endoscope with continuum manipulator |
CN110290758A (en) * | 2017-02-14 | 2019-09-27 | 直观外科手术操作公司 | Multidimensional visualization in area of computer aided remote operation operation |
US10839956B2 (en) * | 2017-03-03 | 2020-11-17 | University of Maryland Medical Center | Universal device and method to integrate diagnostic testing into treatment in real-time |
EP3612121A4 (en) | 2017-04-18 | 2021-04-07 | Intuitive Surgical Operations, Inc. | Graphical user interface for monitoring an image-guided procedure |
US10918455B2 (en) | 2017-05-08 | 2021-02-16 | Camplex, Inc. | Variable light source |
EP3649918B1 (en) * | 2017-07-03 | 2023-08-02 | FUJIFILM Corporation | Medical image processing device, endoscope device, diagnostic support device, medical service support device and report generation support device |
US11426507B2 (en) * | 2017-08-21 | 2022-08-30 | RELIGN Corporation | Arthroscopic devices and methods |
US11662270B2 (en) * | 2017-08-22 | 2023-05-30 | Intuitive Surgical Operations, Inc. | User-installable part installation detection techniques |
WO2019050886A1 (en) * | 2017-09-06 | 2019-03-14 | Covidien Lp | Systems, methods, and computer-readable media for providing stereoscopic visual perception notifications and/or recommendations during a robotic surgical procedure |
WO2019055701A1 (en) | 2017-09-13 | 2019-03-21 | Vanderbilt University | Continuum robots with multi-scale motion through equilibrium modulation |
US10835344B2 (en) * | 2017-10-17 | 2020-11-17 | Verily Life Sciences Llc | Display of preoperative and intraoperative images |
CN111356407A (en) * | 2017-10-20 | 2020-06-30 | 昆山华大智造云影医疗科技有限公司 | Ultrasonic detection device, ultrasonic control device, ultrasonic system and ultrasonic imaging method |
US11564703B2 (en) | 2017-10-30 | 2023-01-31 | Cilag Gmbh International | Surgical suturing instrument comprising a capture width which is larger than trocar diameter |
US11844579B2 (en) | 2017-12-28 | 2023-12-19 | Cilag Gmbh International | Adjustments based on airborne particle properties |
US11672605B2 (en) | 2017-12-28 | 2023-06-13 | Cilag Gmbh International | Sterile field interactive control displays |
US20190201139A1 (en) | 2017-12-28 | 2019-07-04 | Ethicon Llc | Communication arrangements for robot-assisted surgical platforms |
US11857152B2 (en) | 2017-12-28 | 2024-01-02 | Cilag Gmbh International | Surgical hub spatial awareness to determine devices in operating theater |
US11969142B2 (en) | 2017-12-28 | 2024-04-30 | Cilag Gmbh International | Method of compressing tissue within a stapling device and simultaneously displaying the location of the tissue within the jaws |
US11896322B2 (en) | 2017-12-28 | 2024-02-13 | Cilag Gmbh International | Sensing the patient position and contact utilizing the mono-polar return pad electrode to provide situational awareness to the hub |
US11896443B2 (en) * | 2017-12-28 | 2024-02-13 | Cilag Gmbh International | Control of a surgical system through a surgical barrier |
US11864728B2 (en) | 2017-12-28 | 2024-01-09 | Cilag Gmbh International | Characterization of tissue irregularities through the use of mono-chromatic light refractivity |
US11969216B2 (en) | 2017-12-28 | 2024-04-30 | Cilag Gmbh International | Surgical network recommendations from real time analysis of procedure variables against a baseline highlighting differences from the optimal solution |
US11298148B2 (en) | 2018-03-08 | 2022-04-12 | Cilag Gmbh International | Live time tissue classification using electrical parameters |
US11701162B2 (en) | 2018-03-08 | 2023-07-18 | Cilag Gmbh International | Smart blade application for reusable and disposable devices |
US11090047B2 (en) | 2018-03-28 | 2021-08-17 | Cilag Gmbh International | Surgical instrument comprising an adaptive control system |
CN108836392B (en) * | 2018-03-30 | 2021-06-22 | 中国科学院深圳先进技术研究院 | Ultrasonic imaging method, device and equipment based on ultrasonic RF signal and storage medium |
WO2019222495A1 (en) | 2018-05-18 | 2019-11-21 | Auris Health, Inc. | Controllers for robotically-enabled teleoperated systems |
CN109330697B (en) * | 2018-07-31 | 2023-09-22 | 深圳市精锋医疗科技股份有限公司 | Minimally invasive surgery slave operation equipment assembly and surgery robot |
US20200073526A1 (en) * | 2018-08-28 | 2020-03-05 | Johnson Controls Technology Company | Energy management system with draggable and non-draggable building component user interface elements |
EP3851024A4 (en) * | 2018-09-11 | 2021-11-10 | Sony Group Corporation | Medical observation system, medical observation device and medical observation method |
JP7427654B2 (en) | 2018-09-17 | 2024-02-05 | オーリス ヘルス インコーポレイテッド | Systems and methods for performing associated medical procedures |
EP3860426A4 (en) | 2018-10-02 | 2022-12-07 | Convergascent LLC | Endoscope with inertial measurement units and/or haptic input controls |
CN109498162B (en) * | 2018-12-20 | 2023-11-03 | 深圳市精锋医疗科技股份有限公司 | Main operation table for improving immersion sense and surgical robot |
WO2020167678A1 (en) * | 2019-02-12 | 2020-08-20 | Intuitive Surgical Operations, Inc. | Systems and methods for facilitating optimization of an imaging device viewpoint during an operating session of a computer-assisted operation system |
US11259807B2 (en) | 2019-02-19 | 2022-03-01 | Cilag Gmbh International | Staple cartridges with cam surfaces configured to engage primary and secondary portions of a lockout of a surgical stapling device |
EP3977406A1 (en) * | 2019-05-31 | 2022-04-06 | Intuitive Surgical Operations, Inc. | Composite medical imaging systems and methods |
KR102247545B1 (en) * | 2019-07-24 | 2021-05-03 | 경북대학교 산학협력단 | Surgical Location Information Providing Method and Device Thereof |
US11903650B2 (en) | 2019-09-11 | 2024-02-20 | Ardeshir Rastinehad | Method for providing clinical support for surgical guidance during robotic surgery |
US11931119B1 (en) | 2019-11-15 | 2024-03-19 | Verily Life Sciences Llc | Integrating applications in a surgeon console user interface of a robotic surgical system |
US11918307B1 (en) * | 2019-11-15 | 2024-03-05 | Verily Life Sciences Llc | Integrating applications in a surgeon console user interface of a robotic surgical system |
JP2021091060A (en) * | 2019-12-12 | 2021-06-17 | セイコーエプソン株式会社 | Control method and robot system |
US20220087763A1 (en) * | 2020-09-23 | 2022-03-24 | Verb Surgical Inc. | Deep disengagement detection during telesurgery |
USD981425S1 (en) * | 2020-09-30 | 2023-03-21 | Karl Storz Se & Co. Kg | Display screen with graphical user interface |
CN114831738A (en) * | 2020-10-08 | 2022-08-02 | 深圳市精锋医疗科技股份有限公司 | Surgical robot, graphical control device thereof and graphical display method |
USD1022197S1 (en) | 2020-11-19 | 2024-04-09 | Auris Health, Inc. | Endoscope |
CN112957107B (en) * | 2021-02-19 | 2021-11-30 | 南昌华安众辉健康科技有限公司 | Pleuroperitoneal cavity surgical instrument with laparoscope |
US11844583B2 (en) | 2021-03-31 | 2023-12-19 | Moon Surgical Sas | Co-manipulation surgical system having an instrument centering mode for automatic scope movements |
US11819302B2 (en) | 2021-03-31 | 2023-11-21 | Moon Surgical Sas | Co-manipulation surgical system having user guided stage control |
US11832909B2 (en) | 2021-03-31 | 2023-12-05 | Moon Surgical Sas | Co-manipulation surgical system having actuatable setup joints |
US11812938B2 (en) | 2021-03-31 | 2023-11-14 | Moon Surgical Sas | Co-manipulation surgical system having a coupling mechanism removeably attachable to surgical instruments |
JP2024513204A (en) | 2021-03-31 | 2024-03-22 | ムーン サージカル エスアエス | Cooperative surgical system for use with surgical instruments to perform laparoscopic surgery |
CN113925615A (en) * | 2021-10-26 | 2022-01-14 | 北京歌锐科技有限公司 | Minimally invasive surgery equipment and control method thereof |
US11717149B1 (en) | 2022-04-27 | 2023-08-08 | Maciej J. Kieturakis | Methods and systems for robotic single-port laparoscopic access |
CN115363751B (en) * | 2022-08-12 | 2023-05-16 | 华平祥晟(上海)医疗科技有限公司 | Intraoperative anatomical structure indication method |
US11986165B1 (en) | 2023-01-09 | 2024-05-21 | Moon Surgical Sas | Co-manipulation surgical system for use with surgical instruments for performing laparoscopic surgery while estimating hold force |
US11832910B1 (en) | 2023-01-09 | 2023-12-05 | Moon Surgical Sas | Co-manipulation surgical system having adaptive gravity compensation |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6241725B1 (en) * | 1993-12-15 | 2001-06-05 | Sherwood Services Ag | High frequency thermal ablation of cancerous tumors and functional targets with image data assistance |
CN101291635A (en) * | 2005-10-20 | 2008-10-22 | 直观外科手术公司 | Auxiliary image display and manipulation on a computer display in a medical robotic system |
Family Cites Families (52)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5493595A (en) * | 1982-02-24 | 1996-02-20 | Schoolman Scientific Corp. | Stereoscopically displayed three dimensional medical imaging |
US5181514A (en) | 1991-05-21 | 1993-01-26 | Hewlett-Packard Company | Transducer positioning system |
US5279309A (en) * | 1991-06-13 | 1994-01-18 | International Business Machines Corporation | Signaling device and method for monitoring positions in a surgical operation |
US5417210A (en) * | 1992-05-27 | 1995-05-23 | International Business Machines Corporation | System and method for augmentation of endoscopic surgery |
US5182728A (en) * | 1991-06-28 | 1993-01-26 | Acoustic Imaging Technologies Corporation | Ultrasound imaging system and method |
US6963792B1 (en) * | 1992-01-21 | 2005-11-08 | Sri International | Surgical method |
US5361768A (en) * | 1992-06-30 | 1994-11-08 | Cardiovascular Imaging Systems, Inc. | Automated longitudinal position translator for ultrasonic imaging probes, and methods of using same |
US5397323A (en) * | 1992-10-30 | 1995-03-14 | International Business Machines Corporation | Remote center-of-motion robot for surgery |
US5788688A (en) * | 1992-11-05 | 1998-08-04 | Bauer Laboratories, Inc. | Surgeon's command and control |
EP0646263B1 (en) * | 1993-04-20 | 2000-05-31 | General Electric Company | Computer graphic and live video system for enhancing visualisation of body structures during surgery |
US5842473A (en) * | 1993-11-29 | 1998-12-01 | Life Imaging Systems | Three-dimensional imaging system |
US5765561A (en) * | 1994-10-07 | 1998-06-16 | Medical Media Systems | Video-based surgical targeting system |
JPH08111816A (en) * | 1994-10-11 | 1996-04-30 | Toshiba Corp | Medical image display device |
US5836880A (en) * | 1995-02-27 | 1998-11-17 | Micro Chemical, Inc. | Automated system for measuring internal tissue characteristics in feed animals |
US5797849A (en) * | 1995-03-28 | 1998-08-25 | Sonometrics Corporation | Method for carrying out a medical procedure using a three-dimensional tracking and imaging system |
US5817022A (en) * | 1995-03-28 | 1998-10-06 | Sonometrics Corporation | System for displaying a 2-D ultrasound image within a 3-D viewing environment |
JPH08275958A (en) * | 1995-04-07 | 1996-10-22 | Olympus Optical Co Ltd | Manipulator device for operation |
US5887121A (en) * | 1995-04-21 | 1999-03-23 | International Business Machines Corporation | Method of constrained Cartesian control of robotic mechanisms with active and passive joints |
US5551432A (en) * | 1995-06-19 | 1996-09-03 | New York Eye & Ear Infirmary | Scanning control system for ultrasound biomicroscopy |
US6256529B1 (en) * | 1995-07-26 | 2001-07-03 | Burdette Medical Systems, Inc. | Virtual reality 3D visualization for surgical procedures |
JPH09173352A (en) * | 1995-12-25 | 1997-07-08 | Toshiba Medical Eng Co Ltd | Medical navigation system |
US5797900A (en) | 1996-05-20 | 1998-08-25 | Intuitive Surgical, Inc. | Wrist mechanism for surgical instrument for performing minimally invasive surgery with enhanced dexterity and sensitivity |
US6642836B1 (en) * | 1996-08-06 | 2003-11-04 | Computer Motion, Inc. | General purpose distributed operating room control system |
US5810008A (en) * | 1996-12-03 | 1998-09-22 | Isg Technologies Inc. | Apparatus and method for visualizing ultrasonic images |
US5853367A (en) * | 1997-03-17 | 1998-12-29 | General Electric Company | Task-interface and communications system and method for ultrasound imager control |
JPH10322629A (en) * | 1997-05-16 | 1998-12-04 | Canon Inc | Image pickup device, image pickup system and storage medium |
US6129670A (en) * | 1997-11-24 | 2000-10-10 | Burdette Medical Systems | Real time brachytherapy spatial registration and visualization system |
US5842993A (en) * | 1997-12-10 | 1998-12-01 | The Whitaker Corporation | Navigable ultrasonic imaging probe assembly |
JP3582348B2 (en) * | 1998-03-19 | 2004-10-27 | 株式会社日立製作所 | Surgical equipment |
US6950689B1 (en) * | 1998-08-03 | 2005-09-27 | Boston Scientific Scimed, Inc. | Dynamically alterable three-dimensional graphical model of a body region |
US6468265B1 (en) * | 1998-11-20 | 2002-10-22 | Intuitive Surgical, Inc. | Performing cardiac surgery without cardioplegia |
US6659939B2 (en) * | 1998-11-20 | 2003-12-09 | Intuitive Surgical, Inc. | Cooperative minimally invasive telesurgical system |
US6951535B2 (en) * | 2002-01-16 | 2005-10-04 | Intuitive Surgical, Inc. | Tele-medicine system that transmits an entire state of a subsystem |
US6799065B1 (en) * | 1998-12-08 | 2004-09-28 | Intuitive Surgical, Inc. | Image shifting apparatus and method for a telerobotic system |
US6522906B1 (en) * | 1998-12-08 | 2003-02-18 | Intuitive Surgical, Inc. | Devices and methods for presenting and regulating auxiliary information on an image display of a telesurgical system to assist an operator in performing a surgical procedure |
US6602185B1 (en) * | 1999-02-18 | 2003-08-05 | Olympus Optical Co., Ltd. | Remote surgery support system |
JP2000300579A (en) * | 1999-04-26 | 2000-10-31 | Olympus Optical Co Ltd | Multifunctional manipulator |
US6312391B1 (en) * | 2000-02-16 | 2001-11-06 | Urologix, Inc. | Thermodynamic modeling of tissue treatment procedure |
US6599247B1 (en) * | 2000-07-07 | 2003-07-29 | University Of Pittsburgh | System and method for location-merging of real-time tomographic slice images with human vision |
US6862561B2 (en) * | 2001-05-29 | 2005-03-01 | Entelos, Inc. | Method and apparatus for computer modeling a joint |
US6887245B2 (en) * | 2001-06-11 | 2005-05-03 | Ge Medical Systems Global Technology Company, Llc | Surgical drill for use with a computer assisted surgery system |
US7831292B2 (en) * | 2002-03-06 | 2010-11-09 | Mako Surgical Corp. | Guidance system and method for surgical procedures with improved feedback |
WO2004014244A2 (en) * | 2002-08-13 | 2004-02-19 | Microbotics Corporation | Microsurgical robot system |
US20060161218A1 (en) * | 2003-11-26 | 2006-07-20 | Wicab, Inc. | Systems and methods for treating traumatic brain injury |
JP4377827B2 (en) * | 2004-03-30 | 2009-12-02 | 株式会社東芝 | Manipulator device |
US20080020362A1 (en) * | 2004-08-10 | 2008-01-24 | Cotin Stephane M | Methods and Apparatus for Simulaton of Endovascular and Endoluminal Procedures |
JP2006055273A (en) * | 2004-08-18 | 2006-03-02 | Olympus Corp | Surgery support system |
US7396129B2 (en) * | 2004-11-22 | 2008-07-08 | Carestream Health, Inc. | Diagnostic system having gaze tracking |
JP2006320427A (en) * | 2005-05-17 | 2006-11-30 | Hitachi Medical Corp | Endoscopic operation support system |
JP2006321027A (en) * | 2005-05-20 | 2006-11-30 | Hitachi Ltd | Master slave type manipulator system and its operation input device |
JP4398405B2 (en) * | 2005-05-30 | 2010-01-13 | アロカ株式会社 | Medical system |
CN101193603B (en) * | 2005-06-06 | 2010-11-03 | 直观外科手术公司 | Laparoscopic ultrasound robotic surgical system |
-
2006
- 2006-10-19 CN CN201310052693.4A patent/CN103142309B/en active Active
- 2006-10-19 WO PCT/US2006/040754 patent/WO2007047782A2/en active Application Filing
- 2006-10-19 KR KR1020087006736A patent/KR101320379B1/en active IP Right Grant
- 2006-10-19 CN CN201310052673.7A patent/CN103251455B/en active Active
- 2006-10-19 EP EP06817132.1A patent/EP1937176B1/en active Active
- 2006-10-19 EP EP16195634.7A patent/EP3162318B1/en active Active
- 2006-10-19 CN CN2006800389441A patent/CN101291635B/en active Active
- 2006-10-19 EP EP16195633.9A patent/EP3155998B1/en active Active
- 2006-10-19 US US11/583,963 patent/US20080033240A1/en not_active Abandoned
- 2006-10-19 JP JP2008536776A patent/JP5322648B2/en active Active
- 2006-10-19 EP EP19164568.8A patent/EP3524202A1/en active Pending
-
2011
- 2011-12-20 JP JP2011278962A patent/JP5467615B2/en active Active
- 2011-12-20 JP JP2011278963A patent/JP5276706B2/en active Active
-
2013
- 2013-05-01 JP JP2013096250A patent/JP5639223B2/en active Active
-
2016
- 2016-04-27 US US15/139,682 patent/US20160235496A1/en not_active Abandoned
-
2019
- 2019-09-09 US US16/564,734 patent/US11197731B2/en active Active
-
2021
- 2021-11-18 US US17/530,166 patent/US20220071721A1/en active Pending
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6241725B1 (en) * | 1993-12-15 | 2001-06-05 | Sherwood Services Ag | High frequency thermal ablation of cancerous tumors and functional targets with image data assistance |
CN101291635A (en) * | 2005-10-20 | 2008-10-22 | 直观外科手术公司 | Auxiliary image display and manipulation on a computer display in a medical robotic system |
Also Published As
Publication number | Publication date |
---|---|
JP2012061336A (en) | 2012-03-29 |
US11197731B2 (en) | 2021-12-14 |
CN103142309A (en) | 2013-06-12 |
JP5322648B2 (en) | 2013-10-23 |
EP3162318B1 (en) | 2019-10-16 |
KR101320379B1 (en) | 2013-10-22 |
EP3155998A1 (en) | 2017-04-19 |
CN101291635A (en) | 2008-10-22 |
JP2009512514A (en) | 2009-03-26 |
JP5467615B2 (en) | 2014-04-09 |
EP3162318A3 (en) | 2017-08-09 |
EP3162318A2 (en) | 2017-05-03 |
JP5639223B2 (en) | 2014-12-10 |
CN103251455A (en) | 2013-08-21 |
US20160235496A1 (en) | 2016-08-18 |
US20080033240A1 (en) | 2008-02-07 |
EP3524202A1 (en) | 2019-08-14 |
US20190388169A1 (en) | 2019-12-26 |
CN103142309B (en) | 2015-06-17 |
JP2012055752A (en) | 2012-03-22 |
KR20080068640A (en) | 2008-07-23 |
EP1937176A2 (en) | 2008-07-02 |
EP3155998B1 (en) | 2021-03-31 |
US20220071721A1 (en) | 2022-03-10 |
WO2007047782A3 (en) | 2007-09-13 |
JP5276706B2 (en) | 2013-08-28 |
WO2007047782A2 (en) | 2007-04-26 |
JP2013150873A (en) | 2013-08-08 |
CN101291635B (en) | 2013-03-27 |
EP1937176B1 (en) | 2019-04-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN103251455B (en) | Assistant images display on computer display in medical robotic system and manipulation | |
JP6138227B2 (en) | Laparoscopic ultrasonic robotic surgical system | |
JP2009512514A5 (en) | ||
US20200268440A1 (en) | Automated ablation control systems |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant |