US20120089014A1 - Method and apparatus for tracking in a medical procedure - Google Patents

Method and apparatus for tracking in a medical procedure Download PDF

Info

Publication number
US20120089014A1
US20120089014A1 US13/378,175 US201013378175A US2012089014A1 US 20120089014 A1 US20120089014 A1 US 20120089014A1 US 201013378175 A US201013378175 A US 201013378175A US 2012089014 A1 US2012089014 A1 US 2012089014A1
Authority
US
United States
Prior art keywords
anatomy
orientation
image
medical device
computer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/378,175
Inventor
Joerg Sabczynski
Heinrich Schulz
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Koninklijke Philips NV
Original Assignee
Koninklijke Philips Electronics NV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics NV filed Critical Koninklijke Philips Electronics NV
Priority to US13/378,175 priority Critical patent/US20120089014A1/en
Assigned to KONINKLIJKE PHILIPS ELECTRONICS N V reassignment KONINKLIJKE PHILIPS ELECTRONICS N V ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SABCZYNSKI, JOERG, SCHULZ, HEINRICH
Publication of US20120089014A1 publication Critical patent/US20120089014A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/05Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances characterised by the image sensor, e.g. camera, being in the distal end portion
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • A61B5/065Determining position of the probe employing exclusively positioning means located on or in the probe, e.g. using position sensors arranged on the probe
    • A61B5/067Determining position of the probe employing exclusively positioning means located on or in the probe, e.g. using position sensors arranged on the probe using accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5211Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
    • A61B6/5229Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image
    • A61B6/5247Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image combining images from an ionising-radiation diagnostic technique and a non-ionising radiation diagnostic technique, e.g. X-ray and ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/267Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for the respiratory tract, e.g. laryngoscopes, bronchoscopes
    • A61B1/2676Bronchoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2048Tracking techniques using an accelerometer or inertia sensor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0219Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0004Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by the type of physiological signal transmitted
    • A61B5/0013Medical image data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/05Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves 
    • A61B5/055Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves  involving electronic [EMR] or nuclear [NMR] magnetic resonance, e.g. magnetic resonance imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/44Constructional features of apparatus for radiation diagnosis
    • A61B6/4429Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units
    • A61B6/4435Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units the source unit and the detector unit being coupled by a rigid structure
    • A61B6/4441Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units the source unit and the detector unit being coupled by a rigid structure the rigid structure being a C-arm or U-arm
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/12Diagnosis using ultrasonic, sonic or infrasonic waves in body cavities or body tracts, e.g. by using catheters

Definitions

  • the present application relates to the therapeutic arts, in particular to tracking for medical procedures, and will be described with particular reference thereto.
  • the medical device is delivered solely on the basis of this imaging data information, and confirmation of the final position relative to the target may even require a second set of images to be acquired.
  • confirmation of the final position relative to the target may even require a second set of images to be acquired.
  • cameras are utilized in the device for visually presenting the path of the device, it is unclear if the correct path is being followed, such as where the device has twisted during movement.
  • Bronchoscopy is a method to view the interior of the bronchi.
  • a flexible fiber optic device, the bronchoscope, a special kind of endoscope is introduced through the mouth or nostril of the patient into the airway system. It allows the pulmonologist to see the inside of the trachea, the main bronchi, and the bigger of the small bronchi.
  • bronchoscopes have a working channel, through which small surgical instruments can be brought to the tip of the bronchoscope.
  • Lung lesions can be detected on CT scans.
  • a tissue sample must often be investigated. Although it is possible to take the tissue sample with a needle from the outside, this method has certain problems. With the help of a bronchoscope, it is possible to circumvent these problems.
  • Transbronchial endoscopic biopsy of lung lesions is a surgical technique to collect lung tissue via the bronchoscope. A small forceps or biopsy needle is used through the working channel to get lung tissue from behind the bronchial wall.
  • a method of tracking in a medical procedure can include receiving acceleration data from an accelerometer that is integrally connected to a medical device, where the acceleration data is received at a remote processor, and where the medical device is moved through an anatomy of a patient towards a target region; and determining an orientation of the medical device with respect to the anatomy based on the acceleration data.
  • a computer-readable storage medium can include computer-executable code stored therein, where the computer-executable code is configured to cause a computing device, in which the computer-readable storage medium is provided, to: receive orientation data from an orientation sensor that is integrally connected to a medical device, where the orientation data is received at a remote processor, and where the medical device is being moved through an anatomy of a patient towards a target region; determine an orientation of the medical device with respect to the anatomy based on the orientation data; capture real-time images of the anatomy using the medical device; and present the captured images and the orientation of the medical device with respect to the anatomy on a display device operably coupled to the processor.
  • an endoscope can include a body having a distal end and at least one channel formed therein, where the body is adapted for insertion through an anatomy to reach a target area; an accelerometer connected to the body and positioned in proximity to the distal end; an imaging device operably coupled with the body; and a light source operably coupled with the body, where the accelerometer is in communication with a remote processor for transmitting acceleration data thereto, where the imaging device is in communication with the remote processor for transmitting real-time images thereto, and where an orientation of the medical device with respect to the anatomy is determined by the processor based on the acceleration data.
  • the exemplary embodiments described herein can have a number of advantages over contemporary systems and processes, including accuracy of surgical device placement and reduction of procedure time by allowing the correct path of the medical device to be more quickly determined Additionally, the system and method described herein can be utilized through retrofitting existing surgical devices. Still further advantages and benefits will become apparent to those of ordinary skill in the art upon reading and understanding the following detailed description.
  • FIG. 1 is a schematic illustration of a tracking system according to one exemplary embodiment for use in a medical procedure
  • FIG. 2 is a schematic illustration of a surgical device for use with the tracking system of FIG. 1 ;
  • FIG. 3 is a schematic illustration of another surgical device for use with the tracking system of FIG. 1 ;
  • FIG. 4 is a schematic illustration of a patient with a target anatomy
  • FIG. 5 is an image of a bronchus of a patient captured using the surgical device of FIG. 2 or 3 ;
  • FIG. 6 is a method that can be used by the system and devices of FIGS. 1-3 for performing tracking during a medical procedure
  • FIG. 7 is a schematic illustration of signal flow between the surgical device and the work station.
  • the exemplary embodiments of the present disclosure are described with respect to a tracking system for a bronchoscope to be utilized during a procedure for a human. It should be understood by one of ordinary skill in the art that the exemplary embodiments of the present disclosure can be applied to, and utilized with, various types of medical or surgical devices (including other endoscopes or catheters), various types of procedures, and various portions of the body, whether human or animal.
  • the exemplary embodiments can also be used for tracking of a surgical device that utilizes other types of imaging in combination with or in place of a camera, such as ultrasound imaging from an ultrasound device positioned in the surgical device that enters the body.
  • the exemplary embodiments are described herein as using accelerometer tracking in combination with imaging.
  • the use of the method and system of the exemplary embodiments of the present disclosure can be adapted for application to other types of tracking in a target anatomy and can utilize other types of orientation sensing sensors including magnetometers.
  • a tracking system 100 which can have a surgical device 180 , such as a bronchoscope, with an accelerometer 185 connected thereto.
  • the accelerometer 185 can be positioned along or in proximity to the tip or distal end of the surgical device 180 . While the exemplary embodiment shows a single accelerometer 185 , the present disclosure contemplates the use of any number of accelerometers that can be in various configurations along the surgical device 180 .
  • the surgical device 180 can be utilized in a target anatomy 105 of a patient who can be supported by a support structure 170 .
  • the accelerometer 185 can be a measurement device capable of detecting acceleration of the tip of the surgical device 180 so that orientation information can be generated with respect to a current orientation of the tip.
  • Accelerometer 185 can be of various types including piezoelectric, MEMS, thermal (submicrometre CMOS process), bulk micromachined capacitive, bulk micromachined piezo resistive, capacitive spring mass base, electromechanical servo, null-balance, strain gauge, resonance, magnetic induction, optical, surface acoustic wave, DC response, modally tuned impact, seat pad, PIGA and so forth.
  • 3-axis accelerometers can be utilized which measure not only the strength of the acceleration, but also its direction.
  • the accelerometer 185 can be operably connected to a processor 120 that receives the orientation data therefrom.
  • the operable coupling can be through a hardwire, such as line 186 , and/or can be a wireless link between the accelerometer 185 and the processor 120 .
  • the orientation data can be raw data, such as a change in voltage, that is measured and transmitted to the processor 120 .
  • the accelerometer 185 can convert the raw data to direction information prior to transmission of the orientation data to the processor 120 .
  • System 100 depicts the orientation data being provided directly to the processor 120 .
  • the present disclosure contemplates the accelerometer 185 providing the orientation data to a orientation acquisition unit (not shown) which can process the data and then provide it to the processor 120 .
  • tracking system 100 can be used with, or can include, an imaging modality 150 , such as a high resolution imaging modality, including an x-ray scanner 155 .
  • an imaging modality 150 such as a high resolution imaging modality, including an x-ray scanner 155 .
  • a high resolution image of the target anatomy 105 can be generated by the scanner 155 and stored in an image memory.
  • the image memory can be incorporated into processor 120 and/or can be a separate storage and/or processing device.
  • a C-arm x-ray scanning device 155 is shown in FIG. 1 for illustrative purposes, but the present disclosure contemplates the use of various imaging devices, including an open MRI, CT, and so forth.
  • the present disclosure contemplates the use of various imaging modalities, alone or in combination, including MRI, ultrasound, X-ray CT, and so forth.
  • the present disclosure also contemplates the imaging modality 150 being a separate system that is relied upon for gathering of images, including pre-operative and/or intra-operative
  • the surgical device 180 can include one or more channels 292 formed through a body 281 of the device (e.g., a bronchoscope), such as working channels for providing the clinician with access to the target anatomy and suction channels.
  • the body 281 can be made from various flexible materials.
  • the device 180 can include the accelerometer 185 positioned along or in proximity to the tip 290 of the device, including being embedded in a wall of the device or connected to the outside of the device.
  • the device 180 can also include a camera or imaging device 295 and a light source 297 .
  • the light source 297 can have a self-contained power source and/or can be connected to an external power source, such as through use of line 186 (in FIG. 1 ).
  • the light source 297 can be operably connected to the processor 120 for adjustment of the level of emitted light or other control to be exerted over the light source.
  • the tip of the surgical device 180 can be provided light by way of fiber optics from an external light generating device.
  • the camera 295 can be operably connected to a processor 120 that receives the imaging data therefrom.
  • the operable coupling can be through a hardwire, such as the line 186 , and/or can be a wireless link between the camera 295 and the processor 120 .
  • the imaging data can be raw data that is captured by the camera 295 and transmitted directly to the processor 120 .
  • the camera 295 can convert the raw data to video information prior to transmission of the imaging to the processor 120 .
  • the processor 120 can present the imaging data as a video in real time so that the clinician can see the path that the surgical device 180 is traveling along.
  • the surgical device 180 can travel down through the trachea 410 and through the bronchi 420 in order to reach a tumor or other target area or region 430 .
  • the bifurcated structure of the bronchi requires that the clinician select among different paths as the surgical device 180 is being moved during the procedure.
  • the accelerometer 185 measures gravity only. Based on this measurement, it is possible for the processor 120 to determine the up-direction at the tip 290 of the device 180 and relate it to the image (e.g., a CT scan) of the device captured by imaging modality 150 . Since it is known how the patient is placed during the procedure, such as a bronchoscopy, it is possible to relate the bronchoscopy image to the CT scan. At a given bifurcation, it is possible to determine which branch to follow in order to reach the target using the acceleration data.
  • the image e.g., a CT scan
  • bifurcations visible in the bronchoscopy image can be detected automatically by means of image processing performed by processor 120 . It can be further detected whether the bronchoscope 180 is moved into or out of the bronchi. Together with the information from the accelerometer 185 , this combined information can be used to detect the position of the bronchoscope in the bronchial tree. The combination of information from the accelerometer 185 and from image analysis performed by processor 120 facilitates navigation through to the target anatomy.
  • a bifurcation indicator can be presented to indicate the orientation of the bronchoscope with respect to the target anatomy. For example, an arrow or the like can be presented that shows which direction is up or which direction is down with respect to a vertical plane.
  • a computer generated view from the position and with the orientation of the actual bronchoscope, so called “Virtual bronchoscopy.” This view can be shown side by side with the real image in order to allow for user orientation.
  • the planned path may be marked, e.g., with a cross.
  • Device 380 can include additional channels 392 that allow for positioning one or more of the accelerometer 385 , the camera 395 and the light source 397 at or near the tip 390 of the device. These components can be operably coupled to the processor 120 through use of a hardwire and/or wireless link. Once the device 380 reaches its target, one or more of these components can be removed through the channels 392 . For example, the accelerometer 385 can be slid through the channel 392 and positioned therein during movement of the device 380 . In one embodiment, existing bronchoscopes can be utilized with one or more of the components of device 180 .
  • the accelerometer can be positioned into the existing working channel or attached to the tip of the bronchoscope at the outside.
  • the optical system and lighting components can be fixed in the surgical device 180 .
  • the accelerometer 385 can be slid back out through the channel 392 so that the channel can be utilized for other purposes, such as a suction channel or a working channel. In this embodiment, fewer channels may thus be formed through the device 180 .
  • a method 600 of tracking a surgical device such as a bronchoscope
  • an image e.g., a CT image
  • the image can be a pre-operative image and/or intra-operative image.
  • the bronchoscope can be moved through the bronchi where the clinician is viewing the captured real-time video from the camera positioned in the bronchoscope.
  • the clinician may come upon a bifurcation in the path.
  • the correct path to proceed along can be determined using the orientation data received from the accelerometer in step 610 .
  • These steps can be repeated until the target is reached in step 612 .
  • the image can be adjusted so that the position of the bronchoscope and/or the orientation of the bronchoscope is shown therein, such as through using the acceleration data.
  • System 100 allows the data from the accelerometer 185 to be transferred to the processor 120 .
  • this data can be transmitted along a light guide bundle, which is being utilized for an optical camera operably coupled to the bronchoscope.
  • the processor 120 can receive the orientation data from the accelerometer as well as the bronchoscope image from the video processor for analysis.
  • the processor 120 can analyze and track which bifurcation of the bronchial tree is currently being seen.
  • the processor 120 can also be connected to the facility network in order to receive the pre-interventional CT scan and the corresponding path planning data.
  • the directional information calculated by the processor 120 can be transferred to the video-processor, where it is combined with the original bronchoscope image data, and then presented on the monitor 130 .
  • the signal flow can include acceleration data 750 from the accelerometer 180 to the processor 120 ; the bronchoscope imaging (e.g., real-time video) 725 from the camera 295 to a video-processor 721 ; and light 775 from a light source 797 to the light 297 (connected to the bronchoscope).
  • acceleration data 750 from the accelerometer 180 to the processor 120
  • the bronchoscope imaging e.g., real-time video
  • light 775 from a light source 797 to the light 297 (connected to the bronchoscope).
  • the bronchoscope image being presented on the display 130 can be automatically rotated to depict the up-direction based on the orientation data from the accelerometer.
  • the image processing methods can be used to determine if the bronchoscope is moving in or out of the bronchi.
  • other types of orientation sensors can be utilized for capturing the orientation data.
  • a magnetometer can be used to determine the direction associated with the tip of the bronchoscope and the bifurcated paths based on use of an external magnetic field, including the earth's magnetic field and/or an artificial field.
  • System 100 can be used for bronchoscopic navigation, particularly transbronchial lung biopsies. The system 100 can also be used in other applications, such as a colonoscopy.
  • calibration of the direction can be performed where the navigation of the bronchoscope starts with assuming that the patient is in a known position and orientation.
  • the pre-operative CT dataset is thus oriented accordingly to the direction measured by the accelerometer.
  • the directions into both bifurcated bronchi are determined with the help of image analysis. These directions are compared to the expected direction based on the accelerometer measurement and the assumed patient orientation and the deviation in orientation is calculated.
  • the assumed patient orientation is corrected by this deviation and for the next bifurcation a better assumption on the patient orientation is used. This procedure can be repeated at the next bifurcation.
  • a preoperative CT of the lung can be obtained prior to the bronchoscopy.
  • This CT can be analyzed as follows: In the CT image, the position of the lesion of interest, e.g. a lung nodule or tumor, can be determined. This is done manually via clicking in the right slice to the right position.
  • the bronchial tree can be extracted (segmented) from the CT image with the help of suitable image processing methods.
  • the path from the trachea into the bronchial tree to the lesion can be planned. This can be done manually, but automatic methods are also conceivable. Bifurcations along the path can be detected. With this planning step there is enough information available for the intra-operative guidance as describe in the comments above.
  • the accelerometer can be supported by a magnetometer, which measures the direction of the magnetic field. This is not collinear with the gravitation field except at the magnetic poles.
  • the invention can be realized in hardware, software, or a combination of hardware and software.
  • the invention can be realized in a centralized fashion in one computer system, or in a distributed fashion where different elements are spread across several interconnected computer systems. Any kind of computer system or other apparatus adapted for carrying out the methods described herein is suited.
  • a typical combination of hardware and software can be a general purpose computer system with a computer program that, when being loaded and executed, controls the computer system such that it carries out the methods described herein.
  • the invention can be embedded in a computer program product.
  • the computer program product can comprise a computer-readable storage medium in which is embedded a computer program comprising computer-executable code for directing a computing device or computer-based system to perform the various procedures, processes and methods described herein.
  • Computer program in the present context means any expression, in any language, code or notation, of a set of instructions intended to cause a system having an information processing capability to perform a particular function either directly or after either or both of the following: a) conversion to another language, code or notation; b) reproduction in a different material form.

Abstract

A tracking system for a target anatomy of a patient can include a medical device having a body (281) having a distal end (290) and at least one channel (292) formed therein, where the body is adapted for insertion through an anatomy (105) to reach a target area (430); an accelerometer (185) connected to the body and positioned in proximity to the distal end; an imaging device (295) operably coupled with the body; and a light source (297) operably coupled with the body, where the accelerometer is in communication with a remote processor (120) for transmitting acceleration data thereto, where the imaging device is in communication with the remote processor for transmitting real-time images thereto, and where an orientation of the medical device with respect to the anatomy is determined by the processor based on the acceleration data.

Description

    CROSS REFERENCE TO RELATED APPLICATION
  • This application claims the benefit of U.S. provisional application Ser. No. 61/221,138, filed Jun. 29, 2009, (Applicant's docket no. PH013137US1) which is incorporated herein by reference. Related application is Ser. No. 61/221,150, “Method and System for Position Determination,” filed Jun. 29, 2009, (Applicant's docket no. PH013333US1).
  • The present application relates to the therapeutic arts, in particular to tracking for medical procedures, and will be described with particular reference thereto.
  • Various techniques and systems have been proposed to improve the accuracy of instrumentality placement (e.g., catheter placement) into the body, such as based on measurements from 3D imaging formats. These imaging formats attempt to locate the entry device in relation to therapy-targeted areas, such as MRI detected target tissue. These imaging formats generate imaging data that are used to determine the appropriate positioning of the device during treatment.
  • In many cases, the medical device is delivered solely on the basis of this imaging data information, and confirmation of the final position relative to the target may even require a second set of images to be acquired. In some cases where cameras are utilized in the device for visually presenting the path of the device, it is unclear if the correct path is being followed, such as where the device has twisted during movement.
  • Bronchoscopy is a method to view the interior of the bronchi. A flexible fiber optic device, the bronchoscope, a special kind of endoscope, is introduced through the mouth or nostril of the patient into the airway system. It allows the pulmonologist to see the inside of the trachea, the main bronchi, and the bigger of the small bronchi. Usually, bronchoscopes have a working channel, through which small surgical instruments can be brought to the tip of the bronchoscope.
  • Lung lesions can be detected on CT scans. In order to come to a reliable diagnosis, a tissue sample must often be investigated. Although it is possible to take the tissue sample with a needle from the outside, this method has certain problems. With the help of a bronchoscope, it is possible to circumvent these problems. Transbronchial endoscopic biopsy of lung lesions is a surgical technique to collect lung tissue via the bronchoscope. A small forceps or biopsy needle is used through the working channel to get lung tissue from behind the bronchial wall.
  • This Summary is provided to comply with U.S. Rule 37 C.F.R. §1.73, requiring a summary of the invention briefly indicating the nature and substance of the invention. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims.
  • In accordance with one aspect of the exemplary embodiments, a method of tracking in a medical procedure can include receiving acceleration data from an accelerometer that is integrally connected to a medical device, where the acceleration data is received at a remote processor, and where the medical device is moved through an anatomy of a patient towards a target region; and determining an orientation of the medical device with respect to the anatomy based on the acceleration data.
  • In accordance with another aspect of the exemplary embodiments, a computer-readable storage medium can include computer-executable code stored therein, where the computer-executable code is configured to cause a computing device, in which the computer-readable storage medium is provided, to: receive orientation data from an orientation sensor that is integrally connected to a medical device, where the orientation data is received at a remote processor, and where the medical device is being moved through an anatomy of a patient towards a target region; determine an orientation of the medical device with respect to the anatomy based on the orientation data; capture real-time images of the anatomy using the medical device; and present the captured images and the orientation of the medical device with respect to the anatomy on a display device operably coupled to the processor.
  • In accordance with another aspect of the exemplary embodiments, an endoscope is provided that can include a body having a distal end and at least one channel formed therein, where the body is adapted for insertion through an anatomy to reach a target area; an accelerometer connected to the body and positioned in proximity to the distal end; an imaging device operably coupled with the body; and a light source operably coupled with the body, where the accelerometer is in communication with a remote processor for transmitting acceleration data thereto, where the imaging device is in communication with the remote processor for transmitting real-time images thereto, and where an orientation of the medical device with respect to the anatomy is determined by the processor based on the acceleration data.
  • The exemplary embodiments described herein can have a number of advantages over contemporary systems and processes, including accuracy of surgical device placement and reduction of procedure time by allowing the correct path of the medical device to be more quickly determined Additionally, the system and method described herein can be utilized through retrofitting existing surgical devices. Still further advantages and benefits will become apparent to those of ordinary skill in the art upon reading and understanding the following detailed description.
  • The above-described and other features and advantages of the present disclosure will be appreciated and understood by those skilled in the art from the following detailed description, drawings, and appended claims.
  • FIG. 1 is a schematic illustration of a tracking system according to one exemplary embodiment for use in a medical procedure;
  • FIG. 2 is a schematic illustration of a surgical device for use with the tracking system of FIG. 1;
  • FIG. 3 is a schematic illustration of another surgical device for use with the tracking system of FIG. 1;
  • FIG. 4 is a schematic illustration of a patient with a target anatomy; and
  • FIG. 5 is an image of a bronchus of a patient captured using the surgical device of FIG. 2 or 3;
  • FIG. 6 is a method that can be used by the system and devices of FIGS. 1-3 for performing tracking during a medical procedure; and
  • FIG. 7 is a schematic illustration of signal flow between the surgical device and the work station.
  • The exemplary embodiments of the present disclosure are described with respect to a tracking system for a bronchoscope to be utilized during a procedure for a human. It should be understood by one of ordinary skill in the art that the exemplary embodiments of the present disclosure can be applied to, and utilized with, various types of medical or surgical devices (including other endoscopes or catheters), various types of procedures, and various portions of the body, whether human or animal. The exemplary embodiments can also be used for tracking of a surgical device that utilizes other types of imaging in combination with or in place of a camera, such as ultrasound imaging from an ultrasound device positioned in the surgical device that enters the body. The exemplary embodiments are described herein as using accelerometer tracking in combination with imaging. The use of the method and system of the exemplary embodiments of the present disclosure can be adapted for application to other types of tracking in a target anatomy and can utilize other types of orientation sensing sensors including magnetometers.
  • Referring to FIG. 1, a tracking system 100 is shown which can have a surgical device 180, such as a bronchoscope, with an accelerometer 185 connected thereto. The accelerometer 185 can be positioned along or in proximity to the tip or distal end of the surgical device 180. While the exemplary embodiment shows a single accelerometer 185, the present disclosure contemplates the use of any number of accelerometers that can be in various configurations along the surgical device 180. The surgical device 180 can be utilized in a target anatomy 105 of a patient who can be supported by a support structure 170.
  • The accelerometer 185 can be a measurement device capable of detecting acceleration of the tip of the surgical device 180 so that orientation information can be generated with respect to a current orientation of the tip. Accelerometer 185 can be of various types including piezoelectric, MEMS, thermal (submicrometre CMOS process), bulk micromachined capacitive, bulk micromachined piezo resistive, capacitive spring mass base, electromechanical servo, null-balance, strain gauge, resonance, magnetic induction, optical, surface acoustic wave, DC response, modally tuned impact, seat pad, PIGA and so forth. In one embodiment, 3-axis accelerometers can be utilized which measure not only the strength of the acceleration, but also its direction.
  • The accelerometer 185 can be operably connected to a processor 120 that receives the orientation data therefrom. The operable coupling can be through a hardwire, such as line 186, and/or can be a wireless link between the accelerometer 185 and the processor 120. In one embodiment, the orientation data can be raw data, such as a change in voltage, that is measured and transmitted to the processor 120. In another embodiment, the accelerometer 185 can convert the raw data to direction information prior to transmission of the orientation data to the processor 120.
  • System 100 depicts the orientation data being provided directly to the processor 120. However, the present disclosure contemplates the accelerometer 185 providing the orientation data to a orientation acquisition unit (not shown) which can process the data and then provide it to the processor 120.
  • In one embodiment, tracking system 100 can be used with, or can include, an imaging modality 150, such as a high resolution imaging modality, including an x-ray scanner 155. For example, a high resolution image of the target anatomy 105 can be generated by the scanner 155 and stored in an image memory. The image memory can be incorporated into processor 120 and/or can be a separate storage and/or processing device. A C-arm x-ray scanning device 155 is shown in FIG. 1 for illustrative purposes, but the present disclosure contemplates the use of various imaging devices, including an open MRI, CT, and so forth. The present disclosure contemplates the use of various imaging modalities, alone or in combination, including MRI, ultrasound, X-ray CT, and so forth. The present disclosure also contemplates the imaging modality 150 being a separate system that is relied upon for gathering of images, including pre-operative and/or intra-operative images.
  • Referring additionally to FIG. 2, the surgical device 180 can include one or more channels 292 formed through a body 281 of the device (e.g., a bronchoscope), such as working channels for providing the clinician with access to the target anatomy and suction channels. The body 281 can be made from various flexible materials. The device 180 can include the accelerometer 185 positioned along or in proximity to the tip 290 of the device, including being embedded in a wall of the device or connected to the outside of the device. The device 180 can also include a camera or imaging device 295 and a light source 297. The light source 297 can have a self-contained power source and/or can be connected to an external power source, such as through use of line 186 (in FIG. 1). In one embodiment, the light source 297 can be operably connected to the processor 120 for adjustment of the level of emitted light or other control to be exerted over the light source. In another embodiment, the tip of the surgical device 180 can be provided light by way of fiber optics from an external light generating device.
  • The camera 295 can be operably connected to a processor 120 that receives the imaging data therefrom. The operable coupling can be through a hardwire, such as the line 186, and/or can be a wireless link between the camera 295 and the processor 120. In one embodiment, the imaging data can be raw data that is captured by the camera 295 and transmitted directly to the processor 120. In another embodiment, the camera 295 can convert the raw data to video information prior to transmission of the imaging to the processor 120. The processor 120 can present the imaging data as a video in real time so that the clinician can see the path that the surgical device 180 is traveling along.
  • Referring additionally to FIGS. 4 and 5, the surgical device 180 can travel down through the trachea 410 and through the bronchi 420 in order to reach a tumor or other target area or region 430. As shown in FIG. 5, the bifurcated structure of the bronchi requires that the clinician select among different paths as the surgical device 180 is being moved during the procedure.
  • When the surgical device 180 is not moving, the accelerometer 185 measures gravity only. Based on this measurement, it is possible for the processor 120 to determine the up-direction at the tip 290 of the device 180 and relate it to the image (e.g., a CT scan) of the device captured by imaging modality 150. Since it is known how the patient is placed during the procedure, such as a bronchoscopy, it is possible to relate the bronchoscopy image to the CT scan. At a given bifurcation, it is possible to determine which branch to follow in order to reach the target using the acceleration data.
  • In one embodiment, bifurcations visible in the bronchoscopy image can be detected automatically by means of image processing performed by processor 120. It can be further detected whether the bronchoscope 180 is moved into or out of the bronchi. Together with the information from the accelerometer 185, this combined information can be used to detect the position of the bronchoscope in the bronchial tree. The combination of information from the accelerometer 185 and from image analysis performed by processor 120 facilitates navigation through to the target anatomy.
  • In another embodiment, a bifurcation indicator can be presented to indicate the orientation of the bronchoscope with respect to the target anatomy. For example, an arrow or the like can be presented that shows which direction is up or which direction is down with respect to a vertical plane. In another embodiment, based on the CT scan, it is possible to render a computer generated view from the position and with the orientation of the actual bronchoscope, so called “Virtual bronchoscopy.” This view can be shown side by side with the real image in order to allow for user orientation. In another embodiment, after determining the bifurcation in the video image by image analysis the planned path may be marked, e.g., with a cross.
  • Referring additionally to FIG. 3, another surgical device 380 (e.g., a bronchoscope) is shown. Device 380 can include additional channels 392 that allow for positioning one or more of the accelerometer 385, the camera 395 and the light source 397 at or near the tip 390 of the device. These components can be operably coupled to the processor 120 through use of a hardwire and/or wireless link. Once the device 380 reaches its target, one or more of these components can be removed through the channels 392. For example, the accelerometer 385 can be slid through the channel 392 and positioned therein during movement of the device 380. In one embodiment, existing bronchoscopes can be utilized with one or more of the components of device 180. For example, the accelerometer can be positioned into the existing working channel or attached to the tip of the bronchoscope at the outside. In another embodiment, the optical system and lighting components can be fixed in the surgical device 180. Once the target area is reached, the accelerometer 385 can be slid back out through the channel 392 so that the channel can be utilized for other purposes, such as a suction channel or a working channel. In this embodiment, fewer channels may thus be formed through the device 180.
  • Referring to FIG. 6, a method 600 of tracking a surgical device, such as a bronchoscope, is shown. In step 602, an image (e.g., a CT image) of the target region, such as the bronchi, is obtained. The image can be a pre-operative image and/or intra-operative image. In step 604, the bronchoscope can be moved through the bronchi where the clinician is viewing the captured real-time video from the camera positioned in the bronchoscope. In step 606, it is determined whether the target has been reached. In step 608, the clinician may come upon a bifurcation in the path. The correct path to proceed along can be determined using the orientation data received from the accelerometer in step 610. These steps can be repeated until the target is reached in step 612. The image can be adjusted so that the position of the bronchoscope and/or the orientation of the bronchoscope is shown therein, such as through using the acceleration data.
  • System 100 allows the data from the accelerometer 185 to be transferred to the processor 120. In one embodiment, this data can be transmitted along a light guide bundle, which is being utilized for an optical camera operably coupled to the bronchoscope. The processor 120 can receive the orientation data from the accelerometer as well as the bronchoscope image from the video processor for analysis. The processor 120 can analyze and track which bifurcation of the bronchial tree is currently being seen. In one embodiment, the processor 120 can also be connected to the facility network in order to receive the pre-interventional CT scan and the corresponding path planning data. The directional information calculated by the processor 120 can be transferred to the video-processor, where it is combined with the original bronchoscope image data, and then presented on the monitor 130.
  • Referring additionally to FIG. 7, signal flow between the device 180 and a work or base station 119 is depicted. The signal flow can include acceleration data 750 from the accelerometer 180 to the processor 120; the bronchoscope imaging (e.g., real-time video) 725 from the camera 295 to a video-processor 721; and light 775 from a light source 797 to the light 297 (connected to the bronchoscope).
  • In one embodiment, the bronchoscope image being presented on the display 130 can be automatically rotated to depict the up-direction based on the orientation data from the accelerometer. In another embodiment, the image processing methods can be used to determine if the bronchoscope is moving in or out of the bronchi. In yet another embodiment, other types of orientation sensors can be utilized for capturing the orientation data. For example, a magnetometer can be used to determine the direction associated with the tip of the bronchoscope and the bifurcated paths based on use of an external magnetic field, including the earth's magnetic field and/or an artificial field. System 100 can be used for bronchoscopic navigation, particularly transbronchial lung biopsies. The system 100 can also be used in other applications, such as a colonoscopy.
  • In one embodiment, calibration of the direction (iterative refinement of assumed patient position) can be performed where the navigation of the bronchoscope starts with assuming that the patient is in a known position and orientation. The pre-operative CT dataset is thus oriented accordingly to the direction measured by the accelerometer. At the first bifurcation the directions into both bifurcated bronchi are determined with the help of image analysis. These directions are compared to the expected direction based on the accelerometer measurement and the assumed patient orientation and the deviation in orientation is calculated. The assumed patient orientation is corrected by this deviation and for the next bifurcation a better assumption on the patient orientation is used. This procedure can be repeated at the next bifurcation.
  • A preoperative CT of the lung can be obtained prior to the bronchoscopy. This CT can be analyzed as follows: In the CT image, the position of the lesion of interest, e.g. a lung nodule or tumor, can be determined. This is done manually via clicking in the right slice to the right position. The bronchial tree can be extracted (segmented) from the CT image with the help of suitable image processing methods. The path from the trachea into the bronchial tree to the lesion can be planned. This can be done manually, but automatic methods are also conceivable. Bifurcations along the path can be detected. With this planning step there is enough information available for the intra-operative guidance as describe in the comments above.
  • Where the bronchi and thus also the bronchoscope tip point directly down or directly up, there may be no usable directional information from the accelerometer. In such a case, it helps to reposition the patient in such a manner, that the bronchoscope tip is pointing into a direction with a horizontal component. Alternatively, the accelerometer can be supported by a magnetometer, which measures the direction of the magnetic field. This is not collinear with the gravitation field except at the magnetic poles.
  • The invention, including the steps of the methodologies described above, can be realized in hardware, software, or a combination of hardware and software. The invention can be realized in a centralized fashion in one computer system, or in a distributed fashion where different elements are spread across several interconnected computer systems. Any kind of computer system or other apparatus adapted for carrying out the methods described herein is suited. A typical combination of hardware and software can be a general purpose computer system with a computer program that, when being loaded and executed, controls the computer system such that it carries out the methods described herein.
  • The invention, including the steps of the methodologies described above, can be embedded in a computer program product. The computer program product can comprise a computer-readable storage medium in which is embedded a computer program comprising computer-executable code for directing a computing device or computer-based system to perform the various procedures, processes and methods described herein. Computer program in the present context means any expression, in any language, code or notation, of a set of instructions intended to cause a system having an information processing capability to perform a particular function either directly or after either or both of the following: a) conversion to another language, code or notation; b) reproduction in a different material form.
  • The illustrations of embodiments described herein are intended to provide a general understanding of the structure of various embodiments, and they are not intended to serve as a complete description of all the elements and features of apparatus and systems that might make use of the structures described herein. Many other embodiments will be apparent to those of skill in the art upon reviewing the above description. Other embodiments may be utilized and derived therefrom, such that structural and logical substitutions and changes may be made without departing from the scope of this disclosure. Figures are also merely representational and may not be drawn to scale. Certain proportions thereof may be exaggerated, while others may be minimized. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.
  • Thus, although specific embodiments have been illustrated and described herein, it should be appreciated that any arrangement calculated to achieve the same purpose may be substituted for the specific embodiments shown. This disclosure is intended to cover any and all adaptations or variations of various embodiments. Combinations of the above embodiments, and other embodiments not specifically described herein, will be apparent to those of skill in the art upon reviewing the above description. Therefore, it is intended that the disclosure not be limited to the particular embodiment(s) disclosed as the best mode contemplated for carrying out this invention, but that the invention will include all embodiments falling within the scope of the appended claims.
  • The Abstract of the Disclosure is provided to comply with U.S. Rule 37 C.F.R. §1.72(b), requiring an abstract that will allow the reader to quickly ascertain the nature of the technical disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims.

Claims (20)

1. A method of tracking in a medical procedure, the method comprising:
receiving acceleration data (750) from an accelerometer (185) that is integrally connected to a medical device (180), the acceleration data being received at a remote processor (120), the medical device being moved through an anatomy (105) of a patient towards a target region (430); and
determining an orientation of the medical device with respect to the anatomy based on the acceleration data.
2. The method of claim 1, further comprising:
obtaining an image of the anatomy;
calibrating the orientation with the image;
at a first bifurcation of the anatomy, determining directions into bifurcated bronchi using the image; and
comparing the determined directions to an expected direction based on the determined orientation.
3. The method of claim 1, further comprising capturing real-time images of the anatomy using the medical device (180) and displaying the captured images on a display device (130) operably coupled with the processor (120).
4. The method of claim 3, further comprising overlaying an orientation indicator with the captured real-time images of the anatomy (105), wherein the orientation indicator represents the orientation of the medical device (180) with respect to the anatomy.
5. The method of claim 1, further comprising:
capturing real-time images of the anatomy (105) using the medical device (180);
displaying the captured images on a display device (130) operably coupled with the processor (120); and
presenting the orientation on the display device, wherein the orientation is presented by rotating the captured images on the display device.
6. The method of claim 1, further comprising presenting the orientation on a display device (130) operably coupled to the processor (120).
7. The method of claim 1, further comprising detecting bifurcations in the anatomy using an image of the patient.
8. The method of claim 7, wherein the image of the patient is obtained using at least one of computed tomography, magnetic resonance imaging, and ultrasound imaging.
9. The method of claim 7, further comprising overlaying a current position of the medical device (180) on the image of the patient.
10. The method of claim 1, further comprising:
obtaining an image of the anatomy;
determining a target region in the image by selecting a corresponding slice of the image;
segmenting a portion of the image using image processing; and
planning a path from a trachea into a bronchial tree to the target region.
11. A computer-readable storage medium in which computer-executable code is stored, the computer-executable code configured to cause a computing device, in which the computer-readable storage medium is provided, to:
receive orientation data (750) from a orientation sensor (185) that is integrally connected to a medical device (180), the orientation data being received at a remote processor (120), the medical device being moved through an anatomy (105) of a patient towards a target region (430);
determine an orientation of the medical device with respect to the anatomy based on the position data;
capture real-time images of the anatomy using the medical device; and
present the captured images and the orientation of the medical device with respect to the anatomy on a display device (130) operably coupled to the processor.
12. The computer-readable storage medium of claim 11, wherein the orientation sensor (185) is one of an accelerometer and a magnetometer, and wherein the orientation data is acceleration data.
13. The computer-readable storage medium of claim 11, further comprising computer-executable code for causing the computing device to overlay an orientation indicator with the captured real-time images of the anatomy (105), wherein the orientation indicator represents the orientation of the medical device (180) with respect to the anatomy.
14. The computer-readable storage medium of claim 11, further comprising computer-executable code for causing the computing device to detect bifurcations in the anatomy using an image of the patient.
15. The computer-readable storage medium of claim 15, further comprising computer-executable code for causing the computing device to overlay a current position of the medical device (180) on the image of the patient.
16. An endoscope comprising:
a body (281) having a distal end (290) and at least one channel (292) formed therein, the body being adapted for insertion through an anatomy (105) to reach a target area (430);
an accelerometer (185) connected to the body and positioned in proximity to the distal end;
an imaging device (295) operably coupled with the body; and
a light source (297) operably coupled with the body,
wherein the accelerometer is in communication with a remote processor (120) for transmitting acceleration data thereto, wherein the imaging device is in communication with the remote processor for transmitting real-time images thereto, and wherein an orientation of the medical device with respect to the anatomy is determined by the processor based on the acceleration data.
17. The endoscope of claim 16, wherein imaging device (295) and the light source (295) are positioned in proximity to the distal end (290) of the body (281).
18. The endoscope of claim 16, wherein the at least one channel (292) is a first and second channel, the first channel being adapted for providing suction to the target area, the second channel being adapted for passing a surgical device therethrough.
19. The endoscope of claim 16, wherein the accelerometer (185) is in communication with the remote processor (120) through a wireless link.
20. The endoscope of claim 16, wherein the imaging device (295) is in communication with the remote processor (120) through a wireless link.
US13/378,175 2009-06-29 2010-05-17 Method and apparatus for tracking in a medical procedure Abandoned US20120089014A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US13/378,175 US20120089014A1 (en) 2009-06-29 2010-05-17 Method and apparatus for tracking in a medical procedure

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US22113809P 2009-06-29 2009-06-29
PCT/IB2010/052176 WO2011001301A1 (en) 2009-06-29 2010-05-17 Method and apparatus for tracking in a medical procedure
US13/378,175 US20120089014A1 (en) 2009-06-29 2010-05-17 Method and apparatus for tracking in a medical procedure

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2010/052176 A-371-Of-International WO2011001301A1 (en) 2009-06-29 2010-05-17 Method and apparatus for tracking in a medical procedure

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/287,896 Division US10765308B2 (en) 2009-06-29 2016-10-07 Method and apparatus for tracking in a medical procedure

Publications (1)

Publication Number Publication Date
US20120089014A1 true US20120089014A1 (en) 2012-04-12

Family

ID=42668331

Family Applications (2)

Application Number Title Priority Date Filing Date
US13/378,175 Abandoned US20120089014A1 (en) 2009-06-29 2010-05-17 Method and apparatus for tracking in a medical procedure
US15/287,896 Active 2032-12-05 US10765308B2 (en) 2009-06-29 2016-10-07 Method and apparatus for tracking in a medical procedure

Family Applications After (1)

Application Number Title Priority Date Filing Date
US15/287,896 Active 2032-12-05 US10765308B2 (en) 2009-06-29 2016-10-07 Method and apparatus for tracking in a medical procedure

Country Status (6)

Country Link
US (2) US20120089014A1 (en)
EP (1) EP2448512B1 (en)
JP (1) JP6200152B2 (en)
CN (1) CN102470014B (en)
RU (1) RU2544807C2 (en)
WO (1) WO2011001301A1 (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140375784A1 (en) * 2013-06-21 2014-12-25 Omnivision Technologies, Inc. Image Sensor With Integrated Orientation Indicator
US9008757B2 (en) 2012-09-26 2015-04-14 Stryker Corporation Navigation system including optical and non-optical sensors
WO2016004007A1 (en) 2014-07-02 2016-01-07 Covidien Lp Intelligent display
WO2016064870A1 (en) * 2014-10-20 2016-04-28 Ohio State Innovation Foundation Intubation with audiovibratory guidance
US10470724B2 (en) * 2015-04-13 2019-11-12 Precisionrad Llc Laser and accelerometer guided medical device
US20200197106A1 (en) * 2017-06-21 2020-06-25 Biosense Webster (Israel) Ltd. Registration with trajectory information with shape sensing
EP3919019A1 (en) * 2020-06-03 2021-12-08 Covidien LP Surgical tool navigation using sensor fusion
WO2022144635A3 (en) * 2020-12-28 2022-08-11 Johnson & Johnson Surgical Vision, Inc. Highly bendable camera for eye surgery
US11684251B2 (en) * 2019-03-01 2023-06-27 Covidien Ag Multifunctional visualization instrument with orientation control
CN116630534A (en) * 2023-05-06 2023-08-22 华中科技大学协和深圳医院 Airway management artificial intelligence decision-making system

Families Citing this family (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3957242A1 (en) 2011-02-17 2022-02-23 Tyto Care Ltd. System, handheld diagnostics device and methods for performing an automatic and remote trained personnel guided non-invasive medical examination
WO2012111012A1 (en) 2011-02-17 2012-08-23 Eon Medical Ltd. System and method for performing an automatic and self-guided medical examination
US9925009B2 (en) * 2013-03-15 2018-03-27 Covidien Lp Pathway planning system and method
US9639666B2 (en) 2013-03-15 2017-05-02 Covidien Lp Pathway planning system and method
EP2821024A1 (en) * 2013-07-01 2015-01-07 Advanced Osteotomy Tools - AOT AG Computer assisted surgery apparatus and method of cutting tissue
EP2821023A1 (en) * 2013-07-01 2015-01-07 Advanced Osteotomy Tools - AOT AG Planning cutting of human or animal bone tissue
KR101645392B1 (en) * 2014-08-13 2016-08-02 주식회사 고영테크놀러지 Tracking system and tracking method using the tracking system
US10163204B2 (en) * 2015-02-13 2018-12-25 St. Jude Medical International Holding S.À R.L. Tracking-based 3D model enhancement
US11646113B2 (en) * 2017-04-24 2023-05-09 Biosense Webster (Israel) Ltd. Systems and methods for determining magnetic location of wireless tools
US11166766B2 (en) * 2017-09-21 2021-11-09 DePuy Synthes Products, Inc. Surgical instrument mounted display system
US11382594B2 (en) * 2018-12-31 2022-07-12 General Electric Company Systems and methods for interventional radiology with remote processing
US11830274B2 (en) * 2019-01-11 2023-11-28 Infrared Integrated Systems Limited Detection and identification systems for humans or objects
CN110123453B (en) * 2019-05-31 2021-07-23 东北大学 Operation navigation system based on unmarked augmented reality
CN110537982A (en) * 2019-09-25 2019-12-06 重庆博仕康科技有限公司 soft and hard mirror operation navigation system
CN111820955B (en) * 2020-07-27 2021-09-10 南方科技大学 Intelligent device is gathered to portable pharynx swab
US20240000513A1 (en) * 2021-01-25 2024-01-04 Smith & Nephew, Inc. Systems and methods for fusing arthroscopic video data
CN115145453B (en) * 2022-09-02 2022-12-16 北京唯迈医疗设备有限公司 Method, system and storage medium for adjusting display visual angle of medical image
CN115414120A (en) * 2022-11-07 2022-12-02 中南大学 Endoscope navigation system

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060074289A1 (en) * 2004-08-26 2006-04-06 Doron Adler Wireless determination of endoscope orientation
US7233820B2 (en) * 2002-04-17 2007-06-19 Superdimension Ltd. Endoscope structures and techniques for navigating to a target in branched structure
US20070270686A1 (en) * 2006-05-03 2007-11-22 Ritter Rogers C Apparatus and methods for using inertial sensing to navigate a medical device
US20080207997A1 (en) * 2007-01-31 2008-08-28 The Penn State Research Foundation Method and apparatus for continuous guidance of endoscopy
US20080292046A1 (en) * 2007-05-09 2008-11-27 Estelle Camus Bronchopulmonary medical services system and imaging method
US20090149740A1 (en) * 2007-12-11 2009-06-11 Siemens Aktiengesellschaft A medical intervention device
US8337397B2 (en) * 2009-03-26 2012-12-25 Intuitive Surgical Operations, Inc. Method and system for providing visual guidance to an operator for steering a tip of an endoscopic device toward one or more landmarks in a patient

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1994004938A1 (en) 1992-08-14 1994-03-03 British Telecommunications Public Limited Company Position location system
JP4624575B2 (en) * 2001-02-16 2011-02-02 オリンパス株式会社 Endoscope system
JP4776793B2 (en) * 2001-03-08 2011-09-21 オリンパス株式会社 Endoscope device
US7641609B2 (en) * 2002-07-31 2010-01-05 Olympus Corporation Endoscope device and navigation method for endoscope device
CA2567737A1 (en) * 2004-05-14 2005-11-24 Olympus Medical Systems Corp. Electronic endoscope
JP4695420B2 (en) 2004-09-27 2011-06-08 オリンパス株式会社 Bending control device
JP2006230906A (en) * 2005-02-28 2006-09-07 Toshiba Corp Medical diagnostic system and apparatus, and endoscope
JP4812418B2 (en) * 2005-12-06 2011-11-09 オリンパス株式会社 Endoscope device
JP2007319622A (en) * 2006-06-05 2007-12-13 Olympus Corp Endoscope system
US20080146941A1 (en) * 2006-12-13 2008-06-19 Ep Medsystems, Inc. Catheter Position Tracking for Intracardiac Catheters
JP4922107B2 (en) * 2007-09-03 2012-04-25 オリンパスメディカルシステムズ株式会社 Endoscope device
WO2011001300A1 (en) 2009-06-29 2011-01-06 Koninklijke Philips Electronics, N.V. Method and system for position determination

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7233820B2 (en) * 2002-04-17 2007-06-19 Superdimension Ltd. Endoscope structures and techniques for navigating to a target in branched structure
US20060074289A1 (en) * 2004-08-26 2006-04-06 Doron Adler Wireless determination of endoscope orientation
US20070270686A1 (en) * 2006-05-03 2007-11-22 Ritter Rogers C Apparatus and methods for using inertial sensing to navigate a medical device
US20080207997A1 (en) * 2007-01-31 2008-08-28 The Penn State Research Foundation Method and apparatus for continuous guidance of endoscopy
US20080292046A1 (en) * 2007-05-09 2008-11-27 Estelle Camus Bronchopulmonary medical services system and imaging method
US20090149740A1 (en) * 2007-12-11 2009-06-11 Siemens Aktiengesellschaft A medical intervention device
US8337397B2 (en) * 2009-03-26 2012-12-25 Intuitive Surgical Operations, Inc. Method and system for providing visual guidance to an operator for steering a tip of an endoscopic device toward one or more landmarks in a patient

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
Adler US Pub no 2006/0074289 *
Camus US Pub no 2008/0292046 *
Gilboa US Patent no 7,233,820 *
Hoheisel US Pub no 2009/0149740 *
Prisco US Patent no 8,337,397 *
Ritter US Pub no 2007/0270686 *

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9008757B2 (en) 2012-09-26 2015-04-14 Stryker Corporation Navigation system including optical and non-optical sensors
US9271804B2 (en) 2012-09-26 2016-03-01 Stryker Corporation Method for tracking objects using optical and non-optical sensors
US9687307B2 (en) 2012-09-26 2017-06-27 Stryker Corporation Navigation system and method for tracking objects using optical and non-optical sensors
US10575906B2 (en) 2012-09-26 2020-03-03 Stryker Corporation Navigation system and method for tracking objects using optical and non-optical sensors
US11529198B2 (en) 2012-09-26 2022-12-20 Stryker Corporation Optical and non-optical sensor tracking of objects for a robotic cutting system
US20140375784A1 (en) * 2013-06-21 2014-12-25 Omnivision Technologies, Inc. Image Sensor With Integrated Orientation Indicator
US11188285B2 (en) 2014-07-02 2021-11-30 Covidien Lp Intelligent display
WO2016004007A1 (en) 2014-07-02 2016-01-07 Covidien Lp Intelligent display
EP3164071A4 (en) * 2014-07-02 2018-04-04 Covidien LP Intelligent display
AU2015284290B2 (en) * 2014-07-02 2019-09-12 Covidien Lp Intelligent display
US11793389B2 (en) 2014-07-02 2023-10-24 Covidien Lp Intelligent display
WO2016064870A1 (en) * 2014-10-20 2016-04-28 Ohio State Innovation Foundation Intubation with audiovibratory guidance
US10610655B2 (en) 2014-10-20 2020-04-07 Ohio State Innovation Foundation Intubation with audiovibratory guidance
US10470724B2 (en) * 2015-04-13 2019-11-12 Precisionrad Llc Laser and accelerometer guided medical device
US20200197106A1 (en) * 2017-06-21 2020-06-25 Biosense Webster (Israel) Ltd. Registration with trajectory information with shape sensing
US11684251B2 (en) * 2019-03-01 2023-06-27 Covidien Ag Multifunctional visualization instrument with orientation control
EP3919019A1 (en) * 2020-06-03 2021-12-08 Covidien LP Surgical tool navigation using sensor fusion
WO2022144635A3 (en) * 2020-12-28 2022-08-11 Johnson & Johnson Surgical Vision, Inc. Highly bendable camera for eye surgery
CN116630534A (en) * 2023-05-06 2023-08-22 华中科技大学协和深圳医院 Airway management artificial intelligence decision-making system

Also Published As

Publication number Publication date
RU2012103004A (en) 2013-08-10
CN102470014A (en) 2012-05-23
JP6200152B2 (en) 2017-09-20
EP2448512A1 (en) 2012-05-09
US10765308B2 (en) 2020-09-08
WO2011001301A1 (en) 2011-01-06
CN102470014B (en) 2016-07-20
EP2448512B1 (en) 2021-10-27
JP2012531932A (en) 2012-12-13
RU2544807C2 (en) 2015-03-20
US20170020376A1 (en) 2017-01-26

Similar Documents

Publication Publication Date Title
US10765308B2 (en) Method and apparatus for tracking in a medical procedure
US9554729B2 (en) Catheterscope 3D guidance and interface system
JP7154832B2 (en) Improving registration by orbital information with shape estimation
EP2800534B1 (en) Position determining apparatus
CN108430373B (en) Apparatus and method for tracking the position of an endoscope within a patient
EP2123216B1 (en) Medical Device
JP5291619B2 (en) Coordinate system registration
KR20200007896A (en) Biopsy Devices and Systems
EP2430979B1 (en) Biopsy support system
US20120130171A1 (en) Endoscope guidance based on image matching
US20060184016A1 (en) Method and apparatus for guiding an instrument to a target in the lung
US20210378759A1 (en) Surgical tool navigation using sensor fusion
CN109620303B (en) Lung auxiliary diagnosis method and device
JP4334839B2 (en) Endoscope observation device
JP2018134197A (en) Medical procedure navigation system and method
US20230372024A1 (en) Synthetic position in space of an endoluminal instrument
JP2022505955A (en) Spatial alignment method for imaging equipment

Legal Events

Date Code Title Description
AS Assignment

Owner name: KONINKLIJKE PHILIPS ELECTRONICS N V, NETHERLANDS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SABCZYNSKI, JOERG;SCHULZ, HEINRICH;SIGNING DATES FROM 20100602 TO 20100603;REEL/FRAME:027385/0455

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION