EP1196089A2 - Apparatus and methods for medical diagnostic and for medical guided interventions and therapy - Google Patents

Apparatus and methods for medical diagnostic and for medical guided interventions and therapy

Info

Publication number
EP1196089A2
EP1196089A2 EP00914346A EP00914346A EP1196089A2 EP 1196089 A2 EP1196089 A2 EP 1196089A2 EP 00914346 A EP00914346 A EP 00914346A EP 00914346 A EP00914346 A EP 00914346A EP 1196089 A2 EP1196089 A2 EP 1196089A2
Authority
EP
European Patent Office
Prior art keywords
imaging device
medical imaging
medical
volume
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP00914346A
Other languages
German (de)
French (fr)
Inventor
Victor Segalescu
Yoav Paltieli
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ultraguide Ltd
Original Assignee
Ultraguide Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ultraguide Ltd filed Critical Ultraguide Ltd
Publication of EP1196089A2 publication Critical patent/EP1196089A2/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8906Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
    • G01S15/899Combination of imaging systems with ancillary equipment
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/365Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/378Surgical systems with images on a monitor during operation using ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems

Definitions

  • the present invention relates to apparatus for performing medical diagnosis
  • the present invention is related to apparatus for performing
  • energy field towards a target in a body are assisted by images of said body and target produced by medical imaging devices like CT, MR, ultrasound, etc.
  • the same target can be viewed by ultrasound and by some other means.
  • CT computed tomography
  • X-Ray X-Ray
  • endoscope imaging device other medical imaging device such as a CT, X-Ray, or endoscope imaging device.
  • the present invention comprises methods and apparatus for combining two or
  • medical imaging systems such as an ultrasound and a CT
  • the present invention is particularly useful in guided medical interventions into a body or body volume, where a target is sought to be evaluated.
  • the apparatus disclosed in the present invention comprises at least two medical
  • imaging devices for example, an ultrasound, CT, X-Ray, endoscope, at least a display,
  • a data processor and a position measuring system comprising position controlling and
  • the position measuring system enables the establishment of the relative positions between
  • the data processor receiving the information
  • the position measuring system is also able to establish the position between the image planes/volumes produced by the at least two imaging devices and using it for at
  • position measuring components defines any of the following group:
  • position measuring system may be magnetic, acoustic, optic, inertial or a combination of the above.
  • the resultant apparatus enables free-hand manipulation of all or part of the
  • the apparatus and methods described in the present invention facilitate the combination of a number of medical imaging devices to perform various medical
  • the apparatus of the present invention is configured to perform diagnostic, therapy procedures and intervention tasks more safely and efficiently than those of the conventional systems.
  • the apparatus of the present invention is configured to perform diagnostic, therapy procedures and intervention tasks more safely and efficiently than those of the conventional systems.
  • the apparatus of the present invention is configured to perform diagnostic, therapy procedures and intervention tasks more safely and efficiently than those of the conventional systems.
  • intervention tool or a medical therapeutic tool or alternatively inserting a medical device when using a second medical imaging device. This can be particularly useful
  • PCT/IL98/00631 entitled: Calibration Method And Apparatus For Calibrating
  • the apparatus described in the present invention may also reduce the amount of
  • the present invention also comprises methods and apparatus for guiding a directional
  • the apparatus therefore, enables free-hand manipulation of part or
  • directional therapy procedure will define any procedure
  • This energy field can be ultrasonic or Shockwaves (lithotripsy) , or
  • electromagnetic radiotherapy, laser, etc
  • particle beam proto beam for example
  • the data processor receives the information from the position measuring system and
  • the method and apparatus are beneficial in that they have add-on capabilities
  • FIG. la pictorially illustrates one form of a system constructed according in
  • CT computerized topography
  • FIG. lb pictorially illustrates the relative position between the scanning beams
  • FIG. 2a is a vector diagram which pictorially illustrates the vectors used in
  • FIG. 2b is a block diagram illustrating the steps involved in calculating the
  • FIG. 3 pictorially illustrates display functions enabled according to the present
  • FIG. 4 is a simplified flowchart illustrating the steps of using in a cooperative
  • FIG. 5a pictorially illustrates one possible position measuring system to be used
  • FIG. 5b pictorially illustrates another possible position measuring system to be
  • FIG. 6 pictorially illustrates one form of a system constructed according in
  • FIG. la illustrates a first embodiment, exemplary of the present invention.
  • the medical imaging devices include an ultrasound apparatus 2,
  • CT computerized tomography
  • the medical imaging devices could be the same or different
  • these devices may be devices that are compatible.
  • these devices may be devices that are compatible.
  • these devices may be devices that are compatible.
  • body 6 or body volume of a target 8 in the body 6 (body volume) may be utilized to position ultrasound transducer 18 at known position with respect with respect to target
  • the system can be fitted to existing and deployed
  • Ultrasound 2 and/or CT 4 are connected to a display 10 via an image processor
  • Ultrasound 2 and/or CT 4 can also be
  • processor 12 can be part of the data processor 14 or can be connected to it.
  • Ultrasound 2 comprises a main unit 17 connected to a scanning head 18 further
  • CT 4 comprises a main
  • CT computer computer 20 connected to a scanning head 22, further referred as a CT
  • CT scanning head 22 includes X-ray emitter and detector(s) (not
  • scanning head will be used to define the
  • detector and/or emitter component of the medical imaging or scanning device such as
  • the transducer of an ultrasound or the X-ray emitter and detector(s) of a CT or an
  • a position measuring system comprising at least a position sensing controller
  • position measuring components will define any of the following group:
  • Patent No. 4,649,504 or inertial for example, IS900 manufactured by InterSense
  • Position sensing controller 26 can be part of the data processor 14.
  • the attachment can be either directly to the transducer 18 or by means of an extension
  • Position measuring component 28 is calibrated to ultrasound transducer 18 such
  • measuring component 28 Such calibration can be achieved by operating according to PCT application PCT/IL98/00631.
  • Position measuring component 30 is attached at a known and fixed position
  • CT scanning head (gantry) 22 from CT scanning head (gantry) 22.
  • the attachment can be either directly to the CT
  • Position sensing controller 26 measures the relative position between position
  • measuring component 28 and position measuring component 30, enabling to calculate
  • Position measuring component 30 is calibrated to scanning head 22 such that the CT
  • scanning beam 34 is at a known and fixed position with respect to position measuring component 30.
  • Such calibration can be achieved by operating according to the
  • FIG. lb shows a detailed view of ultrasound
  • position measuring component 32 may be attached to
  • the attachment can be either directly to the CT bed 15 or by means of an
  • position measuring component 34 is attached at a fixed position.
  • An additional position measuring component 34 may be attached at a fixed
  • position measuring component 34 is attached at a fixed position with
  • a reference position of the CT for example the default position
  • Position measuring components 32 and 34 are calibrated to CT scanning head
  • position measuring component 28 is calibrated to transducer 18 and at
  • block 40 shows the result of the calibrating
  • Block 42 shows the
  • Blocks 40 and 42 are generally performed off-line.
  • Block 44 shows the measurement
  • FIG. 2a diagram of FIG. 2a and flowchart of FIG. 2b. It is therefore possible to calculate the relative position of transducer 18 and ultrasound beam ⁇ image 36 with respect to a CT
  • stretcher 16 can be measured with an accuracy of less than 1
  • the movement of the bed stretcher 16 can be at a predetermined
  • CT stretcher 16 can be moved at any time
  • Necessary correction due to tilt of the scanning head of the CT 22 may be
  • processor 14 or transferred through communication links from CT main unit 20 or data
  • processor 14 can identify them automatically for example according to information
  • CT image video or DICOM form
  • the indications are in 2-D and/or 3-D fashion for example in the form of
  • scanning image 38 can be displayed to the operator, for example in the from of angles and distances, as in box 60, and in the form of side-view illustration box 62, and
  • box 64 This enables the operator to first scan body 6 by CT,
  • target 8 can be marked in a CT image and it is possible to
  • maneuver ultrasound transducer 18 such as to view the target 8.
  • the indications can be
  • box 66 comprising arrows indicating how to maneuver transducer 18
  • Boxes 68 and 70 provide information regarding the position transducer beam 36
  • Box 68 shows the CT slice (reconstruction)
  • Box 70 illustrates the relative
  • transducer 18 With respect to the scanned volume in a sagital view.
  • FIG. 3 illustrated specific display modalities for enabling the user to
  • FIG. 4 is a flow-chart illustration of the steps
  • step 90 it is
  • position measuring component 30 instead or in addition to position measuring component 30 (as explained above).
  • step 92 it is calculated the relative position between ultrasound scanning
  • the calculation at step 92 is based on the calibration of position measuring component 28 and 30 (32, 34)
  • step 94 may also be used in other medical procedures (step 96) that is optional, but
  • step 92 may also be used in order to instruct the maneuvering ultrasound transducer 18 in a required position.
  • CT system 4 and ultrasound system 2 may be correlated and or fused in optional,
  • step 98 This can be performed according to conventional image processing algorithms and techniques. By correlating or fusing these images (from the
  • the displayed ultrasound image may be
  • step 98 may be enhanced.
  • the superimposed image/information resulting of step 98 may be optionally used to improve calculation at step 92 in an iterative mode.
  • step 98 may also be used in other medical
  • step 96 procedures (step 96) or in order to instruct maneuvering of ultrasound transducer 18
  • the imaging options 100 are non-exhaustively, listed as follows.
  • the image from CT system 4 and ultrasound system 2 may be displayed individually (steps 102 ).
  • the relative position between CT and ultrasound scanning beams may be
  • step 98 displayed (step 106, illustrated in Fig. 3) and the result of image fusing, at step 98 may
  • Target and image correlation information can also be available to the operator indicating for example internal anatomic movements as
  • ancillary functions such as optionally marking a target 8 (step 112) on the image
  • the position of target 8 may be calculated within the scanning beam of the medical imaging device,
  • a reference plane/volume may be marked or
  • step 118 The data in steps 114 and/or 118 may be optionally used in order to instruct the positioning of CT
  • Body volume 6 is scanned by the CT system 4 producing high
  • the operator defines at least one target on CT image as
  • the position of the ultrasound scanning beam ⁇ image 36 is determined (as detailed
  • FIG. 5a illustrates a magnetic position
  • position measuring component 28 is a receiver
  • Transmitter 30' being attached to CT scanning head 22 by an arm 80' .
  • receiver 30' is transmitting AC or DC magnetic/electromagnetic signals to receiver 28' .
  • the output of receiver 28' is transmitted by wire or wireless connections to position
  • sensing controller 26 enabling to calculate the relative position of receiver 28' with
  • position measuring component 28 could be a
  • transmitter and position measuring component 30 could be a receiver.
  • FIG. 5b wherein an optical position measuring
  • a stereo vision charge coupled device (CCD) camera 84' is
  • arm 88' and position measuring component 30 includes a cluster of LED's 30" being
  • beam 34 can be optical, acoustic, magnetic or inertial or a combination of the above.
  • the relative position between position measuring components 28 and 30 can be
  • a third position measuring component When making these indirect calculations, a third position measuring component
  • this CCD 84 is operative communication with position measuring components 28 and 30, this CCD 84
  • the first reference location may be fixed and
  • the first reference position can be movable and unknown.
  • the first reference position can be attached to the bed 24.
  • Position measuring components 28, 30 and 84 may be
  • Position measuring components 28, 30 and 84 may be part of a magnetic or acoustic or optic or inertial
  • Position sensing controller 26 may communicate with at least one or all of
  • FIG. 6, illustrates an additional embodiment
  • the second embodiment illustrates an
  • the X-Ray imaging device comprises X-Ray main unit 140 and X-Ray scanning head 142 including emitter 142' and detector 142' .
  • X-Ray scanning head 142 is mounted on a movable and adjustable arm 144.
  • Bed 24 may or may not be
  • Position measuring component 30 is attached at known and fixed positions from
  • position measuring component 30 is calibrated
  • An additional position measuring component 84 can be placed at a reference
  • This position measuring component 84 (if used) is in
  • FIG. 5b Body 6 can be fixed so as to avoid movement during the procedure.
  • scanning head 142 is positioned in order to view target 8 in the body 6 or body volume
  • the position of X-Ray scanning head 142 is measured with
  • the operator may indicate target 8
  • transducer 18 is then applied to body volume 6 and its position is measured with
  • the body volume 6 is first imaged by ultrasound in order to
  • Data processor 14 stores the
  • Ultrasound transducer 18 is then removed and the position of X-Ray scanning head
  • X-Ray 138 are similar to those described for the system illustrated in Figs. la ( 4 and
  • FIG. 7a and 7b illustrate an additional embodiment of the present invention. Similar items to those shown in previous figures have similar numbers and will not further be described.
  • FIG. 1 illustrates an ultrasound 2, and an optical endoscope 150 to be used in cooperative
  • Said endoscope 150 comprising an
  • endoscope head 152 with a CCD 154 (not shown in the drawing) and optical apparatus
  • Endoscope head 152 can be rigid, or can be flexible. It is still possible to use a rigid endoscope head 152 with a mobile tip enabling to
  • Position measuring component 30 is attached at a known position with respect
  • measuring component 28 is attached at a known position with respect to transducer 18
  • endoscope head 152 is calibrated to transducer beam/image 36 as described above. If endoscope head 152
  • is rigid position measuring component may be attached internally or externally at any combination
  • endoscope head 152 is flexible or
  • position measuring component 30 is positioned at the tip of head 150.
  • position measuring component is the magnetic sensor
  • position component 30 can be a
  • measuring component 30 being attached to the flexible part of endoscope head 152
  • target 8 is viewed by
  • ultrasound transducer 18, and endoscope head 152 is maneuvered to view target 8 based on guiding information received from data processor 14 (not shown in Fig. 7)
  • target 8 is viewed by
  • transducer 18 and also by endoscope head 152. It is then possible to mark target 8 in
  • the ultrasound image and calculate its position with respect to position measuring
  • a target is viewed by endoscope 150 while moving the endoscope head 150 at several positions on a strait line around the focussed position, at least one position providing a focussed image of
  • the target According to the focussing and de-focussing of the target and according to
  • the measured position of position measuring component 30 it is possible to receive the
  • endoscope head 152 is viewing a volume of
  • body 6 comprising target 8 from at least two different positions enabling to implement

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biomedical Technology (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Acoustics & Sound (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • General Physics & Mathematics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Pathology (AREA)
  • Robotics (AREA)
  • Magnetic Resonance Imaging Apparatus (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

The apparatus, and methods introduced in the present invention are enabled to measure the relative position between different medical imaging devices (18)(152), betweeen the image planes, and/or image volumes produced by them by means of using a position measuring system based with attachable position measuring components (28)(30). This facilitates image fusing when images of the same plane/volume are available from different medical imaging systems. Additionally/alternately it enables to position a second medical imaging device over a desire area/volume according to information received from the image produced by the first medical imaging device.

Description

APPARATUS AND METHODS FOR MEDICAL DIAGNOSTIC AND FOR
MEDICAL GUIDED INTERVENTIONS AND THERAPY
CROSS REFERENCES TO RELATED APPLICATIONS
This PCT Patent Application is related to and claims priority from commonly
owned U.S. Provisional Patent Application No. 60/ 127,267 filed on March 31 , 1999
entitled:
APPARATUS AND METHODS FOR MEDICAL DIAGNOSTIC AND FOR
MEDICAL GUIDED INTERVENTIONS USING MULTIPLE IMAGING
SYSTEMS.
This Provisional Patent Application is incorporated by reference in its entirety
herein.
FIELD OF THE INVENTION
The present invention relates to apparatus for performing medical diagnosis,
and to apparatus for planing and performing medical interventions or therapy
procedures. Particularly, the present invention is related to apparatus for performing
guided medical interventions or medical therapy procedures that employ multiple
medical imaging systems for viewing the target in a body or body volume during the
intervention.
BACKGROUND OF THE INVENTION
During recent years fusing images from different medical imaging systems in
order to receive a better diagnostic has become widely used. Additionally, the
cooperative operation of several medical imaging devices can reduce the amount of radiation at which patient is subjected during diagnosis, therapy planning and medical procedures and interventions. While these facts have been already recognized, the
realization of such cooperative operation has been until now generally restricted to
medical imaging devices having a common mechanical platform.
Additionally, various directional therapy procedures (based on directing an
energy field towards a target in a body) are assisted by images of said body and target produced by medical imaging devices like CT, MR, ultrasound, etc.
SUMMARY OF THE INVENTION
During diagnosis during medical interventions or during therapy procedures, it
may be necessary or helpful to operate several medical imaging devices simultaneously or sequentially. This is done upon necessity and compatibility between the devices, in
order to indicate the condition of the patient and/or designate a target in a body or
body volume. For example the same target can be viewed by ultrasound and by some
other medical imaging device such as a CT, X-Ray, or endoscope imaging device.
The present invention comprises methods and apparatus for combining two or
more medical imaging systems (such as an ultrasound and a CT) in medical diagnosis
and procedures without mechanical constrains between the position of the two or more
medical imaging devices. The present invention is particularly useful in guided medical interventions into a body or body volume, where a target is sought to be evaluated. The apparatus disclosed in the present invention comprises at least two medical
imaging devices for example, an ultrasound, CT, X-Ray, endoscope, at least a display,
a data processor, and a position measuring system comprising position controlling and
position measuring components. At least part of the position measuring components
are located at determined positions with respect to the medical imaging devices and
calibrated with respect to the image\beam produced by the imaging devices. The position measuring system enables the establishment of the relative positions between
the at least two medical imaging devices. The data processor receiving the information
from the position measuring system, is also able to establish the position between the image planes/volumes produced by the at least two imaging devices and using it for at
least one of the following; a) maneuvering one or more imaging devices in order to scan the interest
target within the body according to information available from another medical
imaging device, or
b) facilitating image fusing when images of same plane/volume are available from two or more imaging devices.
The term position measuring components defines any of the following group:
transmitter or receiver or reflector or transceiver or optical indicia or inertial sensor or
any combination of the above, suitable to be part of a position measuring system. This
position measuring system may be magnetic, acoustic, optic, inertial or a combination of the above.
The resultant apparatus enables free-hand manipulation of all or part of the
medical imaging devices used in the same intervention or diagnosis.
The apparatus and methods described in the present invention facilitate the combination of a number of medical imaging devices to perform various medical
diagnostic, therapy procedures and intervention tasks more safely and efficiently than those of the conventional systems. Particularly, the apparatus of the present invention
enabling viewing of a target with one medical imaging device and guiding a medical
intervention tool or a medical therapeutic tool, or alternatively inserting a medical device when using a second medical imaging device. This can be particularly useful
when employing image guided medical intervention systems such as those introduced
by the assignees in commonly assigned U.S. Patent No. 5,647,373, entitled:
Articulated Needle Guide For Ultrasound Imaging And Method Of Using Same, and patent applications, PCT/IL96/00050 (WO 97/03609), entitled: Free-Hand Aiming Of A Needle Guide; PCT/IL98/00578, entitled: System And Method For Guiding The
Movements Of A Device To A Target Particularly For Medical Applications; and
PCT/IL98/00631, entitled: Calibration Method And Apparatus For Calibrating
Position Sensors On Scanning Transducers, all four of these documents incorporated
by reference in their entirety herein.
The apparatus described in the present invention may also reduce the amount of
radiation applied on a patient during diagnostic and medical interventions and/or
therapy, for example when using ultrasound and CT in the same intervention.
The present invention also comprises methods and apparatus for guiding a directional
therapy procedure without mechanical constrains between the position of the medical
imaging device or devices used to assist the therapy procedure and the directional
therapeutic device. The apparatus therefore, enables free-hand manipulation of part or
all of the medical imaging devices used to assist the therapy procedure and/or of the
therapeutic head. The term directional therapy procedure will define any procedure
during which an energy field is directed towards a target or area in the body of the
patient. This energy field can be ultrasonic or Shockwaves (lithotripsy) , or
electromagnetic (radiotherapy, laser, etc) or particle beam (proton beam for example).
The data processor receives the information from the position measuring system and
uses it for directing the therapy device head\beam towards a desired target in the body.
The method and apparatus are beneficial in that they have add-on capabilities;
can define dynamic architectures, and enable free-hand maneuvering of all or part of
the devices employed therein. BRIEF DESCRIPTION OF THE DRAWINGS
The present invention will now be described by way of the accompanying
drawings, wherein like reference numerals and/or characters indicate corresponding or
like components. In the drawings:
FIG. la pictorially illustrates one form of a system constructed according in
accordance with the present invention for cooperative operation of an ultrasound and a
computerized topography (CT) apparatus;
FIG. lb pictorially illustrates the relative position between the scanning beams
of the CT and ultrasound in FIG. la;
FIG. 2a is a vector diagram which pictorially illustrates the vectors used in
calculating the relative position between the scanning beams of the ultrasound and the
CT in FIG la;
FIG. 2b is a block diagram illustrating the steps involved in calculating the
relative position between the scanning beams of the ultrasound and the CT in FIG la. ;
FIG. 3 pictorially illustrates display functions enabled according to the present
invention in relation to FIG. la;
FIG. 4 is a simplified flowchart illustrating the steps of using in a cooperative
mode two medical scanning devices in accordance to the present invention;
FIG. 5a pictorially illustrates one possible position measuring system to be used
in accordance to the present invention;
FIG. 5b pictorially illustrates another possible position measuring system to be
used in accordance to the present invention; and FIG. 6 pictorially illustrates one form of a system constructed according in
accordance with the present invention for cooperative operation of an ultrasound and a
X-Ray:
DETAILED DESCRIPTION OF THE EMBODIMENTS
FIG. la illustrates a first embodiment, exemplary of the present invention. The
first embodiment utilizes at least two compatible medical imaging devices that produce
medical images. Here, the medical imaging devices include an ultrasound apparatus 2,
and a computerized tomography (CT) apparatus 4, employed in a cooperative
operation. Alternately, the medical imaging devices could be the same or different
devices, or combinations thereof, provided they operate cooperatively with respect to
each other and are compatible. For example, these devices may be devices that
produce images by ultrasound, computerized tomography (CT), X-ray, endoscopy etc.
In the set-up illustrated in Fig. 1, for example, images produced by CT 4 of a
body 6 (or body volume) of a target 8 in the body 6 (body volume) may be utilized to position ultrasound transducer 18 at known position with respect with respect to target
8, even if target 8 cannot be imaged as accurately by ultrasound. This is particularly
useful in CT assisted medical interventions enabling to use ultrasound images for
real-time monitoring of the invasive tool while using the CT images for primary
location of the target. Additionally, images produced by the two medical imaging
devices may be combined in real-time or off-line to produce the detail required in the
area of the image required. This may also be used for monitoring anatomic changes
due to breathing or internal movements between the CT images and the situation of the
body 6 at the time of the procedure. The system can be fitted to existing and deployed
medical imaging devices without major modifications or adaptations.
Ultrasound 2 and/or CT 4 are connected to a display 10 via an image processor
12 (optional) contained in data processor 14, for displaying on display 10 at least the images produced by ultrasound 2 and CT 4. Ultrasound 2 and/or CT 4 can also be
connected directly to display 10 via the necessary connections and hardware. Image
processor 12 can be part of the data processor 14 or can be connected to it.
Ultrasound 2 comprises a main unit 17 connected to a scanning head 18 further
referred to as the ultrasound transducer as is known in the art. CT 4 comprises a main
unit (CT computer) 20 connected to a scanning head 22, further referred as a CT
scanning head. CT scanning head 22 includes X-ray emitter and detector(s) (not
shown) as is known in the art. The term scanning head will be used to define the
detector and/or emitter component of the medical imaging or scanning device, such as
the transducer of an ultrasound, or the X-ray emitter and detector(s) of a CT or an
X-ray or the CCD of an optical endoscope.
A position measuring system comprising at least a position sensing controller
26 and position measuring components (PMCS) 28, 30, 32 and 34 is used to measure
the relative position between ultrasound transducer 18 and the CT scanning head 22.
The term position measuring components will define any of the following group:
transmitter or receiver or reflector or transceiver or optical indicia or inertial sensor or
any combination of the above, suitable to be part of a magnetic (for example, in
accordance with the systems detailed in U.S. Patents Nos. 4,314,251, 4,054,881) or
acoustic (for example, in accordance with the system detailed in U.S. Patent No.
4,124,838) or optic (for example, in accordance with the system detailed in U.S.
Patent No. 4,649,504) or inertial (for example, IS900 manufactured by InterSense
Inc.) position measuring system or any combination of the above, all of the above listed U.S. Patents incorporated by reference herein. Also, position levers in accordance with U.S. Patent No. 5, 647, 373 and PCTs PCT/IL96/ 00050, PCT/IL98/ 00578, PCT/IL98/ 00578 and PCT/IL98/ 00631, listed above, are also
suitable. Position sensing controller 26 can be part of the data processor 14.
In order to perform the task of measuring the relative position between
ultrasound transducer 18 and CT scanning head 22, position measuring component 28
is attached at a known and fixed position with respect to the ultrasound transducer 18.
The attachment can be either directly to the transducer 18 or by means of an extension
27. Position measuring component 28 is calibrated to ultrasound transducer 18 such
that the ultrasound beam 36 is at a known and fixed position with respect to position
measuring component 28. Such calibration can be achieved by operating according to PCT application PCT/IL98/00631.
Position measuring component 30 is attached at a known and fixed position
from CT scanning head (gantry) 22. The attachment can be either directly to the CT
scanning head (gantry) 22 or by means of an extension 29. The term position defines
location and/or orientation.
Position sensing controller 26 measures the relative position between position
measuring component 28 and position measuring component 30, enabling to calculate
the relative position between ultrasound transducer 18 and CT scanning head 22.
Position measuring component 30 is calibrated to scanning head 22 such that the CT
scanning beam 34 is at a known and fixed position with respect to position measuring component 30. Such calibration can be achieved by operating according to the
co-assigned PCT application PCT/IL98/00631. Reference is now made to FIG. lb that shows a detailed view of ultrasound
transducer 18, CT scanning head 22 and position measuring components 28 and 30.
Items referred to in previous figures are numbered similarly and will not be further described.
Alternately /additionally position measuring component 32 may be attached to
CT bed 15. The attachment can be either directly to the CT bed 15 or by means of an
extension 31. In this case position measuring component 34 is attached at a fixed
position with respect to a reference position of the CT scanning head 22 (for example
the default perpendicular position), and movements of the gantry 22 are compensated
according to information available from the CT computer 20.
An additional position measuring component 34 may be attached at a fixed
position from CT scanning head attached from the ceiling of the CT room by arm 33.
In this case position measuring component 34 is attached at a fixed position with
respect to a reference position of the CT (for example the default position), and
movements of the gantry 22 are compensated according to information available from
the CT 20.
It is not necessary to use position measuring components 30, 32 34 together. Rather, it is sufficient to use at least one of them. In order to operate the apparatus
properly it is sufficient to implement only one of the above position measuring
components 30, 32 and 34 in combination with position measuring component 28.
Position measuring components 32 and 34, if used, are calibrated to CT scanning head
22 similar to the calibration of position measuring component 30, if used. Since position measuring component 28 is calibrated to transducer 18 and at
least one of the position measuring components 30, 32 and 34 is calibrated to CT
scanning head 22 it is possible to calculate the relative position between ultrasound
scanning beam 36 and CT scanning beam 38. This is calculated based on measuring
the relative position between position measuring components 28 and at least one of the
following: one of position measuring components30, 32 and 34, and based on the
calibration values defined above. The vector diagram of FIG. 2a and flowchart of FIG.
2b illustrate one possible algorithm to be used for calculating the relative position
between the beams of the two medical imaging devices.
Reference to flowchart of FIG. 2b, block 40 shows the result of the calibrating
position measuring component 28 to ultrasound transducer 18. Block 42 shows the
result of the calibrating position measuring component 28 to ultrasound transducer 18.
Blocks 40 and 42 are generally performed off-line. Block 44 shows the measurement
of the relative position of position sensor 28 with respect to position sensor 30. Block
46 shows one possible set of equations (Equations 1 and 2 described below) for
calculating the relative position between ultrasound scanning beam 32 and CT scanning
beam 34. These equations are:
* [M]y&g * ( [ ]u-s Mό28)τ (Eq- D
~? US .J3 _ rM-, P_M_C_28 * rM1 P_M^C28 , T * ~1 P_M_C_30 u CT_S_B L1V1J US_S_B V L1V1J P_M_C_30 ) u CT S_B ^
20 (Eq. 2) r λ/fl P-M-C-28 * H P_ _C_28 ~ P M_C_28 ,
11V11 US_S_B I U P_M_C_30 " U US_S_B .
The indexes and parameters in the above equations are according to vector
diagram of FIG. 2a and flowchart of FIG. 2b. It is therefore possible to calculate the relative position of transducer 18 and ultrasound beam\image 36 with respect to a CT
image or CT set of images.
Necessary correction due to moving the CT stretcher 16 outside the gantry can
be performed according to information available from the CT system (the displacement
of stretcher 16 can be measured with an accuracy of less than 1
0.25mm to 0.5 mm). The movement of the bed stretcher 16 can be at a predetermined
value bringing the desired target 6 or CT slice of interest at the same position from CT
scanning head (gantry) 22. Alternately, the CT stretcher 16 can be moved at any
desired position.
Necessary correction due to tilt of the scanning head of the CT 22 may be
performed according to information available from the CT system 4.
Necessary correction due to swivel of bed stretcher 16 can be performed
according to information available from the CT system.
Necessary correction due to using oblique or perpendicular CT images can be
performed according to information available from the CT system.
All the above mentioned correction values can be manually inputted to the data
processor 14 or transferred through communication links from CT main unit 20 or data
processor 14 can identify them automatically for example according to information
generally available in the CT image (video or DICOM form).
Similar or alternate algorithms can be implemented in connection to position
measuring components 32 and 34.
Referring to FIG. 3, the relative position of ultrasound image (scanning beam
34) with respect to a set of CT images is indicated to the operator on the apparatus display 10. The indications are in 2-D and/or 3-D fashion for example in the form of
boxes 60, 62, 64, 66. and 68.
The amount of deviation of ultrasound scanning beam 36 from a reference CT
scanning image 38 can be displayed to the operator, for example in the from of angles and distances, as in box 60, and in the form of side-view illustration box 62, and
top-view illustration, box 64. This enables the operator to first scan body 6 by CT,
identify at least one target 8 in body 6, and then position ultrasound transducer 18 such
as to view a desired CT slice\image.
Additionally, target 8 can be marked in a CT image and it is possible to
maneuver ultrasound transducer 18 such as to view the target 8. The indications can be
in the form of box 66 comprising arrows indicating how to maneuver transducer 18
and numbers indicating the distance of target 8 from ultrasound scanning beam 36. It is
then possible, for example, to guide an invasive tool towards target 8 based on CT images in combination with real-time imaging of the invasive tool by ultrasound. .
Boxes 68 and 70 provide information regarding the position transducer beam 36
with respect to a volume of CT images. Box 68 shows the CT slice (reconstruction)
aligned with the current position of transducer 18. Box 70 illustrates the relative
position of transducer 18 with respect to the scanned volume in a sagital view.
While Fig. 3 illustrated specific display modalities for enabling the user to
cooperatively use ultrasound 2 and CT 4 information additional or alternate displays
may be used. In applications requiring very high accuracy it may be necessary to constrain
body 6 in order to avoid small movements between scanning body 6 by ultrasound 2
and scanning body 6 by CT 4. In most applications this requirement is not necessary.
Reference is now made to FIG. 4 which is a flow-chart illustration of the steps
required in order to operate ultrasound 2 and CT 4 in operative cooperation as
illustrated in Fig. l and in accordance to the present invention. At step 90, it is
measured the relative position between position measuring components (PMC) 28 and
30 (FIG. 4). It is possible to use position measuring components 32 or 34 or both
instead or in addition to position measuring component 30 (as explained above). At
step 92 it is calculated the relative position between ultrasound scanning
plane/volume/image 36 and CT scanning plane/volume/image 38. The calculation at step 92 is based on the calibration of position measuring component 28 and 30 (32, 34)
with respect to ultrasound transducer 18 and CT scanner 22 (step 94). The calculation
in step 94 may also be used in other medical procedures (step 96) that is optional, but
preferred, for example in image guided interventions, such as described in commonly
assigned U.S. Patent No. 5,647,373. The calculation in step 92 may also be used in order to instruct the maneuvering ultrasound transducer 18 in a required position.
Once the relative position between the scanning planes/volume is calculated the images
from CT system 4 and ultrasound system 2 may be correlated and or fused in optional,
but highly preferred step 98. This can be performed according to conventional image processing algorithms and techniques. By correlating or fusing these images (from the
ultrasound and CT scanning beams, respectively), the displayed ultrasound image may
be enhanced. The superimposed image/information resulting of step 98 may be optionally used to improve calculation at step 92 in an iterative mode. The
superimposed image/information resulting of step 98 may also be used in other medical
procedures (step 96) or in order to instruct maneuvering of ultrasound transducer 18
(step 120).
The imaging options 100 are non-exhaustively, listed as follows. The image from CT system 4 and ultrasound system 2 may be displayed individually (steps 102
and 104), the relative position between CT and ultrasound scanning beams may be
displayed (step 106, illustrated in Fig. 3) and the result of image fusing, at step 98 may
also be displayed (step 108). Target and image correlation information can also be available to the operator indicating for example internal anatomic movements as
explained herein below.
In addition to and interacting with display functions 100, there are various
ancillary functions such as optionally marking a target 8 (step 112) on the image
produced by one of the imaging devices as appearing on display 10. Then the position of target 8 may be calculated within the scanning beam of the medical imaging device,
(step 114). Additionally /alternately a reference plane/volume may be marked or
signaled according to the image produced by one of the imaging device (step 116). The
position of the reference plane/volume may then be calculated (step 118). The data in steps 114 and/or 118 may be optionally used in order to instruct the positioning of CT
scanning head 22 and ultrasound transducer 18 with respect to each other (step 120).
A specific implementation to be used in connection to the present invention
regards the use of image correlation in applications where internal organs may
significantly move between the time the CT scan was performed and the time of performing a procedure. Body volume 6 is scanned by the CT system 4 producing high
resolution image/ information. The operator defines at least one target on CT image as
displayed on display 10. The same target is then scanned with ultrasound transducer
18. The position of the ultrasound scanning beam\image 36 is determined (as detailed
above) with respect to position of the CT scanning beam 38. Additionally, by
comparing the relative position of target 8 in the ultrasound image with the position
calculated from the CT image, it is possible to monitor and compensate for internal
movements or for respiratory changes in body 8 between taking the CT scan and the
time of performing an intervention. For this implementation it is preferred (but not
necessary) to mark more than one target/points in the CT image and then locate them
by ultrasound. This enables to calculate internal anatomic displacements inside body 8,
between the images produced by the CT and the situation during the time of an
intervention. The operation defined above can also be performed automatically
provided enough distinctive targets\points\clusters may be found in the CT and
ultrasound images.
Reference is now made to FIG. 5a that illustrates a magnetic position
measuring system to be used in accordance with an exemplary embodiment of the
present invention. Similar items in previous figures have similar numbers and will not
be further described.
In this exemplary embodiment position measuring component 28 is a receiver
28' being attached to ultrasound transducer 18 and position measuring component 30 is
a transmitter 30' being attached to CT scanning head 22 by an arm 80' . Transmitter
30' is transmitting AC or DC magnetic/electromagnetic signals to receiver 28' . The output of receiver 28' is transmitted by wire or wireless connections to position
sensing controller 26 enabling to calculate the relative position of receiver 28' with
respect to transmitter 30' . Alternately, position measuring component 28 could be a
transmitter and position measuring component 30 could be a receiver.
Reference is now made to FIG. 5b wherein an optical position measuring
system is employed in accordance with another exemplary embodiment of the present
invention. Similar items in previous figures have similar numbers and will not be
further described. A stereo vision charge coupled device (CCD) camera 84' is
positioned on an arm 86' at a first reference location. Position measuring component
28 includes a cluster of LED's 28" being attached to ultrasound transducer 18 by an
arm 88' and position measuring component 30 includes a cluster of LED's 30" being
attached to CT scanning head 22 by an arm 80" . The relative position of cluster of LED's 28" is measured with respect to the CCD camera 84' (first reference location),
and also the relative position of cluster of LED's 30" is measured with respect to CCD
camera 84' (first reference location). It is therefore possible to calculate from the
above measurements the relative position of cluster of LED's 30' ' with respect to cluster of LED's 28" and hence, the relative position between ultrasound scanning
beam 32 and CT scanning beam 34.
The above detailed position measuring systems enable measurement of the
relative position between ultrasound transducer scanning beam 32 and CT scanning
beam 34 can be optical, acoustic, magnetic or inertial or a combination of the above. The relative position between position measuring components 28 and 30 can be
measured directly, for example, when one of the components is a receiver and the other one is a transmitter as illustrated in the exemplary embodiment in FIG. 5a.
Alternately, the relative position between position measuring components 28 and 30
can be calculated indirectly, for example, by measuring the position of each with
respect to a reference location as illustrated in the exemplary embodiment in FIG. 5b. When making these indirect calculations, a third position measuring component
84 (illustrated in FIG. 5b by a stereo Charge Coupled Device (CCD) 84') is in
operative communication with position measuring components 28 and 30, this CCD 84
is positioned at a first reference location. The first reference location may be fixed and
known. Alternately, the first reference position can be movable and unknown.
Optionally, the first reference position can be attached to the bed 24.
Position measuring components 28, 30 and 84 (if part of the system) may be
any of the following group: transmitter or receiver or reflector or transceiver or optical
indicia or any combination of the above. Position measuring components 28, 30 and 84 (if part of the system) may be part of a magnetic or acoustic or optic or inertial
position measuring system or a combination of the above.
Position sensing controller 26 may communicate with at least one or all of
position measuring components 28, 30 and 84 (if part of the system) by wired or
wireless links.
Reference is now made to FIG. 6, which illustrates an additional embodiment
of the present invention. Similar items to those shown in previous figures have similar
numbers and will not further be described. The second embodiment illustrates an
ultrasound 2, and an X-Ray 138 to be used in cooperative operation according to the present invention. The X-Ray imaging device comprises X-Ray main unit 140 and X-Ray scanning head 142 including emitter 142' and detector 142' . X-Ray scanning head 142 is mounted on a movable and adjustable arm 144. Bed 24 may or may not be
part of X-Ray imaging device 138.
Position measuring component 30 is attached at known and fixed positions from
X-Ray scanning head 142. Additionally, position measuring component 30 is calibrated
to the scanning head 142 such that it is at a known position from the scanning volume
of the X-Ray. Such calibrations can be achieved by operating according to patent
application PCT/IL98/00631. The relative position between ultrasound transducer
scanning beam 36 and the X-ray scanning volume 146 can be calculated as described
in the first embodiment based on direct measurement between position measuring
components 28 and 30.
An additional position measuring component 84 can be placed at a reference
location on an arm 86. This position measuring component 84 (if used) is in
cooperative communication with position measuring components 28 and 30. Thus, the
position of ultrasound transducer 18 and of the X-Ray scanning head 142 are measured
with respect to the reference position similar to the calculation described in relation to
FIG. 5b. Body 6 can be fixed so as to avoid movement during the procedure. X-Ray
scanning head 142 is positioned in order to view target 8 in the body 6 or body volume
at two different positions. The position of X-Ray scanning head 142 is measured with
respect to reference position thus enabling to correlate the two images into stereo
information in order to receive a 3D image of the scanned body 6. Algorithms for creating such 3D image/information from stereo 2D images/information are known to
those skilled in the art. In one exemplary use of present invention, the operator may indicate target 8
on the image received from the X-Ray 138 for example, by marking it with a mouse
on the display 10 (as described herein above with reference to Figs. 1-3). The relative
position of target 8 can be calculated with respect to the reference point. Ultrasound
transducer 18 is then applied to body volume 6 and its position is measured with
respect to the reference position. Thus, it is possible to calculate the position of
ultrasound scanning plane 36 with respect to the image volume received from X-Ray
138 and target 8.
Alternately, the body volume 6 is first imaged by ultrasound in order to
establish a desired reference plane/volume that includes a target 8. The operator selects
a reference plane, in accordance with the procedures detailed above. Alternately, the
operator indicates target 8 for example by marking it on the display 10, using conventional marking software, as described above. Data processor 14 stores the
position of target 8 or of reference plane with respect to the reference location.
Ultrasound transducer 18 is then removed and the position of X-Ray scanning head
142 is calculated with respect to reference plane or target 8. Thus, it is then possible to
position X-Ray scanning head 142 in an optimal way at two different positions so as
two view target 8 and afterwards produce a 3D image/ information.
Additional modalities of employing in operative cooperation ultrasound 2 and
X-Ray 138 are similar to those described for the system illustrated in Figs. la(4 and
described above.
Reference is now made to FIG. 7a and 7b, which illustrate an additional embodiment of the present invention. Similar items to those shown in previous figures have similar numbers and will not further be described. The second embodiment
illustrates an ultrasound 2, and an optical endoscope 150 to be used in cooperative
operation according to the present invention. Said endoscope 150 comprising an
endoscope head 152 with a CCD 154 (not shown in the drawing) and optical apparatus
156 (not shown in the drawing). Endoscope head 152 can be rigid, or can be flexible. It is still possible to use a rigid endoscope head 152 with a mobile tip enabling to
change the angle of view. It is also possible to use endoscope with changing field of
view.
Position measuring component 30 is attached at a known position with respect
to optical endoscope head 150 and calibrated to the endoscope image 158. Position
measuring component 28 is attached at a known position with respect to transducer 18
and calibrated to transducer beam/image 36 as described above. If endoscope head 152
is rigid position measuring component may be attached internally or externally at any
fixed position with respect to endoscope head 152. If endoscope head 152 is flexible or
has a mobile tip, position measuring component 30 is positioned at the tip of head 150.
An example of such position measuring component is the magnetic sensor
manufactured by Mednetix Inc. Alternately, position component 30 can be a
combination of two different functional sub-components. A first sub-component of positions measuring component 30 measuring the position of the head 152 at a default
situation (default bending or default tip position). A second sub-component of position
measuring component 30 being attached to the flexible part of endoscope head 152
(and preferably also to the tip) and providing indication with respect to the bending or movement of the flexible part with respect to said default situation. An illustrative example for such a second sub-component could be a fiber optic sensor manufactured by Measurand Inc. Still alternately, in the case of a rigid endoscope wit mobile tip it
can be possible to receive from the endoscope information regarding the deviation from the default situation.
According to one aspect of the present invention target 8 is viewed by
ultrasound transducer 18, and endoscope head 152 is maneuvered to view target 8 based on guiding information received from data processor 14 (not shown in Fig. 7)
and displayed on display 10 (not shown in Fig. 7). The guidance can be in accordance
to the methods introduced by the assignees in above cited patent applications
PCT/IL96/00050 or PCT/IL98/00578.
According to another aspect of the present invention target 8 is viewed by
transducer 18 and also by endoscope head 152. It is then possible to mark target 8 in
the ultrasound image and calculate its position with respect to position measuring
component 28. According to the measured relative position between position
measuring components 30 and 28 and based on the calibration of position measuring
component 30 to the endoscope image 158 it is possible to calculate the 3D position of
target 8 with respect to endoscope image 158. This enables to guide an invasive tool
towards target 8 based on endoscope image 158 from any desired angle according to
the method and apparatus described in above cited patent applications PCT/IL96/00050
or PCT/IL98/00578.
According to another aspect of the present invention it is possible to receive
depth information from the endoscope image by alternative methods without the need
to use transducer 18. According to one alternative method a target is viewed by endoscope 150 while moving the endoscope head 150 at several positions on a strait line around the focussed position, at least one position providing a focussed image of
the target. According to the focussing and de-focussing of the target and according to
the measured position of position measuring component 30 it is possible to receive the
depth of the target in the endoscope image from algorithms known as "depth from focus" . According to another method endoscope head 152 is viewing a volume of
body 6 comprising target 8 from at least two different positions enabling to implement
3D stereo imaging. The implementation of the stereo imaging algorithm is based on
knowing the relative position between the two or more positions of the endoscope head 152. According to still another method it is possible to receive the depth of a target 8 from methods known to those skilled in the art as "depth from shading" Such
methods are described in US Patent No. 4,714,319 and US Patent No. 4,695,130
All the above described methods can provide 3D information regarding the
position of target 8 in the image produced by endoscope 150. It is therefore possible to
attach an additional position measuring component to an invasive tool and guide it to target 8 assisted by endoscope imaging in accordance to the methods described in
guidance methods according to above cited patent applications PCT/IL96/00050 or
PCT/IL98/00578.
While the invention has been described with respect to several preferred
embodiments, it will be appreciated that these are set forth merely for purposes of
example, and that many variations, modifications and applications of the invention may
be made. Accordingly, the scope of the invention is defined by the claims, which
follow.

Claims

1. A method enabling free of predefined mechanical constraints cooperative
operation of two or more medical imaging devices useful in medical diagnosis and/or
medical therapy planning and/or medical intervention planning and/or medical therapy
and/or medical intervention and/or medical procedures, the method comprising the
steps of:
imaging a body volume/plane with a first medical imaging device; sensing the position of a second medical imaging device with respect to said first medical imaging device by means of a position measuring system comprising position measuring controller and position measuring components; scanning part or all of said body volume/plane with said second medical imaging device; calculating the relative position of said second medical imaging device and/or relative position of the scanning plane/volume produced by said second medical imaging device with respect to said first medical imaging device and/or with respect to the scanning plane/volume produced by said first medical imaging device: displaying on at least one display screen said calculation in a cooperative way to the medical operator.
2. The method according to claim 1 where said first and said second medical
imaging devices operate sequentially.
3. The method according to claim 1 where said first and said second medical
imaging devices operate simultaneously and/or intermittently.
4. The method of claim 1 where said one first and said one second medical imaging
device are one of the group X-Ray, CT, MRI or ultrasound, endoscope.
5. The method of claim 1 where the position measuring system is from the group:
magnetic, optic, acoustic, inertial, fiber optic or a combination of the above.
6. The method of claim 1 where the step of sensing the position of second medical
imaging device with respect to said first medical imaging device is performed by means of wired or wireless communication.
7. The method according to claim 1 where the position of the scanning
plane/volume/image produced by said second medical imaging device is calculated with respect to the scanning plane/volume/image produced by said first medical
imaging device.
8. The method according to claim 1 where the scanned plane/volume produced by
said first medical imaging device is correlated and/or fused by image processing tools
with the scanned plane/volume/image produced by said second medical imaging
device.
9. The method according to claim 1 further comprising the step of indicating to said
position sensing system the position of a target by marking said target on said at least
one display screen or by automatic recognition of the target in the image.
10. The method according to claim 1 further comprising the step of indicating to said position sensing system the position of a reference plane/volume by marking said
reference plane/volume on said at least one display screen.
11. The method according to any of the claims 9 or 10 , where the position of the said second medical imaging device is calculated with respect to said target and/or said
reference plane/volume/image.
12. The method according to claim 1 where the position of the scanning
plane/volume produced by said one second medical imaging device is calculated with respect to said target and/or said reference plane/ volume.
13. The method according to claim 1 and further comprising the step of indicating
on said at least one display screen the actual progressive motion of said second medical imaging device towards said target and/or reference plane/volume.
14. The method according to claim 1 and further comprising the step of indicating
on said at least one display screen the deviation of said second medical imaging device or scanning plane/ volume produced by said second medical imaging device from said
target and/or from said reference plane/volume.
15. The method according to claim 1 and further comprising the step of adjusting the position of said second imaging device so as to cause it to include in its scan
plane/volume/image said target or to cause that its scan plane/volume coincide with
said reference plane/volume.
16. The method according to claim 1 and further comprising the step of correlating the calculated position of at least on target or the relative position between several targets/points in order to assess internal anatomic movements between different stages
of a medical procedure.
17. Apparatus enabling free of predefined mechanical constraints cooperative
operation of two or more medical imaging devices useful in medical diagnosis and/or
medical therapy planning and/or medical intervention planning and/or medical therapy
and/or medical intervention and/or medical procedures, apparatus comprising: one first medical imaging device; one second medical imaging device; a position measuring system comprising at least the following: a position sensing controller and at least one first position measuring component at a known position with respect to said first medical imaging device and second position measuring component at a known position with respect to said second medical imaging device; a data processor for receiving data from the position sensing controller for calculating the relative position of said second medical imaging device and/or relative position of the scanning plane/volume produced by said second medical imaging device with respect to said first medical imaging device and/or with respect to the scanning plane/volume produced by said first medical imaging device, said data processor displaying on at least one display screen said calculation in a cooperative way to the medical operator.
18. Apparatus according to claim 17 where the at least one display screen is one.
19. Apparatus according to claim 17 where said first and said second medical
imaging device operate simultaneously and/or intermittently.
20. Apparatus of claim 17 where said first and said second medical imaging devices are one of the group X-Ray, CT, MRI, ultrasound or endoscope.
21. The apparatus of claim 17 where the position measuring system is from the
group: magnetic, optic, acoustic, inertial, fiber optic or a combination of the above.
22. Apparatus of claim 17 where the step of sensing the position of said second
medical imaging device with respect to said first medical imaging device is performed by means of wired or wireless communication.
23. Apparatus according to claim 17 where said at least one first position measuring component is attached onto said first medical imaging device.
24. Apparatus according to claim 17 where said at least one second position measuring component is attached onto said second medical imaging device.
25. Apparatus according to claim 17 where said at least one first position measuring component and said at least one second position measuring component work
in operative communication.
26. Apparatus according to claim 17 where said calculation is based on the direct
measurement of the relative position between said at least one first position measuring component and said at least one second position measuring component.
27. Apparatus according to claim 17 where said calculation is based on directly
measuring the position of said at least one first position measuring component with to
said at least one second position measuring component.
28. Apparatus according to claim 17 where said position measuring system additionally comprises at least one third reference position measuring component being
placed at a first reference location; said at least one third position measuring component being in operative communication with said at least one first position measuring component and said at least one second position measuring component and enabling to calculate the relative position between them.
29. Apparatus according to claim 37 where the scanned plane/volume produced by
said at least one first medical imaging device is correlated and/or fused by image processing tools with the scanned plane/volume produced by said at least one second
medical imaging device.
30. Apparatus according to claim 17 and further comprising the step of correlating the calculated position of at least on target or the relative position between several targets/points in order to assess internal anatomic movements between different stages
of a medical procedure.
31. Apparatus according to claim 17 further comprising the step of indicating to
said position sensing system the position of a target by marking said target on said at
least one display screen or by automatic recognition from the image..
32. Apparatus according to claim 17 further comprising the step of indicating to said position sensing system the position of said reference plane/volume by marking
said reference plane/volume on said at least one display screen.
33. Apparatus according to any of the claims 31-32, where the position of the said at least one second medical imaging device is calculated with respect to said target and/or said reference plane/ volume.
34. Apparatus according to claim 17 where the position of the scanning
plane/volume produced by the said at least one second medical imaging device is calculated with respect to said target and/or said reference plane/ volume.
35. Apparatus according to claim 17 and further comprising the step of indicating on said at least one display screen the actual progressive motion of said at least one
second medical imaging device towards said target and/or reference plane/volume.
36. Apparatus according to claim 17 and further comprising the step of indicating on said at least one display screen the deviation of said at least one second medical imaging device or scanning plane/volume produced by said at least one second medical imaging device from said target and/or from said reference plane/volume.
37. Apparatus according to claim 17 and further comprising the step of correlating the calculated position of at least on target or the relative position between several targets/points in order to assess internal anatomic movements between different stages
of a medical procedure.
38. Apparatus enabling to guide an invasive tool towards a target visible by
endoscope means, in a free of predefined mechanical constraints cooperative operation
comprising: an endoscopic imaging device; an invasive tool; a position measuring system comprising at least the following: a position sensing controller and at least one first position measuring component at a known position with respect to said endoscope imaging device and second position measuring component at a known position with respect to said invasive tool; a data processor for receiving data from the position sensing controller for calculating the relative position of said invasive tool with respect to the image(s) produced by said first medical imaging device, said data processor displaying on at least one display screen said calculation in a cooperative way to the medical operator.
EP00914346A 1999-03-31 2000-03-30 Apparatus and methods for medical diagnostic and for medical guided interventions and therapy Withdrawn EP1196089A2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US12726799P 1999-03-31 1999-03-31
US127267P 1999-03-31
PCT/IL2000/000202 WO2000057767A2 (en) 1999-03-31 2000-03-30 Apparatus and methods for medical diagnostic and for medical guided interventions and therapy

Publications (1)

Publication Number Publication Date
EP1196089A2 true EP1196089A2 (en) 2002-04-17

Family

ID=22429204

Family Applications (1)

Application Number Title Priority Date Filing Date
EP00914346A Withdrawn EP1196089A2 (en) 1999-03-31 2000-03-30 Apparatus and methods for medical diagnostic and for medical guided interventions and therapy

Country Status (4)

Country Link
EP (1) EP1196089A2 (en)
JP (1) JP2003527880A (en)
AU (1) AU3573900A (en)
WO (1) WO2000057767A2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9082178B2 (en) 2009-07-31 2015-07-14 Samsung Medison Co., Ltd. Sensor coordinate calibration in an ultrasound system
US9545242B2 (en) 2009-07-31 2017-01-17 Samsung Medison Co., Ltd. Sensor coordinate calibration in an ultrasound system

Families Citing this family (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6582381B1 (en) * 2000-07-31 2003-06-24 Txsonics Ltd. Mechanical positioner for MRI guided ultrasound therapy system
AU2001292836A1 (en) * 2000-09-23 2002-04-02 The Board Of Trustees Of The Leland Stanford Junior University Endoscopic targeting method and system
FR2816825A1 (en) * 2000-11-22 2002-05-24 Pierre Roussouly Use of radiography as an aid to surgical navigation with radiation exposure minimized by use of a method for combining a single radiographic image of the operation area with means for determining surgical instrument position
CA2438005A1 (en) * 2001-02-07 2002-08-15 Synthes (U.S.A.) Device and method for intraoperative navigation
JP4056791B2 (en) * 2002-05-22 2008-03-05 策雄 米延 Fracture reduction guidance device
EP2460474B1 (en) * 2003-05-08 2015-12-16 Hitachi Medical Corporation Reference image display method for ultrasonography and ultrasonic diagnosis apparatus
JP4533638B2 (en) * 2004-01-30 2010-09-01 オリンパス株式会社 Virtual image display system
US8150495B2 (en) 2003-08-11 2012-04-03 Veran Medical Technologies, Inc. Bodily sealants and methods and apparatus for image-guided delivery of same
US7398116B2 (en) 2003-08-11 2008-07-08 Veran Medical Technologies, Inc. Methods, apparatuses, and systems useful in conducting image guided interventions
US7244234B2 (en) 2003-11-11 2007-07-17 Soma Development Llc Ultrasound guided probe device and method of using same
DE602004024580D1 (en) * 2003-12-22 2010-01-21 Koninkl Philips Electronics Nv SYSTEM FOR LEADING A MEDICAL INSTRUMENT IN THE BODY OF A PATIENT
JP2008504847A (en) * 2004-04-02 2008-02-21 シヴコ メディカル インスツルメンツ カンパニー インコーポレイテッド Support system for use when performing medical imaging of patients
JP4559113B2 (en) * 2004-05-12 2010-10-06 ジーイー・メディカル・システムズ・グローバル・テクノロジー・カンパニー・エルエルシー Imaging plan creation method and X-ray CT apparatus
JP4625281B2 (en) * 2004-07-14 2011-02-02 アロカ株式会社 Medical diagnostic system
US7833221B2 (en) 2004-10-22 2010-11-16 Ethicon Endo-Surgery, Inc. System and method for treatment of tissue using the tissue as a fiducial
US10555775B2 (en) * 2005-05-16 2020-02-11 Intuitive Surgical Operations, Inc. Methods and system for performing 3-D tool tracking by fusion of sensor and/or camera derived data during minimally invasive robotic surgery
US8147503B2 (en) 2007-09-30 2012-04-03 Intuitive Surgical Operations Inc. Methods of locating and tracking robotic instruments in robotic surgical systems
US8073528B2 (en) 2007-09-30 2011-12-06 Intuitive Surgical Operations, Inc. Tool tracking systems, methods and computer products for image guided surgery
US8108072B2 (en) 2007-09-30 2012-01-31 Intuitive Surgical Operations, Inc. Methods and systems for robotic instrument tool tracking with adaptive fusion of kinematics information and image information
WO2007033206A2 (en) 2005-09-13 2007-03-22 Veran Medical Technologies, Inc. Apparatus and method for image guided accuracy verification
US20070066881A1 (en) 2005-09-13 2007-03-22 Edwards Jerome R Apparatus and method for image guided accuracy verification
US20100063400A1 (en) * 2008-09-05 2010-03-11 Anne Lindsay Hall Method and apparatus for catheter guidance using a combination of ultrasound and x-ray imaging
WO2010144405A2 (en) 2009-06-08 2010-12-16 Surgivision, Inc. Mri-guided surgical systems with proximity alerts
EP2442718B1 (en) 2009-06-16 2018-04-25 MRI Interventions, Inc. Mri-guided devices and mri-guided interventional systems that can track and generate dynamic visualizations of the devices in near real time
DE102009048361A1 (en) * 2009-10-06 2011-04-07 Richard Wolf Gmbh Medical therapy device
US8496592B2 (en) 2009-10-09 2013-07-30 Stephen F. Ridley Clamp for a medical probe device
WO2012024686A2 (en) 2010-08-20 2012-02-23 Veran Medical Technologies, Inc. Apparatus and method for four dimensional soft tissue navigation
US8425425B2 (en) 2010-09-20 2013-04-23 M. Dexter Hagy Virtual image formation method for an ultrasound device
EP2651308B1 (en) 2010-12-14 2020-03-11 Hologic, Inc. System and method for fusing three dimensional image data from a plurality of different imaging systems for use in diagnostic imaging
JP5657467B2 (en) * 2011-05-13 2015-01-21 オリンパスメディカルシステムズ株式会社 Medical image display system
US9138165B2 (en) 2012-02-22 2015-09-22 Veran Medical Technologies, Inc. Systems, methods and devices for forming respiratory-gated point cloud for four dimensional soft tissue navigation
US9474505B2 (en) * 2012-03-16 2016-10-25 Toshiba Medical Systems Corporation Patient-probe-operator tracking method and apparatus for ultrasound imaging systems
CN105377112B (en) * 2013-07-05 2017-08-18 奥林巴斯株式会社 Medical display device and endoscope surgery system
JP5675930B2 (en) * 2013-10-28 2015-02-25 株式会社東芝 X-ray diagnostic equipment
CN105979900B (en) * 2014-02-04 2020-06-26 皇家飞利浦有限公司 Visualization of depth and position of blood vessels and robot-guided visualization of blood vessel cross-sections
US20150305650A1 (en) 2014-04-23 2015-10-29 Mark Hunter Apparatuses and methods for endobronchial navigation to and confirmation of the location of a target tissue and percutaneous interception of the target tissue
US20150305612A1 (en) 2014-04-23 2015-10-29 Mark Hunter Apparatuses and methods for registering a real-time image feed from an imaging device to a steerable catheter
WO2017157974A1 (en) * 2016-03-16 2017-09-21 Koninklijke Philips N.V. System for assisting in performing an interventional procedure
JP6974354B2 (en) 2016-05-27 2021-12-01 ホロジック, インコーポレイテッドHologic, Inc. Synchronized surface and internal tumor detection

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB9025431D0 (en) * 1990-11-22 1991-01-09 Advanced Tech Lab Three dimensional ultrasonic imaging
US5662111A (en) * 1991-01-28 1997-09-02 Cosman; Eric R. Process of stereotactic optical navigation
US5829444A (en) * 1994-09-15 1998-11-03 Visualization Technology, Inc. Position tracking and imaging system for use in medical applications
EP0845959A4 (en) * 1995-07-16 1998-09-30 Ultra Guide Ltd Free-hand aiming of a needle guide

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO0057767A3 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9082178B2 (en) 2009-07-31 2015-07-14 Samsung Medison Co., Ltd. Sensor coordinate calibration in an ultrasound system
US9468422B2 (en) 2009-07-31 2016-10-18 Samsung Medison Co., Ltd. Sensor coordinate calibration in an ultrasound system
US9545242B2 (en) 2009-07-31 2017-01-17 Samsung Medison Co., Ltd. Sensor coordinate calibration in an ultrasound system
US9782151B2 (en) 2009-07-31 2017-10-10 Samsung Medison Co., Ltd. Sensor coordinate calibration in an ultrasound system
US9955951B2 (en) 2009-07-31 2018-05-01 Samsung Medison Co., Ltd. Sensor coordinate calibration in an ultrasound system
US10271822B2 (en) 2009-07-31 2019-04-30 Samsung Medison Co., Ltd. Sensor coordinate calibration in an ultrasound system
US10278663B2 (en) 2009-07-31 2019-05-07 Samsung Medison Co., Ltd. Sensor coordinate calibration in an ultrasound system
US10561403B2 (en) 2009-07-31 2020-02-18 Samsung Medison Co., Ltd. Sensor coordinate calibration in an ultrasound system

Also Published As

Publication number Publication date
AU3573900A (en) 2000-10-16
WO2000057767A3 (en) 2001-01-11
JP2003527880A (en) 2003-09-24
WO2000057767A2 (en) 2000-10-05

Similar Documents

Publication Publication Date Title
EP1196089A2 (en) Apparatus and methods for medical diagnostic and for medical guided interventions and therapy
JP5190510B2 (en) Multifunctional robotized platform for neurosurgery and position adjustment method
JP4758355B2 (en) System for guiding medical equipment into a patient's body
US6796943B2 (en) Ultrasonic medical system
CN108601628B (en) Navigation, tracking and guidance system for positioning a working instrument in a patient's body
US7076286B2 (en) Surgical microscope
EP0931516B1 (en) Surgical probe locating system for head use
US6996430B1 (en) Method and system for displaying cross-sectional images of a body
US7065393B2 (en) Apparatus, system and method of calibrating medical imaging systems
US6628977B2 (en) Method and system for visualizing an object
JP4470187B2 (en) Ultrasonic device, ultrasonic imaging program, and ultrasonic imaging method
WO2001001845A2 (en) Apparatus and methods for medical interventions
JP3707830B2 (en) Image display device for surgical support
JP2001061861A (en) System having image photographing means and medical work station
KR19990029038A (en) Free aiming of needle ceramic
US7278969B2 (en) Ultrasonic observation system
CN109313698B (en) Simultaneous surface and internal tumor detection
JP2008154833A (en) Ultrasonograph and report image preparation method
JP2010119576A (en) Ultrasonic diagnostic apparatus and ultrasonic diagnostic method
CN115211964A (en) Method, device, system and computer readable storage medium for performing operation by combining ultrasound and X-ray
JPH05305073A (en) Position detection display device for insertion tool
JP6287257B2 (en) Image forming apparatus and ultrasonic diagnostic apparatus
JP2003079616A (en) Detecting method of three-dimensional location of examination tool which is inserted in body region
EP3607885A1 (en) Ultrasonic diagnostic apparatus
JP3988183B2 (en) Magnetic resonance imaging system

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20011030

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE

AX Request for extension of the european patent

Free format text: AL;LT;LV;MK;RO;SI

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20041001