WO2010057315A1 - Appareil et procédé d'imagerie d'un instrument médical - Google Patents

Appareil et procédé d'imagerie d'un instrument médical Download PDF

Info

Publication number
WO2010057315A1
WO2010057315A1 PCT/CA2009/001700 CA2009001700W WO2010057315A1 WO 2010057315 A1 WO2010057315 A1 WO 2010057315A1 CA 2009001700 W CA2009001700 W CA 2009001700W WO 2010057315 A1 WO2010057315 A1 WO 2010057315A1
Authority
WO
WIPO (PCT)
Prior art keywords
probes
ultrasound
medical instrument
image
needle
Prior art date
Application number
PCT/CA2009/001700
Other languages
English (en)
Inventor
Robert Rohling
Original Assignee
The University Of British Columbia
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by The University Of British Columbia filed Critical The University Of British Columbia
Priority to US13/130,291 priority Critical patent/US20110301451A1/en
Publication of WO2010057315A1 publication Critical patent/WO2010057315A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/34Trocars; Puncturing needles
    • A61B17/3403Needle locating or guiding means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0833Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
    • A61B8/0841Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures for locating instruments
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4477Constructional features of the ultrasonic, sonic or infrasonic diagnostic device using several separate ultrasound transducers or probes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/483Diagnostic techniques involving the acquisition of a 3D volume of data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/34Trocars; Puncturing needles
    • A61B17/3403Needle locating or guiding means
    • A61B2017/3413Needle locating or guiding means guided by ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4245Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient

Definitions

  • This invention relates generally to medical imaging, and particular to an apparatus and method for imaging a medical instrument, particularly while being inserted inside a patient.
  • Some medical procedures require a needle or needle-like instrument to be inserted into a patient's body to reach a target.
  • these procedures include tissue biopsies, drug delivery, drainage of fluids, ablation for cancer treatment, and catheterization.
  • Some of these procedures can be done manually without any additional guidance other than the sense of feel and visualization of the surface of the body.
  • Other procedures are difficult to perform without additional guidance because the target is deep, the target is small, sense of feel is inadequate for recognizing when the needle's tip has reached the target, or there is a lack of visual landmarks on the body surface.
  • providing the health care provider with an image of the interior of the body in the vicinity of the target would be beneficial. It would be particularly beneficial to provide realtime images of both the target and the needle as it progresses towards the target.
  • Epidural anaesthesia is administered in the majority (>80% of women in labour) of patients for pain relief of labour and delivery in North American hospitals.
  • Epidural anaesthesia involves the insertion of a needle into the epidural space in the spine.
  • the anatomy of the back and spine in order of increasing depth from the skin, includes the skin and fat layers, a supraspinous and interspinous ligament, the epidural space, the dura mater and spinal cord. A doctor must insert the needle through these layers in order to reach the epidural space without over-inserting the needle and puncturing the thin dura mater surrounding the spinal cord.
  • the traditional procedure of epidural needle insertion will now be described.
  • the patient is seated with the doctor facing the patient's back.
  • the doctor chooses a puncture site between the vertebrae based on feeling the protruding spinal processes.
  • the doctor typically inserts the needle in a plane midline with the long axis of the spine.
  • a saline- filled syringe is attached to the needle so the doctor can apply pressure to the plunger of the syringe, as the needle in incrementally advanced toward the epidural space, and feel how easily saline is injected into the tissue.
  • the sense of feel is the main method for determining when the needle tip has reached the epidural space because the saline is easily injected into the epidural space compared to the tissue encountered before the epidural space.
  • This method can result in failure rates of 6 to 20% depending on the experience and training of the health care provider.
  • Complications include inadvertent dura puncture resulting in loss of cerebral spinal fluid and headache, as well as nerve injury, paralysis and even death.
  • Image guidance during needle insertion would improve the accuracy of needle insertion by providing better feedback to the doctor of where the needle is located with respect to the anatomical structures including the target.
  • an ultrasound imaging and medial instrument guiding apparatus comprising: a first ultrasound probe configured to acquire a first volumetric dataset representing a 3-D image of a first volume; a second ultrasound probe configured to acquire a second volumetric dataset representing a 3-D image of a second volume; a mount to which the first and second probes are mounted, and a medical instrument guide.
  • the first and second probes are located on the mount such that the first and second volumes overlap to form an overlapping volume.
  • the medical instrument guide is positionable relative to the first and second ultrasound probes and configured to receive and guide a medical instrument along a propagation axis to a target such that the target and the propagation axis intersect the overlapping volume.
  • the apparatus can include a third ultrasound probe that is configured to acquire a 3-D image of a third volume, and which is mounted on the mount such that first, second and third volumes overlap to form the overlapping volume.
  • Either or all probes can be a mechanical 3-D probe or a multidimensional probe.
  • the first and second probes can be curved and/or angled towards the propagation axis.
  • the mount can be a housing that houses the probes and the medical instrument guide can be a closable channel that extends through the housing between the probes.
  • the mount can be a plate and the medical instrument guide can be a closable channel that extends through the plate between the probes.
  • the mount can be a member and the probes can be mounted to the member such that a space is provided between the probes for location of the medical instrument guide therein.
  • the medical instrument guide can be detachably mountable to the mount in one or more orientations.
  • the medical instrument guide can be remotely located relative to the probes and can comprise means for tracking the position of the medical instrument guide relative to the probes.
  • a system for acquiring and displaying ultrasound medical images comprises the above ultrasound imaging and instrument guiding apparatus and circuitry that is communicative with this apparatus to receive the first and second volumetric datasets therefrom.
  • the circuitry comprises a processor with a memory having programmed thereon steps and instructions for execution by the processor to: condition the first and second volumetric datasets; combine the first and second volumetric datasets and calculate the overlapping volume in the first and second volumetric datasets; perform one or both of ray-tracing and re-slicing to produce one or more 2-D images from the overlapping volume; and enhance the one or more of the produced 2-D images.
  • the system also comprises a display device communicative with the circuitry to receive and display one or more of the produced 2-D images.
  • the memory can be further programmed to calculate an anticipated trajectory of the needle along the propagation axis, and overlay the calculated anticipated trajectory on one or more of the produced 2-D images.
  • Calculation of the overlapping volume can comprise spatial compounding.
  • ray-tracing can be performed to produce a 2-D projection image of one of the first and second volumetric datasets, or of a combination of the first and second volumetric datasets.
  • re-slicing can be performed on the first or second volumes or the calculated overlapping volume to produce a cross-sectional plane image.
  • a method of using of the ultrasound imaging and needle guiding apparatus described above in an epidural anaesthetic procedure comprises placing the apparatus over a back of a patient such that the medical instrument guide is placed over a needle insertion point on the back, and emitting an ultrasound signal into the back and capturing images of the first and second volumes, wherein the images of the first and second volumes include a section of a patient's spine.
  • the target can be an epidural space in the patient and each probe can be placed at a paramedian location with respect to the spine, and particularly, over spinae erector muscles of the patient.
  • the method can further comprise inserting the needle through the medical instrument guide and along the propagation axis that intersects the target, such that the captured images includes an image of the needle.
  • an ultrasound imaging and medical instrument guiding apparatus comprising: a first ultrasound probe configured to acquire a 2-D image of a first plane; a second ultrasound probe configured to acquire a 3-D image of a first volume; a mount on which the first and second probes are mounted, and a medical instrument guide.
  • the first and second probes are located on the mount such that the first volume intersects the first plane.
  • the medical instrument guide is positionable relative to the first and second ultrasound probes and is configured to receive and guide a medical instrument along a propagation axis to a target such that the target and the propagation axis intersect the first volume and first plane.
  • Figure 1 is a schematic view of approximate locations of vertebrae, needle puncture point, and various imaging planes of a patient to be imaged by an ultrasound probe and subjected to an epidural anaesthesia procedure.
  • Figure 2 is a schematic back view of a dual-probe 3-D ultrasound imaging and needle guiding apparatus according to one embodiment of the invention and positioned to image the spine of the patient shown in Figure 1.
  • Figure 3 is a schematic top view of the ultrasound imaging and needle guiding apparatus and a cross-section of the patient's torso.
  • Figure 4 is a schematic back view of the ultrasound probes of the ultrasound imaging and needle guiding apparatus along with a representation of the spatial volume of the ultrasound image captured by each ultrasound probe.
  • Figure 5 is a schematic top view of the ultrasound probes of the ultrasound imaging and needle guiding apparatus along with a representation of the spatial volume of the ultrasound image captured by each ultrasound probe.
  • Figures 6(a)-(c) are schematic perspective views of the ultrasound probes of the imaging and needle guiding apparatus and a representation of the spatial volume of the ultrasound image captured by each ultrasound probe, wherein Figure 6(a) shows rays used in a raytracing imaging technique, Figure 6(b) shows a cross- sectional re-slice of overlapping volumes in a sagittal plane, and Figure 6(c) shows a sectional re-slice in a transverse plane.
  • Figure 7 is a schematic top view of the probes of the imaging and needle guiding apparatus and a needle, and a representation of ultrasound waves capturing an image of the needle.
  • Figure 8 a schematic top view of a pair of curved ultrasound probes of the imaging and needle guiding apparatus according to another embodiment of the invention, along with a representation of the spatial volume of the ultrasound image captured by each ultrasound probe.
  • Figure 9 is a block diagram of an imaging system comprising the ultrasound imaging and needle guiding apparatus.
  • Figure 10 is a flow chart of a method for processing data from two 3-D images captured by the ultrasound imaging and needle guiding apparatus.
  • Figure 11 is a schematic back view of an ultrasound imaging and needle guiding apparatus having one 3-D ultrasound probe and one 2-D ultrasound probe according to another embodiment of the invention, along with a representation of the spatial volume and spatial area of the ultrasound image captured by each respective ultrasound probe.
  • Figure 12 is a schematic perspective view of an ultrasound imaging and needle guiding apparatus with three ultrasound probes according to another embodiment of the invention.
  • Figure 13 is a schematic view of a display device displaying multiple images captured by the ultrasound imaging and needle guiding apparatus.
  • Figure 14(a) is a schematic front view of a detachable medical instrument guide and Figure 14(b) is a schematic front view of the detachable medical instrument guide attached to an ultrasound imaging and needle guiding apparatus according to another embodiment.
  • Figure 15 is a schematic back view of an ultrasound imaging and needle guiding apparatus having a rod mount wherein the space between the ultrasound probes serves as a medical instrument guide according to yet another embodiment.
  • Ultrasound imaging is a technique for imaging the interior of the body with high frequency sound waves.
  • a standard ultrasound probe comprises a set of transducer elements emitting sound waves into the body. The sound waves reflect on tissue or bone in the body and the reflected sound (echo) is detected by the same transducer elements. By calculating the time from emission to detection of the sound waves at each transducer and measuring the intensity of the reflected sound wave, an ultrasound image can be constructed that shows various anatomical features in the ultrasound probe's field of view.
  • Ultrasound scanning during a needle insertion procedure enables the observation of both the needle and the target on a real-time ultrasound display.
  • One advantage of such an ultrasound scanning-assisted needle insertion procedure is the ability for the doctor to modify the path of needle insertion to correct the trajectory towards the target.
  • Embodiments of the invention described herein relate to an ultrasound imaging and needle guiding apparatus for guiding a needle to a target in a patient's body, such as the epidural space of the spine, and for acquiring real-time ultrasound images of the needle and target.
  • these described embodiments provide real-time or near real-time 3- D images of both the needle and the surrounding tissue and bone of the body using at least two ultrasound probes while the needle is being inserted through a medical instrument guide.
  • At least one of these probes is a 3-D ultrasound probe.
  • some of the described embodiments include a method for using the ultrasound imaging and needle guiding apparatus and for processing acquired 3- D volumetric datasets from at least two ultrasound probes with intersecting scanning volumes for representation on a 2-D display.
  • an ultrasound imaging and needle guiding apparatus 200 that enables the acquisition of simultaneous, or near simultaneous, images of anatomical features of a body 101 and a needle 405 (as shown in Figure 5), ablation probe, catheter, guide wires or other medical instrument that is inserted into the body 101 and guided by the apparatus 200 and towards a target 404.
  • the main components of the apparatus 200 are two 3-D ultrasound probes 201 , 202, a mount 199 on which the probes 201 , and a medical instrument guide 203 that in this embodiment is permanently affixed to the mount 199 but in other embodiments can be detachably mounted to the mount 199 or remotely located.
  • the ultrasound probes 201 , 202 are spaced from each other and are positioned on the mount 199 to provide simultaneous or near-simultaneous 3-D imaging of a volume of interest in the body 101 and of the medical instrument 405 inserted into the volume of interest.
  • the mount 199 in this embodiment is a housing in which probes 201 , 202 are housed; alternatively, the mount 199 can be a rectangular mounting plate (not shown) to which the probes 201 , 202 are mounted, or a rod or similar-shaped member to which the probes 201 , 202 are mounted (as shown in Figure 15).
  • the apparatus 200 can be coupled to a data processing and display system 900, which includes circuitry 904, 905 for processing volumetric datasets representing the ultrasound images captured by the probes 201 , 202, and a display device
  • the 3-D ultrasound volumetric datasets obtained by the ultrasound probes 201 , 202 can be processed and displayed as a single or as multiple image(s) of the volume of interest and the medical instrument 405 inserted into the volume of interest.
  • FIG. 1 shows the lower back of a patient's body 101 ; the vertebrae of the lower back are the thoracic vertebrae T12 102, lumbar vertebrae L1 103, L2 104, L3 105, L4 106, L5 107 and the sacrum 108.
  • a preferred needle puncture site 111 is located between the third lumber vertebra L3 105 and the fourth lumbar vertebrae L4 106 in the midline M-M of the patient's spine and along a transverse T-T plane.
  • the apparatus 200 in this application is designed as a portable device that a health care provider can place on the back of the patient undergoing the epidural injection.
  • the apparatus 200 is positioned near the preferred puncture site 111 such that the health care provider may image the back and spine underneath the apparatus 200 and detect in the ultrasound image both the major anatomical features of interest and the tip and body of the needle 405 during the injection.
  • the probes 201 , 202 are located on the mount 199 such that when the apparatus 200 is placed on the back of the patient with the medical instrument guide 203 directly above preferred puncture site 111 , the probes 201 , 202 are located at positions 109, 110 directly above the spinae erector muscles; it is expected that the muscle tissue at these locations 109, 110 serves as "windows" that transmits ultrasound particularly well.
  • the probes 201 , 202 can also be oriented on the mount 199 such that propagation of the sound waves from each probe 201 , 202 is directed towards the spine.
  • rails can be provide on the mount 199 and the probes 201 , 202 can be slidably mounted on these rails to allow for adjustment of the probes' position to suit patients of different sizes.
  • the back of the mount 199 (i.e. the portion facing away from the body 101 during use) is provided with a hand grip that is shaped and sized to allow for easy single-handed gripping by the operator.
  • the back of the mount 199 can be further provided with finger grips shaped to accept the fingers of the operator.
  • the apparatus 200 is provided with an easy to grasp handle (not shown) so that the operator may hold the apparatus 200 with one hand comfortably against the patient's back during the procedure.
  • the handle may be a basket type handle or pistol-shaped grip protruding from the back of the mount 199.
  • the two 3-D probes 201 , 202 emit sound waves into a 3-D volume that covers the part of the patient's spine underneath the apparatus 200, typically near the L3 and L4 vertebrae.
  • the received data from the reflected sound waves create a volumetric dataset (often abbreviated as "volume") of the anatomy, unlike a 2-D ultrasound probe which creates images of a cross-sectional plane.
  • the 3-D volume can be viewed by the operator in a number of ways, including a 3-D rendering on the 2-D display device 909 created by ray-casting or ray-tracing techniques adapted from the field of computer graphics, or by re-slicing the volume along a user-defined plane and displaying the cross-section of the volume.
  • the ability to view user-defined slices of the volume at any desired location and angle may be a way to alleviate the limitations of conventional two fixed planes in biplane 2-D probes.
  • Real-time 3-D ultrasound imaging can be implemented by at least the following two methods:
  • a specialized 3-D probe is constructed by combining a 2-D probe with a motorized mechanism for rapidly moving the 2-D probe so that the 2-D image sweeps repeatedly through a volume of interest. Repeated sweeping is usually implemented in an oscillating manner where each oscillation produces a 3-D volume. The spatial relationship between the set of 2-D images from each oscillation is known because the probe motion is controlled and the images are reconstructed into a 3-D Cartesian volume.
  • This device is referred to hereafter as a mechanical 3-D probe;
  • a specialized probe is created without a motorized mechanism, but instead uses a two dimensional array of transducers to scan over a 3-D volume of interest.
  • the speed of volume acquisition is typically higher than mechanical probes but the complexity of the probe increases and image quality can be inferior.
  • This probe is known as a multidimensional probe.
  • the 3-D probes 201 , 202 of the apparatus 200 can be a mechanical 3-D probe or a multi-dimensional 3-D probe as known in the art.
  • An example of a suitable mechanical 3-D probe is the RAB2-5 H46701 M for the Voluson 730 ultrasound machine by General Electric Corporation (GE Healthcare, Chalfont St. Giles, United Kingdom).
  • An example of a suitable multidimensional probe is the X7-2 for the Philips iU22 ultrasound machine (Philips Healthcare, Andover, Massachusetts, USA). With such types of probes, the rapid creation of 3-D volumes allows multiple planes of the acquired volumes to be visualized in realtime, thus overcoming some of the limitations of standard 2-D probes. These planes can be selected at any orientation and location within the volume through user control.
  • the medical instrument guide 203 in this embodiment is a channel which extends through the mount 199 and is sized to receive the epidural needle 405; although not shown the channel can have a closable cover that extends along part or the entire length of the channel and which can be opened to allow access therein for cleaning etc.
  • the medical instrument guide 203 is positioned between the 3-D probes 201 , 202 and is used to constrain the path of the epidural needle 405 inserted during the injection procedure.
  • the axis A-A (see Figure 2) of the apparatus 200 is aligned approximately to within 10 to 20 degrees, measured about the axis of the medical instrument guide 203, of the midline axis of the spine M-M (see Figure 1 ) in the inferior-superior direction, while the axis B-B of the apparatus 200 is orthogonal to axis A-A and is aligned to extend to the left and right of the patient along transverse axis T-T (see Figure 1 ).
  • the axis of the medical instrument guide 203 is aligned approximately with the axis C-C which is the horizontal axis extending through the needle insertion point 111 (see Figure 3) and is directed towards the patient's back in the anterior-posterior direction.
  • the apparatus 200 can be provided with markings (not shown) representing axes A-A, B-B, and C-C to assist the operator in correctly positioning the apparatus 200 against the patient's back during use.
  • the apparatus 200 obtains volumetric datasets that are processed by the system 900 and displayed in multiple real-time views which assist the operator in guiding the medical instrument 405 to the target. Two of these views include the sagittal plane which is the plane along axes M-M and C-C and the transverse plane which is the plane along axes T-T and C-C.
  • the medical instrument guide can be a bore, slot, aperture, hole or any guideway which serves to constrain the path of the needle 404 during the insertion procedure.
  • the medical instrument guide is the space between the probes 201 , 202 that are interconnected with a rod-shaped mount 199.
  • the medical instrument guide can either be permanently affixed to the apparatus 200 as shown in Figures 2-7, or be a separate component which can be detachably mounted to the mount 199 as shown in Figure 14; in this Figure, the guide 203 is a clip having three members pivotably connected about a pivot axis; the instrument guide 203 can be attached to a channel located between the probes 201 , 202 and at the edge of the mount 199.
  • the detachable medical instrument guide can be designed to allow the selection of a particular trajectory to be chosen by mounting one of a series of medical instrument guides, each with a different orientation of the guide-way.
  • the detachable medical instrument guide can also be disposable after a single use for the purposes of ease-of-sterilization.
  • the probes 201 , 202 are positioned and operated so that a portion of the volume 401 produced by probe 201 and a portion of the volume 402 produced by probe 202 overlap to form overlapping portion 403 (shown in cross-hatched shading), which intersects the medical instrument guide 203 and at least part of the pathway of the needle 405 inserted through the guide 203.
  • the instrument guide (bore) 203 and probe 201 , 202 locations are positioned relative to each other so that the overlapping portion 403 covers a target 404 which represents the epidural space, and the part of the needle pathway leading up the target 404.
  • the needle 405 is shown partly inserted into the medical instrument guide 203 in a direction that will intersect the target 404.
  • Figure 7 shows how the needle 405 can be detected by the control of an ultrasound pulse transmission and receive processing.
  • Probe 202 emits an ultrasound beam 701 which reflects off of the needle 405 and creates a reflection
  • Reflection 702 travels away from the needle 405 deeper into the tissue, and also produces subsequent reflections 703 upon further interaction with tissue reflectors such as tissue boundaries.
  • the reflection 702 follows the law of specular reflection from beam 701 , meaning that the angle of reflection from the needle 405 equals the angle of incidence of beam 701 to the needle 405. Given knowledge of the direction and timing of beam 701 , and the use of the law of specular reflection, the direction of the reflection 702 and subsequent reflection
  • the subsequent reflection 703 is measured by probe 201 (the reflection can also be measured by probe 202 but the signal will be weaker).
  • the measurements by the probe 201 include one or both of the magnitude and timing of the reflection 703. If the needle 405 is not inserted to a depth that produces an interaction with ultrasound beam 701 then no subsequent reflection 703 is produced. In this way, the measurement of the subsequent reflection 703 provides a measurement of insertion depth of the needle 405 into the tissue. Following this process of producing beam 701 at different source locations and angles, and measurements of the subsequent reflections 703 for each of the different beams 701 , the depth of the needle insertion is robustly determined. The measured needle insertion depth is then displayed on the ultrasound display.
  • the probe 202 measures the diffuse reflection of ultrasound beam 701 after the beam interacts with the needle 405 to measure the depth of needle insertion.
  • the diffuse reflection produces a wide continuous range of reflections 703 that are measured by probe 202 using one or both of timing and magnitude of the reflections 703.
  • the needle 405 is not inserted to a depth that produces an interaction with ultrasound beam 701 then no subsequent diffuse reflection is produced. This, in turn, produces a robust measurement of needle insertion depth.
  • an imaging system 900 incorporating the apparatus 200 processes and displays the images obtained by the apparatus 200.
  • the apparatus 200 is connected to a transmit/receive (T/R) switch 901.
  • the T/R switch 901 receives signals from a beam transmitter 902 and outputs signals to the two probes (201 and 202).
  • the T/R switch 901 also transmits signals from probes 201 and 202 to a beam receiver 903 that forms echo signals for processing.
  • Both the beam transmitter 902 and the beam receiver 903 are communicative with and controlled by a system controller 907.
  • the beam receiver 903 outputs echo signals (representing 3-D volume datasets) from both probes 201 , 202 to a signal processor 904, which performs functions such as, but not limited to, digital filtering, contrast detection and enhancement, spectral analysis and B-mode processing; both beam receiver 903 and signal processor are controlled by the system controller 907.
  • Signal processor 904 outputs the modified echo signals to a 3-D image rendering module 905 which converts the 3-D volume datasets into 2-D images using a method such as, but not limited to, reslicing or raytracing.
  • the 3-D image rendering is performed according to instructions provided by the system controller 907, which can receive input from a user interface 908 to determine methodology.
  • 2-D image data sets are transferred into an image memory 906 for access by the user interface 908, for display on a 2-D image display 909 such as a computer screen, and/or for long term storage on a storage device 910 such as a hard drive.
  • the image memory 906 communicates with the system controller 907 and the user interface 908 to access datasets and control filing.
  • the user interface 908 can receive commands from a user to control the operation of the system 900, how image data is processed and displayed on the 2-D image display 909, and to access/store images in the long- term image storage device 910.
  • the user interface 908 includes an interface program that may be integrated with the 2-D image display 909 and may include, but is not limited to, a pointing device such as a mouse or touch screen, a keyboard, or other input devices such as a microphone.
  • the system controller 907 communicates with user interface 908 to relay operational and display instructions and operational status.
  • the system controller 907 communicates with the 2-D image display 909 to synchronize the data stream.
  • a data processing method 1000 is carried out by the system 900 to manipulate the two 3-D image datasets 401 , 402 that contain overlapping volumes acquired by the apparatus 200 to produce a 2-D sagittal plane image and a 2-D transverse plane image, which can be displayed on the display device 909.
  • the two image datasets 401 , 402 are obtained from the apparatus 200 (step 1001 ) and transmitted via T/R switch 901 and beam receiver 903 to the signal processor 904 for data conditioning (step 1002).
  • Data conditioning performed on the two 3-D image datasets 401 and 402 may include, but is not limited to: filtering, enhancement, thresholding, smoothing and feature extraction.
  • the signal processor 904 also combines the datasets 401 and 402 using their known positions relative to each other and also calculates the overlapping volume 403 (step 1003); the calculation of overlapping 3-D volume 403 on the conditioned dataset may involve spatial compounding to improve image quality or other image processing steps.
  • the combined 3-D image dataset is then transmitted to the image rendering device 905 for projection (step 1004) or cross-sectional (step 1007) image processing.
  • image rendering module 905 can be stored on computer readable medium that can be executed by a general purpose computing device.
  • suitable computer readable medium are compact disk read only memory (CD-ROM), random access memory (RAM), or a hard drive disk.
  • ray-tracing is used to compute a projection of one of the 3-D datasets 401 , 402 or a combination of the two datasets 401 , 402 in the sagittal plane or on another image plane inputted by the user or automatically selected.
  • Ray-tracing is a popular method for realistically projecting a voxel-based volumetric dataset onto a 2-D image.
  • Ray- tracing involves projecting rays perpendicularly from every pixel in the plane of the 2-D image through the voxels of the volume and calculating for each pixel a value that represents the projection of the voxel values encountered along the corresponding ray.
  • the voxel values along the ray path are combined in a variety of ways to derive the pixel value in the projected 2-D image. Examples of how the projected value is calculated from the voxels along the ray path are as follows: (1 ) The minimum voxel value is chosen. (2) The maximum voxel value is chosen. (3) The average or sum of the voxel values is calculated. (4) The voxel values are weighted according to specific parameters controlling the rendering style, such as a modifying parameter based on a local gradient. (5) Voxel values below a noise threshold are first removed and then the minimum voxel value is chosen.
  • Voxel values below a noise threshold are first removed and then the maximum voxel is chosen.
  • Voxel values below a noise threshold are first removed and then the average or sum of the voxel values is calculated.
  • Voxel values below a noise threshold are first removed and then weighted according to specific parameters controlling the rendering style.
  • Figure 6(a) shows how a 2-D image 601 in the sagittal plane can be produced by acquisition of the volumetric dataset ("volume") 401 from the probe 201 , followed by projection of the data of volume 401 through the process of ray-tracing along the rays 602. In this way, the data represented by volume 401 is projected onto the image 601.
  • the process of ray tracing can take the form of averaging of all data encountered in the volume 401 from a single ray 602.
  • the process of ray tracing can also take the form of taking the maximum value of the data encountered along a single ray.
  • ray tracing can also be used, including methods that first extract the anatomical features of interest, such as the epidural space, and only project that data onto the image 601.
  • the ray-tracing can be performed on the combination of the data of volumes 401 and 402.
  • image enhancement 1005 which may include, but is not limited to, filtering, enhancement, thresholding, smoothing and feature extraction and results in the final image.
  • an anticipated needle trajectory can be superimposed onto the projection image.
  • the location of the overlaid trajectory is known and fixed relative to the probes 201 and 202, because it is determined by the physical location of the medical instrument guide 203 on the apparatus 200.
  • the enhanced image is then ready for display by display device 909, and/or storage on storage device 910.
  • re-slicing (step 1007 ) is used to produce a 2-D slice of the 3-D image dataset at the transverse plane or sagittal plane (which are the planes that intersect the medical instrument guide for needle insertion).
  • This 2-D slice image is then processed at step 1008 for image enhancement which may include, but is not limited to, filtering, enhancement, thresholding, smoothing and feature extraction and results in the final 2-D sagittal cross-sectional plane image 603 or transverse cross-sectional plane image 604; like the sagittal plane projection image, an anticipated needle trajectory can be superimposed onto the cross-sectional plane image.
  • the 3-D volumetric datasets measured by the probes 201 , 202 are processed by the system 900 and displayed on one or more of the cross-sectional or projection images on a display device 909.
  • the transverse plane image 604 is formed by combining the volumetric datasets 401 , 402 then re-slicing, using a method of data interpolation, the combined volumes in a plane that is transverse to the patient and intersects the trajectory 1302 of the medical instrument guide 203.
  • This transverse plane, in which image 604 is formed can be the same plane as shown Figure 3.
  • the needle 405 becomes visible in the image 604, and will be along a graphic overlay 1302 of the expected needle trajectory. As the needle 405 is inserted deeper into the tissue, more and more of the needle becomes visible in the image 604. The operator aligns the needle trajectory 1302 with the target 404 so that subsequent insertion of the needle 405 into tissue reaches the target 404.
  • This image 604 is updated on the display device 909 as the ultrasound 3-D volumetric datasets 401 and 402 are created by probes 201 and 202. In this way, the apparatus 200 provides current images of the needle insertion procedure.
  • the projected image 601 can also be shown on the monitor 909.
  • the projected image 601 is formed by projecting the 3-D ultrasound dataset onto a 2-D plane, such as the sagittal plane 601 depicted in Figure 6(a).
  • the projected image 601 also contains a graphic overlay 1302 of the expected needle trajectory that is also projected in the sagittal plane.
  • the projected image 601 also depicts the needle 405 and the target 404.
  • This image 601 is also updated as the ultrasound 3-D volumetric datasets 401 and 402 are created by probes 201 and 202.
  • an operator In performing an epidural anaesthesia procedure on a patient using the apparatus 200, an operator holds the apparatus 200 with one hand gripping the handle and places the apparatus 200 against the patient's back so that the medical instrument guide 2003 is directly over the needle insertion point 111. The operator then activates the apparatus 200 to cause ultrasound signals to be emitted by the proves 201 , 202 and consequent data to be collected and processed by the system 900 and displayed as 2-D images on the display device 909.
  • the two ultrasound probes 201 , 202 may be operated alternately, one after the other, so that the sound fields do not interfere.
  • the operator aligns the displayed target (e.g the epidural space) with the superimposed anticipated needle trajectory in the ultrasound image(s).
  • the operator can then insert the epidural needle 405 through the medical instrument guide 203.
  • the operator may then view in real time on the display device 909 a processed ultrasound image of the needle tip and needle body and the patient's back and spine, such as the two images of the sagittal and transverse planes as shown in Figure 13.
  • the operator may then determine, by viewing the relative motion of the needle tip with respect to the spinal anatomy, when the needle has reached the epidural space of the spine.
  • one advantage of this apparatus 200 is the ability to capture an image of the target, nearby anatomy, and needle trajectory for display in the same display device. Another advantage is the ability to acquire more than one image of the target, nearby anatomy and needle trajectory through the use of two or more 3-D ultrasound probes. Yet another advantage is the ability to use the optimal locations on the skin surface, also known as "windows", for viewing the spine with ultrasound. Yet another advantage is the ability to place the needle through the medical instrument guide 203 near the middle of the apparatus 200 so that the footprint of the apparatus 200 does not interfere with the puncture site of the needle 405. Yet another advantage is the ability to transmit ultrasound beams from one probe of the apparatus 200 and receive the resulting ultrasound echoes with another probe of the apparatus 200.
  • the probes 201 , 202 are curved.
  • the size and shape of the 3-D volumes 401 and 402 are determined by the curved shape of the probes 201 and 202.
  • This embodiment has the advantage of obtaining a wide field of view of the anatomy with a relatively small footprint of the probe due to the curvature of the face of the probe that produces a diverging set of beams that are used to form a 2-D image or 3-D volumetric dataset.
  • This embodiment also has the advantage of directing the ultrasound beams toward the needle 405 at an angle that is closer to perpendicular to the needle 405, resulting in a stronger echo from the needle 405 and a better depiction of the needle 405 in the 3-D volumetric dataset.
  • the probes 201 , 202 whether flat or curved, can be further angled toward each other so that the beams intersect the needle at angles even closer to perpendicular.
  • the apparatus 200 may also be used for the purposes of tissue tracking for elastography.
  • Probe 201 emits an ultrasound beam which encounters a moving portion of tissue. The motion can be measured from the echo signals of that beam using elastography techniques. Similarly, the motion can be measured by a beam from probe 202.
  • Each probe 201 , 202 can measure different components of the tissue motion with different levels of accuracy depending on the orientation of the beam with respect to the tissue motion. Typically motion in the direction of the beam is most accurate.
  • the use of the different components of motion can then subsequently be used in an elastography system to produce estimates of the tissue mechanical properties. The use of three probes allows all three directions of the motion to be estimated in 3-D space.
  • Figure 11 shows another embodiment of the invention wherein the apparatus comprises at least one 3-D probe 201 and at least one 2-D probe 204.
  • the 3-D ultrasound probes 201 creates a 3-D volumetric dataset 401 and the 2-D probe 204 creates a 2-D cross-sectional image 404.
  • the 2-D cross-sectional image is oriented in a direction that intersects the trajectory of the needle defined by the medical instrument guide 203.
  • the 2-D cross-sectional image also intersects the 3-D volumetric dataset 401. In this way there is still a region of overlap 403 (not emphasized in figure for clarity) between the volumetric dataset 401 and the image 404 that is suitable for spatial compounding.
  • the combination of a 3-D probe 201 and 2-D probe 204 is advantageous when faster image formation is desired, since the speed of creating a 2-D image 404 is often faster than the speed of creating a 3-D volumetric dataset 401.
  • Figure 12 shows another embodiment of the invention wherein the apparatus 200 comprises three ultrasound probes: probe 201 producing 3-D volumetric dataset 401 , probe 202 producing 3-D volumetric dataset 402 and probe 209 producing 3-D volumetric dataset 409.
  • the expected needle trajectory defined by the medical instrument guide 203, passes through the overlapping portion 403.
  • three or more probes are used to create 3-D volumetric datasets, three or more images can be formed and displayed on the monitor 909.
  • one or more of the 3-D volumetric datasets is replaced with a 2-D cross-sectional image, such as shown in Figure 11.
  • the sagittal cross-sectional image 603 is replaced by a sagittal projection image 601.
  • the needle guide 203 is not permanently or detachably mounted to the mount 199 and instead is a component of the apparatus 200 that is located remotely of the probes 201 , 202 and mount 199.
  • Both the probes 201 , 202 / mount 199 and needle guide 203 are provided with a position tracking system that provides measurements of the needle location and orientation relative to the ultrasound probes.
  • the tracking system can be based on electromagnetic tracking of coils placed on both the needle 405 and the apparatus 200.
  • a tracking system can also be based on optical tracking of visible fiducials placed on both the needle 405 and the apparatus 200.
  • a tracking system can be based on a moveable needle guide connected to the apparatus 200 by one or more linkages with angle sensors on the linkage joints.
  • the expected needle trajectory can be calculated from the measured needle location and orientation.
  • This expected needle trajectory can be shown as a graphic overlay 1302 on any of the images 601 , 603 or 604.
  • the operator can position the needle guide 200 such that the propagation axis of the projected trajectory will fall within the overlapping volumes 401 , 402 of the probes 201 , 202 and thus be displayable on the display device 909.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Medical Informatics (AREA)
  • Pathology (AREA)
  • Veterinary Medicine (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Public Health (AREA)
  • Molecular Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • Physics & Mathematics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Gynecology & Obstetrics (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

L'invention porte sur un appareil de guidage d'instrument médical et d'imagerie ultrasonore qui comprend : une première sonde ultrasonore conçue pour acquérir un premier ensemble de données volumétrique représentant une image tridimensionnelle d'un premier volume; une seconde sonde ultrasonore conçue pour acquérir un second ensemble de données volumétrique représentant une image tridimensionnelle d'un second volume; un socle sur lequel les première et seconde sondes sont montées, et un guide d'instrument médical. Les première et seconde sondes sont placées sur le socle de telle sorte que les premier et second volumes se chevauchent pour constituer un volume chevauchant. Le guide d'instrument médical est positionnable par rapport aux première et seconde sondes ultrasonores et est conçu pour recevoir et guider un instrument médical selon un axe de propagation vers une cible de telle sorte que la cible et l'axe de propagation coupent le volume chevauchant.
PCT/CA2009/001700 2008-11-24 2009-11-24 Appareil et procédé d'imagerie d'un instrument médical WO2010057315A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US13/130,291 US20110301451A1 (en) 2008-11-24 2009-11-24 Apparatus And Method For Imaging A Medical Instrument

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US19339008P 2008-11-24 2008-11-24
CA61/193,390 2008-11-24

Publications (1)

Publication Number Publication Date
WO2010057315A1 true WO2010057315A1 (fr) 2010-05-27

Family

ID=42197793

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CA2009/001700 WO2010057315A1 (fr) 2008-11-24 2009-11-24 Appareil et procédé d'imagerie d'un instrument médical

Country Status (2)

Country Link
US (1) US20110301451A1 (fr)
WO (1) WO2010057315A1 (fr)

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130041250A1 (en) * 2011-08-09 2013-02-14 Ultrasonix Medical Corporation Methods and apparatus for locating arteries and veins using ultrasound
US8641621B2 (en) 2009-02-17 2014-02-04 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures
US8690776B2 (en) 2009-02-17 2014-04-08 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image guided surgery
US8831310B2 (en) 2008-03-07 2014-09-09 Inneroptic Technology, Inc. Systems and methods for displaying guidance data based on updated deformable imaging data
US9107698B2 (en) 2010-04-12 2015-08-18 Inneroptic Technology, Inc. Image annotation in image-guided medical procedures
US9265572B2 (en) 2008-01-24 2016-02-23 The University Of North Carolina At Chapel Hill Methods, systems, and computer readable media for image guided ablation
US9659345B2 (en) 2006-08-02 2017-05-23 Inneroptic Technology, Inc. System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities
US9675319B1 (en) 2016-02-17 2017-06-13 Inneroptic Technology, Inc. Loupe display
US9901406B2 (en) 2014-10-02 2018-02-27 Inneroptic Technology, Inc. Affected region display associated with a medical device
EP2528509A4 (fr) * 2010-01-29 2018-03-14 University Of Virginia Patent Foundation Échographie pour localiser une anatomie ou le guidage de sonde
US9949700B2 (en) 2015-07-22 2018-04-24 Inneroptic Technology, Inc. Medical device approaches
US10134125B2 (en) 2013-02-28 2018-11-20 Rivanna Medical Llc Systems and methods for ultrasound imaging
US10188467B2 (en) 2014-12-12 2019-01-29 Inneroptic Technology, Inc. Surgical guidance intersection display
EP3337389A4 (fr) * 2015-08-18 2019-04-10 The Penn State Research Foundation Dispositif de fixation intégré comprenant une source d'émission et de récupération ayant pour objectif la localisation de cibles et le verrouillage par rapport au point de fixation afin de faciliter l'intervention sur ladite cible
US10278778B2 (en) 2016-10-27 2019-05-07 Inneroptic Technology, Inc. Medical device navigation using a virtual 3D space
US10314559B2 (en) 2013-03-14 2019-06-11 Inneroptic Technology, Inc. Medical device guidance
US10368834B2 (en) 2011-04-26 2019-08-06 University Of Virginia Patent Foundation Bone surface image reconstruction using ultrasound
US10548564B2 (en) 2015-02-26 2020-02-04 Rivanna Medical, LLC System and method for ultrasound imaging of regions containing bone structure
WO2020197714A1 (fr) * 2019-03-25 2020-10-01 Covidien Lp Systèmes de biopsie, dispositifs à ultrasons et leurs procédés d'utilisation
US10918413B2 (en) 2015-08-18 2021-02-16 The Penn State Research Foundation Bedside stereotactic ultrasound guidance device, system and method
US11259879B2 (en) 2017-08-01 2022-03-01 Inneroptic Technology, Inc. Selective transparency to assist medical device navigation
US11464578B2 (en) 2009-02-17 2022-10-11 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures
US11484365B2 (en) 2018-01-23 2022-11-01 Inneroptic Technology, Inc. Medical image guidance

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5491830B2 (ja) * 2009-01-20 2014-05-14 株式会社東芝 超音波診断装置、超音波画像処理装置、画像処理方法および画像表示方法
US8535337B2 (en) * 2010-04-26 2013-09-17 David Chang Pedicle screw insertion system and method
US8670816B2 (en) 2012-01-30 2014-03-11 Inneroptic Technology, Inc. Multiple medical device guidance
US10660667B2 (en) * 2013-03-13 2020-05-26 The University Of British Columbia Apparatus, system and method for imaging a medical instrument
CA2914370A1 (fr) * 2013-06-07 2014-12-11 Guardsman Scientific, Inc. Systemes et procedes permettant de fixer un dispositif peripherique a ultrasons
WO2015066280A1 (fr) * 2013-10-30 2015-05-07 Brigham And Women's Hospital, Inc. Dispositif de guidage de ventriculostomie
US9747709B2 (en) * 2014-12-19 2017-08-29 General Electric Company Method and apparatus for animate visualization of static 3-D data
CN104771192A (zh) * 2015-04-20 2015-07-15 无锡海斯凯尔医学技术有限公司 组织形态和弹性信息的处理方法和弹性检测设备
WO2018187658A2 (fr) * 2017-04-06 2018-10-11 Duke University Sonde ultrasonore interventionnelle
CN112702957A (zh) * 2018-05-31 2021-04-23 马特麦格拉斯设计有限公司 使用多阵列的医学成像方法
CN109009358B (zh) * 2018-09-19 2024-01-30 珠海医凯电子科技有限公司 无盲区无菌穿刺装置及其成像方法
US11398072B1 (en) * 2019-12-16 2022-07-26 Siemens Healthcare Gmbh Method of obtaining a set of values for a respective set of parameters for use in a physically based path tracing process and a method of rendering using a physically based path tracing process

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4387721A (en) * 1980-05-30 1983-06-14 Tokyo Shibaura Denki Kabushiki Kaisha Ultrasonic probe having means for receiving a puncturing cannula therethrough
US4475553A (en) * 1982-07-09 1984-10-09 Yokogawa Hokushin Electric Corporation Ultrasonic needle housing probe with continuous locator array
US4635644A (en) * 1984-07-24 1987-01-13 Hitachi Medical Corporation Ultrasonic applicator
WO2001064109A1 (fr) * 2000-02-28 2001-09-07 Wilk Ultrasound Of Canada, Inc. Dispositif medical a ultrasons et procede associe
US20020082518A1 (en) * 2000-12-22 2002-06-27 David Weiss Control systems for biopsy devices
GB2400176A (en) * 2003-03-29 2004-10-06 North Glasgow University Hospi Ultrasound probe with needle-guiding feature
US20040267121A1 (en) * 2003-06-12 2004-12-30 Sarvazyan Armen P. Device and method for biopsy guidance using a tactile breast imager
US20050101868A1 (en) * 2003-11-11 2005-05-12 Ridley Stephen F. Ultrasound guided probe device and method of using same
US20060182320A1 (en) * 2003-03-27 2006-08-17 Koninklijke Philips Electronics N.V. Guidance of invasive medical devices by wide view three dimensional ultrasonic imaging
WO2007110077A2 (fr) * 2006-03-24 2007-10-04 B-K Medical Aps Sonde a ultrasons
WO2007113705A1 (fr) * 2006-04-03 2007-10-11 Koninklijke Philips Electronics N. V. Détection des tissus qui entourent un objet inséré chez un patient

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2004311458B2 (en) * 2003-12-30 2011-05-12 Medicis Technologies Corporation Component ultrasound transducer
CN100556367C (zh) * 2005-08-11 2009-11-04 株式会社东芝 超声波诊断装置、超声波探针以及穿刺适配器
US20070167806A1 (en) * 2005-11-28 2007-07-19 Koninklijke Philips Electronics N.V. Multi-modality imaging and treatment

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4387721A (en) * 1980-05-30 1983-06-14 Tokyo Shibaura Denki Kabushiki Kaisha Ultrasonic probe having means for receiving a puncturing cannula therethrough
US4475553A (en) * 1982-07-09 1984-10-09 Yokogawa Hokushin Electric Corporation Ultrasonic needle housing probe with continuous locator array
US4635644A (en) * 1984-07-24 1987-01-13 Hitachi Medical Corporation Ultrasonic applicator
WO2001064109A1 (fr) * 2000-02-28 2001-09-07 Wilk Ultrasound Of Canada, Inc. Dispositif medical a ultrasons et procede associe
US20020082518A1 (en) * 2000-12-22 2002-06-27 David Weiss Control systems for biopsy devices
US20060182320A1 (en) * 2003-03-27 2006-08-17 Koninklijke Philips Electronics N.V. Guidance of invasive medical devices by wide view three dimensional ultrasonic imaging
GB2400176A (en) * 2003-03-29 2004-10-06 North Glasgow University Hospi Ultrasound probe with needle-guiding feature
US20040267121A1 (en) * 2003-06-12 2004-12-30 Sarvazyan Armen P. Device and method for biopsy guidance using a tactile breast imager
US20050101868A1 (en) * 2003-11-11 2005-05-12 Ridley Stephen F. Ultrasound guided probe device and method of using same
WO2007110077A2 (fr) * 2006-03-24 2007-10-04 B-K Medical Aps Sonde a ultrasons
WO2007113705A1 (fr) * 2006-04-03 2007-10-11 Koninklijke Philips Electronics N. V. Détection des tissus qui entourent un objet inséré chez un patient

Cited By (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10127629B2 (en) 2006-08-02 2018-11-13 Inneroptic Technology, Inc. System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities
US9659345B2 (en) 2006-08-02 2017-05-23 Inneroptic Technology, Inc. System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities
US9265572B2 (en) 2008-01-24 2016-02-23 The University Of North Carolina At Chapel Hill Methods, systems, and computer readable media for image guided ablation
US8831310B2 (en) 2008-03-07 2014-09-09 Inneroptic Technology, Inc. Systems and methods for displaying guidance data based on updated deformable imaging data
US10398513B2 (en) 2009-02-17 2019-09-03 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures
US10136951B2 (en) 2009-02-17 2018-11-27 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image guided surgery
US9364294B2 (en) 2009-02-17 2016-06-14 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures
US9398936B2 (en) 2009-02-17 2016-07-26 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image guided surgery
US8690776B2 (en) 2009-02-17 2014-04-08 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image guided surgery
US11464578B2 (en) 2009-02-17 2022-10-11 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures
US11464575B2 (en) 2009-02-17 2022-10-11 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image guided surgery
US8641621B2 (en) 2009-02-17 2014-02-04 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures
EP2528509A4 (fr) * 2010-01-29 2018-03-14 University Of Virginia Patent Foundation Échographie pour localiser une anatomie ou le guidage de sonde
US9107698B2 (en) 2010-04-12 2015-08-18 Inneroptic Technology, Inc. Image annotation in image-guided medical procedures
US10368834B2 (en) 2011-04-26 2019-08-06 University Of Virginia Patent Foundation Bone surface image reconstruction using ultrasound
US20130041250A1 (en) * 2011-08-09 2013-02-14 Ultrasonix Medical Corporation Methods and apparatus for locating arteries and veins using ultrasound
US10134125B2 (en) 2013-02-28 2018-11-20 Rivanna Medical Llc Systems and methods for ultrasound imaging
US10679347B2 (en) 2013-02-28 2020-06-09 Rivanna Medical Llc Systems and methods for ultrasound imaging
US10314559B2 (en) 2013-03-14 2019-06-11 Inneroptic Technology, Inc. Medical device guidance
US10820944B2 (en) 2014-10-02 2020-11-03 Inneroptic Technology, Inc. Affected region display based on a variance parameter associated with a medical device
US11684429B2 (en) 2014-10-02 2023-06-27 Inneroptic Technology, Inc. Affected region display associated with a medical device
US9901406B2 (en) 2014-10-02 2018-02-27 Inneroptic Technology, Inc. Affected region display associated with a medical device
US10188467B2 (en) 2014-12-12 2019-01-29 Inneroptic Technology, Inc. Surgical guidance intersection display
US10820946B2 (en) 2014-12-12 2020-11-03 Inneroptic Technology, Inc. Surgical guidance intersection display
US11534245B2 (en) 2014-12-12 2022-12-27 Inneroptic Technology, Inc. Surgical guidance intersection display
US10548564B2 (en) 2015-02-26 2020-02-04 Rivanna Medical, LLC System and method for ultrasound imaging of regions containing bone structure
US11103200B2 (en) 2015-07-22 2021-08-31 Inneroptic Technology, Inc. Medical device approaches
US9949700B2 (en) 2015-07-22 2018-04-24 Inneroptic Technology, Inc. Medical device approaches
EP3337389A4 (fr) * 2015-08-18 2019-04-10 The Penn State Research Foundation Dispositif de fixation intégré comprenant une source d'émission et de récupération ayant pour objectif la localisation de cibles et le verrouillage par rapport au point de fixation afin de faciliter l'intervention sur ladite cible
US10918413B2 (en) 2015-08-18 2021-02-16 The Penn State Research Foundation Bedside stereotactic ultrasound guidance device, system and method
US9675319B1 (en) 2016-02-17 2017-06-13 Inneroptic Technology, Inc. Loupe display
US11179136B2 (en) 2016-02-17 2021-11-23 Inneroptic Technology, Inc. Loupe display
US10433814B2 (en) 2016-02-17 2019-10-08 Inneroptic Technology, Inc. Loupe display
US11369439B2 (en) 2016-10-27 2022-06-28 Inneroptic Technology, Inc. Medical device navigation using a virtual 3D space
US10772686B2 (en) 2016-10-27 2020-09-15 Inneroptic Technology, Inc. Medical device navigation using a virtual 3D space
US10278778B2 (en) 2016-10-27 2019-05-07 Inneroptic Technology, Inc. Medical device navigation using a virtual 3D space
US11259879B2 (en) 2017-08-01 2022-03-01 Inneroptic Technology, Inc. Selective transparency to assist medical device navigation
US11484365B2 (en) 2018-01-23 2022-11-01 Inneroptic Technology, Inc. Medical image guidance
WO2020197714A1 (fr) * 2019-03-25 2020-10-01 Covidien Lp Systèmes de biopsie, dispositifs à ultrasons et leurs procédés d'utilisation

Also Published As

Publication number Publication date
US20110301451A1 (en) 2011-12-08

Similar Documents

Publication Publication Date Title
US20110301451A1 (en) Apparatus And Method For Imaging A Medical Instrument
US8696582B2 (en) Apparatus and method for imaging a medical instrument
US10660667B2 (en) Apparatus, system and method for imaging a medical instrument
US20210142441A1 (en) Anatomically intelligent echochardiography for point-of-care
JP7277967B2 (ja) 超音波画像データの三次元撮像およびモデリング
EP2701607B1 (fr) Reconstruction de l'image d'une surface osseuse par ultrasons
EP2147636B1 (fr) Dispositif et procédé pour le guidage d'outils chirurgicaux par imagerie ultrasonique
US8556815B2 (en) Freehand ultrasound imaging systems and methods for guiding fine elongate instruments
US20140364726A1 (en) Systems and methods to identify interventional instruments
Beigi et al. Needle trajectory and tip localization in real-time 3-D ultrasound using a moving stylus
WO1996025881A1 (fr) Procede de guidage par ultrasons pour actes cliniques
CN109561875B (zh) 用于超声脊椎阴影特征检测及其成像的***和方法
CN108697403A (zh) 针状物跟踪换能器阵列方法和装置
EP3417791B1 (fr) Système et procédé d'analyse et de formation de procédure guidée par image
JP2008036248A (ja) 椎弓根プローブのナビゲーションシステムおよびナビゲーション方法
CN210019440U (zh) 一种用陀螺仪导航的活检穿刺针
CN116564149A (zh) 一种用于腰椎间孔穿刺的操作训练方法
Khamene et al. Local 3D reconstruction and augmented reality visualization of free-hand ultrasound for needle biopsy procedures
US20230090966A1 (en) Ultrasound-based imaging dual-array probe appartus and system
Chen et al. Recent Patents Review in Three Dimensional Ultrasound Imaging
Huang Development of a portable 3D ultrasound system for imaging and measurement of musculoskeletal body parts
Tran Instrumentation and ultrasound for epidural anesthesia

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09827095

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 13130291

Country of ref document: US

122 Ep: pct application non-entry in european phase

Ref document number: 09827095

Country of ref document: EP

Kind code of ref document: A1