EP3801245A1 - Systèmes et procédés de visualisation de l'anatomie, de localisation de dispositifs médicaux, ou de positionnement de dispositifs médicaux - Google Patents

Systèmes et procédés de visualisation de l'anatomie, de localisation de dispositifs médicaux, ou de positionnement de dispositifs médicaux

Info

Publication number
EP3801245A1
EP3801245A1 EP19815710.9A EP19815710A EP3801245A1 EP 3801245 A1 EP3801245 A1 EP 3801245A1 EP 19815710 A EP19815710 A EP 19815710A EP 3801245 A1 EP3801245 A1 EP 3801245A1
Authority
EP
European Patent Office
Prior art keywords
medical device
patient
ultrasound
alternative
reality headset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP19815710.9A
Other languages
German (de)
English (en)
Other versions
EP3801245A4 (fr
Inventor
Tyler L. DURFEE
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bard Access Systems Inc
Original Assignee
Bard Access Systems Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US16/209,601 external-priority patent/US20190167148A1/en
Priority claimed from US16/370,353 external-priority patent/US20190223757A1/en
Application filed by Bard Access Systems Inc filed Critical Bard Access Systems Inc
Publication of EP3801245A1 publication Critical patent/EP3801245A1/fr
Publication of EP3801245A4 publication Critical patent/EP3801245A4/fr
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/02007Evaluating blood vessel condition, e.g. elasticity, compliance
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • A61B5/061Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body
    • A61B5/062Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body using magnetic field
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • A61B5/743Displaying an image simultaneously with additional graphical information, e.g. symbols, charts, function plots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0833Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
    • A61B8/0841Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures for locating instruments
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0883Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of the heart
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4209Details of probe positioning or probe attachment to the patient by using holders, e.g. positioning frames
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/462Displaying means of special interest characterised by constructional features of the display
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/466Displaying means of special interest adapted to display 3D data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5238Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
    • A61B8/5246Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image combining images from the same or different imaging techniques, e.g. color Doppler and B-mode
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5238Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
    • A61B8/5246Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image combining images from the same or different imaging techniques, e.g. color Doppler and B-mode
    • A61B8/5253Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image combining images from the same or different imaging techniques, e.g. color Doppler and B-mode combining overlapping images, e.g. spatial compounding
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00207Electrical control of surgical instruments with hand gesture control or hand gesture recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00216Electrical control of surgical instruments with eye tracking or head position tracking control
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/00234Surgical instruments, devices or methods, e.g. tourniquets for minimally invasive surgery
    • A61B2017/00238Type of minimally invasive operation
    • A61B2017/00243Type of minimally invasive operation cardiac
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2048Tracking techniques using an accelerometer or inertia sensor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2051Electromagnetic tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/30Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure
    • A61B2090/309Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure using white LEDs
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/365Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/372Details of monitor hardware
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/378Surgical systems with images on a monitor during operation using ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/50Supports for surgical instruments, e.g. articulated arms
    • A61B2090/502Headgear, e.g. helmet, spectacles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/05Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves 
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/318Heart-related electrical modalities, e.g. electrocardiography [ECG]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6846Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be brought in contact with an internal body part, i.e. invasive
    • A61B5/6847Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be brought in contact with an internal body part, i.e. invasive mounted on an invasive device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0891Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of blood vessels
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/488Diagnostic techniques involving Doppler signals
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/41Medical
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/028Multiple view windows (top-side-front-sagittal-orthogonal)

Definitions

  • a medical device-placing system including, in some embodiments, a medical-device tip-location sensor (“TLS”), an ultrasound probe, a console, and an alternative-reality headset.
  • TLS is configured for placement on a chest of a patient.
  • the ultrasound probe is configured to emit ultrasound signals into the patient and receive echoed ultrasound signals from the patient by way of a piezoelectric sensor array.
  • the console has electronic circuitry including memory and a processor configured to transform the echoed ultrasound signals to produce ultrasound-image segments corresponding to anatomical structures of the patient.
  • the console is also configured to transform TLS signals from the TLS into location information for a medical device within the patient when the TLS is placed on the chest of the patient.
  • the alternative-reality headset includes a display screen coupled to a frame having electronic circuitry including memory and a processor.
  • the display screen is configured such that a wearer of the alternative-reality headset can see the patient through the display screen.
  • the display screen is configured to display over the patient a virtual medical device in accordance with the location information for the medical device within objects of virtual anatomy corresponding to the ultrasound-image segments.
  • the ultrasound probe is configured with a pulsed-wave
  • the console is configured to capture ultrasound-imaging frames in accordance with the pulsed-wave Doppler imaging mode, stitch the ultrasound-imaging frames together with a stitching algorithm, and segment the ultrasound-imaging frames or the stitched ultrasound-imaging frames into the ultrasound-image segments with an image segmentation algorithm.
  • the console is configured to transform the ultrasound- image segments into the objects of virtual anatomy with a virtualization algorithm.
  • the console is configured to send both the virtual medical device and the objects of virtual anatomy to the alternative-reality headset for display over the patient.
  • the alternative-reality headset is configured to anchor the virtual medical device and the objects of virtual anatomy to the patient over which the virtual medical device and the objects of virtual anatomy are displayed.
  • the alternative-reality headset further includes one or more eye-tracking cameras coupled to the frame configured to capture eye movements of the wearer.
  • the processor of the alternative-reality headset is configured to process the eye movements with an eye-movement algorithm to identify a focus of the wearer for selecting or enhancing the objects of virtual anatomy, the virtual medical device, or both corresponding to the focus of the wearer.
  • the alternative-reality headset further includes one or more patient-facing cameras coupled to the frame configured to capture gestures of the wearer.
  • the processor of the alternative-reality headset is configured to process the gestures with a gesture-command algorithm to identify gesture-based commands issued by the wearer for execution thereof by the alternative-reality headset.
  • the alternative-reality headset further includes one or more microphones coupled to the frame configured to capture audio of the wearer.
  • the processor of the alternative-reality headset is configured to process the audio with an audio- command algorithm to identify audio-based commands issued by the wearer for execution thereof by the alternative-reality headset.
  • the TLS includes one or more magnetic sensors disposed in a housing.
  • the TLS signals are magnetic-sensor signals from the one or more magnetic sensors available to the console for transforming the magnetic-sensor signals into the location information for the medical device.
  • each magnetic sensor of the one or more magnetic sensors has a fixed spatial relationship to another magnetic sensor of the one or more magnetic sensors.
  • the medical device is a magnetized medical device such as a peripherally inserted central catheter (“PICC”).
  • PICC peripherally inserted central catheter
  • a medical device-placing system including, in some embodiments, a medical-device TLS, an ultrasound probe, a console, and an alternative-reality headset.
  • the TLS includes one or more magnetic sensors disposed in a housing configured for placement on a chest of a patient.
  • the ultrasound probe is configured to emit ultrasound signals into the patient and receive echoed ultrasound signals from the patient by way of a piezoelectric sensor array.
  • the console has electronic circuitry including memory and a processor configured to transform the echoed ultrasound signals to produce ultrasound-image segments corresponding to anatomical structures of the patient.
  • the console is also configured to transform magnetic-sensor signals from the one or more magnetic sensor of the TLS into location information for a magnetized medical device such as a PICC within the patient when the TLS is placed on the chest of the patient.
  • the alternative-reality headset includes a display screen coupled to a frame having electronic circuitry including memory and a processor. The display screen is configured such that a wearer of the alternative-reality headset can see the patient through the display screen. The display screen is configured to display over the patient an anchored virtual medical device in accordance with the location information for the medical device within anchored objects of virtual anatomy corresponding to the ultrasound-image segments.
  • the alternative-reality headset further includes one or more eye-tracking cameras coupled to the frame configured to capture eye movements of the wearer.
  • the processor of the alternative-reality headset is configured to process the eye movements with an eye-movement algorithm to identify a focus of the wearer for selecting or enhancing the objects of virtual anatomy, the virtual medical device, or both corresponding to the focus of the wearer.
  • the alternative-reality headset further includes one or more patient-facing cameras coupled to the frame configured to capture gestures of the wearer.
  • the processor of the alternative-reality headset is configured to process the gestures with a gesture-command algorithm to identify gesture-based commands issued by the wearer for execution thereof by the alternative-reality headset.
  • a wireless medical device-placing system including, in some embodiments, an ultrasound probe, a medical-device TLS, and an alternative-reality headset configured to wirelessly communicate with the ultrasound probe and the TLS.
  • the ultrasound probe is configured to emit ultrasound signals into a patient and receive echoed ultrasound signals from the patient by way of a piezoelectric sensor array.
  • the TLS configured for placement on a chest of the patient.
  • the alternative-reality headset includes a frame and a display screen coupled to the frame through which a wearer of the alternative-reality headset can see an environment including the patient.
  • the frame has electronic circuitry including memory and a processor configured to transform the echoed ultrasound signals to produce ultrasound-image segments corresponding to anatomical structures of the patient, as well as transform TLS signals from the TLS into location information for a medical device within the patient when the TLS is placed on the chest of the patient.
  • the display screen is configured to display a virtual medical device in accordance with the location information for the medical device within objects of virtual anatomy corresponding to the ultrasound-image segments. Alternatively or additionally, the display screen is configured to display one or more graphical- control-element windows including output corresponding to one or more processes of the medical device-placing system.
  • the alternative-reality headset is configured to capture ultrasound-imaging frames in accordance with an imaging mode of the ultrasound probe, stitch the ultrasound-imaging frames together with a stitching algorithm, and segment the ultrasound- imaging frames or the stitched ultrasound-imaging frames into the ultrasound-image segments with an image segmentation algorithm.
  • the alternative-reality headset is configured to display the one or more windows including the output corresponding to the one or more processes of the medical device-placing system.
  • the one or more windows include an ultrasound window, and the output corresponding to the one or more processes of the medical device-placing system includes the ultrasound-imaging frames corresponding to ultrasound imaging with the ultrasound probe.
  • the alternative-reality headset is configured to transform the ultrasound-image segments into the objects of virtual anatomy with a virtualization algorithm and display both the virtual medical device and the objects of virtual anatomy over the environment.
  • the alternative-reality headset is configured to anchor the virtual medical device and the objects of virtual anatomy to a persistent location on the display screen, a persistent location in a reference frame of the wearer, or a persistent location in the environment.
  • the alternative-reality headset further includes one or more eye-tracking cameras coupled to the frame configured to capture eye movements of the wearer.
  • the processor of the alternative-reality headset is further configured to process the eye movements with an eye-movement algorithm to identify a focus of the wearer for selecting or enhancing the objects of virtual anatomy, the virtual medical device, or both corresponding to the focus of the wearer.
  • the alternative-reality headset further includes one or more patient-facing cameras coupled to the frame configured to capture gestures of the wearer.
  • the processor of the alternative-reality headset is further configured to process the gestures with a gesture-command algorithm to identify gesture-based commands issued by the wearer for execution thereof by the alternative-reality headset.
  • the alternative-reality headset further includes one or more microphones coupled to the frame configured to capture audio of the wearer.
  • the processor of the alternative-reality headset is further configured to process the audio with an audio-command algorithm to identify audio-based commands issued by the wearer for execution thereof by the alternative-reality headset.
  • a medical device-placing system including, in some embodiments, an ultrasound probe, a medical-device TLS, a stylet, and a processing means configured for processing echoed ultrasound signals, TLS signals, and a set of electrocardiogram (“ECG”) signals.
  • the ultrasound probe is configured to emit ultrasound signals into a patient and receive the echoed ultrasound signals from the patient by way of a piezoelectric sensor array.
  • the TLS is configured for placement on a chest of the patient.
  • the stylet is configured for insertion into a lumen of a medical device.
  • the stylet includes an ECG electrode in a distal-end portion of the stylet configured to generate the set of ECG signals in response to electrical changes associated with depolarization and repolarization of a heart of the patient.
  • the processing means includes electronic circuitry including memory and a processor configured to transform the echoed ultrasound signals to produce ultrasound-image segments corresponding to anatomical structures of the patient, transform the TLS signals from the TLS into location information for the medical device within the patient when the TLS is placed on the chest of the patient, and transform the set of ECG signals into an ECG.
  • a wearable display screen through which a wearer thereof can see an environment including the patient is configured to display a virtual medical device in accordance with the location information for the medical device within objects of virtual anatomy corresponding to the ultrasound-image segments. Alternatively or additionally, the display screen is configured to display one or more graphical -control-element windows including output corresponding to one or more processes of the medical device-placing system.
  • the processing means is configured to capture ultrasound-imaging frames in accordance with an imaging mode of the ultrasound probe, stitch the ultrasound-imaging frames together with a stitching algorithm, and segment the ultrasound- imaging frames or the stitched ultrasound-imaging frames into the ultrasound-image segments with an image segmentation algorithm.
  • the display screen is configured to display the one or more windows including the output corresponding to the one or more processes of the medical device-placing system.
  • the one or more windows include an ultrasound window, and the output corresponding to the one or more processes of the medical device-placing system includes the ultrasound-imaging frames corresponding to ultrasound imaging with the ultrasound probe.
  • the one or more windows further include an ECG window
  • the output corresponding to the one or more processes of the medical device- placing system further includes the ECG corresponding to electrocardiography with the stylet including the ECG electrode.
  • the medical device-placing system further includes a number of ECG-electrode pads configured to generate a corresponding number of sets of ECG signals in response to the electrical changes associated with the depolarization and the repolarization of the heart of the patient.
  • the processing means is further configured to transform the number of sets of ECG signals into a corresponding number of ECGs.
  • the output corresponding to the one or more processes of the medical device-placing system further includes the number of ECGs corresponding to electrocardiography with the number of ECG-electrode pads.
  • Each of the ECGs in the ECG window is configured for arrangement in the ECG window by the wearer of the display screen.
  • the processing means is configured to transform the ultrasound-image segments into the objects of virtual anatomy with a virtualization algorithm for display of both the virtual medical device and the objects of virtual anatomy over the environment.
  • the processing means is a console of the medical device- placing system, an alternative-reality headset of the medical device-placing system, or a combination of the console and the alternative-reality headset.
  • the alternative-reality headset includes a frame to which the display screen is coupled.
  • the stylet is configured to connect to the TLS through a sterile drape separating a sterile field including the stylet from a non-sterile field including the TLS.
  • the TLS is configured to wirelessly communicate with the alternative-reality headset or communicate with the console over a first wired connection to a first port of the console.
  • the ultrasound probe is configured to wirelessly communicate with the alternative-reality headset or communicate with the console over a second wired connection to a second port of the console.
  • the alternative-reality headset is configured to anchor the virtual medical device and the objects of virtual anatomy to a persistent location on the display screen, a persistent location in a reference frame of the wearer, or a persistent location in the environment.
  • the display screen is configured to display one or more outlines around one or more corresponding components of the medical device-placing system, one or more virtual components over one or more corresponding components of the medical device-placing system, or a combination thereof.
  • the display screen is configured to display a TLS outline around the TLS under the sterile drape, a virtual TLS of the TLS anywhere in the environment over the sterile drape, or a combination thereof.
  • the medical device is a PICC and a desired location in the patient for the PICC is a superior vena cava proximate a sinoatrial node in a right atrium of the heart of the patient.
  • a distal-end portion of the virtual medical device indicates proximity to the desired location in the patient by way of a visual indicator as the medical device is advanced through a body of the patient.
  • an anatomy -visualizing system including, in some embodiments, an ultrasound-imaging system and an alternative-reality headset.
  • the ultrasound-imaging system includes an ultrasound probe and a console.
  • the ultrasound probe is configured to emit ultrasound signals into a patient and receive echoed ultrasound signals from the patient by way of a piezoelectric sensor array.
  • the console has electronic circuitry including memory and a processor configured to transform the echoed ultrasound signals to produce ultrasound-image segments corresponding to anatomical structures of the patient.
  • the alternative-reality headset includes a display screen coupled to a frame having electronic circuitry including memory and a processor.
  • the display screen is configured such that a wearer of the alternative-reality headset can see the patient through the display screen.
  • the display screen is configured to display objects of virtual anatomy over the patient corresponding to the ultrasound-image segments.
  • the ultrasound probe is configured with a pulsed-wave
  • the console is configured to capture ultrasound-imaging frames in accordance with the pulsed-wave Doppler imaging mode, stitch the ultrasound-imaging frames together with a stitching algorithm, and segment the ultrasound-imaging frames or the stitched ultrasound-imaging frames into the ultrasound-image segments with an image segmentation algorithm.
  • the console is configured to transform the ultrasound- image segments into the objects of virtual anatomy with a virtualization algorithm.
  • the console is configured to send the objects of virtual anatomy to the alternative-reality headset for display over the patient.
  • the alternative-reality headset is configured to anchor the objects of virtual anatomy to the patient over which the objects of virtual anatomy are displayed.
  • the alternative-reality headset further includes one or more eye-tracking cameras coupled to the frame eye movements of the wearer.
  • the processor of the alternative-reality headset is configured to process the eye movements with an eye- movement algorithm to identify a focus of the wearer for selecting or enhancing the objects of virtual anatomy corresponding to the focus of the wearer.
  • the alternative-reality headset further includes one or more patient-facing cameras coupled to the frame configured to capture gestures of the wearer.
  • the processor of the alternative-reality headset is configured to process the gestures with a gesture-command algorithm to identify gesture-based commands issued by the wearer for execution thereof by the alternative-reality headset.
  • the alternative-reality headset further includes one or more microphones coupled to the frame configured to capture audio of the wearer.
  • the processor of the alternative-reality headset is configured to process the audio with an audio- command algorithm to identify audio-based commands issued by the wearer for execution thereof by the alternative-reality headset.
  • a method of a medical device-placing system including, in some embodiments, emitting ultrasound signals into a patient and receiving echoed ultrasound signals from the patient by way of a piezoelectric sensor array of an ultrasound probe; transforming the echoed ultrasound signals with a console having electronic circuitry including memory and a processor to produce ultrasound-image segments corresponding to anatomical structures of the patient; transforming magnetic-sensor signals from one or more magnetic sensors disposed within a housing of a medical-device TLS placed on a chest of the patient with the console into location information for a magnetized medical device within the patient; displaying over the patient on a see-through display screen of an alternative-reality headset having electronic circuitry including memory and a processor in a frame coupled to the display screen a virtual medical device in accordance with the location information for the medical device within objects of virtual anatomy corresponding to the ultrasound-image segments.
  • the method further includes capturing in the memory of the console ultrasound-imaging frames in accordance with a pulsed-wave Doppler imaging mode of the ultrasound probe while emitting and receiving the ultrasound signals; stitching the ultrasound-imaging frames together with a stitching algorithm; and segmenting the ultrasound imaging frames or the stitched ultrasound-imaging frames into the ultrasound-image segments with an image segmentation algorithm.
  • the method further includes transforming the ultrasound- image segments into the objects of virtual anatomy with a virtualization algorithm; and sending both the virtual medical device and the objects of virtual anatomy to the alternative-reality headset for display over the patient.
  • the method further includes anchoring the virtual medical device and the objects of virtual anatomy to the patient over which the virtual medical device and the objects of virtual anatomy are displayed.
  • the method further includes capturing in the memory of the console eye movements of the wearer using one or more eye-tracking cameras coupled to the frame of the alternative-reality headset; and processing the eye movements with an eye- movement algorithm to identify a focus of the wearer for selecting or enhancing the objects of virtual anatomy corresponding to the focus of the wearer.
  • the method further includes capturing in the memory of the console gestures of the wearer using one or more patient-facing cameras coupled to the frame of the alternative-reality headset; and processing the gestures with a gesture-command algorithm to identify gesture-based commands issued by the wearer for execution thereof by the alternative-reality headset.
  • the method further includes capturing in the memory of the console audio of the wearer using one or more microphones coupled to the frame of the alternative-reality headset; and processing the audio with an audio-command algorithm to identify audio-based commands issued by the wearer for execution thereof by the alternative- reality headset.
  • a method of a medical device-placing system including, in some embodiments, emitting ultrasound signals into a patient and receiving echoed ultrasound signals from the patient by way of a piezoelectric sensor array of an ultrasound probe; transforming the echoed ultrasound signals with electronic circuitry in a frame of an alternative-reality headset including memory and a processor to produce ultrasound-image segments corresponding to anatomical structures of the patient; transforming magnetic-sensor signals from one or more magnetic sensors disposed within a housing of a medical -device TLS placed on a chest of the patient with the alternative-reality headset into location information for a magnetized medical device within the patient; and displaying over an environment including the patient on a see-through display screen of the alternative-reality headset for a wearer thereof a virtual medical device in accordance with the location information for the medical device within objects of virtual anatomy corresponding to the ultrasound-image segments, one or more graphical-control-element windows including output corresponding to one or more processes of the medical device-placing system
  • the method further includes capturing in the memory of the alternative-reality headset eye movements of the wearer using one or more eye -tracking cameras coupled to the frame of the alternative-reality headset; and processing the eye movements with an eye-movement algorithm to identify a focus of the wearer for selecting or enhancing the virtual medical device, the objects of virtual anatomy, the one or more windows, or the output in the one or more windows corresponding to the focus of the wearer.
  • the method further includes capturing in the memory of the alternative-reality headset gestures of the wearer using one or more patient-facing cameras coupled to the frame of the alternative-reality headset; and processing the gestures with a gesture-command algorithm to identify gesture-based commands issued by the wearer for execution thereof by the alternative-reality headset.
  • the method further includes enabling the wearer to anchor the virtual medical device, any object of the objects of virtual anatomy, or any window of the one or more windows to a persistent location on the display screen, a persistent location in a reference frame of a wearer of the alternative-reality headset, or a persistent location in the environment.
  • the method further includes enabling the wearer to transform the virtual medical device, any object of the objects of virtual anatomy, or any window of the one or more windows over the environment by way of translating, rotating, or resizing the virtual medical device, any object of the objects of virtual anatomy, or any window of the one or more windows.
  • FIG. 1 provides a block diagram for an anatomy-visualizing system in accordance with some embodiments.
  • FIG. 2 provides a block diagram for a medical device-locating system in accordance with some embodiments.
  • FIG. 3 provides a block diagram for a medical device-placing system in accordance with some embodiments.
  • FIG. 4 provides a block diagram for an ultrasound probe connected to a console of the anatomy-visualizing system in accordance with some embodiments.
  • FIG. 5 provides a block diagram for an alternative-reality headset of the anatomy-visualizing system in accordance with some embodiments.
  • FIG. 6A illustrates objects of virtual anatomy over a patient as seen through a display screen of the alternative-reality headset in accordance with some embodiments.
  • FIG. 6B illustrates a cross-sectioned enhancement of the objects of virtual anatomy over the patient as seen through the display screen of the alternative-reality headset in accordance with some embodiments.
  • FIG. 7 provides a block diagram for a medical-device detector connected to a console of the medical device-locating system in accordance with some embodiments.
  • FIG. 8 A provides a first medical -device detector in accordance with some embodiments.
  • FIG. 8B provides the first medical-device detector about a limb of a patient in accordance with some embodiments.
  • FIG. 9 provides a second medical-device detector about a limb of a patient in accordance with some embodiments.
  • FIG. 10 provides a block diagram for an ultrasound probe and a medical -device detector connected to a console of the medical device-placing system in accordance with some embodiments.
  • FIG. 11 A illustrates objects of virtual anatomy and a representation of a medical device over a patient as seen through a display screen of the alternative-reality headset in accordance with some embodiments.
  • FIG. 11B illustrates a zoomed-in enhancement of the objects of virtual anatomy and the representation of the medical device over the patient as seen through the display screen of the alternative-reality headset in accordance with some embodiments.
  • FIG. 12 provides a block diagram for a medical device-placing system in accordance with some embodiments.
  • FIG. 13 provides a block diagram for an ultrasound probe and a tip-location sensor connected to a console of the medical device-placing system in accordance with some embodiments.
  • FIG. 14 illustrates objects of virtual anatomy and a representation of a medical device over a patient as seen through a display screen of the alternative-reality headset in accordance with some embodiments.
  • FIG. 15 illustrates the medical device-placing system including a stylet along with objects of virtual anatomy and a representation of a medical device over a patient as seen through a display screen of the alternative-reality headset in accordance with some embodiments.
  • FIG. 16 illustrates a block diagram for a wireless medical device-placing system without a console in accordance with some embodiments.
  • FIG. 17 illustrates the wireless medical device-placing system including a stylet along with a representation of a medical device within an object of virtual anatomy over a patient as seen through a display screen of the alternative-reality headset in accordance with some embodiments.
  • FIG. 18 illustrates a first representation of a first medical device within an object of virtual anatomy over a patient, a second representation of a second medical device over the patient, and windows including output of the medical device-placing system in an environment beside the patient as seen through a display screen of the alternative-reality headset in accordance with some embodiments.
  • FIG. 19 illustrates the first representation of the first medical device within the object of virtual anatomy over the patient, a third representation of the second medical device over the patient, and the windows including the output of the medical device-placing system in the environment beside the patient as seen through a display screen of the alternative-reality headset in accordance with some embodiments.
  • FIG. 20 illustrates the first representation of the first medical device within the object of virtual anatomy above the patient, the third representation of the second medical device above the patient, and the windows including the output of the medical device-placing system in the environment beside the patient as seen through a display screen of the alternative- reality headset in accordance with some embodiments.
  • FIG. 21 illustrates the first representation of the first medical device within the object of virtual anatomy in the environment away from the patient, the third representation of the second medical device in the environment away from the patient, and the windows including the output of the medical device-placing system in the environment away from the patient as seen through a display screen of the alternative-reality headset in accordance with some embodiments.
  • FIG. 22A provides a first view of a medical-device placing system in accordance with some embodiments.
  • FIG. 22B provides a second view of the medical-device placing system of FIG.
  • FIG. 22C provides a stylet for use with the medical-device placing system of
  • FIGS. 22A and 22B in accordance with some embodiments.
  • a“proximal portion” or a“proximal end portion” of, for example, a medical device such as a catheter includes a portion of the catheter intended to be near a clinician when the catheter is used on a patient.
  • a“proximal length” of, for example, the catheter includes a length of the catheter intended to be near the clinician when the catheter is used on the patient.
  • A“proximal end” of, for example, the catheter includes an end of the catheter intended to be near the clinician when the catheter is used on the patient.
  • the proximal portion, the proximal end portion, or the proximal length of the catheter can include the proximal end of the catheter; however, the proximal portion, the proximal end portion, or the proximal length of the catheter need not include the proximal end of the catheter. That is, unless context suggests otherwise, the proximal portion, the proximal end portion, or the proximal length of the catheter is not a terminal portion or terminal length of the catheter.
  • a“distal portion” or a“distal end portion” of, for example, a medical device such as a catheter disclosed herein includes a portion of the catheter intended to be near or in a patient when the catheter is used on the patient.
  • a“distal length” of, for example, the catheter includes a length of the catheter intended to be near or in the patient when the catheter is used on the patient.
  • A“distal end” of, for example, the catheter includes an end of the catheter intended to be near or in the patient when the catheter is used on the patient.
  • the distal portion, the distal end portion, or the distal length of the catheter can include the distal end of the catheter; however, the distal portion, the distal end portion, or the distal length of the catheter need not include the distal end of the catheter. That is, unless context suggests otherwise, the distal portion, the distal end portion, or the distal length of the catheter is not a terminal portion or terminal length of the catheter.
  • alternative reality includes virtual reality, augmented reality, and mixed reality unless context suggests otherwise.
  • Virtual reality includes virtual content in a virtual setting, which setting can be a fantasy or a real-world simulation.
  • “Augmented reality” and“mixed reality” include virtual content in a real-world setting.
  • Augmented reality includes the virtual content in the real-world setting, but the virtual content is not necessarily anchored in the real-world setting.
  • the virtual content can be information overlying the real-world setting.
  • the information can change as the real- world setting changes due to time or environmental conditions in the real-world setting, or the information can change as a result of an experiencer of the augmented reality moving through the real-world setting - but the information remains overlying the real-world setting.
  • Mixed reality includes the virtual content anchored in every dimension of the real-world setting.
  • the virtual content can be a virtual object anchored in the real-world setting.
  • the virtual object can change as the real-world setting changes due to time or environmental conditions in the real-world setting, or the virtual object can change to accommodate the perspective of an experiencer of the mixed reality as the experiencer moves through the real- world setting.
  • the virtual object can also change in accordance with any interactions with the experiencer or another real-world or virtual agent.
  • the virtual object Unless the virtual object is moved to another location in the real-world setting by the experiencer of the mixed reality, or some other real- world or virtual agent, the virtual object remains anchored in the real-world setting.
  • Mixed reality does not exclude the foregoing information overlying the real-world setting described in reference to augmented reality.
  • an ability to visualize anatomy such as the peripheral vasculature is needed.
  • an ability to visualize such anatomy in conjunction with medical devices such as guidewires and catheters is needed to finally make it possible to determine exactly where such medical devices are during placement thereof.
  • such abilities should not adversely affect patients or clinicians.
  • FIG. 1 provides a block diagram for an anatomy-visualizing system 100 in accordance with some embodiments.
  • FIG. 2 provides a block diagram for a medical device- locating system 200 in accordance with some embodiments.
  • FIG. 3 provides a block diagram for a medical device-placing system 300 in accordance with some embodiments.
  • FIG. 12 provides a block diagram for a medical device-placing system 1200 in accordance with some embodiments.
  • the anatomy-visualizing system 100 includes an ultrasound-imaging system 102 and an alternative-reality headset 130, wherein the ultrasound-imaging system 102 includes a console 110 and an ultrasound probe 120; the medical device-locating system 200 includes a console 210, a medical-device detector 240, and, optionally, the alternative-reality headset 130; and the medical device-placing system 300 includes a console 310, the ultrasound probe 120, the alternative-reality headset 130, and the medical-device detector 240.
  • the medical device-placing system 300 is a combination of at least some elements of the anatomy- visualizing system 100 and the medical device-locating system 200.
  • the medical device-placing system 1200 includes a console 1210, the ultrasound probe 120, and the alternative-reality headset 130. Differently, the medical device placing system 1200 does not include the same medical device-locating system 200 as the medical device-placing system 300 but includes a medical device-locating system having a medical-device tip-location sensor (“TLS”) 1240 instead of the medical-device detector 240.
  • TLS medical-device tip-location sensor
  • the medical device-locating system of the medical device-placing system 1200 includes the console 1210, the alternative-reality headset 130, and the TLS 1240 as a medical-device detector.
  • the TLS 1240 is similar to that of TLS 50 of catheter-placement system 10 described in WO 2014/062728, which publication is incorporated by reference in its entirety into this application.
  • the medical device-placing system 1200 is a combination of at least some elements of the anatomy-visualizing system 100 and the catheter-placement system 10 of WO 2014/062728, particularly the TLS 50.
  • consoles 110, 210, 310, and 1210 need not be different consoles. That is, the consoles 110, 210, 310, and 1210 can be the same console.
  • that same console can be the console 310 of the medical device-placing system 300, wherein the console 310 is a combination of the console 110 of the anatomy-visualizing system 100 and the console 210 of the medical device-locating system 200.
  • console 110 described in reference to the anatomy-visualizing system 100 should be understood to apply to the anatomy-visualizing system 100, the medical device-placing system 300, or the medical device-placing system 1200.
  • components and functions of the console 210 described in reference to the medical device-locating system 200 should be understood to apply to the medical device-locating system 200, the medical device-placing system 300, or the medical device-placing system 1200.
  • the respective consoles 110, 210, 310, and 1210 are absent.
  • the alternative-reality headset 130 or another system component serves as the console or performs the functions (e.g., processing) thereof.
  • An example of such a medical device-placing system is medical device-placing system 1600 of FIG. 16.
  • FIG. 1 provides the block diagram for the anatomy-visualizing system
  • the anatomy-visualizing system 100 includes the ultrasound-imaging system 102 and the alternative-reality headset 130, wherein the ultrasound-imaging system 102 includes the console 110 and the ultrasound probe 120.
  • FIG. 4 provides a block diagram for the ultrasound probe 120 connected to the console of the anatomy -visualizing system 100 in accordance with some embodiments.
  • the console 110 has electronic circuitry including memory 412 and one or more processors 414 configured to transform echoed ultrasound signals from a patient with one or more algorithms 416 to produce ultrasound images and ultrasound-image segments therefrom corresponding to anatomical structures of the patient.
  • the console 110 is configured to capture in the memory 412 ultrasound-imaging frames (i.e., frame-by-frame ultrasound images) in accordance with a pulsed-wave Doppler imaging mode of the ultrasound probe 120, stitch the ultrasound-imaging frames together with a stitching algorithm of the one or more algorithms 416, and segment the ultrasound-imaging frames or the stitched ultrasound-imaging frames into the ultrasound-image segments with an image segmentation algorithm of the one or more algorithms 416.
  • the console 110 is configured to transform the ultrasound-image segments into objects of virtual anatomy with a virtualization algorithm of the one or more algorithms 416.
  • the console 110 is configured to send the objects of virtual anatomy to the alternative-reality headset 130 for display over the patient by way of a wireless communications interface 418.
  • the console 110 and the electronic circuitry thereof including the memory 412 and the one or more processors 414 can also be configured to transform one or more sets of ECG signals with an ECG algorithm of the one or more algorithms 416 to correspondingly produce one or more ECGs.
  • one or more ECG electrodes such as one or more ECG-electrode pads are configured to generate the one or more sets of ECG signals in response to the electrical changes associated with the depolarization and the repolarization of a heart of the patient and provide the one or more sets of ECG signals to the console 110.
  • the console 110 includes a number of components of the anatomy-visualizing system 100, and the console 110 can take any form of a variety of forms to house the number of components.
  • the one or more processors 414 and the memory 412 e.g., non-volatile memory such as electrically erasable, programmable, read-only memory [“EEPROM”]) of the console 110 are configured for controlling various functions of the anatomy-visualizing system 100 such as executing the one or more algorithms 416 during operation of the anatomy- visualizing system 100.
  • a digital controller or analog interface 420 is also included with the console 110, and the digital controller or analog interface 420 is in communication with the one or more processors 414 and other system components to govern interfacing between the probe 120, the alternative-reality headset 130, as well as other system components.
  • the console 110 further includes ports 422 for connection with additional, optional components such as the one or more ECG electrodes or optional components 424 including a printer, storage media, keyboard, etc.
  • the ports 422 can be universal serial bus (“ETSB”) ports, though other ports or a combination of ports can be used, as well as other interfaces or connections described herein.
  • a power connection 426 is included with the console 110 to enable operable connection to an external power supply 428.
  • An internal power supply 430 e.g., disposable or rechargeable battery
  • Power management circuitry 432 is included with the digital controller or analog interface 420 of the console 110 to regulate power use and distribution.
  • a display 434 can be, for example, a liquid crystal display (“LCD”) integrated into the console 110 and used to display information to the clinician during a procedure.
  • the display 434 can be used to display an ultrasound image of a targeted internal body portion of the patient attained by the probe 120 or one or more ECGs.
  • the display 434 can be separate from the console 110 instead of integrated into the console 110; however, such a display is different than that of the alternative-reality headset 130.
  • the console 110 can further include a console button interface 436. In combination with control buttons on the probe 120, the console button interface 436 can be used by a clinician to immediately call up a desired mode on the display 434 for use by the clinician in the procedure.
  • the ultrasound probe 120 is configured to emit ultrasound signals into the patient and receive the echoed ultrasound signals from the patient by way of a piezoelectric sensor array 438.
  • the ultrasound probe 120 can be configured with a continuous wave or a pulsed-wave imaging mode.
  • the ultrasound probe 120 can configured with the foregoing pulsed-wave Doppler imaging mode for emitting and receiving the ultrasound signals.
  • the probe 120 further includes a button-and-memory controller 440 for governing operation of the probe 120 and buttons thereof.
  • the button-and-memory controller 440 can include non-volatile memory such as EEPROM.
  • the button-and-memory controller 440 is in operable communication with a probe interface 442 of the console 110, which probe interface includes a piezoelectric input-output component 444 for interfacing with the piezoelectric sensor array 438 of the probe 120 and a button-and-memory input-output component 446 for interfacing with the button-and-memory controller 440 of the probe 120.
  • FIG. 5 provides a block diagram for the alternative-reality headset 130 of the anatomy-visualizing system 100 in accordance with some embodiments.
  • the alternative-reality headset 130 which can have a goggle-type or face shield-type form factor, includes a suitably configured display screen 512 and a window 514 thereover coupled to a frame 516 having electronic circuitry including memory 518 and one or more processors 520.
  • the display screen 512 is configured such that a wearer of the alternative-reality headset 130 can see an environment (e.g., operating room) including the patient through the display screen 512 in accordance with an opacity of the window 514, which opacity is adjustable with an opacity control 548.
  • the display screen 512 is configured to display objects of virtual anatomy over the environment such as over the patient, the objects of virtual anatomy corresponding to the ultrasound-image segments produced by the console 110 with the image segmentation algorithm. (See, for example, FIG. 6A, wherein the objects of virtual anatomy correspond to vasculature in a limb of the patient.)
  • the alternative-reality headset 130 can be configured to three-dimensionally anchor the objects of virtual anatomy to the environment such as to the patient over which the objects of virtual anatomy are displayed, which allows the wearer of the alternative-reality headset 130 to see a true representation of the patient’s anatomy for one or more subsequent medical procedures (e.g., accessing a vessel and placing a medical device such as a guidewire of catheter in the vessel). Anchoring the objects of virtual anatomy to the environment or to the patient over which the objects of virtual anatomy are displayed is characteristic of mixed reality.
  • the alternative-reality headset 130 can further include a perceptual user interface (“PUT’) configured to enable the wearer of the alternative-reality headset 130 to interact with the alternative-reality headset 130 without a physical input device such as keyboard or mouse.
  • PUT perceptual user interface
  • the PUI can have input devices including, but not limited to, one or more wearer-facing eye-tracking cameras 522, one or more patient-facing cameras 524, one or more microphones 526, or a combination thereof.
  • At least one advantage of the PUI the input devices thereof is the clinician does not have to reach outside a sterile field to execute a command of the alternative-reality headset 130.
  • the one or more eye- tracking cameras 522 can be coupled to the frame 516 and configured to capture eye movements of the wearer in a camera buffer 534 or the memory 518.
  • the processor 520 of the alternative-reality headset 130 can be configured to process the eye movements with an eye- movement algorithm of one or more algorithms 528 to identify a focus of the wearer for selecting the objects of virtual anatomy or other representations (e.g., outlines of medical device, virtual medical devices, etc.) corresponding to the focus of the wearer.
  • the focus of the wearer can be used by the PUI to select an object of virtual anatomy for enhancing the object of virtual anatomy by way of highlighting the object of virtual anatomy or increasing the contrast between the object of virtual anatomy and its environment.
  • the focus of the wearer can be used by the PUI to select an object of virtual anatomy for performing one or more other operations of the PUI such as zooming in on the object of virtual anatomy, providing a cross-section of the one or more objects of virtual anatomy, or the like. (See, for example, FIG.
  • the one or more patient-facing cameras 524 can be coupled to the frame 516 and configured to capture gestures of the wearer in a camera buffer 534 or the memory 518.
  • the processor 520 of the alternative- reality headset 130 can be configured to process the gestures with a gesture-command algorithm of the one or more algorithms 528 to identify gesture-based commands issued by the wearer for execution thereof by the alternative-reality headset 130.
  • the one or more microphones 526 With respect to the one or more microphones 526, the one or more microphones 526, the one or more microphones
  • the processor 520 of the alternative-reality headset 130 can be configured to process the audio with an audio-command algorithm of the one or more algorithms 528 to identify audio- based commands issued by the wearer for execution thereof by the alternative-reality headset 130.
  • the electronic circuitry includes the processor 520, a memory controller 530 in communication with the memory 518 (e.g., dynamic random-access memory [“DRAM”]), a camera interface 532, the camera buffer 534, a display driver 536, a display formatter 538, a timing generator 540, a display-out interface 542, and a display-in interface 544.
  • the processor 520 can be in communication with each other through the processor 520, dedicated lines of one or more buses, or a combination thereof.
  • the camera interface 216 is configured to provide an interface to the one or more eye-tracking cameras 522 and the one or more patient-facing cameras 524, as well as store respective images received from the cameras 522, 524 in the camera buffer 534 or the memory 518.
  • Each camera of the one or more eye-tracking cameras 522 can be an infrared (“IR”) camera or a position-sensitive detector (“PSD”) configured to track eye-glint positions by way of IR reflections or eye glint-position data, respectively.
  • the display driver 220 is configured to drive the display screen 512.
  • the display formatter 538 is configured to provide display-formatting information for the objects of virtual anatomy to the one or more processors 414 of the console 110 for formatting the objects of virtual anatomy for display on the display screen 512 over the environment such as over the patient.
  • the timing generator 540 is configured to provide timing data for the alternative-reality headset 130.
  • the display-out interface 542 includes a buffer for providing images from the one or more eye-tracking cameras 522 or the one or more patient-facing cameras 524 to the one or more processors 414 of the console 110.
  • the display-in interface 544 includes a buffer for receiving images such as the objects of virtual anatomy to be displayed on the display screen 512.
  • the display-out and display -in interfaces 542, 544 are configured to communicate with the console 110 by way of wireless communications interface 546.
  • the opacity control 548 is configured to change a degree of opacity of the window 514.
  • Additional electronic circuitry includes a voltage regulator 550, an eye-tracking illumination driver 552, an audio digital -to-analog converter (“DAC”) and amplifier 554, a microphone preamplifier and audio analog-to-digital converter (“ADC”) 556, a temperature sensor interface 558, and a clock generator 560.
  • the voltage regulator 550 is configured to receive power from an internal power supply 562 (e.g., a battery) or an external power supply 564 through power connection 566.
  • the voltage regulator 550 is configured to provide the received power to the electronic circuitry of the alternative-reality headset 130.
  • the eye tracking illumination driver 236 is configured to control an eye-tracking illumination unit 568 by way of a drive current or voltage to operate about a predetermined wavelength or within a predetermined wavelength range.
  • the audio DAC and amplifier 554 is configured to provide audio data to earphones or speakers 570.
  • the microphone preamplifier and audio ADC 556 is configured to provide an interface for the one or more microphones 526.
  • the temperature sensor interface 558 is configured as an interface for a temperature sensor 572.
  • the alternative-reality headset 130 can include orientation sensors including a three-axis magnetometer 574, a three-axis gyroscope 576, and a three-axis accelerometer 578 configured to provide orientation-sensor data for determining an orientation of the alternative-reality headset 130 at any given time.
  • the alternative-reality headset 130 can include a global-positioning system (“GPS”) receiver 580 configured to receive GPS data (e.g., time and position information for one or more GPS satellites) for determining a location of the alternative-reality headset 130 at any given time.
  • GPS global-positioning system
  • FIG. 2 provides the block diagram for the medical device-locating system 200 in accordance with some embodiments.
  • the medical device-locating system 200 includes the console 210, the medical-device detector 240 including an array of magnetic sensors 242, and, optionally, the alternative-reality headset 130.
  • FIG. 7 provides a block diagram for the medical-device detector 240 connected to the console 210 of the medical device-locating system 200 in accordance with some embodiments.
  • the console 210 has electronic circuitry including memory 712 and one or more processors 714 configured to transform magnetic-sensor signals from the array of magnetic sensors 242 with one or more algorithms 716 (e.g., a location-finding algorithm including, for example, tri angulation) into location information for a magnetized medical device (e.g., a catheter including a magnetic element) within a limb of a patient when the medical-device detector 240 is placed about the limb of the patient.
  • a magnetized medical device e.g., a catheter including a magnetic element
  • the console 210 includes a number of components of the medical device- locating system 200, and the console 210 can take any form of a variety of forms to house the number of components.
  • the one or more processors 714 and the memory 712 (e.g., non-volatile memory such as EEPROM) of the console 210 are configured for controlling various functions of the medical device-locating system 200 such as executing the one or more algorithms 716 during operation of the medical device-locating system 200.
  • a digital controller or analog interface 720 is also included with the console 210, and the digital controller or analog interface 720 is in communication with the one or more processors 714 and other system components to govern interfacing between the medical-device detector 240, the alternative-reality headset 130, as well as other system components.
  • the console 210 can also be configured with a wireless communications interface 418 to send to the alternative-reality headset 130 location information, or a representation of the medical device (e.g., an outline of the medical device, a virtual medical device, etc.) in accordance with the location information, for a magnetized medical device within a limb of a patient for display on the display screen 512 of the alternative-reality headset 130.
  • a wireless communications interface 418 to send to the alternative-reality headset 130 location information, or a representation of the medical device (e.g., an outline of the medical device, a virtual medical device, etc.) in accordance with the location information, for a magnetized medical device within a limb of a patient for display on the display screen 512 of the alternative-reality headset 130.
  • FIGS. 11 A and 11B wherein the objects of virtual anatomy correspond to vasculature in the limb of the patient, and wherein a representation of a medical device such as a guidewire or catheter is being advanced therethrough.
  • the console 210 further includes ports 722 for connection with the medical- device detector 240 as well as additional, optional components such as a magnetic-field generator 740, a printer, storage media, keyboard, etc.
  • the ports 722 can be ETSB ports, though other ports or a combination of ports can be used, as well as other interfaces or connections described herein.
  • a power connection 726 is included with the console 210 to enable operable connection to an external power supply 728.
  • An internal power supply 730 e.g., disposable or rechargeable battery
  • Power management circuitry 732 is included with the digital controller or analog interface 720 of the console 210 to regulate power use and distribution.
  • a display 734 can be, for example, an LCD integrated into the console 210 and used to display information to the clinician during a procedure.
  • the display 734 can be used to display location information, or depict a representation of the medical device (e.g., an outline of the medical device, a virtual medical device, etc.) in accordance with the location information, for a magnetized medical device within a limb of a patient.
  • the display 734 can be separate from the console 210 instead of integrated into the console 210; however, such a display is different than that of the alternative-reality headset 130, which can also be configured to display location information (e.g., as a location-information overlay), or depict a representation of the medical device (e.g., an outline of the medical device, a virtual medical device, etc.) in accordance with the location information, for a magnetized medical device within a limb of a patient.
  • the console 210 can further include a console button interface 736.
  • the console button interface 736 can be used by a clinician to immediately call up a desired mode (e.g., a mode with the magnetic-field generator 740, a mode without the magnetic-field generator 740, etc.) on the display 734 for use by the clinician in the procedure.
  • a desired mode e.g., a mode with the magnetic-field generator 740, a mode without the magnetic-field generator 740, etc.
  • FIG. 8A provides a first medical-device detector 800 in accordance with some embodiments.
  • FIG. 8B provides the first medical-device detector 800 about a limb of a patient in accordance with some embodiments.
  • FIG. 9 provides a second medical -device detector 900 about a limb of a patient in accordance with some embodiments.
  • each medical-device detector of the first medical-device detector 800 and the second medical-device detector 900 includes the array of magnetic sensors 242 embedded within a housing 810, 910 configured for placement about a limb (e.g., an arm or a leg) of a patient.
  • a limb e.g., an arm or a leg
  • the console 210 is configured to transform magnetic-sensor signals from the array of magnetic sensors 242 with the one or more algorithms 716 (e.g., a location-finding algorithm) into location information, or the representation of the medical device (e.g., an outline of the medical device, a virtual medical device, etc.) in accordance with the location information, for a magnetized medical device within the limb of the patient when the medical- device detector 800, 900 is placed about the limb of the patient.
  • the housing 810 of the first medical-device detector 800 is a rigid frame. Each magnetic sensor of the array of magnetic sensors 242 embedded within the frame has a fixed spatial relationship to another magnetic sensor.
  • the fixed spatial relationship is communicated to the console 210 upon connecting the first medical-device detector 800 to a port of the ports 722 of the console 210 or calling up one or more modes with the console button interface 736 of the console 210 for using the first medical-device detector 800 without the magnetic-field generator 740.
  • the console 210 is able to transform the magnetic-sensor signals from the array of magnetic sensors 242 into the location information, or the representation of the medical device (e.g., an outline of the medical device, a virtual medical device, etc.) in accordance with the location information, for the magnetized medical device within the limb of the patient.
  • the housing 810 of the first medical -device detector 800 can further include one or more light-emitting diodes (“LEDs”) or lasers embedded within the frame such as within a strut 812 of the frame.
  • the one or more LEDs or lasers can be configured to illuminate the limb of the patient about which the first medical -device detector 800 is placed, or the one or more LEDs or lasers can be configured to illuminate just a portion of the limb of the patient.
  • the portion of the limb of the patient can be the portion under which a tip of the medical device is located within the limb of the patient. (See, for example, FIG.
  • the one or more LEDs or lasers can function as a real-world light-based pointing system for identifying a medical device’s location.
  • the light-based pointing system can be used in conjunction with the alternative-reality headset 130 for confirmation of a medical device’s location as the illumination provided by the light-based pointing system is visible through the see-through display screen 512 of the alternative-reality headset 130.
  • the housing 910 of the second medical -device detector 900 is a drape.
  • Each magnetic sensor of the array of magnetic sensors 242 embedded within the drape has a variable spatial relationship to another magnetic sensor depending upon how the drape is placed about the limb of the patient.
  • the medical device-locating system 200 can further include the magnetic-field generator 740, which is configured to generate a magnetic field about the second medical-device detector 900 for determining the spatial relationship of one magnetic sensor of the array of magnetic sensors 242 to another magnetic sensor.
  • Each magnetic sensor present in the array of magnetic sensors 242 is communicated to the console 210 upon connecting the second medical -device detector 900 to a port of the ports 722 of the console 210 or calling up one or more modes with the console button interface 736 of the console 210 for using the second medical-device detector 900 with the magnetic-field generator 740.
  • the console 210 is configured to determine the spatial relationship of each magnetic sensor to another magnetic sensor from the magnetic-sensor signals produced by the array of magnetic sensors 242 while in the presence of the generated magnetic field. This is made possible, in part due to each magnetic sensor of the array of magnetic sensors 424 being in a unique magnetic environment with respect to at least the strength and orientation of the generated magnetic field.
  • the console 210 is able to transform the magnetic-sensor signals from the array of magnetic sensors 242 into the location information, or the representation of the medical device (e.g., an outline of the medical device, a virtual medical device, etc.) in accordance with the location information, for the magnetized medical device within the limb of the patient.
  • the determined spatial relationship of the array of magnetic sensors 242 can be periodically confirmed in the presence of a newly generated magnetic field considering the medical device within the limb of the patient.
  • FIG. 3 provides the block diagram for the medical device-placing system
  • the medical device-placing system 300 can include the ultrasound probe 120 of the anatomy-visualizing system 100, the medical-device detector 240 including the array of magnetic sensors 242 of the medical device-locating system 200, the alternative- reality headset 130, and the console 310, which includes electronic circuitry like that of both console 110 and console 210.
  • FIG. 10 provides a block diagram for the ultrasound probe 120 and the medical- device detector 240 connected to the console 310 of the medical device-placing system 300 in accordance with some embodiments.
  • the console 310 has electronic circuitry including memory 1012 and one or more processors 1014. Like the console 110, the console 310 is configured to transform echoed ultrasound signals from a patient with one or more algorithms 1016 to produce ultrasound images and ultrasound-image segments therefrom corresponding to anatomical structures of the patient. The console 310 is configured to capture in the memory 1012 ultrasound-imaging frames in accordance with a pulsed-wave Doppler imaging mode of the ultrasound probe 120, stitch the ultrasound-imaging frames together with a stitching algorithm of the one or more algorithms 1016, and segment the ultrasound-imaging frames or the stitched ultrasound-imaging frames into the ultrasound-image segments with an image segmentation algorithm of the one or more algorithms 1016.
  • the console 310 is configured to transform the ultrasound-image segments into objects of virtual anatomy with a virtualization algorithm of the one or more algorithms 1016. Also like the console 110, the console 310 and the electronic circuitry thereof including the memory 1012 and the one or more processors 1014 can also be configured to transform one or more sets of ECG signals with an ECG algorithm of the one or more algorithms 1016 to correspondingly produce one or more ECGs.
  • one or more ECG electrodes 1050 such as one or more ECG-electrode pads, the stylet 1752 (see FIG. 15), or a combination thereof are configured to generate the one or more sets of ECG signals in response to the electrical changes associated with the depolarization and the repolarization of a heart of the patient and provide the one or more sets of ECG signals to the console 310.
  • the console 310 is configured to transform magnetic- sensor signals from the array of magnetic sensors 242 with one or more algorithms 1016 (e.g., a location-finding algorithm) into location information for a magnetized medical device within a limb of the patient when the medical-device detector 240 is placed about the limb of the patient.
  • the console 310 is configured to send to the alternative-reality headset 130 by way of a wireless communications interface 1018 both the objects of virtual anatomy and a representation of the medical device (e.g., an outline of the medical device, a virtual medical device, etc.) within the limb of the patient, in accordance with the location information, for display over the environment or the patient on the display screen 512 of the alternative-reality headset 130.
  • a wireless communications interface 1018 both the objects of virtual anatomy and a representation of the medical device (e.g., an outline of the medical device, a virtual medical device, etc.) within the limb of the patient, in accordance with the location information, for display over the environment or the patient on the display screen
  • the alternative-reality headset 130 can be configured to anchor the objects of virtual anatomy and the representation of the medical device to the environment or the patient, which is characteristic of mixed reality.
  • the console 310 includes a number of components of the medical device placing system 300, and the console 310 can take any form of a variety of forms to house the number of components.
  • the one or more processors 1014 and the memory 1012 (e.g., non volatile memory such as EEPROM) of the console 310 are configured for controlling various functions of the medical device-placing system 300 such as executing the one or more algorithms 1016 during operation of the medical device-placing system 300.
  • a digital controller or analog interface 1020 is also included with the console 310, and the digital controller or analog interface 1020 is in communication with the one or more processors 1014 and other system components to govern interfacing between the probe 120, the medical-device detector 240, the alternative-reality headset 130, as well as other system components.
  • the console 310 further includes ports 1022 for connection with the medical- device detector 240 as well as additional, optional components such as the magnetic-field generator 740, the one or more ECG electrodes 1050 such as that of the stylet 1752 (see FIG. 15), or the optional components 424 (e.g., a printer, storage media, keyboard, etc.).
  • the ports 1022 can be ETSB ports, though other ports or a combination of ports can be used, as well as other interfaces or connections described herein.
  • a power connection 1026 is included with the console 310 to enable operable connection to an external power supply 1028.
  • An internal power supply 1030 e.g., disposable or rechargeable battery
  • Power management circuitry 1032 is included with the digital controller or analog interface 1020 of the console 310 to regulate power use and distribution.
  • the medical device detector 240 is configured with one or more ports or complementary connectors for connection of the one or more ECG electrodes 1050 including that of the stylet 1752.
  • a display 1034 can be, for example, an LCD integrated into the console 310 and used to display information to the clinician during a procedure.
  • the display 1034 can be used to display an ultrasound image of a targeted internal body portion of the patient attained by the probe 120 or one or more ECGs, the location information for a medical device within a limb of the patient, or a representation of the medical device (e.g., an outline of the medical device, a virtual medical device, etc.) in accordance with the location information for the medical device within the limb of the patient.
  • the display 1034 can be separate from the console 310 instead of integrated into the console 310; however, such a display is different than that of the alternative-reality headset 130, which can also be configured to display the objects of virtual anatomy and the representation of the medical device (e.g., an outline of the medical device, a virtual medical device, etc.) within the limb of the patient.
  • the alternative-reality headset 130 can also be configured to display the objects of virtual anatomy and the representation of the medical device (e.g., an outline of the medical device, a virtual medical device, etc.) within the limb of the patient.
  • the console 310 can further include a console button interface 1036.
  • the console button interface 1036 can be used by a clinician to immediately call up a desired ultrasound-imaging mode (e.g., a continuous wave imaging mode or a pulsed-wave imaging mode) on the display 1034 for use by the clinician in the procedure.
  • the console button interface 1036 can be used by the clinician to immediately call up a desired medical device-locating mode (e.g., a mode with the magnetic- field generator 740, a mode without the magnetic-field generator 740, etc.) on the display 734 for use by the clinician in the procedure.
  • FIG. 12 provides the block diagram for the medical device-placing system 1200 in accordance with some embodiments.
  • the medical device-placing system 1200 can include the ultrasound probe 120 of the anatomy -visualizing system 100, the alternative-reality headset 130, and the console 1210, which includes electronic circuitry like that of the console 110.
  • the medical device-placing system 1200 includes the TLS 1240 similar to that of the TLS 50 of catheter-placement system 10 described in WO 2014/062728, which publication is incorporated by reference in its entirety into this application.
  • FIG. 13 provides a block diagram for the ultrasound probe 120 and the TLS
  • the console 1210 has electronic circuitry including memory 1312 and one or more processors 1314. Like the console 110, the console 1210 is configured to transform echoed ultrasound signals from a patient with one or more algorithms 1316 to produce ultrasound images and ultrasound-image segments therefrom corresponding to anatomical structures of the patient.
  • the console 1210 is configured to capture in the memory 1312 ultrasound-imaging frames in accordance with a pulsed-wave Doppler imaging mode of the ultrasound probe 120, stitch the ultrasound-imaging frames together with a stitching algorithm of the one or more algorithms 1316, and segment the ultrasound-imaging frames or the stitched ultrasound-imaging frames into the ultrasound-image segments with an image segmentation algorithm of the one or more algorithms 1316.
  • the console 1210 is configured to transform the ultrasound-image segments into objects of virtual anatomy with a virtualization algorithm of the one or more algorithms 1316.
  • the console 1210 and the electronic circuitry thereof including the memory 1312 and the one or more processors 1314 can also be configured to transform one or more sets of ECG signals with an ECG algorithm of the one or more algorithms 1016 to correspondingly produce one or more ECGs.
  • the one or more ECG electrodes 1050 such as one or more ECG-electrode pads, the stylet 1752 (see FIG. 15), or a combination thereof are configured to generate the one or more sets of ECG signals in response to the electrical changes associated with the depolarization and the repolarization of a heart of the patient and provide the one or more sets of ECG signals to the console 1210.
  • the console 1210 is configured to transform TLS signals
  • a magnetized medical device e.g., a PICC
  • the console 1210 is configured to send to the alternative-reality headset 130 by way of a wireless communications interface 1318 both the objects of virtual anatomy and a representation of the medical device (e.g., an outline of the medical device, a virtual medical device, etc.) within the limb of the patient, in accordance with the location information, for display over the environment or the patient on the display screen 512 of the alternative-reality headset 130.
  • a wireless communications interface 1318 both the objects of virtual anatomy and a representation of the medical device (e.g., an outline of the medical device, a virtual medical device, etc.) within the limb of the patient, in accordance with the location information, for display over the environment or the patient on the display screen 512 of the alternative-reality headset 130.
  • FIG. 14 illustrates the objects of virtual anatomy and a representation of a medical device over a patient as seen through the display screen 512 of the alternative-reality headset 130 in accordance with some embodiments.
  • FIG. 15 illustrates the medical device placing system 1200 including the stylet 1752 along with the objects of virtual anatomy and the representation of the medical device over a patient as seen through the display screen 512 of the alternative-reality headset 130 in accordance with some embodiments.
  • the alternative- reality headset 130 can be configured to anchor the objects of virtual anatomy and the representation of the medical device to the environment or the patient, which is characteristic of mixed reality.
  • a medical device such as a guidewire or a catheter 1460 (e.g., PICC)
  • the objects of virtual anatomy displayed on the display screen 512 can be restricted to the circulatory system of the patient using the Doppler imaging mode of the ultrasound probe 120 as shown by the vasculature of FIGS. 6A and 6B, and the guidewire or the catheter 1460 can be displayed on the display screen 512 within the vasculature of the patient.
  • the medical device-placing system 1200 can be configured to track and display advancement of the guidewire or the catheter 1460 to a desired location all the way up a superior vena cava (“SVC”) of the patient proximate at least a sinoatrial node in a right atrium of a heart of the patient, which includes displaying placement of a representation of a medical device such as the guidewire or the catheter 1460 in an object of virtual anatomy such as a virtual SVC as shown in FIG. 14.
  • SVC superior vena cava
  • a distal-end portion of the representation of the medical device can be configured to indicate proximity of the guidewire or the catheter 1460 to the desired location in the patient by way of a visual indicator (e.g., color gradient, flashing speed, glowing intensity, etc.) of the representation of the medical device.
  • a visual indicator e.g., color gradient, flashing speed, glowing intensity, etc.
  • the console 1210 is configured with ports 1322 as set forth below, the one or more ECG electrodes 1050 such as the stylet 1752, which includes an ECG electrode in a distal-end portion of the stylet 1752, can be connected to the console 1210.
  • the one or more ECG electrodes 1050 can provide one or more sets of ECG signals to the console 1210 when connected to the console 1210.
  • the one or more set of ECG signals can be used to animate the heart of the patient when displayed as an object of virtual anatomy on the display screen 512 of the alternative-reality headset 130.
  • Animating the heart of the patient includes animating a heartbeat of the heart as shown in FIG. 14.
  • the console 1210 includes a number of components of the medical device placing system 1200, and the console 1210 can take any form of a variety of forms to house the number of components.
  • the one or more processors 1314 and the memory 1312 (e.g., non volatile memory such as EEPROM) of the console 1210 are configured for controlling various functions of the medical device-placing system 1200 such as executing the one or more algorithms 1316 during operation of the medical device-placing system 1200.
  • a digital controller or analog interface 1320 is also included with the console 1210, and the digital controller or analog interface 1320 is in communication with the one or more processors 1314 and other system components to govern interfacing between the probe 120, the TLS 1240, the alternative-reality headset 130, as well as other system components.
  • the console 1210 further includes ports 1322 for connection with the TLS 1240 as well as additional, optional components such as the one or more ECG electrodes 1050 such as that of the stylet 1752 (see FIG. 15), or the optional components 424 (e.g., a printer, storage media, keyboard, etc.).
  • the ports 1322 can be ETSB ports, though other ports or a combination of ports can be used, as well as other interfaces or connections described herein.
  • a power connection 1326 is included with the console 1210 to enable operable connection to an external power supply 1328.
  • An internal power supply 1330 e.g., disposable or rechargeable battery
  • Power management circuitry 1332 is included with the digital controller or analog interface 1320 of the console 1210 to regulate power use and distribution.
  • the TLS 1240 is configured with one or more ports or complementary connectors for connection of the one or more ECG electrodes 1050 including that of the stylet 1752.
  • the one or more complementary connectors of the TLS 1240 are similar to that of the TLS 50 of catheter-placement system 10 described in WO 2010/030820, which publication is incorporated by reference in its entirety into this application.
  • the TLS 1240 can be configured to connect to the stylet 1752 through a sterile drape separating a sterile field including the stylet 1752 from a non-sterile field including the TLS 1240.
  • a display 1334 can be, for example, an LCD integrated into the console 1210 and used to display information to the clinician during a procedure.
  • the display 1334 can be used to display an ultrasound image of a targeted internal body portion of the patient attained by the probe 120, one or more ECGs, the location information for a medical device within a limb of the patient, or a representation of the medical device (e.g., an outline of the medical device, a virtual medical device, etc.) in accordance with the location information for the medical device within the limb of the patient.
  • the display 1334 can be separate from the console 1210 instead of integrated into the console 1210; however, such a display is different than that of the alternative-reality headset 130, which can also be configured to display the objects of virtual anatomy and the representation of the medical device (e.g., an outline of the medical device, a virtual medical device, etc.) within the limb of the patient.
  • the alternative-reality headset 130 can also be configured to display the objects of virtual anatomy and the representation of the medical device (e.g., an outline of the medical device, a virtual medical device, etc.) within the limb of the patient.
  • the console 1210 can further include a console button interface 1336.
  • the console button interface 1336 can be used by a clinician to immediately call up a desired ultrasound-imaging mode (e.g., a continuous wave imaging mode or a pulsed-wave imaging mode) on the display 1334 for use by the clinician in the procedure.
  • the console button interface 1336 can be used by the clinician to immediately call up a desired medical device-locating mode on the display 734 for use by the clinician in the procedure.
  • the alternative-reality headset 130 can serve as the console
  • the alternative-reality headset 130 in such embodiments, further includes the necessary electronic circuitry, algorithms, or the like set forth above to function as the console 110, 210, 310, or 1210.
  • FIG. 16 illustrates a block diagram for a wireless medical device-placing system
  • FIG. 17 illustrates the wireless medical device-placing system 1600 including the stylet 1752 along with a representation of a medical device (e.g., PICC) within an object of virtual anatomy (e.g., SVC) over a patient as seen through the display screen 512 of the alternative-reality headset 130 in accordance with some embodiments.
  • a medical device e.g., PICC
  • object of virtual anatomy e.g., SVC
  • Reference numerals for components of the anatomy-visualizing system 100, the medical device-locating system 200, the medical device-placing system 300, or the medical device-placing system 1200 in common with the medical device-placing system 1600 are retained in the description set forth below for the medical device-placing system 1600 for direct reference to the description set forth above.
  • certain components e.g., the ultrasound probe 120, the medical-device detector 240, the TLS 1240, etc.
  • certain components further include a wireless communications interface and electronic circuitry in support of wireless communications with the alternative-reality headset 130.
  • the medical device-placing system 1600 can include the ultrasound probe 120 of the anatomy -visualizing system 100, the medical-device detector 240 of the medical device-locating system 200 or the TLS 1240 of the medical device-locating system 1200, and the alternative-reality headset 130 configured to wirelessly communicate with the ultrasound probe 120 and the medical-device detector 240 or the TLS 1240.
  • the medical device-placing system 1600 optionally further includes the one or more ECG electrodes 1050 including the stylet 1752.
  • the frame 516 of the alternative-reality headset 130 has electronic circuitry including memory and a processor (e.g., the memory 518 and the one or more processors 520) configured to transform echoed ultrasound signals from a patient with one or more algorithms 528 to produce ultrasound images and ultrasound-image segments therefrom corresponding to anatomical structures of the patient.
  • a processor e.g., the memory 518 and the one or more processors 520
  • the alternative- reality headset 130 is configured to capture in the memory 518 ultrasound-imaging frames (i.e., frame-by-frame ultrasound images) in accordance with an imaging mode of the ultrasound probe 120, stitch the ultrasound-imaging frames together with a stitching algorithm of the one or more algorithms 528, and segment the ultrasound-imaging frames or the stitched ultrasound- imaging frames into the ultrasound-image segments with an image segmentation algorithm of the one or more algorithms 528.
  • the alternative-reality headset 130 is configured to transform the ultrasound-image segments into objects of virtual anatomy with a virtualization algorithm of the one or more algorithms 528 for display on the display screen 512 over an environment including the patient.
  • the alternative-reality headset 130 and the electronic circuitry thereof including the memory 518 and the one or more processors 520 can also be configured to transform one or more sets of ECG signals with an ECG algorithm of the one or more algorithms 528 to correspondingly produce one or more ECGs.
  • the one or more ECG electrodes 1050 such as one or more ECG-electrode pads, the stylet 1752 (see FIG.
  • the TLS 1240 are configured to generate the one or more sets of ECG signals in response to the electrical changes associated with the depolarization and the repolarization of a heart of the patient and provide the one or more sets of ECG signals to the alternative-reality headset 130 by way of the TLS 1240.
  • the electronic circuitry of the frame 516 including the memory 518 and the one or more processors 520 is configured to transform TLS signals (e.g., magnetic-sensor signals from the one or more magnetic sensors 1242 disposed in the housing of the TLS 1240 with a fixed spatial relationship) with one or more algorithms 528 (e.g., a location-finding algorithm) into location information for a magnetized medical device (e.g., a PICC) within the patient for display of a representation of the medical device (e.g., an outline of the medical device, a virtual medical device, etc.) on the display screen 512 of the alternative-reality headset 130 in accordance with the location information.
  • the representation of the medical device can be displayed on the display screen 512 within the objects of virtual anatomy in accordance with the location information for the medical device.
  • FIG. 18 illustrates a first representation 1860 of a first medical device within an object of virtual anatomy over a patient, a second representation 1840 of a second medical device over the patient, and windows 1870 including output of the medical device-placing system 1600 in an environment beside the patient as seen through the display screen 512 of the alternative-reality headset 130 in accordance with some embodiments.
  • FIG. 19 illustrates the first representation 1860 of the first medical device within the object of virtual anatomy over the patient, a third representation 1842 of the second medical device over the patient, and the windows 1870 including the output of the medical device-placing system 1600 in the environment beside the patient as seen through the display screen 512 of the alternative-reality headset 130 in accordance with some embodiments.
  • FIG. 19 illustrates the first representation 1860 of the first medical device within the object of virtual anatomy over the patient, a third representation 1842 of the second medical device over the patient, and the windows 1870 including the output of the medical device-placing system 1600 in the environment beside the patient as seen through the display screen 512
  • FIG. 20 illustrates the first representation 1860 of the first medical device within the object of virtual anatomy above the patient, the third representation 1842 of the second medical device above the patient, and the windows 1870 including the output of the medical device-placing system 1600 in the environment beside the patient as seen through the display screen 512 of the alternative-reality headset 130 in accordance with some embodiments.
  • FIG. 21 illustrates the first representation 1860 of the first medical device within the object of virtual anatomy in the environment away from the patient, the third representation 1842 of the second medical device in the environment away from the patient, and the windows 1870 including the output of the medical device-placing system 1600 in the environment away from the patient as seen through the display screen 512 of the alternative-reality headset 130 in accordance with some embodiments.
  • the first representation 1860 of the first medical device is a virtual medical device corresponding to the catheter 1460 as seen through the display screen 512 of the alternative- reality headset 130, wherein the virtual medical device or virtual catheter is within an object of virtual anatomy such as a virtual SVC over the patient and anchored thereto.
  • the virtual catheter within the virtual SVC represents, in real-time, a location of the catheter 1460 in a SVC of the patient.
  • the second representation 1840 of the second medical device is an outline corresponding to the TLS 1240 as seen through the display screen 512 of the alternative-reality headset 130, wherein the outline around the TLS 1240 over a sterile drape 1802 and anchored thereto indicates the TLS 1240 under the sterile drape 1802 on a chest of the patient.
  • the third representation 1842 of the second medical device is virtual medical device corresponding to the TLS 1240, wherein the virtual medical device or virtual TLS is over the sterile drape 1802 and anchored thereto for visualization of the TLS 1240 without compromising a sterile field defined by the sterile drape 802.
  • the first representation 1860 of the first medical device or the virtual catheter as seen through the display screen 512 of the alternative-reality headset 130 remains within the object of virtual anatomy or the virtual SVC, but both the virtual catheter and the virtual SVC are anchored to the environment above the patient or a reference frame of a wearer of the alternative-reality headset 130 and temporarily located above the patient.
  • the third representation 1842 of the second medical device or the virtual TLS as seen through the display screen 512 of the alternative-reality headset 130 is anchored to the environment above the patient or a reference frame of the wearer of the alternative-reality headset 130 and temporarily located above the patient.
  • any object of virtual anatomy or representation of the medical -device representations 1840, 1842, and 1860 is anchored to the reference frame of the wearer of the alternative-reality headset 130, the object of virtual anatomy or the representation of the medical device can be temporarily located beside the patient (see FIGS. 18-20) or at some distance away from the patient (see FIG. 20) depending upon the wearer’s point of view.
  • the first representation 1860 of the first medical device or the virtual catheter as seen through the display screen 512 of the alternative-reality headset 130 remains within the object of virtual anatomy or the virtual SVC, but both the virtual catheter and the virtual SVC are anchored to the environment some distance away from the patient or a reference frame of the wearer of the alternative-reality headset 130 and temporarily located some distance away from the patient.
  • the third representation 1842 of the second medical device or the virtual TLS as seen through the display screen 512 of the alternative-reality headset 130 is anchored to the environment some distance away from the patient or a reference frame of the wearer of the alternative-reality headset 130 and temporarily located some distance away from the patient.
  • the windows 1870 are graphical-control-element windows as seen through the display screen 512 of the alternative-reality headset 130 including output of the medical device-placing system 1600.
  • the windows 1870 include, but are not limited to, an ultrasound window 1872, wherein the output of the ultrasound window 1872 corresponding to the one or more processes of the medical device-placing system 1600 includes ultrasound-imaging frames corresponding to ultrasound imaging with the ultrasound probe 120.
  • the windows 1870 also include, but are not limited to, an ECG window 1874, wherein the output of the ECG window 1874 corresponding to the one or more processes of the medical device-placing system 1600 includes one or more ECGs corresponding to electrocardiography with the one or more ECG electrodes 150 including the stylet 1752 having the ECG electrode.
  • Each of the one or more ECGs in the ECG window 1874 is configured for arrangement in the ECG window by the wearer of the alternative-reality headset 130.
  • Each window of the windows 1870 in FIGS. 18-20 is either anchored to the environment beside the patient or a reference frame of the wearer of the alternative-reality headset 130. In FIG.
  • each window of the windows 1870 is either anchored to the environment at some distance away from the patient or a reference frame of the wearer of the alternative-reality headset 130.
  • the window can be temporarily located beside the patient (see FIGS. 18-20) or at some distance away from the patient (see FIG. 20) depending upon the wearer’s point of view.
  • the alternative-reality headset 130 can be configured to three-dimensionally anchor the objects of virtual anatomy and the representations of the medical devices anywhere in the environment such as to the patient when displaying the objects of virtual anatomy and the representations of the medical devices on the display screen 512, which is a characteristic of mixed reality.
  • the alternative-reality headset 130 can also be configured to three-dimensionally anchor graphical control elements such as the windows 1870 to the environment such as beside the patient as shown in at least FIGS. 18 and 19.
  • the alternative- reality headset 130 is not limited in its configuration to anchor the objects of virtual anatomy, the representations of the medical devices, and the graphical control elements to the environment.
  • the alternative-reality headset 130 can be configured to independently anchor any virtual object (e.g., any object of virtual anatomy, any representation of a medical device, any graphical control element, etc.) to a persistent location on the display screen 512 (e.g., always at a bottom of the display screen 512 like surgical loupes), a persistent location in a reference frame of the wearer of the alternative-reality headset 130 (e.g., always off to a side of the wearer but visible with a glance to the side and accessible within an arm’s reach), or a persistent location in the environment such as over the patient.
  • any virtual object e.g., any object of virtual anatomy, any representation of a medical device, any graphical control element, etc.
  • a persistent location on the display screen 512 e.g., always at a bottom of the display screen 512 like surgical loupes
  • a persistent location in a reference frame of the wearer of the alternative-reality headset 130 e.g., always off to a side of the wearer
  • FIGS. 22A and 22B provide different views of the medical-device placing system 1200 having a TLS 2240 in accordance with some embodiments.
  • FIG. 22C provides a stylet 2246 for use with the medical-device placing system 1200 having the TLS 2240 of FIGS. 22A and 22B in accordance with some embodiments.
  • the medical device-placing system 1200 can include the ultrasound probe 120 of the anatomy -visualizing system 100, the alternative-reality headset 130, and the console 1210, which includes electronic circuitry like that of the console 110.
  • the medical-device locating system of the medical device-placing system 1200 can alternatively include TLS 2240.
  • the TLS 2240 is different than the TLS 1240 in that the TLS 2240 includes a bedside sensor grid 2242 and a sensor datum 2244 configured to be placed upon a chest of a patient.
  • the sensor grid 2242 includes an array of magnetic sensors embedded within a housing.
  • the sensor datum 2244 includes an electromagnetic coil.
  • the TLS 2240 is configured to detect a stylet 2246 including an electromagnetic coil 2248 disposed in a lumen of the stylet 2246, wherein the electromagnetic coil of the sensor datum 2244 and the electromagnetic coil 2248 of the stylet 2244 operate at different frequencies, different amplitudes, or both different frequencies and amplitudes.
  • the sensor datum 2244 having its electromagnetic coil is configured for placement on the chest of the patient, thereby providing a sensor datum for the TLS 2240.
  • the electromagnetic coil of the sensor datum 2244 is configured to operate at a different frequency or amplitude than the electromagnetic coil 2248 of the stylet 2244.
  • the different frequency or amplitude of the electromagnetic coil of the sensor datum 2244 can be multiplexed with that of the electromagnetic coil 2248 of the stylet 2244.
  • the medical device-placing system 1200 including the TLS 2240 is configured to operate with the sensor grid 2242 at a side of a bed in which the patient lies using the electromagnetic coil 2248 of the stylet 2246 as a transmitter coil.
  • the reverse of the foregoing configuration is also possible when the array of magnetic sensors of the sensor grid 2242 is an array of electromagnetic coils configured to function as transmitter coils and the electromagnetic coil 2248 of the stylet 2246 is a magnetic senor.
  • the relative position of the electromagnetic coil 2248 of the stylet 2246 to the sensor datum 2244 can be tracked and displayed on the display screen 512 of the alternative-reality headset 130 in various places over the patient in 3 -dimensional space.
  • Such tracking and displaying allows clinicians to map a venous route and overcome medical-device placement obstacles such as obstructions (e.g., lesions), incorrectly followed routes in a patient’s vasculature, heart valves, etc.
  • a depth measured from the sensor datum 2244 can provide real-time information relative to medical -device placement such as azygous vein placement or inferior vena cava placement. Arterial placements are also possible with the medical device-placing system 1200 including the TLS 2240.
  • a medical device with a small electromagnetic coil such as the stylet 2246 in neonatal or pediatric patients, as well as patients with a neck brace
  • uses multiple electromagnetic coils such as an array electromagnetic coils in the alternative sensor grid 2242 to track a tip of a medical device such as the stylet 2246 on torturous paths
  • Methods of the medical device-placing system 300, 1200, or 1600 incorporate methods of at least the anatomy-visualizing system 100 and the medical device-locating system 200, or the like, which methods are discernable by references to the anatomy-visualizing system 100, the medical device-locating system 200, or the components thereof (e.g., the ultrasound probe 120, the medical-device detector 240, the TLS 1240, etc.) below.
  • Methods of the medical device-placing system 300 or 1200 include emitting ultrasound signals into a patient (e.g., a limb of the patient) and receiving echoed ultrasound signals from the patient by way of the piezoelectric sensor array 438 of the ultrasound probe 120; transforming the echoed ultrasound signals with the console 310 or 1210 having the electronic circuitry including the memory 1012 or 1312, the one or more algorithms 1016 or 1316, and the one or more processors 1014 or 1314 to produce ultrasound-image segments corresponding to anatomical structures of the patient; inserting a magnetized medical device into the patient (e.g., the limb of the patient) and transforming magnetic-sensor signals from the array of magnetic sensors 242 embedded within the housing 810 or 910 placed about the patient or the one or more magnetic sensors 1242 disposed within the housing of the TLS 1240 placed on a chest of the patient with the one or more algorithms 1016 or 1316 (e.g., a location- finding algorithm) of the console 310 or 1210 into location information
  • Ultrasound imaging to produce the objects of virtual anatomy can be done at any time before inserting the medical device into the patient, and the objects of virtual anatomy can be stored for later use in the memory 1012 or 1312 of the console 310 or 1210 or a storage medium connected to a port of the console 310 or 1210.
  • the method can further include capturing in the memory 1012 or 1312 of the console 310 or 1210 ultrasound-imaging frames in accordance with the pulsed-wave Doppler imaging mode of the ultrasound probe 120 while emitting and receiving the ultrasound signals; stitching the ultrasound-imaging frames together with the stitching algorithm of the one or more algorithms 1016 or 1316; and segmenting the ultrasound-imaging frames or the stitched ultrasound-imaging frames into the ultrasound-image segments with the image segmentation algorithm of the one or more algorithms 1016 or 1316.
  • the method can further include transforming the ultrasound-image segments into the objects of virtual anatomy with the virtualization algorithm of the one or more algorithms 1016 or 1316; and sending both the representation of the medical device and the objects of virtual anatomy to the alternative-reality headset 130 for display over the patient.
  • the method can further includes anchoring the representation of the medical device and the objects of virtual anatomy to the patient over which the virtual medical device and the objects of virtual anatomy are displayed.
  • the method can further includes capturing in the memory 1012 or 1312 of the console 310 of 1210 eye movements of the wearer using the one or more eye-tracking cameras 522 coupled to the frame 516 of the alternative-reality headset 130; and processing the eye movements with the eye-movement algorithm of the one or more algorithms 528 to identify a focus of the wearer for selecting or enhancing the objects of virtual anatomy corresponding to the focus of the wearer.
  • the method can further include capturing in the memory 1012 or 1312 of the console 310 or 1210 gestures of the wearer using one or more patient-facing cameras 524 coupled to the frame 516 of the alternative-reality headset 130; and processing the gestures with the gesture-command algorithm of the one or more algorithms 528 to identify gesture- based commands issued by the wearer for execution thereof by the alternative-reality headset 130.
  • the method can further include capturing in the memory 1012 or 1312 of the console 310 or 1210 audio of the wearer using the one or more microphones 526 coupled to the frame 516 of the alternative-reality headset 130; and processing the audio with the audio command algorithm of the one or more algorithms 528 to identify audio-based commands issued by the wearer for execution thereof by the alternative-reality headset 130.
  • the method can further include generating a magnetic field with the magnetic- field generator 740; and determining a spatial relationship of each magnetic sensor of the array of magnetic sensors 242 to another magnetic sensor from the magnetic-sensor signals produced by the array of magnetic sensors 242 while in the presence of the generated magnetic field. Determining the spatial relationship of each magnetic sensor to another magnetic sensor in the array of magnetic sensors 242 is important when the array of magnetic sensors 242 is embedded within the housing 910 (e.g., a drape), for the magnetic sensors have a variable spatial relationship to each other depending upon how the housing 910 is placed about the limb of the patient.
  • the housing 910 e.g., a drape
  • Methods of the medical device-placing system 1600 include emitting ultrasound signals into a patient (e.g., a limb of the patient) and receiving echoed ultrasound signals from the patient by way of a piezoelectric sensor array 438 of an ultrasound probe 120; transforming the echoed ultrasound signals with electronic circuitry in the frame 516 of the alternative-reality headset 130 including the memory 518 and the one or more processors 520 to produce ultrasound-image segments corresponding to anatomical structures of the patient; transforming magnetic-sensor signals from one or more magnetic sensors 1242 disposed within the housing of the TLS 1240 placed on a chest of the patient with the alternative-reality headset 130 into location information for a magnetized medical device within the patient; and displaying over an environment including the patient on the see-through display screen 512 of the alternative- reality headset 130 for a wearer thereof a virtual medical device in accordance with the location information for the medical device within objects of virtual anatomy corresponding to the ultrasound-image segments, one or more graphical-control-element windows 1870 including output
  • the method further includes capturing in the memory 518 of the alternative- reality headset 130 eye movements of the wearer using the one or more eye-tracking cameras 522 coupled to the frame 516 of the alternative-reality headset 130; and processing the eye movements with an eye-movement algorithm to identify a focus of the wearer for selecting or enhancing the virtual medical device, the objects of virtual anatomy, the one or more windows 1870, or the output in the one or more windows 1870 corresponding to the focus of the wearer.
  • the method further includes capturing in the memory 518 of the alternative- reality headset 130 gestures of the wearer using the one or more patient-facing cameras 524 coupled to the frame 516 of the alternative-reality headset 130; and processing the gestures with a gesture-command algorithm to identify gesture-based commands issued by the wearer for execution thereof by the alternative-reality headset 130.
  • the method further includes enabling the wearer to anchor the virtual medical device, any object of the objects of virtual anatomy, or any window of the one or more windows 1870 to a persistent location on the display screen 512, a persistent location in a reference frame of the wearer of the alternative-reality headset 130, or a persistent location in the environment.
  • the method further includes enabling the wearer to transform the virtual medical device, any object of the objects of virtual anatomy, or any window of the one or more windows 1870 over the environment by way of translating, rotating, or resizing the virtual medical device, any object of the objects of virtual anatomy, or any window of the one or more windows 1870.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Molecular Biology (AREA)
  • Public Health (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Veterinary Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Pathology (AREA)
  • Physics & Mathematics (AREA)
  • Biophysics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Cardiology (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Physiology (AREA)
  • Robotics (AREA)
  • Vascular Medicine (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Magnetic Resonance Imaging Apparatus (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un système de placement de dispositif médical comprenant un capteur de localisation de pointe de dispositif médical ("TLS") configuré pour être placé sur une poitrine d'un patient, une sonde à ultrasons, une console et un casque de réalité alternative. La sonde ultrasonore peut être configurée pour émettre des signaux ultrasonores dans le patient et recevoir des signaux ultrasonores renvoyés en écho à partir du patient. La console peut être configurée pour transformer les signaux ultrasonores renvoyés en écho pour produire des segments d'image ultrasonore correspondant à des structures anatomiques du patient, ainsi que pour transformer des signaux TLS provenant du TLS en informations de localisation pour un dispositif médical à l'intérieur du patient. Le casque de réalité alternative peut comprendre un écran d'affichage à travers lequel un porteur du casque de réalité alternative peut voir le patient. L'écran d'affichage peut être configuré pour afficher sur le patient un dispositif médical virtuel par les informations de localisation pour le dispositif médical dans des objets d'anatomie virtuelle correspondant aux segments d'image ultrasonore.
EP19815710.9A 2018-06-04 2019-06-03 Systèmes et procédés de visualisation de l'anatomie, de localisation de dispositifs médicaux, ou de positionnement de dispositifs médicaux Pending EP3801245A4 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201862680299P 2018-06-04 2018-06-04
US16/209,601 US20190167148A1 (en) 2017-12-04 2018-12-04 Systems And Methods For Visualizing Anatomy, Locating Medical Devices, Or Placing Medical Devices
US16/370,353 US20190223757A1 (en) 2017-12-04 2019-03-29 Systems And Methods For Visualizing Anatomy, Locating Medical Devices, Or Placing Medical Devices
PCT/US2019/035271 WO2019236505A1 (fr) 2018-06-04 2019-06-03 Systèmes et procédés de visualisation de l'anatomie, de localisation de dispositifs médicaux, ou de positionnement de dispositifs médicaux

Publications (2)

Publication Number Publication Date
EP3801245A1 true EP3801245A1 (fr) 2021-04-14
EP3801245A4 EP3801245A4 (fr) 2022-03-02

Family

ID=68770987

Family Applications (1)

Application Number Title Priority Date Filing Date
EP19815710.9A Pending EP3801245A4 (fr) 2018-06-04 2019-06-03 Systèmes et procédés de visualisation de l'anatomie, de localisation de dispositifs médicaux, ou de positionnement de dispositifs médicaux

Country Status (2)

Country Link
EP (1) EP3801245A4 (fr)
WO (1) WO2019236505A1 (fr)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20240164852A1 (en) * 2021-03-26 2024-05-23 C. R. Bard, Inc. Medical Device Projection System
US20240139432A1 (en) * 2022-10-27 2024-05-02 Becton, Dickinson And Company Vascular access system and method for continuous ultrasound monitoring and integrated sensor array
WO2024092340A1 (fr) * 2022-11-04 2024-05-10 Deep Breathe Inc. Système et procédé de détection de pneumothorax

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1504713A1 (fr) * 2003-07-14 2005-02-09 Surgical Navigation Technologies, Inc. Système de navigation pour thérapies cardiaques
WO2016133644A1 (fr) * 2015-02-20 2016-08-25 Covidien Lp Perception de salle d'opération et de site chirurgical
US20180144550A1 (en) * 2016-11-23 2018-05-24 Simbionix Ltd. System and method for rendering complex data in a virtual reality or augmented reality environment

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7835785B2 (en) * 2005-10-04 2010-11-16 Ascension Technology Corporation DC magnetic-based position and orientation monitoring system for tracking medical instruments
US9636031B2 (en) * 2007-11-26 2017-05-02 C.R. Bard, Inc. Stylets for use with apparatus for intravascular placement of a catheter
WO2016004302A1 (fr) * 2014-07-02 2016-01-07 Covidien Lp Alignement tdm
US10013808B2 (en) * 2015-02-03 2018-07-03 Globus Medical, Inc. Surgeon head-mounted display apparatuses
EP3720349A4 (fr) * 2017-12-04 2021-01-20 Bard Access Systems, Inc. Systèmes et procédés de visualisation de l'anatomie, de localisation de dispositifs médicaux, ou de positionnement de dispositifs médicaux

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1504713A1 (fr) * 2003-07-14 2005-02-09 Surgical Navigation Technologies, Inc. Système de navigation pour thérapies cardiaques
WO2016133644A1 (fr) * 2015-02-20 2016-08-25 Covidien Lp Perception de salle d'opération et de site chirurgical
US20180144550A1 (en) * 2016-11-23 2018-05-24 Simbionix Ltd. System and method for rendering complex data in a virtual reality or augmented reality environment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of WO2019236505A1 *

Also Published As

Publication number Publication date
CN112236077A (zh) 2021-01-15
EP3801245A4 (fr) 2022-03-02
WO2019236505A1 (fr) 2019-12-12

Similar Documents

Publication Publication Date Title
US20190307419A1 (en) Systems And Methods For Visualizing Anatomy, Locating Medical Devices, Or Placing Medical Devices
US20190167148A1 (en) Systems And Methods For Visualizing Anatomy, Locating Medical Devices, Or Placing Medical Devices
US10952795B2 (en) System and method for glass state view in real-time three-dimensional (3D) cardiac imaging
US10433761B2 (en) Methods for localizing medical instruments during cardiovascular medical procedures
EP1421913B1 (fr) Système de navigation de cathèter visuel dans la chirurgie cardiaque
AU2014265090B2 (en) Tracking of catheter from insertion point to heart using impedance measurements
US10163204B2 (en) Tracking-based 3D model enhancement
US8401616B2 (en) Navigation system for cardiac therapies
EP2203124B1 (fr) Système de navigation pour thérapies cardiaques utilisant la synchronisation d'images
JP2021505226A (ja) 処置中の視覚化を支援するためのシステム及び方法
WO2019236505A1 (fr) Systèmes et procédés de visualisation de l'anatomie, de localisation de dispositifs médicaux, ou de positionnement de dispositifs médicaux
US20190223757A1 (en) Systems And Methods For Visualizing Anatomy, Locating Medical Devices, Or Placing Medical Devices
CN112236077B (zh) 用于使解剖可视化、定位或放置医疗设备的***和方法
US20240050061A1 (en) Spatially Aware Medical Device Configured for Performance of Insertion Pathway Approximation
US20230218272A1 (en) Controlling and visualizing rotation and deflection of a 4d ultrasound catheter having multiple shafts
WO2024157104A1 (fr) Localisation de propriété tissulaire pour l'administration d'une thérapie
WO2020106664A1 (fr) Système et procédé d'affichage volumétrique de l'anatomie avec mouvement périodique

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20201203

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
REG Reference to a national code

Ref country code: DE

Ref legal event code: R079

Free format text: PREVIOUS MAIN CLASS: A61B0005060000

Ipc: A61B0034000000

A4 Supplementary search report drawn up and despatched

Effective date: 20220128

RIC1 Information provided on ipc code assigned before grant

Ipc: A61B 34/20 20160101ALN20220124BHEP

Ipc: A61B 5/05 20210101ALN20220124BHEP

Ipc: A61B 90/50 20160101ALN20220124BHEP

Ipc: A61B 90/30 20160101ALN20220124BHEP

Ipc: A61B 17/00 20060101ALN20220124BHEP

Ipc: A61B 8/00 20060101ALI20220124BHEP

Ipc: A61B 5/06 20060101ALI20220124BHEP

Ipc: A61B 5/00 20060101ALI20220124BHEP

Ipc: A61B 90/00 20160101ALI20220124BHEP

Ipc: G06T 19/00 20110101ALI20220124BHEP

Ipc: A61B 8/08 20060101ALI20220124BHEP

Ipc: A61B 5/02 20060101ALI20220124BHEP

Ipc: A61B 34/00 20160101AFI20220124BHEP

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20240410