WO2016176452A1 - Fusion dans un dispositif du suivi optique et inertiel de la position de sondes à ultrasons - Google Patents

Fusion dans un dispositif du suivi optique et inertiel de la position de sondes à ultrasons Download PDF

Info

Publication number
WO2016176452A1
WO2016176452A1 PCT/US2016/029784 US2016029784W WO2016176452A1 WO 2016176452 A1 WO2016176452 A1 WO 2016176452A1 US 2016029784 W US2016029784 W US 2016029784W WO 2016176452 A1 WO2016176452 A1 WO 2016176452A1
Authority
WO
WIPO (PCT)
Prior art keywords
processor
image data
subject
ultrasonic transducers
sensors
Prior art date
Application number
PCT/US2016/029784
Other languages
English (en)
Inventor
Ricardo Paulo DOS SANTOS MENDONCA
Patrik Nils Lundqvist
Rashid Ahmed Akbar Attar
Rajeev Jain
Padmapriya JAGANNATHAN
Original Assignee
Qualcomm Incorporated
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US15/140,001 external-priority patent/US20160317122A1/en
Application filed by Qualcomm Incorporated filed Critical Qualcomm Incorporated
Priority to EP16720692.9A priority Critical patent/EP3288465B1/fr
Priority to CN201680024340.5A priority patent/CN108601578B/zh
Publication of WO2016176452A1 publication Critical patent/WO2016176452A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4245Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
    • A61B8/4254Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient using sensors mounted on the probe
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4245Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4427Device being portable or laptop-like
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/483Diagnostic techniques involving the acquisition of a 3D volume of data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5238Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
    • A61B8/5246Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image combining images from the same or different imaging techniques, e.g. color Doppler and B-mode
    • A61B8/5253Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image combining images from the same or different imaging techniques, e.g. color Doppler and B-mode combining overlapping images, e.g. spatial compounding
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5269Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving detection or reduction of artifacts
    • A61B8/5276Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving detection or reduction of artifacts due to motion
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4477Constructional features of the ultrasonic, sonic or infrasonic diagnostic device using several separate ultrasound transducers or probes

Definitions

  • This disclosure relates to an ultrasonography apparatus, and more particularly to techniques for improving the operability and functionality of the ultrasonography apparatus.
  • the ultrasonic imaging probe is a simple hand-held device that emits and receives acoustic signals.
  • the device is connected by an electrical cable with a console or rack of equipment that provides control signals and power to the probe and that processes acoustic signal data received by the probe and forwarded to the console which processes the received data to produce viewable images of an anatomical feature of interest.
  • One innovative aspect of the subject matter described in this disclosure relates to an apparatus for ultrasonography that includes one or more ultrasonic transducers, one or more inertial sensors, one or more optical sensors, and a processor communicatively coupled with the one or more ultrasonic transducers, the one or more inertial sensors and the one or more optical sensors.
  • the processor is capable of estimating a position of the apparatus based on a combination of signals received from the one or more ultrasonic transducers, the one or more inertial sensors and the one or more optical sensors.
  • the estimating the position of the apparatus may include processing ultrasound image data from the one or more ultrasonic transducers and determining the position based on the processed ultrasound image data.
  • the ultrasound image data may include a series of 2-D image frames and the processed ultrasound image data may include a 3-D image.
  • the processor may be configured to adjust at least one of the 2-D image frames in view of the determined position at a time of obtaining the at least one of the 2-D images.
  • the ultrasound image data may include a series of 2-D image frames and the processed ultrasound image data may include a 3-D image of a first volume.
  • the processor may be configured to determine, with regard to at least one of the 2-D image frames, whether the at least one of the 2-D image frames relates to the first volume or to a different volume.
  • the optical sensor may be optically coupled with one or more optical wireless communication (OWC) emitters of an indoor positioning system.
  • OBC optical wireless communication
  • the processor may be configured to correct drift error accumulation of the inertial sensors using the combination of signals.
  • the processor may be configured to process image data acquired by one or both of the optical sensors and the ultrasonic transducers so as to select a plurality of landmarks.
  • the landmarks may include one or both of: (i) one or more points, edges or corners of ordinary surfaces, fixtures or objects of a room in which the apparatus is to be used to examine a subject; and (ii) one or more anatomical features of the subject, the anatomical features being selected from the group consisting of tissue surfaces, tissue boundaries and image texture of ordinary anatomical or pathological structures of the subject.
  • the processor may be configured to calculate the position of the apparatus with respect to the landmarks.
  • the processor may be configured to calculate a location of the subject or an anatomical feature of the subject.
  • the processor may be configured to fuse the
  • the processor may be configured to process ultrasound image data from the ultrasonic transducer and make a determination of the position of the apparatus from the processed ultrasound image data. In some examples, the processor may be configured to use the determination to provide, to an operator of the apparatus, one or more of: navigational guidance for movement of the imaging probe, notifications based on the
  • a method for ultrasonography includes collecting image data of an environment in which an ultrasonography apparatus is to be operated.
  • the ultrasonography apparatus includes one or more ultrasonic transducers, one or more inertial sensors, one or more optical sensors and a processor communicatively coupled with the one or more ultrasonic transducers, the one or more inertial sensors and the one or more optical sensors, the ultrasonography apparatus being configured to perform noninvasive medical ultrasonography.
  • the method includes estimating, with the processor, a position of the apparatus using a combination of signals received from the one or more ultrasonic transducers, the one or more inertial sensors and the one or more optical sensors.
  • the method includes fusing, with the processor, the combination of signals using one or more of visual inertial odometry (VIO) techniques, simultaneous localization and mapping (SLAM) techniques, image registration techniques, or any combination thereof.
  • VIO visual inertial odometry
  • SLAM simultaneous localization and mapping
  • the image data may include outputs from one or both of the optical sensors and the ultrasonic transducers
  • the processor may be configured to process the image data so as to select a plurality of landmarks.
  • the landmarks may include one or both of: (i) one or more points, edges or corners of ordinary surfaces, fixtures or objects of a room in which the apparatus is to be used to examine a subject; and (ii) one or more anatomical features of the subject, the anatomical features being selected from the group consisting of tissue surfaces, tissue boundaries and image texture of ordinary anatomical or pathological structures of the subject.
  • the processor may be configured to determine the position of the ultrasonic transducer with respect to the landmarks.
  • the method may include using the determined position to provide, to an operator of the apparatus, navigational guidance for movement of the imaging probe.
  • the software includes instructions for ultrasonography, the instructions causing an apparatus to (i) collect image data of an environment in which an ultrasonography apparatus is to be operated, the instructions
  • ultrasonography apparatus including one or more ultrasonic transducers, one or more inertial sensors, one or more optical sensors and a processor communicatively coupled with the one or more ultrasonic transducers, the one or more inertial sensors and the one or more optical sensors, the ultrasonography apparatus being configured to perform noninvasive medical ultrasonography; and (ii) estimate, with the processor, a spatial position of the apparatus using a combination of signals received from the one or more ultrasonic transducers, the one or more inertial sensors and the one or more optical sensors.
  • Figure 1 illustrates a hand-held ultrasonic imaging probe, according to an implementation.
  • Figure 2 illustrates an example of an environment in which the hand-held ultrasonic imaging probe may be operated according to an implementation.
  • Figure 3 illustrates an example of a method for estimating a position of an ultrasonography apparatus, according to an implementation.
  • Figure 4 illustrates an example of a method for calibrating an inertial sensor of a ultrasonic imaging probe, according to another implementation.
  • Figure 5 illustrates an example of a data flow diagram according to an implementation.
  • Figure 6 illustrates an example of an environment in which the hand-held ultrasonic imaging probe may be operated according to another implementation.
  • the systems, methods and devices of the disclosure each have several innovative aspects, no single one of which is solely responsible for the desirable attributes disclosed herein.
  • One innovative aspect of the subject matter described in this disclosure can be implemented in a portable ultrasonic imaging probe for medical ultrasonography.
  • the portable ultrasonic imaging probe may be hand-held.
  • the portable ultrasonic imaging may be included in or attached to an apparatus such as a robot, or may be or include a wearable device.
  • a sleeve, wearable by a human or robotic operator and/or by a patient or other subject of examination hereinafter, "subject" may contain one or more ultrasonic transducers, one or more inertial sensors, and/or one or more optical sensors.
  • the wearable device may contain one or more ultrasonic transducers communicatively coupled to a processor by way of a wired or wireless interface.
  • the processor may also be communicatively coupled to one or more inertial sensors of the wearable device and/or one or more optical sensors disposed within an
  • the optical sensors may be configured to capture image data of the wearable device and provide it to the processor, which can use the image data to determine a location of the wearable device.
  • the ultrasonic transducers of the wearable device may capture ultrasound data and send it to the processor, which uses the data to generate an ultrasound volume and also determine a precise location of the wearable device relative to the subject's body.
  • FIG. 1 illustrates a hand-held ultrasonic imaging probe, according to an implementation.
  • the apparatus 100 includes an ultrasonic transducer 110, an inertial sensor 120, an optical sensor 130 and a processor 140 communicatively coupled with the ultrasonic transducer 110, the inertial sensor 120, and the optical sensor 130.
  • the processor 140 may be configured to calibrate the inertial sensor 120 using outputs from the optical sensor 130.
  • the processor 140 may be configured to correct for accumulated drift errors of the inertial sensor 120.
  • the hand-held ultrasonic imaging probe may be configured to make a real-time determination of its spatial position with respect to an arbitrary coordinate system using a combination of optical and inertial sensors.
  • spatial position and “position” refers to a spatial location (e.g., in terms of X, Y and Z coordinate location) in combination with an angular orientation (e.g. roll, pitch and yaw angle) and may be referred to as a 6 degree of freedom (6- DoF) spatial position.
  • optical sensor refers to a device configured to optically detect visible, infrared and or ultraviolet light and/or images thereof, and includes any kind of camera or photodetector.
  • Figure 2 illustrates an example of an environment in which the hand-held ultrasonic imaging probe may be operated according to an implementation.
  • the apparatus 100 includes the processor 140 communicatively coupled with one or more optical sensors 130
  • the processor 140 may be configured to collect image data of an environment (for example, an examining room) in which a subject is to be examined using the apparatus.
  • the processor 140 may be configured to process the acquired environmental image data so as to select a plurality of fixed "landmarks" in the vicinity of the probe.
  • These landmarks may include visually well-defined points, edges or corners of surfaces, fixtures, and/or objects of an ordinary room in which an operator wishes to perform an ultrasonic exam such as corners 201a, 201b, 201c and 20 Id.
  • the processor may be configured to calculate, in real time, the probe's X, Y and Z location as well as the probe's pitch, yaw, and roll orientation with respect to these landmarks. Moreover, the processor may be configured to calculate, in real time, location of the subject or an anatomical feature of the subject. [0030] As indicated above, the processor 140 may also be communicatively coupled with at least one inertial sensor 120.
  • the inertial sensor 120 may be configured to measure translational and rotational motion of the apparatus 100.
  • the inertial sensor 120 may be configured as or include an accelerometer, a gyroscope, a MEMS inertial sensor, etc.
  • the processor may be configured to estimate, in real-time, the probe's spatial position notwithstanding that some or all of the landmarks 201 may be, from time to time, obscured from view of the optical sensors, and notwithstanding normal inertial sensor drift error
  • the combination of optical sensor data and inertial sensor data will enable a reasonably accurate estimation of the probe's spatial position.
  • the estimation of the probe's position may be based on a combination of data from the inertial sensors and the optical sensors.
  • the estimation may be based on a prior position fix determined via optical sensors updated with current data from the inertial sensors.
  • the processor 140 may be configured to receive data inputs from the inertial sensor 120 and the optical sensor 130 and/or the ultrasonic transducer, and to use the received data inputs to determine the spatial position of the apparatus 100.
  • the processor may be configured to estimate a 6-DoF spatial position of the apparatus using a combination of outputs from two or more of the ultrasonic transducer 110, the inertial sensor 120 and the optical sensor 130.
  • the processor may be configured to correct drift error accumulation of the inertial sensor 120 using the combination of outputs.
  • the processor 140 may be further configured to process ultrasound image data from the ultrasonic transducer 110, using the determined spatial position of the apparatus 100.
  • a series of sequential 2-D image frames may be collated to form a 3-D image, after appropriate adjustment of each 2-D image in view of the respective spatial position of the apparatus 100 at the time of obtaining each respective 2-D image.
  • the processor may be configured to process image data acquired by one or both of the optical sensor and the ultrasonic transducer so as to select a plurality of landmarks.
  • the landmarks may include points, edges or corners of ordinary surfaces, fixtures, and/or objects of a room in which in which the apparatus is to be used to examine a subject.
  • the landmarks may include one or more anatomical features of the subject, the anatomical features including one or more of tissue surfaces, tissue boundaries or image texture of ordinary anatomical or pathological structures of the subject.
  • the apparatus may also include one or more optical sensors that are directed towards the subject. Signals from the optical sensors may better allow the apparatus to track its position relative to the subject's body.
  • the apparatus may include one or more optical sensors directed towards the environment the apparatus is located in, and one or more optical sensors directed towards the subject. This may better allow the apparatus to determine a position of the apparatus relative to the environment and also determine the position of the apparatus to the body. As a result, even if the subject moves, the ultrasound volume generation may be substantially unimpaired because the apparatus is aware of its location with respect to the environment as well as with respect to the subject. Otherwise, if the subject moved and the apparatus only had its position relative to the environment, then the apparatus might inadvertently add ultrasound data to an incorrect ultrasound volume. [0035] As a result, outputs of an ultrasonic scan performed by the probe may be processed, in light of the determined spatial position of the probe, to determine the relative position, in three-dimensional space, of each of a sequence of 2-D images.
  • the processor 140 may be configured to use the determined spatial position to provide, to an operator of the apparatus, navigational guidance for movement of the hand-held ultrasonic imaging probe.
  • optical-only systems demand that a large number-often hundreds-of visually conspicuous features (such as points, corners, colored patches, markers) are visible in the environment and that such features can be reliably matched between subsequent frames.
  • Inertial sensors are operable in the absence of any external visual reference, but they quickly lose absolute accuracy as the tracked device moves.
  • inertial sensors provide good relative positional accuracy over short periods of time during which landmarks may be obscured from the field of view of the optical sensors. This knowledge is used to accurately estimate, substantially continuously, the spatial position of the camera during an ultrasound scan. As a result, a need for a large number of specially configured conspicuous visual features in the environment of the ultrasound scan can be eliminated. Consequently, the ultrasonic imaging probe may be used to obtain real time 3-D images even in environments that have not been equipped for ultrasound imaging. For example, the present disclosure contemplates the ultrasonic imaging probe may be used in an ordinary room in which a subject may be examined such as a doctor's office, emergency room, or in a subject's home.
  • the presently disclosed techniques bring many benefits to the medical diagnosis and to the user experience of the ultrasound operator and subject.
  • the techniques enable production of accurate three- dimensional models of a subject's anatomy and pathological structures without the use of external devices and room preparation.
  • field application of ultrasonic imaging, outside a clinical setting may be enabled.
  • Such 3-D models may be used in real time for a more accurate subject diagnosis or assessment, and also may be stored for future comparison against new two or three dimensional data.
  • obtained ultrasound images may be overlaid against an optical image of the subject with the appropriate anatomical alignment.
  • Such overlay may be displayed on a separate screen or transmitted wirelessly or otherwise to a headmounted display (FDVID) which would overlay the ultrasound image against a live image of the subject.
  • FDVID headmounted display
  • a position of the HMD relative to the probe may be obtained and images displayed by the HMD may be adjusted based on the HMD's position relative to the probe's position.
  • the HMD may include optical and/or inertial sensors from which its 6-DoF spatial position may be obtained. Based on the obtained 6-DoF spatial position, images displayed by the HMD may be changed accordingly. For example, as an operator wearing the HMD moves around a subject's body, displayed images of the ultrasonic volume may be observed from multiple angles.
  • the probe device may be a wearable sleeve with multiple ultrasonic transducers, optical and/or inertial sensors, communicatively coupled with the HMD, enabling an operator wearing the HMD to obtain a rich, three dimensional, view of a subject's anatomy or pathological structure.
  • the multiple ultrasonic transducers, optical and/or inertial sensors may be calibrated to determine, for example, their proximity to one another prior to and/or during examination of the subject.
  • navigational guidance for moving the probe may be provided, with an objective of aiding the ultrasound operator in the task of placing the probe for optimal image acquisition. This may enable the use of ultrasound imaging by operators with less experience and training, thereby facilitating the adaption of ultrasound imaging technology.
  • the integration of optical with inertial measurements may include use of an extended Kalman filter (EKF) which would optimally combine measurements from each type of sensor into an overall coherent estimation of the probes position and orientation.
  • Figure 3 illustrates an example of a method for estimating a position of an ultrasonography apparatus.
  • the ultrasonography apparatus may include one or more ultrasonic transducers, one or more inertial sensors, one or more optical sensors and a processor communicatively coupled with the one or more ultrasonic transducers, the one or more inertial sensors and the one or more optical sensors.
  • method 300 includes a block 310 for collecting, with the optical sensor and/or the ultrasonic transducer, image data of an environment in which the ultrasonic imaging probe is to be operated.
  • the method proceeds, at block 320, with estimating, using the processor, a position of the apparatus using a combination of signals received from the one or more of the ultrasonic transducers, the one or more inertial sensors and the one or more optical sensors.
  • the processor may use outputs from the optical sensor and/or the ultrasonic transducer to correct for accumulated drift errors of the inertial sensor.
  • Figure 4 illustrates an example of a method for calibrating an inertial sensor of a hand-held ultrasonic imaging probe and, according to an implementation.
  • the imaging probe may include an ultrasonic transducer, an inertial sensor and a processor communicatively coupled with the ultrasonic transducer the inertial sensor and the optical sensor.
  • method 400 includes a block 410 for collecting, with one or both of the optical sensor and the ultrasonic transducer, image data of an environment in which the ultrasonic imaging probe is to be operated.
  • the method proceeds, at block 420, with calibrating, using the processor, the inertial sensor, using outputs from the optical sensor and/or ultrasonic transducer.
  • the processor may use the outputs to correct for accumulated drift errors of the inertial sensor.
  • the method 400 may proceed at block 430 with combining, with the processor, outputs from the inertial sensor and from one or both of the optical sensor and the ultrasonic transducer.
  • the method 400 may proceed at block 440 with determining with the processor the spatial position of the ultrasound transducer with the combined outputs obtained at block 430.
  • the method 400 may proceed at block 450 with using the spatial position, determined at block 440, to provide navigational guidance for movement of the ultrasonic imaging probe. Navigational guidance may be provided to an operator using the ultrasonic imaging probe to perform noninvasive medical ultrasonography.
  • the processor may be configured to process ultrasound image data from the ultrasonic transducer and calibrate the estimated 6- DoF spatial position of the apparatus 100 using the processed ultrasound image data optical sensor image data.
  • Figure 5 illustrates an example of a data flow diagram according to an implementation.
  • the processor 140 processes ultrasound image data 515, inertial sensor data 525, and optical sensor image data 535.
  • outputs from each of the ultrasonic transducer 110, the inertial sensor 120, and the optical sensor 130 may be fused so as to obtain a more accurately calibrated estimation of the spatial position of the apparatus 100.
  • the processor may be configured to adjust one or more of the 2-D image frames in view of the estimated 6-DoF spatial position at the time of obtaining each respective 2-D image. For example where the estimated 6-DoF spatial position at a time corresponding to a 2-D image frame (i) is different from the estimated 6-DoF spatial position at a time corresponding to a 2-D image frame (i+1), one or both of the respective 2-D image frames may be adjusted to compensate for the difference. As a result, temporal series of 2-D images may be more accurately combined to compute 3-D image data 560.
  • the processor may be configured to make a determination whether or not an obtained 2-D image frame relates to a first volume under examination or a different volume. For example, where an operator interrupts and then resumes use of the apparatus (e.g., by lifting if up from a first location and then setting it down at a second location), the operator may or may not intend that the first location and the second location be substantially identical.
  • the processor may be configured to determine, with regard to a newly received 2-D image frame, whether data from the 2-D image frame should be merged with previously received image frame data (because the first location and the second location are substantially identical) or not merged (because the first location and the second location are not substantially identical).
  • the processor may be configured to determine a difference between two or more 2-D image frames and compare the difference to a threshold to determine if the images relate to approximately the same location.
  • the processor may be configured to compare the 6-DoF spatial position, as well as operator settings of the ultrasound probe (e.g., frequency and gain, image depth and signal processing filter parameters) associated with the two or more 2-D image frames to determine if they should be associated with the same volume.
  • FIG. 6 illustrates an example of an environment in which the hand-held ultrasonic imaging probe may be operated according to another implementation.
  • the apparatus 100 includes the processor 140 communicatively coupled with one or more optical sensors 130
  • the processor 140 may be configured to collect image data of an environment (for example, an examining room) in which a subject is to be examined using the apparatus.
  • the examining room includes a plurality of optical emitters 501 configured for optical wireless communication (OWC).
  • OWC optical wireless communication
  • the optical sensors 130 may be optically coupled so as to receive signals from the emitters 601, which may be configured as part of an indoor positioning system (IPS).
  • the optical emitters are configured for visible light communication (VLC).
  • the optical emitters may be configured for communication in the infrared and/or ultraviolet light wavelengths.
  • the IPS may enable the processor to calculate, in real time, the probe's X, Y and Z location as well as the probe's pitch, yaw, and roll orientation with respect to the optical emitters 601.
  • the processor may be configured to calculate, in real time, location of the subject or an anatomical feature of the subject, with or without use of an inertial sensor.
  • the processor 140 may also be communicatively coupled with at least one inertial sensor 120.
  • the inertial sensor 120 may be configured to measure translational and rotational motion of the apparatus 100.
  • the processor may be configured to estimate, in real-time, the probe's spatial position notwithstanding that some or all of the optical emitters 601 may be obscured from view of the optical sensors, and notwithstanding normal inertial sensor drift error accumulation.
  • a smart device for ultrasound imaging has been disclosed that is configured as an ultrasonic imaging probe that includes an inertial sensor and an optical sensor where the processor is configured to calibrate the inertial sensor using outputs from the optical sensor.
  • a phrase referring to "at least one of a list of items refers to any combination of those items, including single members.
  • "at least one of: a, b, or c" is intended to cover: a, b, c, a-b, a-c, b-c, and a-b-c.
  • the hardware and data processing apparatus used to implement the various illustrative logics, logical blocks, modules and circuits described in connection with the aspects disclosed herein may be implemented or performed with a general purpose single- or multi-chip processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein.
  • a general purpose processor may be a microprocessor or any conventional processor, controller, microcontroller, or state machine.
  • a processor also may be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
  • a processor also may be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
  • the functions described may be implemented in hardware, digital electronic circuitry, computer software, firmware, including the structures disclosed in this specification and their structural equivalents thereof, or in any combination thereof. Implementations of the subject matter described in this specification also can be implemented as one or more computer programs, i.e., one or more modules of computer program instructions, encoded on a computer storage media for execution by or to control the operation of data processing apparatus. [0060] If implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium, such as a non-transitory medium. The processes of a method or algorithm disclosed herein may be implemented in a processor-executable software module which may reside on a computer-readable medium.
  • Computer-readable media include both computer storage media and communication media including any medium that can be enabled to transfer a computer program from one place to another.
  • Storage media may be any available media that may be accessed by a computer.
  • non-transitory media may include RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that may be used to store desired program code in the form of instructions or data structures and that may be accessed by a computer.
  • any connection can be properly termed a computer-readable medium.
  • Disk and disc includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk, and blu-ray disc where disks usually reproduce data

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Medical Informatics (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biophysics (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

L'invention concerne un appareil non invasif d'échographie médicale qui comprend un ou plusieurs transducteurs à ultrasons, un ou plusieurs capteurs inertiels, un ou plusieurs capteurs optiques, et un processeur couplé en communication avec les transducteurs à ultrasons, les capteurs inertiels et les capteurs optiques. Le processeur est conçu pour estimer une position de l'appareil en fonction d'une combinaison de signaux reçus à partir des transducteurs à ultrasons, des capteurs inertiels et des capteurs optiques.
PCT/US2016/029784 2015-04-28 2016-04-28 Fusion dans un dispositif du suivi optique et inertiel de la position de sondes à ultrasons WO2016176452A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP16720692.9A EP3288465B1 (fr) 2015-04-28 2016-04-28 Fusion dans un dispositif du suivi optique et inertiel de la position de sondes à ultrasons
CN201680024340.5A CN108601578B (zh) 2015-04-28 2016-04-28 超声探头的光学和惯性位置跟踪的设备内融合

Applications Claiming Priority (10)

Application Number Priority Date Filing Date Title
US201562153970P 2015-04-28 2015-04-28
US201562153974P 2015-04-28 2015-04-28
US201562153978P 2015-04-28 2015-04-28
US62/153,970 2015-04-28
US62/153,974 2015-04-28
US62/153,978 2015-04-28
US15/140,001 US20160317122A1 (en) 2015-04-28 2016-04-27 In-device fusion of optical and inertial positional tracking of ultrasound probes
US15/140,006 2016-04-27
US15/140,006 US20160317127A1 (en) 2015-04-28 2016-04-27 Smart device for ultrasound imaging
US15/140,001 2016-04-27

Publications (1)

Publication Number Publication Date
WO2016176452A1 true WO2016176452A1 (fr) 2016-11-03

Family

ID=55911132

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2016/029784 WO2016176452A1 (fr) 2015-04-28 2016-04-28 Fusion dans un dispositif du suivi optique et inertiel de la position de sondes à ultrasons

Country Status (1)

Country Link
WO (1) WO2016176452A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109223030A (zh) * 2017-07-11 2019-01-18 中慧医学成像有限公司 一种掌上式三维超声成像***和方法
WO2019179344A1 (fr) * 2018-03-20 2019-09-26 深圳大学 Procédé d'imagerie ultrasonore tridimensionnelle basée sur la fusion d'informations provenant de multiples capteurs, dispositif et support de stockage lisible par machine terminal
WO2024073418A1 (fr) * 2022-09-26 2024-04-04 X Development Llc Étalonnage de multiples capteurs d'un système à ultrasons portatif

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6122538A (en) * 1997-01-16 2000-09-19 Acuson Corporation Motion--Monitoring method and system for medical devices
US20090306509A1 (en) * 2005-03-30 2009-12-10 Worcester Polytechnic Institute Free-hand three-dimensional ultrasound diagnostic imaging with position and angle determination sensors
US20100198068A1 (en) * 2007-02-16 2010-08-05 Johns Hopkins University Robust and accurate freehand 3d ultrasound
US20120253200A1 (en) * 2009-11-19 2012-10-04 The Johns Hopkins University Low-cost image-guided navigation and intervention systems using cooperative sets of local sensors
US20140046186A1 (en) * 2011-04-26 2014-02-13 University Of Virginia Patent Foundation Bone surface image reconstruction using ultrasound
US20140243671A1 (en) * 2013-02-28 2014-08-28 General Electric Company Ultrasound imaging system and method for drift compensation
WO2014134188A1 (fr) * 2013-02-28 2014-09-04 Rivanna Medical, LLC Systèmes et procédés d'imagerie par ultrasons

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6122538A (en) * 1997-01-16 2000-09-19 Acuson Corporation Motion--Monitoring method and system for medical devices
US20090306509A1 (en) * 2005-03-30 2009-12-10 Worcester Polytechnic Institute Free-hand three-dimensional ultrasound diagnostic imaging with position and angle determination sensors
US20100198068A1 (en) * 2007-02-16 2010-08-05 Johns Hopkins University Robust and accurate freehand 3d ultrasound
US20120253200A1 (en) * 2009-11-19 2012-10-04 The Johns Hopkins University Low-cost image-guided navigation and intervention systems using cooperative sets of local sensors
US20140046186A1 (en) * 2011-04-26 2014-02-13 University Of Virginia Patent Foundation Bone surface image reconstruction using ultrasound
US20140243671A1 (en) * 2013-02-28 2014-08-28 General Electric Company Ultrasound imaging system and method for drift compensation
WO2014134188A1 (fr) * 2013-02-28 2014-09-04 Rivanna Medical, LLC Systèmes et procédés d'imagerie par ultrasons

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
MUHAMMAD SAADI ET AL: "Performance analysis of optical wireless communication system using pulse width modulation", ELECTRICAL ENGINEERING/ELECTRONICS, COMPUTER, TELECOMMUNICATIONS AND INFORMATION TECHNOLOGY (ECTI-CON), 2013 10TH INTERNATIONAL CONFERENCE ON, IEEE, 15 May 2013 (2013-05-15), pages 1 - 5, XP032436013, ISBN: 978-1-4799-0546-1, DOI: 10.1109/ECTICON.2013.6559627 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109223030A (zh) * 2017-07-11 2019-01-18 中慧医学成像有限公司 一种掌上式三维超声成像***和方法
CN109223030B (zh) * 2017-07-11 2022-02-18 中慧医学成像有限公司 一种掌上式三维超声成像***和方法
WO2019179344A1 (fr) * 2018-03-20 2019-09-26 深圳大学 Procédé d'imagerie ultrasonore tridimensionnelle basée sur la fusion d'informations provenant de multiples capteurs, dispositif et support de stockage lisible par machine terminal
WO2024073418A1 (fr) * 2022-09-26 2024-04-04 X Development Llc Étalonnage de multiples capteurs d'un système à ultrasons portatif

Similar Documents

Publication Publication Date Title
EP3288465B1 (fr) Fusion dans un dispositif du suivi optique et inertiel de la position de sondes à ultrasons
EP3125809B1 (fr) Système chirurgical avec retour haptique basé sur l'imagerie tridimensionnelle quantitative
EP2185077B1 (fr) Systeme d'imagerie diagnostique ultrasonore et son procede de commande
US9504445B2 (en) Ultrasound imaging system and method for drift compensation
US20120271173A1 (en) Automatic ultrasonic scanning system and scanning method thereof
CN111035408B (zh) 用于超声探头定位反馈的增强的可视化的方法和***
EP3908190A1 (fr) Procédés et appareils pour la collecte de données ultrasonores
Suligoj et al. RobUSt–an autonomous robotic ultrasound system for medical imaging
JP7321836B2 (ja) 情報処理装置、検査システム及び情報処理方法
KR101116925B1 (ko) 초음파 영상을 정렬시키는 초음파 시스템 및 방법
AU2021265350A1 (en) A system for acquiring ultrasound images of internal body organs
US9437003B2 (en) Method, apparatus, and system for correcting medical image according to patient's pose variation
US10078906B2 (en) Device and method for image registration, and non-transitory recording medium
WO2016176452A1 (fr) Fusion dans un dispositif du suivi optique et inertiel de la position de sondes à ultrasons
JP5677399B2 (ja) 情報処理装置、情報処理システム、情報処理方法、及びプログラム
Sun et al. Computer-guided ultrasound probe realignment by optical tracking
JP6338510B2 (ja) 情報処理装置、情報処理方法、情報処理システム、及びプログラム
CN111292248B (zh) 超声融合成像方法及超声融合导航***
US20210068781A1 (en) Ultrasonic imaging system
JP2013223625A (ja) 超音波画像解析装置および超音波画像解析方法
Octorina Dewi et al. Position tracking systems for ultrasound imaging: a survey
CN112397189A (zh) 一种医用导引装置及其使用方法
Tillapaugh et al. Indirect camera calibration for surgery tracking

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16720692

Country of ref document: EP

Kind code of ref document: A1

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
REEP Request for entry into the european phase

Ref document number: 2016720692

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE