US8824756B2 - Image reconstruction incorporating organ motion - Google Patents

Image reconstruction incorporating organ motion Download PDF

Info

Publication number
US8824756B2
US8824756B2 US13/381,206 US201013381206A US8824756B2 US 8824756 B2 US8824756 B2 US 8824756B2 US 201013381206 A US201013381206 A US 201013381206A US 8824756 B2 US8824756 B2 US 8824756B2
Authority
US
United States
Prior art keywords
motion
deformation
organ
imaging modality
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related, expires
Application number
US13/381,206
Other versions
US20120201428A1 (en
Inventor
Sarang Joshi
Jacob Hinkle
Bill Salter
Tom Fletcher
Brian Wang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Utah Research Foundation UURF
Original Assignee
University of Utah Research Foundation UURF
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Utah Research Foundation UURF filed Critical University of Utah Research Foundation UURF
Priority to US13/381,206 priority Critical patent/US8824756B2/en
Assigned to THE UNIVERSITY OF UTAH reassignment THE UNIVERSITY OF UTAH ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: FLETCHER, TOM, HINKLE, JACOB, JOSHI, SARANG, SALTER, BILL, WANG, BRIAN
Assigned to UNIVERSITY OF UTAH RESEARCH FOUNDATION reassignment UNIVERSITY OF UTAH RESEARCH FOUNDATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: THE UNIVERSITY OF UTAH
Publication of US20120201428A1 publication Critical patent/US20120201428A1/en
Application granted granted Critical
Publication of US8824756B2 publication Critical patent/US8824756B2/en
Expired - Fee Related legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/003Reconstruction from projections, e.g. tomography
    • G06T11/006Inverse problem, transformation from projection-space into object-space, e.g. transform methods, back-projection, algebraic methods
    • A60B6/00
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • A61B5/7285Specific aspects of physiological measurement analysis for synchronising or triggering a physiological measurement or image acquisition with a physiological event or waveform, e.g. an ECG signal
    • A61B5/7289Retrospective gating, i.e. associating measured signals or images with a physiological event after the actual measurement or image acquisition, e.g. by simultaneously recording an additional physiological signal during the measurement or image acquisition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5211Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
    • A61B6/5229Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image
    • A61B6/5247Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image combining images from an ionising-radiation diagnostic technique and a non-ionising radiation diagnostic technique, e.g. X-ray and ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/583Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
    • G06F17/30247
    • G06K9/00442
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2211/00Image generation
    • G06T2211/40Computed tomography
    • G06T2211/412Dynamic

Definitions

  • the disclosure relates generally to imaging and, more particularly, to reconstructing images which incorporate motion.
  • Imaging techniques have been developed for studying organs where motion is present.
  • four-dimensional respiratory-correlated computed tomography (4D RCCT) has been widely used for studying organs where motion is associated with patient breathing.
  • the current standard practice for reconstructing images with respect to such organ motion is to use phase binned images.
  • the phase binning algorithm assumes that the patient has a periodic breathing pattern. When the patient's breathing is irregular as represented in the graph of FIG. 1A , this assumption breaks down and significant image artifacts like that indicated by the arrow of FIG. 1B are introduced.
  • the patient passes through the scanner on an automated couch that pauses at regular intervals to collect data.
  • slices are acquired repeatedly (e.g., 15-20 times). Each slice is acquired by collecting a series of projections at different angles. The slices are then reconstructed individually using filtered back-projection.
  • the speed of acquisition of each slice is dependent on the scanner and for current generation multi-slice scanners is generally on the order of 0.5 s.
  • the X-ray detection process used to acquire slices is subject to Poisson noise. However, at the X-ray tube currents typically used in clinical practice the signal is strong enough that the noise is approximately Gaussian.
  • the patient's breathing is monitored during acquisition using an external surrogate for internal organ motion.
  • the resulting breathing trace, a(t) is used to tag the acquired projection retrospectively with a breathing amplitude.
  • An example of a breathing trace monitoring system is the Real-time Position Management (RPM) system (Varian Oncology Systems, Palo Alto, Calif.), which uses a camera to track infrared-reflective markers attached to the patient's torso.
  • RPM Real-time Position Management
  • Modeled rigid 2D motion has been used during image acquisition to alleviate in-plane artifacts in fan-beam CT.
  • rigid 2D motion models are not valid for imaging of the torso, where respiratory-induced motion causes highly non-linear deformation with a significant component in the superior-inferior direction.
  • Another prior method reconstructs a full 4D time-indexed image using a B-spline motion model and a temporal smoothing condition.
  • Yet other B-spline-based methods require an artifact-free reference image (such as a breath-hold image) in addition to a 4D fan-beam or cone-beam scan. These approaches address difficulties caused by slowly-rotating cone-beam scanners.
  • the acquisition of an artifact-free reference image remains impractical for many radiotherapy patients.
  • the B-spline model guarantees smooth deformations, it cannot guarantee the diffeomorphic properties for large deformations and it does not directly enforce local conservation of tissue volume.
  • 3D images are reconstructed at arbitrary amplitudes by interpolating each slice from those collected at nearby amplitudes and then stacking them.
  • Two slices are used to interpolate a slice at the desired amplitude using an optical flow algorithm, so only 2D motion can be estimated.
  • a more recent approach has used a 4D cone-beam scan to estimate organ motion using an optical flow approach. The motion estimate is then used to correct for organ motion during subsequent 3D scans on the fly. This method may be useful in reducing artifacts in a 3D image, but the optical flow model, like the B-spline model, does not ensure diffeomorphic incompressible motion estimates.
  • an incompressible optical flow method for image registration was used. Some systems have used a spline-based model which penalizes tissue compression to perform incompressible image registration. Other researches have studied incompressible fluid-based registration of liver computed tomography (CT). An incompressible fluid-based approach solves Poisson's equation via a multigrid method at each iteration. Another efficient Fourier method of incompressible projection applies a result from the continuous domain to discrete data without alteration, which does not accommodate the discrete nature of image data. Despite these efforts in image registration, the incompressible nature of internal organs has not previously been incorporated into the image reconstruction process.
  • Deformable image registration has been shown to be useful in tracking organ motion in artifact-free 4D RCCT images. Such methods may be used with either phase or amplitude binned images, but are challenged in the presence of binning artifacts. That is, deformable image registration is not well suited for use with respect to 4D RCCT images which include binning artifacts.
  • the present invention is directed to systems and methods which implement an image reconstruction framework to incorporate complex motion of structure, such as complex deformation of an object or objects being imaged, in image reconstruction techniques.
  • image reconstruction frameworks herein are operable with respect to a plurality of image modalities, such as CT, magnetic resonance (MR), positron emission tomography (PET), ultrasound, etc.
  • An image reconstruction framework according to embodiments of the invention utilizes information regarding the nature of motion, information regarding motion tracking, and imaging modality raw image information to jointly provide motion estimation and image reconstruction.
  • anatomical images including one or more organs or other structure (referred to herein generally as objects), of a patient who is being medically imaged.
  • the method may include the operation of collecting raw image data from a medical imaging device configured to obtain anatomical images of a patient's internal anatomy.
  • Motion tracking data can be obtained, wherein such motion tracking data can be related to organ motion as the anatomical images are obtained.
  • the motion tracking data may represent breathing motion as measured by a motion sensor with the patient. Additionally or alternatively, the motion tracking data may be measured from raw image data acquired from the medical imaging device.
  • further operation comprises modeling organ deformation motion using the motion tracking data with correlated raw image data and data regarding the nature of motion of the organ (e.g., motion bounding data for the organ).
  • motion tracking data can be used to indicate attributes of organ motion (e.g., motion timing) in order to facilitate modeling of a patient's internal organs and to obtain organ deformations by an estimation of anatomical deformation during image reconstruction.
  • modeling of the organ deformations includes using data regarding the nature of motion of the organ to enforce conservation of local tissue volume during organ modeling. Further, data regarding the nature of motion of the organ may provide a diffeomorphic motion model of organ deformations for deformation motion modeling according to embodiments herein.
  • the organ deformation motion can be used in reconstructing anatomical images. For example, an estimated time-indexed 4D image can then be generated using the organ deformation motion, wherein the estimated time-indexed image is viewable by an end user.
  • the organ deformation motion estimation provided according to embodiments of the present invention can thus be used to reduce motion artifacts and/or increase the signal-to-noise ratio in images being reconstructed. Increased signal-to-noise ratio resulting from implementation of organ deformation motion estimation of embodiments facilitates a decrease in the dose of image modality energy (e.g., X-ray energy, magnetic field energy, ultrasound energy, etc.) delivered to the patient for generating an image of desired quality.
  • image modality energy e.g., X-ray energy, magnetic field energy, ultrasound energy, etc.
  • FIG. 1A shows a graph of irregular breathing cycles of a patient for which imaging is provided
  • FIG. 1B show an image reconstructed using a phase binning algorithm for image data collected during the irregular breathing cycles of FIG. 1A ;
  • FIG. 2 shows an imaging system adapted according to an embodiment of the present invention
  • FIG. 3 shows an image reconstruction framework of an embodiment of the present invention as may be utilized in the imaging system of FIG. 2 ;
  • FIG. 4A shows a graph of motion tracking data as utilized in an exemplary embodiment providing image reconstruction with respect to a phantom
  • FIG. 4B shows the stationary phantom of the exemplary embodiment of FIG. 4A imaged with helical CT
  • FIG. 5 shows the 4D reconstructed images from a phase binned dataset (top row) and an amplitude binned dataset (bottom row) of the phantom of FIG. 4B ;
  • FIG. 6A shows the 4D reconstructed images generated according to the concepts herein using the same raw data as the phase and amplitude binned images in FIG. 5 ;
  • FIGS. 6B and 6C show plots of the posterior indicating the convergence of the embodiment of the concept as applied in FIG. 6A ;
  • FIG. 7A shows a single point in the phantom of FIG. 4B tracked for analyzing displacement versus motion tracking data as shown in FIGS. 7B and 7C ;
  • FIG. 7B shows a graph of motion tracking data as utilized in the analysis of FIG. 7C ;
  • FIG. 7C shows a plot of the estimated displacements versus the motion tracking data of FIG. 7B for the point indicated in FIG. 7A ;
  • FIG. 8 shows a comparison between phase binning reconstructed images (top row) and the 4D reconstructed images according to the concepts herein (bottom row) at peak-exhale, mid-range, and peak-inhale;
  • FIG. 9 shows 4D reconstructed images (top row) and log Jacobian determinant images (bottom row) for compressible flow reconstruction (left column) and with incompressibility constraint (right column).
  • Imaging system 200 comprises processor based system 210 operatively coupled to transducer 220 for use in collecting imaging data with respect to subject 201 .
  • subject 201 may comprise a patient for which imaging of a portion of the patient's body (e.g., head, torso, leg, shoulder, etc.) is desired.
  • imaging as provided by imaging system 200 may comprise reconstruction of images of internal body structure (e.g., organs, cavities, bones, etc.) using one or more imaging modality (e.g., CT, MR, PET, ultrasound, etc.).
  • imaging modality e.g., CT, MR, PET, ultrasound, etc.
  • Processor based system 210 of embodiments comprises a processor (e.g., central processing unit (CPU), application specific integrated circuit (ASIC), etc.), memory (e.g., random access memory (RAM), read only memory (ROM), disk memory, optical memory, etc.), and appropriate input/output (I/O) apparatus (e.g., display, pointing device, keyboard, printer, speaker, microphone, etc.) operable under control of an instruction set (e.g., software, firmware, etc.) defining operation as described herein. Such operation may provide image reconstruction using one or more imaging modality corresponding to the configuration of transducer 220 .
  • processor e.g., central processing unit (CPU), application specific integrated circuit (ASIC), etc.
  • memory e.g., random access memory (RAM), read only memory (ROM), disk memory, optical memory, etc.
  • I/O input/output
  • I/O input/output
  • Such operation may provide image reconstruction using one or more imaging modality corresponding to the configuration of transducer 2
  • transducer 220 may comprise an X-ray source and detector configuration
  • a MR imaging modality is utilized transducer 220 may comprise a magnetic field generator and receiver configuration
  • a PET imaging modality is utilized transducer 220 may comprise a gamma ray source and detector configuration
  • an ultrasound imaging modality is utilized transducer 220 may comprise an ultrasonic sound source and receiver configuration.
  • Imaging system 200 of embodiments herein is adapted to utilize an image reconstruction framework to incorporate motion of structure, such as complex deformation of an object or objects being imaged, in an image reconstruction technique implemented thereby.
  • An image reconstruction framework as may be utilized by imaging system 200 is shown in FIG. 3 as image reconstruction framework 300 .
  • Image reconstruction framework 300 may be implemented, at least in part, as instructions operable upon the processor of processor based system 210 of imaging system 200 .
  • image reconstruction framework 300 utilizes information regarding the nature of motion, information regarding motion tracking, and imaging modality raw image information to jointly provide motion estimation and image reconstruction.
  • image reconstruction framework 300 of the illustrated embodiment includes nature of motion database 310 storing information regarding the nature of motion and image reconstruction processing 330 providing image reconstruction processing as described herein.
  • Motion tracking data, structure/object data, and/or other imaging parameters useful in image reconstruction processing according to embodiments is provided to image reconstruction processing 330 by imaging parameter input 340 .
  • imaging modality raw image information useful in image reconstruction according to embodiments is provided to image reconstruction processing 330 from imaging modality 320 of processor based system 210 .
  • Image reconstruction framework 300 of the illustrated embodiment may operate with respect to various image modalities, such as CT, magnetic resonance (MR), positron emission tomography (PET), ultrasound, etc., and thus imaging modality 320 may comprise one or more such imaging modality.
  • image modalities provide raw image information (e.g., raw image projection data) as is understood in the art.
  • the motion model e.g., anatomical motion model used with respect to organ movement
  • algorithms of image reconstruction procession 330 are preferably adapted to reconstruct images from the particular raw image information provided by an imaging modality being used.
  • image reconstruction processing 330 of embodiments models structure motion jointly with the image at the time of image reconstruction.
  • Operation of image reconstruction processing 330 preferably accommodates complex deformation of structure, as opposed to rigid motion of structure.
  • image reconstruction processing 330 may model the complex deformation that the anatomy undergoes when the subject (e.g., patient) is breathing, the subject's heart is beating, the subject's gastro intestinal track is contracting, etc.
  • deformation may comprise incompressible organ deformation (e.g., movement of the liver associated with breathing), compressible organ deformation (e.g., movement of the heart associated with its beating), etc.
  • the image reconstruction framework of embodiments includes a deformation model which models the entire non-linear movement process and builds a movement estimation utilized in image reconstruction.
  • nature of motion database 310 comprises structure motion parameters (e.g., object motion bounding data) for use in modeling one or more objects.
  • structure motion parameters may comprise information regarding constraints on possible deformations of a medium, particular structure or a particular object, etc. (e.g., limits on or bounds on movement that may be experienced by an object).
  • Such structure motion parameters may comprise boundaries (e.g., upper and/or lower limits) regarding velocity, changes in velocity, etc. for a medium, particular structure, etc.
  • Structure motion parameters may be derived or estimated using mathematical analysis techniques such as b-spline, polynomial-spline, elastic-spline, ploy-affine, polynomial, wavelet based, and/or the like. It can be appreciated that different media, structure, objects, etc., may have different properties associated therewith (e.g., density, rigidity, fluidity, etc.) which affect the movement (e.g., deformation) thereof.
  • image reconstruction processing 330 may effectively estimate movement of structure within an image being reconstructed. That is, although an exact model of the movement of structure may not be known, limits on how such movement occurs (e.g., how fast, how smooth, etc.) may be known from which structure movement may be estimated herein.
  • a maximum a posteriori (MAP) method for tracking structure motion that uses raw time-stamped image information to reconstruct the images and estimate structure deformations simultaneously.
  • the raw image information e.g., raw projection data
  • imaging modality 320 is preferably time-stamped or otherwise adapted for indexing with motion tracking data.
  • time-stamp indexing of raw projection data may be utilized with respect to motion tracking data (e.g., provided by imaging parameter input 340 ) related to structure motion as the raw image projections are obtained (e.g., organ motion as the subject is being imaged).
  • operation of image reconstruction processing 330 of embodiments estimates motion of structure within the image using the aforementioned structure motion parameters.
  • the particular structure motion parameters utilized with respect to structure, an object, an image, etc. may be selected based upon structure data (e.g., provided by imaging parameter input 340 ). For example, particular structure being imaged, a particular object being imaged, a particular region of the anatomy, a particular media, a particular characteristic (e.g., rigid, fluid filled, gaseous, etc.) may be identified by the structure data for selection of one or more appropriate structure motion parameters.
  • the imaging parameters provided by imaging parameter input 340 may be provided by user input, collected by one or more sensor, may be derived from other data, and/or the like.
  • image reconstruction processing 330 of embodiments implements an alternating optimization algorithm to find the solution to both of these problems at the same time, using the aforementioned information regarding the nature of motion.
  • image reconstruction processing 330 uses equations derived from Navier-Stokes equations to provide mechanical modeling of tissue which accommodates various different tissue properties from rigid, uncompressible to very compressible, soft tissue motion.
  • the global structure of the equations may be determined a priori and preferably the parameters utilized in solving the aforementioned problems may be optimized during operation of the estimation algorithm (as represented by feedback shown by the dotted line of FIG. 3 ).
  • the method eliminates artifacts as it does not rely on a binning process and increases signal-to-noise ratio (SNR) by using all of the collected data.
  • SNR signal-to-noise ratio
  • This technology also facilitates the incorporation of fundamental physical properties such as the conservation of local tissue volume during the estimation of the organ motion.
  • This method is accurate and robust against noise and irregular breathing for tracking organ motion and reducing artifacts.
  • An improvement in image quality is also demonstrated by application of the method to data from a real liver stereotactic body radiation therapy patient.
  • the B-spline model guarantees smooth deformations, it cannot guarantee the diffeomorphic properties for large deformations ensured by the method described herein and it does not directly enforce local conservation of tissue volume.
  • a 4D cone-beam scan used to estimate organ motion using an optical flow approach may be useful in reducing artifacts in a 3D image, but the optical flow model, like the B-Spline model, does not ensure diffeomorphic incompressible motion estimates.
  • exemplary embodiments will be provided below which provide detail helpful in understanding the concepts herein.
  • the exemplary embodiments presented below are general examples as may be implemented using imaging system 200 of FIG. 2 and image reconstruction framework 300 of FIG. 3 . It should be appreciated that the particular exemplary embodiments, and their associated detail, are intended to facilitate an understanding of the concepts of the present invention and do not present limitations regarding various embodiments thereof.
  • the exemplary embodiments presented herein are discussed with respect to application to signals recorded by breathing monitoring systems, such as spirometry or chest circumference tracking, other motion tracking systems may be utilized.
  • 4D RCCT of the liver the methods may be applied to other organ motion, such as cardiac motion using the ECG signal in place of a breathing monitor.
  • the 4D image reconstruction of embodiments estimates the time-indexed image I(t, x) that best represents the patient's anatomy during image acquisition.
  • the data likelihood is derived and a prior model incorporating the physical constraints is preferably defined.
  • the 4D image that maximizes the posterior probability combining the data likelihood and the prior is estimated according to embodiments.
  • CT image acquisition as may be utilized according to embodiments of the invention, is described by a projection operator, P ⁇ , which can represent fan-beam or cone-beam projections at angle ⁇ .
  • P ⁇ a projection operator
  • the acquisition of a single projection p i is subject to Gaussian noise of variance ⁇ 2 .
  • the data log-likelihood then becomes
  • the organ motion may be assumed to be correlated with breathing amplitude, so the deformation may be indexed by amplitude alone. Under this assumption, the deformations take the form h(a(t), x). The velocity of a point in the patient's anatomy is then described by the ordinary differential equation
  • the resulting estimates of patient anatomy are at all times diffeomorphic to one another. This ensures that organs do not tear or disappear during breathing.
  • the diffeomorphic deformations provide a one-to-one correspondence between points in images from different breathing amplitudes, enabling tracking of tissue trajectories.
  • L is a differential operator chosen to reflect physical tissue properties.
  • the homogeneous operator, L can be spatially-varying reflecting the different material properties of the underlying anatomy.
  • the 4D image reconstruction of embodiments can estimate the image and deformations parameterized by the velocity field that maximize equation 6,
  • a MAP estimate that maximizes equation 6 may be obtained via an alternating iterative method which at each iteration updates the estimate of the deformation in a gradient ascent step then updates the image using the associated Euler-Lagrange equation.
  • the continuous amplitude indexed velocity field may be discretized by a set of equally-spaced amplitudes a k with the associated velocities ⁇ k , with spacing ⁇ a.
  • h ⁇ ( a i , x ) h ⁇ ( a k , x ) + a i - a k ⁇ ⁇ ⁇ a ⁇ v k ⁇ ( h ⁇ ( a k , x ) ) ( 9 )
  • higher order integration schemes such as Runge-Kutta, may also be used in place of the simpler Euler method, if desired.
  • b i ⁇ ( k , x ) ⁇ 0 a i ⁇ a k a i - a k ⁇ ⁇ ⁇ a ⁇ ⁇ I k ⁇ ( x + a i - a k ⁇ ⁇ ⁇ a ⁇ ⁇ k ⁇ ( x ) ) a k ⁇ a i ⁇ a k + 1 ⁇ I k ⁇ ( x + ⁇ k ⁇ ( x ) ) a i ⁇ > a k + 1 ( 12 )
  • the Helmholtz-Hodge decomposition enables the implementation of the incompressibility constraint by simply projecting the unconstrained velocity fields onto the space of divergence free vector fields at each iteration of the method.
  • the discrete divergence operator can be used as the operator operates in Fourier domain.
  • the discrete Fourier transform of a central difference approximation to the derivative of a function ⁇ can be written as
  • the image estimate should also be updated.
  • the change of image estimate in turn alters the velocity gradients leading to a joint estimation algorithm in which, at each iteration, the velocity fields are updated and then the image is recalculated.
  • a 4D reconstruction method for slice data is set forth in the pseudocode below.
  • the velocity fields are initialized to zero, so that the initial estimate of the base image is simply the result of averaging all of the data. This yields a quite blurry image that sharpens upon further iterations as the motion estimate improves.
  • the organ deformation motion is modeled using multiple graphics processing units (GPU) in communication with a computer memory storing the anatomical images.
  • GPU graphics processing units
  • current computer graphics hardware may include hundreds of separate graphics channels or pipeline, each of these channels can be used in parallel in order to process and reconstruct the anatomical images being used.
  • the following example embodiment was performed using the CIRS anthropomorphic thorax phantom (CIRS Inc., Norfolk, Va.) and a GE Lightspeed RT scanner (GE Health Care, Waukesha, Wis.).
  • the phantom includes a simulated chest cavity with a 2 cm spherical object representing a tumor that is capable of moving in three dimensions.
  • a chest marker is also included in the phantom which moves in a pattern synchronized to the tumor and allows simulation of a real patient 4D RCCT scan.
  • the scans used in this study were driven to simulate a breathing trace collected from a real patient.
  • the recorded RPM trace (motion tracking data) of the aforementioned experimental setup is shown in FIG. 4A while the stationary spherical CIRS lung tumor phantom imaged with helical CT is shown in FIG. 4B .
  • the 4D phase binned dataset of the CIRS phantom generated by the GE Advance Workstation is shown in the top row of FIG. 5 .
  • the top row of FIG. 5 shows phase binned images at three different phases for the high SNR data (left) and the low SNR (10% tube current) data (right). Binning artifacts, including mismatched slices in the phase binned data when compared with the image of the stationary phantom, can readily be seen in the reconstructed phase binned images of FIG.
  • FIG. 5 Shown in the bottom row of FIG. 5 are images from an amplitude binned dataset at peak-inhale, mid-range amplitude, and peak-exhale for both the high and low SNR data.
  • the reconstructed images from the amplitude binned dataset do not show signs of mismatched slices and more closely resemble the static phantom image but suffer from missing data artifacts.
  • FIG. 6A shows the 4D reconstructed images generated according to the concepts herein using the same raw data as the phase and amplitude binned images in FIG. 5 . Specifically, FIG.
  • FIG. 6A shows 4D reconstructed images of the phantom at end-inhale, mid-range, and end-exhale amplitudes.
  • the top row shows the 4D reconstruction of the high SNR data, while the bottom row shows that of the low SNR data.
  • the reconstructed image as shown in FIG. 6A does not have any artifacts associated with either the phase binning or amplitude binning.
  • the similarity in images between the two 4D reconstructions shows the robustness of the image estimation to increasing noise.
  • FIGS. 6B and 6C show plots of the posterior indicating the
  • FIG. 7C Shown in FIG. 7C is a plot of the estimated displacements versus the RPM signal (motion tracking data) of FIG. 7B .
  • FIG. 8 provides a comparison between phase binning reconstructed images (top row) and the 4D reconstructed images according to the concepts herein (bottom row) at peak-exhale, mid-range, and peak-inhale.
  • slice mismatch artifacts are absent in the 4D reconstructed image as is readily apparent in FIG. 8 .
  • FIG. 9 shows 4D reconstructed images (top row) and log Jacobian determinant images (bottom row) for compressible flow reconstruction (left column) and with incompressibility constraint (right column). Negative log Jacobian values indicate local compression, while positive values indicate expansion.
  • the reconstructed images are extremely similar, while the Jacobian maps (bottom row) are quite different.
  • the estimated motion indicates compression and expansion of the top and bottom of the liver, while the incompressible reconstruction shows no local expansion or contraction.
  • liver is a blood-filled organ, physiologically it does not undergo any appreciable local changes in volume due to breathing. This exemplifies the usefulness of incorporating incompressibility into the reconstruction process.
  • medical imaging scanners that can be used with the present technology include but are not limited to: PET (Positron Emission Tomography) scanning, CT (computed tomography) scanning, combined PET/CT scanning, MRI (Magnetic Resonance Imaging) scanning, radio frequency (RF) scanning, millimeter wave scanning, holographic scanning, conebeam scanners, other nuclear medicine oriented types of scanning devices, and/or the like.
  • PET Positron Emission Tomography
  • CT computed tomography
  • MRI Magnetic Resonance Imaging
  • RF radio frequency
  • millimeter wave scanning holographic scanning
  • conebeam scanners other nuclear medicine oriented types of scanning devices, and/or the like.
  • helical conebeam simply amounts to shifting the input image before projection, according to the z value at which the projection was obtained.
  • helical fanbeam CT can likewise be implemented by shifting the input.
  • the method introduced here therefore, encompasses variations of fanbeam or conebeam CT acquisition.
  • the projection image has value at each position equal to the integral of the image from the source position to the detector position.
  • the detector position can be determined in world coordinates from (u, ⁇ ) via
  • ⁇ ⁇ ⁇ ( u , ⁇ ) ( - d p ⁇ ⁇ cos ⁇ ⁇ ⁇ - u ⁇ ⁇ sin ⁇ ⁇ ⁇ - d p ⁇ sin ⁇ ⁇ ⁇ + u ⁇ ⁇ cos ⁇ ⁇ ⁇ ⁇ )
  • a formula for P ⁇ ⁇ ⁇ p ⁇ (x, y, z) may be derived. Note that in order to not accumulate stuff that never gets projected, the backprojection should be limited to be zero in areas that are never integrated.
  • the furthest source position may be found and checked if it is within l max of the point. This may be performed by checking if (d x + ⁇ square root over (x 2 +y 2 ) ⁇ ) 2 +z 2 ⁇ l max 2 .
  • Dr ⁇ ( u , ⁇ , l ) 1 ( u 2 + ⁇ 2 + d 2 ) 3 / 2 ⁇ ( - l ⁇ ⁇ sin ⁇ ⁇ ⁇ ⁇ ( ⁇ 2 + d 2 ) l ⁇ ⁇ cos ⁇ ⁇ ⁇ ⁇ ( ⁇ 2 + d 2 ) - lu ⁇ ⁇ ⁇ lu ⁇ ⁇ ⁇ ⁇ ⁇ sin ⁇ ⁇ ⁇ - l ⁇ ⁇ u ⁇ ⁇ ⁇ ⁇ cos ⁇ ⁇ ⁇ l ⁇ ( u 2 + d 2 ) ( - d ⁇ ⁇ cos ⁇ ⁇ ⁇ - u ⁇ ⁇ sin ⁇ ⁇ ⁇ ) ( - d ⁇ ⁇ sin ⁇ ⁇ ⁇ + u ⁇ ⁇ cos ⁇ ⁇ ⁇ ) ( u 2 + ⁇ 2 + d 2 ) ⁇ ⁇ ( u 2 + ⁇
  • the noise may be modeled as Gaussian.
  • Gaussian For static gradient descent with optimal step size the noise may be modeled as Gaussian.
  • J ⁇ ( I ) ⁇ i ⁇ ⁇ u ⁇ ⁇ ⁇ [ P ⁇ ⁇ ⁇ I ⁇ ⁇ ( u , ⁇ ) - p i ⁇ ( u , ⁇ ) ] 2 ⁇ d u ⁇ d ⁇ ( 43 ) with respect to the image IC-L 2 (R 3 ).
  • J ⁇ ( I k + 1 ) ⁇ i ⁇ ⁇ u ⁇ ⁇ ⁇ [ P ⁇ i ⁇ ⁇ I k - ⁇ ⁇ ⁇ ⁇ I ⁇ J ⁇ ( I k ) ⁇ ⁇ ( u , ⁇ ) - p i ⁇ ( u , ⁇ ) ] 2 ⁇ d u ⁇ d ⁇ ( 44 )
  • the derivative of this with respect to ⁇ is taken in order to optimize:
  • a 4D reconstruction method for conebeam data is set forth in the pseudocode below. It should be appreciated that “streakdiff” is used in the below pseudocode for the velocities and the image update. This is where the optimization (and GPU) come in.
  • ⁇ d is a line or plane in R 3 .
  • the case of lines will be considered in the following example. However, it should be appreciated that planar k-space data may be implemented easily as simply a collection of lines.
  • the inner product in L 2 (R) may be used to derive the adjoint operator.
  • a 4D reconstruction method for MRI data is set forth in the pseudocode below. “Streakdiff” is again used in the below pseudocode for the velocities and the image update.

Abstract

Systems and methods which implement an image reconstruction framework to incorporate complex motion of structure, such as complex deformation of an object or objects being imaged, in image reconstruction techniques are shown. For example, techniques are provided for tracking organ motion which uses raw time-stamped data provided by any of a number of imaging modalities and simultaneously reconstructs images and estimates deformations in anatomy. Operation according to embodiments reduces artifacts, increase the signal-to-noise ratio (SNR), and facilitates reduced image modality energy dosing to patients. The technology also facilitates the incorporation of physical properties of organ motion, such as the conservation of local tissue volume. This approach is accurate and robust against noise and irregular breathing for tracking organ motion.

Description

CROSS-REFERENCE TO RELATED APPLICATIONS
This application claims priority to U.S. Provisional Patent Application No. 61/222,028 entitled “IMAGE RECONSTRUCTION INCORPORATING ORGAN MOTION,” filed Jun. 30, 2009, the disclosure of which is hereby incorporated herein by reference.
TECHNICAL FIELD
The disclosure relates generally to imaging and, more particularly, to reconstructing images which incorporate motion.
BACKGROUND OF THE INVENTION
Imaging techniques have been developed for studying organs where motion is present. For example, four-dimensional respiratory-correlated computed tomography (4D RCCT) has been widely used for studying organs where motion is associated with patient breathing. The current standard practice for reconstructing images with respect to such organ motion is to use phase binned images. However, the phase binning algorithm assumes that the patient has a periodic breathing pattern. When the patient's breathing is irregular as represented in the graph of FIG. 1A, this assumption breaks down and significant image artifacts like that indicated by the arrow of FIG. 1B are introduced.
A recent extensive study found that 90% of 4D RCCT patient images had at least one artifact. Amplitude binning algorithms have been developed as a way to alleviate these artifacts by assuming that the underlying anatomical configuration is correlated to the amplitude of the breathing signal.
For example, during a typical 4D RCCT fan-beam scan, the patient passes through the scanner on an automated couch that pauses at regular intervals to collect data. At each couch position slices are acquired repeatedly (e.g., 15-20 times). Each slice is acquired by collecting a series of projections at different angles. The slices are then reconstructed individually using filtered back-projection. The speed of acquisition of each slice is dependent on the scanner and for current generation multi-slice scanners is generally on the order of 0.5 s. The X-ray detection process used to acquire slices is subject to Poisson noise. However, at the X-ray tube currents typically used in clinical practice the signal is strong enough that the noise is approximately Gaussian. The patient's breathing is monitored during acquisition using an external surrogate for internal organ motion. The resulting breathing trace, a(t), is used to tag the acquired projection retrospectively with a breathing amplitude. An example of a breathing trace monitoring system is the Real-time Position Management (RPM) system (Varian Oncology Systems, Palo Alto, Calif.), which uses a camera to track infrared-reflective markers attached to the patient's torso.
Although application of amplitude binning algorithms in the foregoing example may reduce binning artifacts, since data is not acquired at all breathing amplitudes the images often have some missing slices. Accordingly, image artifacts continue to be present in the reconstructed images.
Modeled rigid 2D motion has been used during image acquisition to alleviate in-plane artifacts in fan-beam CT. However, such rigid 2D motion models are not valid for imaging of the torso, where respiratory-induced motion causes highly non-linear deformation with a significant component in the superior-inferior direction.
Another prior method reconstructs a full 4D time-indexed image using a B-spline motion model and a temporal smoothing condition. Yet other B-spline-based methods require an artifact-free reference image (such as a breath-hold image) in addition to a 4D fan-beam or cone-beam scan. These approaches address difficulties caused by slowly-rotating cone-beam scanners. However, the acquisition of an artifact-free reference image remains impractical for many radiotherapy patients. While the B-spline model guarantees smooth deformations, it cannot guarantee the diffeomorphic properties for large deformations and it does not directly enforce local conservation of tissue volume.
In another previous method, 3D images are reconstructed at arbitrary amplitudes by interpolating each slice from those collected at nearby amplitudes and then stacking them. Two slices are used to interpolate a slice at the desired amplitude using an optical flow algorithm, so only 2D motion can be estimated. A more recent approach has used a 4D cone-beam scan to estimate organ motion using an optical flow approach. The motion estimate is then used to correct for organ motion during subsequent 3D scans on the fly. This method may be useful in reducing artifacts in a 3D image, but the optical flow model, like the B-spline model, does not ensure diffeomorphic incompressible motion estimates.
As early as 1991, an incompressible optical flow method for image registration was used. Some systems have used a spline-based model which penalizes tissue compression to perform incompressible image registration. Other researches have studied incompressible fluid-based registration of liver computed tomography (CT). An incompressible fluid-based approach solves Poisson's equation via a multigrid method at each iteration. Another efficient Fourier method of incompressible projection applies a result from the continuous domain to discrete data without alteration, which does not accommodate the discrete nature of image data. Despite these efforts in image registration, the incompressible nature of internal organs has not previously been incorporated into the image reconstruction process.
Deformable image registration has been shown to be useful in tracking organ motion in artifact-free 4D RCCT images. Such methods may be used with either phase or amplitude binned images, but are challenged in the presence of binning artifacts. That is, deformable image registration is not well suited for use with respect to 4D RCCT images which include binning artifacts.
From the above, it can be appreciated that regeneration of images where motion is present is problematic. In particular, motion artifacts remain present despite application of many of the available image reconstruction techniques. Moreover, none of the present imaging reconstruction techniques guarantee the diffeomorphic properties for large deformations or directly enforce local conservation of object volume.
BRIEF SUMMARY OF THE INVENTION
The present invention is directed to systems and methods which implement an image reconstruction framework to incorporate complex motion of structure, such as complex deformation of an object or objects being imaged, in image reconstruction techniques. Embodiments of image reconstruction frameworks herein are operable with respect to a plurality of image modalities, such as CT, magnetic resonance (MR), positron emission tomography (PET), ultrasound, etc. An image reconstruction framework according to embodiments of the invention utilizes information regarding the nature of motion, information regarding motion tracking, and imaging modality raw image information to jointly provide motion estimation and image reconstruction.
According to embodiments of the invention, systems and methods are provided for reconstructing anatomical images, including one or more organs or other structure (referred to herein generally as objects), of a patient who is being medically imaged. The method may include the operation of collecting raw image data from a medical imaging device configured to obtain anatomical images of a patient's internal anatomy. Motion tracking data can be obtained, wherein such motion tracking data can be related to organ motion as the anatomical images are obtained. In an exemplary embodiment, the motion tracking data may represent breathing motion as measured by a motion sensor with the patient. Additionally or alternatively, the motion tracking data may be measured from raw image data acquired from the medical imaging device.
Continuing with the foregoing example, further operation according to embodiments comprises modeling organ deformation motion using the motion tracking data with correlated raw image data and data regarding the nature of motion of the organ (e.g., motion bounding data for the organ). For example, when modeling organ deformation motion, motion tracking data can be used to indicate attributes of organ motion (e.g., motion timing) in order to facilitate modeling of a patient's internal organs and to obtain organ deformations by an estimation of anatomical deformation during image reconstruction. In one embodiment, modeling of the organ deformations includes using data regarding the nature of motion of the organ to enforce conservation of local tissue volume during organ modeling. Further, data regarding the nature of motion of the organ may provide a diffeomorphic motion model of organ deformations for deformation motion modeling according to embodiments herein.
The organ deformation motion can be used in reconstructing anatomical images. For example, an estimated time-indexed 4D image can then be generated using the organ deformation motion, wherein the estimated time-indexed image is viewable by an end user. The organ deformation motion estimation provided according to embodiments of the present invention can thus be used to reduce motion artifacts and/or increase the signal-to-noise ratio in images being reconstructed. Increased signal-to-noise ratio resulting from implementation of organ deformation motion estimation of embodiments facilitates a decrease in the dose of image modality energy (e.g., X-ray energy, magnetic field energy, ultrasound energy, etc.) delivered to the patient for generating an image of desired quality.
The foregoing has outlined rather broadly the features and technical advantages of the present invention in order that the detailed description of the invention that follows may be better understood. Additional features and advantages of the invention will be described hereinafter which form the subject of the claims of the invention. It should be appreciated by those skilled in the art that the conception and specific embodiment disclosed may be readily utilized as a basis for modifying or designing other structures for carrying out the same purposes of the present invention. It should also be realized by those skilled in the art that such equivalent constructions do not depart from the spirit and scope of the invention as set forth in the appended claims. The novel features which are believed to be characteristic of the invention, both as to its organization and method of operation, together with further objects and advantages will be better understood from the following description when considered in connection with the accompanying figures. It is to be expressly understood, however, that each of the figures is provided for the purpose of illustration and description only and is not intended as a definition of the limits of the present invention.
BRIEF DESCRIPTION OF THE DRAWING
For a more complete understanding of the present invention, reference is now made to the following descriptions taken in conjunction with the accompanying drawing, in which:
FIG. 1A shows a graph of irregular breathing cycles of a patient for which imaging is provided;
FIG. 1B show an image reconstructed using a phase binning algorithm for image data collected during the irregular breathing cycles of FIG. 1A;
FIG. 2 shows an imaging system adapted according to an embodiment of the present invention;
FIG. 3 shows an image reconstruction framework of an embodiment of the present invention as may be utilized in the imaging system of FIG. 2;
FIG. 4A shows a graph of motion tracking data as utilized in an exemplary embodiment providing image reconstruction with respect to a phantom;
FIG. 4B shows the stationary phantom of the exemplary embodiment of FIG. 4A imaged with helical CT;
FIG. 5 shows the 4D reconstructed images from a phase binned dataset (top row) and an amplitude binned dataset (bottom row) of the phantom of FIG. 4B;
FIG. 6A shows the 4D reconstructed images generated according to the concepts herein using the same raw data as the phase and amplitude binned images in FIG. 5;
FIGS. 6B and 6C show plots of the posterior indicating the convergence of the embodiment of the concept as applied in FIG. 6A;
FIG. 7A shows a single point in the phantom of FIG. 4B tracked for analyzing displacement versus motion tracking data as shown in FIGS. 7B and 7C;
FIG. 7B shows a graph of motion tracking data as utilized in the analysis of FIG. 7C;
FIG. 7C shows a plot of the estimated displacements versus the motion tracking data of FIG. 7B for the point indicated in FIG. 7A;
FIG. 8 shows a comparison between phase binning reconstructed images (top row) and the 4D reconstructed images according to the concepts herein (bottom row) at peak-exhale, mid-range, and peak-inhale; and
FIG. 9 shows 4D reconstructed images (top row) and log Jacobian determinant images (bottom row) for compressible flow reconstruction (left column) and with incompressibility constraint (right column).
DETAILED DESCRIPTION OF THE INVENTION
Reference will now be made to the exemplary embodiments illustrated in the drawings, and specific language will be used herein to describe the same. It will nevertheless be understood that no limitation of the scope of the invention is thereby intended. Alterations and further modifications of the inventive features illustrated herein, and additional applications of the principles of the inventions as illustrated herein, which would occur to one skilled in the relevant art and having possession of this disclosure, are to be considered within the scope of the invention.
Directing attention to FIG. 2, an imaging system adapted according to an embodiment of the present invention is shown as imaging system 200. Imaging system 200 comprises processor based system 210 operatively coupled to transducer 220 for use in collecting imaging data with respect to subject 201. For example, subject 201 may comprise a patient for which imaging of a portion of the patient's body (e.g., head, torso, leg, shoulder, etc.) is desired. Such imaging as provided by imaging system 200 may comprise reconstruction of images of internal body structure (e.g., organs, cavities, bones, etc.) using one or more imaging modality (e.g., CT, MR, PET, ultrasound, etc.).
Processor based system 210 of embodiments comprises a processor (e.g., central processing unit (CPU), application specific integrated circuit (ASIC), etc.), memory (e.g., random access memory (RAM), read only memory (ROM), disk memory, optical memory, etc.), and appropriate input/output (I/O) apparatus (e.g., display, pointing device, keyboard, printer, speaker, microphone, etc.) operable under control of an instruction set (e.g., software, firmware, etc.) defining operation as described herein. Such operation may provide image reconstruction using one or more imaging modality corresponding to the configuration of transducer 220. For example, where a CT imaging modality is utilized transducer 220 may comprise an X-ray source and detector configuration, where a MR imaging modality is utilized transducer 220 may comprise a magnetic field generator and receiver configuration, where a PET imaging modality is utilized transducer 220 may comprise a gamma ray source and detector configuration, and where an ultrasound imaging modality is utilized transducer 220 may comprise an ultrasonic sound source and receiver configuration.
Imaging system 200 of embodiments herein is adapted to utilize an image reconstruction framework to incorporate motion of structure, such as complex deformation of an object or objects being imaged, in an image reconstruction technique implemented thereby. An image reconstruction framework as may be utilized by imaging system 200 is shown in FIG. 3 as image reconstruction framework 300. Image reconstruction framework 300 may be implemented, at least in part, as instructions operable upon the processor of processor based system 210 of imaging system 200.
According to embodiments of the invention, image reconstruction framework 300 utilizes information regarding the nature of motion, information regarding motion tracking, and imaging modality raw image information to jointly provide motion estimation and image reconstruction. Accordingly, image reconstruction framework 300 of the illustrated embodiment includes nature of motion database 310 storing information regarding the nature of motion and image reconstruction processing 330 providing image reconstruction processing as described herein. Motion tracking data, structure/object data, and/or other imaging parameters useful in image reconstruction processing according to embodiments is provided to image reconstruction processing 330 by imaging parameter input 340. Additionally, imaging modality raw image information useful in image reconstruction according to embodiments is provided to image reconstruction processing 330 from imaging modality 320 of processor based system 210.
Image reconstruction framework 300 of the illustrated embodiment may operate with respect to various image modalities, such as CT, magnetic resonance (MR), positron emission tomography (PET), ultrasound, etc., and thus imaging modality 320 may comprise one or more such imaging modality. Such image modalities provide raw image information (e.g., raw image projection data) as is understood in the art. The motion model (e.g., anatomical motion model used with respect to organ movement) does not change from modality to modality according to embodiments, thereby facilitating the utilization of an image reconstruction framework herein with any of a number of imaging modalities. However, algorithms of image reconstruction procession 330 are preferably adapted to reconstruct images from the particular raw image information provided by an imaging modality being used.
As will be more fully appreciated from the discussion which follows, image reconstruction processing 330 of embodiments models structure motion jointly with the image at the time of image reconstruction. Operation of image reconstruction processing 330 preferably accommodates complex deformation of structure, as opposed to rigid motion of structure. For example, image reconstruction processing 330 may model the complex deformation that the anatomy undergoes when the subject (e.g., patient) is breathing, the subject's heart is beating, the subject's gastro intestinal track is contracting, etc. Such deformation may comprise incompressible organ deformation (e.g., movement of the liver associated with breathing), compressible organ deformation (e.g., movement of the heart associated with its beating), etc. Thus, the image reconstruction framework of embodiments includes a deformation model which models the entire non-linear movement process and builds a movement estimation utilized in image reconstruction.
In facilitating modeling such a non-linear movement process, nature of motion database 310 comprises structure motion parameters (e.g., object motion bounding data) for use in modeling one or more objects. For example, structure motion parameters may comprise information regarding constraints on possible deformations of a medium, particular structure or a particular object, etc. (e.g., limits on or bounds on movement that may be experienced by an object). Such structure motion parameters may comprise boundaries (e.g., upper and/or lower limits) regarding velocity, changes in velocity, etc. for a medium, particular structure, etc. Structure motion parameters may be derived or estimated using mathematical analysis techniques such as b-spline, polynomial-spline, elastic-spline, ploy-affine, polynomial, wavelet based, and/or the like. It can be appreciated that different media, structure, objects, etc., may have different properties associated therewith (e.g., density, rigidity, fluidity, etc.) which affect the movement (e.g., deformation) thereof. By utilizing appropriate structure motion parameters, as may be provided by nature of motion database 310, embodiments of image reconstruction processing 330 may effectively estimate movement of structure within an image being reconstructed. That is, although an exact model of the movement of structure may not be known, limits on how such movement occurs (e.g., how fast, how smooth, etc.) may be known from which structure movement may be estimated herein.
In embodiments of image reconstruction processing 330, a maximum a posteriori (MAP) method is provided for tracking structure motion that uses raw time-stamped image information to reconstruct the images and estimate structure deformations simultaneously. Accordingly, the raw image information (e.g., raw projection data) provided by imaging modality 320 is preferably time-stamped or otherwise adapted for indexing with motion tracking data. For example, time-stamp indexing of raw projection data may be utilized with respect to motion tracking data (e.g., provided by imaging parameter input 340) related to structure motion as the raw image projections are obtained (e.g., organ motion as the subject is being imaged). Having indexed the raw projection data to the motion tracking data, operation of image reconstruction processing 330 of embodiments estimates motion of structure within the image using the aforementioned structure motion parameters. The particular structure motion parameters utilized with respect to structure, an object, an image, etc. may be selected based upon structure data (e.g., provided by imaging parameter input 340). For example, particular structure being imaged, a particular object being imaged, a particular region of the anatomy, a particular media, a particular characteristic (e.g., rigid, fluid filled, gaseous, etc.) may be identified by the structure data for selection of one or more appropriate structure motion parameters. It should be appreciated that the imaging parameters provided by imaging parameter input 340 may be provided by user input, collected by one or more sensor, may be derived from other data, and/or the like.
Neither the desired image nor the exact motion of the structure in the image are known at initiation of operation of image reconstruction processing 330 of embodiments. Nevertheless, image reconstruction processing 330 of embodiments implements an alternating optimization algorithm to find the solution to both of these problems at the same time, using the aforementioned information regarding the nature of motion. For example, using equations derived from Navier-Stokes equations embodiments of image reconstruction processing 330 are able to provide mechanical modeling of tissue which accommodates various different tissue properties from rigid, uncompressible to very compressible, soft tissue motion. It should be appreciated that the global structure of the equations may be determined a priori and preferably the parameters utilized in solving the aforementioned problems may be optimized during operation of the estimation algorithm (as represented by feedback shown by the dotted line of FIG. 3).
The method eliminates artifacts as it does not rely on a binning process and increases signal-to-noise ratio (SNR) by using all of the collected data. In the case of CT, the increased SNR provides the opportunity to reduce dose to the patient during scanning. This technology also facilitates the incorporation of fundamental physical properties such as the conservation of local tissue volume during the estimation of the organ motion. This method is accurate and robust against noise and irregular breathing for tracking organ motion and reducing artifacts. An improvement in image quality is also demonstrated by application of the method to data from a real liver stereotactic body radiation therapy patient.
As can be appreciated from the foregoing, previous attempts at reducing 4D RCCT motion artifacts do not offer all the advantages of the described method, which incorporates a fully diffeomorphic motion model into the reconstruction process. For instance, rigid 2D motion models are not valid for imaging of the torso, where respiratory-induced motion causes highly non-linear deformation with a significant component in the superior-inferior direction. B-spline-based methods require an artifact-free reference image (such as a breath-hold image) in addition to a 4D fan-beam or cone-beam scan. However, the acquisition of an artifact-free reference image is impractical for many radiotherapy patients. Moreover, while the B-spline model guarantees smooth deformations, it cannot guarantee the diffeomorphic properties for large deformations ensured by the method described herein and it does not directly enforce local conservation of tissue volume. A 4D cone-beam scan used to estimate organ motion using an optical flow approach may be useful in reducing artifacts in a 3D image, but the optical flow model, like the B-Spline model, does not ensure diffeomorphic incompressible motion estimates.
Having described the concepts of the invention broadly, exemplary embodiments will be provided below which provide detail helpful in understanding the concepts herein. The exemplary embodiments presented below are general examples as may be implemented using imaging system 200 of FIG. 2 and image reconstruction framework 300 of FIG. 3. It should be appreciated that the particular exemplary embodiments, and their associated detail, are intended to facilitate an understanding of the concepts of the present invention and do not present limitations regarding various embodiments thereof. For example, although the exemplary embodiments presented herein are discussed with respect to application to signals recorded by breathing monitoring systems, such as spirometry or chest circumference tracking, other motion tracking systems may be utilized. Likewise, although discussed below with respect to application to 4D RCCT of the liver, the methods may be applied to other organ motion, such as cardiac motion using the ECG signal in place of a breathing monitor.
The 4D image reconstruction of embodiments estimates the time-indexed image I(t, x) that best represents the patient's anatomy during image acquisition. In order to obtain a maximum a posteriori estimate of organ motion the data likelihood is derived and a prior model incorporating the physical constraints is preferably defined. The 4D image that maximizes the posterior probability combining the data likelihood and the prior is estimated according to embodiments.
CT image acquisition, as may be utilized according to embodiments of the invention, is described by a projection operator, Pθ, which can represent fan-beam or cone-beam projections at angle θ. At sufficiently high signal-to-noise ratio, the acquisition of a single projection pi is subject to Gaussian noise of variance σ2. The data log-likelihood then becomes
L ( { p i } | I ( t , x ) ) = - 1 2 σ 2 i S P θ i { I ( t i , x ) } ( s ) - p i ( s ) 2 s , ( 1 )
where the integration with respect to s is over a one or two-dimensional domain depending on the projection operator used.
Due to the sparsity of the imaging data, it is desirable to constrain the estimation. Specifically, it can be assumed that no metabolic changes or local tissue density variations occur during acquisition and that the only dynamic process is the motion of the anatomy due to breathing. Under this assumption, the 4D image is described by a time-indexed deformation field and a single representative static 3D image, I0(x), as
I(t,x)=I 0(h(t,x)),  (2)
where for each time t, h(t, .) is a volume-preserving diffeomorphic deformation capturing the respiratory-induced motion of the underlying anatomy.
The organ motion may be assumed to be correlated with breathing amplitude, so the deformation may be indexed by amplitude alone. Under this assumption, the deformations take the form h(a(t), x). The velocity of a point in the patient's anatomy is then described by the ordinary differential equation
t h ( a ( t ) , x ) = υ ( a ( t ) , h ( a ( t ) , x ) ) a t , ( 3 )
where v is indexed by amplitude and may be thought of as a velocity with respect to changes in amplitude rather than time. The deformation from zero amplitude to any other amplitude is given by the associated integral equation
h(a,x)=x+∫ 0 aυ(a′,h(a′,x))da′.  (4)
If the velocities are constrained to be smooth, the resulting estimates of patient anatomy are at all times diffeomorphic to one another. This ensures that organs do not tear or disappear during breathing. The diffeomorphic deformations provide a one-to-one correspondence between points in images from different breathing amplitudes, enabling tracking of tissue trajectories. Smoothness is enforced according to embodiments by introducing a prior on the velocities via a Sobolev norm ∥υ∥V 2, defined by
∥υ∥V 2=
Figure US08824756-20140902-P00001
υ,υ
Figure US08824756-20140902-P00002
V=∫0 1xεΩ ∥Lυ(a,x)∥R 3 2 dxda  (5)
where L is a differential operator chosen to reflect physical tissue properties. In addition, the homogeneous operator, L can be spatially-varying reflecting the different material properties of the underlying anatomy.
Deformations defined by the flow along smoothly-varying vector fields as described in equation 3 have been well studied. In particular, if the divergence of the velocity field is zero the resulting deformation is guaranteed to preserve volume locally and have unit Jacobian determinant. This is a necessary constraint when modeling the breathing induced motion of incompressible fluid-filled organs such as liver. In fact, if L is the Laplacian operator and the velocities are constrained to be divergence-free, the velocities simulate Stokes flow of an incompressible viscous fluid.
With the data log-likelihood and the prior model described above, the log-posterior probability of observing the data becomes
L ( I 0 , υ | p i ) = - υ ( a ) V 2 - 1 2 σ 2 i S P θ i { I 0 h ( a i , x , y , z i ) } ( s ) - p i ( s ) 2 s , subject to div υ = 0. ( 6 )
Having defined the posterior, the 4D image reconstruction of embodiments can estimate the image and deformations parameterized by the velocity field that maximize equation 6,
( I ^ 0 , υ ^ ) = arg max I 0 , υ L ( I 0 , υ | P i ) subject to div υ = 0. ( 7 )
A MAP estimate that maximizes equation 6 may be obtained via an alternating iterative method which at each iteration updates the estimate of the deformation in a gradient ascent step then updates the image using the associated Euler-Lagrange equation. The continuous amplitude indexed velocity field may be discretized by a set of equally-spaced amplitudes ak with the associated velocities υk, with spacing Δa. Note that this amplitude discretization is independent of the amplitudes at which data is acquired. The deformation from amplitude ak to ak+1 is approximated according to embodiments by the Euler integration of equation 4,
h(a k+1 ,x)=h(a k ,x)+υk(h(a k ,x))  (8)
and the deformation for an amplitude ai between ak and ak+1 is linearly interpolated according to embodiments as
h ( a i , x ) = h ( a k , x ) + a i - a k Δ a v k ( h ( a k , x ) ) ( 9 )
Note that higher order integration schemes, such as Runge-Kutta, may also be used in place of the simpler Euler method, if desired.
A first variation of equation 6 with respect to Vk under the inner product in equation 5 is given by
δ υk L ( I 0 , υ k | p i ) = - 2 v k - 1 σ 2 ( L L ) - 1 i P θ i ( P θ i { I 0 h ( a i , · ) } - p i ) b i ( k , · ) , ( 10 )
where bi is the contribution to the variation due to a single projection and Pθ i + is the adjoint of the projection operator, which acts by back projecting the data discrepancy back into the 3D volume. The adjoint operators for various imaging geometries have been well studied. For parallel-beam CT geometry, the adjoint projection operator is the familiar backprojection operator. Conventional filtered backprojection CT slice reconstruction involves applying the adjoint operator, after filtering the 1D data. Let Ik(x)=I0◯h(ak,x) be the 3D reference image pushed forward to amplitude ak, the factors bi are given by
b i ( k , x ) = { 0 a i a k a i - a k Δ a I k ( x + a i - a k Δ a υ k ( x ) ) a k < a i a k + 1 D ( h a k + 1 h a i - 1 ) ( x ) I k ( x + υ k ( x ) ) a i > a k + 1 ( 11 )
If the deformations are constrained to be incompressible, implying that the Jacobian determinant is unity, this simplifies to
b i ( k , x ) = { 0 a i a k a i - a k Δ a I k ( x + a i - a k Δ a υ k ( x ) ) a k < a i a k + 1 I k ( x + υ k ( x ) ) a i > a k + 1 ( 12 )
An efficient computation of (LL)−1 may be implemented in the Fourier domain, wherein only a matrix multiplication and Fourier transforms of υk at each iteration of the algorithm may be utilized.
The Helmholtz-Hodge decomposition enables the implementation of the incompressibility constraint by simply projecting the unconstrained velocity fields onto the space of divergence free vector fields at each iteration of the method. In order to efficiently implement the Helmholtz-Hodge decomposition of a time-varying velocity field, the discrete divergence operator can be used as the operator operates in Fourier domain. The discrete Fourier transform of a central difference approximation to the derivative of a function ƒ can be written as
D F T { Δ z f } ( ω ) = D F T { f ( x + k x ) - f ( x - k x ) 2 k x } ( ω ) = i 2 k x sin ω D F T { f } ( ω ) ( 13 )
In the Fourier domain the divergence of a vector field takes the following form:
DFT{div υ}=(ω)=W(ω)·DFT{υ}(ω),  (14)
where
W ( ω ) = i 2 ( 1 k x sin ω x N x 1 k y sin ω y N y 1 k z sin ω z N z ) ( 15 )
The foregoing allows the divergent component to be easily removed in Fourier space via the projection
D F T { υ } ( ω ) D F T { v } ( ω ) - W ( ω ) · D F T { υ } ( ω ) W ( ω ) 2 C 3 W ( ω ) ( 16 )
Since the operator (LL)−1 is implemented in the Fourier domain according to embodiments there is little computational overhead in performing this projection at each iteration of the image reconstruction algorithm.
A first variation of equation 6 with respect to I0 is
δ I 0 L ( I 0 , υ | p i ) = 1 σ 2 i Dh - 1 ( a i , · ) ( P θ i P θ i { I 0 h ( a i , · ) } - P θ i p i ) h - 1 ( a i , · ) ( 17 )
If organ motion is slow compared to single slice acquisition time, individual slices can be reconstructed with minimal motion artifacts using filtered backprojection. In this case the 4D image is estimated from the reconstructed 2D slices Si(x y). Note that this assumption holds reasonably well in the case of 4D RCCT. Under the slow motion assumption, the velocity field variation becomes
δ υ k L ( I 0 , υ k | p i ) = - 2 υ k - 1 σ 2 ( L L ) - 1 i ( I 0 h ( a i , · ) - S i ) b i ( k , · ) ( 18 )
For slice data, Si, and an incompressible deformation estimate equation 17 may be solved by the mean of the deformed data,
I ^ 0 ( x ) = 1 N i , h - 1 ( a i x ) z = z i S i h - 1 ( a i , x ) , ( 19 )
which is equivalent to solving the Euler-Lagrange equation for equation 6.
Note that as soon as the velocity field is updated, the image estimate should also be updated. The change of image estimate in turn alters the velocity gradients leading to a joint estimation algorithm in which, at each iteration, the velocity fields are updated and then the image is recalculated.
A 4D reconstruction method for slice data according to embodiments of the present invention is set forth in the pseudocode below. In the method of the pseudocode, the velocity fields are initialized to zero, so that the initial estimate of the base image is simply the result of averaging all of the data. This yields a quite blurry image that sharpens upon further iterations as the motion estimate improves.
Pseudocode for 4D reconstruction of of slice data
I0 ← 0
for each k do
υk ← 0
end for
repeat
I 0 1 N S i h - 1 ( a i , x )
for each k do
υk ← υk + εδυ k L(I0, υk)
end for
Perform divergence-free projection on each υk
until algorithm converges or maximum number iterations reached
In certain embodiments, the organ deformation motion is modeled using multiple graphics processing units (GPU) in communication with a computer memory storing the anatomical images. Since current computer graphics hardware may include hundreds of separate graphics channels or pipeline, each of these channels can be used in parallel in order to process and reconstruct the anatomical images being used.
In order to provide a more complete explanation of the exemplary method and system described in detail above, an example of applying the described technology will now be provided. While specific imaging machines, processing orders, data, and disease diagnosis will be described, the described systems and methods can use a wide variety of imaging machines and raw data. In addition, it should be appreciated that the methods herein can be used in analyzing any portion of the human anatomy and in diagnosing a large variety of disorders or diseases in the human body.
The following example embodiment was performed using the CIRS anthropomorphic thorax phantom (CIRS Inc., Norfolk, Va.) and a GE Lightspeed RT scanner (GE Health Care, Waukesha, Wis.). The phantom includes a simulated chest cavity with a 2 cm spherical object representing a tumor that is capable of moving in three dimensions. A chest marker is also included in the phantom which moves in a pattern synchronized to the tumor and allows simulation of a real patient 4D RCCT scan. The scans used in this study were driven to simulate a breathing trace collected from a real patient.
The recorded RPM trace (motion tracking data) of the aforementioned experimental setup is shown in FIG. 4A while the stationary spherical CIRS lung tumor phantom imaged with helical CT is shown in FIG. 4B. The 4D phase binned dataset of the CIRS phantom generated by the GE Advance Workstation is shown in the top row of FIG. 5. Specifically, the top row of FIG. 5 shows phase binned images at three different phases for the high SNR data (left) and the low SNR (10% tube current) data (right). Binning artifacts, including mismatched slices in the phase binned data when compared with the image of the stationary phantom, can readily be seen in the reconstructed phase binned images of FIG. 5. Shown in the bottom row of FIG. 5 are images from an amplitude binned dataset at peak-inhale, mid-range amplitude, and peak-exhale for both the high and low SNR data. The reconstructed images from the amplitude binned dataset do not show signs of mismatched slices and more closely resemble the static phantom image but suffer from missing data artifacts.
Because access to the raw projection data was not available in performing the experimentation of this example (i.e., the raw image projection data was not available from the GE Lightspeed RT scanner), the slow motion assumption described above was applied when using the CINE slice data along with the recorded RPM trace in the 4D reconstruction algorithm. In order to demonstrate robustness against noise, an initial scan was taken with an X-ray tube current of 250 mA then repeated with a tube current of 25 mA. FIG. 6A shows the 4D reconstructed images generated according to the concepts herein using the same raw data as the phase and amplitude binned images in FIG. 5. Specifically, FIG. 6A shows 4D reconstructed images of the phantom at end-inhale, mid-range, and end-exhale amplitudes. The top row shows the 4D reconstruction of the high SNR data, while the bottom row shows that of the low SNR data. It should be appreciated that the reconstructed image as shown in FIG. 6A does not have any artifacts associated with either the phase binning or amplitude binning. Moreover, an increase in SNR in the 4D reconstructed images. For example, reconstructed 4D images from 25 mA data have higher signal-to-noise ratio (SNR=76.5) than binned images reconstructed using 250 mA data (SNR=53.9). The similarity in images between the two 4D reconstructions shows the robustness of the image estimation to increasing noise. FIGS. 6B and 6C show plots of the posterior indicating the convergence of the method.
To validate the estimated deformation model, a single point at the center of the phantom indicated by the cross hair in FIG. 7A was tracked by integrating the estimated velocity fields according to equation 4. The physical construction of the phantom dictates that the superiorinferior displacement is linearly correlated to the RPM signal. Shown in FIG. 7C is a plot of the estimated displacements versus the RPM signal (motion tracking data) of FIG. 7B. As can readily be appreciated from the graph of FIG. 7C, there is an excellent linear correlation (r=0.9988) between the estimated displacements and the RPM signal, thus validating the deformation estimation process.
Having described an example of applying the described technology using the CIRS anthropomorphic thorax phantom, an example of applying the described technology to a real patient is described below to further facilitate an understanding of the concepts herein. Specifically, a 4D reconstruction method as described herein was also applied to data collected from a real patient undergoing hypo-fractionated radiation therapy treatment of the liver. FIG. 8 provides a comparison between phase binning reconstructed images (top row) and the 4D reconstructed images according to the concepts herein (bottom row) at peak-exhale, mid-range, and peak-inhale. In addition to improving SNR, slice mismatch artifacts are absent in the 4D reconstructed image as is readily apparent in FIG. 8.
In this exemplary embodiment, the 4D reconstruction algorithm was run with and without the incompressibility constraint. Analysis of the incompressibility projection is shown in FIG. 9. Specifically, FIG. 9 shows 4D reconstructed images (top row) and log Jacobian determinant images (bottom row) for compressible flow reconstruction (left column) and with incompressibility constraint (right column). Negative log Jacobian values indicate local compression, while positive values indicate expansion. The reconstructed images are extremely similar, while the Jacobian maps (bottom row) are quite different. In particular, it is seen that without the incompressibility constraint, the estimated motion indicates compression and expansion of the top and bottom of the liver, while the incompressible reconstruction shows no local expansion or contraction. This illustrates the fact that although the two methods produce very similar images, the motion estimates are quite different. Given that liver is a blood-filled organ, physiologically it does not undergo any appreciable local changes in volume due to breathing. This exemplifies the usefulness of incorporating incompressibility into the reconstruction process.
While exemplary embodiments of the present technology have been described in terms of using certain types of medical scanners, the technology can be used with any type of medical scanner that is currently available or which may be become available that can provide images of a patients external or internal anatomy. Examples of medical imaging scanners that can be used with the present technology include but are not limited to: PET (Positron Emission Tomography) scanning, CT (computed tomography) scanning, combined PET/CT scanning, MRI (Magnetic Resonance Imaging) scanning, radio frequency (RF) scanning, millimeter wave scanning, holographic scanning, conebeam scanners, other nuclear medicine oriented types of scanning devices, and/or the like. The following discussion provides detail with respect to application of the concepts herein to several of the foregoing variations of medical scanners.
In providing 4D conebeam image reconstruction according to an embodiment of the invention, start with the log posterior
L ( I 0 υ | p i ) = - υ ( a ) V 2 - 1 2 σ 2 i S P θ i { I 0 h ( a i , · ) } ( s ) - p i ( s ) 2 s , ( 20 )
with or without the constraint div υ=0. Note now that sC-R2 and the projection operator is a conebeam projection.
A first variation of equation 20 with respect to υk is given by
δ υ k L ( I 0 , υ k | p i ) = - 2 v k - 1 σ 2 ( L L ) - 1 i P θ i ( P θ i { I 0 h ( a i , · ) } - p i ) h a i - 1 ( a k , · ) b i ( k , · ) , where ( 21 ) b i ( k , x ) = { 0 a i a k a i - a k Δ a I k ( x + a i - a k Δ a υ k ( x ) ) a k < a i a k + 1 D ( h a k + 1 h a i - 1 ) ( x ) I k ( x + υ k ( x ) ) a i > a k + 1 . ( 22 )
A first variation of equation 20 with respect to I0 is
δ I 0 L ( I 0 , υ | p i ) = 1 σ 2 i Dh - 1 ( a i , · ) ( P θ i P θ i { I 0 h ( a i , · ) } - P θ i p i ) h - 1 ( a i , · ) . ( 23 )
Notice that the image variation is computed by computing the (Pθ i Pθ i {I0◯h(ai,·)}−Pθ i pi) difference images then splatting them back to a=0 and summing. The difference image calculation is expensive and should be done for every projection. Note also that the velocity variations are computed by computing all difference images, and adding it (splatted) to all past velocity field body forces, then applying the Green's function to the summed body force.
The foregoing means both variations can be computed efficiently simultaneously. For each projection pi, the image is pushed forward to amplitude ai (cascaded interpolation). Then the difference image is obtained (CUDA projection/adjoint operation). This difference image is then pulled back to each amplitude discretization (splatting), where it is multiplied by the image gradient and added to the body force for that velocity field gradient. Finally, when it is pulled back to the base amplitude it is also added to the image gradient.
In deriving a formula for Pθ{I}(u,υ), assume that the source/detector rotate in a circular (as opposed to helical/spiral) trajectory about the center of the volume and that the X-ray source is located a distance dx from the center of rotation while the detector plane is a distance dp. The projection angle θ is defined as the angle between the source position and the positive x axis.
It is important to recognize that helical conebeam simply amounts to shifting the input image before projection, according to the z value at which the projection was obtained. It should be appreciated that fanbeam CT acquisition is identical to conebeam acquisition except that the detector image has only one row of detectors at u=0. Thus, helical fanbeam CT can likewise be implemented by shifting the input. The method introduced here, therefore, encompasses variations of fanbeam or conebeam CT acquisition.
Assume the center of rotation is 0C-C-C-R3 and that the detector rotates in the xy plane only. The source position depends on θ, call it sθ=(dx cos θ, dx sin θ, 0)C-R3. The position (index) in the projection image is denoted by (u, υ)C-R2, with the point (u, υ)=0 collinear with both sθ and 0.
The projection image has value at each position equal to the integral of the image from the source position to the detector position. The detector position can be determined in world coordinates from (u, υ) via
ω θ ( u , υ ) = ( - d p cos θ - u sin θ - d p sin θ + u cos θ υ )
Now the projection is given by
P θ { I } ( u , υ ) = 1 min i max cos 3 γ ( u , υ ) I ( s θ + l ω θ ( u , υ ) - s θ ω θ ( u , υ ) - s θ ) l , ( 24 )
where lmin, lmax, are the minimum and maximum distances of the image domain to the source as it rotates and γ(u, υ) is the angle between the line from the source to the plane's midpoint and the ray from source to the point (u, υ). The cosine term comes from the 1/r2 falloff of the radiation and from the tilting of the detector elements. Note that cos γ takes the convenient form
cos γ ( u , υ ) = d u 2 + υ 2 + d 2
For simplicity the transformation r: R2×[lmin, lmax]→R3 is introduced, which is given by
r θ ( u , υ , l ) = s θ + l ω θ ( u , υ ) - s θ ω θ ( u , υ ) - s θ = ( d x cos θ d x sin θ 0 ) + l u 2 + υ 2 + d 2 ( - d cos θ - u sin θ - d sin θ + u cos θ v ) ( 25 )
Now the projection has the simple form
P θ { I } ( u , υ ) = l m i n l ma x d 3 ( u 2 + υ 2 + d 2 ) 3 / 2 I r θ ( u , υ , l ) l . ( 26 )
For backprojection a formula for Pθ {p}(x, y, z) may be derived. Note that in order to not accumulate stuff that never gets projected, the backprojection should be limited to be zero in areas that are never integrated. At each point (x, y, z)C-Ω, the furthest source position may be found and checked if it is within lmax of the point. This may be performed by checking if (dx+√{square root over (x2+y2)})2+z2<lmax 2.
The Jacobian of the transformation rθ(u,υ,l) should be found. The derivatives are
r θ , x u = - l sin θ υ 2 + d 2 ( u 2 + υ 2 + d 2 ) 3 / 2 ( 27 ) r θ , y u = l cos θ υ 2 + d 2 ( u 2 + υ 2 + d 2 ) 3 / 2 ( 28 ) r θ , z u = - l u υ 1 ( u 2 + υ 2 + d 2 ) 3 / 2 ( 29 ) r θ , x υ = l u υ sin θ 1 ( u 2 + υ 2 + d 2 ) 3 / 2 ( 30 ) r θ , y υ = - l u υ cos θ 1 ( u 2 + υ 2 + d 2 ) 3 / 2 ( 31 ) r θ , z υ = l u 2 + d 2 ( u 2 + υ 2 + d 2 ) 3 / 2 ( 32 ) r θ , x l = ( - d cos θ - u sin θ ) u 2 + υ 2 + d 2 ( u 2 + υ 2 + d 2 ) 3 / 2 ( 33 ) r θ , y l = ( - d sin θ + u cos θ ) u 2 + υ 2 + d 2 ( u 2 + υ 2 + d 2 ) 3 / 2 ( 34 ) r θ , z l = υ u 2 + υ 2 + d 2 ( u 2 + υ 2 + d 2 ) 3 / 2 ( 35 )
Putting these together, the Jacobi matrix is given
Dr ( u , υ , l ) = 1 ( u 2 + υ 2 + d 2 ) 3 / 2 ( - l sin θ ( υ 2 + d 2 ) l cos θ ( υ 2 + d 2 ) - lu υ lu υ sin θ - l u υ cos θ l ( u 2 + d 2 ) ( - d cos θ - u sin θ ) ( - d sin θ + u cos θ ) ( u 2 + υ 2 + d 2 ) υ ( u 2 + υ 2 + d 2 ) ) ( 36 )
Taking the determinant it can bee seen that
det Dr ( u , υ , t ) = - l 2 d 3 ( u 2 + υ 2 + d 2 ) 5 / 2 ( 37 )
Since an expression for Pθ{I}(u,υ) is available, the adjoint may be found by rearranging:
P θ { p } , I L 2 ( R 3 ) = p , P θ { I } L 2 ( R 2 ) = u υ p ( u , υ ) l m i n l m ax d 3 ( u 2 + υ 2 + d 2 ) 3 / 2 I r θ ( u , υ , l ) l u u = u υ l m i n l ma x p ( u , υ ) d 3 ( u 2 + υ 2 + d 2 ) 3 / 2 I r θ ( u , υ , l ) det Dr θ ( u , υ , l ) det Dr θ ( u , υ , l ) l u υ = x p ( u ( x ) , υ ( x ) ) det Dr θ ( u ( x ) , υ ( x ) , l ( x ) ) d 3 ( u ( x ) 2 + υ ( x ) 2 + d 2 ) 3 / 2 I ( x ) x = x p ( u ( x ) , υ ( x ) ) u ( x ) 2 + υ ( x ) 2 + d 2 l ( x ) 2 I ( x ) x . ( 38 ) ( 39 ) ( 40 ) ( 41 ) ( 42 )
The foregoing uses u(x),υ(x),l(x) to denote the components of rθ −1(x).
For static gradient descent with optimal step size the noise may be modeled as Gaussian. Thus, the following is to be minimized
J ( I ) = i u υ [ P θ { I } ( u , υ ) - p i ( u , υ ) ] 2 u υ ( 43 )
with respect to the image IC-L2(R3).
Supposing a current image estimate Ik and Ik+1 is to be computed, a first variation of J with respect to I at Ik is
δ I J ( I k ) = 2 i P θ i { P θ i { I k } - p i }
Now the updated image is simply Ik+1=Ik−αδIJ(IK), where alpha is some step size. It is preferred that the optimal a be chosen. To do this consider the updated cost function.
J ( I k + 1 ) = i u υ [ P θ i { I k - α δ I J ( I k ) } ( u , υ ) - p i ( u , υ ) ] 2 u υ ( 44 )
The derivative of this with respect to α is taken in order to optimize:
J ( I k + 1 ) α = i u υ 2 [ P θ i { I k - α δ I J ( I k ) } ( u , υ ) - p i ( u , υ ) ] P θ i { δ I J ( I k ) } ( u , υ ) u υ . ( 45 ) = 2 i P θ i { I k } - α P θ i { δ I J ( I k ) } - p i , P θ i { δ I J ( I k ) } L 2 ( R 2 ) ( 46 ) = 2 i P θ i { I k } - p i P θ i { δ I J ( I k ) } L 2 ( R 2 ) - α P θ i { δ I J ( I k ) } , P θ i { δ I J ( I k ) } L 2 ( R 2 ) . ( 47 ) = 2 i P θ i ( P θ i { I k } - p i ) , δ I J ( I k ) L 2 ( R 2 ) - 2 α i P θ i P θ i { δ I J ( I k ) } , δ I J ( I k ) L 2 ( R 2 ) ( 48 ) = 2 δ I J ( I k ) , δ I J ( I k ) L 2 ( R 2 ) - 2 α i P θ i i P θ i { δ I J ( I k ) } , δ I J ( I k ) L 2 ( R 2 ) ( 49 )
Clearly this is zero when
α = δ I J ( I k ) L 2 ( R 2 ) 2 i P θ i P θ i { δ I J ( I k ) } , δ I J ( I k ) L 2 ( R 2 ) ( 50 )
A 4D reconstruction method for conebeam data according to embodiments of the present invention is set forth in the pseudocode below. It should be appreciated that “streakdiff” is used in the below pseudocode for the velocities and the image update. This is where the optimization (and GPU) come in.
Pseudocode for 4D reconstruction of conebeam data
I0←Feldkamp ({pi}i)
for each k do
 υk←0
end of
repeat
 for each projection i do
  Compute streadkdiff[i] = Pθ i (Pθ i {I0 ∘ h(ai, ·)}− pi)
 end for
 for each time step k do
  υk ← υk + ε δυ kL(I0k)
 end of
 I0 ← I0 − δI 0L(I0,υ)
 Perform divergence-free projection on each υk
until algorithm converges or maximum number iterations reached
As another example, in providing 4D MRI image reconstruction according to an embodiment of the invention, start with Start with the log posterior
L ( I 0 υ | p i ) = - υ ( a ) V 2 - 1 2 σ 2 i S P θ i { I 0 h ( a i , · ) } ( s ) - p i ( s ) 2 s , ( 51 )
with or without the constraint div υ=0. Note now that C-Ωd is a point in the data domain, which is some subspace of R3. If enough temporal resolution exists each sample point in k-space can be treated as an individual projection. This may not be practical since the time between samples in the readout direction is typically a few milliseconds. More commonly, Ωd is a line or plane in R3. The case of lines will be considered in the following example. However, it should be appreciated that planar k-space data may be implemented easily as simply a collection of lines.
A first variation of equation 51 with respect to υk is given by
δ υ k L ( I 0 , υ k | p i ) = - 2 υ k - 1 σ 2 ( L L ) - 1 i P θ i ( P θ i { I 0 h ( a i , · ) } - p i ) h a i - 1 ( a k , · ) b i ( k , · ) , where ( 52 ) b i ( k , x ) = { 0 a i a k a i - a k Δ a I k ( x + a i - a k Δ a υ k ( x ) ) a k < a i a k + 1 D ( h a k + 1 h a i - 1 ) ( x ) I k ( x + υ k ( x ) ) a i > a k + 1 . ( 53 )
A first variation of equation 51 with respect to I0 is
δ I 0 L ( I 0 , υ | p i ) = 1 σ 2 i Dh - 1 ( a i , · ) ( P θ i P θ i { I 0 h ( a i , · ) } - P θ i p i ) h - 1 ( a i , · ) . ( 54 )
Notice that the image variation is computed by computing the (Pθ i Pθ i {I0◯h(ai,·)}−Pθ i pi) difference images then splatting them back to a=0 and summing. The difference image calculation is expensive and should be done for every projection. Note also that the velocity variations are computed by computing all difference images, and adding it (splatted) to all past velocity field body forces, then applying the Green's function to the summed body force.
The foregoing means both variations can be computed efficiently simultaneously. For each projection pi, the image is pushed forward to amplitude ai (cascaded interpolation). Then the difference image is obtained (CUDA projection/adjoint operation). This difference image is then pulled back to each amplitude discretization (splatting), where it is multiplied by the image gradient and added to the body force for that velocity field gradient. Finally, when it is pulled back to the base amplitude it is also added to the image gradient.
In deriving a formula Pi{I} (s), assume sC-Ωd and Ωd some set of points in R3. Further assume it is a line in the x direction, as this is generally the readout direction for a standard gradient echo scan. The position of the line in the Fourier domain is given by υi, wi. In MRI the Fourier transform of the anatomy is measured directly, so that the projection operator takes a very simple form.
P i {I}(u)=F 3D {I}(u,υ iωi).  (55)
Explicitly this is
P i {I}(u)=∫xyz I(x,y,z)e −2πi(ux+υ i y+ω i z) dxdydz  (56)
For backprojection, in the case that Ωd is a line as we described previously, the inner product in L2(R) may be used to derive the adjoint operator.
I , P i { p } L 2 ( R 3 ) = P i { I } , p L 2 ( R ) = u P i { I } * φ ( u ) u = u x y z I ( x , y , z ) * 2 π ( υ x + υ i y + ω i z ) x y z φ ( u ) u = x y z I ( x , y , z ) * u 2 π ( ux + υ i y + ω i z ) φ ( u ) u x y z = I , u 2 π ( ux + υ i y + ω i z ) φ ( u ) u L 2 ( R 3 ) . ( 57 ) ( 58 ) ( 59 ) ( 60 ) ( 61 )
The formula for the adjoint operator may thus be seen as
P i { p } ( x , y , z ) = u 2 π ( ux + υ i y + ω i z ) φ ( u ) u = 2 π ( υ i y + ω i z ) F u - 1 { φ ( u ) } ( x ) ( 62 ) ( 63 )
A 4D reconstruction method for MRI data according to embodiments of the present invention is set forth in the pseudocode below. “Streakdiff” is again used in the below pseudocode for the velocities and the image update.
Pseudocode for 4D reconstruction of MRI data
I0←Feldkamp ({pi}i)
for each k do
 υk←0
end of
repeat
 for each projection i do
  Compute streadkdiff[i] = Pθ i (Pθ i {I0 ∘ h(ai, ·)}− pi)
 end for
 for each time step k do
  υk ← υk + ε δυ kL(I0k)
 end of
 I0 ← I0 − δI 0L(I0,υ)
 Perform divergence-free projection on each υk
until algorithm converges or maximum number iterations reached
It is to be understood that the above-referenced arrangements are only illustrative of the application for the principles of the present invention. Numerous modifications and alternative arrangements can be devised without departing from the spirit and scope of the present invention. While the present invention has been shown in the drawings and fully described above with particularity and detail in connection with what is presently deemed to be the most practical and preferred embodiment(s) of the invention, it will be apparent to those of ordinary skill in the art that numerous modifications can be made without departing from the principles and concepts of the invention as set forth herein.
Although the present invention and its advantages have been described in detail, it should be understood that various changes, substitutions and alterations can be made herein without departing from the spirit and scope of the invention as defined by the appended claims. Moreover, the scope of the present application is not intended to be limited to the particular embodiments of the process, machine, manufacture, composition of matter, means, methods and steps described in the specification. As one of ordinary skill in the art will readily appreciate from the disclosure of the present invention, processes, machines, manufacture, compositions of matter, means, methods, or steps, presently existing or later to be developed that perform substantially the same function or achieve substantially the same result as the corresponding embodiments described herein may be utilized according to the present invention. Accordingly, the appended claims are intended to include within their scope such processes, machines, manufacture, compositions of matter, means, methods, or steps.

Claims (31)

What is claimed is:
1. A method comprising:
collecting raw image data for a subject being imaged, wherein the subject being imaged comprises at least one structure;
obtaining motion tracking data associated with the subject being imaged;
receiving estimated complex deformation motion parameters relevant to the at least one structure;
reconstructing an estimated image of the subject being imaged from the raw image data by modeling structure deformation motion of the at least one structure using the motion tracking data and the received complex deformation motion parameters; and
indexing the obtained motion tracking data with the collected raw image data,
wherein said reconstructing an estimated image includes utilizing the indexed data and complex deformation parameters to provide a 4D image reconstruction.
2. The method of claim 1, wherein the raw image data is provided as raw projection data from an imaging modality.
3. The method of claim 2, wherein the imaging modality comprises an imaging modality selected from the group consisting of: a positron emission tomography (PET) imaging modality; a computed tomography (CT) imaging modality; a combined PET/CT imaging modality; a magnetic resonance imaging (MRI) imaging modality; an ultrasound imaging modality; a radio frequency (RF) imaging modality; a millimeter wave imaging modality; a holographic imaging modality; and a nuclear medicine imaging modality.
4. The method of claim 1, wherein the complex deformation motion parameters comprise constraints on possible deformations of the at least one structure.
5. The method of claim 4, wherein the complex deformation motion parameters comprise boundaries regarding at least one of velocity and a change in velocity for the at least one structure.
6. The method of claim 1, wherein the reconstructing an estimated image comprises:
implementing a maximum a posteriori (MAP) technique using the raw image data to reconstruct images and estimate structure deformations simultaneously.
7. The method of claim 1, wherein the reconstructing an estimated image comprises:
implementing an alternating optimization algorithm to optimize the modeling structure deformation and to optimize the estimated image.
8. The method of claim 1, wherein the subject being imaged comprises a patient, the at least one structure comprises an anatomical structure of the patient, and the motion tracking data is associated with movement of the anatomical structure during a time in which the raw image data is collected.
9. The method of claim 1, wherein the estimating complex deformation motion parameters comprises structure motion parameters estimated using mathematical analysis techniques selected from the group consisting of: b-spline; polynomial-spline; elastic-spline; ploy-affine; polynomial; and wavelet based.
10. A system comprising:
a database of structure motion parameters relevant to at least one structure;
a processor adapted to provide image reconstruction processing resulting in output of an estimated image of a subject being imaged from raw image data by modeling structure deformation motion of the at least one structure using the motion tracking data and structure motion parameters for the at least one structure obtained from the database,
wherein the obtained motion tracking data is indexed with the collected raw image data, and
wherein said reconstruction processing resulting in output of an estimated image includes the indexed data and structure motion parameters to provide a 4D image reconstruction.
11. The system of claim 10, further comprising:
an imaging modality operable to provide the raw image data for the subject being imaged.
12. The system of claim 11, wherein the imaging modality comprises an imaging modality selected from the group consisting of: a positron emission tomography (PET) imaging modality; a computed tomography (CT) imaging modality; a combined PET/CT imaging modality; a magnetic resonance imaging (MRI) imaging modality; an ultrasound imaging modality; a radio frequency (RF) imaging modality; a millimeter wave imaging modality; a holographic imaging modality; and a nuclear medicine imaging modality.
13. The system of claim 10, wherein the database, the processor, and the imaging modality are provided as part of an imaging system.
14. The system of claim 10, further comprising:
an imaging parameter input providing input of one or more imaging parameters to the processor for use in the image reconstruction processing.
15. The system of claim 14, wherein the one or more imaging parameters include motion tracking data used with the structure motion parameters for modeling structure deformation motion by the processor.
16. The system of claim 10, wherein the structure motion parameters comprise estimated structure motion bounding parameters.
17. A method for reconstructing anatomical images of a patient being medically imaged, the method comprising:
collecting raw image data from a medical imaging device configured to obtain anatomical images of a patient's internal anatomy;
receiving a motion tracking signal, wherein the motion tracking signal is related to organ motion as the anatomical images are obtained;
receiving characteristic complex deformation parameters from a nature of motion database relating to an organ being imaged;
modeling organ deformation motion for the organ being imaged using the motion tracking signal and the received characteristic complex deformation parameters;
reconstructing an estimated 4D image of the patient's internal anatomy using the modeled organ deformation motion; and
indexing the collected raw image data with the received motion tracking signal.
18. The method of claim 17, wherein the estimated 4D image is displayed for viewing by an end user.
19. The method of claim 17, wherein the organ deformation motion is used to reduce motion artifacts in the anatomical images being reconstructed.
20. The method of claim 17, wherein the organ deformation motion is used to increase a signal-to-noise ratio in the anatomical images being reconstructed.
21. The method of claim 17, wherein the organ deformation motion is used to decrease a dose of image modality energy delivered to the patient.
22. The method of claim 17, wherein modeling the organ deformation motion comprises:
using the motion tracking signal representing organ motion to model a patient's internal organs; and
obtaining organ deformations by an estimation of anatomical deformation during image reconstruction.
23. The method of claim 17, further comprising:
estimating structure motion parameters relevant to one or more organ of the patient's internal anatomy from which the raw image data is collected, wherein modeling the organ deformation motion includes using the structure motion parameters with the motion tracking signal to model organ deformation.
24. The method of claim 23, wherein the structure motion parameters comprise structure motion bounding parameters.
25. The method of claim 23, wherein modeling of the organ deformations further includes enforcing conservation of local tissue volume during organ modeling.
26. The method of claim 23, wherein the deformation motion is modeled using diffeomorphic motion model of organ deformations.
27. The method of claim 23, wherein the deformation motion is modeled using movement velocities of the object motion bounding data.
28. The method of claim 17, wherein the organ deformation motion is modeled using multiple graphics processing units (GPU) in communication with a computer memory storing the anatomical images.
29. The method of claim 17, wherein the motion tracking signal represents breathing motion as measured by a motion sensor with the patient.
30. The method of claim 17, wherein the motion tracking signal is derived from the raw image data acquired from the medical imaging device.
31. The method of claim 17, wherein each iteration of computing the images updates of the estimate of the deformation in a gradient ascent step and then updates the image using the associated Euler-Lagrange equation.
US13/381,206 2009-06-30 2010-06-30 Image reconstruction incorporating organ motion Expired - Fee Related US8824756B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US13/381,206 US8824756B2 (en) 2009-06-30 2010-06-30 Image reconstruction incorporating organ motion

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US22202809P 2009-06-30 2009-06-30
PCT/US2010/040582 WO2011002874A1 (en) 2009-06-30 2010-06-30 Image reconstruction incorporating organ motion
US13/381,206 US8824756B2 (en) 2009-06-30 2010-06-30 Image reconstruction incorporating organ motion

Publications (2)

Publication Number Publication Date
US20120201428A1 US20120201428A1 (en) 2012-08-09
US8824756B2 true US8824756B2 (en) 2014-09-02

Family

ID=43411424

Family Applications (1)

Application Number Title Priority Date Filing Date
US13/381,206 Expired - Fee Related US8824756B2 (en) 2009-06-30 2010-06-30 Image reconstruction incorporating organ motion

Country Status (2)

Country Link
US (1) US8824756B2 (en)
WO (1) WO2011002874A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9129431B2 (en) 2011-12-18 2015-09-08 National University Corporation Kyoto University Motion-tracking X-ray CT image processing method and motion-tracking X-ray CT image processing device
US20160213341A1 (en) * 2015-01-27 2016-07-28 Septimiu Edmund Salcudean Dynamic computed tomography imaging of elasticity

Families Citing this family (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
ES2569411T3 (en) 2006-05-19 2016-05-10 The Queen's Medical Center Motion tracking system for adaptive real-time imaging and spectroscopy
US8811694B2 (en) * 2010-09-30 2014-08-19 University Of Utah Research Foundation Intrinsic detection of motion in segmented sequences
WO2012176114A1 (en) 2011-06-21 2012-12-27 Koninklijke Philips Electronics N.V. Respiratory motion determination apparatus
US9606209B2 (en) 2011-08-26 2017-03-28 Kineticor, Inc. Methods, systems, and devices for intra-scan motion correction
KR101982149B1 (en) * 2011-09-05 2019-05-27 삼성전자주식회사 Method and apparatus for creating medical image using partial medical image
US9002079B2 (en) * 2012-05-21 2015-04-07 General Electric Company Systems and methods for motion detecting for medical imaging
US9305365B2 (en) 2013-01-24 2016-04-05 Kineticor, Inc. Systems, devices, and methods for tracking moving targets
US9717461B2 (en) 2013-01-24 2017-08-01 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US10327708B2 (en) 2013-01-24 2019-06-25 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
CN105392423B (en) 2013-02-01 2018-08-17 凯内蒂科尔股份有限公司 The motion tracking system of real-time adaptive motion compensation in biomedical imaging
US20160256714A1 (en) * 2013-10-01 2016-09-08 Empire Technology Development Llc Visualization of beam trajectories in radiation therapy
CN103886568B (en) * 2014-03-18 2017-06-06 南方医科大学 Lung 4D CT image super-resolution rebuilding methods based on registration
CN106572810A (en) 2014-03-24 2017-04-19 凯内蒂科尔股份有限公司 Systems, methods, and devices for removing prospective motion correction from medical imaging scans
CN106714681A (en) 2014-07-23 2017-05-24 凯内蒂科尔股份有限公司 Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
CN106572817B (en) 2014-08-11 2020-01-31 皇家飞利浦有限公司 Prospective respiratory triggering with retrospective validation for 4D magnetic resonance imaging
CN104599299B (en) 2014-12-24 2017-12-29 沈阳东软医疗***有限公司 A kind of method for reconstructing and device of CT images
US9943247B2 (en) 2015-07-28 2018-04-17 The University Of Hawai'i Systems, devices, and methods for detecting false movements for motion correction during a medical imaging scan
US10716515B2 (en) 2015-11-23 2020-07-21 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
WO2018037005A1 (en) * 2016-08-22 2018-03-01 Koninklijke Philips N.V. Model regularized motion compensated medical image reconstruction
EP3768380A4 (en) * 2018-03-23 2021-12-22 The Regents Of The University Of California Apparatus and method for removing breathing motion artifacts in ct scans
US11500086B2 (en) * 2020-09-28 2022-11-15 Mitsubishi Electric Research Laboratories, Inc. System and method for tracking a deformation
CN113112486B (en) * 2021-04-20 2022-11-29 中国科学院深圳先进技术研究院 Tumor motion estimation method and device, terminal equipment and storage medium
WO2024011262A1 (en) * 2022-07-08 2024-01-11 The Regents Of The University Of California Systems and methods for calibrating computed tomography breathing amplitudes for image reconstruction

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020025017A1 (en) * 1999-06-17 2002-02-28 Stergios Stergiopoulos Method for tracing organ motion and removing artifacts for computed tomography imaging systems
US20030099391A1 (en) * 2001-11-29 2003-05-29 Ravi Bansal Automated lung nodule segmentation using dynamic progamming and EM based classifcation
US6894494B2 (en) 2002-11-25 2005-05-17 Her Majesty The Queen In Right Of Canada, As Represented By The Minister Of National Defence Of Her Majesty's Canadian Government Method and device for correcting organ motion artifacts in MRI systems
US20050113671A1 (en) * 2003-11-26 2005-05-26 Salla Prathyusha K. Method and system for estimating three-dimensional respiratory motion
US20060133564A1 (en) * 2004-12-21 2006-06-22 David Langan Method and apparatus for correcting motion in image reconstruction
US20060140482A1 (en) 2003-06-18 2006-06-29 Thomas Koehler Motion compensated reconstruction technique
US7221728B2 (en) 2002-07-23 2007-05-22 General Electric Company Method and apparatus for correcting motion in image reconstruction
US20070183639A1 (en) * 2004-03-02 2007-08-09 Koninklijke Philips Electronics N.V. Motion compensation
US20080095422A1 (en) * 2006-10-18 2008-04-24 Suri Jasjit S Alignment method for registering medical images
US20080292163A1 (en) * 2007-05-24 2008-11-27 Dibella Edward V R Method and system for constrained reconstruction of imaging data using data reordering

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020025017A1 (en) * 1999-06-17 2002-02-28 Stergios Stergiopoulos Method for tracing organ motion and removing artifacts for computed tomography imaging systems
US20030099391A1 (en) * 2001-11-29 2003-05-29 Ravi Bansal Automated lung nodule segmentation using dynamic progamming and EM based classifcation
US7221728B2 (en) 2002-07-23 2007-05-22 General Electric Company Method and apparatus for correcting motion in image reconstruction
US6894494B2 (en) 2002-11-25 2005-05-17 Her Majesty The Queen In Right Of Canada, As Represented By The Minister Of National Defence Of Her Majesty's Canadian Government Method and device for correcting organ motion artifacts in MRI systems
US20060140482A1 (en) 2003-06-18 2006-06-29 Thomas Koehler Motion compensated reconstruction technique
US20050113671A1 (en) * 2003-11-26 2005-05-26 Salla Prathyusha K. Method and system for estimating three-dimensional respiratory motion
US20070183639A1 (en) * 2004-03-02 2007-08-09 Koninklijke Philips Electronics N.V. Motion compensation
US20060133564A1 (en) * 2004-12-21 2006-06-22 David Langan Method and apparatus for correcting motion in image reconstruction
US20080095422A1 (en) * 2006-10-18 2008-04-24 Suri Jasjit S Alignment method for registering medical images
US20080292163A1 (en) * 2007-05-24 2008-11-27 Dibella Edward V R Method and system for constrained reconstruction of imaging data using data reordering

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
International Search Report and Written Opinion issued for PCT/US2010/040582, dated Sep. 1, 2010, 9 pages.

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9129431B2 (en) 2011-12-18 2015-09-08 National University Corporation Kyoto University Motion-tracking X-ray CT image processing method and motion-tracking X-ray CT image processing device
US20160213341A1 (en) * 2015-01-27 2016-07-28 Septimiu Edmund Salcudean Dynamic computed tomography imaging of elasticity
US10085703B2 (en) * 2015-01-27 2018-10-02 Septimiu Edmund Salcudean Dynamic computed tomography imaging of elasticity

Also Published As

Publication number Publication date
US20120201428A1 (en) 2012-08-09
WO2011002874A1 (en) 2011-01-06

Similar Documents

Publication Publication Date Title
US8824756B2 (en) Image reconstruction incorporating organ motion
Cai et al. Cine cone beam CT reconstruction using low-rank matrix factorization: algorithm and a proof-of-principle study
Dawood et al. Lung motion correction on respiratory gated 3-D PET/CT images
US9398855B2 (en) System and method for magnetic resonance imaging based respiratory motion correction for PET/MRI
CN109389655B (en) Reconstruction of time-varying data
KR20210020990A (en) System and method for lung-volume-gated X-ray imaging
US8565856B2 (en) Ultrasonic imager for motion measurement in multi-modality emission imaging
CN104395933B (en) Motion parameter estimation
Hinkle et al. 4D CT image reconstruction with diffeomorphic motion model
Wen et al. An accurate and effective FMM-based approach for freehand 3D ultrasound reconstruction
US8655040B2 (en) Integrated image registration and motion estimation for medical imaging applications
WO2003107275A2 (en) Physiological model based non-rigid image registration
Chan et al. Non-rigid event-by-event continuous respiratory motion compensated list-mode reconstruction for PET
CN108289651A (en) System for tracking the ultrasonic probe in body part
AU2012323829A1 (en) Heart imaging method
Christoffersen et al. Registration-based reconstruction of four-dimensional cone beam computed tomography
Rahni et al. A particle filter approach to respiratory motion estimation in nuclear medicine imaging
EP3188120B1 (en) Registering first image data of a first stream with second image data of a second stream
US11495346B2 (en) External device-enabled imaging support
CN110458779B (en) Method for acquiring correction information for attenuation correction of PET images of respiration or heart
Hinkle et al. 4D MAP image reconstruction incorporating organ motion
US10417793B2 (en) System and method for data-consistency preparation and image reconstruction
Vaughan et al. Hole filling with oriented sticks in ultrasound volume reconstruction
Shekhar et al. Medical image processing
Perperidis Spatio-temporal registration and modelling of the heart using cardiovascular MR imaging

Legal Events

Date Code Title Description
AS Assignment

Owner name: THE UNIVERSITY OF UTAH, UTAH

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:JOSHI, SARANG;HINKLE, JACOB;SALTER, BILL;AND OTHERS;SIGNING DATES FROM 20120413 TO 20120425;REEL/FRAME:028115/0908

Owner name: UNIVERSITY OF UTAH RESEARCH FOUNDATION, UTAH

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:THE UNIVERSITY OF UTAH;REEL/FRAME:028116/0031

Effective date: 20120426

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YR, SMALL ENTITY (ORIGINAL EVENT CODE: M2551)

Year of fee payment: 4

FEPP Fee payment procedure

Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

LAPS Lapse for failure to pay maintenance fees

Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20220902