WO2016018825A1 - Method and apparatus for producing a three-dimensional image - Google Patents

Method and apparatus for producing a three-dimensional image Download PDF

Info

Publication number
WO2016018825A1
WO2016018825A1 PCT/US2015/042296 US2015042296W WO2016018825A1 WO 2016018825 A1 WO2016018825 A1 WO 2016018825A1 US 2015042296 W US2015042296 W US 2015042296W WO 2016018825 A1 WO2016018825 A1 WO 2016018825A1
Authority
WO
WIPO (PCT)
Prior art keywords
dimensional image
sensor
shadow
radiation
dimensional
Prior art date
Application number
PCT/US2015/042296
Other languages
French (fr)
Inventor
Jeremy HORST
Thomas GAL
Marcin Swiatek
Original Assignee
Oraviz, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oraviz, Inc. filed Critical Oraviz, Inc.
Publication of WO2016018825A1 publication Critical patent/WO2016018825A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/12Arrangements for detecting or locating foreign bodies
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/02Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/025Tomosynthesis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/44Constructional features of apparatus for radiation diagnosis
    • A61B6/4429Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units
    • A61B6/4452Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units the source unit and the detector unit being able to move relative to each other
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/50Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications
    • A61B6/51Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications for dentistry
    • A61B6/512Intraoral means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5205Devices using data or image processing specially adapted for radiation diagnosis involving processing of raw data to produce diagnostic data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5211Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
    • A61B6/5229Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image
    • A61B6/5235Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image combining images from the same or different ionising radiation imaging techniques, e.g. PET and CT
    • A61B6/5241Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image combining images from the same or different ionising radiation imaging techniques, e.g. PET and CT combining overlapping images of the same imaging modality, e.g. by stitching
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/10Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis
    • A61B90/14Fixators for body parts, e.g. skull clamps; Constructional details of fixators, e.g. pins
    • A61B90/16Bite blocks
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • G03B15/14Special procedures for taking photographs; Apparatus therefor for taking photographs during medical operations
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/56Accessories
    • G03B17/561Support related camera accessories
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B42/00Obtaining records using waves other than optical waves; Visualisation of such records by using optical means
    • G03B42/02Obtaining records using waves other than optical waves; Visualisation of such records by using optical means using X-rays
    • G03B42/04Holders for X-ray films
    • G03B42/042Holders for X-ray films for dental applications
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/367Correlation of different images or relation of image positions in respect to the body creating a 3D dataset from 2D images using position information
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3966Radiopaque markers visible in an X-ray image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30036Dental; Teeth

Definitions

  • This disclosure generally relates to three-dimensional imaging. More
  • this disclosure relates to systems and methods for combining two-dimensional images into a three-dimensional image.
  • fiducial markers may provide reference points on the 2D images to allow the 2D images to be combined into a 3D image.
  • the reference objects have physical properties that allow effective normalization of images from diverse sources and positions.
  • some embodiments of the devices and methods may function with various types and manufacturers of radiation projectors and sensors. This may allow dentists to create 3D images using some of the existing 2D equipment already ubiquitous in dental practices. This will expand access to 3D imaging to patients for whom X-ray computed tomography machines are not available for reasons of financial or geographic convenience. Increased access to 3D imaging may in turn improve clinical outcomes for these patients.
  • a 3D image can be generated using small, mobile equipment that does not confine a person within a large machine. Some people, including children or those with claustrophobia or anxiety disorders, for example, may be afraid of large medical devices, including traditional CT machines. Some people have special healthcare needs that limit their ability to enter a CT machine.
  • the radiation source can be at nearly any angle and/or distance relative to the radiation sensor when capturing a 2D image.
  • the radiation source does not need to be mounted to a track or a rig, does not need to rotate about a fixed axis, and/or the angle of the radiation source relative to the radiation sensor does not need to be known at the time an image is captured.
  • some embodiments of the devices and methods do not require the fiducial markers to be glued or affixed to the object being imaged, but instead are fixed in a position relative to the radiation sensor. This may reduce patient discomfort during procedures.
  • some embodiments of the devices and methods herein do not require medical professionals and staff to receive significant additional training to operate.
  • some embodiments of the devices and methods disclosed herein may allow for the creation of 3D images from fewer images or scans than other 3D imaging equipment, and therefore with less radiation exposure.
  • the desired resolution of the resulting 3D image can be adjusted by increasing or decreasing the number of 2D images incorporated into the 3D image.
  • the resolution of particular anatomical features can also be adjusted by selecting particular capture angles and radiation exposures for each 2D image incorporated into the 3D image.
  • an apparatus for capturing radiation includes a support, a radiation sensor coupled to the support, and a fiducial marker held by the support at a set distance from the sensor.
  • a method for capturing a two-dimensional radiographic image includes inserting an apparatus of any of the embodiments described herein into a mouth and capturing the image.
  • the sensor, sensor holder, and/or the fiducial markers are held static with regard to the object being imaged.
  • an apparatus for holding a radiation sensor includes a support, a radiation sensor holder coupled to the support, and a fiducial marker held by the support at a set distance from the radiation sensor holder.
  • the fiducial marker is pre-aligned with respect to the sensor or sensor holder so that a location of the fiducial marker is known with respect to the sensor.
  • the set distance is less than 10 mm.
  • the fiducial marker is less than one cubic centimeter in volume.
  • a side of the sensor has less than four square inches of surface area. In some embodiments, a side of the sensor that can detect radiation has less than four square inches of surface area. In some embodiments, the sensor holder is configured to hold a sensor, wherein a radio sensitive side of the sensor has less than four, three, two, or one square inches of total surface area.
  • the apparatus includes a biting portion extending from the support.
  • the biting portion holds the sensor, sensor holder, and/or the fiducial markers in a static position relative to the object being imaged.
  • the fiducial marker includes a shape selected from the group consisting of a sphere, a cylinder, a cross, a cube, a pyramid, a hexahedron, or a disc.
  • the fiducial marker comprises a radiopaque or a semi-radiopaque material.
  • the radiopaque or the semi-radiopaque material is selected from the group consisting of lead, steel, compounds of barium, barium sulfate, compounds of bismuth, plastic, and thermoplastic.
  • the support includes at least one, at least two, at least three, at least four, at least five, at least six, at least seven, at least eight, at least nine, or at least ten fiducial markers.
  • the apparatus includes a radiation source for emitting radiation.
  • the radiation source is an X-ray source.
  • the apparatus further comprises a sensor in the sensor holder.
  • a method of generating a three dimensional image by combining a plurality of two dimensional images includes obtaining a plurality of two dimensional images, wherein each two dimensional image comprises a shadow from a fiducial marker, analyzing the shadow to determine shape characteristics of the shadow, calculating, from the shape characteristics, a polar angle and an azimuth angle of a radiation source relative to the radiation sensor and/or radiation sensor holder for each two dimensional image, and combining the two dimensional images into a three dimensional image using the polar angle and azimuth angle of each two-dimensional image.
  • the method includes determining a position and
  • radiodensity of each pixel on a two-dimensional image mapping the densities of each pixel across a plurality of voxels, and creating the three dimensional image including radiodensity information.
  • analyzing the shadow includes determining a length of the shadow, and calculating the polar angle and azimuth angle of the radiation source relative to the radiation sensor and/or radiation sensor holder from the length of the shadow. In some embodiments, analyzing the shadow includes determining an angle of the shadow, and calculating the polar angle and azimuth angle of the radiation source relative to the radiation sensor and/or radiation sensor holder from the angle of the shadow.
  • calculating the polar angle and azimuth angle of the radiation source relative to the radiation sensor and/or radiation sensor holder includes determining a shadow displacement for each of the plurality of two dimensional images, wherein determining a shadow displacement of each two-dimensional image includes determining a position of a center of a shadow in a two-dimensional image taken when the radiation source is orthogonal to the radiation sensor and/or radiation sensor holder, and comparing a position of a center of a shadow in each image to the center of the shadow in the two-dimensional image taken when the radiation source is orthogonal to the radiation sensor and/or radiation sensor holder.
  • a circular shadow created by a spherical fiducial marker indicates the radiation source is orthogonal to the sensor and/or sensor holder.
  • an elliptical shadow created by a spherical fiducial marker indicates the radiation source is not orthogonal to the sensor and/or sensor holder.
  • the direction of the elliptical shadow indicates the azimuth angle of the radiation source relative to the sensor and/or sensor holder.
  • the length of the major axis of the elliptical shadow indicates a capture angle of the source relative to the sensor and/or sensor holder.
  • a first two-dimensional image of the plurality of two dimensional images has a first capture angle and a second two dimensional image of the plurality of two dimensional images has a second capture angle, wherein the first capture angle and second capture angle are different.
  • the method includes displaying the three dimensional image on a display.
  • the method of generating an image includes obtaining two- dimensional images, determining, for each two-dimensional image, a relative position of a sensor and/or sensor holder and a radiation source used to capture the respective two- dimensional image, creating, for each two-dimensional image, a three-dimensional volume by projecting each two-dimensional image in a direction of the relative position of the sensor and/or sensor holder and the radiation source used to capture the respective two-dimensional image, and generating a three-dimensional image by correlating three-dimensional volumes associated with the two-dimensional images.
  • the method includes normalizing each two-dimensional image by analyzing a radiodensity gradient created by an object captured in each two- dimensional image.
  • the object can be captured in one or more of the plurality of images.
  • normalizing each two-dimensional image includes adjusting a parameter in each two-dimensional image.
  • the parameter includes at least one selected from the group consisting of a gamma correction, a brightness, and a contrast of each of the two-dimensional images.
  • the relative position includes a polar angle between a plane of the sensor and/or sensor holder and a direction of a beam emitted from the radiation source.
  • generating the three-dimensional image includes overlapping the three dimensional volumes.
  • generating the three-dimensional image includes identifying empty voxels.
  • generating the three- dimensional image includes estimating an intensity value of a non-empty voxel.
  • estimating the intensity value of the non-empty voxel includes averaging an array of potential values for the non-empty voxel.
  • estimating the intensity value of the non-empty voxel includes selecting the highest value from an array of potential values for the non-empty voxel. In some embodiments, generating the three- dimensional image includes iteratively adjusting the intensity value of the non-empty voxel to distribute a total intensity value among one or more related voxels. In some embodiments, generating the three-dimensional image includes identifying whether the non-empty voxel includes one selected from the group consisting of dentin, enamel, cavity, gum, and bone.
  • the method includes applying an anti-aliasing algorithm to the three-dimensional image. In some embodiments, the method includes displaying the three dimensional image on a display.
  • the method includes obtaining a plurality of two dimensional images, wherein each two dimensional image comprises a shadow from a fiducial marker, analyzing the shadow to determine shape characteristics of the shadow, calculating, from the shape characteristics, a polar angle and an azimuth angle of a radiation source relative to the radiation sensor and/or radiation sensor holder for each two dimensional image, creating, for each two-dimensional image, a three-dimensional volume by projecting each two-dimensional image in a direction of the polar angle and azimuth angle of the radiation source relative to the radiation sensor and/or radiation sensor holder used to capture the respective two-dimensional image, and generating a three-dimensional image by correlating three-dimensional volumes associated with the two-dimensional images.
  • a non-transitory computer-readable storage medium includes computer-readable instructions, which when executed by one or more processors, causes the one or more processors to perform the method of any one of the embodiments described herein.
  • Figure 1 illustrates a method of generating a 3D image by combining a plurality of 2D images, in accordance with an embodiment.
  • Figure 2A shows the difference in shadows created on a radiation sensor by a spherical fiducial marker, in accordance with an embodiment.
  • Figure 2B shows an overhead view of the different shadows created by changing the polar angle ⁇ and the azimuthal angle ⁇ of the radiation source relative to the radiation sensor, in accordance with an embodiment.
  • Figure 2C shows the shadows created by two metal bars.
  • Figure 2D shows the shadow created by two metal bars when one obscures the other from the beam of radiation.
  • Figure 3 illustrates a method of generating a 3D image by combining a plurality of 2D images, in accordance with an embodiment.
  • Figure 4A shows an isometric view of the front of an exemplary device in accordance with an embodiment.
  • Figure 4B shows a front view of an exemplary device in accordance with an embodiment.
  • Figure 4C shows a side view of an exemplary device in accordance with an embodiment.
  • Figure 4D shows an isometric rear view of an exemplary device in accordance with an embodiment.
  • Figure 5A shows an isometric view of the front of an exemplary device in accordance with an embodiment.
  • Figure 5B shows a cut-away front view of an exemplary device in accordance with an embodiment.
  • Figure 6 shows two radiographs produced by a fixed radiation sensor when imaging the same metal bars from different capture angles.
  • methods and devices utilize fiducial markers to determine the incident angle of a radiation source for each of a plurality of 2D images.
  • the fiducial markers may provide reference points on the 2D images to allow the 2D images to be combined into a 3D image.
  • some of the embodiments may be agnostic to the system used to capture the 2D images. Such embodiments allow for the use of traditional 2D imaging technology, which reduces both the cost and space required to generate a 3D image. Further, current 2D imaging techniques provide an acceptable level of radiation; leveraging current 2D imaging techniques may allow a 3D image to be generated without exposing the subject to more radiation than needed to create a typical set of 2D images.
  • a fiducial marker can be understood to be an object placed in the field of view of an imaging system that provides a reference point on an image produced by the imaging system.
  • the fiducial marker appears as a shadow in the image produced, for use as a point of reference or a measure.
  • the position, size, and/or shape of the shadow created by a fiducial marker changes depending on the relative angle of the radiation source and the radiation sensor.
  • the fiducial markers allow for the relative position of one image to be correlated to the relative position of another image in 3D space.
  • the position of the radiation source can be defined relative to the radiation sensor using polar coordinates or spherical coordinates.
  • the spherical coordinates are defined as p, ⁇ , and/or ⁇ , where p is radial distance, ⁇ is the polar angle measured from a fixed zenith orthogonal to a reference plane, e.g. a plane of the sensor, and ⁇ is the azimuthal angle.
  • Figure 1 illustrates a method 100 of generating a 3D image by combining a plurality of 2D images, in accordance with an embodiment.
  • Method 100 includes obtaining a plurality of 2D images 101, wherein each 2D image includes a shadow from a fiducial marker.
  • obtaining a plurality of 2D images 101 may include inserting a device in accordance with any of the embodiments described herein into the mouth of a patient.
  • obtaining a plurality of 2D images 101 may include placing a device in accordance with any of the embodiments described herein next to an object to be imaged.
  • obtaining a plurality of 2D images 101 may include affixing a device in accordance with any of the embodiments described herein to an object to be imaged.
  • Method 100 also includes analyzing the shadow to determine shape characteristics of the shadow 102, calculating, from the shape characteristics, a polar angle and an azimuth angle of a radiation source relative to the radiation sensor for each 2D image 103, and combining the 2D images into a 3D image using the polar angle and azimuth angle of each 2D image 104.
  • the relative position of a fiducial marker and the radiation sensor and/or radiation sensor holder is maintained between different 2D images.
  • FIG. 2A shows the difference in shadows created on a radiation sensor 201 by a spherical fiducial marker 202, in accordance with an embodiment.
  • a spherical fiducial marker 202 can create a circular shadow 204 when a radiation source 221 is positioned orthogonally to the radiation sensor 201.
  • 203 depicts the beam of radiation emitted by radiation source 221.
  • a radiation source 222 that is not orthogonal to the radiation sensor 201 can create an elliptical shadow 205.
  • 230 depicts the beam of radiation emitted by radiation source 222. This can occur because the plane of the radiation sensor 201 intersects the cone of the shadow in 3D space at an incline.
  • 206 depicts the distance between the center of the shadow created by radiation source 221 and the center of the shadow created by radiation source 222.
  • 207 depicts the distance between the sensor 201 and the fiducial marker 202.
  • FIG. 2B shows an overhead view of the different shadows created by changing the polar angle ⁇ and the azimuthal angle ⁇ of the radiation source relative to the radiation sensor, in accordance with an embodiment.
  • is 0, the radiation source is orthogonal to the radiation sensor. This creates a circular shadow 208 on the radiation sensor, which in this view is obscured by the fiducial marker 223.
  • Shadows 209, 210, and 211 depict shadows created by fiducial markers 224, 225, and 226, respectively, using radiation sources with progressively increasing ⁇ . As ⁇ increases, the shadow can become more elliptical.
  • shadows 209, 210, and 211 have progressively longer major axes because they are produced by radiation sources with progressively increasing ⁇ and constant ⁇ relative to the plane of the radiation sensor.
  • the azimuthal angle ⁇ can be determined by the direction of the fiducial shadow. Shadows 212, 213, and 214 each represent a shadow created by fiducial markers 227, 228, and 229, respectively, using radiation sources with the same ⁇ , but with different ⁇ . Thus, in some embodiments, the angle of the major axis of an elliptical shadow relative to the frame of an image is determined by the azimuthal angle ⁇ of the radiation source relative to the sensor.
  • the position of the center of the fiducial shadow can also be dependent on the spherical coordinates ⁇ and ⁇ of the radiation source relative to the radiation sensor.
  • a shadow created by a radiation source 221 orthogonal to the radiation sensor creates a circular shadow 204 directly below the fiducial marker.
  • the center of a shadow created by a radiation source with different ⁇ and/or ⁇ coordinates 222 creates a shadow with a center position displaced by a distance determined by ⁇ and in a direction determined by ⁇ .
  • the spherical coordinates of a radiation source relative to a radiation sensor used to create an image can be calculated by the shape, position, and size characteristics of fiducial shadows.
  • fiducial markers create shadows on each 2D image, wherein the shape, position, and/or size of the shadow are determined by the relative position of the radiation source and the radiation sensor (defined as the capture angle) when the image was captured.
  • some embodiments of the devices and methods herein allow for the generation of 3D images by determining the capture angle of each image from the image itself, and therefore without physically measuring the capture angle of the radiation source and the radiation sensor at the time each image is taken.
  • combining the 2D images into a 3D image 104 can include determining a position and radiodensity of each pixel on a 2D image, mapping the densities of each pixel across a plurality of voxels, and creating the 3D image.
  • Analyzing each image to identify and characterize the shadow created by the fiducial marker 102 can include manually tracing the outline of a shadow created by the fiducial markers. Analyzing each image to identify and characterize the shadow created by the fiducial marker 102 can include manually inputting the location of the shadows.
  • software determines the location of shadows created by the fiducial markers.
  • the software can locate the shadow by blob detection methods.
  • blob detection methods detect regions in a digital image that differ in properties, such as brightness or color, compared to surrounding regions.
  • the blob detection method can be a difference of Gaussians method.
  • edges of shadows created by fiducial markers can be determined by any of the edge detection methods known in the art. Exemplary methods include Canny edge detection, morphological thinning, a wavelet transform, or a combination of any of these methods.
  • shape characteristics for the shadow created by the fiducial marker can be determined by a Hough transform method. In some embodiments, the shape
  • characteristics determined by a Hough transform method include, for example, shadow dimensions, including the lengths of the major and minor axes.
  • the method includes determining p, ⁇ , and/or ⁇ for the position of the radiation source relative to the sensor by analyzing the size, shape and/or position characteristics of shadows created by the fiducial markers.
  • analyzing the shadow 102 can include determining a length of the shadow and calculating the polar angle and azimuth angle of the radiation source relative to the radiation sensor from the length of the shadow.
  • Analyzing the shadow 102 can include determining an angle of the shadow and calculating the polar angle and azimuth angle 103 of the radiation source relative to the radiation sensor from the angle of the shadow.
  • can be calculated from an angle of a major axis of an elliptical shadow relative to the frame of the image.
  • Analyzing the shadow 102 can include determining a shadow displacement for each of the plurality of 2D images, wherein determining a shadow displacement (e.g., shadow displacement 206 in Figure 2A) of each 2D image includes determining a position of a center of a shadow in a 2D image taken when the radiation source is orthogonal to the radiation sensor, and comparing a position of a center of a shadow in each image to the center of the shadow in the 2D image taken when the radiation source is orthogonal to the radiation sensor. Calculating the polar angle and azimuth angle 103 of the radiation source relative to the radiation sensor can include analyzing the shadow displacement.
  • determining a shadow displacement e.g., shadow displacement 206 in Figure 2A
  • Calculating the polar angle and azimuth angle 103 of the radiation source relative to the radiation sensor can include analyzing the shadow displacement.
  • the difference in major axis angles of shadows in two or more images created by the same fiducial marker can be used to calculate the relative difference in ⁇ between the images (e.g. the angles of shadows 212, 213, and 214 in Figure 2B).
  • can be calculated by the length 206 of a line representing the shift or displacement of the position of the center of the elliptical shadow from a fiducial marker in an image and the actual or expected position of a shadow from the same fiducial marker in an actual or theoretical image taken using a radiation source orthogonal to the radiation sensor.
  • the actual or theoretical positions are the x and y coordinates relative to the frame of the image sensor.
  • the relative difference in ⁇ between two images can be calculated by the length of a line representing the shift or displacement of the position of the center of the elliptical shadow from a fiducial marker in a first image and the position of a shadow from the same fiducial marker in a second image.
  • the length 206 divided by the distance 207 between the fiducial marker and the plane of the sensor equals tan(9).
  • can be calculated from a combination of any or all of the methods described herein.
  • can be calculated from a combination of any or all of the methods described herein.
  • calculating ⁇ of the radiation source relative to the radiation sensor includes calculating the angle of a line 206 drawn between the position of the center of the elliptical shadow from a fiducial marker in a first image with a first ⁇ and the expected position of the center of a shadow from the same fiducial marker in an actual or theoretical second image wherein the radiation source is orthogonal to the sensor.
  • p can be calculated by the length of the minor axis of an elliptical shadow created by a spherical fiducial marker.
  • a radiation source positioned at a larger p will create an ellipse with a smaller minor axis than when radiating the same fiducial marker from a position with a smaller p relative to the radiation sensor.
  • converting shadow information into p, ⁇ , and/or ⁇ information involves a linear regression calculation using the shape characteristics.
  • precision and/or accuracy can be improved by analyzing shape characteristics for a plurality of shadows created by a plurality of fiducial markers in each of one or more images.
  • analyzing the shape characteristics for a plurality of shadows includes a regression analysis.
  • a circular shadow created by a spherical fiducial marker indicates the radiation source is orthogonal to the sensor (e.g. shadow 204 in Figure 2A).
  • an elliptical shadow created by a spherical fiducial marker indicates the radiation source is not orthogonal to the sensor (e.g. shadow 205 in Figure 2A).
  • the direction of the elliptical shadow 205 indicates the azimuth angle of the radiation source relative to the sensor.
  • a first 2D image of the plurality of 2D images has a first capture angle
  • a second 2D image of the plurality of 2D images has a second capture angle
  • the plurality of images are captured from different angles using a radiation source rotated about an axis of rotation.
  • some or all of the images are captured with the radiation source having a different set of spherical coordinates relative to a reference plane, wherein the spherical coordinates are defined as p, ⁇ , and/or ⁇ , or a different combination of p, ⁇ , and/or ⁇ .
  • the reference plane is the plane of the sensor.
  • Figure 3 illustrates a method 300 of generating a 3D image by combining a plurality of 2D images, in accordance with an embodiment.
  • Method 300 includes obtaining a plurality of 2D images 301, determining a relative position of a sensor and a radiation source used to capture each 2D image 302, creating a 3D volume for each 2D image by projecting each 2D image in a direction of a relative position of the sensor and the radiation source used to capture the 2D image 303, and generating a 3D image by correlating the volumes associated with the 2D images 304.
  • the relative position of a fiducial marker and the radiation sensor and/or radiation sensor holder is maintained between different 2D images.
  • the method 300 can include normalizing each 2D image by analyzing a radiodensity gradient created by an object captured in each 2D image.
  • the radiodensity gradient includes radiodensities equivalent to the radiodensity of features.
  • the radiodensity gradient is created by a plurality of fiducial markers with varying radiodensities.
  • the radiodensity gradient is created by fiducial markers with radiodensities equivalent to the radiodensity of one or more features.
  • the radiodensity gradient and/or fiducial markers with varying radiodensities help identify features in a 2D or 3D image. In some embodiments,
  • normalizing each 2D image includes adjusting a parameter in each 2D image.
  • the parameter includes at least one selected from the group consisting of a gamma correction, a brightness, and a contrast of each of the 2D images. These parameters can vary between images due to differences in the distance between the radiation source and the radiation sensor, the amount of radiation generated by the radiation source, the focus of the beam of radiation, and variability between different radiation sources and sensors. In some embodiments, these adjustments are used to normalize each image so that the range of values on each image representing features are approximately consistent.
  • Example anatomical features in dental applications include caries, dentin, carious dentin, enamel, carious enamel, cementum, carious cementum, bone, gum, and other tissues.
  • a radiation sensor used to obtain a plurality of images 301 will be exposed non-uniformly if the radiation sensor is not orthogonal to the radiation source in at least one of the plurality of images.
  • the radiation source is a cone beam source or emits radiation in the shape of a cone.
  • the radiation source emits parallel rays of radiation.
  • the radiation source includes a columnator.
  • a side of the radiation sensor further away from the radiation source will be exposed by less radiation than a side closer to the radiation source.
  • normalizing the 2D images includes adjusting each pixel for the distance to the radiation source.
  • differences in radiation exposure of each pixel caused by the differences in distance to the source can be used to determine p, ⁇ , and/or ⁇ of the radiation source relative to the radiation sensor 302. In some embodiments, differences in radiation exposure caused by distance to the radiation source can be
  • the relative position includes a polar angle between a plane of the sensor and a direction of a beam emitted from the radiation source.
  • Step 303 can include creating, for each 2D image, a 3D volume by projecting each 2D image in a direction of the relative position of the sensor and the radiation source used to capture the respective 2D image.
  • each 2D image is projected from a boundary.
  • the projection creates a 3D space for each image where each pixel maps to a line of voxels, and each voxel along the line of voxels is assigned the same intensity value as the corresponding pixel for the image.
  • an intensity value represents the amount of radiation that reaches the sensor.
  • an intensity value has an inverse relationship to a radiopacity value, which can represent the amount of radiation obstructed from reaching the sensor.
  • the shape of each projection can be determined by a boundary and a direction of the projection.
  • an image taken using a radiation source orthogonal to the sensor generates a rectangular 3D space.
  • an image taken using a radiation source at any other angle generates a parallelepiped shaped 3D space, wherein one wall of the 3D space represents the boundary, and the parallelepiped extends from the sensor in a direction parallel to a point having ⁇ and ⁇ calculated for that particular image.
  • each image is projected from a rear boundary of the volumetric model.
  • the rear boundary corresponds to the position of the radiation sensor.
  • the position of the radiation sensor does not change within the model for each projection.
  • the position of the radiation sensor may be held constant while the plurality of images are captured.
  • the position of the radiation source used to capture the 2D images can move along spherical coordinates defined as p, ⁇ , and/or ⁇ , where p is radial distance, ⁇ is the polar angle, and ⁇ is the azimuthal angle.
  • each 2D image includes a series of pixels arranged in 2D space along x and y coordinates.
  • each pixel has an intensity value corresponding to the combined radiopacity of all radiopaque material between the radiation source and the radiation sensor.
  • Generating the 3D image 304 can include overlapping the 3D volumes.
  • the method includes generating a 3D image by correlating 3D volumes associated with the 2D images.
  • an images may comprise lines of voxels that intersect lines of voxels from images with different ⁇ and ⁇ values.
  • each voxel within the 3D image can have an array of intensity values corresponding to the intensity values of the lines that intersect within each voxel's boundaries.
  • Generating the 3D image 304 can include identifying empty voxels.
  • outlines are created around empty volumetric regions of the 3D image.
  • areas of an image, including, for example, a 2D image, that appear black or dark may not have a radiopaque object obstructing the radiation directed to that part of the sensor. In some embodiments, this indicates a clear path or relatively clear path between the source and the sensor.
  • any voxels that overlap a black line in a 2D image may be empty space regardless of the other intersecting line values corresponding to other 2D images.
  • the radiopacity or intensity values of those other intersecting lines can be attributed to the other voxels.
  • Generating the 3D image 304 can include estimating an intensity value of a nonempty voxel.
  • estimating an intensity value of a non-empty voxel includes selecting the highest intensity value from the array of intensity values for that voxel.
  • each pixel of a 2D image has an intensity value determined by the amount of radiation that the sensor detects.
  • each pixel in a 2D image represents the entire radiation that passes through the radiopaque material along a path in 3D space. The path follows a line from the radiation source through the radiopaque material to the sensor.
  • each voxel within the 3D image can have an array of intensity values corresponding to the intensity values of the lines projected from pixels in each of the plurality of 2D images that intersect within each voxel's boundaries.
  • darker areas of an image indicate higher intensity values, which can indicate less radiopaque material along the path of radiation that extends from the sensor to a point having ⁇ and ⁇ calculated for that particular image. Therefore, in some embodiments, the method includes selecting the highest intensity value, which can represent the clearest path between the radiation source and the sensor.
  • the radiopacity associated with unselected values in the array of values assigned to a voxel may belong to other voxels along other paths of radiation with different ⁇ and ⁇ values that intersect that voxel. Therefore, in some embodiments, the smallest radiopacity value in the array of values is selected to represent that voxel, which represents the highest amount of radiation exposure.
  • Generating the 3D image 304 can include a Radon transform and/or a Penrose transform. [0077] In some embodiments, outlines are created around volumetric regions determined to have higher intensity values relative to other volumetric regions. In some embodiments, this process is repeated with progressively smaller intensity values.
  • estimating the intensity value of the non-empty voxel includes averaging the array of potential values for each voxel.
  • Generating the 3D image 304 can include iteratively adjusting the intensity value of the non-empty voxel to distribute a total intensity value of the overlapping voxels.
  • a voxel relates to another voxel if a path of radiation intersects both voxels before reaching the same pixel on the radiation sensor in the same 2D image.
  • each voxel can relate to a different set of voxels for each image.
  • each value for each voxel can be cross-checked against all of the other voxels it relates to in order to ensure that radiopacity attributed to a voxel is consistent with each overlapping pixel.
  • each voxel's intensity value can be iteratively crosschecked against each of the paths of voxels determined to have less radiopaque material than other paths in order to distribute radiopacity to other voxels that intersect with other paths determined to have relatively more radiopaque material.
  • each voxel's intensity value can be iteratively crosschecked against one or more fiducial markers with varying radiopacity.
  • generating a 3D image includes identifying whether the non-empty voxel includes one selected from the group consisting of dentin, enamel, cavity, gum, and bone. Additionally, in some embodiments, once a voxel is identified as dentin, enamel, or other feature, the value for that voxel can be set and remaining radiopacity redistributed among other voxels in lines associated with the identified voxel.
  • the 3D image may be desirable to create a 3D image using the region of the projection of a 2D image that overlaps with a region of at least one other projection.
  • the 3D image can be corrected for radiopacity that only appears in a single projection.
  • the intensity values attributable to areas that only appear in one projection can be removed and the remaining voxel values adjusted
  • the relative position is a position calculated as p, ⁇ , and/or ⁇ by any of the methods contained herein.
  • the 3D image is created by using the relative difference in p, ⁇ , and/or ⁇ between two or more images.
  • generating the 3D image 304 can include identifying whether the non-empty voxel includes one selected from the group consisting of: dentin, enamel, cavity, gum, and bone.
  • Generating the 3D image 304 can include applying an anti-aliasing algorithm to the 3D image.
  • a 3D contour is created from the 3D image.
  • Figure 4A represents an isometric view of the front of an exemplary device in accordance with an embodiment.
  • Figure 4B represents a front view of an exemplary device in accordance with an embodiment.
  • Figure 4C represents a side view of an exemplary device in accordance with an embodiment.
  • Figure 4D represents an isometric rear view of an exemplary device in accordance with an embodiment.
  • An exemplary embodiment includes a plurality of supports for a sensor 402, and one or more fiducial markers 403.
  • the fiducial markers are held in a known and/or fixed position and/or distance relative to the supports for the sensor 402.
  • the supports can be any size.
  • the supports have a width of between 0.1 inches and 4 inches, including, for example, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1.0, 1.25, 1.5, 1.75, 2.0, 2.5 3, and 4 inches.
  • the supports can be configured to hold a sensor of any size.
  • the plurality of supports for a sensor are configured to hold a sensor with a height and/or width of a radio-sensitive side between 0.25 inches and 4 inches, including, for example 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1.0, 1.25, 1.5, 1.75, 2.0, 2.5 3, and 4 inches.
  • Some embodiments comprise a bite plate 404.
  • the bite plate can be any length between 0.1 and 6 inches, including, for example 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1.0, 1.25, 1.5, 1.75, 2.0, 2.5 3, 4, 5, or 6 inches.
  • a sensor can be coupled to the plurality of supports 402 in some embodiments.
  • Figure 5 illustrates an exemplary device in accordance with an embodiment.
  • Figure 5A represents an isometric view of the front of an exemplary device in accordance with an embodiment.
  • Figure 5B represents a cut-away front view of an exemplary device in accordance with an embodiment.
  • An exemplary embodiment includes a radiation sensor 501, and one or more fiducial markers 505.
  • the one or more fiducial markers are contained within supports 502 coupled to the radiation sensor.
  • the sensor can be any size.
  • the senor can have a height and/or width of a radio-sensitive side between 0.25 inches and 4 inches, including, for example 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1.0, 1.25, 1.5, 1.75, 2.0, 2.5 3, and 4 inches.
  • the fiducial markers are held in a known and/or fixed position and/or distance relative to the sensor 501.
  • Some embodiments comprise a bite plate 504.
  • the bite plate can be any length between 0.1 and 6 inches, including, for example 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1.0, 1.25, 1.5, 1.75, 2.0, 2.5 3, 4, 5, or 6 inches.
  • the sensor can be any type of radiation sensor.
  • the radiation sensor is an X-ray sensor adapted for use in capturing dental images.
  • the radiation sensor is a digital sensor.
  • the position of at least one of the one or more fiducial markers are fixed relative to the position of the sensor.
  • the fiducial marker obstructs or partially obstructs radiation emitted from the radiation source from reaching the sensor, thereby creating a shadow in the resulting image.
  • the one or more fiducial markers comprise 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, or more fiducial markers.
  • the one or more fiducial markers 405 are each aligned at a set distance and/or position from the sensor and from the other fiducial markers so that a location of each fiducial marker is known with respect to the sensor and the other fiducial markers.
  • the distance between one or more of the fiducial markers and the sensor is less than 10 mm, e.g.
  • one or more of the fiducial markers are in contact with the sensor, i.e. the one or more fiducial markers are immediately adjacent to the sensor.
  • the sensor includes one or more fiducial markers.
  • the one or more fiducial markers have a known size.
  • the size of the fiducial markers can be any size that creates a shadow detectable by the radiation sensor.
  • the size of the fiducial marker is the same size or between 1 and 100 times (e.g. about 1, 2, 3, 4, 5, 6, 7, 8, 10, 15, 20, 25, 30, 40, 50, 60, 70, 80, 90, or 100 time) the size of the resolution of a pixel in the radiation sensor.
  • the size is between 0.01 and 1 cubic centimeters in volume.
  • the fiducial marker has a diameter between about 0.1 mm and about 5.0 mm long (e.g.
  • the shape of the fiducial markers can be any radiopaque mass that creates a shadow.
  • the shape of the one or more fiducial markers can be a sphere, a cylinder, a cross, a cube, a pyramid, a hexahedron, or a disc.
  • the one or more fiducial markers comprise a plurality of shapes.
  • the fiducial marker is radiopaque. In some embodiments, the fiducial marker is partially radiopaque.
  • the position of the sensor relative to the object being imaged is held constant or relatively constant.
  • the location of objects in 3D space captured in 2D images can be determined by using the "buccal object rule," also known as the SLOB rule (Same Lingual; Opposite Buccal), demonstrated in Figure 6.
  • SLOB rule Standard Lingual; Opposite Buccal
  • whether a first object is in front of or behind a second object from the perspective of the radiation source can be determined by analyzing two images captured with a radiation source with different ⁇ and/or ⁇ angles. If an object moves in the same direction as the source of the x-ray beam, it is behind (lingual) the other object. If the object moves in the opposite direction of the source, it is in front of (buccal) to the other object.
  • Figure 6 depicts two different resulting radiographs (605 and 607) produced by a radiation sensor 601 resulting from the imaging the same metal bars (602 and 603) using radiation sources from two different angles (604 and 606).
  • the radiation source 604 is orthogonal to the radiation sensor, and shadows from the metal bars 602 and 603 appear in the radiograph in almost the same relationship that they share in reality.
  • shadows from the metal bars 602 and 603 appear on the film in a distorted relationship in the resulting radiograph 607.
  • the object closer to the radiation source 603 will create a shadow at a greater distance than the object farther from the radiation source 602, and the relative shift of the shadow created by the object closer to the radiation source 603 will be greater than the relative shift of the shadow created by the objected closer 602 to the radiation sensor.
  • the amount of displacement of a shadow created by an object can be correlated to the distance between the object and the sensor by comparing the displacement to the displacement of an object with a known distance to the sensor.
  • the object with a known distance to the sensor is a fiducial marker.
  • this comparison is a linear regression.
  • the relative size and direction an object shifts from one image to another relative to other objects captured in the same image can place the object in 3D space.
  • computer program product may be used generally to refer to media such as, memory storage devices, or storage unit. These, and other forms of computer-readable media, may be involved in storing one or more instructions for use by processor to cause the processor to perform specified operations. Such instructions, generally referred to as "computer program code” (which may be grouped in the form of computer programs or other groupings), when executed, enable the computing system.
  • computer readable storage may be a non-transitory medium.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Pathology (AREA)
  • Public Health (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Optics & Photonics (AREA)
  • Biophysics (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Radiology & Medical Imaging (AREA)
  • General Physics & Mathematics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Neurosurgery (AREA)
  • Theoretical Computer Science (AREA)
  • Dentistry (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

Devices and methods for generating a three-dimensional image by combining a plurality of two-dimensional images. An exemplary apparatus for capturing radiation comprises a support, a radiation sensor holder coupled to the support, and a fiducial marker held by the support and at a set distance from the sensor holder. The apparatus can further comprise a radiation sensor. The methods of generating a three dimensional image comprise combining a plurality of two dimensional images, wherein each two dimensional image comprises a shadow from a fiducial marker.

Description

METHOD AND APPARATUS FOR PRODUCING A THREE-DIMENSIONAL
IMAGE
Cross-Reference to Related Applications
[0001] This application claims priority from U.S. Provisional Application No.
62/029,843, filed July 28, 2014, and U.S. Provisional Application No. 62/183,104, filed June 22, 2015, the content of each of which is incorporated by reference herein in its entirety.
Technical Field
[0002] This disclosure generally relates to three-dimensional imaging. More
particularly, this disclosure relates to systems and methods for combining two-dimensional images into a three-dimensional image.
Background
[0003] Traditional dental practices take multiple radiographic images of a patient's mouth when planning for common dental procedures. Dentists often take these images from multiple angles in order to capture different views of the area of interest. However, traditional two-dimensional x-rays and radiographic images fail to provide medical professionals with sufficient information for a variety of procedures, including dental extractions, dental implants, and root canals. Examples include the inability to properly plan for procedures pre- operatively, improperly sealing or filling root canals, injuring nerves, damaging sinus, improperly placing implants, and various other types of trauma from dental procedures.
[0004] The American Association of Periodontists recommends utilizing three- dimensional imaging as the standard of excellence in planning all dental implants. Three- dimensional ("3D") imaging, however, is not available in many dental practices because traditional imaging systems are expensive and occupy additional space in dental practices.
[0005] Furthermore, traditional 3D imaging techniques expose patients to high levels of radiation. By some estimates, a patient is over 400 times more likely to get radiation-induced cancer from X-ray computed tomography than a typical series of dental X-rays. New devices and techniques are needed to increase the availability of 3D medical imaging that also reduce both cost and radiation exposure. Summary
[0006] Provided are methods and devices that utilize fiducial markers to determine the incident angle of a radiation source for each of a plurality of two-dimensional ("2D") images. The fiducial markers may provide reference points on the 2D images to allow the 2D images to be combined into a 3D image. In some embodiments, the reference objects have physical properties that allow effective normalization of images from diverse sources and positions.
[0007] In an exemplary advantage, some embodiments of the devices and methods may function with various types and manufacturers of radiation projectors and sensors. This may allow dentists to create 3D images using some of the existing 2D equipment already ubiquitous in dental practices. This will expand access to 3D imaging to patients for whom X-ray computed tomography machines are not available for reasons of financial or geographic convenience. Increased access to 3D imaging may in turn improve clinical outcomes for these patients.
[0008] In another exemplary advantage, a 3D image can be generated using small, mobile equipment that does not confine a person within a large machine. Some people, including children or those with claustrophobia or anxiety disorders, for example, may be afraid of large medical devices, including traditional CT machines. Some people have special healthcare needs that limit their ability to enter a CT machine. In another exemplary advantage, in some embodiments the radiation source can be at nearly any angle and/or distance relative to the radiation sensor when capturing a 2D image. In another exemplary advantage, in some embodiments the radiation source does not need to be mounted to a track or a rig, does not need to rotate about a fixed axis, and/or the angle of the radiation source relative to the radiation sensor does not need to be known at the time an image is captured.
[0009] In another exemplary advantage, some embodiments of the devices and methods do not require the fiducial markers to be glued or affixed to the object being imaged, but instead are fixed in a position relative to the radiation sensor. This may reduce patient discomfort during procedures.
[0010] In another exemplary advantage, some embodiments of the devices and methods herein do not require medical professionals and staff to receive significant additional training to operate. In another exemplary advantage, some embodiments of the devices and methods disclosed herein may allow for the creation of 3D images from fewer images or scans than other 3D imaging equipment, and therefore with less radiation exposure. In another exemplary advantage, the desired resolution of the resulting 3D image can be adjusted by increasing or decreasing the number of 2D images incorporated into the 3D image. In another exemplary advantage, the resolution of particular anatomical features can also be adjusted by selecting particular capture angles and radiation exposures for each 2D image incorporated into the 3D image.
[0011] In some embodiments, an apparatus for capturing radiation includes a support, a radiation sensor coupled to the support, and a fiducial marker held by the support at a set distance from the sensor. In some embodiments, a method for capturing a two-dimensional radiographic image includes inserting an apparatus of any of the embodiments described herein into a mouth and capturing the image. In some embodiments, the sensor, sensor holder, and/or the fiducial markers are held static with regard to the object being imaged.
[0012] In some embodiments, an apparatus for holding a radiation sensor includes a support, a radiation sensor holder coupled to the support, and a fiducial marker held by the support at a set distance from the radiation sensor holder.
[0013] In some embodiments, the fiducial marker is pre-aligned with respect to the sensor or sensor holder so that a location of the fiducial marker is known with respect to the sensor. In some embodiments, the set distance is less than 10 mm. In some embodiments, the fiducial marker is less than one cubic centimeter in volume.
[0014] In some embodiments, a side of the sensor has less than four square inches of surface area. In some embodiments, a side of the sensor that can detect radiation has less than four square inches of surface area. In some embodiments, the sensor holder is configured to hold a sensor, wherein a radio sensitive side of the sensor has less than four, three, two, or one square inches of total surface area.
[0015] In some embodiments, the apparatus includes a biting portion extending from the support. In some embodiments, the biting portion holds the sensor, sensor holder, and/or the fiducial markers in a static position relative to the object being imaged.
[0016] In some embodiments, the fiducial marker includes a shape selected from the group consisting of a sphere, a cylinder, a cross, a cube, a pyramid, a hexahedron, or a disc. In some embodiments, the fiducial marker comprises a radiopaque or a semi-radiopaque material. In some embodiments, the radiopaque or the semi-radiopaque material is selected from the group consisting of lead, steel, compounds of barium, barium sulfate, compounds of bismuth, plastic, and thermoplastic. In some embodiments, the support includes at least one, at least two, at least three, at least four, at least five, at least six, at least seven, at least eight, at least nine, or at least ten fiducial markers. [0017] In some embodiments, the apparatus includes a radiation source for emitting radiation. In some embodiments, the radiation source is an X-ray source. In some embodiments, the apparatus further comprises a sensor in the sensor holder.
[0018] In some embodiments, a method of generating a three dimensional image by combining a plurality of two dimensional images includes obtaining a plurality of two dimensional images, wherein each two dimensional image comprises a shadow from a fiducial marker, analyzing the shadow to determine shape characteristics of the shadow, calculating, from the shape characteristics, a polar angle and an azimuth angle of a radiation source relative to the radiation sensor and/or radiation sensor holder for each two dimensional image, and combining the two dimensional images into a three dimensional image using the polar angle and azimuth angle of each two-dimensional image.
[0019] In some embodiments, the method includes determining a position and
radiodensity of each pixel on a two-dimensional image, mapping the densities of each pixel across a plurality of voxels, and creating the three dimensional image including radiodensity information.
[0020] In some embodiments, analyzing the shadow includes determining a length of the shadow, and calculating the polar angle and azimuth angle of the radiation source relative to the radiation sensor and/or radiation sensor holder from the length of the shadow. In some embodiments, analyzing the shadow includes determining an angle of the shadow, and calculating the polar angle and azimuth angle of the radiation source relative to the radiation sensor and/or radiation sensor holder from the angle of the shadow.
[0021] In some embodiments, calculating the polar angle and azimuth angle of the radiation source relative to the radiation sensor and/or radiation sensor holder includes determining a shadow displacement for each of the plurality of two dimensional images, wherein determining a shadow displacement of each two-dimensional image includes determining a position of a center of a shadow in a two-dimensional image taken when the radiation source is orthogonal to the radiation sensor and/or radiation sensor holder, and comparing a position of a center of a shadow in each image to the center of the shadow in the two-dimensional image taken when the radiation source is orthogonal to the radiation sensor and/or radiation sensor holder.
[0022] In some embodiments, a circular shadow created by a spherical fiducial marker indicates the radiation source is orthogonal to the sensor and/or sensor holder. In some embodiments, an elliptical shadow created by a spherical fiducial marker indicates the radiation source is not orthogonal to the sensor and/or sensor holder. In some embodiments, the direction of the elliptical shadow indicates the azimuth angle of the radiation source relative to the sensor and/or sensor holder. In some embodiments, the length of the major axis of the elliptical shadow indicates a capture angle of the source relative to the sensor and/or sensor holder.
[0023] In some embodiments, a first two-dimensional image of the plurality of two dimensional images has a first capture angle and a second two dimensional image of the plurality of two dimensional images has a second capture angle, wherein the first capture angle and second capture angle are different. In some embodiments, the method includes displaying the three dimensional image on a display.
[0024] In some embodiments, the method of generating an image includes obtaining two- dimensional images, determining, for each two-dimensional image, a relative position of a sensor and/or sensor holder and a radiation source used to capture the respective two- dimensional image, creating, for each two-dimensional image, a three-dimensional volume by projecting each two-dimensional image in a direction of the relative position of the sensor and/or sensor holder and the radiation source used to capture the respective two-dimensional image, and generating a three-dimensional image by correlating three-dimensional volumes associated with the two-dimensional images.
[0025] In some embodiments, the method includes normalizing each two-dimensional image by analyzing a radiodensity gradient created by an object captured in each two- dimensional image. In some embodiments, the object can be captured in one or more of the plurality of images. In some embodiments, normalizing each two-dimensional image includes adjusting a parameter in each two-dimensional image. In some embodiments, the parameter includes at least one selected from the group consisting of a gamma correction, a brightness, and a contrast of each of the two-dimensional images.
[0026] In some embodiments, the relative position includes a polar angle between a plane of the sensor and/or sensor holder and a direction of a beam emitted from the radiation source. In some embodiments, generating the three-dimensional image includes overlapping the three dimensional volumes. In some embodiments, generating the three-dimensional image includes identifying empty voxels. In some embodiments, generating the three- dimensional image includes estimating an intensity value of a non-empty voxel. In some embodiments, estimating the intensity value of the non-empty voxel includes averaging an array of potential values for the non-empty voxel. In some embodiments, estimating the intensity value of the non-empty voxel includes selecting the highest value from an array of potential values for the non-empty voxel. In some embodiments, generating the three- dimensional image includes iteratively adjusting the intensity value of the non-empty voxel to distribute a total intensity value among one or more related voxels. In some embodiments, generating the three-dimensional image includes identifying whether the non-empty voxel includes one selected from the group consisting of dentin, enamel, cavity, gum, and bone.
[0027] In some embodiments, the method includes applying an anti-aliasing algorithm to the three-dimensional image. In some embodiments, the method includes displaying the three dimensional image on a display.
[0028] In some embodiments, the method includes obtaining a plurality of two dimensional images, wherein each two dimensional image comprises a shadow from a fiducial marker, analyzing the shadow to determine shape characteristics of the shadow, calculating, from the shape characteristics, a polar angle and an azimuth angle of a radiation source relative to the radiation sensor and/or radiation sensor holder for each two dimensional image, creating, for each two-dimensional image, a three-dimensional volume by projecting each two-dimensional image in a direction of the polar angle and azimuth angle of the radiation source relative to the radiation sensor and/or radiation sensor holder used to capture the respective two-dimensional image, and generating a three-dimensional image by correlating three-dimensional volumes associated with the two-dimensional images.
[0029] In some embodiments, a non-transitory computer-readable storage medium includes computer-readable instructions, which when executed by one or more processors, causes the one or more processors to perform the method of any one of the embodiments described herein.
Brief Description of the Drawings
[0030] Figure 1 illustrates a method of generating a 3D image by combining a plurality of 2D images, in accordance with an embodiment.
[0031] Figure 2A shows the difference in shadows created on a radiation sensor by a spherical fiducial marker, in accordance with an embodiment.
[0032] Figure 2B shows an overhead view of the different shadows created by changing the polar angle Θ and the azimuthal angle φ of the radiation source relative to the radiation sensor, in accordance with an embodiment.
[0033] Figure 2C shows the shadows created by two metal bars.
[0034] Figure 2D shows the shadow created by two metal bars when one obscures the other from the beam of radiation. [0035] Figure 3 illustrates a method of generating a 3D image by combining a plurality of 2D images, in accordance with an embodiment.
[0036] Figure 4A shows an isometric view of the front of an exemplary device in accordance with an embodiment.
[0037] Figure 4B shows a front view of an exemplary device in accordance with an embodiment.
[0038] Figure 4C shows a side view of an exemplary device in accordance with an embodiment.
[0039] Figure 4D shows an isometric rear view of an exemplary device in accordance with an embodiment.
[0040] Figure 5A shows an isometric view of the front of an exemplary device in accordance with an embodiment.
[0041] Figure 5B shows a cut-away front view of an exemplary device in accordance with an embodiment.
[0042] Figure 6 shows two radiographs produced by a fixed radiation sensor when imaging the same metal bars from different capture angles.
Detailed Description
[0043] In the following description of embodiments, reference is made to the
accompanying drawings which form a part hereof, and which show, by way of illustration, specific embodiments that can be practiced. It is to be understood that other embodiments can be used and structural changes can be made without departing from the scope of the disclosed embodiments.
[0044] In some embodiment, methods and devices utilize fiducial markers to determine the incident angle of a radiation source for each of a plurality of 2D images. The fiducial markers may provide reference points on the 2D images to allow the 2D images to be combined into a 3D image.
[0045] As an exemplary advantage, some of the embodiments may be agnostic to the system used to capture the 2D images. Such embodiments allow for the use of traditional 2D imaging technology, which reduces both the cost and space required to generate a 3D image. Further, current 2D imaging techniques provide an acceptable level of radiation; leveraging current 2D imaging techniques may allow a 3D image to be generated without exposing the subject to more radiation than needed to create a typical set of 2D images. [0046] As used herein, a fiducial marker can be understood to be an object placed in the field of view of an imaging system that provides a reference point on an image produced by the imaging system. In some embodiments, the fiducial marker appears as a shadow in the image produced, for use as a point of reference or a measure. In some embodiments, the position, size, and/or shape of the shadow created by a fiducial marker changes depending on the relative angle of the radiation source and the radiation sensor. Thus, in some
embodiments the fiducial markers allow for the relative position of one image to be correlated to the relative position of another image in 3D space.
[0047] In some embodiments, the position of the radiation source can be defined relative to the radiation sensor using polar coordinates or spherical coordinates. In some
embodiments, the spherical coordinates are defined as p, φ, and/or Θ, where p is radial distance, Θ is the polar angle measured from a fixed zenith orthogonal to a reference plane, e.g. a plane of the sensor, and φ is the azimuthal angle.
[0048] Figure 1 illustrates a method 100 of generating a 3D image by combining a plurality of 2D images, in accordance with an embodiment. Method 100 includes obtaining a plurality of 2D images 101, wherein each 2D image includes a shadow from a fiducial marker. In dental applications, obtaining a plurality of 2D images 101 may include inserting a device in accordance with any of the embodiments described herein into the mouth of a patient. In some applications, obtaining a plurality of 2D images 101 may include placing a device in accordance with any of the embodiments described herein next to an object to be imaged. In some applications, obtaining a plurality of 2D images 101 may include affixing a device in accordance with any of the embodiments described herein to an object to be imaged. In some embodiments, Method 100 also includes analyzing the shadow to determine shape characteristics of the shadow 102, calculating, from the shape characteristics, a polar angle and an azimuth angle of a radiation source relative to the radiation sensor for each 2D image 103, and combining the 2D images into a 3D image using the polar angle and azimuth angle of each 2D image 104.
[0049] In some embodiments, the relative position of a fiducial marker and the radiation sensor and/or radiation sensor holder is maintained between different 2D images.
[0050] Figure 2A shows the difference in shadows created on a radiation sensor 201 by a spherical fiducial marker 202, in accordance with an embodiment. A spherical fiducial marker 202 can create a circular shadow 204 when a radiation source 221 is positioned orthogonally to the radiation sensor 201. 203 depicts the beam of radiation emitted by radiation source 221. A radiation source 222 that is not orthogonal to the radiation sensor 201 can create an elliptical shadow 205. 230 depicts the beam of radiation emitted by radiation source 222. This can occur because the plane of the radiation sensor 201 intersects the cone of the shadow in 3D space at an incline. 206 depicts the distance between the center of the shadow created by radiation source 221 and the center of the shadow created by radiation source 222. 207 depicts the distance between the sensor 201 and the fiducial marker 202.
[0051] Figure 2B shows an overhead view of the different shadows created by changing the polar angle Θ and the azimuthal angle φ of the radiation source relative to the radiation sensor, in accordance with an embodiment. When Θ is 0, the radiation source is orthogonal to the radiation sensor. This creates a circular shadow 208 on the radiation sensor, which in this view is obscured by the fiducial marker 223. Shadows 209, 210, and 211 depict shadows created by fiducial markers 224, 225, and 226, respectively, using radiation sources with progressively increasing Θ. As Θ increases, the shadow can become more elliptical. Thus, shadows 209, 210, and 211 have progressively longer major axes because they are produced by radiation sources with progressively increasing Θ and constant φ relative to the plane of the radiation sensor.
[0052] The azimuthal angle φ can be determined by the direction of the fiducial shadow. Shadows 212, 213, and 214 each represent a shadow created by fiducial markers 227, 228, and 229, respectively, using radiation sources with the same Θ, but with different φ. Thus, in some embodiments, the angle of the major axis of an elliptical shadow relative to the frame of an image is determined by the azimuthal angle φ of the radiation source relative to the sensor.
[0053] In some embodiments, the position of the center of the fiducial shadow can also be dependent on the spherical coordinates Θ and φ of the radiation source relative to the radiation sensor. Returning to Figure 2A, a shadow created by a radiation source 221 orthogonal to the radiation sensor creates a circular shadow 204 directly below the fiducial marker. In some embodiments, the center of a shadow created by a radiation source with different Θ and/or φ coordinates 222 creates a shadow with a center position displaced by a distance determined by Θ and in a direction determined by φ. As will be described, in some embodiments the spherical coordinates of a radiation source relative to a radiation sensor used to create an image can be calculated by the shape, position, and size characteristics of fiducial shadows. [0054] Therefore, in some embodiments, fiducial markers create shadows on each 2D image, wherein the shape, position, and/or size of the shadow are determined by the relative position of the radiation source and the radiation sensor (defined as the capture angle) when the image was captured. In another exemplary advantage, some embodiments of the devices and methods herein allow for the generation of 3D images by determining the capture angle of each image from the image itself, and therefore without physically measuring the capture angle of the radiation source and the radiation sensor at the time each image is taken.
[0055] Returning to Figure 1, combining the 2D images into a 3D image 104 can include determining a position and radiodensity of each pixel on a 2D image, mapping the densities of each pixel across a plurality of voxels, and creating the 3D image.
[0056] Analyzing each image to identify and characterize the shadow created by the fiducial marker 102 can include manually tracing the outline of a shadow created by the fiducial markers. Analyzing each image to identify and characterize the shadow created by the fiducial marker 102 can include manually inputting the location of the shadows.
[0057] In some embodiments, software determines the location of shadows created by the fiducial markers. In some embodiments, the software can locate the shadow by blob detection methods. In some embodiments, blob detection methods detect regions in a digital image that differ in properties, such as brightness or color, compared to surrounding regions. For example, the blob detection method can be a difference of Gaussians method. In some embodiments, edges of shadows created by fiducial markers can be determined by any of the edge detection methods known in the art. Exemplary methods include Canny edge detection, morphological thinning, a wavelet transform, or a combination of any of these methods. In some embodiments, shape characteristics for the shadow created by the fiducial marker can be determined by a Hough transform method. In some embodiments, the shape
characteristics determined by a Hough transform method include, for example, shadow dimensions, including the lengths of the major and minor axes.
[0058] In some embodiments, the method includes determining p, φ, and/or Θ for the position of the radiation source relative to the sensor by analyzing the size, shape and/or position characteristics of shadows created by the fiducial markers. For example, analyzing the shadow 102 can include determining a length of the shadow and calculating the polar angle and azimuth angle of the radiation source relative to the radiation sensor from the length of the shadow. [0059] Analyzing the shadow 102 can include determining an angle of the shadow and calculating the polar angle and azimuth angle 103 of the radiation source relative to the radiation sensor from the angle of the shadow. In some embodiments, φ can be calculated from an angle of a major axis of an elliptical shadow relative to the frame of the image.
[0060] Analyzing the shadow 102 can include determining a shadow displacement for each of the plurality of 2D images, wherein determining a shadow displacement (e.g., shadow displacement 206 in Figure 2A) of each 2D image includes determining a position of a center of a shadow in a 2D image taken when the radiation source is orthogonal to the radiation sensor, and comparing a position of a center of a shadow in each image to the center of the shadow in the 2D image taken when the radiation source is orthogonal to the radiation sensor. Calculating the polar angle and azimuth angle 103 of the radiation source relative to the radiation sensor can include analyzing the shadow displacement. In some embodiments, the difference in major axis angles of shadows in two or more images created by the same fiducial marker can be used to calculate the relative difference in φ between the images (e.g. the angles of shadows 212, 213, and 214 in Figure 2B). In some embodiments, Θ can be calculated by the length 206 of a line representing the shift or displacement of the position of the center of the elliptical shadow from a fiducial marker in an image and the actual or expected position of a shadow from the same fiducial marker in an actual or theoretical image taken using a radiation source orthogonal to the radiation sensor. In some embodiments, the actual or theoretical positions are the x and y coordinates relative to the frame of the image sensor. In some embodiments, the relative difference in Θ between two images can be calculated by the length of a line representing the shift or displacement of the position of the center of the elliptical shadow from a fiducial marker in a first image and the position of a shadow from the same fiducial marker in a second image. In some embodiments, the length 206 divided by the distance 207 between the fiducial marker and the plane of the sensor equals tan(9). In some embodiments, Θ can be calculated from a combination of any or all of the methods described herein.
[0061] In some embodiments, φ can be calculated from a combination of any or all of the methods described herein. In some embodiments, calculating φ of the radiation source relative to the radiation sensor includes calculating the angle of a line 206 drawn between the position of the center of the elliptical shadow from a fiducial marker in a first image with a first φ and the expected position of the center of a shadow from the same fiducial marker in an actual or theoretical second image wherein the radiation source is orthogonal to the sensor. [0062] In some embodiments, p can be calculated by the length of the minor axis of an elliptical shadow created by a spherical fiducial marker. In some embodiments, a radiation source positioned at a larger p will create an ellipse with a smaller minor axis than when radiating the same fiducial marker from a position with a smaller p relative to the radiation sensor.
[0063] In some embodiments, converting shadow information into p, φ, and/or Θ information involves a linear regression calculation using the shape characteristics. In some embodiments, precision and/or accuracy can be improved by analyzing shape characteristics for a plurality of shadows created by a plurality of fiducial markers in each of one or more images. In some embodiments, analyzing the shape characteristics for a plurality of shadows includes a regression analysis.
[0064] In some embodiments, a circular shadow created by a spherical fiducial marker indicates the radiation source is orthogonal to the sensor (e.g. shadow 204 in Figure 2A). In some embodiments, an elliptical shadow created by a spherical fiducial marker indicates the radiation source is not orthogonal to the sensor (e.g. shadow 205 in Figure 2A). In some embodiments, the direction of the elliptical shadow 205 indicates the azimuth angle of the radiation source relative to the sensor.
[0065] In some embodiments, a first 2D image of the plurality of 2D images has a first capture angle, and a second 2D image of the plurality of 2D images has a second capture angle, wherein the first angle and second angle are different. In some embodiments, the plurality of images are captured from different angles using a radiation source rotated about an axis of rotation. In some embodiments, some or all of the images are captured with the radiation source having a different set of spherical coordinates relative to a reference plane, wherein the spherical coordinates are defined as p, φ, and/or Θ, or a different combination of p, φ, and/or Θ. .In some embodiments, the reference plane is the plane of the sensor.
[0066] Figure 3 illustrates a method 300 of generating a 3D image by combining a plurality of 2D images, in accordance with an embodiment. Method 300 includes obtaining a plurality of 2D images 301, determining a relative position of a sensor and a radiation source used to capture each 2D image 302, creating a 3D volume for each 2D image by projecting each 2D image in a direction of a relative position of the sensor and the radiation source used to capture the 2D image 303, and generating a 3D image by correlating the volumes associated with the 2D images 304. In some embodiments, the relative position of a fiducial marker and the radiation sensor and/or radiation sensor holder is maintained between different 2D images. [0067] The method 300 can include normalizing each 2D image by analyzing a radiodensity gradient created by an object captured in each 2D image. In some embodiments, the radiodensity gradient includes radiodensities equivalent to the radiodensity of features. In some embodiments, the radiodensity gradient is created by a plurality of fiducial markers with varying radiodensities. In some embodiments, the radiodensity gradient is created by fiducial markers with radiodensities equivalent to the radiodensity of one or more features. In some embodiments, the radiodensity gradient and/or fiducial markers with varying radiodensities help identify features in a 2D or 3D image. In some embodiments,
normalizing each 2D image includes adjusting a parameter in each 2D image. In some embodiments, the parameter includes at least one selected from the group consisting of a gamma correction, a brightness, and a contrast of each of the 2D images. These parameters can vary between images due to differences in the distance between the radiation source and the radiation sensor, the amount of radiation generated by the radiation source, the focus of the beam of radiation, and variability between different radiation sources and sensors. In some embodiments, these adjustments are used to normalize each image so that the range of values on each image representing features are approximately consistent. Example anatomical features in dental applications include caries, dentin, carious dentin, enamel, carious enamel, cementum, carious cementum, bone, gum, and other tissues.
[0068] In some embodiments, a radiation sensor used to obtain a plurality of images 301 will be exposed non-uniformly if the radiation sensor is not orthogonal to the radiation source in at least one of the plurality of images. In some embodiments, the radiation source is a cone beam source or emits radiation in the shape of a cone. In some embodiments, the radiation source emits parallel rays of radiation. In some embodiments, the radiation source includes a columnator. In some embodiments, a side of the radiation sensor further away from the radiation source will be exposed by less radiation than a side closer to the radiation source. In some embodiments, normalizing the 2D images includes adjusting each pixel for the distance to the radiation source. In some embodiments, differences in radiation exposure of each pixel caused by the differences in distance to the source can be used to determine p, φ, and/or Θ of the radiation source relative to the radiation sensor 302. In some embodiments, differences in radiation exposure caused by distance to the radiation source can be
determined by using a plurality of fiducial markers and/or radiopacity gradients with known radiopacities. [0069] In some embodiments, the relative position includes a polar angle between a plane of the sensor and a direction of a beam emitted from the radiation source.
[0070] Step 303 can include creating, for each 2D image, a 3D volume by projecting each 2D image in a direction of the relative position of the sensor and the radiation source used to capture the respective 2D image. In some embodiments, each 2D image is projected from a boundary. In some embodiments, the projection creates a 3D space for each image where each pixel maps to a line of voxels, and each voxel along the line of voxels is assigned the same intensity value as the corresponding pixel for the image. In some embodiments, an intensity value represents the amount of radiation that reaches the sensor. In some embodiments, an intensity value has an inverse relationship to a radiopacity value, which can represent the amount of radiation obstructed from reaching the sensor.
[0071] Therefore, the shape of each projection can be determined by a boundary and a direction of the projection. For example, in some embodiments, an image taken using a radiation source orthogonal to the sensor generates a rectangular 3D space. In some embodiments, an image taken using a radiation source at any other angle generates a parallelepiped shaped 3D space, wherein one wall of the 3D space represents the boundary, and the parallelepiped extends from the sensor in a direction parallel to a point having φ and Θ calculated for that particular image.
[0072] In some embodiments, each image is projected from a rear boundary of the volumetric model. In some embodiments, the rear boundary corresponds to the position of the radiation sensor. In some embodiments, the position of the radiation sensor does not change within the model for each projection. In some embodiments, the position of the radiation sensor may be held constant while the plurality of images are captured. In some embodiments, the position of the radiation source used to capture the 2D images can move along spherical coordinates defined as p, φ, and/or Θ, where p is radial distance, Θ is the polar angle, and φ is the azimuthal angle.
[0073] Any appropriate digital geometry processing method can be used to create a 3D image from a plurality of 2D images. In some embodiments, each 2D image includes a series of pixels arranged in 2D space along x and y coordinates. In some embodiments, each pixel has an intensity value corresponding to the combined radiopacity of all radiopaque material between the radiation source and the radiation sensor.
[0074] Generating the 3D image 304 can include overlapping the 3D volumes. In some embodiments, the method includes generating a 3D image by correlating 3D volumes associated with the 2D images. In some embodiments, an images may comprise lines of voxels that intersect lines of voxels from images with different φ and Θ values. In some embodiments, each voxel within the 3D image can have an array of intensity values corresponding to the intensity values of the lines that intersect within each voxel's boundaries.
[0075] Generating the 3D image 304 can include identifying empty voxels. In some embodiments, outlines are created around empty volumetric regions of the 3D image. In an exemplary method, areas of an image, including, for example, a 2D image, that appear black or dark may not have a radiopaque object obstructing the radiation directed to that part of the sensor. In some embodiments, this indicates a clear path or relatively clear path between the source and the sensor. In some embodiments, any voxels that overlap a black line in a 2D image may be empty space regardless of the other intersecting line values corresponding to other 2D images. Thus, in some embodiments, the radiopacity or intensity values of those other intersecting lines can be attributed to the other voxels.
[0076] Generating the 3D image 304 can include estimating an intensity value of a nonempty voxel. In some embodiments, estimating an intensity value of a non-empty voxel includes selecting the highest intensity value from the array of intensity values for that voxel. As previously described, in some embodiments, each pixel of a 2D image has an intensity value determined by the amount of radiation that the sensor detects. In some embodiments, each pixel in a 2D image represents the entire radiation that passes through the radiopaque material along a path in 3D space. The path follows a line from the radiation source through the radiopaque material to the sensor. In some embodiments, each voxel within the 3D image can have an array of intensity values corresponding to the intensity values of the lines projected from pixels in each of the plurality of 2D images that intersect within each voxel's boundaries. In some embodiments, darker areas of an image indicate higher intensity values, which can indicate less radiopaque material along the path of radiation that extends from the sensor to a point having φ and Θ calculated for that particular image. Therefore, in some embodiments, the method includes selecting the highest intensity value, which can represent the clearest path between the radiation source and the sensor. In some embodiments, the radiopacity associated with unselected values in the array of values assigned to a voxel may belong to other voxels along other paths of radiation with different φ and Θ values that intersect that voxel. Therefore, in some embodiments, the smallest radiopacity value in the array of values is selected to represent that voxel, which represents the highest amount of radiation exposure. Generating the 3D image 304 can include a Radon transform and/or a Penrose transform. [0077] In some embodiments, outlines are created around volumetric regions determined to have higher intensity values relative to other volumetric regions. In some embodiments, this process is repeated with progressively smaller intensity values.
[0078] In some embodiments, estimating the intensity value of the non-empty voxel includes averaging the array of potential values for each voxel.
[0079] Generating the 3D image 304 can include iteratively adjusting the intensity value of the non-empty voxel to distribute a total intensity value of the overlapping voxels. In some embodiments, a voxel relates to another voxel if a path of radiation intersects both voxels before reaching the same pixel on the radiation sensor in the same 2D image. Thus, in some embodiments having a 3D image created from a plurality of 2D images, each voxel can relate to a different set of voxels for each image. In some embodiments, each value for each voxel can be cross-checked against all of the other voxels it relates to in order to ensure that radiopacity attributed to a voxel is consistent with each overlapping pixel. In some embodiments, each voxel's intensity value can be iteratively crosschecked against each of the paths of voxels determined to have less radiopaque material than other paths in order to distribute radiopacity to other voxels that intersect with other paths determined to have relatively more radiopaque material. In some embodiments, each voxel's intensity value can be iteratively crosschecked against one or more fiducial markers with varying radiopacity. In some embodiments related to dental applications, generating a 3D image includes identifying whether the non-empty voxel includes one selected from the group consisting of dentin, enamel, cavity, gum, and bone. Additionally, in some embodiments, once a voxel is identified as dentin, enamel, or other feature, the value for that voxel can be set and remaining radiopacity redistributed among other voxels in lines associated with the identified voxel.
[0080] In some embodiments, it may be desirable to create a 3D image using the region of the projection of a 2D image that overlaps with a region of at least one other projection. In some embodiments, the 3D image can be corrected for radiopacity that only appears in a single projection. In some embodiments, the intensity values attributable to areas that only appear in one projection can be removed and the remaining voxel values adjusted
accordingly.
[0081] In some embodiments, the relative position is a position calculated as p, φ, and/or Θ by any of the methods contained herein. In some embodiments, the 3D image is created by using the relative difference in p, φ, and/or Θ between two or more images. [0082] In dental applications, generating the 3D image 304 can include identifying whether the non-empty voxel includes one selected from the group consisting of: dentin, enamel, cavity, gum, and bone.
[0083] Generating the 3D image 304 can include applying an anti-aliasing algorithm to the 3D image. In some embodiments, a 3D contour is created from the 3D image.
[0084] Figure 4A represents an isometric view of the front of an exemplary device in accordance with an embodiment. Figure 4B represents a front view of an exemplary device in accordance with an embodiment. Figure 4C represents a side view of an exemplary device in accordance with an embodiment. Figure 4D represents an isometric rear view of an exemplary device in accordance with an embodiment. An exemplary embodiment includes a plurality of supports for a sensor 402, and one or more fiducial markers 403. In some embodiments, the fiducial markers are held in a known and/or fixed position and/or distance relative to the supports for the sensor 402. The supports can be any size. In some embodiments, the supports have a width of between 0.1 inches and 4 inches, including, for example, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1.0, 1.25, 1.5, 1.75, 2.0, 2.5 3, and 4 inches. The supports can be configured to hold a sensor of any size. In some embodiments, including some embodiments for dental applications, the plurality of supports for a sensor are configured to hold a sensor with a height and/or width of a radio-sensitive side between 0.25 inches and 4 inches, including, for example 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1.0, 1.25, 1.5, 1.75, 2.0, 2.5 3, and 4 inches. Some embodiments comprise a bite plate 404. In some embodiments, the bite plate can be any length between 0.1 and 6 inches, including, for example 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1.0, 1.25, 1.5, 1.75, 2.0, 2.5 3, 4, 5, or 6 inches. A sensor can be coupled to the plurality of supports 402 in some embodiments.
[0085] Figure 5 illustrates an exemplary device in accordance with an embodiment. Figure 5A represents an isometric view of the front of an exemplary device in accordance with an embodiment. Figure 5B represents a cut-away front view of an exemplary device in accordance with an embodiment. An exemplary embodiment includes a radiation sensor 501, and one or more fiducial markers 505. In some embodiments, the one or more fiducial markers are contained within supports 502 coupled to the radiation sensor. The sensor can be any size. In some embodiments, including some embodiments for dental applications, the sensor can have a height and/or width of a radio-sensitive side between 0.25 inches and 4 inches, including, for example 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1.0, 1.25, 1.5, 1.75, 2.0, 2.5 3, and 4 inches. In some embodiments, the fiducial markers are held in a known and/or fixed position and/or distance relative to the sensor 501. Some embodiments comprise a bite plate 504. In some embodiments, the bite plate can be any length between 0.1 and 6 inches, including, for example 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1.0, 1.25, 1.5, 1.75, 2.0, 2.5 3, 4, 5, or 6 inches.
[0086] The sensor can be any type of radiation sensor. In some embodiments, the radiation sensor is an X-ray sensor adapted for use in capturing dental images. In some embodiments, the radiation sensor is a digital sensor.
[0087] In some embodiments, the position of at least one of the one or more fiducial markers are fixed relative to the position of the sensor. In some embodiments, the fiducial marker obstructs or partially obstructs radiation emitted from the radiation source from reaching the sensor, thereby creating a shadow in the resulting image.
[0088] In some embodiments, the one or more fiducial markers comprise 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, or more fiducial markers. In some embodiments, the one or more fiducial markers 405 are each aligned at a set distance and/or position from the sensor and from the other fiducial markers so that a location of each fiducial marker is known with respect to the sensor and the other fiducial markers. In some embodiments, the distance between one or more of the fiducial markers and the sensor is less than 10 mm, e.g. about 0.25 mm, about .5 mm, about 1 mm, about 1.5 mm, about 2 mm, about 3 mm, about 4mm, about 5 mm, about 6 mm, about 7 mm, about 8 mm, about 9 mm, or about 10 mm, or any other distance within that range. In some embodiments, one or more of the fiducial markers are in contact with the sensor, i.e. the one or more fiducial markers are immediately adjacent to the sensor. In some embodiments, the sensor includes one or more fiducial markers.
[0089] In some embodiments, the one or more fiducial markers have a known size. The size of the fiducial markers can be any size that creates a shadow detectable by the radiation sensor. In some embodiments, the size of the fiducial marker is the same size or between 1 and 100 times (e.g. about 1, 2, 3, 4, 5, 6, 7, 8, 10, 15, 20, 25, 30, 40, 50, 60, 70, 80, 90, or 100 time) the size of the resolution of a pixel in the radiation sensor. In some embodiments, the size is between 0.01 and 1 cubic centimeters in volume. In some embodiments, the fiducial marker has a diameter between about 0.1 mm and about 5.0 mm long (e.g. about 0.1mm, 0.2 mm, 0.3 mm, 0.4 mm, 0.5 mm, 0.6 mm, 0.7 mm, 0.8 mm, 0.9 mm, 1.0 mm, 1.25 mm, 1.5 mm, 1.75 mm, 2.0 mm, 2.5 mm, 3.0 mm, 3.5 mm, 4.0 mm, 4.5 mm, or 5.0 mm).
[0090] The shape of the fiducial markers can be any radiopaque mass that creates a shadow. In some embodiments, the shape of the one or more fiducial markers can be a sphere, a cylinder, a cross, a cube, a pyramid, a hexahedron, or a disc. In some embodiments, the one or more fiducial markers comprise a plurality of shapes. In some embodiments, the fiducial marker is radiopaque. In some embodiments, the fiducial marker is partially radiopaque.
[0091] In some embodiments, the position of the sensor relative to the object being imaged is held constant or relatively constant.
[0092] In some embodiments, the location of objects in 3D space captured in 2D images can be determined by using the "buccal object rule," also known as the SLOB rule (Same Lingual; Opposite Buccal), demonstrated in Figure 6. Briefly, in some embodiments, whether a first object is in front of or behind a second object from the perspective of the radiation source can be determined by analyzing two images captured with a radiation source with different Θ and/or φ angles. If an object moves in the same direction as the source of the x-ray beam, it is behind (lingual) the other object. If the object moves in the opposite direction of the source, it is in front of (buccal) to the other object.
[0093] Figure 6 depicts two different resulting radiographs (605 and 607) produced by a radiation sensor 601 resulting from the imaging the same metal bars (602 and 603) using radiation sources from two different angles (604 and 606). In the first radiograph 605, the radiation source 604 is orthogonal to the radiation sensor, and shadows from the metal bars 602 and 603 appear in the radiograph in almost the same relationship that they share in reality. By shifting the radiation source 606 to one side and tilting it towards the objects, shadows from the metal bars 602 and 603 appear on the film in a distorted relationship in the resulting radiograph 607. The object closer to the radiation source 603 will create a shadow at a greater distance than the object farther from the radiation source 602, and the relative shift of the shadow created by the object closer to the radiation source 603 will be greater than the relative shift of the shadow created by the objected closer 602 to the radiation sensor.
[0094] In some embodiments, the greater the distance between the object being imaged and the sensor, the greater the relative displacement is between shadows of that object in two different images created using a sensor with different Θ and/or φ angles. In some
embodiments, the amount of displacement of a shadow created by an object can be correlated to the distance between the object and the sensor by comparing the displacement to the displacement of an object with a known distance to the sensor. In some embodiments, the object with a known distance to the sensor is a fiducial marker. In some embodiments, this comparison is a linear regression. Thus, in some embodiments, the relative size and direction an object shifts from one image to another relative to other objects captured in the same image can place the object in 3D space. [0095] Although the above description has focused primarily on dental imaging, one of skill in the art will readily recognize that the features and advantages can be applied to other fields. Exemplary fields include radiologic imaging of non-dental anatomical features, including bone, and imaging of manufactured devices, including devices that are not easily amenable to track-based axial tomography.
[0096] Although the disclosed embodiments have been fully described with reference to the accompanying drawings, it is to be noted that various changes and modifications will become apparent to those skilled in the art. Such changes and modifications are to be understood as being included within the scope of the disclosed embodiments as defined by the appended claims. It should be understood that the various embodiments have been presented by way of example only, and not by way of limitation. Likewise, the various diagrams may depict an example architectural or other configuration for the embodiments, which is done to aid in understanding the features and functionality that can be included in the disclosed embodiments. The disclosure is not restricted to the illustrated example architectures or configurations, but can be implemented using a variety of alternative architectures and configurations. Additionally, although the invention is described above in terms of various exemplary embodiments and implementations, it should be understood that the various features and functionality described in one or more of the individual embodiments are not limited in their applicability to the particular embodiment with which they are described. They instead can, be applied, alone or in some combination, to one or more of the other embodiments of the invention, whether or not such embodiments are described, and whether or not such features are presented as being a part of a described embodiment. Thus the breadth and scope of the invention should not be limited by any of the above-described exemplary embodiments.
[0097] Terms and phrases used in this document, and variations thereof, unless otherwise expressly stated, should be construed as open ended as opposed to limiting. As examples of the foregoing: the term "including" should be read as meaning "including, without limitation" or the like; the term "example" is used to provide exemplary instances of the item in discussion, not an exhaustive or limiting list thereof; and adjectives such as "conventional," "traditional," "normal," "standard," "known", and terms of similar meaning, should not be construed as limiting the item described to a given time period, or to an item available as of a given time. But instead these terms should be read to encompass conventional, traditional, normal, or standard technologies that may be available, known now, or at any time in the future. Likewise, a group of items linked with the conjunction "and" should not be read as requiring that each and every one of those items be present in the grouping, but rather should be read as "and/or" unless expressly stated otherwise. Similarly, a group of items linked with the conjunction "or" should not be read as requiring mutual exclusivity among that group, but rather should also be read as "and/or" unless expressly stated otherwise. Furthermore, although items, elements or components of the invention may be described or claimed in the singular, the plural is contemplated to be within the scope thereof unless limitation to the singular is explicitly stated. The presence of broadening words and phrases such as "one or more," "at least," "but not limited to", or other like phrases in some instances shall not be read to mean that the narrower case is intended or required in instances where such broadening phrases may be absent.
[0098] Functions described herein can be implemented by software, firmware, hardware, and any combination of these elements whether explicitly identified or not. Additionally, memory or other storage may be employed.
[0099] In this document, the terms "computer program product", "computer-readable medium", and the like, may be used generally to refer to media such as, memory storage devices, or storage unit. These, and other forms of computer-readable media, may be involved in storing one or more instructions for use by processor to cause the processor to perform specified operations. Such instructions, generally referred to as "computer program code" (which may be grouped in the form of computer programs or other groupings), when executed, enable the computing system. In this document, computer readable storage may be a non-transitory medium.

Claims

Claims
1. An apparatus for capturing radiation comprising:
a support;
a radiation sensor holder coupled to the support; and
a fiducial marker held by the support and at a set distance from the sensor holder.
2. The apparatus of claim 1, wherein the fiducial marker is pre-aligned with respect to the sensor holder so that a location of the fiducial marker is known with respect to the sensor holder.
3. The apparatus of claims 1 or 2, wherein the set distance is less than 10 mm.
4. The apparatus of any of claims 1-3, wherein the fiducial marker is less than one cubic centimeter in volume.
5. The apparatus of any of claims 1-4, wherein the sensor holder is configured to hold a sensor, wherein a radio sensitive side of the sensor has less than four square inches of total surface area.
6. The apparatus of any of claims 1-5, further comprising a biting portion extending from the support.
7. The apparatus of any of claims 1-6, wherein the fiducial marker comprises a shape selected from the group consisting of a sphere, a cylinder, a cross, a cube, a pyramid, a hexahedron, or a disc.
8. The apparatus of any of claims 1-7, wherein the fiducial marker comprises a radiopaque or a semi-radiopaque material.
9. The apparatus of any of claims 1-8, wherein the radiopaque or the semi- radiopaque material is selected from the group consisting of lead, steel, compounds of barium, barium sulfate, compounds of bismuth, plastic, and thermoplastic, and
wherein the support comprises at least one, at least two, at least three, at least four, at least five, at least six, at least seven, at least eight, at least nine, or at least ten fiducial markers.
10. The apparatus of any of claims 1-9, further comprising a radiation source for emitting radiation.
11. The apparatus of any of claims 1-10, further comprising a sensor in the sensor holder.
12. The sensor of claim 11, wherein a radio sensitive side of the sensor has less than four square inches of total surface area.
13. A method of generating a three dimensional image by combining a plurality of two dimensional images, comprising:
obtaining a plurality of two dimensional images, wherein each two dimensional image comprises a shadow from a fiducial marker;
analyzing the shadow to determine shape characteristics of the shadow;
calculating, from the shape characteristics, a polar angle and an azimuth angle of a radiation source relative to the radiation sensor for each two dimensional image; and
combining the two dimensional images into a three dimensional image using the polar angle and azimuth angle of each two-dimensional image.
14. The method of claim 13, wherein combining the two dimensional images into a three dimensional image further comprises:
determining a position and radiodensity of each pixel on a two-dimensional image; mapping the densities of each pixel across a plurality of voxels; and
creating the three dimensional image.
15. The method of claim 13 or 14, wherein analyzing the shadow comprises: determining a length of the shadow; and
calculating the polar angle and azimuth angle of the radiation source relative to the radiation sensor from the length of the shadow.
16. The method of any of claims 13-15, wherein analyzing the shadow comprises: determining an angle of the shadow; and
calculating the polar angle and azimuth angle of the radiation source relative to the radiation sensor from the angle of the shadow.
17. The method of any of claims 13-16, wherein calculating the polar angle and azimuth angle of the radiation source relative to the radiation sensor comprises determining a shadow displacement for each of the plurality of two dimensional images, wherein determining a shadow displacement of each two-dimensional image comprises
determining a position of a center of a shadow in a two-dimensional image taken when the radiation source is orthogonal to the radiation sensor; and
comparing a position of a center of a shadow in each image to the center of the shadow in the two-dimensional image taken when the radiation source is orthogonal to the radiation sensor.
18. The method of any of claims 13-17, further comprising determining the radiation source is orthogonal to the sensor by determining a circular shadow is created by a spherical fiducial marker.
19. The method of any of claims 13-18, further comprising determining the radiation source is not orthogonal to the sensor by determining an elliptical shadow is created by a spherical fiducial marker.
20. The method of claim 19, wherein the direction of the elliptical shadow indicates the azimuth angle of the radiation source relative to the sensor.
21. The method of claim 19 or 20, wherein the length of the major axis of the elliptical shadow indicates an angle of the source relative to the sensor.
22. The method of any of claims 13-21, wherein:
a first two-dimensional image of the plurality of two dimensional images has a first capture angle; and
a second two dimensional image of the plurality of two dimensional images has a second capture angle, wherein the first capture angle and second capture angle are different.
23. The method of any of claims 13-22, further comprising displaying the three dimensional image on a display.
24. A method of generating an image comprising:
obtaining two-dimensional images;
determining, for each two-dimensional image, a relative position of a sensor and a radiation source used to capture the respective two-dimensional image;
creating, for each two-dimensional image, a three-dimensional volume by projecting each two-dimensional image in a direction of the relative position of the sensor and the radiation source used to capture the respective two-dimensional image; and
generating a three-dimensional image by correlating three-dimensional volumes associated with the two-dimensional images.
25. The method of claim 24, wherein the method further comprises normalizing each two-dimensional image by analyzing a radiodensity gradient created by an object captured in each two-dimensional image.
26. The method of any of claims 24 and 25, wherein normalizing each two- dimensional image comprises adjusting a parameter in each two-dimensional image.
27. The method of claim 26, wherein the parameter includes at least one selected from the group consisting of a gamma correction, a brightness, and a contrast of each of the two-dimensional images.
28. The method of any of claims 24-27, wherein the relative position comprises a polar angle between a plane of the sensor and a direction of a beam emitted from the radiation source.
29. The method of any of claims 24-28, wherein generating the three-dimensional image comprises overlapping the three dimensional volumes.
30. The method of any of claims 24-29, wherein generating the three-dimensional image comprises identifying empty voxels.
31. The method of any of claims 24-30, wherein generating the three-dimensional image comprises estimating an intensity value of a non-empty voxel.
32. The method of claim 31, wherein estimating the intensity value of the nonempty voxel comprises averaging an array of potential values for the non-empty voxel.
33. The method of claim 31 or 32, wherein estimating the intensity value of the non-empty voxel comprises selecting the highest value from an array of potential values for the non-empty voxel.
34. The method of any of claims 31-33, wherein generating the three-dimensional image comprises iteratively adjusting the intensity value of the non-empty voxel to distribute a total intensity value among one or more related voxels.
35. The method of any of claims 31-34, wherein generating the three-dimensional image comprises identifying whether the non-empty voxel comprises one selected from the group consisting of: dentin, enamel, cavity, gum, and bone.
36. The method of any of claims 24-35, further comprising applying an antialiasing algorithm to the three-dimensional image.
37. The method of any of claims 24-36, further comprising displaying the three dimensional image on a display.
38. A method of generating a three dimensional image by combining a plurality of two dimensional images, comprising:
obtaining a plurality of two dimensional images, wherein each two dimensional image comprises a shadow from a fiducial marker;
analyzing the shadow to determine shape characteristics of the shadow;
calculating, from the shape characteristics, a polar angle and an azimuth angle of a radiation source relative to the radiation sensor for each two dimensional image;
creating, for each two-dimensional image, a three-dimensional volume by projecting each two-dimensional image in a direction of the polar angle and azimuth angle of the radiation source relative to the radiation sensor used to capture the respective two- dimensional image; and
generating a three-dimensional image by correlating three-dimensional volumes associated with the two-dimensional images.
39. A method for capturing an image comprising:
inserting the apparatus of any of claims 1-12; and
capturing the image.
40. A non-transitory computer-readable storage medium comprising computer- readable instructions, which when executed by one or more processors, causes the one or more processors to perform the method of any one of claims 13-39.
PCT/US2015/042296 2014-07-28 2015-07-27 Method and apparatus for producing a three-dimensional image WO2016018825A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201462029843P 2014-07-28 2014-07-28
US62/029,843 2014-07-28
US201562183104P 2015-06-22 2015-06-22
US62/183,104 2015-06-22

Publications (1)

Publication Number Publication Date
WO2016018825A1 true WO2016018825A1 (en) 2016-02-04

Family

ID=55218220

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2015/042296 WO2016018825A1 (en) 2014-07-28 2015-07-27 Method and apparatus for producing a three-dimensional image

Country Status (1)

Country Link
WO (1) WO2016018825A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190231284A1 (en) * 2018-01-26 2019-08-01 Palodex Group Oy Portable bite part for determining an imaging area of a patient in panoramic, computed tomography, or cephalometric x-ray imaging
US20190231285A1 (en) * 2018-01-26 2019-08-01 Palodex Group Oy Portable bite part for correcting a motion of an object in panoramic, computed tomography, or cephalometric x-ray imaging

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020130274A1 (en) * 2001-03-15 2002-09-19 International Busines Machines Corporation Spatial phase locking with shaped electron beam lithography
US20040264648A1 (en) * 2003-06-25 2004-12-30 General Electric Company Method, apparatus, and medium for calibration of tomosynthesis system geometry using fiducial markers with non-determined position
US7014361B1 (en) * 2005-05-11 2006-03-21 Moshe Ein-Gal Adaptive rotator for gantry
US20090274272A1 (en) * 2005-11-09 2009-11-05 Dexela Limited Methods and apparatus for obtaining low-dose imaging
US20110255661A1 (en) * 2010-04-20 2011-10-20 Hans Schweizer Imaging fluoroscopy method and system using a navigation system marker device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020130274A1 (en) * 2001-03-15 2002-09-19 International Busines Machines Corporation Spatial phase locking with shaped electron beam lithography
US20040264648A1 (en) * 2003-06-25 2004-12-30 General Electric Company Method, apparatus, and medium for calibration of tomosynthesis system geometry using fiducial markers with non-determined position
US7014361B1 (en) * 2005-05-11 2006-03-21 Moshe Ein-Gal Adaptive rotator for gantry
US20090274272A1 (en) * 2005-11-09 2009-11-05 Dexela Limited Methods and apparatus for obtaining low-dose imaging
US20110255661A1 (en) * 2010-04-20 2011-10-20 Hans Schweizer Imaging fluoroscopy method and system using a navigation system marker device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190231284A1 (en) * 2018-01-26 2019-08-01 Palodex Group Oy Portable bite part for determining an imaging area of a patient in panoramic, computed tomography, or cephalometric x-ray imaging
US20190231285A1 (en) * 2018-01-26 2019-08-01 Palodex Group Oy Portable bite part for correcting a motion of an object in panoramic, computed tomography, or cephalometric x-ray imaging

Similar Documents

Publication Publication Date Title
US10492755B2 (en) Calibration phantom comprising a reflectance calibration target and a plurality of radio-opaque markers
US9782136B2 (en) Intraoral tomosynthesis systems, methods, and computer readable media for dental imaging
US7372935B2 (en) Method for minimizing image artifacts and medical imaging system
JP5906015B2 (en) 2D / 3D image registration based on features
US11464475B2 (en) Self-calibrating technique for x-ray imaging scanners
JP6785776B2 (en) Methods, systems, equipment, and computer programs for removing artifacts from tomosynthesis datasets
US8977026B2 (en) Methods and systems for locating a region of interest in an object
US8045778B2 (en) Hot spot detection, segmentation and identification in pet and spect images
US10470726B2 (en) Method and apparatus for x-ray scan of occlusal dental casts
US20090202127A1 (en) Method And System For Error Compensation
JP2009525780A (en) Clarifying heterogeneous objects when creating CT-based attenuation maps
US9375192B2 (en) Reconstruction of a cone beam scanned object
US11024061B2 (en) Apparatus and method for scattered radiation correction
WO2021034891A1 (en) Geometric calibration marker detection in spectral tomosynthesis system
JP6767997B2 (en) Method for improving the image of the image data of the dental image generation system
EP3072448B1 (en) Device and method for generating dental three-dimensional surface image
WO2016018825A1 (en) Method and apparatus for producing a three-dimensional image
CN103228213A (en) Method for monitoring a radiation dose
US10682113B2 (en) Self-calibrating device for X-ray imaging scanners
US11367227B2 (en) Method and apparatus for computer vision based attenuation map generation
CN113038882A (en) System for adjusting the relative position of an in-vivo portion with respect to an X-ray sensitive surface
JP7209496B2 (en) nuclear medicine diagnostic equipment
KR20160127690A (en) Producing panoramic radiograph
Jiang et al. Comparison of computed tomography scout based reference point localization to conventional film and axial computed tomography
Persons et al. Brachytherapy volume visualization

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15826706

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15826706

Country of ref document: EP

Kind code of ref document: A1