WO2016124554A1 - Object localization in projective x-ray images by geometric considerations - Google Patents

Object localization in projective x-ray images by geometric considerations Download PDF

Info

Publication number
WO2016124554A1
WO2016124554A1 PCT/EP2016/052099 EP2016052099W WO2016124554A1 WO 2016124554 A1 WO2016124554 A1 WO 2016124554A1 EP 2016052099 W EP2016052099 W EP 2016052099W WO 2016124554 A1 WO2016124554 A1 WO 2016124554A1
Authority
WO
WIPO (PCT)
Prior art keywords
ray
detector
image
silhouette
collimator
Prior art date
Application number
PCT/EP2016/052099
Other languages
French (fr)
Inventor
Mathias Schlueter
Original Assignee
Koninklijke Philips N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips N.V. filed Critical Koninklijke Philips N.V.
Publication of WO2016124554A1 publication Critical patent/WO2016124554A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/06Diaphragms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/54Control of apparatus or devices for radiation diagnosis
    • A61B6/547Control of apparatus or devices for radiation diagnosis involving tracking of position of the device or parts of the device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/48Diagnostic techniques
    • A61B6/486Diagnostic techniques involving generating temporal series of image data
    • A61B6/487Diagnostic techniques involving generating temporal series of image data involving fluoroscopy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5211Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
    • A61B6/5252Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data removing objects from field of view, e.g. removing patient table from a CT image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5258Devices using data or image processing specially adapted for radiation diagnosis involving detection or reduction of artifacts or noise
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/58Testing, adjusting or calibrating thereof
    • A61B6/582Calibration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N2223/00Investigating materials by wave or particle radiation
    • G01N2223/40Imaging

Definitions

  • the invention relates to a system for supporting X-ray imaging, to a method of supporting X-ray imaging, to a computer program element and to a computer readable medium.
  • objects like protectors and collimators are located in a path of the X-ray beam in order to reduce the patient dose.
  • Useful as those non-anatomical objects may be for achieving dose reductions for instance, they nevertheless "imprint" unwanted image information into the eventual 2D XX-ray projection image.
  • post-acquisition image processing tasks such as filtering, etc.
  • an attempt is made to identify in the image structures that correspond to those radiation absorbing objects to ensure better image quality. On occasion, these detection attempts fail or lead to incorrect identifications.
  • Another task is the detection of specific markers in the X-ray image for navigation or calibration purposes.
  • a system for supporting X-ray imaging comprising:
  • an input port for receiving, based on a first coordinate system, an input specification of i) a location of an X-ray source, and/or ii) a location and/or an orientation of an object and/or iii) a location and/or orientation of an X-ray detector, the object capable of interacting with X-ray radiation from an X-ray source;
  • a predictor module configured to predict, based on said input specification, at least a plurality of points of a silhouette of said object in an image plane of the X-ray detector
  • an output port configured to output an output specification of the plurality of silhouette points.
  • the proposed system allows addressing the situation where the object (or a plurality) is introduced into a path of the X-ray beam to achieve a collimation or other purpose.
  • the object is assumed to be radiation absorbing or reflecting. This then introduces a "footprint” or “shadow”, that is a contrast, in the acquired image that does not relate to anatomical structures of the actual subject to be imaged.
  • the proposed system affords accurate pre-acquisition predicting of at least a part of said silhouette of this object in the image to be acquired.
  • the system operates largely, if not exclusively, on said object specification, that is, on parameters/coordinates that described the position/orientation of the object relative to the imager's geometry (in particular to the X-ray source position and/or the detector position and/or inclination) to so predict the extent and/or course of the object's silhouette even before the image is acquired.
  • the specification of the object location position is either in terms of internal object coordinates (that is, the coordinates are measured relative to coordinate system located at the object) or the specification is in terms of "external" world coordinates.
  • the specification is such that the location and/or inclination relative to the detector and the X-ray source are derivable (possibly in conjunction with otherwise system- geometric constants such the position of the X-ray source or position/inclination of the detector or position/inclination of the object).
  • the detector position/inclination is supplied. This may be advantageous if the object is fixed (and known to the system) and it is only the detector location/inclination that is variable.
  • the X-ray detector position is supplied, for instance when both object and detector are fixed (and known to the system).
  • the supplied geometry data at input port is such that a constellation, that is, the relative inclination/location between the three components object, detector and X-ray source (in particular the position of its focal spot) is specified or derivable.
  • the locations and/or inclinations are taken relative to one or more suitably chosen coordinate systems (such as "world” or global coordinate system).
  • post-acquisition identification schemes attempt to identify the silhouette in the images themselves have been proposed in the past.
  • Post-acquisition image based identification projection footprint of such objects can therefore be avoided altogether or at least the predicted silhouette position can be used to furnish computational fiducials or prior-knowledge to "guide" computation of these post-acquisition image analyzer algorithms.
  • Image processing based detection/identification of those object footprints can be made more reliable if the projection of these objects is predicted by geometric considerations.
  • the proposed system can help identify image structures, in particular, but necessarily non-anatomical image structures.
  • Another use of the predicted silhouette points as envisaged herein is a collimation control, in particular (but necessarily) prior to the X-ray exposure.
  • a collimation monitor unit whether a collimation window as per the predicted silhouette points will or does extend outside the imaging plane.
  • this monitoring operation may be performed during X-ray operation but is preferably performed prior to the X-ray operation.
  • the predictor module operates to perform, based on said input specification, a transformation across a plurality of coordinate systems, including said first coordinate system.
  • a transformation across a plurality of coordinate systems, including said first coordinate system.
  • location/inclination can be established in one embodiment by using said coordinate system transformations .
  • the object location/orientation specification is in terms of homogenous coordinates. This affords implementation advantages.
  • At least one of said coordinate systems has its origin at a focal point of the X-ray source and/or the orientation of said coordinate system is defined by the orientation of the image plane of the detector.
  • an optical axis of the system is defined as a projection line from a/the focal point of the X-ray source onto an image plane of the detector and/or a focal length is defined as the length of said projection line.
  • the object specification is acquired by a position sensor which comprises any one of the following: a plurality of optical cameras, a depth sensing camera, a position sensor of a positioning system.
  • a position sensor which comprises any one of the following: a plurality of optical cameras, a depth sensing camera, a position sensor of a positioning system.
  • at least one of said plurality of coordinate systems is a local coordinate system of the detector and/or the object.
  • the object is any one of, but is not limited to: a collimator or a part thereof, an X-ray protector device, a marker device or a calibration device.
  • the predicted silhouette points correspond to an aperture of the collimator.
  • an orientation of the object or of an object coordinate system is inclined relative to an orientation of the image plane of the detector.
  • an image processor configured to run an image processing algorithm to process an X-ray image, wherein the predicted silhouette points are used to initialize the algorithm or as constraints for said algorithm.
  • the object includes one or more blades of the collimator arranged between the X-ray source and the X-ray detector, the system including a collimation control unit configured to adjust, based on the specification of the plurality of silhouette points, the collimation aperture of said collimator.
  • the object includes one or more blades of a collimator arranged between the X-ray source and the X-ray detector, the system including a collimation monitor unit configured to output, based on the specification of the plurality of silhouette points, an indication whether an areas defined by said plurality of silhouette points corresponds, at least in in size and/or location, to a predefined collimation area.
  • a collimation monitor unit configured to output, based on the specification of the plurality of silhouette points, an indication whether an areas defined by said plurality of silhouette points corresponds, at least in in size and/or location, to a predefined collimation area.
  • Geometric information about the object and/or the X-ray detector is used in one embodiment for the prediction of a collimation field in an X-ray image.
  • a method based on projective geometry is given which describes how to calculate the projection silhouette of arbitrary object points onto the image plane.
  • This concept can be extended to the prediction of other specific objects like X-ray protectors, calibration spheres or anatomic or other markers if the location of this kind of objects is known from motion sensors or a positioning system or is otherwise provided. For instance, orthopedic
  • Fig. 1 shows an X-ray based imaging arrangement
  • Fig. 2 shows a schematic description of the imaging geometry in terms of three coordinate systems
  • Fig. 3 shows a flow-chart of a method of supporting X-ray imaging
  • Fig. 4 illustrates exemplary output of the proposed method as per Fig. 3.
  • FIG. 1 With reference to Fig. 1 there is shown in an examination room an X-ray based imaging arrangement including an X-ray imager 100 and a system S for supporting X-ray imaging by imager 100.
  • the imager 100 comprises broadly an X-ray source or X-ray such as a tube XR and a radiation sensitive detector D made up from detector pixels PX usually but not necessarily arranged in a rectangular array forming an image plane.
  • the X-ray source XR is operable to emit an X-ray imaging beam and the X-ray detector D is configured to detect radiation as per said X-ray beam.
  • An inclination of the detector plate D relative to an axis (such as a world coordinate system - more details on this will be explained below at Fig 2)and the X-ray source may be adjustable as indicated for one degree of freedom by a rotation angle R D. Only one rotation angle is shown R D but it is understood that two or three rotation angles may also be adjustable. Inclination of any one or both of X-ray source and X-ray detector may be adjustable.
  • the spatial position of any one or both of the X-ray source and X-ray detector may be adjustable in some or all spatial directions: e.g., height above ground and/or lateral and longitudinal position (relative to a vertical or horizontal reference line) may all be adjustable operation of suitable mechanical arrangements/actuators. Embodiments with manual detector position/inclination adjustments are also envisaged.
  • X-ray source and detector D are carried in and by a frame or gantry (not shown).
  • Variants for the X-ray imager 100 envisaged herein are interventional X-ray imagers of the C- or U-arm type where the gantry has a "C" or transposed "U” shape.
  • the X-ray imaging system may be floor or ceiling mounted.
  • there is no mechanical connection between the X-ray detector D and the X-ray source In such systems, the X-ray source is mounted in a movable module, whereas X-ray detector is a portable plate module completely detached from the module where the X-ray source is mounted. The clinical user is at liberty to place the X-ray detector wherever they see fit.
  • Mobile X-ray imagers may be used for example in hospital wards or care homes to image elderly or frail patients for instance.
  • the imaging subject such as a human patient
  • a subject support a table or similar - not shown
  • the detector is deposited between the patient and a bed support for instance if the patient is too frail to stand up and the X-ray source with the imager is positioned above the patient whilst lying in their bed on said mobile detector D.
  • the X-ray imaging system comprises a wall-mounted or freestanding gantry that holds the X-ray source and across the examination room there is a mounting that holds the detector plate.
  • This type of arrangement is used in chest X-ray imaging for instance. The patient "walks" into the examination region, resides there "standing” during image acquisition after the position and/or inclination of the X-ray source and detector plate has been adjusted to ensure that the X-ray beam path can pass through the ROI, that is, in this exemplary case, the chest region of the patient.
  • an X-ray beam emanates from X-ray tube XR, passes through patient PAT at said region ROI, experiences attenuation by interaction with matter therein, and the so attenuated beam PR then strikes detector D's surface at a plurality of the detector cells or pixels PX.
  • Each pixel PX that receives ray radiation responds by issuing a corresponding electric signal.
  • the collection of said signals is then translated by a data acquisition system DAS into a respective digital values representative of said attenuation.
  • the collection of the so registered digital values for each (X-)ray are then consolidated into an array of digital values forming an X-ray projection image for a given acquisition time and projection direction.
  • Detector array may be 2D (2 dimensional) as shown in Fig 1 but ID (one-dimensional)-embodiments are also envisaged herein where there is only a single line of detector pixels PX which is moved during image acquisition.
  • the DAS digitized image signals may then be rendered for view on a monitor MT or may be passed on to an image post-processor IP for filtering or contrast enhancements or any other image processing purpose.
  • the so processed image IM may then be viewed on the monitor MT or may be stored for later reference in a database DB.
  • auxiliary objects B that is, one or more objects B, different from and over and above the actual subject to be imaged
  • relatively high radiation opacity means in particular a significantly higher opacity than at least an average opacity of the subject's tissue at the ROI.
  • One purpose of using such objects B in this manner is to at least locally/partially exclude X-ray radiation from patient regions which are not subject of investigation. The effectively reduces patient dosage.
  • collimator COL comprises two pairs of blades B formed from sheets of lead or tungsten or other highly radiation-opaque material. One pair is arranged perpendicularly to the other and the blades are individually addressable and movable by a respective collimator stepper motor (not shown) so as to restrict the X-ray beam in either or two of the two dimensions depending on their relative position.
  • Blocker blades B may be rotatable and/or shiftable in and out towards a center to form an aperture with a contour or (inner) silhouette formed by the four blades.
  • the silhouette or perimeter of the aperture in Fig 1 is of square or of rectangular shape but any geometrical shape is envisaged herein. In this way the X-ray beam's "cross section" can be shaped to match an expected two dimensional outline of the ROI.
  • a multi-leaf collimator comprising, instead of the four blades, a plurality (usually more than 4, e.g. 10. 20 to 30) of motor-movable slats or leafs arranged in opposing relationship.
  • the multi-leaf collimator allows forming more detailed or more complex curvilinear shapes.
  • Each collimator setting or configuration corresponds to a specific position of the blades B that form the collimator aperture having the silhouette or outline bounded the blades.
  • An inclination of the blades relative to the detector plane D may be adjustable. This degree of freedom is exemplary indicted in Fig. 1 by rotation angle Rc for only one of the blades.
  • pixels PX shown either as clear or in hatchings.
  • pixels PX around the center of the detector D plate are those detector pixels that are to receive primary radiation when the radiation beam passes through the aperture formed by the blades B.
  • These pixels PX are shown in clear rendering.
  • blocked out pixels that do not or are not to receive radiation are shown by hatching. It will be appreciated that a complete blocking out of primary radiation may not be achievable at all circumstances because scatter (secondary) radiation may still be received at pixels that were meant to be shielded from radiation.
  • collimator context and collimator blades are just an exemplary embodiment for radiation blocking objects B occasionally introduced into the path of the X-ray beam.
  • Other embodiments for radiation blockers or artificial contrast inducing objects B used for similar or different purposes are image markers or X-ray protection devices, etc. as will be explained in more detail below.
  • the support system S as proposed herein includes a pixel position specifier module PSM that allows to flag up individual pixel positions that can be expected to encode the blocker B shadow or footprint or at least parts thereof. More particularly, the proposed pixel position specifier module PSM operates to identify pixel positions that are expected to encode at least a part of the projection silhouette or projection outline of the blocker B footprint.
  • the output of pixel position specifier module PSM can be supplied in any suitable format according to different embodiments and the choice of which will depend on given implementation requirements.
  • the silhouette pixel position specification may be output as a pixel mask (defined on the detector image plane) or as a number of discrete pixel coordinates defining a polygon in the detector image plane that circumscribes the object's silhouette or describing the border of the blocker or object footprint. Any of these formats are envisage herein.
  • Image processor IP then knows which pixels to leave out of
  • pixel position specifier module PSM operates solely on the geometry of the X-ray imager 100.
  • the identification of the blocker B footprints can be done prior to radiation exposure, that is, before the actual image acquisition.
  • a potential patient irradiation outside the detector imaging plane can be indicated by a suitable notification system (visual or audio) to the user to avoid unnecessary patient irradiation.
  • the proposed specifier module PSM cannot be used during the image acquisition which is envisaged herein in alternative embodiments.
  • the pixel position specifier module PSM comprises an input port IN, a predictor module PGP, and an output port OUT.
  • input port IN object and or imaging geometry information is received.
  • the object/blocker B as well as the position and orientation of the detector with respect to another coordinate system is received. If the position/inclination of the detector is fixed, only the object position/inclination is received. Alternatively, if the position/inclination of the object is fixed (and known to the system, e.g. as a constant parameter), it is only the location/inclination of the detector that is received. Lastly, if object and detector are fixed, the position of the X-ray source is received.
  • a position and/or shape of at least parts of the blocker is received relative to a suitable chosen coordinate system, on which more later below at Figures 2, 3.
  • the coordinates that describe the position and/or shape of the blocker or the relevant parts thereof as well as the position and orientation of the detector are then passed on to the predictor module PGP.
  • the predictor module PGP uses a geometric formalism from projective geometry which uses the pin hole camera model paradigm tailored to the setting of X-ray projection imaging.
  • Operation of predictor module PGP includes applying one, two or more (in particular three) coordinate transformations via suitable matrix multiplications to arrive at a pixel set specification that describes the predicted shadow B pixel positions in an image to be taken at the given imaging geometry setting ("imaging geometry" as used herein includes in particular X-ray tube XR position and/or X-ray detector D position/inclination) of the imager 100.
  • imaging geometry as used herein includes in particular X-ray tube XR position and/or X-ray detector D position/inclination
  • This specification of the pixel positions of the blocker B footprint (or of at least parts thereof) are supplied at output port OUT and may then be forwarded to the image processing module IP.
  • a logic module in the image processor IP then removes said shadow pixels (that is the image information carried by those pixel positions) from the total projection image received via the DAS and the image processing algorithm implemented by IP may then be restricted to operate only on the remaining pixel set (that is, the pixels other than those specified by the pixel position specifier module PSM) in the digital image as received via the DAS.
  • Length unit calibration allows establishing the real dimensions of image structures.
  • a calibration object B of spherical (with known diameter) or other suitable geometric shape of a prior-known dimensions is positioned at or on the patient by gluing, sticking or is otherwise affixed in spatial association with the patient to be imaged.
  • the footprint of the sphere B can be predicted by the geometric formalism and system proposed herein. Even perspective distortions caused by the detector plane not being aligned with the calibration object relative to the X-ray source (in particular not being perpendicular to the beam from the X-ray source through the center of the calibration object) can be accounted for in the proposed method.
  • the prediction of markers B placed at the patient can also be predicted in the X-ray images. Such markers are used for instance to indicate anatomical regions relevant for further diagnostic work.
  • the proposed system and method can also be used for collimation control prior to actual X-ray exposure. This means that a collimation window which, based on the predicted aperture silhouette points, would extend partly outside the imaging plane or which would turn out otherwise ill-positioned is automatically adjusted for by a suitable collimation controller CC that interacts with the collimator's actuator.
  • a closed feedback loop can be used to compare a sequence of silhouette positions and make the necessary adjustments until the correct collimation window is achieved.
  • the system includes a collimation monitor unit CMU configured to output, based on the specification of the plurality of silhouette points, an indication whether the area as defined by said plurality of silhouette points corresponds, at least in in size and/or location, to a predefined collimation area. If there is no such correspondence because the collimator window would extend beyond the imaging plane of the detector, an indication is given to the user via a suitable (audio or visual) alert signal.
  • a suitable (audio or visual) alert signal can be applied to objects B other than collimator components to that the user can assess, pre-acquisition, whether the object's projection footprint/silhouette can be expected to appear at the intended position/region of the image to be acquired. This could be indicated to the user graphically on screen MT where a suitable graphic element is rendered that represents the borders of the image plane with a polygon overlaid thereon representing the predicted silhouette points. The user can then update detector and/or object position/inclination until the object
  • position sensor PS may be implemented as a system of optical cameras or a depth sense camera or may be achieved by a position sensor that interfaces with control logic of actuators used to operate the positioning of the individual blades B in the collimator or instance.
  • the sensor PS arranged on or at the blade may also be part of a positioning system as will be explained in more detail.
  • Embodiments where it is the user who manually supplies the coordinate specification of the blocker B are also envisaged.
  • the specification is preferably in 3D although embodiments for 2D coordinate specifications are also contemplated.
  • the user inputs via suitable user input means the points or positions of the blocker element B for which the user wishes a prediction to be computed.
  • the user input is supported by a graphical user interface (GUI) system.
  • GUI graphical user interface
  • CAD computer-aided design
  • a CAD computer-aided design
  • PSM specifier module
  • the user may choose by touch screen or computer mouse or e-stylus or other suitable input means the individual positions which are then translated by an event handler into coordinates relative to a suitable coordinate system (as will be explained in more detail below) and these points are then passed onto the pixel position specifier module PSM.
  • the manually supplied points may be specified in 3D for instance.
  • the projection of the optical axis onto the detector plane can even be located outside the imaging plane x , y '.
  • the optical axis of the proposed system is implicitly (or indirectly) defined by the detector D position and inclination R D .
  • R D the detector D position and inclination
  • in projective X-ray imaging object points are described by an object coordinate system which has, in one embodiment, its own location and orientation.
  • An example is the coordinate system C of the collimator COL that describes the mutual position of the collimator blades B with respect to an internal reference point in the collimator as mentioned above.
  • Fig. 2 The geometric situation in projective X-ray imaging is illustrated in Fig. 2.
  • the origin of the detector coordinate system is located at the focal point of the X-ray tube.
  • the origin of the collimator coordinate system is typically also located at the focal point but the orientation can differ from the detector orientation.
  • the world coordinate system W can be located anywhere in the X-ray room, the origin of which being preferably conveniently chosen so as to take advantage of natural system symmetries for instance.
  • detector D is now “virtually” “split" up in two spatially and conceptually separate coordinate systems: that of the detector itself and, different therefrom, the image plane / which is in general parallel to the x-y plane of the detector coordinate system and would be ordinarily placed at one corner or the center of the detector but other choices for the origin are also envisaged. But, as shown in Fig. 2, the detector coordinate system D is now tied to the focal point of the X-ray source XR, away from where the detector is physically located.
  • this additional object coordinate system C that describes locally shape and orientation (that is, its inclination relative to the detector D) of the object B to be projected.
  • This additional object coordinate system C is advantageous (but not required in all embodiments), if the location of object B points are determined by internal motion sensors, which is the case for a collimator where the position and inclination of its blades can be conveniently specified with reference to an object coordinate system having an internal reference point as its origin. More generally, this local object coordinate system C definition may also be used with benefit for objects B that are otherwise deformable, foldable etc.
  • no such internal object coordinate system is used and the object position/inclination is described with reference to the world coordinate system W.
  • definition via the word coordinate system may be useful for objects whose points are more or less "free" to vary in space. In such a situation, the object
  • inclination/position may be determined for instance (but not necessarily in all embodiments) by some external positioning system, with said points being specified in world coordinates and the coordinate system C is not necessary.
  • Fig. 2 is a geometrical illustration of projective X-ray imaging adapted to present X-ray imaging context.
  • the detector position is defined as the position of the detector D' s center but again other reference points such as corner points may also be used instead.
  • the detector position is denoted by dj and ⁇ ° , respectively.
  • the origin of the detector coordinate system is located in the focal point and that the collimator and detector coordinate system have different orientations.
  • the collimator and the detector are inclined with respect to the world coordinate system around the x-axis by an angle a and ⁇ , respectively.
  • object and detector inclination are specified by angles ⁇ , ⁇ around a world W coordinate system axis for instance.
  • O c is an object B point, e.g. a collimator vertex.
  • the various coordinate systems as shown in Fig. 2 are: the world coordinate system W, object coordinate system C, detector coordinate system D and image coordinate system /.
  • the related coordinate transformations are (shown in more detail below at equations (3)-(5)), are denoted as (W ⁇ -C) for the object- (eg, collimator)-to-World coordinate transformation, (W ⁇ -D) for the detector-to-World coordinate transformation, and (I ⁇ D) for the Detector-to-image coordinate transformation, which implicitly includes instructions for the actual projection onto the image plane.
  • the first two 4 x 4 matrices includes a 3 x 3 rotational matrix (Rc, RD) and a 3 dimensional translation vector (f 0 w , c 0 w ) which describes the origin of the source coordinate system.
  • the origin is equal to the focal point f 0 w .
  • the third matrix is a 4 x 3 projection matrix. It projects points in the detector coordinate system onto the imaging plane.
  • matrix (I -D) p x , p y , s x , s y denote pixel size and number of pixels in x and y direction in the detector imaging plane, respectively.
  • the origin of the image coordinate system / is located in one embodiment at the upper left corner of the imaging plane, where the x and y axis points along the respective edges of the imaging plane, for instance left and downwards, respectively. Again, this origin definition is but one embodiment, and other locations on the imaging plane may also be used instead.
  • the homogeneous 3 dimensional vector o can be transformed back to a 2-dimensional Euclidian image vector by the correspondence described above at (1) and (2)
  • Examples for object points which are projected onto the image plane are vertices of the collimation field.
  • a prediction of the collimation field using the formalism described herein may be achieved by establishing geometric data as per Table 2 below:
  • the geometric data in the Table 2 above or similar can be queried from the X-ray imager 100 by interfacing with specific motion sensors that interface with actuators (such as stepper motors) configured to adjust the position/orientation of the object B (and hence the inclination relative to the detector D).
  • actuators such as stepper motors
  • This may involve querying the electronics of the actuators for internal states (such as number or rotations or steps executed) to establish object coordinates.
  • the geometric data for position/orientation of the object B of interest and/or that of the detector are established by using sensors PS such as camera systems, such an optical camera or a depth-sensing camera.
  • the camera system acquires a "scout" image of the geometry of the imager so that the position of the object versus the detector is encoded in this image.
  • the (infrared or optical) scout image information can then be converted into Euclidean (or as the case may be, any curvilinear such as cylindrical coordinates depending of the geometry of the object-imager system under consideration) world coordinates if a plurality of optical cameras are used. These are then converted in projective coordinates as per (1), (2) and the so converted geometrical data can then processed as per eqs (3)-(8) above.
  • an electromagnetic positing system with an operation principle not unlike to that of "GPS"
  • one or more sensors are affixed to the object which is thereby trackable in an electromagnetic field created via a field generator. During the tracking, the respective coordinates of the tracked points are established preferably with respect to the world coordinate system W.
  • FIG. 4 A result of the collimation field prediction using the proposed method is illustrated in Fig. 4.
  • the data is illustrative for collimator embodiment, it will be understood the above can be readily adapted to objects B in the examination region other than collimator blades.
  • FIG. 3 there is shown a flowchart for a method of supporting X-ray imaging as implemented by the predictor module PGP.
  • a coordinate description O c or specification of one or more points of the object B is received.
  • the specification may be generated automatically by a position control mechanism for positioning said object B or may be furnished by an optical camera or may be manually input by a user as described above by means of a suitable graphical user interface or similar. Although specifying each and every point of the object B may be advantageous in some embodiments, it is preferred herein that only certain parts of points are specified (automatically or by the user), for example corners of the object.
  • the coordinate specification of the object B with respect to the object or world coordinate system defines a position in space of the said object and/or an orientation of said object so that the position and orientation of the object in space relative to at least one (or both) of the detector D and the X-ray source is defined.
  • the object specification may include taking into account an inclination of the object relative to the X-ray source and/or the X-ray detector plane and/or the relative inclination between the X-ray source and the X-ray detector plane.
  • said object specification includes spatial position coordinates and (if applicable) in terms of world coordinates and (if applicable) inclination angles of object B and/or the detector plane relative to the world coordinate system W as per Fig 2 although other coordinate specifications are also envisaged herein.
  • a discretization of some sort will need to be employed to specify a discrete number of points suitable positioned so as to capture the shape of the object and/or its inclination.
  • only certain "salient" points such as corners of a polygon (or more general, points of a convex hull of the object) are received at step S310.
  • the plane of the blades eg 4
  • the position of the vertices for instance, 4 vertices
  • the received object specification of location/inclination of the object includes one or more points of the object which are to be projected onto the image plane to establish the silhouette points.
  • position and/or inclination of the detector D plane are received at step S310. This may be advantageous in situations where the object itself is essentially fixed and it is only the location/inclination of the detector within the examination region that is variable. Similarly, if object and detector are fixed, it is only the location of the (movable) X-ray source XR that is received at step S310.
  • the user supplies only the detector location /inclination and/or the object location /inclination.
  • the remaining imaging geometry is the automatically interrogated from the imager's control system for instance through imaging geometry data stream.
  • the geometric information received at step S310 is such that the relative position/inclination between object, detector and X-ray source are specified or can at least be derived from the supplied information. It will be understood that the specification of some or all three components, that is, the detector location /inclination and/or the object location
  • /inclination and X-Ray source location can be received from a user input or the specification is supplied (automatically) by the system through positioning or localization system PS as described above.
  • “Hybrid"-solutions are envisaged and were described above, where a part(s) of the specification is supplied by the user whereas other part(s) are supplied by the system.
  • the user supplies only the object position/inclination with the remaining geometry data (X-ray source, detector) being supplied by the system.
  • position/inclination of only the detector is supplied and again the system complements this by supplying the remaining data.
  • user supply only detector location/inclination and object location/inclination.
  • the coordinates are preferably homogeneous or projective coordinates, but the coordinates may be specified in any suitable coordinate system "language” such as Euclidean or curvilinear (spherical, cylindrical, polar) and may be transformed into homogeneous or projective coordinates before being passed on to the following silhouette predictor step S320.
  • suitable coordinate system "language” such as Euclidean or curvilinear (spherical, cylindrical, polar) and may be transformed into homogeneous or projective coordinates before being passed on to the following silhouette predictor step S320.
  • an outline/silhouette of the object or certain part thereof in the image plane of the detector is then predicted.
  • the prediction is based on the received coordinate specification of the location and orientation/inclination of the object, in particular relative to the detector plane.
  • the prediction operation at step S320 includes populating the coordinate transformation matrices (as per equations (3-5) with the received geometric data as per step S310.
  • the focal point coordinate in World coordinates fo W and the detector position d 0 D in detector coordinates need to be computed and included into the matrix entries as per eqs (3),(5).
  • the so populated coordinate transformation matrices are then multiplied as per (7a) (7b) to perform the coordinate transformation from the local object (e.g.
  • the prediction operation S320 is dynamically responsive, preferably in essentially real-time, to changes to the imager's and/or object position. If such a change occurs, some or all of the matrix entries in all or some of the above transformation equations are re-populated to update same and an updated set of new silhouette positions is output.
  • the imaging geometry data for population of above matrices is "simulation" data so the imager itself is not used at this stage. The user can thus tweak the geometry and object position/inclination settings until a satisfactory projection silhouette is achieved. Only then are the geometry data supplied to the imager to ready same for the actual image acquisition.
  • silhouette may be an inner silhouette or an outer silhouette.
  • An example of an inner silhouette is shown in Fig. 1 where the silhouette is formed by at least the outlines of the aperture of the collimator. Other objects without holes in them are also envisaged where the silhouette will correspond to an outer silhouette.
  • the so identified silhouette image pixel positions Of are then output and made available for image processing or other clinical or non-clinical tasks.
  • the outputting step S320 includes in particular in one but not in all embodiments converting projective homogenous coordinates back into the original coordinate language such as Euclidean, etc.
  • the output silhouette positions are forwarded to an image processor such as a filter or contrast enhancement module or a (eg, model-based) segmenter, or similar.
  • an image processor such as a filter or contrast enhancement module or a (eg, model-based) segmenter, or similar.
  • the so computed points are used to define border regions by passing for instance spline curves through the points to so define a region to be excluded from
  • the silhouette positions serve a-priori-knowledge for existing image processing algorithms, in particular of the iterative type, that are capable of inclusion of said prior-knowledge.
  • the output silhouette positions are used, as mentioned above, for collimation control, check of collimation window versus image plane and or detection of sundry calibration and/or marker objects.
  • Fig. 4 shows an exemplary output of the proposed module PM and/or method according to Fig. 3. More particularly, Fig. 4 is a non- limiting example for a collimation field projection to a 2 dimensional X-ray image.
  • image pixels shown in grey in Fig. 4 are those that correspond to the shadow of the blocker element B, in this case the collimator blades.
  • the trapezoidal shape corresponds to the aperture of the collimator.
  • the trapezoidal shape of the aperture silhouette captured in the image as per Fig. 4 is a projective distortion that stems from an inclination between the detector plane and the collimator blades B within one axis. This inclination may be due to machine imperfections or may be necessary for special clinical examinations.
  • Fig. 4 shows an exemplary output of the proposed module PM and/or method according to Fig. 3. More particularly, Fig. 4 is a non- limiting example for a collimation field projection to a 2 dimensional X-ray image.
  • Fig 4 demonstrates reliability of applicant's proposed method and module PM.
  • the method correctly predicts the course of the silhouette even when X-ray source XR and detector D plane are inclined relative to each other.
  • Fig 4A is a schematic representation of an image acquired by an X-ray imaging system with wall-stand detector. The dark area in the center is the collimation field
  • Fig 4B shows an actual image of the image schematically shown in Fig 4A), with collimator and detector inclination at -29° and -2° degree, respectively.
  • the collimator position c 0 w is equal to the focal point position f 0 w which is (0, 115, -115) T cm ("T" indicating matrix transposition).
  • the detector position d 0 w is (0, 180, 0) T cm.
  • the collimation size at 100 cm distance from the focal point is 23 cm x 20 cm. Again, these numbers are purely illustrative and in no way limiting.
  • polygon 410 indicates the result of the collimation field prediction as per the system and method explained above and polygon 405 is a manually determined reference, evidencing the accuracy of the prediction as per the proposed system and method.
  • the proposed system S and method can be used to predict a common silhouette of a complex object B made up from a system of sub-objects B;: where the silhouette is the projective sum of the plurality of the individual sub-objects Bi making up object B.
  • This situation can arise, for instance, if collimation borders are not clearly visible in the image or if X-ray protectors are placed at the patient and interfere with the collimation field.
  • the proposed method and system allows correct identification of complex shadow patterns such as the combo-shadows formed by the superposition of the collimator blades and protectors given their coordinates as described above.
  • the proposed system may be used of the prediction of positions of the silhouettes in the X-ray image of dedicated markers affixed to the patient when being imaged. This allows for robust marker detection in the X-ray image when image processor IP operates on said images.
  • Other examples are calibration objects (e.g. spheres or other shapes of known dimension).
  • the proposed system or method can improve the robustness of image processing based on detection of those objects.
  • the proposed system or method can be implemented in X-ray fluoroscopy systems, for instance in order to predict coUimation border footprints. This knowledge may then be used as an initial guess for image processing based detection of those coUimation borders. For fluoroscopy systems, robust coUimation detection is has proved useful since otherwise the visual impression of a frame sequence may vary unnaturally strongly between successive image frames thereby imparting undesirable flickering of image structures.
  • the pixel position specifier module PSM or the image processing module IP may be arranged as a software module or routine with a suitable interfaces (such as input port IN and output port OUT) and may be run on a general purpose computing unit or a dedicated computing unit. For instance, pixel position specifier module PSM may be executed on a workstation or operator console of the imaging system 100.
  • the image pixel position specifier module PSM with some or all of its components may be resident on the executive agency (such as a general purpose computer, workstation or console) or may be accessed remotely/centrally by the executive agency via a suitable communication network in a distributed architecture.
  • the components may be implemented in any suitable programming language such as C++ or others.
  • the components of the pixel position specifier module PSM may be arranged as dedicated FPGAs (field-programmable gate array) or similar standalone chips.
  • a computer program or a computer program element is provided that is characterized by being adapted to execute the method steps of the method according to one of the preceding embodiments, on an appropriate system.
  • the computer program element might therefore be stored on a computer unit, which might also be part of an embodiment of the present invention.
  • This computing unit may be adapted to perform or induce a performing of the steps of the method described above.
  • the computing unit can be adapted to operate automatically and/or to execute the orders of a user.
  • a computer program may be loaded into a working memory of a data processor.
  • the data processor may thus be equipped to carry out the method of the invention.
  • This exemplary embodiment of the invention covers both, a computer program that right from the beginning uses the invention and a computer program that by means of an up-date turns an existing program into a program that uses the invention.
  • the computer program element might be able to provide all necessary steps to fulfill the procedure of an exemplary embodiment of the method as described above.
  • a computer readable medium such as a CD-ROM
  • the computer readable medium has a computer program element stored on it which computer program element is described by the preceding section.
  • a computer program may be stored and/or distributed on a suitable medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the internet or other wired or wireless telecommunication systems.
  • the computer program may also be presented over a network like the
  • World Wide Web can be downloaded into the working memory of a data processor from such a network.
  • a medium for making a computer program element available for downloading is provided, which computer program element is arranged to perform a method according to one of the previously described embodiments of the invention.
  • position/inclination of the object B can brought about by changing either the object B inclination/position or the X-ray detector location/inclination or both.
  • object B and detector D position/inclination are adjustable, simpler embodiments are envisaged where either the object position/inclination the detector position/inclination is fixed.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • Engineering & Computer Science (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Optics & Photonics (AREA)
  • Pathology (AREA)
  • Physics & Mathematics (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

A system and related method to predict a projection silhouette or contour of an object (B) residing in a path of an X-ray beam of an X-ray imaging apparatus (100). The prediction is based on geometric data describing the orientation and/or position of the object relative to the detector (D) and the X-Ray source (XR) of the X-ray imager (100).

Description

Object localization in projective x-ray images by geometric considerations
FIELD OF THE INVENTION
The invention relates to a system for supporting X-ray imaging, to a method of supporting X-ray imaging, to a computer program element and to a computer readable medium.
BACKGROUND OF THE INVENTION
In some medical X-ray imaging applications, objects like protectors and collimators are located in a path of the X-ray beam in order to reduce the patient dose. Useful as those non-anatomical objects may be for achieving dose reductions for instance, they nevertheless "imprint" unwanted image information into the eventual 2D XX-ray projection image. Prior to performing post-acquisition image processing tasks (such as filtering, etc.) on X-ray images, an attempt is made to identify in the image structures that correspond to those radiation absorbing objects to ensure better image quality. On occasion, these detection attempts fail or lead to incorrect identifications. Another task is the detection of specific markers in the X-ray image for navigation or calibration purposes.
SUMMARY OF THE INVENTION
There may therefore be a need for alternative system or method to support X-ray imaging.
The object of the present invention is solved by the subject matter of the independent claims where further embodiments are incorporated in the dependent claims. It should be noted that the following described aspect of the invention equally applies to the image method of supporting X-ray imaging, to the computer program element and to the computer readable medium.
According to a first aspect of the invention there is provided a system for supporting X-ray imaging, comprising:
an input port for receiving, based on a first coordinate system, an input specification of i) a location of an X-ray source, and/or ii) a location and/or an orientation of an object and/or iii) a location and/or orientation of an X-ray detector, the object capable of interacting with X-ray radiation from an X-ray source;
a predictor module configured to predict, based on said input specification, at least a plurality of points of a silhouette of said object in an image plane of the X-ray detector;
an output port configured to output an output specification of the plurality of silhouette points.
The proposed system allows addressing the situation where the object (or a plurality) is introduced into a path of the X-ray beam to achieve a collimation or other purpose. The object is assumed to be radiation absorbing or reflecting. This then introduces a "footprint" or "shadow", that is a contrast, in the acquired image that does not relate to anatomical structures of the actual subject to be imaged. The proposed system affords accurate pre-acquisition predicting of at least a part of said silhouette of this object in the image to be acquired. The system operates largely, if not exclusively, on said object specification, that is, on parameters/coordinates that described the position/orientation of the object relative to the imager's geometry (in particular to the X-ray source position and/or the detector position and/or inclination) to so predict the extent and/or course of the object's silhouette even before the image is acquired. The specification of the object location position is either in terms of internal object coordinates (that is, the coordinates are measured relative to coordinate system located at the object) or the specification is in terms of "external" world coordinates. The specification is such that the location and/or inclination relative to the detector and the X-ray source are derivable (possibly in conjunction with otherwise system- geometric constants such the position of the X-ray source or position/inclination of the detector or position/inclination of the object). Alternatively or additionally thereto, the detector position/inclination is supplied. This may be advantageous if the object is fixed (and known to the system) and it is only the detector location/inclination that is variable. In addition or instead to all the above, the X-ray detector position is supplied, for instance when both object and detector are fixed (and known to the system). In general the supplied geometry data at input port is such that a constellation, that is, the relative inclination/location between the three components object, detector and X-ray source (in particular the position of its focal spot) is specified or derivable. The locations and/or inclinations are taken relative to one or more suitably chosen coordinate systems (such as "world" or global coordinate system).
In distinction to the proposed system, post-acquisition identification schemes attempt to identify the silhouette in the images themselves have been proposed in the past. Post-acquisition image based identification projection footprint of such objects can therefore be avoided altogether or at least the predicted silhouette position can be used to furnish computational fiducials or prior-knowledge to "guide" computation of these post-acquisition image analyzer algorithms. Image processing based detection/identification of those object footprints can be made more reliable if the projection of these objects is predicted by geometric considerations. In other words the proposed system can help identify image structures, in particular, but necessarily non-anatomical image structures.
Another use of the predicted silhouette points as envisaged herein is a collimation control, in particular (but necessarily) prior to the X-ray exposure. This includes automatic collimation adjustment by a collimation control unit when a size or location of a current collimation setting as per the silhouette points would or does lead to a collimation area/window which would extend or does extend outside an image plane of the detector.
In addition or instead of this auto-collimation the user can be provided with a feedback (visual or audio etc.) by a collimation monitor unit whether a collimation window as per the predicted silhouette points will or does extend outside the imaging plane. Again, this monitoring operation may be performed during X-ray operation but is preferably performed prior to the X-ray operation.
According to one embodiment, the predictor module operates to perform, based on said input specification, a transformation across a plurality of coordinate systems, including said first coordinate system. In particular, the above mentioned constellation between object location/inclination, the X-ray source location and the detector
location/inclination can be established in one embodiment by using said coordinate system transformations .
According to one embodiment, the object location/orientation specification is in terms of homogenous coordinates. This affords implementation advantages.
According to one embodiment, at least one of said coordinate systems has its origin at a focal point of the X-ray source and/or the orientation of said coordinate system is defined by the orientation of the image plane of the detector.
According to one embodiment, an optical axis of the system is defined as a projection line from a/the focal point of the X-ray source onto an image plane of the detector and/or a focal length is defined as the length of said projection line.
According to one embodiment, the object specification is acquired by a position sensor which comprises any one of the following: a plurality of optical cameras, a depth sensing camera, a position sensor of a positioning system. According to one embodiment, at least one of said plurality of coordinate systems is a local coordinate system of the detector and/or the object.
According to one embodiment, the object is any one of, but is not limited to: a collimator or a part thereof, an X-ray protector device, a marker device or a calibration device.
According to one embodiment, the predicted silhouette points correspond to an aperture of the collimator.
According to one embodiment, an orientation of the object or of an object coordinate system is inclined relative to an orientation of the image plane of the detector.
According to one embodiment, there is an image processor configured to run an image processing algorithm to process an X-ray image, wherein the predicted silhouette points are used to initialize the algorithm or as constraints for said algorithm.
According to one embodiment, the object includes one or more blades of the collimator arranged between the X-ray source and the X-ray detector, the system including a collimation control unit configured to adjust, based on the specification of the plurality of silhouette points, the collimation aperture of said collimator.
According to one embodiment, the object includes one or more blades of a collimator arranged between the X-ray source and the X-ray detector, the system including a collimation monitor unit configured to output, based on the specification of the plurality of silhouette points, an indication whether an areas defined by said plurality of silhouette points corresponds, at least in in size and/or location, to a predefined collimation area.
In sum, what is proposed herein is the prediction of projections of object shapes onto the detector plane by using geometry data. Geometric information about the object and/or the X-ray detector is used in one embodiment for the prediction of a collimation field in an X-ray image. Herein a method based on projective geometry is given which describes how to calculate the projection silhouette of arbitrary object points onto the image plane. More particularly, it is proposed to apply the traditional pin-hole-camera model for projective X-ray imaging. This allows for the prediction of object shapes in the projective X-ray image from the location of these shapes within the path of X-ray. This concept can be extended to the prediction of other specific objects like X-ray protectors, calibration spheres or anatomic or other markers if the location of this kind of objects is known from motion sensors or a positioning system or is otherwise provided. For instance, orthopedic
measurements can be simplified by ensuring more robust (automatic) identification/detection of silhouettes of markers or (other) calibration objects. Other uses of the predicted object shape/silhouette coordinates reside in collimation control or checking for correct collimator settings, etc.
BRIEF DESCRIPTION OF THE DRAWINGS
Exemplary embodiments of the invention will now be described with reference to the following drawings wherein:
Fig. 1 shows an X-ray based imaging arrangement;
Fig. 2 shows a schematic description of the imaging geometry in terms of three coordinate systems;
Fig. 3 shows a flow-chart of a method of supporting X-ray imaging;
Fig. 4 illustrates exemplary output of the proposed method as per Fig. 3.
DETAILED DESCRIPTION OF EMBODIMENTS
With reference to Fig. 1 there is shown in an examination room an X-ray based imaging arrangement including an X-ray imager 100 and a system S for supporting X-ray imaging by imager 100.
The imager 100 comprises broadly an X-ray source or X-ray such as a tube XR and a radiation sensitive detector D made up from detector pixels PX usually but not necessarily arranged in a rectangular array forming an image plane. The X-ray source XR is operable to emit an X-ray imaging beam and the X-ray detector D is configured to detect radiation as per said X-ray beam.
An inclination of the detector plate D relative to an axis (such as a world coordinate system - more details on this will be explained below at Fig 2)and the X-ray source may be adjustable as indicated for one degree of freedom by a rotation angle RD. Only one rotation angle is shown RD but it is understood that two or three rotation angles may also be adjustable. Inclination of any one or both of X-ray source and X-ray detector may be adjustable. The spatial position of any one or both of the X-ray source and X-ray detector may be adjustable in some or all spatial directions: e.g., height above ground and/or lateral and longitudinal position (relative to a vertical or horizontal reference line) may all be adjustable operation of suitable mechanical arrangements/actuators. Embodiments with manual detector position/inclination adjustments are also envisaged.
X-ray source and detector D are carried in and by a frame or gantry (not shown). Variants for the X-ray imager 100 envisaged herein are interventional X-ray imagers of the C- or U-arm type where the gantry has a "C" or transposed "U" shape. The X-ray imaging system may be floor or ceiling mounted. For some "high-end" diagnostic X-ray imaging systems and mobile X-ray imaging systems, which are also envisaged herein, there is no mechanical connection between the X-ray detector D and the X-ray source. In such systems, the X-ray source is mounted in a movable module, whereas X-ray detector is a portable plate module completely detached from the module where the X-ray source is mounted. The clinical user is at liberty to place the X-ray detector wherever they see fit. Mobile X-ray imagers may be used for example in hospital wards or care homes to image elderly or frail patients for instance.
During an imaging session, the imaging subject, such as a human patient, is deposed on a subject support (a table or similar - not shown) in an examination region which is broadly the space between the X-ray source XR and the detector D. The patient may lie or sit or stand during the image acquisition in the examination region. In the mobile X-ray imaging systems variant briefly introduced above, the detector is deposited between the patient and a bed support for instance if the patient is too frail to stand up and the X-ray source with the imager is positioned above the patient whilst lying in their bed on said mobile detector D.
In some embodiments, the X-ray imaging system comprises a wall-mounted or freestanding gantry that holds the X-ray source and across the examination room there is a mounting that holds the detector plate. This type of arrangement is used in chest X-ray imaging for instance. The patient "walks" into the examination region, resides there "standing" during image acquisition after the position and/or inclination of the X-ray source and detector plate has been adjusted to ensure that the X-ray beam path can pass through the ROI, that is, in this exemplary case, the chest region of the patient.
Broadly, in any of the above embodiments, during an image acquisition an X-ray beam emanates from X-ray tube XR, passes through patient PAT at said region ROI, experiences attenuation by interaction with matter therein, and the so attenuated beam PR then strikes detector D's surface at a plurality of the detector cells or pixels PX. Each pixel PX that receives ray radiation responds by issuing a corresponding electric signal. The collection of said signals is then translated by a data acquisition system DAS into a respective digital values representative of said attenuation. The collection of the so registered digital values for each (X-)ray are then consolidated into an array of digital values forming an X-ray projection image for a given acquisition time and projection direction. Detector array may be 2D (2 dimensional) as shown in Fig 1 but ID (one-dimensional)-embodiments are also envisaged herein where there is only a single line of detector pixels PX which is moved during image acquisition.
The DAS digitized image signals may then be rendered for view on a monitor MT or may be passed on to an image post-processor IP for filtering or contrast enhancements or any other image processing purpose. The so processed image IM may then be viewed on the monitor MT or may be stored for later reference in a database DB.
In some imaging scenarios, it may be necessary to introduce external, auxiliary objects B (that is, one or more objects B, different from and over and above the actual subject to be imaged) of relatively high radiation opacity into the X-ray beam PR or at least into the examination region to be flooded by the X-ray beam. "Relatively high radiation opacity" means in particular a significantly higher opacity than at least an average opacity of the subject's tissue at the ROI. One purpose of using such objects B in this manner is to at least locally/partially exclude X-ray radiation from patient regions which are not subject of investigation. The effectively reduces patient dosage. An exemplary embodiment of such a radiation blocker element B is/are blade(s) B (also referred to as shutter(s)) of a radiation collimator COL. For instance, as schematically shown in Fig 1, collimator COL comprises two pairs of blades B formed from sheets of lead or tungsten or other highly radiation-opaque material. One pair is arranged perpendicularly to the other and the blades are individually addressable and movable by a respective collimator stepper motor (not shown) so as to restrict the X-ray beam in either or two of the two dimensions depending on their relative position. Blocker blades B may be rotatable and/or shiftable in and out towards a center to form an aperture with a contour or (inner) silhouette formed by the four blades. The silhouette or perimeter of the aperture in Fig 1 is of square or of rectangular shape but any geometrical shape is envisaged herein. In this way the X-ray beam's "cross section" can be shaped to match an expected two dimensional outline of the ROI. The collimator
arrangement allows shaping the beam into square or rectangular forms in various sizes. In another embodiment a multi-leaf collimator is used comprising, instead of the four blades, a plurality (usually more than 4, e.g. 10. 20 to 30) of motor-movable slats or leafs arranged in opposing relationship. The multi-leaf collimator allows forming more detailed or more complex curvilinear shapes. Each collimator setting or configuration corresponds to a specific position of the blades B that form the collimator aperture having the silhouette or outline bounded the blades. An inclination of the blades relative to the detector plane D may be adjustable. This degree of freedom is exemplary indicted in Fig. 1 by rotation angle Rc for only one of the blades. Because of the high radiation opacity of blades B, primary radiation beam incident on the blades B is blocked, whereas that part of the radiation beam that is directed at the aperture, is not blocked, so it can pass through the collimator COL to irradiate the patient at a target volume or ROI. The collimator design as sketched in Fig. 1 is purely for exemplary and illustrative purposes and is not to be construed as limiting what is described herein.
The effect of the radiation blocker B in the context of collimation is illustrated in Fig. 1 by detector pixels PX shown either as clear or in hatchings. Strictly for the purposes of illustration and in no way limiting, pixels PX around the center of the detector D plate are those detector pixels that are to receive primary radiation when the radiation beam passes through the aperture formed by the blades B. These pixels PX are shown in clear rendering. In distinction from this situation, blocked out pixels that do not or are not to receive radiation are shown by hatching. It will be appreciated that a complete blocking out of primary radiation may not be achievable at all circumstances because scatter (secondary) radiation may still be received at pixels that were meant to be shielded from radiation. The collimator context and collimator blades are just an exemplary embodiment for radiation blocking objects B occasionally introduced into the path of the X-ray beam. Other embodiments for radiation blockers or artificial contrast inducing objects B used for similar or different purposes are image markers or X-ray protection devices, etc. as will be explained in more detail below.
A further, and rather undesirable effect of the presence of the one or more radiation blocking elements B in the radiation path is that the ultimate projection image obtained from image signals recorded at detector D now encodes both, anatomical structure (this is the prime interest for imaging) and, also the "footprints" or "shadows" of the radiation blockers B which have no diagnostic relevance. It will be understood that although reference is made to the elements B as "radiation blockers", said blockage may not necessarily be complete in all circumstances. Rather, what matters for present purposes, is that the blocker B's presence causes a sufficiently pronounced radiation intensity reduction (and therefore an appreciable contrast in the projection image) that does not relate to anatomical structures.
Now, it has been discovered by applicant that in some image processing applications it would be desirable to know a priori which pixels encode anatomic information and which pixels encode the footprints of the radiation blocker B. For this purpose, the support system S as proposed herein includes a pixel position specifier module PSM that allows to flag up individual pixel positions that can be expected to encode the blocker B shadow or footprint or at least parts thereof. More particularly, the proposed pixel position specifier module PSM operates to identify pixel positions that are expected to encode at least a part of the projection silhouette or projection outline of the blocker B footprint. The output of pixel position specifier module PSM can be supplied in any suitable format according to different embodiments and the choice of which will depend on given implementation requirements. For instance, the silhouette pixel position specification may be output as a pixel mask (defined on the detector image plane) or as a number of discrete pixel coordinates defining a polygon in the detector image plane that circumscribes the object's silhouette or describing the border of the blocker or object footprint. Any of these formats are envisage herein.
Whatever the format, the so identified pixel positions are then forwarded to the image processor IP. Image processor IP then knows which pixels to leave out of
consideration when for instance running an optimization algorithm as it is indeed used in many filtering or contrast enhancing algorithms. Inclusion, into the optimization, of pixel information that stems from the blocker B footprint may distort the outcome of such image optimizations, as otherwise undue "weight" is given to image information of blocker shadow pixels that do not actually represent true structural (e.g., anatomic) contrast. By leaving out blocker B footprints from image processing considerations, a higher quality and fidelity of the processed image IM can be expected.
It is envisaged herein that pixel position specifier module PSM operates solely on the geometry of the X-ray imager 100. In other words, the identification of the blocker B footprints can be done prior to radiation exposure, that is, before the actual image acquisition. This has the advantage, that a potential patient irradiation outside the detector imaging plane can be indicated by a suitable notification system (visual or audio) to the user to avoid unnecessary patient irradiation. Of course that is not to say that the proposed specifier module PSM cannot be used during the image acquisition which is envisaged herein in alternative embodiments.
Broadly, the pixel position specifier module PSM comprises an input port IN, a predictor module PGP, and an output port OUT. At input port IN, object and or imaging geometry information is received. In one embodiment, a coordinate system based
specification of the object/blocker B as well as the position and orientation of the detector with respect to another coordinate system is received. If the position/inclination of the detector is fixed, only the object position/inclination is received. Alternatively, if the position/inclination of the object is fixed (and known to the system, e.g. as a constant parameter), it is only the location/inclination of the detector that is received. Lastly, if object and detector are fixed, the position of the X-ray source is received.
In particular, in one embodiment, a position and/or shape of at least parts of the blocker is received relative to a suitable chosen coordinate system, on which more later below at Figures 2, 3. The coordinates that describe the position and/or shape of the blocker or the relevant parts thereof as well as the position and orientation of the detector are then passed on to the predictor module PGP. The predictor module PGP uses a geometric formalism from projective geometry which uses the pin hole camera model paradigm tailored to the setting of X-ray projection imaging. Operation of predictor module PGP includes applying one, two or more (in particular three) coordinate transformations via suitable matrix multiplications to arrive at a pixel set specification that describes the predicted shadow B pixel positions in an image to be taken at the given imaging geometry setting ("imaging geometry" as used herein includes in particular X-ray tube XR position and/or X-ray detector D position/inclination) of the imager 100. This specification of the pixel positions of the blocker B footprint (or of at least parts thereof) are supplied at output port OUT and may then be forwarded to the image processing module IP. A logic module in the image processor IP then removes said shadow pixels (that is the image information carried by those pixel positions) from the total projection image received via the DAS and the image processing algorithm implemented by IP may then be restricted to operate only on the remaining pixel set (that is, the pixels other than those specified by the pixel position specifier module PSM) in the digital image as received via the DAS.
Another application envisaged herein is automatic length unit calibration within the patient plane. This is the plane where the patient resides during the imaging, typically this is the plane defined by the top of the examination table. This plane is different from the imaging plane so object will appear larger in the image plane than they are in the patient plane. Length unit calibration allows establishing the real dimensions of image structures. In this procedure, a calibration object B of spherical (with known diameter) or other suitable geometric shape of a prior-known dimensions is positioned at or on the patient by gluing, sticking or is otherwise affixed in spatial association with the patient to be imaged. If the position/inclination of the calibration object B (eg sphere) is determined by for instance via an electromagnetic positioning system or is otherwise provided, the footprint of the sphere B can be predicted by the geometric formalism and system proposed herein. Even perspective distortions caused by the detector plane not being aligned with the calibration object relative to the X-ray source (in particular not being perpendicular to the beam from the X-ray source through the center of the calibration object) can be accounted for in the proposed method.
Similarly, the prediction of markers B placed at the patient can also be predicted in the X-ray images. Such markers are used for instance to indicate anatomical regions relevant for further diagnostic work. Instead of or in addition to these detection tasks, the proposed system and method can also be used for collimation control prior to actual X-ray exposure. This means that a collimation window which, based on the predicted aperture silhouette points, would extend partly outside the imaging plane or which would turn out otherwise ill-positioned is automatically adjusted for by a suitable collimation controller CC that interacts with the collimator's actuator. A closed feedback loop can be used to compare a sequence of silhouette positions and make the necessary adjustments until the correct collimation window is achieved.
In an alternative embodiment, rather than automatically adjusting the collimator blades, the system includes a collimation monitor unit CMU configured to output, based on the specification of the plurality of silhouette points, an indication whether the area as defined by said plurality of silhouette points corresponds, at least in in size and/or location, to a predefined collimation area. If there is no such correspondence because the collimator window would extend beyond the imaging plane of the detector, an indication is given to the user via a suitable (audio or visual) alert signal. Of course, the same principles can be applied to objects B other than collimator components to that the user can assess, pre-acquisition, whether the object's projection footprint/silhouette can be expected to appear at the intended position/region of the image to be acquired. This could be indicated to the user graphically on screen MT where a suitable graphic element is rendered that represents the borders of the image plane with a polygon overlaid thereon representing the predicted silhouette points. The user can then update detector and/or object position/inclination until the object
footprint/silhouette appears at or near the intended image region or at an acceptable perspective distortion, etc.
The position and/or orientation and/or shape specification of the blocker element B is supplied in one embodiment by a position or shape sensor PS. Various embodiments for PS are envisaged, for instance position sensor PS may be implemented as a system of optical cameras or a depth sense camera or may be achieved by a position sensor that interfaces with control logic of actuators used to operate the positioning of the individual blades B in the collimator or instance. The sensor PS arranged on or at the blade may also be part of a positioning system as will be explained in more detail. Embodiments where it is the user who manually supplies the coordinate specification of the blocker B are also envisaged. The specification is preferably in 3D although embodiments for 2D coordinate specifications are also contemplated. For instance, the user inputs via suitable user input means the points or positions of the blocker element B for which the user wishes a prediction to be computed. In one embodiment, the user input is supported by a graphical user interface (GUI) system. In an exemplary embodiment, a CAD (computer-aided design) or other graphic/schematic representation of the blocker element B is displayed on monitor MT or on a separate dedicated screen. The user can then choose the positions or points of the blocker element B which are then forwarded to the specifier module PSM for processing, as explained above. In one embodiment, the user may choose by touch screen or computer mouse or e-stylus or other suitable input means the individual positions which are then translated by an event handler into coordinates relative to a suitable coordinate system (as will be explained in more detail below) and these points are then passed onto the pixel position specifier module PSM. The manually supplied points may be specified in 3D for instance.
The geometric principles underlying operation of the pixel position specifier module PSM as proposed herein will now be explained in more detail. Applicant discovered that the application of the traditional pinhole camera model to projective X-ray imaging may be achieved by realizing the following analogies:
Figure imgf000013_0001
Table 1
In projective X-ray imaging object points are located between the focal point of the X-ray tube XR and the detector image plane, whereas the object points in the pin-hole camera model are located behind the imaging plane seen from the camera center. However, this difference is invariant under projection onto the image plane. Furthermore, in the camera model the optical axis defines directly the orientation of the camera, while in projective X-ray imaging the optical axis is a result of the detector orientation as per Table 1 above. As a consequence, in the proposed system, the position of the imaging plane with respect to the projection of the optical axis onto the detector plane, as well as the focal length, both depend on the detector position and inclination. For detector versus object tilts/inclinations, the projection of the optical axis onto the detector plane can even be located outside the imaging plane x , y '. The optical axis of the proposed system is implicitly (or indirectly) defined by the detector D position and inclination RD. AS a further difference to the pinhole camera model, in projective X-ray imaging object points are described by an object coordinate system which has, in one embodiment, its own location and orientation. An example is the coordinate system C of the collimator COL that describes the mutual position of the collimator blades B with respect to an internal reference point in the collimator as mentioned above.
Therefore, when adapting the pinhole camera paradigm to present X-ray imaging purposes we first transform the object points into a "World" coordinate system (W and then into the detector coordinate system (D). Finally the object point is projected from the detector coordinate system onto the x-y plane of the image coordinate system (/). The geometric situation in projective X-ray imaging is illustrated in Fig. 2. The origin of the detector coordinate system is located at the focal point of the X-ray tube. The origin of the collimator coordinate system is typically also located at the focal point but the orientation can differ from the detector orientation. The world coordinate system W can be located anywhere in the X-ray room, the origin of which being preferably conveniently chosen so as to take advantage of natural system symmetries for instance. To make the point again, the tailoring of the pinhole camera model to present purposes entails the counterintuitive effect that detector D is now "virtually" "split" up in two spatially and conceptually separate coordinate systems: that of the detector itself and, different therefrom, the image plane / which is in general parallel to the x-y plane of the detector coordinate system and would be ordinarily placed at one corner or the center of the detector but other choices for the origin are also envisaged. But, as shown in Fig. 2, the detector coordinate system D is now tied to the focal point of the X-ray source XR, away from where the detector is physically located. On top of this, there is also an optional, local, object coordinate system C that describes locally shape and orientation (that is, its inclination relative to the detector D) of the object B to be projected. This additional object coordinate system C is advantageous (but not required in all embodiments), if the location of object B points are determined by internal motion sensors, which is the case for a collimator where the position and inclination of its blades can be conveniently specified with reference to an object coordinate system having an internal reference point as its origin. More generally, this local object coordinate system C definition may also be used with benefit for objects B that are otherwise deformable, foldable etc.
In other embodiments, no such internal object coordinate system is used and the object position/inclination is described with reference to the world coordinate system W. For instance, definition via the word coordinate system may be useful for objects whose points are more or less "free" to vary in space. In such a situation, the object
inclination/position may be determined for instance (but not necessarily in all embodiments) by some external positioning system, with said points being specified in world coordinates and the coordinate system C is not necessary.
Fig. 2 is a geometrical illustration of projective X-ray imaging adapted to present X-ray imaging context. The detector position is defined as the position of the detector D' s center but again other reference points such as corner points may also be used instead. In world and detector coordinates the detector position is denoted by dj and ά° , respectively. The focal distance and focal point (of the X-ray source XR) are denoted by / = ((do )z) and , respectively. Note, that the origin of the detector coordinate system is located in the focal point and that the collimator and detector coordinate system have different orientations. Here the collimator and the detector are inclined with respect to the world coordinate system around the x-axis by an angle a and β, respectively. In other words, object and detector inclination are specified by angles α, β around a world W coordinate system axis for instance. In Fig 2, Oc is an object B point, e.g. a collimator vertex. The various coordinate systems as shown in Fig. 2 are: the world coordinate system W, object coordinate system C, detector coordinate system D and image coordinate system /. The related coordinate transformations are (shown in more detail below at equations (3)-(5)), are denoted as (W^-C) for the object- (eg, collimator)-to-World coordinate transformation, (W^-D) for the detector-to-World coordinate transformation, and (I^~D) for the Detector-to-image coordinate transformation, which implicitly includes instructions for the actual projection onto the image plane.
The projections of object points onto the detector D plane are established herein by harnessing projective geometry formalism similar to the one used in computer vision or camera calibration techniques. In this formalism, a Euclidian n dimensional vector v is represented by an n+1 dimensional homogeneous vector v in the projective space. The correspondence between these vectors is as follows: v -> v = (v,l)T (1) V -> V = (vil Vn+i, Vnl Vn+i) T (2) where the focal point of the X-ray source XR is the projective center.
Within the projective space, the transformations between the various coordinate systems as per Fig 2 can be briefly described by homogeneous matrices:
(/ <- /» =
Figure imgf000016_0001
The first two 4 x 4 matrices includes a 3 x 3 rotational matrix (Rc, RD) and a 3 dimensional translation vector (f0 w, c0 w) which describes the origin of the source coordinate system. For the detector coordinate system the origin is equal to the focal point f0 w. The third matrix is a 4 x 3 projection matrix. It projects points in the detector coordinate system onto the imaging plane. In matrix (I -D), px, py , sx, sy denote pixel size and number of pixels in x and y direction in the detector imaging plane, respectively. The origin of the image coordinate system / is located in one embodiment at the upper left corner of the imaging plane, where the x and y axis points along the respective edges of the imaging plane, for instance left and downwards, respectively. Again, this origin definition is but one embodiment, and other locations on the imaging plane may also be used instead.
The projection onto the detector plane requires the focal distance which is the z-component of the detector position in the detector coordinate system. In homogeneous coordinates the detector position transforms as follows: doD = (W^D)-'d0 w (6)
Now, object B points o_given in the collimator coordinate system C can be projected onto the x-y plane of the image coordinate system / by successive matrix application: o = (I^D)(W^D)-! (W^C) oc (7a) or d = (I^D)(W^D)-1ow (7b) Coordinate transformation as per (7a) represent the situation where the local object B coordinate system is used whereas eq (7b) represents the situation where object B location/inclination/orientation is relative to the world coordinate system and there is no local object coordinate system. Accordingly, matrix (W <- C) is required only when the object coordinate system is used.
As will be apparent from above transformation equations (7a, b) in view of eq (3(-(5), that the coordinate specification of object position/inclination is transformed via the transformations (7a)(7b) relative to the detector with the last matrix to the right of the left hand sides in (7a)(7b) implementing the actual projection onto the detector image plane. In yet other word, the transformations across the various coordinate systems does both, it relates object location/inclinations to the detector and then effects the projection to obtain the projected silhouette points on the detector imaging plane as output for the input object points.
The homogeneous 3 dimensional vector o can be transformed back to a 2-dimensional Euclidian image vector by the correspondence described above at (1) and (2)
Figure imgf000017_0001
Examples for object points which are projected onto the image plane are vertices of the collimation field.
As an exemplary application, a prediction of the collimation field using the formalism described herein may be achieved by establishing geometric data as per Table 2 below:
Figure imgf000017_0002
Table 2
According to one embodiment, the geometric data in the Table 2 above or similar can be queried from the X-ray imager 100 by interfacing with specific motion sensors that interface with actuators (such as stepper motors) configured to adjust the position/orientation of the object B (and hence the inclination relative to the detector D). This may involve querying the electronics of the actuators for internal states (such as number or rotations or steps executed) to establish object coordinates. In this case it may be convenient to use the previously mentioned local object coordinate system C. In other embodiments, the geometric data for position/orientation of the object B of interest and/or that of the detector are established by using sensors PS such as camera systems, such an optical camera or a depth-sensing camera. The camera system acquires a "scout" image of the geometry of the imager so that the position of the object versus the detector is encoded in this image. The (infrared or optical) scout image information can then be converted into Euclidean (or as the case may be, any curvilinear such as cylindrical coordinates depending of the geometry of the object-imager system under consideration) world coordinates if a plurality of optical cameras are used. These are then converted in projective coordinates as per (1), (2) and the so converted geometrical data can then processed as per eqs (3)-(8) above. In another embodiment one may use an electromagnetic positing system (with an operation principle not unlike to that of "GPS"), to track the position/inclination/orientation of the object B. In such an electromagnetic tracking or positioning system one or more sensors are affixed to the object which is thereby trackable in an electromagnetic field created via a field generator. During the tracking, the respective coordinates of the tracked points are established preferably with respect to the world coordinate system W.
It will be understood that all of the above described options to the establish object B position/inclination are of equal application for acquisition of detector D
position/inclination and are envisaged either instead of or in addition to the determination of the object B position inclination. Same applies to the X-ray source position.
A result of the collimation field prediction using the proposed method is illustrated in Fig. 4. Although the data is illustrative for collimator embodiment, it will be understood the above can be readily adapted to objects B in the examination region other than collimator blades.
With reference to Fig. 3, there is shown a flowchart for a method of supporting X-ray imaging as implemented by the predictor module PGP.
At step S310, a coordinate description Oc or specification of one or more points of the object B is received. The specification may be generated automatically by a position control mechanism for positioning said object B or may be furnished by an optical camera or may be manually input by a user as described above by means of a suitable graphical user interface or similar. Although specifying each and every point of the object B may be advantageous in some embodiments, it is preferred herein that only certain parts of points are specified (automatically or by the user), for example corners of the object. The coordinate specification of the object B with respect to the object or world coordinate system defines a position in space of the said object and/or an orientation of said object so that the position and orientation of the object in space relative to at least one (or both) of the detector D and the X-ray source is defined. The object specification may include taking into account an inclination of the object relative to the X-ray source and/or the X-ray detector plane and/or the relative inclination between the X-ray source and the X-ray detector plane.
In one embodiment said object specification includes spatial position coordinates and (if applicable) in terms of world coordinates and (if applicable) inclination angles of object B and/or the detector plane relative to the world coordinate system W as per Fig 2 although other coordinate specifications are also envisaged herein.
As a matter of computational reality, a discretization of some sort will need to be employed to specify a discrete number of points suitable positioned so as to capture the shape of the object and/or its inclination. For instance, in some embodiments, only certain "salient" points such as corners of a polygon (or more general, points of a convex hull of the object) are received at step S310. For instance, in case of a rectangular collimation aperture, it may be sufficient to specify the plane of the blades (eg 4) and/or the position of the vertices (for instance, 4 vertices) forming the aperture. In other object B geometries, fewer or more points need to be specified which will depend on the complexities of the shape and or its symmetry. For instance, for a circular shape B it may be sufficient to specify plane of the circle and the center point if the diameter is known or one may also have to specify the diameter if not known a-priori. Discrete points on the circle chosen at a possibly user- adjustable step width are then selected as point specification for the object B's position and/or inclination. In sum, the received object specification of location/inclination of the object includes one or more points of the object which are to be projected onto the image plane to establish the silhouette points.
Alternatively or in addition to the object points, position and/or inclination of the detector D plane are received at step S310. This may be advantageous in situations where the object itself is essentially fixed and it is only the location/inclination of the detector within the examination region that is variable. Similarly, if object and detector are fixed, it is only the location of the (movable) X-ray source XR that is received at step S310.
Preferably, the user supplies only the detector location /inclination and/or the object location /inclination. The remaining imaging geometry is the automatically interrogated from the imager's control system for instance through imaging geometry data stream. In general, the geometric information received at step S310 is such that the relative position/inclination between object, detector and X-ray source are specified or can at least be derived from the supplied information. It will be understood that the specification of some or all three components, that is, the detector location /inclination and/or the object location
/inclination and X-Ray source location can be received from a user input or the specification is supplied (automatically) by the system through positioning or localization system PS as described above. "Hybrid"-solutions are envisaged and were described above, where a part(s) of the specification is supplied by the user whereas other part(s) are supplied by the system. In one preferred embodiment, the user supplies only the object position/inclination with the remaining geometry data (X-ray source, detector) being supplied by the system. In other embodiments, position/inclination of only the detector is supplied and again the system complements this by supplying the remaining data. In yet other embodiments user supply only detector location/inclination and object location/inclination. There is also an
embodiment, where location/inclination (later applied only to object and detector) of all the three components are supplied by the system. The user then merely needs to place the object into the examination region for instance and the system takes care. In case the object is a collimator or other system component resident in the examination region, the user does not even have to place the object but merely request from an operation console a "silhouette predict" mode and the system outputs the silhouette points as described above. If there are more than one objects/components of interest in the examination region the system may prompt the user in an interrogation routine which object's silhouette he or she wishes to be predicted.
In the above step 310, the coordinates are preferably homogeneous or projective coordinates, but the coordinates may be specified in any suitable coordinate system "language" such as Euclidean or curvilinear (spherical, cylindrical, polar) and may be transformed into homogeneous or projective coordinates before being passed on to the following silhouette predictor step S320.
At step S320, an outline/silhouette of the object or certain part thereof in the image plane of the detector is then predicted. The prediction is based on the received coordinate specification of the location and orientation/inclination of the object, in particular relative to the detector plane. The prediction operation at step S320 includes populating the coordinate transformation matrices (as per equations (3-5) with the received geometric data as per step S310. In particular the focal point coordinate in World coordinates foW and the detector position d0 D in detector coordinates need to be computed and included into the matrix entries as per eqs (3),(5). The so populated coordinate transformation matrices are then multiplied as per (7a) (7b) to perform the coordinate transformation from the local object (e.g. collimator) coordinates or world coordinate system to the image plane coordinates in the detector plane to so identify a set of pixel positions among the detector pixels position that represent points of the object's silhouette. It will be understood that in one embodiment the prediction operation S320 is dynamically responsive, preferably in essentially real-time, to changes to the imager's and/or object position. If such a change occurs, some or all of the matrix entries in all or some of the above transformation equations are re-populated to update same and an updated set of new silhouette positions is output. In one embodiment, the imaging geometry data for population of above matrices is "simulation" data so the imager itself is not used at this stage. The user can thus tweak the geometry and object position/inclination settings until a satisfactory projection silhouette is achieved. Only then are the geometry data supplied to the imager to ready same for the actual image acquisition.
The term "silhouette" (or outline or contour) as used herein may be an inner silhouette or an outer silhouette. An example of an inner silhouette is shown in Fig. 1 where the silhouette is formed by at least the outlines of the aperture of the collimator. Other objects without holes in them are also envisaged where the silhouette will correspond to an outer silhouette.
At step S330, the so identified silhouette image pixel positions Of are then output and made available for image processing or other clinical or non-clinical tasks. The outputting step S320 includes in particular in one but not in all embodiments converting projective homogenous coordinates back into the original coordinate language such as Euclidean, etc.
In one embodiment the output silhouette positions are forwarded to an image processor such as a filter or contrast enhancement module or a (eg, model-based) segmenter, or similar. This allows restricting operations of such an image processing module to only those points that are not stemming from the object B shadow or footprint. For instance, in one embodiment, the so computed points are used to define border regions by passing for instance spline curves through the points to so define a region to be excluded from
consideration by the respective image processing algorithm. The silhouette positions serve a-priori-knowledge for existing image processing algorithms, in particular of the iterative type, that are capable of inclusion of said prior-knowledge. In other embodiments, the output silhouette positions are used, as mentioned above, for collimation control, check of collimation window versus image plane and or detection of sundry calibration and/or marker objects.
Fig. 4 shows an exemplary output of the proposed module PM and/or method according to Fig. 3. More particularly, Fig. 4 is a non- limiting example for a collimation field projection to a 2 dimensional X-ray image. In other words, image pixels shown in grey in Fig. 4 are those that correspond to the shadow of the blocker element B, in this case the collimator blades. The trapezoidal shape corresponds to the aperture of the collimator. The trapezoidal shape of the aperture silhouette captured in the image as per Fig. 4 is a projective distortion that stems from an inclination between the detector plane and the collimator blades B within one axis. This inclination may be due to machine imperfections or may be necessary for special clinical examinations. In other cases, Fig. 4 demonstrates reliability of applicant's proposed method and module PM. In particular, the method correctly predicts the course of the silhouette even when X-ray source XR and detector D plane are inclined relative to each other. Fig 4A) is a schematic representation of an image acquired by an X-ray imaging system with wall-stand detector. The dark area in the center is the collimation field
representative of high X-ray dose. The dotted line indicates the result of the collimation field prediction as per the system and method explained above. The solid polygon is a manually determined reference. Fig 4B) shows an actual image of the image schematically shown in Fig 4A), with collimator and detector inclination at -29° and -2° degree, respectively. The collimator position c0 w is equal to the focal point position f0 w which is (0, 115, -115)T cm ("T" indicating matrix transposition). The detector position d0 w is (0, 180, 0)T cm. The collimation size at 100 cm distance from the focal point is 23 cm x 20 cm. Again, these numbers are purely illustrative and in no way limiting. In Fig 4B), polygon 410 indicates the result of the collimation field prediction as per the system and method explained above and polygon 405 is a manually determined reference, evidencing the accuracy of the prediction as per the proposed system and method.
It will be appreciated from the above that the proposed system S and method can be used to predict a common silhouette of a complex object B made up from a system of sub-objects B;: where the silhouette is the projective sum of the plurality of the individual sub-objects Bi making up object B. This situation can arise, for instance, if collimation borders are not clearly visible in the image or if X-ray protectors are placed at the patient and interfere with the collimation field. The proposed method and system allows correct identification of complex shadow patterns such as the combo-shadows formed by the superposition of the collimator blades and protectors given their coordinates as described above.
Application of the proposed system or method to contexts other than coUimation is also envisaged. For instance the proposed system may be used of the prediction of positions of the silhouettes in the X-ray image of dedicated markers affixed to the patient when being imaged. This allows for robust marker detection in the X-ray image when image processor IP operates on said images. Other examples are calibration objects (e.g. spheres or other shapes of known dimension). In this context the proposed system or method can improve the robustness of image processing based on detection of those objects. The proposed system or method can be implemented in X-ray fluoroscopy systems, for instance in order to predict coUimation border footprints. This knowledge may then be used as an initial guess for image processing based detection of those coUimation borders. For fluoroscopy systems, robust coUimation detection is has proved useful since otherwise the visual impression of a frame sequence may vary unnaturally strongly between successive image frames thereby imparting undesirable flickering of image structures.
Operation of the proposed system or method in the collimator context can be established by the following experimental procedure:
i. incline or tilt the detector D with respect to the imager's collimator or tilt or incline the collimator with respect detector D,
ii. adjust the collimator to establish trapezoidal coUimation field within an X-ray image to be acquired,
iii. acquire a first X-ray image and check the detected trapezoidal coUimation field,
iv. place an external collimator between system collimator and detector, v. align the external collimator with the detector to be parallel to the detector plane,
vi. adjust the coUimation field of the external detector to be within the coUimation field of the system collimator,
vii. acquire a second X-ray image,
viii. if the coUimation field as per the second X-ray image does not differ significantly from the first trapezoidal coUimation field, this fact may be taken as an indication for the operation of the proposed system or method.
It will be understood that the above may also be applied to objects other than collimators. The pixel position specifier module PSM or the image processing module IP may be arranged as a software module or routine with a suitable interfaces (such as input port IN and output port OUT) and may be run on a general purpose computing unit or a dedicated computing unit. For instance, pixel position specifier module PSM may be executed on a workstation or operator console of the imaging system 100. The image pixel position specifier module PSM with some or all of its components may be resident on the executive agency (such as a general purpose computer, workstation or console) or may be accessed remotely/centrally by the executive agency via a suitable communication network in a distributed architecture. The components may be implemented in any suitable programming language such as C++ or others.
In one embodiment, the components of the pixel position specifier module PSM may be arranged as dedicated FPGAs (field-programmable gate array) or similar standalone chips.
In another exemplary embodiment of the present invention, a computer program or a computer program element is provided that is characterized by being adapted to execute the method steps of the method according to one of the preceding embodiments, on an appropriate system.
The computer program element might therefore be stored on a computer unit, which might also be part of an embodiment of the present invention. This computing unit may be adapted to perform or induce a performing of the steps of the method described above.
Moreover, it may be adapted to operate the components of the above-described system. The computing unit can be adapted to operate automatically and/or to execute the orders of a user.
A computer program may be loaded into a working memory of a data processor. The data processor may thus be equipped to carry out the method of the invention.
This exemplary embodiment of the invention covers both, a computer program that right from the beginning uses the invention and a computer program that by means of an up-date turns an existing program into a program that uses the invention.
Further on, the computer program element might be able to provide all necessary steps to fulfill the procedure of an exemplary embodiment of the method as described above.
According to a further exemplary embodiment of the present invention, a computer readable medium, such as a CD-ROM, is presented wherein the computer readable medium has a computer program element stored on it which computer program element is described by the preceding section. A computer program may be stored and/or distributed on a suitable medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the internet or other wired or wireless telecommunication systems.
However, the computer program may also be presented over a network like the
World Wide Web and can be downloaded into the working memory of a data processor from such a network. According to a further exemplary embodiment of the present invention, a medium for making a computer program element available for downloading is provided, which computer program element is arranged to perform a method according to one of the previously described embodiments of the invention.
It has to be noted that embodiments of the invention are described with reference to different subject matters. In particular, some embodiments are described with reference to method type claims whereas other embodiments are described with reference to the device type claims. However, a person skilled in the art will gather from the above and the following description that, unless otherwise notified, in addition to any combination of features belonging to one type of subject matter also any combination between features relating to different subject matters is considered to be disclosed with this application.
However, all features can be combined providing synergetic effects that are more than the simple summation of the features.
While the invention has been illustrated and described in detail in the drawings and foregoing description, such illustration and description are to be considered illustrative or exemplary and not restrictive. The invention is not limited to the disclosed embodiments. Other variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing a claimed invention, from a study of the drawings, the disclosure, and the dependent claims.
In the claims, the word "comprising" does not exclude other elements or steps, and the indefinite article "a" or "an" does not exclude a plurality. A single processor or other unit may fulfill the functions of several items re-cited in the claims. The mere fact that certain measures are re-cited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage. Any reference signs in the claims should not be construed as limiting the scope.
It will be appreciated from the above, that relative change in
position/inclination of the object B (e.g. collimator blade) can brought about by changing either the object B inclination/position or the X-ray detector location/inclination or both. In other words, although in general both object B and detector D position/inclination are adjustable, simpler embodiments are envisaged where either the object position/inclination the detector position/inclination is fixed.
Definitions:- w World coordinate system
c Collimator coordinate system
D Detector coordinate system
I Image coordinate system
(W^C) Homogeneous transformation from collimator to world coordinates
Homogeneous transformation from detector to world coordinates
(I+-D) Homogeneous transformation from detector to image coordinates do Detector position (3 dimensional vector)
fo Focal point (3 dimensional vector)
Co Collimator position (3 dimensional vector)
f Focal distance
o Object point position (3 dimensional vector)
T Transpose of a matrix or vector
Px Pixel size in x-direction
Py Pixel size in y-direction
Sx Number of pixels in x-direction
Sy Number of pixels in y-direction

Claims

CLAIMS:
1. A system for supporting X-ray imaging, comprising:
an input port (IN) for receiving, based on a first coordinate system and in terms of homogenous coordinates, an input specification of i) a location of an X-ray source, and ii) a location and/or an orientation of an object (B) andiii) a location and/or orientation of an X-ray detector (D), the object capable of interacting with X-ray radiation from the X-ray source (XR);
a predictor module (PGP) configured to predict, based on said input specification and on at least one geometric transformation, at least a plurality of points of a silhouette of said object in an image plane of the X-ray detector (D),;
an output port (OUT) configured to output, in terms of coordinates, an output specification of the plurality of silhouette points.
2. System of claim 1, wherein the predictor module (PGP) operates to perform based on said input specification the at least one geometric transformation across a plurality of coordinate systems, including said first coordinate system.
3. System of claim 2, wherein at least one of said coordinate systems has its origin at a focal point of the X-ray source and/or the orientation of said coordinate system is defined by the orientation of the image plane of the detector.
4. System of any of one claims 1- 3, wherein an optical axis of the system is defined as a projection line from a or the focal point of the X-ray source (XR) onto an image plane of the detector (D) and/or a focal length is defined as the length of said projection line.
5. System of any one of claims 1- 4, where the input specification is acquired by a position sensor (PS) which comprises any one of the following: a plurality of optical cameras, a depth sensing camera, a position sensor of a positioning system.
6. System of any one of claim 1-5, wherein at least one of said plurality of coordinate systems is a local coordinate system of the detector and/or the object.
7. System of any one of claim 1- 6, wherein the object is any one of, but is not limited to: a collimator (COL) or a part thereof, an X-ray protector device, a marker device or a calibration device.
8. System of claim 7, wherein the predicted silhouette points correspond to an aperture of the collimator.
9. System of any one of claims 1-8, including an image processor (IP) configured to run an image processing algorithm to process an X-ray image, wherein the predicted silhouette points are used to initialize the algorithm or as constraints for said algorithm.
10. System of any one of claims 7 or 8, wherein the object (B) includes one or more blades of the collimator (COL) arranged between the X-ray source (XR) and the X-ray detector (D), the system including a collimation control unit (CC) configured to adjust, based on the specification of the plurality of silhouette points, the collimation aperture of said collimator.
11. System of any one of claims 7 or 8, wherein the object (B) includes one or more blades of a collimator (COL) arranged between the X-ray source (XR) and the X-ray detector (D), the system including a collimation monitor unit (CMU) configured to output, based on the specification of the plurality of silhouette points, an indication whether or not an area defined by said plurality of silhouette points extends outside a predefined collimation area.
12. Method of supporting X-ray imaging, comprising the steps of:
receiving (S310), based on a first coordinate system and in terms of homogenous coordinates, an input specification of i) a location of an X-ray source, and ii) a location and/or an orientation of an object (B) and iii) a location and/or orientation of an X- ray detector (D), the object capable of interacting with X-ray radiation from an X-ray source (XR);
based on said input specification and on at least one geometric transformation, predicting (S320) at least a plurality of points of a silhouette of said object in an image plane of the X-ray detector; and
outputting (S330), in terms of coordinates, an output specification of the plurality of silhouette points.
13. A computer program element for controlling a system according to any one of claims 1-11, which, when being executed by a processing unit is adapted to perform the method steps of claim 12.
14. A computer readable medium having stored thereon the program element of claim 13.
PCT/EP2016/052099 2015-02-03 2016-02-02 Object localization in projective x-ray images by geometric considerations WO2016124554A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP15153619 2015-02-03
EP15153619.0 2015-02-03

Publications (1)

Publication Number Publication Date
WO2016124554A1 true WO2016124554A1 (en) 2016-08-11

Family

ID=52462155

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2016/052099 WO2016124554A1 (en) 2015-02-03 2016-02-02 Object localization in projective x-ray images by geometric considerations

Country Status (1)

Country Link
WO (1) WO2016124554A1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109875589A (en) * 2019-01-29 2019-06-14 东软医疗***股份有限公司 A kind of measurement method and device of the error of centralization of vascular machine system
CN109922734A (en) * 2016-09-29 2019-06-21 皇家飞利浦有限公司 Imaging system with the limitation of dynamic beam size
CN110507338A (en) * 2019-08-30 2019-11-29 东软医疗***股份有限公司 Localization method, device, equipment and Digital X-ray Radiotive system
US10531850B2 (en) 2017-09-07 2020-01-14 General Electric Company Mobile X-ray imaging with detector docking within a spatially registered compartment
CN110831502A (en) * 2017-06-27 2020-02-21 皇家飞利浦有限公司 X-ray misuse protection
EP3870055A4 (en) * 2018-10-22 2022-08-17 ControlRad Inc. Control system for x-ray imaging system
CN115511831A (en) * 2022-09-27 2022-12-23 佳木斯大学 Data analysis processing system and method for tissue embryo pathological section
CN116311085A (en) * 2023-05-19 2023-06-23 杭州睿影科技有限公司 Image processing method, system, device and electronic equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5963612A (en) * 1997-12-31 1999-10-05 Siemens Corporation Research, Inc. Apparatus for C-arm calibration for 3D reconstruction in an imaging system utilizing planar transformation
US20020122534A1 (en) * 2001-01-05 2002-09-05 Polkus Vincent S. Image cropping for asymmetrical imaging
US20030138078A1 (en) * 2002-01-18 2003-07-24 General Electric Company Crd Radiation imaging system and method of collimation
US20140205058A1 (en) * 2013-01-21 2014-07-24 Shimadzu Corporation Radiographic apparatus and an image processing method therefore
EP2767238A1 (en) * 2013-02-13 2014-08-20 Dental Imaging Technologies Corporation Automatic field-of-view size calculation constraint

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5963612A (en) * 1997-12-31 1999-10-05 Siemens Corporation Research, Inc. Apparatus for C-arm calibration for 3D reconstruction in an imaging system utilizing planar transformation
US20020122534A1 (en) * 2001-01-05 2002-09-05 Polkus Vincent S. Image cropping for asymmetrical imaging
US20030138078A1 (en) * 2002-01-18 2003-07-24 General Electric Company Crd Radiation imaging system and method of collimation
US20140205058A1 (en) * 2013-01-21 2014-07-24 Shimadzu Corporation Radiographic apparatus and an image processing method therefore
EP2767238A1 (en) * 2013-02-13 2014-08-20 Dental Imaging Technologies Corporation Automatic field-of-view size calculation constraint

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
HU ZHANLI ET AL: "Region-of-interest reconstruction for a cone-beam dental CT with a circular trajectory", NUCLEAR INSTRUMENTS & METHODS IN PHYSICS RESEARCH. SECTION A: ACCELERATORS, SPECTROMETERS, DETECTORS, AND ASSOCIATED EQUIPMENT, ELSEVIER BV * NORTH-HOLLAND, NL, vol. 708, 18 January 2013 (2013-01-18), pages 39 - 45, XP028990607, ISSN: 0168-9002, DOI: 10.1016/J.NIMA.2013.01.003 *

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109922734A (en) * 2016-09-29 2019-06-21 皇家飞利浦有限公司 Imaging system with the limitation of dynamic beam size
EP3518767B1 (en) * 2016-09-29 2024-02-28 Koninklijke Philips N.V. Imaging system with dynamic beam size limitation
CN110831502A (en) * 2017-06-27 2020-02-21 皇家飞利浦有限公司 X-ray misuse protection
CN110831502B (en) * 2017-06-27 2024-01-16 皇家飞利浦有限公司 X-ray misuse protection
US10531850B2 (en) 2017-09-07 2020-01-14 General Electric Company Mobile X-ray imaging with detector docking within a spatially registered compartment
EP3870055A4 (en) * 2018-10-22 2022-08-17 ControlRad Inc. Control system for x-ray imaging system
CN109875589A (en) * 2019-01-29 2019-06-14 东软医疗***股份有限公司 A kind of measurement method and device of the error of centralization of vascular machine system
CN110507338A (en) * 2019-08-30 2019-11-29 东软医疗***股份有限公司 Localization method, device, equipment and Digital X-ray Radiotive system
CN110507338B (en) * 2019-08-30 2022-12-27 东软医疗***股份有限公司 Positioning method, device and equipment and digital X-ray photography system
CN115511831A (en) * 2022-09-27 2022-12-23 佳木斯大学 Data analysis processing system and method for tissue embryo pathological section
CN116311085A (en) * 2023-05-19 2023-06-23 杭州睿影科技有限公司 Image processing method, system, device and electronic equipment
CN116311085B (en) * 2023-05-19 2023-09-01 杭州睿影科技有限公司 Image processing method, system, device and electronic equipment

Similar Documents

Publication Publication Date Title
WO2016124554A1 (en) Object localization in projective x-ray images by geometric considerations
US20220346736A1 (en) X-ray imaging apparatus and method for controlling the same
US11273326B2 (en) Radiotherapy system and treatment support apparatus
US10568602B2 (en) Virtual positioning image for use in imaging
EP3073926B1 (en) Interventional x-ray system with automatic iso-centering
KR101695267B1 (en) Positioning unit for positioning a patient, imaging device and method for the optical generation of a positioning aid
EP2887876B1 (en) Patient-specific and automatic x-ray system adjustment based on optical 3d scene detection and interpretation
KR101934836B1 (en) X-ray image apparatus nad control method for the same
US10737118B2 (en) Systems and methods for patient position monitoring
US7922391B2 (en) Determining calibration information for an x-ray apparatus
KR102662639B1 (en) X-ray image apparatus nad control method for the same
CN106999727A (en) The method for demarcating the patient monitoring system for radiotherapy equipment
JP6109650B2 (en) X-ray diagnostic apparatus, exposure management apparatus, scattered radiation dose distribution forming method, and scattered radiation dose distribution forming program
US10742956B2 (en) System and method for determining position and orientation of depth cameras
JP6970203B2 (en) Computed tomography and positioning of anatomical structures to be imaged
US20220054862A1 (en) Medical image processing device, storage medium, medical device, and treatment system
JP6732489B2 (en) Alignment adjustment support device, method and program for X-ray imaging equipment
KR102479266B1 (en) Treatment system, calibration method, and program
CN111052186B (en) Method and apparatus for measuring accuracy of model generated by patient monitoring system
JP2005021661A (en) Tomographic x-ray equipment
KR20180072357A (en) X-ray image capturing apparatus and controlling method thereof
KR102118674B1 (en) Video guidance system for supporting localization of mobile c-arm fluoroscopy and method for locating the position of mobile c-arm fluoroscopy using the same
US20230380791A1 (en) Method for planning a recording of an x-ray image of a patient by an x-ray system, and overall system
CN112515689A (en) X-ray imaging apparatus
JP2024027651A (en) Information processor, radiation imaging system, information processing method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16702539

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16702539

Country of ref document: EP

Kind code of ref document: A1