WO2006092602A1 - Imagerie ultrasonore 3d - Google Patents

Imagerie ultrasonore 3d Download PDF

Info

Publication number
WO2006092602A1
WO2006092602A1 PCT/GB2006/000746 GB2006000746W WO2006092602A1 WO 2006092602 A1 WO2006092602 A1 WO 2006092602A1 GB 2006000746 W GB2006000746 W GB 2006000746W WO 2006092602 A1 WO2006092602 A1 WO 2006092602A1
Authority
WO
WIPO (PCT)
Prior art keywords
ultrasound
image
parameters
data
body part
Prior art date
Application number
PCT/GB2006/000746
Other languages
English (en)
Inventor
David Hawkes
Philip Edwards
Dean Barratt
Graeme Penney
Original Assignee
Kings College London
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from GB0504173A external-priority patent/GB0504173D0/en
Application filed by Kings College London filed Critical Kings College London
Publication of WO2006092602A1 publication Critical patent/WO2006092602A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0875Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of bone
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4245Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/483Diagnostic techniques involving the acquisition of a 3D volume of data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5238Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • A61B5/1127Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using markers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/50Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications
    • A61B6/505Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications for diagnosis of bone
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5211Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
    • A61B6/5229Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image
    • A61B6/5247Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image combining images from an ionising-radiation diagnostic technique and a non-ionising radiation diagnostic technique, e.g. X-ray and ultrasound
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/08Indexing scheme for image data processing or generation, in general involving all processing steps from image acquisition to 3D model generation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/41Medical

Definitions

  • the present invention relates a to computer implemented method of improving the geometric accuracy of a 3D ultrasound image, and in particular to registering that image to a computer model of a body part
  • Ultrasound imaging techniques can be used to generate 3D images of body parts.
  • Known techniques involve tracking the location and orientation of an ultrasound transducer while a patient is being scanned.
  • Information on the location and shape of the patient's body parts can be derived by combining the image data and the data from tracking the transducer.
  • Tracking techniques which can be used in relation to the transducer are well known.
  • Data relating to the locations of points on the surface or within a body part can be created by extracting points from ultrasound images and calculating their positions relative to a reference coordinate system.
  • the body part is then represented in the coordinate system by a sparsely sampled collection of points that are derived from the ultrasound scan data.
  • the surface points are then registered (or matched) to a computer generated surface model of the same body part derived from pre-operative scan data (for example by CT, X-ray or magnetic resonance imaging techniques) using a standard point to surface matching algorithm.
  • the transformation which is calculated in this way provides estimates of the physical to model transformation.
  • the surface point data can also be used to modify a generic computer model of the body part so that it more closely represents the shape and size of the body part of the particular patient who is being scanned.
  • Localisation errors can arise from ultrasonic effects which can limit the accuracy of the physical to model transformation. Such errors can arise due to assumptions which are made in relation to ultrasound imaging techniques, for example concerning the speed with which sound travels through tissue, the path taken by the sound waves between the transmitter, the target tissue location and the receiver, and the effect of beam thickness. These errors can restrict the geometric accuracy with which images can be created, and limit the accuracy of ultrasound-based techniques when applied to procedures such as computer aided orthopaedic surgery. It can restrict the accuracy of registration of an image generated by ultrasound scanning with another image, for example from a CT scan or a preoperative model. Such registration is often a required step in preparation of a surgical procedure in which a computer based system is used for navigation.
  • the present invention provides a technique for automatically updating predetermined estimates of 3D ultrasound calibration parameters using patient specific data, which can compensate for errors introduced by assumptions inherent in ultrasound image formation and conventional calibration techniques.
  • the technique of the invention involves including calibration parameters as free parameters in the numerical optimisation that forms the basis of a point to surface matching algorithm used to calculate a physical-to- model or other registration transformation.
  • the invention provides a computer implemented method for improving the accuracy of a 3D ultrasound image of a body part, comprising: determining at least one ultrasound scaling parameter used to capture the ultrasound images; and using an ultrasound scaling parameter as a variable parameter during matching of the 3D ultrasound image to another 3D image or model of the body part.
  • the technique provided by the invention makes use of data on the location and orientation of a tracked object which is rigidly attached to the ultrasound probe during scanning of a body part and the predetermined spatial relationship between this object and the ultrasound image.
  • the ultrasound image is defined by a rigid-body transformation described by at least five, preferably six parameters, which comprises six calibration parameters for the 3D ultrasound system. Two or three further calibration parameters are the voxel scaling parameters of the ultrasound image.
  • location and orientation data will be provided in at least four degrees of freedom, preferably at least five degrees of freedom, especially six degrees of freedom.
  • This information can be used to calculate the a transformation between a set of points (from an ultrasound image) and a surface (a preoperative model).
  • An algorithm which can be used for this is the Iterative Closest Point B2006/000746
  • ultrasound scaling parameters are used in the point-to- surface matching step in addition to the location and orientation data.
  • the scaling parameters define the pixel (for a 2D ultrasound image) or voxel (for a 3D or volumetric ultrasound image) dimensions of an ultrasound image and are effectively calibration parameters for the system under consideration. These are used in combination with the parameters which define the rigid-body transformation between the ultrasound image and the tracked object attached to the ultrasound probe.
  • the invention involves optimising a combination of at least five fixed body coordinates which define the fixed body transformation between the ultrasound image and the representation of the body part and at least one scaling parameter for the ultrasound system, corresponding to the direction of propagation of the ultrasound waves.
  • the invention involves optimising, in addition, at least one other scaling parameter for the ultrasound system.
  • the invention involves optimising the coordinates or parameters which define the location and orientation of the ultrasound probe in the reference coordinate system.
  • Another algorithm for matching points to a surface can be based on the techniques described by A Fitzgibbon in his paper "Robust registration of 2D and 3D point sets" in Proc BMVC 2001, pp 411-420 (2001).
  • the algorithm is modified when applied to the present invention in that any combination of ultrasound calibration parameters is allowed to change to provide an improved estimate of their true values.
  • the point-to-surface matching method involves estimating at least two ultra- sound scaling parameters relating to the scaling of an ultrasound image in the lateral and axial directions, more preferably at least eight calibration parameters.
  • two or three parameters which are estimated in the point-to-surface matching step relate to the ultrasound image scaling (for example in DIm 1 PiXeI "1 ) in the lateral and axial directions, and preferably in the elevational direction for a volumetric 3D ultrasound image, relative to the ultrasound beam.
  • an estimate is made of the physical-to-model transformation which relates to the location and orientation of the ultrasound image to the 3D image or to a model of the scanned body part.
  • the ultrasound calibration parameters are set to fixed values which are derived from a phantom calibration experiment.
  • the point-to-surface matching routine is run again but with the ultrasound calibration parameters allowed to vary, to provide final registration parameters and updated estimates of the ultrasound calibration parameters.
  • the point-to- surface matching routine can be run again using the registration parameters from the previous run as starting parameters for the matching routine.
  • the method can include carrying out a second estimate of the transformation after the first estimate.
  • the second estimate can use a reduced data set for the ultrasound image compared to the first estimate.
  • the data set for the ultrasound image can be reduced by removing data points which are sufficiently far from the fit of the ultrasound image arrived at from the first estimate. For example, the 10% most distant or outlying data points can be removed from the data set prior to the second estimate.
  • the second estimate can use the parameters obtained from the first estimate as starting values for the parameters during the second estimate.
  • a data processing apparatus comprising a data processor and a memory storing computer program instructions which can cause the data processor to determine at least one ultrasound scaling parameter used to capture ultrasound images and use an ultrasound scaling parameter as a variable parameter during matching of a 3D ultrasound image to another 3D image or model of the body part.
  • embodiments of the present invention employ various processes involving data stored in or transferred through one or more computer systems.
  • Embodiments of the present invention also relate to an apparatus for performing these operations.
  • This apparatus may be specially constructed for the required purposes, or it may be a general- purpose computer selectively activated or reconfigured by a computer program and/or data structure stored in the computer.
  • the processes presented herein are not inherently related to any particular computer or other apparatus.
  • various general-purpose machines may be used with programs written in accordance with the teachings herein, or it may be more convenient to construct a more specialized apparatus to perform the required method steps. A particular structure for a variety of these machines will appear from the description given below.
  • embodiments of the present invention relate to computer readable media or computer program products that include program instructions and/or data (including data structures) for performing various computer-implemented operations.
  • Examples of computer-readable media include, but are not limited to, magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD-ROM disks; magneto- optical media; semiconductor memory devices, and hardware devices that are specially configured to store and perform program instructions, such as read-only memory devices (ROM) and random access memory (RAM).
  • ROM read-only memory devices
  • RAM random access memory
  • the data and program instructions of this invention may also be embodied on a carrier wave or other transport medium.
  • Examples of program instructions include both machine code, such as produced by a compiler, and files containing higher level code that may be executed by the computer using an interpreter.
  • Figure 1 shows a flow chart illustrating a computer aided surgical procedure in which the invention can be used
  • Figure 2 shows an ultrasound probe which can be used to capture ultrasound data used by the invention
  • Figure 3 shows a schematic illustration of a surgical setup which can be used to allow the invention to be carried out
  • Figure 4 shows a schematic block diagram of apparatus including the probe shown in Figure 2 and set up shown in Figure 3 which can be used to carry out the invention
  • Figure 5 shows a process flow chart illustrating a first embodiment of a registration method of the invention
  • Figure 6 shows a process flow chart illustrating a second embodiment of the registration method of the invention.
  • Figure 7 shows a schematic block diagram of a data processing apparatus according to the invention for carrying out the registration method illustrated in Figures 5 and 6.
  • FIG. 1 With reference to Figure 1, there is shown a flow chart illustrating a surgical procedure 100 in which the registration method of the present invention can be used.
  • the registration method will be described within the context of a surgical procedure being carried out on a femur but the invention is not limited to that particular application.
  • a pre-operative initial step 102 images of the patient's body part, in this instance the femur, are captured by carrying out a CT scan of the femur.
  • the digitized CT scan data is then transferred, e.g. over a network, or otherwise made available to the computer of the computer aided surgery (CAS) system that will carry out the registration method.
  • CAS computer aided surgery
  • an MR scan may be carried out instead of a CT scan and indeed any imaging modality can be used that can generate a sufficiently accurate representation or model of the patients body part for use in the CAS subsequent CAS procedure.
  • the preoperative imaging step is optional and in another embodiment, a statistical shape model (SSM) approach is used to create the model of the patient's body part and the SSM is instantiated using data collected by ultra sound imaging intra-operatively.
  • SSM statistical shape model
  • model will be used to refer to both an instantiated SSM model of the patient's body part and a virtual representation of the patient's body part derived from captured images obtained by whatever imaging method.
  • a marker bearing a plurality of infrared LEDs or reflective spheres trackable by a tracking system part of the CAS system is attached to the patients body part to allow the position of the patients body part in the reference frame of the tracking system to be dynamically determined.
  • This trackable marker will be referred to herein as a Dynamic Reference Object or DRO.
  • This intra-operative step is not illustrated in Figure 1.
  • an ultrasound (“US") image of the patient's body part is taken intra- operatively and the US scan data is processed and made available to the computer part of the CAS system.
  • the registration method of the invention is used to register the patient model in the reference frame of the tracking system of the CAS system.
  • the registered patient model can be used to navigate various instruments, implants and other entities during a navigated surgical procedure 108.
  • the navigated surgical procedure may include various general operations such as planning the surgical procedure or image guided surgical steps with which a person of ordinary skill in the art will be familiar. Subsequent steps have not been described in greater detail so as not to obscure the present invention.
  • Fig. 3 shows a schematic diagram illustrating the set-up used for the registration method, together with the rigid-body transformations (T) that relate the 3D coordinate systems of the various elements indicated.
  • T rigid-body transformations
  • the model of the patient's body part in its frame of reference is illustrated by element 2 and Figure 3 shows the ultrasound probe 12 being used to capture a US image 5 of the patient's femur 20.
  • the apparatus includes a standard clinical US scanner 10 (such as a Philips-ATL HDI- 5000 as provided by Philips Medical Systems, Bothell, WA) with a high-frequency linear- array scan-probe 12 (such as an L12-5 probe with a 5-12MHz broadband transducer) which are used to obtain the US images.
  • the scan-probe can be tracked by a suitable trackable marker 3 such as an array of LEDs, as shown in Figs. 2 and 3.
  • the 3D position of each LED can be measured by the optical localiser 1 part of the tracking system 14 and these positions used to calculate the position and orientation of the marker 3 relative to the dynamic reference object marker 4.
  • Suitable acquisition software for example written in C++ using Visual Studio 6 (Microsoft Corp., USA), is used to synchronize US image capture with tracking measurements.
  • US images are captured using an analogue-to-digital converter 16 (such as a Canopus ADVC-100, as provided by Canopus UK, Berkshire, UK) connected between the composite video output of the US .scanner 10 and the computer 18 of the CAS system.
  • Computer 18 includes or has access to a mass storage device 22 which can store various data required by the CAS system, including the US image data and the pre-operatively captured CT data or other data defining the model 2.
  • the scan-probe 12 can be covered with a plastic sheath containing a small quantity of US gel to maintain acoustic coupling.
  • the scan-probe should be swept across the skin surface slowly during step 104.
  • the rigid-body calibration transformation for the 3D US system can be determined using a point-target phantom as is known in the art and as described, for example, in "Accuracy of an Electromagnetic Three-Dimensional Ultrasoimd System for Cartoid Artery Imaging", Ultrasound Med Biol, Vol. 27, pp. 1421- 1425, 2001 which is incorporated herein by reference for all purposes.
  • Other calibration techniques can be used, such as those described in "A Review of Calibration Techniques for Freehand 3D Ultrasound Systems", Ultrasound Med Biol, Vol. 31, 449-471, 2005 the disclosure of which is incorporated herein by reference.
  • the calibration procedure involves obtaining tracked images of a pmhead immersed in a water-glycerol solution from many different views. A plurality of images captured from different directions are used to calculate the calibration transformation. The coordinates of the pinhead in each image are identified manually using custom- written software and the pixel scaling can be determined using electronic calliper measurements provided by the US scanner.
  • tracked US images of the surface of the femur are acquired from many different views of the anterior and the posterior femoral shaft, the greater trochanter, the femoral neck, and the epicondyles.
  • US images can be acquired in 2D compounding mode ('SonoCTTM' with the level of compounding set to 'Survey') with a maximum penetration depth of 6cm and a single focal zone. Before images are acquired, the level of the focal zone can be adjusted to approximately correspond to the average depth of bone surface in order to maximise the image resolution at that depth.
  • compound mode a number of images are formed by electronically steering the US beam through several preset angles. These images are then combined to produce a single compounded image. Compounding generally reduces speckle artefacts, but has also been found to improve the echo intensity from regions of a curved bone surface not perpendicular to the axial direction of the image.
  • Bone surfaces can be automatically, semi-automatically or manually segmented from the captured US images. Suitable software can be used to help segment bone surfaces. Using the software, points lying on the bone surface can be selected based on the maximum intensity of the edge corresponding to the bone-soft-tissue interface. A cubic spline can then be fitted to these points to approximate the surface within the image slice so that an arbitrary number of bone surface points can be generated for each image. An average of 10 points can be used per image. The segmented US data is then stored in storage device 22 for use during the registration process.
  • Method 40 corresponds generally to method step 106 of the method 100 illustrated by Figure 1.
  • US images of the patient's femur 20 are captured 42 using the ultrasound probe 12.
  • the ultrasound system has certain parameters, such as an assumed value for the speed-of-sound in soft-tissue, which determines the pixel scaling of US images in the axial and lateral directions, characterising its performance while the US images are captured.
  • the rigid-body transformation for the femur between the US images and the body model is estimated by optimising the six parameters (three translations and three rotations) that define that transformation.
  • the six parameters (three translations and three rotations) defining the transformation T PR0BEMJS between the US image and probe and the US axial and lateral scaling parameters are set at the values determined from calibration of the probe during this first determination of the transformation.
  • the transformation between the US images and bone model is determined again, but using the values of the six parameters for the transformation determined by the preceding step as starting values and setting the eight US calibration parameters as free during the optimisation routine.
  • the output 48 of the second optimisation is then the six parameters defining the transformation, which can be used to perform registration, and the eight US parameters which define T PROBE ⁇ US and the US image scaling.
  • Figure 6 shows a process flow chart illustrating data processing operations carried out by the computer 18 or data processing apparatus of the CAS system.
  • the process has access to data 52 defining bone points obtained from the segmented US images and also data 54 representing bone surfaces from the pre-segmented CT scan images.
  • the registration method 50 uses an implementation of an algorithm to register US-derived bone surface points to the CT-extracted bone surface.
  • a similar point to surface matching algorithm is described in "Robust Registration of 2D and 3D Point Sets" Proceedings of the British Machine Vision Conference, 2001, pp. 411-420.
  • the method uses a generalized non-linear optimisation scheme to directly find a parameter vector, ⁇ , that minimizes a sum-of-squares cost function, C, of the form
  • WRE ⁇ c ⁇ L> S x> S y l contains 6 rotations, 6 translations, and 2 US scaling parameters: ⁇ Q including the three rotations and translations defining the registration and ⁇ CAL including the three rotations and translations defining the calibration of the US probe image and attached trackable marker.
  • the transformation matrices T (0PT ⁇ _ DR0 ;) and T (0PT ⁇ . PR0BE ⁇ are calculated from measurements made by the optical tracking system localizer, and specify the 3D position and orientation of the DRO and US probe relative to the coordinate system, or reference frame, of the tracking system, respectively.
  • the subscript i explicitly denotes the dependency on the image index.
  • the matrix T PROBE ⁇ _ US in (2) is the rigid-body calibration matrix for the 3D US system
  • T SCALE is a four by four diagonal scaling matrix in which the diagonal elements are ⁇ s x , s y , 1,1 ⁇ , where S x and s y are scaling factors (in mm/pixel) for the lateral and axial directions of the US image, respectively, and is given by the matrix
  • the calibration parameters are determined preoperatively. These parameters are included in the optimization. With reasonable starting estimates of the values of the parameters in ⁇ , a simultaneous registration and calibration can be performed by minimizing C( ⁇ ) as defined in (1).
  • C is minimized using an implementation of an iterative region trust algorithm.
  • An implementation of the region-trust optimisation algorithm is provided, for example, in the Matlab Optimisation Toolbox (v. 3.0). This algorithm is a general non-linear optimization algorithm rather than iterative computation of an intermediate closed-from solution, such as the Iterative Closest Point (ICP) algorithm.
  • ICP Iterative Closest Point
  • each parameter is scaled by the inverse of the magnitude of the corresponding column vector of the Jacobian, as described in "Rapid Calibration for 3-D Freehand Ultrasound", Ultrasound Med. Biol. Vol. 24, pp. 855-869, 1998, incorporated herein by reference for all purposes, in order to ensure a well-conditioned system of equations.
  • US-point-to-CT-surface distances are found efficiently to sub-voxel precision using trilinear interpolation of a precomputed 3D Euclidean distance transform of the CT bone surface.
  • the binary image is created as follows.
  • the segmented CT image is resampled over a uniform gird with lmm spacing in all directions using linear interpolation.
  • the resampled image is then binarized where the intensity of the interior region of the bone is set to 1 and with intensity set to 0 elsewhere.
  • a Canny edge filter is then applied to each transverse slice to find voxels corresponding to the bone surface.
  • Distance transforms are computed using Matlab and point-to-surface distances are computed to sub-voxel precision by linearly interpolating voxel intensities of the distance transform.
  • the registration algorithm outlined above is adapted to execute as a 3-step algorithm as illustrated in Figure 6.
  • Initial estimates of the starting registration parameters can be used at the beginning of the first step 56 of the method.
  • the surgeon can use a marked pointer which is trackable by the tracking system to identify a number of anatomical points of the femur which are easy to identify, such as the medial and lateral epicondyles and the greater trochanter.
  • the centre of rotation of the femoral head can be used and can be determined by pivoting the femur about the femoral head while tracking the motion of the marker attached to the femur as is understood in the art.
  • An initial registration of the model of the femur is then carried out by trying to match the corresponding anatomical features in the model of the femur, e.g. in the CT scan, with the captured anatomical positions, so as to provide starting values for the rigid body transformation parameters which are used as starting values for the optimization carried out at step 56.
  • Palpable anatomical landmarks for the femur are: the epicondyles, the greater trochanter; for the pelvis, the standard landmarks are the superior pubic rami, and the anterior superior iliac spines.
  • Self-adhesive skin markers which are visibile in a preoperative image, can also be used for determining an approximate starting position.
  • T CT ⁇ _ DR0 can be estimated using 3 or more points determined in physical space using a pointer and in the model/image space. These two sets of corresponding points are then matched together using a standard closed-form solution (e.g. Procrustes method). The resulting transformation is approximate as there are errors associated with defining anatomical landmarks due to skin movement.
  • a standard closed-form solution e.g. Procrustes method
  • outlier points i.e. those US points which have not been well fitted to the bone surface, are automatically removed by transforming the US points using the updated registration transformation defined by ⁇ REGI and removing the 10% of points furthest from the bone surface.
  • a second rigid-body registration is performed at step 60 using the remaining US points in the data set and with ⁇ set equal to ⁇ REG1 as the starting registration parameters to yield a second updated set of registration parameters, ⁇ REG2 -
  • [ ⁇ 02 , ⁇ CAL0 s x0 , S y0 ] as starting values
  • the algorithm is run again at step 62, this time including the calibration parameters as free parameters in the optimization. This results in ⁇ REQ which provides the three translations and rotations required to register the body part and the model of the body part.
  • all eight calibration parameters can be used as free parameters in the optimization. In other embodiments, fewer than all of the eight calibration parameters can be used as free parameters. For example, s y , (which is directly related to the average speed of sound in tissue) can be included as a single additional free parameter.
  • the algorithm described above uses prior segmentation of bone surface points from US images. In order to maximize accuracy, manual point extraction can be used but incurs a large time overhead. However, other methods of obtaining accurate, automatic segmentation of the bone surface from US images can be used. A number of potential approaches to segmentation have been proposed, any of which could provide input data for the algorithm without modification.
  • outliers in the US point set A further issue is the treatment of outliers in the US point set.
  • outlier points are defined as the furthest 10% of points from the bone surface. Although this method is easy to implement, potentially useful, non-outlying points may be discarded. More sophisticated outlier removal schemes may be more appropriate, especially if the number of US points is reduced, for example, by adopting a clinical protocol where the number of US images is reduced to minimize the intraoperative scan time.
  • One approach would be to incorporate robust estimation by applying a Lorentzian or Huber function to the distance transform as described in "Robust Registration of 2D and 3D Point Sets", in Proc. BMVC, 2001, pp. 411-420, incorporated herein by reference for all purposes.
  • An alternative approach to the registration problem is an automatic, image-intensity-based method for bone registration using US data.
  • This approach has the advantage that the bone surface does not need to be explicitly segmented.
  • the concept of self-calibration explained above can be integrated into an intensity-based registration scheme as described in UK patent application no. 0504175.1 entitled Intensity Based Ultrasound Registration and UK patent application no. 0521825.0 entitled 3D Ultrasound Registration, both of which are incorporated herein by reference in their entirety for all purposes.
  • the method of the invention can be used in a method of planning a surgical procedure which includes the steps of providing a virtual model of the body part, the model having data associated with it representing at least a part of a planned surgical procedure to be carried out on a corresponding real body part of the patient; and morphing the virtual model of the body part using data derived from the patient's real body part thereby also adapting the part of the planned surgical procedure to reflect the anatomy of the patient's real body part.
  • a method is disclosed in UK patent application no 0504172.8 entitled “Surgical planning” and bearing agents' reference P205478 and a new International Patent Application entitled “Surgical Planning” filed with the present application under agents reference P205596WO. Subject matter which is disclosed in the specification of those applications is incorporated by reference herein in its entirety for all purposes.
  • FIG. 7 illustrates a typical computer system that, when appropriately configured or designed, can serve as the data processing apparatus or computer of the CAS system according to the invention.
  • the data processing apparatus or computer 500 includes any number of processors 502 (also referred to as central processing units, or CPUs) that are coupled to storage devices including primary storage 506 (typically a random access memory, or RAM), primary storage 504 (typically a read only memory, or ROM).
  • processors 502 may be of various types including microcontrollers and microprocessors such as programmable devices (e.g., CPLDs and FPGAs) and unprogrammable devices such as gate array ASICs or general purpose microprocessors.
  • primary storage 504 acts to transfer data and instructions uni-directionally to the CPU and primary storage 506 is used typically to transfer data and instructions in a bi-directional manner. Both of these primary storage devices may include any suitable computer-readable media such as those described above.
  • a mass storage device 508 is also coupled bi-directionally to CPU 502 and provides additional data storage capacity and may include any of the computer-readable media described above. Mass storage device 508 may be used to store programs, data and the like and is typically a secondary storage medium such as a hard disk. It will be appreciated that the information retained within the mass storage device 508, may, in appropriate cases, be incorporated in standard fashion as part of primary storage 506 as virtual memory.
  • a specific mass storage device such as a CD-ROM 514 may also pass data uni-directionally to the CPU.
  • CPU 502 is also coupled to an interface 510 that connects to one or more input/output devices such as such as video monitors, track balls, mice, keyboards, microphones, touch- sensitive displays, transducer card readers, magnetic or paper tape readers, tablets, styluses, voice or handwriting recognizers, or other well-known input devices such as, of course, other computers.
  • CPU 502 optionally may be coupled to an external device such as a database or a computer or telecommunications network using an external connection as shown generally at 512. With such a connection, it is contemplated that the CPU might receive information from the network, or might output information to the network in the course of performing the method steps described herein.
  • the present invention has a much broader range of applicability.
  • aspects of the present invention is not limited to any particular kind of orthopaedic procedure and can be applied to virtually any method in which US images are sufficient to allow the anatomical structure of interest to be distinguished.
  • the techniques of the present invention could be used to register non-rigid anatomical structures, i.e. non-bony structures, such as organs.
  • non-rigid anatomical structures i.e. non-bony structures, such as organs.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Orthopedic Medicine & Surgery (AREA)
  • Rheumatology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

Procédé, dispositif et programme informatique permettant d'améliorer la précision géométrique d'une image ultrasonore 3D d'une région anatomique. On détermine au moins un paramètre de mise à l'échelle ultrasonore pour capturer les images ultrasonores. On utilise un paramètre de mise à l'échelle ultrasonore comme paramètre variable lors de la mise en correspondance de l'image ultrasonore 3D et d'une autre image 3D ou modèle de la région anatomique.
PCT/GB2006/000746 2005-03-01 2006-03-01 Imagerie ultrasonore 3d WO2006092602A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
GB0504173A GB0504173D0 (en) 2005-03-01 2005-03-01 3D ultrasound imaging
GB0504173.6 2005-03-01
GB0521824.3 2005-10-26
GB0521824A GB0521824D0 (en) 2005-03-01 2005-10-26 3D ultrasound imaging

Publications (1)

Publication Number Publication Date
WO2006092602A1 true WO2006092602A1 (fr) 2006-09-08

Family

ID=36121310

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2006/000746 WO2006092602A1 (fr) 2005-03-01 2006-03-01 Imagerie ultrasonore 3d

Country Status (1)

Country Link
WO (1) WO2006092602A1 (fr)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
ITVI20080309A1 (it) * 2008-12-24 2009-03-25 Milano Politecnico Sistema e metodo per la scansione avanzata e la simulazione della deformazione di superfici.
EP2279695A1 (fr) * 2009-07-31 2011-02-02 Medison Co., Ltd. Étalonnage coordonné de capteurs dans un système à ultrasons
US20120123252A1 (en) * 2010-11-16 2012-05-17 Zebris Medical Gmbh Imaging apparatus for large area imaging of a body portion
EP2567660A3 (fr) * 2007-08-31 2013-05-01 Canon Kabushiki Kaisha Système d'imagerie diagnostique ultrasonore et son procédé de commande
US9545242B2 (en) 2009-07-31 2017-01-17 Samsung Medison Co., Ltd. Sensor coordinate calibration in an ultrasound system
EP3305202A1 (fr) * 2016-10-06 2018-04-11 Biosense Webster (Israel), Ltd. Enregistrement pré-opératoire d'images anatomiques comprenant un système de détermination de position à l'aide d'ultrasons
US10105120B2 (en) 2014-02-11 2018-10-23 The University Of British Columbia Methods of, and apparatuses for, producing augmented images of a spine
US10510171B2 (en) 2016-11-29 2019-12-17 Biosense Webster (Israel) Ltd. Visualization of anatomical cavities

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
BASS W A ET AL: "Surface-based registration of pysical space with CT images using A-mode ultrasound localization of the skull", PROC. SPIE, vol. 3335, June 1998 (1998-06-01), pages 228 - 238, XP002376997 *
PRAGER R W ET AL: "Rapid calibration for 3-D freehand ultrasound", ULTRASOUND IN MEDICINE AND BIOLOGY, NEW YORK, NY, US, vol. 24, no. 6, July 1998 (1998-07-01), pages 855 - 869, XP004295314, ISSN: 0301-5629 *
SHEKHAR R ET AL: "MUTUAL INFORMATION-BASED RIGID AND NONRIGID REGISTRATION OF ULTRASOUND VOLUMES", IEEE TRANSACTIONS ON MEDICAL IMAGING, IEEE SERVICE CENTER, PISCATAWAY, NJ, US, vol. 21, no. 1, January 2002 (2002-01-01), pages 9 - 22, XP001101417, ISSN: 0278-0062 *
ZITOVA B ET AL: "IMAGE REGISTRATION METHODS: A SURVEY", IMAGE AND VISION COMPUTING, GUILDFORD, GB, vol. 21, no. 11, October 2003 (2003-10-01), pages 977 - 1000, XP001189327, ISSN: 0262-8856 *

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2567660A3 (fr) * 2007-08-31 2013-05-01 Canon Kabushiki Kaisha Système d'imagerie diagnostique ultrasonore et son procédé de commande
US8756033B2 (en) 2007-08-31 2014-06-17 Canon Kabushiki Kaisha Ultrasonic diagnostic imaging system and control method thereof
WO2010073129A1 (fr) * 2008-12-24 2010-07-01 Politecnico Di Milano Système et procédé permettant une exploration avancée et une simulation de la déformation de surfaces
ITVI20080309A1 (it) * 2008-12-24 2009-03-25 Milano Politecnico Sistema e metodo per la scansione avanzata e la simulazione della deformazione di superfici.
US9545242B2 (en) 2009-07-31 2017-01-17 Samsung Medison Co., Ltd. Sensor coordinate calibration in an ultrasound system
US9082178B2 (en) 2009-07-31 2015-07-14 Samsung Medison Co., Ltd. Sensor coordinate calibration in an ultrasound system
US9468422B2 (en) 2009-07-31 2016-10-18 Samsung Medison Co., Ltd. Sensor coordinate calibration in an ultrasound system
EP2279695A1 (fr) * 2009-07-31 2011-02-02 Medison Co., Ltd. Étalonnage coordonné de capteurs dans un système à ultrasons
US9782151B2 (en) 2009-07-31 2017-10-10 Samsung Medison Co., Ltd. Sensor coordinate calibration in an ultrasound system
US9955951B2 (en) 2009-07-31 2018-05-01 Samsung Medison Co., Ltd. Sensor coordinate calibration in an ultrasound system
US10271822B2 (en) 2009-07-31 2019-04-30 Samsung Medison Co., Ltd. Sensor coordinate calibration in an ultrasound system
US10278663B2 (en) 2009-07-31 2019-05-07 Samsung Medison Co., Ltd. Sensor coordinate calibration in an ultrasound system
US10561403B2 (en) 2009-07-31 2020-02-18 Samsung Medison Co., Ltd. Sensor coordinate calibration in an ultrasound system
US20120123252A1 (en) * 2010-11-16 2012-05-17 Zebris Medical Gmbh Imaging apparatus for large area imaging of a body portion
US10105120B2 (en) 2014-02-11 2018-10-23 The University Of British Columbia Methods of, and apparatuses for, producing augmented images of a spine
EP3305202A1 (fr) * 2016-10-06 2018-04-11 Biosense Webster (Israel), Ltd. Enregistrement pré-opératoire d'images anatomiques comprenant un système de détermination de position à l'aide d'ultrasons
US10510171B2 (en) 2016-11-29 2019-12-17 Biosense Webster (Israel) Ltd. Visualization of anatomical cavities

Similar Documents

Publication Publication Date Title
Barratt et al. Self-calibrating 3D-ultrasound-based bone registration for minimally invasive orthopedic surgery
AU2018316801B2 (en) Ultrasound bone registration with learning-based segmentation and sound speed calibration
EP3213682B1 (fr) Procédé et appareil de reconstruction tridimensionnelle d'une articulation à l'aide d'ultrasons
Gill et al. Biomechanically constrained groupwise ultrasound to CT registration of the lumbar spine
Penney et al. Cadaver validation of intensity-based ultrasound to CT registration
US20030011624A1 (en) Deformable transformations for interventional guidance
US20120155732A1 (en) CT Atlas of Musculoskeletal Anatomy to Guide Treatment of Sarcoma
US9052384B2 (en) System and method for calibration for image-guided surgery
WO2006092602A1 (fr) Imagerie ultrasonore 3d
Kowal et al. Automated bone contour detection in ultrasound B‐mode images for minimally invasive registration in computer‐assisted surgery—an in vitro evaluation
WO2006092594A2 (fr) Enregistrement a ultrasons en 3d
US11826112B2 (en) Method for registering articulated anatomical structures
Schumann State of the art of ultrasound-based registration in computer assisted orthopedic interventions
Foroughi et al. Localization of pelvic anatomical coordinate system using US/atlas registration for total hip replacement
Hacihaliloglu 3D ultrasound for orthopedic interventions
US20230368465A1 (en) Methods and apparatus for three-dimensional reconstruction
Talib et al. Information filtering for ultrasound-based real-time registration
Brendel et al. Bone registration with 3D CT and ultrasound data sets
US20240024035A1 (en) Preoperative imaging combined with intraoperative navigation before and after alteration of a surgical site to create a composite surgical three dimensional structural dataset
Peterhans et al. A method for frame-by-frame US to CT registration in a joint calibration and registration framework
US20140309476A1 (en) Ct atlas of musculoskeletal anatomy to guide treatment of sarcoma
Amin Ultrasound registration for surgical navigation
Krivonos et al. Computer assisted treatment of pelvis fractures
Hawkes et al. Measuring and modeling soft tissue deformation for image guided interventions
Krivonos et al. From planning of complex bone deformities correction to computer aided operation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: RU

WWW Wipo information: withdrawn in national office

Country of ref document: RU

122 Ep: pct application non-entry in european phase

Ref document number: 06709970

Country of ref document: EP

Kind code of ref document: A1

WWW Wipo information: withdrawn in national office

Ref document number: 6709970

Country of ref document: EP