CN112215878B - X-ray image registration method based on SURF feature points - Google Patents

X-ray image registration method based on SURF feature points Download PDF

Info

Publication number
CN112215878B
CN112215878B CN202011216382.3A CN202011216382A CN112215878B CN 112215878 B CN112215878 B CN 112215878B CN 202011216382 A CN202011216382 A CN 202011216382A CN 112215878 B CN112215878 B CN 112215878B
Authority
CN
China
Prior art keywords
image
feature
points
point
feature point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011216382.3A
Other languages
Chinese (zh)
Other versions
CN112215878A (en
Inventor
邹海波
李伟
宋璐杰
吉震宇
王宇鸣
刘朝晖
张忠财
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing University of Chemical Technology
China Japan Friendship Hospital
Original Assignee
Beijing University of Chemical Technology
China Japan Friendship Hospital
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing University of Chemical Technology, China Japan Friendship Hospital filed Critical Beijing University of Chemical Technology
Priority to CN202011216382.3A priority Critical patent/CN112215878B/en
Publication of CN112215878A publication Critical patent/CN112215878A/en
Application granted granted Critical
Publication of CN112215878B publication Critical patent/CN112215878B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10116X-ray image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30008Bone
    • G06T2207/30012Spine; Backbone

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention relates to an X-ray image registration method based on SURF characteristic points, computer equipment and a computer readable storage medium, wherein the method comprises the following steps: selecting an interested area of an X-ray image, and performing pretreatment and line feature extraction; selecting a corresponding region of interest from the X-ray image for registration, and performing preprocessing and line feature extraction; respectively extracting the feature points of the region of interest based on an SURF method; forming matched feature point pairs according to the feature description of the feature points; randomly selecting 4 groups of matched characteristic point pairs, and calculating a projection matrix; calculating a distance loss value after mapping according to the projection matrix mapping; and performing multi-round calculation, reserving and outputting the minimum distance loss value and the corresponding projection matrix, and performing image registration according to the output projection matrix. The invention can realize the multi-source image registration quickly and accurately.

Description

X-ray image registration method based on SURF feature points
Technical Field
The invention relates to the technical field of medical image processing, in particular to an X-ray image registration method based on SURF (speeded up robust features), computer equipment and a computer readable storage medium.
Background
Since the last nineteenth century X-ray discovery by roentgen for over one hundred years, medical imaging techniques have undergone a progression from analog to digital. At first, the X-ray is only applied to the imaging of human skeleton, and as the new technical generation of the 20 th century and the 30 th century comes up with the generation of the rotary anode X-ray tube, the X-ray imaging effect is obviously improved, and the X-ray tube shows unique advantages in the diagnosis of certain moving organs, and gradually becomes an indispensable detection mode in clinical research. On the fifteenth international radiology society, many experts and scholars affirm the digital X-ray imaging technology, and with the rapid development of the imaging technology, the application of digital medical images has wide development prospects. At present, the digital informatization of the medical images is rapidly developed to provide rapid, convenient and accurate diagnosis information for clinical treatment, and the digital informatization of the medical images will gradually become a new clinical research basic application platform.
At present, in the diagnosis task of an orthopedic image to the state of an illness, when a cervical vertebra, a vertebra or other injured areas are examined by utilizing X-ray imaging, since a monitoring source comprises X-ray orthopedic examination images with multiple angles and different postures, all detail information required by a doctor is difficult to be clearly and completely presented, and further diagnosis and treatment of a patient are not facilitated, therefore, the multi-angle image registration work aiming at the X-ray imaging is urgently needed to be realized.
Disclosure of Invention
The present invention is directed to at least some of the above-mentioned drawbacks, and provides a method for registering X-ray images at different angles and different postures based on image feature information.
In order to achieve the above object, the present invention provides an X-ray image registration method based on SURF feature points, which includes the steps of:
s1, selecting an interested area of an X-ray image and marking the interested area as I 1 Image, and pair I 1 Preprocessing an image and extracting line characteristics;
s2, selecting a corresponding region of interest in the X-ray image for registration, and marking the region of interest as I 2 Image, and pair I 2 Preprocessing an image and extracting line characteristics;
s3, respectively aligning I based on SURF method 1 Image and I 2 Extracting feature points of the image;
s4, according to the feature description of the feature points, the feature points are taken from I 2 Each feature point of the image and the feature points are taken from I 1 Matching all the characteristic points of the image to form matched characteristic point pairs; the feature description consistency of two feature points in the matched feature point pair does not exceed a set threshold;
s5, randomly selecting 4 groups of matched characteristic point pairs, and calculating a projection matrix;
s6, according to the projection matrix, dividing I 1 Image mapping to I 2 On the image, calculating the mapped I 1 Image and I 2 A distance loss value of the image; wherein the distance loss value is I under the Euclidean distance 1 Image and I 2 Point mean square error of the closest point between the same line features of the image;
s7, comparing the distance loss value with the current storage result, and reserving a smaller distance loss value and a corresponding projection matrix as a new storage result;
s8, judging whether to perform the next round of calculation, if so, returning to the step S5, otherwise, outputting the distance loss value and the projection matrix in the storage result;
s9, mixing I 1 Image sum I 2 And carrying out image registration on the images according to the output projection matrix.
Preferably, said steps S1 are to I 1 When the image is preprocessed, I is processed 1 Carrying out low-pass filtering on the image, and detecting the candy edge;
in the step S2, for I 2 When the image is preprocessed, I is processed 2 The image is low pass filtered, candy edge detected.
Preferably, in the step S3, the pairs I are respectively based on SURF method 1 Image and I 2 When the image is subjected to feature point extraction, the method comprises the following steps:
s3-1, constructing a Hessian matrix, wherein the expression is as follows:
Figure BDA0002760532230000021
wherein x and y represent coordinates of pixel points, D xx (x, σ) denotes the second derivative to the x-direction, D yy (x, σ) denotes the second derivative to the y-direction, D xy (x, sigma) represents that the first order partial derivative is solved for the x direction and then the first order partial derivative is solved for the y direction, and sigma represents the scale of the filter;
s3-2, constructing a scale space of the SURF method, wherein the scale space consists of S groups of o layers, the filter is a box filter, gaussian second-order differential filtering is used inside the filter, the sizes of images among different groups are consistent, the size of a template of the used filter is gradually increased, the images of the same group and different layers use the filter with the same size, but the scale space factor of the filter is gradually increased, and the relation is as follows: l =2 o+1 (s+1)+1、L=3*l、
Figure BDA0002760532230000031
Wherein L represents the response length of the filter to positive and negative spots in the differential direction, L represents the size of a template of the filter, s is the number of groups, and o is the number of layers;
s3-3, processing each pixel point in the image needing to be subjected to feature point extraction by using a Hessian matrix, and calculating a Hessian matrix discriminant formula of the current pixel point:
H(x,σ)=D xx (x,σ)*D yy (x,σ)-(D xy (x,σ)) 2
when the Hessian matrix discriminant obtains a local maximum value, selecting a current pixel point as an interest point;
and carrying out Taylor expansion on the selected interest points in a continuous space, wherein the expression is as follows:
Figure BDA0002760532230000032
where D (X) represents the selected interest point information, X = (X, y, σ) is the offset position between two points, and X (X, y, σ) is the offset position between two points T Denotes the transposition of X, D T Representing the transposition of D, D being a Gaussian difference function;
determining extreme points on a continuous space in a mode of derivation of 0, wherein a derivation formula is as follows:
Figure BDA0002760532230000033
wherein the content of the first and second substances,
Figure BDA0002760532230000034
an offset position representing an interpolation center;
solving Hessian matrix for all the maximum value points in the determined extreme value points, and passing
Figure BDA0002760532230000041
Screening to obtain characteristic points, wherein r is a specified screening parameter, and Tr (H) and Det (H) respectively represent the trace and determinant of the Hessian matrix; />
S3-4, scanning in the circular neighborhood of the characteristic points obtained by screening by taking 0.2rad as a step length, counting the sum of horizontal and vertical Harr wavelet characteristic absolute values of all the characteristic points in a sector range of 60 degrees, and taking the sector direction corresponding to the maximum value as the main direction of the characteristic point;
and S3-5, taking the feature point as a center, extracting 4 × 4 rectangular area blocks along the main direction of the feature point, wherein the size of each rectangular area block is 5 × 5, counting the Haar wavelet features of 25 pixel points in the horizontal direction and the vertical direction relative to the main direction of the feature point for each rectangular area block respectively, and finally summarizing the counting results of all rectangular area blocks to obtain feature description of the feature point, wherein the feature description comprises the sum of the Haar wavelet feature values in the horizontal direction, the sum of the Haar wavelet feature values in the vertical direction, the sum of the Haar wavelet feature values in the horizontal direction and the sum of the Haar wavelet feature values in the vertical direction, and the feature description is expressed in the form of feature vectors.
Preferably, in said step S3-3, by
Figure BDA0002760532230000042
When the characteristic points are obtained by screening, det (H) = D xx *D yy -(0.9*D xy ) 2
Preferably, theIn step S4, the extract is taken from I 2 Each feature point of the image and the feature points are taken from I 1 When matching each characteristic point of the image, calculating each selected from I 2 Characteristic points of the image and one from I 1 The error value of the feature vector between the feature points of the image is obtained from I 2 Feature points of the image and the point of interest taken from I 1 The characteristic points of the image form a group of preliminary matching characteristic point pairs;
and taking the feature vector error value as the feature description consistency, sequencing the feature vector error values of all the groups of preliminary matching feature point pairs from small to large, and determining each group of feature point pairs with the feature vector error value not exceeding a set threshold as the matched feature point pair.
Preferably, in step S5, when 4 groups of matched pairs of feature points are randomly selected and the projection matrix is calculated, the method includes the following steps:
s5-1, randomly selecting 4 groups of matched feature point pairs, and establishing an equation for each group of matched feature point pairs:
Figure BDA0002760532230000051
two explicit expressions for the projection matrix elements are derived:
(h 31 x i +h 32 y i +h 33 )·x i '=h 11 x i +h 12 y i +h 13
(h 31 x i +h 32 y i +h 33 )·y i '=h 21 x i +h 22 y i +h 23
wherein (x) i ,y i ) The pairs of feature points representing the ith set of matches are taken from I 1 Feature points of image (x' i ,y' i ) The pairs of feature points representing the ith set of matches are taken from I 2 Feature points of the image, h jk Represents projection matrix elements, j, k =1, 2, 3;
s5-2, rewriting the elements of the projection matrix into a vector form to obtain the following relational expression:
Figure BDA0002760532230000052
according to the linear characteristic of the matrix coefficient, let h 33 And 5, constantly keeping the value at 1, and substituting two explicit expressions about the elements of the projection matrix obtained by each group of matched characteristic point pairs to obtain the projection matrix.
Preferably, in the step S8, when determining whether to perform the next round of calculation, the number of calculation rounds of the current round of calculation is checked, and if the number of calculation rounds does not reach the preset number of rounds, the process returns to the step S5 to perform the next round of calculation, and the number of calculation rounds is increased by 1.
Preferably, in step S8, when determining whether to perform the next round of calculation, the currently calculated matched feature point pair is determined, and if any 4 groups of matched feature point pairs have already been calculated by the projection matrix, the distance loss value and the projection matrix in the storage result are output.
The invention also provides a computer device comprising a memory and a processor, wherein the memory stores a computer program, and the processor implements the steps of the SURF feature point-based X-ray image registration method according to any one of the above items when executing the computer program.
The present invention also provides a computer readable storage medium having stored thereon a computer program which, when being executed by a processor, implements the steps of the SURF feature point-based X-ray image registration method according to any one of the above.
The technical scheme of the invention has the following advantages: the invention provides an X-ray image registration method based on SURF (speeded Up robust features), computer equipment and a computer readable storage medium, wherein line features and point feature information of X-ray images of the same scene, different angles and/or postures are respectively extracted, a projection matrix for registration is calculated through the point feature information, then evaluation is carried out through the line features, evaluation and updating of the projection matrix are realized, the optimal projection matrix is finally determined, high-precision and high-matching-degree multi-source image registration is realized, the method can be applied to multi-type target images of cervical vertebra, vertebra and the like acquired by X-ray imaging, and accurate and real-time detection and identification registration can be carried out.
Drawings
Fig. 1 is a schematic diagram illustrating steps of an X-ray image registration method based on SURF feature points according to an embodiment of the present invention;
fig. 2 is a schematic diagram of a step of extracting feature points based on the SURF method in the embodiment of the present invention;
fig. 3 is a graph of a result obtained by performing registration by using an X-ray image registration method based on SURF feature points in the embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be obtained by a person skilled in the art without any inventive step based on the embodiments of the present invention, are within the scope of the present invention.
As shown in fig. 1 to 3, an embodiment of the present invention provides a SURF feature point-based X-ray image registration method, including the following steps:
s1, selecting an interested area of an X-ray image and marking the interested area as I 1 Image, and pair I 1 And preprocessing the image and extracting line features.
For an X-ray image (usually the main image in the multi-source image, which can be referred to as image 1), after selecting the region of interest (ROI), a smaller image, i.e. I, can be obtained 1 Image, pair I 1 The image is preprocessed, so that the image definition can be improved, and the subsequent line feature extraction is facilitated.
When extracting line features, one or more line features may be selected according to the characteristics of the specific target in the ROI.
S2, selecting a corresponding region of interest in the X-ray image for registration, and marking the region of interest as I 2 Image of a personAnd to I 2 And preprocessing the image and extracting line features.
X-ray image for registration (which may be referred to as image 2), i.e. an X-ray image of the same scene, different angles and/or poses as the primary image (i.e. image 1), of the region of interest (i.e. I) 2 Image) is typically the sum of I 1 The same or similar regions of the images for registration. For I 2 The preprocessing and line feature extraction of the image should be corresponding to the I 1 The preprocessing performed on the image is the same as the line feature extraction.
In a preferred embodiment, steps S1 are directed to I 1 When the image is preprocessed, I is processed 1 The image is low pass filtered, candy edge detected. Correspondingly, in step S2, for I 2 When the image is preprocessed, I is also processed 2 The image is subjected to the same low pass filtering, candy edge detection.
S3, respectively aligning I based on SURF method 1 Image and I 2 And extracting the characteristic points of the image.
The SURF (Speeded Up Robust Features) method is an improvement to the SIFT (Scale-invariant feature transform) method, particularly in terms of execution efficiency. The point characteristic information of the interested region of the X-ray image is extracted by the SURF method, and the real-time registration of the image can be realized.
S4, according to the feature description of the feature points, the feature points are taken from I 2 Each feature point of the image and the feature points are taken from I 1 And matching the characteristic points of the images to form matched characteristic point pairs. And in the matched feature point pair, the feature description consistency of the two feature points does not exceed a set threshold value.
The feature description of two feature points being matched, i.e. taken from I 2 Feature points of the image taken from I 1 And corresponding feature vector error values between feature points of the image.
And S5, randomly selecting 4 groups of matched characteristic point pairs, and calculating a projection matrix.
Projection matrix, i.e. I 1 Image mapping to I 2 A transformation matrix of the image.
S6, converting I according to the projection matrix 1 Image mapping to I 2 On the image, calculating the mapped I 1 Image and I 2 Distance loss value of image.
Wherein, the distance loss value is at Euclidean distance (mapping to I) 2 On the image) I 1 Image and I 2 And the mean square error of the points of the nearest points between the same line features of the image.
And S7, comparing the distance loss value with the current storage result, and reserving a smaller distance loss value and a corresponding projection matrix as a new storage result.
When comparing the distance loss value with the current storage result, if the current storage result is an empty set, directly storing the distance loss value obtained by the calculation of the current round and the corresponding projection matrix as a new storage result, and if the current storage result is not an empty set, comparing the distance loss value obtained by the calculation of the current round with the distance loss value in the current storage result so as to store the minimum distance loss value and the corresponding projection matrix as the storage result after multiple rounds of calculation.
And S8, judging whether to perform the next round of calculation, if so, returning to the step S5, and otherwise, outputting the distance loss value and the projection matrix in the storage result.
By repeatedly calculating the projection matrix for many times, the projection matrix with the best registration effect can be screened out through iteration, and therefore the accuracy and the matching degree of multi-source image registration are improved.
S9, mixing I 1 Image and I 2 And carrying out image registration on the images according to the output projection matrix.
The invention respectively extracts line characteristics and point characteristic information of two X-ray images, calculates a projection matrix through point characteristic information pairing, evaluates the projection matrix through line characteristics, selects the projection matrix with the best effect, and can realize high-quality registration of multi-source X-ray images with different angles and/or postures. In addition, in the whole registration process, interference information such as noise and the like is removed through preprocessing, and only line characteristics and point characteristic information of the X-ray image are used in subsequent calculation, especially the point characteristic information is robust to targets of different types, different scales and different brightness, and robustness of registration of various scenes can be achieved.
Preferably, as shown in fig. 2, in step S3, I is respectively paired based on SURF method 1 Image sum I 2 When the image is subjected to feature point extraction, the method comprises the following steps:
and S3-1, constructing a Hessian matrix.
Hessian matrix
Figure BDA0002760532230000091
Filtering is needed before constructing the Hessian matrix, and the expression of the Hessian matrix becomes:
Figure BDA0002760532230000092
wherein x and y represent coordinates of pixel points, D xx (x, σ) denotes the second derivative to the x-direction, D yy (x, σ) denotes the second derivative to the y-direction, D xy (x, σ) represents the first order partial derivative in the x direction and then the first order partial derivative in the y direction, and σ represents the scale of the filter.
When the discriminant H (x, σ) = D of Hessian matrix xx (x,σ)*D yy (x,σ)-(D xy (x,σ)) 2 When the local maximum is obtained, the current pixel point can be judged to be a point brighter or darker than other points in the surrounding neighborhood.
S3-2, constructing a scale space.
The scale space of the SURF method consists of s groups of o layers, the filters are box filters, gaussian second-order differential filtering is used inside the filters, the sizes of images in different groups are consistent, the sizes of templates of the used filters are gradually increased, the same group of images but different layers of images use the same size of filters, but the scale space factor of the filters is gradually increased, and the specific relation is as follows: l =2 o+1 (s+1)+1、L=3*l、
Figure BDA0002760532230000093
Where L denotes the filter response length to positive and negative spots in the differential direction, and L denotesThe template size of the filter, s the number of groups, o the number of layers, and σ the scale of the filter.
And S3-3, positioning the characteristic points.
Processing the image (i.e. I) needing to extract the characteristic points by a Hessian matrix 1 Image, I 2 Image), calculating a Hessian matrix discriminant of the current pixel: h (x, σ) = D xx (x,σ)*D yy (x,σ)-(D xy (x,σ)) 2 When the Hessian matrix discriminant obtains a local maximum, the current pixel point is selected as the interest point (that is, the extreme point of the pixel point is found).
The Hessian matrix discriminant of each pixel is compared with all neighboring points of its image domain (same size image) and scale domain (neighboring scale space). When the pixel value of a pixel point is greater than (or less than) all the adjacent points, the pixel point is an extreme point. The detected pixel points are compared with 8 pixel points (image domain adjacent points) in a 3 × 33 × 3 neighborhood of the image where the detected pixel points are located, and 18 pixel points (scale domain adjacent points) in an upper and lower two-layer 3 × 33 × 3 neighborhood of the detected pixel points, wherein the total number of the pixel points is 26. The point of the edge response in the pixel value comparison result can be removed by the Hessian matrix discriminant.
And carrying out Taylor expansion on each selected interest point in a continuous space, wherein the expression is as follows:
Figure BDA0002760532230000101
where D (X) represents the selected interest point information, X = (X, y, σ) is the offset position between two points, and X (X, y, σ) is the offset position between two points T Denotes the transposition of X, D T Representing the transpose of D, which is a gaussian difference function.
Determining an extreme point in the interest point, namely the extreme point on the continuous space, in a mode of derivation of 0, wherein a derivation formula is as follows:
Figure BDA0002760532230000102
wherein the content of the first and second substances,
Figure BDA0002760532230000103
representing the offset position of the interpolation center.
Solving the Hessian matrix for all the maximum value points in the extreme value points determined by taking the derivative as 0, and passing
Figure BDA0002760532230000111
And screening to obtain characteristic points, wherein r is a specified screening parameter, tr (H) represents the trace of the Hessian matrix, and Det (H) represents the determinant of the Hessian matrix. The specific value of the designated screening parameter r can be set according to the actual situation.
Further, since the Hessian matrix discriminant uses the gaussian convolution of the original image, since the gaussian verification follows the normal distribution, the coefficients are lower and lower from the center point to the outside, and in order to increase the operation speed, the SURF method uses a box filter to approximate the substitute gaussian filter, so that the operation speed can be increased at D xy By multiplying a weighting factor of 0.9 in order to balance the error due to the approximation using the box filter, i.e. by
Figure BDA0002760532230000112
When the characteristic points are obtained by screening, det (H) = D xx *D yy -(0.9*D xy ) 2
And S3-4, determining the main direction of each characteristic point.
And scanning in the circular neighborhood of each characteristic point obtained by screening by taking 0.2rad as a step length, counting the sum of horizontal Harr wavelet characteristic absolute values and vertical Harr wavelet characteristic absolute values of all the characteristic points in a sector range of 60 degrees, and taking the sector direction corresponding to the maximum value as the main direction of the characteristic point.
The circular neighborhood of a feature point, i.e. within a circular range centered on itself and having a radius of 6 σ, σ is the scale of the filter (used by the image in which the feature point is located).
And S3-5, generating feature description of each feature point.
For each feature point determining the main direction, taking the feature point as the center, extracting 4 × 4 rectangular region blocks along the main direction of the feature point, wherein the size of a single rectangular region block is 5 × 5, counting Haar wavelet features of 25 pixel points in the horizontal direction and the vertical direction relative to the main direction of the feature point for each rectangular region block respectively, finally summarizing the counting results of all rectangular region blocks to obtain the feature description of the feature point, wherein the feature description of one feature point comprises the sum of Haar wavelet feature values in the horizontal direction, the sum of Haar wavelet feature values in the vertical direction, the sum of Haar wavelet feature values in the horizontal direction and the sum of Haar wavelet feature values in the vertical direction, and the feature description is expressed in the form of feature vectors.
Preferably, in step S4, it will be taken from I 2 Each feature point of the image and the feature points are taken from I 1 When matching each characteristic point of the image, calculating each selected from I 2 Characteristic points of the image and one from I 1 The error value of the feature vector between the feature points of the image is obtained from I 2 Feature points of the image and the point of interest taken from I 1 The feature points of the image constitute a set of preliminary matching feature point pairs. Other extracts from I are treated in the same manner 1 And (4) feature points of the image until all the feature points form a preliminary matching feature point pair.
Taken from I as (a set of preliminary matching feature point pairs) 1 Feature points of the image taken from I 2 And (3) taking the feature vector error values among the feature points of the image as feature description consistency (of a group of preliminary matching feature point pairs), sequencing the feature vector error values of all the groups of preliminary matching feature point pairs from small to large, and determining each group of feature point pairs with the feature vector error value not exceeding a set threshold as a matched feature point pair. And screening out the characteristic point pairs with the highest conformity, thereby being beneficial to determining the optimal projection matrix through subsequent calculation.
Preferably, in step S5, randomly selecting 4 groups of matched pairs of feature points, and when calculating the projection matrix, the method includes the following steps:
s5-1, randomly selecting 4 groups of matched feature point pairs, and establishing an equation for each group of matched feature point pairs:
Figure BDA0002760532230000121
two explicit expressions for the projection matrix elements are derived:
(h 31 x i +h 32 y i +h 33 )·x i '=h 11 x i +h 12 y i +h 13
(h 31 x i +h 32 y i +h 33 )·y i '=h 21 x i +h 22 y i +h 23
wherein (x) i ,y i ) The pairs of feature points representing the ith set of matches are taken from I 1 Feature points of image (x' i ,y' i ) The pairs of feature points representing the ith set of matches are taken from I 2 Feature points of the image, h jk Represents projection matrix elements, j, k =1, 2, 3;
s5-2, rewriting the elements of the projection matrix into a vector form to obtain the following relational expression:
Figure BDA0002760532230000131
according to the linear characteristic of the matrix coefficient, let h 33 And (3) obtaining a unique transformation matrix with the constant value of 1, so far, the degree of freedom of the transformation matrix is 8, substituting two explicit expressions (8 explicit expressions in total) related to projection matrix elements obtained by each group of matched characteristic point pairs, and calculating to obtain the projection matrix.
In some embodiments, in step S8, when determining whether to perform the next round of calculation, the number of calculation rounds of the current round of calculation may be checked, and if the number of calculation rounds does not reach the preset number of rounds, the process returns to step S5 to perform the next round of calculation, and the number of calculation rounds is incremented by 1.
In other embodiments, in step S8, when it is determined whether to perform the next round of calculation, it may be determined that the currently calculated matched pairs of feature points are present, and if any 4 groups of matched pairs of feature points have already been subjected to calculation of the projection matrix, the step S5 is not returned to perform the next round of calculation, and the distance loss value and the projection matrix in the stored result are output.
In particular, in some preferred embodiments of the present invention, there is further provided a computer device, including a memory and a processor, where the memory stores a computer program, and the processor implements the steps of the SURF feature point-based X-ray image registration method in any one of the above embodiments when executing the computer program.
In other preferred embodiments of the present invention, a computer-readable storage medium is further provided, on which a computer program is stored, and the computer program is executed by a processor to implement the steps of the SURF feature point-based X-ray image registration method described in any of the above embodiments.
It will be understood by those skilled in the art that all or part of the processes in the method for implementing the above embodiments may be implemented by a computer program, which may be stored in a non-volatile computer-readable storage medium, and when executed, the computer program may include the processes in the above embodiments of the SURF feature point-based X-ray image registration method, and will not be described again here.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (10)

1. An X-ray image registration method based on SURF feature points is characterized by comprising the following steps:
s1, selecting an interested area of an X-ray image and marking the interested area as I 1 Image, and pair I 1 Image pre-processing and line characterizationExtracting;
s2, selecting a corresponding region of interest in the X-ray image for registration, and marking the region of interest as I 2 Image, and pair I 2 Preprocessing an image and extracting line characteristics;
s3, respectively aligning I based on SURF method 1 Image and I 2 Extracting feature points of the image;
s4, according to the feature description of the feature points, the feature points are taken from I 2 Each feature point of the image and the feature points are taken from I 1 Matching the characteristic points of the images to form matched characteristic point pairs; the feature description consistency of two feature points in the matched feature point pair does not exceed a set threshold;
s5, randomly selecting 4 groups of matched characteristic point pairs, and calculating a projection matrix;
s6, according to the projection matrix, dividing I 1 Image mapping to I 2 On the image, calculating the mapped I 1 Image and I 2 A distance loss value of the image; wherein the distance loss value is I under the Euclidean distance 1 Image and I 2 Point mean square error of the closest point between the same line features of the image;
s7, comparing the distance loss value with the current storage result, and reserving a smaller distance loss value and a corresponding projection matrix as a new storage result;
s8, judging whether to perform the next round of calculation, if so, returning to the step S5, otherwise, outputting the distance loss value and the projection matrix in the storage result;
s9, mixing I 1 Image sum I 2 And carrying out image registration on the images according to the output projection matrix.
2. The SURF feature point-based X-ray image registration method according to claim 1, wherein:
said step S1 is to I 1 When the image is preprocessed, I is processed 1 Carrying out low-pass filtering on the image and detecting the candy edge;
in the step S2, for I 2 When the image is preprocessed, I is processed 2 The image is low pass filtered, candy edge detected.
3. The SURF feature point-based X-ray image registration method according to claim 1,
in the step S3, the I pairs are respectively selected based on the SURF method 1 Image and I 2 When the image is subjected to feature point extraction, the method comprises the following steps:
s3-1, constructing a Hessian matrix, wherein the expression is as follows:
Figure FDA0002760532220000021
wherein x and y represent coordinates of pixel points, D xx (x, σ) denotes the second derivative to the x-direction, D yy (x, σ) denotes the second derivative to the y-direction, D xy (x, sigma) represents that the first order partial derivative is solved for the x direction and then the first order partial derivative is solved for the y direction, and sigma represents the scale of the filter;
s3-2, constructing a scale space of the SURF method, wherein the scale space consists of S groups of o layers, the filter is a box filter, gaussian second-order differential filtering is used inside the filter, the sizes of images among different groups are consistent, the size of a template of the used filter is gradually increased, the images of the same group and different layers use the filter with the same size, but the scale space factor of the filter is gradually increased, and the relation is as follows: l =2 o+1 (s+1)+1、L=3*l、
Figure FDA0002760532220000022
Wherein L represents the response length of the filter to positive and negative spots in the differential direction, L represents the size of a template of the filter, s is the number of groups, and o is the number of layers;
s3-3, processing each pixel point in the image needing to be subjected to feature point extraction by using a Hessian matrix, and calculating a Hessian matrix discriminant formula of the current pixel point:
H(x,σ)=D xx (x,σ)*D yy (x,σ)-(D xy (x,σ)) 2
when the Hessian matrix discriminant obtains a local maximum value, selecting a current pixel point as an interest point;
and carrying out Taylor expansion on the selected interest points in a continuous space, wherein the expression is as follows:
Figure FDA0002760532220000023
where D (X) represents the selected point of interest information, X = (X, y, σ) is the offset position between two points, X T Denotes the transposition of X, D T Representing the transposition of D, D being a Gaussian difference function;
determining extreme points on a continuous space in a mode of derivation of 0, wherein a derivation formula is as follows:
Figure FDA0002760532220000031
wherein the content of the first and second substances,
Figure FDA0002760532220000032
an offset position representing an interpolation center;
solving Hessian matrix for all the maximum value points in the determined extreme value points, and passing
Figure FDA0002760532220000033
Screening to obtain characteristic points, wherein r is a specified screening parameter, and Tr (H) and Det (H) respectively represent a track and a determinant of a Hessian matrix;
s3-4, scanning in the circular neighborhood of the characteristic points obtained by screening by taking 0.2rad as a step length, counting the sum of horizontal and vertical Harr wavelet characteristic absolute values of all the characteristic points in a sector range of 60 degrees, and taking the sector direction corresponding to the maximum value as the main direction of the characteristic point;
and S3-5, taking the feature point as a center, extracting 4 × 4 rectangular region blocks along the main direction of the feature point, wherein the size of a single rectangular region block is 5 × 5, counting the Haar wavelet characteristics of 25 pixel points in the horizontal direction and the vertical direction relative to the main direction of the feature point for each rectangular region block, finally summarizing the counting results of all rectangular region blocks to obtain the feature description of the feature point, wherein the feature description comprises the sum of the Haar wavelet characteristic values in the horizontal direction, the sum of the Haar wavelet characteristic values in the vertical direction, the sum of the Haar wavelet characteristic values in the horizontal direction and the sum of the Haar wavelet characteristic values in the vertical direction, and the feature description is expressed in the form of feature vectors.
4. The SURF feature point-based X-ray image registration method according to claim 3, wherein:
in said step S3-3, by
Figure FDA0002760532220000041
When the characteristic points are obtained by screening, det (H) = D xx *D yy -(0.9*D xy ) 2
5. The SURF feature point-based X-ray image registration method according to claim 3, wherein:
in said step S4, will be taken from I 2 Each feature point of the image and the feature points are taken from I 1 When matching each characteristic point of the image, calculating each selected from I 2 Characteristic points of the image and one from I 1 The feature vector error value between the feature points of the image is obtained from I 2 Feature points of the image and the point of interest taken from I 1 The characteristic points of the image form a group of preliminary matching characteristic point pairs;
and taking the feature vector error value as the feature description consistency, sequencing the feature vector error values of all the groups of preliminary matching feature point pairs from small to large, and determining each group of feature point pairs with the feature vector error value not exceeding a set threshold as the matched feature point pair.
6. The SURF feature point-based X-ray image registration method according to claim 1, wherein:
in step S5, randomly selecting 4 groups of matched feature point pairs, and when calculating the projection matrix, including the following steps:
s5-1, randomly selecting 4 groups of matched feature point pairs, and establishing an equation for each group of matched feature point pairs:
Figure FDA0002760532220000042
two explicit expressions for the projection matrix elements are derived:
(h 31 x i +h 32 y i +h 33 )·x i '=h 11 x i +h 12 y i +h 13
(h 31 x i +h 32 y i +h 33 )·y i '=h 21 x i +h 22 y i +h 23
wherein (x) i ,y i ) The pairs of feature points representing the ith set of matches are taken from I 1 Feature points of image (x' i ,y' i ) The pairs of feature points representing the ith set of matches are taken from I 2 Feature points of the image, h jk Represents projection matrix elements, j, k =1, 2, 3;
s5-2, rewriting the elements of the projection matrix into a vector form to obtain the following relational expression:
Figure FDA0002760532220000051
according to the linear characteristic of the matrix coefficient, let h 33 And 5, constantly keeping the value at 1, and substituting two explicit expressions about the elements of the projection matrix obtained by each group of matched characteristic point pairs to obtain the projection matrix.
7. The SURF feature point-based X-ray image registration method according to claim 1, wherein:
in step S8, when determining whether to perform the next round of calculation, checking the number of calculation rounds of the current round of calculation, and if the number of calculation rounds does not reach the preset number of rounds, returning to step S5 to perform the next round of calculation, and adding 1 to the number of calculation rounds.
8. The SURF feature point-based X-ray image registration method according to claim 1, wherein:
in step S8, when determining whether to perform the next round of calculation, the currently calculated matched feature point pair is determined, and if any 4 groups of matched feature point pairs have already performed the calculation of the projection matrix, the distance loss value and the projection matrix in the stored result are output.
9. A computer device comprising a memory and a processor, the memory storing a computer program, wherein the processor when executing the computer program performs the steps of the SURF feature point-based X-ray image registration method according to any one of claims 1 to 8.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the SURF feature point based X-ray image registration method according to any one of claims 1 to 8.
CN202011216382.3A 2020-11-04 2020-11-04 X-ray image registration method based on SURF feature points Active CN112215878B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011216382.3A CN112215878B (en) 2020-11-04 2020-11-04 X-ray image registration method based on SURF feature points

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011216382.3A CN112215878B (en) 2020-11-04 2020-11-04 X-ray image registration method based on SURF feature points

Publications (2)

Publication Number Publication Date
CN112215878A CN112215878A (en) 2021-01-12
CN112215878B true CN112215878B (en) 2023-03-24

Family

ID=74058150

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011216382.3A Active CN112215878B (en) 2020-11-04 2020-11-04 X-ray image registration method based on SURF feature points

Country Status (1)

Country Link
CN (1) CN112215878B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115841506B (en) * 2023-02-20 2023-05-02 广东省人民医院 Fluorescent molecular image processing method and system
CN116728420B (en) * 2023-08-11 2023-11-03 苏州安博医疗科技有限公司 Mechanical arm regulation and control method and system for spinal surgery

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106683127A (en) * 2017-01-05 2017-05-17 南京觅踪电子科技有限公司 Multimode medical image registration method based on SURF algorithm
CN109271996A (en) * 2018-08-21 2019-01-25 南京理工大学 FPC automatic image registration method based on SURF feature and Hash perception algorithm
CN109727239A (en) * 2018-12-27 2019-05-07 北京航天福道高技术股份有限公司 Based on SURF feature to the method for registering of inspection figure and reference map
CN111784675A (en) * 2020-07-01 2020-10-16 云南易见纹语科技有限公司 Method and device for processing article texture information, storage medium and electronic equipment

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106356757B (en) * 2016-08-11 2018-03-20 河海大学常州校区 A kind of power circuit unmanned plane method for inspecting based on human-eye visual characteristic

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106683127A (en) * 2017-01-05 2017-05-17 南京觅踪电子科技有限公司 Multimode medical image registration method based on SURF algorithm
CN109271996A (en) * 2018-08-21 2019-01-25 南京理工大学 FPC automatic image registration method based on SURF feature and Hash perception algorithm
CN109727239A (en) * 2018-12-27 2019-05-07 北京航天福道高技术股份有限公司 Based on SURF feature to the method for registering of inspection figure and reference map
CN111784675A (en) * 2020-07-01 2020-10-16 云南易见纹语科技有限公司 Method and device for processing article texture information, storage medium and electronic equipment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"红外与可见光遥感图像自动配准算法研究";廉蔺;《中国博士学位论文全文数据库 信息科技辑》;20141015;第二章至第六章 *

Also Published As

Publication number Publication date
CN112215878A (en) 2021-01-12

Similar Documents

Publication Publication Date Title
CN110021025B (en) Region-of-interest matching and displaying method, device, equipment and storage medium
JP4130661B2 (en) Device for detecting temporal changes between temporally continuous chest images
JP6077993B2 (en) Image data processing method, system, and program for identifying image variants
CN111047572A (en) Automatic spine positioning method in medical image based on Mask RCNN
CN106228528B (en) A kind of multi-focus image fusing method based on decision diagram and rarefaction representation
Tschirren et al. Segmentation, skeletonization, and branchpoint matching—a fully automated quantitative evaluation of human intrathoracic airway trees
CN112215878B (en) X-ray image registration method based on SURF feature points
US20080226145A1 (en) Image processing apparatus and computer readable media containing image processing program
CN110782428B (en) Method and system for constructing clinical brain CT image ROI template
CN106296613B (en) A kind of Dual Energy Subtraction method based on DR machine
Wang et al. Automatic fundus images mosaic based on SIFT feature
JP2023517058A (en) Automatic detection of tumors based on image processing
CN111062936B (en) Quantitative index evaluation method for facial deformation diagnosis and treatment effect
CN111179173B (en) Image splicing method based on discrete wavelet transform and gradient fusion algorithm
CN114782715B (en) Vein recognition method based on statistical information
CN115861656A (en) Method, apparatus and system for automatically processing medical images to output an alert
CN109978897B (en) Registration method and device for heterogeneous remote sensing images of multi-scale generation countermeasure network
CN112801141B (en) Heterogeneous image matching method based on template matching and twin neural network optimization
CN113610746A (en) Image processing method and device, computer equipment and storage medium
Martín-Fernández et al. Automatic articulated registration of hand radiographs
JP2004188202A (en) Automatic analysis method of digital radiograph of chest part
CN107451610B (en) Image detection method for improving feature matching precision
CN110428392A (en) A kind of Method of Medical Image Fusion based on dictionary learning and low-rank representation
Malode New approach of statistical analysis for lung disease diagnosis using microscopy images
Kumar et al. Semiautomatic method for segmenting pedicles in vertebral radiographs

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant