CN112215878A - X-ray image registration method based on SURF feature points - Google Patents

X-ray image registration method based on SURF feature points Download PDF

Info

Publication number
CN112215878A
CN112215878A CN202011216382.3A CN202011216382A CN112215878A CN 112215878 A CN112215878 A CN 112215878A CN 202011216382 A CN202011216382 A CN 202011216382A CN 112215878 A CN112215878 A CN 112215878A
Authority
CN
China
Prior art keywords
image
feature
points
characteristic
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011216382.3A
Other languages
Chinese (zh)
Other versions
CN112215878B (en
Inventor
邹海波
李伟
宋璐杰
吉震宇
王宇鸣
刘朝晖
张忠财
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing University of Chemical Technology
China Japan Friendship Hospital
Original Assignee
Beijing University of Chemical Technology
China Japan Friendship Hospital
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing University of Chemical Technology, China Japan Friendship Hospital filed Critical Beijing University of Chemical Technology
Priority to CN202011216382.3A priority Critical patent/CN112215878B/en
Publication of CN112215878A publication Critical patent/CN112215878A/en
Application granted granted Critical
Publication of CN112215878B publication Critical patent/CN112215878B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10116X-ray image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30008Bone
    • G06T2207/30012Spine; Backbone

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to an X-ray image registration method based on SURF characteristic points, computer equipment and a computer readable storage medium, wherein the method comprises the following steps: selecting an interested area of an X-ray image, and performing pretreatment and line feature extraction; selecting a corresponding region of interest from the X-ray image for registration, and performing preprocessing and line feature extraction; respectively extracting the feature points of the region of interest based on an SURF method; forming matched feature point pairs according to the feature description of the feature points; randomly selecting 4 groups of matched characteristic point pairs, and calculating a projection matrix; calculating a distance loss value after mapping according to the projection matrix mapping; and performing multi-round calculation, reserving and outputting the minimum distance loss value and the corresponding projection matrix, and performing image registration according to the output projection matrix. The invention can realize the multi-source image registration quickly and accurately.

Description

X-ray image registration method based on SURF feature points
Technical Field
The invention relates to the technical field of medical image processing, in particular to an X-ray image registration method based on SURF (speeded up robust features), computer equipment and a computer readable storage medium.
Background
Since the last nineteenth century X-ray discovery by roentgen for over one hundred years, medical imaging techniques have undergone a progression from analog to digital. At first, the X-ray is only applied to the imaging of human skeleton, and as the new technical generation of the 20 th century and the 30 th century comes up with the generation of the rotary anode X-ray tube, the X-ray imaging effect is obviously improved, and the X-ray tube shows unique advantages in the diagnosis of certain moving organs, and gradually becomes an indispensable detection mode in clinical research. In the fifteenth international radiology society, many experts and scholars are certain about the digital X-ray imaging technology, and the application of digital medical images has wide development prospect with the rapid development of the imaging technology. Today, the digital informatization of medical images is developing at a high speed, so that rapid, convenient and accurate diagnosis information is provided for clinical treatment, and the digital informatization of medical images will gradually become a new clinical research basic application platform.
At present, in the diagnosis task of an orthopedic image to the state of an illness, when a cervical vertebra, a vertebra or other injured areas are examined by utilizing X-ray imaging, since a monitoring source comprises X-ray orthopedic examination images with multiple angles and different postures, all detail information required by a doctor is difficult to be clearly and completely presented, and further diagnosis and treatment of a patient are not facilitated, therefore, the multi-angle image registration work aiming at the X-ray imaging is urgently needed to be realized.
Disclosure of Invention
The present invention is directed to at least some of the above-mentioned drawbacks, and provides a method for registering X-ray images at different angles and different postures based on image feature information.
In order to achieve the above object, the present invention provides an X-ray image registration method based on SURF feature points, which includes the steps of:
s1, selecting an interested area of the X-ray image, and marking the interested area as I1Image, and pair I1Preprocessing an image and extracting line characteristics;
s2, selecting a corresponding region of interest in the X-ray image for registration, and marking the region of interest as I2Image, and pair I2Preprocessing an image and extracting line characteristics;
s3, respectively comparing I based on SURF method1Image and I2Extracting characteristic points of the image;
s4, according to the feature description of the feature point, will be taken from I2Each feature point of the image and the feature points are taken from I1Matching all the characteristic points of the image to form matched characteristic point pairs; the feature description consistency of two feature points in the matched feature point pair does not exceed a set threshold;
s5, randomly selecting 4 groups of matched characteristic point pairs, and calculating a projection matrix;
s6, combining I according to the projection matrix1Image mapping to I2On the image, calculating the mapped I1Image and I2A distance loss value of the image; wherein the distance loss value is I under the Euclidean distance1Image and I2Point mean square error of the closest point between the same line features of the image;
s7, comparing the distance loss value with the current storage result, and reserving a smaller distance loss value and a corresponding projection matrix as a new storage result;
s8, judging whether to carry out the next round of calculation, if so, returning to the step S5, otherwise, outputting the distance loss value and the projection matrix in the storage result;
s9, mixing I1Image and I2And carrying out image registration on the images according to the output projection matrix.
Preferably, the step S1 is to I1When the image is preprocessedTo 1, pair1Carrying out low-pass filtering on the image, and detecting the candy edge;
in the step S2, for I2When the image is preprocessed, I is processed2The image is low pass filtered, candy edge detected.
Preferably, in step S3, the I pairs are respectively based on SURF method1Image and I2When the image is subjected to feature point extraction, the method comprises the following steps:
s3-1, constructing a Hessian matrix, wherein the expression is as follows:
Figure BDA0002760532230000021
wherein x and y represent coordinates of pixel points, Dxx(x, σ) denotes the second derivative to the x-direction, Dyy(x, σ) denotes the second derivative to the y-direction, Dxy(x, sigma) represents that the first order partial derivative is solved for the x direction and then the first order partial derivative is solved for the y direction, and sigma represents the scale of the filter;
s3-2, constructing a scale space of the SURF method, wherein the scale space consists of S groups of o layers, the filter is a box filter, Gaussian second order differential filtering is used inside the box filter, the sizes of images among different groups are consistent, the size of a template of the used filter is gradually increased, the images of the same group and different layers use the same size of the filter, but the scale space factor of the filter is gradually increased, and the relation is as follows: l is 2o+1(s+1)+1、L=3*l、
Figure BDA0002760532230000031
Wherein L represents the response length of the filter to positive and negative spots in the differential direction, L represents the size of a template of the filter, s is the number of groups, and o is the number of layers;
s3-3, processing each pixel point in the image needing characteristic point extraction by using a Hessian matrix, and calculating the Hessian matrix discriminant of the current pixel point:
H(x,σ)=Dxx(x,σ)*Dyy(x,σ)-(Dxy(x,σ))2
when the Hessian matrix discriminant obtains a local maximum value, selecting a current pixel point as an interest point;
and carrying out Taylor expansion on the selected interest points in a continuous space, wherein the expression is as follows:
Figure BDA0002760532230000032
where d (X) represents the selected point-of-interest information, X ═ X, y, σ is the offset position between two points, and X represents the offset position between two pointsTDenotes the transposition of X, DTRepresenting the transposition of D, D being a Gaussian difference function;
determining extreme points on a continuous space in a mode of derivation of 0, wherein a derivation formula is as follows:
Figure BDA0002760532230000033
wherein the content of the first and second substances,
Figure BDA0002760532230000034
an offset position representing an interpolation center;
solving Hessian matrix for all the maximum value points in the determined extreme value points, and passing
Figure BDA0002760532230000041
Screening to obtain characteristic points, wherein r is a designated screening parameter, Tr (H) and det (H) respectively represent the trace and determinant of the Hessian matrix;
s3-4, scanning in the circular neighborhood of the characteristic points obtained by screening by taking 0.2rad as a step length, counting the sum of horizontal and vertical Harr wavelet characteristic absolute values of all the characteristic points in a sector range of 60 degrees, and taking the sector direction corresponding to the maximum value as the main direction of the characteristic point;
s3-5, for each characteristic point determining the main direction, taking the characteristic point as the center, extracting 4 × 4 rectangular area blocks along the main direction of the characteristic point, wherein the size of a single rectangular area block is 5 × 5, counting Haar wavelet characteristics of 25 pixel points in the horizontal direction and the vertical direction relative to the main direction of the characteristic point for each rectangular area block respectively, finally summarizing the counting results of all rectangular area blocks to obtain the characteristic description of the characteristic point, wherein the characteristic description comprises the sum of Haar wavelet characteristic values in the horizontal direction, the sum of Haar wavelet characteristic values in the vertical direction, the sum of Haar wavelet characteristic values in the horizontal direction and the sum of Haar wavelet characteristic values in the vertical direction, and the characteristic description is expressed in the form of characteristic vectors.
Preferably, in the step S3-3, the step S is carried out by
Figure BDA0002760532230000042
When the feature point is obtained by screening, det (H) ═ Dxx*Dyy-(0.9*Dxy)2
Preferably, in the step S4, the signal is taken from I2Each feature point of the image and the feature points are taken from I1When matching each characteristic point of the image, calculating each selected from I2Characteristic points of the image and one from I1The error value of the feature vector between the feature points of the image is obtained from I2Feature points of the image and the point of interest taken from I1The characteristic points of the image form a group of preliminary matching characteristic point pairs;
and taking the feature vector error value as the feature description consistency, sequencing the feature vector error values of all the groups of preliminary matching feature point pairs from small to large, and determining each group of feature point pairs with the feature vector error value not exceeding a set threshold as the matched feature point pair.
Preferably, in step S5, when 4 groups of matched pairs of feature points are randomly selected and the projection matrix is calculated, the method includes the following steps:
s5-1, randomly selecting 4 groups of matched feature point pairs, and establishing an equation for each group of matched feature point pairs:
Figure BDA0002760532230000051
two explicit expressions for the projection matrix elements are derived:
(h31xi+h32yi+h33)·xi'=h11xi+h12yi+h13
(h31xi+h32yi+h33)·yi'=h21xi+h22yi+h23
wherein (x)i,yi) The pairs of feature points representing the ith set of matches are taken from I1Feature points of image (x'i,y'i) The pairs of feature points representing the ith set of matches are taken from I2Feature points of the image, hjkRepresents projection matrix elements, j, k being 1, 2, 3;
s5-2, rewriting the projection matrix elements into a vector form to obtain the following relational expression:
Figure BDA0002760532230000052
according to the linear characteristic of the matrix coefficient, let h33And 5, constantly keeping the value at 1, and substituting two explicit expressions about the elements of the projection matrix obtained by each group of matched characteristic point pairs to obtain the projection matrix.
Preferably, in the step S8, when determining whether to perform the next round of calculation, the number of calculation rounds of the current round of calculation is checked, and if the number of calculation rounds does not reach the preset number of rounds, the process returns to the step S5 to perform the next round of calculation, and the number of calculation rounds is incremented by 1.
Preferably, in the step S8, when determining whether to perform the next round of calculation, the currently calculated matched feature point pair is determined, and if any 4 groups of matched feature point pairs have already been calculated as the projection matrix, the distance loss value and the projection matrix in the storage result are output.
The invention also provides a computer device comprising a memory and a processor, wherein the memory stores a computer program, and the processor implements the steps of the SURF feature point-based X-ray image registration method according to any one of the above items when executing the computer program.
The present invention also provides a computer readable storage medium having stored thereon a computer program which, when being executed by a processor, implements the steps of the SURF feature point-based X-ray image registration method according to any one of the above.
The technical scheme of the invention has the following advantages: the invention provides an X-ray image registration method based on SURF (speeded Up robust features), computer equipment and a computer readable storage medium, wherein line features and point feature information of X-ray images of the same scene, different angles and/or postures are respectively extracted, a projection matrix for registration is calculated through the point feature information, then evaluation is carried out through the line features, evaluation and updating of the projection matrix are realized, the optimal projection matrix is finally determined, high-precision and high-matching-degree multi-source image registration is realized, the method can be applied to multi-type target images of cervical vertebra, vertebra and the like acquired by X-ray imaging, and accurate and real-time detection and identification registration can be carried out.
Drawings
Fig. 1 is a schematic diagram illustrating steps of an X-ray image registration method based on SURF feature points according to an embodiment of the present invention;
fig. 2 is a schematic diagram of a step of extracting feature points based on the SURF method in the embodiment of the present invention;
fig. 3 is a diagram of a result obtained by performing registration by using an X-ray image registration method based on SURF feature points in the embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be obtained by a person skilled in the art without any inventive step based on the embodiments of the present invention, are within the scope of the present invention.
As shown in fig. 1 to 3, an embodiment of the present invention provides a SURF feature point-based X-ray image registration method, including the following steps:
s1, selectingTaking an interested area of an X-ray image and marking the interested area as I1Image, and pair I1And preprocessing the image and extracting line features.
For an X-ray image (usually the main image in the multi-source image, which can be referred to as image 1), after selecting the region of interest (ROI), a smaller image, i.e. I, can be obtained1Image, pair I1The image is preprocessed, so that the image definition can be improved, and the subsequent line feature extraction is facilitated.
When extracting line features, one or more line features may be selected according to the characteristics of the specific target in the ROI.
S2, selecting a corresponding region of interest in the X-ray image for registration, and marking the region of interest as I2Image, and pair I2And preprocessing the image and extracting line features.
X-ray image for registration (which may be referred to as image 2), i.e. an X-ray image of the same scene, different angles and/or poses as the primary image (i.e. image 1), of the region of interest (i.e. I)2Image) is typically the sum of I1The same or similar regions of the images for registration. For I2The preprocessing and line feature extraction of the image should be corresponding to the I1The preprocessing performed on the image is the same as the line feature extraction.
In a preferred embodiment, step S1 is for I1When the image is preprocessed, I is processed1The image is low pass filtered, candy edge detected. Correspondingly, in step S2, for I2When the image is preprocessed, I is also processed2The image is subjected to the same low pass filtering, candy edge detection.
S3, respectively comparing I based on SURF method1Image and I2And extracting the characteristic points of the image.
The SURF (Speeded Up Robust Features) method is an improvement to the SIFT (Scale-invariant feature transform) method, particularly in terms of execution efficiency. The point characteristic information of the interested region of the X-ray image is extracted by the SURF method, and the real-time registration of the image can be realized.
S4, according to the feature description of the feature point, will be taken from I2Each feature point of the image and the feature points are taken from I1And matching the characteristic points of the images to form matched characteristic point pairs. And in the matched feature point pair, the feature description consistency of the two feature points does not exceed a set threshold value.
The feature description of two feature points being matched, i.e. taken from I2Feature points of the image taken from I1And corresponding feature vector error values between feature points of the image.
And S5, randomly selecting 4 groups of matched characteristic point pairs, and calculating a projection matrix.
Projection matrix, i.e. I1Image mapping to I2A transformation matrix of the image.
S6, combining I according to the projection matrix1Image mapping to I2On the image, calculating the mapped I1Image and I2Distance loss value of image.
Wherein, the distance loss value is at Euclidean distance (mapping to I)2On the image) I1Image and I2And the mean square error of the points of the nearest points between the same line features of the image.
And S7, comparing the distance loss value with the current storage result, and reserving a smaller distance loss value and a corresponding projection matrix as a new storage result.
When comparing the distance loss value with the current storage result, if the current storage result is an empty set, directly storing the distance loss value obtained by the calculation of the current round and the corresponding projection matrix as a new storage result, and if the current storage result is not an empty set, comparing the distance loss value obtained by the calculation of the current round with the distance loss value in the current storage result so as to store the minimum distance loss value and the corresponding projection matrix as the storage result after multiple rounds of calculation.
And S8, judging whether to perform the next round of calculation, if so, returning to the step S5, and otherwise, outputting the distance loss value and the projection matrix in the storage result.
By repeatedly calculating the projection matrix for many times, the projection matrix with the best registration effect can be screened out through iteration, and therefore the accuracy and the matching degree of multi-source image registration are improved.
S9, mixing I1Image and I2And carrying out image registration on the images according to the output projection matrix.
The invention respectively extracts line characteristics and point characteristic information of two X-ray images, calculates a projection matrix through point characteristic information pairing, evaluates the projection matrix through line characteristics, selects the projection matrix with the best effect, and can realize high-quality registration of multi-source X-ray images with different angles and/or postures. In addition, in the whole registration process, interference information such as noise and the like is removed through preprocessing, and only line characteristics and point characteristic information of the X-ray image are used in subsequent calculation, especially the point characteristic information is robust to targets of different types, different scales and different brightness, and robustness of registration of various scenes can be achieved.
Preferably, as shown in fig. 2, in step S3, the I pairs are respectively based on SURF method1Image and I2When the image is subjected to feature point extraction, the method comprises the following steps:
and S3-1, constructing a Hessian matrix.
Hessian matrix
Figure BDA0002760532230000091
Filtering is needed before constructing the Hessian matrix, and the expression of the Hessian matrix becomes:
Figure BDA0002760532230000092
wherein x and y represent coordinates of pixel points, Dxx(x, σ) denotes the second derivative to the x-direction, Dyy(x, σ) denotes the second derivative to the y-direction, Dxy(x, σ) represents the first order partial derivative in the x direction and then the first order partial derivative in the y direction, and σ represents the scale of the filter.
When the discriminant H (x, sigma) of the Hessian matrix is Dxx(x,σ)*Dyy(x,σ)-(Dxy(x,σ))2When local maximum is obtained, it can be judged that the current pixel point is more adjacent than the surrounding areaOther dots within the array are brighter or darker.
And S3-2, constructing a scale space.
The scale space of the SURF method consists of s groups of o layers, the filters are box filters, Gaussian second-order differential filtering is used inside the filters, the sizes of images in different groups are consistent, the sizes of templates of the used filters are gradually increased, the same group of images but different layers of images use the same size of filters, but the scale space factor of the filters is gradually increased, and the specific relation is as follows: l is 2o+1(s+1)+1、L=3*l、
Figure BDA0002760532230000093
Wherein L represents the response length of the filter to positive and negative spots in the differential direction, L represents the size of the template of the filter, s is the number of groups, o is the number of layers, and σ is the scale of the filter.
And S3-3, positioning the characteristic points.
Processing the image (i.e. I) needing to extract the characteristic points by a Hessian matrix1Image, I2Image), calculating a Hessian matrix discriminant of the current pixel: h (x, σ) ═ Dxx(x,σ)*Dyy(x,σ)-(Dxy(x,σ))2When the Hessian matrix discriminant obtains a local maximum, the current pixel point is selected as the interest point (i.e. the extreme point of the pixel point is found).
The Hessian matrix discriminant of each pixel is compared with all neighboring points of its image domain (same size image) and scale domain (neighboring scale space). When the pixel value of a pixel point is greater than (or less than) all the adjacent points, the pixel point is an extreme point. The detected pixel points are compared with 8 pixel points (image domain adjacent points) in a 3 × 33 × 3 neighborhood of the image where the detected pixel points are located, and 18 pixel points (scale domain adjacent points) in an upper and lower two-layer 3 × 33 × 3 neighborhood of the detected pixel points, wherein the total number of the pixel points is 26. The point of the edge response in the pixel value comparison result can be removed by the Hessian matrix discriminant.
And carrying out Taylor expansion on each selected interest point in a continuous space, wherein the expression is as follows:
Figure BDA0002760532230000101
where d (X) represents the selected point-of-interest information, X ═ X, y, σ is the offset position between two points, and X represents the offset position between two pointsTDenotes the transposition of X, DTRepresenting the transpose of D, which is a gaussian difference function.
Determining an extreme point in the interest point, namely the extreme point on the continuous space, in a mode of derivation of 0, wherein the derivation formula is as follows:
Figure BDA0002760532230000102
wherein the content of the first and second substances,
Figure BDA0002760532230000103
representing the offset position of the interpolation center.
Solving the Hessian matrix for all the maximum value points in the extreme value points determined by taking the derivative as 0, and passing
Figure BDA0002760532230000111
And screening to obtain characteristic points, wherein r is a specified screening parameter, Tr (H) represents the trace of the Hessian matrix, and Det (H) represents the determinant of the Hessian matrix. The specific value of the designated screening parameter r can be set according to the actual situation.
Further, since the Hessian matrix discriminant uses the gaussian convolution of the original image, since the gaussian verification follows the normal distribution, the coefficients are lower and lower from the center point to the outside, and in order to increase the operation speed, the SURF method uses a box filter to approximate the substitute gaussian filter, so that the operation speed can be increased at DxyMultiplying a weighting factor of 0.9 in order to balance the error due to the approximation using the box filter, i.e. in step S3-3 by
Figure BDA0002760532230000112
When the feature point is obtained by screening, det (H) ═ Dxx*Dyy-(0.9*Dxy)2
And S3-4, determining the main direction of each characteristic point.
And scanning in the circular neighborhood of each characteristic point obtained by screening by taking 0.2rad as a step length, counting the sum of horizontal Harr wavelet characteristic absolute values and vertical Harr wavelet characteristic absolute values of all the characteristic points in a sector range of 60 degrees, and taking the sector direction corresponding to the maximum value as the main direction of the characteristic point.
The circular neighborhood of a feature point, i.e. within a circular range centered on itself and having a radius of 6 σ, σ is the scale of the filter (used by the image in which the feature point is located).
And S3-5, generating feature descriptions of the feature points.
For each feature point determining the main direction, taking the feature point as the center, extracting 4 × 4 rectangular region blocks along the main direction of the feature point, wherein the size of a single rectangular region block is 5 × 5, counting the Haar wavelet features of 25 pixel points in the horizontal direction and the vertical direction relative to the main direction of the feature point for each rectangular region block, finally summarizing the counting results of all rectangular region blocks to obtain the feature description of the feature point, wherein the feature description of one feature point comprises the sum of the Haar wavelet feature values in the horizontal direction, the sum of the Haar wavelet feature values in the vertical direction, the sum of the Haar wavelet feature values in the horizontal direction and the sum of the Haar wavelet feature values in the vertical direction, and the feature description is expressed in the form of feature vectors.
Preferably, in step S4, it will be taken from I2Each feature point of the image and the feature points are taken from I1When matching each characteristic point of the image, calculating each selected from I2Characteristic points of the image and one from I1The error value of the feature vector between the feature points of the image is obtained from I2Feature points of the image and the point of interest taken from I1The feature points of the image constitute a set of preliminary matching feature point pairs. Other extracts from I are treated in the same manner1And (4) feature points of the image until all the feature points form a preliminary matching feature point pair.
Taken from I as (a set of preliminary matching feature point pairs)1Feature point and extraction of imageFrom I2And (3) taking the feature vector error values among the feature points of the image as feature description consistency (of a group of preliminary matching feature point pairs), sequencing the feature vector error values of all the groups of preliminary matching feature point pairs from small to large, and determining each group of feature point pairs with the feature vector error value not exceeding a set threshold as a matched feature point pair. And screening out the characteristic point pairs with the highest conformity, thereby being beneficial to determining the optimal projection matrix through subsequent calculation.
Preferably, in step S5, when 4 groups of matched pairs of feature points are randomly selected and the projection matrix is calculated, the method includes the following steps:
s5-1, randomly selecting 4 groups of matched feature point pairs, and establishing an equation for each group of matched feature point pairs:
Figure BDA0002760532230000121
two explicit expressions for the projection matrix elements are derived:
(h31xi+h32yi+h33)·xi'=h11xi+h12yi+h13
(h31xi+h32yi+h33)·yi'=h21xi+h22yi+h23
wherein (x)i,yi) The pairs of feature points representing the ith set of matches are taken from I1Feature points of image (x'i,y'i) The pairs of feature points representing the ith set of matches are taken from I2Feature points of the image, hjkRepresents projection matrix elements, j, k being 1, 2, 3;
s5-2, rewriting the projection matrix elements into a vector form to obtain the following relational expression:
Figure BDA0002760532230000131
according to matrix coefficientsLinear characteristic, set h33And (3) obtaining a unique transformation matrix with the constant value of 1, so far, the degree of freedom of the transformation matrix is 8, substituting two explicit expressions (8 explicit expressions in total) related to projection matrix elements obtained by each group of matched characteristic point pairs, and calculating to obtain the projection matrix.
In some embodiments, in step S8, when determining whether to perform the next round of calculation, the number of calculation rounds of the current round of calculation can be checked, and if the number of calculation rounds does not reach the preset number of rounds, the method returns to step S5 to perform the next round of calculation, and the number of calculation rounds is increased by 1.
In another embodiment, in step S8, when determining whether to perform the next round of calculation, it may determine the currently calculated matched pairs of feature points, and if any 4 groups of matched pairs of feature points have already been subjected to the calculation of the projection matrix, the process will not return to step S5 to perform the next round of calculation, and the distance loss value and the projection matrix in the stored result are output.
In particular, in some preferred embodiments of the present invention, there is further provided a computer device, including a memory and a processor, where the memory stores a computer program, and the processor implements the steps of the SURF feature point-based X-ray image registration method in any one of the above embodiments when executing the computer program.
In other preferred embodiments of the present invention, a computer-readable storage medium is further provided, on which a computer program is stored, and the computer program is executed by a processor to implement the steps of the SURF feature point-based X-ray image registration method described in any of the above embodiments.
It will be understood by those skilled in the art that all or part of the processes in the method for implementing the above embodiments may be implemented by a computer program, which may be stored in a non-volatile computer-readable storage medium, and when executed, the computer program may include the processes in the above embodiments of the SURF feature point-based X-ray image registration method, and will not be described again here.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (10)

1. An X-ray image registration method based on SURF feature points is characterized by comprising the following steps:
s1, selecting an interested area of the X-ray image, and marking the interested area as I1Image, and pair I1Preprocessing an image and extracting line characteristics;
s2, selecting a corresponding region of interest in the X-ray image for registration, and marking the region of interest as I2Image, and pair I2Preprocessing an image and extracting line characteristics;
s3, respectively comparing I based on SURF method1Image and I2Extracting characteristic points of the image;
s4, according to the feature description of the feature point, will be taken from I2Each feature point of the image and the feature points are taken from I1Matching all the characteristic points of the image to form matched characteristic point pairs; the feature description consistency of two feature points in the matched feature point pair does not exceed a set threshold;
s5, randomly selecting 4 groups of matched characteristic point pairs, and calculating a projection matrix;
s6, combining I according to the projection matrix1Image mapping to I2On the image, calculating the mapped I1Image and I2A distance loss value of the image; wherein the distance loss value is I under the Euclidean distance1Image and I2Point mean square error of the closest point between the same line features of the image;
s7, comparing the distance loss value with the current storage result, and reserving a smaller distance loss value and a corresponding projection matrix as a new storage result;
s8, judging whether to carry out the next round of calculation, if so, returning to the step S5, otherwise, outputting the distance loss value and the projection matrix in the storage result;
s9, mixing I1Image and I2And carrying out image registration on the images according to the output projection matrix.
2. The SURF feature point-based X-ray image registration method according to claim 1, wherein:
the step S1 is to I1When the image is preprocessed, I is processed1Carrying out low-pass filtering on the image, and detecting the candy edge;
in the step S2, for I2When the image is preprocessed, I is processed2The image is low pass filtered, candy edge detected.
3. The SURF feature point-based X-ray image registration method according to claim 1,
in the step S3, the first and second pairs I are respectively determined based on SURF method1Image and I2When the image is subjected to feature point extraction, the method comprises the following steps:
s3-1, constructing a Hessian matrix, wherein the expression is as follows:
Figure FDA0002760532220000021
wherein x and y represent coordinates of pixel points, Dxx(x, σ) denotes the second derivative to the x-direction, Dyy(x, σ) denotes the second derivative to the y-direction, Dxy(x, sigma) represents that the first order partial derivative is solved for the x direction and then the first order partial derivative is solved for the y direction, and sigma represents the scale of the filter;
s3-2, constructing scale space of SURF method, where the scale space is composed of S groups of o layers, the filter is box filter, the interior of the filter uses Gaussian second order differential filtering, the sizes of the images among different groups are the same, the size of the template of the used filter is gradually increased, the images in the same group and different layers use the same size filter, but the scale space factor of the filter is gradually increased, and the relation is that:l=2o+1(s+1)+1、L=3*l、
Figure FDA0002760532220000022
Wherein L represents the response length of the filter to positive and negative spots in the differential direction, L represents the size of a template of the filter, s is the number of groups, and o is the number of layers;
s3-3, processing each pixel point in the image needing characteristic point extraction by using a Hessian matrix, and calculating the Hessian matrix discriminant of the current pixel point:
H(x,σ)=Dxx(x,σ)*Dyy(x,σ)-(Dxy(x,σ))2
when the Hessian matrix discriminant obtains a local maximum value, selecting a current pixel point as an interest point;
and carrying out Taylor expansion on the selected interest points in a continuous space, wherein the expression is as follows:
Figure FDA0002760532220000023
where d (X) represents the selected point-of-interest information, X ═ X, y, σ is the offset position between two points, and X represents the offset position between two pointsTDenotes the transposition of X, DTRepresenting the transposition of D, D being a Gaussian difference function;
determining extreme points on a continuous space in a mode of derivation of 0, wherein a derivation formula is as follows:
Figure FDA0002760532220000031
wherein the content of the first and second substances,
Figure FDA0002760532220000032
an offset position representing an interpolation center;
solving Hessian matrix for all the maximum value points in the determined extreme value points, and passing
Figure FDA0002760532220000033
Screening to obtain characteristic points, wherein r is a designated screening parameter, Tr (H) and det (H) respectively represent the trace and determinant of the Hessian matrix;
s3-4, scanning in the circular neighborhood of the characteristic points obtained by screening by taking 0.2rad as a step length, counting the sum of horizontal and vertical Harr wavelet characteristic absolute values of all the characteristic points in a sector range of 60 degrees, and taking the sector direction corresponding to the maximum value as the main direction of the characteristic point;
s3-5, for each characteristic point determining the main direction, taking the characteristic point as the center, extracting 4 × 4 rectangular area blocks along the main direction of the characteristic point, wherein the size of a single rectangular area block is 5 × 5, counting Haar wavelet characteristics of 25 pixel points in the horizontal direction and the vertical direction relative to the main direction of the characteristic point for each rectangular area block respectively, finally summarizing the counting results of all rectangular area blocks to obtain the characteristic description of the characteristic point, wherein the characteristic description comprises the sum of Haar wavelet characteristic values in the horizontal direction, the sum of Haar wavelet characteristic values in the vertical direction, the sum of Haar wavelet characteristic values in the horizontal direction and the sum of Haar wavelet characteristic values in the vertical direction, and the characteristic description is expressed in the form of characteristic vectors.
4. The SURF feature point-based X-ray image registration method according to claim 3, wherein:
in the step S3-3, by
Figure FDA0002760532220000041
When the feature point is obtained by screening, det (H) ═ Dxx*Dyy-(0.9*Dxy)2
5. The SURF feature point-based X-ray image registration method according to claim 3, wherein:
in the step S4, the signal is taken from I2Each feature point of the image and the feature points are taken from I1When matching each characteristic point of the image, calculating each selected from I2Characteristic points of the image and one from I1A feature vector error value between feature points of the image,and the one with the minimum error value of the feature vector is taken from I2Feature points of the image and the point of interest taken from I1The characteristic points of the image form a group of preliminary matching characteristic point pairs;
and taking the feature vector error value as the feature description consistency, sequencing the feature vector error values of all the groups of preliminary matching feature point pairs from small to large, and determining each group of feature point pairs with the feature vector error value not exceeding a set threshold as the matched feature point pair.
6. The SURF feature point-based X-ray image registration method according to claim 1, wherein:
in step S5, when 4 groups of matched feature point pairs are randomly selected and the projection matrix is calculated, the method includes the following steps:
s5-1, randomly selecting 4 groups of matched feature point pairs, and establishing an equation for each group of matched feature point pairs:
Figure FDA0002760532220000042
two explicit expressions for the projection matrix elements are derived:
(h31xi+h32yi+h33)·xi'=h11xi+h12yi+h13
(h31xi+h32yi+h33)·yi'=h21xi+h22yi+h23
wherein (x)i,yi) The pairs of feature points representing the ith set of matches are taken from I1Feature points of image (x'i,y'i) The pairs of feature points representing the ith set of matches are taken from I2Feature points of the image, hjkRepresents projection matrix elements, j, k being 1, 2, 3;
s5-2, rewriting the projection matrix elements into a vector form to obtain the following relational expression:
Figure FDA0002760532220000051
according to the linear characteristic of the matrix coefficient, let h33And 5, constantly keeping the value at 1, and substituting two explicit expressions about the elements of the projection matrix obtained by each group of matched characteristic point pairs to obtain the projection matrix.
7. The SURF feature point-based X-ray image registration method according to claim 1, wherein:
in step S8, when determining whether to perform the next round of calculation, the number of calculation rounds of the current round of calculation is checked, and if the number of calculation rounds does not reach the preset number of calculation rounds, the process returns to step S5 to perform the next round of calculation, and the number of calculation rounds is incremented by 1.
8. The SURF feature point-based X-ray image registration method according to claim 1, wherein:
in step S8, when determining whether to perform the next round of calculation, the currently calculated matched feature point pair is determined, and if any 4 groups of matched feature point pairs have already been calculated by the projection matrix, the distance loss value and the projection matrix in the stored result are output.
9. A computer device comprising a memory and a processor, the memory storing a computer program, wherein the processor when executing the computer program implements the steps of the SURF feature point based X-ray image registration method of any one of claims 1 to 8.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the SURF feature point-based X-ray image registration method according to any one of claims 1 to 8.
CN202011216382.3A 2020-11-04 2020-11-04 X-ray image registration method based on SURF feature points Active CN112215878B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011216382.3A CN112215878B (en) 2020-11-04 2020-11-04 X-ray image registration method based on SURF feature points

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011216382.3A CN112215878B (en) 2020-11-04 2020-11-04 X-ray image registration method based on SURF feature points

Publications (2)

Publication Number Publication Date
CN112215878A true CN112215878A (en) 2021-01-12
CN112215878B CN112215878B (en) 2023-03-24

Family

ID=74058150

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011216382.3A Active CN112215878B (en) 2020-11-04 2020-11-04 X-ray image registration method based on SURF feature points

Country Status (1)

Country Link
CN (1) CN112215878B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115841506A (en) * 2023-02-20 2023-03-24 广东省人民医院 Fluorescent molecule image processing method and system
CN116728420A (en) * 2023-08-11 2023-09-12 苏州安博医疗科技有限公司 Mechanical arm regulation and control method and system for spinal surgery

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106683127A (en) * 2017-01-05 2017-05-17 南京觅踪电子科技有限公司 Multimode medical image registration method based on SURF algorithm
US20180357788A1 (en) * 2016-08-11 2018-12-13 Changzhou Campus of Hohai University UAV Inspection Method for Power Line Based on Human Visual System
CN109271996A (en) * 2018-08-21 2019-01-25 南京理工大学 FPC automatic image registration method based on SURF feature and Hash perception algorithm
CN109727239A (en) * 2018-12-27 2019-05-07 北京航天福道高技术股份有限公司 Based on SURF feature to the method for registering of inspection figure and reference map
CN111784675A (en) * 2020-07-01 2020-10-16 云南易见纹语科技有限公司 Method and device for processing article texture information, storage medium and electronic equipment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180357788A1 (en) * 2016-08-11 2018-12-13 Changzhou Campus of Hohai University UAV Inspection Method for Power Line Based on Human Visual System
CN106683127A (en) * 2017-01-05 2017-05-17 南京觅踪电子科技有限公司 Multimode medical image registration method based on SURF algorithm
CN109271996A (en) * 2018-08-21 2019-01-25 南京理工大学 FPC automatic image registration method based on SURF feature and Hash perception algorithm
CN109727239A (en) * 2018-12-27 2019-05-07 北京航天福道高技术股份有限公司 Based on SURF feature to the method for registering of inspection figure and reference map
CN111784675A (en) * 2020-07-01 2020-10-16 云南易见纹语科技有限公司 Method and device for processing article texture information, storage medium and electronic equipment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
廉蔺: ""红外与可见光遥感图像自动配准算法研究"", 《中国博士学位论文全文数据库 信息科技辑》 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115841506A (en) * 2023-02-20 2023-03-24 广东省人民医院 Fluorescent molecule image processing method and system
CN116728420A (en) * 2023-08-11 2023-09-12 苏州安博医疗科技有限公司 Mechanical arm regulation and control method and system for spinal surgery
CN116728420B (en) * 2023-08-11 2023-11-03 苏州安博医疗科技有限公司 Mechanical arm regulation and control method and system for spinal surgery

Also Published As

Publication number Publication date
CN112215878B (en) 2023-03-24

Similar Documents

Publication Publication Date Title
JP4130661B2 (en) Device for detecting temporal changes between temporally continuous chest images
CN111047572B (en) Automatic spine positioning method in medical image based on Mask RCNN
JP6077993B2 (en) Image data processing method, system, and program for identifying image variants
CN109754396B (en) Image registration method and device, computer equipment and storage medium
CN109124662B (en) Rib center line detection device and method
US8285013B2 (en) Method and apparatus for detecting abnormal patterns within diagnosis target image utilizing the past positions of abnormal patterns
CN112215878B (en) X-ray image registration method based on SURF feature points
CN110782428B (en) Method and system for constructing clinical brain CT image ROI template
CN106296613B (en) A kind of Dual Energy Subtraction method based on DR machine
EP1551296A1 (en) Method, code, and system for assaying joint deformity
Wang et al. Automatic fundus images mosaic based on SIFT feature
CN110555860A (en) Method, electronic device and storage medium for marking rib region in medical image
CN111062936B (en) Quantitative index evaluation method for facial deformation diagnosis and treatment effect
CN111179173B (en) Image splicing method based on discrete wavelet transform and gradient fusion algorithm
CN109978897B (en) Registration method and device for heterogeneous remote sensing images of multi-scale generation countermeasure network
CN113077502B (en) Dental jaw space registration method based on multilayer spherical surface generation points in marker sphere
CN111951178B (en) Image processing method and device for remarkably improving image quality and electronic equipment
CN113610746A (en) Image processing method and device, computer equipment and storage medium
CN109886988B (en) Method, system, device and medium for measuring positioning error of microwave imager
Martín-Fernández et al. Automatic articulated registration of hand radiographs
CN107451610B (en) Image detection method for improving feature matching precision
Kumar et al. Semiautomatic method for segmenting pedicles in vertebral radiographs
Malode New approach of statistical analysis for lung disease diagnosis using microscopy images
Iakovidis et al. Robust model-based detection of the lung field boundaries in portable chest radiographs supported by selective thresholding
CN112766332A (en) Medical image detection model training method, medical image detection method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant