CN112651951A - DCE-MRI-based breast cancer classification method - Google Patents
DCE-MRI-based breast cancer classification method Download PDFInfo
- Publication number
- CN112651951A CN112651951A CN202011620218.9A CN202011620218A CN112651951A CN 112651951 A CN112651951 A CN 112651951A CN 202011620218 A CN202011620218 A CN 202011620218A CN 112651951 A CN112651951 A CN 112651951A
- Authority
- CN
- China
- Prior art keywords
- classifier
- dce
- mri
- breast cancer
- weight
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 40
- 206010006187 Breast cancer Diseases 0.000 title claims abstract description 18
- 208000026310 Breast neoplasm Diseases 0.000 title claims abstract description 18
- 238000013535 dynamic contrast enhanced MRI Methods 0.000 title claims abstract 19
- 230000004927 fusion Effects 0.000 claims abstract description 30
- 210000005075 mammary gland Anatomy 0.000 claims abstract description 25
- 238000012549 training Methods 0.000 claims abstract description 18
- 239000011159 matrix material Substances 0.000 claims description 33
- 238000004422 calculation algorithm Methods 0.000 claims description 21
- 230000010354 integration Effects 0.000 claims description 10
- 238000003066 decision tree Methods 0.000 claims description 9
- 238000010276 construction Methods 0.000 claims description 8
- 238000007635 classification algorithm Methods 0.000 claims description 7
- 230000003902 lesion Effects 0.000 claims description 7
- 230000002159 abnormal effect Effects 0.000 claims description 5
- 238000012216 screening Methods 0.000 claims description 3
- 238000002595 magnetic resonance imaging Methods 0.000 abstract description 4
- 238000012706 support-vector machine Methods 0.000 description 15
- 239000002872 contrast media Substances 0.000 description 5
- 230000000694 effects Effects 0.000 description 3
- 238000003745 diagnosis Methods 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 238000005070 sampling Methods 0.000 description 2
- 230000002792 vascular Effects 0.000 description 2
- 206010028980 Neoplasm Diseases 0.000 description 1
- 230000003527 anti-angiogenesis Effects 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 210000004204 blood vessel Anatomy 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000002512 chemotherapy Methods 0.000 description 1
- 238000003759 clinical diagnosis Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 238000012804 iterative process Methods 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 238000013139 quantization Methods 0.000 description 1
- 238000007637 random forest analysis Methods 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- 210000001519 tissue Anatomy 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 238000000844 transformation Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2411—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/243—Classification techniques relating to the number of classes
- G06F18/24323—Tree-organised classifiers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/90—Dynamic range modification of images or parts thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10072—Tomographic images
- G06T2207/10088—Magnetic resonance imaging [MRI]
- G06T2207/10096—Dynamic contrast-enhanced magnetic resonance imaging [DCE-MRI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30068—Mammography; Breast
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30096—Tumor; Lesion
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Engineering & Computer Science (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Artificial Intelligence (AREA)
- Life Sciences & Earth Sciences (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Quality & Reliability (AREA)
- Magnetic Resonance Imaging Apparatus (AREA)
Abstract
The invention relates to the technical field of magnetic resonance imaging, in particular to a DCE-MRI-based breast cancer classification method, which comprises the following steps: training by using the extracted kinetic parameter features and texture features to obtain a plurality of strong classifiers, and then linearly combining the strong classifiers based on a weighted voting method to obtain a fusion classifier; and finally, classifying the DCE-MRI images of the mammary glands by using a fusion classifier to obtain the grade of the DCE-MRI images of the mammary glands. The invention provides a DCE-MRI-based breast cancer classification method, which solves the problem of low accuracy of the existing classification method.
Description
Technical Field
The invention relates to the technical field of magnetic resonance imaging, in particular to a breast cancer grading method based on DCE-MRI.
Background
DCE-MRI is of great significance for improving the clinical diagnosis level of solid tumors and evaluating the early curative effect of anti-angiogenesis chemotherapy. DCE-MRI can provide high quality images of a set of lesions, including not only useful information about the shape of a suspicious lesion, but also useful information about the dynamic behavior of contrast agents in the lesion.
The existing auxiliary diagnosis of breast cancer has the following problems:
(1) the main doctor judges according to experience and has greater subjectivity;
(2) one fundamental problem in acquiring DCE magnetic resonance imaging data is the selection of a balance between spatial and temporal resolution. Calculating the quantitative parameters requires knowing the time rate of change of the concentration of the contrast agent in the blood vessel, which changes very fast with time, thus requiring a high time resolution. Quantitative parameters can be obtained only by introducing a proper model, and a classic TOFTs (morphology of Tofts) two-chamber model is the most common pharmacokinetic model of DCE-MRI (digital radiography-magnetic resonance imaging), but the method needs higher spatial resolution and signal-to-noise ratio and has higher requirements on a magnetic resonance scanner and sequence setting;
(3) the classification result is directly influenced by the quality of the classification algorithm, and the classification algorithm has certain limitations in application of SVM classifiers, random forests and Bayes classifiers which are relatively wide.
Disclosure of Invention
The invention provides a DCE-MRI-based breast cancer classification method, which aims to solve the problem of low accuracy of the existing classification method.
The technical scheme for solving the problems is as follows: a DCE-MRI based breast cancer staging method comprising the steps of:
s1: extracting sample data from at least one mammary gland DCE-MRI image training sample, wherein the sample data comprises kinetic parameter characteristics and texture characteristics;
s2: construction fusion classifier
The method comprises the steps that a plurality of strong classifiers are subjected to linear combination based on a weighted voting method to obtain a fusion classifier, wherein the strong classifiers are obtained after sample data training is utilized;
s3: and classifying the DCE-MRI images of the mammary glands by using a fusion classifier to obtain the grade of the DCE-MRI images of the mammary glands.
Preferably, each strong classifier is obtained by processing the corresponding base classifier through a boosting integration algorithm.
Preferably, the base classifier is a SVM classifier, a decision tree classifier, and an ELM classifier.
Preferably, a threshold δ is preset in the fusion classifier, when y > δ, the determination result is class 1, and when y < δ, the determination result is class 2, where y is the output of the fusion of the plurality of strong classifiers.
Preferably, the method for obtaining the strong classifier by using the boosting integration algorithm specifically includes:
giving initial weight to the sample data;
obtaining a base classifier by the sample data with the initial weight through a classification algorithm;
determining the weight of the base classifier and the weight of a sample according to the classification accuracy of the base classifier, wherein the weight of the sample is used as the weight of a training sample of the next-stage base classifier, and continuously iterating until the number of the desired base classifiers is reached;
and voting by the plurality of base classifiers according to the weight of the base classifiers to obtain the strong classifier.
Preferably, the kinetic parameter signature comprises a semi-quantitative parameter and a quantitative parameter.
Preferably, the step S1 specifically includes:
obtaining semi-quantitative parameters through a signal intensity-time curve of a mammary gland DCE-MRI image;
inputting the DCE-MRI image of the mammary gland into a regional reference model to obtain quantitative parameters;
extracting texture features from a lesion region of a mammary gland DCE-MRI image by adopting one method of a law texture feature, a gray level co-occurrence matrix, a gray level run matrix, a gray level region matrix and a gray level difference matrix;
and screening the extracted kinetic parameters and texture features to obtain sample data.
Preferably, the abnormal sample data in the boosting integration algorithm is judged by using a K-nearest neighbor algorithm.
Compared with the prior art, the invention has the beneficial effects that: the invention adopts boosting strategy and uses mutually overlapped sampling subsets through a self-service sampling method. Meanwhile, the strong classifiers are fused by adopting voting strategies with different weights to obtain a fusion classifier, so that the accuracy of the BIRADS classification of the mammary gland is improved. According to the invention, the comprehensive characteristics including the kinetic parameter characteristics and the texture characteristics are extracted by utilizing the space-time information of the mammary gland DCE-MRI image, so that the grading process is facilitated, and the diagnosis accuracy is improved.
Drawings
FIG. 1 is a schematic flow diagram of the classification method of the present invention.
FIG. 2 is a flow chart illustrating the construction of a fusion classifier according to the present invention.
FIG. 3 is a schematic flow chart of obtaining a strong classifier by using a boosting integration algorithm according to the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be described clearly and completely with reference to the accompanying drawings of the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all embodiments of the present invention. All other embodiments, which can be obtained by a person skilled in the art without any inventive step based on the embodiments of the present invention, are within the scope of the present invention. Thus, the following detailed description of the embodiments of the present invention, presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention.
A DCE-MRI based breast cancer staging method comprising the steps of:
s1: extracting sample data from at least one mammary gland DCE-MRI image training sample, wherein the sample data comprises kinetic parameter characteristics and texture characteristics;
s2: construction fusion classifier
The method comprises the steps that linear combination is carried out on a plurality of strong classifiers based on a weighted voting method to obtain a fusion classifier, wherein the strong classifiers are obtained after sample data training is utilized;
s3: and classifying the DCE-MRI images of the mammary glands by using a fusion classifier to obtain the grade of the DCE-MRI images of the mammary glands.
As a preferred embodiment of the invention, each strong classifier is obtained by processing the corresponding base classifier through a boosting integration algorithm.
As a preferred embodiment of the present invention, the base classifier is a SVM classifier, a decision tree classifier, or an ELM classifier.
In a preferred embodiment of the present invention, a threshold δ is preset in the fusion classifier, and when y > δ, the decision result is class 1, and when y < δ, the decision result is class 2, where y is the output of the fusion of the plurality of strong classifiers.
As a preferred embodiment of the present invention, the method for obtaining a strong classifier by using a boosting integration algorithm specifically includes:
giving initial weight to the sample data;
obtaining a base classifier by the sample data with the initial weight through a classification algorithm;
determining the weight of the base classifier and the weight of a sample according to the classification accuracy of the base classifier, wherein the weight of the sample is used as the weight of a training sample of the next-stage base classifier, and continuously iterating until the number of the desired base classifiers is reached;
and voting by the plurality of base classifiers according to the weight of the base classifiers to obtain the strong classifier.
As a preferred embodiment of the invention, the kinetic parameter features comprise semi-quantitative parameters and quantitative parameters.
As a preferred embodiment of the present invention, step S1 specifically includes:
obtaining semi-quantitative parameters through a signal intensity-time curve of a mammary gland DCE-MRI image;
inputting the DCE-MRI image of the mammary gland into a regional reference model to obtain quantitative parameters;
extracting texture features from a lesion region of a mammary gland DCE-MRI image by adopting one method of a law texture feature, a gray level co-occurrence matrix, a gray level run matrix, a gray level region matrix and a gray level difference matrix;
and screening the extracted kinetic parameters and texture features to obtain sample data.
In a preferred embodiment of the present invention, the abnormal sample data in the boosting integration algorithm is determined by using a K-nearest neighbor algorithm.
Example 1: a DCE-MRI based breast cancer staging method comprising the steps of:
1. preprocessing a plurality of mammary gland DCE-MRI images to obtain a training sample set;
2. extracting dynamic parameter features and texture features from the training sample set;
21: and (3) extracting dynamic parameter characteristics, wherein the dynamic parameter characteristics comprise quantitative parameters and semi-quantitative parameters.
The invention adopts a Reference Region Model (RRM) to extract quantitative parameters, and can obtain pharmacokinetic estimated values without AIF. Can be expressed by the following equation (1):
in model Ctis(t) and CRR(t) denotes the time-varying concentration of contrast agent in the region of interest and in the reference region, respectively, the parameters with the subscript RR describing the reference region and the others describing the region of interest. KTrans (min-1): the volume transport constant of the contrast agent leaking from the vascular (plasma) space to the EES (peripheral compartment); kep (min-1): the rate constant at which contrast agent returns from the EES to the vascular (plasma) space. There are three fitting parameters, each beingkep,kep,RR. The relative EES volume fraction can also be found by equation (2):
in the first step, all parameters are fitted with equation (6), taking the parameter kep,RRThe second fit is performed, which reduces the variability of the fit. In the second fitting, the parameters to be solved arekep. All parameters were fitted using a non-linear least squares method.
The reference region model can also be fitted linearly, the fitting equation is shown in formula (3),
we solve the equation using linear fitting, e.g. A ═ BxWherein A ═ Ctis(t1),...,Ctis(tN)],
The equation is solved as:
wherein X of X1And X2The following relationships exist:
equation (8) can be rewritten as equation (12) according to equation (11),
we calculate k for all reference region voxels according to the first fit of equations (3) and (6)ep,RRSolving the median and substituting the median as a fixed value into equation (7) to solve, wherein the second fitting parameter iskep。
Referring to table 1, the reference region model is implemented as follows:
a. the voxels in all tissues of interest are fitted using either a non-linear reference region model (equation (1)) or a linear reference region model (equation (3)).
b. The voxel estimate is discarded as a negative parameter.
d. Will be provided withThe second fit is performed by substituting as a fixed value into the nonlinear reference region model (formula (1)) or the linear reference region model (formula (7)).
TABLE 1boosting fusion algorithm pseudo-code
22. Extracting textural features
The extraction of the texture features from the mammary lesion area is performed by adopting one of a law texture feature, a gray level co-occurrence matrix, a gray level run-length matrix, a gray level area matrix and a gray level difference matrix.
The law texture features determine texture attributes by mean gray value, edges, blobs, ripples and waveforms in the texture.
The gray level co-occurrence matrix characteristic is used for representing the texture characteristic of the image by calculating various statistics of the gray level co-occurrence matrix of the image. The gray level co-occurrence matrix is a common method for extracting texture features, by calculating the joint probability distribution of two pixel values with a distance of D in an image, reflecting textures by using conditional probability, describing the gray level correlation of adjacent pixels, and quantitatively describing information such as the gray level direction, interval, change amplitude and the like of the image.
The gray level run matrix is also one of the common means for analyzing the local information and arrangement rule in the image, and is usually used in combination with the gray level co-occurrence matrix. The gray level run-length matrix mainly focuses on the number of the same gray level value in the same direction in the image, and texture features are obtained by counting the distribution of the pixels. Similar to the gray level co-occurrence matrix, the gray level run-length matrix can also select to scan along different directions, but the horizontal and vertical coordinates of the gray level co-occurrence matrix have different meanings, the horizontal and vertical coordinates of the gray level co-occurrence matrix represent the gray levels of two points and represent the gray level difference, the vertical coordinate of the run-length matrix represents the pixel value, the horizontal coordinate represents the number of continuous pixel values and represents the gray level uniformity, and the value in the matrix represents the number of times of occurrence of the continuous pixels of the pixel value.
The gray area matrix is also an area of consecutive pixels in the quantized image, which is arranged in the same manner as the gray run matrix, but the quantization unit is a size area rather than a run length, so to speak, the gray run matrix is the number of occurrences of recording consecutive adjacent pixel values in one-dimensional direction, and the gray area matrix is the number of occurrences of recording consecutive adjacent pixels in the image area in two-dimensional area.
The neighborhood gray difference matrix reflects the gray difference between the neighborhood and the central point.
3. Construction fusion classifier
The invention adopts three base classifiers to carry out fusion strategy design. Fusion step of fusion classifier as shown in fig. 2, the present invention mainly selects an SVM (support vector machine) classifier, a decision tree classifier and an ELM classifier as a base classifier of the first layer of boosting fusion.
The SVM finds the optimal classification plane by optimizing classification boundary intervals, the decision tree mainly carries out split decision by indexes based on information entropy, the ELM is based on a single-hidden-layer neural network, the minimum square difference of classification results is optimized, the classification principles of the three classifiers are different, the diversity of the classifiers can be increased for the same sample data, so that the classification results of the samples can be different, the weighted decision is carried out on the three improved strong classifiers, the classification results can be complemented to a certain degree, and the classification accuracy is improved.
The boosting classifier fusion algorithm continuously adjusts the distribution of training subsets according to the classification result of a base classifier in the training process, focuses more on classifying samples which are more prone to error in the learning process of the classifier, iterates for multiple times to obtain a plurality of base classifiers, then combines the plurality of classifiers to obtain a strong classifier, and therefore the classification accuracy of a single classifier is improved. The boosting algorithm firstly endows the same training sample set with initial weight, samples with weight obtain the optimal classifier through a classification algorithm, the classifier weight and the sample weight are determined according to the classification accuracy of the classifier, the sample weight is used as the training sample weight of the next-stage classifier, iteration is carried out continuously until the number of the required base classifiers is reached, and finally, the base classifiers are voted according to the classifier weight to obtain the classification of the boosting fusion classifier.
The construction of the fusion classifier mainly comprises two parts, firstly, the classification accuracy of a single SVM (support vector machine) classifier, a decision tree classifier and an ELM classifier is improved through a boosting fusion algorithm, the decision results of the boosting-SVM classifier, the boosting-decision tree classifier and the boosting-ELM classifier are obtained, and then the boosting-SVM classifier, the boosting-decision tree classifier and the boosting-ELM classifier are fused in a weighted voting mode to obtain the final classification result. Through the boosting algorithm, the classification accuracy of the base classifier can be optimized.
The specific steps of constructing the boosting-SVM classifier comprise: giving initial weight to the sample data; obtaining an SVM classifier by using sample data with initial weight through a classification algorithm; determining the weight of an SVM classifier and the weight of a sample according to the classification accuracy of the SVM classifier, wherein the weight of the sample is used as the weight of a training sample of a next-stage SVM classifier, and continuously iterating until the number of the SVM classifiers is required; and voting by the plurality of SVM classifiers according to the weight of the SVM classifier to obtain a boosting-SVM classifier.
In an iterative process, the weight value of a sample is related to whether the sample is classified correctly under the current base classifier. And for the sample with correct classification, properly reducing the weight of the sample, judging whether the sample is abnormal data or not for the sample with wrong classification, if so, reducing the weight of the sample, and otherwise, properly increasing the weight of the sample, so that the classifier focuses more on the sample with wrong classification in the normal sample. The abnormal data is judged based on a K-nearest neighbor algorithm, and if the proportion of classification errors in K-nearest neighbor samples of the samples is larger than the average value of the classification error rates of the nearest neighbor samples of all the samples.
The used classifiers are fused into a principle that a small number of classifiers obey a majority, the classification results of a plurality of classifiers are obtained respectively, classification reaches more than half as a judgment rule, and for the two classification problems of three classifiers, the classification result of two classifiers is the same as the final classification. However, for different three classifiers, different sample sets may have different performances, and the advantage of each classifier may not be well played by directly selecting the average vote.
The construction method of the boosting-decision tree classifier and the boosting-ELM classifier is the same as the construction steps of the boosting-SVM classifier.
For the three strong classifiers, the weights of the three classifiers are determined by adopting a weighted voting mode and are combined into a fused classifier. The idea of weighted voting is to have different weights in voting according to different effects of the base classifier, thereby influencing the final classification choice. The classifier with good classification effect and high classification accuracy will have a larger proportion, otherwise, the weight is smaller.The selection of the weight mainly refers to the classification accuracy of the three strong classifiers in a training set, and meanwhile, the proportion of the sample classes is considered, the weight is firstly evenly distributed to each class, then is equally divided into each sample, and is used for calculating the weight accuracy of each classifier classification, so that the weight accuracy C of the three strong classifiers is obtained1,C2,C3Then, the sum is made to be 1 by a normalization method, and the sum is used as the weight of the three classifiers, as shown in formula (13).
The weighted voting algorithm is shown in table 2.
TABLE 2 weighted voting Algorithm
4. And classifying the DCE-MRI images of the mammary glands by using a fusion classifier to obtain the grade of the DCE-MRI images of the mammary glands.
The above description is only an embodiment of the present invention, and not intended to limit the scope of the present invention, and all equivalent structures or equivalent flow transformations made by using the contents of the specification and the drawings, or applied directly or indirectly to other related systems, are included in the scope of the present invention.
Claims (8)
1. A DCE-MRI based breast cancer staging method comprising the steps of:
s1: extracting sample data from at least one mammary gland DCE-MRI image training sample, wherein the sample data comprises kinetic parameter characteristics and texture characteristics;
s2: construction fusion classifier
The method comprises the steps that a plurality of strong classifiers are subjected to linear combination based on a weighted voting method to obtain a fusion classifier, wherein the strong classifiers are obtained after sample data training is utilized;
s3: and classifying the DCE-MRI images of the mammary glands by using a fusion classifier to obtain the grade of the DCE-MRI images of the mammary glands.
2. The DCE-MRI-based breast cancer classification method according to claim 1, wherein each strong classifier is obtained by boosting integration algorithm based on the corresponding base classifier.
3. The DCE-MRI based breast cancer classification method according to claim 2, wherein said base classifier is SVM classifier, decision tree classifier and ELM classifier.
4. The DCE-MRI-based breast cancer classification method according to claim 1, wherein a threshold δ is preset in the fusion classifier, and when y > δ, the decision result is class 1, and when y < δ, the decision result is class 2, where y is the output of the fusion of the plurality of strong classifiers.
5. The DCE-MRI-based breast cancer classification method according to claim 2, wherein the method for obtaining the strong classifier by using boosting integration algorithm specifically comprises:
giving initial weight to the sample data;
obtaining a base classifier by the sample data with the initial weight through a classification algorithm;
determining the weight of the base classifier and the weight of a sample according to the classification accuracy of the base classifier, wherein the weight of the sample is used as the weight of a training sample of the next-stage base classifier, and continuously iterating until the number of the desired base classifiers is reached;
and voting by the plurality of base classifiers according to the weight of the base classifiers to obtain the strong classifier.
6. A DCE-MRI based breast cancer staging method according to any of the claims 1-6, wherein said kinetic parameter signature comprises a semi-quantitative parameter and a quantitative parameter.
7. The DCE-MRI based breast cancer staging method according to claim 6, wherein said step S1 specifically includes:
obtaining semi-quantitative parameters through a signal intensity-time curve of a mammary gland DCE-MRI image;
inputting the DCE-MRI image of the mammary gland into a regional reference model to obtain quantitative parameters;
extracting texture features from a lesion region of a mammary gland DCE-MRI image by adopting one method of a law texture feature, a gray level co-occurrence matrix, a gray level run matrix, a gray level region matrix and a gray level difference matrix;
and screening the extracted kinetic parameters and texture features to obtain sample data.
8. The DCE-MRI-based breast cancer classification method according to claim 5, wherein said abnormal sample data in said boosting integration algorithm is determined by using K-nearest neighbor algorithm.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011620218.9A CN112651951A (en) | 2020-12-30 | 2020-12-30 | DCE-MRI-based breast cancer classification method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011620218.9A CN112651951A (en) | 2020-12-30 | 2020-12-30 | DCE-MRI-based breast cancer classification method |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112651951A true CN112651951A (en) | 2021-04-13 |
Family
ID=75366649
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011620218.9A Pending CN112651951A (en) | 2020-12-30 | 2020-12-30 | DCE-MRI-based breast cancer classification method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112651951A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113421633A (en) * | 2021-06-25 | 2021-09-21 | 上海联影智能医疗科技有限公司 | Feature classification method, computer device, and storage medium |
CN117982106A (en) * | 2024-04-02 | 2024-05-07 | 天津市肿瘤医院(天津医科大学肿瘤医院) | MRI image-based breast cancer chemotherapy curative effect prediction system and method |
WO2024108396A1 (en) * | 2022-11-22 | 2024-05-30 | 深圳先进技术研究院 | Method and apparatus for measuring hemodynamic heterogeneity and computer device |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102247144A (en) * | 2011-04-18 | 2011-11-23 | 大连理工大学 | Time intensity characteristic-based computer aided method for diagnosing benign and malignant breast lesions |
CN104715260A (en) * | 2015-03-05 | 2015-06-17 | 中南大学 | Multi-modal fusion image sorting method based on RLS-ELM |
CN109300121A (en) * | 2018-09-13 | 2019-02-01 | 华南理工大学 | A kind of construction method of cardiovascular disease diagnosis model, system and the diagnostic model |
CN109816010A (en) * | 2019-01-21 | 2019-05-28 | 北京工业大学 | A kind of CART increment study classification method based on selective ensemble for flight delay prediction |
CN110706218A (en) * | 2019-09-26 | 2020-01-17 | 四川大学 | Breast tumor positioning analysis method based on dynamic enhanced magnetic resonance imaging |
CN111028206A (en) * | 2019-11-21 | 2020-04-17 | 万达信息股份有限公司 | Prostate cancer automatic detection and classification system based on deep learning |
CN111667460A (en) * | 2020-04-30 | 2020-09-15 | 清华大学 | MRI image processing system, method, apparatus and medium |
CN111832563A (en) * | 2020-07-17 | 2020-10-27 | 江苏大学附属医院 | Intelligent breast tumor identification method based on ultrasonic image |
-
2020
- 2020-12-30 CN CN202011620218.9A patent/CN112651951A/en active Pending
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102247144A (en) * | 2011-04-18 | 2011-11-23 | 大连理工大学 | Time intensity characteristic-based computer aided method for diagnosing benign and malignant breast lesions |
CN104715260A (en) * | 2015-03-05 | 2015-06-17 | 中南大学 | Multi-modal fusion image sorting method based on RLS-ELM |
CN109300121A (en) * | 2018-09-13 | 2019-02-01 | 华南理工大学 | A kind of construction method of cardiovascular disease diagnosis model, system and the diagnostic model |
CN109816010A (en) * | 2019-01-21 | 2019-05-28 | 北京工业大学 | A kind of CART increment study classification method based on selective ensemble for flight delay prediction |
CN110706218A (en) * | 2019-09-26 | 2020-01-17 | 四川大学 | Breast tumor positioning analysis method based on dynamic enhanced magnetic resonance imaging |
CN111028206A (en) * | 2019-11-21 | 2020-04-17 | 万达信息股份有限公司 | Prostate cancer automatic detection and classification system based on deep learning |
CN111667460A (en) * | 2020-04-30 | 2020-09-15 | 清华大学 | MRI image processing system, method, apparatus and medium |
CN111832563A (en) * | 2020-07-17 | 2020-10-27 | 江苏大学附属医院 | Intelligent breast tumor identification method based on ultrasonic image |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113421633A (en) * | 2021-06-25 | 2021-09-21 | 上海联影智能医疗科技有限公司 | Feature classification method, computer device, and storage medium |
WO2024108396A1 (en) * | 2022-11-22 | 2024-05-30 | 深圳先进技术研究院 | Method and apparatus for measuring hemodynamic heterogeneity and computer device |
CN117982106A (en) * | 2024-04-02 | 2024-05-07 | 天津市肿瘤医院(天津医科大学肿瘤医院) | MRI image-based breast cancer chemotherapy curative effect prediction system and method |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104751178B (en) | Lung neoplasm detection means and method based on shape template matching combining classification device | |
CN112651951A (en) | DCE-MRI-based breast cancer classification method | |
WO2022141201A1 (en) | Breast cancer grading method based on dce-mri | |
CN109325550B (en) | No-reference image quality evaluation method based on image entropy | |
US8139831B2 (en) | System and method for unsupervised detection and gleason grading of prostate cancer whole mounts using NIR fluorscence | |
Guo et al. | A new method of detecting micro-calcification clusters in mammograms using contourlet transform and non-linking simplified PCNN | |
Sahoo et al. | Threshold selection using a minimal histogram entropy difference | |
US20090279778A1 (en) | Method, a system and a computer program for determining a threshold in an image comprising image values | |
CN108898152B (en) | Pancreas cystic tumor CT image classification method based on multi-channel multi-classifier | |
CN106157330B (en) | Visual tracking method based on target joint appearance model | |
CN112396619B (en) | Small particle segmentation method based on semantic segmentation and internally complex composition | |
Yue et al. | Multiscale roughness measure for color image segmentation | |
CN112508963B (en) | SAR image segmentation method based on fuzzy C-means clustering | |
CN112700461A (en) | System for pulmonary nodule detection and characterization class identification | |
Kaushik et al. | Medical image segmentation using genetic algorithm | |
CN107944497A (en) | Image block method for measuring similarity based on principal component analysis | |
CN114724218A (en) | Video detection method, device, equipment and medium | |
CN105608673B (en) | Image color quantization and dithering method and system | |
Pham et al. | Extraction of fluorescent cell puncta by adaptive fuzzy segmentation | |
Maji et al. | Second order fuzzy measure and weighted co-occurrence matrix for segmentation of brain MR images | |
CN111046783A (en) | Slope geological disaster boundary extraction method for improving watershed algorithm | |
CN110717910A (en) | CT image target detection method and CT scanner | |
CN106485686A (en) | One kind is based on gravitational spectral clustering image segmentation algorithm | |
Rosenberger et al. | Unsupervised and supervised image segmentation evaluation | |
Aslam et al. | A review on various clustering approaches for image segmentation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |