CN108960341B - Brain network-oriented structural feature selection method - Google Patents

Brain network-oriented structural feature selection method Download PDF

Info

Publication number
CN108960341B
CN108960341B CN201810818259.5A CN201810818259A CN108960341B CN 108960341 B CN108960341 B CN 108960341B CN 201810818259 A CN201810818259 A CN 201810818259A CN 108960341 B CN108960341 B CN 108960341B
Authority
CN
China
Prior art keywords
network
brain
feature
term
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810818259.5A
Other languages
Chinese (zh)
Other versions
CN108960341A (en
Inventor
接标
王咪
卞维新
丁新涛
左开中
方群
罗永龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Anhui Normal University
Original Assignee
Anhui Normal University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Anhui Normal University filed Critical Anhui Normal University
Priority to CN201810818259.5A priority Critical patent/CN108960341B/en
Publication of CN108960341A publication Critical patent/CN108960341A/en
Application granted granted Critical
Publication of CN108960341B publication Critical patent/CN108960341B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/211Selection of the most significant subset of features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/03Recognition of patterns in medical or anatomical images

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Magnetic Resonance Imaging Apparatus (AREA)

Abstract

A structured feature selection method for a brain network is provided, and considering that network local measurement is generally used as a feature vector for subsequent feature selection and classification for complex data of the brain network and information of a topological structure inherent in the network is ignored, so that the performance of network analysis is influenced, the method comprises two regularization terms, wherein one regularization term is a sparse regularization term, and the term comprises an L1A paradigm regularization term ensures that only a small number of network features with discrimination can be selected; the other is a laplacian regularization item which is used for retaining the overall distribution information of the brain network data, calculating the similarity of the network data by using the graph core and retaining the topological structure information of the brain network data. On two real brain disease data sets, experimental results show that compared with the existing method, the proposed method has better performance on brain diseases.

Description

Brain network-oriented structural feature selection method
Technical Field
The invention belongs to the field of machine learning and medical image analysis, and particularly relates to a brain network-oriented structural feature selection method.
Background
Modern Magnetic Resonance Imaging (MRI) techniques, including functional MRI, provide a non-invasive way to explore the human brain, revealing mechanisms of brain structure and function that have not been previously uncovered. Brain network analysis can depict the interaction between brain and brain regions on a connection level, and becomes a new research hotspot in medical image analysis and neuroimaging.
More recently, methods of machine learning have been used in the analysis and classification of brain networks. For example, researchers have taken advantage of the brain network for early brain disease diagnosis and classification, and have achieved very good performance. In these studies, it is typical to extract local measures of the brain (e.g., clustering coefficients) from the brain network as features for disease classification. And the feature selection is to filter out redundant and unimportant features, thereby improving the classification performance. For example, Chen et al use the weights of edges as features for AD (Alzheimer's disease) and MCI (milli cognitive impact). Wee et al extracted clustering coefficients from functional brain networks as features for classification of MCI. Zanin et al used 16 network measurements as features for classification of MCI and normal. Since the locality measures only the characteristics of the local structure of the network, the overall topology of the network is lost during the classification process, which may affect the classification performance.
In brain network analysis, the two feature selection methods most frequently used are the t-test method and the Lasso method. In the t-test method, the discriminativity of each feature is first measured using a standard t-test, and the features are sorted according to the discriminativity, and finally a set of most discriminative feature subsets is selected. Previous studies have shown that good performance is generally obtained with the t-test method in small samples. Unlike the t-test method, the Lasso method does feature selection by minimizing an objective function, and studies have shown that the Lasso method works well when there are a large number of uncorrelated features but only a small number of samples. At present, most feature selection methods mainly aim at vector data and cannot be directly used for processing complex structured data such as brain network data.
Feature selection can not only improve the performance of the classifier, but also help to find some disease-sensitive biomarkers. Existing methods typically extract local measures (such as edge weights or clustering coefficients) from the network data as features and combine them into a long feature vector for subsequent feature selection and classification, while some useful network structure information (such as the overall topology of the network) is lost, which may degrade the final classification performance. To address this problem, a graph-kernel based structured feature selection method (referred to as gk-SFS) is proposed herein for feature selection of structured data. Different from the existing method, the gk-SFS method is provided for reserving the structural information of the whole network data and reserving the topological structure information of the network data.
Disclosure of Invention
Aiming at the defects in the prior art, the invention provides a brain network-oriented structural feature selection method. The specifically proposed gk-SFS method first utilizes L1The paradigm sparsification item ensures that only a few features with discriminant power can be selected. The Laplace regularization term is further utilized for preserving the information on the whole structure of the network data, and the similarity of the network data is calculated by using a graph-based kernel (graph kernel), so that the self topological structure information of the network data is preserved. Finally, it is proposed to optimize the proposed model using the Accelerated approximate Gradient (Accelerated approximate Gradient) algorithm.
In order to achieve the purpose, the invention adopts the following technical scheme:
a brain network-oriented structured feature selection method is characterized by comprising the following steps:
firstly, performing data preprocessing on a brain data set, and constructing a functional brain network by using a Pearson correlation coefficient;
step two, establishing a target function of a brain network-oriented gk-SFS structural feature selection method;
introducing a regularization term based on a target function for retaining the overall distribution information among the samples;
step four, calculating the similarity of the network data by using the graph core so as to keep the topological structure information of the network data;
and fifthly, optimizing the objective function by using an accelerated approximation gradient algorithm.
In order to optimize the technical scheme, the specific measures adopted further comprise:
in the first step, a full-connection network graph with weights is constructed, and a given threshold value is used for converting a weight network into a binary network for depicting a topological structure; and then extracting local clustering coefficients from each brain region to be used as features for reducing feature dimensions, wherein the features from all the brain regions form a feature vector together.
In the second step, given the feature matrix X ═ X extracted from the training sample set1,x2…,xN]∈RN*dWherein x isiRepresenting a feature vector of an ith sample, wherein i is 1., N represents the number of training samples, and d represents a feature dimension;
let Y be [ Y1,y2…,yN]∈RNRepresents a vector in which yiClass labels representing samples, classifying two classes of problems, i.e. yi∈{+1,-1};
The optimized objective function of the gk-SFS characteristic selection method is as follows:
Figure BDA0001738955220000021
wherein matrix C represents a laplacian matrix, w represents a projection vector, and λ and β are two regularization parameters; the target function comprises three terms, wherein the first term is a loss function, a square loss function is adopted in the target function, the second term is a sparse regularization term, an L1 paradigm is adopted in the target function for selecting the feature with discriminant force, and the third term is a Laplace regularization term for retaining the distribution information of the whole network data and the structure information of the network.
In the third step, the following regularization terms are introduced:
Figure BDA0001738955220000031
wherein, g (x)i)=wTxiIs a linear mapping function, C ═ D-M is a laplacian matrix, M ═ Mij]Representing a measurement matrix defining the similarity between samples, D being a diagonal matrix whose diagonal elements
Figure BDA0001738955220000032
j=1,...,N。
In the fourth step, the graph core is used to define the similarity of two networks, i.e. for any two networks GiAnd GjThe similarity matrix S is defined as follows:
Mij=k(Gi,Gj)
where k represents a kernel function, the Weisfeiler-Lehman sub-tree kernel is used to construct the corresponding graph kernel.
The invention has the beneficial effects that: the structural information of the whole network data is kept, the topological structure information of the network data is kept, and experimental results on two real brain disease data sets (attention deficit hyperactivity disorder data set and senile dementia data set) show that compared with the existing method, the method has better performance on the brain diseases.
Drawings
FIG. 1 shows the classification accuracy results versus the variation of different regularization parameters λ and β over three classification tasks: nc classification, fig. 1 a; fig. 1b represents the 1MCI vs. emci classification; nc classification is shown in fig. 1 c.
Detailed Description
The present invention will now be described in further detail with reference to the accompanying drawings.
The invention specifically adopts the following technical scheme:
given a training sample set X ═ X1,x2…,xN]∈RN*dWherein x isiA feature vector representing the ith sample (e.g., a feature vector formed by local measurements extracted from each network data), i ═ 1. Let Y be [ Y1,y2…,yN]∈RNRepresents a vector in which yiClass labels representing samples, classifying two classes of problems, i.e. yiE { +1, -1} (e.g.: 1 represents patient, -1 represents normal person).
In order to preserve the overall distribution information among samples, the following regularization terms are introduced:
Figure BDA0001738955220000033
wherein, g (x)i)=wTxiIs a linear mapping function, C ═ D-M is a laplacian matrix, M ═ Mij]Representing a measurement matrix defining the similarity between samples, D being a diagonal matrix whose diagonal elements
Figure BDA0001738955220000034
j=1,...,N。
Equation (1) defines that the regularization term preserves the sample inter-sample data graph structure information in the original space.
Further, in order to preserve the topology information of the network data itself, a graph core is used to directly define the (local and global) similarity of two network data, i.e. for any two networks GiAnd GjThe similarity matrix S is defined as follows:
Mij=k(Gi,Gj) (2)
where k represents a kernel function, in this study the Weisfeiler-Lehman sub-tree kernel was used to construct the corresponding graph kernel:
given two graphs GiAnd GjLet L0Is shown in GiAnd GjLet L be the initial label set (if the node has no label, the node degree is used as the label of the node)mDenotes G in the m-th isomorphism testiAnd GjThe Weisfeiler-Lehman subtree kernel is defined as follows:
Figure BDA0001738955220000041
wherein
Figure BDA0001738955220000042
Figure BDA0001738955220000043
Where h denotes the maximum number of iterations, smnRepresenting a set of labels LmMiddle label, σm(Gi,smn) And σm(Gj,smn) Respectively represent a label smnIn the figure GiAnd GjThe number of occurrences of (1), (L), and (m)m|。
Finally, the objective function of the graph core-based structural feature selection method is defined as follows:
Figure BDA0001738955220000044
where matrix C represents the laplacian matrix and w represents the projection vector, defined by equations (1) and (4). The objective function contains three terms, the first term is a square loss function and the second term is a sparse regularization term, where L is used1And in the paradigm, a sparse solution is generated in the feature space, the features corresponding to the non-0 elements in w are reserved, and the third term is a laplacian regularization term and is used for reserving the spatial distribution information of the whole network data and the topological structure information of the network data. λ and β are two regularization parameters that balance the relative contributions between the three terms, the values of which are determined by cross-validation on training data.
For the objective function defined by equation (4), a widely used accelerated approximation gradient algorithm is employed for optimization. The effectiveness of the method was demonstrated on two published fMRI datasets, namely the adhd (attention defect hyper activity disorder) dataset and the ADNI (the Alzheimer's Disease neuromaking Initiative).
The technical scheme of the invention is further explained in detail by combining the embodiment as follows:
one embodiment of the present invention, enumerates the evaluation of the effectiveness of the proposed method on two published fMRI datasets. Table 1 gives the characteristics of these data sets.
TABLE 1 statistical information of samples of two data sets
Figure BDA0001738955220000051
MMSE=Mmi-Mental State Examination
For the ADHD dataset, time series data that has been pre-processed using sites from NYU (New York university) may be pre-processed in detail as shown in http: v/www.nitrc.org/plugs/mwiki/index php/neuroboureau: athena finds. The preprocessed data divides the brain into 90 brain areas according to AAL (automated Anatomical laboratory), each brain area comprises 172 time point data, and a Pearson correlation coefficient is used for constructing a functional brain network.
For the ADNI dataset, standard pre-processing pipelines were used, including time-slicing (rectification and cephalorectification). Image pre-processing was done using SPM8(Statistical Parametric Mapping software package) (http:// www.fil.ion.ucl.ac.uk.spm). For each sample, the first 10 fMRI images were discarded to ensure magnetization balance. The remaining images are corrected for inter-slice acquisition time delay first, followed by head motion correction to eliminate the effects of head motion. Since ventricular (ventricle) and White Matter (WM) regions contain relatively high noise, the Gray Matter (GM) is used to extract the Blood Oxygen Level Dependent (BOLD) signal to construct a functionally connected network, and each sample GM tissue is further used to mask (mask) their corresponding fMRI images in order to eliminate the possible effects of WM and CSF. The first scan of the fMRI time series is registered to the T1 weighted image of the same sample, and the estimated transformation is applied to other time series of the same sample. The rectified fMRI image is first registered to the same template space using the deformed registration method of HAMMER, and divided into 90 regions of interest (ROIs) using the AAL template. Finally, for each ROI, the mean fMRI time-series over all voxels (voxels) is taken as the time-series for that ROI. The same Pearson correlation coefficients are also used to construct functional brain networks.
Since the constructed network has a full-connection graph with weights, in order to characterize the topology, the weight network of each sample is firstly thresholded by a given threshold value and converted into a binary network. Then, in order to reduce the dimension of the features, according to the literature, a local clustering coefficient is extracted as the features for each brain region, and each sample is derived from all the features of the brain regions to form a feature vector together. Finally, feature selection is performed using the proposed gk-SFS method. In the classification step, a widely used Support Vector Machine (SVM) technique is used for classification.
Table 2, table 3 and table 4 summarize the experimental results of all methods on three classification tasks, respectively. In which a method in which feature selection is not performed (i.e., all the clustering coefficient features extracted from the network are used) is used as baseline for comparison. As can be seen from table 2, table 3 and fig. 1, the proposed method is better than the comparative method in both classification accuracy and AUC values on both data sets. Specifically, on the ADHD data set, the proposed method obtained a classification accuracy of 63.0% and an AUC value of 0.66, while the comparison method preferably had a classification accuracy of 61.6%, and preferably an AUC value of 0.95; on the ADNI dataset, the proposed method yielded classification accuracies of 68.4% and 71.7% in the two classification tasks 1MCI vs. eMCI and eMCI vs. nc, respectively, while the best results of the comparative method were 61.2% and 67.1%, respectively. Furthermore, the AUC values obtained by the proposed method were 0.74 for both classification tasks, while the best results of the comparison method were 0.63 and 0.69, respectively. These results show that the proposed method can retain the overall distribution information of the network data and the topology information of the network itself, thereby inducing more discriminative features. In addition, compared with the Baseline method, the feature selection method (comprising t-test, Lasso and gk-SFS) can remarkably improve the classification performance of the brain network, and the importance of feature selection is suggested.
Table 2 classification performance of ADHD vs. nc
Figure BDA0001738955220000061
eMMC classification performance of Table 31 MCIvs
Figure BDA0001738955220000062
Table 4 eMCI vs. nc classification performance
Figure BDA0001738955220000063
As can be seen from fig. 1, the classification performance is better for most cases where β > 0 than when β ═ 0, indicating the importance of Laplacian regularization term, suggesting the effectiveness of the proposed method. In addition, when the lambda changes beta, the color changes more smoothly (namely the classification performance changes less), which indicates that the proposed method is more robust to the beta parameter. When the beta is changed into the lambda fixedly, the color change is obvious, and the tg-SFS is suggested to be sensitive to the lambda. This is also reasonable, since λ controls the sparsification term, thus determining the number of selected features.
The above is only a preferred embodiment of the present invention, and the protection scope of the present invention is not limited to the above-mentioned embodiments, and all technical solutions belonging to the idea of the present invention belong to the protection scope of the present invention. It should be noted that modifications and embellishments within the scope of the invention may be made by those skilled in the art without departing from the principle of the invention.

Claims (4)

1. A brain network-oriented structured feature selection method is characterized by comprising the following steps:
firstly, performing data preprocessing on a brain data set, and constructing a functional brain network by using a Pearson correlation coefficient;
establishing a target function of a brain network-oriented graph core-based structural feature selection method; in the second step, given the feature matrix X ═ X extracted from the training sample set1,x2…,xN]∈RN*dWherein x isiA feature vector representing the ith sample, i 1, N represents the training sampleThe number of the main points, d, represents the feature dimension;
let Y be [ Y1,y2…,yN]∈RNRepresents a vector in which yiClass labels representing samples, classifying two classes of problems, i.e. yi∈{+1,-1};
The optimized objective function of the gk-SFS characteristic selection method is as follows:
Figure FDA0003333291030000011
wherein matrix C represents a laplacian matrix, w represents a projection vector, and λ and β are two regularization parameters; the target function comprises three terms, wherein the first term is a loss function, a square loss function is adopted in the target function, the second term is a sparse regularization term, an L1 paradigm is adopted in the target function for selecting the feature with discriminant force, and the third term is a Laplace regularization term for retaining the distribution information of the whole network data and the structure information of the network;
introducing a regularization term based on a target function for retaining the overall distribution information among the samples;
step four, calculating the similarity of the network data by using the graph core so as to keep the topological structure information of the network data;
and fifthly, optimizing the objective function by using an accelerated approximation gradient algorithm.
2. The method for selecting structured features oriented to brain networks according to claim 1, wherein: in the first step, a full-connection network graph with weights is constructed, and a given threshold value is used for converting a weight network into a binary network for depicting a topological structure; and then extracting local clustering coefficients from each brain region to be used as features for reducing feature dimensions, wherein the features from all the brain regions form a feature vector together.
3. The method for selecting the structural features oriented to the brain network according to claim 2, wherein: in the third step, the following regularization terms are introduced:
Figure FDA0003333291030000012
wherein, g (x)i)=wTxiIs a linear mapping function, C ═ D-M is a laplacian matrix, M ═ Mij]Representing a measurement matrix defining the similarity between samples, D being a diagonal matrix whose diagonal elements
Figure FDA0003333291030000021
4. A brain network-oriented structured feature selection method according to claim 3, wherein: in the fourth step, the graph core is used to define the similarity of two networks, i.e. for any two networks GiAnd GjThe similarity matrix S is defined as follows:
Mij=k(Gi,Gj)
where k represents a kernel function, the Weisfeiler-Lehman sub-tree kernel is used to construct the corresponding graph kernel.
CN201810818259.5A 2018-07-23 2018-07-23 Brain network-oriented structural feature selection method Active CN108960341B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810818259.5A CN108960341B (en) 2018-07-23 2018-07-23 Brain network-oriented structural feature selection method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810818259.5A CN108960341B (en) 2018-07-23 2018-07-23 Brain network-oriented structural feature selection method

Publications (2)

Publication Number Publication Date
CN108960341A CN108960341A (en) 2018-12-07
CN108960341B true CN108960341B (en) 2022-03-01

Family

ID=64464101

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810818259.5A Active CN108960341B (en) 2018-07-23 2018-07-23 Brain network-oriented structural feature selection method

Country Status (1)

Country Link
CN (1) CN108960341B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11443136B2 (en) * 2019-03-20 2022-09-13 Tata Consultancy Services Limited System and method for signal pre-processing based on data driven models and data dependent model transformation
CN113177604B (en) * 2021-05-14 2024-04-16 东北大学 High-dimensional data feature selection method based on improved L1 regularization and clustering
CN113283518B (en) * 2021-06-01 2024-02-13 常州大学 Multi-mode brain network feature selection method based on clustering
CN113516186B (en) * 2021-07-12 2024-01-30 聊城大学 Modularized feature selection method for brain disease classification
CN113920123B (en) * 2021-12-16 2022-03-15 中国科学院深圳先进技术研究院 Addictive brain network analysis method and device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106127769A (en) * 2016-06-22 2016-11-16 南京航空航天大学 A kind of brain Forecasting Methodology in age connecting network based on brain
CN107133651A (en) * 2017-05-12 2017-09-05 太原理工大学 The functional magnetic resonance imaging data classification method of subgraph is differentiated based on super-network
CN107909117A (en) * 2017-09-26 2018-04-13 电子科技大学 A kind of sorting technique and device based on brain function network characterization to early late period mild cognitive impairment
CN107944490A (en) * 2017-11-22 2018-04-20 中南大学 A kind of image classification method based on half multi-modal fusion feature reduction frame
CN108229066A (en) * 2018-02-07 2018-06-29 北京航空航天大学 A kind of Parkinson's automatic identifying method based on multi-modal hyper linking brain network modelling

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8874432B2 (en) * 2010-04-28 2014-10-28 Nec Laboratories America, Inc. Systems and methods for semi-supervised relationship extraction

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106127769A (en) * 2016-06-22 2016-11-16 南京航空航天大学 A kind of brain Forecasting Methodology in age connecting network based on brain
CN107133651A (en) * 2017-05-12 2017-09-05 太原理工大学 The functional magnetic resonance imaging data classification method of subgraph is differentiated based on super-network
CN107909117A (en) * 2017-09-26 2018-04-13 电子科技大学 A kind of sorting technique and device based on brain function network characterization to early late period mild cognitive impairment
CN107944490A (en) * 2017-11-22 2018-04-20 中南大学 A kind of image classification method based on half multi-modal fusion feature reduction frame
CN108229066A (en) * 2018-02-07 2018-06-29 北京航空航天大学 A kind of Parkinson's automatic identifying method based on multi-modal hyper linking brain network modelling

Also Published As

Publication number Publication date
CN108960341A (en) 2018-12-07

Similar Documents

Publication Publication Date Title
CN108960341B (en) Brain network-oriented structural feature selection method
CN109409416B (en) Feature vector dimension reduction method, medical image identification method, device and storage medium
CN110298364B (en) Multi-threshold-value multi-task-based feature selection method for functional brain network
Mahapatra Analyzing training information from random forests for improved image segmentation
Ahirwar Study of techniques used for medical image segmentation and computation of statistical test for region classification of brain MRI
Doshi et al. Multi-atlas skull-stripping
Feng et al. Image segmentation and bias correction using local inhomogeneous iNtensity clustering (LINC): A region-based level set method
Ortiz et al. Two fully-unsupervised methods for MR brain image segmentation using SOM-based strategies
Song et al. A modified probabilistic neural network for partial volume segmentation in brain MR image
Kabade et al. Segmentation of brain tumour and its area calculation in brain MR images using K-mean clustering and fuzzy C-mean algorithm
EP2483863B1 (en) Method and apparatus for processing medical images
Van Assen et al. A 3-D active shape model driven by fuzzy inference: application to cardiac CT and MR
CN109492668B (en) MRI (magnetic resonance imaging) different-phase multimode image characterization method based on multi-channel convolutional neural network
CN113616184A (en) Brain network modeling and individual prediction method based on multi-mode magnetic resonance image
CN110211671B (en) Thresholding method based on weight distribution
CN112862834B (en) Image segmentation method based on visual salient region and active contour
Mahapatra Automatic cardiac segmentation using semantic information from random forests
JP2023517058A (en) Automatic detection of tumors based on image processing
CN105512670B (en) Divided based on KECA Feature Dimension Reduction and the HRCT peripheral nerve of cluster
Ramasamy et al. Segmentation of brain tumor using deep learning methods: a review
Christe et al. Improved hybrid segmentation of brain MRI tissue and tumor using statistical features
Ramana Alzheimer disease detection and classification on magnetic resonance imaging (MRI) brain images using improved expectation maximization (IEM) and convolutional neural network (CNN)
Hu et al. A GLCM embedded CNN strategy for computer-aided diagnosis in intracerebral hemorrhage
CN112927235B (en) Brain tumor image segmentation method based on multi-scale superpixel and nuclear low-rank representation
Kasim et al. Gaussian mixture model-expectation maximization algorithm for brain images

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant