CN112927215A - Automatic analysis method for digestive tract biopsy pathological section - Google Patents

Automatic analysis method for digestive tract biopsy pathological section Download PDF

Info

Publication number
CN112927215A
CN112927215A CN202110281259.8A CN202110281259A CN112927215A CN 112927215 A CN112927215 A CN 112927215A CN 202110281259 A CN202110281259 A CN 202110281259A CN 112927215 A CN112927215 A CN 112927215A
Authority
CN
China
Prior art keywords
sub
matrix
region
image
feature
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110281259.8A
Other languages
Chinese (zh)
Inventor
姜志国
郑钰山
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Macaudi Xiamen Medical Big Data Co ltd
Original Assignee
Macaudi Xiamen Medical Big Data Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Macaudi Xiamen Medical Big Data Co ltd filed Critical Macaudi Xiamen Medical Big Data Co ltd
Priority to CN202110281259.8A priority Critical patent/CN112927215A/en
Publication of CN112927215A publication Critical patent/CN112927215A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10068Endoscopic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30028Colon; Small intestine
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30092Stomach; Gastric
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/03Recognition of patterns in medical or anatomical images

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Computational Linguistics (AREA)
  • Computing Systems (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Molecular Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses an automatic analysis method for a biopsy pathological section of a digestive tract, which comprises the following steps: dividing the pathological section image into a plurality of sub-regions; based on the feature extraction model, feature extraction is carried out on each subregion one by one to obtain a feature matrix of each subregion; classifying each feature vector in each feature matrix based on an image classification model to obtain a corresponding lesion probability matrix; performing distance quantization on each eigenvector in each characteristic matrix based on a distance transformation algorithm to obtain a distance quantization characteristic matrix; cascading the feature matrix of each sub-region and the distance quantization feature matrix to obtain a fusion feature matrix; generating an organizational chart of each subregion based on the fusion characteristic matrix and the lesion probability matrix of each subregion; and carrying out classification prediction on each organizational structure diagram based on the graph convolution network model to generate a diagnosis result of each sub-region. The invention can screen the biopsy pathology, automatically output the diagnosis conclusion and assist the doctor to accurately and efficiently finish the work.

Description

Automatic analysis method for digestive tract biopsy pathological section
Technical Field
The invention relates to the technical field of medical image processing, in particular to an automatic analysis method for a biopsy pathological section of a digestive tract.
Background
The gastrointestinal tract tissue biopsy is an important means for screening cancers in the gastrointestinal tract system such as gastric cancer, intestinal cancer and the like, and a pathologist finishes screening by checking microscopic sections of biopsy tissues one by one. However, the gastrointestinal endoscopy of the digestive tract has a plurality of cases, the diagnosis task of a pathologist is heavy, and misdiagnosis is easily caused by long-time high-intensity work. With the rapid development of computer and microscopic imaging technology, digital pathological images can be acquired conveniently and rapidly, and a computer automatic analysis algorithm suitable for digital pathological full-section images becomes a research hotspot in the direction in recent years.
Because the resolution of the digital pathological image is far higher than that of the natural scene image, the whole pathological image is difficult to directly process by a computer vision algorithm. In order to classify the whole slice, the existing algorithm mostly adopts a full-slice image blocking mode, firstly, a local area is classified, then, on the basis of a local prediction result, a specific strategy is formulated to complete the classification of the full slice, and the following three common strategies are provided:
1) whole-slice classification strategy based on Majority voting (Majority voting)
The strategy regards the classification result of the image blocks contained in the slice as a vote for the full slice category, and directly outputs the category with the largest number of votes as the category of the slice. The method is simple and intuitive, and when the image blocks which are decisive for diagnosis in the slice occupy the number dominance, the correct result can be obtained. However, in pathological diagnosis, the image blocks that determine the full-slice diagnosis result sometimes occupy no dominance in number, even the number only occupies less than 1% of the total number of the image blocks, and in such a case, the majority voting strategy is difficult to give the correct full-slice classification result.
2) Full-slice classification strategy based on convolutional neural network
The strategy arranges the image block classification result or the image block features into a three-dimensional tensor according to the position relation in the full slice, then trains a Convolutional Neural Network (CNN) network by taking the three-dimensional tensor as a sample and the class of the slice as a label to realize the classification of the full slice, as shown in FIG. 1. The method can effectively relieve the problem of unequal number of the image blocks in the strategy 1), but the strategy is limited by a CNN model, has poor adaptability to the pixel resolution and the length-width ratio of a full slice, and is difficult to meet the requirement of practical application.
3) Classification strategy based on key image block sampling
After the classification of the image blocks is completed, the strategy samples the image blocks according to a certain rule (for example, selecting the image blocks with the classification confidence coefficient higher than a threshold value T) so as to reduce the number of the image blocks with smaller decision-making effect or side effect; and then, establishing a multi-instance learning and other set classification model by means of the image block set obtained by sampling, so as to realize the classification of the full slice. The method is compared with the strategy 2) to increase the adaptability to the slice size, but the absolute position information of the image blocks in the full slice and the relative position information between the image blocks are abandoned in the sampling process, so that the classification precision is reduced.
In order to solve the problems of the existing full-section classification strategy, how to provide a method which can effectively screen the pathological screening of the gastrointestinal biopsy and improve the classification precision and automatically analyze the pathological section of the gastrointestinal biopsy based on the automatic classification method of the digital pathological full-section image computer becomes a problem which needs to be solved by the technicians in the field urgently.
Disclosure of Invention
In view of the above, the present invention provides an automatic analysis method for pathological sections of gastrointestinal biopsy, which can be used for pathological screening of gastrointestinal biopsy, automatically output a diagnosis conclusion, and assist a doctor to complete work accurately and efficiently.
In order to achieve the purpose, the invention adopts the following technical scheme:
an automatic analysis method for pathological section of digestive tract biopsy comprises the following steps:
dividing the pathological section image into a plurality of sub-regions, and diagnosing each sub-region one by one;
determining a current diagnosis sub-area a, extracting the characteristics of the current diagnosis sub-area a by matching with a sliding window method based on a pre-constructed characteristic extraction model to obtain a characteristic matrix F(a)
The feature matrix F is subjected to image classification model based on pre-construction(a)The feature vectors in the step (a) are classified to obtain a lesion probability matrix P(a)
Feature matrix F based on distance transformation algorithm(a)Performing distance quantization on each eigenvector to obtain a distance quantization characteristic matrix H(a)(ii) a The feature matrix F(a)And the distance quantization feature matrix H(a)Cascading to obtain a fusion characteristic matrix
Figure BDA0002978520590000035
Based on the fusion feature matrix
Figure BDA0002978520590000036
And the lesion probability matrix P(a)Generating a histogramming map G of the current diagnostic sub-region a(a)
The organizational structure graph G of the current diagnosis sub-area a based on a pre-trained graph convolution network model(a)Carrying out classification prediction;
generating a diagnosis result c of the current diagnosis sub-region a based on the classification prediction result(a)
Preferably, in the above method for automatically analyzing pathological section of gastrointestinal biopsy, the dividing the pathological section image into a plurality of sub-regions and diagnosing each of the sub-regions one by one includes:
converting the pathological section image into a gray image from an RGB three-channel image;
processing the gray level image by using a threshold segmentation method to obtain a binary template M of the tissue region;
performing closed operation on the binary template M of the tissue region to obtain a closed operation result;
carrying out connected region detection on the closed operation result, taking the external rectangle of each connected region as a boundary to intercept a rectangular region from the tissue region binary template M, and obtaining a subregion binary template M(a)
One by one aiming at each subregion binary template M(a)And (6) carrying out diagnosis.
Preferably, in the above method for automatically analyzing pathological section of gastrointestinal biopsy, the feature matrix F(a)The expression method is as follows:
Figure BDA0002978520590000031
in the above formula, the first and second carbon atoms are,
Figure BDA0002978520590000032
the length of the window corresponding to the ith row and the jth column is dfThe feature vector of (2); [*]Represents rounding down; when the ith row and the jth column window do not contain the current diagnosis sub-region, the window is not subjected to feature extraction, and F is directly assignedij=0。
Preferably, in the above method for automatically analyzing pathological section of gastrointestinal biopsy, the lesion probability matrix P is(a)The expression method is as follows:
Figure BDA0002978520590000033
in the above formula, C represents the number of lesion types involved in the automatic classification task;
Figure BDA0002978520590000034
representing the predicted probability of the image block in the ith row and jth column window over C lesion types.
Preferably, in the above method for automatically analyzing pathological sections of gastrointestinal biopsy, the distance quantization is performed on each feature vector in the feature matrix based on a distance transform algorithm, so as to obtain a distance quantization feature matrix; cascading the feature matrix and the distance quantization feature matrix to obtain a fusion feature matrix, wherein the fusion feature matrix comprises the following steps:
distance transformation method based feature vector F for solving any image block in current diagnosis sub-areaijIn the feature matrix F(a)Shortest coordinate distance d of medium distance 0 vector or boundaryij
Using the formula for dijAnd (3) carrying out distance conversion:
Figure BDA0002978520590000041
in the above formula, the first and second carbon atoms are,
Figure BDA0002978520590000042
representing the degree of the current image block close to the boundary of the tissue area, wherein tau represents a temperature coefficient, and is set according to the actual application effect, and tau is taken as 16;
using the following formula pair
Figure BDA0002978520590000043
Carrying out distance quantization coding;
Figure BDA0002978520590000044
Figure BDA0002978520590000045
in the above formula, dhRepresents the length of the quantization code; hijRepresenting distance quantization feature vectors; h isijkRepresents HijThe value of the kth element;
respectively converting each image block feature vector F in the current diagnosis subareaijSum distance quantization feature vector HijCascading to obtain the fusion characteristic vector of each image block
Figure BDA0002978520590000046
The length of d ═ df+dh
Fusion feature vector based on each image block
Figure BDA0002978520590000047
Constructing a fused feature matrix
Figure BDA0002978520590000048
Preferably, in the above method for automatically analyzing pathological section of gastrointestinal biopsy, the fusion feature matrix is used as a basis for the analysis of the pathological section of gastrointestinal biopsy
Figure BDA00029785205900000411
And the lesion probability matrix P(a)Generating a histogramming map G of the current diagnostic sub-area a(a)The method comprises the following steps:
the fused feature matrix for the current diagnostic sub-region using the formula
Figure BDA00029785205900000412
Network sampling is carried out;
Figure BDA0002978520590000049
wherein, χgridRepresenting a set of network samples; s is a sampling step length which is determined according to the number N of image blocks contained in the current diagnosis sub-region and the upper limit N of the number of expected image blocks after grid samplingmaxCalculating to obtain; | represents the length of the collection;
obtaining a lesion probability matrix P of a current diagnosis subarea(a)Top N with highest probability of medium lesionconfEach image block and constructing a confidence coefficient sampling set; the expression for the confidence sample set is as follows:
Figure RE-GDA00030410122500000413
wherein, χconfRepresenting a set of confidence samples; α represents a confidence sampling threshold;
Figure BDA0002978520590000051
representing a pre-constructed image classification model;
collecting x' network samplesgridAnd confidence sample set χconfAnd performing union to obtain a set χ, wherein the representation method of the set χ is as follows:
Figure BDA0002978520590000052
wherein N isg| χ |, representing the length of the set χ;
constructing an adjacency matrix A by using the following formula;
Figure BDA0002978520590000053
Figure BDA0002978520590000054
Figure BDA0002978520590000055
apqrepresents any element in the adjacency matrix a;
Figure BDA0002978520590000056
representing fused feature vectors
Figure BDA0002978520590000057
And
Figure BDA0002978520590000058
in the original feature matrix
Figure BDA0002978520590000059
Euclidean distance in coordinate space; (i)p,jp) And (i)q,jq) To represent
Figure BDA00029785205900000510
And
Figure BDA00029785205900000511
in the original feature matrix
Figure BDA00029785205900000512
Coordinates of (5);
construction of the histogra gram G of the current diagnostic sub-region a(a),G(a)=(A,X)。
Preferably, in the above method for automatically analyzing pathological section of gastrointestinal biopsy, the sampling step S is calculated as follows:
Figure BDA00029785205900000513
Figure BDA00029785205900000514
in the above formula, N represents the number of image blocks included in the current diagnostic sub-region; n is a radical ofmaxRepresenting the upper limit of the number of the expected image blocks after grid sampling;
the confidence sample threshold α is calculated as:
calculating the prediction probability P of the image block of the ith row and the jth column window of the sliding window on C lesion types by using the following formulaijAnd probability of lesion rij
Pij=[pij1,pij2,...,pijC];
rij=1-pij1
Probability of lesion for all image blocks obtained by sliding windowijGet the set r 'in descending order'1,r'2,...};
According to a pathological change probability set { r'1,r'2,.. } obtain a confidence sample threshold a,
Figure BDA0002978520590000061
preferably, in the above method for automatically analyzing pathological section of gastrointestinal biopsy, the diagnosis result c of the current diagnosis sub-region a is obtained based on the classification prediction result(a)Then, the method further comprises the following steps: generating diagnosis results of other sub-regions one by one, and sorting and outputting the diagnosis results of each sub-region according to a preset rule; diagnosis result c of current diagnosis sub-area a(a)The diagnosis process comprises the following steps:
constructing a graph convolution network model based on DiffPool;
acquiring a training sample set, and preprocessing the training sample set by using a sliding window method; preprocessed training sample set
Figure BDA0002978520590000062
The expression method of (2) is as follows:
Figure BDA0002978520590000063
wherein G iskA tissue region diagram representing the k pathological section in the training sample set; lkRepresents GkA corresponding slice label;
training the graph convolution network model by utilizing the preprocessed training sample set;
classifying the organizational structure graph of the current diagnosis sub-region a based on the trained graph convolution network model, wherein the expression is as follows:
Figure BDA0002978520590000064
wherein z is(a)∈(0,1)CA probability that the org-chart G is divided into C categories;
Figure BDA0002978520590000065
representing the trained graph convolution network model;
generating a diagnosis result c of the current diagnosis sub-area a by using the following formula(a)
c(a)=argmax(z(a))。
10. Preferably, in the above method for automatically analyzing pathological section of gastrointestinal biopsy, the histogramming map of the current diagnosis sub-region a is classified and predicted based on a pre-trained map convolutional network model, and a diagnosis result c of the current diagnosis sub-region a is obtained(a)The method also comprises the following steps:
based on the lesion probability matrix P(a)Generating a heat map of each sub-region, correspondingly superposing the heat map on the surface of the original image of the corresponding sub-region to obtain a diagnosis region map; the process of generating the heat map comprises the following steps:
if c is(a)>1, extracting the lesion probability matrix P(a)C of (a)(a)One channel acts as a thermodynamic diagram if c(a)1, indicating that the corresponding sub-region is free of lesions, no thermodynamic diagram is output.
Preferably, in the above method for automatically analyzing pathological section of gastrointestinal biopsy, the method further includes:
sorting and outputting the diagnosis result and/or the diagnosis area map of each sub-area according to a preset rule; the diagnosis result of each sub-region is expressed by the following formula;
Figure BDA0002978520590000071
in the above formula, naRepresenting the number of sub-regions contained in the pathological section image;
the diagnosis result of the pathological section image is represented by the following formula:
Figure BDA0002978520590000072
Figure BDA0002978520590000073
Zcrow c representing Z;
Figure BDA0002978520590000074
the probability that the pathological section image belongs to each category is indicated.
According to the technical scheme, compared with the prior art, the invention discloses and provides an automatic analysis method for the pathologic section of the alimentary tract biopsy, which has the following beneficial effects:
the invention can classify the biopsy pathology of the digestive tract, and can provide the lesion area in a full-section image in a thermodynamic diagram mode to assist doctors in diagnosing the pathology.
The method balances the information quantity of the tissue areas with different sizes by combining the grid sampling and the confidence coefficient sampling, and has wider application range.
The invention introduces the tissue boundary quantization distance in the tissue region classification to represent the absolute position of the feature in the tissue, describes the space relative position before the feature by using the tissue structure diagram, and integrates the information by using the graph convolution network model to finally complete the classification of the tissue region. The introduction of absolute position and relative position information enables the method of the invention to significantly improve the classification accuracy of certain lesion types diagnosed by means of different tissue morphological distributions.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
FIG. 1 is a block flow diagram of an automatic analysis method for pathological section of digestive tract biopsy according to the present invention;
FIG. 2 is a flow chart illustrating an application of the method for automatically analyzing pathological section of digestive tract biopsy according to the present invention;
FIG. 3 is a flow chart illustrating the creation of a full-section histology map in accordance with the present invention;
fig. 4 is a schematic diagram of a pathological section area, a doctor labeling result and a lesion area model of a certain pathological section in a training sample set provided by the invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
As shown in fig. 1-3, the embodiment of the invention discloses a method for automatically analyzing pathological sections of digestive tract biopsy, which comprises the following steps:
s1, dividing the pathological section image into a plurality of sub-regions as shown in fig. 2b, and diagnosing each sub-region one by one;
s2, determining the current diagnosis sub-region a, extracting the features of the current diagnosis sub-region a (shown in figure 2 c) by matching with a sliding window method based on a pre-constructed feature extraction model, and obtaining a feature matrix F(a)As shown in fig. 2 d;
s3, feature matrix F based on pre-constructed image classification model(a)The feature vectors in the step (a) are classified to obtain a lesion probability matrix P(a)As shown in fig. 2 e;
s4, feature matrix F based on distance transformation algorithm(a)Performing distance quantization on each eigenvector to obtain a distance quantization characteristic matrix H(a)(ii) a As shown in fig. 2 f; the feature matrix F(a)Sum distance quantization feature matrix H(a)Cascading to obtain a fusion characteristic matrix
Figure BDA0002978520590000081
S5 based onFusion feature matrix
Figure BDA0002978520590000082
And the lesion probability matrix P(a)Generating a histogramming map G of the current diagnostic sub-area a(a)As shown in fig. 2 h;
s6, based on the graph convolution network model trained in advance, organizing the structure graph G of the current diagnosis sub-area a(a)Carrying out classification prediction;
s7, obtaining the diagnosis result c of the current diagnosis subarea a based on the classification prediction result(a)As shown in fig. 2 i.
And S8, repeatedly executing S2-S6, and generating diagnosis results of other sub-regions one by one.
And S9, sorting and outputting the diagnosis results of each sub-region according to a preset rule.
More advantageously, in order to assist the doctor in locating the lesion area quickly, the method further comprises:
s10, generating a heat map of each sub-region one by one based on the probability value of the lesion probability matrix of each sub-region, and correspondingly superimposing the heat map on the original image surface of the corresponding sub-region to obtain a diagnosis region map of each sub-region, as shown in fig. 2 j; the process of generating the heat map is as follows:
if c is(a)>1, extracting a lesion probability matrix P(a)C of (a)(a)One channel acts as a thermodynamic diagram if c(a)1, the current diagnosis subarea is not provided with a lesion, and a thermodynamic diagram is not output.
And sorting and outputting the diagnosis result and the diagnosis area map of the whole pathological section image according to a preset rule.
The above steps are described in detail below:
s1, dividing the pathological section image into a plurality of sub-regions, and diagnosing each sub-region one by one, wherein the method comprises the following steps:
in the case of biopsy pathology, where multiple layers of tissue are often placed side by side on the same glass slide (as shown in fig. 2 a), embodiments of the present invention require that all sub-regions be predicted one by one. The method specifically comprises the following steps:
s111, converting the pathological section image I into a gray image from an RGB three-channel image;
s112, processing the gray level image by using a threshold segmentation method (Dajin threshold segmentation algorithm) to obtain a binary template M of the tissue region, as shown in FIG. 2 b;
s113, performing closed operation on the binary template M of the tissue region to obtain a closed operation result;
s114, carrying out connected region detection on the closed operation result, and taking an external rectangle of each connected region as a boundary to intercept a rectangular region from the binary template M of the tissue region to obtain a sub-region binary template, as shown in FIG. 2c or 3 b; defining the binary template of the a-th sub-region as M(a)Defining the corresponding area in the original drawing I as I(a)As indicated by the rectangular box in fig. 2 a;
s115, one-by-one binary template M for each subregion(a)And (6) carrying out diagnosis.
The feature extraction model in S2 and the image classification model in S3 are constructed as follows:
1. slice labeling and data collection
As shown in FIG. 4, the method of the present invention depends on the labeling of the pathologist, and requires the pathologist to outline a typical lesion region in the original section map, and then converts the region outlined by the doctor into a lesion region template through a closed curve filling algorithm. Engagement
Figure BDA0002978520590000091
Representing an RGB three-channel pathological full-section image with pixel resolution of w x h under high microscopic magnification (such as resolution of 0.46um/pixel), marking K slices, and establishing a training data set
Figure BDA0002978520590000101
Wherein IkRepresenting the k slice image in the training set, EkRepresenting the generated lesion region template, its size and image IkThe same is used for recording the lesion category number, l, corresponding to each pixel point in the imagekA category number indicating the overall diagnosis of the noted slice.
At the same time, it is necessary to match the training set
Figure BDA00029785205900001017
The data in (1) is processed, specifically, the slice in the data is divided into image blocks in a sliding window mode, the window size is defined as t × t, and the sliding window step length is t/2. Order to
Figure BDA0002978520590000102
Representing the nth image block obtained by sliding window, the submatrix of the same area of the image block in the lesion area template is YnThe image block data set created by the sliding window is represented as
Figure BDA0002978520590000103
Wherein y isnValue of (a) is represented by YnAll values in (a) make most voting decisions.
2. Model building
Training set for image blocks
Figure BDA0002978520590000104
Any image block T in (1)nExtracting image features can be expressed as the formula:
Figure BDA0002978520590000105
wherein
Figure BDA0002978520590000106
A model for the extraction of the features is represented,
Figure BDA0002978520590000107
expressed as length dfThe feature vector of (2). The invention has no special limitation on the feature extraction model, and can select a digital image feature extraction method according to requirements, such as traditional image features of texture features, color histogram features, shape features, frequency domain transformation features and the like, or a feature extraction method based on machine learning of a self-coding network, a convolutional neural network and the like.
In the above image feature vector fnAnd establishing an image classification model on the basis, and classifying the lesion types of the image blocks. The expression of the image classification model is as follows:
Figure BDA0002978520590000108
wherein
Figure BDA0002978520590000109
Representing an image classification model, pn∈(0,1)CRepresenting the prediction probability, C representing the number of lesion types involved in the automatic classification task, pncRepresents the probability that the nth image block belongs to the class c and satisfies
Figure BDA00029785205900001010
For convenience of subsequent description, designating c as 1 denotes a normal region category, c>1 represents a lesion category. The invention classifies the image
Figure BDA00029785205900001011
Without special requirements, a support vector machine, a random forest or a classification model based on a neural network can be selected according to requirements.
In order to obtain higher automatic classification precision, the invention adopts a Convolutional Neural Network (CNN) to establish the above
Figure BDA00029785205900001012
And
Figure BDA00029785205900001013
CNN is an end-to-end machine learning model, and needs to use the above
Figure BDA00029785205900001014
Completing the training of CNN, and after the training is completed, using the last full-connection-Softmax structure in the CNN structure as a classification model
Figure BDA00029785205900001015
The entire network except the classification layer is used as a feature extraction model
Figure BDA00029785205900001016
Feature matrix F of the current diagnostic sub-region a in S2(a)The construction process comprises the following steps:
using the above feature extraction model
Figure BDA0002978520590000111
And (3) extracting the image characteristics of the current diagnosis subarea by matching with a sliding window method, wherein the window size is t multiplied by t, the window size is the same as the window size in the training set preprocessing process, and the sliding window step length is t. The full-slice feature matrix obtained by feature extraction is represented as
Figure BDA0002978520590000112
Wherein [. X]Indicating a rounding down. Matrix elements
Figure BDA0002978520590000113
That is, the feature corresponding to the ith row and jth column window, in order to avoid unnecessary calculation, when the ith row and jth column window does not contain an organization region (determined according to the organization region template shown in fig. 3 b), feature extraction is not performed on the window, and F is directly assignedij=0。
In S3, the lesion probability matrix P(a)The specific construction process comprises the following steps:
classifying the images into models
Figure BDA0002978520590000114
Act on F(a)Any one of FijObtaining the probability matrix of lesion probability of the current diagnosis subregion a
Figure BDA0002978520590000115
Wherein the matrix elements
Figure BDA0002978520590000116
Namely the image block in the ith row and jth column window is pre-positioned on C lesionsProbability of measurement, as shown in FIG. 3d
The distance transformation process of the current diagnosis sub-region a in S4 is as follows:
in order to describe the position of a local tissue area in a tissue block, the invention adds a tissue boundary distance quantization feature in an image block feature extraction node, and uses distance transformation to solve any feature FijIn the feature matrix F(a)Distance of shortest coordinate of vector or boundary of middle distance 0, using dijThe effect of the implementation is shown in fig. 3 e.
In order to make the distance coding more sensitive to the boundary parts of the tissue region (corresponding to the epithelial parts possible in tissue pathology), d is corrected using the following formulaijImplementing a transformation
Figure BDA0002978520590000117
In the above formula, the first and second carbon atoms are,
Figure BDA0002978520590000118
the degree of the current image block close to the boundary of the tissue area is represented, tau represents a temperature coefficient, the temperature coefficient is set according to the actual application effect, and tau is preferably 16 for the pathological section of the digestive tract biopsy;
in order to make better use of this feature for subsequent machine learning models, the following pairs are used
Figure BDA0002978520590000119
Distance quantization coding
Figure BDA00029785205900001110
Figure BDA00029785205900001111
Wherein d ishIndicating the length of the quantization code, HijValue of the k-th element in (1), HijRepresenting distance-quantized feature vectors, hijkTo representHijThe value of the kth element.
For convenience of description, the image block distance quantization coding process at the ith row and jth column window is expressed as:
Figure RE-GDA00030410122500001112
to avoid unnecessary calculation, when the ith row and jth column window does not contain an organization region (judged according to the organization region template shown in fig. 3 b), distance quantization is not performed on the window, and H is directly assignedij=0。
Respectively converting each image block feature vector F in the current diagnosis subareaijSum distance quantization feature vector HijCascading to obtain the fusion characteristic vector of each image block
Figure BDA0002978520590000121
The length of d ═ df+dh
Fusion feature vector based on each image block
Figure BDA0002978520590000122
Constructing a fused feature matrix
Figure BDA0002978520590000123
The specific process of generating the organizational chart in S5 is as follows:
considering that the size difference of the pathological image tissue area is large, the tissue area may contain hundreds of thousands of image blocks (windows) in an extreme case, and directly using all image block features to construct a graph (graph) can cause the scale of the graph to be too large, thereby affecting the calculation of a subsequent graph convolution network model. Therefore, the invention balances the number of image blocks used for the composition (graph constraint) of different tissue slices in a sampling mode.
1) First, grid sampling (grid sampling) is performed, and a fusion feature matrix of the current diagnostic sub-region is obtained by using the following formula
Figure BDA0002978520590000124
Network sampling is carried out;
Figure BDA0002978520590000125
wherein, χgridRepresenting a set of network samples; s is a sampling step length which is determined according to the number N of image blocks contained in the current diagnosis sub-region and the upper limit N of the number of expected image blocks after grid samplingmaxCalculating to obtain; | represents the length of the collection.
The calculation formula of the sampling step length S is as follows:
Figure BDA0002978520590000126
Figure BDA0002978520590000127
in the above formula, N represents the number of image blocks included in the current diagnostic sub-region; n is a radical ofmaxRepresenting the upper limit of the number of expected image blocks after grid sampling.
2)χgridOnly considering the reduction of the number of image blocks on a space structure, in order to ensure that image blocks which play a decisive role in classification in a slice can participate in composition, the embodiment of the invention additionally adopts confidence coefficient sampling to obtain a probability matrix P of the current diagnosis subregion lesion(a)Top N with highest probability of medium lesionconfAnd constructing a confidence coefficient sampling set.
For convenience of description, P will be mentionedijIs shown in detail as Pij=[pij1,pij2,...,pijC]Defining the probability of pathological changes of the ith row and the jth column of image blocks obtained by sliding window as rij=1-pij1Probability of lesions for all image blocks obtained for the sliding window { rijSorting from big to small to obtain a set of { r'1,r'2,.., defining confidence sample thresholds based thereon
Figure BDA0002978520590000131
Figure BDA0002978520590000132
According to the threshold value alpha, defining the confidence coefficient sampling result as:
Figure BDA0002978520590000133
wherein, χconfRepresenting a set of confidence samples; α represents a confidence sampling threshold;
Figure BDA0002978520590000134
representing a pre-constructed image classification model.
Collecting x' network samplesgridAnd confidence sample set χconfAnd performing union to obtain a set χ, wherein the representation method of the set χ is as follows:
Figure BDA0002978520590000135
wherein N isgAnd | χ |, which represents the length of the set χ.
The adjacency matrix a is defined as a matrix of,
Figure BDA0002978520590000136
wherein the element apqThe value rule is as follows:
Figure BDA0002978520590000137
Figure BDA0002978520590000138
apqrepresents any element in the adjacency matrix a;
Figure BDA0002978520590000139
representing fused feature vectors
Figure BDA00029785205900001310
And
Figure BDA00029785205900001311
in the original feature matrix
Figure BDA00029785205900001312
Euclidean distance in coordinate space; (i)p,jp) And (i)q,jq) To represent
Figure BDA00029785205900001313
And
Figure BDA00029785205900001314
in the original feature matrix
Figure BDA00029785205900001315
Coordinates of (2).
In conclusion, the process of constructing the organizational chart is completed and is marked as G(a)(a, X), as shown in fig. 3 (g).
For convenience of description, the feature matrix will be composed of
Figure BDA00029785205900001316
And defining the whole process of constructing the organization region structure diagram by the prediction probability matrix P as a diagram construction model
Figure BDA00029785205900001317
The composition process is described as
Figure BDA00029785205900001318
And (8) training and classifying and predicting the graph convolution network model in S6.
Building graph convolution network model for graph classification prediction
Figure BDA00029785205900001319
For mapping tissue regions GFor classification of lesion types, the present invention is preferably applied to pathological sections of digestive tract biopsies using the DiffPool model. Diagnosis result c of the current diagnosis sub-area a(a)The diagnosis process comprises the following steps:
constructing a graph convolution network model based on DiffPool;
acquiring a training sample set, and preprocessing the training sample set by using a sliding window method; preprocessed training sample set
Figure BDA0002978520590000141
The expression method of (2) is as follows:
Figure BDA0002978520590000142
wherein G iskA tissue region diagram representing the k pathological section in the training sample set; lkRepresents GkA corresponding slice label;
training the graph convolution network model by utilizing the preprocessed training sample set;
classifying the organizational structure diagram of the current diagnosis sub-area a based on a trained graph convolution network model, wherein the expression is as follows:
Figure BDA0002978520590000143
wherein z is(a)∈(0,1)CA probability that the org-chart G is divided into C categories;
Figure BDA0002978520590000144
representing the trained graph convolution network model;
generating a diagnosis result c of the current diagnosis sub-area a by using the following formula(a)
c(a)=argmax(z(a))。
The diagnosis results of the sub-regions are generated one by one according to S2-S6.
S10, outputting the diagnosis result of case
Prediction result z from screening all sub-regions (sometimes distributed in multiple full slices) of pathological section image(a)Arranged into a matrix
Figure BDA0002978520590000145
Wherein n isaRepresenting the number of the sub-regions contained in the case, the screening result of the whole case is defined as
Figure BDA0002978520590000146
Wherein
Figure BDA0002978520590000147
Wherein ZcLine c of Z.
Figure BDA0002978520590000148
I.e. the probability that the case belongs to each category, can be output to the doctor user according to the application requirements and certain rules.
The embodiments in the present description are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other. The device disclosed by the embodiment corresponds to the method disclosed by the embodiment, so that the description is simple, and the relevant points can be referred to the method part for description.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (10)

1. An automatic analysis method for pathological sections of digestive tract biopsy is characterized by comprising the following steps:
dividing the pathological section image into a plurality of sub-regions, and diagnosing each sub-region one by one;
determining a current diagnosis sub-area a, extracting the characteristics of the current diagnosis sub-area a by matching with a sliding window method based on a pre-constructed characteristic extraction model to obtain a characteristic matrix F(a)
The feature matrix F is subjected to image classification model based on pre-construction(a)The feature vectors in the step (a) are classified to obtain a lesion probability matrix P(a)
Feature matrix F based on distance transformation algorithm(a)Performing distance quantization on each eigenvector to obtain a distance quantization characteristic matrix H(a)(ii) a The feature matrix F(a)And the distance quantization feature matrix H(a)Cascading to obtain a fusion characteristic matrix
Figure FDA0002978520580000011
Based on the fusion feature matrix
Figure FDA0002978520580000012
And the lesion probability matrix P(a)Generating a histogramming map G of the current diagnostic sub-area a(a)
The org-chart G of the current diagnostic sub-region a based on a pre-trained graph convolution network model(a)Carrying out classification prediction;
generating a diagnosis result c of the current diagnosis sub-region a based on the classification prediction result(a)
2. The method for automatically analyzing pathological section of alimentary canal biopsy according to claim 1, wherein dividing pathological section image into a plurality of sub-regions, and diagnosing each sub-region one by one, comprises:
converting the pathological section image into a gray image from an RGB three-channel image;
processing the gray level image by using a threshold segmentation method to obtain a binary template M of the tissue region;
performing closed operation on the binary template M of the tissue region to obtain a closed operation result;
carrying out connected region detection on the closed operation result, taking the external rectangle of each connected region as the boundary to intercept the rectangular region from the tissue region binary template M, and obtaining a subregion binary template M(a)
One by one aiming at each subregion binary template M(a)And (6) carrying out diagnosis.
3. The method for automatically analyzing pathological section of digestive tract biopsy according to claim 1, wherein the feature matrix F(a)The expression method is as follows:
Figure FDA0002978520580000013
in the above formula, the first and second carbon atoms are,
Figure FDA0002978520580000014
the length of the window corresponding to the ith row and the jth column is dfThe feature vector of (2); [*]Represents rounding down; when the ith row and the jth column window do not contain the current diagnosis sub-region, the window is not subjected to feature extraction, and F is directly assignedij=0。
4. The method as claimed in claim 3, wherein the lesion probability matrix P is a matrix of lesion probability(a)The expression method is as follows:
Figure FDA0002978520580000021
in the above formula, C represents the number of lesion types involved in the automatic classification task;
Figure FDA0002978520580000022
representing the predicted probability of the image block in the ith row and jth column window over C lesion types.
5. The method for automatically analyzing pathological section of digestive tract biopsy as claimed in claim 4, wherein the distance-based transformation algorithm performs distance quantization on each feature vector in the feature matrix to obtain a distance quantized feature matrix; cascading the feature matrix and the distance quantization feature matrix to obtain a fusion feature matrix, wherein the step of cascading the feature matrix and the distance quantization feature matrix comprises the following steps:
distance transformation method based feature vector F for solving any image block in current diagnosis sub-areaijIn the feature matrix F(a)Shortest coordinate distance d of medium distance 0 vector or boundaryij
Using the formula for dijAnd (3) carrying out distance conversion:
Figure FDA0002978520580000023
in the above formula, the first and second carbon atoms are,
Figure FDA0002978520580000024
representing the degree of the current image block close to the boundary of the tissue area, wherein tau represents a temperature coefficient, and is set according to the actual application effect, and tau is taken as 16;
using the following formula pair
Figure FDA0002978520580000025
Carrying out distance quantization coding;
Figure FDA0002978520580000026
Figure FDA0002978520580000027
in the above formula, dhRepresents the length of the quantization code; hijRepresenting distance quantization feature vectors; h isijkRepresents HijThe value of the kth element;
respectively converting each image block feature vector F in the current diagnosis subareaijSum distance quantization feature vector HijCascading to obtain the fusion characteristic vector of each image block
Figure FDA0002978520580000028
The length of d ═ df+dh
Fusion feature vector based on each image block
Figure FDA0002978520580000029
Constructing a fused feature matrix
Figure FDA00029785205800000210
6. The method of claim 5, wherein the fused feature matrix is based on
Figure FDA00029785205800000211
And the lesion probability matrix P(a)Generating a histogramming map G of the current diagnostic sub-area a(a)The method comprises the following steps:
the fused feature matrix for the current diagnostic sub-region using the formula
Figure FDA00029785205800000212
Network sampling is carried out;
Figure FDA0002978520580000031
wherein the content of the first and second substances,
Figure FDA00029785205800000324
representing a set of network samples; s is a sampling step length which is determined according to the number N of image blocks contained in the current diagnosis sub-area and the upper limit N of the number of expected image blocks after grid samplingmaxCalculating to obtain; | represents the length of the collection;
obtaining a lesion probability matrix P of a current diagnosis subarea(a)Top N with highest probability of medium lesionconfEach image block and a confidence coefficient sampling set are constructed; the expression for the confidence sample set is as follows:
Figure FDA0002978520580000032
wherein the content of the first and second substances,
Figure FDA0002978520580000033
representing a set of confidence samples; α represents a confidence sampling threshold;
Figure FDA0002978520580000034
representing a pre-constructed image classification model;
aggregating network samples
Figure FDA0002978520580000035
And confidence sample set
Figure FDA0002978520580000036
Performing union to obtain a set
Figure FDA0002978520580000037
Collection
Figure FDA0002978520580000038
The expression method of (2) is as follows:
Figure FDA0002978520580000039
wherein the content of the first and second substances,
Figure FDA00029785205800000310
showing platform
Figure FDA00029785205800000311
Length of (d);
constructing an adjacency matrix A by using the following formula;
Figure FDA00029785205800000312
Figure FDA00029785205800000313
Figure FDA00029785205800000314
apqrepresents any element in the adjacency matrix a;
Figure FDA00029785205800000315
representing fused feature vectors
Figure FDA00029785205800000316
And
Figure FDA00029785205800000317
in the original feature matrix
Figure FDA00029785205800000318
Euclidean distance in coordinate space; (i)p,jp) And (i)q,jq) To represent
Figure FDA00029785205800000319
And
Figure FDA00029785205800000320
in the original feature matrix
Figure FDA00029785205800000321
Coordinates of (5);
construction of the histogra gram G of the current diagnostic sub-region a(a),G(a)=(A,X)。
7. The method for automatically analyzing pathological section of digestive tract biopsy according to claim 6, wherein the sampling step S is calculated as follows:
Figure FDA00029785205800000322
Figure FDA00029785205800000323
in the above formula, N represents the number of image blocks included in the current diagnostic sub-region; n is a radical ofmaxRepresenting the upper limit of the number of the expected image blocks after grid sampling;
the confidence sample threshold α is calculated as:
calculating the prediction probability P of the image block of the ith row and the jth column window of the sliding window on C lesion types by using the following formulaijAnd probability of lesion rij
Pij=[pij1,pij2,...,pijC];
rij=1-pij1
Probability of lesion for all image blocks obtained by sliding windowijGet the set r 'in descending order'1,r′2,...};
According to a pathological change probability set { r'1,r′2,.. } obtain a confidence sample threshold a,
Figure FDA0002978520580000041
8. the method as claimed in claim 6, wherein the diagnosis result C of the current diagnosis sub-area a is obtained(a)The diagnosis process comprises the following steps:
constructing a graph convolution network model based on DiffPool;
acquiring a training sample set, and preprocessing the training sample set by using a sliding window method; preprocessed training sample set
Figure FDA0002978520580000042
The expression method of (2) is as follows:
Figure FDA0002978520580000043
wherein G iskA tissue region diagram representing the k pathological section in the training sample set; lkRepresents GkA corresponding slice label;
training the graph convolution network model by utilizing the preprocessed training sample set;
classifying the organizational chart of the current diagnosis sub-area a based on the trained graph convolution network model, wherein the expression is as follows:
Figure FDA0002978520580000044
wherein z is(a)∈(0,1)CA probability that the org-chart G is divided into C categories;
Figure FDA0002978520580000045
representing the trained graph convolution network model;
generation of the Current diagnostics Using the following equationDiagnosis result c of region a(a)
c(a)=argmax(z(a))。
9. The method as claimed in claim 6, wherein the map convolutional network model trained in advance is used to perform classification prediction on the histogramming map of the current diagnosis sub-region a and obtain the diagnosis result c of the current diagnosis sub-region a(a)The method also comprises the following steps:
generating a heat map of each sub-region one by one based on the probability value in the lesion probability matrix of each sub-region, and correspondingly overlapping the heat map to the surface of the original image of the corresponding sub-region to obtain a diagnosis region map of each sub-region; the process of generating the heat map comprises the following steps:
if c is(a)> 1, extracting the lesion probability matrix P(a)C of (a)(a)One channel acts as a thermodynamic diagram if c(a)1, indicating that the corresponding sub-region is free of lesions, no thermodynamic diagram is output.
10. The method of claim 9, further comprising:
sorting and outputting the diagnosis result and/or the diagnosis area map of each sub-area according to a preset rule; the diagnosis result of each sub-region is expressed by the following formula;
Figure FDA0002978520580000051
in the above formula, naRepresenting the number of sub-regions contained in the pathological section image;
the diagnosis result of the pathological section image is represented by the following formula:
Figure FDA0002978520580000052
Figure FDA0002978520580000053
Zcrow c representing Z;
Figure FDA0002978520580000054
the probability that the pathological section image belongs to each category is indicated.
CN202110281259.8A 2021-03-16 2021-03-16 Automatic analysis method for digestive tract biopsy pathological section Pending CN112927215A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110281259.8A CN112927215A (en) 2021-03-16 2021-03-16 Automatic analysis method for digestive tract biopsy pathological section

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110281259.8A CN112927215A (en) 2021-03-16 2021-03-16 Automatic analysis method for digestive tract biopsy pathological section

Publications (1)

Publication Number Publication Date
CN112927215A true CN112927215A (en) 2021-06-08

Family

ID=76175566

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110281259.8A Pending CN112927215A (en) 2021-03-16 2021-03-16 Automatic analysis method for digestive tract biopsy pathological section

Country Status (1)

Country Link
CN (1) CN112927215A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114140445A (en) * 2021-12-06 2022-03-04 上海派影医疗科技有限公司 Breast cancer pathological image identification method based on key attention area extraction
CN114359280A (en) * 2022-03-18 2022-04-15 武汉楚精灵医疗科技有限公司 Gastric mucosa image boundary quantification method, device, terminal and storage medium

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114140445A (en) * 2021-12-06 2022-03-04 上海派影医疗科技有限公司 Breast cancer pathological image identification method based on key attention area extraction
CN114140445B (en) * 2021-12-06 2022-10-28 上海派影医疗科技有限公司 Breast cancer pathological image identification method based on key attention area extraction
CN114359280A (en) * 2022-03-18 2022-04-15 武汉楚精灵医疗科技有限公司 Gastric mucosa image boundary quantification method, device, terminal and storage medium

Similar Documents

Publication Publication Date Title
CN111985536B (en) Based on weak supervised learning gastroscopic pathology image Classification method
CN109886179B (en) Image segmentation method and system of cervical cell smear based on Mask-RCNN
CN107665492B (en) Colorectal panoramic digital pathological image tissue segmentation method based on depth network
WO2021139258A1 (en) Image recognition based cell recognition and counting method and apparatus, and computer device
WO2020253629A1 (en) Detection model training method and apparatus, computer device, and storage medium
CN108830326B (en) Automatic segmentation method and device for MRI (magnetic resonance imaging) image
CN109389129B (en) Image processing method, electronic device and storage medium
CN110021425B (en) Comparison detector, construction method thereof and cervical cancer cell detection method
CN112101451B (en) Breast cancer tissue pathological type classification method based on generation of antagonism network screening image block
CN112380900A (en) Deep learning-based cervical fluid-based cell digital image classification method and system
Yang et al. Colon polyp detection and segmentation based on improved MRCNN
CN112733950A (en) Power equipment fault diagnosis method based on combination of image fusion and target detection
CN112085714B (en) Pulmonary nodule detection method, model training method, device, equipment and medium
CN108830149B (en) Target bacterium detection method and terminal equipment
CN108305253A (en) A kind of pathology full slice diagnostic method based on more multiplying power deep learnings
CN110796661B (en) Fungal microscopic image segmentation detection method and system based on convolutional neural network
CN110766670A (en) Mammary gland molybdenum target image tumor localization algorithm based on deep convolutional neural network
Bouchet et al. Intuitionistic fuzzy set and fuzzy mathematical morphology applied to color leukocytes segmentation
CN115909006B (en) Mammary tissue image classification method and system based on convolution transducer
WO2020066257A1 (en) Classification device, classification method, program, and information recording medium
CN112132166A (en) Intelligent analysis method, system and device for digital cytopathology image
CN112927215A (en) Automatic analysis method for digestive tract biopsy pathological section
CN113378792A (en) Weak supervision cervical cell image analysis method fusing global and local information
CN114445356A (en) Multi-resolution-based full-field pathological section image tumor rapid positioning method
CN116740435A (en) Breast cancer ultrasonic image classifying method based on multi-mode deep learning image group science

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination