CN107578405A - A kind of pulmonary nodule automatic testing method based on depth convolutional neural networks - Google Patents

A kind of pulmonary nodule automatic testing method based on depth convolutional neural networks Download PDF

Info

Publication number
CN107578405A
CN107578405A CN201710765376.5A CN201710765376A CN107578405A CN 107578405 A CN107578405 A CN 107578405A CN 201710765376 A CN201710765376 A CN 201710765376A CN 107578405 A CN107578405 A CN 107578405A
Authority
CN
China
Prior art keywords
layers
mrow
images
lung
layer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710765376.5A
Other languages
Chinese (zh)
Inventor
王学磊
刘璟丹
王鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Medical Wisdom Technology Co Ltd
Original Assignee
Beijing Medical Wisdom Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Medical Wisdom Technology Co Ltd filed Critical Beijing Medical Wisdom Technology Co Ltd
Priority to CN201710765376.5A priority Critical patent/CN107578405A/en
Publication of CN107578405A publication Critical patent/CN107578405A/en
Pending legal-status Critical Current

Links

Landscapes

  • Apparatus For Radiation Diagnosis (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a kind of pulmonary nodule automatic testing method based on depth convolutional neural networks, one depth convolutional neural networks is trained using training set, then for lung's CT images to be detected, slided using a sliding window in image, its output valve is obtained by depth convolutional neural networks to the lung's Local C T images slided every time, each voxel is using all output valves obtained in sliding process averagely as its final score, if the final score is more than predetermined threshold value, as tubercle voxel, connected domain analysis finally is carried out to the tubercle voxel in lung's CT images again, obtain testing result.The presence situation of tubercle in secondary lung's CT images can be automatically detected that using the present invention, compared with existing detection method, the present invention can also realize preferable technical performance in the case where that need not carry out lung segmentation and false positive exclusion, have the characteristics that Detection accuracy is high, robustness is strong.

Description

A kind of pulmonary nodule automatic testing method based on depth convolutional neural networks
Technical field
The invention belongs to lung CT Image detection triage techniques field, more specifically, is related to a kind of based on depth volume The pulmonary nodule automatic testing method of product neutral net.
Background technology
At present, presented because the reasons such as long-term smoking, air pollution cause cases of lung cancer in worldwide rapidly The phenomenon of growing trend.Lung cancer belongs to a kind of higher cancer types of morbidity and mortality in the world.It data show, The average 5 years survival rates of lung cancer are only 16% in global range, and in early stage (I phases) 5 years survival rates of lung cancer up to 65%, and The patient for unfortunately only having 10% can have found ill and take corresponding treatment in lung cancer early stage.Evidence suggests right High Risk of Lung Cancer crowd carries out lung's computed tomography (CT) and carrys out examination lung health situation on time every year, can reduce by 20% Lung cancer mortality.
The tubercle of lung often has certain contact with lung cancer.When finding to have tubercle in lung's CT images, need Cause the attention of detected person, and give and sufficiently pay attention to and actively further detected, treated.Meanwhile find early The possibility of healing can be greatly improved in the tubercle of lung, therefore, by CT images detect pulmonary nodule presence situation for The early screening of lung cancer has great meaning.
Detection of the existing clinical method for pulmonary nodule is mostly the plane gray scale that whole lung is obtained using CT machines Fault image is CT images, then the detection work of pulmonary nodule is accomplished manually by professional healthcare givers, filters out lung's knot Save image.The problems such as workload is big, time-consuming, error-prone omission be present more in such conventional method, the result of screening also it is more according to Rely in the personal level professional technology of healthcare givers.
The content of the invention
It is an object of the invention to overcome the deficiencies in the prior art, proposes a kind of lung based on depth convolutional neural networks Tubercle automatic testing method, the detection efficiency and Detection accuracy of pulmonary nodule are improved, to reduce the workload of healthcare givers, section The about working time of healthcare givers.
For achieving the above object, the pulmonary nodule automatic testing method of the invention based on depth convolutional neural networks, It is characterised in that it includes following steps:
S1:Collect a number of size be 5 × 20 × 20 lung's Local C T images as training set, wherein wrapping Containing normal lung tissue, also contains containing tuberculous abnormal pulmonary tissue, the CT images classification comprising normal lung tissue ' 0 ' is denoted as, ' 1 ' is denoted as containing tuberculous abnormal pulmonary tissue CT images classification;
All CT images in training set are normalized, obtain normalized CT images, normalized can To be represented using equation below:
Wherein, x is the pixel pixel value in CT images, and norm (x) represents the pixel value after normalization, xminFor CT shadows As the minimum pixel value in pixel, xmaxFor the max pixel value in CT images pixel;
S2:Build depth convolutional neural networks:
1st layer is convolutional layer (being denoted as C1 layers), using the convolution kernel that 96 sizes are 3 × 9 × 9, to the office of lung of input Portion's CT images do convolution using step-length as 1;
2nd layer is pond layer (being denoted as M1 layers), C1 layers are input to the data of M1 layers using step-length as 2, it is non-overlapped do 1 × 2 × 2 maximum pondization operation;
3rd layer is convolutional layer (being denoted as C2 layers), and using the convolution kernel that 256 sizes are 2 × 4 × 4, C2 is input to M1 layers The data of layer do convolution using step-length as 1;
4th layer is pond layer (being denoted as M2 layers), and the data of M2 layers are input to C2 layers using step-length as 2, it is non-overlapped do 2 × 2 × 2 maximum pondization operation;
5th layer is convolutional layer (being denoted as C3 layers), and using the convolution kernel that 384 sizes are 1 × 3 × 3, C3 is input to M2 layers The data of layer do convolution using step-length as 1;
6th layer is convolutional layer (being denoted as C4 layers), and using the convolution kernel that 384 sizes are 1 × 3 × 3, C4 is input to C3 layers The data of layer do convolution using step-length as 1;
7th layer is convolutional layer (being denoted as C5 layers), and using the convolution kernel that 256 sizes are 1 × 3 × 3, C5 is input to C4 layers The data of layer do convolution using step-length as 1;
In C1, C2, C3, C4, C5 layer, using the linear unit R eLu of amendment as activation primitive:
Wherein, y represents the input of activation primitive, and ReLu (y) represents the output of activation primitive.
8th layer is pond layer (being denoted as M3 layers), and the data of M3 layers are input to C5 layers using step-length as 2, it is non-overlapped do 1 × 2 × 2 maximum pondization operation;
9th layer is full articulamentum (being denoted as FC1 layers), shares 1096 neurons, each neuron and M3 layer output datas Connected entirely with 7 parameters, preceding 3 values in 7 parameters are the CT images of input depth convolutional neural networks relative to whole The receptive field positional information of individual CT images and three axles, rear 4 values represent DICOM image informations, i.e. slice thickness, flat at one X-axis, the distance between axles of y-axis, framing information in face;
10th layer is full articulamentum (being denoted as FC2 layers), shares 1096 neurons, each neuron and the output of FC1 layers Data are connected entirely;FC1 layers, FC2 layers are using tanh functions as activation primitive:
Wherein, z represents the input of activation primitive, and tanh (z) represents the output of activation primitive;
11th layer is full articulamentum (being denoted as FC3 layers), shares 2 neurons, each neuron and the 1096 of the output of FC2 layers Dimension data is connected entirely;The output data of FC3 layers be one 2 dimension size vector, respectively represent belong to classification 0 probability and Belong to the probability of classification 1;FC3 layers are using sigmoid functions as activation primitive:
Wherein, p represents the input of activation primitive, and sigmoid (p) represents the output of activation primitive;
S3:Each lung's Local C T images in pulmonary nodule training set are obtained using step S1 and are used as input, corresponding class Not Zuo Wei desired output, the depth convolutional neural networks of step S2 structures are trained, the depth convolution that is trained god Through network;
S4:For lung's CT images to be detected, slided often in the image using the sliding window of 5 × 20 × 20 sizes Its output is obtained in the secondary depth convolutional neural networks for sliding the obtained lung equal input step S3 of Local C T images and training to obtain Value, the score using the output valve as each voxel in this obtained lung's Local C T image of slip, each voxel it is final The average value of all scores of the voxel in sliding window sliding process is scored at, is preset if the final score of some voxel is more than Voxel decision threshold H, then the voxel is tubercle voxel, is not otherwise tubercle voxel;To the knot detected in lung's CT images Save voxel and carry out connected domain analysis, so as to obtain the position of pulmonary nodule and border.
The present invention proposes a kind of pulmonary nodule automatic testing method based on depth convolutional neural networks, using training set A depth convolutional neural networks are trained, then for lung's CT images to be detected, using a sliding window in image Slide, its output valve is obtained by depth convolutional neural networks to the lung's Local C T images slided every time, each voxel will be sliding All output valves obtained during dynamic are averagely used as its final score, if the final score is more than predetermined threshold value, as tie Voxel is saved, connected domain analysis finally is carried out to the tubercle voxel in lung's CT images again, obtains testing result.
The presence situation of tubercle in secondary lung's CT images can be automatically detected that using the present invention, with existing detection side Method is compared, and the present invention can also realize preferable technical performance in the case where that need not carry out lung segmentation and false positive exclusion, have There is the features such as Detection accuracy is high, robustness is strong.
Brief description of the drawings
Fig. 1 is the pulmonary nodule automatic testing method testing process schematic diagram of the invention based on depth convolutional neural networks;
Fig. 2 is the structural representation of depth convolutional neural networks in the present invention;
Fig. 3 is 4 pulmonary nodule testing result exemplary plots in certain lung's CT images.
Embodiment
The embodiment of the present invention is described below in conjunction with the accompanying drawings, so as to those skilled in the art preferably Understand the present invention.Requiring particular attention is that in the following description, when known function and the detailed description of design perhaps When can desalinate the main contents of the present invention, these descriptions will be ignored herein.
Fig. 1 is the pulmonary nodule automatic testing method testing process schematic diagram of the invention based on depth convolutional neural networks. As shown in figure 1, the specific steps of the pulmonary nodule automatic testing method of the invention based on depth convolutional neural networks include:
S101:Obtain pulmonary nodule training set:
In order to realize that candidate's pulmonary nodule is detected, it is necessary to build and train a depth convolution for being used for pulmonary nodule detection Neutral net.Therefore, firstly the need of one training set of structure.Collect the lung that a number of size is 5 × 20 × 20 Local C T images, wherein comprising normal lung tissue, also contains containing tuberculous abnormal pulmonary tissue, bag as training set The CT images classification of the tissue containing normal lung is denoted as ' 0 ', and ' 1 ' is denoted as containing tuberculous abnormal pulmonary tissue CT images classification.It is right Be considered as just non-nodules when its voxel is not a part for tubercle in lung's Local C T images, i.e. normal lung tissue, Otherwise it is assumed that it is abnormal pulmonary tissue.
All CT images in training set are normalized, obtain normalized CT images, normalized can To be represented using equation below:
Wherein, x is the pixel pixel value in CT images, and norm (x) represents the pixel value after normalization, xminFor CT shadows As the minimum pixel value in pixel, xmaxFor the max pixel value in CT images pixel.
The size of lung's CT images is D × 512 × 512 (∈ [65,764]) in industry at present, pulmonary nodule in the present invention Lung's Local C T images size in training set is 5 × 20 × 20, and the size obtains by experiment, specific experiment method For:Receptive field is sized to 1 × 76 × 76,76 maximums for being Lung neoplasm diameter first, it is local using the lung of this size CT images carry out the training of depth convolutional neural networks, then reduce receptive field size, it is observed that the depth volume trained The accuracy rate of product neural network model has been lifted.Then, the depth for increasing receptive field inputs the number of plies of lung piece, its accuracy rate Also lifted.Experiment is found, when it is 5 to input the number of plies, network model rate of accuracy reached to optimum level.Lung neoplasm is one three Target is tieed up, therefore when receptive field is set to multilayer input, feature extraction preferably can be carried out to Lung neoplasm, therefore the present invention is set Put input window size of the receptive field as depth convolutional neural networks that size is 5 × 20 × 20.And experiment is found, works as sense When by open country being 5 × 20 × 20, a full node size can be comprised at least or include part tubercle.
The pulmonary nodule training set obtained using step S101, depth convolutional neural networks can not only train tubercle Partial information, tubercle can also be trained with the information of outer portion lung tissue.Therefore it is original in order to more effectively retain DICOM (Digital Imaging and Communications in Medicine, digital imaging and communications in medicine) file Information, former CT images are not zoomed in and out in data processing.
S102:Build depth convolutional neural networks:
In order to realize the detection of pulmonary nodule, it is necessary to build a depth convolutional neural networks.Fig. 2 is depth in the present invention The structural representation of convolutional neural networks.Table 1 is the structure table of depth convolutional neural networks in the present invention.
Table 1
According to Fig. 2 and table 1, the structure of the depth convolutional neural networks employed in the present invention is as follows:
1st layer is convolutional layer (being denoted as C1 layers), using the convolution kernel that 96 sizes are 3 × 9 × 9, to the office of lung of input Portion's CT images do convolution using step-length as 1;
2nd layer is pond layer (being denoted as M1 layers), C1 layers are input to the data of M1 layers using step-length as 2, it is non-overlapped do 1 × 2 × 2 maximum pondization operation;
3rd layer is convolutional layer (being denoted as C2 layers), and using the convolution kernel that 256 sizes are 2 × 4 × 4, C2 is input to M1 layers The data of layer do convolution using step-length as 1;
4th layer is pond layer (being denoted as M2 layers), and the data of M4 layers are input to C2 layers using step-length as 2, it is non-overlapped do 2 × 2 × 2 maximum pondization operation;
5th layer is convolutional layer (being denoted as C3 layers), and using the convolution kernel that 384 sizes are 1 × 3 × 3, C3 is input to M2 layers The data of layer do convolution using step-length as 1;
6th layer is convolutional layer (being denoted as C4 layers), and using the convolution kernel that 384 sizes are 1 × 3 × 3, C4 is input to C3 layers The data of layer do convolution using step-length as 1;
7th layer is convolutional layer (being denoted as C5 layers), and using the convolution kernel that 256 sizes are 1 × 3 × 3, C5 is input to C4 layers The data of layer do convolution using step-length as 1;
In C1, C2, C3, C4, C5 layer, using the linear unit R eLu of amendment as activation primitive:
Wherein, y represents the input of activation primitive, and ReLu (y) represents the output of activation primitive.
8th layer is pond layer (being denoted as M3 layers), and the data of M3 layers are input to C5 layers using step-length as 2, it is non-overlapped do 1 × 2 × 2 maximum pondization operation;
9th layer is full articulamentum (being denoted as FC1 layers), shares 1096 neurons, each neuron and M3 layer output datas Connected entirely with 7 parameters, preceding 3 values in 7 parameters are the CT images of input depth convolutional neural networks relative to whole The receptive field positional information of individual CT images and three axles, rear 4 values represent DICOM image informations, i.e. slice thickness (the present embodiment Middle unit is millimeter), x-axis, the distance between axles of y-axis (unit is millimeter in the present embodiment), framing information in a plane;
10th layer is full articulamentum (being denoted as FC2 layers), shares 1096 neurons, each neuron and the output of FC1 layers Data are connected entirely;FC1 layers, FC2 layers are using tanh functions as activation primitive:
Wherein, z represents the input of activation primitive, and tanh (z) represents the output of activation primitive;
11th layer is full articulamentum (being denoted as FC3 layers), shares 2 neurons, each neuron and the 1096 of the output of FC2 layers Dimension data is connected entirely;The output data of FC3 layers be one 2 dimension size vector, respectively represent belong to classification 0 probability and Belong to the probability of classification 1;FC3 layers are using sigmoid functions as activation primitive:
Wherein, p represents the input of activation primitive, and sigmoid (p) represents the output of activation primitive.
S103:Train depth convolutional neural networks:
Each lung's Local C T images in pulmonary nodule training set are obtained using step S101 and are used as input, corresponding classification As desired output, the depth convolutional neural networks of step S102 structures are trained, the depth convolution god trained Through network.
In the present embodiment, it is trained using back-propagation algorithm, the algorithm is the most frequently used training of deep neural network Method, its detailed process will not be repeated here.When solving network parameter, using small lot gradient descent method.In the present embodiment Lung's CT images are 814 width in training set, are trained altogether 70 cycles, train 4000 groups of data every time, and every group of size of data is 128.Learning rate initial value is 0.01, as training process is gradually reduced, until reaching 0.0005.Batch size represents input The quantity of the input example handled in each iteration of back-propagation algorithm by depth convolutional neural networks.Due to neutral net Weight can update during each individualized training, and batch size can be changed into calculating pure stochastic gradient for 1, therefore batch size will More than 1 and less than the size of training set.
S104:Pulmonary nodule detects:
For lung's CT images to be detected, slided using the sliding window of 5 × 20 × 20 sizes in the image, i.e., it is deep The receptive field of degree convolutional neural networks slides in whole CT images, slides obtained lung's Local C T images every time and inputs step Its output valve is obtained in the depth convolutional neural networks that rapid S103 trains to obtain, the lung that the output valve is obtained as this slip The score of each voxel in portion's Local C T images.Because sliding window is when sliding, a voxel may be divided into difference Lung's Local C T images in, and each lung Local C T images can be obtained by depth convolutional neural networks one output Value, i.e. a voxel has more than one score, therefore the final score of each voxel is to be somebody's turn to do in sliding window sliding process The average value of all scores of voxel.If the final score of some voxel is more than default voxel decision threshold H, the voxel is Tubercle voxel, otherwise it is not tubercle voxel.Connected domain analysis is carried out to the tubercle voxel detected in lung's CT images, so as to To the position and border of pulmonary nodule.
It will be apparent that voxel decision threshold H setting has considerable influence to final testing result, can rule of thumb set Put, in order to obtain more reasonably voxel decision threshold H in the present embodiment, be configured with the following method:
Initial threshold value a, some lung's CT images are collected, pulmonary nodule therein is labeled, then by these lungs Portion's CT images carry out pulmonary nodule detection according to step S104 detection method, and voxel decision threshold is used as using threshold value a in detection Value H, testing result is evaluated according to the pulmonary nodule marked in advance, if reaching default index request, increase threshold Value a, is detected again, until being unable to reach default index request, then reaches the threshold of pre-set level requirement with the last time Value a is as threshold value A to be selected;If not up to default index request, reduce threshold value a, detected again, it is pre- until reaching If index request, threshold value A to be selected is used as using current threshold value a.When threshold value A to be selected is more than 0.7, then voxel decision threshold H Span for (0.7, A], otherwise voxel decision threshold H=0.7.The minimum value for setting voxel decision threshold H herein is 0.7, It is in order to control the size of pulmonary nodule in testing result, in order to avoid the artificial scope for unnecessarily increasing pulmonary nodule.
In order to verify the technique effect of the present invention, experimental verification is carried out using a collection of specific lung's CT images.This It is 814 width for training lung's CT images of depth convolutional neural networks, lung's CT images for test are in experimental verification 204.Fig. 3 is 4 pulmonary nodule testing result exemplary plots in certain lung's CT images.Due to not including lung when lung's CT images During tubercle, the sensitivity that these images calculate is 0, therefore when carrying out the analysis of pulmonary nodule Detection results, only considers to include The lung CT image of at least one tubercle, therefore in subsequent statistical sensitivity, normal lung's CT images are not considered.
Selection several pulmonary nodule detection methods commonly used in the trade at present carry out technique effect comparison with the present invention.To analogy Method 1 is document A.Riccardi, T.S.Petkov, G.Ferri, M.Masotti, and R.Campanini, " Computer- aided detection of lung nodules via 3d fast radial transform,scale space representation,and zernike mip classification,”Medical physics,vol.38,no.4, Pp.1962-1971, the method described in 2011., control methods 2 are document B.Golosio, G.L.Masala, A.Piccioli,P.Oliva,M.Carpinelli,R.Cataldo,P.Cerello,F.De Carlo,F.Falaschi, M.E.Fantacci et al.,“A novel multithreshold method for nodule detection in Lung ct, " Medical physics, vol.36, no.8, pp.3607-3618, the method described in 2009., control methods 3 For T.Messay, R.C.Hardie, and S.K.Rogers, " A new computationally efficient cad system for pulmonary nodule detection in ct imagery,”Medical Image Analysis, Vol.14, no.3, pp.390-406, the method described in 2010., control methods 4 are document M.Tan, R.Deklerck, B.Jansen,M.Bister,and J.Cornelis,“A novel computer-aided lung nodule detection system for ct images,”Medical physics,vol.38,no.10,pp.5630–5645, Method described in 2011..
Table 2 is the present invention and the technical performance of 4 control methods.As shown in table 2, the kidney-Yang of testing result has been counted herein Property number, sensitivity and false positive number.In the detection of pulmonary nodule, when voxel decision threshold sets relatively low, easily inspection False-positive nodule is measured, but sets higher, the sensitivity of detection is again not high enough.Other method is contrasted, the present invention is not entering For row to lung segmentation and on the premise of not additional exclusion false positive sample, its susceptibility is still of a relatively high:When each scanning During in the presence of 20 false-positive nodules, sensitivity can reach 78.9%, and when each scanning has 10 false-positive nodules, its is sensitive Degree can reach 71.2%, and the performance of the comprehensive apparently present invention is ideal.Because the present invention need not carry out entering lung Row segmentation, or the extra exclusion work for carrying out false positive, its implementation complexity is relatively low, and operating efficiency is high.If additional lung Partitioning algorithm and false positive exclude algorithm, and accuracy rate can also be lifted further.
Method True positives Sensitivity False positive number/scanning
The present invention 161/204|145/204 78.9% | 71.2% 20|10
Control methods 1 83/117 71% 6.5
Control methods 2 30/38 79% 4
Control methods 3 118/143 82.7% 3
Control methods 4 70/80 87.5% 4
Table 2
Proposed by the invention further improves detection knot by two stages of cascade to complete the detection of tubercle The accuracy of section.Whole automatic detection has the accuracy of height so that computer is done into auxiliary inspection for medical imaging field Survey is possibly realized.The accuracy of diagnosis is not only increased, reduces the workload of medical personnel again, there is higher realistic meaning And social value.
Although the illustrative embodiment of the present invention is described above, in order to the technology of the art Personnel understand the present invention, it should be apparent that the invention is not restricted to the scope of embodiment, to the common skill of the art For art personnel, if various change in the spirit and scope of the present invention that appended claim limits and determines, these Change is it will be apparent that all utilize the innovation and creation of present inventive concept in the row of protection.

Claims (2)

1. a kind of pulmonary nodule automatic testing method based on depth convolutional neural networks, it is characterised in that including following methods:
S1:Collect a number of size be 5 × 20 × 20 lung's Local C T images as training set, wherein include just Normal lung tissue, it also contains containing tuberculous abnormal pulmonary tissue, the CT images classification comprising normal lung tissue is denoted as ' 0 ', it is denoted as ' 1 ' containing tuberculous abnormal pulmonary tissue CT images classification;
All CT images in training set are normalized, obtain normalized CT images, normalized can adopt Represented with equation below:
<mrow> <mi>n</mi> <mi>o</mi> <mi>r</mi> <mi>m</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mrow> <mi>x</mi> <mo>-</mo> <msub> <mi>x</mi> <mi>min</mi> </msub> </mrow> <mrow> <msub> <mi>x</mi> <mi>max</mi> </msub> <mo>-</mo> <msub> <mi>x</mi> <mi>min</mi> </msub> </mrow> </mfrac> </mrow>
Wherein, x is the pixel pixel value in CT images, and norm (x) represents the pixel value after normalization, xminFor CT images picture Minimum pixel in vegetarian refreshments;Value, xmaxFor the max pixel value in CT images pixel;
S2:Build depth convolutional neural networks:
1st layer is convolutional layer (being denoted as C1 layers), using the convolution kernel that 96 sizes are 3 × 9 × 9, to the lung Local C T of input Image does convolution using step-length as 1;
2nd layer is pond layer (being denoted as M1 layers), C1 layers are input to the data of M1 layers using step-length as 2, non-overlapped do 1 × 2 × 2 Maximum pondization operation;
3rd layer is convolutional layer (being denoted as C2 layers), and using the convolution kernel that 256 sizes are 2 × 4 × 4, C2 layers are input to M1 layers Data do convolution using step-length as 1;
4th layer is pond layer (being denoted as M2 layers), the data of M4 layers is input to C2 layers using step-length as 2, non-overlapped does 2 × 2 × 2 Maximum pondization operation;
5th layer is convolutional layer (being denoted as C3 layers), and using the convolution kernel that 384 sizes are 1 × 3 × 3, C3 layers are input to M2 layers Data do convolution using step-length as 1;
6th layer is convolutional layer (being denoted as C4 layers), and using the convolution kernel that 384 sizes are 1 × 3 × 3, C4 layers are input to C3 layers Data do convolution using step-length as 1;
7th layer is convolutional layer (being denoted as C5 layers), and using the convolution kernel that 256 sizes are 1 × 3 × 3, C5 layers are input to C4 layers Data do convolution using step-length as 1;
In C1, C2, C3, C4, C5 layer, using the linear unit R eLu of amendment as activation primitive:
<mrow> <mi>Re</mi> <mi>L</mi> <mi>u</mi> <mrow> <mo>(</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mn>1</mn> <mn>2</mn> </mfrac> <mrow> <mo>(</mo> <mi>y</mi> <mo>+</mo> <mo>|</mo> <mi>y</mi> <mo>|</mo> <mo>)</mo> </mrow> </mrow>
Wherein, y represents the input of activation primitive, and ReLu (y) represents the output of activation primitive.
8th layer is pond layer (being denoted as M3 layers), the data of M3 layers is input to C5 layers using step-length as 2, non-overlapped does 1 × 2 × 2 Maximum pondization operation;
9th layer is full articulamentum (being denoted as FC1 layers), shares 1096 neurons, each neuron and M3 layers output data and 7 Parameter is connected entirely, and preceding 3 values in 7 parameters are schemed for the CT images of input depth convolutional neural networks relative to whole CT The receptive field positional information of picture and three axles, rear 4 values represent DICOM image informations, i.e. slice thickness, the x in a plane Axle, the distance between axles of y-axis, framing information;
10th layer is full articulamentum (being denoted as FC2 layers), shares 1096 neurons, each neuron and the data of FC1 layers output Connected entirely;FC1 layers, FC2 layers are using tanh functions as activation primitive:
<mrow> <mi>tanh</mi> <mrow> <mo>(</mo> <mi>z</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mrow> <msup> <mi>e</mi> <mi>z</mi> </msup> <mo>-</mo> <msup> <mi>e</mi> <mrow> <mo>-</mo> <mi>z</mi> </mrow> </msup> </mrow> <mrow> <msup> <mi>e</mi> <mi>z</mi> </msup> <mo>+</mo> <msup> <mi>e</mi> <mrow> <mo>-</mo> <mi>z</mi> </mrow> </msup> </mrow> </mfrac> </mrow>
Wherein, z represents the input of activation primitive, and tanh (z) represents the output of activation primitive;
11th layer is full articulamentum (being denoted as FC3 layers), shares 2 neurons, each neuron and 1096 dimensions of FC2 layers output According to being connected entirely;The output data of FC3 layers is the vector of one 2 dimension size, represents to belong to the probability of classification 0 respectively and belongs to The probability of classification 1;FC3 layers are using sigmoid functions as activation primitive:
<mrow> <mi>s</mi> <mi>i</mi> <mi>g</mi> <mi>m</mi> <mi>o</mi> <mi>i</mi> <mi>d</mi> <mrow> <mo>(</mo> <mi>p</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mn>1</mn> <mrow> <mn>1</mn> <mo>+</mo> <msup> <mi>e</mi> <mrow> <mo>-</mo> <mi>p</mi> </mrow> </msup> </mrow> </mfrac> </mrow>
Wherein, p represents the input of activation primitive, and sigmoid (p) represents the output of activation primitive;
S3:The each lung's Local C T images obtained using step S1 in pulmonary nodule training set are made as input, corresponding classification For desired output, the depth convolutional neural networks of step S2 structures are trained, the depth convolutional Neural net trained Network;
S4:For lung's CT images to be detected, slided in the image using the sliding window of 5 × 20 × 20 sizes and slided every time Its output valve is obtained in the depth convolutional neural networks that the dynamic obtained lung equal input step S3 of Local C T images trains to obtain, will The score of each voxel, the final score of each voxel are in lung's Local C T images that the output valve obtains as this slip The average value of all scores of the voxel in sliding window sliding process, if the final score of some voxel is more than default voxel Decision threshold H, then the voxel is tubercle voxel, is not otherwise tubercle voxel;To the tubercle voxel detected in lung's CT images Connected domain analysis is carried out, so as to obtain the position of pulmonary nodule and border.
2. pulmonary nodule automatic testing method according to claim 1, it is characterised in that the threshold value H is used with lower section Method is set:
Initial threshold value a, some lung's CT images are collected, pulmonary nodule therein is labeled, then by these lung CTs Image carries out pulmonary nodule detection according to step S4 detection method, in detection using threshold value a as voxel decision threshold H, presses Testing result is evaluated according to the pulmonary nodule marked in advance, if reaching default index request, increases threshold value a, then It is secondary to be detected, until being unable to reach default index request, then using the last time reach the threshold value a of pre-set level requirement as Threshold value A to be selected;If not up to default index request, reduce threshold value a, detected again, until reaching default finger Mark is required, threshold value A to be selected is used as using current threshold value a;When threshold value A to be selected is more than 0.7, then voxel decision threshold H value model Enclose for (0.7, A], otherwise voxel decision threshold H=0.7.
CN201710765376.5A 2017-08-30 2017-08-30 A kind of pulmonary nodule automatic testing method based on depth convolutional neural networks Pending CN107578405A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710765376.5A CN107578405A (en) 2017-08-30 2017-08-30 A kind of pulmonary nodule automatic testing method based on depth convolutional neural networks

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710765376.5A CN107578405A (en) 2017-08-30 2017-08-30 A kind of pulmonary nodule automatic testing method based on depth convolutional neural networks

Publications (1)

Publication Number Publication Date
CN107578405A true CN107578405A (en) 2018-01-12

Family

ID=61030081

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710765376.5A Pending CN107578405A (en) 2017-08-30 2017-08-30 A kind of pulmonary nodule automatic testing method based on depth convolutional neural networks

Country Status (1)

Country Link
CN (1) CN107578405A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108389183A (en) * 2018-01-24 2018-08-10 上海交通大学 Pulmonary nodule detects neural network accelerator and its control method
CN108615237A (en) * 2018-05-08 2018-10-02 上海商汤智能科技有限公司 A kind of method for processing lung images and image processing equipment
CN108648179A (en) * 2018-04-17 2018-10-12 杭州依图医疗技术有限公司 A kind of method and device of analysis Lung neoplasm
CN108742679A (en) * 2018-06-29 2018-11-06 上海联影医疗科技有限公司 Nodule detection device and method
CN108898595A (en) * 2018-06-27 2018-11-27 慧影医疗科技(北京)有限公司 A kind of construction method of thoracopathy detection model and application
CN109871869A (en) * 2019-01-11 2019-06-11 五邑大学 A kind of Lung neoplasm classification method and its device
CN110175989A (en) * 2019-05-08 2019-08-27 常州市第二人民医院 Video data processing method and its device
CN112150429A (en) * 2020-09-18 2020-12-29 南京师范大学 Attention mechanism guided kidney CT image segmentation method

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106372390A (en) * 2016-08-25 2017-02-01 姹ゅ钩 Deep convolutional neural network-based lung cancer preventing self-service health cloud service system
CN106504232A (en) * 2016-10-14 2017-03-15 北京网医智捷科技有限公司 A kind of pulmonary nodule automatic testing method based on 3D convolutional neural networks

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106372390A (en) * 2016-08-25 2017-02-01 姹ゅ钩 Deep convolutional neural network-based lung cancer preventing self-service health cloud service system
CN106504232A (en) * 2016-10-14 2017-03-15 北京网医智捷科技有限公司 A kind of pulmonary nodule automatic testing method based on 3D convolutional neural networks

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108389183A (en) * 2018-01-24 2018-08-10 上海交通大学 Pulmonary nodule detects neural network accelerator and its control method
CN108648179A (en) * 2018-04-17 2018-10-12 杭州依图医疗技术有限公司 A kind of method and device of analysis Lung neoplasm
CN108615237A (en) * 2018-05-08 2018-10-02 上海商汤智能科技有限公司 A kind of method for processing lung images and image processing equipment
CN108615237B (en) * 2018-05-08 2021-09-07 上海商汤智能科技有限公司 Lung image processing method and image processing equipment
CN108898595A (en) * 2018-06-27 2018-11-27 慧影医疗科技(北京)有限公司 A kind of construction method of thoracopathy detection model and application
CN108898595B (en) * 2018-06-27 2021-02-19 慧影医疗科技(北京)有限公司 Construction method and application of positioning model of focus region in chest image
CN108742679A (en) * 2018-06-29 2018-11-06 上海联影医疗科技有限公司 Nodule detection device and method
CN109871869A (en) * 2019-01-11 2019-06-11 五邑大学 A kind of Lung neoplasm classification method and its device
CN109871869B (en) * 2019-01-11 2023-03-21 五邑大学 Pulmonary nodule classification method and device
CN110175989A (en) * 2019-05-08 2019-08-27 常州市第二人民医院 Video data processing method and its device
CN112150429A (en) * 2020-09-18 2020-12-29 南京师范大学 Attention mechanism guided kidney CT image segmentation method

Similar Documents

Publication Publication Date Title
CN107578405A (en) A kind of pulmonary nodule automatic testing method based on depth convolutional neural networks
CN106504232B (en) A kind of pulmonary nodule automatic checkout system based on 3D convolutional neural networks
US10417788B2 (en) Anomaly detection in volumetric medical images using sequential convolutional and recurrent neural networks
Jadoon et al. Three‐class mammogram classification based on descriptive CNN features
CN107154043B (en) Pulmonary nodule false positive sample inhibition method based on 3DCNN
Sirish Kaushik et al. Pneumonia detection using convolutional neural networks (CNNs)
CN108615237A (en) A kind of method for processing lung images and image processing equipment
CN109544511B (en) Method for identifying lung nodule by convolutional neural network based on particle swarm optimization
Khoiriyah et al. Convolutional neural network for automatic pneumonia detection in chest radiography
Vieira et al. Detecting pulmonary diseases using deep features in X-ray images
Phankokkruad COVID-19 pneumonia detection in chest X-ray images using transfer learning of convolutional neural networks
Kalaivani et al. A three-stage ensemble boosted convolutional neural network for classification and analysis of COVID-19 chest x-ray images
R-Prabha et al. Design of hybrid deep learning approach for covid-19 infected lung image segmentation
Xu et al. Identification of benign and malignant lung nodules in CT images based on ensemble learning method
Tang et al. NSCGCN: A novel deep GCN model to diagnosis COVID-19
Madhavi et al. COVID-19 infection prediction from CT scan images of lungs using Iterative Convolution Neural Network Model
Sharmila et al. Convolution Neural Networks based lungs disease detection and Severity classification
Bhansali et al. CoronaNet: a novel deep learning model for COVID-19 detection in CT scans
Jiang et al. Images denoising for COVID-19 chest X-ray based on multi-resolution parallel residual CNN
Liu et al. Novel superpixel‐based algorithm for segmenting lung images via convolutional neural network and random forest
Irsyad et al. Detection of COVID-19 from Chest CT Images Using Deep Transfer Learning
Perkonigg et al. Detecting bone lesions in multiple myeloma patients using transfer learning
Elbarougy et al. COVID-19 detection on chest x-ray images by combining histogram-oriented gradient and convolutional neural network features
Kapoor et al. Lung Cancer Detection Using VGG16 and CNN
Singla et al. Enhancing Diagnostic Accuracy for Lung Disease in Medical Images: A Transfer-Learning Approach

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20180112

RJ01 Rejection of invention patent application after publication