CN111598868B - Lung ultrasonic image identification method and system - Google Patents

Lung ultrasonic image identification method and system Download PDF

Info

Publication number
CN111598868B
CN111598868B CN202010409334.XA CN202010409334A CN111598868B CN 111598868 B CN111598868 B CN 111598868B CN 202010409334 A CN202010409334 A CN 202010409334A CN 111598868 B CN111598868 B CN 111598868B
Authority
CN
China
Prior art keywords
image
lung
normalized
images
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010409334.XA
Other languages
Chinese (zh)
Other versions
CN111598868A (en
Inventor
朱瑞星
黄孟钦
刘西耀
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Shenzhi Information Technology Co ltd
Original Assignee
Shanghai Shenzhi Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Shenzhi Information Technology Co ltd filed Critical Shanghai Shenzhi Information Technology Co ltd
Priority to CN202010409334.XA priority Critical patent/CN111598868B/en
Publication of CN111598868A publication Critical patent/CN111598868A/en
Application granted granted Critical
Publication of CN111598868B publication Critical patent/CN111598868B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H70/00ICT specially adapted for the handling or processing of medical references
    • G16H70/60ICT specially adapted for the handling or processing of medical references relating to pathologies
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30061Lung

Landscapes

  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Public Health (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Pathology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • Veterinary Medicine (AREA)
  • General Physics & Mathematics (AREA)
  • Surgery (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention provides a lung ultrasonic image identification method and a system, which are characterized in that collected continuous lung ultrasonic images are preprocessed, the preprocessed continuous lung ultrasonic images are trained by adopting two neural network models to obtain an analysis result whether the lung has a lung sliding sign, an A-line sign, a B-line sign or a lung real sign, then the continuous ultrasonic images are classified and marked, manual identification is changed into lung B-ultrasonic image identification, and the lung disease condition is automatically screened and judged in an auxiliary manner.

Description

Lung ultrasonic image identification method and system
Technical Field
The invention relates to the field of medical image analysis and robot learning, in particular to a lung ultrasonic image identification method and system.
Background
The obstructive pulmonary diseases such as emphysema, pneumoconiosis, tubercle, asthma and the like can be caused by smoking, air pollution, virus infection and the like, patients mostly have the symptoms of dyspnea, chest distress, short breath, limb weakness, chronic cough, expectoration and the like, and the pulmonary heart disease develops after a long time, thus the normal life of people is seriously influenced.
B-mode ultrasound is a relatively reliable screening method for lung diseases, but the analysis and identification of B-mode ultrasound image features depend on visual observation and diagnosis experience of doctors. However, in clinical applications in hospitals, the conventional analysis and classification of B-ultrasonic image features is still a costly and difficult task, because the task can only be manually completed by a small number of highly experienced and trained doctors and experts consuming a large amount of manpower and material resources, and the manual diagnosis method has strong subjectivity and is not reproducible, so that it is difficult to culture an excellent B-ultrasonic diagnostician in a short time. Especially for countries with large population and large medical needs, hospitals in many remote areas have a huge gap in B ultrasonic diagnosticians. The B-ultrasonic diagnosticians in large cities also face the problem of large task load. Overload workload is bound to be accompanied by an increase in the rate of misdiagnosis, and slight negligence can have extremely serious consequences for patients and attending physicians.
Disclosure of Invention
In order to overcome the defects of high difficulty, low efficiency and low precision of the conventional ultrasonic image lung disease screening mode, the invention provides the lung ultrasonic image identification method and system based on deep learning, which have high speed, high efficiency and high precision, realize automatic analysis of the lung ultrasonic image and can effectively assist in screening and judging the chronic kidney disease diseased condition of the ultrasonic image.
A lung ultrasonic image identification method comprises the following steps:
s1, collecting continuous ultrasonic images of the lung of a detected person;
s2, preprocessing the continuous ultrasonic images to obtain a plurality of frames of images to be identified which are sequentially arranged, and forming all the images to be identified into an image set to be detected;
s3, inputting the image set to be detected into a first classification model formed by pre-training to identify and obtain a first type of pathological feature result of the lung, and executing the step S5;
s4, inputting the image set to be detected into a second classification model formed by pre-training to identify and obtain a second pathological feature result of the lung, and executing the step S5;
s5, outputting the first type of pathological feature result and the second type of pathological feature result as lung ultrasonic detection results;
step S3 and step S4 are performed simultaneously.
Further, in step S2, the preprocessing process specifically includes the following steps:
step S21, obtaining an effective area image in each frame of ultrasonic image of the continuous ultrasonic images;
s22, performing graying and normalization processing on each frame of effective area image to obtain a normalized grayscale image;
step S23, eliminating relatively static images in all the normalized gray level images, and reserving the normalized gray level images with specific frame numbers;
and S24, respectively scaling each reserved frame of normalized gray level image into a uniform size and forming an image to be identified to obtain an image set to be detected.
Further, in step S23, the process of rejecting the relatively still image specifically includes:
step S231, calculating a correlation coefficient between the normalized grayscale image of the current frame and the normalized grayscale image of the previous frame;
step S232, judging whether the correlation coefficient is larger than a preset value, and keeping the normalized gray level image of the current frame corresponding to the correlation coefficient which is not larger than the preset value;
step S233, comparing the actual number of frames of the retained normalized gradation image with the preset specific number of frames:
if the actual frame number is greater than the specific frame number, correspondingly reducing the preset value, and returning to the step S232;
if the actual frame number is equal to the specific frame number, executing step S24;
if the actual frame number is less than the specific frame number, the preset value is correspondingly increased, and the step S232 is returned to.
Further, in step S231, the correlation coefficient is calculated according to the following formula:
Figure BDA0002492595450000021
wherein, subscript x is used for representing the transverse coordinate position of each pixel in the normalized gray level image;
subscript y is used to represent the vertical coordinate position of each pixel in the normalized gray scale image;
the subscript x' is used to represent the lateral offset of the lateral coordinate position of the pixel;
the subscript y' is used to denote the longitudinal shift in the longitudinal coordinate position of the pixel;
I prev(x,y) expressing the gray value of a pixel with the coordinate position of (x, y) in the normalized gray image of the previous frame;
I cur(x,y) the gray scale value is used for representing a pixel with a coordinate position of (x, y) in the normalized gray scale image of the current frame;
r (x, y) is used to represent the correlation coefficient.
Further, in step S3, the first type of pathological feature result is a binary screening result characterizing the pulmonary glide characteristic or the non-pulmonary glide characteristic.
Further, in step S4, a frame of image to be identified in the image set to be detected is selected and sent to the second classification model for identification, and the output second type pathological feature result is a three-classification screening result representing a line a sign, a line B sign or a lung parenchyma sign.
The lung ultrasonic image identification system is applied to the lung ultrasonic image identification method, and comprises the following steps:
the image acquisition module is used for acquiring continuous ultrasonic images of the lungs of the detected person;
the preprocessing module is connected with the image acquisition module and used for preprocessing the continuous ultrasonic images to obtain a plurality of frames of images to be identified which are sequentially arranged and forming all the images to be identified into an image set to be detected;
the first analysis module is connected with the preprocessing module and used for inputting the image set to be detected into a first classification model formed by pre-training to identify and obtain a first type of pathological feature result of the lung;
the second analysis module is connected with the preprocessing module and used for inputting the image set to be detected into a second classification model formed by pre-training to identify and obtain a second type pathological feature result of the lung;
and the identification module is respectively connected with the first analysis module and the second analysis module and is used for outputting the first type of pathological feature result and the second type of pathological feature result as lung ultrasonic detection results.
Further, the preprocessing module specifically includes:
the extraction unit is used for acquiring an effective area image in each frame of ultrasonic image of the continuous ultrasonic images;
the conversion unit is connected with the extraction unit and is used for respectively carrying out graying and normalization processing on each frame of effective area image to obtain a normalized grayscale image;
the selecting unit is connected with the converting unit and is used for rejecting relatively static images in all the normalized gray level images and reserving the normalized gray level images with specific frame numbers;
and the scaling unit is connected with the selection unit and used for scaling each reserved frame of normalized gray level image into a uniform size and forming an image to be identified so as to obtain an image set to be identified.
Further, the selecting unit includes:
a calculating unit for calculating a correlation coefficient between the normalized gray-scale image of the current frame and the normalized gray-scale image of the previous frame;
the retention unit is connected with the calculation unit and used for judging whether the correlation coefficient is greater than a preset value or not and retaining the normalized gray level image of the current frame corresponding to the correlation coefficient which is not greater than the preset value;
and the judging unit is connected with the retaining unit and is used for comparing the actual frame number of the retained normalized gray-scale image with the preset specific frame number: if the actual frame number is equal to the specific frame number, transmitting the normalized gray level image of the specific frame number to a scaling unit; if the actual frame number is greater than or less than the specific frame number, transmitting the comparison result to a setting unit;
the setting unit is respectively connected with the judging unit and the retaining unit, is used for presetting a preset numerical value, and is used for setting as follows according to a comparison result: if the comparison result shows that the actual frame number is greater than the specific frame number, correspondingly reducing the preset value; if the comparison result is that the actual frame number is less than the specific frame number, correspondingly increasing the preset value; and the system is also used for sending preset values, reduced preset values and promoted preset values to the reservation unit.
Further, the calculating unit is configured to calculate the correlation coefficient according to the following formula:
Figure BDA0002492595450000041
wherein, subscript x is used for representing the transverse coordinate position of each pixel in the normalized gray level image;
subscript y is used to represent the vertical coordinate position of each pixel in the normalized gray scale image;
the subscript x' is used to represent the lateral offset of the lateral coordinate position of the pixel;
the subscript y' is used to denote the longitudinal shift in the longitudinal coordinate position of the pixel;
I prev(x,y) expressing the gray value of a pixel with the coordinate position of (x, y) in the normalized gray image of the previous frame;
I cur(x,y) the gray value of a pixel with the coordinate position of (x, y) in the normalized gray image of the current frame is represented;
r (x, y) is used to represent the correlation coefficient.
The beneficial technical effects of the invention are as follows:
the invention carries out early intelligent auxiliary screening on the lung diseases based on the ultrasonic images, and utilizes the convolutional neural network to extract the image characteristics so as to realize the judgment of the lung disease diseased condition. Compared with the prior art, the method has the technical advantages that:
1. the ultrasonic image is analyzed through the neural network, the lung B ultrasonic image is identified from manual identification, and the lung disease condition is automatically screened and judged in an auxiliary mode.
2. The method has the advantages that the effective area of the ultrasonic image is zoomed to a certain size, the relatively static ultrasonic image is removed, the result accuracy is guaranteed, and the training difficulty of the network is greatly reduced.
3. The pulmonary glide sign/non-pulmonary glide sign uses a neural network model, and the A line sign, the B line sign or the pulmonary compaction sign uses another neural network model, so that the parallel processing is realized, the efficiency is higher, and the result is faster. The missed diagnosis condition is improved, and the value of the B ultrasonic in the lung examination is further improved.
Deep learning not only has great clinical significance for accurate assessment of lesions, but also brings potential and hope in imaging assessment.
Drawings
Fig. 1-3 are flow diagrams of steps of various method embodiments of the present invention.
Fig. 4-6 are block schematic diagrams of various system embodiments of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the embodiments and features of the embodiments may be combined with each other without conflict.
The invention is further described with reference to the following drawings and specific examples, which are not intended to be limiting.
Referring to fig. 1-3, the present invention provides a method for identifying an ultrasound image of a lung, comprising the following steps:
s1, acquiring continuous ultrasonic images of the lung of a detected person.
In the present invention, a Lung Ultrasound BLUE protocol can be used, which is a rapid detection protocol (< 3 minutes) including the examination of blood vessels (veins) and can be used for the differential diagnosis of patients with acute respiratory failure. According to the BLUE scheme, an ultrasonic diagnostic apparatus is used for acquiring ultrasonic images of a BLUE upper point or a BLUE lower point, the position of a probe is kept still in the single ultrasonic acquisition process, and the images are continuously acquired for 3-5 seconds.
And S2, preprocessing the continuous ultrasonic images to obtain a plurality of frames of images to be identified which are sequentially arranged, and forming an image set to be detected by all the images to be identified.
And S3, inputting the image set to be detected into a first classification model formed by pre-training to identify a first type of pathological feature result of the lung, and executing the step S5.
Namely, in step S3, the first type of pathological feature result is a binary screening result representing the pulmonary sliding feature or the non-pulmonary sliding feature.
And S4, inputting the image set to be detected into a second classification model formed by pre-training to identify a second pathological feature result of the lung, and executing the step S5.
In the step S4, one frame of image to be identified in the image set to be detected is selected and sent into a second classification model for identification, and the output second type pathological feature result is a three-classification screening result representing an A-line sign, a B-line sign or a lung consolidation sign.
And S5, outputting the first type of pathological feature result and the second type of pathological feature result as lung ultrasonic detection results.
Step S3 and step S4 are performed simultaneously.
Further, in step S2, the preprocessing process specifically includes the following steps.
Step S21, an effective region image in each frame of ultrasound images of the consecutive ultrasound images is acquired.
The method comprises the steps of extracting an image effective region ROI from a single-frame ultrasonic image, wherein the ROI detection frame can be extracted through manual setting, and after a first frame ultrasonic image is extracted, a subsequent ultrasonic image continues to use the ROI detection frame.
And S22, performing graying and normalization processing on each frame of effective area image to obtain a normalized grayscale image.
Graying the single-frame effective area image by adopting the following formula:
gray = R0.299 + G0.587 + B0.114, with R, G, B being the red, green and blue components, respectively.
And step S23, eliminating relatively static images in all the normalized gray level images, and reserving the normalized gray level images with specific frame numbers.
And S24, respectively scaling each reserved frame of normalized gray level image into a uniform size and forming an image to be identified to obtain an image set to be detected.
And scaling each frame of normalized gray level image to an image to be measured with a uniform size, such as 224 × 224, preferably, scaling by using a method such as bilinear interpolation, cubic spline interpolation or Lanczos interpolation. All the images to be identified constitute 224 × 224 frames of the image set to be measured, and the frame represents a specific frame number.
Further, in step S23, the process of rejecting the relatively still image specifically includes the following steps.
In step S231, a correlation coefficient between the normalized grayscale image of the current frame and the normalized grayscale image of the previous frame is calculated.
The correlation coefficient was calculated according to the following formula:
Figure BDA0002492595450000071
wherein, subscript x is used for representing the transverse coordinate position of each pixel in the normalized gray level image; subscript y is used to represent the vertical coordinate position of each pixel in the normalized gray scale image; the subscript x' is used to represent the lateral offset of the lateral coordinate position of the pixel; the subscript y' is used to denote the vertical offset of the vertical coordinate position of the pixel; i is prev(x,y) Expressing the gray value of a pixel with the coordinate position of (x, y) in the normalized gray image of the previous frame; i is cur(x,y) The gray value of a pixel with the coordinate position of (x, y) in the normalized gray image of the current frame is represented; r (x, y) is used to represent the correlation coefficient.
Step S232, determine whether the correlation coefficient is greater than a preset value, and keep the normalized gray-scale image of the current frame corresponding to the correlation coefficient not greater than the preset value.
In step S233, the actual number of frames of the retained normalized gradation image is compared with a preset specific number of frames.
If the actual frame number is greater than the specific frame number, the preset value is correspondingly decreased, and the step S232 is returned to.
If the actual frame number is equal to the specific frame number, executing the step S24;
if the actual frame number is less than the specific frame number, the preset value is correspondingly increased, and the step S232 is returned to.
And when R (x, y) is larger than a preset value, for example, when R (x, y) >0.95, the current frame normalized gray-scale image and the previous frame normalized gray-scale image are considered to have strong correlation, the current frame normalized gray-scale image is considered to be a relatively static image, and the current frame normalized gray-scale image is removed. If R (x, y) is not greater than 0.95, the current frame normalized gray image is not considered to be a relatively static image, and the current frame normalized gray image is retained.
When the method is used for recognition in a subsequent pre-trained neural network model, the input layer number is set in advance, each layer of input data is a frame image, and therefore, a specific frame number is reserved to match the input layer number. Therefore, in the acquisition process of step S1, in order to satisfy the number of input layers for neural network model identification later, for example, the number of input layers is set to 30, and the acquired continuous ultrasound images are at least over 30 frames. For example, if the acquisition is 3 seconds, it is desirable to acquire more than 10 frames per second.
As an embodiment of retaining the normalized grayscale image of a specific number of frames, step S233 is performed after the above-described step S232: and judging whether the frame number of the reserved normalized gray level image is a specific frame number, if so, performing the step S24, otherwise, adjusting a preset value, and re-executing the step S232.
Specifically, if the number of frames of the normalized grayscale image remaining after the removal of the relatively still image is greater than the specific number of frames, the preset value is increased to form a new preset value, step S232 is executed again, that is, it is determined whether R (x, y) is greater than the new preset value, and when R (x, y) is greater than the new preset value, the current frame grayscale image is removed. Can be adjusted in an incremental manner, and can be increased by a unit value of 0.005-0.05, such as 0.01,0.02, or 0.04, each time. Of course, it is also possible to do without incremental means.
For example, the specific frame number is 30 frames, a preset value is set to 0.95, after step S232, if the remaining number of frames of the normalized grayscale image is greater than 30 frames, the preset value is adjusted to 0.96, and step S232 is executed again until the number of frames of the normalized grayscale image remains to be 30 frames.
Specifically, if the number of frames of the normalized grayscale image remaining after the removal of the relatively still image is less than the specific number of frames, the preset value is decreased to form a new preset value, step S232 is executed again, that is, it is determined whether R (x, y) is greater than the new preset value, and when R (x, y) is greater than the new preset value, the current normalized frame grayscale image is removed. The adjustment can be made in a decreasing manner, and the unit value of each decrease is 0.005-0.05, such as 0.01,0.02, or 0.04.
For example, the specific frame number is 30 frames, the preset value is set to 0.95, after the step S232 is performed, the number of remaining grayscale image frames is less than 30 frames, the preset value is adjusted to 0.94, and the step S232 is performed again until the number of frames of the normalized grayscale image remains to be 30 frames.
Of course, it is also possible to set the initial value of the preset value to 0.99, execute step S232, remove the normalized grayscale image with R (x, y) greater than 0.99, then adjust the preset value in a decreasing manner, and repeat step S232 after each adjustment until the normalized grayscale image with a specific number of frames is retained. The decremental adjustment can be by a unit value of 0.005-0.05, such as 0.01,0.02, or 0.04, at a time. For example, when the unit value is decreased by 0.01, the preset values are sequentially decreased by 0.99 → 0.98 → 0.97 → 0.96 → 0.95. The unit value can also be adjusted as appropriate depending on the situation.
As a second embodiment of the normalized gray-scale image retaining a specific number of frames, of course, the number of frames retaining the normalized gray-scale image is not a specific number of frames, and in addition to the above-described method, other methods such as a method of average addition or average deletion may be employed, frames more than the specific number of frames are removed from the number of frames retaining the normalized gray-scale image at average intervals, and frames less than the specific number of frames are retained from the number of frames of the removed normalized gray-scale image at average intervals.
For example, the specific frame number is 30 frames, the frame number of the current reserved normalized grayscale image is 35 frames, and if 5 frames need to be removed, one frame is removed every 5 frames.
For another example, if the number of specific frames is 30, the number of frames currently holding the normalized grayscale image is 25, 5 frames need to be held, and the number of frames of the removed normalized grayscale image is 10, then one frame is held every 2 frames from the removed normalized grayscale image.
The first classification model is a first neural network model, the image set to be detected is input into the first neural network model which is constructed in advance for identification, and a classification screening result representing the pulmonary sliding sign or the non-pulmonary sliding sign is output.
Preferably, the invention uses a MobileNet v3 convolutional neural network as the first classification model. When the MobileNet v3 convolutional neural network is adopted, the image set to be detected 224 frames are used as input data, the number of output layer classification is modified to be 2 by modifying the first input layer of the MobileNet v3 convolutional neural network to be 224 frames, and then whether the image characteristics of the image set to be detected are lung sliding characteristics is identified. The output of the MobileNet V3 algorithm network is a classification result (lung sliding sign/non-lung sliding sign), and classification identification can be performed on the continuous ultrasonic images acquired at this time according to the output classification identification result.
The frames may preferably be set to 30 frames or other suitable number of frames. That is, the normalization product correlation algorithm is used to eliminate the relatively static images between frames, and meanwhile, the requirement of the MobileNet V3 algorithm for the number of input frames is satisfied, which is specifically referred to above steps S231-233.
The structural parameters of the MobileNet V3-Small convolution neural network adopted by the invention are shown in Table 1.
Table 1: mobileNet V3-Small convolution neural network structure
Figure BDA0002492595450000101
The second classification model can be a second neural network model, a certain frame of image to be identified in the image set to be detected is selected and input into the second neural network model which is constructed in advance for training, and three classification identification results representing the A line sign, the B line sign or the lung real sign are output. Step S3 and step S4 are processed in parallel.
Preferably, the present invention employs a ResNet50 neural network model as the second classification model. When using the ResNet50 neural network model, before performing the ResNet50 algorithm on the pre-processed image set to be detected to detect image features, firstly, a frame of image to be identified is extracted from the image set to be detected, generally, an intermediate frame is preferred as an input layer, and only one layer is used as an input layer of the ResNet50 algorithm, namely 224 × 224 1. And then detecting the image characteristics of the image to be detected by using a ResNet50 algorithm, modifying the output layer classification quantity to be 3, and detecting whether the image characteristics are an A line sign, a B line sign and a lung real sign. The detected image features are three networks, and the result of the second type of pathological features is an A line feature, a B line feature or a lung consolidation feature. The continuous ultrasonic images can be classified and identified according to the three classification identification results. The neural network structure parameters of ResNet50 are shown in Table 2.
Table 2: neural network structure parameters of ResNet50
Figure BDA0002492595450000111
Referring to fig. 4-6, the present invention further provides a lung ultrasound image identification system, which is characterized in that the method for identifying a lung ultrasound image includes:
the image acquisition module 1 is used for acquiring continuous ultrasonic images of the lungs of a detected person;
the preprocessing module 2 is connected with the image acquisition module 1 and is used for preprocessing the continuous ultrasonic images to obtain a plurality of frames of images to be identified which are sequentially arranged and forming an image set to be detected by the images to be identified;
the first analysis module 3 is connected with the preprocessing module 2 and used for inputting the image set to be detected into a first classification model formed by pre-training to identify and obtain a first type of pathological feature result of the lung;
the second analysis module 4 is connected with the preprocessing module 2 and used for inputting the image set to be detected into a second classification model formed by pre-training to identify and obtain a second type pathological feature result of the lung;
and the identification module 5 is respectively connected with the first analysis module 3 and the second analysis module 4 and is used for outputting the first type of pathological feature result and the second type of pathological feature result as lung ultrasonic detection results.
Namely, the continuous ultrasonic images are classified and identified according to the identification result.
Further, the preprocessing module 2 specifically includes:
an extracting unit 21 configured to acquire an effective region image in each frame of ultrasound images of consecutive ultrasound images;
the conversion unit 22 is connected with the extraction unit 21 and is used for performing graying and normalization processing on each frame of effective area image to obtain a normalized grayscale image;
the selecting unit 23 is connected with the converting unit 22 and is used for eliminating relatively static images in all the normalized gray level images and reserving the normalized gray level images with specific frame numbers;
and the scaling unit 24 is connected with the selecting unit 23 and is used for scaling each frame of the reserved normalized gray level image into a uniform size and forming an image to be identified so as to obtain an image set to be identified.
Further, the selecting unit 23 includes: a calculating unit 231 for calculating a correlation coefficient between the normalized grayscale image of the current frame and the normalized grayscale image of the previous frame;
a retaining unit 232, connected to the calculating unit 231, for determining whether the correlation coefficient is greater than a preset value, and retaining the normalized grayscale image of the current frame corresponding to the correlation coefficient not greater than the preset value;
a judging unit 233, connected to the holding unit 232, for comparing the actual frame number of the held normalized grayscale image with a preset specific frame number: if the actual frame number is equal to the specific frame number, transmitting the normalized gray level image of the specific frame number to a scaling unit; if the actual frame number is larger than or smaller than the specific frame number, transmitting the comparison result to a setting unit;
a setting unit 234, respectively connected to the determining unit 233 and the reserving unit 232, for presetting a preset value, and for setting as follows according to the comparison result: if the comparison result shows that the actual frame number is greater than the specific frame number, correspondingly reducing the preset value; if the comparison result is that the actual frame number is less than the specific frame number, correspondingly increasing the preset value; and the system is also used for sending preset values, reduced preset values and promoted preset values to the reservation unit.
Further, the calculating unit is configured to calculate the correlation coefficient according to the following formula:
Figure BDA0002492595450000121
wherein, subscript x is used for representing the transverse coordinate position of each pixel in the normalized gray level image;
subscript y is used to represent the vertical coordinate position of each pixel in the normalized gray scale image;
the subscript x' is used to represent the lateral offset of the lateral coordinate position of the pixel;
the subscript y' is used to denote the longitudinal shift in the longitudinal coordinate position of the pixel;
I prev(x,y) expressing the gray value of a pixel with the coordinate position of (x, y) in the normalized gray image of the previous frame;
I cur(x,y) the gray value of a pixel with the coordinate position of (x, y) in the normalized gray image of the current frame is represented;
r (x, y) is used to represent the correlation coefficient.
Wherein the first type of pathological feature result is a binary screening result for characterizing or not characterizing the pulmonary sliding sign.
The second analysis module 4 is specifically configured to select one frame of the image to be identified in the image set to be detected and send the image to be identified into the second classification model for identification, and the output second type pathological feature result is a three-classification screening result representing an a-line sign, a B-line sign or a lung consolidation sign.
The invention carries out early intelligent auxiliary screening on the lung diseases based on the ultrasonic images, and utilizes the convolutional neural network to extract the image characteristics so as to realize the judgment of the lung disease diseased condition. Compared with the prior art, the method has the technical advantages that:
1. the ultrasonic image is analyzed through the neural network, the lung B ultrasonic image is recognized from manual recognition, and the lung disease condition is automatically screened and judged in an auxiliary manner.
2. The method has the advantages that the effective area of the ultrasonic image is zoomed to a certain size, the relatively static ultrasonic image is removed, the result accuracy is guaranteed, and the training difficulty of the network is greatly reduced.
3. The lung sliding sign/non-lung sliding sign uses one neural network model, and the A line sign, the B line sign or the lung consolidation sign uses another neural network model, so that the parallel processing is realized, the efficiency is higher, and the result is faster. The missed diagnosis condition is improved, and the value of the B ultrasonic in the lung examination is further improved.
Deep learning not only has great clinical significance for accurate assessment of lesions, but also brings potential and hope in imaging assessment.
While the invention has been described with reference to a preferred embodiment, it will be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the spirit and scope of the invention.

Claims (6)

1. A lung ultrasonic image identification method is characterized by comprising the following steps:
s1, collecting continuous ultrasonic images of the lung of a detected person;
s2, preprocessing the continuous ultrasonic images to obtain a plurality of frames of images to be identified which are sequentially arranged, and forming all the images to be identified into an image set to be detected;
s3, inputting the image set to be detected into a first classification model formed by pre-training to identify and obtain a first type of pathological feature result of the lung, and executing the step S5;
s4, inputting the image set to be detected into a second classification model formed by pre-training to identify and obtain a second pathological feature result of the lung, and executing the step S5;
s5, outputting the first type of pathological feature result and the second type of pathological feature result as lung ultrasonic detection results;
the step S3 and the step S4 are executed simultaneously;
in the step S2, the preprocessing process specifically includes the following steps:
step S21, obtaining an effective area image in each frame of ultrasonic image of the continuous ultrasonic images;
s22, performing graying and normalization processing on each frame of effective area image to obtain a normalized gray image;
s23, eliminating relatively static images in all the normalized gray level images, and reserving the normalized gray level images with specific frame numbers;
s24, respectively scaling each reserved frame of the normalized gray level image into a uniform size and forming the image to be identified to obtain an image set to be detected;
in step S23, the process of removing the relatively still image specifically includes:
step S231 of calculating a correlation coefficient between the normalized grayscale image of the current frame and the normalized grayscale image of the previous frame;
step S232, judging whether the correlation coefficient is larger than a preset value or not, and keeping the normalized gray-scale image of the current frame corresponding to the correlation coefficient which is not larger than the preset value;
step S233, comparing the actual number of frames of the normalized grayscale image that is retained with the preset specific number of frames:
if the actual frame number is greater than the specific frame number, correspondingly reducing the preset value, and returning to the step S232;
if the actual frame number is equal to the specific frame number, executing the step S24;
and if the actual frame number is less than the specific frame number, correspondingly increasing the preset value, and returning to the step S232.
2. The method for identifying ultrasound images of the lung as claimed in claim 1, wherein in step S231, the correlation coefficient is calculated according to the following formula:
Figure FDA0003939887170000021
wherein subscript x is used to represent the lateral coordinate position of each pixel in the normalized grayscale image;
subscript y is used to represent the vertical coordinate position of each pixel in the normalized grayscale image;
subscript x' is used to represent a lateral offset of the lateral coordinate position of the pixel;
the subscript y' is used to denote a longitudinal shift in the longitudinal coordinate position of the pixel;
I prev(x,y) representing a gray value of a pixel with a coordinate position of (x, y) in the normalized gray image of the previous frame;
I cur(x,y) a gray scale value used for representing a pixel with a coordinate position of (x, y) in the normalized gray scale image of the current frame;
r (x, y) is used to represent the correlation coefficient.
3. The method for identifying ultrasound images of the lung as claimed in claim 1, wherein in the step S3, the first type of pathological feature result is a binary screening result characterizing or not characterizing the pulmonary sliding feature.
4. The method for recognizing an ultrasound image of the lung of claim 1, wherein in the step S4, one frame of the image to be recognized in the image set to be detected is selected and sent to the second classification model for recognition, and the output result of the second type of pathological features is a three-classification screening result representing a line a sign, a line B sign or a lung consolidation sign.
5. A lung ultrasound image identification system, which is applied to the lung ultrasound image identification method of claim 1, and comprises:
the image acquisition module is used for acquiring continuous ultrasonic images of the lungs of the detected person;
the preprocessing module is connected with the image acquisition module and is used for preprocessing the continuous ultrasonic images to obtain a plurality of frames of images to be identified which are sequentially arranged and forming all the images to be identified into an image set to be identified;
the first analysis module is connected with the preprocessing module and used for inputting the image set to be detected into a first classification model formed by pre-training to identify and obtain a first type of pathological feature result of the lung;
the second analysis module is connected with the preprocessing module and used for inputting the image set to be detected into a second classification model formed by pre-training to identify and obtain a second type of pathological feature result of the lung;
the identification module is respectively connected with the first analysis module and the second analysis module and used for outputting the first type of pathological feature result and the second type of pathological feature result as lung ultrasonic detection results;
the preprocessing module specifically comprises:
the extraction unit is used for acquiring an effective area image in each frame of ultrasonic image of the continuous ultrasonic images;
the conversion unit is connected with the extraction unit and is used for respectively carrying out graying and normalization processing on each frame of effective area image to obtain a normalized gray image;
the selecting unit is connected with the converting unit and is used for eliminating relatively static images in all the normalized gray level images and reserving the normalized gray level images with specific frame numbers;
the scaling unit is connected with the selection unit and used for scaling the reserved normalized gray level images of each frame into a uniform size and forming the image to be identified so as to obtain an image set to be detected;
the selecting unit comprises:
a calculation unit for calculating a correlation coefficient between the normalized grayscale image of a current frame and the normalized grayscale image of a previous frame;
the keeping unit is connected with the calculating unit and used for judging whether the correlation coefficient is larger than a preset value or not and keeping the normalized gray level image of the current frame corresponding to the correlation coefficient which is not larger than the preset value;
a judging unit, connected to the retaining unit, for comparing the actual frame number of the retained normalized grayscale image with the preset specific frame number: if the actual frame number is equal to the specific frame number, transmitting the normalized gray level image of the specific frame number to the scaling unit; if the actual frame number is greater than or less than the specific frame number, transmitting a comparison result to a setting unit;
the setting unit is respectively connected with the judging unit and the retaining unit, is used for presetting a preset numerical value, and is used for setting as follows according to the comparison result: if the comparison result is that the actual frame number is greater than the specific frame number, correspondingly reducing the preset value; if the comparison result is that the actual frame number is smaller than the specific frame number, correspondingly increasing the preset value; and the reservation unit is also used for sending preset values, reduced preset values and promoted preset values to the reservation unit.
6. The system of claim 5, wherein the computing unit is configured to compute the correlation coefficient according to the following equation:
Figure FDA0003939887170000041
wherein subscript x is used to represent the lateral coordinate position of each pixel in the normalized grayscale image;
subscript y is used to represent the vertical coordinate position of each pixel in the normalized grayscale image;
the subscript x' is used to represent the lateral offset of the lateral coordinate position of the pixel;
the subscript y' is used to denote a longitudinal shift in the longitudinal coordinate position of the pixel;
I prev(x,y) representing a gray value of a pixel with a coordinate position (x, y) in the normalized gray image of the previous frame;
I cur(x,y) a gray scale value used for representing a pixel with a coordinate position of (x, y) in the normalized gray scale image of the current frame;
r (x, y) is used to represent the correlation coefficient.
CN202010409334.XA 2020-05-14 2020-05-14 Lung ultrasonic image identification method and system Active CN111598868B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010409334.XA CN111598868B (en) 2020-05-14 2020-05-14 Lung ultrasonic image identification method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010409334.XA CN111598868B (en) 2020-05-14 2020-05-14 Lung ultrasonic image identification method and system

Publications (2)

Publication Number Publication Date
CN111598868A CN111598868A (en) 2020-08-28
CN111598868B true CN111598868B (en) 2022-12-30

Family

ID=72187397

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010409334.XA Active CN111598868B (en) 2020-05-14 2020-05-14 Lung ultrasonic image identification method and system

Country Status (1)

Country Link
CN (1) CN111598868B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112861881A (en) * 2021-03-08 2021-05-28 太原理工大学 Honeycomb lung recognition method based on improved MobileNet model
CN113763353A (en) * 2021-09-06 2021-12-07 杭州类脑科技有限公司 Lung ultrasonic image detection system
CN114727117A (en) * 2022-03-04 2022-07-08 上海深至信息科技有限公司 Packing method of ultrasonic scanning video

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109616195A (en) * 2018-11-28 2019-04-12 武汉大学人民医院(湖北省人民医院) The real-time assistant diagnosis system of mediastinum endoscopic ultrasonography image and method based on deep learning
CN109741312A (en) * 2018-12-28 2019-05-10 上海联影智能医疗科技有限公司 A kind of Lung neoplasm discrimination method, device, equipment and medium
CN110461240A (en) * 2017-01-19 2019-11-15 纽约大学 System, method and computer accessible for ultrasonic analysis
CN110678933A (en) * 2017-03-28 2020-01-10 皇家飞利浦有限公司 Ultrasound clinical feature detection and association apparatus, systems, and methods
WO2020020809A1 (en) * 2018-07-26 2020-01-30 Koninklijke Philips N.V. Ultrasound system with an artificial neural network for guided liver imaging
CN110766051A (en) * 2019-09-20 2020-02-07 四川大学华西医院 Lung nodule morphological classification method based on neural network
CN110807495A (en) * 2019-11-08 2020-02-18 腾讯科技(深圳)有限公司 Multi-label classification method and device, electronic equipment and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110461240A (en) * 2017-01-19 2019-11-15 纽约大学 System, method and computer accessible for ultrasonic analysis
CN110678933A (en) * 2017-03-28 2020-01-10 皇家飞利浦有限公司 Ultrasound clinical feature detection and association apparatus, systems, and methods
WO2020020809A1 (en) * 2018-07-26 2020-01-30 Koninklijke Philips N.V. Ultrasound system with an artificial neural network for guided liver imaging
CN109616195A (en) * 2018-11-28 2019-04-12 武汉大学人民医院(湖北省人民医院) The real-time assistant diagnosis system of mediastinum endoscopic ultrasonography image and method based on deep learning
CN109741312A (en) * 2018-12-28 2019-05-10 上海联影智能医疗科技有限公司 A kind of Lung neoplasm discrimination method, device, equipment and medium
CN110766051A (en) * 2019-09-20 2020-02-07 四川大学华西医院 Lung nodule morphological classification method based on neural network
CN110807495A (en) * 2019-11-08 2020-02-18 腾讯科技(深圳)有限公司 Multi-label classification method and device, electronic equipment and storage medium

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
POCOVID-Net: AUTOMATIC DETECTION OF COVID-19 FROM A NEW LUNG ULTRASOUND IMAGING DATASET (POCUS);Jannis Born等;《https://arxiv.org/pdf/2004.12084v2.pdf》;20200429;全文 *
Ultrasound-Based Detection of Lung Abnormalities Using Single Shot Detection Convolutional Neural Networks;Sourabh Kulhare等;《Lecture Notes in Computer Science》;20180915;全文 *
基于多类医学图像的疾病诊断架构研究;王蓝天;《中国优秀博硕士学位论文全文数据库(硕士) 信息科技辑》;20190115;全文 *

Also Published As

Publication number Publication date
CN111598868A (en) 2020-08-28

Similar Documents

Publication Publication Date Title
CN111598868B (en) Lung ultrasonic image identification method and system
JP6999812B2 (en) Bone age evaluation and height prediction model establishment method, its system and its prediction method
CN109543526B (en) True and false facial paralysis recognition system based on depth difference characteristics
US9801609B2 (en) Device and method for enhancing accuracy of recognizing fetus heart rate acceleration data
TWI684997B (en) Establishing method of bone age assessment and height prediction model, bone age assessment and height prediction system, and bone age assessment and height prediction method
CN110378232B (en) Improved test room examinee position rapid detection method of SSD dual-network
CN112819093B (en) Man-machine asynchronous identification method based on small data set and convolutional neural network
CN112734757A (en) Spine X-ray image cobb angle measuring method
US20200365271A1 (en) Method for predicting sleep apnea from neural networks
CN108511055A (en) Ventricular premature beat identifying system and method based on Multiple Classifier Fusion and diagnostic rule
CN111653356A (en) New coronary pneumonia screening method and new coronary pneumonia screening system based on deep learning
CN111161287A (en) Retinal vessel segmentation method based on symmetric bidirectional cascade network deep learning
CN107563364A (en) The discriminating conduct of the fingerprint true and false and fingerprint identification method based on sweat gland
CN115138059B (en) Pull-up standard counting method, pull-up standard counting system and storage medium of pull-up standard counting system
CN111345814B (en) Analysis method, device, equipment and storage medium for electrocardiosignal center beat
CN114358194A (en) Gesture tracking based detection method for abnormal limb behaviors of autism spectrum disorder
CN111754485A (en) Artificial intelligence ultrasonic auxiliary system for liver
CN113397485A (en) Scoliosis screening method based on deep learning
CN112862749A (en) Automatic identification method for bone age image after digital processing
CN114566282A (en) Treatment decision system based on echocardiogram detection report
CN115564741A (en) Device for quantifying esophageal and gastric fundus varices based on CT (computed tomography) images
CN111798408A (en) Endoscope interference image detection and grading system and method
CN111062936A (en) Quantitative index evaluation method for facial deformation diagnosis and treatment effect
CN113763353A (en) Lung ultrasonic image detection system
CN111341459A (en) Training method of classified deep neural network model and genetic disease detection method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20201016

Address after: Room 5030, 5 / F, building e, 555 Dongchuan Road, Minhang District, Shanghai, 200241

Applicant after: Shanghai Shenzhi Information Technology Co.,Ltd.

Address before: Room 5030, 5 / F, building e, 555 Dongchuan Road, Minhang District, Shanghai, 200241

Applicant before: Shanghai Shenzhi Information Technology Co.,Ltd.

Applicant before: Shanghai Zhuxing Biotechnology Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant