CN114494157A - Automatic evaluation method for image quality of four-chamber heart ultrasonic section of fetal heart - Google Patents

Automatic evaluation method for image quality of four-chamber heart ultrasonic section of fetal heart Download PDF

Info

Publication number
CN114494157A
CN114494157A CN202210013358.2A CN202210013358A CN114494157A CN 114494157 A CN114494157 A CN 114494157A CN 202210013358 A CN202210013358 A CN 202210013358A CN 114494157 A CN114494157 A CN 114494157A
Authority
CN
China
Prior art keywords
image
heart
region
chamber
valve
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210013358.2A
Other languages
Chinese (zh)
Inventor
徐光柱
钱奕凡
刘蓉
王阳
周军
雷帮军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Wanzhida Enterprise Management Co ltd
Original Assignee
China Three Gorges University CTGU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Three Gorges University CTGU filed Critical China Three Gorges University CTGU
Priority to CN202210013358.2A priority Critical patent/CN114494157A/en
Publication of CN114494157A publication Critical patent/CN114494157A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0883Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of the heart
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5207Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of raw data to produce diagnostic data, e.g. for generating an image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/523Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for generating planar views from image data in a user selectable plane not corresponding to the acquisition plane
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration using local operators
    • G06T5/30Erosion or dilatation, e.g. thinning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/40Image enhancement or restoration using histogram techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/12Edge-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/155Segmentation; Edge detection involving morphological operators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20036Morphological image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30044Fetus; Embryo
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30048Heart; Cardiac

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Medical Informatics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Surgery (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Pathology (AREA)
  • Data Mining & Analysis (AREA)
  • Software Systems (AREA)
  • Mathematical Physics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Evolutionary Computation (AREA)
  • Computational Linguistics (AREA)
  • Artificial Intelligence (AREA)
  • Cardiology (AREA)
  • Geometry (AREA)
  • Quality & Reliability (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

The method for automatically evaluating the image quality of the four-cavity ultrasonic section of the fetal heart comprises the following steps: acquiring a data set of a four-chamber heart ultrasonic section image of a fetal heart, and performing noise reduction treatment; building a Yolov5 target detection model, training, inputting a test image into the trained Yolov5 target detection model, and obtaining position coordinates of a chest region and a four-chamber heart region; extracting four-cavity heart region images, segmenting the four-cavity heart region images, and training U2-Net image segmentation network model; extracting four-cavity center regions and performing mask operation on a section image to be evaluated; combining the histogram correction with an OTSU algorithm, and segmenting the four-cavity heart region image after the mask operation; and according to a preset quality evaluation rule, performing quality evaluation on the four-chamber heart ultrasonic sectional image of the heart of the fetus. The method solves the problem that stable, reliable and accurate scoring can not be given to the image quality of the four-chamber heart ultrasonic section of the fetal heart in the prior art.

Description

Automatic evaluation method for image quality of four-cavity heart ultrasonic section of fetal heart
Technical Field
The invention relates to the technical field of ultrasonic image quality evaluation, in particular to an automatic evaluation method for the image quality of a four-chamber heart ultrasonic section of a fetal heart.
Background
The ultrasonic imaging technology has been widely applied to screening and diagnosing prenatal fetal congenital heart disease due to the advantages of painlessness, no wound, no ionizing radiation, real-time imaging and the like. Compared with imaging means such as CT, MRI and the like, the quality of an ultrasonic image contains more noise, and is influenced by the fetal activity, and subjective differences exist in the operation technology of an ultrasonic doctor and the image reading and recognizing process, so that the detection rate of fetal congenital heart disease is relatively low due to a plurality of factors, about 30 thousands of fetuses die of the congenital heart disease in pregnancy or childbirth every year,
as in document [1 ]: vulings R, Fetal Electrocardiography and Deep Learning for Prestate diagnosis, 2019Computing in Cardiology (CinC),2019, Page 1-Page 4, doi:10.23919/CinC49843.2019.9005870. In order to effectively reduce the influence of subjective difference and image acquisition quality on the removal of congenital heart disease, the development of quality control of an ultrasonic image of a fetal heart is very important, and the core of the quality control is an ultrasonic section image quality evaluation algorithm. Currently, ultrasound image quality evaluation is an essential link in prenatal diagnosis, and is mainly manually evaluated by an expert doctor. Manual evaluation has many disadvantages, it depends on the experience of the specialist, it is susceptible to subjective factors of the specialist, and it takes a lot of time and effort. The four-cavity heart section is the most important and basic section in the fetal heart ultrasonic image, and is also the section which is used most in the screening process of the congenital heart disease, and the automatic quality evaluation not only can reduce the workload of an evaluation expert, but also has certain reference significance for expanding the automatic quality evaluation of other ultrasonic sections.
In recent years, deep learning techniques have been widely applied to quality evaluation of fetal ultrasound images, which has revolutionary influence on the existing quality evaluation mode. According to the current state of research, the quality evaluation of the fetal ultrasound image can be essentially converted into the problem of classification or target detection. Document [2 ]: in Abdi A, Luong C, Tscan T, et al, automatic resolution Association of Echocardiograms Using volumetric Neural Networks Feasibility on the adaptive Four Chamber View-View [ J ] IEEE Trans US Imaging,2017:1-1, Abdi et al use a deep Convolutional Neural network to perform 0-5 quantitative Quality control on Apical Four Chamber ultrasonic sectional images, but the classification network used can only perform overall scoring on the whole image because it cannot locate the anatomical structure, so that stable, reliable and accurate scoring cannot be given according to the definition of the anatomical structure. On this basis, document [3 ]: in Baumgartner C F, Kamnitsas K, et al, SonoNet, Real-Time Detection and localization of magnetic Standard Scan Planes in free Ultrasound [ J ] IEEE transactions on medical imaging,2017, SonoNet, et al, can automatically detect 13 Standard slices in a two-dimensional Fetal Ultrasound image and locate various anatomical structures, such as the four-chamber ventricle of the Fetal heart, in a weakly supervised manner through a bounding box, but the method has insufficient location precision, and therefore, cannot give stable, reliable and accurate scores to the image quality of the four-chamber heart Ultrasound slice of the Fetal heart. With the continuous development of target detection networks, the detection precision and speed of the target detection networks are greatly improved.
Document [10 ]: the technical scheme in the Chinese patent "ultrasound image evaluation and screening method and device" (CN109191442B) utilizes the document [4 ]: the fast R-CNN in the Ren S, He K, Girshick R, et al, fast R-CNN: the directions Real-Time Object Detection with Region pro-position Networks [ J ]. IEEE Transactions on Pattern Analysis & Machine Analysis, 2017,39(6): 1137) 1149 detects five anatomical structures in the ultrasound image of the fetal head, and obtains the corresponding evaluation score of the image according to whether the corresponding anatomical structures are detected, but the scoring effect of the method depends on the manually set experience threshold. When the method is used for detecting the four-chamber heart ultrasonic sectional image of the fetal heart, when the threshold value is set to be lower, other structures are detected by mistake, so that the score is too high, as shown in fig. 1(a), fig. 1(b) and fig. 1 (c).
And document [11 ]: the Chinese patent "a method for controlling quality of ultrasonic section images of fetuses in middle and late pregnancy" (CN110464380A) uses the following technical scheme:
redmon J and Farhadi A.2018.YOLOv3: An incorporated Improvement [ EB/OL ], [2021-11-15]. https:// arxiv.org/pdf/1804.02767. YOLOv3 in pdf detects the anatomical structures of a plurality of fetal ultrasonic sectional images, the coincidence degree of the anatomical structures is calculated through the corresponding position coordinates of the key anatomical structures, the erroneously detected anatomical structures are filtered according to the coincidence degree association table, and finally the quality scores of the fetal ultrasonic sectional images after the key structures are filtered are obtained through a tissue-score mapping table. Document [12 ]: the technical scheme in the Chinese patent 'ultrasonic tangent plane image quality control method, device and computer equipment' (CN112070119A) utilizes the document [5 ]: t.lin, p.goyal, r.girshick, k.he and p.doll' r, Focal local for sense Object Detection, in IEEE Transactions on Pattern Analysis and Machine Analysis, vol.42, No.2, pp.318-327,1feb.2020, doi:10.1109/tpami.2018.2858826. the anatomical structures of a plurality of ultrasound fetal sectional images are detected, whether the fetal ultrasound sectional images are standard sectional images or not is obtained according to the rule of whether the structures in each sectional plane appear, are clear or not, and a mass score of each image is generated, which is calculated by weighting and calculating the weight of the relevant anatomical structure and the confidence of the Detection frame, but the confidence of the Detection frame of the actual anatomical structure may not completely match the clarity thereof, as shown in fig. 2(a) and fig. 2 (b). The valves in the four-chamber heart region in fig. 2(a) are barely visible, but with higher confidence than in fig. 2(b), indicating that scoring by means of detection box confidence is unreliable. Fig. 2(a), 2(b) show four-chamber heart regions of different degrees of clarity: FIG. 2(a) is a non-standard four-chamber core region; fig. 2(b) is a standard four-chamber core region.
On the basis of the prior art, document [13 ]: chinese patent 'automatic quality control method of standard section of fetus in early pregnancy based on multitask' (CN113393456A) proposes a deep convolutional neural network based on multitask to automatically control the quality of standard section of fetus in early pregnancy, acquiring the section class and confidence thereof corresponding to the section image, the class, coordinates and confidence thereof of all anatomical structures in the section image through the network, inputting the section class corresponding to each ultrasonic section image and the class of all anatomical structures in the section image into a trained SVM classification model, judging whether the section image is standard or not according to a preset matching rule, the quality control result of the method is susceptible to the accuracy of the classification module, especially when the section is misclassified, even if the detection results of all anatomical structures on the section meet the requirements of the original standard section, the quality control result is still a non-standard section.
In summary, it is unreliable to use only the target detection and classification model to perform quality control on the fetal ultrasound sectional image, on one hand, the classification model cannot provide specific position coordinates of the anatomical structure, and on the other hand, the confidence of the detection frame of the anatomical structure is not completely matched with the definition of the detection frame, so that the target detection model is prone to false detection on the anatomical structure, and the anatomical structure must be further analyzed by the image segmentation technology to improve the reliability of quality control on the fetal ultrasound sectional image.
Disclosure of Invention
In order to enable the quality evaluation result of the four-cavity ultrasonic sectional image of the fetal heart to have higher stability, reliability and interpretability, the invention provides an automatic quality evaluation method of the four-cavity ultrasonic sectional image of the fetal heart, which utilizes a deep learning method to improve the robustness and the application range of detection, and simultaneously adopts a traditional segmentation method to match with morphological processing and histogram correction, so that the method has good result and interpretability on the basis of giving an accurate score.
The technical scheme adopted by the invention is as follows:
the method for automatically evaluating the image quality of the four-chamber heart ultrasonic section of the fetal heart comprises the following steps:
step 1: acquiring a data set of a four-chamber heart ultrasonic section image of a fetal heart, and performing noise reduction treatment;
step 2: building a YOLOv5 target detection model and training, inputting a test image into the trained YOLOv5 target detection model to obtain position coordinates of a thoracic region and a four-chamber heart region, judging the position relation between the four chambers and the thoracic cavity in a four-chamber heart ultrasonic sectional image of the fetal heart, if the four-chamber heart region of the image is in the thoracic region, taking the image as a sectional image to be evaluated to perform the next operation, and if not, taking the image as an unqualified fetal heart four-chamber heart ultrasonic sectional image;
and step 3: extracting four-cavity heart region images, segmenting the four-cavity heart region images, and training U2-Net image segmentation network model;
and 4, step 4: extracting four-cavity center regions and performing mask operation on a section image to be evaluated;
and 5: combining the histogram correction with an OTSU algorithm, and segmenting the four-cavity heart region image after the mask operation;
step 6: calculating the ratio of the area of the valve together with the atrioventricular septum to the ventricular atrial region;
and 7: calculating the average brightness of the valve and the atrioventricular compartment area;
and 8: calculating the average gray scale of the ventricular and atrial regions;
and step 9: and according to a preset quality evaluation rule, performing quality evaluation on the four-chamber heart ultrasonic sectional image of the heart of the fetus.
The invention discloses an automatic evaluation method for the image quality of a four-chamber heart ultrasonic section of a fetal heart, which has the following technical effects:
1) the method provides a section image scoring strategy combining the area ratio of the valve in the heart region to the atrioventricular region in the four-chamber heart ultrasonic section image of the fetal heart and the gray average factor, and constructs a comprehensive scoring formula, so that the obtained evaluation result is matched with the human eye evaluation result. The problem of can't give stable, reliable and accurate grade to four chamber heart supersound tangent plane image quality of foetus' heart among the prior art is solved.
2) In order to effectively segment the heart region, the method of the invention provides a combined segmentation strategy which firstly utilizes a deep convolutional neural network to realize the positioning of the heart region, then utilizes a deep semantic segmentation neural network to realize the coarse segmentation of the heart region, and finally utilizes morphological processing and histogram correction in combination with the maximum inter-class variance method to realize the accurate segmentation of the valve and the atrioventricular region in the heart region, thereby effectively solving the difficult problem of the accurate segmentation of the heart valve and the atrioventricular and ventricular regions in the section of the four-chamber heart ultrasonic image of the fetal heart.
3) The invention provides a method for roughly positioning a key anatomical structure by using a YOLOv5 target detection model, and then roughly positioning the key anatomical structure by using a U target detection model, aiming at the problems that the image classification model cannot position the anatomical structure in an evaluation section image to give a corresponding evaluation basis, and although a target detection model can give a position coordinate of the anatomical structure, the target detection model can cause false detection to a certain extent due to incomplete matching of confidence degree of an anatomical structure detection frame and definition degree of the anatomical structure detection frame, so that the evaluation method based on the confidence degree of the detection frame is unreliable2-a method for fine segmentation of a four-chamber heart region in combination with OTSU.
4) Aiming at the problem that the valve and atrioventricular partition area cannot be divided by directly using the OTSU algorithm because the brightness of the valve and atrioventricular partition area is almost the same as that of the area outside the ventricular atrial edge, and when the ventricular atrium is bright, even if the area outside the ventricular atrial edge is shielded by using the mask operation first and then the valve and ventricular atrial area cannot be divided well by using the OTSU algorithm, the invention provides a method for multiplying the corroded binary division image with the grayed original image, fully filtering the area outside the ventricular atrial edge, correcting the multiplied image histogram and dividing the valve and atrioventricular partition area by using the OTSU.
5) The four-chamber heart area is used as the most important anatomical structure in the four-chamber ultrasonic section image of the fetal heart, and the evaluation of the four-chamber heart section is not enough only by depending on the rough positioning of the left ventricle atrium and the right ventricle atrium in the four-chamber heart area. The ventricular atrium is formed by separating a valve from an atrioventricular space, so that the valve, the area ratio of the atrioventricular space to the ventricular atrium, the average brightness of the valve and the atrioventricular space and the average gray scale of the ventricular atrium are three important indexes for evaluating the quality of a four-chamber section image. Aiming at the problems, the invention provides a novel method for evaluating the quality of the image of the four-chamber heart ultrasonic section of the fetal heart, which judges the quality of the image of the four-chamber heart section according to the normalized weight and size of the area ratio, the average brightness and the average gray level when the area ratio, the average brightness and the average gray level are in the preset threshold range, and simultaneously gives a corresponding quality evaluation report according to the preset quality evaluation rule
Drawings
FIG. 1(a) is a first image of a four-chamber heart region being falsely detected;
FIG. 1(b) is a second image of the four-chamber heart region being falsely detected;
fig. 1(c) shows a third image in which a four-chamber-center region is erroneously detected.
FIG. 2(a) is a non-standard four-chamber-center region view;
fig. 2(b) is a standard four-chamber heart region diagram.
FIG. 3 is a flow chart of a quality evaluation method of the present invention.
FIG. 4 is a schematic diagram showing the position relationship between the four-chamber heart region and the chest region.
Fig. 5(a1) is an extracted four-chamber heart region image one;
fig. 5(a2) is an extracted four-chamber heart region image two;
fig. 5(a3) is an extracted four-chamber heart region image three;
fig. 5(a4) is an extracted four-chamber heart region image four.
FIG. 5(b1) is a segmentation map corresponding to the image of FIG. 5(a 1);
FIG. 5(b2) is a segmentation map corresponding to the image of FIG. 5(a 2);
FIG. 5(b3) is a segmentation map corresponding to the image of FIG. 5(a 3);
fig. 5(b4) is a corresponding segmentation map for the image of fig. 5(a 4).
FIG. 6(a) is an original four-chamber heart region image;
fig. 6(b) is a grayed four-chamber heart region image.
FIG. 7(a1) is U2-segmentation map one of Net output;
FIG. 7(a2) is U2-segmentation map two of Net output.
Fig. 7(b1) is a segmentation map one after OTSU processing;
fig. 7(b2) is a segmentation map two after OTSU processing.
FIG. 8(a1) is a first image obtained by masking directly without etching;
fig. 8(a2) shows a second image obtained by directly performing a masking operation without etching.
FIG. 8(b1) is a first image obtained by masking after etching;
fig. 8(b2) shows a second image obtained by masking after etching.
FIG. 9(a) is a grayed four-chamber heart region image;
FIG. 9(b) is an image of a four-chamber core region after an etch and mask operation;
FIG. 9(c) is a modified four chamber heart region image;
FIG. 9(d) is a histogram corresponding to a grayed four-chamber heart region image;
FIG. 9(e) is a histogram corresponding to the four-chamber heart region image after the etching and masking operations;
fig. 9(f) is a histogram corresponding to the corrected four-chamber heart region image.
FIG. 10(a) is an OTSU segmentation result diagram of a grayed four-chamber heart region image;
fig. 10(b) is an OTSU segmentation result diagram of the four-chamber heart region image after the etching and masking operations.
Fig. 10(c) shows the OTSU segmentation result of the four-chamber cardiac region image after the histogram correction.
FIG. 11(a) is a fragmented view of the area within the atrial margin of the ventricle undergoing an erosion procedure;
FIG. 11(b) is a segmented view of the valve and atrioventricular septal region in the four-chamber heart region.
FIG. 12 is a diagram of the masking operation of the valve and the atrioventricular compartment region.
FIG. 13 is a diagram of a process of subtraction of a ventricular atrial region.
FIG. 14 is a diagram of the masking operation of the ventricular atrial region.
FIG. 15(a) is a diagram of the four-chamber heart region with an excessive atrioventricular separation;
fig. 15(b) is a diagram of a four-chamber heart region with unclear atrioventricular intervals.
Detailed Description
The automatic quality evaluation method for the fetal heart four-chamber ultrasonic sectional image comprises the steps of firstly preprocessing the original three-channel fetal heart four-chamber ultrasonic sectional image; the preprocessed data sets are then used to pair documents [7]:Ultralytics.YOLOv5[EB/OL].[2021.11.15]Https:// github. com/ultralytics/YOLOv5 and literature [9]:Qin X B,Zhang Z C and Huang C Y.2020.U2-net:going deeper with nested u-structure for salient object detection[EB/OL].[2021-11-15]U described in https:// arxiv.org/pdf/2005.09007.pdf2-the Net network is trained and enhanced; inputting the four-chamber heart ultrasonic sectional image of the fetal heart to be quality-controlled into a trained Yolov5 model, and judging whether to continue evaluating according to the detected position relation between the four-chamber heart and the chest; then extracting the four-cavity heart area in the section image to be evaluated and inputting the four-cavity heart area into the trained U2-Net model, which divides the ventricular atrial edge into regions, after appropriate erosion, multiplies the eroded binary segmentation map with the original image, and then corrects the image histogram after the masking operation, i.e. does not count the number of pixels with a gray level of 0, and then uses the document [8 ]]:Otsu N.A Threshold Selection Method from Gray-Level Histograms[J].IEEE Transactions on Systems Man&Cybernetics2007,9(1) the OTSU algorithm described in 62-66, which is used for segmenting the image after correcting the histogram, and obtaining a segmentation map of the valve and atrioventricular compartment area; obtaining a segmentation map of the ventricular atrial region by subtracting the segmentation map; finally, taking the area ratio of the valve and the atrioventricular space to the ventricular atrial region as a first parameter, taking the average brightness of the valve and the atrioventricular space as a second parameter, taking the average gray scale of the ventricular atrium as a third parameter, normalizing the three parameters, and calculating the score of the four-chamber heart region through weighting sum; when the area ratio, the average brightness and the average gray scale are all within the preset threshold range, the four-cavity cardiotomy image is shown to be in accordance with the standard, the quality of the image can be reflected according to the score, and meanwhile, a corresponding quality evaluation report is given according to the preset quality evaluation rule.
The method comprises the following steps:
step 1: acquiring a four-cavity heart ultrasonic section image data set of the fetal heart marked with a chest cavity and a four-cavity heart region, and performing median filtering and noise reduction treatment;
step 2: building a YOLOv5 target detection model and training, inputting a test image into the trained YOLOv5 target detection model to obtain position coordinates of a thoracic region and a four-chamber heart region, judging the position relation between the four chambers and the thoracic cavity in a four-chamber heart ultrasonic sectional image of the fetal heart, if the four-chamber heart region of the image is in the thoracic region, taking the image as a sectional image to be evaluated to perform the next operation, and if not, taking the image as an unqualified fetal heart four-chamber heart ultrasonic sectional image;
and step 3: extracting four-cavity heart region images, segmenting the four-cavity heart region images, and training U2-Net image segmentation network model;
and 4, step 4: extracting four-cavity center regions and performing mask operation on a section image to be evaluated;
and 5: combining the histogram correction with an OTSU algorithm, and segmenting the four-cavity heart region image after the mask operation;
step 6: calculating the ratio of the area of the valve together with the atrioventricular septum to the ventricular atrial region;
and 7: calculating the average brightness of the valve and the atrioventricular compartment area;
and 8: calculating the average gray scale of the ventricular and atrial regions;
and step 9: and according to a preset quality evaluation rule, performing quality evaluation on the four-chamber heart ultrasonic sectional image of the heart of the fetus.
The detailed flow from step 1 to step 9 is shown in FIG. 3.
In the step 1, a four-chamber heart ultrasonic section image data set of the fetal heart marked with the thoracic cavity and the four-chamber heart region is obtained, and median filtering and noise reduction processing is performed.
Specifically, the median filtering and noise reduction processing is to take a 3x3 neighborhood as a sliding window, traverse the whole image pixel matrix, sort the gray values of nine pixels in the image pixel matrix corresponding to the 3x3 neighborhood, and take the middle gray value as the gray value of the neighborhood center.
The step 2 comprises the following steps:
step 2.1: and (3) performing the data set obtained in the step 1 according to the following steps of 5: 2: 3, dividing a training set, a verification set and a test set according to the proportion, and training a Yolov5 target detection model by using the divided data set.
Step 2.2: inputting the test image into a trained YOLOv5 target detection model, and calculating the coordinates of the upper left corner and the lower right corner of the four-chamber heart region and the thoracic region according to the output result of the model, wherein the specific calculation formula is as follows:
YOLOv5 is in the output format for each detected target: the class number, the abscissa after the normalization of the central point, the ordinate after the normalization of the central point, the width after the normalization of the boundary box and the height after the normalization of the boundary box;
if the Width of the image of the four-chamber heart ultrasonic section of a certain fetal heart is Width, the Height of the image is Height, and the output result of the four-chamber heart area is 0, x, y, w and h, the coordinate of the central point (x)o,yo) The width of the boundary frame is woThe height of the boundary frame is hoCoordinates of upper left corner (x)1,y1) Lower right corner coordinate (x)2,y2) I.e. by
xo=x×Width,yo=y×Height (1);
wo=w×Width,ho=h×Height (2);
Figure BDA0003458657380000081
x2=xo+wo,y2=yo+ho (4);
The formula (1) and the formula (4) are respectively calculation formulas of the center point coordinate, the width and the height of the boundary frame, the upper left corner coordinate and the lower right corner coordinate of the four-cavity center area.
Similarly, if the output result of the thoracic region is 1, X, Y, W, H, the center point coordinate (X)o,Yo) The width of the boundary frame is WoThe height of the boundary frame is HoThe coordinates of the upper left corner (X)1,Y1) Lower right corner coordinate (X)2,Y2) I.e. by
Xo=X×Width,Yo=Y×Height (5);
Wo=W×Width,Ho=H×Height (6);
Figure BDA0003458657380000082
X2=Xo+Wo,Y2=Yo+Ho (8);
Formula (5) -formula (8) are respectively thorax region center point coordinate, bounding box width and height, upper left corner coordinate, and lower right corner coordinate calculation formulas.
Step 2.3: examining the position relation between the four-chamber heart and the chest cavity in the fetal heart four-chamber heart ultrasonic sectional image, and if the four-chamber heart area of the image is in the chest cavity, as shown in fig. 4, satisfying X1<x1<x2<X2And Y is1<y1<y2<Y2If the image is not qualified, the fetal heart four-chamber ultrasonic section is judged to be unqualifiedA face image.
The step 3 comprises the following steps:
step 3.1: according to the position coordinates of the manually marked four-chamber heart region boundary frame, all the four-chamber heart region images in the data set divided in the step 2.1 are extracted, and the image segmentation marking tool is used for marking the ventricular atrial inner edges of the four-chamber heart regions, the ventricular atrial inner edges of the obtained segmentation maps are all 1 in the center chamber, and the ventricular atrial outer edges are all 0 at the moment, as shown in fig. 5(a1) -5 (a4) and 5(b1) -5 (b4), fig. 5(a1) -5 (a4) are the extracted four-chamber heart region images; fig. 5(b1) to 5(b4) are segmentation maps corresponding to the four-chamber heart region image.
Step 3.2: training U by using the four-chamber heart region and the corresponding segmentation map obtained in step 3.12-Net segmentation model.
The step 4 comprises the following steps:
step 4.1: extracting the four-chamber heart region in the four-chamber heart ultrasonic section image of the fetal heart to be evaluated tested in the step 2.3, and inputting the extracted four-chamber heart region into a trained U2And a Net segmentation model for outputting a segmentation map corresponding to the four-chamber heart region.
Step 4.2: the four-chamber heart region in step 4.1 is subjected to graying processing, as shown in fig. 6(a) and 6 (b).
Step 4.3: the segmentation map obtained in step 4.1 is binarized by OTSU, and as shown in fig. 7(a1), 7(a2), 7(b1) and 7(b2), fig. 7(a1), 7(a2), 7(b1) and 7(b2) are front and back comparisons of the segmentation map processed by OTSU.
Step 4.4: in order to fully filter the region outside the ventricular atrial edge, the segmentation map obtained after OTSU processing in step 4.3 is eroded, and the eroded segmentation map and the grayed four-chamber heart region image obtained in step 4.2 are masked, that is, the eroded segmentation map and the grayed four-chamber heart region image are multiplied, as shown in fig. 8(a1), fig. 8(a2), fig. 8(b1) and fig. 8(b 2). Fig. 8(a1), fig. 8(a2), fig. 8(b1), and fig. 8(b2) compare etching and masking operations before and after.
The step 5 comprises the following steps:
step 5.1: counting the histograms of the images subjected to the masking operation in step 4.4, as shown in fig. 9(b) and 9 (e);
and step 5.2: correcting the image and the histogram thereof in the step 5.1, that is, not counting the number of pixels with a gray level of 0, as shown in fig. 9(c) and 9 (f);
step 5.3: the histogram-corrected four-chamber heart region image is segmented using OTSU, as shown in fig. 10 (c).
The step 6 comprises the following steps:
step 6.1: counting the number of pixel points of the regional segmentation map in the ventricular atrial edge subjected to the corrosion operation in the step 4.4, namely, calculating the number of white pixel points in the graph 11 (a);
step 6.2: counting the number of pixel points in the valve and atrioventricular compartment region segmented by the OTSU in step 5.3, i.e. calculating the number of white pixel points in fig. 11 (b);
step 6.3: calculating the difference between the pixel number of the area in the ventricular atrial edge subjected to corrosion operation and the pixel number of the valve and the atrioventricular septum area to obtain the pixel number of the ventricular atrial area, and obtaining the area ratio k' of the valve and the atrioventricular septum to the ventricular atrial area in the four-chamber cardiac ultrasonic sectional image of the fetal heart to be evaluated through a formula (9);
the pixel matrix of the segmentation map of the area inside the atrial edge of the ventricle subjected to erosion operation is marked as Am×nThe OTSU-segmented image pixel matrix of the valve and atrioventricular space region is Bm×n
Figure BDA0003458657380000101
In the formula (9), AijRow i and column j elements in a pixel matrix representing a segmentation map of the area inside the atrial edge of the ventricle subjected to an erosion operation; b isijRepresenting the ith row and the jth column of elements in the pixel matrix of the valve and atrioventricular compartment area image segmented by the OTSU; m and n respectively represent the row number and the column number of the pixel matrix; i. j denotes a row label and a column label of the pixel matrix, respectively.
The step 7 comprises the following steps:
step 7.1: multiplying the segmented image of the valve and the atrioventricular compartment region by the grayed original image, i.e. masking fig. 10(c) and fig. 9(a), as shown in fig. 12;
step 7.2: calculating the average value of the pixel points of the valve and the atrioventricular partition region by using a formula (10), namely calculating the ratio of the sum of the pixel points of the image subjected to the masking operation in the graph 12 to the number of non-zero pixel points in the image, and taking the average value as the average brightness of the valve and the atrioventricular partition region and marking the average value as k ";
graying the original image pixel matrix to Cm×n
Figure BDA0003458657380000102
Step 8 comprises the following steps:
step 8.1: subtracting the segmentation map of the ventricular atrial edge inner region subjected to the erosion operation in the step 4.4 from the segmentation map of the valve and the atrial-ventricular septum region in the step 5.3 to obtain a segmentation map of the ventricular atrial region, as shown in fig. 13;
step 8.2: multiplying the segmentation map of the ventricular atrial region in step 8.1 with the grayed original map, i.e., masking the segmentation map obtained by the subtraction in fig. 13 with fig. 7(a1) and fig. 7(a2), as shown in fig. 14;
step 8.3: calculating the average value of the pixel points of the ventricular atrial region, namely calculating the ratio of the sum of the pixel points of the image subjected to the masking operation in the graph 14 to the number of non-zero pixel points in the image, and obtaining the average gray k' ″ of the ventricular atrial region by using a formula (11);
Figure BDA0003458657380000103
when k' "is larger, it indicates that the ventricular atrial region is darker; when k' ″ is smaller, it indicates that the ventricular atrial region is brighter;
the average brightness of the ventricular and atrial areas in the four-chamber heart area of a high-quality fetal heart four-chamber section image cannot be higher than that of the valve area, and the fetal heart four-chamber section image is darker as much as possible.
Step 9 comprises the following steps:
step 9.1: executing the training set and the verification set in the step 2.1 according to the steps 2.3-8.3, and counting the ranges of three quality evaluation indexes, namely the area ratio of the valve and the atrioventricular interval to the ventricular and atrial region, the average brightness of the valve and the atrioventricular interval region and the average gray scale of the ventricular and atrial region in the standard fetal heart four-chamber heart ultrasonic sectional image, which are respectively 0.11-0.57, 44-143 and 189-248;
step 9.2: comparing the three quality evaluation indexes of the test set with the range set in the step 9.1, when the three quality evaluation indexes are all in the set range, the four-chamber heart region is judged to be in accordance with the standard, otherwise, the four-chamber heart region is not in accordance with the standard and is graded as D, as shown in fig. 15(a) and fig. 15 (b). Figure 15(a) is out of range for too large a compartmental spacing, while figure 15(b) is out of standard for too small a compartmental spacing, and is rated as D.
Step 9.3: obtaining three normalized quality evaluation indexes, namely area ratio K, by using formula (12)1Average brightness K2Average gray level K3
Figure BDA0003458657380000111
In equation (12), k 'represents the ratio of the valve-associated ventricular interval to the area of the ventricular atrial region in the four-chamber cardiac ultrasound sectional image of the fetal heart to be evaluated, k'maxRepresents the maximum value, k ', of the ratio of their areas in all images to be evaluated'minRepresenting the minimum value of the area ratio of all the images to be evaluated; k "represents the average brightness of the valve-atrioventricular compartment region, k"maxRepresenting the maximum value, k, of the mean luminance of all images to be evaluated "minRepresenting the minimum value of the average brightness of all the images to be evaluated; k '"denotes the mean gray level of the ventricular atrial region, k"'maxRepresenting the mean gray level of all images to be evaluatedMaximum value of, k'minRepresenting the minimum value of the average gray level of all the images to be evaluated.
Step 9.4: carrying out weighted statistics on the three normalized quality evaluation indexes by using a formula (13) to obtain the quality fraction of the evaluation section image;
Figure BDA0003458657380000112
considering that the ratio of the area of the valve together with the atrioventricular septum to the ventricular atrium cannot be too large, a suitable fractional penalty should be given when the normalized area ratio is equal to or greater than 0.66.
Step 9.5: and comparing the quality of the four-cavity cardiotomy plane image according to the quality fraction, and giving a corresponding quality evaluation report according to a preset quality evaluation rule.
The evaluation rules referred to are as follows:
1. area ratio:
when K is more than or equal to 01<0.31, indicating a smaller valve with a smaller area of the atrioventricular septum to the ventricular atrial region;
when K is more than or equal to 0.311<0.66, indicating a proper valve with a proper ratio of the area of the atrioventricular septum to the ventricular atrial region;
when K is more than or equal to 0.661When the area of the valve, the atrioventricular septum and the ventricular atrial region is less than or equal to 1, the ratio of the area of the valve, the atrioventricular septum and the ventricular atrial region is larger;
2. average luminance:
when K is more than or equal to 02<At 0.25, the valve is not clearly spaced from the atrioventricular space;
when K is more than or equal to 0.252<0.50, indicating a less clear valve-atrioventricular septum;
when K is more than or equal to 0.502When the spacing between the valve and the atrioventricular space is less than or equal to 1, the valve is clear;
3. average gray scale:
when K is more than or equal to 03<0.35, indicating a brighter ventricular atrial region;
when K is more than or equal to 0.353<0.70 hours, indicating ventricular atrial chamberThe regional brightness is moderate;
when K is more than or equal to 0.703When the value is less than or equal to 1, the ventricular atrial region is darker;
4. mass fraction:
when F is more than or equal to 0 and less than 0.378, the rating is C;
when F is more than or equal to 0.378 and less than 0.473, the rating is B;
when 0.473 ≦ F ≦ 0.864, the rating is A;
when the area ratio is larger, certain fraction punishment exists, so the mass fraction is less than 1 at most;
the ratings of the test sets were compared to the human eye rating results with the accuracy shown in table 1.
TABLE 1 test set evaluation accuracy
Rating A B C D
Rate of accuracy 93.7% 90.3% 99.1% 99.5%
Although the intrinsic quality evaluation method has certain misclassification conditions, the evaluation result is similar to the subjective evaluation result of human eyes on the whole, and the method has good application value.
The evaluation results of the partial pictures in the test set are given corresponding evaluation analysis, as shown in the following table 2.
Table 2 analysis of evaluation results
Figure BDA0003458657380000121
Figure BDA0003458657380000131
Figure BDA0003458657380000141
The image comparison experiments in the table 1 and the table 2 show that the sectional image scoring strategy combining the valve-atrioventricular area ratio and the gray average factor in the heart area of the fetal heart four-chamber cardiac ultrasound sectional image adopted by the invention has the image quality score which is generally consistent with the human eye evaluation result, and can be used for automatic quality evaluation of the fetal heart four-chamber cardiac ultrasound sectional image.

Claims (10)

1. The method for automatically evaluating the image quality of the four-chamber heart ultrasonic section of the fetal heart is characterized by comprising the following steps of:
step 1: acquiring a data set of a four-chamber heart ultrasonic section image of a fetal heart, and performing noise reduction treatment;
step 2: building a YOLOv5 target detection model and training, inputting a test image into the trained YOLOv5 target detection model to obtain position coordinates of a thoracic region and a four-chamber heart region, judging the position relation between the four chambers and the thoracic cavity in a four-chamber heart ultrasonic sectional image of the fetal heart, if the four-chamber heart region of the image is in the thoracic region, taking the image as a sectional image to be evaluated to perform the next operation, and if not, taking the image as an unqualified fetal heart four-chamber heart ultrasonic sectional image;
and step 3: extracting four-cavity heart region image and aligning the four-cavity heart regionThe domain image is divided to train U2-Net image segmentation network model;
and 4, step 4: extracting four-cavity center regions and performing mask operation on a section image to be evaluated;
and 5: combining the histogram correction with an OTSU algorithm, and segmenting the four-cavity heart region image after the mask operation;
step 6: calculating the ratio of the area of the valve together with the atrioventricular septum to the ventricular atrial region;
and 7: calculating the average brightness of the valve and the atrioventricular compartment area;
and 8: calculating the average gray scale of the ventricular and atrial regions;
and step 9: and according to a preset quality evaluation rule, performing quality evaluation on the four-chamber heart ultrasonic sectional image of the heart of the fetus.
2. The method for automatically evaluating the image quality of the four-chamber ultrasonic section of the fetal heart according to claim 1, wherein the method comprises the following steps: the step 2 comprises the following steps:
step 2.1: dividing the image data set obtained in the step 1 into a training set, a verification set and a test set according to a certain proportion, and training a Yolov5 target detection model by using the divided data set;
step 2.2: inputting the test image into a trained Yolov5 target detection model, and calculating the coordinates of the upper left corner and the lower right corner of the four-chamber heart region and the chest region according to the output result of the model, wherein the specific calculation formula is as follows:
if the Width of the image of the four-chamber heart ultrasonic section of a certain fetal heart is Width, the Height of the image is Height, and the output result of the four-chamber heart area is 0, x, y, w and h, the coordinate of the central point (x)o,yo) The width of the boundary frame is woThe height of the boundary frame is hoCoordinate of upper left corner (x)1,y1) Lower right corner coordinate (x)2,y2) I.e. by
xo=x×Width,yo=y×Height (1);
wo=w×Width,ho=h×Height (2);
Figure FDA0003458657370000011
x2=xo+wo,y2=yo+ho (4);
Similarly, if the output result of the thoracic region is 1, X, Y, W, H, the center point coordinate (X)o,Yo) The width of the boundary frame is WoThe height of the boundary frame is HoThe coordinates of the upper left corner (X)1,Y1) Lower right corner coordinate (X)2,Y2) I.e. by
Xo=X×Width,Yo=Y×Height (5);
Wo=W×Width,Ho=H×Height (6);
Figure FDA0003458657370000021
X2=Xo+Wo,Y2=Yo+Ho (8);
Step 2.3: the position relation of the four-cavity heart and the chest in the fetal heart four-cavity heart ultrasonic section image is inspected, and if the four-cavity heart area of the image is in the chest, the requirement of X is met1<x1<x2<X2And Y is1<y1<y2<Y2And if not, the image is regarded as an unqualified fetal heart four-chamber heart ultrasonic section image.
3. The method for automatically evaluating the image quality of the four-chamber ultrasonic section of the fetal heart according to claim 2, wherein: the step 3 comprises the following steps:
step 3.1: according to the position coordinates of the manually marked four-chamber heart region boundary frame, all the four-chamber heart region images in the data set divided in the step 2.1 are extracted, and the ventricular atrial inner edges of the four-chamber heart regions are marked by using an image segmentation marking tool, wherein the ventricular atrial inner edges of the obtained segmentation graph are all 1 in the central chamber atrium edge, and all 0 outside the ventricular atrial edge;
step 3.2: training U by using the four-chamber heart region and the corresponding segmentation map obtained in step 3.12-Net segmentation model.
4. The method for automatically evaluating the image quality of the four-chamber ultrasonic section of the fetal heart according to claim 2, wherein: the step 4 comprises the following steps:
step 4.1: extracting the four-chamber heart area in the four-chamber heart ultrasonic section image of the fetal heart to be evaluated tested in the step 2.3, and inputting the four-chamber heart area into the trained U2-a Net segmentation model outputting a segmentation map corresponding to the four-chamber heart region;
step 4.2: carrying out graying treatment on the four-cavity heart area in the step 4.1;
step 4.3: binarization is carried out on the segmentation map obtained in the step 4.1 by using OTSU
Step 4.4: in order to fully filter the region outside the ventricular atrial edge, the segmentation map obtained after OTSU processing in step 4.3 is eroded, and masking operation is performed on the eroded segmentation map and the grayed four-chamber heart region image obtained in step 4.2, that is, the eroded segmentation map is multiplied by the grayed four-chamber heart region image.
5. The method for automatically evaluating the image quality of the four-chamber ultrasonic section of the fetal heart according to claim 4, wherein: the step 5 comprises the following steps:
step 5.1: counting the histogram of the image subjected to the mask operation in the step 4.4;
step 5.2: correcting the image and the histogram thereof in the step 5.1, namely not counting the number of pixel points with the gray level of 0;
step 5.3: and (5) segmenting the four-cavity heart area image after the histogram correction by using the OTSU.
6. The method for automatically evaluating the image quality of the four-chamber ultrasonic section of the fetal heart according to claim 5, wherein: the step 6 comprises the following steps:
step 6.1: counting the number of pixel points of the regional segmentation map in the ventricular atrial edge subjected to the corrosion operation in the step 4.4;
step 6.2: counting the number of pixel points in the valve and atrioventricular compartment area segmented by the OTSU in the step 5.3;
step 6.3: calculating the difference between the pixel number of the area in the ventricular atrial edge subjected to corrosion operation and the pixel number of the valve and the atrioventricular septum area to obtain the pixel number of the ventricular atrial area, and obtaining the area ratio k' of the valve and the atrioventricular septum to the ventricular atrial area in the four-chamber cardiac ultrasonic sectional image of the fetal heart to be evaluated through a formula (9);
let A be the pixel matrix of the segmentation map of the area inside the atrial edge of the ventricle subjected to erosion operationm×nThe pixel matrix of the valve and atrioventricular compartment region image segmented by the OTSU is Bm×n
Figure FDA0003458657370000031
7. The method for automatically evaluating the image quality of the four-chamber ultrasonic section of the fetal heart according to claim 1, wherein the method comprises the following steps: the step 7 comprises the following steps:
step 7.1: multiplying the segmented image of the valve and the atrioventricular compartment area by the grayed original image;
step 7.2: calculating the average value of pixel points of the valve and the atrioventricular space region by using a formula (10), namely calculating the ratio of the sum of the image pixel points subjected to the masking operation to the number of non-zero pixel points in the image pixel points, and taking the average value as the average brightness of the valve and the atrioventricular space region and marking the average value as k ";
let the grayed original image pixel matrix be Cm×n
Figure FDA0003458657370000032
8. The method for automatically evaluating the image quality of the four-chamber ultrasonic section of the fetal heart according to claim 6, wherein: the step 8 comprises the following steps:
step 8.1: subtracting the segmentation graph of the ventricular atrial edge inner region subjected to the corrosion operation in the step 4.4 from the segmentation graph of the valve and the atrial-ventricular septum region in the step 5.3 to obtain a segmentation graph of the ventricular atrial region;
step 8.2: multiplying the segmentation map of the ventricular atrial region in the center chamber in the step 8.1 by the grayed original map;
step 8.3: calculating the average value of the pixel points of the ventricular atrial region, namely calculating the ratio of the sum of the image pixel points subjected to masking operation to the number of non-zero pixel points in the image pixel points, and obtaining the average gray k' ″ of the ventricular atrial region by using a formula (11);
Figure FDA0003458657370000041
when k' "is larger, it indicates that the ventricular atrial region is darker; when k' "is smaller, it indicates that the ventricular atrial region is brighter.
9. The method for automatically evaluating the image quality of the four-chamber ultrasonic section of the fetal heart according to claim 8, wherein: the step 9 comprises the following steps:
step 9.1: executing the training set and the verification set in the step 2.1 according to the steps 2.3-8.3, and counting the ranges of three quality evaluation indexes, namely the area ratio of the valve and the atrioventricular space to the ventricular atrial region, the average brightness of the valve and the atrioventricular space region and the average gray scale of the ventricular atrial region in a standard fetal heart four-chamber cardiac ultrasonic sectional image, which are respectively 0.11-0.57, 44-143 and 189-248;
step 9.2: comparing the three quality evaluation indexes of the test set with the range set in the step 9.1, and when the three quality evaluation indexes are in the set range, indicating that the four-cavity heart region meets the standard, otherwise, rating as D;
step 9.3: obtaining three normalized quality evaluation indexes, namely area ratio K, by using formula (12)1Average brightness K2Average gray level K3
Figure FDA0003458657370000042
Step 9.4: carrying out weighted statistics on the three normalized quality evaluation indexes by using a formula (13) to obtain the quality fraction of the evaluation section image;
Figure FDA0003458657370000043
considering that the ratio of the area of the valve together with the atrioventricular septum to the ventricular atrium cannot be excessive, when the normalized area ratio is equal to or greater than 0.66, a suitable fractional penalty should be given;
step 9.5: and comparing the quality of the four-cavity cardiotomy plane image according to the quality fraction, and giving a corresponding quality evaluation report according to a preset quality evaluation rule.
10. The method for automatically evaluating the image quality of the four-chamber ultrasonic section of the fetal heart according to claim 9, wherein: the quality evaluation rule is as follows:
the area ratio:
when K is more than or equal to 01<0.31, indicating a smaller valve with a smaller area of the atrioventricular septum to the ventricular atrial region;
when K is more than or equal to 0.311<0.66, indicating a proper valve with a proper ratio of the area of the atrioventricular septum to the ventricular atrial region;
when K is more than or equal to 0.661When the area of the valve, the atrioventricular septum and the ventricular atrial region is less than or equal to 1, the ratio of the area of the valve, the atrioventricular septum and the ventricular atrial region is larger;
②, average brightness:
when K is more than or equal to 02<0.25Time, it indicates unclear valve-to-atrioventricular spacing;
when K is more than or equal to 0.252<0.50, indicating a less clear valve-atrioventricular septum;
when K is more than or equal to 0.502When the spacing between the valve and the atrioventricular space is less than or equal to 1, the valve is clear;
thirdly, average gray level:
when K is more than or equal to 03<0.35, indicating a brighter ventricular atrial region;
when K is more than or equal to 0.353<0.70, indicating moderate intensity in the ventricular atrial region;
when K is more than or equal to 0.703When the value is less than or equal to 1, the ventricular atrial region is darker;
fourthly, mass fraction:
when F is more than or equal to 0 and less than 0.378, the rating is C;
when 0.378 < F <0.473, the rating is B;
when 0.473 ≦ F ≦ 0.864, the rating is A.
CN202210013358.2A 2022-01-06 2022-01-06 Automatic evaluation method for image quality of four-chamber heart ultrasonic section of fetal heart Pending CN114494157A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210013358.2A CN114494157A (en) 2022-01-06 2022-01-06 Automatic evaluation method for image quality of four-chamber heart ultrasonic section of fetal heart

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210013358.2A CN114494157A (en) 2022-01-06 2022-01-06 Automatic evaluation method for image quality of four-chamber heart ultrasonic section of fetal heart

Publications (1)

Publication Number Publication Date
CN114494157A true CN114494157A (en) 2022-05-13

Family

ID=81509953

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210013358.2A Pending CN114494157A (en) 2022-01-06 2022-01-06 Automatic evaluation method for image quality of four-chamber heart ultrasonic section of fetal heart

Country Status (1)

Country Link
CN (1) CN114494157A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115760851A (en) * 2023-01-06 2023-03-07 首都儿科研究所附属儿童医院 Ultrasonic image data processing method and system based on machine learning
CN115797296A (en) * 2022-12-05 2023-03-14 北京智影技术有限公司 Method and device for automatically measuring diaphragm thickness and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2705731A1 (en) * 2009-02-23 2010-08-23 Sunnybrook Health Sciences Centre Method for automatic segmentation of images
WO2017193251A1 (en) * 2016-05-09 2017-11-16 深圳迈瑞生物医疗电子股份有限公司 Method and system for recognizing region of interest profile in ultrasound image
CN110163825A (en) * 2019-05-23 2019-08-23 大连理工大学 A kind of denoising of human embryos cardiac ultrasound images and Enhancement Method
CN112348780A (en) * 2020-10-26 2021-02-09 首都医科大学附属北京安贞医院 Fetal heart measuring method and device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2705731A1 (en) * 2009-02-23 2010-08-23 Sunnybrook Health Sciences Centre Method for automatic segmentation of images
WO2017193251A1 (en) * 2016-05-09 2017-11-16 深圳迈瑞生物医疗电子股份有限公司 Method and system for recognizing region of interest profile in ultrasound image
CN110163825A (en) * 2019-05-23 2019-08-23 大连理工大学 A kind of denoising of human embryos cardiac ultrasound images and Enhancement Method
CN112348780A (en) * 2020-10-26 2021-02-09 首都医科大学附属北京安贞医院 Fetal heart measuring method and device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
黄志标: "基于像素聚类的超声图像分割", 《计算机应用》, 10 February 2017 (2017-02-10), pages 569 - 573 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115797296A (en) * 2022-12-05 2023-03-14 北京智影技术有限公司 Method and device for automatically measuring diaphragm thickness and storage medium
CN115797296B (en) * 2022-12-05 2023-09-05 北京智影技术有限公司 Automatic diaphragm thickness measuring method, device and storage medium
CN115760851A (en) * 2023-01-06 2023-03-07 首都儿科研究所附属儿童医院 Ultrasonic image data processing method and system based on machine learning
CN115760851B (en) * 2023-01-06 2023-05-09 首都儿科研究所附属儿童医院 Ultrasonic image data processing equipment, system and computer readable storage medium based on machine learning

Similar Documents

Publication Publication Date Title
US9256941B2 (en) Microcalcification detection and classification in radiographic images
EP1593094B1 (en) Image analysis for the purpose of assessing cancer
CN107798679B (en) Breast region segmentation and calcification detection method for mammary gland molybdenum target image
EP1793350B1 (en) Ultrasound imaging system and method for forming a 3D ultrasound image of a target object
EP1229493B1 (en) Multi-mode digital image processing method for detecting eyes
Wimmer et al. A generic probabilistic active shape model for organ segmentation
CN114494157A (en) Automatic evaluation method for image quality of four-chamber heart ultrasonic section of fetal heart
CN108537751B (en) Thyroid ultrasound image automatic segmentation method based on radial basis function neural network
CN108615239B (en) Tongue image segmentation method based on threshold technology and gray level projection
US20230005140A1 (en) Automated detection of tumors based on image processing
CN110120056A (en) Blood leucocyte dividing method based on self-adapting histogram threshold value and contour detecting
Susomboon et al. A hybrid approach for liver segmentation
Ghose et al. A random forest based classification approach to prostate segmentation in MRI
Yaqub et al. Automatic detection of local fetal brain structures in ultrasound images
CN108765427A (en) A kind of prostate image partition method
CN110211671B (en) Thresholding method based on weight distribution
CN116912255B (en) Follicular region segmentation method for ovarian tissue analysis
CN117237591A (en) Intelligent removal method for heart ultrasonic image artifacts
Chen et al. Automatic ovarian follicle quantification from 3D ultrasound data using global/local context with database guided segmentation
CN114972272A (en) Grad-CAM-based segmentation method for new coronary pneumonia lesions
CN101404062A (en) Automatic screening method for digital galactophore image based on decision tree
Panigrahi et al. An enhancement in automatic seed selection in breast cancer ultrasound images using texture features
CN111724356A (en) Image processing method and system for CT image pneumonia identification
CN104680529B (en) Arteria carotis inside and outside wall automatic division method based on shape prior and similarity constraint
US20230115927A1 (en) Systems and methods for plaque identification, plaque composition analysis, and plaque stability detection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20240710

Address after: 1003, Building A, Zhiyun Industrial Park, No. 13 Huaxing Road, Tongsheng Community, Dalang Street, Longhua District, Shenzhen City, Guangdong Province, 518000

Applicant after: Shenzhen Wanzhida Enterprise Management Co.,Ltd.

Country or region after: China

Address before: 443002 No. 8, University Road, Xiling District, Yichang, Hubei

Applicant before: CHINA THREE GORGES University

Country or region before: China