CN114693672A - Mammary gland molybdenum target image skin gland and nipple removing method based on image processing - Google Patents

Mammary gland molybdenum target image skin gland and nipple removing method based on image processing Download PDF

Info

Publication number
CN114693672A
CN114693672A CN202210448093.9A CN202210448093A CN114693672A CN 114693672 A CN114693672 A CN 114693672A CN 202210448093 A CN202210448093 A CN 202210448093A CN 114693672 A CN114693672 A CN 114693672A
Authority
CN
China
Prior art keywords
image
gland
skin
molybdenum target
nipple
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210448093.9A
Other languages
Chinese (zh)
Inventor
陈丰农
章梦达
刘元振
张娟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Dianzi University
Original Assignee
Hangzhou Dianzi University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Dianzi University filed Critical Hangzhou Dianzi University
Priority to CN202210448093.9A priority Critical patent/CN114693672A/en
Publication of CN114693672A publication Critical patent/CN114693672A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • G06T2207/20032Median filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30068Mammography; Breast
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30096Tumor; Lesion

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Health & Medical Sciences (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a mammary gland molybdenum target image skin gland and nipple removing method based on image processing, which comprises the following steps: acquiring and preprocessing an original mammary gland molybdenum target image data set; binarizing the preprocessed image based on an Otsu algorithm, and denoising to obtain a binarized mammary molybdenum target image; sketching the skin gland area of the preprocessed image, and calculating the average thickness of the skin gland; determining the skin gland boundary of each binary mammary gland molybdenum target image, and acquiring a corresponding skin gland mask image by using morphological operation; judging the type of each skin gland mask image, correspondingly translating according to the type and fusing the interior of the boundary according to the translation difference to obtain a nipple mask image; normalizing the nipple mask image after reversing the color; and multiplying the normalized image with the corresponding preprocessed image to obtain a corresponding target mammary gland molybdenum target image. The method can accurately remove the skin gland and nipple area, avoid errors and excessive cutting caused by human factors, and has higher cutting efficiency.

Description

Mammary gland molybdenum target image skin gland and nipple removing method based on image processing
Technical Field
The invention belongs to the technical field of image processing, and particularly relates to a mammary gland molybdenum target image skin gland and nipple removing method based on image processing.
Background
The method for accurately quantifying the gland content in the molybdenum target image is one of important means for accurately diagnosing early breast cancer, and the method for accurately quantifying the gland content in the molybdenum target image has important significance for diagnosing the breast cancer. In the seventies of the twentieth century, scholars put forward the theory of the correlation between breast gland content in molybdenum target images and breast cancer, followed by the scholars' suggestion that breast gland density can be one of the independent risk factors for breast cancer. However, since the gray scale of the skin gland and the papilla area in the molybdenum target image is close to or exceeds the gland area, the skin gland and the papilla area need to be removed before the molybdenum target gland content is accurately measured. The method has the advantages that the skin gland and nipple area in the molybdenum target image are accurately removed, and the method has important significance for quantifying the content of the mammary gland and the risk of suffering from breast cancer.
At present, the method for removing the skin gland and the nipple area in the molybdenum target image mainly uses a manual cutting mode for removing, because the manual cutting mode for removing the skin gland and the nipple area is long in time consumption and low in efficiency, and the molybdenum target image of the same patient can have deviation due to subjective factors under the cutting of different doctors. Especially, when the skin gland is removed, the original breast contour is influenced by manual cutting, and errors can occur when the gland proportion is quantified subsequently. In view of the large error of the method in the quantification of glands, a method for removing the skin gland and the nipple of the mammary gland molybdenum target image based on image processing is provided.
Disclosure of Invention
The invention aims to solve the problems, provides a mammary gland molybdenum target image skin gland and nipple removing method based on image processing, can accurately remove skin gland and nipple areas, avoids the problems of errors and excessive cutting caused by human factors, and has higher cutting efficiency compared with the traditional manual cutting method.
In order to achieve the purpose, the technical scheme adopted by the invention is as follows:
the invention provides a mammary gland molybdenum target image skin gland and nipple removing method based on image processing, which comprises the following steps:
s1, acquiring a data set of original breast molybdenum target images and preprocessing, wherein the preprocessing comprises the steps of sequentially carrying out format conversion and shooting information label removal on each original breast molybdenum target image;
s2, binarizing the preprocessed image based on an Otsu algorithm, and denoising the binarized image by adopting median filtering to obtain a binarized mammary molybdenum target image;
s3, delineating the skin gland area of the preprocessed image by using Labelme software, counting the number of pixel points occupied by the skin gland area of the preprocessed image, and taking the number of the pixel points occupied by the average skin gland as the average thickness of the skin gland;
s4, determining the skin gland boundary of each binary breast molybdenum target image based on a Canny edge detection algorithm, and acquiring a corresponding skin gland mask image by using morphological operation according to the average thickness of the skin glands, wherein the skin gland area in the skin gland mask image is marked as white, the rest areas are marked as black, and the morphological operation adopts corrosion operation;
s5, judging the type of each skin gland mask image, if the image is a left breast image, translating a skin gland area in the skin gland mask image to the right by at least one-half image width, fusing the interior of a boundary according to translation difference, marking the fused area as white, marking the rest areas as black, obtaining a nipple mask image, if the image is a right breast image, translating the skin gland area in the skin gland mask image to the left by at least one-half image width, fusing the interior of the boundary according to translation difference, marking the fused area as white, marking the rest areas as black, and obtaining the nipple mask image;
s6, reversing the color of the nipple mask image and then carrying out normalization processing;
and S7, multiplying each nipple mask image after normalization processing by the corresponding preprocessed image to remove the skin gland and the nipple, and obtaining the corresponding target breast molybdenum target image.
Preferably, the original breast molybdenum target image dataset comprises an equal number of left and right breast images, and the number of head and tail images and side oblique images in all images is equal.
Preferably, the format conversion converts the DICOM format to the PNG format.
Preferably, the shooting information tag removing operation is specifically as follows:
and extracting the maximum contour of the original mammary gland molybdenum target image based on a maximum contour detection algorithm, and multiplying the maximum contour with the original mammary gland molybdenum target image after format conversion to obtain an image without the shooting information label.
Preferably, the median filtering employs a 5 × 5 region.
Preferably, the kernel of the etching operation is oval and 30 × 30 in size.
Preferably, the Canny operator of the Canny edge detection algorithm employs a weak boundary threshold of 100 and a strong boundary threshold of 200.
Compared with the prior art, the invention has the following beneficial effects:
1) the method includes the steps that according to the thickness and position characteristics of skin glands and nipple areas in a breast molybdenum target image, the number of pixel values of the thickness of all the skin glands in the existing data set is counted to obtain the average thickness of the skin glands, the kernel structure and the size of corresponding morphological operations are set according to the average thickness of the skin glands, the skin glands and the nipple areas are removed accurately, the problems of errors and excessive cutting caused by human factors are avoided, and a more accurate basis is provided for accurately calculating the gland content in a molybdenum target image and estimating the risk of a patient suffering from breast cancer;
2) compared with the traditional manual cropping method, the method has higher cropping efficiency, for example, when the data set comprises 500 image samples, the average time for removing the skin gland area of a single image is 300 milliseconds, and the average time for removing the single nipple area is 325 milliseconds; the time for manually drawing a single skin gland area is 2-3 minutes, the time for manually drawing a single nipple area is 0.5-1 minute, the efficiency of removing the skin gland and the nipple area is far higher than that of a manual cutting method, and the working efficiency is greatly improved.
Drawings
FIG. 1 is a flow chart of the method for removing skin glands and nipples from breast molybdenum target images according to the present invention;
FIG. 2 is a schematic diagram of the processing of the original breast molybdenum target image through steps S1 and S2;
FIG. 3 is a schematic view of the skin gland removal process of the present invention;
fig. 4 is a schematic view of the nipple removal process of the present invention.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It is to be noted that, unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used in the description of the present application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application.
As shown in fig. 1-4, a method for removing mammary gland molybdenum target image skin gland and nipple based on image processing includes the following steps:
and S1, acquiring the original breast molybdenum target image data set and preprocessing, wherein the preprocessing comprises the steps of sequentially carrying out format conversion and shooting information label removal operation on each original breast molybdenum target image.
In one embodiment, the original breast molybdenum target image dataset includes an equal number of left and right breast images, and the number of head and tail images and side oblique images in all images is equal. The original breast molybdenum target image can be head-to-tail (CC) or oblique (MLO), and the data set equalization is convenient for subsequent median filtering and morphological operation processing.
In one embodiment, the format conversion converts the DICOM format to the PNG format. And mapping the pixel value to 0-255, and storing the pixel value as a high-quality PNG molybdenum target image.
In an embodiment, the operation of removing the shooting information tag is specifically as follows:
and extracting the maximum contour of the original mammary gland molybdenum target image based on a maximum contour detection algorithm, and multiplying the maximum contour by the original mammary gland molybdenum target image after format conversion to obtain an image without shooting information labels (such as information of position, hospital, time and the like). And when the original breast molybdenum target image subjected to format conversion, namely the PNG molybdenum target image, uses a maximum contour detection algorithm, all the outer contours are searched and then sequenced, the extracted maximum contour area is a breast area, and the maximum contour is multiplied by the corresponding PNG molybdenum target image to obtain an image without the shooting information label.
And S2, binarizing the preprocessed image based on an Otsu algorithm, and denoising the binarized image by adopting median filtering to obtain a binarized mammary molybdenum target image.
In one embodiment, the median filtering uses a 5 x 5 region. The Otsu algorithm (OTSU) performs binarization operation on the mammary gland region by automatically calculating a segmentation threshold value, distinguishes a target and a background, and removes noise by using median filtering operation on a region with the size of 5 multiplied by 5 in an image. As shown in fig. 2, an original breast molybdenum target image (DICOM molybdenum target image), a format-converted original breast molybdenum target image (PNG molybdenum target image), an image with the removed shooting information labels (i.e., a preprocessed image), a madzu binarized image, and a median filtering image (i.e., a binarized breast molybdenum target image) are sequentially arranged from left to right.
S3, delineating the skin gland area of the preprocessed image by using Labelme software, counting the number of pixels occupied by the skin gland area of the preprocessed image, and regarding the number of pixels occupied by the average skin gland as the average thickness of the skin gland.
S4, determining the skin gland boundary of each binary breast molybdenum target image based on a Canny edge detection algorithm, and acquiring a corresponding skin gland mask image by using morphological operation according to the average thickness of the skin glands, wherein the skin gland area in the skin gland mask image is marked as white, the rest areas are marked as black, and the morphological operation adopts corrosion operation.
In one embodiment, the kernel for the etching operation is oval and 30 × 30 in size. The size of the kernel in the corrosion operation is determined according to the average thickness of the skin gland, and the more pixel points in the thickness of the skin gland, the larger the size of the kernel in the corrosion operation.
In one embodiment, the Canny operator of the Canny edge detection algorithm employs a weak boundary threshold of 100 and a strong boundary threshold of 200. And (3) setting the weak boundary threshold value of a Canny operator as 100 and the strong boundary threshold value as 200, searching the boundary of the skin gland for the binary breast molybdenum target image by using the Canny operator, and recording the boundary coordinate point of the binary breast molybdenum target image.
As shown in fig. 3, from left to right, a median filter image (i.e., a binarized breast molybdenum target image), a boundary mask image (i.e., a skin gland boundary map determined by the Canny edge detection algorithm), a boundary map after corrosion (i.e., a skin gland mask image), a color inversion map, and a removed skin gland map are sequentially performed. It should be noted that, for the color inversion map and the removed skin gland map, when the skin gland region needs to be removed independently, the skin gland mask image needs to be color-inverted and then normalized, and then each normalized skin gland mask image is multiplied by the corresponding preprocessed image to remove the skin gland.
S5, judging the type of each skin gland mask image, if the image is a left breast image, translating the skin gland area in the skin gland mask image to the right by at least one half of the image width, fusing the interior of the boundary according to the translation difference, marking the fused area as white, marking the rest areas as black, obtaining a nipple mask image, if the image is a right breast image, translating the skin gland area in the skin gland mask image to the left by at least one half of the image width, fusing the interior of the boundary according to the translation difference, marking the fused area as white, marking the rest areas as black, and obtaining the nipple mask image.
S6, the nipple mask image is color-inverted and then normalized. After the color is turned over, namely the pixel point of the white area is 1, the pixel point of the black area is 0 for the reserved area, and the pixel point of the black area is the skin gland and nipple area, namely the area to be removed.
And S7, multiplying each nipple mask image after normalization processing by the corresponding preprocessed image to remove the skin gland and the nipple, and obtaining the corresponding target breast molybdenum target image.
As shown in fig. 4, the image is a left breast image, and a median filter image (i.e., a binarized breast molybdenum target image), a boundary mask image (i.e., a skin gland boundary image determined by a Canny edge detection algorithm), a boundary map after erosion (i.e., a skin gland mask image, not shown), a boundary translation map, a boundary fusion map (white is a fusion region), a color inversion map, and a skin gland and nipple removal map (i.e., a target breast molybdenum target image) are sequentially arranged in the order indicated by an arrow.
It should be noted that the operator can also adjust the image according to the actual requirement, for example, only removing the nipple area or the skin gland area is selected, and the simple multiplication operation is performed on the corresponding image.
The method includes the steps that according to the thickness and position characteristics of skin glands and nipple areas in a breast molybdenum target image, the number of pixel values of the thickness of all the skin glands in the existing data set is counted to obtain the average thickness of the skin glands, the kernel structure and the size of corresponding morphological operations are set according to the average thickness of the skin glands, the skin glands and the nipple areas are removed accurately, the problems of errors and excessive cutting caused by human factors are avoided, and a more accurate basis is provided for accurately calculating the gland content in a molybdenum target image and estimating the risk of a patient suffering from breast cancer; compared with the traditional manual cropping method, the method has higher cropping efficiency, for example, when the data set comprises 500 image samples, the average time for removing the skin gland area of a single image is 300 milliseconds, and the average time for removing the single nipple area is 325 milliseconds; the time for manually drawing a single skin gland area is 2-3 minutes, the time for manually drawing a single nipple area is 0.5-1 minute, the efficiency of removing the skin gland and the nipple area is far higher than that of a manual cutting method, and the working efficiency is greatly improved.
The technical features of the embodiments described above may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the embodiments described above are not described, but should be considered as being within the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express the more specific and detailed embodiments described in the present application, but not be construed as limiting the claims. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent application shall be subject to the appended claims.

Claims (7)

1. A mammary gland molybdenum target image skin gland and nipple removing method based on image processing is characterized in that: the method for removing the mammary gland molybdenum target image skin gland and nipple based on image processing comprises the following steps:
s1, acquiring a data set of original breast molybdenum target images and preprocessing, wherein the preprocessing comprises the steps of sequentially carrying out format conversion and shooting information label removal on each original breast molybdenum target image;
s2, binarizing the preprocessed image based on an Otsu algorithm, and denoising the binarized image by adopting median filtering to obtain a binarized mammary molybdenum target image;
s3, delineating the skin gland area of the preprocessed image by using Labelme software, counting the number of pixel points occupied by the skin gland area of the preprocessed image, and taking the number of the pixel points occupied by the average skin gland as the average thickness of the skin gland;
s4, determining the skin gland boundary of each binary breast molybdenum target image based on a Canny edge detection algorithm, and acquiring a corresponding skin gland mask image by using morphological operation according to the average thickness of the skin glands, wherein the skin gland area in the skin gland mask image is marked as white, the rest areas are marked as black, and the morphological operation adopts corrosion operation;
s5, judging the type of each skin gland mask image, if the skin gland mask image is a left breast image, translating a skin gland area in the skin gland mask image to the right by at least one half of image width, fusing the interior of a boundary according to translation difference, marking the fused area as white, marking the rest areas as black, obtaining a nipple mask image, if the skin gland mask image is a right breast image, translating the skin gland area in the skin gland mask image to the left by at least one half of image width, fusing the interior of the boundary according to translation difference, marking the fused area as white, marking the rest areas as black, and obtaining the nipple mask image;
s6, reversing the color of the nipple mask image and then carrying out normalization processing;
and S7, multiplying each nipple mask image after normalization processing by the corresponding preprocessed image to remove the skin gland and the nipple, and obtaining a corresponding target breast molybdenum target image.
2. The image processing-based breast molybdenum target image skin gland and nipple removal method of claim 1, wherein: the original mammary gland molybdenum target image data set comprises a left mammary gland image and a right mammary gland image which are equal in number, and the head and tail position images and the lateral oblique position images in all the images are equal in number.
3. The image processing-based breast molybdenum target image skin gland and nipple removal method of claim 1, wherein: the format conversion converts the DICOM format to the PNG format.
4. The image processing-based breast molybdenum target image skin gland and nipple removal method of claim 1, wherein: the shooting information tag removing operation is specifically as follows:
and extracting the maximum contour of the original mammary gland molybdenum target image based on a maximum contour detection algorithm, and multiplying the maximum contour with the original mammary gland molybdenum target image after format conversion to obtain an image without the shooting information label.
5. The image processing-based breast molybdenum target image skin gland and nipple removal method of claim 1, wherein: the median filtering takes a 5 x 5 region.
6. The image processing-based breast molybdenum target image skin gland and nipple removal method of claim 1, wherein: the kernel of the etching operation is oval and 30 × 30 in size.
7. The image processing-based breast molybdenum target image skin gland and nipple removal method of claim 1, wherein: the Canny operator of the Canny edge detection algorithm adopts a weak boundary threshold value of 100 and a strong boundary threshold value of 200.
CN202210448093.9A 2022-04-26 2022-04-26 Mammary gland molybdenum target image skin gland and nipple removing method based on image processing Pending CN114693672A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210448093.9A CN114693672A (en) 2022-04-26 2022-04-26 Mammary gland molybdenum target image skin gland and nipple removing method based on image processing

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210448093.9A CN114693672A (en) 2022-04-26 2022-04-26 Mammary gland molybdenum target image skin gland and nipple removing method based on image processing

Publications (1)

Publication Number Publication Date
CN114693672A true CN114693672A (en) 2022-07-01

Family

ID=82144981

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210448093.9A Pending CN114693672A (en) 2022-04-26 2022-04-26 Mammary gland molybdenum target image skin gland and nipple removing method based on image processing

Country Status (1)

Country Link
CN (1) CN114693672A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115760886A (en) * 2022-11-15 2023-03-07 中国平安财产保险股份有限公司 Plot partitioning method and device based on aerial view of unmanned aerial vehicle and related equipment
CN116433695A (en) * 2023-06-13 2023-07-14 天津市第五中心医院 Mammary gland region extraction method and system of mammary gland molybdenum target image

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115760886A (en) * 2022-11-15 2023-03-07 中国平安财产保险股份有限公司 Plot partitioning method and device based on aerial view of unmanned aerial vehicle and related equipment
CN115760886B (en) * 2022-11-15 2024-04-05 中国平安财产保险股份有限公司 Land parcel dividing method and device based on unmanned aerial vehicle aerial view and related equipment
CN116433695A (en) * 2023-06-13 2023-07-14 天津市第五中心医院 Mammary gland region extraction method and system of mammary gland molybdenum target image
CN116433695B (en) * 2023-06-13 2023-08-22 天津市第五中心医院 Mammary gland region extraction method and system of mammary gland molybdenum target image

Similar Documents

Publication Publication Date Title
CN110458831B (en) Scoliosis image processing method based on deep learning
Yang-Mao et al. Edge enhancement nucleus and cytoplast contour detector of cervical smear images
CN108133476B (en) Method and system for automatically detecting pulmonary nodules
CN114693672A (en) Mammary gland molybdenum target image skin gland and nipple removing method based on image processing
CN112716446B (en) Method and system for measuring pathological change characteristics of hypertensive retinopathy
CN108961280B (en) Fundus optic disc fine segmentation method based on SLIC super-pixel segmentation
CA2177477A1 (en) Automated method and system for the segmentation of medical images
CN110717888A (en) Automatic identification method for intravascular Optical Coherence Tomography (OCT) vessel wall inner contour
CN117422628B (en) Optimized enhancement method for cardiac vascular ultrasonic examination data
CN117557460B (en) Angiography image enhancement method
CN111986157B (en) Digital pathological image quality evaluation system
CN113643353B (en) Measurement method for enhancing resolution of vascular caliber of fundus image
CN114155202A (en) Thyroid nodule ultrasonic image classification method based on feature fusion and transfer learning
CN110060246B (en) Image processing method, device and storage medium
Bergmeir et al. Segmentation of cervical cell images using mean-shift filtering and morphological operators
CN109948622B (en) Method and device for detecting head and neck body aneurysm and computer readable storage medium
US8160336B2 (en) Reducing false positives for automatic computerized detection of objects
CN111105427A (en) Lung image segmentation method and system based on connected region analysis
CN111401102A (en) Deep learning model training method and device, electronic equipment and storage medium
CN112381084B (en) Automatic contour recognition method for tomographic image
CN111899272A (en) Fundus image blood vessel segmentation method based on coupling neural network and line connector
CN112634280B (en) MRI image brain tumor segmentation method based on energy functional
CN114663443A (en) 12-lead paper electrocardiogram digitization method and device
CN112037217B (en) Intraoperative blood flow imaging method based on fluorescence imaging
CN112258533B (en) Method for segmenting cerebellum earthworm part in ultrasonic image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination