WO2021159767A1 - 一种医学图像处理的方法、图像处理的方法及装置 - Google Patents
一种医学图像处理的方法、图像处理的方法及装置 Download PDFInfo
- Publication number
- WO2021159767A1 WO2021159767A1 PCT/CN2020/126063 CN2020126063W WO2021159767A1 WO 2021159767 A1 WO2021159767 A1 WO 2021159767A1 CN 2020126063 W CN2020126063 W CN 2020126063W WO 2021159767 A1 WO2021159767 A1 WO 2021159767A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- image data
- processed
- pixel
- medical
- Prior art date
Links
- 238000003672 processing method Methods 0.000 title abstract description 32
- 238000012545 processing Methods 0.000 claims abstract description 196
- 238000000034 method Methods 0.000 claims abstract description 103
- 230000001575 pathological effect Effects 0.000 claims abstract description 89
- 238000000605 extraction Methods 0.000 claims abstract description 4
- 238000001914 filtration Methods 0.000 claims description 16
- 239000002131 composite material Substances 0.000 claims description 11
- 230000007170 pathology Effects 0.000 claims description 2
- 230000008520 organization Effects 0.000 claims 1
- 238000013473 artificial intelligence Methods 0.000 abstract description 17
- 238000010191 image analysis Methods 0.000 abstract description 16
- 210000001519 tissue Anatomy 0.000 description 86
- 238000005516 engineering process Methods 0.000 description 30
- 238000010586 diagram Methods 0.000 description 27
- 230000008569 process Effects 0.000 description 17
- 238000013461 design Methods 0.000 description 10
- 239000011159 matrix material Substances 0.000 description 10
- 238000004458 analytical method Methods 0.000 description 8
- 238000012549 training Methods 0.000 description 7
- 230000000007 visual effect Effects 0.000 description 7
- 230000006399 behavior Effects 0.000 description 5
- 230000008859 change Effects 0.000 description 5
- 238000010801 machine learning Methods 0.000 description 5
- 230000006870 function Effects 0.000 description 4
- 241000282412 Homo Species 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 239000011521 glass Substances 0.000 description 3
- 238000012544 monitoring process Methods 0.000 description 3
- 230000000877 morphologic effect Effects 0.000 description 3
- 238000010606 normalization Methods 0.000 description 3
- 238000004445 quantitative analysis Methods 0.000 description 3
- WZUVPPKBWHMQCE-UHFFFAOYSA-N Haematoxylin Chemical compound C12=CC(O)=C(O)C=C2CC2(O)C1C1=CC=C(O)C(O)=C1OC2 WZUVPPKBWHMQCE-UHFFFAOYSA-N 0.000 description 2
- 230000003044 adaptive effect Effects 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 238000004040 coloring Methods 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 238000013135 deep learning Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 230000018109 developmental process Effects 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 238000012015 optical character recognition Methods 0.000 description 2
- 230000009467 reduction Effects 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 235000002566 Capsicum Nutrition 0.000 description 1
- 239000006002 Pepper Substances 0.000 description 1
- 235000016761 Piper aduncum Nutrition 0.000 description 1
- 235000017804 Piper guineense Nutrition 0.000 description 1
- 244000203593 Piper nigrum Species 0.000 description 1
- 235000008184 Piper nigrum Nutrition 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 239000000969 carrier Substances 0.000 description 1
- 210000004027 cell Anatomy 0.000 description 1
- 210000000170 cell membrane Anatomy 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 210000000805 cytoplasm Anatomy 0.000 description 1
- 239000003814 drug Substances 0.000 description 1
- 230000001815 facial effect Effects 0.000 description 1
- 238000003709 image segmentation Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000001939 inductive effect Effects 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012806 monitoring device Methods 0.000 description 1
- 238000003058 natural language processing Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000010827 pathological analysis Methods 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 230000002085 persistent effect Effects 0.000 description 1
- 230000002787 reinforcement Effects 0.000 description 1
- 150000003839 salts Chemical class 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 238000007447 staining method Methods 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 238000013526 transfer learning Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 238000002604 ultrasonography Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/20—Image enhancement or restoration using local operators
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/73—Deblurring; Sharpening
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/136—Segmentation; Edge detection involving thresholding
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/194—Segmentation; Edge detection involving foreground-background segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20024—Filtering details
- G06T2207/20032—Median filtering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30024—Cell structures in vitro; Tissue sections in vitro
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A10/00—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE at coastal zones; at river basins
- Y02A10/40—Controlling or monitoring, e.g. of flood or hurricane; Forecasting, e.g. risk assessment or mapping
Definitions
- This application relates to the field of artificial intelligence, specifically to image processing technology.
- WSI whole field of view digital slice
- the main way to extract the pathological tissue area on the WSI image is to first reduce the WSI image to a certain scale and then convert it into a grayscale image, and then perform further image processing on the grayscale image, such as image binarization. Hole removal processing, etc., and finally the pathological tissue area is extracted on the processed image.
- This application provides a medical image processing method, image processing method and device, which are used to generate a difference image using the color information of different channels before binarizing the image, thereby effectively using the image Color information, based on the pathological tissue area extracted from the difference image is more accurate, and has a positive impact on subsequent image analysis.
- the first aspect of the present application provides a method for medical image processing, which is executed by a server, and includes:
- the medical image to be processed is a color image
- the medical image to be processed includes first image data, second image data, and third image data, and the first image data, second image data, and third image data
- the image data respectively correspond to the color information under different attributes
- Binarization processing is performed on the difference image to obtain a binarized image, and the foreground area of the binarized image corresponds to the pathological tissue area of the medical image to be processed.
- the second aspect of the present application provides an image processing method, which is executed by a server, and includes:
- first image to be processed is a color image
- first image to be processed includes first image data, second image data, and third image data
- first image The data, the second image data, and the third image data respectively correspond to color information in different channels;
- the foreground area of the binarized image corresponds to the pathological tissue area of the medical image to be processed; according to the foreground area of the binarized image, from the first image to be processed Extract the pathological tissue area;
- a composite image is generated, where the pathological tissue area is located in the first layer, the second to-be-processed image is located in the second layer, and the first layer is overlaid on the second layer.
- a third aspect of the present application provides a medical image processing device, including:
- the acquisition module is used to acquire the medical image to be processed, the medical image to be processed is a color image, and the medical image to be processed includes first image data, second image data, and third image data, and the first image data and the second image data And the third image data respectively correspond to the color information under different channels;
- a generating module for generating a difference image according to the first image data, the second image data, and the third image data
- the processing module is configured to perform binarization processing on the difference image to obtain a binarized image, and the foreground area of the binarized image corresponds to the pathological tissue area of the medical image to be processed.
- the generating module is specifically configured to generate a maximum value image and a minimum value image according to the first image data, the second image data, and the third image data included in the medical image to be processed;
- the generating module is specifically used to calculate the first pixel value at the first pixel position in the first image data, the second pixel value at the second pixel position in the second image data, and the third pixel at the third pixel position in the third image data. Value, determine the maximum pixel value and the minimum pixel value;
- the maximum value image according to the maximum pixel value obtain the minimum value image according to the minimum pixel value, the pixel value at the fourth pixel position in the maximum value image is the maximum pixel value, and the pixel value at the fifth pixel position in the minimum value image is the minimum pixel value,
- the first pixel position, the second pixel position, the third pixel position, the fourth pixel position, and the fifth pixel position all correspond to the position of the same pixel in the medical image to be processed;
- the generating module is specifically configured to determine the pixel difference value according to the pixel value of the fourth pixel position in the maximum value image and the pixel value of the fifth pixel position in the minimum value image;
- the difference image is obtained.
- the pixel value of the sixth pixel position in the difference image is the pixel difference value.
- the fourth pixel position, the fifth pixel position and the sixth pixel position all correspond to the same one in the medical image to be processed The position of the pixel.
- the generating module is specifically configured to generate the difference image to be processed according to the first image data, the second image data, and the third image data;
- Gaussian blur processing is performed on the difference image to be processed to obtain the difference image.
- the medical image processing apparatus further includes a determining module
- the determination module is used to determine the binarization threshold according to the difference image
- the determining module is further configured to perform binarization processing on the difference image according to the binarization threshold to obtain a binarized image.
- the determining module is specifically configured to obtain N pixel values corresponding to N pixels according to the difference image, where the pixel values and the pixels have a one-to-one correspondence, and N is an integer greater than 1;
- the binarization threshold is determined.
- the generating module is specifically used to detect the background area in the binarized image using a flooding algorithm, where the background area includes a plurality of background pixels;
- Median filtering processing is performed on the hole-filled image to obtain a result image, and the foreground area of the result image corresponds to the pathological tissue area of the medical image to be processed.
- the processing module is specifically used to perform median filter processing on the hole-filled image to obtain a filtered image
- boundary line of the foreground area in the filtered image where the boundary line includes M pixels, and M is an integer greater than 1;
- K pixels are extended outward to obtain the result image, where K is an integer greater than or equal to 1.
- the acquisition module is specifically used to acquire the original medical image
- the medical sub-image includes a pathological tissue area, it is determined to be a medical image to be processed
- the medical sub-image is determined as the background image, and the background image is removed.
- the image processing device further includes a training module
- the generating module is also used to generate the target positive sample image according to the image to be processed and the foreground area of the image to be processed, wherein the target positive sample image belongs to a positive sample image in the positive sample set, and each positive sample image contains a pathological tissue area ;
- the obtaining module is also used to obtain a negative sample set, wherein the negative sample set includes at least one negative sample image, and each negative sample image does not include a pathological tissue area;
- the training module is used to train the image processing model based on the positive sample set and the negative sample set.
- a fourth aspect of the present application provides an image processing device, including:
- An acquisition module for acquiring a first image to be processed and a second image to be processed, wherein the first image to be processed is a color image, and the first image to be processed includes first image data, second image data, and third image data , And the first image data, the second image data, and the third image data respectively correspond to the color information under different channels;
- a generating module for generating a difference image according to the first image data, the second image data, and the third image data
- the processing module is used to perform binarization processing on the difference image to obtain a binarized image.
- the foreground area of the binarized image corresponds to the target object of the medical image to be processed; the foreground corresponding to the first image to be processed
- the extraction module is used to extract the target object from the first image to be processed according to the foreground area of the binarized image
- the generating module is also used to generate a composite image according to the target object and the second image to be processed, wherein the target object is located in the first layer, the second image to be processed is located in the second layer, and the first layer covers the second image Above the layer.
- the fifth aspect of the present application provides a computer-readable storage medium, in which instructions are stored, which when run on a computer, cause the computer to execute the methods of the above-mentioned aspects.
- a method for medical image processing is provided.
- a color medical image to be processed can be obtained, and the medical image to be processed includes first image data, second image data, and third image data.
- the first image data, the second image data, and the third image data respectively correspond to the color information under different attributes.
- the difference image is generated according to the first image data, the second image data, and the third image data.
- the difference image is binarized to obtain a binarized image, and the foreground area of the binarized image corresponds to the pathological tissue area of the medical image to be processed.
- the color information of the channel generates a difference image, which effectively utilizes the color information in the image.
- the pathological tissue area extracted based on the difference image is more accurate and has a positive impact on subsequent image analysis.
- FIG. 1 is a schematic diagram of an architecture of a medical image processing system in an embodiment of the application
- FIG. 2 is a schematic diagram of an embodiment of a method for medical image processing in an embodiment of the application
- Fig. 3 is a schematic diagram of an embodiment of a medical image to be processed in an embodiment of the application
- FIG. 4 is a schematic diagram of an embodiment of a difference image in an embodiment of the application.
- FIG. 5 is a schematic diagram of an embodiment of a binarized image in an embodiment of the application.
- Fig. 6 is a schematic diagram of an embodiment of a result image in an embodiment of the application.
- FIG. 7 is a schematic diagram of another embodiment of the result image in the embodiment of the application.
- FIG. 8 is a schematic diagram of an embodiment of acquiring medical images to be processed in an embodiment of the application.
- FIG. 9 is a schematic flowchart of a method for medical image processing in an embodiment of the application.
- FIG. 10 is a schematic diagram of an embodiment of a result image in an embodiment of the application.
- FIG. 11 is a schematic diagram of an embodiment of an image processing method in an embodiment of the application.
- FIG. 12 is a schematic diagram of an embodiment of a medical image processing device in an embodiment of the application.
- FIG. 13 is a schematic diagram of an embodiment of an image processing device in an embodiment of the application.
- FIG. 14 is a schematic diagram of a server structure provided by an embodiment of the present application.
- the embodiments of the present application provide a medical image processing method, an image processing method and a device, which are used to generate a difference image using the color information of different channels before the image is binarized, thereby effectively using the image
- the color information in, the pathological tissue area extracted based on the difference image is more accurate, and has a positive impact on subsequent image analysis.
- Image processing is a technology that can analyze images to achieve the desired results.
- Image processing generally refers to the processing of digital images, while digital images refer to a large two-dimensional array obtained by shooting with industrial cameras, video cameras, and scanners. The elements of this array are called pixels, and their values are called gray values.
- Image processing technology can help people understand the world more objectively and accurately.
- the human visual system can help humans obtain a large amount of information from the outside world. Images and graphics are the carriers of all visual information.
- image processing technologies may include, but are not limited to, image transformation, image coding and compression, image enhancement and restoration, image segmentation, image description, matting technology, and image classification.
- the image processing method provided in the present application can be applied to scenes in the medical field.
- medical images that can be processed include, but are not limited to, brain images, heart images, chest images, and cell images, and medical images may be subject to noise. , Field offset effect, local body effect and the influence of tissue movement. Because there are also differences between individuals and the shape of the tissue structure is complex, medical images are generally more blurred than ordinary images and have unevenness.
- the medical image involved in this application is a color image, which can be a color ultrasound image or a whole-field digital pathology (WSI) image, or it can include a color digital image obtained from a microscope. Taking the WSI image as an example, the edge of the WSI image The length is usually 10,000 pixels to 100,000 pixels.
- WSI images it is often necessary to scale or cut into small-size images for further processing.
- image processing it is necessary to obtain the area with pathological tissue slices, and then according to the Areas for pathological analysis, such as nucleus quantitative analysis, cell membrane quantitative analysis, cytoplasm quantitative analysis, tissue microvascular analysis, and tissue microvascular analysis, etc.
- the medical image processing method of this application can obtain the medical image to be processed, and according to the first image data, the second image data, and the third image data included in the medical image to be processed , Generate a difference image, where the first image data, the second image data, and the third image data respectively correspond to the color information under different attributes, and the difference image is further binarized to obtain a binary image,
- the foreground area of the binarized image corresponds to the pathological tissue area of the medical image to be processed.
- the difference image effectively utilizes the color information in the image, and the pathological tissue area extracted based on the difference image is more accurate, and has a positive impact on subsequent image analysis.
- image processing can also be applied to scenes in the field of remote sensing.
- high-resolution remote sensing images can be used in marine monitoring, land cover monitoring, marine pollution and maritime rescue, and high-resolution remote sensing images have image details.
- the characteristics of richness, prominent geometric structure of ground objects, and complex target structure, such as complex shadows of coastline objects in high-resolution remote sensing images, large vegetation coverage, or insufficient processing of light and dark artificial facilities, because high-resolution remote sensing images are different from ordinary The image is more detailed and more complex than usual.
- the vegetation can be subtracted from the high-resolution remote sensing image to determine the corresponding area. Therefore, based on the characteristics of high-resolution remote sensing images, through the image processing method of this application, a difference image can be generated based on the first image data, the second image data, and the third image data included in the first image to be processed.
- the first image to be processed is a color image
- the first image data, the second image data, and the third image data included in the first image to be processed respectively correspond to the color information in different channels
- the generated difference image is performed twice
- the binarized image is obtained by the binarization process, and the foreground area of the binarized image corresponds to the pathological tissue area of the medical image to be processed, and then according to the result image, the target object (such as the vegetation area) is extracted from the first to-be-processed image ).
- the target object extracted based on the difference image is more accurate, and the details in the high-resolution remote sensing image can be obtained more accurately, thereby improving the processing of the high-resolution remote sensing image. Accuracy.
- FIG. 1 is a schematic diagram of an architecture of the medical image processing system in an embodiment of the application.
- the image processing system includes a server and terminal equipment.
- the medical image processing device can be deployed on a server or on a terminal device with higher computing power.
- the server obtains the medical image to be processed, and then the server generates a difference image according to the first image data, the second image data, and the third image data included in the medical image to be processed, Further binarization processing is performed on the difference image to obtain a binarized image, and the foreground area of the binarized image corresponds to the pathological tissue area of the medical image to be processed.
- the server can perform medical image analysis based on the pathological tissue area.
- the terminal device acquires the medical image to be processed, and then the terminal device generates a difference according to the first image data, the second image data, and the third image data included in the medical image to be processed. Further, the difference image is binarized to obtain a binarized image, and the foreground area of the binarized image corresponds to the pathological tissue area of the medical image to be processed.
- the terminal device can perform medical image analysis based on the pathological tissue area.
- the server in FIG. 1 may be one server or a server cluster or cloud computing center composed of multiple servers, and the details are not limited here.
- the terminal device can be a tablet computer, a notebook computer, a palmtop computer, a mobile phone, a personal computer (PC), and a voice interaction device shown in Figure 1, or it can be a monitoring device, a face recognition device, etc., which are not done here. limited.
- FIG. 1 Although only five terminal devices and one server are shown in FIG. 1, it should be understood that the example in FIG. 1 is only used to understand this solution, and the number of specific terminal devices and servers should be flexibly determined in combination with actual conditions.
- Artificial Intelligence is a theory, method, technology and application system that uses digital computers or machines controlled by digital computers to simulate, extend and expand human intelligence, perceive the environment, acquire knowledge, and use knowledge to obtain the best results.
- artificial intelligence is a comprehensive technology of computer science, which attempts to understand the essence of intelligence and produce a new kind of intelligent machine that can react in a similar way to human intelligence.
- Artificial intelligence is to study the design principles and implementation methods of various intelligent machines, so that the machines have the functions of perception, reasoning and decision-making.
- Machine Learning is a multi-field interdisciplinary subject, involving probability theory, statistics, approximation theory, convex analysis, algorithm complexity theory and other subjects. Specializing in the study of how computers simulate or realize human learning behaviors in order to acquire new knowledge or skills, and reorganize the existing knowledge structure to continuously improve its own performance.
- Machine learning is the core of artificial intelligence and the fundamental way to make computers intelligent. Its applications are in all fields of artificial intelligence.
- Machine learning and deep learning usually include techniques such as artificial neural networks, belief networks, reinforcement learning, transfer learning, inductive learning, and style teaching learning.
- CV Computer Vision
- Computer vision technology usually includes image processing, image recognition, image semantic understanding, image retrieval, optical character recognition (OCR), video processing, video semantic understanding, video content/behavior recognition, three-dimensional object reconstruction, 3D technology, virtual Technologies such as reality, augmented reality, synchronized positioning and map construction also include common facial recognition, fingerprint recognition and other biometric recognition technologies.
- OCR optical character recognition
- video processing video semantic understanding, video content/behavior recognition
- 3D technology three-dimensional object reconstruction
- virtual Technologies such as reality, augmented reality, synchronized positioning and map construction also include common facial recognition, fingerprint recognition and other biometric recognition technologies.
- FIG. 2 is the method of medical image processing in the embodiment of this application.
- a schematic diagram of an embodiment, as shown in the figure, an embodiment of a method for processing medical images in an embodiment of the present application includes:
- the medical image to be processed is a color image
- the medical image to be processed includes first image data, second image data, and third image data, and the first image data, second image data, and It is the third image data respectively corresponding to the color information under different channels;
- the medical image processing device may obtain a color image to be processed medical image
- the to-be-processed medical image may include the first image data, the second image data, and the third image data, and the first image data, the second image data
- the second image data and the third image data respectively correspond to color information in different channels.
- the medical image to be processed may be a medical image received by the medical image processing device through a wired network, or may be a medical image stored by the medical image processing device itself.
- the medical image to be processed may be an area captured from the WSI image, and the WSI image can be scanned through a microscope, because the slice refers to a glass slide prepared after hematoxylin or other staining methods.
- the WSI image obtained after scanning the film through a microscope is a color image.
- the image color mode of a color image includes, but is not limited to, red green blue (RGB) color mode, luminance-bandwidth chrominance (YUV) color mode, hue-saturation-luminance (Hue- Saturation-Value, HSV) color mode, and color information can be expressed as pixel values under different channels, such as the pixel value of the R channel, the pixel value of the G channel, and the pixel value of the B channel.
- RGB red green blue
- YUV luminance-bandwidth chrominance
- HSV hue-saturation-luminance
- color information can be expressed as pixel values under different channels, such as the pixel value of the R channel, the pixel value of the G channel, and the pixel value of the B channel.
- WSI image formats include but are not limited to file formats such as SVS and NDPI.
- the length and width of WSI images are usually in the range of tens of thousands of pixels, and the image size is relatively large. Direct processing of the WSI image requires a large amount of memory. Therefore, it is necessary to WSI image is cut.
- the image with the largest resolution in the WSI image file is read as the image to be processed.
- this embodiment can capture the medical image to be processed on the reduced WSI image, and the WSI image can be reduced by any multiple, such as 20 times or 10 times, and the length and width of the reduced WSI image are within the range of several thousand pixels. It should be understood that Since the reduction factor is artificially defined, the specific reduction factor should be flexibly determined in light of the actual situation.
- FIG. 3 is a schematic diagram of an embodiment of the medical image to be processed in the embodiment of the application.
- the medical image to be processed includes the tissue area of the case, and there is no other gray-scale background or pure white background. Interfere with the medical image to be processed.
- the image color mode of the medical image to be processed is RGB as an example for description. Since the first image data, the second image data, and the third image data included in the medical image to be processed respectively correspond to different channels.
- the first image data can be the pixel value 200 corresponding to the R channel
- the second image data can be the pixel value 100 corresponding to the G channel
- the third image data It can be the B channel corresponding to a pixel value of 60.
- the first image data can be R channel corresponding to a pixel value of 100
- the second image data can be G channel corresponding to a pixel value of 800
- the third image data can be B channel Corresponds to a pixel value of 40.
- HSV images or YUV images can be converted into RGB images before subsequent processing.
- the medical image processing device can be deployed on a server, or it can be deployed on a terminal device with higher computing power. In this embodiment, the medical image processing device is deployed on a server as an example for introduction.
- the medical image processing device can generate a difference image based on the first image data, the second image data, and the third image data. Specifically, the difference image appears as a grayscale image.
- FIG. 4 is a schematic diagram of an embodiment of the difference image in the embodiment of the application.
- the difference image can be the image including the pathological tissue area shown in the figure. Since the color information corresponding to different channels is used to distinguish the pixel values, the image color mode of the medical image to be processed is RGB as an example for description. If the medical image to be processed is gray, the RGB is relatively similar. If the medical image to be processed is in color, the difference between RGB is large, and there is a large color difference in the pathological tissue.
- the medical image processing apparatus may perform binarization processing on the difference image generated in step 102 to obtain a binarized image.
- the foreground area of the binarized image corresponds to the pathological tissue area of the medical image to be processed.
- FIG. 5 is a schematic diagram of an embodiment of the binarized image in the embodiment of the application.
- the difference image shown in (A) in FIG. For grayscale images, grayscale image-based processing can be used.
- adaptive binarization is used for foreground processing, that is, the difference image is binarized to obtain the result shown in Figure 5.
- B Binarized image shown. And in the binarized image, white is the foreground area including the pathological tissue area, and black is the background area not including the pathological tissue area.
- FIG. 6 is a schematic diagram of an embodiment of the foreground area in the embodiment of the application. As shown in the figure, according to the binarized image shown in (A) in FIG. The foreground area of the tissue area, and the black is the background area that does not include the pathological tissue area. Therefore, according to the binarized image, the area corresponding to the medical image to be processed as shown in FIG. 6(B) can be generated.
- a method for medical image processing is provided.
- the color information difference of gray pixels under different channels is small, the color information of color pixels under different channels is relatively different. Therefore, before the image is binarized, the color information of different channels is used to generate the difference image, thereby effectively using the color information in the image, and the pathological tissue area extracted based on the difference image is more accurate. And it has a positive impact on subsequent image analysis.
- the image data and the third image data to generate a difference image may include:
- the medical image processing device may generate a maximum value image and a minimum value according to the first image data, the second image data, and the third image data included in the medical image to be processed. Image, and finally based on the maximum image and the minimum image, you can generate a difference image.
- the image color mode of the medical image to be processed is RGB as an example for description. Since the first image data, the second image data, and the third image data included in the medical image to be processed respectively correspond to color information in different channels, The color information is expressed as the pixel value corresponding to the R channel, G channel and B channel, and the maximum value in the R channel, G channel and B channel is determined, and the maximum value image is determined by the maximum value. Similarly, it can be determined in The minimum value in the R channel, G channel and B channel, through which the minimum value image can be determined, and then each pixel point in the maximum value image is subtracted from each pixel point at the corresponding position in the minimum value image, Get the difference image.
- a method for generating a difference image is provided.
- a maximum value image and a minimum value image are generated according to the first image data, the second image data, and the third image data.
- the color information is different.
- the color information of the medical image to be processed included is more accurate, thereby improving the accuracy of the difference image generation.
- the second image data and the third image data to generate the maximum value image and the minimum value image may include:
- the maximum value image is obtained according to the maximum pixel value
- the minimum value image is obtained according to the minimum pixel value.
- the pixel value of the fourth pixel position in the maximum value image is the maximum pixel value
- the pixel value of the fifth pixel position in the minimum value image is the minimum pixel value
- the four-pixel position and the fifth pixel position both correspond to the position of the same pixel in the medical image to be processed
- the difference image is generated, which can include:
- the difference image is obtained.
- the pixel value of the sixth pixel position in the difference image is the pixel difference value.
- the fourth pixel position, the fifth pixel position and the sixth pixel position all correspond to the same one in the medical image to be processed The position of the pixel.
- the medical image processing device can determine the maximum pixel value and the minimum pixel value corresponding to the target pixel according to the first image data, the second image data, and the third image data included in the medical image to be processed, and then According to the determined maximum pixel value and minimum pixel value. Finally, the maximum pixel value corresponding to the target pixel in the maximum image is subtracted from the minimum pixel value corresponding to the target pixel in the minimum image to obtain the difference pixel value corresponding to the target pixel in the difference image.
- the image color mode of the medical image to be processed is RGB as an example for description.
- the medical image to be processed including the first image data, the second image data, and the third image data
- the Each pixel of the image has corresponding image data in the R channel, G channel and B channel.
- the image data of the pixel in the R channel is the first pixel value
- the image data in the G channel is the second pixel value.
- the image data on the B channel is the third pixel value. According to the first pixel value, the second pixel value and the third pixel value, the maximum pixel value and the minimum pixel value in the R channel, G channel and B channel can be determined.
- the maximum pixel value and minimum pixel value of the pixel position (x, y) can be calculated by the following formula:
- Imax(x,y) Max[Ir(x,y), Ig(x,y), Ib(x,y)];
- Imin(x,y) Min[Ir(x,y), Ig(x,y), Ib(x,y)];
- Imax(x,y) represents the maximum pixel value
- Imin(x,y) represents the minimum pixel value
- Ir(x,y) represents the first pixel value
- Ig(x,y) represents the second pixel value
- Ib( x, y) represents the third pixel value.
- the color information corresponding to different channels is used to distinguish the pixel values.
- the image color mode of the medical image to be processed is RGB as an example. If the medical image to be processed is gray, the RGB is relatively similar. If the medical image is processed in color, the difference between RGB is very large, and the pathological tissue has color, and the value of the color is the pixel value required in this embodiment.
- the foregoing formulas only take pixels corresponding to two-dimensional images as an example. In practical applications, the foregoing formulas are also applicable to multi-dimensional images to calculate the maximum pixel value and minimum pixel value, such as three-dimensions (3D) images and Four-dimensional (4 dimensions, 4D) images, etc.
- the image target pixel position is (x1, y1)
- the image color mode of the medical image to be processed is RGB as an example for description.
- the target pixel position (x1, y1) is the first
- the pixel value Ir(x1, y1) is 100
- the second pixel value Ig(x1, y1) at the target pixel location (x1, y1) is 200
- the third pixel value Ib( at the target pixel location (x1, y1) x1, y1) is 150.
- the maximum pixel value Imax(x1, y1) at the target pixel position (x1, y1) is the pixel value 200 corresponding to the second pixel value Ig(x1, y1).
- the minimum pixel value Imin(x1, y1) at the pixel point position (x1, y1) is the pixel value 100 corresponding to the first pixel value Ir(x1, y1).
- the target pixel position is (x2, y2)
- the image color mode of the medical image to be processed is RGB as another example.
- the first pixel value Ir(x2) at the target pixel position (x2, y2) , Y2) is 30, the second pixel value Ig(x2, y2) of the target pixel position (x2, y2) is 80, and the third pixel value Ib(x2, y2) of the target pixel position (x2, y2) is 120.
- the maximum pixel value Imax(x2, y2) of the target pixel point position (x2, y2) is the pixel value 120 corresponding to the third pixel value Ib(x2, y2), and the target pixel point position (x2 , Y2)
- the minimum pixel value Imin(x2, y2) is the pixel value 30 corresponding to the first pixel value Ir(x2, y2).
- the image color mode of the medical image to be processed is RGB
- the medical image to be processed is a 3D image
- the target pixel position is (x3, y3, z3) as another example for description
- the target pixel position is ( The first pixel value Ir(x3, y3, z3) of x3, y3, z3) is 200
- the second pixel value Ig(x3, y3, z3) of the target pixel position (x3, y3, z3) is 10
- the third pixel value Ib(x3, y3, z3) at the pixel point location (x3, y3, z3) is 60.
- the maximum pixel value Imax(x3, y3, z3) is the pixel value 200 corresponding to the first pixel value Ir(x3, y3, z3)
- the minimum pixel value Imin(x3, y3, z3) of the target pixel position (x3, y3, z3) is the second
- the pixel value corresponding to the pixel value Ig (x3, y3, z3) is 10.
- the maximum pixel value and the minimum pixel value corresponding to the target pixel position can be subtracted to obtain the difference value corresponding to the target pixel position in the difference image Pixel values.
- the difference pixel value can be calculated according to the maximum pixel value Imax (x, y) and the minimum pixel value Imin (x, y) through the following formula, and it is assumed that the medical image to be processed includes 10,000 pixels:
- Idiff(x,y) Imax(x,y)-Imin(x,y);
- Imax (x, y) represents the maximum pixel value
- Imin (x, y) represents the minimum pixel value
- Idiff (x, y) represents the difference pixel value at the (x, y) position.
- the target pixel position is (x1, y1)
- the image color mode of the medical image to be processed is RGB as an example.
- the maximum pixel value Imax(x1) at the target pixel position (x1, y1) , Y1) is 200
- the minimum pixel value Imin (x1, y1) of the target pixel position (x1, y1) is 100
- the maximum pixel value Imax (x1, y1) is subtracted from the minimum pixel value Imin (x1, y1)
- the difference pixel value corresponding to the target pixel position (x1, y1) can be obtained as 100.
- the maximum pixel value Imax(x2, y2) is 120
- the minimum pixel value Imin(x2, y2) of the target pixel position (x2, y2) is 30
- the maximum pixel value Imax(x2, y2) is subtracted from the minimum pixel value Imin(x2, y2)
- the difference pixel value corresponding to the target pixel point position (x2, y2) is 90.
- the image color mode of the medical image to be processed is RGB
- the medical image to be processed is a 3D image
- the target pixel position is (x3, y3, z3) as another example for description.
- Imax(x,y,z) Max[Ir(x,y,z), Ig(x,y,z), Ib(x,y,z)];
- Imin(x,y,z) Min[Ir(x,y,z), Ig(x,y,z), Ib(x,y,z)];
- Idiff(x,y,z) Imax(x,y,z)-Imin(x,y,z);
- the maximum pixel value Imax(x3, y3, z3) of the target pixel position (x3, y3, z3) is 200
- the minimum pixel value Imin(x3, y3, z3) of the target pixel position (x3, y3, z3) Is 10
- the maximum pixel value Imax (x3, y3, z3) from the minimum pixel value Imin (x3, y3, z3) is 190.
- the difference pixel value of the medical image to be processed when the difference pixel value of the medical image to be processed is small, it indicates that the first pixel value, the second pixel value, and the third pixel value of the medical image to be processed are relatively similar, which can indicate that the medical image to be processed is similar to Gray image, and when the difference pixel value of the medical image to be processed is large, it means that the first pixel value, the second pixel value, and the third pixel value of the medical image to be processed have a large difference, which can indicate the medical image to be processed.
- the image is similar to a color image, and the image with the pathological tissue area is often a colored image, so it can be preliminarily determined whether the medical image to be processed includes the pathological tissue area according to the difference pixel value.
- a method for generating a maximum value image and a minimum value image is provided.
- the maximum value is determined by the pixel values of the target pixel corresponding to the first image data, the second image data, and the third image data.
- the pixel value and the minimum pixel value, the maximum pixel value and the minimum pixel value reflect the color information of the medical image to be processed to varying degrees, and the difference pixel value is obtained by subtracting the maximum pixel value and the minimum pixel value, so that the difference pixel value can be Accurately reflect the color information of the medical image to be processed, thereby improving the accuracy of differential image generation.
- the second image data and the third image data to generate a difference image may include:
- Gaussian blur processing is performed on the difference image to be processed to obtain the difference image.
- the medical image processing device can generate the difference image to be processed according to the first image data, the second image data, and the third image data included in the medical image to be processed, and then perform Gaussian processing on the difference image to be processed. Blur processing to obtain a difference image.
- blurring can be understood as taking the average value of its surrounding pixels for each pixel of the difference image to be processed.
- the value of the pixel can be It tends to be smooth, and on the difference image to be processed, it is equivalent to a blur effect, and the pixel will lose its details.
- the algorithm used for blur in this embodiment is Gaussian Blur.
- Gaussian Blur can use normal distribution (Gaussian distribution) for the processing of the difference image to be processed, so that the weighted average between pixels is more reasonable, and the distance The closer the pixel is, the greater the weight, and the farther the pixel is, the smaller the weight.
- the pixel point (x, y) is a two-dimensional pixel point, so the two-dimensional Gaussian function can be calculated by the following formula:
- (x, y) represents the pixel
- G(x, y) represents the two-dimensional Gaussian function of the pixel
- ⁇ represents the standard deviation of the normal distribution.
- the weight corresponding to (0,1) is 0.0566
- the weight corresponding to pixel (1,1) is 0.0453
- the weight corresponding to pixel (-1,0) is 0.0566
- the weight corresponding to pixel (1,0) is 0.0566.
- the pixel point (-1, -1) corresponds to a weight of 0.0453
- the pixel point (0, -1) corresponds to a weight of 0.0566
- the pixel point (1, -1) corresponds to a weight of 0.0453
- the pixel point (0, 0) The sum of the weights of the 9 points of the surrounding 8 pixels is approximately equal to 0.479.
- the sum of their weights must be equal to 1, which is to normalize the sum of weights. That is, the 9 values corresponding to the weight matrix can be divided by the total weight of 0.479 to obtain the normalized weight matrix, that is, the normalized weight of the pixel point (0, 0) is 0.147, and the pixel point (- 1,1) After normalization, the corresponding weight is 0.0947, the pixel (0,1) is normalized, and the corresponding weight is 0.0118, and the pixel (1,1) is normalized, and the corresponding weight is 0.0947.
- the weight of the pixel point (-1, 0) after normalization is 0.0118
- the weight of the pixel point (1, 0) after normalization is 0.0118
- the pixel point (-1, -1) is normalized
- the corresponding weight is 0.0947
- the normalized pixel (0, -1) corresponds to 0.0118
- the pixel (1, -1) normalized corresponds to 0.0947. Since using a weight matrix with a weight sum greater than 1 will make the difference image brighter, and using a weight matrix with a weight sum less than 1 will make the difference image darker, the normalized weight matrix can make the difference image appear The pathological tissue area is more accurate.
- Gaussian blur calculation can be performed on the pixel point.
- the pixel point (0, 0) in the weight matrix corresponds to The gray value of is 25, the gray value of pixel (-1, 1) is 14, the gray value of pixel (0, 1) is 15, and the gray value of pixel (1, 1) is Is 16, the gray value corresponding to the pixel point (-1, 0) is 24, the gray value corresponding to the pixel point (1, 0) is 26, and the gray value corresponding to the pixel point (-1, -1) is 34 , The gray value corresponding to the pixel point (0, -1) is 35 and the gray value corresponding to the pixel point (1, -1) is 36.
- the gray value corresponding to each pixel is multiplied by the weight corresponding to each pixel to get 9 values, that is, the pixel point (0, 0) can get 3.69, and the pixel point (-1, 1) pair can get 1.32.
- Pixel (0, 1) can get 1.77, pixel (1, 1) can get 1.51, pixel (-1, 0) can get 2.83, pixel (1, 0) can get 3.07, pixel (-1) , -1) can get 3.22, pixel (0, -1) can get 4.14 and pixel (1, -1) can get 3.41. Then add up these 9 values to get the Gaussian blur value of the pixel (0, 0).
- another method for generating a difference image is provided.
- Gaussian blur processing is performed on the generated difference image to be processed. Since the Gaussian blur processing can improve the processing robustness, the resulting The difference image has better processing robustness, which improves the stability of the difference image.
- the difference image is binarized to obtain the binarized image
- Can include:
- the medical image processing device can determine the binarization threshold according to the difference image, and when the pixel value corresponding to the pixel in the difference image is greater than or equal to the binarization threshold, the pixel is determined to be binarized For the foreground pixel of the image, when the pixel value corresponding to the pixel in the difference image is less than the binarization threshold, the pixel is determined as the background pixel of the binarized image.
- the binarization of the difference image can turn the grayscale image into a binarized image with a value of 0 or 1, that is to say, the binarization of the difference image can be achieved by Set the binarization threshold and transform the difference image into a binarized image of the foreground and the background of the image represented by only two values (0 or 1), where the value of the foreground is 1, and the value of the background is 0, and in practical applications, 0 corresponds to the RGB values are all 0, 1 corresponds to the RGB values are all 255, the difference image is the binarized image obtained after the binarization process, and then the binarized image is further During processing, because the geometric properties of the binarized image are only related to the positions of 0 and 1, and the gray value of the pixel is no longer involved, the processing of the binarized image becomes simple and the image processing efficiency can be improved.
- the method of determining the binarization threshold can be divided into global threshold and local threshold.
- the global threshold is to use a threshold to divide the entire difference image. But for different difference images, the gray depth of the difference image is different, and for the same difference image, different parts of the light and dark distribution can also be different, therefore, we use dynamic threshold binary value in this embodiment
- the binarization method determines the binarization threshold.
- the binarization threshold is determined based on the difference image, the pixel value corresponding to the pixel in the difference image and the binarization threshold are judged, when the pixel value corresponding to the pixel in the difference image is greater than or equal to the binarization
- the pixel is determined as the foreground pixel of the binarized image.
- the pixel value corresponding to the pixel in the difference image is less than the binarization threshold, the pixel is determined as the background pixel of the binarized image.
- the pixel A is determined as the foreground pixel of the binarized image, that is, the pixel value is 1, that is, the pixel A is in the foreground area.
- the image is in RGB mode, it is displayed in white.
- the pixel value corresponding to pixel B is less than the binarization threshold, then the pixel B is determined as the background pixel of the binarized image, that is, the pixel value is 0, that is, the pixel B is in the background area.
- the image is in RGB mode, it is displayed in black.
- a method for obtaining a binarized image is provided.
- the binarized image is generated according to the binarization process. Because the geometric nature of the binarized image does not involve the gray value of the pixel, The subsequent processing of the binarized image can be simplified, and the efficiency of generating the resulting image can be improved.
- determining the binarization threshold according to the difference image may include:
- N pixel values corresponding to N pixels according to the difference image, where the pixel values and the pixel points have a one-to-one correspondence, and N is an integer greater than 1;
- the binarization threshold is calculated.
- the medical image processing device can obtain N pixel values corresponding to N pixel points according to the difference image, and the pixel values have a one-to-one correspondence with the pixel points, and then determine the reference from the N pixel values Pixel value, the reference pixel value is the maximum value of N pixel values, and finally the binarization threshold can be calculated according to the reference pixel value and the preset ratio, where N is an integer greater than 1.
- the binarization threshold in this embodiment is determined based on the difference image, because the difference image can be generated by subtracting the maximum value image and the minimum value image in the medical image to be processed, and the pixels in the difference image The value and the pixel have a one-to-one correspondence, so you can obtain the corresponding pixel value of multiple pixels in the difference image, and then determine the maximum value of the multiple pixel values as the reference pixel value, and then according to the reference pixel value and preset The ratio is calculated to obtain the binarization threshold.
- this embodiment uses a preset ratio of 10% as an example for description.
- the length and width of the reduced image of the WSI image is within a range of several thousand pixels.
- the reduced image includes 100*100 pixels, that is It is necessary to find the maximum value among the corresponding pixel values of 10,000 pixels, for example, the maximum value is 150, then the maximum value of 150 can be determined as the reference pixel value, and then the reference pixel value 150 is multiplied by a preset ratio of 10%, that is A binarization threshold of 15 can be obtained.
- the preset ratio may also be a value corresponding to other percentages, and the specific preset ratio should be flexibly determined in combination with actual conditions.
- the binarization threshold can be flexibly determined by adjusting the preset ratio to improve the accuracy and flexibility of the threshold, thereby improving the accuracy of the binarization image generation Spend.
- the flooding algorithm is used to detect the background area in the binarized image, where the background area includes multiple background pixels;
- the medical image processing device may use a flooding algorithm to detect the background area in the binarized image, the background area may include a plurality of background pixels, and then based on the binarized image and the background area in the binarized image , Obtain the background pixels in the foreground area in the binarized image, the foreground area may include multiple foreground pixels, and then change the background pixels in the foreground area in the binarized image to foreground pixels to obtain a hole Fill the image, and finally perform median filter processing on the hole-filled image to obtain the result image.
- the foreground area of the result image corresponds to the pathological tissue area of the medical image to be processed.
- FIG. 7 is a schematic diagram of another embodiment of the result image in the embodiment of the application.
- the binary image shown in FIG. 7 (A) is in a white foreground
- the area includes multiple background pixels.
- the black dots framed in area A1 to area A5 are all composed of background pixels.
- the black dots framed in area A1 to area A5 are changed from background pixels to foreground pixels, and
- the white points framed by area A6 and area A7 are composed of foreground pixels, and the white points framed by area A6 and area A7 are changed from foreground pixels to background pixels, and the result shown in Figure 7 (B) can be obtained. Fill the image with the holes.
- median filter processing is performed on the hole-filled image shown in FIG. 7(B), and morphological processing may be further performed, that is, the result image shown in FIG. 7(C) can be obtained.
- the filtering process is to suppress the noise of the medical image to be processed while keeping the holes as much as possible to fill the details of the image.
- the filtering process can improve the effectiveness and reliability of the subsequent result image processing and analysis. Eliminating the noise components in the hole-filled image is the filtering operation.
- the energy of the hole-filled image is mostly concentrated in the low and mid-frequency bands of the amplitude spectrum.
- the median filter processing is a typical nonlinear filter, which is a nonlinear signal processing technology that can effectively suppress noise based on the sorting statistics theory.
- the median filter processing can use the median value of the gray value of the pixel neighborhood. Instead of the gray value of the pixel, make the surrounding pixel values close to the true value to eliminate isolated noise points.
- a method for generating a result image is provided.
- the background pixels in the foreground area are changed to foreground pixels, and the obtained hole-filled image has better reliability.
- the value filtering process can make the result image corresponding to the medical image to be processed clear and good visual effect without damaging the characteristic information such as the contour and edge of the image.
- performing median filtering processing on the hole-filled image may include:
- boundary line of the foreground area in the filtered image where the boundary line includes M pixels, and M is an integer greater than 1;
- K pixels are extended outward to obtain the result image, where K is an integer greater than or equal to 1.
- the medical image processing device performs median filter processing on the hole-filled image
- the filtered image may include the foreground area to be processed, and the boundary line of the foreground area in the filtered image is obtained, and the boundary line includes M pixels, and then For each of the M pixels on the boundary line, K pixels are extended outward to obtain a result image, where M is an integer greater than 1, and K is an integer greater than or equal to 1.
- the median filter process can replace the gray value of the pixel with the median of the gray value of the pixel's neighborhood, so that the surrounding pixel values are close to the true value to eliminate isolated noise points, and the median filter is processed While taking out impulse noise and salt and pepper noise, a filtered image that preserves the edge details of the image is obtained.
- the flooding algorithm (Flood Fill) is used to fill the connected and similar color areas with different colors.
- the basic principle of the flooding algorithm is to start from a pixel point to extend the coloring to the surrounding pixels until the graphics boundary.
- the flooding algorithm requires three parameters: start node, target color, and replacement color.
- the flooding algorithm is connected to all the nodes of the starting node through the path of the target color, and they are changed to the replacement color.
- the flooding algorithm can be constructed in many ways, but many ways are clearly Or implicitly use queue or stack data structures. For example, the four-neighbor flooding algorithm, the eight-neighbor flooding algorithm, the Scanline Fill algorithm, and the Large-scale behavior (Large-scale behavior).
- the traditional four-neighbor flooding algorithm is to color the pixel (x, y) and then color the four points around it.
- the recursive method consumes more memory. If you need to color it The area is very large, which will cause overflow. Therefore, a non-recursive four-neighbor flooding algorithm can be used.
- the eight-neighbor flooding algorithm is to color the top, bottom, left, top, left, top, and bottom of a pixel.
- the line drawing algorithm can use the filling line to speed up the algorithm. You can color the pixels on a line first, and then expand up and down in sequence until the coloring is completed. Large-scale behaviors are data-centric or process-centric.
- the drawing algorithm is used in this embodiment, and the boundary line of the foreground area to be processed including 1000 pixels is taken as an example. Extend K pixels respectively, assuming K is 2, then 2000 pixels are added as the foreground area in addition to the original 1000 pixels, and the result image is obtained. It should be understood that, in actual applications, the specific M pixels and K pixels should be determined flexibly in combination with actual conditions.
- acquiring the medical image to be processed may include:
- the medical sub-image includes a pathological tissue area, it is determined to be a medical image to be processed
- the medical sub-image is determined as the background image, and the background image is removed.
- the medical image processing device may first obtain the original medical image, and then use a sliding window to extract the medical sub-image from the original medical image.
- the medical sub-image includes a pathological tissue area
- the medical sub-image is determined as the background image, and the background image is removed.
- the original medical image may be an image received by the medical image processing device through a wired network, or may also be an image stored by the medical image processing device itself.
- FIG. 8 is a schematic diagram of an embodiment of obtaining medical images to be processed in an embodiment of the application.
- the medical sub-image is extracted from the original medical image, where the area framed by B1 to B3 is the medical sub-image extracted from the original medical image, so that B1 can correspond to the medical sub-image shown in Figure 8 (B), B2 can correspond to the medical sub-image shown in Figure 8 (C), and B3 can correspond to the medical sub-image shown in Figure 8 (D).
- FIG. 8 (B) and (C) The medical sub-image shown in Figure 8 includes the pathological tissue area, so the medical sub-images shown in Figure 8 (B) and (C) can be determined as medical images to be processed, and Figure 8 (D) shows the medical image The pathological tissue area is not included in the sub-image, so the medical sub-image shown in (D) in FIG. 8 can be determined as the background image, and the background image can be removed.
- a method for obtaining medical images to be processed is provided.
- the medical images to be processed are determined, so that the medical images to be processed include the pathological tissue regions.
- Image Through the foregoing steps, the result image corresponding to the medical image to be processed can be obtained, and the result image includes the pathological tissue area, which is convenient for subsequent processing and analysis of the pathological tissue area in the result image.
- the medical sub-image including the pathological tissue area is determined as the background image, and the background image is removed to reduce the resource occupancy rate.
- the medical image processing method can also include:
- Target positive sample image belongs to a positive sample image in the positive sample set, and each positive sample image contains a pathological tissue area;
- the negative sample set includes at least one negative sample image, and each negative sample image does not include a pathological tissue area;
- the image processing model is trained.
- the medical image processing device may also generate a target positive sample image based on the result image.
- the target positive sample image belongs to a positive sample image in the positive sample set, and each The positive sample image contains the pathological tissue area.
- a negative sample set can also be obtained.
- the negative sample set includes at least one negative sample image, and each negative sample image does not contain the pathological tissue area.
- the image processing model can process the corresponding pathological tissue area based on a color medical image.
- a method for training an image processing model is provided.
- the image processing model is trained through a collection of positive sample images that include pathological tissue regions and a negative sample collection that does not include pathological tissue regions. The accuracy and reliability of the image processing model, thereby improving the efficiency and accuracy of image processing.
- FIG. 1 A schematic diagram of the image processing method, specifically:
- step S1 an original medical image is acquired
- step S2 based on the original medical image, a medical image to be processed is acquired;
- step S3 a difference image is generated according to the medical image to be processed
- step S4 binarization processing is performed on the difference image difference to obtain a binarized image
- step S5 a hole-filled image is obtained based on the binarized image
- step S6 median filter processing is performed on the hole-filled image to obtain a result image.
- step S1 the original medical image shown in Figure 9 (A) can be obtained, and then in step S2, a sliding window is used to extract medical sub-images from the original medical image shown in Figure 9 (A).
- the medical sub-image includes a pathological tissue area
- step S3 According to the first image data, the second image data and the third image data included in the medical image to be processed, the maximum pixel value corresponding to the target pixel is determined from the first pixel value, the second pixel value and the third pixel value And the minimum pixel value, thereby generating the maximum value image and the minimum value image, and then obtain the difference image as shown in (C) of FIG. 9 according to the maximum value image and the minimum value image. Furthermore, in step S4, N pixel values corresponding to N pixels can be obtained according to the difference image as shown in (C) in FIG. 9, and the pixel values have a one-to-one correspondence with the pixels.
- step S5 the flooding algorithm is used to detect the background area including multiple background pixels in the binarized image, and then the foreground area in the binarized image is obtained according to the binarized image and the background area in the binarized image Change the background pixels in the foreground area of the binarized image to foreground pixels, so that the hole-filled image as shown in FIG. 9(E) can be obtained.
- step S6 median filter processing is performed on the hole-filled image to obtain a filtered image including the foreground area to be processed.
- the boundary line of the foreground area to be processed including M pixels is obtained for each of the M pixels on the boundary line. Pixels, K pixels are extended outward to obtain a result image as shown in (F) in FIG. 9, where N is an integer greater than 1.
- FIG. 10 is a schematic diagram of an embodiment of the result image in the embodiment of the application, as shown in the figure, as shown in (A) in FIG.
- FIG. 10(B) shows a medical image to be processed with regular vertical streaks.
- the regular vertical streaks are generated by scanning the glass slide for the scanner. The generation of the regular vertical streaks depends on the scanning equipment.
- the result image as shown in (D) in FIG. 10 can be obtained.
- the black and white stripes can be generated by format conversion, or can be an unclear area generated by a scanner scanning a glass slide.
- black and white stripes are added, and then through the medical image processing method provided in the embodiment of the present application, the resulting image as shown in (F) in FIG. 10 can be obtained.
- the color information of different channels is used to generate a difference image. Because the color information of gray pixels in different channels is small, the color pixels are in different channels. The difference in color information is large, and the color information in the various medical images to be processed in Figure 10 can be effectively used.
- the pathological tissue area extracted based on the difference image is more accurate, and it is positive for subsequent image analysis. Influence.
- FIG. 11 is a schematic diagram of an embodiment of the image processing method in the embodiment of this application.
- An embodiment of the processing method includes:
- a first image to be processed and a second image to be processed wherein the first image to be processed is a color image, and the first image to be processed includes first image data, second image data, and third image data, and One image data, second image data, and third image data respectively correspond to color information in different channels;
- the image processing apparatus may obtain a first image to be processed and a second image to be processed.
- the first image to be processed may include first image data, second image data, and third image data, and the first image
- the data, the second image data, and the third image data respectively correspond to color information in different channels.
- the first to-be-processed image and the second to-be-processed image may be images received by the image processing apparatus via a wired network, or may also be images stored by the image processing apparatus itself.
- the first to-be-processed image is similar to the to-be-processed medical image described in the foregoing step 101, and will not be repeated here.
- the color information specifically corresponding to the first image data, the second image data, and the third image data should be flexibly determined in combination with actual conditions.
- the image processing device can be deployed on a server, or it can be deployed on a terminal device with higher computing power. In this embodiment, the deployment of the image processing device on a server is taken as an example for introduction.
- the first image to be processed is a photo taken on a cloudy day
- the background of the photo is a cloudy day
- a red car is also included.
- the second image to be processed is a picture of the blue sky and the sea.
- the image processing apparatus may generate a difference image according to the first image data, the second image data, and the third image data included in the first image to be processed obtained in step 201.
- the difference image is a grayscale image.
- the method for generating a difference image in this embodiment is similar to the corresponding embodiment in FIG. 2 described above, and will not be repeated here.
- the difference image generated at this time can see the outline of the car.
- the image processing device may perform binarization processing on the difference image generated in step 202 to obtain a binarized image.
- an adaptive binarization method is used to perform foreground processing, that is, to perform binarization processing on the difference image, so as to obtain a binarized image.
- the method for generating the binarized image in this embodiment is similar to the corresponding embodiment in FIG. 2, and will not be repeated here.
- the binarized image generated at this time can accurately show the outline of the car.
- the image processing apparatus may extract the target object from the first image to be processed according to the foreground region generated in step 203.
- the target object may be a pathological tissue area.
- the target object may be a vegetation area.
- the target object may be a bicycle or a car.
- the image of the car can be cut out from the first image to be processed, that is, the image of the car is the target object.
- the image processing device will set the target object as the first layer, the second image to be processed as the second layer, and overlay the first layer on the second layer, thereby generating a composite image .
- the image of the car is overlaid on the blue sky and white cloud photo to form a synthesized image.
- the background of the car is no longer cloudy, but blue sky and white clouds.
- an image processing method is provided.
- the color information difference of grayscale pixels under different channels is small, while the color information of color pixels under different channels is relatively different. Therefore, before the image is binarized, the color information of different channels is used to generate the difference image, which effectively uses the color information in the image, and the target object extracted based on the difference image is more accurate.
- the layer where the target object is located covers the layer where the second image to be processed is located, and the generated composite image summarizes the target object accurately, thereby improving the accuracy of the composite image and can have a positive impact on subsequent image analysis.
- FIG. 12 is a schematic diagram of an embodiment of the medical image processing device in an embodiment of the application.
- the medical image processing device 300 includes:
- the acquiring module 301 is used to acquire a medical image to be processed, where the medical image to be processed is a color image, and the medical image to be processed includes first image data, second image data, and third image data, and the first image data and the second image data The second image data and the third image data respectively correspond to color information in different channels;
- the generating module 302 is configured to generate a difference image according to the first image data, the second image data, and the third image data;
- the processing module 303 is configured to perform binarization processing on the difference image to obtain a binarized image, and the foreground area of the binarized image corresponds to the pathological tissue area of the medical image to be processed including the first image to be processed;
- a method for medical image processing is provided.
- the color information difference of gray pixels under different channels is small, the color information of color pixels under different channels is relatively different. Therefore, before the image is binarized, the color information of different channels is used to generate the difference image, thereby effectively using the color information in the image, and the pathological tissue area extracted based on the difference image is more accurate. And it has a positive impact on subsequent image analysis.
- the generating module 302 is specifically configured to generate a maximum value image and a minimum value image according to the first image data, the second image data, and the third image data included in the medical image to be processed;
- a method for generating a difference image is provided.
- a maximum value image and a minimum value image are generated according to the first image data, the second image data, and the third image data.
- the color information is different.
- the color information of the medical image to be processed included is more accurate, thereby improving the accuracy of the difference image generation.
- the generating module 302 is specifically configured to perform according to the first pixel value of the first pixel position in the first image data, the second pixel value of the second pixel position in the second image data, and the third pixel value of the third pixel position in the third image data. Pixel value, determine the maximum pixel value and the minimum pixel value;
- the maximum value image according to the maximum pixel value obtain the minimum value image according to the minimum pixel value, the pixel value at the fourth pixel position in the maximum value image is the maximum pixel value, and the pixel value at the fifth pixel position in the minimum value image is the minimum pixel value,
- the first pixel position, the second pixel position, the third pixel position, the fourth pixel position, and the fifth pixel position all correspond to the position of the same pixel in the medical image to be processed;
- the generating module 302 is specifically configured to determine the pixel difference value according to the pixel value of the fourth pixel position in the maximum value image and the pixel value of the fifth pixel position in the minimum value image;
- the difference image is obtained.
- the pixel value of the sixth pixel position in the difference image is the pixel difference value.
- the fourth pixel position, the fifth pixel position and the sixth pixel position all correspond to the same one in the medical image to be processed The position of the pixel.
- a method for generating a maximum value image is provided.
- the maximum pixel value and the minimum pixel value are determined by the pixel values of the first image data, the second image data and the third image data corresponding to the target pixel.
- the pixel value, the maximum pixel value and the minimum pixel value reflect the color information of the medical image to be processed to varying degrees, and the difference pixel value is obtained by subtracting the maximum pixel value and the minimum pixel value, so that the difference pixel value can accurately reflect the color information to be processed Process the color information of medical images to improve the accuracy of differential image generation.
- the generating module 302 is specifically configured to generate the difference image to be processed according to the first image data, the second image data, and the third image data;
- Gaussian blur processing is performed on the difference image to be processed to obtain the difference image.
- another method of generating a difference image is provided.
- Gaussian blur processing is performed on the generated difference image to be processed. Since the Gaussian blur processing can improve the robustness, the resulting difference The value image has better robustness, which improves the stability of the difference image.
- the medical image processing apparatus 300 further includes a determining module 304;
- the determining module 304 is configured to determine the binarization threshold according to the difference image
- the determining module 304 is further configured to perform binarization processing on the difference image according to the binarization threshold to obtain a binarized image.
- a method for obtaining a binarized image is provided.
- the binarized image is generated according to the binarization process. Because the geometric nature of the binarized image does not involve the gray value of the pixel, The subsequent processing of the binarized image can be made simple, so that the efficiency of generating the foreground area can be improved.
- the determining module 304 is specifically configured to obtain N pixel values corresponding to N pixels according to the difference image, where the pixel values and the pixel points have a one-to-one correspondence, and N is an integer greater than 1;
- the binarization threshold is determined.
- the binarization threshold can be flexibly determined by adjusting the preset ratio to improve the accuracy and flexibility of the threshold, thereby improving the accuracy of the binarization image generation Spend.
- the generating module 302 is specifically configured to use a flooding algorithm to detect a background area in a binarized image, where the background area includes a plurality of background pixels;
- Median filtering processing is performed on the hole-filled image to obtain a result image, and the foreground area of the result image corresponds to the pathological tissue area of the medical image to be processed.
- a method for generating a result image is provided.
- the background pixels in the foreground area are changed to foreground pixels, and the obtained hole-filled image has better reliability.
- the value filtering process can make the result image corresponding to the medical image to be processed clear and good visual effect without damaging the characteristic information such as the contour and edge of the image.
- the processing module 303 is specifically configured to perform median filtering processing on the hole-filled image to obtain a filtered image
- boundary line of the foreground area in the filtered image where the boundary line includes M pixels, and M is an integer greater than 1;
- K pixels are extended outward to obtain the result image, where K is an integer greater than or equal to 1.
- the obtaining module 301 is specifically used to obtain original medical images
- the medical sub-image includes a pathological tissue area, it is determined to be a medical image to be processed
- the medical sub-image is determined as the background image, and the background image is removed.
- a method for obtaining medical images to be processed is provided.
- the medical images to be processed are determined, so that the medical images to be processed include the pathological tissue regions.
- Image Through the foregoing steps, the result image corresponding to the medical image to be processed can be obtained, and the result image includes the pathological tissue area, which is convenient for subsequent processing and analysis of the pathological tissue area in the result image.
- the medical sub-image including the pathological tissue area is determined as the background image, and the background image is removed to reduce the resource occupancy rate.
- the image processing apparatus further includes a training module 305;
- the generating module 302 is further configured to generate a target positive sample image according to the image to be processed and the foreground area of the image to be processed, wherein the target positive sample image belongs to a positive sample image in the positive sample set, and each positive sample image contains pathological tissue area;
- the obtaining module 301 is further configured to obtain a negative sample set, where the negative sample set includes at least one negative sample image, and each negative sample image does not include a pathological tissue area;
- the training module 305 is used to train the image processing model based on the positive sample set and the negative sample set.
- a method for training an image processing model is provided.
- the image processing model is trained through a collection of positive sample images that include pathological tissue regions and a negative sample collection that does not include pathological tissue regions. The accuracy and reliability of the image processing model, thereby improving the efficiency and accuracy of image processing.
- FIG. 13 is a schematic diagram of an embodiment of the image processing device in an embodiment of the application.
- the image processing device 400 includes:
- the acquiring module 401 is used to acquire a first image to be processed and a second image to be processed, wherein the first image to be processed is a color image, and the first image to be processed includes first image data, second image data, and third image Data, and the first image data, the second image data, and the third image data respectively correspond to color information in different channels;
- the generating module 402 is configured to generate a difference image according to the first image data, the second image data, and the third image data;
- the processing module 403 is configured to perform binarization processing on the difference image to obtain a binarized image, and the foreground area of the binarized image corresponds to the pathological tissue area of the medical image to be processed;
- the extraction module 404 is configured to extract the target object from the first image to be processed according to the foreground area of the binarized image
- the generating module 402 is further configured to generate a composite image according to the target object and the second image to be processed, where the target object is located in the first layer, the second image to be processed is located in the second layer, and the first layer covers the second layer. Above the layer.
- an image processing method is provided.
- the color information difference of grayscale pixels under different channels is small, while the color information of color pixels under different channels is relatively different. Therefore, before the image is binarized, the color information of different channels is used to generate the difference image, which effectively uses the color information in the image, and the target object extracted based on the difference image is more accurate.
- the layer where the target object is located covers the layer where the second image to be processed is located, and the generated composite image summarizes the target object accurately, thereby improving the accuracy of the composite image and can have a positive impact on subsequent image analysis.
- the server 500 may have relatively large differences due to different configurations or performance, and may include one or more central processing units (CPU) 522 (for example, , One or more processors) and memory 532, and one or more storage media 530 (for example, one or more storage devices with a large amount of storage) for storing application programs 542 or data 544.
- the memory 532 and the storage medium 530 may be short-term storage or persistent storage.
- the program stored in the storage medium 530 may include one or more modules (not shown in the figure), and each module may include a series of command operations on the server.
- the central processing unit 522 may be configured to communicate with the storage medium 530 and execute a series of instruction operations in the storage medium 530 on the server 500.
- the server 500 may also include one or more power supplies 525, one or more wired or wireless network interfaces 550, one or more input and output interfaces 558, and/or one or more operating systems 541, such as Windows Server TM , Mac OS X TM , Unix TM , Linux TM , FreeBSD TM and so on.
- the steps performed by the server in the above embodiment may be based on the server structure shown in FIG. 14.
- the CPU 522 is used to execute the steps executed by the medical image processing apparatus in the embodiment corresponding to FIG. 2, and the CPU 522 is also used to execute the steps executed by the image processing apparatus in the embodiment corresponding to FIG. 1.
- the disclosed system, device, and method can be implemented in other ways.
- the device embodiments described above are merely illustrative, for example, the division of the units is only a logical function division, and there may be other divisions in actual implementation, for example, multiple units or components may be combined or It can be integrated into another system, or some features can be ignored or not implemented.
- the displayed or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection through some interfaces, devices or units, and may be in electrical, mechanical or other forms.
- the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, they may be located in one place, or they may be distributed on multiple network units. Some or all of the units may be selected according to actual needs to achieve the objectives of the solutions of the embodiments.
- the functional units in the various embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit.
- the above-mentioned integrated unit can be implemented in the form of hardware or software functional unit.
- the integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, it can be stored in a computer readable storage medium.
- the technical solution of the present application essentially or the part that contributes to the existing technology or all or part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage medium , Including several instructions to make a computer device (which may be a personal computer, a server, or a network device, etc.) execute all or part of the steps of the methods described in the various embodiments of the present application.
- the aforementioned storage media include: U disk, mobile hard disk, read-only memory (read-only memory, ROM), random access memory (random access memory, RAM), magnetic disks or optical disks and other media that can store program codes. .
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Quality & Reliability (AREA)
- Image Processing (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
- Image Analysis (AREA)
Abstract
Description
Claims (15)
- 一种医学图像处理的方法,由服务器执行包括:获取待处理医学图像,所述待处理医学图像为彩色图像,且所述待处理医学图像包括第一图像数据、第二图像数据以及第三图像数据,且所述第一图像数据、所述第二图像数据以及所述第三图像数据分别对应于不同通道下的色彩信息;根据所述第一图像数据、所述第二图像数据以及所述第三图像数据,生成差值图像;对所述差值图像进行二值化处理,得到二值化图像,所述二值化图像的前景区域与所述待处理医学图像的病理组织区域对应。
- 根据权利要求1所述的方法,所述根据所述第一图像数据、所述第二图像数据以及所述第三图像数据,生成差值图像,包括:根据所述待处理医学图像中所包括的所述第一图像数据、所述第二图像数据以及所述第三图像数据,生成最大值图像和最小值图像;根据所述最大值图像以及所述最小值图像,生成所述差值图像。
- 根据权利要求2所述的方法,所述根据所述第一图像数据、所述第二图像数据以及所述第三图像数据,生成最大值图像和最小值图像,包括:根据所述第一图像数据中第一像素位置的第一像素值、所述第二图像数据中第二像素位置的第二像素值以及所述第三图像数据中第三像素位置的第三像素值,确定最大像素值和最小像素值;根据所述最大像素值获得最大值图像,根据所述最小像素值获得最小值图像,所述最大值图像中第四像素位置的像素值为所述最大像素值,所述最小值图像中第五像素位置的像素值为所述最小像素值,所述第一像素位置、所述第二像素位置、所述第三像素位置以及所述第四像素位置、所述第五像素位置均对应于所述待处理医学图像中同一个像素点的位置;所述根据所述最大值图像以及所述最小值图像,生成差值图像,包括:根据所述最大值图像中所述第四像素位置的像素值和所述最小值图像中所述第五像素位置的像素值确定像素差值;根据所述像素差值,获得差值图像,所述差值图像中第六像素位置的像素值为所述像素差值,所述第四像素位置、所述第五像素位置和所述第六像素位置均对应于所述待处理医学图像中同一个像素点的位置。
- 根据权利要求1所述的方法,所述根据所述第一图像数据、所述第二图像数据以及所述第三图像数据,生成差值图像,包括:根据所述第一图像数据、所述第二图像数据以及所述第三图像数据,生成待处理差值图像;对所述待处理差值图像进行高斯模糊处理,得到差值图像。
- 根据权利要求1所述的方法,其特征在于,所述对所述差值图像进行 二值化处理,得到二值化图像,包括:根据所述差值图像确定二值化阈值;根据所述二值化阈值对所述差值图像进行二值化处理,得到二值化图像。
- 根据权利要求5所述的方法,所述根据所述差值图像确定二值化阈值,包括:根据所述差值图像获取N个像素点所对应的N个像素值,其中,所述像素值与所述像素点具有一一对应的关系,所述N为大于1的整数;从所述N个像素值中确定参考像素值,其中,所述参考像素值为所述N个像素值中的最大值;根据所述参考像素值以及预设比例,确定二值化阈值。
- 根据权利要求1所述的方法,所述方法还包括:采用泛洪算法检测所述二值化图像中的背景区域,其中,所述背景区域包括多个背景像素点;根据所述二值化图像以及所述二值化图像中的背景区域,获取所述二值化图像中所述前景区域内的背景像素点,其中,所述前景区域包括多个前景像素点;将所述二值化图像中的前景区域内的背景像素点变更为前景像素点,得到空洞填补图像;对所述空洞填补图像进行中值滤波处理,得到结果图像,所述结果图像的前景区域与所述待处理医学图像的病理组织区域对应。
- 根据权利要求7所述的方法,所述对所述空洞填补图像进行中值滤波处理,得到结果图像包括:对所述空洞填补图像进行中值滤波处理,得到滤波图像;获取所述滤波图像中前景区域的边界线,其中,所述边界线包括M个像素点,所述M为大于1的整数;针对所述边界线上所述M个像素点中的每个像素点,向外延伸K个像素点,得到结果图像,其中,所述K为大于或等于1的整数。
- 根据权利要求1所述的方法,所述获取待处理医学图像,包括:获取原始医学图像;采用滑动窗口从所述原始医学图像中提取医学子图像;若检测到所述医学子图像中包括病理组织区域,则确定为所述待处理医学图像;若检测到所述医学子图像中未包括病理组织区域,则将所述医学子图像确定为背景图像,且去除所述背景图像。
- 根据权利要求1至9中任一项所述的方法,所述方法还包括:根据所述待处理图像以及所述待处理图像的所述前景区域生成目标正样本图像,其中,所述目标正样本图像属于正样本集合中的一个正样本图像, 且每个正样本图像包含病理组织区域;获取负样本集合,其中,所述负样本集合包括至少一个负样本图像,且每个负样本图像不包含病理组织区域;基于所述正样本集合以及所述负样本集合,对图像处理模型进行训练。
- 一种图像处理的方法,由服务器执行,包括:获取第一待处理图像以及第二待处理图像,其中,所述第一待处理图像为彩色图像,且所述第一待处理图像包括第一图像数据、第二图像数据以及第三图像数据,且所述第一图像数据、所述第二图像数据以及所述第三图像数据分别对应于不同通道下的色彩信息;根据所述第一图像数据、所述第二图像数据以及所述第三图像数据,生成差值图像;对所述差值图像进行二值化处理,得到二值化图像,所述二值化图像的前景区域与所述待处理医学图像的病理组织区域对应;区域;根据所述二值化图像的前景区域,从所述第一待处理图像中提取目标对象;根据所述目标对象以及所述第二待处理图像,生成合成图像,其中,所述目标对象位于第一图层,所述第二待处理图像位于第二图层,所述第一图层覆盖于所述第二图层之上。
- 一种医学图像处理装置,包括:获取模块,用于获取待处理医学图像,所述待处理医学图像为彩色图像,且所述待处理医学图像包括第一图像数据、第二图像数据以及第三图像数据,且所述第一图像数据、所述第二图像数据以及所述第三图像数据分别对应于不同通道下的色彩信息;生成模块,用于根据所述第一图像数据、所述第二图像数据以及所述第三图像数据,生成差值图像;处理模块,用于对所述差值图像进行二值化处理,得到二值化图像,所述二值化图像的前景区域与所述待处理医学图像的病理组织区域对应。
- 一种图像处理装置,包括:获取模块,用于获取第一待处理图像以及第二待处理图像,所述第一待处理图像为彩色图像,且所述第一待处理图像包括第一图像数据、第二图像数据以及第三图像数据,且所述第一图像数据、所述第二图像数据以及所述第三图像数据分别对应于不同通道下的色彩信息;生成模块,用于根据所述第一图像数据、所述第二图像数据以及所述第三图像数据,生成差值图像;处理模块,用于对所述差值图像进行二值化处理,得到二值化图像,所述二值化图像的前景区域与所述待处理医学图像的目标对象对应;提取模块,用于根据所述二值化图像的前景区域,从所述第一待处理图 像中提取所述目标对象;所述生成模块,还用于根据所述目标对象以及所述第二待处理图像,生成合成图像,其中,所述目标对象位于第一图层,所述第二待处理图像位于第二图层,所述第一图层覆盖于所述第二图层之上。
- 一种计算机设备,包括:存储器、收发器、处理器以及总线***;其中,所述存储器用于存储程序;所述处理器用于执行所述存储器中的程序,以实现如上述权利要求1至10中任一项所述的方法,或,实现如上述权利要求11所述的方法;所述总线***用于连接所述存储器以及所述处理器,以使所述存储器以及所述处理器进行通信。
- 一种计算机可读存储介质,包括指令,当其在计算机上运行时,使得计算机执行如权利要求1至10中任一项所述的方法,或,执行如权利要求11所述的方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020227009792A KR20220050977A (ko) | 2020-02-10 | 2020-11-03 | 의료 이미지 처리 방법, 이미지 처리 방법 및 장치 |
JP2022524010A JP2022553979A (ja) | 2020-02-10 | 2020-11-03 | 医用画像処理方法、画像処理方法、医用画像処理装置、画像処理装置、コンピュータ装置およびプログラム |
EP20919308.5A EP4002268A4 (en) | 2020-02-10 | 2020-11-03 | METHOD FOR PROCESSING MEDICAL IMAGES, IMAGE PROCESSING METHOD AND APPARATUS |
US17/685,847 US20220189017A1 (en) | 2020-02-10 | 2022-03-03 | Medical image processing method and apparatus, image processing method and apparatus, terminal and storage medium |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010084678.8 | 2020-02-10 | ||
CN202010084678.8A CN111275696B (zh) | 2020-02-10 | 2020-02-10 | 一种医学图像处理的方法、图像处理的方法及装置 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/685,847 Continuation US20220189017A1 (en) | 2020-02-10 | 2022-03-03 | Medical image processing method and apparatus, image processing method and apparatus, terminal and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021159767A1 true WO2021159767A1 (zh) | 2021-08-19 |
Family
ID=71000325
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2020/126063 WO2021159767A1 (zh) | 2020-02-10 | 2020-11-03 | 一种医学图像处理的方法、图像处理的方法及装置 |
Country Status (6)
Country | Link |
---|---|
US (1) | US20220189017A1 (zh) |
EP (1) | EP4002268A4 (zh) |
JP (1) | JP2022553979A (zh) |
KR (1) | KR20220050977A (zh) |
CN (1) | CN111275696B (zh) |
WO (1) | WO2021159767A1 (zh) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118135340A (zh) * | 2024-05-06 | 2024-06-04 | 天津市肿瘤医院(天津医科大学肿瘤医院) | 基于肺区域分割的肺影像病灶预标记方法、***及介质 |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111275696B (zh) * | 2020-02-10 | 2023-09-15 | 腾讯医疗健康(深圳)有限公司 | 一种医学图像处理的方法、图像处理的方法及装置 |
CN112070708B (zh) | 2020-08-21 | 2024-03-08 | 杭州睿琪软件有限公司 | 图像处理方法、图像处理装置、电子设备、存储介质 |
CN112149509B (zh) * | 2020-08-25 | 2023-05-09 | 浙江中控信息产业股份有限公司 | 深度学习与图像处理融合的交通信号灯故障检测方法 |
CN114979589B (zh) * | 2021-02-26 | 2024-02-06 | 深圳怡化电脑股份有限公司 | 图像处理方法、装置、电子设备及介质 |
CN113160974B (zh) * | 2021-04-16 | 2022-07-19 | 山西大学 | 一种基于超图聚类的精神疾病生物型发掘方法 |
CN113989304A (zh) * | 2021-11-10 | 2022-01-28 | 心医国际数字医疗***(大连)有限公司 | 图像处理方法、装置、电子设备及存储介质 |
CN115205156B (zh) * | 2022-07-27 | 2023-06-30 | 上海物骐微电子有限公司 | 无失真的中值滤波边界填充方法及装置、电子设备、存储介质 |
CN115934990B (zh) * | 2022-10-24 | 2023-05-12 | 北京数慧时空信息技术有限公司 | 基于内容理解的遥感影像推荐方法 |
CN115830459B (zh) * | 2023-02-14 | 2023-05-12 | 山东省国土空间生态修复中心(山东省地质灾害防治技术指导中心、山东省土地储备中心) | 基于神经网络的山地林草生命共同体损毁程度检测方法 |
CN117252893B (zh) * | 2023-11-17 | 2024-02-23 | 科普云医疗软件(深圳)有限公司 | 一种乳腺癌病理图像的分割处理方法 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109461143A (zh) * | 2018-10-12 | 2019-03-12 | 上海联影医疗科技有限公司 | 图像显示方法、装置、计算机设备和存储介质 |
CN110575178A (zh) * | 2019-09-10 | 2019-12-17 | 贾英 | 一种运动状态判断的诊断监控综合医疗***及其判断方法 |
CN110705425A (zh) * | 2019-09-25 | 2020-01-17 | 广州西思数字科技有限公司 | 一种基于图卷积网络的舌象多标签分类学习方法 |
CN111275696A (zh) * | 2020-02-10 | 2020-06-12 | 腾讯科技(深圳)有限公司 | 一种医学图像处理的方法、图像处理的方法及装置 |
Family Cites Families (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1153564A (zh) * | 1994-06-03 | 1997-07-02 | 神经医药体系股份有限公司 | 基于密度纹理的分类***和方法 |
JPH08279046A (ja) * | 1995-04-06 | 1996-10-22 | Mitsubishi Rayon Co Ltd | パターン検査装置 |
US6704456B1 (en) * | 1999-09-02 | 2004-03-09 | Xerox Corporation | Automatic image segmentation in the presence of severe background bleeding |
US20050136509A1 (en) * | 2003-09-10 | 2005-06-23 | Bioimagene, Inc. | Method and system for quantitatively analyzing biological samples |
US20060036372A1 (en) * | 2004-03-18 | 2006-02-16 | Bulent Yener | Method and apparatus for tissue modeling |
WO2010011356A2 (en) * | 2008-07-25 | 2010-01-28 | Aureon Laboratories, Inc. | Systems and methods of treating, diagnosing and predicting the occurrence of a medical condition |
JP5295044B2 (ja) * | 2009-08-27 | 2013-09-18 | Kddi株式会社 | マスク画像を抽出する方法及びプログラム並びにボクセルデータを構築する方法及びプログラム |
CN102360500B (zh) * | 2011-07-08 | 2013-06-12 | 西安电子科技大学 | 基于Treelet曲波域去噪的遥感图像变化检测方法 |
JP5995215B2 (ja) * | 2012-05-14 | 2016-09-21 | 学校法人東京理科大学 | 癌細胞領域抽出装置、方法、及びプログラム |
CN102982519B (zh) * | 2012-11-23 | 2015-04-01 | 南京邮电大学 | 一种视频图像的前景识别提取和拼接方法 |
CN103325117B (zh) * | 2013-06-17 | 2016-08-10 | 中国石油天然气集团公司 | 一种基于matlab的岩心图像处理方法及*** |
CN104036490B (zh) * | 2014-05-13 | 2017-03-29 | 重庆大学 | 适用于移动通信网络传输中的前景分割方法 |
CN106469267B (zh) * | 2015-08-20 | 2019-12-17 | 深圳市腾讯计算机***有限公司 | 一种验证码样本收集方法及*** |
CN105740844A (zh) * | 2016-03-02 | 2016-07-06 | 成都翼比特自动化设备有限公司 | 基于图像识别技术的绝缘子炸裂故障检测方法 |
CN106295645B (zh) * | 2016-08-17 | 2019-11-29 | 东方网力科技股份有限公司 | 一种车牌字符识别方法和装置 |
WO2018180386A1 (ja) * | 2017-03-30 | 2018-10-04 | 国立研究開発法人産業技術総合研究所 | 超音波画像診断支援方法、およびシステム |
CN107563373B (zh) * | 2017-07-28 | 2021-06-04 | 一飞智控(天津)科技有限公司 | 基于立体视觉的无人机降落区域主动安全检测方法及应用 |
CN107609468B (zh) * | 2017-07-28 | 2021-11-16 | 一飞智控(天津)科技有限公司 | 用于无人机降落区域主动安全检测的类别优化聚合分析方法及应用 |
CN107644429B (zh) * | 2017-09-30 | 2020-05-19 | 华中科技大学 | 一种基于强目标约束视频显著性的视频分割方法 |
JP2018152095A (ja) * | 2018-04-19 | 2018-09-27 | 株式会社ニコン | 画像処理装置、撮像装置及び画像処理プログラム |
CN108924525B (zh) * | 2018-06-06 | 2021-07-06 | 平安科技(深圳)有限公司 | 图像亮度调整方法、装置、计算机设备及存储介质 |
CN109708813B (zh) * | 2018-06-28 | 2021-07-27 | 浙江森拉特暖通设备有限公司 | 供暖片漏水状态实时检测平台 |
CN109784344B (zh) * | 2019-01-24 | 2020-09-29 | 中南大学 | 一种用于地平面标识识别的图像非目标滤除方法 |
CN110675420B (zh) * | 2019-08-22 | 2023-03-24 | 华为技术有限公司 | 一种图像处理方法和电子设备 |
CN110472616B (zh) * | 2019-08-22 | 2022-03-08 | 腾讯科技(深圳)有限公司 | 图像识别方法、装置、计算机设备及存储介质 |
-
2020
- 2020-02-10 CN CN202010084678.8A patent/CN111275696B/zh active Active
- 2020-11-03 KR KR1020227009792A patent/KR20220050977A/ko unknown
- 2020-11-03 WO PCT/CN2020/126063 patent/WO2021159767A1/zh unknown
- 2020-11-03 EP EP20919308.5A patent/EP4002268A4/en active Pending
- 2020-11-03 JP JP2022524010A patent/JP2022553979A/ja active Pending
-
2022
- 2022-03-03 US US17/685,847 patent/US20220189017A1/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109461143A (zh) * | 2018-10-12 | 2019-03-12 | 上海联影医疗科技有限公司 | 图像显示方法、装置、计算机设备和存储介质 |
CN110575178A (zh) * | 2019-09-10 | 2019-12-17 | 贾英 | 一种运动状态判断的诊断监控综合医疗***及其判断方法 |
CN110705425A (zh) * | 2019-09-25 | 2020-01-17 | 广州西思数字科技有限公司 | 一种基于图卷积网络的舌象多标签分类学习方法 |
CN111275696A (zh) * | 2020-02-10 | 2020-06-12 | 腾讯科技(深圳)有限公司 | 一种医学图像处理的方法、图像处理的方法及装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP4002268A4 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118135340A (zh) * | 2024-05-06 | 2024-06-04 | 天津市肿瘤医院(天津医科大学肿瘤医院) | 基于肺区域分割的肺影像病灶预标记方法、***及介质 |
Also Published As
Publication number | Publication date |
---|---|
JP2022553979A (ja) | 2022-12-27 |
EP4002268A1 (en) | 2022-05-25 |
US20220189017A1 (en) | 2022-06-16 |
CN111275696B (zh) | 2023-09-15 |
EP4002268A4 (en) | 2022-12-21 |
CN111275696A (zh) | 2020-06-12 |
KR20220050977A (ko) | 2022-04-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021159767A1 (zh) | 一种医学图像处理的方法、图像处理的方法及装置 | |
CN110570353B (zh) | 密集连接生成对抗网络单幅图像超分辨率重建方法 | |
US11830230B2 (en) | Living body detection method based on facial recognition, and electronic device and storage medium | |
CN108446617B (zh) | 抗侧脸干扰的人脸快速检测方法 | |
US20230214976A1 (en) | Image fusion method and apparatus and training method and apparatus for image fusion model | |
CN104050471B (zh) | 一种自然场景文字检测方法及*** | |
WO2018145470A1 (zh) | 一种图像检测方法和装置 | |
TW202014984A (zh) | 一種圖像處理方法、電子設備及存儲介質 | |
WO2017084204A1 (zh) | 一种二维视频流中的人体骨骼点追踪方法及*** | |
CN109685045B (zh) | 一种运动目标视频跟踪方法及*** | |
CN109918971B (zh) | 监控视频中人数检测方法及装置 | |
CN111160194B (zh) | 一种基于多特征融合的静态手势图像识别方法 | |
CN112561813B (zh) | 人脸图像增强方法、装置、电子设备及存储介质 | |
CN113762009B (zh) | 一种基于多尺度特征融合及双注意力机制的人群计数方法 | |
Cai et al. | Perception preserving decolorization | |
CN113781421A (zh) | 基于水下的目标识别方法、装置及*** | |
Zhang et al. | Salient target detection based on the combination of super-pixel and statistical saliency feature analysis for remote sensing images | |
CN115713469A (zh) | 基于通道注意力和形变生成对抗网络的水下图像增强方法 | |
CN108711160A (zh) | 一种基于hsi增强性模型的目标分割方法 | |
CN113129214A (zh) | 一种基于生成对抗网络的超分辨率重建方法 | |
CN112070041B (zh) | 一种基于cnn深度学习模型的活体人脸检测方法和装置 | |
Kour et al. | A review on image processing | |
Fang et al. | Detail maintained low-light video image enhancement algorithm | |
Honnutagi et al. | Underwater video enhancement using manta ray foraging lion optimization-based fusion convolutional neural network | |
Xu et al. | DANet-SMIW: An Improved Model for Island Waterline Segmentation Based on DANet |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20919308 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2020919308 Country of ref document: EP Effective date: 20220216 |
|
ENP | Entry into the national phase |
Ref document number: 20227009792 Country of ref document: KR Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2022524010 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |