CN116912122A - Method and device for repairing fat color of endoscope, storage medium and electronic equipment - Google Patents

Method and device for repairing fat color of endoscope, storage medium and electronic equipment Download PDF

Info

Publication number
CN116912122A
CN116912122A CN202310875953.1A CN202310875953A CN116912122A CN 116912122 A CN116912122 A CN 116912122A CN 202310875953 A CN202310875953 A CN 202310875953A CN 116912122 A CN116912122 A CN 116912122A
Authority
CN
China
Prior art keywords
bleeding
target
target image
pixel
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310875953.1A
Other languages
Chinese (zh)
Inventor
汪晓辉
刘恩毅
王金铭
何炎森
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Haikang Huiying Technology Co ltd
Original Assignee
Hangzhou Haikang Huiying Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Haikang Huiying Technology Co ltd filed Critical Hangzhou Haikang Huiying Technology Co ltd
Priority to CN202310875953.1A priority Critical patent/CN116912122A/en
Publication of CN116912122A publication Critical patent/CN116912122A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10068Endoscopic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The application discloses a method for recovering fat color in endoscopic imaging, which comprises the following steps: acquiring a target image obtained by endoscopic imaging, the target image including adipose tissue stained with blood; identifying adipose tissue from the target image; estimating the bleeding dyeing degree based on the image pixel information of the adipose tissue to obtain a bleeding degree estimated value of the target image; and performing color recovery on the adipose tissue based on the bleeding level estimation value. By the aid of the method, the color of the fat part in endoscopic imaging can be restored and reduced.

Description

Method and device for repairing fat color of endoscope, storage medium and electronic equipment
Technical Field
The present application relates to image processing technology, and in particular, to a method and apparatus for repairing fat color of an endoscope, a storage medium, and an electronic device.
Background
An endoscope is a commonly used medical instrument, and consists of a bendable part, a light source and a group of lenses. When in use, the endoscope enters the human body through a natural duct of the human body or through a small incision made by operation, is led into a pre-checked organ, and can directly peep for the change of related parts. The quality of the image directly influences the use effect of the endoscope and marks the development level of the endoscope technology.
White light endoscopic surgical imaging systems are also the most common and most commonly used in endoscopic surgery. The endoscopic image is mainly red and yellow. Red is mostly formed by organs, blood and other tissues, while yellow is mostly fat, urinary tract (pale yellow) and the like. In the operation process, fat is easy to stain blood and other reasons, so that the fat is red, is indistinguishable from other tissues, and can influence the operation of doctors while affecting the sense of human eyes.
Disclosure of Invention
The application provides a method and a device for recovering fat color in endoscopic imaging, a storage medium and electronic equipment, which can recover and recover the color of a fat part.
In order to achieve the above purpose, the application adopts the following technical scheme:
a method for recovering fat color in endoscopic imaging, comprising:
acquiring a target image obtained by endoscopic imaging, the target image including adipose tissue stained with blood;
identifying the adipose tissue from the target image;
estimating the bleeding dyeing degree based on the image pixel information of the adipose tissue to obtain a bleeding degree estimated value of the target image;
and performing color recovery on the adipose tissue based on the bleeding level estimation value.
Preferably, the identifying the adipose tissue from the target image includes:
identifying the adipose tissue from the target image using a adipose identification model; or identifying the adipose tissue based on pixel information of the target image,
the fat recognition model is obtained by training a fat sample image, and the pixel information comprises color information.
Preferably, identifying the adipose tissue based on the pixel information of the target image includes:
and determining a target pixel for representing fat based on a matching result of the color information of each pixel in the target image and a calibrated color range, wherein the calibrated color range is used for representing the color range of the pixel of fat.
Preferably, the estimating the bleeding and dyeing degree based on the image pixel information of the adipose tissue, and obtaining the bleeding and dyeing degree estimated value of the target image includes:
acquiring a bleeding-staining level of a target pixel identified as the adipose tissue;
estimating based on the bleeding dyeing degree of each target pixel on the target image to obtain a bleeding degree estimated value of the target image; or estimating based on the bleeding dyeing degree of each target pixel on the auxiliary image to obtain a bleeding degree estimated value of the target image;
The auxiliary image is an image which is positioned at the same position and has the same size as the target image on a frame before the frame where the target image is positioned.
Preferably, obtaining the bleeding and staining level of the target pixel identified as the adipose tissue comprises:
for each of the target pixels, taking as the bleeding-staining level a weighted sum of the respective channel pixel values of the target pixel in a single space; wherein, the weighted value of the channel pixels related to the red color in each channel is larger than the weighted value of the other channel pixels.
Preferably, the estimating based on the bleeding dyeing degree of each target pixel on the target image, and obtaining the bleeding degree estimated value of the target image includes:
selecting part or all of target pixels on the target image based on the bleeding and dyeing degree of each target pixel on the target image to form a calculation data set;
carrying out statistical calculation based on the bleeding dyeing degree of each target pixel in the calculation data set, and taking a calculation result as a bleeding degree estimated value of the target image;
and/or the number of the groups of groups,
the estimating based on the bleeding dyeing degree of each target pixel on the auxiliary image, and obtaining the bleeding degree estimated value of the target image comprises the following steps:
Selecting part or all target pixels from all target pixels of the auxiliary image based on the bleeding and dyeing degree of each target pixel on the auxiliary image to form an auxiliary calculation data set;
and carrying out statistical calculation based on the bleeding dyeing degree of each target pixel in the auxiliary calculation data set, and taking a calculation result as a bleeding degree estimated value of the target image.
Preferably, the color recovery of the adipose tissue includes:
determining a color recovery intensity based on the bleeding level estimate of the target image;
processing color information of all target pixels of the target image, which are identified as adipose tissue, in a target space based on the color restoration intensity to obtain restored color information;
and determining pixel values of all target pixels in the original space of the target image by using the recovered color information.
Preferably, performing color restoration on the adipose tissue based on the bleeding level estimation value includes:
and determining color restoration intensity based on the current scene information and/or the level parameters input by the user, and performing color restoration on the adipose tissue.
A fat color restoration device in endoscopic imaging, comprising: a target image acquisition unit, an adipose tissue detection unit, a bleeding degree estimation unit and a color recovery unit;
The target image acquisition unit is used for acquiring a target image obtained through endoscopic imaging, wherein the target image comprises fat tissue stained by blood;
the adipose tissue detection unit is used for identifying the adipose tissue in the target image;
the bleeding degree estimation unit is used for estimating bleeding dyeing degree based on the image pixel information of the adipose tissue to obtain a bleeding degree estimation value of the target image;
the color recovery unit is used for performing color recovery on the adipose tissue based on the bleeding degree estimated value.
Preferably, in the adipose tissue detection unit, the identifying the adipose tissue from the target image includes:
identifying the adipose tissue from the target image using a adipose identification model; or identifying the adipose tissue based on pixel information of the target image,
the fat recognition model is obtained by training a fat sample image, and the pixel information comprises color information.
Preferably, in the adipose tissue detection unit, identifying the adipose tissue based on the pixel information of the target image includes:
And determining a target pixel for representing fat based on a matching result of the color information of each pixel in the target image and a calibrated color range, wherein the calibrated color range is used for representing the color range of the pixel of fat.
Preferably, in the bleeding level estimation unit, the estimating the bleeding level based on the image pixel information of the adipose tissue, and obtaining the bleeding level estimation value of the target image includes:
acquiring a bleeding-staining level of a target pixel identified as the adipose tissue;
estimating based on the bleeding dyeing degree of each target pixel on the target image to obtain a bleeding degree estimated value of the target image; or estimating based on the bleeding dyeing degree of each target pixel on the auxiliary image to obtain a bleeding degree estimated value of the target image;
the auxiliary image is an image which is positioned at the same position and has the same size as the target image on a frame before the frame where the target image is positioned.
Preferably, in the bleeding level estimation unit, acquiring the bleeding coloring level of the target pixel identified as the adipose tissue includes:
For each of the target pixels, taking as the bleeding-staining level a weighted sum of the respective channel pixel values of the target pixel in a single space; wherein, the weighted value of the channel pixels related to the red color in each channel is larger than the weighted value of the other channel pixels.
Preferably, in the bleeding level estimation unit, the estimating based on the bleeding dyeing level of each of the target pixels on the target image, to obtain the bleeding level estimation value of the target image includes:
selecting part or all of target pixels on the target image based on the bleeding and dyeing degree of each target pixel on the target image to form a calculation data set;
carrying out statistical calculation based on the bleeding dyeing degree of each target pixel in the calculation data set, and taking a calculation result as a bleeding degree estimated value of the target image;
and/or the number of the groups of groups,
the estimating based on the bleeding dyeing degree of each target pixel on the auxiliary image, and obtaining the bleeding degree estimated value of the target image comprises the following steps:
selecting part or all target pixels from all target pixels of the auxiliary image based on the bleeding and dyeing degree of each target pixel on the auxiliary image to form an auxiliary calculation data set;
And carrying out statistical calculation based on the bleeding dyeing degree of each target pixel in the auxiliary calculation data set, and taking a calculation result as a bleeding degree estimated value of the target image.
Preferably, in the color recovery unit, the performing color recovery on the adipose tissue includes:
determining a color recovery intensity based on the bleeding level estimate of the target image;
processing color information of all target pixels of the target image, which are identified as adipose tissue, in a target space based on the color restoration intensity to obtain restored color information;
and determining pixel values of all target pixels in the original space of the target image by using the recovered color information.
Preferably, in the color recovery unit, performing color recovery on the adipose tissue based on the bleeding degree estimation value includes:
and determining color restoration intensity based on the current scene information and/or the level parameters input by the user, and performing color restoration on the adipose tissue.
A computer readable storage medium having stored thereon computer instructions which, when executed by a processor, implement the method for recovering fat color in endoscopic imaging of any one of the above.
An electronic device comprising at least a computer-readable storage medium and a processor;
the processor is configured to read executable instructions from the computer readable storage medium and execute the instructions to implement the method for recovering fat color in endoscopic imaging as described in any one of the above.
As can be seen from the above technical solution, in the present application, a target image obtained by endoscopic imaging is first acquired, and adipose tissue is identified in the target image; then, estimating the bleeding dyeing degree based on the image pixel information of the adipose tissue to obtain a bleeding degree estimated value of the target image; finally, according to the bleeding degree estimated value, color recovery is carried out on the adipose tissue representing the fat. Therefore, the fat part in the target image can be effectively detected, the color recovery is carried out pertinently, and the original state of the fat is recovered.
Drawings
FIG. 1 is a schematic flow chart of a fat color restoration method in endoscopic imaging according to the present application;
FIG. 2 is a schematic flow chart of a method for recovering fat color according to an embodiment of the application;
FIG. 3 is a schematic view of the basic structure of a fat color restoration device for endoscopic imaging according to the present application;
Fig. 4 is a schematic diagram of a basic structure of an electronic device according to the present application.
Detailed Description
The present application will be described in further detail with reference to the accompanying drawings, in order to make the objects, technical means and advantages of the present application more apparent.
Fig. 1 is a schematic diagram of the basic architecture of the fat color restoration method in endoscopic imaging according to the present application. As shown in fig. 1, the method includes:
step 101, obtaining a target image obtained by endoscopic imaging.
The acquired target image includes adipose tissue stained with blood.
The video composed of a plurality of continuous images can be obtained by endoscopic imaging, and the target image can be a single frame image or a partial image area in the single frame image when the processing of the application is performed.
Step 102, identifying adipose tissue in the target image.
The adipose tissue may include target pixels for characterizing fat. Wherein the target pixel for characterizing fat, i.e. the pixel, is identified as belonging to the fat classification. Herein, a pixel identified as adipose tissue is referred to as a target pixel for convenience of description.
In determining adipose tissue, this may be achieved by a deep neural network model. Specifically, the target image may be input into the fat recognition model; the identification information output by the fat identification model may be target pixel information identified as fat tissue. The fat recognition model processes an input target image, extracts characteristics of the target image, recognizes the target image according to the extracted characteristics, and outputs recognition information, wherein the recognition information comprises target pixels recognized as fat tissues. The fat recognition model is pre-trained, and specifically, the neural network model can be trained by utilizing a pre-collected sample image to obtain the fat recognition model; the neural network model may include a Fast R-CNN model, a YOLO model, an SSD model, or the like, and the sample image is an image including a fat region, wherein the partial image may be an image including a fat region stained with blood.
In identifying adipose tissue, conventional classification methods may be used in addition to the neural network model. Specifically, adipose tissue may be identified based on pixel information in the target image. In more detail, the target pixel for characterizing the fat may be determined based on a result of matching pixel color information of each pixel in the target image with a calibrated color range, where calibrated color range refers to a color range of the pixel for characterizing the fat. In order to achieve better classification and recognition effects, the pixel color information may include color information of pixels in multiple image format spaces, so that judgment can be more accurately performed from multiple dimensions. For example, in an image obtained by endoscopic imaging, most of the images are yellow (fat-like portion) and red (blood vessel and viscera portion), so that when such an image is subjected to fat portion identification, the degree of distinction between red and yellow portions needs to be considered particularly, while in the LAB space of the image, the a value is used for representing the distinction between red and yellow, while in the HSV space, the color information in the image is represented by using one channel of the H value, and based on this, the pixel color information for fat portion identification can include the a value of the LAB space and the H value of the HSV space, and a number of experiments show that this combination manner can identify fat tissue conveniently and effectively. Additionally, the calibration color range may be determined by performing a cluster analysis of pixel color information for pixels used to characterize fat in a pre-collected sample image. Wherein the sample image is an image comprising fat areas, wherein the partial image may be an image comprising stained fat areas.
And step 103, estimating the bleeding dyeing degree based on the image pixel information of the adipose tissue, and obtaining a bleeding degree estimated value of the target image.
In an image obtained by endoscopic imaging, fat parts are stained with red colors to different degrees due to bleeding, which affects judgment of doctors. In the application, the degree of bleeding is estimated by using the degree of red staining of the fat part on the image, so that reference is provided for the subsequent color recovery of the adipose tissue.
The process of specifically determining the bleeding degree evaluation value of the target image may include: first, the bleeding and staining degree of a target pixel identified as adipose tissue is acquired; then, the bleeding degree is estimated based on the bleeding dyeing degree of the target pixel, and a bleeding degree estimated value of the target image is obtained.
Optionally, the process of determining the bleeding and staining level of the target pixel may include: for each target pixel, the weighted sum of the individual channel pixel values of the target pixel in a single space is taken as the degree of bleeding staining. The bleeding and dyeing degree is mainly measured by the degree to which the target pixel is dyed with red, so that the weighted value of the channel pixel related to the red color in each channel is larger than that of the other channel pixels, and the influence of the red color on the bleeding and dyeing degree is the greatest. The red color-related channel may be, for example, an R value in RGB space, an a value in LAB space, or a U value in YUV space, etc.
Alternatively, the estimation may be performed based on the bleeding dyeing degrees of all the target pixels, so as to obtain an estimated bleeding degree value of the whole target image, and the specific processing may include: selecting part or all of the target pixels from all of the target pixels based on the bleeding and staining degrees of the target pixels to form a calculation data set; and carrying out statistical calculation based on the bleeding dyeing degree of each target pixel in the calculation data set, and taking the calculation result as a bleeding degree estimated value of the target image. Specifically, it has been found through early experiments that in one target image (for example, one frame of image), the bleeding and dyeing degree of the target pixel may have various values, and the bleeding and dyeing degree of the target pixel in a very high or very low part is extremely high, so that the bleeding and dyeing degree estimated value of the whole target image is greatly influenced, and based on this, optionally, part or all of the target pixels may be selected to participate in the calculation of the bleeding and dyeing degree estimated value of the target image. When a portion of the target pixels are selected to participate in the calculation, target pixels having a bleeding coloring degree of less than the first n% and greater than the last n% may be used for the calculation of the bleeding degree estimation value of the target image. Where n may be a constant set in advance. Or when selecting part of target pixels to participate in calculation, the calculation can be performed according to the current scene information, and the value of n can be determined according to the current scene information. For example, if the current scene is a small-cavity endoscope with a high bleeding degree such as ear, nose and throat, the calculation influence of the extreme value on the estimated bleeding degree value of the target image is relatively large, so that the value of n can be a little higher; if the current scene is a large-cavity endoscope with smaller bleeding degree such as a thoracic cavity or an abdominal cavity, the influence of the extreme value on the calculation of the bleeding degree estimated value of the target image is smaller, and the value of n can be smaller.
After the calculation data set is determined, the bleeding dyeing degree of the target pixel in the calculation data set is statistically calculated, and the statistical result is used as a bleeding degree estimated value of the target image. The statistical calculation may be performed in various existing manners, for example, selecting an average number of the calculated data sets, or may perform a cluster analysis on the calculated data sets, find a centremost point to be used as a feature point of the data set, and use the bleeding dyeing degree of the feature point of the data set as a bleeding degree estimation value of the target image. When the cluster analysis searches for the center point, the calculation mode includes, but is not limited to, euclidean distance, manhattan distance and the like, and the calculation space includes, but is not limited to, RGB, LAB, XYZ, YUV and the like.
As can be seen from the above calculation process of the whole bleeding level estimation value, the calculation of the bleeding level estimation value needs to traverse the target pixels of the whole target image, which takes a long time, but in fact, for the two previous and subsequent frames of images, the time interval is short, and the bleeding level estimation value of the image generally does not change too much, so that, alternatively, the calculation process of the bleeding level estimation value can be performed using the previous frame image of the frame in which the target image is located. Specifically, an image which is located at the same position as the target image on the previous frame of the frame where the target image is located and has the same size is referred to as an auxiliary image, and the bleeding-dyeing degree of the target pixel of the auxiliary image is used for calculating the bleeding-dyeing degree estimated value of the target image instead of the bleeding-dyeing degree of the target pixel of the target image. Therefore, the bleeding and dyeing degree of the target pixel of the auxiliary image can be calculated in advance to the previous frame, so that the bleeding and dyeing degree of the target pixel of the existing auxiliary image can be directly utilized to directly calculate the bleeding and dyeing degree estimated value of the target image when the target image is processed, the calculation delay can be reduced, and the user experience is improved.
In summary, the above-mentioned process of estimating the bleeding level estimation value of the target image by using the target image and the auxiliary image may be summarized as the manner of calculating the bleeding level estimation value of the target image:
acquiring a bleeding and staining degree of a target pixel identified as adipose tissue;
estimating based on the bleeding dyeing degree of each target pixel on the target image to obtain a bleeding degree estimated value of the target image; or estimating the bleeding dyeing degree of each target pixel on the auxiliary image to obtain the bleeding degree estimated value of the target image.
The process of estimating the bleeding degree estimated value of the target image based on the bleeding and dyeing degree of each target pixel on the target image may specifically include:
selecting part or all of target pixels from all target pixels on the target image based on the bleeding and dyeing degree of each target pixel on the target image to form a calculation data set;
and carrying out statistical calculation based on the bleeding dyeing degree of each target pixel in the calculation data set, and taking the calculation result as a bleeding degree estimated value of the target image.
The process of estimating the bleeding degree estimation value of the target image based on the bleeding and dyeing degree of each target pixel on the auxiliary image may specifically include:
Selecting part or all of target pixels in all of the target pixels of the auxiliary image based on the bleeding and staining degrees of the target pixels of the auxiliary image to form an auxiliary calculation data set;
and carrying out statistical calculation based on the bleeding dyeing degree of each target pixel in the auxiliary calculation data set, and taking the calculation result as a bleeding degree estimated value of the target image.
And 104, performing color recovery on the adipose tissue based on the bleeding degree estimated value of the target image.
After determining the bleeding level estimation value of the whole target image in step 103, the color recovery of the adipose tissue representing fat can be performed to a corresponding degree based on the bleeding level estimation value. An exemplary color recovery method is provided in the present application, and specific processing may include:
determining a color recovery intensity based on the bleeding level estimate of the target image;
processing color information of all target pixels of the target image, which are identified as adipose tissue, in a target space based on the color recovery intensity to obtain recovered color information;
determining pixel values of all target pixels in an original space of the target image by using the restored color information;
and outputting pixel values of all pixels of the target image in the original space.
In particular, determining the color recovery intensity may be performed based on the bleeding level estimate of the target image, or may be further performed based on current scene information and/or user-entered level parameters. For example, if the current scene is a small lumen endoscope with a high bleeding level such as ear, nose, throat, etc., the color recovery intensity may be relatively high; if the current scene is a large cavity endoscope with a small bleeding degree such as a thoracic cavity or an abdominal cavity, the color recovery intensity can be relatively low. The user may also input a level parameter for controlling the color recovery intensity according to personal needs, e.g. a doctor may choose a higher level parameter indicating a desire for a higher color recovery intensity or a lower level parameter indicating a desire for a lower color recovery intensity according to personal habits.
And then, processing the color information of all target pixels of the target image in the target space based on the color recovery intensity to obtain recovered color information. In general, in different image spaces, different combinations of channels may be utilized to represent pixel values of an image. For the color restoration to be performed in this step, a certain target space may be selected to represent the color information of the adipose tissue, and then the color restoration of the adipose tissue is performed accordingly according to the color restoration intensity. In order to simplify the color restoration process as much as possible and save processing resources, optionally, the color information of the target image can be represented in the HSV space, so that the color information of the target pixel can be represented only by the component of the H value, when the color restoration is performed, the H value is processed to obtain the restored H value, the S value and the V value of the target pixel can be kept unchanged, and then the restored H value, the original S value and the original V value are utilized to convert the target pixel from the HSV space to the original space, so as to obtain the restored target pixel representation in the original space.
To this end, the method flow shown in fig. 1 ends. Through the series of processing, color recovery can be realized for all adipose tissues used for representing fat, and the recovered adipose tissues and other characteristics in the target image are output together, so that a complete image with the fat part recovered can be obtained, and the endoscope operation of a doctor can be effectively assisted.
The following describes the implementation of the fat color recovery method of the present application by means of a specific example. In a specific embodiment, for convenience of description, it is assumed that the acquired target image is an RGB image of the whole frame, and description is given taking a target pixel for characterizing fat as an example of fat tissue. Fig. 2 is a schematic flow chart of a fat color recovery method according to an embodiment, as shown in fig. 2, the flow chart includes:
in step 201, a target image obtained by endoscopic imaging is acquired, and spatial conversion of the target image is performed.
In this embodiment, the target image is an RGB image, and since the RGB image represents colors through three channel data, the processing of color information in the subsequent processing is relatively complex, and in order to simplify the processing, the RBG image is first spatially converted in this step, and may be specifically converted into an HSV space, a YUV space, or a YCbCr space, which may all represent color information with a smaller number of channels. In this embodiment, conversion into HSV space is described as an example. The specific manner of conversion into HSV space may be done in the existing manner, for example:
Wherein r, g and b are pixel values of three channels of the RBG image respectively, max and min are maximum and minimum values of the three channels in the original RBG image respectively, H calculated by the formula (1) is between [0,360 ] in order to restore physical meaning (the unit is degree), and can be normalized to be between 0 and 1 for the convenience of calculation later, namely H_ori=h/360.
Because the S value and the V value in the HSV space in the subsequent processing may not be further processed, the conversion of the S value and the V value will not be described here, and in particular, the S value and the V value may be obtained by adopting an existing conversion manner.
Step 202, detecting all target pixels in the target image for characterizing fat.
As previously described, detection of the target pixel may be by way of a neural network, or by way of conventional cluster analysis.
By using the neural network mode, the whole frame of image can be input into a pre-trained fat recognition model, and all the recognized target pixels in the frame of image are output.
One way of performing target pixel detection in a conventional manner is given below:
in the target image, comparing pixel by pixel with a set color range, judging whether the current pixel is a target pixel for representing fat or not, and specifically, judging by adopting the following formula (2):
Wherein Coe_Fat.determination indicates whether the current pixel is a target pixel, a value of 1 indicates that the current pixel is a target pixel, a value of 0 indicates that the current pixel is not a target pixel, A, B, C … indicates channel pixel values participating in determination, and th_Fat1, th_Fat2, and th_Fat3 indicate thresholds of each channel of the calibrated color range. The sample image containing fat can be collected in advance, and the pixels used for representing the fat in the image are subjected to clustering analysis of pixel color information, so that a calibrated color range is obtained. Here, the A, B, C … channel pixel values can come from different image spaces, and experiments prove that the recognition accuracy of the target pixel can be effectively improved by adopting color information of a plurality of different image spaces. An example of comprehensively determining whether or not it is a target pixel by the H value of the HSV space and the a value of the LAB space is given by the following formula (3):
wherein, H_ori is the V value of HSV space, a_ori is the A value of LAB space, coe_Fat.detem indicates whether the current pixel is the target pixel, 1 indicates that the current pixel is the target pixel, and 0 indicates that the current pixel is not the target pixel. In order to complete the processing of the formula (3), each pixel in the target image needs to be converted into an LAB space in advance, and a corresponding A value is calculated. Experiments prove that the target pixel for characterizing fat can be effectively identified through the formula (3).
Through the processing of this step, all target pixels in the target image can be found.
Step 203, for each target pixel of the target image, determining a characteristic bleeding dyeing degree of the target pixel.
In this embodiment, to accelerate the processing of the fat color recovery, the bleeding coloring degree of the target pixel in the auxiliary image of the previous frame is used in calculating the bleeding degree estimated value of the target image. Based on this, the processing performed in this step is to prepare for calculating the bleeding level estimation value for the target image of the next frame.
Specifically, for each target pixel, determining the bleeding and staining level thereof may be performed according to formula (4):
L_blood.pixel=coe1kA+coe2kB+coe3*C+th1 (4)
where l_blood is the bleeding and staining level of the target pixel, A, B and C are the pixel values of each channel in a single image space, including but not limited to RGB, LAB, XYZ, YUV, etc., coe1, coe2, coe3 are the weighting values of each channel, and the weighting value of the channel pixel associated with the red color in each channel is greater than the weighting values of the other channel pixels, and th1 is a preset constant, which can be empirically set.
Taking LAB space as an example, the above equation (4) can be embodied as the following equation (5):
L_blood.pixel=0.2kL+0.6*a+0.1*b+128 (5)
Wherein L, a, b are respectively the pixel values of the target pixel in three channels in the LAB space, and the pixel value of the a channel is related to the red color, so that the weighting value is greater than the weighting values of the pixels of the other two channels. Experiments prove that the bleeding dyeing degree according to the formula (5) can effectively reflect the dyeing degree of the target pixel, namely, effectively represent the bleeding degree, and the ideal color recovery effect can be realized after the corresponding calculation is applied to the whole color recovery method.
Since the process of step 203 is prepared for image restoration of the next frame, the process herein may be performed in parallel with the process of the subsequent step or with an idle time as long as it is completed before the next frame image starts to be processed.
Step 204, selecting a part of the calculation features as a calculation data set from all calculation features based on the bleeding and staining degree of the target pixel of the auxiliary image.
As described above, in the present embodiment, in order to reduce the processing delay, the bleeding level estimation value of the target image is calculated using the bleeding dyeing level of the target pixel on the auxiliary image.
Specifically, among the target pixels of all the auxiliary images, the target pixels with the bleeding dyeing degree smaller than the first n% and larger than the last n% are selected to form a calculation data set for calculating the bleeding degree estimated value of the target image. Where n may alternatively be a preset constant, may be set empirically, for example n may take 10, or n may be determined from current scene information.
In step 205, a statistical calculation is performed based on the bleeding and dyeing degree of each target pixel in the calculation data set, and the calculation result is used as a bleeding degree estimated value of the target image.
In the calculation data set, an average value of bleeding dyeing degrees of all the target pixels is calculated as a bleeding degree estimation value of the target image. Or performing cluster analysis on all target pixels in the calculation data set based on the bleeding dyeing degree, finding out a centremost calculation feature as a data set feature point, and taking the bleeding dyeing degree of the data set feature point as a bleeding degree estimated value of the target image.
Step 206, determining the color recovery intensity based on the bleeding level estimation value of the target image.
Most simply, the color recovery intensity may be determined based on an estimate of the bleeding level of the target image.
Or, in order to enable the fat color restoration method of the present application to achieve better color restoration effects for images of different scenes, more choices are provided for users, and optionally, the color restoration intensity can be further determined based on current scene information and/or level parameters input by users on the basis of determining the bleeding degree estimation value based on the target image.
Specifically, in this embodiment, the color recovery intensity level_yellow c is calculated by using the formula (6):
level_yellowC=(i2p_param2.Level_Yellow-L_blood)/coe1_ada-th1_ada,
wherein l_blood is the bleeding level estimated value calculated in step 206, and i2p_param2.level_yellow, coe1_da and th1_da are respectively fixed constants or constants set in advance corresponding to different recovery intensity gears.
In more detail, when the color recovery intensity is determined only from the bleeding level estimation value, i2p_param2.Level_yellow, coe1_da, and th1_da are respectively preset fixed constants, which can be empirically set. When the color restoration intensity is determined according to the bleeding degree estimation value, and is further determined according to the current scene information and/or the level parameter input by the user, i2p_param2.Level_yellow, coe1_da and th1_da are constants set in advance corresponding to different restoration intensity gears respectively. The recovery strength gear may be determined according to current scene information and/or a level parameter input by a user, and may specifically be a high gear, a medium gear, and a low gear. For example, when the recovery strength gear is determined to be high, isp_parameters.level_yellow may be taken as 100, coe1_ada (module algorithm coefficient) may be taken as 50, and th1_ada (lower threshold coefficient) may be taken as 30; when the recovery intensity gear is a middle gear, the isp_params.level_yellow is 80, the coe1_ada is 40, and the th1_ada is 25; when the recovery strength gear is the low gear, isp_parameters.level_yellow may be 60, coe1_ada may be 30, and th1_ada may be 20.
In step 207, color information of all target pixels of the target image in the target space is processed based on the color restoration intensity, so as to obtain restored color information.
In this embodiment, the color information I after recovery in the target space output Can be determined according to formula (7):
wherein, for the target pixel, I in n 、I in n-1 、…、I in 1 Pixel values of each channel related to color information in the target space are coe1, coe2, … and coen respectively, weighting values corresponding to each channel are coe1, coe2, … and coen respectively, the pixel values are related to color recovery intensity level_yellow, and th1 is a preset constant; for non-target pixels, the original pixel values are kept unchanged.
For the convenience of calculation, a specific manner of color information recovery is given below taking the target space as an HSV space as an example. All pixels of the target image are converted into the HSV space through the step 208, and the color recovery is directly performed in the HSV space in the step, so as to determine the recovered pixel value.
Because only the H value in the HSV space represents the color information, only the H value can be recovered, and the recovered H value H_output can be determined according to the formula (8):
wherein, for the target pixel, th_yellow is a preset constant, which can be set empirically, for example, can be set to 0.1, and the S value and the V value of the HSV space remain unchanged except the H value; for non-target pixels, the original pixel value of the HSV space is kept unchanged.
At step 208, pixel values of all target pixels in the original space of the target image are determined using the recovered color information.
And for all the target pixels, performing space conversion by using the recovered color information and other channel pixel values of the target space, and converting the space conversion into the original space of the target image.
For non-target pixels, the pixel values of the target space are also converted back to the original space. The specific spatial conversion mode can be realized in an existing mode.
The method flow in this embodiment ends. According to the specific implementation of the fat color recovery method, the fat tissue can be effectively identified by traversing the target image, the pixel-level identification can be specifically realized, and meanwhile, the recovery intensity of the fat tissue in the interior can be adjusted by intelligently identifying the condition of the current scene through the bleeding degree estimation in the endoscope image cavity, so that the image processing effect is more in line with the application scene, the processed image is smoother, and the fat color recovery and recovery effect is more close to the human eye perception.
The application also provides a device for recovering the fat color in the endoscopic imaging, and fig. 3 is a basic structural schematic diagram of the fat color recovering device. As shown in fig. 3, the apparatus includes: a target image acquisition unit, an adipose tissue detection unit, a bleeding degree estimation unit and a color recovery unit;
A target image acquisition unit for acquiring a target image obtained by endoscopic imaging, the target image including adipose tissue stained with blood;
an adipose tissue detection unit for identifying adipose tissue from the target image;
the bleeding degree estimation unit is used for estimating bleeding dyeing degree based on the image pixel information of the adipose tissue to obtain a bleeding degree estimated value of the target image;
and a color recovery unit for performing color recovery on the adipose tissue based on the bleeding degree estimation value.
Optionally, in the adipose tissue detection unit, the process of identifying adipose tissue from the target image may specifically include:
identifying adipose tissue from the target image using the adipose identification model; or the adipose tissue is identified based on the pixel information of the target image,
the fat recognition model can be obtained by training a fat sample image, and the pixel information can comprise color information.
Optionally, the device further comprises a training unit, which is used for training the neural network model by using the pre-collected sample image to obtain a fat recognition model;
the neural network model may include a Fast R-CNN model, a Yolo model, or an SSD model, among others.
Optionally, in the adipose tissue detection unit, the process of identifying adipose tissue based on the pixel information of the target image may specifically include:
a target pixel for characterizing fat is determined based on a result of matching pixel color information of each pixel in the target image with a calibrated color range for characterizing a color range of the pixel of fat.
To more accurately identify adipose tissue, optionally, pixel color information may include color information of pixels in multiple image format spaces. The color information of the pixel in the multiple image format spaces can include an A value of an LAB space and an H value of an HSV space.
Alternatively, the calibration color range may be determined by performing a cluster analysis of pixel color information for pixels used to characterize fat in a pre-collected sample image.
Optionally, in the bleeding level estimation unit, the processing for estimating the bleeding level based on the image pixel information of the adipose tissue to obtain the bleeding level estimation value of the target image may specifically include:
acquiring a bleeding and staining degree of a target pixel identified as adipose tissue;
estimating based on the bleeding dyeing degree of each target pixel on the target image to obtain a bleeding degree estimated value of the target image; or estimating the bleeding dyeing degree of each target pixel on the auxiliary image to obtain a bleeding degree estimated value of the target image;
The auxiliary image is an image which is positioned at the same position and has the same size as the target image on the frame before the frame where the target image is positioned.
Alternatively, in the bleeding level estimation unit, the process of acquiring the bleeding coloring level of the target pixel identified as the adipose tissue may include:
for each target pixel, taking the weighted sum of the pixel values of the respective channels of the target pixel in a single space as a characteristic bleeding level; wherein the channel pixel weighting values used for representing the red color segments in each channel are larger than the channel pixel weighting values of the other color segments.
Optionally, in the bleeding level estimation unit, the process of estimating the bleeding level estimation value of the target image based on the bleeding dyeing level of each target pixel on the target image may specifically include:
selecting part or all of target pixels from all target pixels on the target image based on the bleeding and dyeing degree of each target pixel on the target image to form a calculation data set;
carrying out statistical calculation based on the bleeding dyeing degree of each target pixel in the calculation data set, and taking the calculation result as a bleeding degree estimated value of the target image;
and/or the number of the groups of groups,
in the bleeding level estimation unit, the process of estimating the bleeding level estimation value of the target image based on the bleeding dyeing level of each target pixel on the auxiliary image may specifically include:
Selecting part or all target pixels from all target pixels of the auxiliary image based on the bleeding and dyeing degree of each target pixel on the auxiliary image to form an auxiliary calculation data set;
and carrying out statistical calculation based on the bleeding dyeing degree of each target pixel in the auxiliary calculation data set, and taking a calculation result as a bleeding degree estimated value of the target image.
Optionally, in the bleeding level estimation unit, selecting part or all of the target pixels may be further performed based on the current scene information.
Optionally, in the bleeding level estimation unit, the process of selecting a part of the target pixels may specifically include:
selecting a target pixel with bleeding dyeing degree less than the first n% and greater than the last n%; wherein n is determined from the current scene information.
Alternatively, the statistical calculation may specifically be: and calculating an average value, or performing cluster analysis to obtain a central point.
Optionally, in the color recovery unit, the processing for performing color recovery on the adipose tissue may specifically include:
determining a color recovery intensity based on the bleeding level estimate of the target image;
processing color information of all target pixels of the target image, which are identified as adipose tissue, in a target space based on the color recovery intensity to obtain recovered color information;
And determining pixel values of all target pixels in the original space of the target image by using the recovered color information.
Optionally, in the color recovery unit, the processing for performing color recovery on the adipose tissue based on the bleeding level estimation value may specifically include:
and determining color restoration intensity based on the current scene information and/or the level parameters input by the user, and performing color restoration on the adipose tissue.
Optionally, in the color recovery unit, the process of determining the color recovery intensity level_yellow may specifically include:
level_; ellowc= (i2p_param2. Level_yellow-l_blood)/coe 1_ada-th1_ada, wherein a restoration strength gear is determined based on current scene information and/or a level parameter input by a user; the isp_params.level_yellow, coe1_ada and th1_ada are constants set in advance for different recovery intensity gear positions, respectively.
Alternatively, in the color recovery unit, the recovery intensity gear may be determined based on current scene information and/or a level parameter entered by a user.
Optionally, in the color recovery unit, performing recovery processing on color information of all target pixels of the target image identified as adipose tissue in the target space may specifically include:
Acquiring a pixel value of each target pixel in an HSV space, and recovering the H value H_ori of each target pixel to obtain a recovered H value;
determining pixel values of all target pixels in an original space of the target image by using the restored color information specifically may include:
for each target pixel, converting the pixel value of the HSV space into the pixel value of the original space; in the pixel values of the HSV space, the S value and the V value are kept unchanged, and the H value is recovered.
The present application also provides a computer readable storage medium storing instructions that, when executed by a processor, perform the steps of implementing the fat color restoration method as described above. In practice, the computer readable medium may be comprised by or separate from the apparatus/device/system of the above embodiments, and may not be incorporated into the apparatus/device/system. Wherein instructions are stored in a computer readable storage medium, which stored instructions, when executed by a processor, can perform the steps in a fat color recovery method as described above.
According to an embodiment of the present disclosure, the computer-readable storage medium may be a non-volatile computer-readable storage medium, which may include, for example, but is not limited to: portable computer diskette, hard disk, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), portable compact disc read-only memory (CD-ROM), optical storage device, magnetic storage device, or any suitable combination of the foregoing, but are not intended to limit the scope of the application. In the disclosed embodiments, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
Fig. 4 is a schematic diagram of an electronic device according to the present application. As shown in fig. 4, a schematic structural diagram of an electronic device according to an embodiment of the present application is shown, specifically:
the electronic device may include a processor 401 of one or more processing cores, a memory 402 of one or more computer readable storage media, and a computer program stored on the memory and executable on the processor. The method of fat color recovery may be implemented when the program of the memory 402 is executed.
Specifically, in practical applications, the electronic device may further include a power supply 403, an input/output unit 404, and other components. Those skilled in the art will appreciate that the configuration of the electronic device shown in fig. 4 is not limiting of the electronic device and may include more or fewer components than shown, or may combine certain components, or may be arranged in different components. Wherein:
the processor 401 is a control center of the electronic device, connects various parts of the entire electronic device using various interfaces and lines, and performs various functions of a server and processes data by running or executing software programs and/or modules stored in the memory 402, and calling data stored in the memory 402, thereby performing overall monitoring of the electronic device.
Memory 402 may be used to store software programs and modules, i.e., the computer-readable storage media described above. The processor 401 executes various functional applications and data processing by running software programs and modules stored in the memory 402. The memory 402 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, application programs required for at least one function, and the like; the storage data area may store data created according to the use of the server, etc. In addition, memory 402 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device. Accordingly, the memory 402 may also include a memory controller to provide the processor 401 with access to the memory 402.
The electronic device further comprises a power supply 403 for supplying power to the respective components, which may be logically connected to the processor 401 by a power management system, so that functions of managing charging, discharging, power consumption management, etc. are implemented by the power management system. The power supply 403 may also include one or more of any of a direct current or alternating current power supply, a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator, and the like.
The electronic device may also include an input output unit 404, which input unit output 404 may be used to receive input numeric or character information and to generate keyboard, mouse, joystick or optical signal inputs related to user settings and function control. The input unit output 404 may also be used to display information entered by a user or provided to a user as well as various graphical user interfaces that may be composed of graphics, text, icons, video, and any combination thereof.
The foregoing description of the preferred embodiments of the invention is not intended to be limiting, but rather to enable any modification, equivalent replacement, improvement or the like to be made within the spirit and principles of the invention.

Claims (11)

1. A method for recovering fat color in endoscopic imaging, comprising:
acquiring a target image obtained by endoscopic imaging, the target image including adipose tissue stained with blood;
identifying the adipose tissue from the target image;
estimating the bleeding dyeing degree based on the image pixel information of the adipose tissue to obtain a bleeding degree estimated value of the target image;
And performing color recovery on the adipose tissue based on the bleeding level estimation value.
2. The method of claim 1, wherein the identifying the adipose tissue from the target image comprises:
identifying the adipose tissue from the target image using a adipose identification model; or identifying the adipose tissue based on pixel information of the target image,
the fat recognition model is obtained by training a fat sample image, and the pixel information comprises color information.
3. The method of claim 2, wherein identifying the adipose tissue based on pixel information of the target image comprises:
and determining a target pixel for representing fat based on a matching result of the color information of each pixel in the target image and a calibrated color range, wherein the calibrated color range is used for representing the color range of the pixel of fat.
4. The method of claim 1, wherein estimating the bleeding level based on the image pixel information of the adipose tissue, the obtaining the bleeding level estimate of the target image comprises:
acquiring a bleeding-staining level of a target pixel identified as the adipose tissue;
Estimating based on the bleeding dyeing degree of each target pixel on the target image to obtain a bleeding degree estimated value of the target image; or estimating based on the bleeding dyeing degree of each target pixel on the auxiliary image to obtain a bleeding degree estimated value of the target image;
the auxiliary image is an image which is positioned at the same position and has the same size as the target image on a frame before the frame where the target image is positioned.
5. The method of claim 4, wherein obtaining a bleeding dye level of a target pixel identified as the adipose tissue comprises:
for each of the target pixels, taking as the bleeding-staining level a weighted sum of the respective channel pixel values of the target pixel in a single space; wherein, the weighted value of the channel pixels related to the red color in each channel is larger than the weighted value of the other channel pixels.
6. The method of claim 4, wherein estimating based on the bleeding coloring level of each of the target pixels on the target image, the obtaining a bleeding level estimate of the target image comprises:
selecting part or all of target pixels on the target image based on the bleeding and dyeing degree of each target pixel on the target image to form a calculation data set;
Carrying out statistical calculation based on the bleeding dyeing degree of each target pixel in the calculation data set, and taking a calculation result as a bleeding degree estimated value of the target image;
and/or the number of the groups of groups,
the estimating based on the bleeding dyeing degree of each target pixel on the auxiliary image, and obtaining the bleeding degree estimated value of the target image comprises the following steps:
selecting part or all target pixels from all target pixels of the auxiliary image based on the bleeding and dyeing degree of each target pixel on the auxiliary image to form an auxiliary calculation data set;
and carrying out statistical calculation based on the bleeding dyeing degree of each target pixel in the auxiliary calculation data set, and taking a calculation result as a bleeding degree estimated value of the target image.
7. The method of claim 1, wherein said color recovery of said adipose tissue comprises:
determining a color recovery intensity based on the bleeding level estimate of the target image;
processing color information of all target pixels of the target image, which are identified as adipose tissue, in a target space based on the color restoration intensity to obtain restored color information;
and determining pixel values of all target pixels in the original space of the target image by using the recovered color information.
8. The method of claim 7, wherein color recovery of the adipose tissue based on the bleeding level estimate comprises:
and determining color restoration intensity based on the current scene information and/or the level parameters input by the user, and performing color restoration on the adipose tissue.
9. A fat color restoration device for endoscopic imaging, comprising: a target image acquisition unit, an adipose tissue detection unit, a bleeding degree estimation unit and a color recovery unit;
the target image acquisition unit is used for acquiring a target image obtained through endoscopic imaging, wherein the target image comprises fat tissue stained by blood;
the adipose tissue detection unit is used for identifying the adipose tissue in the target image;
the bleeding degree estimation unit is used for estimating bleeding dyeing degree based on the image pixel information of the adipose tissue to obtain a bleeding degree estimation value of the target image;
the color recovery unit is used for performing color recovery on the adipose tissue based on the bleeding degree estimated value.
10. A computer readable storage medium having stored thereon computer instructions, which when executed by a processor, implement the method for recovering fat color in endoscopic imaging according to any one of claims 1 to 8.
11. An electronic device comprising at least a computer-readable storage medium and a processor;
the processor configured to read executable instructions from the computer readable storage medium and execute the instructions to implement the method of recovering fat color in endoscopic imaging as defined in any one of claims 1 to 8.
CN202310875953.1A 2023-07-17 2023-07-17 Method and device for repairing fat color of endoscope, storage medium and electronic equipment Pending CN116912122A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310875953.1A CN116912122A (en) 2023-07-17 2023-07-17 Method and device for repairing fat color of endoscope, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310875953.1A CN116912122A (en) 2023-07-17 2023-07-17 Method and device for repairing fat color of endoscope, storage medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN116912122A true CN116912122A (en) 2023-10-20

Family

ID=88359777

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310875953.1A Pending CN116912122A (en) 2023-07-17 2023-07-17 Method and device for repairing fat color of endoscope, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN116912122A (en)

Similar Documents

Publication Publication Date Title
CN110232383B (en) Focus image recognition method and focus image recognition system based on deep learning model
CN110600122B (en) Digestive tract image processing method and device and medical system
CN109002846B (en) Image recognition method, device and storage medium
JP3810776B2 (en) A method for detecting and correcting red eyes in digital images.
CN109829446A (en) Eye fundus image recognition methods, device, electronic equipment and storage medium
CN109948671B (en) Image classification method, device, storage medium and endoscopic imaging equipment
US20210166066A1 (en) Image processing system and image processing method
CN110826576B (en) Cervical lesion prediction system based on multi-mode feature level fusion
WO2012114600A1 (en) Medical image processing device and medical image processing method
EP3932290B1 (en) Medical image processing device, processor device, endoscope system, medical image processing method, and program
CN106650794A (en) Method and system for eliminating highlight of image affected by highlight reflection on object surface
CN109241898A (en) Object localization method and system, the storage medium of hysteroscope video
WO2020114346A1 (en) Traditional chinese medicine tongue tip redness detection apparatus, method and computer storage medium
CN109949299A (en) A kind of cardiologic medical image automatic segmentation method
CN117314872A (en) Intelligent segmentation method and device for retina image
CN114663293A (en) Image enhancement method and device, electronic equipment and endoscope system
CN109003264B (en) Retinopathy image type identification method and device and storage medium
CN116912122A (en) Method and device for repairing fat color of endoscope, storage medium and electronic equipment
CN109711306B (en) Method and equipment for obtaining facial features based on deep convolutional neural network
CN115359548B (en) Handheld intelligent pupil detection device and detection method
CN113706536B (en) Sliding mirror risk early warning method and device and computer readable storage medium
CN115035086A (en) Intelligent tuberculosis skin test screening and analyzing method and device based on deep learning
US10057505B2 (en) Full-body image capturing and image processing system and method for its operation
CN113255781A (en) Representative picture selecting method and device for CP-EBUS and diagnosis system
CN117456000B (en) Focusing method and device of endoscope, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination