CN117575924A - Visible light and near infrared fluorescence image fusion method of unified model - Google Patents

Visible light and near infrared fluorescence image fusion method of unified model Download PDF

Info

Publication number
CN117575924A
CN117575924A CN202311492954.4A CN202311492954A CN117575924A CN 117575924 A CN117575924 A CN 117575924A CN 202311492954 A CN202311492954 A CN 202311492954A CN 117575924 A CN117575924 A CN 117575924A
Authority
CN
China
Prior art keywords
image
near infrared
fusion
color
infrared fluorescence
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311492954.4A
Other languages
Chinese (zh)
Inventor
迟崇巍
何坤山
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Digital Precision Medicine Co
Original Assignee
Beijing Digital Precision Medicine Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Digital Precision Medicine Co filed Critical Beijing Digital Precision Medicine Co
Priority to CN202311492954.4A priority Critical patent/CN117575924A/en
Publication of CN117575924A publication Critical patent/CN117575924A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration using local operators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10064Fluorescence image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • G06T2207/20032Median filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The invention provides a method for fusing visible light and near infrared fluorescence images of a unified model, which comprises the steps of registering near infrared fluorescence images and visible light images which are synchronously acquired pixel by pixel; filtering and denoising the registered visible light image and near infrared fluorescence image; gamma conversion is carried out on the near infrared fluorescence image after normalization treatment; pseudo-colorizing the near infrared fluorescent image after gamma conversion to generate a pseudo-color RGB image; transforming the filtered and noise-reduced visible light image and the pseudo-color RGB image into a preset color space; carrying out weighted fusion on the image subjected to the transformation of the preset color space; and inversely transforming the fused image from the preset color space to the RGB color space to obtain a final fusion result. The invention is suitable for various fusion color modes by using uniform model parameters, and overcomes the defects of fusion information loss, unnatural transition of fusion effect, incomplete decoupling of brightness and color of fusion images and the like in the existing fusion method.

Description

Visible light and near infrared fluorescence image fusion method of unified model
Technical Field
The invention relates to the technical field of medical image and image processing, in particular to a visible light and near infrared fluorescence image fusion method of a unified model.
Background
Dynamic visualization of the intraoperative focal region can provide accurate surgical guidance to the surgeon. Conventional detection techniques (e.g., CT, MRI, etc.) have difficulty providing real-time, clear localization of lesions during surgery. With the increasing improvement of camera imaging technology and production technology, a color visible light camera can meet the requirements of doctors on resolution, imaging quality and frame rate in an intraoperative scene, but the visible light camera alone cannot acquire spectrum information outside a visible light wave band. Therefore, in the case of a visible light-only camera, a doctor can perform lesion localization only by means of visual observation and empirical judgment. And under the irradiation of an excitation light source, if the pathological cells are marked by using a targeted fluorescent probe (such as indocyanine green), the near infrared emission light with a specific wave band can be generated. The near infrared camera can be used for collecting and imaging the excited near infrared emitted light, and details (such as polyps, tumors and even blood vessels) which are difficult to clearly distinguish by a visible light camera are presented. Near infrared fluorescent molecular imaging has the advantages of high resolution, small damage, quick imaging and the like, and is suitable for highlighting a lesion area marked by a fluorescent probe in real time in the operation process. Near infrared cameras alone are also unable to provide the state of human organs (including color, shape, surface texture, etc.) under visible light.
If the visible light camera and the near infrared camera are used for synchronously imaging a plurality of spectrums, the natural form of the organ under the irradiation of the visible light can be shot, and the pathological area with the fluorescent mark can be accurately positioned, so that abundant pathological detail information is presented.
The visible light image and the near infrared fluorescence image respectively have the characteristics under the respective spectral ranges, and the fusion of the visible light image and the near infrared fluorescence image can help doctors to make accurate and rapid judgment. The fusion of the visible light and the near infrared fluorescence image refers to pixel-level image fusion of the coordinate positions corresponding to the near infrared fluorescence image and the visible light image. The existing image fusion method either directly fills the color channel (such as the G channel of RGB color space) of the visible light image by using the gray value of the near infrared fluorescence image, or performs simple linear weighting and merging on the two images under the RGB color space. From the viewpoint of fusion effect, the existing fusion method has the following disadvantages:
(1) In the position with obvious near infrared fluorescence signals, detail textures of the visible light image are easily covered by fusion colors, and the detail of the visible light image and the intensity of the near infrared fluorescence image cannot be presented at the same time.
(2) The boundary transition of the fusion area is unnatural and obvious sawtooth phenomenon is presented. This is because existing fusion methods mostly employ hard threshold parameters in the fusion process, which are not adaptive and versatile.
(3) The brightness and the color of the images in the fusion process are mutually influenced, and the fusion strategy cannot be independently configured. This is because the existing fusion method mostly starts from the RGB color space, and cannot realize decoupling of color and brightness.
(4) The fusion method is not uniform. That is, the parameters of the same fusion model cannot be fully applied to all fusion color modes. And most fusion methods only support single-color fusion, and are not suitable for layering pseudo-color fusion modes to present the intensity change of near infrared fluorescence signals.
Disclosure of Invention
In order to solve the problems in the existing fusion method, the invention aims to provide a visible light and near infrared fluorescence image fusion method with a unified model, which can be suitable for various fusion color modes by using unified model parameters, and overcomes the defects of fusion information loss, unnatural transition of fusion effect, incomplete decoupling of brightness and color of the fusion image and the like in the existing fusion method.
The invention realizes the above purpose through the following technical scheme:
a method for fusing visible light and near infrared fluorescence images of a unified model comprises the following steps:
synchronously collecting a visible light image and a near infrared fluorescence image;
transforming the near infrared fluorescence image into the same coordinate system as the visible light image, and registering the near infrared fluorescence image and the visible light image pixel by pixel;
filtering and denoising the registered visible light image and the near infrared fluorescence image, and normalizing the filtered and denoised near infrared fluorescence image;
gamma conversion is carried out on the near infrared fluorescence image after normalization treatment;
pseudo-colorizing the gamma-converted near infrared fluorescent image based on the RGB color space to generate a pseudo-color RGB image, wherein the pseudo-colorizing model comprises a single-color mode and a layered pseudo-color fusion mode;
transforming the filtered and noise-reduced visible light image and the pseudo-color RGB image into a preset color space with decoupled brightness and color, so as to realize the separation of brightness and color channels;
carrying out weighted fusion on the image subjected to the transformation of the preset color space;
and inversely transforming the fused image from the preset color space to the RGB color space to obtain a final fusion result.
According to the method for integrating the visible light and the near infrared fluorescence image of the unified model, when filtering and denoising are carried out, a median filtering algorithm is used for filtering and denoising the registered visible light image and the registered near infrared fluorescence image, wherein a filtering window with the length of 3*3 is defined, average filtering smoothing processing is carried out, an edge area is operated in a mode of filling 0, and the visible light image and the near infrared fluorescence image after filtering and denoising are respectively set as followsI ir1
According to the visible light and near infrared fluorescence image fusion method of the unified model provided by the invention, a near infrared fluorescence image I is obtained ir1 Normalized to [0,1 ]]Range, normalized formula is expressed as formula (1):
wherein max (I ir1 )、min(I ir1 ) Respectively represent the pass I ir1 Maximum and minimum values of gray values, I ir2 Representing the near infrared fluorescence image after normalization.
According to the method for fusing the visible light and the near infrared fluorescence image of the unified model, the normalized near infrared fluorescence image I is subjected to gamma transformation ir2 Gamma conversion and nonlinear gray stretching are carried out, and the specific expression is as formula (2):
wherein I is irn And I ir3 The gamma converted gray scale ranges are 0,1]、[0,255]Gamma is the calculated gamma conversion coefficient.
According to the visible light and near infrared fluorescence image fusion method of the unified model, when pseudo-colorization is carried out, if a single-color fusion mode is set, a near infrared fluorescence image I is generated ir3 A uniform-sized monochromatic image; if a layered pseudo-color fusion mode is set, the near infrared fluorescence image I is used ir3 Is used as index to generate and near infrared fluorescence image I ir3 Pseudo-color RGB image with gray scale intensity variation correlationSpecifically expressed as formula (3):
when the single-color fusion mode is set, the Lut function in the formula (3) is a constant function of the corresponding color; when the pseudo color fusion mode is set as the layering, the Lut function in the formula (3) is a 256-dimensional color mapping table.
According to the visible light and near infrared fluorescence of the unified model provided by the inventionImage fusion method for fusing visible light imageAnd pseudo-color RGB image->When transforming to HSV color space with decoupled brightness and color, respectively obtain +.>Andexpressed as formula (4):
wherein Max and Min respectively representOr->The maximum and minimum values of the R, G, B channels in the RGB color space, H, S, V represent the calculated hue, saturation, and brightness channel values, respectively.
According to the method for fusing the visible light and the near infrared fluorescence images of the unified model, when weighting fusion is carried out, the visible light and the near infrared fluorescence images of the unified model are respectively subjected to the following steps ofAnd->Weighted fusion is carried out on the H, S, V channels of the three channels;
wherein, is provided withRespectively indicate->H, S, V values of three channels, +.> Respectively indicate->H, S, V of the three channels;
the fused image is represented asThe values of the three channels are denoted +.>
According to the visible light and near infrared fluorescence image fusion method of the unified model, the fusion strategy of (2) is expressed as formula (5):
in the fusion process embodied in the formula (5), the setting is based onSinusoidal triangular dynamic fusion coefficient of (2)For solving boundary jaggies and transitional artifacts of fusion regionsLike a Chinese character.
According to the method for fusing visible light and near infrared fluorescence images of the unified model, provided by the invention, when the color space inverse transformation is carried out, the method can be used for fusing the visible light and near infrared fluorescence imagesInverse transformation from HSV color space back to RGB color space, resulting in a final fusion result>The specific conversion is expressed as formula (6):
wherein the obtainedNamely +.>The values of the three channels of the image.
According to the method for fusing the visible light and the near infrared fluorescence images of the unified model, the preset color space comprises but is not limited to HSV, YUV, HSL, YCbCr and other color spaces.
It can be seen that the invention has the following beneficial effects over the prior art:
1. the invention is completely suitable for unified models of various fusion color modes, and the fusion process does not need hard threshold parameters and super parameters for targeted tuning, so that the model generalization is better.
2. The invention decouples the colors and the brightness of the images in the fusion process, not only can flexibly set the fusion scheme of the brightness and the colors in the fusion process, but also can simultaneously give consideration to the detail of the visible light images and the strength of near infrared fluorescent image signals.
3. The invention designs the nonlinear dynamic fusion coefficient of the sine triangle, and the passenger source eliminates the sawtooth phenomenon of the fusion boundary, so that the fusion effect is more real, and the boundary transition is natural.
4. The invention can display the intensity change of near infrared fluorescent signals in a layered pseudo color fusion mode besides a single fusion color mode.
The present invention also provides an electronic device including:
a memory storing computer executable instructions;
a processor configured to execute the computer-executable instructions,
wherein the computer executable instructions, when executed by the processor, implement the steps of the method for fusion of visible light and near infrared fluorescence images of the unified model of any one of the above.
The invention also provides a storage medium, wherein the storage medium is stored with a computer program, and the computer program is used for realizing the steps of the method for fusing the visible light and the near infrared fluorescence images of the unified model.
Thus, the invention also provides an electronic device and a storage medium of a unified model visible light and near infrared fluorescence image fusion method, which comprise: one or more memories, one or more processors. The memory is used for storing the program codes, intermediate data generated in the running process of the program, the output result of the model and model parameters; the processor is used for processor resources occupied by code running and a plurality of processor resources occupied when training the model.
The invention is described in further detail below with reference to the drawings and the detailed description.
Drawings
FIG. 1 is a flow chart of an embodiment of a unified model visible and near infrared fluorescence image fusion method of the present invention.
Fig. 2 is a schematic diagram of a visible light image and a near infrared fluorescence image after synchronous acquisition and registration in an embodiment of a unified model of a method for fusing visible light and near infrared fluorescence images according to the present invention.
FIG. 3 is a schematic diagram of a unified model of visible and near infrared fluorescence image fusion method according to an embodiment of the invention with respect to the visible and near infrared fluorescence images after median filtering.
FIG. 4 is a schematic diagram of a unified model of a fusion method of visible and near infrared fluorescence images according to an embodiment of the invention, with respect to a monochromatic mode and a layered pseudo-color fusion mode.
Fig. 5 is a schematic diagram of a fusion result of two color modes in an embodiment of a method for fusing visible light and near infrared fluorescence images of a unified model according to the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present invention more apparent, the technical solutions of the present invention will be clearly and completely described below with reference to the accompanying drawings, and it is apparent that the described embodiments are some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the present application. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments.
The process of image registration techniques is called image matching or image correlation (image matching or image correlation). Image registration may be defined as the spatial and gray-scale transformation between two adjacent images, i.e. the coordinate X of an image pixel is mapped to a certain coordinate X' in a new coordinate system and its pixel is resampled. Image registration requires that there be a portion of the adjacent images that are logically identical, i.e., that a portion of the adjacent images reflect the same target region, which is a fundamental condition for achieving image registration. If the relation between all pixels of the adjacent images representing the same scene target is determined, image registration can be realized by adopting a corresponding processing algorithm.
The gamma conversion is mainly used for correcting images, and correcting images with over-high gray level or under-low gray level to enhance contrast.
The pseudo color process (pseudo color) refers to a process of assigning color values to gray values according to a certain criterion. Macroscopically, it is the conversion of a black and white image into a color image, or of a monochrome image into an image of a given color distribution. Since the resolution of the human eye for color is much higher than the resolution for gray, converting a gray image into a color representation can improve the discrimination of the image details.
Referring to fig. 1 to 5, the present invention provides a method for fusing visible light and near infrared fluorescence images of a unified model, comprising the steps of:
step S1, synchronously collecting a visible light image and a near infrared fluorescence image;
s2, converting the near infrared fluorescent image into the same coordinate system as the visible light image, and registering pixels between the near infrared fluorescent image and the visible light image, so as to realize pixel alignment of the visible light image and the near infrared fluorescent image;
s3, filtering and denoising the registered visible light image and the near infrared fluorescence image, and normalizing the filtered and denoised near infrared fluorescence image;
s4, gamma conversion is carried out on the near infrared fluorescence image after normalization processing;
s5, pseudo-colorizing the near infrared fluorescent image after gamma conversion based on an RGB color space to generate a pseudo-color RGB image, wherein a pseudo-colorizing model comprises a single-color mode and a layered pseudo-color fusion mode;
s6, converting the filtered and noise-reduced visible light image and the pseudo-color RGB image into a preset color space with decoupled brightness and color, and realizing the separation of brightness and color channels;
s7, carrying out weighted fusion on the image subjected to the transformation of the preset color space;
and S8, inversely transforming the fused image from a preset color space to an RGB color space to obtain a final fusion result.
In the step S3, when filtering and denoising are performed, a median filtering algorithm is used to perform filtering and denoising on the registered visible light image and the near infrared fluorescent image, wherein a filtering window with a length of 3*3 is defined, smoothing processing of mean filtering is performed, and an edge region is operated in a manner of filling 0. Therefore, the conventional filter window with 3*3 size is adopted in the embodiment, so that the damage of the large-size filter window to the near infrared fluorescent image is avoided, and the interference of random noise generated by the camera to the subsequent fusion process can be restrained.
In this embodiment, the filtered and noise-reduced visible light image and near infrared fluorescence image are respectively set to beI ir1
Then, near infrared fluorescence image I ir1 Normalized to [0,1 ]]Range, normalized formula is expressed as formula (1):
wherein max (I ir1 )、min(I ir1 ) Respectively represent the pass I ir1 Maximum and minimum values of gray values, I ir2 Representing the near infrared fluorescence image after normalization.
In the above step S4, the normalized near infrared fluorescence image I is subjected to gamma conversion ir2 Gamma transformation is performed to perform nonlinear gray-scale stretching, and the whole intensity is weaker than I ir2 The image again shows a significant gain effect. Specifically expressed as formula (2):
wherein I is irn And I ir3 Respectively gamma converted gray scaleEnclose as [0,1]、[0,255]Gamma is the calculated gamma conversion coefficient.
In step S5, when the pseudo-coloring process is performed, if the single-color fusion mode is set, a near-infrared fluorescence image I is generated ir3 Uniform sized monochrome images (e.g., green, blue, magenta, etc.); if a layered pseudo-color fusion mode is set, the near infrared fluorescence image I is used ir3 Is used as index to generate and near infrared fluorescence image I or3 Pseudo-color RGB image with gray scale intensity variation correlationSpecifically expressed as formula (3):
when the single-color fusion mode is set, the Lut function in the formula (3) is a constant function of the corresponding color; when the pseudo color fusion mode is set as the layering, the Lut function in the formula (3) is a 256-dimensional color mapping table.
The preset color space includes, but is not limited to, HSV, YUV, HSL, YCbCr and other color spaces. For convenience of description, the present embodiment is described in detail by taking HSV color space as an example, but is not intended to limit the applicable scope of the present invention.
In the above step S6, the visible light is imagedAnd pseudo-color RGB image->When transforming to HSV color space with decoupled brightness and color, respectively obtain +.>And->Expressed as formula (4):
wherein Max and Min respectively representOr->The maximum and minimum values of the R, G, B channels in the RGB color space, H, S, V represent the calculated hue, saturation, and brightness channel values, respectively.
In step S7, when the weighting fusion is performed, the weighting fusion is performedAnd->Weighted fusion is carried out on the H, S, V channels of the three channels;
wherein, is provided withRespectively indicate->H, S, V values of three channels, +.> Respectively indicate->H, S, V of the three channels;
the fused image is represented asThe values of the three channels are denoted +.>
In this embodiment, one of the possible specific fusion strategies is exemplified, for example,the fusion strategy of (2) is expressed as formula (5):
in the fusion process represented by the formula (5), the method does not need to set a hard threshold parameter and other super parameters for targeted tuning, so that the fusion method provided by the invention has more universality, and the embodiment is further set based onSinusoidal triangular dynamic fusion coefficient +.>Thus solving the phenomena of boundary saw tooth and transition unnaturalness of the fusion area.
In the above step S8, when performing the color space inverse transform, the color space inverse transform is performedInverse transformation from HSV color space back to RGB color space, resulting in a final fusion result>The specific conversion is expressed as formula (6):
wherein the obtainedNamely +.>The values of the three channels of the image.
In practical application, the method provided by the embodiment specifically includes the following steps:
(1) And synchronously collecting a visible light image and a near infrared fluorescence image of the same target object.
(2) And transforming the near infrared fluorescence image into a visible light image coordinate system to realize pixel-by-pixel registration of the visible light image of the near infrared fluorescence image. Visible and near infrared fluorescence images after simultaneous acquisition and registration are shown in fig. 2. Because the embodiment mainly discloses a fusion method of visible light and near infrared fluorescence images, the implementation mode of a specific registration module is not in the disclosure range of the embodiment.
(3) And filtering and denoising the median filtering image of the visible light image.
(4) Performing gamma conversion on the near infrared fluorescence image by using the formula (1) and the formula (2);
(5) Pseudo-color processing is performed on the resulting gamma conversion result using equation (3). The embodiment provides two pseudo-color modes, namely a single-color mode and a layered pseudo-color fusion mode. As shown in fig. 4, fig. 4 shows the result of a specific function of each of the monochrome mode and the layered pseudo-color fusion mode.
(6) And (3) converting the visible light image and the near infrared fluorescence image obtained in the step (5) from an RGB space to an HSV space by using a formula (4) to realize separation of brightness and color channels.
(7) Based on the processing result of (6), image fusion is completed by using formula (5).
(8) And (3) inversely transforming the fusion result of the step (7) into an RGB color space by using a formula (5). The fusion result of one of the single-color fusion modes and one of the layered pseudo-color fusion modes is corresponding to that shown in fig. 5.
In conclusion, the embodiment is applicable to unified models of various fusion color modes, and the fusion process does not need hard threshold parameters and super parameters which are purposefully adjusted, so that the generalization of the model is better; in the embodiment, the colors and the brightness of the images are decoupled in the fusion process, so that the fusion scheme of the brightness and the colors in the fusion process can be flexibly set, and the detail of the visible light image and the intensity of the near infrared fluorescent image signals can be simultaneously considered; the embodiment has the advantages that the sine triangle nonlinear dynamic fusion coefficient is designed, and the passenger source eliminates the sawtooth phenomenon of the fusion boundary, so that the fusion effect is more real, and the boundary transition is natural; in addition to the single fusion color mode, the embodiment can also use a layered pseudo color fusion mode to present the intensity change of the near infrared fluorescence signal.
In one embodiment, an electronic device is provided, which may be a server. The computer device includes a processor, a memory, and a network interface connected by a system bus. Wherein the processor of the electronic device is configured to provide computing and control capabilities. The memory of the electronic device includes a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, computer programs, and a database. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The database of the electronic device is for storing data. The network interface of the electronic device is used for communicating with an external terminal through a network connection. The computer program, when executed by the processor, implements a unified model visible light and near infrared fluorescence image fusion method.
It will be appreciated by those skilled in the art that the electronic device structure shown in this embodiment is merely a partial structure related to the present application and does not constitute a limitation of the electronic device to which the present application is applied, and that a specific electronic device may include more or fewer components than those shown in this embodiment, or may combine certain components, or have a different arrangement of components.
In one embodiment, a computer-readable storage medium is provided, on which a computer program is stored which, when executed by a processor, carries out the steps of the method embodiments described above.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical Memory, or the like. Volatile memory can include random access memory (Random Access Memory, RAM) or external cache memory. By way of illustration, and not limitation, RAM can be in the form of a variety of forms, such as static random access memory (Static Random Access Memory, SRAM) or dynamic random access memory (Dynamic Random Access Memory, DRAM), and the like.
Further, the logic instructions in the memory described above may be implemented in the form of software functional units and stored in a computer-readable storage medium when sold or used as a stand-alone product. Based on this understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
It can be seen that this embodiment also provides an electronic device and a storage medium for a unified model of a method for fusing visible light and near infrared fluorescence images, which includes: one or more memories, one or more processors. The memory is used for storing the program codes, intermediate data generated in the running process of the program, the output result of the model and model parameters; the processor is used for processor resources occupied by code running and a plurality of processor resources occupied when training the model.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The above embodiments are only preferred embodiments of the present invention, and the scope of the present invention is not limited thereto, but any insubstantial changes and substitutions made by those skilled in the art on the basis of the present invention are intended to be within the scope of the present invention as claimed.

Claims (10)

1. The method for fusing the visible light and near infrared fluorescence images of the unified model is characterized by comprising the following steps of:
synchronously collecting a visible light image and a near infrared fluorescence image;
transforming the near infrared fluorescence image into the same coordinate system as the visible light image, and registering the near infrared fluorescence image and the visible light image pixel by pixel;
filtering and denoising the registered visible light image and the near infrared fluorescence image, and normalizing the filtered and denoised near infrared fluorescence image;
gamma conversion is carried out on the near infrared fluorescence image after normalization treatment;
pseudo-colorizing the gamma-converted near infrared fluorescent image based on the RGB color space to generate a pseudo-color RGB image, wherein the pseudo-colorizing model comprises a single-color mode and a layered pseudo-color fusion mode;
transforming the filtered and noise-reduced visible light image and the pseudo-color RGB image into a preset color space with decoupled brightness and color, so as to realize the separation of brightness and color channels;
carrying out weighted fusion on the image subjected to the transformation of the preset color space;
and inversely transforming the fused image from the preset color space to the RGB color space to obtain a final fusion result.
2. The method according to claim 1, characterized in that:
when filtering and denoising, a median filtering algorithm is used for filtering and denoising the registered visible light image and the near infrared fluorescence image, wherein a filtering window with the length of 3*3 is defined, average filtering smoothing processing is carried out, an edge area is operated in a filling 0 mode, and the visible light image and the near infrared fluorescence image after filtering and denoising are respectively set as follows
3. The method according to claim 2, characterized in that:
near infrared fluorescence image I ir1 Normalized to [0,1 ]]Range, normalized formula is expressed as formula (1):
wherein max (I ir1 )、min(I ir1 ) Respectively represent the pass I ir1 Maximum and minimum values of gray values, I ir2 Representing the near infrared fluorescence image after normalization.
4. A method according to claim 3, characterized in that:
during gamma conversion, the normalized near infrared fluorescence image I is subjected to ir2 Gamma conversion and nonlinear gray stretching are carried out, and the specific expression is as formula (2):
wherein I is irn And I ir3 The gamma converted gray scale ranges are 0,1]、[0,255]Gamma is the calculated gamma conversion coefficient.
5. The method according to claim 4, wherein:
when pseudo-coloring is performed, if a single-color fusion mode is set, a near-infrared fluorescence image I is generated ir3 A uniform-sized monochromatic image; if a layered pseudo-color fusion mode is set, the near infrared fluorescence image I is used ir3 Is used as index to generate and near infrared fluorescence image I ir3 Pseudo-color RGB image with gray scale intensity variation correlationSpecifically expressed as formula (3):
when the single-color fusion mode is set, the Lut function in the formula (3) is a constant function of the corresponding color; when the pseudo color fusion mode is set as the layering, the Lut function in the formula (3) is a 256-dimensional color mapping table.
6. The method according to claim 5, wherein:
in the process of imaging visible lightAnd pseudo-color RGB image->When transforming to HSV color space with decoupled brightness and color, respectively obtain +.>And->Expressed as formula (4):
wherein Max and Min respectively representOr->The maximum and minimum values of the R, G, B channels in the RGB color space, H, S, V represent the calculated hue, saturation, and brightness channel values, respectively.
7. The method according to claim 6, wherein:
in the weighted fusion, respectivelyAnd->Weighted fusion is carried out on the H, S, V channels of the three channels;
wherein, is provided withRespectively indicate->H, S, V values of three channels, +.> Respectively indicate->H, S, V of the three channels;
the fused image is represented asThe values of the three channels are denoted +.>
8. The method according to claim 7, wherein:
the fusion strategy of (2) is expressed as formula (5):
in the fusion process embodied in the formula (5), the setting is based onSinusoidal triangular dynamic fusion coefficient of (2)Is used for solving the phenomena of boundary saw tooth and transition unnaturalness of a fusion area.
9. The method according to claim 8, wherein:
when performing the color space inverse transform, the methodInverse transformation from HSV color space back to RGB color space, resulting in a final fusion result>The specific conversion is expressed as formula (6):
wherein the obtainedNamely +.>The values of the three channels of the image.
10. The method according to any one of claims 1 to 9, wherein:
the preset color space includes, but is not limited to, HSV, YUV, HSL, YCbCr and the like.
CN202311492954.4A 2023-11-10 2023-11-10 Visible light and near infrared fluorescence image fusion method of unified model Pending CN117575924A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311492954.4A CN117575924A (en) 2023-11-10 2023-11-10 Visible light and near infrared fluorescence image fusion method of unified model

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311492954.4A CN117575924A (en) 2023-11-10 2023-11-10 Visible light and near infrared fluorescence image fusion method of unified model

Publications (1)

Publication Number Publication Date
CN117575924A true CN117575924A (en) 2024-02-20

Family

ID=89892743

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311492954.4A Pending CN117575924A (en) 2023-11-10 2023-11-10 Visible light and near infrared fluorescence image fusion method of unified model

Country Status (1)

Country Link
CN (1) CN117575924A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117934310A (en) * 2024-03-21 2024-04-26 苏州西默医疗科技有限公司 Vascular fluorescence image and RGB image fusion system based on deep learning

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117934310A (en) * 2024-03-21 2024-04-26 苏州西默医疗科技有限公司 Vascular fluorescence image and RGB image fusion system based on deep learning
CN117934310B (en) * 2024-03-21 2024-05-24 苏州西默医疗科技有限公司 Vascular fluorescence image and RGB image fusion system based on deep learning

Similar Documents

Publication Publication Date Title
US8027533B2 (en) Method of automated image color calibration
CN103561632B (en) Endoscope device
US8401258B2 (en) Method to provide automated quality feedback to imaging devices to achieve standardized imaging data
US11051696B2 (en) Complementary color flashing for multichannel image presentation
US20150379712A1 (en) Medical image processing
CN114445316B (en) Method for fusing fluorescence and visible light images of endoscope
CN117575924A (en) Visible light and near infrared fluorescence image fusion method of unified model
CN114529475B (en) Image enhancement method and system based on two-dimensional gamma correction and tone mapping
US8520918B2 (en) Medical diagnostic image change highlighter
CN117314872A (en) Intelligent segmentation method and device for retina image
KR20150017617A (en) Method of processing optical coherence tomography and apparatus performing the same
CN112991367A (en) Imaging system and method for generating visible light video and color light video
CN116245766A (en) Image enhancement processing method and device, electronic equipment and readable storage medium
JP7163386B2 (en) Endoscope device, method for operating endoscope device, and program for operating endoscope device
CN114298956B (en) Image fusion method of dual-fluorescence endoscope, electronic equipment and device
CN115797276A (en) Method, device, electronic device and medium for processing focus image of endoscope
US20230255443A1 (en) Apparatuses, systems, and methods for discounting an object while managing auto-exposure of image frames depicting the object
Bergen et al. Shading correction for endoscopic images using principal color components
JP3126996B2 (en) Endoscope image processing method
Abel et al. Automatic glare removal in endoscopic imaging
CN115953327B (en) Image enhancement method, system, readable storage medium and electronic equipment
CN115330651A (en) Method for fusing fluorescence and visible light images of endoscope
CN113688848B (en) Early gastric cancer target feature extraction system based on fractional Fourier transform
CN115731205B (en) Image processing device and method for endoscope, electronic device, and storage medium
CN117437136A (en) Method, device, equipment and storage medium for enhancing endoscope image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination