WO2015025640A1 - 蛍光観察装置 - Google Patents
蛍光観察装置 Download PDFInfo
- Publication number
- WO2015025640A1 WO2015025640A1 PCT/JP2014/068701 JP2014068701W WO2015025640A1 WO 2015025640 A1 WO2015025640 A1 WO 2015025640A1 JP 2014068701 W JP2014068701 W JP 2014068701W WO 2015025640 A1 WO2015025640 A1 WO 2015025640A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- fluorescence
- unit
- light image
- return light
- Prior art date
Links
- 238000005286 illumination Methods 0.000 claims abstract description 30
- 230000005284 excitation Effects 0.000 claims abstract description 23
- 230000001678 irradiating effect Effects 0.000 claims abstract 2
- 238000002073 fluorescence micrograph Methods 0.000 claims description 42
- 238000001917 fluorescence detection Methods 0.000 claims description 39
- 230000009467 reduction Effects 0.000 claims description 17
- 238000003384 imaging method Methods 0.000 claims description 8
- 238000001514 detection method Methods 0.000 claims description 6
- 238000003780 insertion Methods 0.000 description 13
- 230000037431 insertion Effects 0.000 description 13
- 229910052724 xenon Inorganic materials 0.000 description 10
- FHNFHKCVQCLJFQ-UHFFFAOYSA-N xenon atom Chemical compound [Xe] FHNFHKCVQCLJFQ-UHFFFAOYSA-N 0.000 description 10
- 238000004364 calculation method Methods 0.000 description 9
- 230000004048 modification Effects 0.000 description 8
- 238000012986 modification Methods 0.000 description 8
- 238000010586 diagram Methods 0.000 description 7
- 238000000034 method Methods 0.000 description 7
- 239000000835 fiber Substances 0.000 description 6
- 230000003902 lesion Effects 0.000 description 6
- 230000008901 benefit Effects 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 5
- 238000006243 chemical reaction Methods 0.000 description 4
- 239000003550 marker Substances 0.000 description 4
- 230000008859 change Effects 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 230000007423 decrease Effects 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 239000000284 extract Substances 0.000 description 3
- 238000012886 linear function Methods 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 230000008569 process Effects 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 239000000126 substance Substances 0.000 description 2
- 239000008280 blood Substances 0.000 description 1
- 210000004369 blood Anatomy 0.000 description 1
- 239000003795 chemical substances by application Substances 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000000799 fluorescence microscopy Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 230000000877 morphologic effect Effects 0.000 description 1
- 230000001151 other effect Effects 0.000 description 1
- 238000011946 reduction process Methods 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000094—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope extracting biological structures
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
- A61B1/0005—Display arrangement combining images e.g. side-by-side, superimposed or tiled
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/043—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances for fluorescence imaging
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/045—Control thereof
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0655—Control therefor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0071—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence by measuring fluorescence emission
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/90—Dynamic range modification of images or parts thereof
- G06T5/94—Dynamic range modification of images or parts thereof based on local image properties, e.g. for local contrast enhancement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0638—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements providing two or more wavelengths
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0646—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements with illumination filters
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10064—Fluorescence image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10068—Endoscopic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
Definitions
- the present invention relates to a fluorescence observation apparatus.
- Patent Document 1 has a problem that it is difficult to ensure sufficient visibility of a fluorescent region in a superimposed image when the intensity of fluorescence is low with respect to a bright return light image such as a white light image. .
- the fluorescence image and the illumination light image are received and photoelectrically converted by the photoelectric conversion element to obtain analog signals representing the fluorescence image and the illumination light image, and the analog signal representing the fluorescence image is converted to the analog representing the illumination light image.
- a method of amplifying with a gain larger than that of a signal is known (for example, see Patent Document 2).
- the photoelectric conversion element and the AD conversion circuit that processes an analog signal generated by the photoelectric conversion element have a dynamic range. Therefore, when the intensity of fluorescence is low with respect to a bright illumination light image, it is difficult to secure sufficient gain to obtain sufficient visibility of the fluorescent region in the superimposed image with respect to the analog signal representing the fluorescence image. There is.
- the present invention has been made in view of the above-described circumstances, and an object thereof is to provide a fluorescence observation apparatus capable of improving the visibility of a fluorescent region in an image in which a fluorescent image is superimposed and displayed on a white light image.
- the present invention provides the following means.
- the present invention provides a light source that irradiates the subject with illumination light and excitation light, and a return light image that captures the return light emitted from the subject when the illumination light is emitted from the light source.
- An optical image generation unit a fluorescence image generation unit that generates a fluorescence image obtained by capturing fluorescence emitted from the subject by being irradiated with the excitation light from the light source; and the fluorescence generated by the fluorescence image generation unit
- a fluorescence detection unit that detects a fluorescent region having a gradation value equal to or greater than a predetermined gradation value threshold from the image; a return light image adjustment unit that adjusts a gradation value of the return light image; and the return light image adjustment unit
- a superimposed image generation unit that generates a superimposed image by superimposing the return light image adjusted in gradation value by the fluorescence image, and the return light image adjustment based on a detection result of the fluorescence region by the fluorescence detection unit.
- the coefficient setting unit detects the fluorescence region by the fluorescence detection unit when the fluorescence region is detected by the fluorescence detection unit.
- the adjustment width is set so that a gradation value of the return light image is reduced as compared with a case where no region is detected.
- a return light image obtained by photographing the form of the subject by irradiation of illumination light from the light source is generated by the return light image generation unit, and fluorescence from a specific region of the subject by irradiation of excitation light from the light source.
- a fluorescence image obtained by photographing is generated by the fluorescence image generation unit.
- the superimposed image generation unit generates a superimposed image in which the fluorescent image is superimposed on the return light image.
- the observer can observe the form of the subject and the fluorescent region in association with each other. Even in the region where the fluorescent region is superimposed, the superimposed image includes the information of the return light image, so that the form of the subject in the region corresponding to the fluorescent region can also be observed.
- a superimposition image that is substantially equivalent to the return light image is generated in a normal time when there is no fluorescent region in the fluorescent image.
- the return is based on the adjustment range set by the coefficient setting unit so that the gradation value of the return light image is smaller than normal.
- the tone value of the light image is adjusted by the return light image adjustment unit, and the return light image whose tone value has been adjusted is used to generate a superimposed image. Therefore, in this case, a superimposed image in which the brightness of the fluorescent region is emphasized by reducing the brightness of the return light image as a whole is generated. In this way, the visibility of the fluorescent region can be improved by adjusting the brightness of the return light image in the superimposed image as necessary.
- the coefficient setting unit sets a coefficient that is greater than 0 and less than 1, and the return light image adjustment unit is configured by the coefficient setting unit when the fluorescence region is detected by the fluorescence detection unit.
- the set coefficient may be multiplied by the gradation value of the return light image. In this way, the gradation value of the return light image can be adjusted only by the arithmetic processing, and the apparatus configuration can be simplified.
- the image pickup device for photographing the return light is provided, the coefficient setting unit sets a reduction range of the exposure amount of the return light of the image pickup device, and the return light image adjustment unit is configured to detect the fluorescence.
- an exposure amount control unit may be provided that reduces the exposure amount of the image sensor according to the reduction range set by the coefficient setting unit.
- the exposure amount control unit controls at least one of the amount of return light incident on the image sensor, the exposure time of the image sensor, and the amount of illumination light applied to the subject. May be.
- the return light image generated by the return light image generation unit can be used as it is for the generation of the superimposed image, so that the arithmetic processing can be simplified.
- the coefficient setting unit may reduce the reduction range of the gradation value of the return light image as the gradation value of the fluorescent region detected by the fluorescence detection unit increases. In this way, when the gradation value of the fluorescent region is sufficiently large, sufficient visibility of the fluorescent region can be obtained for the return light image. Accordingly, in this case, by reducing the reduction range of the gradation value of the return light image, the gradation value of the return light image is reduced more than necessary, so that the image of the subject in the superimposed image becomes unclear. Can be prevented.
- the coefficient setting unit may change the adjustment range of the gradation value of the return light image according to the color tone of the return light image.
- the contrast between the color of the subject in the return light image and the display color of the fluorescent region in the superimposed image is high, sufficient visibility of the fluorescent region with respect to the return light image Can be obtained.
- the gradation value of the return light image is reduced more than necessary, so that the image of the subject in the superimposed image becomes unclear. Can be prevented.
- the return light image adjustment unit is configured to reduce the scale of the return light image only when the number of pixels constituting the fluorescence region detected by the fluorescence detection unit is equal to or greater than a predetermined pixel number threshold.
- the adjustment value may be reduced. In this way, although the fluorescent light is not emitted from the subject, the tone value of the return light image is unnecessarily reduced by detecting the noise in the fluorescent image as the fluorescent region. Can be prevented.
- the visibility of the fluorescent region can be improved in the image in which the fluorescent image is superimposed and displayed on the white light image.
- 1 is an overall configuration diagram of a fluorescence observation apparatus according to a first embodiment of the present invention. It is a flowchart explaining the image processing by the image processing unit of FIG. It is a whole block diagram of the 1st modification of the fluorescence observation apparatus of FIG. It is a whole block diagram of the 2nd modification of the fluorescence observation apparatus of FIG. It is a whole block diagram of the fluorescence observation apparatus which concerns on the 2nd Embodiment of this invention.
- 6 is a graph showing a function of a representative value n of a gradation value of a fluorescent region and a coefficient ⁇ used in the coefficient setting unit of FIG. 5. It is a flowchart explaining the image processing by the image processing unit of FIG.
- a fluorescence observation apparatus 100 according to this embodiment is an endoscope apparatus, and as illustrated in FIG. 1, an elongated insertion portion 2 to be inserted into a body, a light source 3, excitation light from the light source 3, and An illumination unit 4 that irradiates illumination light from the distal end of the insertion unit 2 toward the observation target (subject) X, and image information S1 and S2 of the biological tissue that is provided at the distal end of the insertion unit 2 and that is the observation target X
- An image processing unit 6 for processing the image information S1 and S2 obtained by the image pickup unit 5, and an image G3 processed by the image processing unit 6.
- a monitor 7 for display.
- the light source 3 includes a xenon lamp 31, a filter 32 that extracts excitation light and illumination light from the light emitted from the xenon lamp 31, and a coupling lens 33 that collects the excitation light and illumination light extracted by the filter 32. And.
- the filter 32 selectively transmits light having a wavelength band of 400 nm to 740 nm corresponding to excitation light and illumination light. That is, in this embodiment, near infrared light (for example, wavelength band 700 nm to 740 nm) is used as excitation light.
- the xenon lamp 31 another type of lamp light source or a semiconductor light source such as an LED may be used.
- the illumination unit 4 includes a light guide fiber 41 disposed over almost the entire length in the longitudinal direction of the insertion portion 2 and an illumination optical system 42 provided at the distal end of the insertion portion 2.
- the light guide fiber 41 guides the excitation light and illumination light collected by the coupling lens 33.
- the illumination optical system 42 diffuses the excitation light and the illumination light guided by the light guide fiber 41 and irradiates the observation target X facing the distal end of the insertion portion 2.
- the imaging unit 5 includes an objective lens 51 that condenses the light from the observation target X, and white light that reflects excitation light and fluorescence out of the light collected by the objective lens 51 and has a shorter wavelength than the excitation light.
- a dichroic mirror 52 that transmits light (wavelength band 400 nm to 700 nm, return light), and two condensing lenses 53 and 54 that condense fluorescence reflected by the dichroic mirror 52 and white light transmitted through the dichroic mirror 52, respectively.
- An image sensor 55 such as a color CCD that captures the white light collected by the condenser lens 53, and an image sensor 56 such as a high-sensitivity monochrome CCD that captures the fluorescence condensed by the condenser lens 54. It has.
- reference numeral 57 denotes an excitation light cut filter that selectively transmits fluorescence (for example, a wavelength band of 760 nm to 850 nm) out of the light reflected by the dichroic mirror 52 and blocks excitation light.
- fluorescence for example, a wavelength band of 760 nm to 850 nm
- the white light imaging element 55 and the fluorescence imaging element 56 may be of different types, but may be of the same type.
- the image processing unit 6 includes a white light image generation unit (return light image generation unit) 61 that generates a white light image (return light image) G 1 from the white light image information S 1 acquired by the image sensor 55, and an image sensor 56.
- a fluorescence image generation unit 62 that generates a fluorescence image G2 from the acquired fluorescence image information S2, a fluorescence detection unit 63 that detects a fluorescence region F from the fluorescence image G2 generated by the fluorescence image generation unit 62, and the fluorescence detection
- a coefficient setting unit 64 that sets a coefficient ⁇ related to the gradation value of the white light image G1 based on the detection result by the unit 63, and an adjustment image G1 ′ by adjusting the gradation value of the white light image G1 based on the coefficient ⁇ .
- the fluorescence detection unit 63 holds a predetermined threshold Th for the gradation value of the fluorescence image G2.
- the fluorescence detection unit 63 compares the gradation value of each pixel of the fluorescence image G2 input from the fluorescence image generation unit 62 with a threshold Th, and detects a pixel having a gradation value equal to or greater than the threshold Th as the fluorescence region F. .
- the fluorescence detection unit 63 outputs a signal S3 indicating whether or not the fluorescence region F has been detected to the coefficient setting unit 64.
- the coefficient setting unit 64 holds predetermined two values ⁇ 1 and ⁇ 2 as the coefficient ⁇ , selects the value ⁇ 1 or the value ⁇ 2 according to the signal S3 received from the fluorescence detection unit 63, and adjusts the selected value to the white light image adjustment To the unit 65.
- the coefficient setting unit 64 selects ⁇ 2 when a signal indicating that the fluorescent region F has been detected is received from the fluorescence detection unit 63.
- the coefficient setting unit 64 selects ⁇ 1.
- the following equation (1) represents a process of generating the superimposed image G3 from the white light image G1 and the fluorescent image G2 performed by the white light image adjusting unit 65 and the superimposed image generating unit 66.
- R, G, and B are the tone values of the red (R) component, the green (G) component, and the blue (B) component of each pixel of the white light image G1
- FL is the fluorescent image.
- G2 is the gradation value of each pixel
- R ′, G ′, and B ′ are the gradation values of the R component, G component, and B component of each pixel of the superimposed image G3.
- the white light image adjustment unit 65 and the superimposed image generation unit 66 perform the processing of Expression (1) for all the pixels of the white light image G1 and the fluorescence image G2.
- the white light image adjustment unit 65 multiplies the gradation value of each color component of each pixel of the white light image G1 by the coefficient ⁇ set by the coefficient setting unit 64. Then, the gradation value of each color component of each pixel of the adjustment image G1 ′ is calculated.
- the superimposed image generation unit 66 is any one of three color component images (that is, a red (R) component image, a green (G) component image, and a blue (B) component image) constituting the color adjustment image G1 ′.
- the monochrome fluorescent image G2 is added.
- the superimposed image generation unit 66 reconstructs a color superimposed image G3 from the color component image to which the fluorescent image G2 is added and the other two color component images.
- the white light image G1 becomes the adjusted image G1 ′ as it is. That is, the superimposed image G3 is generated from the white light image G1 having the original gradation value.
- the coefficient ⁇ is ⁇ 2
- a white light image G1 with a reduced gradation value is generated as the adjustment image G1 ′
- a superimposed image G3 is generated from the adjustment image G1 ′.
- the superimposed image G3 generated according to Expression (1) is an image in which the red, green, or blue fluorescent region F is superimposed and displayed on the adjusted image G1 '.
- the display color of the fluorescent region F in the superimposed image G3 is preferably green that is a complementary color of red. Therefore, in Equation (1), the matrix is set so that the fluorescent region F is assigned to the G component image of the adjustment image G1 '.
- the fluorescence observation apparatus 100 configured as described above.
- a fluorescent substance that accumulates in the lesion Y is previously administered to the observation target X.
- the insertion portion 2 is inserted into the body, and the distal end of the insertion portion 2 is disposed opposite to the observation target X.
- the light source 3 is operated to irradiate the observation target X with excitation light and illumination light from the distal end of the insertion portion 2.
- the fluorescent substance contained in the lesion Y is excited by excitation light, and fluorescence is emitted, and white light is reflected on the surface of the observation target X.
- a part of the fluorescence emitted from the observation target X and the reflected white light returns to the tip of the insertion portion 2 and is collected by the objective lens 51.
- FIG. 2 shows a flowchart for explaining image processing by the image processing unit 6.
- the white light image information S1 is input to the white light image generation unit 61 to generate the white light image G1
- the fluorescence image information S2 is input to the fluorescence image generation unit 62 to generate the fluorescence image G2.
- Step S1 it is determined in the fluorescence detection part 63 whether the fluorescence area
- the white light image adjustment unit 65 outputs the white light image G1 as it is as the adjustment image G1 ′ to the superimposed image generation unit 66 (step S5), and the superimposed image generation unit 66 converts it into an unprocessed white light image G1.
- a superimposed image G3 on which the fluorescent image G2 having almost no gradation value is superimposed is generated (step S6).
- the superimposed image G3 generated at this time is substantially equivalent to the raw white light image G1.
- step S6 if the lesion Y is present in the field of view, the fluorescent region F is detected in step S2 (YES in step S2), so ⁇ 2 ( ⁇ 1) is set as the coefficient ⁇ in the coefficient setting unit 64 ( Step S4). Therefore, in the white light image adjustment unit 65, an adjustment image G1 ′ is generated by performing a gradation value reduction process on the white light image G1 (step S5), and the original gradation value is added to the adjustment image G1 ′. A superimposed image G3 on which the fluorescent image G2 is superimposed is generated (step S6).
- the superimposed image G3 generated at this time is an image in which a fluorescent region F that is relatively brighter than the white light image G1 is superimposed on the white light image G1 whose brightness is reduced as a whole. Therefore, the brightness of the fluorescent region F stands out in the superimposed image G3.
- the brightness of the white light image G1 included in the superimposed image G3 is switched depending on whether or not the fluorescent region F to which the observer pays attention exists in the visual field. That is, when the fluorescent region F does not exist, the normal clear white light image G1 is displayed on the monitor 7, so that the observer can clearly observe the form of the living tissue. On the other hand, when the fluorescent region F exists, the superimposed image G3 in which the brightness of the white light image G1 is reduced so that the brightness of the fluorescent region F stands out is displayed on the monitor 7. Therefore, the observer can easily visually recognize the fluorescent region F, and at the same time can observe the living tissue displayed behind the fluorescent region F sufficiently clearly.
- the coefficient setting unit 64 sets the coefficient ⁇ in consideration of the number of pixels detected as the fluorescent region F. Is preferred. That is, the fluorescence detection unit 63 outputs the number of pixels having a gradation value equal to or greater than the gradation value threshold Th to the coefficient setting unit 64.
- the coefficient setting unit 64 selects ⁇ 2 only when the number of pixels received from the fluorescence detection unit 63 is equal to or greater than a predetermined pixel number threshold. On the other hand, even when a pixel having a gradation value equal to or greater than the gradation value threshold Th is detected, the coefficient setting unit 64 selects ⁇ 1 when the number of pixels is smaller than the predetermined pixel number threshold.
- noise having a gradation value equal to or higher than a predetermined gradation value threshold Th may be generated in the fluorescent image G2.
- the brightness of the white light image G1 unnecessarily changes in the superimposed image G3, which causes trouble for the observer. Accordingly, the gradation value of the white light image G1 is reduced only when the number of pixels exceeding the gradation value threshold Th is sufficient, so that the floor of the white light image G1 reacts to noise. It is possible to prevent the adjustment value from being unnecessarily reduced.
- the fluorescence observation apparatus 100 sets the gradation value threshold Th of the fluorescence detection unit 63 based on the gradation value of the white light image G1.
- a threshold setting unit 67 is further provided.
- the threshold setting unit 67 receives the white light image G1 from the white light image generation unit 61, and calculates a representative value m of the gradation values of the white light image G1.
- the representative value m is, for example, an average value or a median value of gradation values of all pixels of the white light image G1.
- the threshold value setting unit 67 calculates the gradation value threshold value Th from the representative value m based on a predetermined function.
- the predetermined function is an increase function in which the gradation value threshold Th increases as the representative value m increases.
- the gradation value of the white light image G1 varies as a whole according to the observation distance between the tip of the insertion portion 2 and the observation target X, and the overall brightness of the white light image G1 decreases as the observation distance decreases. Increase.
- the gradation value of the fluorescence image G2 also varies as a whole according to the observation distance, and even if the intensity of the fluorescence emitted from the observation target X is the same, the greater the observation distance, The gradation value of the fluorescent region F in the image G2 becomes small.
- the magnitude of the observation distance is determined from the representative value m of the gradation value of the white light image G1, and the gradation corresponding to the variation of the gradation value of the fluorescent image G2 due to the variation of the observation distance.
- the fluorescence observation apparatus 100 As shown in FIG. 4, the fluorescence observation apparatus 100 according to the second modification of the present embodiment further includes a division unit 68 that divides the fluorescence image G2 by the white light image G1.
- the division unit 68 divides the gradation value of each pixel of the fluorescent image G2 input from the fluorescent image generation unit 62 by the gradation value of each pixel of the white light image G1 input from the white light image generation unit 61. As a result, a fluorescent image (hereinafter referred to as a corrected fluorescent image) G2 ′ with the gradation value corrected is generated. Then, the division unit 68 outputs the generated corrected fluorescence image G2 'to the fluorescence detection unit 63 and the superimposed image generation unit 66.
- the fluorescence detection unit 63 detects the fluorescence region F from the corrected fluorescence image G2 ′ instead of the fluorescence image G2.
- the superimposed image generation unit 66 generates a superimposed image G3 using the corrected fluorescent image G2 ′ instead of the fluorescent image G2. That is, the superimposed image generation unit 66 uses the gradation value of the corrected fluorescent image G2 ′ as FL in Expression (1).
- the gradation value of the white light image G1 varies depending on the observation angle between the distal end of the insertion portion 2 and the observation target X in addition to the observation distance described above.
- the gradation value of the fluorescent image G2 varies according to the observation distance and the observation angle. Therefore, by dividing the fluorescence image G2 by the white light image G1, the gradation value of the fluorescence image G2 is normalized, and the corrected fluorescence in which the change in the gradation value depending on the difference in observation distance and observation angle is removed. An image G2 ′ is obtained.
- a corrected fluorescence image G2 ' it is possible to improve the detection accuracy of the fluorescence region F and to provide a highly reliable superimposed image G3.
- the fluorescence observation apparatus 200 according to the second embodiment of the present invention will be described with reference to FIGS.
- the configuration different from the first embodiment will be mainly described, and the configuration common to the first embodiment will be denoted by the same reference numerals and the description thereof will be omitted.
- the fluorescence observation apparatus 200 according to the present embodiment further includes a representative value calculation unit 69 that calculates a representative value n of the gradation values of the fluorescent region F detected by the fluorescence detection unit 63.
- the coefficient setting unit 64 is mainly different from the first embodiment in that the coefficient ⁇ is set according to the representative value n calculated by the representative value calculating unit 69.
- the fluorescence detection unit 63 extracts the corresponding pixel and represents the extracted pixel gradation value S4.
- the data is output to the value calculation unit 69.
- the representative value calculation unit 69 calculates, for example, an average value as the representative value n of the gradation value S4 of the pixel input from the fluorescence detection unit 63, and outputs the calculated representative value n to the coefficient setting unit 64.
- the coefficient setting unit 64 calculates the value of the coefficient ⁇ from the representative value n calculated by the representative value calculation unit 69 based on a predetermined function, and uses the calculated value as the white light image adjustment unit 65.
- the predetermined function is an increasing function in which the coefficient ⁇ increases as the representative value n increases, and is, for example, a linear function as shown in FIG.
- the representative value n is a value in a range not less than the tone value threshold Th and not more than the maximum value that the tone value of the fluorescent image G2 can take, and the coefficient ⁇ is a value in the range exceeding 0 and not more than 1.
- the processing when the fluorescent region F is detected in step S2 is different from that of the first embodiment.
- the fluorescence detection unit 63 extracts the fluorescence region F (step S21).
- the representative value n of the extracted gradation values of the fluorescent region F is calculated by the representative value calculation unit 69 (step S22).
- the coefficient ⁇ corresponding to the representative value n is set in the coefficient setting unit 64 (step S23).
- the white light image adjustment unit 65 generates an adjustment image G1 ′ using the coefficient ⁇ set in step S23 (step S5).
- the coefficient ⁇ set in step S23 is a value reflecting the overall brightness of the fluorescent region F. Therefore, the reduction width when the white light image adjustment unit 65 reduces the gradation value of the white light image G1 is adjusted according to the overall brightness of the fluorescent region F. That is, when the fluorescent region F is sufficiently bright as a whole, the coefficient ⁇ is a value closer to 1, and an adjusted image G1 ′ in which the brightness is hardly reduced with respect to the white light image G1 is generated. On the other hand, when the fluorescent region F is sufficiently dark as a whole, the coefficient ⁇ is a smaller value, and an adjusted image G1 ′ in which the brightness is sufficiently reduced with respect to the white light image G1 is generated.
- the brightness reduction width of the white light image G1 is set to an amount necessary and sufficient to obtain sufficient visibility of the fluorescent region F according to the brightness of the fluorescent region F.
- the coefficient setting unit 64 may output the calculated coefficient ⁇ to the monitor 7 and the monitor 7 may display the coefficient ⁇ .
- the observer can recognize whether or not it is reduced, and the diagnostic accuracy based on the superimposed image G3 can be improved.
- the fluorescence observation apparatus 300 includes an opening provided in the preceding stage of the representative value calculation unit 69 described in the second embodiment and the imaging element 55 for the white light image G1.
- the coefficient setting unit 64 calculates a coefficient ⁇ from the representative value n calculated by the representative value calculating unit 69 based on a predetermined function, and outputs the calculated coefficient ⁇ to the exposure amount control unit 70.
- the predetermined function is an increasing function in which the coefficient ⁇ increases as the representative value n increases.
- the predetermined function is a linear function as shown in FIG.
- the coefficient ⁇ is a value in the range of more than 0 and 1 or less.
- the exposure amount control unit 70 transmits a signal S5 designating the aperture diameter ⁇ to the aperture stop 8 based on the detection result of the fluorescence region F of the fluorescence detection unit 63 and the coefficient ⁇ input from the coefficient setting unit 64. To control the aperture diameter of the aperture stop 8. Specifically, the exposure amount control unit 70 maintains the aperture diameter of the aperture stop 8 at a predetermined diameter ⁇ at the normal time when the fluorescence region F is not detected by the fluorescence detection unit 63. On the other hand, when the fluorescence detection unit 63 detects the fluorescent region F, the exposure amount control unit 70 sets the aperture diameter of the aperture stop 8 to the diameter obtained by multiplying the normal diameter ⁇ by the coefficient ⁇ . Control. For example, when the coefficient ⁇ is 0.8, the aperture stop 8 is controlled so that the aperture diameter is 80% of the normal value. In this case, the exposure amount of the white light of the image sensor 55 is reduced to about 80% of the normal time.
- the processing after determining the presence or absence of the fluorescent region F in step S2 is different from the first embodiment and the second embodiment.
- the aperture diameter of the aperture stop 8 is adjusted to a predetermined diameter ⁇ (steps S31 and S36).
- the white light image G1 generated next is an image having normal brightness.
- step S2 when the fluorescence region F is detected in step S2 (YES in step S2), the fluorescence region F is extracted in the fluorescence detection unit 63 as in steps S21 to S23 of the second embodiment (step S32). ), The representative value n of the gradation value of the fluorescent region F is calculated in the representative value calculation unit 69 (step S33), and the coefficient ⁇ corresponding to the representative value n is set in the coefficient setting unit 64 (step S34). Then, according to the coefficient ⁇ , the aperture diameter of the aperture stop 8 is adjusted to be smaller than normal (steps S35 and S36). As a result, the white light image G1 that is generated next is an image with reduced brightness compared to the normal time.
- the aperture stop (not shown) positioned in front of the fluorescence image sensor 56 is left untouched while the aperture stop 8 positioned in front of the white light image sensor 55 is stopped to generate the same.
- the light reflected / absorbed light
- only the amount of white light received can be reduced, and the brightness of the white light image G1 can be relatively reduced compared to the fluorescent image G2.
- the exposure value of the image sensor 55 is reduced instead of reducing the gradation value of the white light image G1 by the arithmetic processing as in the first and second embodiments.
- the brightness of the white light image G1 can be adjusted according to the presence or absence of the fluorescent region F. Thereby, an effect similar to that of the first embodiment can be obtained.
- the aperture diameter of the aperture stop 8 in accordance with the brightness of the fluorescent region F, the brightness reduction width of the white light image G1 can be set to a sufficient value of the fluorescent region F as in the second embodiment. Therefore, there is an advantage that the reduction in the visibility of the white light image G1 in the superimposed image G3 can be kept to a minimum by adjusting the amount to a necessary and sufficient amount.
- the gradation value of the white light image G1 is reduced by using the aperture stop 8 to reduce the amount of light incident on the image sensor 55.
- the exposure time of the element 55 may be shortened.
- the adjustment of the exposure time is performed, for example, by the exposure amount control unit 70 controlling the opening time of an electronic shutter (not shown) provided in the image sensor 55.
- the white light image generation unit 61 can generate the white light image G ⁇ b> 1 with the gradation value reduced as compared with the normal time.
- the fluorescence observation apparatus 400 according to the present embodiment is common to the third embodiment in that the gradation value of the white light image G1 is reduced by adjusting the exposure amount of the image sensor 55.
- the fluorescence observation apparatus 400 according to the present embodiment includes an aperture stop 81 in the light source 3 instead of the front stage of the image sensor 55, and a light amount control unit (exposure amount control unit, return light image).
- the adjusting unit 71 controls the aperture stop 81 to change the amount of illumination light applied to the observation target X.
- the light source 3 further includes another xenon lamp 311 and a beam splitter 34.
- the beam splitter 34 multiplexes the light emitted from the two xenon lamps 31 and 311 onto the optical axis incident on the light guide fiber 41.
- the xenon lamps 31 and 311 may be other types of lamp light sources or semiconductor light sources such as LEDs.
- the xenon lamps 31 and 311 may be of the same type or different types.
- the light source 3 includes a first filter 321 and a second filter 322 instead of the filter 32.
- the first filter 321 cuts out illumination light (for example, a wavelength band from 400 nm to 700 nm) from the light emitted from one xenon lamp 31.
- the second filter 322 cuts out excitation light (for example, a wavelength band of 700 nm to 740 nm) from the light emitted from the other xenon lamp 311.
- the aperture stop 81 is disposed between the first filter 321 and the beam splitter 34, and changes the amount of illumination light alone out of the light incident on the light guide fiber 41.
- the light quantity control unit 71 is the same as the exposure amount control unit 70 of the third embodiment except that the aperture stop 81 is controlled instead of the aperture stop 8.
- the aperture diameter of the aperture stop 81 is reduced, so that the amount of illumination light irradiated on the observation target X and the amount of white light collected by the objective lens 51 are reduced, thereby capturing an image.
- the exposure amount of the element 55 is reduced.
- the operation and effect of the fluorescence observation apparatus 400 according to the present embodiment is the same as that of the third embodiment described above except that the aperture stop 81 whose aperture diameter is adjusted is different in steps S31, S35, and S36 of FIG. Since it is the same as that of the observation apparatus 300, description is abbreviate
- the fluorescence observation apparatus 500 is such that the coefficient setting unit 64 sets the coefficient ⁇ (0 ⁇ ⁇ 1) in consideration of the color component of the white light image G1.
- the white light image generation unit 61 outputs the white light image G1 to the coefficient setting unit 64 in addition to the white light image adjustment unit 65.
- the coefficient setting unit 64 determines the color tone of the white light image G1 input from the white light image generation unit 61 prior to setting the coefficient ⁇ .
- the coefficient setting unit 64 calculates an average value ⁇ G ′> and an average value ⁇ R> from the white light image G1.
- the average value ⁇ G ′> is the sum of the average value ⁇ G> of the G component gradation values of the white light image G ⁇ b> 1 and the representative value n calculated by the representative value calculation unit 69.
- the average value ⁇ R> is an average value of the gradation values of the R component of the white light image G1.
- the predetermined function is an increasing function in which the coefficient ⁇ increases as the ratio Z increases.
- the predetermined function is a linear function as shown in FIG.
- the visibility of the fluorescent region F in the superimposed image G3 also depends on the contrast between the color of the living tissue X and the display color of the fluorescent region F.
- the ratio Z described above represents the contrast of the hue of the living tissue with respect to green, which is the display color of the fluorescent region F.
- the ratio Z and the coefficient ⁇ are decreased, and the reduction range of the gradation value of the white light image G1 is increased.
- the ratio Z and the coefficient ⁇ increase, and the reduction range of the gradation value of the white light image G1 decreases.
- the color tone of the white light image G1 is also taken into consideration, and the brightness reduction range of the white light image G1 is set to an amount necessary and sufficient to obtain sufficient visibility of the fluorescent region F.
- the visibility of the fluorescent region F in the superimposed image G3 can be improved and the brightness reduction width of the white light image G1 can be minimized.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Surgery (AREA)
- Physics & Mathematics (AREA)
- Public Health (AREA)
- Animal Behavior & Ethology (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- General Health & Medical Sciences (AREA)
- Veterinary Medicine (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Optics & Photonics (AREA)
- Radiology & Medical Imaging (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Software Systems (AREA)
- General Engineering & Computer Science (AREA)
- Computer Hardware Design (AREA)
- Computer Graphics (AREA)
- Signal Processing (AREA)
- Architecture (AREA)
- Endoscopes (AREA)
- Investigating, Analyzing Materials By Fluorescence Or Luminescence (AREA)
- Instruments For Viewing The Inside Of Hollow Bodies (AREA)
Abstract
Description
本発明は、被検体に対して照明光および励起光を照射する光源と、該光源から前記照明光が照射されることによって前記被検体から発せられる戻り光を撮影した戻り光画像を生成する戻り光画像生成部と、前記光源から前記励起光が照射されることによって前記被検体から発せられる蛍光を撮影した蛍光画像を生成する蛍光画像生成部と、該蛍光画像生成部によって生成された前記蛍光画像から、所定の階調値閾値以上の階調値を有する蛍光領域を検出する蛍光検出部と、前記戻り光画像の階調値を調整する戻り光画像調整部と、該戻り光画像調整部によって階調値を調整された戻り光画像と前記蛍光画像とを重畳した重畳画像を生成する重畳画像生成部と、前記蛍光検出部による前記蛍光領域の検出結果に基づいて、前記戻り光画像調整部による前記戻り光画像の階調値の調整幅を設定する係数設定部とを備え、該係数設定部は、前記蛍光検出部によって前記蛍光領域が検出された場合に、前記蛍光検出部によって前記蛍光領域が検出されなかった場合に比べて前記戻り光画像の階調値が低減するように、前記調整幅を設定する蛍光観察装置を提供する。
このようにすることで、戻り光画像の階調値を演算処理のみによって調整することができ、装置構成を簡易にすることができる。
このようにすることで、蛍光画像内に蛍光領域が検出された場合に、撮像素子の戻り光の露光量が低減されることによって、通常時に比べて階調値が低減された戻り光画像が戻り光画像生成部によって生成される。これにより、戻り光画像生成部によって生成された戻り光画像をそのまま重畳画像の生成に使用することができるので、演算処理を簡易にすることができる。
このようにすることで、蛍光領域の階調値が十分に大きい場合には、戻り光画像に対して、蛍光領域の十分な視認性を得られる。したがって、この場合には、戻り光画像の階調値の低減幅を小さくすることで、戻り光画像の階調値が必要以上に低減されることによって重畳画像内の被検体の像が不鮮明になることを防ぐことができる。
このようにすることで、戻り光画像内の被検体の色合いと、重畳画像内における蛍光領域の表示色とのコントラストが高い場合には、戻り光画像に対して、蛍光領域の十分な視認性を得られる。したがって、この場合には、戻り光画像の階調値の低減幅を小さくすることで、戻り光画像の階調値が必要以上に低減されることによって重畳画像内の被検体の像が不鮮明になることを防ぐことができる。
このようにすることで、被検体から蛍光が発せられていないにも関わらず、蛍光画像内のノイズが蛍光領域として検出されることによって戻り光画像の階調値が不要に低減されてしまうことを防ぐことができる。
以下に、本発明の第1の実施形態に係る蛍光観察装置100について図1から図4を参照して説明する。
本実施形態に係る蛍光観察装置100は、内視鏡装置であって、図1に示されるように、体内に挿入される細長い挿入部2と、光源3と、該光源3からの励起光および照明光を挿入部2の先端から観察対象(被検体)Xに向けて照射する照明ユニット4と、挿入部2の先端に設けられ、観察対象Xである生体組織の画像情報S1,S2を取得する撮像ユニット5と、挿入部2の基端側に配置され、撮像ユニット5によって取得された画像情報S1,S2を処理する画像処理ユニット6と、該画像処理ユニット6によって処理された画像G3を表示するモニタ7とを備えている。
白色光用の撮像素子55および蛍光用の撮像素子56は、上述したように互いに異なる種類のものであってもよいが、同一種類のものであってもよい。
本実施形態に係る蛍光観察装置100を用いて観察対象Xである体内の生体組織を観察するには、予め、病変部Yに集積する蛍光物質を観察対象Xに投与しておく。そして、体内に挿入部2を挿入し、挿入部2の先端を観察対象Xに対向配置する。次に、光源3を作動させることによって、励起光および照明光を挿入部2の先端から観察対象Xに照射する。
画像処理ユニット6において、白色光画像情報S1が白色光画像生成部61に入力されて白色光画像G1が生成され、蛍光画像情報S2が蛍光画像生成部62に入力されて蛍光画像G2が生成される(ステップS1)。次に、蛍光画像G2内に蛍光領域Fが存在するか否かが、蛍光検出部63において判定される(ステップS2)。
(第1の変形例)
本実施形態の第1の変形例に係る蛍光観察装置100は、図3に示されるように、白色光画像G1の階調値に基づいて、蛍光検出部63の階調値閾値Thを設定する閾値設定部67をさらに備えている。
本実施形態の第2の変形例に係る蛍光観察装置100は、図4に示されるように、蛍光画像G2を白色光画像G1で除算する除算部68をさらに備えている。
重畳画像生成部66は、蛍光画像G2に代えて補正蛍光画像G2’を用いて、重畳画像G3を生成する。すなわち、重畳画像生成部66は、式(1)のFLとして、補正蛍光画像G2’の階調値を用いる。
次に、本発明の第2の実施形態に係る蛍光観察装置200について、図5から図7を参照して説明する。本実施形態においては、第1の実施形態と異なる構成について主に説明し、第1の実施形態と共通する構成については、同一の符号を付して説明を省略する。
本実施形態に係る蛍光観察装置200は、図5に示されるように、蛍光検出部63によって検出された蛍光領域Fの階調値の代表値nを算出する代表値算出部69をさらに備える点、および、係数設定部64が、代表値算出部69によって算出された代表値nに応じて係数αを設定する点において、第1の実施形態と主に異なっている。
代表値算出部69は、蛍光検出部63から入力された画素の階調値S4の代表値nとして、例えば、平均値を算出し、算出された代表値nを係数設定部64へ出力する。
本実施形態に係る蛍光観察装置200によれば、ステップS2において蛍光領域Fが検出された場合の処理が、第1の実施形態と異なる。図7に示されるように、蛍光画像G2に蛍光領域Fが存在する場合(ステップS2のYES)、蛍光検出部63において蛍光領域Fが抽出される(ステップS21)。次に、抽出された蛍光領域Fの階調値の代表値nが、代表値算出部69において算出される(ステップS22)。次に、代表値nに応じた係数αが係数設定部64において設定される(ステップS23)。白色光画像調整部65は、ステップS23において設定された係数αを用いて調整画像G1’を生成する(ステップS5)。
このようにすることで、通常の(すなわち、α=1であるときの)白色光画像G1に比べて、現在モニタ7に表示されている重畳画像G3において調整画像G1’の明るさがどの程度低減されているかを、観察者が認識することができ、重畳画像G3に基づく診断精度を向上することができるという利点がある。
次に、本発明の第3の実施形態に係る蛍光観察装置300について、図8から図10を参照して説明する。本実施形態においては、第1および第2の実施形態と異なる構成について主に説明し、第1および第2の実施形態と共通する構成については、同一の符号を付して説明を省略する。
本実施形態に係る蛍光観察装置300によれば、ステップS2において蛍光領域Fの有無を判定した後の処理が、第1の実施形態および第2の実施形態と異なる。図10に示されるように、ステップS2において、蛍光領域Fが検出されなかった場合(ステップS2のNO)、開口絞り8の開口径が所定の径φに調整される(ステップS31,S36)。これにより、次に生成される白色光画像G1は、通常の明るさを有する画像となる。
次に、本発明の第4の実施形態に係る蛍光観察装置400について、図11を参照して説明する。本実施形態においては、第1から第3の実施形態と異なる構成について主に説明し、第1の実施形態と共通する構成については、同一の符号を付して説明を省略する。
次に、本発明の第5の実施形態に係る蛍光観察装置500について、図12及び図13を参照して説明する。本実施形態においては、第1から第4の実施形態と異なる構成について主に説明し、第1の実施形態と共通する構成については、同一の符号を付して説明を省略する。
本実施形態において、白色光画像生成部61は、白色光画像G1を、白色光画像調整部65に加えて係数設定部64にも出力する。
係数設定部64は、係数αの設定に先立ち、白色光画像生成部61から入力された白色光画像G1の色調を判定する。
上述した比Zは、蛍光領域Fの表示色である緑に対する、生体組織の色合いのコントラストを表わしている。具体的には、主に血液によって生体組織の赤みが強い場合には、比Zおよび係数αが小さくなり、白色光画像G1の階調値の低減幅が大きくなる。一方、生体組織の表面が脂肪で覆われるなどして生体組織の赤みが弱い場合には、比Zおよび係数αが大きくなり、白色光画像G1の階調値の低減幅が小さくなる。
2 挿入部
3 光源
31,311 キセノンランプ
32,321,322 フィルタ
33 カップリングレンズ
34 ビームスプリッタ
41 ライトガイドファイバ
42 照明光学系
51 対物レンズ
52 ダイクロイックミラー
53,54 集光レンズ
55,56 撮像素子
57 励起光カットフィルタ
4 照明ユニット
5 撮像ユニット
6 画像処理ユニット
61 白色光画像生成部(戻り光画像生成部)
62 蛍光画像生成部
63 蛍光検出部
64 係数設定部
65 白色光画像調整部(戻り光画像調整部)
66 重畳画像生成部
67 閾値設定部
68 除算部
69 代表値算出部
70 露光量制御部(戻り光画像調整部)
71 光量制御部(露光量制御部、戻り光画像調整部)
7 モニタ
8,81 開口絞り
Claims (7)
- 被検体に対して照明光および励起光を照射する光源と、
該光源から前記照明光が照射されることによって前記被検体から発せられる戻り光を撮影した戻り光画像を生成する戻り光画像生成部と、
前記光源から前記励起光が照射されることによって前記被検体から発せられる蛍光を撮影した蛍光画像を生成する蛍光画像生成部と、
該蛍光画像生成部によって生成された前記蛍光画像から、所定の階調値閾値以上の階調値を有する蛍光領域を検出する蛍光検出部と、
前記戻り光画像の階調値を調整する戻り光画像調整部と、
該戻り光画像調整部によって階調値を調整された戻り光画像と前記蛍光画像とを重畳した重畳画像を生成する重畳画像生成部と、
前記蛍光検出部による前記蛍光領域の検出結果に基づいて、前記戻り光画像調整部による前記戻り光画像の階調値の調整幅を設定する係数設定部とを備え、
該係数設定部は、前記蛍光検出部によって前記蛍光領域が検出された場合に、前記蛍光検出部によって前記蛍光領域が検出されなかった場合に比べて前記戻り光画像の階調値が低減するように、前記調整幅を設定する蛍光観察装置。 - 前記係数設定部が、0を超え1未満の係数を設定し、
前記戻り光画像調整部は、前記蛍光検出部によって前記蛍光領域が検出された場合に、前記係数設定部によって設定された前記係数を前記戻り光画像の階調値に乗算する請求項1に記載の蛍光観察装置。 - 前記戻り光を撮影する撮像素子を備え、
前記係数設定部が、前記撮像素子の前記戻り光の露光量の低減幅を設定し、
前記戻り光画像調整部は、前記蛍光検出部によって前記蛍光領域が検出された場合に、前記係数設定部によって設定された前記低減幅に従って前記撮像素子の露光量を低減させる露光量制御部を備える請求項1に記載の蛍光観察装置。 - 前記露光量制御部は、前記撮像素子への前記戻り光の入射光量、前記撮像素子の露光時間および前記被検体に照射される前記照明光の光量のうち、少なくとも1つを制御する請求項3に記載の蛍光観察装置。
- 前記係数設定部が、前記蛍光検出部によって検出された前記蛍光領域の階調値が大きい程、前記戻り光画像の階調値の低減幅を小さくする請求項1から請求項4のいずれかに記載の蛍光観察装置。
- 前記係数設定部が、前記戻り光画像の色調に応じて、該戻り光画像の階調値の調整幅を変更する請求項1から請求項5のいずれかに記載の蛍光観察装置。
- 前記戻り光画像調整部は、前記蛍光検出部によって検出された前記蛍光領域を構成する画素の数が所定の画素数閾値以上であった場合にのみ、前記戻り光画像の階調値を低減する請求項1から請求項6のいずれかに記載の蛍光観察装置。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP14838502.4A EP3037030A4 (en) | 2013-08-23 | 2014-07-14 | Fluorescence observation apparatus |
JP2015532761A JP6461797B2 (ja) | 2013-08-23 | 2014-07-14 | 蛍光観察装置 |
CN201480045985.8A CN105473049B (zh) | 2013-08-23 | 2014-07-14 | 荧光观察装置 |
US15/046,860 US9839359B2 (en) | 2013-08-23 | 2016-02-18 | Fluorescence observation apparatus |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013173111 | 2013-08-23 | ||
JP2013-173111 | 2013-08-23 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/046,860 Continuation US9839359B2 (en) | 2013-08-23 | 2016-02-18 | Fluorescence observation apparatus |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015025640A1 true WO2015025640A1 (ja) | 2015-02-26 |
Family
ID=52483422
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/068701 WO2015025640A1 (ja) | 2013-08-23 | 2014-07-14 | 蛍光観察装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US9839359B2 (ja) |
EP (1) | EP3037030A4 (ja) |
JP (1) | JP6461797B2 (ja) |
CN (1) | CN105473049B (ja) |
WO (1) | WO2015025640A1 (ja) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017148503A (ja) * | 2016-02-15 | 2017-08-31 | ライカ インストゥルメンツ (シンガポール) プライヴェット リミテッドLeica Instruments (Singapore) Pte. Ltd. | 擬似色を用いた顕微鏡又は内視鏡などの医用検査装置 |
JP2017221351A (ja) * | 2016-06-14 | 2017-12-21 | Hoya株式会社 | 電子内視鏡システム |
WO2018123613A1 (ja) * | 2016-12-28 | 2018-07-05 | ソニー株式会社 | 医療用画像処理装置、医療用画像処理方法、プログラム |
WO2018159288A1 (ja) * | 2017-02-28 | 2018-09-07 | ソニー株式会社 | 画像処理装置、および画像処理方法、並びにプログラム |
EP3272265A4 (en) * | 2015-03-17 | 2019-01-02 | Hamamatsu Photonics K.K. | Superimposed image creation apparatus and superimposed image creation method |
WO2019123827A1 (ja) * | 2017-12-19 | 2019-06-27 | オリンパス株式会社 | 内視鏡システム及び内視鏡プロセッサ |
JP2020524572A (ja) * | 2017-06-22 | 2020-08-20 | ヘルムホルツ ツェントラム ミュンヘン ドイチェス フォーシュングスツェントラム フュール ゲズントハイト ウント ウンヴェルト ゲーエムベーハーHelmholtz Zentrum Muenchen Deutsches Forschungszentrum Fuer Gesundheit Und Umwelt Gmbh | 内視鏡イメージングのためのシステム、および、画像を処理するための方法 |
JP6867533B1 (ja) * | 2020-05-20 | 2021-04-28 | 株式会社アルス | 光源装置 |
JP2021517847A (ja) * | 2018-04-25 | 2021-07-29 | 上海凱利泰医療科技股▲ふん▼有限公司Shanghai Kinetic Medical Co., Ltd | 画像処理システム、蛍光内視鏡照明撮像装置及び撮像方法 |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11307430B2 (en) * | 2016-06-07 | 2022-04-19 | Karl Storz Se & Co. Kg | Optical device and method for providing improved depth of field and resolution modes |
US10805553B2 (en) * | 2017-04-14 | 2020-10-13 | Canon Kabushiki Kaisha | Imaging apparatus and imaging method |
CN110709001B (zh) * | 2017-06-12 | 2022-09-23 | 索尼公司 | 医疗***、医疗设备和控制方法 |
CN109567935A (zh) * | 2018-12-07 | 2019-04-05 | 中聚科技股份有限公司 | 一种结合肿瘤细胞检测的激光治疗*** |
US11872355B2 (en) * | 2020-09-04 | 2024-01-16 | Covidien Lp | Medical device for detecting fluid parameters using fluorescent probes |
CN114098611B (zh) * | 2021-10-08 | 2022-09-13 | 武汉迈瑞医疗技术研究院有限公司 | 一种内窥镜***及其成像调节方法 |
EP4275590A1 (en) * | 2022-05-13 | 2023-11-15 | Leica Instruments (Singapore) Pte Ltd | Data processing device and computer-implemented method combining two images and an overlay color using a uniform color space |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07222712A (ja) * | 1994-02-10 | 1995-08-22 | Olympus Optical Co Ltd | 蛍光内視鏡装置 |
JP2003010101A (ja) | 2001-04-27 | 2003-01-14 | Fuji Photo Film Co Ltd | 内視鏡装置の撮像方法および装置 |
JP2005348902A (ja) * | 2004-06-09 | 2005-12-22 | Olympus Corp | 内視鏡装置 |
JP2007111328A (ja) * | 2005-10-21 | 2007-05-10 | Pentax Corp | 電子内視鏡装置 |
JP2009226065A (ja) * | 2008-03-24 | 2009-10-08 | Olympus Corp | カプセル型医療装置とその作動方法およびカプセル型医療装置システム |
JP2011110216A (ja) * | 2009-11-26 | 2011-06-09 | Olympus Corp | 蛍光観察装置 |
WO2011135992A1 (ja) | 2010-04-28 | 2011-11-03 | オリンパス株式会社 | 画像処理装置および蛍光観察装置 |
JP2012070935A (ja) * | 2010-09-28 | 2012-04-12 | Fujifilm Corp | 内視鏡画像表示装置 |
JP2013056001A (ja) * | 2011-09-07 | 2013-03-28 | Olympus Corp | 蛍光観察装置 |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5749830A (en) * | 1993-12-03 | 1998-05-12 | Olympus Optical Co., Ltd. | Fluorescent endoscope apparatus |
JP3923595B2 (ja) | 1997-05-13 | 2007-06-06 | オリンパス株式会社 | 蛍光観察装置 |
JP6005918B2 (ja) * | 2011-09-08 | 2016-10-12 | オリンパス株式会社 | 蛍光観察装置 |
JP5993184B2 (ja) * | 2012-04-04 | 2016-09-14 | オリンパス株式会社 | 蛍光観察装置および蛍光観察装置の作動方法 |
JP6391562B2 (ja) * | 2013-03-29 | 2018-09-19 | オリンパス株式会社 | 蛍光観察装置 |
-
2014
- 2014-07-14 EP EP14838502.4A patent/EP3037030A4/en not_active Withdrawn
- 2014-07-14 CN CN201480045985.8A patent/CN105473049B/zh not_active Expired - Fee Related
- 2014-07-14 JP JP2015532761A patent/JP6461797B2/ja active Active
- 2014-07-14 WO PCT/JP2014/068701 patent/WO2015025640A1/ja active Application Filing
-
2016
- 2016-02-18 US US15/046,860 patent/US9839359B2/en active Active
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07222712A (ja) * | 1994-02-10 | 1995-08-22 | Olympus Optical Co Ltd | 蛍光内視鏡装置 |
JP2003010101A (ja) | 2001-04-27 | 2003-01-14 | Fuji Photo Film Co Ltd | 内視鏡装置の撮像方法および装置 |
JP2005348902A (ja) * | 2004-06-09 | 2005-12-22 | Olympus Corp | 内視鏡装置 |
JP2007111328A (ja) * | 2005-10-21 | 2007-05-10 | Pentax Corp | 電子内視鏡装置 |
JP2009226065A (ja) * | 2008-03-24 | 2009-10-08 | Olympus Corp | カプセル型医療装置とその作動方法およびカプセル型医療装置システム |
JP2011110216A (ja) * | 2009-11-26 | 2011-06-09 | Olympus Corp | 蛍光観察装置 |
WO2011135992A1 (ja) | 2010-04-28 | 2011-11-03 | オリンパス株式会社 | 画像処理装置および蛍光観察装置 |
JP2012070935A (ja) * | 2010-09-28 | 2012-04-12 | Fujifilm Corp | 内視鏡画像表示装置 |
JP2013056001A (ja) * | 2011-09-07 | 2013-03-28 | Olympus Corp | 蛍光観察装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3037030A4 |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3272265A4 (en) * | 2015-03-17 | 2019-01-02 | Hamamatsu Photonics K.K. | Superimposed image creation apparatus and superimposed image creation method |
US10223779B2 (en) | 2015-03-17 | 2019-03-05 | Hamamatsu Photonics K.K. | Superimposed image creation apparatus and superimposed image creation method |
JP2017148503A (ja) * | 2016-02-15 | 2017-08-31 | ライカ インストゥルメンツ (シンガポール) プライヴェット リミテッドLeica Instruments (Singapore) Pte. Ltd. | 擬似色を用いた顕微鏡又は内視鏡などの医用検査装置 |
JP2017221351A (ja) * | 2016-06-14 | 2017-12-21 | Hoya株式会社 | 電子内視鏡システム |
US11004197B2 (en) | 2016-12-28 | 2021-05-11 | Sony Corporation | Medical image processing apparatus, medical image processing method, and program |
WO2018123613A1 (ja) * | 2016-12-28 | 2018-07-05 | ソニー株式会社 | 医療用画像処理装置、医療用画像処理方法、プログラム |
WO2018159288A1 (ja) * | 2017-02-28 | 2018-09-07 | ソニー株式会社 | 画像処理装置、および画像処理方法、並びにプログラム |
JPWO2018159288A1 (ja) * | 2017-02-28 | 2019-12-19 | ソニー株式会社 | 画像処理装置、および画像処理方法、並びにプログラム |
JP7127637B2 (ja) | 2017-02-28 | 2022-08-30 | ソニーグループ株式会社 | 画像処理装置、および画像処理方法、並びにプログラム |
JP2020524572A (ja) * | 2017-06-22 | 2020-08-20 | ヘルムホルツ ツェントラム ミュンヘン ドイチェス フォーシュングスツェントラム フュール ゲズントハイト ウント ウンヴェルト ゲーエムベーハーHelmholtz Zentrum Muenchen Deutsches Forschungszentrum Fuer Gesundheit Und Umwelt Gmbh | 内視鏡イメージングのためのシステム、および、画像を処理するための方法 |
WO2019123827A1 (ja) * | 2017-12-19 | 2019-06-27 | オリンパス株式会社 | 内視鏡システム及び内視鏡プロセッサ |
JPWO2019123827A1 (ja) * | 2017-12-19 | 2020-07-09 | オリンパス株式会社 | 内視鏡システム及び内視鏡プロセッサ |
US10951800B2 (en) | 2017-12-19 | 2021-03-16 | Olympus Corporation | Endoscope system and endoscope processor |
JP2021517847A (ja) * | 2018-04-25 | 2021-07-29 | 上海凱利泰医療科技股▲ふん▼有限公司Shanghai Kinetic Medical Co., Ltd | 画像処理システム、蛍光内視鏡照明撮像装置及び撮像方法 |
JP7140464B2 (ja) | 2018-04-25 | 2022-09-21 | 上海凱利泰医療科技股▲ふん▼有限公司 | 画像処理システム、蛍光内視鏡照明撮像装置及び撮像方法 |
WO2021145019A1 (ja) * | 2020-05-20 | 2021-07-22 | 株式会社アルス | 光源装置 |
JP2021182978A (ja) * | 2020-05-20 | 2021-12-02 | 株式会社アルス | 光源装置 |
JP6867533B1 (ja) * | 2020-05-20 | 2021-04-28 | 株式会社アルス | 光源装置 |
Also Published As
Publication number | Publication date |
---|---|
US20160157722A1 (en) | 2016-06-09 |
CN105473049A (zh) | 2016-04-06 |
US9839359B2 (en) | 2017-12-12 |
EP3037030A1 (en) | 2016-06-29 |
CN105473049B (zh) | 2018-07-20 |
JPWO2015025640A1 (ja) | 2017-03-02 |
EP3037030A4 (en) | 2017-04-19 |
JP6461797B2 (ja) | 2019-01-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6461797B2 (ja) | 蛍光観察装置 | |
US9906739B2 (en) | Image pickup device and image pickup method | |
US9513221B2 (en) | Fluorescence observation apparatus | |
JP5814698B2 (ja) | 自動露光制御装置、制御装置、内視鏡装置及び内視鏡装置の作動方法 | |
US20200337540A1 (en) | Endoscope system | |
US9052286B2 (en) | Fluorescence endoscope apparatus | |
JP5356191B2 (ja) | 蛍光観察装置 | |
JP6391562B2 (ja) | 蛍光観察装置 | |
JP5993184B2 (ja) | 蛍光観察装置および蛍光観察装置の作動方法 | |
JP2007075198A (ja) | 電子内視鏡システム | |
JP6072374B2 (ja) | 観察装置 | |
WO2018211885A1 (ja) | 画像取得システム、制御装置及び画像取得方法 | |
WO2011135992A1 (ja) | 画像処理装置および蛍光観察装置 | |
US20150173595A1 (en) | Imaging apparatus | |
JP6203452B1 (ja) | 撮像システム | |
JP4245787B2 (ja) | 蛍光画像取得方法および装置 | |
JP2001154232A (ja) | 測光装置 | |
WO2023090044A1 (ja) | 電子内視鏡用プロセッサ及び電子内視鏡システム | |
US20230164287A1 (en) | Imaging method for imaging a scene and a system therefor | |
JP6335776B2 (ja) | 内視鏡システム | |
JP2019000148A (ja) | 内視鏡システム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201480045985.8 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14838502 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2015532761 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2014838502 Country of ref document: EP |