WO2013031701A1 - 内視鏡装置 - Google Patents
内視鏡装置 Download PDFInfo
- Publication number
- WO2013031701A1 WO2013031701A1 PCT/JP2012/071496 JP2012071496W WO2013031701A1 WO 2013031701 A1 WO2013031701 A1 WO 2013031701A1 JP 2012071496 W JP2012071496 W JP 2012071496W WO 2013031701 A1 WO2013031701 A1 WO 2013031701A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- brightness
- illumination
- light
- illumination light
- band
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000095—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope for image enhancement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0638—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements providing two or more wavelengths
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0646—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements with illumination filters
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B23/00—Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
- G02B23/24—Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
- G02B23/2407—Optical details
- G02B23/2461—Illumination
- G02B23/2469—Illumination using optical fibres
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/56—Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/71—Circuitry for evaluating the brightness variation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/74—Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/741—Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/84—Camera processing pipelines; Components thereof for processing colour signals
- H04N23/88—Camera processing pipelines; Components thereof for processing colour signals for colour balance, e.g. white-balance circuits or colour temperature control
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/62—Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
- G01N21/63—Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
- G01N21/64—Fluorescence; Phosphorescence
- G01N21/645—Specially adapted constructive features of fluorimeters
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B2207/00—Coding scheme for general features or characteristics of optical elements and systems of subclass G02B, but not including elements and systems which would be classified in G02B6/00 and subgroups
- G02B2207/113—Fluorescence
Definitions
- the present invention relates to an endoscope apparatus suitable for narrowband light observation.
- a medical endoscope requires a light source device that illuminates the inside of the living body because the site to be observed is inside the living body. Illumination light generated by the light source device is irradiated to the observation target tissue from the distal end portion where the imaging unit is located through a light guide inserted through the insertion portion of the endoscope.
- white light observation As observation with an endoscope, normal light observation using visible light (white light observation: WLI) is widely performed.
- white light sources are transmitted through a rotary filter to sequentially irradiate tissue in a body cavity with illumination light of three colors R, G, and B. Then, the reflected light images corresponding to the three colors of R, G, and B are acquired in a time division manner, and a color image for performing normal light observation is generated from each reflected light image.
- WO2010 / 131620 discloses a field sequential imaging device for performing narrowband light observation (NBI) as special light observation.
- Narrowband light observation focuses on the use of light that is strongly absorbed by blood and strongly reflected / scattered by the mucosal surface layer in order to observe blood vessels with high contrast.
- the invention of Document 1 is configured such that green narrow band light G and two blue narrow band lights B1 and B2 can be sequentially irradiated.
- narrowband light observation is performed using a narrowband light observation image created from a reflected light image (narrowband image) corresponding to the narrowband light G, B1, and B2.
- the composition ratio is determined based on the average brightness of the captured image.
- the average brightness of the captured image is not necessarily the same as the brightness perceived by humans.
- dimming control that matches the sense of the operator is not performed.
- the narrow band lights B1 and B2 are irradiated to the living tissue with a time lag. Therefore, when the images based on the narrow band lights B1 and B2 are synthesized, there is a problem that the synthesized image may be blurred.
- the endoscope apparatus performs illumination with illumination light of the first band within a predetermined time and performs illumination with illumination light of the second band at least a first number of times. And a second image based on the first captured image based on the illumination of the illumination light in the first band and the illumination based on the illumination light in the second band.
- An imaging means for outputting a captured image; a first imaging signal based on illumination with illumination light in the first band; and a second imaging signal based on a first predetermined illumination of the first number of times
- the first brightness is calculated by color conversion matrix processing using the first imaging signal based on illumination with illumination light in the first band, and the first predetermined number of times other than the first predetermined number of times
- Brightness calculating means for calculating the second brightness, the difference between the first brightness and the target brightness in the first and second imaging signals that are the basis of the second brightness,
- a synthesizing unit that synthesizes the first and second imaging signals that are the basis of the first brightness after multiplying by a coefficient based on the ratio with the second brightness.
- FIG. 1 is a block diagram showing an endoscope apparatus according to a first embodiment of the present invention. Explanatory drawing for demonstrating the brightness detection process of the captured image based on each illumination light in the brightness calculation process part 44 in FIG. Explanatory drawing for demonstrating the brightness detection process of the captured image based on each illumination light in the brightness calculation process part 44 in FIG. The graph for demonstrating the weight which changes with modes. The graph for demonstrating the weight which changes with modes.
- the block diagram which shows the 2nd Embodiment of this invention The block diagram which shows the general circuit which performs white balance adjustment.
- the timing chart which shows a mode that it converts from interlace to progressive. Explanatory drawing for demonstrating the effect
- FIG. 1 is a block diagram showing an endoscope apparatus according to a first embodiment of the present invention.
- an endoscope apparatus 1 includes an endoscope 2 for observing the inside of a living body as a subject, and a light source that emits narrow-band illumination light for observing the inside of the living body.
- the apparatus 3 and the image processing apparatus 4 which performs signal processing with respect to the imaging signal imaged under the narrow band illumination light are provided.
- the narrow band image generated by the image processing device 4 is supplied to the monitor 5.
- the monitor 5 a normal color monitor can be adopted. That is, the monitor 5 has an RGB input terminal (not shown), and R, G, and B image signals are supplied to the RGB input terminal to perform color display.
- the endoscope 2 has a flexible insertion portion 21 having an outer diameter that can be inserted into a body cavity, and guides light emitted from the light source device 3 inside the insertion portion 21.
- a light guide fiber 26 made of quartz fiber or the like is inserted.
- One end of the light guide fiber 26 is connected to a connector 27 that is detachably connected to the light source device 3.
- the other end of the light guide fiber 26 is disposed in the vicinity of the illumination lens 23 provided at the distal end portion 22 at the distal end of the insertion portion 21.
- the connector 27 is connected to the light source device 3 and also to the image processing device 4 described later.
- the illumination light from the light source device 3 is guided to the distal end portion 22 of the insertion portion 21 by the light guide fiber 26, is diffused by the illumination lens 23, and is irradiated onto the subject.
- an objective lens 24 for connecting an optical image of the subject by return light from the subject, and a CCD (charge coupled device) 25 as an imaging device disposed at the imaging position are provided at the distal end portion 22. Is provided.
- the CCD 25 constituting the imaging means is driven by a CCD drive circuit (not shown) provided in the image processing device 4 (not shown), images the subject, converts the captured optical image of the subject into a video signal, The image is output to the image processing device 4.
- the light source device 3 includes a light source 31 constituted by a xenon lamp or the like.
- the light source 31 emits light in a wavelength band close to white light.
- a narrow band filter 32, a rotary filter 33 and a diaphragm 34 are disposed on the irradiation light path of the light source 31.
- the narrow band filter 32 narrows the band of light emitted from the light source 31 and emits it to the rotary filter 33.
- the rotary filter 33 limits the band of light that has passed through the narrow band filter 32 to a wavelength band necessary for narrow band light observation.
- the diaphragm 34 adjusts the amount of light by limiting the amount of light that has passed through the rotary filter 33.
- the aperture amount of the aperture 34 is controlled by a dimming control unit 49 described later.
- FIG. 2 is an explanatory diagram showing an example of the rotary filter 33.
- the rotary filter 33 has a disk shape, and three openings are provided at equal angles in the circumferential direction, and filters 33G, 33B1, and 33B2 are attached to the three openings, respectively.
- the filter 33G has a green (G) wavelength band as a transmission band
- the filters 33B1 and 33B2 have a blue (B) wavelength band as a transmission band.
- a narrow band G illumination light of 530-550 nm centered at 540 nm is transmitted from the filter 33G, and for example, 400-430 nm centered at 415 nm is transmitted from the filter 33B1.
- N-band B illumination light (hereinafter referred to as B1 illumination light) is transmitted, and the filter 33B2 transmits, for example, 400-430 nm narrow-band B illumination light (hereinafter referred to as B2) centered on 415 nm, in the same manner as the filter 33B1.
- B1 illumination light 400-430 nm narrow-band B illumination light
- B2 illumination light transmitted through the filters 33B1 and 33B2 have the same wavelength band.
- the center of the rotary filter 33 is attached to a rotary shaft of a rotary motor (not shown) and is driven to rotate.
- An encoder (not shown) is attached to the rotation shaft of the rotation motor, and the rotation of the rotation motor, that is, the rotation of the rotary filter 33 can be detected by the encoder.
- the image processing apparatus 4 to be described later controls the rotation of the rotation motor so that the rotation speed of the rotation filter 33 is constant (not shown).
- the light source device 3 performs imaging on the subject using narrow-band illumination light. For this reason, the amount of illumination light tends to be insufficient as compared to the case of using broadband illumination light that is normally widely used.
- the transmission loss on the short wavelength B side tends to be larger due to the optical transmission characteristics of the light guide fiber 26, and the B illumination light is emitted when emitted from the illumination lens 23 of the tip 22 as illumination light. The amount of light tends to be small.
- two filters 33B1 and 33B2 having the same transmission characteristics are arranged in the circumferential direction of the rotary filter 33, and each time the rotary filter 33 is rotated once by using these two filters 33B1 and 33B2.
- the same part of the subject to be observed is irradiated with the B illumination light twice, and the imaging is performed twice based on the B illumination light by the return light.
- the rotation filter 33 is rotated once in a 1.5 frame period, and imaging with B illumination light is performed twice.
- the brightness of the captured image (B captured image) based on the B illumination light is improved by combining the two captured images.
- imaging with the G illumination light is performed once and imaging with the B illumination light is performed twice in the 1.5 frame period will be described, but the period and number of imaging with the narrowband light of each color can be set as appropriate. .
- the B1 captured image based on the return light of the narrow-band B1 illumination light and the B2 captured image based on the return light of the B2 illumination light are temporally shifted images, and the image quality deteriorates by combining these images. There is a fear. Therefore, in the present embodiment, no synthesis is performed when a captured image with sufficient brightness can be obtained by only one of the narrow-band B1 illumination light and B2 illumination light. When a captured image with sufficient brightness cannot be obtained with only one of the narrow-band B1 illumination and B2 illumination light, a captured image based on the other B illumination light is synthesized according to the brightness. Thus, a captured image with sufficient brightness is obtained while suppressing deterioration in image quality.
- the image processing device 4 obtains the brightness of the captured image by color conversion matrix processing, thereby performing brightness control according to the operator's sense.
- the image processing apparatus 4 has an analog processing unit 41.
- the analog processing unit 41 performs predetermined analog signal processing such as amplification processing on the video signal from the endoscope 2 and outputs it to the A / D converter 42.
- the A / D converter 42 converts the output of the analog processing unit 41 into a digital signal, and then outputs the digital signal to the digital processing unit 43.
- the CCD 25 of the endoscope 2 outputs a G captured image based on the return light of the G illumination light as a G signal, outputs a B1 captured image based on the return light of the B1 illumination light as a B1 signal, and returns the B2 illumination light return light.
- the B2 captured image based on is output as a B2 signal.
- the video signal from the endoscope 2 includes these G signal, B1 signal, and B2 signal.
- the synchronization control unit 40 stores these G signal, B1 signal, and B2 signal in the synchronization memory 40a that stores the R image, the G image, and the B image. Note that the synchronization memory 40a stores, for example, 30 frames of G signal, B1 signal, and B2 signal, respectively. Then, the G, B1, and B2 signals are read from the synchronization memory 40a so that the color shift is minimized.
- the brightness calculation processing unit 44 calculates the brightness of the captured image in order every 0.5 frames based on the G signal, the B1 signal, and the B2 signal before recording in the synchronization memory 40a.
- the same matrix processing as that performed when displaying on the monitor 5 is performed. Find the brightness.
- the illumination by the G illumination light and the B1 illumination light is used as the master illumination that is always used for imaging, and the illumination by the G illumination light and the B2 illumination light is supplementarily performed when the brightness of the image is dark. Use slave lighting.
- the brightness calculation processing unit 44 obtains the luminance Y1 by the master illumination using matrix processing by the matrix processing unit 52 based on the G signal and the B1 signal. In addition, in order to obtain the brightness of the captured image by the slave illumination, the brightness calculation processing unit 44 obtains the luminance Y2 by the slave illumination by using the matrix processing by the matrix processing unit 52 based on the G signal and the B2 signal. .
- FIG. 3 is a block diagram showing a specific configuration of the brightness calculation processing unit 44.
- the average value calculation unit 50 calculates the / B1 signal, the / G signal, and the / B2 signal that are average values of signals corresponding to the R image, the G image, and the B pixel.
- the R image brightness calculation unit 51R, the G image brightness calculation unit 51G, and the B image brightness calculation unit 51B obtain brightness by using the / B1 signal, the / G signal, and the / B2 signal, respectively.
- the B1 signal is used as the R image signal
- the G signal is used as the G image signal
- the B2 signal is used as the B image signal.
- the brightness calculation units 51R, 51G, and 51B hold the / B1 signal, the / G signal, and the / B2 signal, and then output them to the matrix processing unit 52 as Rf, Gf, and Bf.
- the matrix processing unit 52 performs color conversion of the input R, G, and B image signals by the matrix calculation of the following equation (1).
- the matrix processing unit 52 performs matrix processing on each of the master illumination and the slave illumination by the matrix calculation of the following equation (1).
- ⁇ , ⁇ , and ⁇ are matrix coefficients.
- outputs Rm, Gm, and Bm obtained by matrix processing for master illumination or matrix processing for slave illumination are supplied to the luminance calculation unit 53.
- the G signal is supplied to the luminance calculation unit 53 as the output Rm, the B signal as the output Gm, and the B signal as the output Bm.
- ⁇ is the same as ⁇ in the later-described equation (6).
- Gt (1 + a) G.
- the brightness calculation processing unit 44 outputs the obtained luminances Y1, Y2, and ⁇ Y1 to the dimming control unit 49 as brightness information.
- a matrix processing unit 46 which will be described later, generates RGB image signal components from a G captured image and a B captured image obtained by imaging using narrowband light by matrix processing (color conversion matrix processing).
- the matrix processing in the matrix processing unit 52 of the brightness calculation processing unit 44 is the same processing as the matrix processing in the matrix processing unit 46. That is, the matrix calculation by the brightness calculation processing unit 44 is for obtaining signals corresponding to the R, G, and B inputs of the monitor 5, and the luminance obtained by the brightness calculation processing unit 44 is displayed on the monitor. This corresponds to the brightness of the image to be displayed, and corresponds to the brightness of the image felt when the surgeon observes the monitor 5.
- coefficients of the above formulas (2) and (3) in the brightness calculation processing unit 44 can be changed according to the color tone desired for the narrowband light observation image displayed on the monitor 5. .
- the dimming control unit 49 controls the diaphragm 34 based on the input brightness information so that the target brightness can be obtained. For example, when the brightness of the captured image is equal to or higher than the target value, the dimming control unit 49 outputs a dimming signal for reducing the opening amount of the aperture 34, and when the brightness of the captured image is less than the target value. Outputs a dimming signal for opening the diaphragm 34.
- the composition processing unit 45 synthesizes the captured image by the master illumination and the captured image by the slave illumination based on the composition ratio a. That is, when the composition ratio a is 0, the composition processing unit 45 outputs the captured image by the master illumination from the digital processing unit 43, that is, a signal using only the G signal and the B1 signal to the matrix processing unit 46. In addition, when the combination ratio a is not 0, the combination processing unit 45 uses a signal based on the image captured by the master illumination as an image captured by the slave illumination, that is, a signal obtained by the G signal and the B2 signal and the combination ratio a. And synthesize.
- the brightness of the G illumination light is much brighter than that of the B illumination light, so that when the composition ratio a becomes 1, there is no excessive multiplication and the increase in noise can be suppressed.
- the following equation (5) shows a combined signal by a.
- Rin, Gin, and Bin in Equation (5) indicate inputs of an R image, a G image, and a B image, respectively, and are B2, G, and B1, respectively, in narrowband light observation in the present embodiment.
- Rt, Gt, and Bt in the equation (5) indicate the output of the R image, G image, and B image of the composite signal.
- the output of the R image is B2, but the output of the R image supplied to the monitor 5 by matrix processing described later is substantially zero.
- the synthesis processing unit 45 obtains a synthesized signal by, for example, the calculation of equation (5) and outputs the obtained synthesized signal to the matrix processing unit 46.
- the matrix processing unit 46 obtains a signal corresponding to the RGB input of the monitor 5 by matrix processing.
- the following formula (6) shows an example of matrix processing by the matrix processing unit 46.
- ⁇ , ⁇ , and ⁇ are matrix coefficients, and Rout, Gout, and Bout indicate the output of the R image, G image, and B image after matrix processing.
- ⁇ , ⁇ , and ⁇ can be changed according to the color tone desired for narrowband light observation. These are obtained in a range of 0.7 to 1.5, for example, and selected from a plurality of candidates so that they are neither too large nor too small.
- the D / A converter 47 converts the output of the matrix processing unit 46 into an analog signal and outputs it to the monitor 5. That is, Rout, Gout, and Bout in the expression (6) are given to the monitor 5 as RGB inputs.
- the monitor 5 displays the captured image in color according to the input RGB input. In this way, narrow-band light observation can be performed on the display screen of the monitor 5.
- the surgeon When using the endoscope apparatus 1, the surgeon connects the connector 27 of the endoscope 2 to the light source apparatus 3 and the image processing apparatus 4 as shown in FIG. 1. Thereby, the connection state shown in FIG. 1 is obtained.
- the surgeon operates a power switch (not shown) so that the light source device 3, the image processing device 4, and the monitor 5 are in an operating state, and performs an operation for narrowband light observation.
- the light emitted from the light source 31 is converted into narrowband G illumination light, B1 illumination light, and B2 illumination light by the narrowband filter 32 and the rotary filter 33, and after adjusting the brightness by the diaphragm 34, Supplied to the mirror 2.
- Each illumination light is irradiated from the illumination lens 23 to the subject side through the light guide fiber 26 sequentially and substantially continuously, for example, in a period of 1/20 second.
- the CCD 25 captures an optical image by the return light from the part.
- the G signal, the B1 signal, and the B2 signal corresponding to the return lights of the G illumination light, the B1 illumination light, and the B2 illumination light are obtained by the photoelectric conversion of the CCD 25.
- Video signals including the G signal, the B1 signal, and the B2 signal are given from the endoscope 2 to the image processing device 4.
- the B1 signal and the B2 signal are signals obtained by imaging with the same exposure amount using illumination light in the same wavelength band, and are substantially the same under the same conditions except that there is a short timing shift within one frame. It is obtained by.
- the video signal input to the image processing device 4 is subjected to predetermined analog processing by the analog processing unit 41 and then converted to a digital signal by the A / D converter 42.
- the digital video signal from the A / D converter 42 is separated into a G signal, a B1 signal, and a B2 signal in the digital processing unit 43 and stored in the synchronization memory 40a.
- the brightness calculation processing unit 44 is given the G signal, the B1 signal, and the B2 signal read from the synchronization memory 40a, and performs the matrix processing of the luminance Y1 by the master illumination and the luminance Y2 by the slave illumination by the matrix processing unit 52. Use to calculate.
- the dimming control unit 49 obtains a difference ⁇ Y1 between the target luminance Ys and the luminance Y1, and obtains a combination ratio a.
- the luminances Y1 and Y2 are calculated using matrix processing, and show the same brightness as the brightness when displayed on the monitor 5.
- the composition ratio a is supplied to the composition processing unit 45, and the composition processing unit 45 synthesizes the image captured by the slave illumination with the image captured by the master illumination by a ratio based on the composition ratio a.
- the synthesis processing unit 45 obtains a synthesized signal using the above equation (5).
- the expression (5) when the composition ratio a is 0, that is, when the luminance Y1 is equal to or higher than the target luminance Ys, the captured image is not synthesized by the slave illumination. Therefore, in this case, the synthesized image based on the synthesized signal is not blurred and the image quality is not deteriorated.
- the combination is performed at a ratio corresponding to the combination ratio a, and the minimum combination necessary to obtain the necessary brightness is performed. Degradation of the image quality of the composite image can be suppressed.
- the composite signal from the composite processing unit 45 is given to the matrix processing unit 46 and subjected to matrix processing, and R, G, and B image signals in the display system are obtained.
- the output of the matrix processing unit 46 is returned to an analog signal by the D / A converter 47 and then supplied to the monitor 5. In this way, a narrow-band light observation image with sufficient brightness and suppressed image quality deterioration is displayed on the display screen of the monitor 5.
- the same narrow band illumination light is irradiated and synthesized multiple times on the same site to be observed within a predetermined period such as one frame period, thereby narrowing down.
- Improves image brightness in band light observation In this case, it is possible to detect the brightness corresponding to the brightness of the image actually displayed on the monitor by obtaining the brightness of the image using matrix processing, and at the brightness desired by the operator. Observation is possible.
- the composition of the captured image with the same narrowband illumination light is controlled using the composition ratio according to the brightness of the detected image, and the minimum composition processing for obtaining the set brightness is performed. Degradation of image quality can be suppressed.
- FIG. 4 is a block diagram showing a second embodiment of the present invention.
- the same components as those of FIG. 4 are identical to those of FIG. 4, the same components as those of FIG. 4, the same components as those of FIG. 4, the same components as those of FIG. 4, the same components as those of FIG.
- the present invention can be applied not only to narrow band observation but also to special light observation such as fluorescence observation.
- the composition ratio a is controlled in accordance with various observation modes and the configuration of the light source device for realizing these observation modes. In order to respond to the configuration of the light source device, it is conceivable to make a determination by communicating with the light source device.
- the endoscope apparatus 100 employs an image processing apparatus 104 including a control unit 105 instead of the image processing apparatus 4, and replaces the light source apparatus 3 with a light source control unit 106 and a rotation filter.
- adopted the light source device 103 provided with 113 differs from 1st Embodiment.
- the rotation filter 113 not only the rotation filter 33 of the first embodiment but also a rotation filter for special observation such as fluorescence observation can be adopted.
- a rotation filter for special observation such as fluorescence observation
- a rotary filter provided with one or two excitation light filters can be used, and a rotary filter provided with one or two narrowband observation filters is used. Is possible.
- the light source control unit 106 holds various information related to the light source device 103, for example, information related to the configuration of the rotary filter 113, and transmits / receives the held information to / from the control unit 105. Further, the light source control unit 106 is controlled by the control unit 105 to perform lighting control of the light source 31 and rotation control of the rotation filter 113. For example, when the rotation filter provided with two narrowband observation filters or two excitation light filters is used as the rotation filter 113, the light source control unit 106 performs the same control as in the first embodiment.
- the transmission of these filters at the timing of capturing the two-channel imaging signal The light source 31 and the rotary filter 113 are controlled so as to emit light.
- the control unit 105 controls the synthesis processing unit 45 and the dimming control unit 49 based on various information on the light source device 103 and information on the observation mode designated by the operation of the operator.
- the control unit 105 controls the light source control unit 106 according to the observation mode.
- the control unit 105 controls each unit so that the same operation as that of the first embodiment is performed.
- the control unit 105 controls the dimming control unit 49 so that the dimming control unit 49 performs dimming control based on the comparison between the luminance Y1 and the target luminance Ys when the combination ratio a is 0 ⁇ a ⁇ 1. 49 may be controlled.
- the control unit 105 is based on the configuration of the rotation filter 113 based on information from the light source control unit 106. Control each part. For example, when the rotary filter 113 has two filters for excitation light, the control unit 105 sets the synthesis ratio a of the synthesis processing unit 45 to 1 regardless of the output of the brightness calculation processing unit 44. . In this case, the control unit 105 may control the dimming control unit 49 so that the dimming control unit 49 performs dimming control based on the comparison between the luminance (Y1 + Y2) and the target luminance Ys. .
- control unit 105 takes into account that sufficient brightness cannot be obtained.
- Each part of 104 is controlled to stop the operation in the fluorescence observation mode.
- the control unit 105 may display a message on the monitor 5 indicating that the operation in the designated observation mode is prohibited.
- control unit 105 sets the synthesis ratio a to zero.
- control unit 105 sets the combination ratio a to zero.
- control unit 105 may set the synthesis ratio and dimming control according to the operator's settings.
- the composition ratio is controlled according to the type of the light source device, the observation mode, etc., and not only the same effects as in the first embodiment can be obtained, Optimal brightness control according to the light source device and the observation mode can be performed.
- FIG. 5 is a block diagram showing the endoscope apparatus according to the first embodiment of the present invention.
- the dimming control unit 49 performs dimming control based on the comparison between the luminance Y1 and the target luminance Ys when the synthesis ratio a is 0 ⁇ a ⁇ 1, and when the synthesis ratio a is 1, Dimming control may be performed based on a comparison between the luminance Y1 + Y2 and the target luminance Ys.
- the composition processing unit 45 synthesizes the captured image by the master illumination and the captured image by the slave illumination based on the composition ratio a and outputs a composite captured image (composite signal). That is, the composition processing unit 45 synthesizes a captured image by slave illumination, that is, a signal obtained by the G signal and the B2 signal and the composition ratio a with a signal based on the captured image by master illumination.
- the synthesis processing unit 45 outputs a captured image by master illumination from the digital processing unit 43, that is, a signal using only the G signal and the B1 signal to the matrix processing unit 46. become.
- the endoscope apparatus 1 includes a first mode in which the composition ratio a is 0, for example, a second mode in which the composition ratio a is 1, for example, and a composition ratio a (0 ⁇ a third mode in which a ⁇ 1) is multiplied and combined with the captured image by the master illumination.
- the combination ratio a is described as being 0 or 1.
- the combination ratio a may be set to a value other than 0 or 1.
- three or more fixed synthesis ratios may be set as the synthesis ratio a, and the operation may be performed in four or more modes.
- the composition ratio a in the first mode is 0 and the composition ratio a in the second mode is 1, and the control of the aperture amount by the dimming control unit 49 is remarkably performed in the first mode and the second mode. It can be different. That is, in this case, the amount of emitted light is significantly different between the first mode and the second mode, and the output level of the CCD 25 is significantly different. Therefore, in the brightness calculation processing unit 244, when the brightness detection processing of the captured image based on each illumination light is the same in the first mode and the second mode, detection that does not correspond to the actual brightness It is conceivable that results will be obtained.
- the dimming control unit 49 performs dimming control based on the brightness of the captured image, whereas the brightness of the observation image displayed on the monitor 5 is based on the composite signal. For this reason, when it is determined that no halation has occurred in the brightness detection for dimming control, halation may occur in the observation image on the monitor 5. Further, the same problem may occur depending on the gain amount of the AGC circuit 48 that operates independently of the brightness calculation of the brightness calculation processing unit 244.
- the first threshold value used for the brightness detection processing of the captured image based on each illumination light is used so that the brightness detection can be reliably performed even when the output level of the CCD 25 is significantly different.
- the mode is changed between the first mode and the second mode.
- each illumination light can be reliably detected regardless of the composition ratio a or the gain of the AGC circuit 48 so that the brightness detection corresponding to the display can be performed reliably.
- the threshold value used for the brightness detection processing of the captured image based on the above is changed in accordance with the composition ratio a and the gain value of the AGC circuit 48.
- FIG. 6 and 7 are explanatory diagrams for explaining brightness detection processing of a captured image based on each illumination light in the brightness calculation processing unit 244 in FIG.
- the brightness calculation processing unit 244 divides one screen into blocks having a predetermined number of pixels, and obtains the brightness of the screen. Note that the brightness calculation processing unit 244 calculates the brightness of the screen for each imaging signal based on each illumination light.
- FIG. 6 shows that the effective pixel area 251 is divided into 10 ⁇ 10 blocks 252 by the brightness calculation processing unit 244.
- Each block 252 includes a predetermined number of pixels in the horizontal and vertical directions.
- the brightness calculation processing unit 244 calculates the brightness for each block. For example, the brightness calculation processing unit 244 sets the average of the pixel values of the pixels included in each block as the brightness of each block (block brightness). Now, it is assumed that 100 blocks included in one screen are blocks B1 to B100, and the brightness of each block B1 to B100 is Bs1 to Bs100.
- FIG. 7A shows that the brightness Bs1 to Bs100 of each block B1 to B100 detected by the brightness calculation processing unit 244 is arranged in block order.
- the brightness calculation processing unit 244 arranges the brightness Bs1 to Bs100 in order of the magnitude of the values.
- the brightness Bs10 of the block B10 is the brightest, and then the brightness is dark as Bs20, Bs9,...,
- the block B91 is the block of the darkest brightness Bs91.
- the brightness calculation processing unit 244 calculates the area surrounded by the thick broken line in FIG. 7C, that is, the 2nd to 5th brightness in the brightness order and the 94th to 97th brightness in the brightness order. select. Then, the brightness calculation processing unit 244 sets the average value of the second to fifth brightnesses in the order of brightness as the representative value (hereinafter referred to as a high-intensity detection value) Ash of the bright area in the screen, and in order of brightness.
- the average value of the 94th to 97th brightnesses is set as a representative value (hereinafter referred to as a low luminance detection value) Asd of a dark area in the screen.
- a representative value hereinafter referred to as a low luminance detection value
- the brightness calculation processing unit 244 obtains the brightness of the screen by adding a weight that varies depending on the mode to at least one of the high luminance detection value and the low luminance detection value.
- 8 and 9 are graphs for explaining the weights that change depending on the mode.
- FIG. 8 shows the change in weight, with the ratio on the horizontal axis and the weight on the vertical axis.
- FIG. 9 shows changes in the threshold value with respect to the mode and the composition ratio, with the mode (composition ratio) on the horizontal axis and the threshold value on the vertical axis. 8 and 9 are for explaining the weight given to the high luminance detection value.
- the brightness calculation processing unit 244 is also provided with pixel values of all the pixels of each screen.
- the brightness calculation processing unit 244 obtains a ratio of pixels in which the pixel value is equal to or greater than the threshold among all the pixels. For example, as the threshold value, a value used for halation determination as to whether or not halation has occurred in the pixel is set. In this case, the brightness calculation processing unit 244 calculates the ratio of pixels in which halation has occurred among all the pixels. In the example of FIG. 8, the brightness calculation processing unit 244 increases the weight as the ratio of pixels whose pixel values are equal to or greater than the threshold (hereinafter referred to as high luminance pixels) is higher, as in the case where halation occurs. .
- the brightness calculation processing unit 244 multiplies the high luminance detection value by the weight based on FIG. 8 and then adds the low luminance detection value to the brightness of the screen. Therefore, as the ratio of high-luminance pixels whose pixel value is equal to or greater than the threshold value as in the case where halation occurs, the detection result indicating that the screen is brighter by multiplying the high-luminance detection value by a larger weight. can get.
- the threshold value for obtaining the ratio of high luminance pixels is changed according to the mode, the synthesis ratio, and the gain of the AGC circuit 48.
- the dimming control unit 49 decreases the aperture amount and increases the amount of light emitted from the light source device 3. Therefore, in this case, the output of the CCD 25 is at a high level.
- the dimming control unit 49 increases the aperture amount and decreases the amount of light emitted from the light source device 3. Therefore, in this case, the output of the CCD 25 is at a relatively low level.
- the threshold for obtaining the ratio of high-luminance pixels is set lower than that in the first mode, taking into account that the captured images by the B1 and B2 illumination lights are combined.
- the brightness calculation processing unit 244 can appropriately determine the ratio of the high luminance pixels and perform accurate brightness detection regardless of the mode.
- the threshold value T ′ is lowered as the composition ratio a increases.
- the brightness calculation processing unit 244 can appropriately determine the ratio of high luminance pixels and perform accurate brightness detection regardless of the combination ratio a.
- the threshold value T ′ decreases as the gain g of the AGC circuit 48 increases.
- the brightness calculation processing unit 244 obtains the brightness of the captured image based on each illumination light for each screen in this way, the above-described matrix calculation of the expression (1) and the expressions (2) and (3) Luminances Y1 and Y2 are calculated by the above calculation.
- the brightness calculation processing unit 244 obtains a ratio of pixels (hereinafter referred to as low luminance pixels) whose pixel value is equal to or less than a threshold value among all the pixels. Then, the brightness calculation processing unit 244 may change the threshold value for obtaining the ratio of the low luminance pixels according to the mode.
- the surgeon When using the endoscope apparatus 1, the surgeon connects the connector 27 of the endoscope 2 to the light source apparatus 3 and the image processing apparatus 204 as shown in FIG. 5. Thereby, the connection state shown in FIG. 5 is obtained.
- the surgeon operates a power switch (not shown) to operate the light source device 3, the image processing device 204, and the monitor 5, and performs an operation for narrowband light observation.
- the light emitted from the light source 31 is converted into narrowband G illumination light, B1 illumination light, and B2 illumination light by the narrowband filter 32 and the rotary filter 33, and after adjusting the brightness by the diaphragm 34, Supplied to the mirror 2.
- Each illumination light is irradiated from the illumination lens 23 to the subject side through the light guide fiber 26 sequentially and substantially continuously, for example, in a period of 1/20 second.
- the CCD 25 captures an optical image by the return light from the part.
- the G signal, the B1 signal, and the B2 signal corresponding to the return lights of the G illumination light, the B1 illumination light, and the B2 illumination light are obtained by the photoelectric conversion of the CCD 25.
- Video signals including the G signal, the B1 signal, and the B2 signal are given from the endoscope 2 to the image processing device 204.
- the B1 signal and the B2 signal are signals obtained by imaging with the same exposure amount using illumination light in the same wavelength band, and are substantially the same under the same conditions except that there is a short timing shift within one frame. It is obtained by.
- the video signal input to the image processing device 204 is subjected to predetermined analog processing by the analog processing unit 41 and then converted to a digital signal by the A / D converter 42.
- the digital video signal from the A / D converter 42 is separated into a G signal, a B1 signal, and a B2 signal in the digital processing unit 43 and stored in the synchronization memory 40a.
- the brightness calculation processing unit 244 receives the G signal, the B1 signal, and the B2 signal read from the synchronization memory 40a, and obtains the brightness of the captured image based on each illumination light for each screen.
- the brightness calculation processing unit 244 calculates the brightness of each block for each screen of the captured image based on each illumination light.
- the brightness calculation processing unit 244 obtains the average of the block brightness with the relatively bright block brightness and the brightness with the higher block to obtain the high brightness detection value, and the block brightness with the relatively dark block brightness and the lower brightness of the block.
- the average of the lengths is obtained as a low luminance detection value.
- the brightness calculation processing unit 244 multiplies at least one of the high-luminance detection value and the low-luminance detection value by the weight obtained according to the mode, the synthesis ratio a, or the gain of the AGC circuit 48 to add each illumination.
- the brightness of each screen is obtained for the captured image based on light.
- the brightness calculation processing unit 244 uses a relatively high value as a threshold value for obtaining the ratio of high-luminance pixels. Thereby, a high-intensity pixel can be detected with high accuracy.
- the brightness calculation processing unit 244 increases the weight by which the high-luminance detection value is multiplied as the ratio of high-luminance pixels increases.
- the brightness calculation processing unit 244 calculates the brightness of the screen by multiplying the high-luminance detection value by the calculated weight and then adding it to the low-luminance detection value.
- the brightness calculation processing unit 244 compares, for example, as a threshold value for obtaining the ratio of high luminance pixels. Use a low value. Thereby, a high-intensity pixel can be detected with high accuracy.
- the brightness calculation processing unit 244 increases the weight by which the high-luminance detection value is multiplied as the ratio of high-luminance pixels increases.
- the brightness calculation processing unit 244 calculates the brightness of the screen by multiplying the high-luminance detection value by the calculated weight and then adding it to the low-luminance detection value.
- the threshold value is changed according to the composition ratio a.
- the brightness calculation processing unit 244 increases the weight by which the high-luminance detection value is multiplied as the ratio of high-luminance pixels increases.
- the brightness calculation processing unit 244 calculates the brightness of the screen by multiplying the high-luminance detection value by the calculated weight and then adding it to the low-luminance detection value.
- the brightness calculation processing unit 244 may change the threshold based on the gain of the AGC circuit 48. In this case as well, high luminance pixels can be detected with high accuracy.
- the brightness calculation processing unit 244 increases the weight by which the high-luminance detection value is multiplied as the ratio of high-luminance pixels increases.
- the brightness calculation processing unit 244 calculates the brightness of the screen by multiplying the high-luminance detection value by the calculated weight and then adding it to the low-luminance detection value.
- the threshold value for obtaining the ratio of the high luminance pixels changes according to the mode, the composition ratio a, or the gain of the AGC circuit 48, the brightness of the screen can be obtained with high accuracy regardless of the mode. Can do.
- the brightness calculation processing unit 244 uses the brightness obtained for each screen for the captured image based on each illumination light, for example, the matrix calculation by the above formula (1) and the master by the above formulas (2) and (3). Luminance Y1 due to illumination and luminance Y2 due to slave illumination are calculated. As described above, the luminances Y1 and Y2 are calculated using matrix processing, and show the same brightness as the brightness when displayed on the monitor 5. Next, the dimming control unit 49 obtains a difference ⁇ Y1 between the target luminance Ys and the luminance Y1, and determines the combination ratio a.
- the composition ratio a is supplied to the composition processing unit 45, and the composition processing unit 45 synthesizes the image captured by the slave illumination with the image captured by the master illumination by a ratio based on the composition ratio a.
- the synthesis processing unit 45 obtains a synthesized signal using the above equation (5).
- the composition ratio a is 0, that is, when the luminance Y1 is equal to or higher than the target luminance Ys, the captured image is not synthesized by the slave illumination. Therefore, in this case, no blur occurs in the composite captured image based on the composite signal, and image quality deterioration does not occur.
- the composite signal from the composite processing unit 45 is given to the matrix processing unit 46 and subjected to matrix processing, and R, G, and B image signals in the display system are obtained.
- the output of the matrix processing unit 46 is returned to an analog signal by the D / A converter 47 and then supplied to the monitor 5. In this way, a narrow-band light observation image with sufficient brightness and suppressed image quality deterioration is displayed on the display screen of the monitor 5.
- the same narrow band illumination light is irradiated and synthesized multiple times on the same site to be observed within a predetermined period such as one frame period, thereby narrowing down.
- the brightness of the screen is obtained using a threshold value corresponding to the mode, the combination ratio, or the AGC gain, and the brightness of the screen can be obtained with high accuracy.
- matrix processing it is possible to detect the brightness corresponding to the brightness of the image actually displayed on the monitor. Observation at the desired brightness is possible.
- the synthesized image is not synthesized with the same narrow-band illumination light, so that deterioration of image quality can be suppressed.
- the threshold value for obtaining the ratio of the high-luminance pixels is changed according to the mode or the like. You may make it change according to AGC gain.
- the threshold value used for determining the representative value of the block brightness can be changed according to the mode, the synthesis ratio, or the AGC gain.
- the threshold relating to the output level of the image sensor i.e., the pixel unit or block unit is high or low.
- the threshold value may be changed according to the mode, the combination ratio, or the AGC gain.
- the present invention can be similarly applied to fluorescence observation, and a rotation having two excitation light filters for performing fluorescence observation.
- a filter is used, the same synthesis process and brightness calculation process as in the above embodiment can be performed.
- FIG. 10 is a block diagram showing a fourth embodiment of the present invention.
- the same components as those of FIG. Even in an endoscope apparatus for performing narrow-band light observation, white balance adjustment is necessary so that the color displayed on the monitor is in a desired state, similarly to the endoscope apparatus performing normal light observation. .
- the aperture amount may change significantly depending on the mode or the combination ratio a. Then, the color of the emitted light from the light source device 3 may change due to the change in the aperture amount. Therefore, white balance adjustment corresponding to the mode and the composition ratio a is necessary.
- FIG. 11 is a block diagram showing a general circuit for performing white balance adjustment.
- the R image signal Rch, the G image signal Gch, and the B image signal Bch are respectively supplied to and detected by the R image detection unit 71R, the G image detection unit 71G, and the B image detection unit 71B.
- B, G, and B signals are used as R, G, and B image signals.
- the multiplier 72 multiplies the G signal by a predetermined coefficient, for example, 1/2.
- the B signal from the R image detection unit 71R, the G signal from the multiplier 72, and the B signal from the B image detection unit 71B are supplied to the white balance adjustment unit 73, multiplied by a predetermined gain, and subjected to white balance adjustment. Is done.
- the white balance adjustment in narrow-band light observation was shown, the same color balance adjustment can be performed also in fluorescence observation using the circuit similar to FIG.
- G light is irradiated at the timing to obtain the R image signal
- the first excitation light is irradiated at the timing to obtain the G image signal.
- the second excitation light is irradiated at the timing of obtaining the B image signal.
- the color balance adjustment is performed by halving the detection result for the G signal based on the irradiation of the G light and using the detection result as it is for the first and second fluorescence signals based on the first and second excitation lights.
- the light source device 3 may be used.
- the amount of G illumination light may be extremely large with respect to the B illumination light emitted from.
- the gains of the B1 and B2 signals need to be extremely larger than the gain of the G signal. If it does so, the level range effective as B1, B2 signal will become narrow, and a dynamic range will become narrow.
- the dynamic range is prevented from being narrowed by controlling the amount of emitted light in each band according to the mode.
- This embodiment is different from the third embodiment in that an image processing device 260 is used instead of the image processing device 204.
- the image processing apparatus 260 employs a dimming control unit 61 instead of the dimming control unit 49, and has an adjustment value memory 62 and a white balance processing unit 63 added thereto, in the image processing apparatus 204 of the third embodiment. And different.
- the dimming control unit 61 has the same function as that of the dimming control unit 49 of the third embodiment, and also has a function of controlling the white balance adjustment value acquisition process.
- the combination ratio a is different between the first mode and the second mode. For example, it is assumed that the composition ratio a in the first mode is 0 and the composition ratio a in the second mode is 1. In this case, in the first mode, a captured image using G illumination light and B1 illumination light is obtained.
- the dimming control unit 61 obtains the white balance adjustment value corresponding to the first mode, When the light from the light source 31 passes through the filter 33G of the rotary filter 33, the stop amount of the diaphragm 34 is increased, and at the timing when the light from the light source 31 passes through the filter 33B1 of the rotary filter 33, the stop amount of the stop 34 Make it smaller.
- the dimming control unit 61 obtains the white balance adjustment value corresponding to the second mode.
- the aperture amount of the aperture 34 is, for example, the same.
- the white balance processing unit 63 obtains each white balance adjustment value for the first and second modes based on the output of the A / D converter 42 when acquiring the white balance adjustment value, and stores it in the adjustment value memory 62.
- the dimming control unit 61 corrects the white balance adjustment values for the first and second modes based on the aperture amount when acquiring the white balance adjustment values corresponding to the first and second modes, and adjusts the adjustment values. It is stored in the memory 62.
- control of the aperture amount when acquiring the white balance adjustment value may be determined based on the level of the captured image based on the G illumination light and the level of the captured image based on the B illumination light.
- the captured image based on the B illumination light is obtained by combining the captured image based on the B1 illumination light and the captured image based on the B2 illumination light according to the combination ratio a. Accordingly, the aperture amount at the time of obtaining the white balance adjustment value corresponding to the first and second modes may be set based on the combination ratio a in the first and second modes.
- the light control part 61 adjusted the light quantity by controlling the amount of apertures, you may make it adjust the emitted light quantity of the light source 31.
- FIG. 1 the light control part 61 adjusted the light quantity by controlling the amount of apertures, you may make it adjust the emitted light quantity of the light source 31.
- the white balance adjustment value is determined for each mode.
- the dimming control unit 61 restricts the light amounts of the G illumination light and the B illumination light with the aperture amount based on the combination ratio a set in the first mode.
- the white balance processing unit 63 calculates a white balance adjustment value and outputs it to the adjustment value memory 62.
- the dimming control unit 61 reads the white balance adjustment value for the first mode stored in the adjustment value memory 62, corrects it according to the aperture amount, and stores it in the adjustment value memory 62.
- the dimming control unit 61 uses the aperture amount based on the combination ratio a set in the second mode to change the amounts of G illumination light and B illumination light. Restrict.
- the white balance processing unit 63 calculates a white balance adjustment value and outputs it to the adjustment value memory 62.
- the dimming controller 61 reads the white balance adjustment value for the second mode stored in the adjustment value memory 62, corrects it according to the aperture amount, and stores it in the adjustment value memory 62.
- the white balance processing unit 63 reads the white balance adjustment value for the first or second mode stored in the adjustment value memory 62 according to the mode, and amplifies the imaging signal. Other operations are the same as those of the third embodiment.
- the white balance adjustment corresponding to each mode is performed, and at the time of obtaining the white balance adjustment value, the amount of illumination light in each band is limited corresponding to the combination ratio of each mode, The white balance adjustment value of each mode can be acquired while ensuring the dynamic range.
- the white balance adjustment value may be calculated by the above-described method for each synthesis ratio a of the third mode.
- the endoscope is provided with a custom switch, and photometry, contrast toggle, focus, etc. can be switched with one switch. That is, various operation switches are provided on the front panel, the keyboard, and the like, and there are custom switches that can be assigned arbitrary functions.
- the scope of the optical magnification function has been optically magnified only by a dedicated operation lever, but by assigning it to a custom switch, the flexibility of the operation method can be improved.
- the two-focus switching can be assigned to the custom switch.
- optical expansion and bifocal switching it is good to display what state is after operating.
- the recognizability can be improved by reversing white and black or using a large font so as to be clearly distinguished from the display of the patient name and the like.
- each function multiple modes and levels may be toggled.
- photometry and contrast can be easily switched with a custom switch.
- a custom switch there are three modes for metering: average, peak, and auto. If you set peak and auto in the menu in advance, you can switch between peak and auto at any time with a custom switch.
- the function to be realized and the image processing may be changed in one custom switch or one function switch.
- the function switch called “color” the operation is to change the color enhancement level in the normal observation mode and the color tone mode in the NBI observation mode.
- the NBI color mode can be changed from 3 to 1 or changed from 1 to 2 by a variable switch.
- the NBI tone mode can be set to 1 in the esophagus and can be changed to 2 in the stomach. It can meet the user's preference.
- the NBI color tone mode set in advance by the user may be used without setting the NBI color tone mode based on the scope ID. For example, after the user sets the NBI color tone mode to 2, the system is turned off, and when it is turned on again, the last used 2 may be called. Whether to use the scope ID may be set in the menu.
- the processor can be combined with multiple types of light sources.
- the observation modes that can be handled vary depending on the light source.
- the processor may have a communication unit corresponding to each light source.
- the processor may correspond to two communication units with one connector. More specifically, it is preferable to use a plurality of pins in the connector properly, and it is even better to include pins that can be shared.
- the communication standard of the processor may be changed corresponding to the light source, and the signal standard may be changed.
- the synchronization signal may be a composite synchronization signal or a vertical synchronization signal.
- the rotation position of the color filter shown here can be said to be the exposure timing for the CCD in a broad sense. That is, if the aperture adjuster combined with the color filter makes the relative position in the rotation direction with the color filter appropriate, the exposure timing with respect to the CCD becomes correct. Therefore, it is important to keep the position including them.
- the white balance can be operated by pressing a switch provided on the front panel.
- the holding of the white balance switch has been described depending on the setting, but it may be automatically switched according to the type of endoscope used.
- the type of endoscope is detected by the scope ID and it is found that the scope is a surgical scope, the operation may be performed so as to turn off the white balance switch holding. This is because, in the field of surgery, when the user who operates the switch is different from the user who holds the endoscope, the operation timing of the switch is shifted, and it may be difficult to hold the switch.
- the endoscope is equipped with a CCD and a resistance element is also installed to indicate the type of the CCD, the type of the CCD is determined based on the resistance value, and then the white balance switch is set accordingly.
- the white balance has the effect of absorbing the manufacturing variation of the color filter of the light source device and improving the color reproducibility.
- the CCD of the endoscope performs 4-line reading, it has an effect of absorbing the variation of each channel and preventing the variation between channels.
- FIG. 12A shows the case where the input frame rate to the field memory ⁇ the output frame rate
- FIG. 12B shows the case where the input frame rate to the field memory> the output frame rate.
- the circled number 1 in FIG. 12 indicates the period of output after combining the A field image and the previous B field image
- the circled number 2 in FIG. 12 indicates the B field image and the previous A field image.
- the period during which the image is output after the composition process is shown.
- the frequency of the progressive output image is higher than the interlace, the images indicated by P5 and P6 are exactly the same. That is, in P2, P6, etc., frequency deviation is absorbed by taking out the same field twice.
- the frequency of the progressive output image is lower than the interlace, an interlaced image that is not output is provided as between P1 and P2 to absorb the frequency shift.
- the current input field is output as an output frame for each line as it is.
- the output frames p1, p3,... The output of the median filter is used. They are to search for signals having the second highest signal level in a0, b0, and a1, and output them as p1.
- b0 is optimal for p1
- a0 or a1 is appropriate. This is a method that focuses on these.
- the median filter is performed for each RGB color and for each pixel. Taking the pixel in the shaded area in FIG. 13 as an example, in the case of an R image, the signal level is the second in three (a1, x), (b1, x), (a2, x) of the R image. Search for a size and get the output.
- the endoscope is indispensable for the endoscope, and there are various types of endoscopes, and the circuit for driving the CCD includes those provided for the endoscope and those provided for the video processor.
- the CCD When the CCD is driven by a circuit in the endoscope, what is necessary for the circuit is supplied from the video processor. They are mainly power, clock and ground. There are also various types of CCDs, and those with appropriate power supply voltages and clock frequencies are required.
- the endoscope is provided with a resistance value, and the voltage is recognized there to realize the CCD discrimination. Such an endoscope requires certain consideration in connection with a video processor.
- the present invention is not limited to these, and a signal indicating the type of endoscope may be lengthened. As these effects, if the power supply, the clock, and the ground are disconnected at the same time when the endoscope is removed, it is possible to stabilize the CCD driver circuit and the circuits such as A / D and AFE that require them simultaneously. If the power source and the resistance unit are connected simultaneously when connecting the endoscope, an appropriate clock frequency can be prepared and sent to the endoscope.
- ⁇ Characters can be entered in the comment field etc. even while the endoscope image is frozen and a still image is displayed. At this time, the patient ID cannot be input or changed. Further, regardless of whether the image is frozen or not, in order to change the color of the endoscopic image, a GUI may be displayed as a menu on the monitor screen, and a certain amount of red or blue may be changed.
Landscapes
- Life Sciences & Earth Sciences (AREA)
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Surgery (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- Multimedia (AREA)
- Optics & Photonics (AREA)
- Radiology & Medical Imaging (AREA)
- Veterinary Medicine (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Pathology (AREA)
- Biophysics (AREA)
- Public Health (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Astronomy & Astrophysics (AREA)
- General Physics & Mathematics (AREA)
- Endoscopes (AREA)
- Instruments For Viewing The Inside Of Hollow Bodies (AREA)
- Studio Devices (AREA)
Abstract
Description
図1は本発明の第1の実施の形態に係る内視鏡装置を示すブロック図である。
図1に示すように、内視鏡装置1は、被検体としての生体の内部を観察するための内視鏡2と、生体の内部の観察を行うために狭帯域の照明光を照射する光源装置3と、狭帯域の照明光の下で撮像された撮像信号に対する信号処理を行う画像処理装置4とを備えている。画像処理装置4により生成された狭帯域画像はモニター5に供給されるようになっている。モニター5としては通常のカラーモニターを採用することができる。即ち、モニター5は、RGB入力端子(図示省略)を備えており、RGB入力端子に、R画像、G画像及びB画像の信号が供給されてカラー表示を行う。
回転フィルタ33は、円板形状で、周方向には、3つの開口が等しい角度で設けられており、3つの開口には、夫々フィルタ33G,33B1,33B2が取り付けられている。フィルタ33Gは、緑(G)の波長帯域を透過帯域とし、フィルタ33B1,33B2は、青(B)の波長帯域を透過帯域としている。
=0.3(α・/G)+0.59(β・/B1)+0.11(γ・/B1) …(2)
また、輝度算出部53は、スレーブ照明による輝度Y2を、輝度算出部53に入力されたB2信号のレベルをB2として、例えば、下記(3)式の演算によって得る。Y2=0.3Rm+0.59Gm+0.11Bm
=0.3(α・/G)+0.59(β・/B2)+0.11(γ・/B2) …(3)
ここで、上記(2)式及び(3)式のGはG=Rm=αGfである。αは後述の(6)式のαと同じである。後述の(5)式によれば、Gt=(1+a)Gとなり、a=1の時には、Gt=2Gとなる。従って、αに1/2を加味しておくことが重要である。
明るさ算出処理部44は、求めた輝度Y1,Y2及びΔY1を明るさの情報として調光制御部49に出力する。調光制御部49は、目標とする明るさ(輝度)Ysと、輝度Y1との差(Ys-Y1)=ΔY1を求める。ΔY1≦0の場合には、マスター照明のみによって目標とする明るさが得られているので、調光制御部49は、スレーブ照明による撮像画像の合成比率(係数)aを0とする。また、ΔY1>0の場合には、調光制御部49は、下記(4)式によって合成比率aを求める。調光制御部49は、求めた合成比率aを合成処理部45に出力する。
狭帯域光観察においては、狭帯域のG照明光及びB照明光の各戻り光に基づく撮像画像を用いる。後述するマトリクス処理部46は、狭帯域光を用いた撮像によって得たG撮像画像及びB撮像画像からRGB画像の信号成分をマトリクス処理(色変換マトリクス処理)によって生成する。
α、β、γは、狭帯域光観察として希望する色調に応じて変更できる。これらは、大き過ぎず小さ過ぎないように、例えば、0.7~1.5の範囲で求めておき、複数の候補から選択する。この範囲を超えてしまうと、ノイズが増えたり、飽和しやすくなる。この条件の下で、αだけは上記(5)式の(1+a)を考慮し、0.35~0.75の範囲で決めておく。当然、この考慮があるので、ノイズ増加や飽和増加は無いが、ダイナミックレンジが欠落しないようビット幅の拡張は必要である。
D/A変換器47は、マトリクス処理部46の出力をアナログ信号に変換してモニター5に出力する。即ち、(6)式のRout,Gout,Boutが、RGB入力としてモニター5に与えられる。モニター5は、入力されたRGB入力に応じて撮像画像をカラー表示する。こうして、モニター5の表示画面上において、狭帯域光観察が可能となる。
図4は本発明の第2の実施の形態を示すブロック図である。図4において図1と同一の構成要素には同一符号を付して説明を省略する。
図5は本発明の第1の実施の形態に係る内視鏡装置を示すブロック図である。図5において図1と同一の構成要素には同一符号を付して説明を省略する。
図10は本発明の第4の実施の形態を示すブロック図である。図10において図5と同一の構成要素には同一符号を付して説明を省略する。
狭帯域光観察を行うための内視鏡装置においても、通常光観察を行う内視鏡装置と同様に、モニターに表示される色合いが所望の状態となるように、ホワイトバランス調整が必要である。例えば、上述した第3の実施の形態においては、モードに応じて或いは合成比率aに応じて、絞り量が著しく変化することがある。そうすると、絞り量の変化によって、光源装置3からの出射光の色が変化することがある。従って、モードや合成比率aに対応したホワイトバランス調整が必要である。
プログレッシブの出力画像の周波数はインターレースよりも高い時、P5とP6で示される画像は全く同じになる。即ち、P2,P6等においては、同じフィールドを2回だすことで周波数ズレを吸収する。一方、プログレッシブの出力画像の周波数がインターレースよりも低い時、P1,P2間のように出力しないインターレース画像を設けて周波数ズレを吸収する。
本出願は、2011年8月26日に日本国に出願された特願2011-185127号及び特願2011-185128号を優先権主張の基礎として出願するものであり、上記の開示内容は、本願明細書、請求の範囲、図面に引用されたものとする。
Claims (10)
- 所定の時間内に、第1の帯域の照明光による照明を行うと共に、第2の帯域の照明光による照明を2回以上の第1の回数行う照明手段と、
前記照明手段により照明された被写体を撮像し、前記第1の帯域の照明光の照明に基づく第1の撮像画像及び前記第2の帯域の照明光の照明に基づく第2の撮像画像を出力する撮像手段と、
前記第1の帯域の照明光による照明に基づく第1の撮像信号と前記第1の回数のうちの第1の所定回の照明に基づく第2の撮像信号とを用いた色変換マトリクス処理によって第1の明るさを算出し、前記第1の帯域の照明光による照明に基づく第1の撮像信号と前記第1の回数のうちの前記第1の所定回以外の照明に基づく第2の撮像信号とを用いた色変換マトリクス処理によって第2の明るさを算出する明るさ算出手段と、
前記第2の明るさの元となる前記第1及び第2の撮像信号に、前記第1の明るさと目標明るさとの差分と前記第2の明るさとの比に基づく係数を乗算した後、前記第1の明るさの元となる前記第1及び第2の撮像信号に合成する合成手段と、
を備えたことを特徴とする内視鏡装置。 - 前記明るさ算出手段で算出された明るさに基づき前記照明手段を調光制御する調光制御手段を備え、
前記合成手段は、前記明るさ算出手段により算出した前記第1の明るさと前記第2の明るさとに基づき、前記第2の撮像信号を乗算する係数を第1のモードと第2のモードとで切換え、
前記明るさ算出及び調光制御手段は、前記撮像手段の出力を画素単位又はブロック単位で閾値と比較することで高輝度であるか低輝度であるかに分類し、分類結果を利用して前記第1及び第2の帯域の照明光に基づく撮像画像の画面毎の明るさを求め、求めた前記画面毎の明るさを用いて前記合成撮像画像の明るさを求めるものであって、前記閾値を前記第1のモードと前記第2のモードとで変化させる
ことを特徴とする請求項1に記載の内視鏡装置。 - 前記明るさ算出及び調光制御手段は、前記撮像手段からの前記第1及び第2の撮像画像が画面毎に入力され、画面内を複数のブロックに分割し、分割したブロック毎に明るさを求め、求めたブロック毎の明るさに基づいて、画面内における高輝度検波値及び低輝度検波値を求め、前記高輝度検波値及び低輝度検波値を重み付け加算することによって、前記画面毎の明るさを求めるものであって、前記重み付け加算に用いる重みを前記分類結果によって取得する
ことを特徴とする請求項2に記載の内視鏡装置。 - 前記明るさ算出及び調光制御手段は、前記閾値として高輝度画素を検出するための値を設定することにより、前記高輝度検波値に乗算する重みを取得する
ことを特徴とする請求項3に記載の内視鏡装置。 - 前記第1の回数は2回であって、
前記明るさ算出手段は、前記第1の撮像信号と前記第1の回数のうちの1回目の照明に基づく第2の撮像信号とを用いた色変換マトリクス処理によって前記第1の明るさを算出し、前記第1の撮像信号と前記第1の回数のうちの2回目の照明に基づく第2の撮像信号とを用いた色変換マトリクス処理によって前記第2の明るさを算出する
ことを特徴とする請求項3に記載の内視鏡装置。 - 前記第1及び第2の撮像信号に基づく表示を行う表示系の入力に対応した色変換処理を行うマトリクス処理部を具備し、
前記合成手段による合成は、前記マトリクス処理部の前段若しくは後段で行われるか又は前記マトリクス処理部の色変換処理と同時に行われる
ことを特徴とする請求項5に記載の内視鏡装置。 - 前記第1の帯域は、緑の帯域であり、前記第2の帯域は、青の帯域である
ことを特徴とする請求項6に記載の内視鏡装置。 - 前記第1のモードに対応する第1の色バランス調整と、前記第2のモードに対応する第2の色バランス調整とを行う色バランス調整手段と、
前記第1の色バランス調整で用いる第1の色バランス調整値を取得するとき、前記第1のモード時の前記係数に基づき、前記照明手段の前記第1の帯域の照明光の光量と前記第2の帯域の照明光の光量とを調整し、前記第2の色バランス調整で用いる第2の色バランス調整値を取得するとき、前記第2のモード時の前記係数に基づき、前記照明手段の前記第1の帯域の照明光の光量と前記第2の帯域の照明光の光量とを調整する前記色バランス値取得手段と
を具備したことを特徴とする請求項2に記載の内視鏡装置。 - 前記明るさ算出手段で算出された明るさに基づき前記照明手段を調光制御する調光制御手段を備え、
前記明るさ算出及び調光制御手段は、前記撮像手段の出力を画素単位又はブロック単位で閾値と比較することで高輝度であるか低輝度であるかに分類し、分類結果を利用して前記第1及び第2の帯域の照明光に基づく撮像画像の画面毎の明るさを求め、求めた前記画面毎の明るさを用いて前記合成撮像画像の明るさを求めるものであって、前記閾値を前記合成するための係数に応じて変化させる
ことを特徴とする請求項1に記載の内視鏡装置。 - 前記明るさ算出手段で算出された明るさに基づき前記照明手段を調光制御する調光制御手段と、
前記撮像手段の出力に基づいて前記第1及び第2の撮像画像に利得を付与するAGC回路とを備え、
前記明るさ算出及び調光制御手段は、前記撮像手段の出力を画素単位又はブロック単位で閾値と比較することで高輝度であるか低輝度であるかに分類し、分類結果を利用して前記第1及び第2の帯域の照明光に基づく撮像画像の画面毎の明るさを求め、求めた前記画面毎の明るさを用いて前記合成撮像画像の明るさを求めるものであって、前記閾値を前記AGC回路の利得に応じて変化させる
ことを特徴とする請求項1に記載の内視鏡装置。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013514467A JP5326065B2 (ja) | 2011-08-26 | 2012-08-24 | 内視鏡装置 |
CN201280013720.0A CN103429136B (zh) | 2011-08-26 | 2012-08-24 | 内窥镜装置 |
EP12827919.7A EP2671498B1 (en) | 2011-08-26 | 2012-08-24 | Endoscope device |
US13/856,748 US20130286175A1 (en) | 2011-08-26 | 2013-04-04 | Endoscope apparatus |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011185127 | 2011-08-26 | ||
JP2011-185128 | 2011-08-26 | ||
JP2011185128 | 2011-08-26 | ||
JP2011-185127 | 2011-08-26 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/856,748 Continuation US20130286175A1 (en) | 2011-08-26 | 2013-04-04 | Endoscope apparatus |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013031701A1 true WO2013031701A1 (ja) | 2013-03-07 |
Family
ID=47756191
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2012/071496 WO2013031701A1 (ja) | 2011-08-26 | 2012-08-24 | 内視鏡装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20130286175A1 (ja) |
EP (1) | EP2671498B1 (ja) |
JP (1) | JP5326065B2 (ja) |
CN (1) | CN103429136B (ja) |
WO (1) | WO2013031701A1 (ja) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014156253A1 (ja) * | 2013-03-25 | 2014-10-02 | オリンパスメディカルシステムズ株式会社 | 内視鏡装置 |
WO2015186397A1 (ja) * | 2014-06-05 | 2015-12-10 | オリンパスメディカルシステムズ株式会社 | 処理装置、内視鏡システム、内視鏡装置、画像処理方法および画像処理プログラム |
JP2017511176A (ja) * | 2014-03-17 | 2017-04-20 | インテュイティブ サージカル オペレーションズ, インコーポレイテッド | 非白色光の一般的な照明装置を含む手術システム |
JP2018158152A (ja) * | 2018-07-05 | 2018-10-11 | 富士フイルム株式会社 | 内視鏡システム、プロセッサ装置、内視鏡システムの作動方法、及びプロセッサ装置の作動方法 |
US10779716B2 (en) | 2017-04-27 | 2020-09-22 | Olympus Corporation | Light source system, light source control method, first light source apparatus, and endoscope system |
WO2021095417A1 (ja) * | 2019-11-12 | 2021-05-20 | 富士フイルム株式会社 | 内視鏡システム |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6008812B2 (ja) * | 2013-09-27 | 2016-10-19 | 富士フイルム株式会社 | 内視鏡システム及びその作動方法 |
JP5932748B2 (ja) * | 2013-09-27 | 2016-06-08 | 富士フイルム株式会社 | 内視鏡システム |
JP5927348B2 (ja) * | 2013-10-30 | 2016-06-01 | オリンパス株式会社 | 内視鏡装置 |
WO2015083684A1 (ja) * | 2013-12-06 | 2015-06-11 | オリンパス株式会社 | 撮像装置、撮像装置の作動方法 |
WO2015190443A1 (ja) * | 2014-06-13 | 2015-12-17 | オリンパス株式会社 | 固体撮像装置及び撮像方法 |
CN107113405B (zh) * | 2015-01-20 | 2019-01-15 | 奥林巴斯株式会社 | 图像处理装置、图像处理装置的工作方法、记录介质和内窥镜装置 |
JP6829926B2 (ja) * | 2015-11-11 | 2021-02-17 | Hoya株式会社 | 内視鏡装置 |
JP6242558B2 (ja) * | 2015-12-24 | 2017-12-06 | オリンパス株式会社 | 撮像システム及び画像処理装置 |
DE112017005511T5 (de) * | 2016-11-01 | 2019-09-12 | Olympus Corporation | Bildverarbeitungsvorrichtung |
JP6810812B2 (ja) * | 2017-09-13 | 2021-01-06 | オリンパス株式会社 | 内視鏡装置、内視鏡装置の作動方法及びプログラム |
WO2020100184A1 (ja) * | 2018-11-12 | 2020-05-22 | オリンパス株式会社 | 内視鏡用光源装置、内視鏡装置及び内視鏡用光源装置の作動方法 |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2010131620A1 (ja) | 2009-05-14 | 2010-11-18 | オリンパスメディカルシステムズ株式会社 | 撮像装置 |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5408263A (en) * | 1992-06-16 | 1995-04-18 | Olympus Optical Co., Ltd. | Electronic endoscope apparatus |
JP4270634B2 (ja) * | 1999-03-18 | 2009-06-03 | オリンパス株式会社 | 内視鏡装置 |
WO2002007588A1 (fr) * | 2000-07-21 | 2002-01-31 | Olympus Optical Co., Ltd. | Endoscope |
US6826424B1 (en) * | 2000-12-19 | 2004-11-30 | Haishan Zeng | Methods and apparatus for fluorescence and reflectance imaging and spectroscopy and for contemporaneous measurements of electromagnetic radiation with multiple measuring devices |
JP4009626B2 (ja) * | 2004-08-30 | 2007-11-21 | オリンパス株式会社 | 内視鏡用映像信号処理装置 |
JP5191090B2 (ja) * | 2005-07-15 | 2013-04-24 | オリンパスメディカルシステムズ株式会社 | 内視鏡装置 |
JP5081720B2 (ja) * | 2008-05-22 | 2012-11-28 | 富士フイルム株式会社 | 蛍光内視鏡装置および励起光ユニット |
JP5460506B2 (ja) * | 2009-09-24 | 2014-04-02 | 富士フイルム株式会社 | 内視鏡装置の作動方法及び内視鏡装置 |
-
2012
- 2012-08-24 WO PCT/JP2012/071496 patent/WO2013031701A1/ja active Application Filing
- 2012-08-24 EP EP12827919.7A patent/EP2671498B1/en active Active
- 2012-08-24 CN CN201280013720.0A patent/CN103429136B/zh active Active
- 2012-08-24 JP JP2013514467A patent/JP5326065B2/ja active Active
-
2013
- 2013-04-04 US US13/856,748 patent/US20130286175A1/en not_active Abandoned
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2010131620A1 (ja) | 2009-05-14 | 2010-11-18 | オリンパスメディカルシステムズ株式会社 | 撮像装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2671498A4 |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9545193B2 (en) | 2013-03-25 | 2017-01-17 | Olympus Corporation | Endoscope apparatus |
WO2014156253A1 (ja) * | 2013-03-25 | 2014-10-02 | オリンパスメディカルシステムズ株式会社 | 内視鏡装置 |
US10932649B2 (en) | 2014-03-17 | 2021-03-02 | Intuitive Surgical Operations, Inc. | Surgical system including a non-white light general illuminator |
JP2017511176A (ja) * | 2014-03-17 | 2017-04-20 | インテュイティブ サージカル オペレーションズ, インコーポレイテッド | 非白色光の一般的な照明装置を含む手術システム |
US10506914B2 (en) | 2014-03-17 | 2019-12-17 | Intuitive Surgical Operations, Inc. | Surgical system including a non-white light general illuminator |
US11759093B2 (en) | 2014-03-17 | 2023-09-19 | Intuitive Surgical Operations, Inc. | Surgical system including a non-white light general illuminator |
CN106455942A (zh) * | 2014-06-05 | 2017-02-22 | 奥林巴斯株式会社 | 处理装置、内窥镜***、内窥镜装置、图像处理方法以及图像处理程序 |
US10535134B2 (en) | 2014-06-05 | 2020-01-14 | Olympus Corporation | Processing apparatus, endoscope system, endoscope apparatus, method for operating image processing apparatus, and computer-readable recording medium |
WO2015186397A1 (ja) * | 2014-06-05 | 2015-12-10 | オリンパスメディカルシステムズ株式会社 | 処理装置、内視鏡システム、内視鏡装置、画像処理方法および画像処理プログラム |
US10779716B2 (en) | 2017-04-27 | 2020-09-22 | Olympus Corporation | Light source system, light source control method, first light source apparatus, and endoscope system |
JP2018158152A (ja) * | 2018-07-05 | 2018-10-11 | 富士フイルム株式会社 | 内視鏡システム、プロセッサ装置、内視鏡システムの作動方法、及びプロセッサ装置の作動方法 |
WO2021095417A1 (ja) * | 2019-11-12 | 2021-05-20 | 富士フイルム株式会社 | 内視鏡システム |
JPWO2021095417A1 (ja) * | 2019-11-12 | 2021-05-20 | ||
JP7273988B2 (ja) | 2019-11-12 | 2023-05-15 | 富士フイルム株式会社 | 内視鏡システム |
Also Published As
Publication number | Publication date |
---|---|
JPWO2013031701A1 (ja) | 2015-03-23 |
JP5326065B2 (ja) | 2013-10-30 |
CN103429136A (zh) | 2013-12-04 |
EP2671498A1 (en) | 2013-12-11 |
EP2671498B1 (en) | 2017-10-18 |
US20130286175A1 (en) | 2013-10-31 |
EP2671498A4 (en) | 2015-01-07 |
CN103429136B (zh) | 2015-09-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5326065B2 (ja) | 内視鏡装置 | |
JP6461739B2 (ja) | 画像処理装置及び内視鏡システム並びに画像処理装置の作動方法 | |
JP5259882B2 (ja) | 撮像装置 | |
KR101184841B1 (ko) | 내시경 장치 및 그 신호 처리 방법 | |
US10335014B2 (en) | Endoscope system, processor device, and method for operating endoscope system | |
JP5355846B2 (ja) | 内視鏡用画像処理装置 | |
JP5654511B2 (ja) | 内視鏡システム、内視鏡システムのプロセッサ装置、及び内視鏡システムの作動方法 | |
JP5308815B2 (ja) | 生体観測システム | |
WO2016121556A1 (ja) | 内視鏡用のプロセッサ装置、及びその作動方法、並びに制御プログラム | |
JP2007075198A (ja) | 電子内視鏡システム | |
US9414739B2 (en) | Imaging apparatus for controlling fluorescence imaging in divided imaging surface | |
CN110868908B (zh) | 医用图像处理装置及内窥镜***以及医用图像处理装置的工作方法 | |
JP2014061152A (ja) | 内視鏡システム及び内視鏡用光源装置並びに内視鏡画像の作成方法 | |
JPWO2019087557A1 (ja) | 内視鏡システム | |
JP5747362B2 (ja) | 内視鏡装置 | |
JP6392486B1 (ja) | 内視鏡システム | |
JP2014050594A (ja) | 内視鏡システム及び内視鏡画像の取得方法 | |
JP2009066147A (ja) | 生体観測装置 | |
JP2012125462A (ja) | 画像処理装置 | |
JP5856943B2 (ja) | 撮像システム | |
JP2005124755A (ja) | 内視鏡用画像処理装置 | |
JP2014050458A (ja) | 内視鏡用プローブ装置及び内視鏡システム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2013514467 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12827919 Country of ref document: EP Kind code of ref document: A1 |
|
REEP | Request for entry into the european phase |
Ref document number: 2012827919 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012827919 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |