WO2016084608A1 - 内視鏡システム、および内視鏡システムの動作方法、並びにプログラム - Google Patents
内視鏡システム、および内視鏡システムの動作方法、並びにプログラム Download PDFInfo
- Publication number
- WO2016084608A1 WO2016084608A1 PCT/JP2015/081823 JP2015081823W WO2016084608A1 WO 2016084608 A1 WO2016084608 A1 WO 2016084608A1 JP 2015081823 W JP2015081823 W JP 2015081823W WO 2016084608 A1 WO2016084608 A1 WO 2016084608A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- low
- unit
- frequency
- imaging
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 36
- 238000012545 processing Methods 0.000 claims abstract description 116
- 230000009467 reduction Effects 0.000 claims abstract description 55
- 238000003384 imaging method Methods 0.000 claims description 96
- 238000000605 extraction Methods 0.000 claims description 35
- 230000003287 optical effect Effects 0.000 claims description 34
- 230000008569 process Effects 0.000 claims description 22
- 239000000284 extract Substances 0.000 claims description 20
- 230000006872 improvement Effects 0.000 claims description 19
- 238000012937 correction Methods 0.000 claims description 13
- 238000003780 insertion Methods 0.000 claims description 10
- 230000037431 insertion Effects 0.000 claims description 10
- 238000005286 illumination Methods 0.000 claims description 6
- 230000001965 increasing effect Effects 0.000 claims description 4
- 230000002123 temporal effect Effects 0.000 claims description 2
- 238000005516 engineering process Methods 0.000 description 12
- 230000015572 biosynthetic process Effects 0.000 description 10
- 238000003786 synthesis reaction Methods 0.000 description 10
- 238000010586 diagram Methods 0.000 description 5
- 230000000694 effects Effects 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 238000004891 communication Methods 0.000 description 3
- 239000004065 semiconductor Substances 0.000 description 3
- 210000003815 abdominal wall Anatomy 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 238000002357 laparoscopic surgery Methods 0.000 description 2
- 239000000203 mixture Substances 0.000 description 2
- 206010028980 Neoplasm Diseases 0.000 description 1
- 238000012084 abdominal surgery Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000008094 contradictory effect Effects 0.000 description 1
- 238000003745 diagnosis Methods 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 238000002350 laparotomy Methods 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012634 optical imaging Methods 0.000 description 1
- 238000011946 reduction process Methods 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000095—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope for image enhancement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00011—Operational features of endoscopes characterised by signal transmission
- A61B1/00016—Operational features of endoscopes characterised by signal transmission using wireless means
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00064—Constructional details of the endoscope body
- A61B1/00071—Insertion part of the endoscope body
- A61B1/0008—Insertion part of the endoscope body characterised by distal tip features
- A61B1/00096—Optical elements
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/043—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances for fluorescence imaging
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0661—Endoscope light sources
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/313—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for introducing through surgical openings, e.g. laparoscopes
- A61B1/3132—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for introducing through surgical openings, e.g. laparoscopes for laparoscopy
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B23/00—Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
- G02B23/24—Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
- G02B23/2476—Non-optical details, e.g. housings, mountings, supports
- G02B23/2484—Arrangements in relation to a camera or imaging device
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/70—Denoising; Smoothing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/73—Deblurring; Sharpening
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/81—Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/60—Noise processing, e.g. detecting, correcting, reducing or removing noise
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
- A61B1/0005—Display arrangement combining images e.g. side-by-side, superimposed or tiled
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0638—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements providing two or more wavelengths
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10048—Infrared image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10068—Endoscopic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20016—Hierarchical, coarse-to-fine, multiscale or multiresolution image processing; Pyramid transform
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20021—Dividing image into blocks, subimages or windows
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20172—Image enhancement details
- G06T2207/20182—Noise reduction or smoothing in the temporal domain; Spatio-temporal filtering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20216—Image averaging
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30028—Colon; Small intestine
- G06T2207/30032—Colon polyp
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30096—Tumor; Lesion
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/555—Constructional details for picking-up images in sites, inaccessible due to their dimensions or hazardous conditions, e.g. endoscopes or borescopes
Definitions
- the present technology relates to an endoscope system, an operation method of the endoscope system, and a program, and in particular, in an image quality improvement process of an image captured by an endoscope apparatus constituting the endoscope system, Accordingly, the present invention relates to an endoscope system, an operation method of the endoscope system, and a program capable of adjusting processing load reduction and resolution improvement in a balanced manner.
- imaging with an endoscope system in addition to general white light, special light such as narrowband light and near infrared light is used for IR (infrared light) observation and PDD (photodynamic diagnosis) observation. There is imaging used.
- imaging in this endoscope system includes normal imaging using white light as illumination light and special light imaging using special light such as narrow band light or near infrared light. Even though any illumination light is used, the image processing is the same regardless of the nature of the input signal due to the difference in the amount of light, so it is not necessarily suitable for the characteristics of each image signal. In some cases, processing was wasted.
- the present technology has been made in view of such a situation, and in particular, according to the imaging conditions related to brightness such as the type of illumination light and the lens aperture, the processing load related to image processing is reduced, and the image quality is high. This makes it possible to adjust the contradictory effects such as conversion in a balanced manner.
- An endoscope system includes an endoscope apparatus in which an objective lens is provided at a distal end of a rigid insertion portion that is inserted into a body cavity, and the objective lens that is input from the endoscope apparatus
- An image pickup unit that picks up an optical image condensed by the image signal and outputs it as an image signal
- a low-frequency component extraction unit that extracts a low-frequency image that is a low-frequency component from the image signal, and a high-frequency component from the image signal
- a high-frequency component extracting unit that extracts a high-frequency image, an image-quality improving unit that performs high-quality image processing on the low-frequency image, and at least the low-frequency image that has been improved in image quality by the image quality improving processing unit
- the condition information may be information indicating whether or not the condition for increasing the ratio of the noise component of the image signal captured by the imaging unit is high.
- the information indicating whether the ratio of the noise component of the image signal captured by the image capturing unit is high is determined by a light source device that emits illumination light when the image capturing unit captures an image.
- Information indicating the type of light emitted, information on the aperture of the objective lens in the endoscope apparatus, and information on the aperture of a relay lens provided between the imaging unit and the objective lens are included. can do.
- a reduction unit that reduces the low-frequency image at a predetermined reduction rate and an enlargement unit that enlarges the image at an enlargement rate corresponding to the reduction rate can be further included. Can be made to perform high image quality processing on the reduced low-frequency image.
- the reduction unit reduces the low-frequency image at a reduction rate according to the condition information, and the enlargement unit increases the image quality at a reduction rate corresponding to the reduction rate according to the condition information.
- the reduced image of the low-frequency image being processed can be enlarged.
- the image quality enhancement processing unit can include spatial direction noise removal, temporal direction noise removal, color correction, and band enhancement processing.
- the high-frequency component extraction unit has a condition in which a ratio of a noise component of the image signal imaged in the imaging unit is high. The extraction of the high frequency image can be stopped.
- the operation method of the endoscope system is focused by the objective lens that is input from an endoscope apparatus in which an objective lens is provided at a distal end of a rigid insertion portion that is inserted into a body cavity.
- An optical image is captured and output as an image signal, a low-frequency image that is a low-frequency component is extracted from the image signal, a high-frequency image that is a high-frequency component is extracted from the image signal, and the low-frequency image is extracted.
- High-quality processing is performed, and at least the low-frequency image that has been improved in image quality by the high-quality image processing unit is output as an output image.
- An image signal having a pixel value obtained by adding the pixel value of each pixel of the high-frequency image to the pixel value of each pixel of the image is output as the output image.
- a program includes an endoscope device in which an objective lens is provided at a distal end of a rigid insertion portion that is inserted into a body cavity, and the objective lens that is input from the endoscope device
- An image pickup unit that picks up an optical image condensed by the image signal and outputs it as an image signal, a low-frequency component extraction unit that extracts a low-frequency image that is a low-frequency component from the image signal, and a high-frequency component from the image signal
- a high-frequency component extracting unit that extracts a high-frequency image, an image-quality improving unit that performs high-quality image processing on the low-frequency image, and at least the low-frequency image that has been improved in image quality by the image quality improving processing unit
- the output unit outputs the high value to the pixel value of each pixel of the low-frequency image that has been improved in image quality according to the condition information at the time of imaging in the imaging unit.
- Add pixel values of each pixel of the area image An image signal comprising a low-
- an optical image collected by the objective lens which is input from an endoscope apparatus in which an objective lens is provided at a distal end of a rigid insertion portion that is inserted into a body cavity, is captured. It is output as an image signal, a low-frequency image that is a low-frequency component is extracted from the image signal, a high-frequency image that is a high-frequency component is extracted from the image signal, and a high-quality image processing is performed on the low-frequency image At least the low-frequency image with high image quality is output as an output image, and the high-frequency image is set to the pixel value of each pixel of the low-frequency image with high image quality according to the condition information at the time of imaging. An image signal having a pixel value obtained by adding the pixel values of the respective pixels is output as the output image.
- Each component of the endoscope system may be an independent device, or may be a block that functions as each component of the endoscope system.
- the conflicting effects of reducing the processing load related to the image quality enhancement processing and increasing the resolution are balanced. It becomes possible to adjust well.
- FIG. 10 It is a figure explaining the structural example of 2nd Embodiment of the image processing apparatus of FIG. 10 is a flowchart illustrating image processing by the image processing apparatus in FIG. 9.
- FIG. 11 is a diagram illustrating a configuration example of a general-purpose personal computer.
- FIG. 1 is a diagram illustrating an outline of an endoscope system to which the present technology is applied.
- This endoscope system is used in laparoscopic surgery performed in place of conventional laparotomy in the medical field in recent years.
- an opening device called a trocar 2 is used instead of cutting the abdominal wall 1 that has been conventionally performed to open the abdominal wall.
- a laparoscope (hereinafter also referred to as an endoscopic device or an endoscope) 11 and a treatment instrument 3 are inserted into a body through holes provided in the trocar 2. Then, a treatment such as excising the affected part 4 with the treatment tool 3 is performed while viewing the image of the affected part (tumor or the like) 4 imaged by the endoscope apparatus 11 in real time.
- an operator, an assistant, a scoop, a robot, or the like holds the head portion 24.
- the endoscope system 10 includes an endoscope device 11, an image processing device 12, and a display device 13.
- the endoscope apparatus 11 and the image processing apparatus 12 may be connected wirelessly in addition to being connected via a cable. Further, the image processing apparatus 12 may be arranged at a location away from the operating room and connected via a network such as a local LAN or the Internet. The same applies to the connection between the image processing device 12 and the display device 13.
- the endoscope apparatus 11 includes a straight bar-shaped lens barrel portion 21 and a head portion 24.
- the lens barrel portion 21 is also referred to as an optical viewing tube or a rigid tube, and has a length of about several tens of centimeters.
- An objective lens 22 is provided at one end inserted into the body, and the other end is a head. Connected to the unit 24.
- An optical lens portion 23 of a relay optical system is provided inside the lens barrel portion 21. Note that the shape of the lens barrel 21 is not limited to a straight bar shape.
- the lens barrel portion 21 is roughly classified into a direct-view mirror having the same lens barrel axis and optical axis as shown in FIG. 2, and a perspective mirror having a predetermined angle between the lens barrel axis and the optical axis. 2 is an example of a direct-view mirror.
- the head unit 24 incorporates an imaging unit 25.
- the imaging unit 25 has an imaging element such as a CMOS (Complementary Metal Oxide Semiconductor) image sensor, and converts an optical image of the affected part input from the lens barrel part 21 into an image signal at a predetermined frame rate.
- CMOS Complementary Metal Oxide Semiconductor
- a light source device 14 is connected to the endoscope apparatus 11, and the affected part 4 is irradiated with the supply of a light source required for imaging. At this time, the light source device 14 can emit light having various wavelengths, and can emit special light that can particularly identify the affected part 4 in addition to the normal light. Therefore, the image picked up by the image pickup unit 25 can pick up an image signal based on special light as well as an image signal based on normal light.
- an optical image of the affected part 4 collected by the objective lens 22 is incident on the imaging unit 25 of the head unit 24 via the optical lens unit 23, and an image with a predetermined frame rate is input by the imaging unit 25.
- the signal is converted into a signal and output to the subsequent image processing apparatus 12.
- the head unit 24 is configured to supply information such as the type of light emitted from the light source device 14, the aperture of the objective lens 22, and the aperture of the optical lens unit 23 to the image processing apparatus 12 as condition information. To do. About this condition information, the structure which a user inputs previously in the site
- the condition information may be configured to be recognized by the image processing apparatus 12 by analyzing the image signal to be captured in the image processing apparatus 12.
- the description will be made on the assumption that condition information is input to the image processing apparatus 12 by any method.
- the information on the type of light supplied from the light source device 14 to the image processing device 12 may be directly supplied from the light source device 14 to the image processing device 12.
- FIG. 3 shows another configuration example of the endoscope apparatus 11.
- an imaging unit 25 may be disposed immediately after the objective lens 22, and the optical lens unit 23 inside the lens barrel unit 21 may be omitted.
- the image processing apparatus 12 includes a low-frequency extraction unit 51, a high-frequency extraction unit 52, a reduction unit 53, a noise removal unit 54, a color correction unit 55, an enlargement unit 56, a low-high frequency synthesis unit 57, a structure enhancement unit 58, and an electronic A zoom unit 59 is provided.
- the low frequency extraction unit 51 extracts a low frequency component in the input image and outputs the low frequency component to the high frequency extraction unit 52 and the reduction unit 53. More specifically, the low-frequency extraction unit 51 includes, for example, an LPF (Low (Pass Filter), extracts a low-frequency component in the input image, and forms a high-frequency extraction unit 52 and a reduction unit 53 as a low-frequency image. Output to.
- LPF Low (Pass Filter
- the high frequency extraction unit 52 extracts a high frequency component for each pixel of the input image and outputs the high frequency component to the low high frequency synthesis unit 57. More specifically, the high frequency extraction unit 52 extracts, for example, a high frequency component by subtracting the low frequency component supplied from the low frequency extraction unit 51 for each pixel of the input image. This is output to the area synthesis unit 57.
- the high frequency extraction unit 52 Stop extracting.
- the reduction unit 53 reduces the image signal composed of the low frequency components to a low resolution image and outputs the reduced image to the noise removal unit 54 as a reduced image. More specifically, the reduction unit 53 reduces the pixel resolution by reducing the pixel signal of the image signal composed of low-frequency components, for example, at a predetermined interval.
- the noise removal unit 54 performs noise removal processing on the reduced image. More specifically, the noise removing unit 54 performs, for example, 2D ⁇ NR (2Dimension Noise Reduction) processing (two-dimensional noise removal processing) and 3D NR (3Dimension Noise Reduction) processing (three-dimensional noise removal processing) on the reduced image. ), Noise is removed and output to the enlargement unit 56.
- the two-dimensional noise removal process here is a so-called spatial direction noise removal process using a signal in the image in the reduced image, and the three-dimensional noise removal process uses a plurality of images in the time direction. This is so-called time direction noise removal processing.
- the color correction unit 55 performs color correction on the reduced image subjected to the noise removal process, and outputs the reduced image to the enlargement unit 56.
- the enlargement unit 56 enlarges the reduced image subjected to the band emphasis at an enlargement rate corresponding to the reduction rate when the reduction process is performed by the reduction unit 53, converts the reduced image into the same image size as the size of the input image,
- the low-frequency image composed of the component image signals is output to the low-high frequency combining unit 57.
- the low and high frequency synthesizer 57 is based on condition information for specifying brightness related to imaging such as the type of light source from the light source device 14, the optical lens unit 23 in the endoscope device 11, and the aperture size of the optical lens unit 23. Then, by controlling the built-in adder 57a (FIG. 6), either the image signal obtained by adding the image signal of the high frequency component and the image signal of the low frequency component, or the image signal of only the low frequency component is obtained. The image signal is output to the structure enhancement unit 58 as a high-quality image signal.
- condition information information indicating that the light emitted from the light source device 14 is special light, information indicating that the aperture size of the objective lens 22 is smaller than a predetermined size, and the aperture size of the optical lens unit 23 are predetermined. If any of the information that is smaller than the image size is included, it indicates that the brightness related to imaging is relatively dark. In such a case, a high frequency component in the input image contains a lot of noise components. Therefore, in the case of such condition information, the low / high frequency synthesizer 57 stops the operation of the adder 57a (FIG. 6) to generate a low frequency image consisting of only the low frequency components with high image quality. Output as image quality processing result.
- the high frequency component contains a lot of noise components, so by outputting only the low frequency image, an image with an excellent signal-to-noise ratio (S / N) can be output with low resolution. It becomes possible to do.
- condition information information indicating that the light emitted from the light source device 14 is normal light made of white light, information that the aperture size of the objective lens 22 is larger than a predetermined size, and the aperture size of the optical lens unit 23 If all the information that the image is larger than the predetermined size is available, it indicates that the brightness related to the imaging is relatively bright. In such a case, the noise component is relatively small even in the high frequency component. Therefore, the low / high frequency combining unit 57 controls the adding unit 57a (FIG. 6) to output the result of adding the high frequency component to the low frequency component with high image quality for each pixel as the high image quality processing result. .
- the structure enhancement unit 58 performs structure enhancement processing on the image signal with high image quality, and outputs the image signal to the electronic zoom unit 59.
- the electronic zoom unit 59 electronically enlarges the structure-enhanced image signal supplied from the structure emphasizing unit 58 to an appropriate size and outputs it to the display device 13.
- image processing by the image processing apparatus 12 of FIG. 4 will be described with reference to the flowchart of FIG.
- the light source device 14 emits normal light from the light source device 14 to perform normal imaging, or special light is emitted and special light imaging is performed. It is assumed that condition information indicating whether or not there is supplied. Further, it is assumed that information related to the apertures of the objective lens 22 and the optical lens unit 23 in the endoscope apparatus 11 is also supplied as condition information to the image processing apparatus 12.
- step S ⁇ b> 11 the low-frequency extraction unit 51 performs low-frequency component extraction processing by LPF on each pixel of the input image to extract a low-frequency component, and outputs the low-frequency component to the high-frequency extraction unit 52 and the reduction unit 53. .
- step S12 the high-frequency extraction unit 52 is relatively bright such that the condition information from the light source device 14 is normal light imaging, or the objective lens 22 and the optical lens unit 23 having a diameter larger than a predetermined size are used. It is determined whether or not the condition information indicates an imaging state in normal light. In step S12, for example, when the condition information indicates an imaging state in a relatively bright state, the process proceeds to step S13.
- step S13 the high frequency extraction unit 52 subtracts the low frequency component for each pixel of the input image to extract the high frequency component, and outputs the high frequency component to the low high frequency synthesis unit 57.
- step S12 when the condition information indicates an imaging state in a relatively dark state, the process in step S13 is skipped. That is, in this case, as will be described later, since a high frequency component is not required in the subsequent stage, an image signal composed of the high frequency component is not extracted. As a result, it is possible to reduce the processing load for extracting the high frequency component.
- step S14 the reduction unit 53 reduces the image size of the image signal composed of the low frequency component of the input image, and outputs it to the noise removal unit 54 as a reduced image composed of the low frequency component.
- step S ⁇ b> 15 the noise removal unit 54 performs noise removal processing on the reduced image composed of the low frequency components and outputs the reduced image to the color correction unit 55.
- step S ⁇ b> 16 the color correction unit 55 performs color correction on the reduced image that has been subjected to noise removal processing and includes the low-frequency component, and outputs the reduced image to the enlargement unit 56.
- step S ⁇ b> 17 the enlargement unit 56 enlarges the noise-reduced reduced image at an enlargement rate corresponding to the reduction rate when reduced by the reduction unit 53, returns the image to the same image size as the input image, The data is output to the combining unit 57.
- step S18 the low and high frequency combining unit 57 determines whether or not the imaging state is in a relatively bright state based on the condition information, and indicates that the imaging state is in a relatively bright state. If yes, the process proceeds to step S19.
- step S ⁇ b> 19 the low and high frequency synthesizer 57 receives the low frequency component image signal (low frequency image) subjected to the noise removal process supplied from the enlargement unit 56 and the high frequency supplied from the high frequency extractor 52.
- the component image signals (high-frequency images) are added and synthesized, and output to the structure enhancement unit 58 as an image with high image quality (output image). That is, as shown in the upper part of FIG. 6, the low and high frequency synthesizer 57 controls the adder 57 a inside the image signal of the low frequency component subjected to the noise removal process supplied from the enlargement unit 56. (Low-frequency image) and a high-frequency component image signal (high-frequency image) supplied from the high-frequency extraction unit 52 are added. Then, the low and high frequency synthesizer 57 outputs the image signal as the addition result to the structure enhancing unit 58 as an image signal (output image) with high resolution and high image quality.
- step S18 if it is determined in step S18 that the imaging state is relatively dark based on the condition information, the process proceeds to step S20.
- step S20 the low and high frequency synthesizer 57 enhances the structure of only the image signal (low frequency image) composed of the low frequency components supplied from the enlargement unit 56 as an image signal (output image) subjected to noise removal processing as it is.
- the unit 58 To the unit 58. That is, as shown in the lower part of FIG. 6, the low and high frequency synthesizer 57 stops the operation of the built-in adder 57 a, and the noise signal processed low frequency component image signal supplied from the enlarger 56. Only the (low-frequency image) is output to the structure enhancement unit 58 as an image signal (output image) with high image quality.
- step S ⁇ b> 21 the structure enhancement unit 58 performs structure enhancement processing on the image signal supplied from the low and high frequency synthesis unit 57 and outputs the image signal to the electronic zoom unit 59.
- step S22 the electronic zoom unit 59 converts the image signal on which the structure enhancement processing has been performed into a resolution suitable for the display device 13, and outputs the display signal to the display device 13 for display.
- the high-frequency component of the image signal to be captured has a low-frequency component and a high-frequency component because the ratio of the noise signal is low.
- the image signal is high. Since the ratio of the noise signal is high in the band component, the high band component is not added to the low band component, and the image of the low band component with high image quality is output as it is as the output image.
- FIG. 7 the resolution and S / N ratio in the image with the high frequency component added are shown on the left side, and the resolution and S / N ratio in the image with no high frequency component added on the right side. It is shown.
- the resolution can be higher than that of the upper right image of FIG. 8, and a clearer image can be provided. Is possible. Also, for special light imaging with high noise in high frequency components, as shown in the lower right image of FIG. 8, by improving the S / N ratio than the lower left image of FIG. It is possible to improve the sensitivity and provide an image that is easier to recognize.
- FIG. 8 an image with a high frequency component added is shown on the left side, and an image with no high frequency component added is shown on the right side. Further, in FIG. 8, the upper row is an image in normal imaging, and the lower row is an image in special light imaging.
- the image is reduced, noise-removed and color-corrected, then enlarged and returned to the size of the input image, so each process of noise-removing and color-correcting It is possible to reduce the processing load in
- FIG. 9 shows a configuration example of the second embodiment of the image processing apparatus 12 in which the reduction size is changed according to the imaging conditions such as the input image and the brightness at the time of imaging.
- the image processing apparatus 12 in FIG. 9 differs from the image processing apparatus 12 in FIG. 4 in that a reduction unit 81 and an enlargement unit 82 are provided instead of the reduction unit 53 and the enlargement unit 56.
- the reduction unit 81 has the same basic configuration as the reduction unit 53, but further reduces the input image by changing the reduction rate in accordance with the imaging conditions, and outputs the reduced input image. That is, a darker image with a high noise component and a low S / N ratio is set to a higher reduction ratio according to the imaging conditions, and the reduction ratio is reduced as the brighter conditions are closer to normal imaging.
- the enlargement unit 82 enlarges the image signal at an enlargement rate corresponding to the reduction rate of the reduction unit 81 according to the imaging conditions, and outputs the image signal to the low / high frequency synthesis unit 57.
- an image reduced at a reduction rate according to the imaging condition is subjected to noise removal and color correction, and then enlarged and output to the low and high frequency synthesis unit 57.
- Appropriate processing according to the noise level of the image can be performed, and the processing load can be appropriately reduced.
- the reduction unit 81 has a condition closer to normal imaging based on the condition information.
- the reduction rate is reduced, the image is reduced to a size closer to that of the original image, and an image in which a high frequency component such as special imaging includes a lot of noise components is reduced to a smaller size.
- step S47 the enlargement unit 82 enlarges the image at an enlargement rate corresponding to the reduction rate in the reduction unit 81 based on the condition information.
- the data is output to the low / high frequency synthesis unit 57.
- steps S48 to S52 high / low range synthesis, structure enhancement, and electronic zoom processing are performed.
- the resolution and S / N ratio are adjusted in a well-balanced manner according to the imaging conditions, and the processing load can be appropriately reduced according to the imaging conditions, as in the above-described processing. It becomes.
- ⁇ Example executed by software> By the way, the series of processes described above can be executed by hardware, but can also be executed by software.
- a program constituting the software may execute various functions by installing a computer incorporated in dedicated hardware or various programs. For example, it is installed from a recording medium in a general-purpose personal computer or the like.
- FIG. 11 shows a configuration example of a general-purpose personal computer.
- This personal computer incorporates a CPU (Central Processing Unit) 1001.
- An input / output interface 1005 is connected to the CPU 1001 via a bus 1004.
- a ROM (Read Only Memory) 1002 and a RAM (Random Access Memory) 1003 are connected to the bus 1004.
- the input / output interface 1005 includes an input unit 1006 including an input device such as a keyboard and a mouse for a user to input an operation command, an output unit 1007 for outputting a processing operation screen and an image of the processing result to a display device, programs, and various types.
- a storage unit 1008 including a hard disk drive for storing data, a LAN (Local Area Network) adapter, and the like are connected to a communication unit 1009 that executes communication processing via a network represented by the Internet.
- magnetic disks including flexible disks
- optical disks including CD-ROM (Compact Disc-Read Only Memory), DVD (Digital Versatile Disc)), magneto-optical disks (including MD (Mini Disc)), or semiconductors
- a drive 1010 for reading / writing data from / to a removable medium 1011 such as a memory is connected.
- the CPU 1001 is read from a program stored in the ROM 1002 or a removable medium 1011 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, installed in the storage unit 1008, and loaded from the storage unit 1008 to the RAM 1003. Various processes are executed according to the program.
- the RAM 1003 also appropriately stores data necessary for the CPU 1001 to execute various processes.
- the CPU 1001 loads the program stored in the storage unit 1008 to the RAM 1003 via the input / output interface 1005 and the bus 1004 and executes the program, for example. Is performed.
- the program executed by the computer (CPU 1001) can be provided by being recorded on the removable medium 1011 as a package medium, for example.
- the program can be provided via a wired or wireless transmission medium such as a local area network, the Internet, or digital satellite broadcasting.
- the program can be installed in the storage unit 1008 via the input / output interface 1005 by attaching the removable medium 1011 to the drive 1010. Further, the program can be received by the communication unit 1009 via a wired or wireless transmission medium and installed in the storage unit 1008. In addition, the program can be installed in advance in the ROM 1002 or the storage unit 1008.
- the program executed by the computer may be a program that is processed in time series in the order described in this specification, or in parallel or at a necessary timing such as when a call is made. It may be a program for processing.
- the system means a set of a plurality of components (devices, modules (parts), etc.), and it does not matter whether all the components are in the same housing. Accordingly, a plurality of devices housed in separate housings and connected via a network and a single device housing a plurality of modules in one housing are all systems. .
- the present technology can take a cloud computing configuration in which one function is shared by a plurality of devices via a network and is jointly processed.
- each step described in the above flowchart can be executed by one device or can be shared by a plurality of devices.
- the plurality of processes included in the one step can be executed by being shared by a plurality of apparatuses in addition to being executed by one apparatus.
- this technique can also take the following structures.
- an endoscope apparatus in which an objective lens is provided at a distal end of a rigid insertion portion to be inserted into a body cavity;
- An imaging unit that captures an optical image collected from the objective lens and is input from the endoscope device, and outputs the image as an image signal;
- a low-frequency component extraction unit that extracts a low-frequency image that is a low-frequency component from the image signal;
- a high-frequency component extraction unit that extracts a high-frequency image that is a high-frequency component from the image signal;
- An image quality improvement processing unit that applies an image quality improvement process to the low-frequency image;
- At least an output unit that outputs the low-frequency image that has been improved in image quality by the image quality improving processing unit as an output image,
- the output unit is a pixel obtained by adding the pixel value of each pixel of the high-frequency image to the pixel value of each pixel of the low-frequency image that has been improved in image quality according to the condition information
- condition information is information indicating whether the ratio of a noise component of the image signal captured by the imaging unit is high.
- the information indicating whether or not the condition information is a condition in which a ratio of a noise component of the image signal captured by the imaging unit is high is a light source that emits illumination light when the imaging unit captures an image.
- the endoscope system according to any one of (1) to (3), wherein the image quality improvement processing unit performs image quality improvement processing on the reduced low-frequency image.
- the reduction unit reduces the low-frequency image at a reduction rate according to the condition information,
- the image quality improvement processing unit includes a spatial direction noise removal, a time direction noise removal, a color correction, and a band enhancement process.
- the high-frequency component extraction unit is a condition in which a ratio of a noise component of the image signal captured in the imaging unit is high.
- an image signal composed of a pixel value obtained by adding the pixel value of each pixel of the high-frequency image to the pixel value of each pixel of the low-frequency image with the high image quality is output image
- the output method of the endoscope system is a pixel value obtained by adding the pixel value of each pixel of the high-frequency image to the pixel value of each pixel of the low-frequency image with the high image quality.
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Surgery (AREA)
- Medical Informatics (AREA)
- Radiology & Medical Imaging (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Optics & Photonics (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Biophysics (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- Pathology (AREA)
- General Physics & Mathematics (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Astronomy & Astrophysics (AREA)
- Quality & Reliability (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computer Networks & Wireless Communication (AREA)
- Endoscopes (AREA)
- Closed-Circuit Television Systems (AREA)
- Studio Devices (AREA)
- Instruments For Viewing The Inside Of Hollow Bodies (AREA)
Abstract
Description
図1は、本技術を適用した内視鏡システムの概要を説明する図である。
ここで、図2を参照して、本技術の実施の形態である内視鏡システムの構成例について説明する。この内視鏡システム10は、内視鏡装置11、画像処理装置12、および表示装置13から構成される。
次に、図4のブロック図を参照して、画像処理装置12の第1の実施の形態の構成例について説明する。
次に、図5のフローチャートを参照して、図4の画像処理装置12による画像処理について説明する。尚、ここでは画像処理装置12に対して光源装置14から、通常光である白色光を発して通常撮像している状態であるか、または、特殊光を発して特殊光撮像している状態であるかを示す条件情報が供給されることを前提とする。さらに、内視鏡装置11における対物レンズ22、および光学レンズ部23の口径に係る情報についても画像処理装置12に条件情報として供給されることを前提とする。
以上においては、入力画像の低域成分については、一律のサイズの画像に縮小された後、ノイズ除去、および色補正がなされた後、拡大する例について説明してきた。しかしながら、高域成分に含まれるノイズ成分の割合の高さ、すなわち、S/N比は、暗くなるほど低減し、ノイズの割合が高くなる。そこで、例えば、入力画像や撮像時の明るさといった撮像条件に応じて、縮小サイズを変更させるようにしてもよい。
次に、図10のフローチャートを参照して、図9の画像処理装置12における画像処理について説明する。尚、図10のフローチャートにおけるステップS41乃至S43,S45,S46,S48乃至S52の処理は、図5におけるステップS11乃至S13,S15,S16,S18乃至S22における処理と同様であるので、その説明は省略する。
ところで、上述した一連の処理は、ハードウェアにより実行させることもできるが、ソフトウェアにより実行させることもできる。一連の処理をソフトウェアにより実行させる場合には、そのソフトウェアを構成するプログラムが、専用のハードウェアに組み込まれているコンピュータ、または、各種のプログラムをインストールすることで、各種の機能を実行することが可能な、例えば汎用のパーソナルコンピュータなどに、記録媒体からインストールされる。
(1) 体腔に挿入される硬性の挿入部の先端に対物レンズが設けられている内視鏡装置と、
前記内視鏡装置から入力される、前記対物レンズによって集光された光学像を撮像し、画像信号として出力する撮像部と、
前記画像信号より低域成分である低域画像を抽出する低域成分抽出部と、
前記画像信号より高域成分である高域画像を抽出する高域成分抽出部と、
前記低域画像に高画質化処理を施す高画質化処理部と、
少なくとも、前記高画質化処理部により高画質化された前記低域画像を出力画像として出力する出力部とを含み、
前記出力部は、前記撮像部における撮像時の条件情報に応じて、前記高画質化された前記低域画像の各画素の画素値に、前記高域画像の各画素の画素値を加算した画素値からなる画像信号を前記出力画像として出力する
内視鏡システム。
(2) 前記条件情報は、前記撮像部において撮像される前記画像信号のノイズ成分の割合が高くなる条件であるか否かを示す情報である
(1)に記載の内視鏡システム。
(3) 前記条件情報における、前記撮像部において撮像される前記画像信号のノイズ成分の割合が高くなる条件であるか否かを示す情報は、前記撮像部が撮像する際の照明光を発する光源装置により発せられる光の種別を示す情報、並びに、前記内視鏡装置における前記対物レンズの口径、および前記撮像部と、前記対物レンズとの間に設けられたリレーレンズの口径の情報を含む
(2)に記載の内視鏡システム。
(4) 前記低域画像を所定の縮小率で縮小する縮小部と、
前記縮小率に対応する拡大率で画像を拡大する拡大部とをさらに含み、
前記高画質化処理部は、前記縮小された低域画像に高画質化処理を施す
(1)乃至(3)のいずれかに記載の内視鏡システム。
(5) 前記縮小部は、前記条件情報に応じた縮小率で前記低域画像を縮小し、
前記拡大部は、前記条件情報に応じた前記縮小率に対応する縮大率で、前記高画質化処理されている、前記低域画像の縮小画像を拡大する
(4)に記載の内視鏡システム。
(6) 前記高画質化処理部は、空間方向ノイズ除去、時間方向ノイズ除去、色補正、および帯域強調の処理を含む
(1)乃至(5)のいずれかに記載の内視鏡システム。
(7) 前記撮像部における撮像時の条件情報に応じて、前記高域成分抽出部は、前記撮像部において撮像される前記画像信号のノイズ成分の割合が高くなる条件であるとき、前記画像信号からの前記高域画像の抽出を停止する
(1)乃至(6)のいずれかに記載の内視鏡システム。
(8) 体腔に挿入される硬性の挿入部の先端に対物レンズが設けられている内視鏡装置から入力される、前記対物レンズによって集光された光学像を撮像し、画像信号として出力し、
前記画像信号より低域成分である低域画像を抽出し、
前記画像信号より高域成分である高域画像を抽出し、
前記低域画像に高画質化処理を施し、
少なくとも、前記高画質化処理部により高画質化された前記低域画像を出力画像として出力し、
撮像時の条件情報に応じて、前記高画質化された前記低域画像の各画素の画素値に、前記高域画像の各画素の画素値を加算した画素値からなる画像信号を前記出力画像として出力する
内視鏡システムの動作方法。
(9) コンピュータを、
体腔に挿入される硬性の挿入部の先端に対物レンズが設けられている内視鏡装置と、
前記内視鏡装置から入力される、前記対物レンズによって集光された光学像を撮像し、画像信号として出力する撮像部と、
前記画像信号より低域成分である低域画像を抽出する低域成分抽出部と、
前記画像信号より高域成分である高域画像を抽出する高域成分抽出部と、
前記低域画像に高画質化処理を施す高画質化処理部と、
少なくとも、前記高画質化処理部により高画質化された前記低域画像を出力画像として出力する出力部として機能させ、
前記出力部は、前記撮像部における撮像時の条件情報に応じて、前記高画質化された前記低域画像の各画素の画素値に、前記高域画像の各画素の画素値を加算した画素値からなる画像信号を前記出力画像として出力する
プログラム。
Claims (9)
- 体腔に挿入される挿入部の先端に対物レンズが設けられている内視鏡と、
前記内視鏡から入力される、前記対物レンズによって集光された光学像を撮像し、画像信号として出力する撮像部と、
前記画像信号より低域成分である低域画像を抽出する低域成分抽出部と、
前記画像信号より高域成分である高域画像を抽出する高域成分抽出部と、
前記低域画像に高画質化処理を施す高画質化処理部と、
少なくとも、前記高画質化処理部により高画質化された前記低域画像を出力画像として出力する出力部とを含み、
前記出力部は、前記撮像部における撮像時の条件情報に応じて、前記高画質化された前記低域画像の各画素の画素値に、前記高域画像の各画素の画素値を加算した画素値からなる画像信号を前記出力画像として出力する
内視鏡システム。 - 前記条件情報は、前記撮像部において撮像される前記画像信号のノイズ成分の割合が高くなる条件であるか否かを示す情報である
請求項1に記載の内視鏡システム。 - 前記条件情報における、前記撮像部において撮像される前記画像信号のノイズ成分の割合が高くなる条件であるか否かを示す情報は、前記撮像部が撮像する際の照明光を発する光源装置により発せられる光の種別を示す情報、並びに、前記内視鏡装置における前記対物レンズの口径、および前記撮像部と、前記対物レンズとの間に設けられたリレーレンズの口径の情報を含む
請求項2に記載の内視鏡システム。 - 前記低域画像を所定の縮小率で縮小する縮小部と、
前記縮小率に対応する拡大率で画像を拡大する拡大部とをさらに含み、
前記高画質化処理部は、前記縮小された低域画像に高画質化処理を施す
請求項1に記載の内視鏡システム。 - 前記縮小部は、前記条件情報に応じた縮小率で前記低域画像を縮小し、
前記拡大部は、前記条件情報に応じた前記縮小率に対応する縮大率で、前記高画質化処理されている、前記低域画像の縮小画像を拡大する
請求項4に記載の内視鏡システム。 - 前記高画質化処理部は、空間方向ノイズ除去、時間方向ノイズ除去、色補正、および帯域強調の処理を含む
請求項1に記載の内視鏡システム。 - 前記撮像部における撮像時の条件情報に応じて、前記高域成分抽出部は、前記撮像部において撮像される前記画像信号のノイズ成分の割合が高くなる条件であるとき、前記画像信号からの前記高域画像の抽出を停止する
請求項1に記載の内視鏡システム。 - 体腔に挿入される挿入部の先端に対物レンズが設けられている内視鏡から入力される、前記対物レンズによって集光された光学像を撮像し、画像信号として出力し、
前記画像信号より低域成分である低域画像を抽出し、
前記画像信号より高域成分である高域画像を抽出し、
前記低域画像に高画質化処理を施し、
少なくとも、前記高画質化処理部により高画質化された前記低域画像を出力画像として出力し、
撮像時の条件情報に応じて、前記高画質化された前記低域画像の各画素の画素値に、前記高域画像の各画素の画素値を加算した画素値からなる画像信号を前記出力画像として出力する
内視鏡システムの動作方法。 - コンピュータを、
体腔に挿入される挿入部の先端に対物レンズが設けられている内視鏡と、
前記内視鏡から入力される、前記対物レンズによって集光された光学像を撮像し、画像信号として出力する撮像部と、
前記画像信号より低域成分である低域画像を抽出する低域成分抽出部と、
前記画像信号より高域成分である高域画像を抽出する高域成分抽出部と、
前記低域画像に高画質化処理を施す高画質化処理部と、
少なくとも、前記高画質化処理部により高画質化された前記低域画像を出力画像として出力する出力部として機能させ、
前記出力部は、前記撮像部における撮像時の条件情報に応じて、前記高画質化された前記低域画像の各画素の画素値に、前記高域画像の各画素の画素値を加算した画素値からなる画像信号を前記出力画像として出力する
プログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/112,234 US9986890B2 (en) | 2014-11-25 | 2015-11-12 | Endoscope system, operation method for endoscope system, and program for balancing conflicting effects in endoscopic imaging |
JP2016537591A JP6020950B1 (ja) | 2014-11-25 | 2015-11-12 | 内視鏡システム、および内視鏡システムの作動方法、並びにプログラム |
CN201580004855.4A CN105916430B (zh) | 2014-11-25 | 2015-11-12 | 内窥镜***以及内窥镜***的操作方法 |
US15/985,322 US10799087B2 (en) | 2014-11-25 | 2018-05-21 | Endoscope system, operation method for endoscope system, and program for balancing conflicting effects in endoscopic imaging |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014237702 | 2014-11-25 | ||
JP2014-237702 | 2014-11-25 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/112,234 A-371-Of-International US9986890B2 (en) | 2014-11-25 | 2015-11-12 | Endoscope system, operation method for endoscope system, and program for balancing conflicting effects in endoscopic imaging |
US15/985,322 Continuation US10799087B2 (en) | 2014-11-25 | 2018-05-21 | Endoscope system, operation method for endoscope system, and program for balancing conflicting effects in endoscopic imaging |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016084608A1 true WO2016084608A1 (ja) | 2016-06-02 |
Family
ID=56074177
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/081823 WO2016084608A1 (ja) | 2014-11-25 | 2015-11-12 | 内視鏡システム、および内視鏡システムの動作方法、並びにプログラム |
Country Status (4)
Country | Link |
---|---|
US (2) | US9986890B2 (ja) |
JP (2) | JP6020950B1 (ja) |
CN (1) | CN105916430B (ja) |
WO (1) | WO2016084608A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2018148973A (ja) * | 2017-03-10 | 2018-09-27 | 株式会社Jvcケンウッド | 手術システム、手術器具、トロカール及び判定方法 |
WO2021079723A1 (ja) * | 2019-10-21 | 2021-04-29 | ソニー株式会社 | 画像処理装置、画像処理方法および内視鏡システム |
JP7406892B2 (ja) | 2019-03-11 | 2023-12-28 | キヤノン株式会社 | 医用画像処理装置、医用画像処理方法及びプログラム |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6631331B2 (ja) * | 2016-03-10 | 2020-01-15 | 富士通株式会社 | 生体認証装置、生体認証方法および生体認証プログラム |
JP6897679B2 (ja) * | 2016-06-28 | 2021-07-07 | ソニーグループ株式会社 | 撮像装置、撮像方法、プログラム |
EP3590415A4 (en) * | 2017-03-03 | 2020-03-18 | Fujifilm Corporation | ENDOSCOPE SYSTEM, PROCESSOR DEVICE, AND OPERATION METHOD OF THE ENDOSCOPE SYSTEM |
WO2020059098A1 (ja) * | 2018-09-20 | 2020-03-26 | オリンパス株式会社 | 画像処理装置 |
US11625825B2 (en) | 2019-01-30 | 2023-04-11 | Covidien Lp | Method for displaying tumor location within endoscopic images |
CN109949281A (zh) * | 2019-03-11 | 2019-06-28 | 哈尔滨工业大学(威海) | 一种胃镜图像质量检测方法及装置 |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009100936A (ja) * | 2007-10-23 | 2009-05-14 | Fujinon Corp | 画像処理装置 |
JP2011193983A (ja) * | 2010-03-18 | 2011-10-06 | Olympus Corp | 内視鏡システム、撮像装置及び制御方法 |
Family Cites Families (35)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4888639A (en) * | 1987-05-22 | 1989-12-19 | Olympous Optical Co., Ltd. | Endoscope apparatus having integrated disconnectable light transmitting and image signal transmitting cord |
JP2643965B2 (ja) * | 1988-01-19 | 1997-08-25 | オリンパス光学工業株式会社 | 内視鏡装置 |
JP4370008B2 (ja) * | 1998-11-17 | 2009-11-25 | オリンパス株式会社 | 内視鏡画像処理装置 |
JP4534756B2 (ja) * | 2004-12-22 | 2010-09-01 | ソニー株式会社 | 画像処理装置、画像処理方法、撮像装置、プログラム、及び記録媒体 |
TWI286032B (en) * | 2005-07-05 | 2007-08-21 | Ali Corp | Image enhancement system |
JP5355846B2 (ja) * | 2006-05-08 | 2013-11-27 | オリンパスメディカルシステムズ株式会社 | 内視鏡用画像処理装置 |
JP2008015741A (ja) * | 2006-07-05 | 2008-01-24 | Konica Minolta Holdings Inc | 画像処理装置、画像処理方法及びこれを用いた撮像装置 |
US7949182B2 (en) * | 2007-01-25 | 2011-05-24 | Hewlett-Packard Development Company, L.P. | Combining differently exposed images of the same object |
JP5076755B2 (ja) * | 2007-09-07 | 2012-11-21 | ソニー株式会社 | 画像処理装置、および画像処理方法、並びにコンピュータ・プログラム |
JP2009253760A (ja) * | 2008-04-08 | 2009-10-29 | Olympus Imaging Corp | 画像処理装置、画像処理方法及び電子機器 |
KR101493694B1 (ko) * | 2008-08-01 | 2015-02-16 | 삼성전자주식회사 | 이미지 처리장치, 이미지 처리방법 및 처리방법을실행시키기 위한 프로그램을 저장한 기록매체 |
KR101538655B1 (ko) * | 2008-11-21 | 2015-07-22 | 삼성전자주식회사 | 이미지 처리장치, 이미지 처리방법 및 처리방법을 실행시키기 위한 프로그램을 저장한 기록매체 |
JP5493760B2 (ja) * | 2009-01-19 | 2014-05-14 | 株式会社ニコン | 画像処理装置およびデジタルカメラ |
US8456541B2 (en) * | 2009-02-18 | 2013-06-04 | Olympus Corporation | Image processing apparatus and image processing program |
US8724928B2 (en) * | 2009-08-31 | 2014-05-13 | Intellectual Ventures Fund 83 Llc | Using captured high and low resolution images |
KR101618298B1 (ko) * | 2009-10-23 | 2016-05-09 | 삼성전자주식회사 | 고감도 영상 생성 장치 및 방법 |
JP4973719B2 (ja) * | 2009-11-11 | 2012-07-11 | カシオ計算機株式会社 | 撮像装置、撮像方法、及び撮像プログラム |
JP5609080B2 (ja) * | 2009-11-30 | 2014-10-22 | 富士通株式会社 | 画像処理装置、画像表示装置、画像処理プログラム及び画像処理方法 |
JP5640370B2 (ja) * | 2009-12-18 | 2014-12-17 | ソニー株式会社 | 画像処理装置,画像処理方法及び撮像装置 |
JP5541914B2 (ja) * | 2009-12-28 | 2014-07-09 | オリンパス株式会社 | 画像処理装置、電子機器、プログラム及び内視鏡装置の作動方法 |
CN102754443B (zh) * | 2010-02-12 | 2014-11-12 | 佳能株式会社 | 图像处理设备和图像处理方法 |
JP5366855B2 (ja) * | 2010-02-16 | 2013-12-11 | 富士フイルム株式会社 | 画像処理方法及び装置並びにプログラム |
JP2011234342A (ja) * | 2010-04-08 | 2011-11-17 | Canon Inc | 画像処理装置及びその制御方法 |
JP2011218090A (ja) * | 2010-04-14 | 2011-11-04 | Olympus Corp | 画像処理装置、内視鏡システム及びプログラム |
JP4991907B2 (ja) * | 2010-05-11 | 2012-08-08 | キヤノン株式会社 | 画像処理装置、および、画像処理装置の制御方法 |
JP2012075545A (ja) | 2010-09-30 | 2012-04-19 | Fujifilm Corp | 内視鏡システムおよび内視鏡の較正方法 |
JP5335017B2 (ja) * | 2011-02-24 | 2013-11-06 | 富士フイルム株式会社 | 内視鏡装置 |
KR101828411B1 (ko) * | 2011-09-21 | 2018-02-13 | 삼성전자주식회사 | 영상 처리 방법 및 영상 처리 장치 |
JP5816511B2 (ja) * | 2011-10-04 | 2015-11-18 | オリンパス株式会社 | 画像処理装置、内視鏡装置及び画像処理装置の作動方法 |
JP2014002635A (ja) * | 2012-06-20 | 2014-01-09 | Sony Corp | 画像処理装置、撮像装置、画像処理方法およびプログラム |
JP2014021928A (ja) * | 2012-07-23 | 2014-02-03 | Canon Inc | 画像処理装置、画像処理方法およびプログラム |
US9639915B2 (en) * | 2012-08-08 | 2017-05-02 | Samsung Electronics Co., Ltd. | Image processing method and apparatus |
US9106813B2 (en) * | 2013-02-27 | 2015-08-11 | Samsung Electronics Co., Ltd. | Noise suppression devices and image capturing apparatuses having the same |
CN104125442A (zh) * | 2013-04-26 | 2014-10-29 | 索尼公司 | 图像处理方法、装置以及电子设备 |
JP2014230176A (ja) * | 2013-05-23 | 2014-12-08 | ソニー株式会社 | 画像信号処理装置、画像信号処理方法、撮像装置および画像表示方法 |
-
2015
- 2015-11-12 CN CN201580004855.4A patent/CN105916430B/zh active Active
- 2015-11-12 WO PCT/JP2015/081823 patent/WO2016084608A1/ja active Application Filing
- 2015-11-12 JP JP2016537591A patent/JP6020950B1/ja active Active
- 2015-11-12 US US15/112,234 patent/US9986890B2/en active Active
-
2016
- 2016-09-30 JP JP2016193685A patent/JP6794747B2/ja active Active
-
2018
- 2018-05-21 US US15/985,322 patent/US10799087B2/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009100936A (ja) * | 2007-10-23 | 2009-05-14 | Fujinon Corp | 画像処理装置 |
JP2011193983A (ja) * | 2010-03-18 | 2011-10-06 | Olympus Corp | 内視鏡システム、撮像装置及び制御方法 |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2018148973A (ja) * | 2017-03-10 | 2018-09-27 | 株式会社Jvcケンウッド | 手術システム、手術器具、トロカール及び判定方法 |
JP7406892B2 (ja) | 2019-03-11 | 2023-12-28 | キヤノン株式会社 | 医用画像処理装置、医用画像処理方法及びプログラム |
WO2021079723A1 (ja) * | 2019-10-21 | 2021-04-29 | ソニー株式会社 | 画像処理装置、画像処理方法および内視鏡システム |
Also Published As
Publication number | Publication date |
---|---|
CN105916430A (zh) | 2016-08-31 |
CN105916430B (zh) | 2019-04-23 |
US20170251901A1 (en) | 2017-09-07 |
JP2017000839A (ja) | 2017-01-05 |
US9986890B2 (en) | 2018-06-05 |
US10799087B2 (en) | 2020-10-13 |
JPWO2016084608A1 (ja) | 2017-04-27 |
JP6794747B2 (ja) | 2020-12-02 |
JP6020950B1 (ja) | 2016-11-02 |
US20180263462A1 (en) | 2018-09-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6020950B1 (ja) | 内視鏡システム、および内視鏡システムの作動方法、並びにプログラム | |
JP5870264B2 (ja) | 撮像装置、撮像方法、プログラム、および集積回路 | |
US9858666B2 (en) | Medical skin examination device and method for processing and enhancing an image of a skin lesion | |
JP5814698B2 (ja) | 自動露光制御装置、制御装置、内視鏡装置及び内視鏡装置の作動方法 | |
JP5562808B2 (ja) | 内視鏡装置及びプログラム | |
JP2007124088A (ja) | 画像撮影装置 | |
JP2008165312A (ja) | 画像処理装置及び画像処理方法 | |
JP2009284001A (ja) | 画像処理装置、撮像装置及び画像処理方法 | |
JP2016519592A (ja) | ノイズ認識のエッジ強調 | |
US20170046836A1 (en) | Real-time endoscopic image enhancement | |
JP2023120364A (ja) | 光レベル適応フィルタ及び方法 | |
KR101538655B1 (ko) | 이미지 처리장치, 이미지 처리방법 및 처리방법을 실행시키기 위한 프로그램을 저장한 기록매체 | |
JP2017098863A (ja) | 情報処理装置、および情報処理方法、並びにプログラム | |
JP5963990B2 (ja) | 医療用システム、その画像処理設定方法、及び画像処理装置 | |
WO2017175452A1 (ja) | 画像処理装置、撮像装置、および画像処理方法、並びにプログラム | |
JP4089675B2 (ja) | 撮像装置 | |
US7822247B2 (en) | Endoscope processor, computer program product, endoscope system, and endoscope image playback apparatus | |
JP2005182232A (ja) | 輝度補正装置および輝度補正方法 | |
WO2017082091A1 (ja) | 手術システム、手術用制御方法、およびプログラム | |
JP5653082B2 (ja) | 電子内視鏡用の動画像強調処理システムおよび同システムの作動方法 | |
JP2009258284A (ja) | 画像処理装置、撮像装置、画像処理方法およびプログラム | |
WO2021079723A1 (ja) | 画像処理装置、画像処理方法および内視鏡システム | |
JP6344608B2 (ja) | 画像処理装置、画像処理方法、プログラム、及び、手術システム | |
JP7174064B2 (ja) | 画像信号処理装置、画像信号処理方法、プログラム | |
JP2023073635A (ja) | 電子内視鏡用プロセッサ及び電子内視鏡システム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2016537591 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15863817 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15112234 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15863817 Country of ref document: EP Kind code of ref document: A1 |