WO2022264688A1 - 医療画像処理装置、内視鏡システム、及び医療画像処理装置の作動方法 - Google Patents
医療画像処理装置、内視鏡システム、及び医療画像処理装置の作動方法 Download PDFInfo
- Publication number
- WO2022264688A1 WO2022264688A1 PCT/JP2022/018434 JP2022018434W WO2022264688A1 WO 2022264688 A1 WO2022264688 A1 WO 2022264688A1 JP 2022018434 W JP2022018434 W JP 2022018434W WO 2022264688 A1 WO2022264688 A1 WO 2022264688A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- boundary line
- display
- still image
- displayed
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims abstract description 120
- 238000000034 method Methods 0.000 title claims abstract description 47
- 238000009499 grossing Methods 0.000 claims description 5
- 230000003068 static effect Effects 0.000 abstract 2
- 238000003384 imaging method Methods 0.000 description 26
- 238000010586 diagram Methods 0.000 description 25
- 238000012937 correction Methods 0.000 description 24
- 230000003902 lesion Effects 0.000 description 24
- 230000006870 function Effects 0.000 description 18
- 238000003825 pressing Methods 0.000 description 18
- 238000005286 illumination Methods 0.000 description 15
- 238000007689 inspection Methods 0.000 description 13
- 238000001514 detection method Methods 0.000 description 11
- 230000003287 optical effect Effects 0.000 description 10
- 238000004458 analytical method Methods 0.000 description 8
- 230000009467 reduction Effects 0.000 description 8
- 238000012323 Endoscopic submucosal dissection Methods 0.000 description 5
- 238000006243 chemical reaction Methods 0.000 description 5
- 230000000295 complement effect Effects 0.000 description 5
- 238000003780 insertion Methods 0.000 description 5
- 230000037431 insertion Effects 0.000 description 5
- 230000002547 anomalous effect Effects 0.000 description 4
- 238000005452 bending Methods 0.000 description 4
- 238000004195 computer-aided diagnosis Methods 0.000 description 4
- 238000003745 diagnosis Methods 0.000 description 4
- 238000012326 endoscopic mucosal resection Methods 0.000 description 4
- 230000007274 generation of a signal involved in cell-cell signaling Effects 0.000 description 4
- 238000010801 machine learning Methods 0.000 description 4
- 238000001228 spectrum Methods 0.000 description 4
- 230000001276 controlling effect Effects 0.000 description 3
- 238000001839 endoscopy Methods 0.000 description 3
- 239000004065 semiconductor Substances 0.000 description 3
- 238000002679 ablation Methods 0.000 description 2
- 238000001574 biopsy Methods 0.000 description 2
- 210000004204 blood vessel Anatomy 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 230000002596 correlated effect Effects 0.000 description 2
- 230000000875 corresponding effect Effects 0.000 description 2
- 230000008878 coupling Effects 0.000 description 2
- 238000010168 coupling process Methods 0.000 description 2
- 238000005859 coupling reaction Methods 0.000 description 2
- 238000009826 distribution Methods 0.000 description 2
- 230000001788 irregular Effects 0.000 description 2
- 239000010410 layer Substances 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 210000004400 mucous membrane Anatomy 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 238000005070 sampling Methods 0.000 description 2
- 239000002344 surface layer Substances 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 238000012935 Averaging Methods 0.000 description 1
- 206010028980 Neoplasm Diseases 0.000 description 1
- OAICVXFJPJFONN-UHFFFAOYSA-N Phosphorus Chemical compound [P] OAICVXFJPJFONN-UHFFFAOYSA-N 0.000 description 1
- 208000037062 Polyps Diseases 0.000 description 1
- 208000005718 Stomach Neoplasms Diseases 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 201000011510 cancer Diseases 0.000 description 1
- 238000013527 convolutional neural network Methods 0.000 description 1
- 238000013136 deep learning model Methods 0.000 description 1
- 201000010099 disease Diseases 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 238000003708 edge detection Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 206010017758 gastric cancer Diseases 0.000 description 1
- 210000004907 gland Anatomy 0.000 description 1
- 229910052736 halogen Inorganic materials 0.000 description 1
- 150000002367 halogens Chemical class 0.000 description 1
- 208000017819 hyperplastic polyp Diseases 0.000 description 1
- 238000002347 injection Methods 0.000 description 1
- 239000007924 injection Substances 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000003062 neural network model Methods 0.000 description 1
- 230000000644 propagated effect Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 210000002784 stomach Anatomy 0.000 description 1
- 201000011549 stomach cancer Diseases 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
- 229910052724 xenon Inorganic materials 0.000 description 1
- FHNFHKCVQCLJFQ-UHFFFAOYSA-N xenon atom Chemical compound [Xe] FHNFHKCVQCLJFQ-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
- A61B1/0005—Display arrangement combining images e.g. side-by-side, superimposed or tiled
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000095—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope for image enhancement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000094—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope extracting biological structures
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000096—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope using artificial intelligence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/20—Drawing from basic elements, e.g. lines or circles
- G06T11/203—Drawing of straight lines or curves
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/70—Denoising; Smoothing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10068—Endoscopic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20092—Interactive image processing based on input by user
- G06T2207/20104—Interactive definition of region of interest [ROI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30096—Tumor; Lesion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/41—Medical
Definitions
- the present invention relates to a medical image processing apparatus, an endoscope system, and a method of operating a medical image processing apparatus.
- CAD Computer-Aided Diagnosis
- Patent Document 1 There is known a medical image processing apparatus that graphically notifies an attention area without obstructing observation of a boundary between an attention area and a non-attention area when indicating a range of a lesion obtained by CAD on an endoscopic image.
- the borderline of the lesion (demarcation) in endoscopic submucosal dissection (ESD) or endoscopic mucosal resection (EMR) line) to set the ablation range.
- ESD endoscopic submucosal dissection
- EMR endoscopic mucosal resection
- An object of the present invention is to provide a medical image processing apparatus, an endoscope system, and a method of operating the medical image processing apparatus that provide information on the boundary between the attention area and the non-attention area with higher accuracy.
- a medical image processing apparatus comprises a processor.
- the processor obtains an endoscopic image of a subject photographed by an endoscope, and determines the area of interest and the non-attention area of the subject in the still image of the endoscopic image. setting a boundary line indicating a boundary, generating a boundary line display image in which the set boundary line is displayed in a still image, and performing control to display the moving image of the endoscopic image and the boundary line display image on the display device;
- the boundary lines displayed in the boundary line display image are updated and displayed for each boundary line setting.
- the processor preferably detects and sets the boundary line based on the still image.
- the display device includes a first display device and a second display device, and the processor controls the second display device provided by the small terminal that connects the still image and/or the border display image to the first display device and/or the medical image processing device. It is preferable to control display on a display device.
- the processor When displaying a still image, the processor preferably sets the boundary line based on a user-generated drawing on the displayed still image.
- the drawing is a positive point generated in the region of interest of the still image based on the user's judgment.
- the drawing is preferably a negative point generated in the non-attention area of the still image by the user's judgment.
- the processor controls the display of the still image on the second display device, and the drawing is a drawing generated on the still image displayed on the second display device.
- the processor newly sets the boundary line obtained by correcting the boundary line displayed in the boundary line display image as the boundary line.
- the processor preferably controls the display of the border display image on the second display device.
- the processor displays the moving image on the main screen of the first display device and displays the boundary line display image on the sub-screen of the first display device.
- the processor preferably performs control to display the still image on the sub-screen of the first display device.
- the processor displays a boundary line in the moving image corresponding to the boundary line displayed in the boundary display image.
- the processor controls whether or not to display the boundary line in the moving image based on the user's instruction or the endoscopic image.
- the processor preferably finishes updating the boundary line based on the user's instruction or the endoscopic image.
- the still image is preferably obtained in the same examination as the moving image, or obtained in a different examination than the moving image.
- the endoscope system of the present invention includes an endoscope for photographing a subject, a display device, and a medical image processing device.
- the display device preferably includes a first display device and a second display device.
- a method of operating a medical image processing apparatus includes the steps of acquiring an endoscopic image of a subject photographed by an endoscope, and dividing a region of interest and a region of non-interest in the subject in a still image of the endoscopic image. a step of setting a boundary line indicating a boundary; a step of generating a boundary line display image in which the set boundary line is displayed in a still image; and displaying a moving image of the endoscopic image and the boundary line display image on a display device. and a step of controlling, wherein the boundary lines displayed in the boundary line display image are updated and displayed each time the boundary line is set.
- the present invention it is possible to provide information on the boundary between the attention area and the non-attention area with higher accuracy.
- FIG. 1 is an external view of an endoscope system;
- FIG. It is an explanatory view explaining four colors of LED which a light source part contains.
- 4 is a graph showing spectra of violet light V, blue light B, green light G, and red light R;
- 3 is a block diagram showing functions of a boundary line processing unit;
- FIG. FIG. 3 is an image diagram of a display containing a still image;
- FIG. 4 is an image diagram of a touch panel including a home screen;
- FIG. 4 is an image diagram of a touch panel including thumbnails;
- FIG. 4 is an image diagram of a touch panel including a selected still image;
- FIG. 3 is an image diagram of a display containing selected still images; 4 is a block diagram showing functions of a boundary line generation unit; FIG. FIG. 4 is an explanatory diagram for explaining functions of a learning model;
- FIG. 10 is an image diagram of a touch panel including a DL setting button;
- FIG. 4 is an image diagram of a touch panel including a positive point registration button;
- FIG. 4 is an image diagram of a touch panel including a negative point registration button;
- FIG. 10 is an image diagram of a touch panel including a generated boundary line;
- FIG. 4 is an image diagram of a touch panel including a boundary line display image;
- FIG. 4 is an image diagram of a touch panel including a correction button;
- FIG. 4 is an image diagram of a touch panel including borders marked with vertices;
- FIG. 4 is an image diagram of a touch panel including moved vertices and borders; 4 is a block diagram showing functions of a boundary correction unit; FIG. FIG. 10 is an explanatory diagram for explaining the display of the degree of irregularity; FIG. 4 is an image diagram of a display including a border display image; FIG. 10 is an image diagram of a display in the DL setting mode; FIG. 3 is an image diagram of a display including thumbnails; 10 is a flowchart for explaining the flow of boundary line setting by the medical image processing apparatus; FIG. 4 is an explanatory diagram for explaining a case where a medical image processing device is included in a diagnosis support device; FIG. 4 is an explanatory diagram for explaining a case where a medical image processing device is included in a medical service support device;
- the endoscope system 10 includes an endoscope 12, a light source device 13, a processor device 14, a display 15 as a first display device, a keyboard 16, and a second display device. It has a tablet 17 which is a small terminal.
- the second display device is preferably a touch panel. Note that the display device includes the first display device and the second display device, and does not distinguish between them.
- the endoscope 12 is optically connected to the light source device 13 and electrically connected to the processor device 14 .
- the processor device 14 has a function as a medical image processing device.
- the tablet 17 is connected to the processor device 14 wirelessly or by wire.
- the medical image is an endoscopic image.
- An endoscopic image is an image obtained by photographing an observation target of an endoscope, which is a subject, with an endoscope.
- the processor device 14 has the function of a medical image processing device, but the device that performs the function of the medical image processing device may be a separate device from the processor device 14 .
- various connections are not limited to wired connections, but may be wireless connections, or may be connected via a network. Therefore, the functions of the medical image processing apparatus may be performed by an external device connected via a network.
- the endoscope 12 includes an insertion section 12a to be inserted into the body of a subject having an observation target, an operation section 12b provided at a proximal end portion of the insertion section 12a, and a distal end side of the insertion section 12a. It has a curved portion 12c and a tip portion 12d.
- the bending portion 12c is bent by operating the angle knob 12e (see FIG. 3) of the operation portion 12b.
- the distal end portion 12d is directed in a desired direction by the bending motion of the bending portion 12c.
- a forceps channel (not shown) for inserting a treatment tool or the like is provided from the insertion portion 12a to the distal end portion 12d.
- the treatment instrument is inserted into the forceps channel from the forceps port 12h. Air supply, water supply, or suction is also performed from the forceps port 12h.
- the operation unit 12b has a zoom operation unit 12f for changing the imaging magnification, a mode switching switch 12g for switching observation modes, and a freeze switch 12i for acquiring a still image.
- the observation mode switching operation, zoom operation, or still image acquisition operation can be performed using the mode switching switch 12g, the zoom operation section 12f, or the freeze switch, as well as the operation using the keyboard 16, foot switch (not shown), or the like. It can be used as an instruction.
- the endoscope system 10 has a normal observation mode and a special observation mode.
- a normal image is displayed on the display 15, which is an endoscopic image with natural colors obtained by imaging the observation target using white light as the illumination light.
- a special image is displayed on the display 15, which is an endoscopic image of an observation target captured by emitting illumination light having a specific spectrum different from white light.
- An observation support mode can be added to each of the normal observation mode and the special observation mode.
- a moving image of an endoscopic image and a still image of a boundary line showing a boundary line indicating a boundary between an attention area and a non-attention area are displayed on a display device. perform a function.
- the boundary line 18 is the boundary between a lesion area 18a, which is an attention area, and a non-lesion area 18b, which is a non-interest area. is the line shown. It is usually a closed curve and in ESD or EMR it is important to know the boundary line 18 accurately in order to identify the boundary line 18 and set the ablation line or extent. In the drawing, the lesion area 18a is shaded.
- a normal image or a special image is used as the endoscopic image used in the observation support mode.
- an observation mode a multi-observation mode or the like in which normal images and special images are automatically switched and obtained may be provided.
- An observation support mode can be added to the multi-observation mode, and even when the observation support mode is added, the normal image and the special image can be automatically switched and used.
- the processor device 14 is electrically connected to the display 15 and keyboard 16 .
- the display 15 displays a moving image of an endoscopic image acquired during an examination, a still image 19, a boundary line display image to be described later, and/or various kinds of information.
- the keyboard 16 functions as a user interface that receives input operations such as function settings.
- the processor device 14 may be connected to an external storage (not shown) for storing images, image information, and the like.
- the light source device 13 emits illumination light to irradiate an observation target, and includes a light source unit 20 and a light source processor 21 that controls the light source unit 20 .
- the light source unit 20 is composed of, for example, a semiconductor light source such as a multicolor LED (Light Emitting Diode), a combination of a laser diode and a phosphor, or a xenon lamp or halogen light source.
- the light source unit 20 also includes an optical filter and the like for adjusting the wavelength band of light emitted by the LED or the like.
- the light source processor 21 controls the amount of illumination light by turning on/off each LED or the like and adjusting the driving current or driving voltage of each LED or the like.
- the light source processor 21 also controls the wavelength band of the illumination light by changing the optical filter or the like.
- the light source unit 20 includes a V-LED (Violet Light Emitting Diode) 20a, a B-LED (Blue Light Emitting Diode) 20b, a G-LED (Green Light Emitting Diode) 20c, and R-LED (Red Light Emitting Diode) 20d.
- V-LED Volt Light Emitting Diode
- B-LED Blue Light Emitting Diode
- G-LED Green Light Emitting Diode
- R-LED Red Light Emitting Diode
- the V-LED 20a generates violet light V with a central wavelength of 410 ⁇ 10 nm and a wavelength range of 380-420 nm.
- the B-LED 20b generates blue light B with a central wavelength of 450 ⁇ 10 nm and a wavelength range of 420-500 nm.
- the G-LED 20c generates green light G with a wavelength range of 480-600 nm.
- the R-LED 20d emits red light R with a central wavelength of 620-630 nm and a wavelength range of 600-650 nm.
- the light source processor 21 controls the V-LED 20a, B-LED 20b, G-LED 20c, and R-LED 20d. In the normal observation mode, the light source processor 21 emits normal light in which the combination of the light intensity ratios of the violet light V, blue light B, green light G, and red light R is Vc:Bc:Gc:Rc. Then, each LED 20a-20d is controlled.
- the light source processor 21 When the light source processor 21 is set to the special observation mode, for example, the combination of the light intensity ratios among the violet light V, blue light B, green light G, and red light R is changed to obtain a specific observation mode. Emit a spectrum of illumination light.
- the light emitted by each of the LEDs 20a to 20e is incident on the light guide 41 via an optical path coupling section (not shown) composed of mirrors, lenses, and the like.
- the light guide 41 is built in the endoscope 12 and the universal cord (the cord connecting the endoscope 12, the light source device 13 and the processor device 14).
- the light guide 41 propagates the light from the optical path coupling portion to the distal end portion 12 d of the endoscope 12 .
- the distal end portion 12d of the endoscope 12 is provided with an illumination optical system 30a and an imaging optical system 30b.
- the illumination optical system 30 a has an illumination lens 42 , and the illumination light propagated by the light guide 41 is applied to the observation target via the illumination lens 42 .
- the imaging optical system 30 b has an objective lens 43 , a zoom lens 44 and an imaging sensor 45 .
- Various kinds of light such as reflected light, scattered light, and fluorescent light from the observation target enter the imaging sensor 45 via the objective lens 43 and the zoom lens 44 .
- an image of the observation target is formed on the imaging sensor 45 .
- the zoom lens 44 can be freely moved between the telephoto end and the wide end by operating the zoom operation section 12f to enlarge or reduce the observation target imaged on the imaging sensor 45.
- the imaging sensor 45 is a color imaging sensor in which any one of R (red), G (green), or B (blue) color filters is provided for each pixel. to output As the imaging sensor 45, a CCD (Charge Coupled Device) imaging sensor or a CMOS (Complementary Metal-Oxide Semiconductor) imaging sensor can be used. Further, instead of the imaging sensor 45 provided with primary color filters, a complementary color imaging sensor provided with complementary color filters of C (cyan), M (magenta), Y (yellow) and G (green) may be used. . When a complementary color imaging sensor is used, CMYG four-color image signals are output.
- CCD Charge Coupled Device
- CMOS Complementary Metal-Oxide Semiconductor
- CMYG four-color image signal into the RGB three-color image signal by complementary color-primary color conversion
- RGB image signal similar to that of the image sensor 45 can be obtained.
- a monochrome sensor without a color filter may be used instead of the imaging sensor 45.
- the imaging sensor 45 is driven and controlled by an imaging control unit (not shown).
- the central control unit 58 controls the light emission of the light source unit 20 through the light source processor 21 in synchronization with the imaging control unit, thereby capturing an image of the observation target illuminated with normal light. to control.
- the B pixels of the imaging sensor 45 output the Bc image signals
- the G pixels output the Gc image signals
- the R pixels output the Rc image signals.
- a CDS/AGC (Correlated Double Sampling/Automatic Gain Control) circuit 46 performs correlated double sampling (CDS) and automatic gain control (AGC) on analog image signals obtained from the imaging sensor 45 .
- the image signal that has passed through the CDS/AGC circuit 46 is converted into a digital image signal by an A/D (Analog/Digital) converter 47 .
- the digital image signal after A/D conversion is input to the processor device 14 .
- a program in a program memory is operated by a central control unit 58 constituted by an image processor, etc., which is a first processor, so that an image acquisition unit 51 and a DSP (Digital Signal Processor) 52 , the functions of the noise reduction unit 53, the memory 54, the image processing unit 55, the display control unit 56, the video signal generation unit 57, and the central control unit 58 are realized.
- the central control unit 58 receives information from the endoscope 12 and the light source device 13, and controls the endoscope 12 or the light source device 13 in addition to controlling each unit of the processor device 14 based on the received information. I do. It also receives information such as instructions from the keyboard 16 .
- the image acquisition unit 51 acquires a digital image signal of an endoscopic image input from the endoscope 12.
- the image acquisition unit 51 acquires, for each frame, an image signal obtained by photographing an observation target illuminated by each illumination light.
- the image acquisition unit 51 may acquire an endoscope image obtained by photographing an observation target illuminated with predetermined illumination lights having different spectra.
- the acquired image signal is sent to the DSP 52.
- the DSP 52 performs digital signal processing such as color correction processing on the received image signal.
- the noise reduction unit 53 performs noise reduction processing using, for example, a moving average method, a median filter method, or the like on the image signal that has been subjected to color correction processing or the like by the DSP 52 .
- the noise-reduced image signal is stored in the memory 54 .
- the image processing unit 55 acquires the image signal after noise reduction from the memory 54 . Then, the acquired image signal is subjected to signal processing such as color conversion processing, color enhancement processing, and structure enhancement processing as necessary to generate a color endoscopic image showing the observation target.
- the image processing section 55 includes a normal image processing section 61 , a special image processing section 62 and a boundary line processing section 63 .
- the normal image processing unit 61 performs color conversion processing and color enhancement processing on the input image signal for the normal image after noise reduction for one frame. , and perform image processing for normal images such as structure enhancement processing.
- An image signal that has undergone image processing for a normal image is input to the display control unit 56 .
- the special image processing unit 62 performs special image processing such as color conversion processing, color enhancement processing, and structure enhancement processing on the input image signal of the special image after noise reduction for one frame. Apply image processing.
- the image signal subjected to the image processing for the special image is input to the display control section 56 as the special image.
- the endoscopic image generated by the image processing unit 55 is a normal image when the observation mode is the normal observation mode, and is a special image when the observation mode is the special observation mode.
- the content of structure enhancement processing differs depending on the observation mode.
- the image processing unit 55 In the normal observation mode, the image processing unit 55 generates a normal image by performing the various signal processing described above so that the observation target becomes a natural color.
- the image processing unit 55 In the special observation mode, the image processing unit 55 generates a special image by, for example, performing the above various signal processing for emphasizing the blood vessel to be observed.
- the display control unit 56 receives the endoscopic image generated by the image processing unit 55 and performs control for displaying it on the display 15 according to the control of the central control unit 58 .
- the endoscope image controlled for display by the display control unit 56 is generated by the video signal generation unit 57 into a video signal to be displayed on the display 15 and sent to the display 15 .
- the display 15 displays the endoscopic image sent from the video signal generator 57 under the control of the display controller 56 .
- the boundary line processing unit 63 functions in the observation support mode. Therefore, the boundary line processing section 63 operates together with the normal image processing section 61 or the special image processing section 62 respectively.
- the boundary line processing unit 63 acquires the endoscopic image from the memory 54, and in the still image 19 of the endoscopic image, the boundary line 18 indicating the boundary between the attention area and the non-attention area in the subject. set.
- a boundary line display image is created by displaying the set boundary line 18 on the still image 19, and control is performed to display the boundary line display image and the moving image of the endoscopic image on a display device such as the display 15 or the like.
- the boundary line 18 displayed in the boundary line display image is updated and displayed each time the boundary line 18 is set.
- the boundary processing unit 63 includes a still image storage unit 71, a target image setting unit 72, a boundary generation unit 73, a boundary correction unit 74, a boundary setting unit 75, and a boundary.
- a line display portion 76 is provided.
- the still image storage unit 71 stores the still image 19 for which the boundary line is to be set.
- the target image setting unit 72 sets the selected still image, which is the target still image 19 for setting the boundary line.
- the boundary line generator 73 generates a boundary line for the selected still image.
- the boundary correction unit 74 corrects the generated boundary in some cases.
- the boundary line setting unit 75 sets the generated or modified boundary line and creates a boundary line display image.
- the boundary line display unit 76 displays the boundary line display image on a display device such as the display 15 or the like.
- the user performs observation by operating the mode changeover switch 12g (see FIG. 2) of the endoscope operation unit 12b and the like to add the observation support mode.
- a boundary line display image is created by setting a boundary line on the obtained still image 19, and can be displayed at a predetermined position of a display device such as the display 15 or the like.
- the still image 19 can be the one acquired in the inspection being performed at that time, or the one acquired in the past inspection.
- the image acquired in the inspection being performed at that time is acquired in the same inspection as the moving image, and is selected from the still images 19 acquired in the inspection by the user.
- Still images 19 acquired in past examinations can be used by calling the still images 19 stored in the still image storage unit 71 .
- the still image 19 acquired in the past examination is, for example, a still image 19 having a region of interest such as a similar site or a similar lesion as the observation target in the current examination, or a past image of the same site as the patient's site under examination. can be a still image 19 or the like.
- the selected still image may be one or a plurality of still images 19 displayed on the display 15 and selected from the displayed still images 19 as the selected still image, or one or a plurality of still images may be displayed on the touch panel 91 of the tablet 17. may be displayed, and a selected still image may be selected from the displayed still images 19 as the selected still image.
- the still image 19 or the like displayed on the touch panel 91 of the tablet 17 can also be displayed on the display 15 via the processor unit 14, and the images displayed by both can be synchronized.
- the still image 19 is obtained by the user operating the freeze switch 12i (see FIG. 2).
- the acquired still image 19 is stored in the still image storage unit 71 .
- the still images 19 stored in the still image storage unit 71 are displayed in a temporary display still image area 81 of the display 15, for example, three of the most recently acquired still images 19 are arranged in order of shooting time. .
- the still image 19 with the oldest shooting time is deleted, and the newly acquired still image 19 is displayed instead. Is displayed.
- the display 15 includes a live moving image area 82 that displays a moving image 82a of an endoscopic image, and a fixed display still image area 83 that displays a selected still image or a boundary line display image.
- the selected still image is the image that sets the boundary line 18 .
- the target image setting unit 72 sets the still image 19 selected from the still images 19 stored in the still image storage unit 71 as the selected still image.
- the still image 19 can be selected and set with a cursor or the like in the temporary display still image area 81 displayed on the display 15. or a method of setting using the tablet 17, or the like.
- a preferable method can be selected depending on various situations, such as whether or not a person other than the person who operates the endoscope can operate it.
- the tablet 17 is used to set the selected still image.
- the home screen displayed on the touch panel 91 of the tablet 17 includes an image selection button 92a, a DL (Demarcation line) setting button 92b, and a reflect to processor button 92c. It also has a selected still image area 96 for displaying a selected still image.
- the image selection button 92a is a button for selecting a selected still image.
- the DL setting button 92b is a button for setting the boundary line 18.
- FIG. The reflect to processor button 92 c is a button for sending the set boundary line 18 to the processor unit 14 and displaying it on the display 15 .
- thumbnails 93 By pressing the image selection button 92 on the home screen of the touch panel 91 of the tablet 17, as shown in FIG. A predetermined number of thumbnails 93 are displayed in order.
- symbol may be attached
- One of the thumbnails 93 of the still image 19 is selected by touching one check box 94 with the touch pen 98 or the like. After that, by pressing the decision button 95, the still image 19 of the selected thumbnail 93 can be selected as the selected still image.
- the screen of the tablet 17 returns to the home screen, and the selected still image 19 is displayed as the selected still image 97 in the selected still image area 96.
- processor device 14 continues to display selected still image 97 in fixed display still image area 83 of display 15 . Therefore, the display 15 displays the current moving image 82a of the endoscope in the live moving image area 82, continuously displays the selected still image 97 in the fixed display still image area 83, and are displayed in the temporary display still image area 81 while updating the three.
- the boundary line generator 73 generates the boundary line 18 based on the selected still image 97 , that is, the still image 19 displayed in the fixed display still image area 83 of the display 15 .
- a method of generating the boundary line 18 there is an automatic method of detecting and generating the boundary line 18 based on the selected still image 97, or a manual method of generating the boundary line 18 by drawing on the selected still image 97 by the user. methods that have been adopted.
- the boundary generation unit 73 includes a boundary detection unit 101 , a drawing detection unit 102 , and a positive point/negative point analysis unit 103 .
- the boundary detection unit 101 detects the boundary 18 of the selected still image 97 based on the selected still image 97 and sets the boundary 18 .
- the drawing detection unit 102 detects the drawn boundary line 18 and sets the boundary line 18 .
- the positive point/negative point analysis unit 103 draws positive points generated in the attention area of the selected still image 97 by the user's judgment and/or draws negative points generated in the non-interest area of the selected still image 97 by the user's judgment.
- the drawing is analyzed to detect the boundary line 18, and the detected boundary line 18 is set.
- the boundary detection unit 101 automatically detects a boundary by calculation based on the selected still image 97 when boundary information is not associated with the selected still image 97 . If boundary line information is associated with the still image 19 of a past examination, the boundary line detection unit 101 reads the boundary line information. In this embodiment, since the selected still image 97 is based on the still image 19 acquired during the examination, the boundary line is detected from the selected still image 97 .
- the boundary line detection method a method using image processing, a method using a learning model by machine learning, or the like can be used, and any method can be adopted as long as the boundary line in the selected still image can be detected. may
- V microvascular architecture
- S microsurface (MS) structure
- the gland duct structure and/or blood vessel structure to be observed is extracted based on the selected still image 97, and discontinuous points are calculated using these density distributions and/or shape distributions. do.
- a discontinuity point can be calculated by edge detection or the like.
- a closed curve is generated by connecting the calculated discontinuous points. Since the boundary line 18 is the boundary between a lesion and a non-lesion, this closed curve can be used as the boundary line 18 .
- a learning model 111 that outputs a boundary line 18 when a still image 19 is input can be created and used. Since the selected still image 97 is the still image 19 , the learning model 111 outputs the boundary line 18 in the selected still image 97 when the selected still image 97 is input.
- the learning model 111 can be based on supervised learning, unsupervised learning, or the like.
- a learning model 111 based on supervised learning is generated by learning using still images for learning in which information about the boundary line 18 is associated with the still image 19 .
- the information about the boundary line 18 also includes that associated with the still image 19 not including the boundary line 18 .
- a test is performed using a still image 19 with a known boundary line 18, and various adjustments such as parameters are performed.
- the learning model 111 is generated by further adjusting parameters and the like so that the boundary line 18 is output correctly when the still image 19 with the unknown boundary line 18 is input.
- Machine learning techniques such as clustering can be used in the learning model 111 based on unsupervised learning.
- the learning model 111 is preferably a neural network model. Further, since the learning model 111 detects the boundary line 18 based on the still image 19, it is preferably a convolutional neural network. Therefore, learning model 111 preferably has a layered structure with an output layer that outputs boundary line 18 and at least one intermediate layer. A deep learning model is also preferred, as it may result in better detection results.
- the drawing detection unit 102 generates the boundary line 18 by drawing on the selected still image 97 by the user. Drawing is preferably performed on the tablet 17 having the touch panel 91 .
- the selected still image 97 is displayed in the selected still image area 96 on the home screen displayed on the touch panel 91 of the tablet 17 (see FIG. 10).
- drawing method there is a method in which the user draws on the selected still image 97 displayed on the tablet 17, or the like.
- the drawings can be line drawings, stipple drawings, graphics, or the like.
- the line drawing can be used as the boundary line 18 by drawing a line drawing in the region of interest, based on the user's visual observation of the still image 19 .
- a method may be adopted in which the positive points and/or negative points drawn by the user on the selected still image 97 displayed on the tablet 17 are analyzed by the positive point/negative point analysis unit 103 to generate the boundary line 18.
- a positive point is a point determined by the user by viewing the still image 19 and drawn in the attention area.
- a negative point is a point determined by the user's visual observation of the still image 19 and drawn in the non-attention area.
- the touch panel 91 of the tablet 17 is provided with a DL setting button 92b for setting the boundary line 18.
- a positive point selection button 122 By pressing the DL setting button 92b, a positive point selection button 122, a negative point selection button 123, a DL setting button 124, a correction button 125, a DL setting button 126, and a return button 122c are displayed.
- the positive point selection button 122 is a button for registering positive points on the selected still image 97
- the negative point selection button 123 is a button for registering negative points on the selected still image 97 .
- the DL setting button 124 is a button for generating the boundary line 18 based on the registered positive points and/or negative line points.
- a correction button 125 is a button for correcting the generated boundary line 18 .
- the DL setting button 126 is a button for sending the generated boundary line 18 to the processor unit 14 and synchronously updating the selected still image 97 displayed on the display 15 .
- the return button 122 c is a button for returning to the home screen of the tablet 17 . It should be noted that the image selection button 92a may be disabled when the DL setting button 92b is pressed to prevent erroneous operations.
- pressing the positive point selection button 122 displays a positive point registration button 122a, a delete button 122b, and a return button 122c.
- the positive point registration button 122 a is a button for registering positive points drawn on the selected still image 97 .
- the delete button 122b is a button for deleting registered positive points.
- the return button 122c is a button for returning to the screen one step before.
- the positive point selection button 122 By pressing the positive point selection button 122 and touching the selected still image 97 with a finger or the touch pen 98 or the like, the positive point 127 can be drawn at the touched location. After drawing, this drawing is registered as a positive point 127 by pressing the positive point registration button 122a. Here, if the return button 122c is pressed, the screen returns to the screen for selecting either the positive point 127 or the negative point (see FIG. 14).
- negative points are also drawn by operating in the same way.
- a negative point registration button 123a is displayed.
- a negative point 128 can be drawn at the touched location. After drawing, this drawing is registered as a negative point 128 by pressing the negative point registration button 123a.
- the positive point/negative point analysis unit 103 generates the boundary line 18 by analyzing the positive points 127 and/or the negative points 128 . For example, between positive points 127 and negative points 128, boundary line 18 is generated. Therefore, it is preferable to draw a plurality of positive points 127 and negative points 128 . This is because there is a high possibility that the positive point/negative point analysis unit 103 will generate a more accurate boundary line 18 .
- the high accuracy of the boundary line 18 means that the boundary between a lesion and a non-lesion in the observation target can be shown more accurately in response to changes over time, or that the boundary line 18 has high precision. Including showing more correctly in
- the selected still image 97 displayed on the touch panel 91 can be moved by an on-screen operation such as dragging or pinching so that the user can easily determine the positive points 127 and/or the negative points 128 in the selected still image 97.
- an on-screen operation such as dragging or pinching
- the user can determine the selected still image 97 in detail by enlarging it, etc., and draw a plurality of positive points 127 and/or negative points 128 or finely draw them, thereby generating a more accurate boundary line 18 .
- a biopsy may be performed in a portion of the region of interest, and positive points 127 and/or negative points 128 may be entered based on the location and results of the biopsy.
- positive points 127 and/or negative points 128 may be entered based on the location and results of the biopsy.
- the user presses the DL (Demarcation line) generation button 124 .
- the boundary line 18 generated by the positive point/negative point analysis unit 103 is displayed on the selected still image 97 .
- the boundary line setting unit 75 sets the generated boundary line 18 .
- the user presses the DL setting button 126 when the user considers the generated boundary line 18 to be appropriate.
- the boundary line 18 is set on the selected still image 97 .
- the display of the positive points 127 and the negative points 128 disappears.
- the return button 122 c When the work of setting the boundary line 18 is completed, the user presses the return button 122 c to return to the home screen of the tablet 17 .
- the selected still image area 96 of the home screen displays a boundary line display image 129 in which the boundary line 18 set in the selected still image 97 is displayed.
- the boundary line display image 129 is an image in which the boundary line 18 is displayed on the still image 19 .
- the boundary correction unit 74 corrects the set boundary 18 and sets the corrected boundary 18 again. If the user thinks that the generated boundary 18 is not suitable, this boundary 18 can be modified to generate a suitable boundary 18 .
- Correction includes a method of manually correcting the generated boundary line 18, a method of correcting based on the still image 19, a method of correcting based on the boundary line display image 129 acquired in the past, and the like.
- Manual correction methods include a correction method by manually moving the generated boundary line 18 and a correction method by enlarging, reducing, or rotating the boundary line 18 .
- the selected still image 97 which is the target still image 19 for generating the boundary line 18, for example, the difference or feature amount related to the color, shape, or surface layer mucous membrane structure of the observation target
- a method in which the user designates the degree of atypicality can be used.
- Boundary line 18 can be manually modified by pressing manual button 131 .
- the boundary line 18 can be enlarged.
- the border 18 can be reduced by pressing the reduce button.
- a vertex 141 is displayed on the boundary line 18 .
- the boundary line 18 can be moved by dragging the vertex 141 with a finger or a touch pen 98 or the like.
- pressing the return button 134 returns to the previous screen.
- the boundary correction unit 74 includes an atypical degree determination unit 142 .
- the heteromorphic degree determination unit 142 determines the heteromorphic degree based on the still image 19 .
- the degree of anomaly is determined based on the selected still image 97 which is the still image 19 from which the boundary line 18 is generated.
- the determined degree of atypicality is represented by a numerical value. Therefore, in the selected still image 97 , regions can be divided according to the degree of anomaly according to the judgment of the degree of anomaly judgment section 142 .
- a region with a high disease severity may have a reddish mucosal color than a region with a low severity.
- the locally injected site has a shape that swells more than the surrounding area.
- the anomalous degree determination unit 142 that determines the shape a portion with a shape that swells more than the surroundings is determined with a high degree of anomaly. The larger the shape of the bulge, the higher the degree of irregularity is given, and the smaller the shape of the bulge, the lower the degree of irregularity is given.
- the user can determine to which range the boundary line 18 is to be modified by specifying the numerical value or numerical range of the degree of atypicality.
- the degree of atypicality is divided into five stages from 1 to 5, depending on the numerical range.
- FIG. 23(A) in the boundary line display image 129, the boundary line 18 is displayed in the region of interest, and the region 151 with the irregularity degree 1 and the region 152 with the irregularity degree 3 are around the boundary line 18. It exists and is displayed as a temporary line according to the user's instruction.
- FIG. 23(B) when the user designates the degree of heterogeneity 3, the boundary line 18 is corrected to the area of the degree of heterogeneity 3 .
- the degree of irregularity the user can easily and accurately correct the boundary line 18 to a desired one.
- the following method can also be adopted in the method using the degree of atypicality.
- the boundary line display image 129 acquired in the past has the boundary line 18 generated without using the irregularity, such as the boundary line 18 manually drawn by the user on the still image 19, first, the past The boundary line 18 is automatically generated for the boundary line display image 129 acquired in 1, and the degree of irregularity is determined. Then, it is examined to what numerical value the degree of irregularity corresponds to the automatically generated boundary line 18 line. Then, a boundary line 18 is generated for the newly obtained still image 19 based on the degree of anomaly examined above.
- the boundary line display image 129 acquired in the past is read and aligned with the still image 19 acquired in the current inspection. Then, a method of reflecting the boundary line 18 on the still image 19, reading the still image 19 associated with the positive points and/or negative points acquired in the past, and aligning it with the still image 19 acquired in the current examination , a method of reflecting positive points and/or negative points on the still image 19, and the like.
- the desired boundary line 18 when a preferable boundary line 18 has been generated in the past, the desired boundary line 18 can be corrected accurately and easily using the information of the past boundary line 18 in the same observation object. can do.
- the boundary line 18 correction method as described above may be applied when the boundary line 18 is generated.
- the boundary line 18 can be corrected easily and quickly with a high degree of freedom. Depending on the use of the boundary line 18 or the user's preference, there are cases where it is desired to display the boundary line 18 with a margin from the lesion, or to display the boundary line 18 just above the lesion. Since the generated boundary line 18 can be freely and easily modified on the spot, the boundary line 18 can be generated to meet various needs, which is preferable.
- the drawing is preferably smoothed.
- Smoothing processing is also called smoothing processing, and is processing for smoothing drawing.
- By smoothing even the boundary line 18 drawn by hand can be made into a smooth boundary line 18 .
- As a specific method there is a method of realizing by averaging the coordinates of the boundary line 18 drawn by handwriting or the feature amount as a whole or in part.
- the user presses the DL setting button 126 if the generated boundary line 18 is considered appropriate.
- the return button 122 c When the work of setting the boundary line 18 is completed, the user presses the return button 122 c to return to the home screen of the tablet 17 .
- a boundary line display image 129 is displayed in the selected still image area 96 on the home screen of the tablet (see FIG. 19). In this case, the boundary line display image 129 is transmitted to the processor unit 14 by pressing the reflect to processor button 92c.
- the boundary line display unit 76 displays the boundary line display image 129 on the display 15 .
- the boundary line display image displayed in the selected still image area 96 of the tablet 17 is displayed.
- 129 is displayed in the fixed display still image area 83 of the display 15 in synchronization with the display of the tablet 17 .
- the display 15 preferably has a main screen and a sub-screen.
- a live moving image area 82 which is the main screen, displays a moving image 82a of an endoscopic image under examination
- a fixed display still image area 83 which is a sub screen, displays a border. It is preferable to display the line display image 129 .
- two or more sub-screens are provided, and the boundary line display image 129 is displayed in the fixed display still image area 83, which is one of the sub-screens, and the temporary display still image area 81 (see FIG. 11) is displayed in the other sub-screen.
- the still image 19 suitable for displaying the boundary line 18 can be easily and quickly selected, and the boundary line display image 129 displaying the boundary line 18 after the selection of the still image 19 can be displayed during inspection.
- the display device is not limited to the display 15, nor is the number limited to one or two. Depending on the situation, the number of display devices, screens to be displayed, or the like can be set as appropriate.
- boundary lines 18 may be displayed in the moving image 82a corresponding to the boundary lines 18 displayed in the boundary line display image 129.
- each frame of the moving image 82a is aligned with the boundary line display image 129, and then the boundary line 18 is superimposed and displayed.
- a frame is an endoscopic image obtained by one imaging.
- the moving image 82a is, for example, 60 fps (frames per second).
- the boundary line display image 129 is an image obtained by photographing an observation target in a range larger than the frame of the moving image 82a, that is, when the frame of the moving image 82a is included in the boundary line display image 129, the boundary line display image 129 A frame of the moving image 82a may be superimposed and displayed thereon.
- the boundary line display image 129 is an image obtained by photographing an observation target within a range smaller than the frame of the moving image 82a, that is, when the boundary line display image 129 is included in the frame of the moving image 82a, the frame of the moving image 82a is displayed. A frame of the moving image 82a may be superimposed and displayed thereon.
- whether or not to display the boundary line 18 on the moving image 82a may be controlled based on the user's instruction or the endoscopic image.
- the display of the boundary line 18 may be controlled based on the endoscopic image. For example, when it is determined that the endoscope is moving by analyzing the endoscope image, control is performed so that the boundary line 18 is not automatically displayed, and detailed observation is made without the endoscope moving. If there is a region of interest in the observation target, control can be performed such that the boundary line 18 is automatically displayed. This can be beneficial because the border 18 is automatically displayed without user prompting.
- the properly set boundary line 18 is displayed on the moving image 82a of the endoscopy. This is a useful aid when a physician needs to determine the perimeter 18 during diagnostics, ESD, or EMR or the like.
- the boundary line 18 displayed in the boundary line display image 129 is updated and displayed each time the boundary line 18 is set.
- the setting is performed not only when the boundary line 18 is generated, but also when the boundary line 18 is modified. Therefore, the medical image processing apparatus can newly set the boundary line 18 obtained by correcting the boundary line 18 displayed in the boundary line display image 129 as the boundary line 18 .
- the medical image processing apparatus generates the boundary line display image 129 that displays the boundary line 18 set in the still image 19, and converts the boundary line display image 129 into the moving image 82a of the endoscopic image, which is a live moving image. is displayed on the display 15, and the boundary line 18 to be displayed is updated and displayed each time the boundary line 18 is set. Therefore, the boundary line 18 can be generated and displayed with higher accuracy. Further, since the boundary line 18 is set in the still image 19, even when the boundary line 18 is automatically generated and set, the troublesome problem of the boundary line 18 changing for each frame can be suppressed. In addition, since the still image 19 on which the boundary line 18 is displayed can be selected at any time, the boundary line 18 can be set to an appropriate still image 19 according to the scene. In addition, the setting is made each time the boundary line 18 is generated or modified, and the boundary line 18 is updated and displayed in the boundary line display image 129 each time the boundary line 18 is set. can be updated and displayed.
- the boundary line 18 when the boundary line 18 is generated without using the tablet 17, it can be done as follows. During the examination, the still image 19 is acquired, and on the display 15, a moving image 82a of the endoscopic image under examination is displayed in the live moving image area 82 (see FIG. 11). As shown in FIG. 25, an instruction to generate a boundary line 18 is given from the keyboard 16, and a DL setting mode is set in which a boundary line setting screen 161 is displayed on the display 15.
- FIG. 11 an instruction to generate a boundary line 18 is given from the keyboard 16, and a DL setting mode is set in which a boundary line setting screen 161 is displayed on the display 15.
- the acquired still images 19 are displayed as thumbnails.
- a still image 19 with a boundary line 18 set is selected from the thumbnails using an arrow key or the like on the keyboard 16 .
- the selected still image 19 is displayed in the fixed display still image area 83 .
- the DL setting button 163 is pressed.
- the boundary line 18 is automatically generated.
- the DL correction button 165 is pressed. Modifications are the same as described above.
- the boundary line 18 is set, and the boundary line display image 129 that is the still image 19 with the boundary line 18 set is continuously displayed in the fixed display still image area 83 .
- the return button 166 By pressing the return button 166, the DL setting mode is terminated and the home screen is displayed.
- the generation, modification, setting, etc. of the boundary line 18 can be performed accurately and easily.
- boundary line 18 is updated and displayed each time the boundary line 18 is set, updating of the boundary line 18 may be terminated based on the user's instruction or an endoscopic image. If no further update is required, such as when the boundary line 18 is fixed or when the display of the boundary line 18 is no longer necessary, the update can be terminated. As a result, it is possible to easily prevent the boundary line 18 from being continuously updated when updating is no longer necessary, thereby reducing the user's trouble.
- step ST110 A series of flow of endoscopic image processing of this embodiment by the medical image processing apparatus will be described along the flowchart shown in FIG. Endoscopy is started and a still image 19 is acquired (step ST110).
- the acquired still image 19 is transmitted to the tablet 17 each time it is acquired (step ST120).
- step ST130 A thumbnail 93 of the still image 19 is displayed on the tablet 17 (step ST140).
- the still image 19 for which the boundary line 18 is to be set is selected from the thumbnails 93 (step ST150).
- the selected still image 19 is displayed in the fixed display still image area 83 of the display 15 and displayed in the selected still image area 96 of the tablet 17 (step ST160).
- the DL setting button 92b is pressed to start setting the boundary line 18 (step ST170).
- the positive point selection button 122 is pressed to draw the positive point 127 on the selected still image 97, and the positive point registration button 122a is pressed to register the positive point 127 (step ST180).
- the negative point selection button 123 is pressed to draw a negative point on the selected still image 97, and the negative point registration button 123a is pressed to register the negative point 128 (step ST190).
- pressing the DL generation button 124 generates the boundary line 18 on the selected still image 97 (step ST200).
- the correction button 125 is pressed to correct the generated boundary line 18 (step ST210).
- the reflection to processor button 92c is pressed to display the boundary line display image 129 in the fixed display still image area 83 of the display 15 (step ST220).
- the present invention is applied to the case of processing endoscopic images.
- the present invention can also be applied to image processing systems and the like.
- part or all of the image processing unit 55 and/or the central control unit 58 of the endoscope system 10 may be connected directly from the endoscope system 10 or PACS (Picture (Archiving and Communication Systems) 22 can be provided in a diagnostic support device 610 that indirectly acquires images captured by the endoscope 12 .
- PACS Picture (Archiving and Communication Systems) 22
- part or all of the medical image processing section device 640 which is a device that performs the function of the medical image processing device portion of the endoscope system 10, may be directly from the endoscope system 10 or may (Picture Archiving and Communication Systems) 22 can be provided in a diagnostic support device 610 that acquires images captured by the endoscope 12 indirectly.
- various inspection apparatuses including the endoscope system 10, such as a first inspection apparatus 621, a second inspection apparatus 622, .
- the medical service support device 630 can be provided with part or all of the image processing unit 55 and/or the central control unit 58 of the endoscope system 10 or part or all of the medical image processing unit device 640 .
- the hardware structure of a processing unit that executes various processes such as a central control unit (not shown) included is various processors as shown below.
- Various processors include CPU (Central Processing Unit), FPGA (Field Programmable Gate Array), etc., which are general-purpose processors that run software (programs) and function as various processing units.
- Programmable Logic Devices which are processors, dedicated electric circuits, which are processors with circuit configurations specially designed to perform various processes, and the like.
- One processing unit may be composed of one of these various processors, or composed of a combination of two or more processors of the same type or different types (for example, a plurality of FPGAs or a combination of a CPU and an FPGA).
- a plurality of processing units may be configured by one processor.
- a plurality of processing units may be configured by one processor.
- this processor functions as a plurality of processing units.
- SoC System On Chip
- SoC System On Chip
- the various processing units are configured using one or more of the above various processors as a hardware structure.
- the hardware structure of these various processors is, more specifically, an electric circuit in the form of a combination of circuit elements such as semiconductor elements.
- endoscope system 12 endoscope 12a insertion portion 12b operation portion 12c bending portion 12d tip portion 12e angle knob 12f zoom operation portion 12g mode switching switch 12h forceps port 12i freeze switch 13 light source device 14 processor device 15 display 16 keyboard 17 tablet 18 boundary line 18a lesion area 18b non-lesion area 19 still image 20 light source unit 20a V-LED 20b B-LED 20c G-LED 20d R-LED 21 light source processor 22 PACS 30a illumination optical system 30b imaging optical system 41 light guide 42 illumination lens 43 objective lens 44 zoom lens 45 imaging sensor 46 CDS/AGC circuit 47 A/D converter 51 image acquisition unit 52 DSP 53 noise reduction unit 54 memory 55 image processing unit 56 display control unit 57 video signal generation unit 58 central control unit 61 normal image processing unit 62 special image processing unit 63 boundary line processing unit 71 still image storage unit 72 target image setting unit 73 boundary Line generation unit 74 Boundary line correction unit 75 Boundary line setting unit 76 Boundary line display unit 81 Temporary display still image area 82 Live moving image area
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Physics & Mathematics (AREA)
- Medical Informatics (AREA)
- Heart & Thoracic Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Veterinary Medicine (AREA)
- Molecular Biology (AREA)
- Optics & Photonics (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Endoscopes (AREA)
Abstract
Description
12 内視鏡
12a 挿入部
12b 操作部
12c 湾曲部
12d 先端部
12e アングルノブ
12f ズーム操作部
12g モード切替スイッチ
12h 鉗子口
12i フリーズスイッチ
13 光源装置
14 プロセッサ装置
15 ディスプレイ
16 キーボード
17 タブレット
18 境界線
18a 病変領域
18b 非病変領域
19 静止画像
20 光源部
20a V-LED
20b B-LED
20c G-LED
20d R-LED
21 光源用プロセッサ
22 PACS
30a 照明光学系
30b 撮像光学系
41 ライトガイド
42 照明レンズ
43 対物レンズ
44 ズームレンズ
45 撮像センサ
46 CDS/AGC回路
47 A/Dコンバータ
51 画像取得部
52 DSP
53 ノイズ低減部
54 メモリ
55 画像処理部
56 表示制御部
57 映像信号生成部
58 中央制御部
61 通常画像処理部
62 特殊画像処理部
63 境界線処理部
71 静止画保存部
72 対象画像設定部
73 境界線生成部
74 境界線修正部
75 境界線設定部
76 境界線表示部
81 一時表示静止画領域
82 ライブ動画領域
82a 動画像
83 固定表示静止画領域
91 タッチパネル
92a、162 画像選択ボタン
92b、126、163 DL設定ボタン
92c プロセッサに反映ボタン
93 サムネイル
94 チェックボックス
95、164 決定ボタン
96 選択静止画領域
97 選択静止画像
98 タッチペン
101 境界線検出部
102 描画検出部
103 陽性点陰性点解析部
111 学習モデル
122 陽性点選択ボタン
122a 陽性点登録ボタン
122b 削除ボタン
122c、134、166 戻るボタン
123 陰性点選択ボタン
123a 陰性点登録ボタン
124 DL生成ボタン
125 修正ボタン
127 陽性点
128 陰性点
129 境界線表示画像
131 手動ボタン
132 拡大ボタン
133 縮小ボタン
141 頂点
142 異型度判定部
151 異型度1の領域
152 異型度3の領域
161 境界線設定画面
165 DL修正ボタン
610 診断支援装置
621 第1検査装置
622 第2検査装置
623 第N検査装置
626 ネットワーク
630 医療業務支援装置
640 医療画像処理部装置
ST110~ST220 ステップ
Claims (19)
- プロセッサを備え、
前記プロセッサは、
内視鏡により被写体を撮影した内視鏡画像を取得し、
前記内視鏡画像の静止画像において前記被写体における注目領域と非注目領域との境界を示す境界線を設定し、
設定した前記境界線を前記静止画像に表示した境界線表示画像を生成し、
前記内視鏡画像の動画像と前記境界線表示画像とを表示装置に表示する制御を行い、
前記境界線表示画像に表示する前記境界線は、前記境界線の設定毎に更新して表示する医療画像処理装置。 - 前記プロセッサは、前記静止画像に基づき前記境界線を検出して設定する請求項1に記載の医療画像処理装置。
- 前記表示装置は、第1表示装置及び第2表示装置とを含み、
前記プロセッサは、前記静止画像及び/又は前記境界線表示画像を前記第1表示装置及び/又は前記医療画像処理装置に接続する小型端末が備える前記第2表示装置に表示する制御を行う請求項1又は2に記載の医療画像処理装置。 - 前記プロセッサは、前記静止画像を表示する場合に、表示した前記静止画像上にユーザが生成した描画に基づき前記境界線を設定する請求項3に記載の医療画像処理装置。
- 前記描画は、平滑化処理がなされたものである請求項4に記載の医療画像処理装置。
- 前記描画は、前記ユーザの判定により前記静止画像の前記注目領域に生成した陽性点である請求項4に記載の医療画像処理装置。
- 前記描画は、前記ユーザの判定により前記静止画像の前記非注目領域に生成した陰性点である請求項4又は6に記載の医療画像処理装置。
- 前記プロセッサは、前記静止画像を前記第2表示装置に表示する制御を行い、
前記描画は、前記第2表示装置に表示された前記静止画像上に生成した前記描画である請求項4ないし7のいずれか1項に記載の医療画像処理装置。 - 前記プロセッサは、前記境界線表示画像に表示する前記境界線を修正して得られる前記境界線を、前記境界線として新たに設定する請求項1ないし8のいずれか1項に記載の医療画像処理装置。
- 前記プロセッサは、前記境界線表示画像を前記第2表示装置に表示する制御を行う請求項3ないし9のいずれか1項に記載の医療画像処理装置。
- 前記プロセッサは、前記動画像を前記第1表示装置のメイン画面に表示し、かつ、前記境界線表示画像を前記第1表示装置のサブ画面に表示する制御を行う請求項3ないし10のいずれか1項に記載の医療画像処理装置。
- 前記プロセッサは、前記静止画像を前記第1表示装置のサブ画面に表示する制御を行なう請求項3ないし11のいずれか1項に記載の医療画像処理装置。
- 前記プロセッサは、前記境界線表示画像に表示される前記境界線に対応して、前記動画像に前記境界線を表示する請求項1ないし12のいずれか1項に記載の医療画像処理装置。
- 前記プロセッサは、ユーザの指示又は前記内視鏡画像に基づいて、前記動画像への前記境界線の表示の有無を制御する請求項13に記載の医療画像処理装置。
- 前記プロセッサは、ユーザの指示又は前記内視鏡画像に基づいて、前記境界線の更新を終了する請求項1ないし14のいずれか1項に記載の医療画像処理装置。
- 前記静止画像は、前記動画像と同じ検査において取得したものであるか、又は、前記動画像と異なる検査において取得したものである請求項1ないし15のいずれか1項に記載の医療画像処理装置。
- 前記被写体を撮影する内視鏡と、
前記表示装置と、
請求項1ないし16のいずれか1項に記載の医療画像処理装置とを備える内視鏡システム。 - 前記表示装置は、第1表示装置及び第2表示装置を備える請求項17に記載の内視鏡システム。
- 内視鏡により被写体を撮影した内視鏡画像を取得するステップと、
前記内視鏡画像の静止画像において前記被写体における注目領域と非注目領域との境界を示す境界線を設定するステップと、
設定した前記境界線を前記静止画像に表示した境界線表示画像を生成するステップと、
前記内視鏡画像の動画像と前記境界線表示画像とを表示装置に表示する制御を行うステップとを備え、
前記境界線表示画像に表示する前記境界線は、前記境界線の設定毎に更新して表示する医療画像処理装置の作動方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202280042699.0A CN117500426A (zh) | 2021-06-16 | 2022-04-21 | 医疗图像处理装置、内窥镜***及医疗图像处理装置的工作方法 |
EP22824683.1A EP4356813A1 (en) | 2021-06-16 | 2022-04-21 | Medical image processing device, endoscope system, and operation method for medical image processing device |
JP2023529653A JPWO2022264688A1 (ja) | 2021-06-16 | 2022-04-21 | |
US18/537,762 US20240108198A1 (en) | 2021-06-16 | 2023-12-12 | Medical image processing device, endoscope system, and operation method of medical image processing device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021-100526 | 2021-06-16 | ||
JP2021100526 | 2021-06-16 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/537,762 Continuation US20240108198A1 (en) | 2021-06-16 | 2023-12-12 | Medical image processing device, endoscope system, and operation method of medical image processing device |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022264688A1 true WO2022264688A1 (ja) | 2022-12-22 |
Family
ID=84527092
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/018434 WO2022264688A1 (ja) | 2021-06-16 | 2022-04-21 | 医療画像処理装置、内視鏡システム、及び医療画像処理装置の作動方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20240108198A1 (ja) |
EP (1) | EP4356813A1 (ja) |
JP (1) | JPWO2022264688A1 (ja) |
CN (1) | CN117500426A (ja) |
WO (1) | WO2022264688A1 (ja) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07262371A (ja) * | 1994-03-17 | 1995-10-13 | Fujitsu Ltd | 分散型画像処理装置 |
JP2001137206A (ja) * | 1999-09-21 | 2001-05-22 | Biosense Inc | 心臓の腔の状態を心内膜的に検査するための方法と装置 |
WO2020075254A1 (ja) | 2018-10-11 | 2020-04-16 | オリンパス株式会社 | 内視鏡システム及び表示画像生成方法 |
-
2022
- 2022-04-21 JP JP2023529653A patent/JPWO2022264688A1/ja active Pending
- 2022-04-21 WO PCT/JP2022/018434 patent/WO2022264688A1/ja active Application Filing
- 2022-04-21 EP EP22824683.1A patent/EP4356813A1/en active Pending
- 2022-04-21 CN CN202280042699.0A patent/CN117500426A/zh active Pending
-
2023
- 2023-12-12 US US18/537,762 patent/US20240108198A1/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07262371A (ja) * | 1994-03-17 | 1995-10-13 | Fujitsu Ltd | 分散型画像処理装置 |
JP2001137206A (ja) * | 1999-09-21 | 2001-05-22 | Biosense Inc | 心臓の腔の状態を心内膜的に検査するための方法と装置 |
WO2020075254A1 (ja) | 2018-10-11 | 2020-04-16 | オリンパス株式会社 | 内視鏡システム及び表示画像生成方法 |
Also Published As
Publication number | Publication date |
---|---|
CN117500426A (zh) | 2024-02-02 |
JPWO2022264688A1 (ja) | 2022-12-22 |
US20240108198A1 (en) | 2024-04-04 |
EP4356813A1 (en) | 2024-04-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11526986B2 (en) | Medical image processing device, endoscope system, medical image processing method, and program | |
US11607109B2 (en) | Endoscopic image processing device, endoscopic image processing method, endoscopic image processing program, and endoscope system | |
US20210153730A1 (en) | Endoscope system | |
JP6889282B2 (ja) | 医療画像処理装置及び方法、内視鏡システム、プロセッサ装置、診断支援装置並びにプログラム | |
JP7335399B2 (ja) | 医用画像処理装置及び内視鏡システム並びに医用画像処理装置の作動方法 | |
JP7289373B2 (ja) | 医療画像処理装置、内視鏡システム、診断支援方法及びプログラム | |
WO2020054543A1 (ja) | 医療画像処理装置及び方法、内視鏡システム、プロセッサ装置、診断支援装置並びにプログラム | |
JP2020065685A (ja) | 内視鏡システム | |
JP7386347B2 (ja) | 内視鏡システム及びその作動方法 | |
JP7389257B2 (ja) | 内視鏡システム及びその作動方法 | |
US20230101620A1 (en) | Medical image processing apparatus, endoscope system, method of operating medical image processing apparatus, and non-transitory computer readable medium | |
WO2022264688A1 (ja) | 医療画像処理装置、内視鏡システム、及び医療画像処理装置の作動方法 | |
US20220338717A1 (en) | Endoscopic examination support device, endoscopic examination support method, and endoscopic examination support program | |
EP4183311A1 (en) | Image analysis processing device, endoscopy system, operation method for image analysis processing device, and program for image analysis processing device | |
JP7214886B2 (ja) | 画像処理装置及びその作動方法 | |
JP7217351B2 (ja) | 画像処理装置、内視鏡システム、及び画像処理装置の作動方法 | |
WO2022210508A1 (ja) | プロセッサ装置、医療画像処理装置、及び医療画像処理システム、並びに内視鏡システム | |
WO2022230607A1 (ja) | 医療画像処理装置、内視鏡システム、及び医療画像処理装置の作動方法 | |
JP7076535B2 (ja) | 内視鏡装置、内視鏡装置の作動方法、及びプログラム | |
JP7508559B2 (ja) | 画像解析処理装置、内視鏡システム、画像解析処理装置の作動方法、及び画像解析処理装置用プログラム | |
WO2021176852A1 (ja) | 画像選択支援装置、画像選択支援方法、及び画像選択支援プログラム | |
JP7411515B2 (ja) | 内視鏡システム及びその作動方法 | |
EP4095802A1 (en) | Medical image processing device, endoscope system, and medical image processing device operation method | |
CN113747825A (zh) | 电子内窥镜*** |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22824683 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023529653 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280042699.0 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022824683 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2022824683 Country of ref document: EP Effective date: 20240116 |