US20200126223A1 - Endoscope diagnosis support system, storage medium, and endoscope diagnosis support method - Google Patents
Endoscope diagnosis support system, storage medium, and endoscope diagnosis support method Download PDFInfo
- Publication number
- US20200126223A1 US20200126223A1 US16/665,040 US201916665040A US2020126223A1 US 20200126223 A1 US20200126223 A1 US 20200126223A1 US 201916665040 A US201916665040 A US 201916665040A US 2020126223 A1 US2020126223 A1 US 2020126223A1
- Authority
- US
- United States
- Prior art keywords
- image
- endoscope
- detection
- diagnosis support
- support system
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000003745 diagnosis Methods 0.000 title claims abstract description 73
- 238000000034 method Methods 0.000 title claims description 16
- 238000001514 detection method Methods 0.000 claims abstract description 179
- 230000004048 modification Effects 0.000 description 42
- 238000012986 modification Methods 0.000 description 42
- 238000010586 diagram Methods 0.000 description 19
- 238000012545 processing Methods 0.000 description 16
- 238000005286 illumination Methods 0.000 description 13
- 238000003780 insertion Methods 0.000 description 10
- 230000037431 insertion Effects 0.000 description 10
- 230000008569 process Effects 0.000 description 9
- 238000005516 engineering process Methods 0.000 description 5
- 230000003902 lesion Effects 0.000 description 5
- 238000012937 correction Methods 0.000 description 4
- 238000013473 artificial intelligence Methods 0.000 description 3
- 238000000605 extraction Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 2
- 230000009467 reduction Effects 0.000 description 2
- 230000004075 alteration Effects 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000007493 shaping process Methods 0.000 description 1
- 238000002604 ultrasonography Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000094—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope extracting biological structures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000096—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope using artificial intelligence
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/045—Control thereof
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/05—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances characterised by the image sensor, e.g. camera, being in the distal end portion
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0655—Control therefor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0661—Endoscope light sources
- A61B1/0676—Endoscope light sources at distal tip of an endoscope
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B23/00—Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
- G02B23/24—Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/555—Constructional details for picking-up images in sites, inaccessible due to their dimensions or hazardous conditions, e.g. endoscopes or borescopes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
- A61B1/0005—Display arrangement combining images e.g. side-by-side, superimposed or tiled
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10068—Endoscopic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30096—Tumor; Lesion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30204—Marker
Definitions
- the present invention relates to an endoscope diagnosis support system, a storage medium, and an endoscope diagnosis support method.
- Japanese Patent Application Laid-Open Publication No. 2004-159739 discloses an image processing apparatus that performs image processing on a medical image obtained by an X-ray CT apparatus, a magnetic resonance photographing apparatus, an ultrasound diagnosis apparatus, an X-ray photographing apparatus, or the like, and adds a mark to a part that is suspected to be a lesion to be displayed such that diagnosis support can be performed.
- An endoscope diagnosis support system includes a processor.
- the processor performs detection of an anomaly candidate area from an endoscope image obtained by performing image pickup of an inside of a subject to obtain a detection result, and generates a display image in which an indicator indicating detection of the anomaly candidate area is arranged in a periphery portion of the endoscope image in accordance with the detection result.
- a non-transitory storage medium stores a computer-readable program.
- the program causes a computer to execute code for performing detection of an anomaly candidate area from an endoscope image obtained by performing image pickup of an inside of a subject to obtain a detection result, and code for generating a display image in which an indicator indicating detection of the anomaly candidate area is arranged in a periphery portion of the endoscope image in accordance with the detection result.
- An endoscope diagnosis support method includes performing detection of an anomaly candidate area from an endoscope image obtained by performing image pickup of an inside of a subject to obtain a detection result, and generating a display image in which an indicator indicating detection of the anomaly candidate area is arranged in a periphery portion of the endoscope image in accordance with the detection result.
- FIG. 1 is a block diagram illustrating a configuration example of an endoscope diagnosis support system according to a first embodiment of the present invention
- FIG. 2 is a diagram illustrating a configuration example of a display image of a display unit of the endoscope diagnosis support system according to the first embodiment of the present invention
- FIG. 3 is a flowchart illustrating an example of display image generation processing of the endoscope diagnosis support system according to the first embodiment of the present invention
- FIG. 4 is a diagram illustrating a configuration example of the display image of the display unit of the endoscope diagnosis support system according to the first embodiment of the present invention
- FIG. 5 is a diagram illustrating a configuration example of the display image of the display unit of the endoscope diagnosis support system according to the first embodiment of the present invention
- FIG. 6 is a diagram illustrating a configuration example of the display image of the display unit of the endoscope diagnosis support system according to a first modification of the first embodiment of the present invention
- FIG. 7 is a diagram illustrating a configuration example of the display image of the display unit of the endoscope diagnosis support system according to the first modification of the first embodiment of the present invention.
- FIG. 8 is a diagram illustrating a configuration example of the display image of the display unit of the endoscope diagnosis support system according to a second modification of the first embodiment of the present invention.
- FIG. 9 is a diagram illustrating a configuration example of the display image of the display unit of the endoscope diagnosis support system according to a third modification of the first embodiment of the present invention.
- FIG. 10 is a diagram illustrating a configuration example of the display image of the display unit of the endoscope diagnosis support system according to a fourth modification of the first embodiment of the present invention.
- FIG. 11 is a flowchart illustrating an example of the display image generation processing of the endoscope diagnosis support system according to a fifth modification of the first embodiment of the present invention.
- FIG. 12 is a diagram illustrating a configuration example of the display image of the display unit of the endoscope diagnosis support system according to the fifth modification of the first embodiment of the present invention.
- FIG. 13 is a flowchart illustrating an example of the display image generation processing of the endoscope diagnosis support system according to a second embodiment of the present invention.
- FIG. 14 illustrates a configuration example of the display image of the display unit of the endoscope diagnosis support system according to the second embodiment of the present invention.
- FIG. 1 is a block diagram illustrating a configuration example of an endoscope diagnosis support system 1 according to a first embodiment of the present invention.
- illustration of a signal line that connects an operation unit X to a control unit 32 for setting an observation mode is omitted.
- the endoscope diagnosis support system 1 includes a light source drive unit 11 , an endoscope 21 , a video processor 31 , a display unit 41 , and the operation unit X.
- the light source drive unit 11 is connected to the endoscope 21 and the video processor 31 .
- the endoscope 21 and the operation unit X are connected to the video processor 31 .
- the video processor 31 is connected to the display unit 41 .
- the light source drive unit 11 is a circuit configured to drive an illumination portion 23 disposed in a distal end portion of an insertion portion 22 of the endoscope 21 .
- the light source drive unit 11 is connected to the control unit 32 in the video processor 31 and the illumination portion 23 in the endoscope 21 .
- the light source drive unit 11 emits illumination light from the illumination portion 23 to a subject under control of the control unit 32 .
- the light source drive unit 11 emits normal light and narrow band light from the illumination portion 23 in accordance with the observation mode. More specifically, when the observation mode is a normal light mode, the light source drive unit 11 emits the normal light from the illumination portion 23 , and when the observation mode is a narrow band light observation mode, the light source drive unit 11 emits the narrow band light from the illumination portion 23 .
- the endoscope 21 is configured such that image pickup of an inside of the subject can be performed.
- the endoscope 21 includes the insertion portion 22 , the illumination portion 23 , and an image pickup portion 24 .
- the insertion portion 22 is formed to be elongated so as to be able to be inserted into the subject.
- the insertion portion 22 includes a conduit such as a treatment instrument insertion conduit that is not illustrated in the drawing.
- the insertion portion 22 can cause a treatment instrument that is not illustrated in the drawing which is allowed to be inserted into the treatment instrument insertion conduit to protrude from the distal end portion thereof.
- the illumination portion 23 is disposed in the distal end portion of the insertion portion 22 and emits the illumination light to the subject under control of the light source drive unit 11 .
- the image pickup portion 24 is disposed in the distal end portion of the insertion portion 22 , performs image pickup of the subject to which the illumination light is emitted, and outputs an image pickup signal to the video processor 31 .
- the video processor 31 performs control on the endoscope 21 , generates an endoscope image A based on the image pickup signal inputted from the endoscope 21 , and generates a display image B based on the endoscope image A.
- the video processor 31 includes the control unit 32 , an anomaly detection unit 33 , and an image generation unit 34 .
- the control unit 32 is a circuit configured to control the respective units in the endoscope diagnosis support system 1 .
- the control unit 32 performs image processing such as gain adjustment, white balance adjustment, gamma correction, contour enhancement correction, or enlargement/reduction adjustment based on the image pickup signal inputted from the endoscope 21 , for example, to generate the endoscope image A, and outputs the endoscope image A to the anomaly detection unit 33 and the image generation unit 34 .
- the control unit 32 transmits a control signal to the light source drive unit 11 and drives the illumination portion 23 in accordance with the observation mode.
- the observation mode is set by an instruction input of a user via the operation unit X.
- the control unit 32 may also adjust a light emitting amount of the illumination portion 23 in accordance with a luminance of the endoscope image A.
- the endoscope image A may be either a moving or a still image.
- the anomaly detection unit 33 is a circuit configured to perform detection of an anomaly candidate area L that is an area corresponding to a candidate of an anomaly such as a lesion based on the endoscope image A.
- the anomaly detection unit 33 is connected to the image generation unit 34 .
- the image generation unit 34 outputs a detection result indicating non-detection of the anomaly candidate area L to the image generation unit 34 .
- the anomaly detection unit 33 outputs a detection result indicating a detection position and a size of the anomaly candidate area L to the image generation unit 34 .
- the anomaly detection unit 33 performs the detection of the anomaly candidate area L from the endoscope image A obtained by performing image pickup of the inside of the subject by the image pickup portion 24 and outputs the detection result.
- the anomaly candidate area L is a lesion candidate area.
- the anomaly detection unit 33 is configured by a computing apparatus using an artificial intelligence technology such as machine learning.
- the anomaly detection unit 33 is configured by a computing apparatus that learns extraction of a feature value by a deep learning technology.
- the anomaly detection unit 33 performs predetermined computation adjusted by the learning with respect to the endoscope image A inputted from the image pickup portion 24 , and outputs a feature value indicating non-detection of the anomaly candidate area L or a feature value indicating the detection position and the size of the anomaly candidate area L to the image generation unit 34 as the detection result.
- the anomaly detection unit 33 is configured by the computing apparatus using the artificial intelligence technology, but may also be configured by a computing apparatus that does not use the artificial intelligence technology.
- the anomaly detection unit 33 may be configured to perform extraction of a contour from a change amount between mutually adjacent pixels, and perform the extraction of the feature value by matching between the contour and model information of the anomaly candidate area L which is previously stored in the control unit 32 .
- the image generation unit 34 is a circuit configured to generate the display image B.
- the image generation unit 34 performs generation of the display image B based on the endoscope image A inputted from the control unit 32 , the detection result inputted from the anomaly detection unit 33 , and an instruction signal inputted from the operation unit X.
- the image generation unit 34 switches a detection position image D 1 in a main area B 1 from a non-display state to the display state in accordance with the instruction signal.
- the display unit 41 is configured such that the display image B inputted from the image generation unit 34 can be displayed on a display screen.
- the display unit 41 is, for example, a monitor including a rectangular display screen.
- the operation unit X is configured such that instruction input can be performed by a user operation.
- the operation unit X is connected to the image generation unit 34 .
- the operation unit X includes a foot switch Xa, a keyboard Xb, a tablet Xc, a voice input apparatus Xd, and a scope switch Xe.
- the operation unit X is mentioned when the foot switch Xa, the keyboard Xb, the tablet Xc, the voice input apparatus Xd, and the scope switch Xe are wholly or partly illustrated.
- the foot switch Xa, the keyboard Xb, the tablet Xc, and the voice input apparatus Xd are connected to the video processor 31 in a wired or wireless manner
- a stepping operation on a pedal by a foot of the user can be performed by the foot switch Xa.
- a pressing operation on a predetermined key by a hand or finger of the user can be performed by the keyboard Xb.
- a touch operation on a touch panel by the hand or finger of the user can be performed by the tablet Xc.
- An operation based on voice of the user can be performed by the voice input apparatus Xd.
- the voice of the user is inputted, and predetermined voice for instructing the display state or the non-display state is detected from the inputted voice.
- the scope switch Xe is attached to the endoscope 21 , and the operation by the hand or finger of the user can be performed.
- the operation unit X When the instruction input for instructing the display state is performed by the hand or finger, the foot, or the voice of the user, the operation unit X outputs the instruction signal for instructing the display state to the image generation unit 34 .
- the operation unit X When the instruction input for instructing the non-display state is performed by the hand or finger, the foot, or the voice of the user, the operation unit X outputs the instruction signal for instructing the non-display state to the image generation unit 34 . In other words, the operation unit X outputs the instruction signal in accordance with the operation of the user.
- a configuration of the display image B is described.
- FIG. 2 is a diagram illustrating a configuration example of the display image B of the display unit 41 of the endoscope diagnosis support system 1 according to the first embodiment of the present invention.
- entire shapes of endoscope images A 1 and A 2 are octagonal, and a lumen in a living body is schematically represented by curved lines.
- the display image B is a rectangular image and includes the main area B 1 and a sub area B 2 that are divided in a longitudinal direction.
- a dashed-dotted line in FIG. 2 is a virtual line indicating a boundary between the main area B 1 and the sub area B 2 .
- the main area B 1 is an area in which an endoscope image A 1 is displayed.
- the main area B 1 is set to be wider than the sub area B 2 such that visibility of the endoscope image A 1 can be improved.
- the endoscope image A 1 is displayed to have a size larger than the endoscope image A 2 in the main area B 1 .
- the sub area B 2 is an area where the detection position of the anomaly candidate area L is displayed.
- the sub area B 2 is arranged so as to be adjacent to the main area B 1 .
- the endoscope image A 2 for superposing a detection position image D 2 for indicating a detection position is arranged in the sub area B 2 ( FIG. 4 ).
- FIG. 3 is a flowchart illustrating an example of the display image generation processing of the endoscope diagnosis support system 1 according to the first embodiment of the present invention.
- FIG. 4 and FIG. 5 are diagrams illustrating a configuration example of the display image B of the display unit 41 of the endoscope diagnosis support system 1 according to the first embodiment of the present invention.
- the anomaly detection unit 33 performs predetermined computation and outputs the detection result to the image generation unit 34 .
- the image generation unit 34 adjusts the size of the endoscope image A inputted from the control unit 32 , arranges the endoscope image A 1 in the main area B 1 , and arranges the endoscope image A 2 in the sub area B 2 .
- the image generation unit 34 also sets the detection position image D 2 in the display state based on the detection result such that a position corresponding to the detection position of the anomaly candidate area L in the sub area B 2 is indicated.
- the detection position image D 2 is a rectangular frame image, but another image may also be adopted.
- the detection position image D 1 in the main area B 1 is set in the display state (S 14 ). As illustrated in FIG. 5 , the image generation unit 34 sets the detection position image D 1 for indicating the detection position in the display state such that a position corresponding to the detection position of the anomaly candidate area L in the main area B 1 is indicated.
- the detection position image D 1 is a rectangular frame image, but another image may also be adopted. In other words, the detection position image D 1 arranged in the main area B 1 is a rectangular frame image.
- the processes S 11 to S 14 constitute the display image generation processing according to the first embodiment.
- the image generation unit 34 is divided into the main area B 1 and the sub area B 2 that is smaller than the main area B 1 , the endoscope image A 1 is arranged in the main area B 1 , and the display image B in which the anomaly detection image indicating detection of the anomaly candidate area L is arranged in the periphery portion of the main area B 1 is generated in accordance with the detection result.
- the image generation unit 34 arranges the detection position image D 2 for indicating the detection position such that the position corresponding to the detection position of the anomaly candidate area L in the sub area B 2 is indicated in accordance with the detection result.
- the image generation unit 34 arranges the detection position image D 1 such that the position corresponding to the detection position in the main area B 1 is indicated.
- the anomaly detection unit 33 performs the detection of the anomaly candidate area L from the endoscope image A obtained by performing image pickup of the inside of the subject by the image pickup portion 24 to output the detection result, and the image generation unit 34 generates the display image B which is divided into the main area B 1 and the sub area B 2 that is smaller than the main area B 1 and in which the endoscope image A 1 is arranged in the main area B 1 , and the anomaly detection image indicating detection of the anomaly candidate area L is arranged in the periphery portion of the main area B 1 in accordance with the detection result.
- the detection position image D 1 in the anomaly candidate area L in the main area B 1 is set in the non-display state until the user performs the instruction input, and user's attention to the endoscope image A 1 is not disturbed.
- the anomaly candidate area L corresponding to the candidate of the anomaly such as the lesion can be indicated in a manner that the user's attention to the endoscope image A 1 is not disturbed, and the diagnosis based on the endoscope 21 can be supported.
- the detection mark Ma is displayed in the lower right portion in the main area B 1 , but a detection mark Mb may be displayed in four corners of the endoscope image A 1 .
- FIG. 6 and FIG. 7 are diagrams illustrating a configuration example of the display image B of the display unit 41 of the endoscope diagnosis support system 1 according to a first modification of the first embodiment of the present invention. According to the present modification, descriptions of same components as those according to other embodiments and modifications are omitted.
- the image generation unit 34 sets the detection mark Mb in the main area B 1 in the display state, and sets the detection position image D 2 in the sub area B 2 in the display state.
- the detection marks Mb are arranged in the periphery portions of the main area B 1 and also outside of the endoscope image A 1 along tapered portions of the four corners of the endoscope image A 1 .
- the detection mark Mb is a strip-like image having a predetermined thickness. Note that in the example of FIG. 6 , the detection mark Mb is arranged in all of the four corners of the endoscope image A 1 , but may be configured to be arranged in at least part of the four corners instead of all of the four corners.
- the detection position image D 1 in the main area B 1 is set in the display state.
- the detection mark Mb can be displayed such that the user can more easily notice.
- the detection mark Mb of the strip-like image is displayed, but a detection mark Mc of a triangular image may be displayed.
- the detection mark Mc is arranged in the periphery portions in the main area B 1 and also outside of the endoscope image A 1 along the tapered portions of the four corners of the endoscope image A 1 .
- the detection mark Mc is a triangular image marked out by a predetermined color. Note that in the example of FIG. 8 , the detection mark Mc is arranged in all of the four corners of the endoscope image A 1 , but may be configured to be arranged in at least part of the four corners instead of all of the four corners.
- FIG. 9 is a diagram illustrating a configuration example of the display image B of the display unit 41 of the endoscope diagnosis support system 1 according to a third modification of the first embodiment of the present invention. According to the present modification, descriptions of same components as those according to other embodiments and modifications are omitted.
- a detection mark Md is an L-shaped image that is arranged in the four corners of the endoscope image A 1 d and obtained by being bent along the four corners. Note that the detection mark Md may be displayed in a flashing manner so as to be conspicuous.
- the detection mark Md can be displayed in the four corners of the endoscope image A 1 d , the entire shape of which is quadrangular, such that the user can more easily notice.
- FIG. 10 is a diagram illustrating a configuration example of the display image B of the display unit 41 of the endoscope diagnosis support system 1 according to a fourth modification of the first embodiment of the present invention. According to the present modification, descriptions of same components as those according to other embodiments and modifications are omitted.
- An endoscope image A 1 e in which upper and lower portions are linear and also both side portion on left and right are curved is arranged in the main area B 1 , and an endoscope image A 2 e obtained by reducing the endoscope image A 1 e is arranged in the sub area B 2 .
- a detection mark Me is a triangular image that is arranged in the four corners in the endoscope image A 1 e and obtained by shaping one side curved.
- the instruction input for setting the detection position image D 1 in the display state is performed by an operation on the operation unit X, but a movement of a user's eyes may be detected to set the detection position image D 1 in the display state.
- FIG. 11 is a flowchart illustrating an example of the display image generation processing of the endoscope diagnosis support system 1 according to a fifth modification of the first embodiment of the present invention.
- FIG. 12 is a diagram illustrating a configuration example of the display image B of the display unit 41 of the endoscope diagnosis support system 1 according to the fifth modification of the first embodiment of the present invention.
- the endoscope diagnosis support system 1 includes a camera Z (dashed-two dotted line in FIG. 1 ).
- the camera Z is attached to a rim or the like of the display unit 41 such that the movement of the user's eyes which observe the endoscope image A 1 can be detected, for example.
- the camera Z is connected to the image generation unit 34 .
- the camera Z performs image pickup of the user's eyes and outputs an image of the user's eyes to the image generation unit 34 .
- the predetermined movement may be a movement where, for example, the user finds an anomaly in the endoscope image A by visual observation, and a line of sight of the user directs to the detection position of the anomaly candidate area L for a predetermined period of time.
- the predetermined movement may also be a movement where the user notices the detection mark Ma, and the line of sight of the user shifts from the main area B 1 to the entirety of the display image B.
- the predetermined movement may also be a movement where the line of sight of the user directs to a previously set predetermined position in the display image B.
- the detection position image D 1 in the main area B 1 is set in the display state (S 24 ).
- the detection position image D 1 is an arrow image displayed in a vicinity of the anomaly candidate area L in the main area B 1 .
- the detection position image D 1 arranged in the main area B 1 is an arrow image.
- the image generation unit 34 sets the detection position image D 1 in the non-display state based on the instruction input by the operation unit X, the detection of the predetermined movement of the eyes by the camera Z, or the non-detection of the anomaly candidate area L.
- the image generation unit 34 sets non-display of the detection position image D 1 in the main area B 1 such that the user's attention to the endoscope image A 1 is not disturbed when the observation mode is switched from a normal observation mode to a narrow band light mode.
- a treatment instrument detection portion T 1 (dashed-two dotted line in FIG. 1 ) configured to detect a predetermined treatment instrument from the endoscope image A is included in the anomaly detection unit 33 , and when the predetermined treatment instrument is detected from the endoscope image A, the detection position image D 1 in the main area B 1 is set in the non-display such that the user's attention is not disturbed.
- the detection position image D 1 can be switched to either the display state or the non-display state by the predetermined display switching condition, which saves trouble of the user from operating the operation unit X.
- the sub area B 2 displays the detection position of the anomaly candidate area L, but may display an enlarged image E of the anomaly candidate area L.
- FIG. 13 is a flowchart illustrating an example of the display image generation processing of the endoscope diagnosis support system 1 according to a second embodiment of the present invention.
- FIG. 14 is a diagram illustrating a configuration example of the display image B of the display unit 41 of the endoscope diagnosis support system 1 according to the second embodiment of the present invention. According to the present embodiment, descriptions of same components as those according to other embodiments and modifications are omitted.
- the detection position image D 1 in the main area B 1 is set in the display state, and the enlarged image E is displayed in the sub area B 2 (S 34 ). As illustrated in FIG. 14 , the image generation unit 34 sets the detection position image D 1 in the main area B 1 in the display state. The image generation unit 34 also displays the enlarged image E at a predetermined enlargement rate in the sub area B 2 .
- the image generation unit 34 arranges the enlarged image E obtained by enlarging the anomaly candidate area L in the sub area B 2 .
- the processes S 31 to S 34 constitute the display image generation processing according to the second embodiment.
- the enlarged image E can be displayed in the sub area B 2 by the instruction input of the user, and visibility of the anomaly candidate area L is improved.
- the anomaly candidate area L corresponding to the candidate of the anomaly such as the lesion can be indicated in a manner that the user's attention to the endoscope image A 1 is not disturbed, and the diagnosis based on the endoscope 21 can be supported.
- the detection marks Ma, Mb, Mc, Md, and Me and the detection position images D 1 and D 2 may have a same color such that it is easy for the user to see.
- the image generation unit 34 when the detection position image D 1 is arranged in the main area B 1 , the image generation unit 34 also arranges the detection position image D 2 in the sub area B 2 , but when the detection position image D 1 is arranged in the main area B 1 , the display image B may be generated such that the detection position image D 2 is not arranged in the sub area B 2 .
- the operation unit X is configured by all of the foot switch Xa, the keyboard Xb, the tablet Xc, the voice input apparatus Xd, and the scope switch Xe, but may be configured by part of the foot switch Xa, the keyboard Xb, the tablet Xc, the voice input apparatus Xd, and the scope switch Xe.
- the operation unit X includes at least any one of the foot switch Xa, the keyboard Xb, the tablet Xc, the voice input apparatus Xd, and the scope switch Xe.
- the detection mark Ma is displayed in the lower right portion in the main area B 1 , but may be displayed in an upper right portion, an upper left portion, or a lower left portion in the main area B 1 .
- the predetermined display switching condition is whether or not the image of the user's eyes indicates the predetermined movement, but a timer T 2 may be included (dashed-two dotted line in FIG. 1 ), the anomaly detection unit 33 can detect an anomaly type in the anomaly candidate area L, the timer T 2 can measure a predetermined period of time in accordance with the anomaly type after the anomaly candidate area L is detected, and the predetermined display switching condition may be whether or not the predetermined period of time elapses after the anomaly candidate area L is detected.
- the predetermined enlargement rate according to the second embodiment is previously set, but may be configured to change in accordance with the size of the anomaly candidate area L.
- the respective “units” in the present specification are conceptual components corresponding to the respective functions of the embodiments and do not necessarily correspond to particular hardware or software routines on a one-on-one basis. Therefore, in the present specification, the embodiments are described while virtual circuit blocks (units) including the respective functions of the embodiments are supposed.
- virtual circuit blocks (units) including the respective functions of the embodiments are supposed.
- an execution order may be changed, a plurality of steps may be executed at the same time, or the execution be performed in a different order for each execution.
- all or part of the respective steps in the respective procedure according to the present embodiment may be realized by hardware.
- the video processor 31 may include a central processing unit (CPU) 51 and a memory 52 and execute an endoscope diagnosis support program 53 stored in the memory 52 to realize the functions of the anomaly detection unit 33 and the image generation unit 34 (dashed-two dotted line in FIG. 1 ).
- CPU central processing unit
- memory 52 stores instructions for the endoscope diagnosis support program 53 to realize the functions of the anomaly detection unit 33 and the image generation unit 34 (dashed-two dotted line in FIG. 1 ).
- the endoscope diagnosis support program 53 causes a computer to execute code for performing the detection of the anomaly candidate area L from the endoscope image A obtained by performing image pickup of the inside of the subject by the image pickup portion 24 to output the detection result, and code for generating the display image B which is divided into the main area B 1 and the sub area B 2 that is smaller than the main area B 1 and in which the endoscope image A 1 is arranged in the main area B 1 , and the anomaly detection image indicating the detection of the anomaly candidate area L is arranged in the periphery portion of the main area B 1 in accordance with the detection result.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Surgery (AREA)
- Physics & Mathematics (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Optics & Photonics (AREA)
- Animal Behavior & Ethology (AREA)
- Heart & Thoracic Surgery (AREA)
- Biomedical Technology (AREA)
- Molecular Biology (AREA)
- Pathology (AREA)
- Biophysics (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Signal Processing (AREA)
- General Physics & Mathematics (AREA)
- Quality & Reliability (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Multimedia (AREA)
- Astronomy & Astrophysics (AREA)
- Endoscopes (AREA)
- Instruments For Viewing The Inside Of Hollow Bodies (AREA)
Abstract
Description
- This application is a continuation application of PCT/JP2017/016961 filed on Apr. 28, 2017, the entire contents of which are incorporated herein by this reference.
- The present invention relates to an endoscope diagnosis support system, a storage medium, and an endoscope diagnosis support method.
- Up to now, a technology has been proposed in which image processing on a medical image is performed, and the medical image is displayed with a mark added to a part that matches a previously specified condition. For example, Japanese Patent Application Laid-Open Publication No. 2004-159739 discloses an image processing apparatus that performs image processing on a medical image obtained by an X-ray CT apparatus, a magnetic resonance photographing apparatus, an ultrasound diagnosis apparatus, an X-ray photographing apparatus, or the like, and adds a mark to a part that is suspected to be a lesion to be displayed such that diagnosis support can be performed.
- An endoscope diagnosis support system according to an embodiment includes a processor. The processor performs detection of an anomaly candidate area from an endoscope image obtained by performing image pickup of an inside of a subject to obtain a detection result, and generates a display image in which an indicator indicating detection of the anomaly candidate area is arranged in a periphery portion of the endoscope image in accordance with the detection result.
- A non-transitory storage medium according to an embodiment stores a computer-readable program. The program causes a computer to execute code for performing detection of an anomaly candidate area from an endoscope image obtained by performing image pickup of an inside of a subject to obtain a detection result, and code for generating a display image in which an indicator indicating detection of the anomaly candidate area is arranged in a periphery portion of the endoscope image in accordance with the detection result.
- An endoscope diagnosis support method according to an embodiment includes performing detection of an anomaly candidate area from an endoscope image obtained by performing image pickup of an inside of a subject to obtain a detection result, and generating a display image in which an indicator indicating detection of the anomaly candidate area is arranged in a periphery portion of the endoscope image in accordance with the detection result.
-
FIG. 1 is a block diagram illustrating a configuration example of an endoscope diagnosis support system according to a first embodiment of the present invention; -
FIG. 2 is a diagram illustrating a configuration example of a display image of a display unit of the endoscope diagnosis support system according to the first embodiment of the present invention; -
FIG. 3 is a flowchart illustrating an example of display image generation processing of the endoscope diagnosis support system according to the first embodiment of the present invention; -
FIG. 4 is a diagram illustrating a configuration example of the display image of the display unit of the endoscope diagnosis support system according to the first embodiment of the present invention; -
FIG. 5 is a diagram illustrating a configuration example of the display image of the display unit of the endoscope diagnosis support system according to the first embodiment of the present invention; -
FIG. 6 is a diagram illustrating a configuration example of the display image of the display unit of the endoscope diagnosis support system according to a first modification of the first embodiment of the present invention; -
FIG. 7 is a diagram illustrating a configuration example of the display image of the display unit of the endoscope diagnosis support system according to the first modification of the first embodiment of the present invention; -
FIG. 8 is a diagram illustrating a configuration example of the display image of the display unit of the endoscope diagnosis support system according to a second modification of the first embodiment of the present invention; -
FIG. 9 is a diagram illustrating a configuration example of the display image of the display unit of the endoscope diagnosis support system according to a third modification of the first embodiment of the present invention; -
FIG. 10 is a diagram illustrating a configuration example of the display image of the display unit of the endoscope diagnosis support system according to a fourth modification of the first embodiment of the present invention; -
FIG. 11 is a flowchart illustrating an example of the display image generation processing of the endoscope diagnosis support system according to a fifth modification of the first embodiment of the present invention; -
FIG. 12 is a diagram illustrating a configuration example of the display image of the display unit of the endoscope diagnosis support system according to the fifth modification of the first embodiment of the present invention; -
FIG. 13 is a flowchart illustrating an example of the display image generation processing of the endoscope diagnosis support system according to a second embodiment of the present invention; and -
FIG. 14 illustrates a configuration example of the display image of the display unit of the endoscope diagnosis support system according to the second embodiment of the present invention. - Hereinafter, embodiments of the present invention are described with reference to the drawings.
-
FIG. 1 is a block diagram illustrating a configuration example of an endoscopediagnosis support system 1 according to a first embodiment of the present invention. InFIG. 1 , illustration of a signal line that connects an operation unit X to acontrol unit 32 for setting an observation mode is omitted. - The endoscope
diagnosis support system 1 includes a lightsource drive unit 11, anendoscope 21, avideo processor 31, adisplay unit 41, and the operation unit X. The lightsource drive unit 11 is connected to theendoscope 21 and thevideo processor 31. Theendoscope 21 and the operation unit X are connected to thevideo processor 31. Thevideo processor 31 is connected to thedisplay unit 41. - The light
source drive unit 11 is a circuit configured to drive anillumination portion 23 disposed in a distal end portion of aninsertion portion 22 of theendoscope 21. The lightsource drive unit 11 is connected to thecontrol unit 32 in thevideo processor 31 and theillumination portion 23 in theendoscope 21. The lightsource drive unit 11 emits illumination light from theillumination portion 23 to a subject under control of thecontrol unit 32. The lightsource drive unit 11 emits normal light and narrow band light from theillumination portion 23 in accordance with the observation mode. More specifically, when the observation mode is a normal light mode, the lightsource drive unit 11 emits the normal light from theillumination portion 23, and when the observation mode is a narrow band light observation mode, the lightsource drive unit 11 emits the narrow band light from theillumination portion 23. - The
endoscope 21 is configured such that image pickup of an inside of the subject can be performed. Theendoscope 21 includes theinsertion portion 22, theillumination portion 23, and animage pickup portion 24. - The
insertion portion 22 is formed to be elongated so as to be able to be inserted into the subject. Theinsertion portion 22 includes a conduit such as a treatment instrument insertion conduit that is not illustrated in the drawing. Theinsertion portion 22 can cause a treatment instrument that is not illustrated in the drawing which is allowed to be inserted into the treatment instrument insertion conduit to protrude from the distal end portion thereof. - The
illumination portion 23 is disposed in the distal end portion of theinsertion portion 22 and emits the illumination light to the subject under control of the lightsource drive unit 11. - The
image pickup portion 24 is disposed in the distal end portion of theinsertion portion 22, performs image pickup of the subject to which the illumination light is emitted, and outputs an image pickup signal to thevideo processor 31. - The
video processor 31 performs control on theendoscope 21, generates an endoscope image A based on the image pickup signal inputted from theendoscope 21, and generates a display image B based on the endoscope image A. Thevideo processor 31 includes thecontrol unit 32, ananomaly detection unit 33, and animage generation unit 34. - The
control unit 32 is a circuit configured to control the respective units in the endoscopediagnosis support system 1. Thecontrol unit 32 performs image processing such as gain adjustment, white balance adjustment, gamma correction, contour enhancement correction, or enlargement/reduction adjustment based on the image pickup signal inputted from theendoscope 21, for example, to generate the endoscope image A, and outputs the endoscope image A to theanomaly detection unit 33 and theimage generation unit 34. Thecontrol unit 32 transmits a control signal to the lightsource drive unit 11 and drives theillumination portion 23 in accordance with the observation mode. The observation mode is set by an instruction input of a user via the operation unit X. Thecontrol unit 32 may also adjust a light emitting amount of theillumination portion 23 in accordance with a luminance of the endoscope image A. The endoscope image A may be either a moving or a still image. - The
anomaly detection unit 33 is a circuit configured to perform detection of an anomaly candidate area L that is an area corresponding to a candidate of an anomaly such as a lesion based on the endoscope image A. Theanomaly detection unit 33 is connected to theimage generation unit 34. When the anomaly candidate area L is not detected, theimage generation unit 34 outputs a detection result indicating non-detection of the anomaly candidate area L to theimage generation unit 34. When the anomaly candidate area L is detected, theanomaly detection unit 33 outputs a detection result indicating a detection position and a size of the anomaly candidate area L to theimage generation unit 34. In other words, theanomaly detection unit 33 performs the detection of the anomaly candidate area L from the endoscope image A obtained by performing image pickup of the inside of the subject by theimage pickup portion 24 and outputs the detection result. The anomaly candidate area L is a lesion candidate area. - For example, the
anomaly detection unit 33 is configured by a computing apparatus using an artificial intelligence technology such as machine learning. - More specifically, the
anomaly detection unit 33 is configured by a computing apparatus that learns extraction of a feature value by a deep learning technology. Theanomaly detection unit 33 performs predetermined computation adjusted by the learning with respect to the endoscope image A inputted from theimage pickup portion 24, and outputs a feature value indicating non-detection of the anomaly candidate area L or a feature value indicating the detection position and the size of the anomaly candidate area L to theimage generation unit 34 as the detection result. - Note that the
anomaly detection unit 33 is configured by the computing apparatus using the artificial intelligence technology, but may also be configured by a computing apparatus that does not use the artificial intelligence technology. For example, theanomaly detection unit 33 may be configured to perform extraction of a contour from a change amount between mutually adjacent pixels, and perform the extraction of the feature value by matching between the contour and model information of the anomaly candidate area L which is previously stored in thecontrol unit 32. - The
image generation unit 34 is a circuit configured to generate the display image B. Theimage generation unit 34 performs generation of the display image B based on the endoscope image A inputted from thecontrol unit 32, the detection result inputted from theanomaly detection unit 33, and an instruction signal inputted from the operation unit X. Theimage generation unit 34 switches a detection position image D1 in a main area B1 from a non-display state to the display state in accordance with the instruction signal. - The
display unit 41 is configured such that the display image B inputted from theimage generation unit 34 can be displayed on a display screen. Thedisplay unit 41 is, for example, a monitor including a rectangular display screen. - The operation unit X is configured such that instruction input can be performed by a user operation. The operation unit X is connected to the
image generation unit 34. The operation unit X includes a foot switch Xa, a keyboard Xb, a tablet Xc, a voice input apparatus Xd, and a scope switch Xe. Hereinafter, the operation unit X is mentioned when the foot switch Xa, the keyboard Xb, the tablet Xc, the voice input apparatus Xd, and the scope switch Xe are wholly or partly illustrated. - The foot switch Xa, the keyboard Xb, the tablet Xc, and the voice input apparatus Xd are connected to the
video processor 31 in a wired or wireless manner A stepping operation on a pedal by a foot of the user can be performed by the foot switch Xa. A pressing operation on a predetermined key by a hand or finger of the user can be performed by the keyboard Xb. A touch operation on a touch panel by the hand or finger of the user can be performed by the tablet Xc. An operation based on voice of the user can be performed by the voice input apparatus Xd. In the voice input apparatus Xd, the voice of the user is inputted, and predetermined voice for instructing the display state or the non-display state is detected from the inputted voice. The scope switch Xe is attached to theendoscope 21, and the operation by the hand or finger of the user can be performed. - When the instruction input for instructing the display state is performed by the hand or finger, the foot, or the voice of the user, the operation unit X outputs the instruction signal for instructing the display state to the
image generation unit 34. When the instruction input for instructing the non-display state is performed by the hand or finger, the foot, or the voice of the user, the operation unit X outputs the instruction signal for instructing the non-display state to theimage generation unit 34. In other words, the operation unit X outputs the instruction signal in accordance with the operation of the user. - A configuration of the display image B is described.
-
FIG. 2 is a diagram illustrating a configuration example of the display image B of thedisplay unit 41 of the endoscopediagnosis support system 1 according to the first embodiment of the present invention. In the example ofFIG. 2 , entire shapes of endoscope images A1 and A2 are octagonal, and a lumen in a living body is schematically represented by curved lines. - The display image B is a rectangular image and includes the main area B1 and a sub area B2 that are divided in a longitudinal direction. A dashed-dotted line in
FIG. 2 is a virtual line indicating a boundary between the main area B1 and the sub area B2. - The main area B1 is an area in which an endoscope image A1 is displayed. The main area B1 is set to be wider than the sub area B2 such that visibility of the endoscope image A1 can be improved. The endoscope image A1 is displayed to have a size larger than the endoscope image A2 in the main area B1.
- The sub area B2 is an area where the detection position of the anomaly candidate area L is displayed. The sub area B2 is arranged so as to be adjacent to the main area B1. The endoscope image A2 for superposing a detection position image D2 for indicating a detection position is arranged in the sub area B2 (
FIG. 4 ). - Subsequently, display image generation processing of the
image generation unit 34 is described. -
FIG. 3 is a flowchart illustrating an example of the display image generation processing of the endoscopediagnosis support system 1 according to the first embodiment of the present invention.FIG. 4 andFIG. 5 are diagrams illustrating a configuration example of the display image B of thedisplay unit 41 of the endoscopediagnosis support system 1 according to the first embodiment of the present invention. - When the
insertion portion 22 is inserted to perform image pickup of the subject, theimage pickup portion 24 outputs the image pickup signal to thecontrol unit 32. Thecontrol unit 32 performs the image processing such as the gain adjustment, the white balance adjustment, the gamma correction, the contour enhancement correction, or the enlargement/reduction adjustment based on the image pickup signal and outputs the endoscope image A to theanomaly detection unit 33 and theimage generation unit 34. - The
anomaly detection unit 33 performs predetermined computation and outputs the detection result to theimage generation unit 34. - The
image generation unit 34 adjusts the size of the endoscope image A inputted from thecontrol unit 32, arranges the endoscope image A1 in the main area B1, and arranges the endoscope image A2 in the sub area B2. - It is determined whether or not the anomaly candidate area L is detected (S11). When the
image generation unit 34 determines that the detection result indicating non-detection of the anomaly candidate area L is inputted from the anomaly detection unit 33 (S11: NO), the process repeats S11. On the other hand, as illustrated inFIG. 4 , when theimage generation unit 34 determines that the detection result indicating the detection position and the size of the anomaly candidate area L is inputted (S11: YES), the process proceeds to S12. - A detection mark Ma in the main area B1 and the detection position image D2 in the sub area B2 are set in the display state (S12). The
image generation unit 34 sets the detection mark Ma corresponding to an anomaly detection image indicating detection of the anomaly candidate area L in the display state in a lower right portion in the main area B1 and also outside of the endoscope image A1. In other words, the detection mark Ma is arranged in a periphery portion of the main area B1 and also in a vicinity of the endoscope image A1. In the example ofFIG. 4 , the detection mark Ma is an image imitating a flag, but another image may also be adopted. - The
image generation unit 34 also sets the detection position image D2 in the display state based on the detection result such that a position corresponding to the detection position of the anomaly candidate area L in the sub area B2 is indicated. In the example ofFIG. 4 , the detection position image D2 is a rectangular frame image, but another image may also be adopted. - It is determined whether or not an input of the instruction signal exists (S13). When the
image generation unit 34 determines that the input of the instruction signal for instructing the display state does not exist, the process returns to S11. On the other hand, when theimage generation unit 34 determines that the input of the instruction signal for instructing the display state exists, the process proceeds to S14. - The detection position image D1 in the main area B1 is set in the display state (S14). As illustrated in
FIG. 5 , theimage generation unit 34 sets the detection position image D1 for indicating the detection position in the display state such that a position corresponding to the detection position of the anomaly candidate area L in the main area B1 is indicated. In the example ofFIG. 5 , the detection position image D1 is a rectangular frame image, but another image may also be adopted. In other words, the detection position image D1 arranged in the main area B1 is a rectangular frame image. - When the instruction input for instructing the non-display state by the operation unit X exists or the anomaly candidate area L is not detected, the
image generation unit 34 sets the detection position image D1 in the non-display state. - The processes S11 to S14 constitute the display image generation processing according to the first embodiment.
- In other words, the
image generation unit 34 is divided into the main area B1 and the sub area B2 that is smaller than the main area B1, the endoscope image A1 is arranged in the main area B1, and the display image B in which the anomaly detection image indicating detection of the anomaly candidate area L is arranged in the periphery portion of the main area B1 is generated in accordance with the detection result. Theimage generation unit 34 arranges the detection position image D2 for indicating the detection position such that the position corresponding to the detection position of the anomaly candidate area L in the sub area B2 is indicated in accordance with the detection result. After the display image B in which the anomaly detection image is arranged in the main area B1 is generated, theimage generation unit 34 arranges the detection position image D1 such that the position corresponding to the detection position in the main area B1 is indicated. - In other words, according to an endoscope diagnosis support method, the
anomaly detection unit 33 performs the detection of the anomaly candidate area L from the endoscope image A obtained by performing image pickup of the inside of the subject by theimage pickup portion 24 to output the detection result, and theimage generation unit 34 generates the display image B which is divided into the main area B1 and the sub area B2 that is smaller than the main area B1 and in which the endoscope image A1 is arranged in the main area B1, and the anomaly detection image indicating detection of the anomaly candidate area L is arranged in the periphery portion of the main area B1 in accordance with the detection result. - According to this, in the endoscope
diagnosis support system 1, the detection position image D1 in the anomaly candidate area L in the main area B1 is set in the non-display state until the user performs the instruction input, and user's attention to the endoscope image A1 is not disturbed. - According to the above-described first embodiment, in the endoscope
diagnosis support system 1, the anomaly candidate area L corresponding to the candidate of the anomaly such as the lesion can be indicated in a manner that the user's attention to the endoscope image A1 is not disturbed, and the diagnosis based on theendoscope 21 can be supported. - According to the first embodiment, the detection mark Ma is displayed in the lower right portion in the main area B1, but a detection mark Mb may be displayed in four corners of the endoscope image A1.
-
FIG. 6 andFIG. 7 are diagrams illustrating a configuration example of the display image B of thedisplay unit 41 of the endoscopediagnosis support system 1 according to a first modification of the first embodiment of the present invention. According to the present modification, descriptions of same components as those according to other embodiments and modifications are omitted. - As illustrated in
FIG. 6 , when the detection result indicating the detection position and the size of the anomaly candidate area L is inputted from theanomaly detection unit 33, theimage generation unit 34 sets the detection mark Mb in the main area B1 in the display state, and sets the detection position image D2 in the sub area B2 in the display state. - In the example of
FIG. 6 , the detection marks Mb are arranged in the periphery portions of the main area B1 and also outside of the endoscope image A1 along tapered portions of the four corners of the endoscope image A1. The detection mark Mb is a strip-like image having a predetermined thickness. Note that in the example ofFIG. 6 , the detection mark Mb is arranged in all of the four corners of the endoscope image A1, but may be configured to be arranged in at least part of the four corners instead of all of the four corners. - As illustrated in
FIG. 7 , when the user performs the instruction input of the display state by the operation unit X, the detection position image D1 in the main area B1 is set in the display state. - In other words, the
image generation unit 34 arranges the detection mark Mb in the four corners of the endoscope image A1. - According to this, in the endoscope
diagnosis support system 1, the detection mark Mb can be displayed such that the user can more easily notice. - According to the first modification of the first embodiment, the detection mark Mb of the strip-like image is displayed, but a detection mark Mc of a triangular image may be displayed.
-
FIG. 8 is a diagram illustrating a configuration example of the display image B of thedisplay unit 41 of the endoscopediagnosis support system 1 according to a second modification of the first embodiment of the present invention. According to the present modification, descriptions of same components as those according to other embodiments and modifications are omitted. - In the example of
FIG. 8 , the detection mark Mc is arranged in the periphery portions in the main area B1 and also outside of the endoscope image A1 along the tapered portions of the four corners of the endoscope image A1. The detection mark Mc is a triangular image marked out by a predetermined color. Note that in the example ofFIG. 8 , the detection mark Mc is arranged in all of the four corners of the endoscope image A1, but may be configured to be arranged in at least part of the four corners instead of all of the four corners. - According to this, in the endoscope
diagnosis support system 1, the detection mark Mc can be displayed such that the user can more easily notice. - According to the first embodiment and first and second modifications of the first embodiment, entire shapes of the endoscope images A1 and A2 are octagonal, but may be other than an octagon.
-
FIG. 9 is a diagram illustrating a configuration example of the display image B of thedisplay unit 41 of the endoscopediagnosis support system 1 according to a third modification of the first embodiment of the present invention. According to the present modification, descriptions of same components as those according to other embodiments and modifications are omitted. - An endoscope image A1 d, an entire shape of which is quadrangular, is arranged in the main area B1, and an endoscope image A2 d obtained by reducing the endoscope image A1 d is arranged in the sub area B2.
- In the example of
FIG. 9 , a detection mark Md is an L-shaped image that is arranged in the four corners of the endoscope image A1 d and obtained by being bent along the four corners. Note that the detection mark Md may be displayed in a flashing manner so as to be conspicuous. - According to this, in the endoscope
diagnosis support system 1, the detection mark Md can be displayed in the four corners of the endoscope image A1 d, the entire shape of which is quadrangular, such that the user can more easily notice. - According to the third modification of the first embodiment, the entire shapes of the endoscope images A1 d and A2 d are quadrangular, but mutually facing sides may be curved.
-
FIG. 10 is a diagram illustrating a configuration example of the display image B of thedisplay unit 41 of the endoscopediagnosis support system 1 according to a fourth modification of the first embodiment of the present invention. According to the present modification, descriptions of same components as those according to other embodiments and modifications are omitted. - An endoscope image A1 e in which upper and lower portions are linear and also both side portion on left and right are curved is arranged in the main area B1, and an endoscope image A2 e obtained by reducing the endoscope image A1 e is arranged in the sub area B2.
- In the example of
FIG. 10 , a detection mark Me is a triangular image that is arranged in the four corners in the endoscope image A1 e and obtained by shaping one side curved. - According to this, in the endoscope
diagnosis support system 1, the detection mark Me can be displayed in the four corners of the endoscope image A1 e in which the mutually facing sides are curved such that the user can more easily notice. - According to the first embodiment and the first to fourth modifications of the first embodiment, the instruction input for setting the detection position image D1 in the display state is performed by an operation on the operation unit X, but a movement of a user's eyes may be detected to set the detection position image D1 in the display state.
-
FIG. 11 is a flowchart illustrating an example of the display image generation processing of the endoscopediagnosis support system 1 according to a fifth modification of the first embodiment of the present invention.FIG. 12 is a diagram illustrating a configuration example of the display image B of thedisplay unit 41 of the endoscopediagnosis support system 1 according to the fifth modification of the first embodiment of the present invention. - According to the present embodiment, descriptions of same components as those according to other embodiments and modifications are omitted.
- According to the present modification, the endoscope
diagnosis support system 1 includes a camera Z (dashed-two dotted line inFIG. 1 ). - The camera Z is attached to a rim or the like of the
display unit 41 such that the movement of the user's eyes which observe the endoscope image A1 can be detected, for example. The camera Z is connected to theimage generation unit 34. The camera Z performs image pickup of the user's eyes and outputs an image of the user's eyes to theimage generation unit 34. - Subsequently, an operation of the present modification is described.
- Since S21 and S22 are same as S11 and S12, descriptions thereof are omitted.
- It is determined whether or not the user's eyes perform a predetermined movement (S23). When the
image generation unit 34 determines that the user's eyes do not perform the predetermined movement, the process returns to 21. On the other hand, when theimage generation unit 34 determines that the user's eyes perform the predetermined movement, the process proceeds to S24. - The predetermined movement may be a movement where, for example, the user finds an anomaly in the endoscope image A by visual observation, and a line of sight of the user directs to the detection position of the anomaly candidate area L for a predetermined period of time. The predetermined movement may also be a movement where the user notices the detection mark Ma, and the line of sight of the user shifts from the main area B1 to the entirety of the display image B. The predetermined movement may also be a movement where the line of sight of the user directs to a previously set predetermined position in the display image B.
- The detection position image D1 in the main area B1 is set in the display state (S24). The detection position image D1 is an arrow image displayed in a vicinity of the anomaly candidate area L in the main area B1. The detection position image D1 arranged in the main area B1 is an arrow image.
- The
image generation unit 34 sets the detection position image D1 in the non-display state based on the instruction input by the operation unit X, the detection of the predetermined movement of the eyes by the camera Z, or the non-detection of the anomaly candidate area L. - The
image generation unit 34 sets non-display of the detection position image D1 in the main area B1 such that the user's attention to the endoscope image A1 is not disturbed when the observation mode is switched from a normal observation mode to a narrow band light mode. - A treatment instrument detection portion T1 (dashed-two dotted line in
FIG. 1 ) configured to detect a predetermined treatment instrument from the endoscope image A is included in theanomaly detection unit 33, and when the predetermined treatment instrument is detected from the endoscope image A, the detection position image D1 in the main area B1 is set in the non-display such that the user's attention is not disturbed. - In other words, when the
image generation unit 34 determines that a predetermined display switching condition is satisfied, theimage generation unit 34 switches the detection position image D1 in the main area B1 to either the display state or the non-display state. The predetermined display switching condition is whether or not the image of the user's eyes which is inputted from the camera Z indicates the predetermined movement. The predetermined display switching condition is also whether or not the observation mode is the narrow band light observation mode. The predetermined display switching condition is also whether or not the predetermined treatment instrument is detected by the treatment instrument detection portion T1. - The processes S21 to S24 constitute the display image generation processing according to a fifth modification of the first embodiment.
- According to this, in the endoscope
diagnosis support system 1, the detection position image D1 can be switched to either the display state or the non-display state by the predetermined display switching condition, which saves trouble of the user from operating the operation unit X. - According to the first embodiment and the first to fifth modifications of the first embodiment, the sub area B2 displays the detection position of the anomaly candidate area L, but may display an enlarged image E of the anomaly candidate area L.
-
FIG. 13 is a flowchart illustrating an example of the display image generation processing of the endoscopediagnosis support system 1 according to a second embodiment of the present invention.FIG. 14 is a diagram illustrating a configuration example of the display image B of thedisplay unit 41 of the endoscopediagnosis support system 1 according to the second embodiment of the present invention. According to the present embodiment, descriptions of same components as those according to other embodiments and modifications are omitted. - An operation of the endoscope
diagnosis support system 1 according to the second embodiment is described. - Since S31 to S33 are same as S11 to S13, descriptions thereof are omitted.
- The detection position image D1 in the main area B1 is set in the display state, and the enlarged image E is displayed in the sub area B2 (S34). As illustrated in
FIG. 14 , theimage generation unit 34 sets the detection position image D1 in the main area B1 in the display state. Theimage generation unit 34 also displays the enlarged image E at a predetermined enlargement rate in the sub area B2. - In other words, the
image generation unit 34 arranges the enlarged image E obtained by enlarging the anomaly candidate area L in the sub area B2. - The processes S31 to S34 constitute the display image generation processing according to the second embodiment.
- According to this, in the endoscope
diagnosis support system 1, the enlarged image E can be displayed in the sub area B2 by the instruction input of the user, and visibility of the anomaly candidate area L is improved. - According to the second embodiment described above, in the endoscope
diagnosis support system 1, the anomaly candidate area L corresponding to the candidate of the anomaly such as the lesion can be indicated in a manner that the user's attention to the endoscope image A1 is not disturbed, and the diagnosis based on theendoscope 21 can be supported. - Note that according to the embodiments and the modifications, the detection marks Ma, Mb, Mc, Md, and Me and the detection position images D1 and D2 may have a same color such that it is easy for the user to see.
- Note that according to the embodiments and the modifications, the
image generation unit 34 may generate the display image B such that the detection position images D1 and D2 are not displayed outside of the endoscope images A1, A1 d, A1 e, A2, A2 d, and A2 e. - Note that according to the embodiments and the modifications, when the detection position image D1 is arranged in the main area B1, the
image generation unit 34 also arranges the detection position image D2 in the sub area B2, but when the detection position image D1 is arranged in the main area B1, the display image B may be generated such that the detection position image D2 is not arranged in the sub area B2. - Note that according to the embodiments and the modifications, the operation unit X is configured by all of the foot switch Xa, the keyboard Xb, the tablet Xc, the voice input apparatus Xd, and the scope switch Xe, but may be configured by part of the foot switch Xa, the keyboard Xb, the tablet Xc, the voice input apparatus Xd, and the scope switch Xe. In other words, the operation unit X includes at least any one of the foot switch Xa, the keyboard Xb, the tablet Xc, the voice input apparatus Xd, and the scope switch Xe.
- Note that according to the first embodiment, the detection mark Ma is displayed in the lower right portion in the main area B1, but may be displayed in an upper right portion, an upper left portion, or a lower left portion in the main area B1.
- Note that according to the fifth modification of the first embodiment, the predetermined display switching condition is whether or not the image of the user's eyes indicates the predetermined movement, but a timer T2 may be included (dashed-two dotted line in
FIG. 1 ), theanomaly detection unit 33 can detect an anomaly type in the anomaly candidate area L, the timer T2 can measure a predetermined period of time in accordance with the anomaly type after the anomaly candidate area L is detected, and the predetermined display switching condition may be whether or not the predetermined period of time elapses after the anomaly candidate area L is detected. - Note that the predetermined enlargement rate according to the second embodiment is previously set, but may be configured to change in accordance with the size of the anomaly candidate area L.
- The respective “units” in the present specification are conceptual components corresponding to the respective functions of the embodiments and do not necessarily correspond to particular hardware or software routines on a one-on-one basis. Therefore, in the present specification, the embodiments are described while virtual circuit blocks (units) including the respective functions of the embodiments are supposed. With regard to the respective steps in the respective procedure according to the present embodiment, unless contrary to the nature thereof, an execution order may be changed, a plurality of steps may be executed at the same time, or the execution be performed in a different order for each execution. Furthermore, all or part of the respective steps in the respective procedure according to the present embodiment may be realized by hardware.
- For example, the
video processor 31 may include a central processing unit (CPU) 51 and amemory 52 and execute an endoscopediagnosis support program 53 stored in thememory 52 to realize the functions of theanomaly detection unit 33 and the image generation unit 34 (dashed-two dotted line inFIG. 1 ). In other words, the endoscopediagnosis support program 53 causes a computer to execute code for performing the detection of the anomaly candidate area L from the endoscope image A obtained by performing image pickup of the inside of the subject by theimage pickup portion 24 to output the detection result, and code for generating the display image B which is divided into the main area B1 and the sub area B2 that is smaller than the main area B1 and in which the endoscope image A1 is arranged in the main area B1, and the anomaly detection image indicating the detection of the anomaly candidate area L is arranged in the periphery portion of the main area B1 in accordance with the detection result. - The present invention is not limited to the above-mentioned embodiments, and various modifications, alterations, and the like can be made in a range without departing from a gist of the present invention.
Claims (21)
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2017/016961 WO2018198327A1 (en) | 2017-04-28 | 2017-04-28 | Endoscope diagnosis assist system, endoscope diagnosis assist program, and endoscope diagnosis assist method |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2017/016961 Continuation WO2018198327A1 (en) | 2017-04-28 | 2017-04-28 | Endoscope diagnosis assist system, endoscope diagnosis assist program, and endoscope diagnosis assist method |
Publications (1)
Publication Number | Publication Date |
---|---|
US20200126223A1 true US20200126223A1 (en) | 2020-04-23 |
Family
ID=63918852
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/665,040 Pending US20200126223A1 (en) | 2017-04-28 | 2019-10-28 | Endoscope diagnosis support system, storage medium, and endoscope diagnosis support method |
Country Status (4)
Country | Link |
---|---|
US (1) | US20200126223A1 (en) |
JP (1) | JP6751815B2 (en) |
CN (1) | CN110868907B (en) |
WO (1) | WO2018198327A1 (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11576563B2 (en) | 2016-11-28 | 2023-02-14 | Adaptivendo Llc | Endoscope with separable, disposable shaft |
US11918176B2 (en) | 2019-03-08 | 2024-03-05 | Fujifilm Corporation | Medical image processing apparatus, processor device, endoscope system, medical image processing method, and program |
USD1018844S1 (en) | 2020-01-09 | 2024-03-19 | Adaptivendo Llc | Endoscope handle |
US11931200B2 (en) | 2020-06-25 | 2024-03-19 | Fujifilm Healthcare Corporation | Ultrasound diagnostic apparatus and diagnosis assisting method |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7123173B2 (en) * | 2018-12-28 | 2022-08-22 | オリンパス株式会社 | Endoscope image processing device, endoscope image processing method and program |
WO2020170809A1 (en) * | 2019-02-19 | 2020-08-27 | 富士フイルム株式会社 | Medical image processing device, endoscope system, and medical image processing method |
WO2021033303A1 (en) * | 2019-08-22 | 2021-02-25 | Hoya株式会社 | Training data generation method, learned model, and information processing device |
WO2021095446A1 (en) * | 2019-11-11 | 2021-05-20 | 富士フイルム株式会社 | Information display system and information display method |
CN111064934A (en) * | 2019-12-30 | 2020-04-24 | 元力(天津)科技有限公司 | Medical image processing system and method |
JP7260060B2 (en) * | 2020-03-30 | 2023-04-18 | 日本電気株式会社 | Information processing device, display method, and program |
WO2024004524A1 (en) * | 2022-06-29 | 2024-01-04 | 富士フイルム株式会社 | Diagnosis assistance device, ultrasound endoscope, diagnosis assistance method, and program |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120026308A1 (en) * | 2010-07-29 | 2012-02-02 | Careview Communications, Inc | System and method for using a video monitoring system to prevent and manage decubitus ulcers in patients |
EP2517614A1 (en) * | 2010-02-05 | 2012-10-31 | Olympus Corporation | Image processing device, endoscope system, program and image processing method |
US20120321161A1 (en) * | 2011-06-17 | 2012-12-20 | Canon Kabushiki Kaisha | Image processing apparatus, image processing method, image pickup system, and program |
US20150078615A1 (en) * | 2013-09-18 | 2015-03-19 | Cerner Innovation, Inc. | Marking and tracking an area of interest during endoscopy |
US20180098690A1 (en) * | 2015-06-11 | 2018-04-12 | Olympus Corporation | Endoscope apparatus and method for operating endoscope apparatus |
WO2018230074A1 (en) * | 2017-06-14 | 2018-12-20 | オリンパス株式会社 | System for assisting observation of endoscope image |
US20190385018A1 (en) * | 2018-06-13 | 2019-12-19 | Casmo Artificial Intelligence-Al Limited | Systems and methods for training generative adversarial networks and use of trained generative adversarial networks |
US20210000327A1 (en) * | 2018-01-26 | 2021-01-07 | Olympus Corporation | Endoscopic image processing apparatus, endoscopic image processing method, and recording medium |
US20210357109A1 (en) * | 2020-05-15 | 2021-11-18 | Digits Financial, Inc. | System and method for detecting and resizing a window for improved content delivery |
Family Cites Families (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5482049A (en) * | 1994-03-16 | 1996-01-09 | Siemens Medical Systems, Inc. | Programmable electronic blood pressure monitoring labels |
US6371908B1 (en) * | 1998-05-01 | 2002-04-16 | Asahi Kogaku Kogyo Kabushiki Kaisha | Video endoscopic apparatus for fluorescent diagnosis |
WO2005009206A2 (en) * | 2003-06-25 | 2005-02-03 | Besson Guy M | Dynamic multi-spectral imaging system |
JP4578817B2 (en) * | 2004-02-06 | 2010-11-10 | オリンパス株式会社 | Surgical lesion identification system |
CN101073097B (en) * | 2004-04-15 | 2011-05-18 | 美国医软科技公司 | Multiple volume exploration system and method |
JP4477451B2 (en) * | 2004-08-23 | 2010-06-09 | オリンパス株式会社 | Image display device, image display method, and image display program |
US8423123B2 (en) * | 2005-09-30 | 2013-04-16 | Given Imaging Ltd. | System and method for in-vivo feature detection |
JP5005981B2 (en) * | 2006-08-03 | 2012-08-22 | オリンパスメディカルシステムズ株式会社 | Image display device |
KR100936456B1 (en) * | 2006-12-07 | 2010-01-13 | 주식회사 메디슨 | Ultrasound system |
CN101686799B (en) * | 2007-07-12 | 2012-08-22 | 奥林巴斯医疗株式会社 | Image processing device, and its operating method |
US20100329520A2 (en) * | 2007-11-08 | 2010-12-30 | Olympus Medical Systems Corp. | Method and System for Correlating Image and Tissue Characteristic Data |
KR100868339B1 (en) * | 2007-11-15 | 2008-11-12 | 주식회사 인트로메딕 | Method for displaying the medical image and system and method for providing captured image by the medical image |
JP2010172673A (en) * | 2009-02-02 | 2010-08-12 | Fujifilm Corp | Endoscope system, processor for endoscope, and endoscopy aiding method |
US8120301B2 (en) * | 2009-03-09 | 2012-02-21 | Intuitive Surgical Operations, Inc. | Ergonomic surgeon control console in robotic surgical systems |
CN102215911A (en) * | 2009-08-19 | 2011-10-12 | 奥林巴斯医疗株式会社 | Medical system and medical control method |
JP2011255006A (en) * | 2010-06-09 | 2011-12-22 | Olympus Corp | Image processor, endoscopic device, program and image processing method |
US20130096457A1 (en) * | 2011-10-18 | 2013-04-18 | Qscope, LLC | Oral scope system with image sensor and method for visual examination of oral cavity and upper airway |
TWI548394B (en) * | 2012-09-24 | 2016-09-11 | 榮晶生物科技股份有限公司 | Image detecting apparatus and image detecting method |
CN104114077B (en) * | 2012-10-18 | 2016-07-20 | 奥林巴斯株式会社 | Image processing apparatus and image processing method |
KR102043133B1 (en) * | 2012-11-16 | 2019-11-12 | 삼성전자주식회사 | Computer-aided diagnosis supporting apparatus and method |
CN103622662A (en) * | 2013-11-05 | 2014-03-12 | 乐陵市信诺医疗器械有限公司 | High-definition colonoscopy system |
JP6344039B2 (en) * | 2014-04-28 | 2018-06-20 | 富士通株式会社 | Image display device, image display method, and program |
US10216762B2 (en) * | 2014-06-04 | 2019-02-26 | Panasonic Corporation | Control method and non-transitory computer-readable recording medium for comparing medical images |
KR101728045B1 (en) * | 2015-05-26 | 2017-04-18 | 삼성전자주식회사 | Medical image display apparatus and method for providing user interface thereof |
WO2017073338A1 (en) * | 2015-10-26 | 2017-05-04 | オリンパス株式会社 | Endoscope image processing device |
WO2017073337A1 (en) * | 2015-10-27 | 2017-05-04 | オリンパス株式会社 | Endoscope device |
JP6246431B2 (en) * | 2015-11-10 | 2017-12-13 | オリンパス株式会社 | Endoscope device |
-
2017
- 2017-04-28 WO PCT/JP2017/016961 patent/WO2018198327A1/en active Application Filing
- 2017-04-28 CN CN201780092399.2A patent/CN110868907B/en active Active
- 2017-04-28 JP JP2019515030A patent/JP6751815B2/en active Active
-
2019
- 2019-10-28 US US16/665,040 patent/US20200126223A1/en active Pending
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2517614A1 (en) * | 2010-02-05 | 2012-10-31 | Olympus Corporation | Image processing device, endoscope system, program and image processing method |
US20120026308A1 (en) * | 2010-07-29 | 2012-02-02 | Careview Communications, Inc | System and method for using a video monitoring system to prevent and manage decubitus ulcers in patients |
US20120321161A1 (en) * | 2011-06-17 | 2012-12-20 | Canon Kabushiki Kaisha | Image processing apparatus, image processing method, image pickup system, and program |
US20150078615A1 (en) * | 2013-09-18 | 2015-03-19 | Cerner Innovation, Inc. | Marking and tracking an area of interest during endoscopy |
US20160133014A1 (en) * | 2013-09-18 | 2016-05-12 | Cerner Innovation, Inc. | Marking And Tracking An Area Of Interest During Endoscopy |
US20180098690A1 (en) * | 2015-06-11 | 2018-04-12 | Olympus Corporation | Endoscope apparatus and method for operating endoscope apparatus |
WO2018230074A1 (en) * | 2017-06-14 | 2018-12-20 | オリンパス株式会社 | System for assisting observation of endoscope image |
US20210000327A1 (en) * | 2018-01-26 | 2021-01-07 | Olympus Corporation | Endoscopic image processing apparatus, endoscopic image processing method, and recording medium |
US20190385018A1 (en) * | 2018-06-13 | 2019-12-19 | Casmo Artificial Intelligence-Al Limited | Systems and methods for training generative adversarial networks and use of trained generative adversarial networks |
US20210357109A1 (en) * | 2020-05-15 | 2021-11-18 | Digits Financial, Inc. | System and method for detecting and resizing a window for improved content delivery |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11576563B2 (en) | 2016-11-28 | 2023-02-14 | Adaptivendo Llc | Endoscope with separable, disposable shaft |
US11918176B2 (en) | 2019-03-08 | 2024-03-05 | Fujifilm Corporation | Medical image processing apparatus, processor device, endoscope system, medical image processing method, and program |
USD1018844S1 (en) | 2020-01-09 | 2024-03-19 | Adaptivendo Llc | Endoscope handle |
US11931200B2 (en) | 2020-06-25 | 2024-03-19 | Fujifilm Healthcare Corporation | Ultrasound diagnostic apparatus and diagnosis assisting method |
Also Published As
Publication number | Publication date |
---|---|
CN110868907A (en) | 2020-03-06 |
JPWO2018198327A1 (en) | 2020-03-12 |
JP6751815B2 (en) | 2020-09-09 |
CN110868907B (en) | 2022-05-17 |
WO2018198327A1 (en) | 2018-11-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20200126223A1 (en) | Endoscope diagnosis support system, storage medium, and endoscope diagnosis support method | |
CN108135457B (en) | Endoscope image processing device | |
JP6246431B2 (en) | Endoscope device | |
US10694933B2 (en) | Image processing apparatus and image processing method for image display including determining position of superimposed zoomed image | |
US11602263B2 (en) | Insertion system, method and computer-readable storage medium for displaying attention state information over plurality of times | |
US8251890B2 (en) | Endoscope insertion shape analysis system and biological observation system | |
WO2017073337A1 (en) | Endoscope device | |
US20210000327A1 (en) | Endoscopic image processing apparatus, endoscopic image processing method, and recording medium | |
US11025835B2 (en) | Imaging device, endoscope apparatus, and method for operating imaging device | |
US11457876B2 (en) | Diagnosis assisting apparatus, storage medium, and diagnosis assisting method for displaying diagnosis assisting information in a region and an endoscopic image in another region | |
US11061470B2 (en) | Eye tracking device and eye tracking method | |
US10918265B2 (en) | Image processing apparatus for endoscope and endoscope system | |
JP5698068B2 (en) | Image processing apparatus, image display system, radiographic image capturing system, image processing program, and image processing method | |
WO2020165978A1 (en) | Image recording device, image recording method, and image recording program | |
US20220414880A1 (en) | Medical system, information processing method, and computer-readable medium | |
US20220215539A1 (en) | Composite medical imaging systems and methods | |
US20240087113A1 (en) | Recording Medium, Learning Model Generation Method, and Support Apparatus | |
CN107633478B (en) | Image processing apparatus, image processing method, and computer readable medium | |
CN114830638A (en) | System and method for telestration with spatial memory | |
JP5932188B1 (en) | Video processor for endoscope and endoscope system having the same | |
US20220346632A1 (en) | Image processing apparatus, image processing method, and non-transitory storage medium storing computer program | |
WO2023112499A1 (en) | Endoscopic image observation assistance device and endoscope system | |
JP7215504B2 (en) | Treatment support device, treatment support method, and program | |
US20240065527A1 (en) | Medical support device, endoscope, medical support method, and program | |
WO2024018713A1 (en) | Image processing device, display device, endoscope device, image processing method, image processing program, trained model, trained model generation method, and trained model generation program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |