WO2023228659A1 - Image processing device and endoscope system - Google Patents

Image processing device and endoscope system Download PDF

Info

Publication number
WO2023228659A1
WO2023228659A1 PCT/JP2023/016078 JP2023016078W WO2023228659A1 WO 2023228659 A1 WO2023228659 A1 WO 2023228659A1 JP 2023016078 W JP2023016078 W JP 2023016078W WO 2023228659 A1 WO2023228659 A1 WO 2023228659A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
region
recognized
information
display
Prior art date
Application number
PCT/JP2023/016078
Other languages
French (fr)
Japanese (ja)
Inventor
正明 大酒
Original Assignee
富士フイルム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士フイルム株式会社 filed Critical 富士フイルム株式会社
Publication of WO2023228659A1 publication Critical patent/WO2023228659A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof

Definitions

  • the present invention relates to an image processing device and an endoscope system, and in particular, an image processing device that processes a plurality of images taken in chronological order by an endoscope, and an endoscope system equipped with the image processing device. Regarding.
  • One embodiment of the technology of the present disclosure provides an image processing device and an endoscope system that can grasp the observation situation with an endoscope.
  • An image processing device that processes multiple images taken in time series with an endoscope, and includes a processor, and the processor acquires multiple images, processes the images, and processes multiple images inside the body.
  • a region of interest in an image is recognized from among the regions of interest, and the first region of interest of the plurality of regions of interest is recognized from the first image of the plurality of images, and the region of interest is recognized from the first image of the plurality of images, and
  • a second region of interest among the plurality of regions of interest is recognized from a second image later in the sequence, information indicating that an area between the first region of interest and the second region of interest has been observed is displayed.
  • An image processing device that displays images on the device.
  • the processor causes the display device to display information on a plurality of specific attention regions selected from the plurality of attention regions within the body as first information, and to The image processing device according to (1), which causes a display device to display information indicating that the region has been observed as second information.
  • the first information includes a schema diagram of the organ to be observed, and is composed of information in which markers are placed at the positions of each attention area on the schema diagram.
  • the processor calculates an observation evaluation value based on images between the first image and the second image in chronological order, and the second information includes information on the evaluation value. 9) any one of the image processing devices.
  • the processor extracts mutually similar images from among the images between the first image and the second image, excludes one of the extracted mutually similar images, and calculates an evaluation value. ) image processing device.
  • the processor recognizes the first region of interest from a third image that is later in chronological order than the second image, and that the second region of interest is recognized from a third image that is later than the third image in chronological order.
  • the image processing device according to any one of (10) to (15), which updates the evaluation value and updates the display of the second information when recognized from four images.
  • the processor displays the plurality of images on the display device in chronological order, accepts selection of the first image from the images displayed on the display device, and selects the first image from the images displayed on the display device after the first image.
  • the image processing device according to any one of (1) to (16), which accepts selection of two images.
  • the processor When the processor recognizes the region of interest from the image, the processor further determines whether the image in which the region of interest has been recognized satisfies the second criterion, and if the second criterion is satisfied, determines the recognition of the region of interest.
  • the image processing device according to any one of (1) to (19).
  • the processor records the history of display of the second information, and when displaying the second information anew, simultaneously displays the second information displayed in the past based on the history, the image of (22) Processing equipment.
  • the processor may terminate the display of the second information when the first region of interest is recognized from a third image that is later in chronological order than the second image.
  • the first image processing device may terminate the display of the second information when the first region of interest is recognized from a third image that is later in chronological order than the second image.
  • An endoscope system comprising an endoscope, a display device, and the image processing device according to any one of (1) to (26) for processing images taken by the endoscope.
  • Block diagram showing an example of the system configuration of an endoscope system Block diagram of the main functions of the processor device Block diagram showing an example of the hardware configuration of an image processing device Block diagram of the main functions of the image processing device Block diagram of the main functions of the image processing unit
  • a diagram showing an example of a screen display of a display device Diagram showing an example of display transition of the observation status display window Flowchart showing the processing procedure when displaying information indicating observation status on a display device Block diagram of the main functions of the observation situation determination unit of the image processing device
  • Conceptual diagram of imaging area determination and evaluation value calculation performed by the observation situation determination unit Diagram showing an example of the display of the observation status display window Diagram showing another example of display of information indicating observation status Diagram showing an example of display transition of observation status display window
  • the present invention is applied to an endoscope system for observing (including observation for the purpose of examination) the upper digestive tract in the body, particularly the stomach will be described as an example.
  • the stomach is an example of an organ to be observed, particularly a hollow organ.
  • FIG. 1 is a block diagram showing an example of the system configuration of an endoscope system.
  • the endoscope system 1 of this embodiment includes an endoscope 10, a light source device 20, a processor device 30, an input device 40, a display device 50, an image processing device 100, and the like.
  • the endoscope 10 is connected to a light source device 20 and a processor device 30.
  • the light source device 20, the input device 40, and the image processing device 100 are connected to the processor device 30.
  • Display device 50 is connected to image processing device 100.
  • the endoscope system 1 of this embodiment is configured as a system capable of observation using special light (special light observation) in addition to normal white light observation (white light observation).
  • Special light observation includes narrowband light observation.
  • Narrow band optical observation includes BLI observation (Blue laser imaging observation), NBI observation (Narrow band imaging observation), LCI observation (Linked Color Imaging observation), and the like. Note that special light observation itself is a well-known technique, so detailed explanation thereof will be omitted.
  • the endoscope 10 of this embodiment is an electronic endoscope (flexible endoscope), particularly an electronic endoscope for upper digestive organs.
  • An electronic endoscope includes an operation section, an insertion section, a connection section, and the like, and photographs a subject using an image sensor built into the tip of the insertion section.
  • an image sensor for example, a CMOS (Complementary Metal Oxide Semiconductor) image sensor, a CCD (Charge Coupled Device) image sensor, etc.
  • a predetermined color filter array for example, a Bayer array
  • the operation section includes an angle knob, an air/water supply button, a suction button, a mode switching button, a release button, a forceps port, and the like.
  • the mode switching button is a button for switching observation modes. For example, switching is performed between a mode for white light observation, a mode for LCI observation, and a mode for BLI observation.
  • the release button is a button that instructs to take a still image. Note that since the endoscope itself is well known, detailed explanation thereof will be omitted.
  • the endoscope 10 is connected to a light source device 20 and a processor device 30 via a connection part.
  • the light source device 20 generates illumination light to be supplied to the endoscope 10.
  • the endoscope system 1 of this embodiment is configured as a system capable of special light observation in addition to normal white light observation. Therefore, the light source device 20 has a function of generating light (for example, narrowband light) compatible with special light observation in addition to normal white light. Note that, as mentioned above, since special light observation itself is a well-known technique, a description of the generation of the illumination light will be omitted. Switching of the light source type is performed using, for example, a mode switching button provided on the operation section of the endoscope 10.
  • the processor device 30 centrally controls the operation of the entire endoscope system.
  • the processor device 30 includes a processor, a main storage device, an auxiliary storage device, an input/output interface, etc. as its hardware configuration.
  • FIG. 2 is a block diagram of the main functions of the processor device.
  • the processor device 30 has functions such as an endoscope control section 31, a light source control section 32, an image processing section 33, an input control section 34, and an output control section 35. Each function is realized by the processor executing a predetermined program.
  • the auxiliary storage device stores various programs executed by the processor and various data necessary for control and the like.
  • the endoscope control unit 31 controls the endoscope 10.
  • Control of the endoscope 10 includes drive control of an image sensor, control of air and water supply, control of suction, and the like.
  • the light source control unit 32 controls the light source device 20.
  • Control of the light source device 20 includes light emission control of the light source, switching control of light source types, and the like.
  • the image processing unit 33 performs various signal processing on the signal (image signal) output from the image sensor of the endoscope 10 to generate an image.
  • the input control unit 34 performs a process of accepting operation inputs from the input device 40 and the operation unit of the endoscope 10, and inputs of various information.
  • the output control unit 35 controls the output of information from the processor device 30 to the image processing device 100.
  • the information output from the processor device 30 to the image processing device 100 includes images taken with an endoscope (endoscope images), information input via the input device 40, various operation information, etc. included.
  • the various operation information includes operation information of the input device 40 as well as operation information of the operation section of the endoscope 10.
  • the operation information includes a still image shooting instruction. As described above, a still image shooting instruction is performed using a release button provided on the operation section of the endoscope 10. In addition, the still image shooting instruction may be given via a foot switch, an audio input device, a touch panel, or the like. Images taken in chronological order by the endoscope are sequentially output to the image processing device 100.
  • the input device 40 constitutes a user interface in the endoscope system 1 together with the display device 50.
  • the input device 40 includes, for example, a keyboard, a mouse, a foot switch, and the like.
  • the input device 40 can be configured to include a touch panel, a voice input device, a line of sight input device, and the like.
  • the display device 50 is used not only to display endoscopic images but also to display various information.
  • the display device 50 is configured with, for example, a liquid crystal display (LCD), an organic electroluminescence display (OELD), or the like.
  • the display device 50 can also be configured with a projector, a head-mounted display, or the like.
  • the image processing device 100 processes images output from the processor device 30. Specifically, the image processing device 100 performs a process of sequentially displaying images sequentially output from the processor device 30 on the display device 50. Further, the image processing device 100 sequentially processes images sequentially output from the processor device 30, and performs a process of detecting a lesion from the images. Further, processing for differentiating the detected lesion is performed. The image processing device 100 also performs processing for displaying the detection results and discrimination results of the lesion on the display device 50. Further, the image processing device 100 performs a process of photographing and recording a still image and/or a moving image in response to an instruction from a user.
  • the image processing apparatus 100 performs a process of recognizing the part of an organ shown in the photographed still image. Further, the image processing device 100 performs a process of displaying information indicating the observation status of the organ by the endoscope 10 on the display device 50 based on the recognition result of the part of the organ.
  • the plurality of parts of the organ that are the recognition targets are an example of the plurality of regions of interest inside the body.
  • FIG. 3 is a block diagram showing an example of the hardware configuration of the image processing device.
  • the image processing device 100 is composed of a so-called computer, and its hardware configuration includes a processor 101, a main memory 102, an auxiliary storage 103, an input/output interface 104, and the like.
  • the image processing device 100 is connected to the processor device 30 and the display device 50 via an input/output interface 104.
  • the auxiliary storage device 103 includes, for example, a hard disk drive (HDD), a flash memory including an SSD (Solid State Drive), and the like.
  • the auxiliary storage device 103 stores programs executed by the processor 101 and various data necessary for control and the like.
  • images (still images and moving images) taken with the endoscope are recorded in the auxiliary storage device 103.
  • the detection result of the lesion, the discrimination result, the recognition result of the part, the judgment result of the observation situation, etc. are also recorded in the auxiliary storage device 103.
  • FIG. 4 is a block diagram of the main functions of the image processing device.
  • the image processing device 100 has functions such as an image acquisition section 111, a command acquisition section 112, an image processing section 113, a recording control section 114, and a display control section 115.
  • the functions of each part are realized by the processor 101 executing a predetermined program (image processing program).
  • the image acquisition unit 111 performs a process of sequentially acquiring images sequentially output from the processor device 30. As described above, the processor device 30 sequentially outputs images taken in chronological order by the endoscope 10. Therefore, the image acquisition unit 111 acquires images taken in chronological order by the endoscope 10 in chronological order.
  • the command acquisition unit 112 acquires command information.
  • the command information includes information on still image shooting instructions.
  • the image processing unit 113 processes the image acquired by the image acquisition unit 111.
  • FIG. 5 shows the main functional blocks of the image processing section.
  • the image processing unit 113 of the image processing apparatus 100 of this embodiment has functions such as a lesion detection unit 113A, a discrimination unit 113B, a site recognition unit 113C, and an observation situation determination unit 113D.
  • the lesion detection unit 113A processes the input image and performs processing to detect a lesion (for example, a polyp, etc.) appearing in the image. Lesions include areas that are certain to be lesions, areas that may be lesions (benign tumors or dysplasia, etc.), and areas that may be directly or indirectly related to the lesions. This includes areas with certain characteristics (redness, etc.).
  • the lesion detection unit 113A is composed of a trained model that has been trained to detect a lesion from an image. Detection of a lesion using a learned model itself is a well-known technique, so a detailed explanation thereof will be omitted. As an example, it is configured with a model using a convolutional neural network (CNN). Note that detection of a lesion can include determining the type of the detected lesion.
  • CNN convolutional neural network
  • the discrimination unit 113B performs discrimination processing on the lesion detected by the lesion detection unit 113A. As an example, in the present embodiment, a process is performed to distinguish whether a lesion such as a polyp detected by the lesion detection unit 113A is neoplastic (NEOPLASTIC) or non-neoplastic (HYPERPLASTIC).
  • the discrimination unit 113B is composed of a trained model that has been trained to discriminate a lesion from an image.
  • the part recognition unit 113C processes the input image and performs processing to recognize the part of the organ shown in the image.
  • the region recognition unit 113C performs processing to recognize the region of the stomach.
  • the region to be recognized is, for example, a region that is an anatomical landmark.
  • the cardia, fornix, greater curvature, gastric angle, antrum, pylorus, etc. Recognize landmark parts of the stomach. If the body part cannot be recognized from the input image, the body part recognition unit 113C outputs "unrecognizable" as the recognition result. Furthermore, if the recognized part is a part other than the part set as a recognition target, the part recognition unit 113C outputs "Other".
  • the part recognition unit 113C is composed of a trained model that has been trained to recognize parts of organs from images. In this embodiment, it is configured with a trained model that has been trained to recognize the region of the stomach from an image. As an example, in this embodiment, the part recognition unit 113C is configured with a CNN.
  • body part recognition processing is performed on the photographed still image. Therefore, the photographed still image is input to the body part recognition unit 113C.
  • the observation status determination unit 113D determines the observation status of the organ using the endoscope 10 based on the recognition result of the organ site by the site recognition unit 113C. Specifically, the observation situation is determined by the following procedure. When a still image is photographed and the photographed region is recognized by the region recognition section 113C, information on the recognized region (information on the region recognition result) is added to the observation situation determination section 113D. The observation situation determination unit 113D acquires information on the recognized part and compares it with information on the previously recognized part. If the region recognized this time is different from the region recognized last time, the observation situation determination unit 113D determines that the area of the organ between the region recognized last time and the region recognized this time has been observed.
  • the region recognized last time is the "greater curvature of the body of the stomach”
  • the region recognized this time is the "greater curvature of the body of the stomach.” If the region (object) is the "antrum”, it is determined that the region between the greater curvature of the gastric body and the antrum has been observed.
  • the observation situation determination unit 113D determines the region of the observed organ by comparing it with the previously recognized region. Since the determination is made by comparison with the previously recognized region, the observation situation determination unit 113D retains at least information on the previously recognized region.
  • the part recognized last time is an example of the first region of interest (first part), and the part recognized this time is an example of the second region of interest (second part).
  • the image in which the first region of interest was recognized is an example of the first image
  • the image in which the second region of interest was recognized is an example of the first image
  • (image) is an example of the second image.
  • the part that was recognized this time (second region of interest) will be the part that was previously recognized in relation to the newly recognized part. (first region of interest). In this way, the observed areas are determined one after another in comparison with the previous recognition results.
  • the recording control unit 114 performs a process of photographing a still image and recording it in the auxiliary storage device 103 in response to a still image photographing instruction.
  • the still image the image displayed on the display device 50 at the time when the still image shooting instruction is received is recorded. This allows the user to record a desired image as a still image during observation.
  • the recording control unit 114 acquires an image of a frame displayed on the display device 50 in response to a still image shooting instruction, and records it in the auxiliary storage device 103.
  • the recording control unit 114 performs a process of recording recognition result information (recognized body part information) in the auxiliary storage device 103 in association with a still image. conduct.
  • the association method is not particularly limited. It is only necessary to record the still image in a format that allows the correspondence between the still image and the information on the recognition result of the body part based on the still image to be understood. Therefore, for example, the association between the two may be managed using a separately generated management file. Further, for example, information on the recognition result of the body part may be recorded as additional information (so-called meta information) of the still image.
  • the recording control section 114 performs a process of recording information on the determination result in the auxiliary storage device 103. Information on the determination results is recorded in chronological order. This allows the history of observed areas to be recorded in chronological order.
  • the display control unit 115 controls the display of the display device 50.
  • the display control unit 115 performs a process of sequentially displaying images sequentially output from the processor device 30 on the display device 50.
  • the processor device 30 sequentially outputs images taken in chronological order by the endoscope 10 in chronological order. Therefore, images taken in chronological order by the endoscope 10 are displayed on the display device 50 in chronological order.
  • the display control unit 115 performs a process of displaying the observation status of the organ by the endoscope 10 on the display device 50.
  • FIG. 6 is a diagram showing an example of the screen display of the display device.
  • the figure shows an example in which the display device 50 is a so-called wide monitor (a monitor with a horizontally long screen).
  • a main display area A1 and a sub display area A2 are set on the screen 52.
  • the main display area A1 and the sub display area A2 are set by dividing the screen 52 into two in the horizontal direction.
  • the large area on the left side of FIG. 6 is the main display area A1, and the small area on the right side is the sub display area A2.
  • the main display area A1 is an area that mainly displays images taken by the endoscope 10.
  • An observation image display area A1a is set in the main display area A1, and an image Im photographed by the endoscope 10 is displayed in the observation image display area A1a in real time.
  • the observation image display area A1a is constituted by an area in the shape of a circle with the top and bottom cut out.
  • the lesion detection support function when the lesion detection support function is turned on, the lesion detection result is displayed superimposed on the image Im.
  • the detection result of the lesion is displayed in a form in which the detected lesion is surrounded by a frame (so-called bounding box) B.
  • the lesion detection unit 113A when determining the type of the lesion, information on the determined type of the lesion is used instead of or in addition to the information indicating the position of the lesion. Is displayed.
  • Information on the type of lesion is displayed, for example, near the detected lesion. Information on the type of lesion can also be displayed in the sub-display area A2.
  • a strip-shaped information display area A1b is set below the observation image display area A1a.
  • the information display area A1b is used to display various information. For example, when the discrimination support function is turned on, the discrimination result is displayed in the information display area A1b.
  • FIG. 6 shows an example where the differential diagnosis result is "neoplastic".
  • the sub-display area A2 is an area that displays various information related to observation. For example, information Ip about the subject (patient) and a still image Is taken during observation are displayed. The still images Is are displayed, for example, in the order in which they were photographed from the top to the bottom of the screen 52. If the number of still images exceeds the number that can be displayed, the oldest images Im are deleted and the display is switched. Alternatively, the display size of each image is reduced so that all images are displayed.
  • observation situation determination function when the observation situation determination function is turned on, information indicating the observation situation is displayed in the sub display area A2.
  • information indicating the observation situation is displayed in a predetermined observation situation display window W.
  • the observation status display window W is composed of a rectangular display area, and is displayed at a predetermined position in the sub-display area A2.
  • FIG. 6 shows an example in which the observation status display window W is displayed near the lower end of the sub-display area A2.
  • the observation status display window W displays information on the body part recognized by the body part recognition unit 113C as information indicating the observation status.
  • Information on the recognized body parts is displayed in respective prescribed body part display frames Fl1, Fl2, . . . .
  • the body part display frames Fl1, Fl2, . . . are added each time a new body part is recognized.
  • FIG. 7 is a diagram showing an example of the display transition of the observation status display window.
  • FIG. 7(A) shows an example of the initial display of the observation status display window W. That is, it shows an example of the display of the observation status display window W in a case where recognition of a body part has not been performed even once since the start of observation (in a case where still image photography has not been performed). In this case, the observation status display window W is left blank. In other words, nothing is displayed.
  • FIG. 7(B) shows an example of the display of the observation status display window W when a body part is recognized for the first time. That is, it shows an example of the display of the observation status display window W when a still image is captured for the first time and a body part is recognized from the captured image. As shown in the figure, a body part display frame Fl1 is displayed at a predetermined position (near the top end) in the window, and information about the recognized body part is displayed in the body part display frame Fl1.
  • FIG. 7(B) shows an example in which a "greater curvature of the body of the stomach" is recognized.
  • FIG. 7(C) shows an example of the display of the observation status display window W when a new part is recognized. That is, an example of the display of the observation status display window W is shown when still image photography is further performed and a body part is recognized from the photographed image. As shown in the figure, a new part display frame Fl2 is added below the previously displayed part display frame Fl1, and information about the newly recognized part is displayed in the newly added part display frame Fl2. be done.
  • FIG. 7(C) shows an example in which "antrum" is newly recognized.
  • the previously displayed body part display frame Fl1 and the newly added body part display frame Fl2 are connected by an arrow Ar1.
  • the arrow Ar1 is displayed in the direction from the previously displayed body part display frame Fl1 to the newly added body part display frame Fl2.
  • This arrow Ar1 indicates that the endoscope 10 has moved from the region displayed in the region display frame Fl1 to the region displayed in the region display frame Fl2. That is, it is shown that the area between the part displayed in the part display frame Fl1 and the part displayed in the part display frame Fl2 has been observed. It is also shown that the observation was made from the part displayed in the part display frame Fl1 to the part displayed in the part display frame Fl2. That is, the direction of observation movement is indicated.
  • the endoscope 10 is moved from the "greater curvature of the stomach body" toward the "antrum,” and the region of the stomach in between is observed. It shows.
  • FIG. 7(D) shows an example of the display of the observation status display window W when a region is further recognized. That is, an example of the display of the observation status display window W is shown when still image photography is further performed and a body part is recognized from the photographed image. As shown in the figure, a new part display frame Fl3 is added below the previously displayed part display frame Fl2, and information about the newly recognized part is displayed in the newly added part display frame Fl3. be done.
  • FIG. 7(D) shows an example where a "gastric angle" is newly recognized.
  • the previously displayed body part display frame Fl2 and the newly added body part display frame Fl3 are connected by an arrow Ar2.
  • the arrow Ar2 is displayed in the direction from the previously displayed body part display frame Fl2 to the newly added body part display frame Fl3.
  • This arrow Ar2 indicates that the endoscope 10 has moved from the region displayed in the region display frame Fl2 to the region displayed in the region display frame Fl3. That is, it is shown that the area between the part displayed in part display frame Fl2 and the part displayed in part display frame Fl3 has been observed. Further, it is shown that the part displayed in the part display frame Fl3 was observed from the part displayed in the part display frame Fl2. That is, the direction of observation movement is indicated.
  • the endoscope 10 is moved from the antrum to the gastric angle, and the region of the stomach between them is observed. ing.
  • body part display frames Fl1, Fl2, . . . are additionally displayed within the observation status display window W.
  • Information about the newly recognized body part is then displayed in the newly added body part display frame.
  • the previously displayed body part display frame and the newly added body part display frame are connected with an arrow. Thereby, the observed part, observed area, and observation direction can be grasped from the display on the observation status display window W.
  • up to four body part display frames can be displayed in the observation status display window W. If more than four parts are recognized, the display size of the observation status display window W is enlarged. More specifically, it is extended and expanded upward. Thereby, information on newly recognized parts can be displayed sequentially. In addition, the size of the observation status display window W may not be changed, and only information about the most recently recognized part may be displayed. In other words, a configuration may be adopted in which only information on the most recently recognized n parts is displayed without changing the number n of parts displayed in the part display frame. Alternatively, the display range can be changed by scrolling without changing the size of the observation status display window W.
  • the observation situation display window W is displayed in priority over other displays in the sub-display area A2. That is, if the display overlaps with other information, it is displayed at the top.
  • the arrows Ar1, Ar2, ... connecting the part display frames Fl1, Fl2, ... indicate that the area between the first region of interest (first region) and the second region of interest (second region) is observed. This is an example of information indicating that the
  • the image processing device 100 causes the display device 50 to display images captured by the endoscope 10 in real time.
  • the image processing device 100 causes the display device 50 to display various types of support information.
  • information on the detection result of the lesion, information on the discrimination result of the lesion, and information indicating the observation status are displayed as the support information.
  • Each piece of support information is displayed when the corresponding function is turned on. For example, when the lesion detection support function is turned on, the detected lesion is displayed surrounded by a frame as the lesion detection result. Further, when the discrimination support function is turned on, the discrimination result is displayed in the discrimination result display area A3. Furthermore, when the observation status display function is turned on, an observation status display window W is displayed on the screen 52 as information indicating the observation status.
  • the image processing device 100 records the image of the frame being displayed on the display device 50 as a still image in response to a still image shooting instruction from the user.
  • the lesion detection support function When the lesion detection support function is turned on, information about the lesion detected from the photographed still image is recorded in association with the photographed still image.
  • the discrimination support function when the discrimination support function is turned on, information on the discrimination result of the lesion is recorded in association with the photographed still image.
  • the observation status display function is turned on, information on the recognized body part and information on the observation status determination result are recorded in association with the photographed still image.
  • the configuration may be such that the operation is performed using the operating section of the processor device 30 or the operating section of the endoscope 10.
  • FIG. 8 is a flowchart illustrating the processing procedure for displaying information indicating observation conditions on a display device.
  • information indicating the observation situation is displayed on the display device 50 when the observation situation display function is turned on.
  • the processor 101 of the image processing device 100 displays the observation status display window W on the screen 52 of the display device 50 (step S1).
  • the observation status display window W is displayed at a predetermined position (sub-display area A2) on the screen 52 (see FIG. 6).
  • the initial display of the observation status display window W is blank (see FIG. 7(A)).
  • the processor 101 determines whether still images are to be taken (step S2).
  • body part recognition processing is performed on the photographed still image (step S3). That is, a process is performed to recognize the part of the organ (in this embodiment, the part of the stomach) shown in the photographed still image.
  • the processor 101 determines whether or not the body part has been recognized (step S4).
  • the processor 101 performs processing to determine the observation situation (step S5). That is, the presence or absence of a recognized part is determined, and if there is a recognized part, it is determined that the area between the previously recognized part and the currently recognized part has been observed.
  • the processor 101 updates the display of the observation situation display window W based on the region recognition result and the observation situation determination result (step S6).
  • a body part display frame is displayed in the observation status display window W, and information about the recognized body part is displayed in the body part display frame (see FIG. 7(B)).
  • a body part display frame is added to the observation status display window W and displayed. Then, information on the recognized body part is displayed in the newly added body part display frame (see FIGS. 7(C) and (D)).
  • part display frame of the previously recognized part and the part display frame of the currently recognized part are connected with an arrow, and information indicating that the area of the organ between the previously recognized part and the currently recognized part has been recognized is displayed. (See FIGS. 7(C) and (D)).
  • the processor 101 determines whether the observation has ended (step S7). Note that in the case where it is determined in step S2 that no still image has been taken, and in the case where it is determined in step S4 that the body part has not been recognized, it is similarly determined whether the observation has ended. When it is determined that the observation has ended, the process ends. If it is determined that the observation has not been completed, it is determined whether still images are to be taken (step S2).
  • the process also ends when the observation status display function is turned off. In this case, the observation status display window W is deleted from the screen 52.
  • each time a still image is taken the part of the organ shown in the taken still image is recognized. Information about the recognized region is then displayed on the display device 50. Thereby, the observed part can be easily grasped.
  • the information on the previously recognized part and the information on the newly recognized part are displayed connected with an arrow. Thereby, it is possible to easily grasp the observed region in the organ to be observed.
  • the user can grasp the observed area from the display of the observation status display window W. However, it cannot be determined whether the area determined to have been observed has been correctly observed (photographed). If the observation is not performed correctly, there is a risk that the lesion cannot be detected accurately when automatically detecting the lesion from the image. Therefore, it is more preferable to not only be able to grasp the observed area but also to be able to confirm whether the area determined to have been observed has been correctly observed.
  • an endoscope system having a function of evaluating the observation state will be described.
  • the basic configuration of the system is substantially the same as the endoscope system 1 of the first embodiment. Therefore, in the following, only the main differences from the endoscope system 1 of the first embodiment, particularly the differences in the image processing apparatus 100, will be described.
  • the image processing apparatus 100 further has a function of evaluating observation of organs using the endoscope 10. Evaluation of the observation is performed in the observation situation determination section 113D.
  • FIG. 9 is a block diagram of the main functions of the observation situation determination section of the image processing device.
  • the observation situation determination unit 113D has the functions of an observation area determination unit 113D1, a photography evaluation unit 113D2, and an evaluation value calculation unit 113D3. These functions are realized by the processor 101 executing a predetermined program.
  • the observation area determination unit 113D1 determines the observed area (observation area) based on the part recognition result by the part recognition unit 113C. Specifically, the observation area is determined by the following procedure. When a still image is photographed and the photographed region is recognized by the region recognition section 113C, information on the recognized region (information on the region recognition result) is added to the observation region determination section 113D1. The observation area determination unit 113D1 acquires information on the recognized region and compares it with information on the previously recognized region. If the region recognized this time is different from the region recognized last time, the observation region determination unit 113D1 determines that the region of the organ between the region recognized last time and the region recognized this time is the observed region. judge.
  • the imaging evaluation unit 113D2 evaluates images taken by the endoscope 10.
  • images are evaluated from the perspective of image recognition. That is, as described above, in the endoscope system 1 of the present embodiment, automatic detection of a lesion, etc. is performed by image recognition, so the captured image is evaluated from the viewpoint of performing image recognition.
  • an image is evaluated based on the blur state and blur state of the image.
  • the blur state and blur state of an image can be evaluated, for example, as the sharpness of the image. Sharpness is one of the indicators representing the clarity of an image.
  • the shooting evaluation unit 113D2 calculates the sharpness of the image, and determines that images whose calculated sharpness is below a threshold are NG images (blurred or blurred images), and images that exceed the threshold are OK images (clear images). do.
  • a known method can be used to calculate the sharpness.
  • a method for evaluating an image a known method for quantitatively evaluating the blur state and/or blur state of an image can be adopted.
  • the image sharpness threshold is an example of the first criterion.
  • the evaluation value calculation unit 113D3 calculates an observation evaluation value (score) based on the image evaluation result.
  • the observation evaluation value is an index that quantitatively indicates how accurately the area determined to have been observed by the observation area determination unit 113D1 has been observed. As an example, in the present embodiment, it is calculated as the percentage of OK images taken in the area determined to have been observed by the observation area determination unit 113D1. Specifically, the ratio of OK images among the images taken in the area determined to have been observed is calculated. A video is shot in the area determined to have been observed. Therefore, the evaluation value is calculated as the ratio of OK images among the images of each frame making up the moving image. For example, assume that the total number of frames of moving images shot in the area determined to have been observed is 100, of which the number of frames determined to be OK images is 82. In this case, the evaluation value is 82[%].
  • the evaluation value may be calculated sequentially, or may be calculated all at once after the observed area is determined. In the case of sequential calculation, the evaluation value is updated every time an evaluation result of a photographed image is obtained.
  • FIG. 10 is a conceptual diagram of the determination of the imaging area and the calculation of the evaluation value performed by the observation situation determination section.
  • reference numeral I indicates images I taken in chronological order by the endoscope 10. This image corresponds to an image of each frame of a moving image captured by the endoscope 10.
  • the observed region is determined based on the region recognized from the photographed still image. That is, a part (first region of interest) recognized from a still image taken first (first image) and a part (second region of interest) recognized from a still image taken after that (second image). The area between is determined to be the observed area.
  • the photographic evaluation unit 113D2 evaluates each image photographed between a still image photographed first (first image) and a still image photographed after that (second image).
  • the evaluation value calculation unit 113D3 calculates the proportion of OK images among the images taken between the still image taken first (first image) and the still image taken after that (second image). and calculate the evaluation value.
  • FIG. 11 is a diagram showing an example of the display of the observation status display window. 11A to 11C show changes in the display of the observation status display window W over time.
  • FIG. 11(A) shows an example of the display of the observation status display window W when a region is recognized for the first time.
  • a body part display frame Fl1 is displayed at a predetermined position (near the top end) in the window, and information about the recognized body part is displayed in the body part display frame Fl1.
  • FIG. 11(A) shows an example in which a "greater curvature of the body of the stomach" is recognized.
  • FIG. 11(B) shows an example of the display of the observation status display window W when a new part is recognized.
  • a new part display frame Fl2 is added below the previously displayed part display frame Fl1, and a newly recognized part is displayed in the newly added part display frame Fl2.
  • Information will be displayed.
  • the previously displayed body part display frame Fl1 and the newly added body part display frame Fl2 are connected by an arrow Ar1.
  • an evaluation value display frame Sc1 is newly displayed between the previously displayed body part display frame Fl1 and the newly added body part display frame Fl2, and the newly displayed The evaluation value is displayed in the evaluation value display frame Sc1.
  • This evaluation value is an evaluation value calculated for the observation performed between the part displayed in the part display frame Fl1 and the part displayed in the part display frame Fl2.
  • Figure 11(B) shows a case where the evaluation value calculated for the observation performed between the "greater curvature” and the "antrum” was 82%. An example is shown.
  • the evaluation value display frame Sc1 is displayed adjacent to the arrow Ar1.
  • FIG. 11C shows an example of the display of the observation status display window W when a region is further recognized.
  • a new part display frame Fl3 is added below the previously displayed part display frame Fl2, and a newly recognized part is displayed in the newly added part display frame Fl3. Information will be displayed.
  • the previously displayed body part display frame Fl2 and the newly added body part display frame Fl3 are connected by an arrow Ar2.
  • an evaluation value display frame Sc2 is newly displayed between the previously displayed body part display frame Fl2 and the newly added body part display frame Fl3, and the newly displayed The evaluation value is displayed in the evaluation value display frame Sc2.
  • This evaluation value is an evaluation value calculated for the observation performed between the region displayed in the region display frame Fl2 and the region displayed in the region display frame Fl3.
  • Figure 11 (C) is an example where the evaluation value calculated for the observation performed between the "antrum” and "gastric angle” is 99%. It shows.
  • the evaluation value display frames Sc1, Sc2, ... are displayed between the body part display frames Fl1, Fl2, ... displayed on the observation status display window W, and the evaluation value display frames Sc1, Sc2, ...
  • the observation evaluation value calculated between the parts is displayed. Thereby, it is possible to easily check the observation status (whether or not the images were correctly captured) among the parts displayed in each part display frame Fl1, Fl2, . . . .
  • evaluation of observation is shown for a region that has been observed. Thereby, it can be easily determined whether the observed area has been correctly observed (photographed) or not.
  • FIG. 12 is a diagram illustrating another example of displaying information indicating observation status.
  • FIG. 12 shows an example of a case where an observation situation is shown using a schema diagram of an organ to be observed.
  • the observation status display window W displays a schema diagram (picture of the organ) Sh of the organ to be observed.
  • FIG. 12 shows an example in which the stomach is the observation target. Therefore, in this case, a schema diagram of the stomach is displayed.
  • predetermined marks M1, M2, . . . are displayed on the schema diagram Sh.
  • Marks M1, M2, . . . are displayed at positions corresponding to the recognized parts. Therefore, for example, if the cardia of the stomach is recognized, a mark is displayed at the position of the cardia on the schema diagram.
  • the observed parts can be recognized from these marks M1, M2, . . . .
  • FIG. 12 shows an example of displaying circular marks M1, M2, . . . .
  • arrows Ar1, Ar2, ... connecting the marks in the order in which they are displayed are displayed on the schema diagram Sh.
  • Arrows Ar1, Ar2, . . . are displayed from the mark displayed first to the mark displayed later. Therefore, for example, when marks are displayed in the order of mark M1, mark M2, and mark M3, an arrow Ar1 connecting mark M1 and mark M2 is displayed from mark M1 toward mark M2. Further, an arrow Ar2 connecting the mark M2 and the mark M3 is displayed from the mark M2 toward the mark M3.
  • arrows Ar1, Ar2, ... are examples of information indicating that a region between the first region of interest (first part) and the second region of interest (second part) has been observed. .
  • FIG. 12 shows an example where the greater curvature of the stomach body, the antrum, and the angle of the stomach are recognized in the order of the greater curvature of the stomach, the antrum, and the angle of the stomach.
  • the mark M1 is displayed on the schema diagram at a position indicating the greater curvature of the body of the stomach.
  • the mark M2 is displayed on the schema diagram at a position indicating the antrum.
  • the mark M3 is displayed on the schema diagram at a position indicating the stomach angle.
  • FIG. 13 is a diagram showing an example of the display transition of the observation status display window W.
  • FIG. 13(A) shows an example of the initial display of the observation status display window W. In this case, only the schema diagram Sh is displayed in the observation status display window W.
  • FIG. 13(B) shows an example of the display of the observation status display window W when a region is recognized for the first time.
  • a mark M1 is displayed at the position of the part corresponding to the recognized part.
  • FIG. 13(B) shows an example in which the "greater curvature of the body of the stomach" is recognized. In this case, a mark M1 is displayed at a position corresponding to the greater curvature of the body of the stomach on the schema Sh.
  • FIG. 13(C) shows an example of the display of the observation status display window W when a new part is recognized.
  • a new mark M2 is displayed at the position of the part corresponding to the newly recognized part.
  • FIG. 13(C) shows an example in which the "vestibular region" is newly recognized.
  • a mark M2 is displayed on the schema diagram Sh at a position corresponding to the antrum.
  • the previously displayed mark M1 and the newly displayed mark M2 are connected by an arrow Ar1.
  • the arrow Ar1 is displayed from the previously displayed mark M1 toward the newly displayed mark M2.
  • This arrow Ar1 indicates that the endoscope 10 has moved from the region indicated by the mark M1 toward the region indicated by the mark M2. That is, it is shown that the region between the region indicated by mark M1 and the region indicated by mark M2 has been observed.
  • the endoscope 10 was moved from the "greater curvature of the body of the stomach" indicated by mark M1 toward the "antrum” indicated by mark M2, and the region of the stomach in between was observed. It is shown that.
  • FIG. 13(D) shows an example of the display of the observation status display window W when a region is further recognized. As shown in FIG. 13(D), a new mark M3 is displayed at the position of the part corresponding to the newly recognized part. FIG. 13(D) shows an example where the "angular part of the stomach" is newly recognized. In this case, a mark M3 is displayed at a position corresponding to the angle of the stomach on the schema Sh.
  • the previously displayed mark M2 and the newly displayed mark M3 are connected by an arrow Ar2.
  • the arrow Ar2 is displayed from the previously displayed mark M2 to the newly displayed mark M3.
  • This arrow Ar2 indicates that the endoscope 10 has moved from the region indicated by the mark M2 toward the region indicated by the mark M3. That is, it is shown that the region between the region indicated by mark M2 and the region indicated by mark M3 has been observed.
  • the endoscope 10 is moved from the "antrum" indicated by mark M2 to the "angular region of the stomach" indicated by mark M3, and the region of the stomach in between is observed. It shows.
  • marks M1, M2, . . . are displayed on the schema diagram Sh corresponding to the recognized part. Thereby, the observed region can be easily understood. Moreover, each time the marks M1, M2, . . . are displayed, arrows Ar1, Ar2, . Thereby, the observed area and observation direction can be easily grasped.
  • the marks are connected by arrows, but they may be connected by line segments.
  • the observation direction (the direction in which the endoscope 10 was moved) can be confirmed.
  • the arrows (including the case of line segments) Ar1, Ar2, . . . can be configured to be displayed only for a certain period of time.
  • an arrangement can be made in which an arrow is displayed at the same time as a mark of a newly recognized region is displayed, and after T time has elapsed from the start of arrow display, only the arrow is turned off.
  • Time T is a preset time.
  • the observation when the same part as the first recognized part is recognized again, it can be determined that the observation has ended.
  • a specific motion it can be determined that the observation has ended.
  • a movement of removing the endoscope from the organ to be observed it can be determined that the observation has ended.
  • the action of removing the endoscope from the organ to be observed can be detected, for example, by detecting a specific region or organ from the image.
  • the processor 101 when the processor 101 has a function of redisplaying the arrow, the processor 101 records the arrow display history in the main storage device 102 or the auxiliary storage device 103. When redisplaying the arrow, the recorded information is referred to and redisplayed.
  • the shapes of the marks M1, M2, . . . are circles, but the shapes of the marks are not particularly limited.
  • FIG. 14 is a diagram illustrating another example of displaying information indicating observation status.
  • FIG. 14 shows an example in which evaluation values are further displayed in the display format shown in FIG. 13.
  • evaluation value calculated for each area is displayed adjacent to arrows Ar1, Ar2, . . . indicating each area. More specifically, evaluation value display frames Sc1, Sc2, ... are displayed adjacent to arrows Ar1, Ar2, ... indicating observed areas, and evaluation values are displayed within the evaluation value display frames Sc1, Sc2, ... Is displayed.
  • the evaluation value of the observation of the region of the stomach between the region indicated by mark M1 (greater curvature of the body of the stomach) and the region indicated by mark M2 (antrum) is 82 [%]
  • An example is shown in which the observation evaluation value of the region of the stomach between the region shown (antrum) and the region indicated by mark M3 (angular region) is 99%.
  • evaluation value display frames Sc1, Sc2, . . . are displayed between marks indicating each part, and evaluation values are displayed within the evaluation value display frames Sc1, Sc2, .
  • the arrow and/or the evaluation value can be configured to be displayed only for a certain period of time.
  • the configuration may be such that only the arrows are displayed after T time has elapsed since the start of displaying the arrows and/or evaluation values.
  • Time T is a preset time.
  • FIG. 15 is a diagram illustrating another example of displaying information indicating observation status. Note that FIG. 15 shows an example of the display (initial display) of the observation status display window W before body part recognition.
  • This example is an advantageous display form particularly when the region to be observed (or the region to be photographed as a still image) is determined in advance.
  • regions to be observed are displayed on the schema diagram Sh using predetermined marks Ma, Mb, . . . .
  • FIG. 15 shows an example in which circles are used as the marks Ma, Mb, . . . .
  • the shapes and colors of the marks Ma, Mb, . . . are not particularly limited.
  • Each mark Ma, Mb, . . . is displayed at a position corresponding to a region determined as a region to be observed.
  • FIG. 15 shows an example where there are five parts to be observed. The five regions shown in FIG.
  • the mark Ma indicates the “cardia”
  • the mark Mb indicates the “concave region”
  • the mark Mc indicates the "greater curvature of the body of the stomach”
  • the mark Md indicates the "angular region of the stomach”
  • the mark Me indicates the "antrum.”
  • marks Ma, Mb, . . . are examples of marks representing regions of interest.
  • the schema diagram Sh in which marks Ma, Mb, . . . are arranged in a predetermined layout is an example of the first information.
  • the site determined in advance as a site to be observed (or a site for which a still image should be photographed) is an example of a plurality of specific attention regions selected from a plurality of attention regions inside the body.
  • FIG. 16 is a diagram showing an example of the display of the observation status display window after body part recognition.
  • the display form of the mark of the recognized body part changes.
  • the color (including transparency) of the mark of the recognized part changes.
  • FIG. 16 shows an example in which the inside of the circle indicating the mark changes from transparent to chromatic or achromatic.
  • FIG. 16 shows an example in which a region indicated by a mark Mc (greater curvature of the gastric body), a region indicated by a mark Md (angular region of the stomach), and a region indicated by a mark Me (antrum) are recognized.
  • the display form of the mark before the corresponding part is recognized is an example of the first form.
  • the display form of the mark after the corresponding region is recognized is an example of the second form.
  • FIG. 16 shows an example where the region indicated by the mark Mc (greater curvature of the gastric body), the region indicated by the mark Me (antrum), and the region indicated by the mark Md (angular region of the stomach) are recognized in this order. .
  • an arrow Ar1 connecting the mark Mc and the mark Me is displayed from the mark Mc toward the mark Me.
  • an arrow Ar2 connecting the mark Me and the mark Md is displayed from the mark Me toward the mark Md.
  • arrows Ar1, Ar2, ... are examples of information indicating that a region between a first region of interest (first part) and a second region of interest (second part) has been observed, and , is an example of the second information.
  • arrows Ar1, Ar2, ... are information that associates marks Ma, Mb, ... (first attention area and second attention area) in schema diagram Sh (first information) in which marks Ma, Mb, ... are displayed. This is an example.
  • FIG. 17 is a diagram showing an example of the display transition of the observation status display window W.
  • FIG. 17(A) shows an example of the initial display of the observation status display window W.
  • a schema diagram Sh is displayed in the observation status display window W, and marks Ma, Mb, . . . indicating parts to be observed are displayed on the schema diagram Sh.
  • Each mark Ma, Mb, . . . is displayed corresponding to the position of the part to be observed.
  • each mark Ma, Mb, . . . is displayed in a predetermined color (first form). In this example, the inside of the circle indicating the mark is displayed transparently.
  • FIG. 17(B) shows an example of the display of the observation status display window W when the region to be observed is recognized for the first time.
  • the display form of the mark Mc of the recognized part changes (switches to the second form).
  • the inside of the circle indicating the mark changes from transparent to chromatic or achromatic.
  • FIG. 17(B) shows an example in which the "greater curvature of the body of the stomach" is recognized.
  • FIG. 17(C) shows an example of the display of the observation status display window W when a new part is recognized from among the parts to be observed. As shown in FIG. 17(C), the color (display format) of the mark Me of the newly recognized part changes. FIG. 17(C) shows an example in which the "vestibular region" is newly recognized.
  • an arrow Ar1 is displayed that connects the mark Mc of the previously recognized part and the mark Me of the newly displayed part.
  • the arrow Ar1 is displayed from the mark Mc of the previously recognized part to the mark Me of the newly recognized part.
  • This arrow Ar1 indicates that the region of the organ between the site indicated by the mark Mc and the site indicated by the mark Me has been observed.
  • the example shown in FIG. 17C shows that the region of the stomach between the "greater curvature of the body of the stomach" indicated by the mark Mc and the "antrum” indicated by the mark Me has been observed.
  • FIG. 17(D) shows an example of the display of the observation status display window W when a region is further recognized from among the regions to be observed. As shown in FIG. 17(D), the color (display format) of the mark Md of the newly recognized region changes. FIG. 17(D) shows an example where the "angular part of the stomach" is newly recognized.
  • an arrow Ar2 is displayed that connects the previously displayed mark Me and the newly displayed mark Md.
  • the arrow Ar2 is displayed from the previously displayed mark Me to the newly displayed mark Md.
  • This arrow Ar2 indicates that the region of the stomach between the region indicated by the mark Me and the region indicated by the mark Md has been observed.
  • the endoscope 10 is moved from the "antrum" indicated by the mark Me to the "angular region of the stomach" indicated by the mark Md, and the region of the stomach between them is observed. It shows.
  • the parts to be observed are indicated on the schema diagram Sh by the marks Ma, Mb, . . ., and when observed (recognized), the display form of the marks Ma, Mb, . . . is switched. Thereby, the observed region can be easily understood. Furthermore, each time a region is recognized, arrows Ar1, Ar2, . . . are displayed that connect the mark of the region recognized first and the mark of the region recognized subsequently. Thereby, the observed area and observation direction can be easily grasped.
  • the marks are connected by arrows, but they may be connected by line segments.
  • the observation direction (the direction in which the endoscope 10 was moved) can be confirmed.
  • the arrows (including the case of line segments) Ar1, Ar2, . . . can be configured to be displayed only for a certain period of time.
  • an arrangement can be made in which an arrow is displayed at the same time as the display form of a mark of a newly recognized part is switched, and only the arrow is turned off after T time has elapsed since the start of arrow display.
  • Time T is a preset time. Note that when the arrow display is to be erased after a certain period of time has elapsed, it is preferable to enable it to be displayed again if necessary.
  • the shapes of the marks Ma, Mb, . . . are circles, but the shapes of the marks are not particularly limited.
  • the color of the mark is changed when the corresponding part is recognized, but the mode of switching is not limited to this.
  • the shape of the mark may be changed or the mark may be made to blink.
  • FIG. 18 is a diagram illustrating another example of displaying information indicating observation status.
  • FIG. 18 shows an example in which evaluation values are further displayed in the display formats shown in FIGS. 15 and 16.
  • evaluation value calculated for each area is displayed adjacent to arrows Ar1, Ar2, . . . indicating each area. More specifically, evaluation value display frames Sc1, Sc2, . . . are displayed adjacent to arrows Ar1, Ar2, . . . indicating observed areas, and evaluation values are displayed in the evaluation value display frames Sc1, Sc2, . be done.
  • the evaluation value of the observation of the region of the stomach between the region indicated by mark Mc (greater curvature of the gastric body) and the region indicated by mark Me (antrum) is 82 [%]
  • An example is shown in which the observation evaluation value of the region of the stomach between the region shown (antrum) and the region indicated by mark Md (angular region) is 99%.
  • evaluation value display frames Sc1, Sc2, . . . are displayed between marks indicating each part, and evaluation values are displayed within the evaluation value display frames Sc1, Sc2, .
  • the arrow and/or the evaluation value can be configured to be displayed only for a certain period of time.
  • the configuration may be such that only the arrows are displayed after T time has elapsed since the start of displaying the arrows and/or evaluation values.
  • Time T is a preset time. Note that when the display of the arrow and/or the evaluation value is to be erased after a certain period of time has elapsed, it is preferable that the arrow and/or the evaluation value be able to be displayed again if necessary.
  • FIG. 19 is a diagram illustrating another example of displaying information indicating observation status. Note that FIG. 19 shows an example of the display (initial display) of the observation status display window W before body part recognition.
  • This example is also an advantageous display form when the region to be observed (or the region to be photographed as a still image) is determined in advance.
  • marks Ma, Mb, . . . indicating regions to be observed are displayed in a predetermined layout in the observation status display window W in advance.
  • FIG. 19 shows an example where there are five parts to be observed.
  • the five regions shown in Figure 19 are the "cardia”, “fornix”, “greater curvature”, “gastric angle”, and “antrum”. antrum).
  • the mark Ma indicates the “cardia”
  • the mark Mb indicates the “concave region”
  • the mark Mc indicates the "greater curvature of the body of the stomach”
  • the mark Md indicates the "angular region of the stomach”
  • the mark Me indicates the "antrum.”
  • each mark Ma, Mb, . . . is represented by a circle, and the name of the corresponding part is written inside the circle.
  • marks Ma, Mb, . . . indicating each part are arranged at the vertices of a pentagon.
  • marks Ma, Mb, . . . are examples of marks representing regions of interest.
  • the schema diagram Sh in which marks Ma, Mb, . . . are arranged in a predetermined layout is an example of the first information.
  • the site determined in advance as a site to be observed (or a site for which a still image should be photographed) is an example of a plurality of specific attention regions selected from a plurality of attention regions inside the body.
  • FIG. 20 is a diagram showing an example of the display of the observation status display window after body part recognition.
  • FIG. 20 shows an example in which the region indicated by the mark Mc (greater curvature of the gastric body), the region indicated by the mark Md (angular region of the stomach), and the region indicated by the mark Me (antrum) are recognized.
  • the display form of the mark before the corresponding part is recognized is an example of the first form.
  • the display form of the mark after the corresponding region is recognized is an example of the second form.
  • FIG. 20 shows an example where the region indicated by the mark Mc (greater curvature of the gastric body), the region indicated by the mark Me (antrum), and the region indicated by the mark Md (angular region of the stomach) are recognized in this order.
  • an arrow Ar1 connecting the mark Mc and the mark Me is displayed from the mark Mc toward the mark Me.
  • an arrow Ar2 connecting the mark Me and the mark Md is displayed from the mark Me toward the mark Md.
  • evaluation values are calculated between each region, the calculated evaluation values are displayed between the marks indicating each region. Specifically, evaluation value display frames Sc1, Sc2, . . . are displayed above the arrows Ar1, Ar2, . . . , and evaluation values are displayed in the evaluation value display frames Sc1, Sc2, .
  • the evaluation value of the observation of the region of the stomach between the region indicated by mark Mc (greater curvature of the gastric body) and the region indicated by mark Me (antrum) is 82 [%]
  • An example is shown in which the observation evaluation value of the region of the stomach between the region shown (antrum) and the region indicated by mark Md (angular region) is 99%.
  • the arrow and the evaluation value are an example of information indicating that a region between the first region of interest (first part) and the second region of interest (second part) has been observed, and This is an example of 2 information. Further, the arrow and the evaluation value are examples of information that associates the marks (the first attention area and the second attention area) in the diagram in which the marks are displayed (first information).
  • the observed region can be easily recognized by the mark. Further, by displaying arrows, it is possible to easily understand the observed area and observation direction. Furthermore, by displaying the evaluation value, it is possible to grasp the observation status (photography status) of the area that has been determined to have been observed.
  • the marks are connected by arrows, but they may be connected by line segments.
  • the observation direction (the direction in which the endoscope 10 was moved) can be confirmed.
  • the shapes of the marks Ma, Mb, . . . are circles, but the shapes of the marks are not particularly limited.
  • FIG. 21 is a diagram showing another example of the mark.
  • FIG. 21 shows an example of displaying marks Ma, Mb, . . . indicating each part using icons.
  • the icon is constructed using a schema diagram of the organ to be observed, and is constructed of a diagram in which dots are displayed at the positions of the parts indicated by the icon.
  • the mark Ma indicates the "cardia”
  • the mark Mb indicates the “conical region”
  • the mark Mc indicates the "greater curvature of the body of the stomach”
  • the mark Md indicates the "angle of the stomach”
  • the mark Me indicates the "antrum.” There is.
  • the color of the mark is changed when the corresponding part is recognized, but the mode of switching is not limited to this.
  • the shape of the mark may be changed or the mark may be made to blink.
  • the arrow and/or the evaluation value can be configured to be displayed only for a certain period of time.
  • the configuration may be such that only the arrows are displayed after T time has elapsed since the start of displaying the arrows and/or evaluation values.
  • Time T is a preset time. Note that when the display of the arrow and/or the evaluation value is to be erased after a certain period of time has elapsed, it is preferable that the arrow and/or the evaluation value be able to be displayed again if necessary.
  • a region to be observed is determined in advance, but a region that can be recognized by the image processing device 100 may be indicated by a mark.
  • all parts that can be recognized by image processing device 100 are displayed (marks of all parts that can be recognized are displayed in a predetermined layout).
  • FIG. 22 is a diagram showing an example of highlighted display of evaluation values.
  • FIG. 22 shows that the observation evaluation value in the region between the region indicated by the mark Me (antrum) and the region indicated by the mark Md (gastric angle) is below the threshold value, and the region indicated by the mark Mc (the greater curvature of the gastric body) ) and the region indicated by mark Me (antibular region), the evaluation value of observation exceeds the threshold value.
  • FIG. 22 shows an example in which the display size of the evaluation value display frame is enlarged compared to normal (when the threshold value is exceeded) and the evaluation value display frame is displayed in a different color.
  • the form of highlighted display is not limited to this.
  • the evaluation value display frame can be made to blink or the shape of the evaluation value display frame can be changed to highlight the evaluation value display frame.
  • the degree of emphasis may be changed in stages according to the evaluation value. That is, the lower the evaluation value, the stronger the degree of emphasis is displayed.
  • the evaluation value is highlighted, but arrows (including the case where they are displayed as line segments) may also be highlighted in the same way.
  • an area where the evaluation value is less than or equal to a threshold can be highlighted by changing the thickness of the arrow line, changing the color of the arrow, or blinking the arrow.
  • the evaluation value can be configured to be displayed only when it is equal to or less than a threshold value. That is, if the calculated evaluation value exceeds the threshold value, it is assumed that the observation is being performed correctly, so only the arrow is displayed without separately displaying the evaluation value. On the other hand, if the calculated evaluation value is less than or equal to the threshold value, there is a high possibility that the observation is not being performed correctly, so the evaluation value is displayed on the screen to prompt the user with a warning. In this case, the arrow may be highlighted to prompt a warning without displaying the evaluation value.
  • FIG. 23 is a diagram illustrating an example of highlighting an arrow. FIG.
  • FIG. 23 shows that the evaluation value of observation in the region between the region indicated by mark Mc (greater curvature of gastric body) and the region indicated by mark Me (antrum) is below the threshold value, and the region indicated by mark Me (antrum) An example is shown in which the evaluation value of the observation in the region between and the region indicated by the mark Md (angular region of the stomach) exceeds the threshold value.
  • the arrow Ar1 in the area where the evaluation value is less than or equal to the threshold value is highlighted.
  • FIG. 23 shows an example of highlighting the arrow by changing its line thickness and color.
  • the highlighting can also be configured using a mark.
  • a mark For example, in the example illustrated in FIG. and highlight the mark Me.
  • the marks Mc and Me can be highlighted by changing their color, size, shape, or blinking.
  • the configuration may be such that, for example, a designation of a region for which the evaluation value is to be recalculated is accepted from the user, and the evaluation value is recalculated.
  • the evaluation value may be automatically recalculated using detection of a specific operation as a trigger. For example, when a recognized part is recognized again, the evaluation value of the area determined to be recognized is recalculated using the part as a starting point.
  • a process for automatically recalculating an evaluation value when a recognized part is recognized again will be described.
  • FIG. 24 is a flowchart showing the procedure for automatically recalculating the evaluation value when a recognized part is recognized again.
  • the region to be observed in advance is indicated by a mark on the schema diagram displayed in the observation status display window W (from FIG. 15 to FIG. (see 18).
  • the display form of the mark of the recognized region is switched. Further, an arrow is displayed between the mark of the previously recognized part, and an evaluation value is displayed adjacent to the arrow.
  • the processor 101 of the image processing device 100 displays the observation status display window W on the screen 52 of the display device 50 (step S11).
  • the processor 101 determines whether still images are to be taken (step S12). When a still image is photographed, body part recognition processing is performed on the photographed still image (step S13).
  • the processor 101 determines whether or not the body part has been recognized (step S14). When the body part is recognized, the processor 101 determines whether the recognized body part is a recognized body part (step S15). That is, it is determined whether the currently recognized part is a part that has already been recognized.
  • the processor 101 performs processing to determine the observation situation (step S16). That is, the presence or absence of a recognized part is determined, and if there is a recognized part, it is determined that the area between the previously recognized part and the currently recognized part has been observed. Thereafter, the processor 101 updates the display of the observation situation display window W based on the region recognition result and the observation situation determination result (step S17).
  • the display form of the mark of the recognized part is switched. If there is a recognized part, the display form of the mark of the newly recognized part is switched, and an arrow is displayed between it and the mark of the previously recognized part. Furthermore, an evaluation value is displayed adjacent to the arrow.
  • the processor 101 determines whether the observation has ended (step S18). Note that in the case where it is determined in step S12 that no still image has been taken, and in the case where it is determined in step S14 that the body part has not been recognized, it is similarly determined whether the observation has ended. When it is determined that the observation has ended, the process ends. If it is determined that the observation has not been completed, it is determined whether still images are to be taken (step S12).
  • step S15 if it is determined that the recognized part is a recognized part, the processor 101 performs a process of canceling the determination of "recognized" for the area determined to be recognized starting from the recognized part. (Step S19).
  • FIG. 25 is a conceptual diagram of the process of canceling the "recognized" determination.
  • the mark Me is recognized again.
  • An example is shown in which a part (antrum) is recognized. In this case, the area between the mark Mc (greater curvature of the body of the stomach) and the mark Me (antrum), and the area between the mark Me (antrum) and the mark Md (antrum). The area between is considered to be a recognized area.
  • the part of the mark Me (antrum) is recognized, the part of the mark Me (antrum) is a recognized part, so the part of the mark Me is recognized.
  • the determination as "recognized” is canceled for the region determined to have been recognized starting from the vestibular region. That is, the determination that the area between the mark Me (antrum) and the mark Md (gastric angle) is "recognized” is canceled.
  • the determination as recognized is also canceled for the part that defines the end point of the area. For example, in the case of the example shown in FIG. 25, the determination that the part of the mark Md (the angle of the stomach) is "recognized” is canceled.
  • the processor 101 After performing the process of canceling the "recognized" determination for the corresponding area, the processor 101 updates the display on the observation status display window W (step S17). In this case, the arrow and evaluation value are deleted for the corresponding area. In the above example, the arrow and evaluation value displayed in the area between the mark Me (antral region) and the mark Md (angular region) are erased. Further, the display of the mark Md is switched to an unrecognized display.
  • the processor 101 determines whether the observation has ended (step S18). If it is determined that the observation has not been completed, it is determined whether still images are to be taken (step S12). Then, when a still image is photographed and a new part is recognized from the photographed still image, an observation situation determination process is performed with respect to the newly recognized part (step S16). For example, in the example shown in FIG. 25, if the part of the mark Me (antrum) is recognized again and then the part of the mark Md (angular part of the stomach) is recognized, then the part of the mark Me (antrum) is recognized again.
  • the area between the mark Md and the site is determined to be a recognized area, and an evaluation value is calculated for the area. Then, the display of the observation situation display window W is updated based on the recognition result of the body part and the judgment result of the observation situation (step S17).
  • the evaluation value can be easily recalculated by simply performing an operation to re-recognize the starting point for the region for which the evaluation value has been calculated (the region determined to have been observed). can be processed.
  • the region of the mark Md angular region of the stomach
  • the region between the region of the mark Me (antrum) and the region of the mark Md (angular region of the stomach) is determined to be the observed region.
  • the part of the mark Me (antrum) is recognized again, the area between the part of the mark Me (antrum) and the part of the mark Md (angular part of the stomach) is determined to be "observed”. is released.
  • the region marked Md (angular region) is recognized again, the region between the region marked Me (antrum) and the region marked Md (angular region) is determined to be the observed region. .
  • the region marked Me (antrum) is an example of the first region
  • the region marked Md (angular region of the stomach) is an example of the second region.
  • the image (still image) in which the part of the mark Me (antrum) is first recognized is an example of the first image
  • the image (still image) in which the part of the mark Md (angular region) is first recognized is an example of the first image.
  • the image (still image) in which the part of the mark Me (antrum) is recognized for the second time is an example of the third image
  • the image (still image) in which the part of the mark Md (the gastric angle) is recognized the second time is an example of the fourth image.
  • Evaluation method In the above embodiment, the image is evaluated based on the blur state and blur state of the image, but the method for evaluating the image is not limited to this. In addition to or in place of the blur state and/or blur state of the image, a configuration may be adopted in which the image is evaluated from the viewpoint of the brightness (exposure) of the image.
  • the criteria for evaluating observations may be changed for each area. For example, when observing the stomach, there are evaluation criteria for observing the area between the greater curvature of the gastric body and the antrum, and evaluation criteria for observing the area between the antrum and the angle of the stomach. You may change . Thereby, evaluation criteria suitable for observation of each region can be set, and each region can be appropriately evaluated.
  • observation evaluation can also be performed from the perspective of the time or speed at which the area was observed. For example, it is possible to measure the time during which the area was observed, and to determine that the measured time is OK if it exceeds a preset reference time, and NG if it is less than the reference time.
  • the speed at which the area is observed can be measured, and if the measured speed is less than or equal to a preset reference speed, it can be determined to be OK, and if it exceeds the reference speed, it can be determined to be NG. More preferably, the reference time and reference speed are set for each region. Evaluation of OK and NG based on the time or speed at which the area was observed is another example of the evaluation value.
  • measuring the time or speed at which an area is observed is substantially the same as measuring the number of images captured in the area. That is, since time can be calculated from the number of images and speed can also be calculated from time, observation can be evaluated based on the number of images. For example, if the number of images taken in a region exceeds a preset reference number, it can be evaluated as OK, and if it is less than or equal to the reference number, it can be evaluated as NG. It is more preferable to set the reference number of sheets for each region.
  • each evaluation value can be displayed individually. Further, when the observation is evaluated from a plurality of viewpoints, an evaluation value that is a total of the evaluations from each viewpoint may be further calculated and displayed. For example, in the overall evaluation, if the evaluation value exceeds or meets the standard for all items (viewpoints), it is "OK", and if even one evaluation value is below the standard or does not meet the standard, it is "OK”. It can be evaluated as "NG”.
  • Evaluation target In the embodiment described above, images of all frames taken between body parts are evaluated, but a specific image can also be evaluated. For example, images extracted at regular frame intervals can be evaluated.
  • Modification example of evaluation value calculation method (1) Modification example 1 of evaluation value calculation method
  • the percentage of OK images is calculated as the evaluation value, but the percentage of NG images can also be calculated as the evaluation value.
  • the percentage of NG images can also be calculated as the evaluation value.
  • FIG. 26 is a block diagram of an observation situation determination unit that has a function of calculating an evaluation value by excluding similar images.
  • the observation situation determination unit 113D of this example has the functions of a similar image detection unit 113D4 in addition to the observation area determination unit 113D1, the imaging evaluation unit 113D2, and the evaluation value calculation unit 113D3.
  • the functions of the observation area determination section 113D1, the imaging evaluation section 113D2, and the evaluation value calculation section 113D3 are the same as those of the observation situation determination section 113D of the image processing apparatus 100 of the above embodiment. Therefore, here, the functions of the similar image detection section 113D4 will be explained.
  • the similar image detection unit 113D4 acquires images taken in chronological order by the endoscope 10, and detects images that are similar to each other. Specifically, an image similar to a previously photographed image (a photographed image) is detected.
  • the detection method known techniques can be employed. For example, a method of detecting similar images using image correlation can be adopted.
  • the detection target is a group of images taken in a region determined to have been observed. Therefore, each time a part is recognized, the detection target is reset.
  • the similar image detection unit 113D4 sequentially processes the acquired images and adds only images that are dissimilar to the photographed image to the photographic evaluation unit 113D2. This makes it possible to exclude images similar to already photographed images from evaluation targets (evaluation value calculation targets).
  • the evaluation value when calculating the evaluation value, it may be configured to exclude images similar to already photographed images from the evaluation value calculation targets.
  • evaluation value calculation method can also be configured to be calculated in real time. In this case, after recognizing the parts, evaluation values are calculated in sequence.
  • FIG. 27 is a conceptual diagram when calculating evaluation values in real time.
  • the symbol IOK indicates an image that has been evaluated as an OK image.
  • the code ING indicates an image evaluated as an NG image.
  • FIG. 27 shows a state in which the 20th frame image is taken after the body part is recognized.
  • FIG. 28 is a diagram illustrating an example of displaying evaluation values in real time.
  • FIG. 28 shows an example in which evaluation values calculated in real time are displayed on the observation status display window W.
  • an evaluation value display area SR is set within the observation status display window W, and an evaluation value is displayed in the evaluation value display area SR.
  • the evaluation value display area SR is set as a margin area when displaying the schema diagram Sh.
  • FIG. 28 shows an example in which a rectangular evaluation value display area SR is set in the upper right corner of the observation status display window W.
  • FIG. 28 shows an example in which a region indicated by a mark Me (antrum) is recognized after a region indicated by a mark Mc (greater curvature of the body of the stomach).
  • the region between the region indicated by the mark Mc (greater curvature of the body of the stomach) and the region indicated by the mark Me (antrum) is the observed region.
  • the evaluation value based on the image observed after recognizing the region indicated by the mark Me (antibular region) is displayed in the evaluation value display area SR.
  • the evaluation value is displayed in the evaluation value display area SR.
  • an evaluation value is displayed near the arrow indicating the area. The evaluation value displayed at this time is the final evaluation value.
  • FIG. 29 is a diagram showing an example of the display transition of the observation status display window.
  • FIG. 29 shows an example where the region to be observed is determined in advance.
  • FIG. 29(A) shows an example of the initial display of the observation status display window W.
  • a schema diagram Sh is displayed in the observation status display window W, and marks Ma, Mb, . . . indicating parts to be observed are displayed on the schema diagram Sh.
  • FIG. 29(B) shows an example of the display of the observation status display window W when the region to be observed is recognized for the first time.
  • the display form of the mark Mc of the recognized part changes.
  • FIG. 29(B) shows an example in which the "greater curvature of the body of the stomach" is recognized.
  • the evaluation value calculated based on the image taken after recognition is displayed in the evaluation value display area SR in real time.
  • FIG. 29(B) shows an example in which the evaluation value calculated from the image taken after recognizing the region indicated by the mark Mc (greater curvature of the body of the stomach) is 100 [%] at the time of display. There is.
  • FIG. 29(C) shows an example of the display of the observation status display window W when a new part is recognized from among the parts to be observed. As shown in FIG. 29(C), the color (display format) of the mark Me of the newly recognized part changes. FIG. 29(C) shows an example in which the "vestibular region" is newly recognized.
  • an arrow Ar1 is displayed that connects the mark Mc of the previously recognized part and the mark Me of the newly displayed part. Further, an evaluation value display frame Sc1 is displayed adjacent to the arrow Ar1, and the determined evaluation value is displayed.
  • the example shown in FIG. 29(C) shows that the region of the stomach between the "greater curvature of the body of the stomach" indicated by the mark Mc and the "antrum” indicated by the mark Me has been observed.
  • FIG. 29C shows an example in which the evaluation value calculated from the image taken after recognizing the region indicated by the mark Me (antibular region) is 98% at the time of display.
  • FIG. 29(D) shows an example of the display of the observation status display window W when a region is further recognized from among the regions to be observed. As shown in FIG. 29(D), the color (display format) of the mark Md of the newly recognized region changes. FIG. 29(D) shows an example where the "angular part of the stomach" is newly recognized.
  • an arrow Ar2 is displayed that connects the previously displayed mark Me and the newly displayed mark Md.
  • an evaluation value display frame Sc2 is displayed adjacent to the arrow Ar2, and the determined evaluation value is displayed.
  • the endoscope 10 is moved from the "antrum" indicated by the mark Me toward the "angular region of the stomach” indicated by the mark Md, and the region of the stomach between them is observed. It shows.
  • FIG. 29(D) when the body part is recognized, the evaluation value calculated based on the image taken after recognition is displayed in the evaluation value display area SR in real time.
  • FIG. 29(D) shows an example in which the evaluation value calculated from the image taken after recognizing the region indicated by the mark Md (angular part of the stomach) is 95% at the time of display.
  • the evaluation value calculated based on the image taken after body part recognition is displayed in the evaluation value display area SR in real time. This allows the user to grasp the observation situation in real time.
  • the case where the evaluation value calculated in real time is displayed in the observation status display window W was explained as an example, but this is the place where the evaluation value calculated in real time is displayed. It is not limited to. It may be displayed in a location other than the observation status display window W. Also, when displaying on the observation status display window W, the display position is not particularly limited. For example, it may be displayed near the mark indicating the part recognized immediately before (at a position within a range where the relationship with the part can be understood).
  • the reset for example, a method of resetting when the same part as the part recognized immediately before is recognized can be adopted. In addition, a configuration may be adopted in which the reset can be instructed by operating a foot switch or the like.
  • Part recognition image to be processed for recognition
  • a body part is recognized from a still photographed image, but the image to be subjected to body part recognition processing is not limited to this.
  • a configuration may be adopted in which body part recognition is performed on an image arbitrarily selected by the user.
  • the image selection may be performed using, for example, a foot switch, and the part may be recognized from the image displayed in the observation image display area A1a at the time the foot switch is pressed.
  • a configuration may be adopted in which images are selected using a switch provided in the operation section of the endoscope 10, a voice input device, or the like.
  • the recognition of the body part can be configured to be performed for all captured images (images of all frames).
  • the processor 101 processes each of the images acquired in chronological order and recognizes the body part shown in each image.
  • body parts By recognizing body parts from all captured images, it becomes possible to automatically recognize body parts.
  • the region between the two parts has been observed. For example, in the stomach, if the antrum is recognized after the greater curvature of the gastric corpus is recognized, it is determined that the region between the greater curvature of the gastric body and the antrum has been observed. Thereafter, when the gastric angle is further recognized, it is determined that the region between the antrum and the gastric angle has been observed. In this way, each time the region recognized from the image changes, it is determined that the region between the previously recognized region has been observed.
  • the parts of the organ to be observed can be classified into multiple types, it is not necessarily necessary to be able to recognize all parts. It is only necessary to recognize a specific part determined as an object to be recognized.
  • objects to be recognized regions of interest
  • sites that are anatomical landmarks sites that must be observed from the viewpoint of examinations, etc., sites that must be photographed as still images, etc. are selected.
  • characteristic organs or body parts and regions can also be selected as recognition targets.
  • an area selected by the user for example, an area serving as a landmark
  • a lesion, an inflamed area, etc. can be set as a recognition target. These areas can be set, for example, from images taken in the past.
  • the imaging state may also be evaluated and body part recognition processing may be performed. That is, the body part recognition process may be performed by also evaluating whether the image to be processed has been properly captured. In this case, recognition of the body part is confirmed only when it is determined that the image has been properly captured. Therefore, for example, even if a body part can be recognized from an image, if it is determined that the image was not captured appropriately, it is assumed that the body part has not been recognized.
  • Whether or not the image has been properly photographed is determined from the viewpoints of, for example, blur of the image, blur of the image, brightness of the image, quality of the composition, dirt on the image (reflection of dirt on the observation window), etc.
  • image blur and blur an image without blur and blur (including cases where blur and blur are within an acceptable range) is determined to be an appropriately photographed image.
  • the brightness of an image an image photographed with appropriate brightness (appropriate exposure) is determined to be an appropriately photographed image.
  • the quality of the composition is determined, for example, from the viewpoint of whether the region to be recognized is photographed with a predetermined composition (for example, a structure placed in the center of the screen).
  • image stains for example, an image without cloudiness (a clear image) is determined to be an appropriately photographed image.
  • Whether or not an image has been appropriately photographed can be determined in a composite manner from multiple viewpoints. Further, in this case, requirements can be determined for each part. For example, for all areas, it is determined whether or not the image was taken appropriately from the viewpoints of image blur, image blur, image brightness, and image smudge, while for a specific area, it is judged whether the composition is good or not. It is also possible to have a configuration in which it is also determined.
  • a configuration may be adopted in which a determination device is used to determine whether or not the image has been appropriately photographed.
  • the determiner can be configured with a learned model, for example.
  • separate determining devices are prepared. For example, when determining whether or not an image has been properly photographed from the viewpoint of image blur, image blur, image brightness, quality of composition, image dirt, etc., a determination device that determines the presence or absence of image blur; A determiner for determining whether the image is blurred, a determiner for determining whether the brightness of the image is appropriate, a determiner for determining the quality of the composition, a determiner for determining the presence or absence of dirt in the image, etc. are separately prepared.
  • the standard set for determining whether the image has been appropriately captured is an example of the second standard. Further, the determination criteria determined from the viewpoints of image blur, image blur, image brightness, quality of composition, image dirt, etc. are examples of the contents of the second criteria.
  • the organ to be observed is not particularly limited. Furthermore, organs other than internal organs can also be observed. That is, the present invention can be applied to observing the inside of the body using an endoscope.
  • FIG. 30 is a diagram showing an example of the display of the observation status display window when observing the large intestine.
  • FIG. 30 shows an example where the region to be observed is determined in advance.
  • a schema diagram Sh of the large intestine to be observed is displayed in the observation status display window W, and predetermined marks MA, MB, ... (circles in the example shown in FIG. 30) are displayed on the schema diagram Sh. ) is displayed to indicate the part of the large intestine to be observed (or the part to take a still image).
  • FIG. 30 shows an example where there are four parts to be observed.
  • the four sites shown in FIG. 30 are "ileocecal", “hepatic curvature (right colonic flexure)", “splenic curvature (left colonic flexure)", and "anus”.
  • the hepatic flexure is the transition between the ascending and transverse colon.
  • the splenohepatic curvature is the transition between the transverse and descending colons.
  • Mark MA indicates "ileocecale”
  • mark MB indicates “hepatic curvature”
  • mark MC indicates “splenic curvature”
  • mark MD indicates "anus.”
  • FIG. 30 shows an example where each part of the large intestine is recognized in the order of the part indicated by mark MA (ileocecal), the part indicated by mark MB (hepatic curvature), and the part indicated by mark MC (splenic curvature).
  • the region between the curves (transverse colon) is determined as the recognized region.
  • the display form of the mark of the recognized part changes (for example, the color changes).
  • arrows Ar1, Ar2, . . . and evaluation values are displayed in the recognized areas.
  • Arrows Ar1, Ar2, . . . are displayed pointing in the observation direction.
  • the evaluation values are displayed in evaluation value display frames Sc1, Sc2, . . . displayed adjacent to arrows Ar1, Ar2, .
  • the results of the observation status judgment process include information on still images and/or videos taken during observation, information on lesions detected during observation, and results of classification performed during observation. It can be recorded in association with the information etc.
  • the information to be recorded includes (1) information on the order in which each part was recognized, (2) if the parts were recognized by taking a still image, information on the still image in which each part was recognized, (3) each area Information on the evaluation value calculated for (area between recognized parts), (4) imaging time and/or number of frames in each region, (5) overall imaging time and/or number of frames, etc. Can contain information.
  • the total imaging time is the imaging time from recognizing the first part to recognizing the last part.
  • the total number of frames is the total number of frames from recognizing the first part to recognizing the last part.
  • the information on the determination result of the observation situation may be configured to be recorded in an external storage device in addition to or in place of the storage device (auxiliary storage device 103) provided in the image processing device 100.
  • result information is sent to a system that manages endoscopy test results, etc. (endoscopy information management system, etc.), and the system or a data server connected to the system (endoscopy data server, etc.) It can be configured to record in
  • observation including examination
  • an endoscope a report is usually created showing the observation results. Therefore, when a process for determining the observation status is performed during observation, it is preferable to also write (record) information on the results of the determination process in the report.
  • report creation is performed using a device that supports report creation (report creation support device).
  • the report creation support device acquires information necessary for creating a report from an image processing device or the like, and the information acquired by the report creation support device can include information on the determination result of the observation situation. Thereby, information on the determination result of the observation situation can be included in the information described (recorded) in the report, and the information can be automatically input.
  • processors The functions of the endoscopic image processing device are realized by various processors.
  • Various types of processors include CPUs and/or GPUs (Graphic Processing Units), FPGAs (Field Programmable Gate Arrays), etc., which are general-purpose processors that execute programs and function as various processing units.
  • the circuit configuration may be changed after manufacturing.
  • Programmable logic devices PLDs
  • dedicated electric circuits which are processors with circuit configurations specifically designed to execute specific processes, such as ASICs (Application Specific Integrated Circuits), etc. included.
  • Program is synonymous with software.
  • One processing unit may be composed of one of these various processors, or may be composed of two or more processors of the same type or different types.
  • one processing unit may be configured by a plurality of FPGAs or a combination of a CPU and an FPGA.
  • the plurality of processing units may be configured with one processor.
  • one processor is configured with a combination of one or more CPUs and software, as typified by computers used for clients and servers. There is a form in which this processor functions as a plurality of processing units.
  • processors that use a single IC (Integrated Circuit) chip, such as System on Chip (SoC), which implements the functions of an entire system that includes multiple processing units. be.
  • SoC System on Chip

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biomedical Technology (AREA)
  • Optics & Photonics (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biophysics (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Endoscopes (AREA)

Abstract

Provided is an image processing device and an endoscope system with which it is possible to ascertain an observation status by using an endoscope. This image processing device processes a plurality of images captured in time series by an endoscope, and comprises a processor. The processor acquires a plurality of images. The processor processes the images and recognizes a region-of-interest appearing in the images from among a plurality of regions-of-interest inside a body. When a first region-of-interest among the plurality of regions-of-interest is recognized from a first image among the plurality of images, and a second region-of-interest among the plurality of regions-of-interest is recognized from a second image after the first image in chronological order, the processor causes a display device to display information indicating that a region between the first region-of-interest and the second region-of-interest has been observed.

Description

画像処理装置及び内視鏡システムImage processing device and endoscope system
 本発明は、画像処理装置及び内視鏡システムに係り、特に、内視鏡で時系列に撮影された複数の画像を処理する画像処理装置、及び、その画像処理装置を備えた内視鏡システムに関する。 The present invention relates to an image processing device and an endoscope system, and in particular, an image processing device that processes a plurality of images taken in chronological order by an endoscope, and an endoscope system equipped with the image processing device. Regarding.
 内視鏡による観察を支援する技術として、内視鏡で撮影された画像から病変等の注目領域を自動で認識し、報知する技術が知られている(たとえば、特許文献1、2等)。特に、近年は、人工知能(Artificial Intelligence:AI)の技術の進展により、高精度な画像認識が可能になっている。 As a technique for supporting observation using an endoscope, a technique is known that automatically recognizes and notifies an area of interest such as a lesion from an image taken with an endoscope (for example, Patent Documents 1 and 2). In particular, in recent years, advances in artificial intelligence (AI) technology have made it possible to perform highly accurate image recognition.
国際公開第2020/110214号International Publication No. 2020/110214 国際公開第2020/152758号International Publication No. 2020/152758
 しかしながら、いかに画像認識が優れていても、撮影されていない個所に存在する病変等は検出できない。 However, no matter how good the image recognition is, it is impossible to detect lesions that exist in areas that are not photographed.
 本開示の技術に係る1つの実施形態は、内視鏡による観察状況を把握できる画像処理装置及び内視鏡システムを提供する。 One embodiment of the technology of the present disclosure provides an image processing device and an endoscope system that can grasp the observation situation with an endoscope.
 (1)内視鏡で時系列に撮影された複数の画像を処理する画像処理装置であって、プロセッサを備え、プロセッサは、複数の画像を取得し、画像を処理して、体内の複数の注目領域の中から画像に写っている注目領域を認識し、複数の画像のうちの第1画像から複数の注目領域のうちの第1注目領域が認識され、かつ、第1画像よりも時系列順で後の第2画像から複数の注目領域のうちの第2注目領域が認識された場合に、第1注目領域と第2注目領域との間の領域が観察されたことを示す情報を表示装置に表示させる、画像処理装置。 (1) An image processing device that processes multiple images taken in time series with an endoscope, and includes a processor, and the processor acquires multiple images, processes the images, and processes multiple images inside the body. A region of interest in an image is recognized from among the regions of interest, and the first region of interest of the plurality of regions of interest is recognized from the first image of the plurality of images, and the region of interest is recognized from the first image of the plurality of images, and When a second region of interest among the plurality of regions of interest is recognized from a second image later in the sequence, information indicating that an area between the first region of interest and the second region of interest has been observed is displayed. An image processing device that displays images on the device.
 (2)プロセッサは、体内の複数の注目領域の中から選択された複数の特定の注目領域の情報を第1情報として表示装置に表示させ、第1注目領域と第2注目領域との間の領域が観察されたことを示す情報を第2情報として表示装置に表示させる、(1)の画像処理装置。 (2) The processor causes the display device to display information on a plurality of specific attention regions selected from the plurality of attention regions within the body as first information, and to The image processing device according to (1), which causes a display device to display information indicating that the region has been observed as second information.
 (3)第2情報は、第1情報において、第1注目領域と第2注目領域とを関連付ける情報で構成される、(2)の画像処理装置。 (3) The image processing device according to (2), wherein the second information is comprised of information that associates the first region of interest with the second region of interest in the first information.
 (4)第2情報は、観察の移動方向を示す情報を含む、(2)又は(3)の画像処理装置。 (4) The image processing device of (2) or (3), wherein the second information includes information indicating the direction of observation movement.
 (5)第1情報は、注目領域を表す標識を特定のレイアウトで配置した情報で構成され、第2情報は、標識を関連付ける情報で構成される、(3)の画像処理装置。 (5) The image processing device according to (3), wherein the first information is comprised of information in which signs representing the region of interest are arranged in a specific layout, and the second information is comprised of information associating the signs.
 (6)第1情報は、観察対象とする臓器のシェーマ図を含み、シェーマ図上の各注目領域の位置に標識を配置した情報で構成される、(5)の画像処理装置。 (6) The image processing device according to (5), wherein the first information includes a schema diagram of the organ to be observed, and is composed of information in which markers are placed at the positions of each attention area on the schema diagram.
 (7)第2情報は、標識を結ぶ線分で構成される、(5)又は(6)の画像処理装置。 (7) The image processing device of (5) or (6), wherein the second information is composed of line segments connecting the markers.
 (8)線分は、観察の移動方向を示す矢印を含む、(7)の画像処理装置。 (8) The image processing device of (7), in which the line segment includes an arrow indicating the direction of observation movement.
 (9)プロセッサは、認識された注目領域の標識の表示の形態を第1形態から第2形態に切り替える、(5)から(8)のいずれか一の画像処理装置。 (9) The image processing device according to any one of (5) to (8), wherein the processor switches the display format of the marker of the recognized region of interest from the first format to the second format.
 (10)プロセッサは、時系列順で第1画像及び第2画像の間の画像に基づいて、観察の評価値を算出し、第2情報は、評価値の情報を含む、(2)から(9)のいずれか一の画像処理装置。 (10) The processor calculates an observation evaluation value based on images between the first image and the second image in chronological order, and the second information includes information on the evaluation value. 9) any one of the image processing devices.
 (11)評価値は、第1基準を満たした画像の数、及び/又は、第1画像及び第2画像の間の画像の数に基づいて算出される、(10)の画像処理装置。 (11) The image processing device according to (10), wherein the evaluation value is calculated based on the number of images that satisfy the first criterion and/or the number of images between the first image and the second image.
 (12)第1基準が、画像のボケ状態、画像のブレ状態及び画像の明るさの少なくとも一つに基づいて設定される、(11)の画像処理装置。 (12) The image processing device according to (11), wherein the first criterion is set based on at least one of an image blur state, an image blur state, and an image brightness.
 (13)評価値は、第1基準を満たした画像の割合又は第1基準を満たしていない画像の割合として算出される、(10)から(12)のいずれか一の画像処理装置。 (13) The image processing device according to any one of (10) to (12), wherein the evaluation value is calculated as a percentage of images that meet the first criterion or a percentage of images that do not meet the first criterion.
 (14)プロセッサは、第1画像及び第2画像の間の画像のうち相互に類似する画像を抽出し、抽出した相互に類似する画像の一方を除外して、評価値を算出する、(13)の画像処理装置。 (14) The processor extracts mutually similar images from among the images between the first image and the second image, excludes one of the extracted mutually similar images, and calculates an evaluation value. ) image processing device.
 (15)プロセッサは、評価値に応じた形態で第2情報を表示させる、(10)から(14)のいずれか一の画像処理装置。 (15) The image processing device according to any one of (10) to (14), wherein the processor displays the second information in a form according to the evaluation value.
 (16)プロセッサは、第1注目領域が、第2画像よりも時系列順で後の第3画像から認識され、かつ、第2注目領域が、第3画像よりも時系列順で後の第4画像から認識された場合に、評価値を更新し、かつ、第2情報の表示を更新する、(10)から(15)のいずれか一の画像処理装置。 (16) The processor recognizes the first region of interest from a third image that is later in chronological order than the second image, and that the second region of interest is recognized from a third image that is later than the third image in chronological order. The image processing device according to any one of (10) to (15), which updates the evaluation value and updates the display of the second information when recognized from four images.
 (17)プロセッサは、複数の画像を時系列順に表示装置に表示させ、表示装置に表示された画像から第1画像の選択を受け付け、第1画像よりも後に表示装置に表示された画像から第2画像の選択を受け付ける、(1)から(16)のいずれか一の画像処理装置。 (17) The processor displays the plurality of images on the display device in chronological order, accepts selection of the first image from the images displayed on the display device, and selects the first image from the images displayed on the display device after the first image. The image processing device according to any one of (1) to (16), which accepts selection of two images.
 (18)プロセッサは、第1画像及び第2画像として選択された画像を静止画として記録する、(17)の画像処理装置。 (18) The image processing device according to (17), wherein the processor records the images selected as the first image and the second image as still images.
 (19)プロセッサは、複数の画像のそれぞれを処理して、それぞれの画像に写っている注目領域を認識する、(1)から(18)のいずれか一の画像処理装置。 (19) The image processing device according to any one of (1) to (18), wherein the processor processes each of the plurality of images and recognizes a region of interest shown in each image.
 (20)プロセッサは、画像から注目領域を認識した場合に、更に、注目領域を認識した画像が第2基準を満たすか否か判定し、第2基準を満たす場合に、注目領域の認識を確定させる、(1)から(19)のいずれか一の画像処理装置。 (20) When the processor recognizes the region of interest from the image, the processor further determines whether the image in which the region of interest has been recognized satisfies the second criterion, and if the second criterion is satisfied, determines the recognition of the region of interest. The image processing device according to any one of (1) to (19).
 (21)第2基準の内容が、注目領域ごとに設定される、(20)の画像処理装置。 (21) The image processing device according to (20), wherein the content of the second standard is set for each region of interest.
 (22)プロセッサは、第2情報の表示を開始してからT時間経過後に第2情報の表示を終了させる、(2)から(8)のいずれか一の画像処理装置。 (22) The image processing device according to any one of (2) to (8), wherein the processor ends displaying the second information after T time has elapsed after starting displaying the second information.
 (23)プロセッサは、第2情報の表示の履歴を記録し、新たに第2情報を表示させる際、履歴に基づいて、過去に表示させた第2情報を同時に表示させる、(22)の画像処理装置。 (23) The processor records the history of display of the second information, and when displaying the second information anew, simultaneously displays the second information displayed in the past based on the history, the image of (22) Processing equipment.
 (24)プロセッサは、観察が終了した場合、及び/又は、履歴の表示が指示された場合に、履歴に基づいて、過去に表示させた第2情報を表示させる、(23)の画像処理装置。 (24) The image processing device of (23), wherein the processor displays the second information displayed in the past based on the history when the observation is completed and/or when displaying the history is instructed. .
 (25)プロセッサは、第1注目領域が、第2画像よりも時系列順で後の第3画像から認識された場合、第2情報の表示を終了させる、(2)から(8)のいずれか一の画像処理装置。 (25) The processor may terminate the display of the second information when the first region of interest is recognized from a third image that is later in chronological order than the second image. The first image processing device.
 (26)複数の注目領域が、観察対象とする臓器の複数の部位である、(1)から(25)のいずれか一の画像処理装置。 (26) The image processing device according to any one of (1) to (25), wherein the plurality of regions of interest are a plurality of parts of an organ to be observed.
 (27)内視鏡と、表示装置と、内視鏡で撮影された画像を処理する(1)から(26)のいずれか一の画像処理装置と、を備えた内視鏡システム。 (27) An endoscope system comprising an endoscope, a display device, and the image processing device according to any one of (1) to (26) for processing images taken by the endoscope.
 本発明によれば、内視鏡による観察状況を把握できる。 According to the present invention, it is possible to grasp the observation situation using an endoscope.
内視鏡システムのシステム構成の一例を示すブロック図Block diagram showing an example of the system configuration of an endoscope system プロセッサ装置が有する主な機能のブロック図Block diagram of the main functions of the processor device 画像処理装置のハードウェア構成の一例を示すブロック図Block diagram showing an example of the hardware configuration of an image processing device 画像処理装置が有する主な機能のブロック図Block diagram of the main functions of the image processing device 画像処理部が有する主な機能のブロック図Block diagram of the main functions of the image processing unit 表示装置の画面表示の一例を示す図A diagram showing an example of a screen display of a display device 観察状況表示ウインドウの表示の遷移の一例を示す図Diagram showing an example of display transition of the observation status display window 観察状況を示す情報を表示装置に表示させる場合の処理の手順を示すフローチャートFlowchart showing the processing procedure when displaying information indicating observation status on a display device 画像処理装置の観察状況判定部が有する主な機能のブロック図Block diagram of the main functions of the observation situation determination unit of the image processing device 観察状況判定部で行われる撮影領域の判定及び評価値の算出の概念図Conceptual diagram of imaging area determination and evaluation value calculation performed by the observation situation determination unit 観察状況表示ウインドウの表示の一例を示す図Diagram showing an example of the display of the observation status display window 観察状況を示す情報の表示の他の一例を示す図Diagram showing another example of display of information indicating observation status 観察状況表示ウインドウWの表示の遷移の一例を示す図Diagram showing an example of display transition of observation status display window W 観察状況を示す情報の表示の他の一例を示す図Diagram showing another example of display of information indicating observation status 観察状況を示す情報の表示の他の一例を示す図Diagram showing another example of display of information indicating observation status 部位認識後の観察状況表示ウインドウの表示の一例を示す図Diagram showing an example of the display of the observation status display window after body part recognition 観察状況表示ウインドウWの表示の遷移の一例を示す図Diagram showing an example of display transition of observation status display window W 観察状況を示す情報の表示の他の一例を示す図Diagram showing another example of display of information indicating observation status 観察状況を示す情報の表示の他の一例を示す図Diagram showing another example of display of information indicating observation status 部位認識後の観察状況表示ウインドウの表示の一例を示す図Diagram showing an example of the display of the observation status display window after body part recognition マークの他の一例を示す図Diagram showing another example of the mark 評価値の強調表示の一例を示す図Diagram showing an example of highlighting evaluation values 矢印を強調表示させる場合の一例を示す図Diagram showing an example of highlighting arrows 認識済みの部位を再度認識した場合に自動で評価値を算出し直す場合の処理の手順を示すフローチャートFlowchart showing the procedure for automatically recalculating the evaluation value when a recognized part is recognized again 「認識済み」との判定を解除する処理の概念図Conceptual diagram of the process of canceling the “recognized” determination 類似する画像を除外して評価値を算出する機能を備えた観察状況判定部のブロック図Block diagram of an observation situation determination unit with a function to exclude similar images and calculate an evaluation value 評価値をリアルタイムに算出する場合の概念図Conceptual diagram when calculating evaluation values in real time 評価値をリアルタイムに表示させる場合の一例を示す図Diagram showing an example of displaying evaluation values in real time 観察状況表示ウインドウの表示の遷移の一例を示す図Diagram showing an example of display transition of the observation status display window 大腸を観察する場合の観察状況表示ウインドウの表示の一例を示す図Diagram showing an example of the display of the observation status display window when observing the large intestine
 以下、添付図面に従って本発明の好ましい実施形態について詳説する。 Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings.
 [第1の実施の形態]
 ここでは、体内である上部消化器官、特に胃を観察(検査を目的とした観察を含む)する内視鏡システムに本発明を適用した場合を例に説明する。胃は、観察対象の臓器、特に、管腔臓器の一例である。
[First embodiment]
Here, a case where the present invention is applied to an endoscope system for observing (including observation for the purpose of examination) the upper digestive tract in the body, particularly the stomach will be described as an example. The stomach is an example of an organ to be observed, particularly a hollow organ.
 [内視鏡システムの構成]
 図1は、内視鏡システムのシステム構成の一例を示すブロック図である。
[Endoscope system configuration]
FIG. 1 is a block diagram showing an example of the system configuration of an endoscope system.
 図1に示すように、本実施の形態の内視鏡システム1は、内視鏡10、光源装置20、プロセッサ装置30、入力装置40、表示装置50及び画像処理装置100等を備える。内視鏡10は、光源装置20及びプロセッサ装置30に接続される。光源装置20、入力装置40及び画像処理装置100は、プロセッサ装置30に接続される。表示装置50は、画像処理装置100に接続される。 As shown in FIG. 1, the endoscope system 1 of this embodiment includes an endoscope 10, a light source device 20, a processor device 30, an input device 40, a display device 50, an image processing device 100, and the like. The endoscope 10 is connected to a light source device 20 and a processor device 30. The light source device 20, the input device 40, and the image processing device 100 are connected to the processor device 30. Display device 50 is connected to image processing device 100.
 本実施の形態の内視鏡システム1は、通常の白色光による観察(白色光観察)の他に、特殊光を用いた観察(特殊光観察)が可能なシステムとして構成される。特殊光観察には、狭帯域光観察が含まれる。狭帯域光観察には、BLI観察(Blue laser imaging観察)、NBI観察(Narrow band imaging観察)、LCI観察(Linked Color Imaging観察)等が含まれる。なお、特殊光観察自体は、公知の技術であるので、その詳細についての説明は省略する。 The endoscope system 1 of this embodiment is configured as a system capable of observation using special light (special light observation) in addition to normal white light observation (white light observation). Special light observation includes narrowband light observation. Narrow band optical observation includes BLI observation (Blue laser imaging observation), NBI observation (Narrow band imaging observation), LCI observation (Linked Color Imaging observation), and the like. Note that special light observation itself is a well-known technique, so detailed explanation thereof will be omitted.
 [内視鏡]
 本実施の形態の内視鏡10は、電子内視鏡(軟性鏡)、特に、上部消化器官用の電子内視鏡である。電子内視鏡は、操作部、挿入部及び接続部等を備え、挿入部の先端に組み込まれたイメージセンサで被写体を撮影する。イメージセンサには、所定のカラーフィルタ配列(たとえば、ベイヤ配列)を有するイメージセンサ(たとえば、CMOS(Complementary Metal Oxide Semiconductor)イメージセンサ、CCD(Charge Coupled Device)イメージセンサ等)が使用される。操作部には、アングルノブ、送気送水ボタン、吸引ボタン、モード切り替えボタン、レリーズボタン及び鉗子口等が備えられる。モード切り替えボタンは、観察モードを切り替えるボタンである。たとえば、白色光観察を行うモード、LCI観察を行うモード及びBLI観察を行うモードの切り替えが行われる。レリーズボタンは、静止画の撮影を指示するボタンである。なお、内視鏡自体は公知であるので、その詳細についての説明は省略する。内視鏡10は、接続部を介して光源装置20及びプロセッサ装置30に接続される。
[Endoscope]
The endoscope 10 of this embodiment is an electronic endoscope (flexible endoscope), particularly an electronic endoscope for upper digestive organs. An electronic endoscope includes an operation section, an insertion section, a connection section, and the like, and photographs a subject using an image sensor built into the tip of the insertion section. As the image sensor, an image sensor (for example, a CMOS (Complementary Metal Oxide Semiconductor) image sensor, a CCD (Charge Coupled Device) image sensor, etc.) having a predetermined color filter array (for example, a Bayer array) is used. The operation section includes an angle knob, an air/water supply button, a suction button, a mode switching button, a release button, a forceps port, and the like. The mode switching button is a button for switching observation modes. For example, switching is performed between a mode for white light observation, a mode for LCI observation, and a mode for BLI observation. The release button is a button that instructs to take a still image. Note that since the endoscope itself is well known, detailed explanation thereof will be omitted. The endoscope 10 is connected to a light source device 20 and a processor device 30 via a connection part.
 [光源装置]
 光源装置20は、内視鏡10に供給する照明光を生成する。上記のように、本実施の形態の内視鏡システム1は、通常の白色光観察の他に、特殊光観察が可能なシステムとして構成される。このため、光源装置20は、通常の白色光の他、特殊光観察に対応した光(たとえば、狭帯域光)を生成する機能を有する。なお、上記のように、特殊光観察自体は、公知の技術であるので、その照明光の生成についての説明は省略する。光源種の切り替えは、たとえば、内視鏡10の操作部に備えられるモード切り替えボタンで行われる。
[Light source device]
The light source device 20 generates illumination light to be supplied to the endoscope 10. As described above, the endoscope system 1 of this embodiment is configured as a system capable of special light observation in addition to normal white light observation. Therefore, the light source device 20 has a function of generating light (for example, narrowband light) compatible with special light observation in addition to normal white light. Note that, as mentioned above, since special light observation itself is a well-known technique, a description of the generation of the illumination light will be omitted. Switching of the light source type is performed using, for example, a mode switching button provided on the operation section of the endoscope 10.
 [プロセッサ装置]
 プロセッサ装置30は、内視鏡システム全体の動作を統括制御する。プロセッサ装置30は、そのハードウェア構成として、プロセッサ、主記憶装置、補助記憶装置、入出力インターフェース等を備える。
[Processor device]
The processor device 30 centrally controls the operation of the entire endoscope system. The processor device 30 includes a processor, a main storage device, an auxiliary storage device, an input/output interface, etc. as its hardware configuration.
 図2は、プロセッサ装置が有する主な機能のブロック図である。 FIG. 2 is a block diagram of the main functions of the processor device.
 図2に示すように、プロセッサ装置30は、内視鏡制御部31、光源制御部32、画像処理部33、入力制御部34及び出力制御部35等の機能を有する。各機能は、プロセッサが所定のプログラムを実行することにより実現される。補助記憶装置には、プロセッサが実行する各種プログラム、及び、制御等に必要な各種データ等が格納される。 As shown in FIG. 2, the processor device 30 has functions such as an endoscope control section 31, a light source control section 32, an image processing section 33, an input control section 34, and an output control section 35. Each function is realized by the processor executing a predetermined program. The auxiliary storage device stores various programs executed by the processor and various data necessary for control and the like.
 内視鏡制御部31は、内視鏡10を制御する。内視鏡10の制御には、イメージセンサの駆動制御、送気送水の制御及び吸引の制御等が含まれる。 The endoscope control unit 31 controls the endoscope 10. Control of the endoscope 10 includes drive control of an image sensor, control of air and water supply, control of suction, and the like.
 光源制御部32は、光源装置20を制御する。光源装置20の制御には、光源の発光制御、光源種の切り替え制御等が含まれる。 The light source control unit 32 controls the light source device 20. Control of the light source device 20 includes light emission control of the light source, switching control of light source types, and the like.
 画像処理部33は、内視鏡10のイメージセンサから出力される信号(画像信号)に各種信号処理を施して、画像を生成する処理を行う。 The image processing unit 33 performs various signal processing on the signal (image signal) output from the image sensor of the endoscope 10 to generate an image.
 入力制御部34は、入力装置40及び内視鏡10の操作部からの操作の入力、及び、各種情報の入力を受け付ける処理を行う。 The input control unit 34 performs a process of accepting operation inputs from the input device 40 and the operation unit of the endoscope 10, and inputs of various information.
 出力制御部35は、プロセッサ装置30から画像処理装置100への情報の出力を制御する。プロセッサ装置30から画像処理装置100に出力する情報には、内視鏡で撮影された画像(内視鏡画像)の他、入力装置40を介して入力された情報、及び、各種操作情報等が含まれる。各種操作情報には、入力装置40の操作情報の他、内視鏡10の操作部の操作情報が含まれる。操作情報には、静止画の撮影指示が含まれる。上記のように、静止画の撮影指示は、内視鏡10の操作部に備えられたレリーズボタンで行われる。この他、静止画の撮影指示は、フットスイッチ、音声入力装置、タッチパネル等を介して行う構成とすることもできる。内視鏡で時系列に撮影された画像は、順次、画像処理装置100に出力される。 The output control unit 35 controls the output of information from the processor device 30 to the image processing device 100. The information output from the processor device 30 to the image processing device 100 includes images taken with an endoscope (endoscope images), information input via the input device 40, various operation information, etc. included. The various operation information includes operation information of the input device 40 as well as operation information of the operation section of the endoscope 10. The operation information includes a still image shooting instruction. As described above, a still image shooting instruction is performed using a release button provided on the operation section of the endoscope 10. In addition, the still image shooting instruction may be given via a foot switch, an audio input device, a touch panel, or the like. Images taken in chronological order by the endoscope are sequentially output to the image processing device 100.
 [入力装置]
 入力装置40は、表示装置50と共に内視鏡システム1におけるユーザーインターフェースを構成する。入力装置40は、たとえば、キーボード、マウス及びフットスイッチ等で構成される。この他にも、入力装置40は、タッチパネル、音声入力装置及び視線入力装置等を含めた構成とすることができる。
[Input device]
The input device 40 constitutes a user interface in the endoscope system 1 together with the display device 50. The input device 40 includes, for example, a keyboard, a mouse, a foot switch, and the like. In addition to this, the input device 40 can be configured to include a touch panel, a voice input device, a line of sight input device, and the like.
 [表示装置]
 表示装置50は、内視鏡画像の表示に使用される他、各種情報の表示に使用される。表示装置50は、たとえば、液晶ディスプレイ(Liquid Crystal Display:LCD)、有機ELディスプレイ(Organic Electroluminescence Display:OELD)等で構成される。この他、表示装置50は、プロジェクタ、ヘッドマウントディスプレイ等で構成することもできる。
[Display device]
The display device 50 is used not only to display endoscopic images but also to display various information. The display device 50 is configured with, for example, a liquid crystal display (LCD), an organic electroluminescence display (OELD), or the like. In addition, the display device 50 can also be configured with a projector, a head-mounted display, or the like.
 [画像処理装置]
 画像処理装置100は、プロセッサ装置30から出力される画像を処理する。具体的には、画像処理装置100は、プロセッサ装置30から順次出力される画像を表示装置50に順次表示させる処理を行う。また、画像処理装置100は、プロセッサ装置30から順次出力される画像を順次処理し、画像から病変部を検出する処理を行う。また、検出された病変部を鑑別する処理を行う。また、画像処理装置100は、病変部の検出結果及び鑑別結果を表示装置50に表示させる処理を行う。また、画像処理装置100は、ユーザからの指示に応じて、静止画、及び/又は、動画を撮影し、記録する処理を行う。更に、画像処理装置100は、静止画が撮影された場合に、撮影された静止画に写っている臓器の部位を認識する処理を行う。また、画像処理装置100は、その臓器の部位の認識結果に基づいて、内視鏡10による臓器の観察状況を示す情報を表示装置50に表示させる処理を行う。本実施の形態において、認識対象とされる臓器の複数の部位は、体内の複数の注目領域の一例である。
[Image processing device]
The image processing device 100 processes images output from the processor device 30. Specifically, the image processing device 100 performs a process of sequentially displaying images sequentially output from the processor device 30 on the display device 50. Further, the image processing device 100 sequentially processes images sequentially output from the processor device 30, and performs a process of detecting a lesion from the images. Further, processing for differentiating the detected lesion is performed. The image processing device 100 also performs processing for displaying the detection results and discrimination results of the lesion on the display device 50. Further, the image processing device 100 performs a process of photographing and recording a still image and/or a moving image in response to an instruction from a user. Furthermore, when a still image is photographed, the image processing apparatus 100 performs a process of recognizing the part of an organ shown in the photographed still image. Further, the image processing device 100 performs a process of displaying information indicating the observation status of the organ by the endoscope 10 on the display device 50 based on the recognition result of the part of the organ. In this embodiment, the plurality of parts of the organ that are the recognition targets are an example of the plurality of regions of interest inside the body.
 図3は、画像処理装置のハードウェア構成の一例を示すブロック図である。 FIG. 3 is a block diagram showing an example of the hardware configuration of the image processing device.
 画像処理装置100は、いわゆるコンピュータで構成され、そのハードウェア構成として、プロセッサ101、主記憶装置(メインメモリ)102、補助記憶装置(ストレージ)103、入出力インターフェース104等を備える。画像処理装置100は、入出力インターフェース104を介してプロセッサ装置30及び表示装置50と接続される。補助記憶装置103は、たとえば、ハードディスクドライブ(Hard Disk Drive:HDD)、SSD(Solid State Drive)を含むフラッシュメモリ等で構成される。補助記憶装置103には、プロセッサ101が実行するプログラム、及び、制御等に必要な各種データが記憶される。また、内視鏡で撮影された画像(静止画及び動画)は、補助記憶装置103に記録される。病変部の検出結果、鑑別結果、部位の認識結果、及び、観察状況の判定結果等も補助記憶装置103に記録される。 The image processing device 100 is composed of a so-called computer, and its hardware configuration includes a processor 101, a main memory 102, an auxiliary storage 103, an input/output interface 104, and the like. The image processing device 100 is connected to the processor device 30 and the display device 50 via an input/output interface 104. The auxiliary storage device 103 includes, for example, a hard disk drive (HDD), a flash memory including an SSD (Solid State Drive), and the like. The auxiliary storage device 103 stores programs executed by the processor 101 and various data necessary for control and the like. Furthermore, images (still images and moving images) taken with the endoscope are recorded in the auxiliary storage device 103. The detection result of the lesion, the discrimination result, the recognition result of the part, the judgment result of the observation situation, etc. are also recorded in the auxiliary storage device 103.
 図4は、画像処理装置が有する主な機能のブロック図である。 FIG. 4 is a block diagram of the main functions of the image processing device.
 図4に示すように、画像処理装置100は、画像取得部111、コマンド取得部112、画像処理部113、記録制御部114及び表示制御部115等の機能を有する。各部の機能は、プロセッサ101が、所定のプログラム(画像処理プログラム)を実行することで実現される。 As shown in FIG. 4, the image processing device 100 has functions such as an image acquisition section 111, a command acquisition section 112, an image processing section 113, a recording control section 114, and a display control section 115. The functions of each part are realized by the processor 101 executing a predetermined program (image processing program).
 画像取得部111は、プロセッサ装置30から順次出力される画像を順次取得する処理を行う。上記のように、プロセッサ装置30からは、内視鏡10によって時系列に撮影された画像が順次出力される。よって、画像取得部111では、内視鏡10によって時系列に撮影された画像が時系列順に取得される。 The image acquisition unit 111 performs a process of sequentially acquiring images sequentially output from the processor device 30. As described above, the processor device 30 sequentially outputs images taken in chronological order by the endoscope 10. Therefore, the image acquisition unit 111 acquires images taken in chronological order by the endoscope 10 in chronological order.
 コマンド取得部112は、コマンド情報を取得する。コマンド情報には、静止画の撮影指示の情報が含まれる。 The command acquisition unit 112 acquires command information. The command information includes information on still image shooting instructions.
 画像処理部113は、画像取得部111で取得した画像を処理する。図5は、画像処理部が有する主な機能のブロックである。図5に示すように、本実施の形態の画像処理装置100の画像処理部113は、病変検出部113A、鑑別部113B、部位認識部113C及び観察状況判定部113D等の機能を有する。 The image processing unit 113 processes the image acquired by the image acquisition unit 111. FIG. 5 shows the main functional blocks of the image processing section. As shown in FIG. 5, the image processing unit 113 of the image processing apparatus 100 of this embodiment has functions such as a lesion detection unit 113A, a discrimination unit 113B, a site recognition unit 113C, and an observation situation determination unit 113D.
 病変検出部113Aは、入力された画像を処理し、画像に写っている病変部(たとえば、ポリープなど)を検出する処理を行う。病変部には、病変部であることが確定的な部分の他、病変の可能性がある部分(良性の腫瘍又は異形成等)、及び、直接的又は間接的に病変に関連する可能性のある特徴を有する部分(発赤等)等が含まれる。病変検出部113Aは、画像から病変部を検出するように学習された学習済みモデルで構成される。学習済みモデルを用いた病変部の検出自体は公知の技術であるので、その詳細についての説明は省略する。一例として、畳み込みニューラルネットワーク(Convolutional Neural Network:CNN)を用いたモデルで構成される。なお、病変部の検出は、検出した病変部に対する種別の判別を含めることができる。 The lesion detection unit 113A processes the input image and performs processing to detect a lesion (for example, a polyp, etc.) appearing in the image. Lesions include areas that are certain to be lesions, areas that may be lesions (benign tumors or dysplasia, etc.), and areas that may be directly or indirectly related to the lesions. This includes areas with certain characteristics (redness, etc.). The lesion detection unit 113A is composed of a trained model that has been trained to detect a lesion from an image. Detection of a lesion using a learned model itself is a well-known technique, so a detailed explanation thereof will be omitted. As an example, it is configured with a model using a convolutional neural network (CNN). Note that detection of a lesion can include determining the type of the detected lesion.
 鑑別部113Bは、病変検出部113Aで検出された病変部について鑑別処理を行う。一例として、本実施の形態では、病変検出部113Aで検出されたポリープ等の病変部について、腫瘍性(NEOPLASTIC)又は非腫瘍性(HYPERPLASTIC)の鑑別処理を行う。鑑別部113Bは、画像から病変部を鑑別するように学習された学習済みモデルで構成される。 The discrimination unit 113B performs discrimination processing on the lesion detected by the lesion detection unit 113A. As an example, in the present embodiment, a process is performed to distinguish whether a lesion such as a polyp detected by the lesion detection unit 113A is neoplastic (NEOPLASTIC) or non-neoplastic (HYPERPLASTIC). The discrimination unit 113B is composed of a trained model that has been trained to discriminate a lesion from an image.
 部位認識部113Cは、入力された画像を処理して、画像に写っている臓器の部位を認識する処理を行う。本実施の形態では、胃を観察対象としていることから、部位認識部113Cは、胃の部位を認識する処理を行う。認識する部位は、たとえば、解剖学的ランドマークとされる部位である。一例として、本実施の形態では、噴門(cardia)、穹窿部(fornix)、胃体部大弯(greater curvature)、胃角部(gastric angle)、前庭部(antrum)、幽門(pylorus)等、胃のランドマークとなる部位を認識する。入力された画像から部位を認識できない場合、部位認識部113Cは、認識結果として、「認識不能」を出力する。また、認識した部位が、認識対象として設定された部位以外の場合、部位認識部113Cは、「その他」を出力する。 The part recognition unit 113C processes the input image and performs processing to recognize the part of the organ shown in the image. In this embodiment, since the stomach is the observation target, the region recognition unit 113C performs processing to recognize the region of the stomach. The region to be recognized is, for example, a region that is an anatomical landmark. As an example, in this embodiment, the cardia, fornix, greater curvature, gastric angle, antrum, pylorus, etc. Recognize landmark parts of the stomach. If the body part cannot be recognized from the input image, the body part recognition unit 113C outputs "unrecognizable" as the recognition result. Furthermore, if the recognized part is a part other than the part set as a recognition target, the part recognition unit 113C outputs "Other".
 部位認識部113Cは、画像から臓器の部位を認識するように学習された学習済みモデルで構成される。本実施の形態では、画像から胃の部位を認識するように学習された学習済みモデルで構成される。一例として、本実施の形態では、部位認識部113Cが、CNNで構成される。 The part recognition unit 113C is composed of a trained model that has been trained to recognize parts of organs from images. In this embodiment, it is configured with a trained model that has been trained to recognize the region of the stomach from an image. As an example, in this embodiment, the part recognition unit 113C is configured with a CNN.
 なお、上記のように、本実施の形態では、撮影された静止画に対し、部位の認識処理が行われる。したがって、部位認識部113Cには、撮影された静止画が入力される。 Note that, as described above, in this embodiment, body part recognition processing is performed on the photographed still image. Therefore, the photographed still image is input to the body part recognition unit 113C.
 観察状況判定部113Dは、部位認識部113Cによる臓器の部位の認識結果に基づいて、内視鏡10による臓器の観察状況を判定する。具体的には、次の手順で観察状況を判定する。静止画の撮影が行われ、部位認識部113Cで撮影された部位が認識されると、認識された部位の情報(部位の認識結果の情報)が、観察状況判定部113Dに加えられる。観察状況判定部113Dは、認識された部位の情報を取得し、前回認識された部位の情報と比較する。今回認識された部位が、前回認識された部位と異なる場合、観察状況判定部113Dは、前回認識された部位と今回認識された部位との間の臓器の領域が観察されたと判定する。たとえば、前回認識した部位(前回の静止画撮影により得られた画像から認識した部位)が「胃体部大弯」であり、今回認識した部位(今回の静止画撮影により得られた画像から認識した部位)が「前庭部」の場合、胃体部大弯と前庭部との間の領域が観察されたと判定する。 The observation status determination unit 113D determines the observation status of the organ using the endoscope 10 based on the recognition result of the organ site by the site recognition unit 113C. Specifically, the observation situation is determined by the following procedure. When a still image is photographed and the photographed region is recognized by the region recognition section 113C, information on the recognized region (information on the region recognition result) is added to the observation situation determination section 113D. The observation situation determination unit 113D acquires information on the recognized part and compares it with information on the previously recognized part. If the region recognized this time is different from the region recognized last time, the observation situation determination unit 113D determines that the area of the organ between the region recognized last time and the region recognized this time has been observed. For example, the region recognized last time (the region recognized from the image obtained from the previous still image shooting) is the "greater curvature of the body of the stomach," and the region recognized this time (the region recognized from the image obtained from the current still image shooting) is the "greater curvature of the body of the stomach." If the region (object) is the "antrum", it is determined that the region between the greater curvature of the gastric body and the antrum has been observed.
 このように、観察状況判定部113Dは、前回認識された部位との比較で観察された臓器の領域を判定する。前回認識された部位との比較で判定するので、観察状況判定部113Dは、少なくとも前回認識された部位の情報を保持する。 In this way, the observation situation determination unit 113D determines the region of the observed organ by comparing it with the previously recognized region. Since the determination is made by comparison with the previously recognized region, the observation situation determination unit 113D retains at least information on the previously recognized region.
 本実施の形態において、前回認識した部位は、第1注目領域(第1部位)の一例であり、今回認識した部位は、第2注目領域(第2部位)の一例である。また、第1注目領域を認識した画像(前回の静止画撮影により得られた画像)は、第1画像の一例であり、第2注目領域を認識した画像(今回の静止画札遺影により得られた画像)は、第2画像の一例である。 In this embodiment, the part recognized last time is an example of the first region of interest (first part), and the part recognized this time is an example of the second region of interest (second part). In addition, the image in which the first region of interest was recognized (the image obtained from the previous still image shooting) is an example of the first image, and the image in which the second region of interest was recognized (the image obtained by the current still image shooting) is an example of the first image. (image) is an example of the second image.
 なお、今回の認識された部位(第2注目領域)は、新たに静止画の撮影が行われて、部位の認識が行われると、新たに認識された部位との関係で前回認識された部位(第1注目領域)となる。このように、前回の認識結果との比較において、次々と観察された領域が判定される。 In addition, when a new still image is taken and the part is recognized, the part that was recognized this time (second region of interest) will be the part that was previously recognized in relation to the newly recognized part. (first region of interest). In this way, the observed areas are determined one after another in comparison with the previous recognition results.
 記録制御部114は、静止画の撮影指示に応じて、静止画を撮影し、補助記憶装置103に記録する処理を行う。静止画は、静止画の撮影指示を受け付けた時点で表示装置50に表示されている画像が記録される。これにより、ユーザは、観察中に所望の画像を静止画として記録できる。記録制御部114は、静止画の撮影指示に応じて、表示装置50に表示されているフレームの画像を取得し、補助記憶装置103に記録する。 The recording control unit 114 performs a process of photographing a still image and recording it in the auxiliary storage device 103 in response to a still image photographing instruction. As the still image, the image displayed on the display device 50 at the time when the still image shooting instruction is received is recorded. This allows the user to record a desired image as a still image during observation. The recording control unit 114 acquires an image of a frame displayed on the display device 50 in response to a still image shooting instruction, and records it in the auxiliary storage device 103.
 また、記録制御部114は、部位認識部113Cにおいて部位の認識処理が行われた場合、認識結果の情報(認識された部位の情報)を静止画に関連付けて補助記憶装置103に記録する処理を行う。関連付けの手法は、特に限定されない。静止画と、その静止画に基づく部位の認識結果の情報との対応関係が分かる形式で記録できればよい。したがって、たとえば、別途生成した管理ファイルにより、両者の対応付けを管理してもよい。また、たとえば、静止画の付属情報(いわゆるメタ情報)として、部位の認識結果の情報を記録してもよい。 Furthermore, when body part recognition processing is performed in the body part recognition unit 113C, the recording control unit 114 performs a process of recording recognition result information (recognized body part information) in the auxiliary storage device 103 in association with a still image. conduct. The association method is not particularly limited. It is only necessary to record the still image in a format that allows the correspondence between the still image and the information on the recognition result of the body part based on the still image to be understood. Therefore, for example, the association between the two may be managed using a separately generated management file. Further, for example, information on the recognition result of the body part may be recorded as additional information (so-called meta information) of the still image.
 更に、記録制御部114は、観察状況判定部113Dにおいて観察状況が判定された場合、その判定結果の情報を補助記憶装置103に記録する処理を行う。判定結果の情報は、時系列順に記録される。これにより、観察した領域の履歴を時系列順に記録できる。 Further, when the observation situation is determined by the observation situation determination section 113D, the recording control section 114 performs a process of recording information on the determination result in the auxiliary storage device 103. Information on the determination results is recorded in chronological order. This allows the history of observed areas to be recorded in chronological order.
 表示制御部115は、表示装置50の表示を制御する。表示制御部115は、プロセッサ装置30から順次出力される画像を表示装置50に順次表示させる処理を行う。上記のように、プロセッサ装置30からは、内視鏡10によって時系列に撮影された画像が時系列順に順次出力される。したがって、表示装置50には、内視鏡10によって時系列に撮影された画像が、時系列順に表示される。また、表示制御部115は、内視鏡10による臓器の観察状況を表示装置50に表示させる処理を行う。 The display control unit 115 controls the display of the display device 50. The display control unit 115 performs a process of sequentially displaying images sequentially output from the processor device 30 on the display device 50. As described above, the processor device 30 sequentially outputs images taken in chronological order by the endoscope 10 in chronological order. Therefore, images taken in chronological order by the endoscope 10 are displayed on the display device 50 in chronological order. Further, the display control unit 115 performs a process of displaying the observation status of the organ by the endoscope 10 on the display device 50.
 図6は、表示装置の画面表示の一例を示す図である。同図は、表示装置50が、いわゆるワイドモニタ(画面が横長のモニタ)の場合の例を示している。 FIG. 6 is a diagram showing an example of the screen display of the display device. The figure shows an example in which the display device 50 is a so-called wide monitor (a monitor with a horizontally long screen).
 図6に示すように、画面52には、主表示領域A1及び副表示領域A2が設定される。主表示領域A1及び副表示領域A2は、画面52を横方向(水平方向)に二分割して設定される。図6の左側の大きな領域が主表示領域A1であり、右側の小さな領域が副表示領域A2である。 As shown in FIG. 6, a main display area A1 and a sub display area A2 are set on the screen 52. The main display area A1 and the sub display area A2 are set by dividing the screen 52 into two in the horizontal direction. The large area on the left side of FIG. 6 is the main display area A1, and the small area on the right side is the sub display area A2.
 主表示領域A1は、主に内視鏡10で撮影された画像を表示する領域である。主表示領域A1には、観察像表示領域A1aが設定され、その観察像表示領域A1aに内視鏡10で撮影された画像Imがリアルタイムに表示される。一例として、観察像表示領域A1aは、円の上下を切り欠いた形状の領域で構成される。 The main display area A1 is an area that mainly displays images taken by the endoscope 10. An observation image display area A1a is set in the main display area A1, and an image Im photographed by the endoscope 10 is displayed in the observation image display area A1a in real time. As an example, the observation image display area A1a is constituted by an area in the shape of a circle with the top and bottom cut out.
 なお、病変部の検出支援機能がONされている場合、病変部の検出結果が、画像Imに重畳して表示される。病変部の検出結果は、検出された病変部を枠(いわゆるバウンディングボックス)Bで囲う形で表示される。病変検出部113Aにおいて、病変部の種別を判別する場合には、病変部の位置を示す情報に代えて、あるいは、病変部の位置を示す情報に加えて、判別した病変部の種別の情報が表示される。病変部の種別の情報は、たとえば、検出した病変部の近傍に表示される。病変部の種別の情報は、副表示領域A2に表示する態様とすることもできる。 Note that when the lesion detection support function is turned on, the lesion detection result is displayed superimposed on the image Im. The detection result of the lesion is displayed in a form in which the detected lesion is surrounded by a frame (so-called bounding box) B. In the lesion detection unit 113A, when determining the type of the lesion, information on the determined type of the lesion is used instead of or in addition to the information indicating the position of the lesion. Is displayed. Information on the type of lesion is displayed, for example, near the detected lesion. Information on the type of lesion can also be displayed in the sub-display area A2.
 主表示領域A1には、観察像表示領域A1aの下部に帯状の情報表示領域A1bが設定される。情報表示領域A1bは、各種情報の表示に使用される。たとえば、鑑別支援機能がONされている場合、鑑別結果が、情報表示領域A1bに表示される。図6は、鑑別結果が「腫瘍性(NEOPLASTIC)」の場合の例を示している。 In the main display area A1, a strip-shaped information display area A1b is set below the observation image display area A1a. The information display area A1b is used to display various information. For example, when the discrimination support function is turned on, the discrimination result is displayed in the information display area A1b. FIG. 6 shows an example where the differential diagnosis result is "neoplastic".
 副表示領域A2は、観察に関連した各種情報を表示する領域である。たとえば、被検者(患者)の情報Ip、観察中に撮影された静止画Isが表示される。静止画Isは、たとえば、画面52の上から下に向かって撮影された順に表示される。表示可能な枚数を超えて静止画が撮影された場合は、古い順に画像Imが消されて、表示が切り替えられる。あるいは、すべての画像が表示されるように、各画像の表示サイズが縮小される。 The sub-display area A2 is an area that displays various information related to observation. For example, information Ip about the subject (patient) and a still image Is taken during observation are displayed. The still images Is are displayed, for example, in the order in which they were photographed from the top to the bottom of the screen 52. If the number of still images exceeds the number that can be displayed, the oldest images Im are deleted and the display is switched. Alternatively, the display size of each image is reduced so that all images are displayed.
 また、観察状況の判定機能がONされている場合、観察状況を示す情報が、副表示領域A2に表示される。図6に示すように、観察状況を示す情報は、所定の観察状況表示ウインドウWに表示される。観察状況表示ウインドウWは、矩形の表示領域で構成され、副表示領域A2の所定位置に表示される。図6では、副表示領域A2の下端部近傍に観察状況表示ウインドウWを表示する場合の例を示している。観察状況表示ウインドウWには、観察状況を示す情報として、部位認識部113Cで認識した部位の情報が表示される。認識した部位の情報は、それぞれ規定の部位表示枠Fl1、Fl2、…に表示される。部位表示枠Fl1、Fl2、…は、新たに部位が認識されるたびに追加される。 Furthermore, when the observation situation determination function is turned on, information indicating the observation situation is displayed in the sub display area A2. As shown in FIG. 6, information indicating the observation situation is displayed in a predetermined observation situation display window W. The observation status display window W is composed of a rectangular display area, and is displayed at a predetermined position in the sub-display area A2. FIG. 6 shows an example in which the observation status display window W is displayed near the lower end of the sub-display area A2. The observation status display window W displays information on the body part recognized by the body part recognition unit 113C as information indicating the observation status. Information on the recognized body parts is displayed in respective prescribed body part display frames Fl1, Fl2, . . . . The body part display frames Fl1, Fl2, . . . are added each time a new body part is recognized.
 図7は、観察状況表示ウインドウの表示の遷移の一例を示す図である。 FIG. 7 is a diagram showing an example of the display transition of the observation status display window.
 図7(A)は、観察状況表示ウインドウWの初期表示の一例を示している。すなわち、観察開始から一度も部位の認識が行われていない場合(静止画撮影が行われていない場合)の観察状況表示ウインドウWの表示の一例を示している。この場合、観察状況表示ウインドウWは空欄とされる。すなわち、何も表示されない。 FIG. 7(A) shows an example of the initial display of the observation status display window W. That is, it shows an example of the display of the observation status display window W in a case where recognition of a body part has not been performed even once since the start of observation (in a case where still image photography has not been performed). In this case, the observation status display window W is left blank. In other words, nothing is displayed.
 図7(B)は、初めて部位が認識された場合の観察状況表示ウインドウWの表示の一例を示している。すなわち、初めて静止画撮影が行われ、その撮影画像から部位が認識された場合の観察状況表示ウインドウWの表示の一例を示している。同図に示すように、ウインドウ内の所定の位置(上端近傍)に部位表示枠Fl1が表示され、その部位表示枠Fl1に認識された部位の情報が表示される。図7(B)は、「胃体部大弯(greater curvature)」が認識された場合の例を示している。 FIG. 7(B) shows an example of the display of the observation status display window W when a body part is recognized for the first time. That is, it shows an example of the display of the observation status display window W when a still image is captured for the first time and a body part is recognized from the captured image. As shown in the figure, a body part display frame Fl1 is displayed at a predetermined position (near the top end) in the window, and information about the recognized body part is displayed in the body part display frame Fl1. FIG. 7(B) shows an example in which a "greater curvature of the body of the stomach" is recognized.
 図7(C)は、新たに部位が認識された場合の観察状況表示ウインドウWの表示の一例を示している。すなわち、更に静止画撮影が行われ、その撮影画像から部位が認識された場合の観察状況表示ウインドウWの表示の一例を示している。同図に示すように、先に表示された部位表示枠Fl1の下に部位表示枠Fl2が新たに追加され、その新たに追加された部位表示枠Fl2に新たに認識された部位の情報が表示される。図7(C)は、新たに「前庭部(antrum)」が認識された場合の例を示している。 FIG. 7(C) shows an example of the display of the observation status display window W when a new part is recognized. That is, an example of the display of the observation status display window W is shown when still image photography is further performed and a body part is recognized from the photographed image. As shown in the figure, a new part display frame Fl2 is added below the previously displayed part display frame Fl1, and information about the newly recognized part is displayed in the newly added part display frame Fl2. be done. FIG. 7(C) shows an example in which "antrum" is newly recognized.
 また、図7(C)に示すように、新たに部位が認識されると、先に表示された部位表示枠Fl1と新たに追加された部位表示枠Fl2とが矢印Ar1で結ばれる。矢印Ar1は、先に表示された部位表示枠Fl1から新たに追加された部位表示枠Fl2に向かう方向に表示される。この矢印Ar1によって、部位表示枠Fl1に表示された部位から部位表示枠Fl2に表示された部位に向けて内視鏡10が移動したことが示される。すなわち、部位表示枠Fl1に表示された部位と部位表示枠Fl2に表示された部位との間の領域が観察されたことが示される。また、部位表示枠Fl1に表示された部位から部位表示枠Fl2に表示された部位に向かって観察されたことが示される。すなわち、観察の移動方向が示される。図7(C)に示す例では、「胃体部大弯(greater curvature)」から「前庭部(antrum)」に向けて内視鏡10が移動し、その間の胃の領域が観察されたことを示している。 Further, as shown in FIG. 7(C), when a new body part is recognized, the previously displayed body part display frame Fl1 and the newly added body part display frame Fl2 are connected by an arrow Ar1. The arrow Ar1 is displayed in the direction from the previously displayed body part display frame Fl1 to the newly added body part display frame Fl2. This arrow Ar1 indicates that the endoscope 10 has moved from the region displayed in the region display frame Fl1 to the region displayed in the region display frame Fl2. That is, it is shown that the area between the part displayed in the part display frame Fl1 and the part displayed in the part display frame Fl2 has been observed. It is also shown that the observation was made from the part displayed in the part display frame Fl1 to the part displayed in the part display frame Fl2. That is, the direction of observation movement is indicated. In the example shown in FIG. 7(C), the endoscope 10 is moved from the "greater curvature of the stomach body" toward the "antrum," and the region of the stomach in between is observed. It shows.
 図7(D)は、更に部位が認識された場合の観察状況表示ウインドウWの表示の一例を示している。すなわち、更に静止画撮影が行われ、その撮影画像から部位が認識された場合の観察状況表示ウインドウWの表示の一例を示している。同図に示すように、先に表示された部位表示枠Fl2の下に部位表示枠Fl3が新たに追加され、その新たに追加された部位表示枠Fl3に新たに認識された部位の情報が表示される。図7(D)は、新たに「胃角部(gastric angle)」が認識された場合の例を示している。 FIG. 7(D) shows an example of the display of the observation status display window W when a region is further recognized. That is, an example of the display of the observation status display window W is shown when still image photography is further performed and a body part is recognized from the photographed image. As shown in the figure, a new part display frame Fl3 is added below the previously displayed part display frame Fl2, and information about the newly recognized part is displayed in the newly added part display frame Fl3. be done. FIG. 7(D) shows an example where a "gastric angle" is newly recognized.
 また、図7(D)に示すように、新たに部位が認識されると、先に表示された部位表示枠Fl2と新たに追加された部位表示枠Fl3とが矢印Ar2で結ばれる。矢印Ar2は、先に表示された部位表示枠Fl2から新たに追加された部位表示枠Fl3に向かう方向に表示される。この矢印Ar2によって、部位表示枠Fl2に表示された部位から部位表示枠Fl3に表示された部位に向けて内視鏡10が移動したことが示される。すなわち、部位表示枠Fl2に表示された部位と部位表示枠Fl3に表示された部位との間の領域が観察されたことが示される。また、部位表示枠Fl2に表示された部位から部位表示枠Fl3に表示された部位に向けて観察されたことが示される。すなわち、観察の移動方向が示される。図7(D)に示す例では、「前庭部(antrum)」から「胃角部(gastric angle)」に向けて内視鏡10が移動し、その間の胃の領域が観察されたことを示している。 Further, as shown in FIG. 7(D), when a new body part is recognized, the previously displayed body part display frame Fl2 and the newly added body part display frame Fl3 are connected by an arrow Ar2. The arrow Ar2 is displayed in the direction from the previously displayed body part display frame Fl2 to the newly added body part display frame Fl3. This arrow Ar2 indicates that the endoscope 10 has moved from the region displayed in the region display frame Fl2 to the region displayed in the region display frame Fl3. That is, it is shown that the area between the part displayed in part display frame Fl2 and the part displayed in part display frame Fl3 has been observed. Further, it is shown that the part displayed in the part display frame Fl3 was observed from the part displayed in the part display frame Fl2. That is, the direction of observation movement is indicated. In the example shown in FIG. 7(D), the endoscope 10 is moved from the antrum to the gastric angle, and the region of the stomach between them is observed. ing.
 このように、新たに部位が認識されるたびに、観察状況表示ウインドウW内に部位表示枠Fl1、Fl2、…が追加して表示される。そして、その新たに追加された部位表示枠に新たに認識された部位の情報が表示される。また、先に表示された部位表示枠と新たに追加された部位表示枠とが矢印で結ばれる。これにより、観察状況表示ウインドウWの表示から観察済み部位、観察済みの領域、及び、観察方向を把握できる。 In this way, each time a new body part is recognized, body part display frames Fl1, Fl2, . . . are additionally displayed within the observation status display window W. Information about the newly recognized body part is then displayed in the newly added body part display frame. Further, the previously displayed body part display frame and the newly added body part display frame are connected with an arrow. Thereby, the observed part, observed area, and observation direction can be grasped from the display on the observation status display window W.
 なお、図6及び図7に示す例では、観察状況表示ウインドウWに最大4つまで部位表示枠を表示できる。4つを超えて部位が認識された場合、観察状況表示ウインドウWは、その表示サイズが拡大される。より具体的には、上方に延長して拡大される。これにより、新たに認識された部位の情報を逐次表示できる。この他、観察状況表示ウインドウWのサイズは変えずに、直近に認識した部位の情報のみを表示させる構成とすることもできる。すなわち、部位表示枠の表示数nは変えずに、直近に認識したn個の部位の情報のみを表示する構成とすることもできる。また、観察状況表示ウインドウWのサイズは変えずに、スクロールして表示範囲を変更できる構成とすることもできる。 Note that in the examples shown in FIGS. 6 and 7, up to four body part display frames can be displayed in the observation status display window W. If more than four parts are recognized, the display size of the observation status display window W is enlarged. More specifically, it is extended and expanded upward. Thereby, information on newly recognized parts can be displayed sequentially. In addition, the size of the observation status display window W may not be changed, and only information about the most recently recognized part may be displayed. In other words, a configuration may be adopted in which only information on the most recently recognized n parts is displayed without changing the number n of parts displayed in the part display frame. Alternatively, the display range can be changed by scrolling without changing the size of the observation status display window W.
 図6に示すように、観察状況表示ウインドウWは、副表示領域A2において、他の表示よりも優先して表示される。すなわち、他の情報と表示が重なる場合、最上位に表示される。 As shown in FIG. 6, the observation situation display window W is displayed in priority over other displays in the sub-display area A2. That is, if the display overlaps with other information, it is displayed at the top.
 本実施の形態において、部位表示枠Fl1、Fl2、…を結ぶ矢印Ar1、Ar2、…は、第1注目領域(第1部位)と第2注目領域(第2部位)との間の領域が観察されたことを示す情報の一例である。 In this embodiment, the arrows Ar1, Ar2, ... connecting the part display frames Fl1, Fl2, ... indicate that the area between the first region of interest (first region) and the second region of interest (second region) is observed. This is an example of information indicating that the
 [内視鏡システムの作用]
 内視鏡システム1の基本動作として、画像処理装置100は、内視鏡10で撮影された画像を表示装置50にリアルタイムに表示させる。
[Operation of endoscope system]
As a basic operation of the endoscope system 1, the image processing device 100 causes the display device 50 to display images captured by the endoscope 10 in real time.
 この基本動作に加えて、画像処理装置100は、各種の支援情報を表示装置50に表示させる。本実施の形態では、支援情報として、病変部の検出結果の情報、病変部の鑑別結果の情報、及び、観察状況を示す情報が表示される。各支援情報は、該当する機能がONされている場合に表示される。たとえば、病変部の検出支援機能がONされている場合、病変部の検出結果として、検出された病変部が枠で囲われて表示される。また、鑑別支援機能がONされている場合、鑑別結果が鑑別結果表示領域A3に表示される。更に、観察状況の表示機能がONされている場合、観察状況を示す情報として、観察状況表示ウインドウWが画面52に表示される。 In addition to this basic operation, the image processing device 100 causes the display device 50 to display various types of support information. In this embodiment, information on the detection result of the lesion, information on the discrimination result of the lesion, and information indicating the observation status are displayed as the support information. Each piece of support information is displayed when the corresponding function is turned on. For example, when the lesion detection support function is turned on, the detected lesion is displayed surrounded by a frame as the lesion detection result. Further, when the discrimination support function is turned on, the discrimination result is displayed in the discrimination result display area A3. Furthermore, when the observation status display function is turned on, an observation status display window W is displayed on the screen 52 as information indicating the observation status.
 更に、画像処理装置100は、ユーザによる静止画の撮影指示に応じて、表示装置50に表示中のフレームの画像を静止画として記録する。病変部の検出支援機能がONされている場合、撮影された静止画から検出された病変部の情報が、撮影された静止画に関連付けて記録される。また、鑑別支援機能がONされている場合、その病変部の鑑別結果の情報が、撮影された静止画に関連付けて記録される。更に、観察状況の表示機能がONされている場合、認識された部位の情報及び観察状況の判別結果の情報が、撮影された静止画に関連付けて記録される。 Further, the image processing device 100 records the image of the frame being displayed on the display device 50 as a still image in response to a still image shooting instruction from the user. When the lesion detection support function is turned on, information about the lesion detected from the photographed still image is recorded in association with the photographed still image. Further, when the discrimination support function is turned on, information on the discrimination result of the lesion is recorded in association with the photographed still image. Furthermore, when the observation status display function is turned on, information on the recognized body part and information on the observation status determination result are recorded in association with the photographed still image.
 各種支援機能のON/OFFは、たとえば、設定画面で行われる。この他、プロセッサ装置30の操作部、内視鏡10の操作部で行う構成とすることもできる。 Various support functions are turned on/off on the settings screen, for example. In addition, the configuration may be such that the operation is performed using the operating section of the processor device 30 or the operating section of the endoscope 10.
 [観察状況を示す情報の表示]
 図8は、観察状況を示す情報を表示装置に表示させる場合の処理の手順を示すフローチャートである。
[Display of information indicating observation status]
FIG. 8 is a flowchart illustrating the processing procedure for displaying information indicating observation conditions on a display device.
 上記のように、観察状況を示す情報は、観察状況の表示機能がONされている場合に表示装置50に表示される。 As described above, information indicating the observation situation is displayed on the display device 50 when the observation situation display function is turned on.
 観察状況の表示機能がONされると、画像処理装置100のプロセッサ101は、表示装置50の画面52に観察状況表示ウインドウWを表示させる(ステップS1)。観察状況表示ウインドウWは、画面52の所定位置(副表示領域A2)に表示される(図6参照)。初期の観察状況表示ウインドウWの表示は、空欄である(図7(A)参照)。 When the observation status display function is turned on, the processor 101 of the image processing device 100 displays the observation status display window W on the screen 52 of the display device 50 (step S1). The observation status display window W is displayed at a predetermined position (sub-display area A2) on the screen 52 (see FIG. 6). The initial display of the observation status display window W is blank (see FIG. 7(A)).
 観察が開始されると、プロセッサ101は、静止画撮影の有無を判定する(ステップS2)。静止画が撮影されると、撮影された静止画に対し、部位の認識処理を行う(ステップS3)。すなわち、撮影された静止画に写っている臓器の部位(本実施の形態では胃の部位)を認識する処理を行う。 When observation starts, the processor 101 determines whether still images are to be taken (step S2). When a still image is photographed, body part recognition processing is performed on the photographed still image (step S3). That is, a process is performed to recognize the part of the organ (in this embodiment, the part of the stomach) shown in the photographed still image.
 認識処理後、プロセッサ101は、部位を認識した否かを判定する(ステップS4)。部位を認識した場合、プロセッサ101は、観察状況を判定する処理を行う(ステップS5)。すなわち、認識済みの部位の有無を判定し、認識済みの部位があれば、前回認識された部位と今回認識された部位との間の領域が観察済みと判定する。 After the recognition process, the processor 101 determines whether or not the body part has been recognized (step S4). When the part is recognized, the processor 101 performs processing to determine the observation situation (step S5). That is, the presence or absence of a recognized part is determined, and if there is a recognized part, it is determined that the area between the previously recognized part and the currently recognized part has been observed.
 この後、プロセッサ101は、部位の認識結果と観察状況の判定結果とに基づいて、観察状況表示ウインドウWの表示を更新する(ステップS6)。始めて部位を認識した場合は、観察状況表示ウインドウWに部位表示枠を表示させ、その部位表示枠に認識した部位の情報を表示させる(図7(B)参照)。認識済みの部位が存在する場合は、観察状況表示ウインドウWに部位表示枠を追加して表示させる。そして、その新たに追加した部位表示枠に認識した部位の情報を表示させる(図7(C)及び(D)参照)。また、前回認識した部位の部位表示枠と今回認識した部位の部位表示枠と矢印で結び、前回認識した部位と今回認識した部位との間の臓器の領域が認識されたことを示す情報を表示させる(図7(C)及び(D)参照)。 Thereafter, the processor 101 updates the display of the observation situation display window W based on the region recognition result and the observation situation determination result (step S6). When a body part is recognized for the first time, a body part display frame is displayed in the observation status display window W, and information about the recognized body part is displayed in the body part display frame (see FIG. 7(B)). If a recognized body part exists, a body part display frame is added to the observation status display window W and displayed. Then, information on the recognized body part is displayed in the newly added body part display frame (see FIGS. 7(C) and (D)). In addition, the part display frame of the previously recognized part and the part display frame of the currently recognized part are connected with an arrow, and information indicating that the area of the organ between the previously recognized part and the currently recognized part has been recognized is displayed. (See FIGS. 7(C) and (D)).
 観察状況表示ウインドウWの表示の更新後、プロセッサ101は、観察が終了したか否かを判定する(ステップS7)。なお、ステップS2で静止画が撮影されていないと判定した場合、及び、ステップS4で部位を認識していないと判定した場合も同様に観察が終了したか否かを判定する。観察が終了したと判定すると、処理を終了する。観察が終了していないと判定すると、静止画撮影の有無を判定する(ステップS2)。 After updating the display of the observation status display window W, the processor 101 determines whether the observation has ended (step S7). Note that in the case where it is determined in step S2 that no still image has been taken, and in the case where it is determined in step S4 that the body part has not been recognized, it is similarly determined whether the observation has ended. When it is determined that the observation has ended, the process ends. If it is determined that the observation has not been completed, it is determined whether still images are to be taken (step S2).
 なお、観察状況の表示機能がOFFされた場合も処理を終了する。この場合、観察状況表示ウインドウWが画面52から消去される。 Note that the process also ends when the observation status display function is turned off. In this case, the observation status display window W is deleted from the screen 52.
 以上説明したように、本実施の形態の内視鏡システム1及び画像処理装置100によれば、静止画を撮影するたびに、撮影した静止画に写っている臓器の部位が認識される。そして、その認識された部位の情報が表示装置50に表示される。これにより、観察済みの部位を容易に把握できる。 As described above, according to the endoscope system 1 and the image processing device 100 of the present embodiment, each time a still image is taken, the part of the organ shown in the taken still image is recognized. Information about the recognized region is then displayed on the display device 50. Thereby, the observed part can be easily grasped.
 また、前回認識した部位の情報と新たに認識した部位の情報とが矢印で結ばれて表示される。これにより、観察対象の臓器において、観察済みの領域を容易に把握できる。 Additionally, the information on the previously recognized part and the information on the newly recognized part are displayed connected with an arrow. Thereby, it is possible to easily grasp the observed region in the organ to be observed.
 [第2の実施の形態]
 上記第1の実施の形態の内視鏡システム1によれば、ユーザは、観察状況表示ウインドウWの表示から観察済みの領域を把握できる。しかし、観察済みと判定された領域が、正しく観察(撮影)できていたか否かについては判断できない。正しく観察できていないと、画像から病変部を自動検出等する場合に、精度よく検出できないおそれがある。したがって、観察済みの領域を把握できるだけでなく、観察済みと判定された領域が、正しく観察できていたか否かを確認できることがより好ましい。
[Second embodiment]
According to the endoscope system 1 of the first embodiment, the user can grasp the observed area from the display of the observation status display window W. However, it cannot be determined whether the area determined to have been observed has been correctly observed (photographed). If the observation is not performed correctly, there is a risk that the lesion cannot be detected accurately when automatically detecting the lesion from the image. Therefore, it is more preferable to not only be able to grasp the observed area but also to be able to confirm whether the area determined to have been observed has been correctly observed.
 本実施の形態では、観察の状態を評価する機能を備えた内視鏡システムについて説明する。なお、システムの基本構成は、上記第1の実施の形態の内視鏡システム1と実質的に同じである。よって、以下においては、上記第1の実施の形態の内視鏡システム1との主な相違点、特に、画像処理装置100での相違点についてのみ説明する。 In this embodiment, an endoscope system having a function of evaluating the observation state will be described. Note that the basic configuration of the system is substantially the same as the endoscope system 1 of the first embodiment. Therefore, in the following, only the main differences from the endoscope system 1 of the first embodiment, particularly the differences in the image processing apparatus 100, will be described.
 [画像処理装置]
 本実施の形態の画像処理装置100は、内視鏡10による臓器の観察を評価する機能を更に有する。観察の評価は、観察状況判定部113Dにおいて行われる。
[Image processing device]
The image processing apparatus 100 according to the present embodiment further has a function of evaluating observation of organs using the endoscope 10. Evaluation of the observation is performed in the observation situation determination section 113D.
 図9は、画像処理装置の観察状況判定部が有する主な機能のブロック図である。 FIG. 9 is a block diagram of the main functions of the observation situation determination section of the image processing device.
 図9に示すように、観察状況判定部113Dは、観察領域判定部113D1、撮影評価部113D2及び評価値算出部113D3の機能を有する。これらの機能は、プロセッサ101が、所定のプログラムを実行することにより実現される。 As shown in FIG. 9, the observation situation determination unit 113D has the functions of an observation area determination unit 113D1, a photography evaluation unit 113D2, and an evaluation value calculation unit 113D3. These functions are realized by the processor 101 executing a predetermined program.
 観察領域判定部113D1は、部位認識部113Cによる部位の認識結果に基づいて、観察された領域(観察領域)を判定する。具体的には、次の手順で観察領域を判定する。静止画の撮影が行われ、部位認識部113Cで撮影された部位が認識されると、認識された部位の情報(部位の認識結果の情報)が、観察領域判定部113D1に加えられる。観察領域判定部113D1は、認識された部位の情報を取得し、前回認識された部位の情報と比較する。今回認識された部位が、前回認識された部位と異なる場合、観察領域判定部113D1は、前回認識された部位と今回認識された部位との間の臓器の領域が、観察された領域であると判定する。 The observation area determination unit 113D1 determines the observed area (observation area) based on the part recognition result by the part recognition unit 113C. Specifically, the observation area is determined by the following procedure. When a still image is photographed and the photographed region is recognized by the region recognition section 113C, information on the recognized region (information on the region recognition result) is added to the observation region determination section 113D1. The observation area determination unit 113D1 acquires information on the recognized region and compares it with information on the previously recognized region. If the region recognized this time is different from the region recognized last time, the observation region determination unit 113D1 determines that the region of the organ between the region recognized last time and the region recognized this time is the observed region. judge.
 撮影評価部113D2は、内視鏡10で撮影された画像を評価する。特に、画像認識の観点から画像を評価する。すなわち、上記のように、本実施の形態の内視鏡システム1では、画像認識により病変部の自動検出等が行われるので、画像認識を行う観点から撮影された画像を評価する。一例として、本実施の形態では、画像のボケ状態及びブレ状態から画像を評価する。画像のボケ状態及びブレ状態は、たとえば、画像の鮮鋭度として評価できる。鮮鋭度は、画像の明瞭さを表す指標の一つである。撮影評価部113D2は、画像の鮮鋭度を算出し、算出した鮮鋭度が閾値以下の画像をNG画像(ボケ又はブレが生じた画像)、閾値を超える画像をOK画像(明瞭な画像)と判定する。鮮鋭度の算出は、公知の手法を採用できる。この他、画像を評価する手法は、画像のボケ状態及び/又はブレ状態を定量的に評価する公知の手法を採用できる。本実施の形態において、画像の鮮鋭度の閾値は、第1基準の一例である。 The imaging evaluation unit 113D2 evaluates images taken by the endoscope 10. In particular, images are evaluated from the perspective of image recognition. That is, as described above, in the endoscope system 1 of the present embodiment, automatic detection of a lesion, etc. is performed by image recognition, so the captured image is evaluated from the viewpoint of performing image recognition. As an example, in this embodiment, an image is evaluated based on the blur state and blur state of the image. The blur state and blur state of an image can be evaluated, for example, as the sharpness of the image. Sharpness is one of the indicators representing the clarity of an image. The shooting evaluation unit 113D2 calculates the sharpness of the image, and determines that images whose calculated sharpness is below a threshold are NG images (blurred or blurred images), and images that exceed the threshold are OK images (clear images). do. A known method can be used to calculate the sharpness. In addition, as a method for evaluating an image, a known method for quantitatively evaluating the blur state and/or blur state of an image can be adopted. In this embodiment, the image sharpness threshold is an example of the first criterion.
 評価値算出部113D3は、画像の評価結果に基づいて、観察の評価値(スコア)を算出する。観察の評価値は、観察領域判定部113D1で観察済みと判定された領域が、どの程度正しく観察されたかを定量的に示す指標である。一例として、本実施の形態では、観察領域判定部113D1で観察済みと判定された領域でOK画像が撮影された割合として算出される。具体的には、観察済みと判定された領域で撮影された画像のうちOK画像の割合が算出される。観察済みと判定された領域では動画が撮影される。したがって、評価値は、その動画を構成する各フレームの画像のうちOK画像の割合が算出される。たとえば、観察済みと判定された領域で撮影された動画の総フレーム数が100枚、そのうちOK画像と判定されたフレームの数が82枚であったとする。この場合、評価値は、82[%]となる。 The evaluation value calculation unit 113D3 calculates an observation evaluation value (score) based on the image evaluation result. The observation evaluation value is an index that quantitatively indicates how accurately the area determined to have been observed by the observation area determination unit 113D1 has been observed. As an example, in the present embodiment, it is calculated as the percentage of OK images taken in the area determined to have been observed by the observation area determination unit 113D1. Specifically, the ratio of OK images among the images taken in the area determined to have been observed is calculated. A video is shot in the area determined to have been observed. Therefore, the evaluation value is calculated as the ratio of OK images among the images of each frame making up the moving image. For example, assume that the total number of frames of moving images shot in the area determined to have been observed is 100, of which the number of frames determined to be OK images is 82. In this case, the evaluation value is 82[%].
 評価値は、逐次算出する構成としてもよいし、観察した領域の確定後、まとめて算出する構成としてもよい。逐次算出する場合は、撮影した画像の評価結果が得られるたびに、評価値が更新される。 The evaluation value may be calculated sequentially, or may be calculated all at once after the observed area is determined. In the case of sequential calculation, the evaluation value is updated every time an evaluation result of a photographed image is obtained.
 図10は、観察状況判定部で行われる撮影領域の判定及び評価値の算出の概念図である。 FIG. 10 is a conceptual diagram of the determination of the imaging area and the calculation of the evaluation value performed by the observation situation determination section.
 図10において、符号Iは、内視鏡10によって時系列に撮影された画像Iである。この画像は、内視鏡10によって撮影される動画の各フレームの画像に相当する。 In FIG. 10, reference numeral I indicates images I taken in chronological order by the endoscope 10. This image corresponds to an image of each frame of a moving image captured by the endoscope 10.
 上記のように、観察された領域は、撮影された静止画から認識された部位に基づいて判定される。すなわち、先に撮影された静止画(第1画像)から認識された部位(第1注目領域)と、その後に撮影された静止画(第2画像)から認識された部位(第2注目領域)との間の領域が、観察された領域と判定される。 As described above, the observed region is determined based on the region recognized from the photographed still image. That is, a part (first region of interest) recognized from a still image taken first (first image) and a part (second region of interest) recognized from a still image taken after that (second image). The area between is determined to be the observed area.
 よって、撮影の評価は、先に撮影された静止画(第1画像)と、その後に撮影された静止画(第2画像)との間に撮影された画像に基づいて行われる。撮影評価部113D2は、先に撮影された静止画(第1画像)と、その後に撮影された静止画(第2画像)との間に撮影された画像をそれぞれ評価する。評価値算出部113D3は、先に撮影された静止画(第1画像)と、その後に撮影された静止画(第2画像)との間に撮影された画像のうちOK画像の割合を算出して、評価値を算出する。 Therefore, the evaluation of photography is performed based on images taken between the still image taken first (first image) and the still image taken after that (second image). The photographic evaluation unit 113D2 evaluates each image photographed between a still image photographed first (first image) and a still image photographed after that (second image). The evaluation value calculation unit 113D3 calculates the proportion of OK images among the images taken between the still image taken first (first image) and the still image taken after that (second image). and calculate the evaluation value.
 図11は、観察状況表示ウインドウの表示の一例を示す図である。図11の(A)~(C)は、観察状況表示ウインドウWの表示の経時的な変化を示している。 FIG. 11 is a diagram showing an example of the display of the observation status display window. 11A to 11C show changes in the display of the observation status display window W over time.
 図11(A)は、初めて部位が認識された場合の観察状況表示ウインドウWの表示の一例を示している。図11(A)に示すように、ウインドウ内の所定の位置(上端近傍)に部位表示枠Fl1が表示され、その部位表示枠Fl1に認識された部位の情報が表示される。図11(A)は、「胃体部大弯(greater curvature)」が認識された場合の例を示している。 FIG. 11(A) shows an example of the display of the observation status display window W when a region is recognized for the first time. As shown in FIG. 11A, a body part display frame Fl1 is displayed at a predetermined position (near the top end) in the window, and information about the recognized body part is displayed in the body part display frame Fl1. FIG. 11(A) shows an example in which a "greater curvature of the body of the stomach" is recognized.
 図11(B)は、新たに部位が認識された場合の観察状況表示ウインドウWの表示の一例を示している。図11(B)に示すように、先に表示された部位表示枠Fl1の下に部位表示枠Fl2が新たに追加され、その新たに追加された部位表示枠Fl2に新たに認識された部位の情報が表示される。また、図11(B)に示すように、先に表示された部位表示枠Fl1と新たに追加された部位表示枠Fl2とが矢印Ar1で結ばれる。更に、図11(B)に示すように、先に表示された部位表示枠Fl1と新たに追加された部位表示枠Fl2との間に評価値表示枠Sc1が新たに表示され、その新たに表示された評価値表示枠Sc1に評価値が表示される。この評価値は、部位表示枠Fl1に表示された部位と部位表示枠Fl2に表示された部位との間で行われた観察に対して算出された評価値である。図11(B)は、「胃体部大弯(greater curvature)」から「前庭部(antrum)」の間で行われた観察に対して算出された評価値が82[%]であった場合の例を示している。評価値表示枠Sc1は、矢印Ar1に隣接して表示される。 FIG. 11(B) shows an example of the display of the observation status display window W when a new part is recognized. As shown in FIG. 11(B), a new part display frame Fl2 is added below the previously displayed part display frame Fl1, and a newly recognized part is displayed in the newly added part display frame Fl2. Information will be displayed. Further, as shown in FIG. 11(B), the previously displayed body part display frame Fl1 and the newly added body part display frame Fl2 are connected by an arrow Ar1. Furthermore, as shown in FIG. 11(B), an evaluation value display frame Sc1 is newly displayed between the previously displayed body part display frame Fl1 and the newly added body part display frame Fl2, and the newly displayed The evaluation value is displayed in the evaluation value display frame Sc1. This evaluation value is an evaluation value calculated for the observation performed between the part displayed in the part display frame Fl1 and the part displayed in the part display frame Fl2. Figure 11(B) shows a case where the evaluation value calculated for the observation performed between the "greater curvature" and the "antrum" was 82%. An example is shown. The evaluation value display frame Sc1 is displayed adjacent to the arrow Ar1.
 図11(C)は、更に部位が認識された場合の観察状況表示ウインドウWの表示の一例を示している。図11(C)に示すように、先に表示された部位表示枠Fl2の下に部位表示枠Fl3が新たに追加され、その新たに追加された部位表示枠Fl3に新たに認識された部位の情報が表示される。また、図11(C)に示すように、新たに部位が認識されると、先に表示された部位表示枠Fl2と新たに追加された部位表示枠Fl3とが矢印Ar2で結ばれる。更に、図11(C)に示すように、先に表示された部位表示枠Fl2と新たに追加された部位表示枠Fl3との間に評価値表示枠Sc2が新たに表示され、その新たに表示された評価値表示枠Sc2に評価値が表示される。この評価値は、部位表示枠Fl2に表示された部位と部位表示枠Fl3に表示された部位との間で行われた観察に対して算出された評価値である。図11(C)は、「前庭部(antrum)」から「胃角部(gastric angle)」の間で行われた観察に対して算出された評価値が99[%]であった場合の例を示している。 FIG. 11C shows an example of the display of the observation status display window W when a region is further recognized. As shown in FIG. 11(C), a new part display frame Fl3 is added below the previously displayed part display frame Fl2, and a newly recognized part is displayed in the newly added part display frame Fl3. Information will be displayed. Further, as shown in FIG. 11C, when a new body part is recognized, the previously displayed body part display frame Fl2 and the newly added body part display frame Fl3 are connected by an arrow Ar2. Furthermore, as shown in FIG. 11(C), an evaluation value display frame Sc2 is newly displayed between the previously displayed body part display frame Fl2 and the newly added body part display frame Fl3, and the newly displayed The evaluation value is displayed in the evaluation value display frame Sc2. This evaluation value is an evaluation value calculated for the observation performed between the region displayed in the region display frame Fl2 and the region displayed in the region display frame Fl3. Figure 11 (C) is an example where the evaluation value calculated for the observation performed between the "antrum" and "gastric angle" is 99%. It shows.
 このように、観察状況表示ウインドウWに表示される各部位表示枠Fl1、Fl2、…の間に評価値表示枠Sc1、Sc2、…が表示され、その評価値表示枠Sc1、Sc2、…に各部位間で算出された観察の評価値が表示される。これにより、各部位表示枠Fl1、Fl2、…に表示された部位の間での観察の状態(正しく撮影できていたか否か)を容易に確認できる。 In this way, the evaluation value display frames Sc1, Sc2, ... are displayed between the body part display frames Fl1, Fl2, ... displayed on the observation status display window W, and the evaluation value display frames Sc1, Sc2, ... The observation evaluation value calculated between the parts is displayed. Thereby, it is possible to easily check the observation status (whether or not the images were correctly captured) among the parts displayed in each part display frame Fl1, Fl2, . . . .
 本実施の形態において、評価値表示枠Sc1、Sc2、…に表示される評価値は、第1注目領域(第1部位)と第2注目領域(第2部位)との間の領域が観察されたことを示す情報の他の一例でもある。よって、矢印Ar1、Ar2、…を表示させずに、評価値のみを表示させる構成とすることもできる。 In this embodiment, the evaluation values displayed in the evaluation value display frames Sc1, Sc2,... This is another example of information indicating that. Therefore, it is also possible to have a configuration in which only the evaluation values are displayed without displaying the arrows Ar1, Ar2, . . . .
 以上説明したように、本実施の形態の内視鏡システム1及び画像処理装置100によれば、観察済みとされた領域について、観察(撮影)の評価が示される。これにより、観察済みとされた領域が、正しく観察(撮影)されたか否かを容易に判断できる。 As described above, according to the endoscope system 1 and the image processing device 100 of the present embodiment, evaluation of observation (photography) is shown for a region that has been observed. Thereby, it can be easily determined whether the observed area has been correctly observed (photographed) or not.
 [変形例]
 [観察状況を示す情報の表示の変形例]
 [1]表示の変形例(1)
 図12は、観察状況を示す情報の表示の他の一例を示す図である。
[Modified example]
[Modified example of display of information indicating observation status]
[1] Display modification example (1)
FIG. 12 is a diagram illustrating another example of displaying information indicating observation status.
 図12は、観察対象とする臓器のシェーマ図を用いて、観察状況を示す場合の例を示している。 FIG. 12 shows an example of a case where an observation situation is shown using a schema diagram of an organ to be observed.
 図12に示すように、観察状況表示ウインドウWには、観察対象とする臓器のシェーマ図(臓器の絵図)Shが表示される。図12は、胃が観察対象である場合の例を示している。したがって、この場合は、胃のシェーマ図が表示される。 As shown in FIG. 12, the observation status display window W displays a schema diagram (picture of the organ) Sh of the organ to be observed. FIG. 12 shows an example in which the stomach is the observation target. Therefore, in this case, a schema diagram of the stomach is displayed.
 静止画が撮影され、撮影された静止画から部位が認識されると、シェーマ図Sh上に所定のマークM1、M2、…が表示される。マークM1、M2、…は、認識された部位に対応する位置に表示される。したがって、たとえば、胃の噴門が認識された場合、シェーマ図上の噴門の位置にマークが表示される。このマークM1、M2、…から観察された部位を認識できる。図12は、円形のマークM1、M2、…を表示する場合の例を示している。 When a still image is photographed and a body part is recognized from the photographed still image, predetermined marks M1, M2, . . . are displayed on the schema diagram Sh. Marks M1, M2, . . . are displayed at positions corresponding to the recognized parts. Therefore, for example, if the cardia of the stomach is recognized, a mark is displayed at the position of the cardia on the schema diagram. The observed parts can be recognized from these marks M1, M2, . . . . FIG. 12 shows an example of displaying circular marks M1, M2, . . . .
 また、2番目以降のマークM2、M3、…が表示されると、表示された順でマーク同士を結ぶ矢印Ar1、Ar2、…が、シェーマ図Sh上に表示される。矢印Ar1、Ar2、…は、先に表示されたマークから後に表示されたマークに向かって表示される。したがって、たとえば、マークM1、マークM2、マークM3の順でマークが表示されると、マークM1とマークM2とを結ぶ矢印Ar1がマークM1からマークM2に向かって表示される。また、マークM2とマークM3とを結ぶ矢印Ar2がマークM2からマークM3に向かって表示される。 Furthermore, when the second and subsequent marks M2, M3, ... are displayed, arrows Ar1, Ar2, ... connecting the marks in the order in which they are displayed are displayed on the schema diagram Sh. Arrows Ar1, Ar2, . . . are displayed from the mark displayed first to the mark displayed later. Therefore, for example, when marks are displayed in the order of mark M1, mark M2, and mark M3, an arrow Ar1 connecting mark M1 and mark M2 is displayed from mark M1 toward mark M2. Further, an arrow Ar2 connecting the mark M2 and the mark M3 is displayed from the mark M2 toward the mark M3.
 矢印Ar1、Ar2、…を表示することにより、矢印Ar1、Ar2、…で結ばれたマークM1、M2、…の部位間が観察されたことが示される。よって、本例において、矢印Ar1、Ar2、…は、第1注目領域(第1部位)と第2注目領域(第2部位)との間の領域が観察されたことを示す情報の一例である。 By displaying the arrows Ar1, Ar2, . . . , it is shown that the portions of the marks M1, M2, . . . connected by the arrows Ar1, Ar2, . Therefore, in this example, arrows Ar1, Ar2, ... are examples of information indicating that a region between the first region of interest (first part) and the second region of interest (second part) has been observed. .
 なお、図12は、胃体部大弯、前庭部及び胃角部が、胃体部大弯、前庭部、胃角部の順で認識された場合の例を示している。マークM1は、シェーマ図上で胃体部大弯を示す位置に表示される。マークM2は、シェーマ図上で前庭部を示す位置に表示される。マークM3は、シェーマ図上で胃角部を示す位置に表示される。 Note that FIG. 12 shows an example where the greater curvature of the stomach body, the antrum, and the angle of the stomach are recognized in the order of the greater curvature of the stomach, the antrum, and the angle of the stomach. The mark M1 is displayed on the schema diagram at a position indicating the greater curvature of the body of the stomach. The mark M2 is displayed on the schema diagram at a position indicating the antrum. The mark M3 is displayed on the schema diagram at a position indicating the stomach angle.
 図13は、観察状況表示ウインドウWの表示の遷移の一例を示す図である。 FIG. 13 is a diagram showing an example of the display transition of the observation status display window W.
 図13(A)は、観察状況表示ウインドウWの初期表示の一例を示している。この場合、観察状況表示ウインドウWには、シェーマ図Shのみが表示される。 FIG. 13(A) shows an example of the initial display of the observation status display window W. In this case, only the schema diagram Sh is displayed in the observation status display window W.
 図13(B)は、初めて部位が認識された場合の観察状況表示ウインドウWの表示の一例を示している。図13(B)に示すように、認識された部位に対応する部位の位置にマークM1が表示される。図13(B)は、「胃体部大弯」が認識された場合の例を示している。この場合、シェーマ図Sh上の胃体部大弯に相当する位置にマークM1が表示される。 FIG. 13(B) shows an example of the display of the observation status display window W when a region is recognized for the first time. As shown in FIG. 13(B), a mark M1 is displayed at the position of the part corresponding to the recognized part. FIG. 13(B) shows an example in which the "greater curvature of the body of the stomach" is recognized. In this case, a mark M1 is displayed at a position corresponding to the greater curvature of the body of the stomach on the schema Sh.
 図13(C)は、新たに部位が認識された場合の観察状況表示ウインドウWの表示の一例を示している。図13(C)に示すように、新たに認識された部位に対応する部位の位置に新たなマークM2が表示される。図13(C)は、新たに「前庭部」が認識された場合の例を示している。この場合、シェーマ図Sh上の前庭部に相当する位置にマークM2が表示される。 FIG. 13(C) shows an example of the display of the observation status display window W when a new part is recognized. As shown in FIG. 13(C), a new mark M2 is displayed at the position of the part corresponding to the newly recognized part. FIG. 13(C) shows an example in which the "vestibular region" is newly recognized. In this case, a mark M2 is displayed on the schema diagram Sh at a position corresponding to the antrum.
 また、図13(C)に示すように、先に表示されたマークM1と新たに表示されたマークM2とが矢印Ar1で結ばれる。矢印Ar1は、先に表示されたマークM1から新たに表示されたマークM2に向かって表示される。この矢印Ar1によって、マークM1で示す部位からマークM2で示す部位に向けて内視鏡10が移動したことが示される。すなわち、マークM1で示す部位とマークM2で示す部位との間の領域が観察されたことが示される。図13(C)に示す例では、マークM1で示す「胃体部大弯」からマークM2で示す「前庭部」に向けて内視鏡10が移動し、その間の胃の領域が観察されたことを示している。 Furthermore, as shown in FIG. 13(C), the previously displayed mark M1 and the newly displayed mark M2 are connected by an arrow Ar1. The arrow Ar1 is displayed from the previously displayed mark M1 toward the newly displayed mark M2. This arrow Ar1 indicates that the endoscope 10 has moved from the region indicated by the mark M1 toward the region indicated by the mark M2. That is, it is shown that the region between the region indicated by mark M1 and the region indicated by mark M2 has been observed. In the example shown in FIG. 13(C), the endoscope 10 was moved from the "greater curvature of the body of the stomach" indicated by mark M1 toward the "antrum" indicated by mark M2, and the region of the stomach in between was observed. It is shown that.
 図13(D)は、更に部位が認識された場合の観察状況表示ウインドウWの表示の一例を示している。図13(D)に示すように、新たに認識された部位に対応する部位の位置に新たなマークM3が表示される。図13(D)は、新たに「胃角部」が認識された場合の例を示している。この場合、シェーマ図Sh上の胃角部に相当する位置にマークM3が表示される。 FIG. 13(D) shows an example of the display of the observation status display window W when a region is further recognized. As shown in FIG. 13(D), a new mark M3 is displayed at the position of the part corresponding to the newly recognized part. FIG. 13(D) shows an example where the "angular part of the stomach" is newly recognized. In this case, a mark M3 is displayed at a position corresponding to the angle of the stomach on the schema Sh.
 また、図13(D)に示すように、先に表示されたマークM2と新たに表示されたマークM3とが矢印Ar2で結ばれる。矢印Ar2は、先に表示されたマークM2から新たに表示されたマークM3に向かって表示される。この矢印Ar2によって、マークM2で示す部位からマークM3で示す部位に向けて内視鏡10が移動したことが示される。すなわち、マークM2で示す部位とマークM3で示す部位との間の領域が観察されたことが示される。図13(D)に示す例では、マークM2で示す「前庭部」からマークM3で示す「胃角部」に向けて内視鏡10が移動し、その間の胃の領域が観察されたことを示している。 Furthermore, as shown in FIG. 13(D), the previously displayed mark M2 and the newly displayed mark M3 are connected by an arrow Ar2. The arrow Ar2 is displayed from the previously displayed mark M2 to the newly displayed mark M3. This arrow Ar2 indicates that the endoscope 10 has moved from the region indicated by the mark M2 toward the region indicated by the mark M3. That is, it is shown that the region between the region indicated by mark M2 and the region indicated by mark M3 has been observed. In the example shown in FIG. 13(D), the endoscope 10 is moved from the "antrum" indicated by mark M2 to the "angular region of the stomach" indicated by mark M3, and the region of the stomach in between is observed. It shows.
 このように、部位が認識されるたびに、認識された部位に対応して、シェーマ図Sh上にマークM1、M2、…が表示される。これにより、観察された部位を容易に把握できる。また、マークM1、M2、…が表示されるたびに、先に表示されたマークと、その後に表示されたマークとを結ぶ矢印Ar1、Ar2、…が表示される。これにより、観察された領域及び観察方向を容易に把握できる。 In this way, each time a part is recognized, marks M1, M2, . . . are displayed on the schema diagram Sh corresponding to the recognized part. Thereby, the observed region can be easily understood. Moreover, each time the marks M1, M2, . . . are displayed, arrows Ar1, Ar2, . Thereby, the observed area and observation direction can be easily grasped.
 なお、本例では、マーク同士を矢印で結ぶ構成としているが、線分で結ぶ構成としてもよい。矢印で結ぶことにより、観察方向(内視鏡10を移動させた方向)を確認できる。 Note that in this example, the marks are connected by arrows, but they may be connected by line segments. By connecting with arrows, the observation direction (the direction in which the endoscope 10 was moved) can be confirmed.
 また、矢印(線分の場合を含む)Ar1、Ar2、…は、一定時間のみ表示させる構成とすることもできる。たとえば、新たに認識された部位のマークを表示させるのと同時に矢印を表示させ、矢印の表示開始からT時間経過後に、矢印のみ表示を消す構成とすることができる。時間Tは、あらかじめ設定される時間である。 Furthermore, the arrows (including the case of line segments) Ar1, Ar2, . . . can be configured to be displayed only for a certain period of time. For example, an arrangement can be made in which an arrow is displayed at the same time as a mark of a newly recognized region is displayed, and after T time has elapsed from the start of arrow display, only the arrow is turned off. Time T is a preset time.
 なお、矢印Ar1、Ar2、…の表示を一定時間経過後に消す場合、必要に応じて、再表示できるようにすることが好ましい。すなわち、過去に表示させた矢印Ar1、Ar2、…を再度表示させることが好ましい。たとえば、ユーザからの再表示(履歴の表示)の指示に応じて、再表示させる構成とすることができる。また、観察終了後に、自動で再表示させる構成とすることができる。観察の終了は、ユーザによる指示の他、自動で判定する構成とすることもできる。画像から特定の部位が認識された場合に、観察終了と判定することができる。たとえば、最初に認識した部位と同じ部位が再び認識された場合に、観察終了と判定することができる。あるいは、特定の動作を検出した場合に、観察終了と判定することができる。たとえば、観察対象の臓器から内視鏡を抜く動作が検出された場合に観察終了と判定することができる。観察対象の臓器から内視鏡を抜く動作は、たとえば、画像から特定の部位又は臓器を検出することにより検出できる。 Note that when displaying the arrows Ar1, Ar2, . . . after a certain period of time has elapsed, it is preferable to enable them to be displayed again if necessary. That is, it is preferable to display the arrows Ar1, Ar2, . . . that were displayed in the past again. For example, it can be configured to be redisplayed in response to a redisplay (history display) instruction from the user. Further, it is possible to configure the display to be automatically redisplayed after the observation is completed. The end of observation may be determined automatically, in addition to an instruction from the user. When a specific region is recognized from the image, it can be determined that the observation has ended. For example, when the same part as the first recognized part is recognized again, it can be determined that the observation has ended. Alternatively, when a specific motion is detected, it can be determined that the observation has ended. For example, when a movement of removing the endoscope from the organ to be observed is detected, it can be determined that the observation has ended. The action of removing the endoscope from the organ to be observed can be detected, for example, by detecting a specific region or organ from the image.
 なお、矢印を再表示させる機能を備える場合、プロセッサ101は、矢印の表示の履歴を主記憶装置102又は補助記憶装置103に記録する。矢印を再表示させる際は、記録された情報を参照して、再表示させる。 Note that when the processor 101 has a function of redisplaying the arrow, the processor 101 records the arrow display history in the main storage device 102 or the auxiliary storage device 103. When redisplaying the arrow, the recorded information is referred to and redisplayed.
 本例では、マークM1、M2、…の形状を円としているが、マークの形状等については、特に限定されない。 In this example, the shapes of the marks M1, M2, . . . are circles, but the shapes of the marks are not particularly limited.
 [2]表示の変形例(2)
 図14は、観察状況を示す情報の表示の他の一例を示す図である。図14は、図13に示す表示形態において、更に評価値を表示する場合の例を示している。
[2] Display modification example (2)
FIG. 14 is a diagram illustrating another example of displaying information indicating observation status. FIG. 14 shows an example in which evaluation values are further displayed in the display format shown in FIG. 13.
 図14に示すように、各領域で算出される評価値が、各領域を示す矢印Ar1、Ar2、…に隣接して表示される。より具体的には、観察された領域を示す矢印Ar1、Ar2、…に隣接して評価値表示枠Sc1、Sc2、…が表示され、その評価値表示枠Sc1、Sc2、…内に評価値が表示される。 As shown in FIG. 14, the evaluation value calculated for each area is displayed adjacent to arrows Ar1, Ar2, . . . indicating each area. More specifically, evaluation value display frames Sc1, Sc2, ... are displayed adjacent to arrows Ar1, Ar2, ... indicating observed areas, and evaluation values are displayed within the evaluation value display frames Sc1, Sc2, ... Is displayed.
 図14に示す例では、マークM1で示す部位(胃体部大弯)とマークM2で示す部位(前庭部)との間の胃の領域の観察の評価値が82[%]、マークM2で示す部位(前庭部)とマークM3で示す部位(胃角部)の間の胃の領域の観察の評価値が99[%]である場合の例を示している。 In the example shown in FIG. 14, the evaluation value of the observation of the region of the stomach between the region indicated by mark M1 (greater curvature of the body of the stomach) and the region indicated by mark M2 (antrum) is 82 [%], An example is shown in which the observation evaluation value of the region of the stomach between the region shown (antrum) and the region indicated by mark M3 (angular region) is 99%.
 なお、本例では、矢印Ar1、Ar2、…と共に評価値を表示する構成としているが、評価値のみを表示させる構成とすることもできる。この場合、たとえば、各部位を示すマークの間に評価値表示枠Sc1、Sc2、…を表示させ、その評価値表示枠Sc1、Sc2、…内に評価値を表示させる。 In this example, the evaluation value is displayed together with the arrows Ar1, Ar2, . . . , but it is also possible to display only the evaluation value. In this case, for example, evaluation value display frames Sc1, Sc2, . . . are displayed between marks indicating each part, and evaluation values are displayed within the evaluation value display frames Sc1, Sc2, .
 また、矢印及び/又は評価値は、一定時間のみ表示させる構成とすることもできる。たとえば、矢印及び/又は評価値の表示開始からT時間経過後に矢印のみ表示を消す構成とすることができる。時間Tは、あらかじめ設定される時間である。 Furthermore, the arrow and/or the evaluation value can be configured to be displayed only for a certain period of time. For example, the configuration may be such that only the arrows are displayed after T time has elapsed since the start of displaying the arrows and/or evaluation values. Time T is a preset time.
 なお、矢印及び/又は評価値の表示を一定時間経過後に消す場合、必要に応じて、再表示できるようにすることが好ましい。 Note that when displaying the arrow and/or the evaluation value disappears after a certain period of time has elapsed, it is preferable to allow them to be displayed again if necessary.
 [3]表示の変形例(3)
 図15は、観察状況を示す情報の表示の他の一例を示す図である。なお、図15は、部位認識前の観察状況表示ウインドウWの表示(初期の表示)の一例を示している。
[3] Display modification example (3)
FIG. 15 is a diagram illustrating another example of displaying information indicating observation status. Note that FIG. 15 shows an example of the display (initial display) of the observation status display window W before body part recognition.
 本例は、特に、観察すべき部位(又は静止画を撮影すべき部位)が、あらかじめ定められている場合などに有利な表示形態である。 This example is an advantageous display form particularly when the region to be observed (or the region to be photographed as a still image) is determined in advance.
 図15に示すように、あらかじめ観察すべき部位(又は静止画を撮影すべき部位)が、所定のマークMa、Mb、…を用いて、シェーマ図Sh上に表示される。図15は、マークMa、Mb、…として、円を用いた場合の例を示している。マークMa、Mb、…の形状及び色彩は、特に限定されない。各マークMa、Mb、…は、観察すべき部位として定められた部位に対応する位置に表示される。図15では、観察すべき部位が5つの場合の例を示している。図15に示す5つ部位は、「噴門」、「穹窿部」、「胃体部大弯」、「胃角部」及び「前庭部」である。マークMaは「噴門」、マークMbは「穹窿部」、マークMcは「胃体部大弯」、マークMdは「胃角部」、マークMeは「前庭部」をそれぞれ示している。 As shown in FIG. 15, regions to be observed (or regions to be photographed as still images) in advance are displayed on the schema diagram Sh using predetermined marks Ma, Mb, . . . . FIG. 15 shows an example in which circles are used as the marks Ma, Mb, . . . . The shapes and colors of the marks Ma, Mb, . . . are not particularly limited. Each mark Ma, Mb, . . . is displayed at a position corresponding to a region determined as a region to be observed. FIG. 15 shows an example where there are five parts to be observed. The five regions shown in FIG. 15 are the "cardia," "the vault," "the greater curvature of the gastric body," the "angular region," and the "antrum." The mark Ma indicates the "cardia," the mark Mb indicates the "concave region," the mark Mc indicates the "greater curvature of the body of the stomach," the mark Md indicates the "angular region of the stomach," and the mark Me indicates the "antrum."
 本例において、マークMa、Mb、…は、注目領域を表す標識の一例である。また、各マークMa、Mb、…が所定のレイアウトで配置されたシェーマ図Shは第1情報の一例である。更に、あらかじめ観察すべき部位(又は、静止画を撮影すべき部位)として定められた部位は、体内の複数の注目領域の中から選択された複数の特定の注目領域の一例である。 In this example, marks Ma, Mb, . . . are examples of marks representing regions of interest. Further, the schema diagram Sh in which marks Ma, Mb, . . . are arranged in a predetermined layout is an example of the first information. Furthermore, the site determined in advance as a site to be observed (or a site for which a still image should be photographed) is an example of a plurality of specific attention regions selected from a plurality of attention regions inside the body.
 図16は、部位認識後の観察状況表示ウインドウの表示の一例を示す図である。 FIG. 16 is a diagram showing an example of the display of the observation status display window after body part recognition.
 図16に示すように、静止画が撮影され、部位が認識されると、認識された部位のマークの表示形態が切り替わる。たとえば、認識された部位のマークの色(透明を含む)が変わる。図16は、マークを示す円内が、透明から有彩色又は無彩色に切り替わる例を示している。また、図16は、マークMcで示す部位(胃体部大弯)、マークMdで示す部位(胃角部)、及び、マークMeで示す部位(前庭部)が認識された場合の例を示している。なお、本例において、対応する部位が認識される前のマークの表示形態(色が変わる前の表示形態)は、第1形態の一例である。また、対応する部位が認識された後のマークの表示形態(色が変わった後の表示形態)は、第2形態の一例である。 As shown in FIG. 16, when a still image is taken and a body part is recognized, the display form of the mark of the recognized body part changes. For example, the color (including transparency) of the mark of the recognized part changes. FIG. 16 shows an example in which the inside of the circle indicating the mark changes from transparent to chromatic or achromatic. Further, FIG. 16 shows an example in which a region indicated by a mark Mc (greater curvature of the gastric body), a region indicated by a mark Md (angular region of the stomach), and a region indicated by a mark Me (antrum) are recognized. ing. In this example, the display form of the mark before the corresponding part is recognized (the display form before the color changes) is an example of the first form. Furthermore, the display form of the mark after the corresponding region is recognized (the display form after the color changes) is an example of the second form.
 また、図16に示すように、2つ目以降の部位が認識されると、先に認識された部位のマークと、その後に認識された部位とを結ぶ矢印Ar1、Ar2、…が表示される。各矢印Ar1、Ar2、…は、先に認識された部位のマークから、その後に認識された部位のマークに向けて表示される。図16は、マークMcで示す部位(胃体部大弯)、マークMeで示す部位(前庭部)、マークMdで示す部位(胃角部)の順で認識された場合の例を示している。この場合、マークMcとマークMeとを結ぶ矢印Ar1がマークMcからマークMeに向かって表示される。また、マークMeとマークMdとを結ぶ矢印Ar2がマークMeからマークMdに向かって表示される。 Furthermore, as shown in FIG. 16, when the second and subsequent parts are recognized, arrows Ar1, Ar2, etc. are displayed that connect the mark of the first recognized part and the part recognized after that. . The arrows Ar1, Ar2, . . . are displayed from the mark of the region recognized first to the mark of the region recognized subsequently. FIG. 16 shows an example where the region indicated by the mark Mc (greater curvature of the gastric body), the region indicated by the mark Me (antrum), and the region indicated by the mark Md (angular region of the stomach) are recognized in this order. . In this case, an arrow Ar1 connecting the mark Mc and the mark Me is displayed from the mark Mc toward the mark Me. Further, an arrow Ar2 connecting the mark Me and the mark Md is displayed from the mark Me toward the mark Md.
 矢印Ar1、Ar2、…を表示することにより、矢印Ar1、Ar2、…で結ばれたマークMa、Mb、…の部位間が観察されたことが示される。この矢印Ar1、Ar2、…によって、ユーザは観察状況を確認できる。 By displaying the arrows Ar1, Ar2, . . . , it is shown that the regions of the marks Ma, Mb, . . . connected by the arrows Ar1, Ar2, . . . have been observed. The user can check the observation status using the arrows Ar1, Ar2, . . . .
 本例において、矢印Ar1、Ar2、…は、第1注目領域(第1部位)と第2注目領域(第2部位)との間の領域が観察されたことを示す情報の一例であり、かつ、第2情報の一例である。また、矢印Ar1、Ar2、…は、マークMa、Mb、…が表示されたシェーマ図Sh(第1情報)において、マークMa、Mb、…(第1注目領域及び第2注目領域)を関連付ける情報の一例である。 In this example, arrows Ar1, Ar2, ... are examples of information indicating that a region between a first region of interest (first part) and a second region of interest (second part) has been observed, and , is an example of the second information. Further, arrows Ar1, Ar2, ... are information that associates marks Ma, Mb, ... (first attention area and second attention area) in schema diagram Sh (first information) in which marks Ma, Mb, ... are displayed. This is an example.
 図17は、観察状況表示ウインドウWの表示の遷移の一例を示す図である。 FIG. 17 is a diagram showing an example of the display transition of the observation status display window W.
 図17(A)は、観察状況表示ウインドウWの初期表示の一例を示している。図17(A)に示すように、観察状況表示ウインドウWにはシェーマ図Shが表示され、かつ、シェーマ図Sh上に観察すべき部位を示すマークMa、Mb、…が表示される。各マークMa、Mb、…は、観察すべき部位の位置に対応して表示される。また、各マークMa、Mb、…は、所定の色(第1形態)で表示される。本例では、マークを示す円の内側が透明で表示される。 FIG. 17(A) shows an example of the initial display of the observation status display window W. As shown in FIG. 17A, a schema diagram Sh is displayed in the observation status display window W, and marks Ma, Mb, . . . indicating parts to be observed are displayed on the schema diagram Sh. Each mark Ma, Mb, . . . is displayed corresponding to the position of the part to be observed. Further, each mark Ma, Mb, . . . is displayed in a predetermined color (first form). In this example, the inside of the circle indicating the mark is displayed transparently.
 図17(B)は、観察すべき部位が初めて認識された場合の観察状況表示ウインドウWの表示の一例を示している。図17(B)に示すように、認識された部位のマークMcの表示形態が切り替わる(第2形態に切り替わる)。本例では、マークを示す円内が、透明から有彩色又は無彩色に切り替わる。図17(B)は、「胃体部大弯」が認識された場合の例を示している。 FIG. 17(B) shows an example of the display of the observation status display window W when the region to be observed is recognized for the first time. As shown in FIG. 17(B), the display form of the mark Mc of the recognized part changes (switches to the second form). In this example, the inside of the circle indicating the mark changes from transparent to chromatic or achromatic. FIG. 17(B) shows an example in which the "greater curvature of the body of the stomach" is recognized.
 図17(C)は、観察すべき部位の中から新たに部位が認識された場合の観察状況表示ウインドウWの表示の一例を示している。図17(C)に示すように、新たに認識された部位のマークMeの色(表示形態)が切り替わる。図17(C)は、新たに「前庭部」が認識された場合の例を示している。 FIG. 17(C) shows an example of the display of the observation status display window W when a new part is recognized from among the parts to be observed. As shown in FIG. 17(C), the color (display format) of the mark Me of the newly recognized part changes. FIG. 17(C) shows an example in which the "vestibular region" is newly recognized.
 また、図17(C)に示すように、先に認識された部位のマークMcと新たに表示された部位のマークMeを結ぶ矢印Ar1が表示される。矢印Ar1は、先に認識された部位のマークMcから新たに認識された部位のマークMeに向かって表示される。この矢印Ar1によって、マークMcで示す部位からマークMeで示す部位との間の臓器の領域が観察されたことが示される。図17(C)に示す例では、マークMcで示す「胃体部大弯」からマークMeで示す「前庭部」との間の胃の領域が観察されたことを示している。 Furthermore, as shown in FIG. 17(C), an arrow Ar1 is displayed that connects the mark Mc of the previously recognized part and the mark Me of the newly displayed part. The arrow Ar1 is displayed from the mark Mc of the previously recognized part to the mark Me of the newly recognized part. This arrow Ar1 indicates that the region of the organ between the site indicated by the mark Mc and the site indicated by the mark Me has been observed. The example shown in FIG. 17C shows that the region of the stomach between the "greater curvature of the body of the stomach" indicated by the mark Mc and the "antrum" indicated by the mark Me has been observed.
 図17(D)は、観察すべき部位の中から更に部位が認識された場合の観察状況表示ウインドウWの表示の一例を示している。図17(D)に示すように、新たに認識された部位のマークMdの色(表示形態)が切り替わる。図17(D)は、新たに「胃角部」が認識された場合の例を示している。 FIG. 17(D) shows an example of the display of the observation status display window W when a region is further recognized from among the regions to be observed. As shown in FIG. 17(D), the color (display format) of the mark Md of the newly recognized region changes. FIG. 17(D) shows an example where the "angular part of the stomach" is newly recognized.
 また、図17(D)に示すように、先に表示されたマークMeと新たに表示されたマークMdとを結ぶ矢印Ar2が表示される。矢印Ar2は、先に表示されたマークMeから新たに表示されたマークMdに向かって表示される。この矢印Ar2によって、マークMeで示す部位からマークMdで示す部位との間の胃の領域が観察されたことが示される。図17(D)に示す例では、マークMeで示す「前庭部」からマークMdで示す「胃角部」に向けて内視鏡10が移動し、その間の胃の領域が観察されたことを示している。 Additionally, as shown in FIG. 17(D), an arrow Ar2 is displayed that connects the previously displayed mark Me and the newly displayed mark Md. The arrow Ar2 is displayed from the previously displayed mark Me to the newly displayed mark Md. This arrow Ar2 indicates that the region of the stomach between the region indicated by the mark Me and the region indicated by the mark Md has been observed. In the example shown in FIG. 17(D), the endoscope 10 is moved from the "antrum" indicated by the mark Me to the "angular region of the stomach" indicated by the mark Md, and the region of the stomach between them is observed. It shows.
 このように、観察すべき部位が、マークMa、Mb、…によってシェーマ図Sh上に示され、観察(認識)されると、そのマークMa、Mb、…の表示形態が切り替わる。これにより、観察された部位を容易に把握できる。また、部位が認識されるたびに、先に認識された部位のマークと、その後に認識された部位のマークとを結ぶ矢印Ar1、Ar2、…が表示される。これにより、観察された領域及び観察方向を容易に把握できる。 In this way, the parts to be observed are indicated on the schema diagram Sh by the marks Ma, Mb, . . ., and when observed (recognized), the display form of the marks Ma, Mb, . . . is switched. Thereby, the observed region can be easily understood. Furthermore, each time a region is recognized, arrows Ar1, Ar2, . . . are displayed that connect the mark of the region recognized first and the mark of the region recognized subsequently. Thereby, the observed area and observation direction can be easily grasped.
 なお、本例では、マーク同士を矢印で結ぶ構成としているが、線分で結ぶ構成としてもよい。矢印で結ぶことにより、観察方向(内視鏡10を移動させた方向)を確認できる。 Note that in this example, the marks are connected by arrows, but they may be connected by line segments. By connecting with arrows, the observation direction (the direction in which the endoscope 10 was moved) can be confirmed.
 また、矢印(線分の場合を含む)Ar1、Ar2、…は、一定時間のみ表示させる構成とすることもできる。たとえば、新たに認識された部位のマークの表示形態を切り替えるのと同時に矢印を表示させ、矢印の表示開始からT時間経過後に、矢印のみ表示を消す構成とすることができる。時間Tは、あらかじめ設定される時間である。なお、矢印の表示を一定時間経過後に消す場合、必要に応じて、再表示できるようにすることが好ましい。 Furthermore, the arrows (including the case of line segments) Ar1, Ar2, . . . can be configured to be displayed only for a certain period of time. For example, an arrangement can be made in which an arrow is displayed at the same time as the display form of a mark of a newly recognized part is switched, and only the arrow is turned off after T time has elapsed since the start of arrow display. Time T is a preset time. Note that when the arrow display is to be erased after a certain period of time has elapsed, it is preferable to enable it to be displayed again if necessary.
 また、本例では、マークMa、Mb、…の形状を円としているが、マークの形状等については、特に限定されない。 Further, in this example, the shapes of the marks Ma, Mb, . . . are circles, but the shapes of the marks are not particularly limited.
 また、本例では、該当する部位が認識された場合、マークの色を変える構成としているが、切り替えの態様は、これに限定されるものではない。たとえば、該当する部位が認識された場合に、マークの形を変えたり、マークを点滅させたりしてもよい。 Further, in this example, the color of the mark is changed when the corresponding part is recognized, but the mode of switching is not limited to this. For example, when a corresponding part is recognized, the shape of the mark may be changed or the mark may be made to blink.
 また、本例では、あらかじめ観察する部位が定められている場合を例に説明したが、画像処理装置100で認識可能な部位をマークで示してもよい。この場合、画像処理装置100で認識可能なすべての部位がシェーマ図上に表示される。 Furthermore, in this example, a case has been described in which a region to be observed is determined in advance, but a region that can be recognized by the image processing device 100 may be indicated by a mark. In this case, all parts that can be recognized by the image processing device 100 are displayed on the schema diagram.
 [4]表示の変形例(4)
 図18は、観察状況を示す情報の表示の他の一例を示す図である。図18は、図15及び図16に示す表示形態において、更に評価値を表示する場合の例を示している。
[4] Display modification example (4)
FIG. 18 is a diagram illustrating another example of displaying information indicating observation status. FIG. 18 shows an example in which evaluation values are further displayed in the display formats shown in FIGS. 15 and 16.
 図18に示すように、各領域で算出される評価値が、各領域を示す矢印Ar1、Ar2、…に隣接して表示される。より具体的には、観察された領域を示す矢印Ar1、Ar2、…に隣接して評価値表示枠Sc1、Sc2、…が表示され、その評価値表示枠Sc1、Sc2、…に評価値が表示される。 As shown in FIG. 18, the evaluation value calculated for each area is displayed adjacent to arrows Ar1, Ar2, . . . indicating each area. More specifically, evaluation value display frames Sc1, Sc2, . . . are displayed adjacent to arrows Ar1, Ar2, . . . indicating observed areas, and evaluation values are displayed in the evaluation value display frames Sc1, Sc2, . be done.
 図18に示す例では、マークMcで示す部位(胃体部大弯)とマークMeで示す部位(前庭部)との間の胃の領域の観察の評価値が82[%]、マークMeで示す部位(前庭部)とマークMdで示す部位(胃角部)の間の胃の領域の観察の評価値が99[%]である場合の例を示している。 In the example shown in FIG. 18, the evaluation value of the observation of the region of the stomach between the region indicated by mark Mc (greater curvature of the gastric body) and the region indicated by mark Me (antrum) is 82 [%], An example is shown in which the observation evaluation value of the region of the stomach between the region shown (antrum) and the region indicated by mark Md (angular region) is 99%.
 なお、本例では、矢印Ar1、Ar2、…と共に評価値を表示する構成としているが、評価値のみを表示させる構成とすることもできる。この場合、たとえば、各部位を示すマークの間に評価値表示枠Sc1、Sc2、…を表示させ、その評価値表示枠Sc1、Sc2、…内に評価値を表示させる。 In this example, the evaluation value is displayed together with the arrows Ar1, Ar2, . . . , but it is also possible to display only the evaluation value. In this case, for example, evaluation value display frames Sc1, Sc2, . . . are displayed between marks indicating each part, and evaluation values are displayed within the evaluation value display frames Sc1, Sc2, .
 また、矢印及び/又は評価値は、一定時間のみ表示させる構成とすることもできる。たとえば、矢印及び/又は評価値の表示開始からT時間経過後に矢印のみ表示を消す構成とすることができる。時間Tは、あらかじめ設定される時間である。なお、矢印及び/又は評価値の表示を一定時間経過後に消す場合、必要に応じて、再表示できるようにすることが好ましい。 Furthermore, the arrow and/or the evaluation value can be configured to be displayed only for a certain period of time. For example, the configuration may be such that only the arrows are displayed after T time has elapsed since the start of displaying the arrows and/or evaluation values. Time T is a preset time. Note that when the display of the arrow and/or the evaluation value is to be erased after a certain period of time has elapsed, it is preferable that the arrow and/or the evaluation value be able to be displayed again if necessary.
 [5]表示の変形例(5)
 図19は、観察状況を示す情報の表示の他の一例を示す図である。なお、図19は、部位認識前の観察状況表示ウインドウWの表示(初期の表示)の一例を示している。
[5] Display modification example (5)
FIG. 19 is a diagram illustrating another example of displaying information indicating observation status. Note that FIG. 19 shows an example of the display (initial display) of the observation status display window W before body part recognition.
 本例も観察すべき部位(又は静止画を撮影すべき部位)が、あらかじめ定められている場合に有利な表示形態である。 This example is also an advantageous display form when the region to be observed (or the region to be photographed as a still image) is determined in advance.
 図19に示すように、あらかじめ観察すべき部位(又は静止画を撮影すべき部位)を示すマークMa、Mb、…が、観察状況表示ウインドウW内に所定のレイアウトで表示される。図19は、観察すべき部位が5つの場合の例を示している。図19に示す5つ部位は、「噴門(cardia)」、「穹窿部(fornix)」、「胃体部大弯(greater curvature)」、「胃角部(gastric angle)」及び「前庭部(antrum)」である。マークMaは「噴門」、マークMbは「穹窿部」、マークMcは「胃体部大弯」、マークMdは「胃角部」、マークMeは「前庭部」をそれぞれ示している。図19に示す例では、各マークMa、Mb、…を円で表し、その円の内部に該当する部位の名称を表記している。また、図19に示す例では、各部位を示すマークMa、Mb、…を五角形の頂点の位置に配置している。 As shown in FIG. 19, marks Ma, Mb, . . . indicating regions to be observed (or regions to be photographed as still images) are displayed in a predetermined layout in the observation status display window W in advance. FIG. 19 shows an example where there are five parts to be observed. The five regions shown in Figure 19 are the "cardia", "fornix", "greater curvature", "gastric angle", and "antrum". antrum). The mark Ma indicates the "cardia," the mark Mb indicates the "concave region," the mark Mc indicates the "greater curvature of the body of the stomach," the mark Md indicates the "angular region of the stomach," and the mark Me indicates the "antrum." In the example shown in FIG. 19, each mark Ma, Mb, . . . is represented by a circle, and the name of the corresponding part is written inside the circle. Further, in the example shown in FIG. 19, marks Ma, Mb, . . . indicating each part are arranged at the vertices of a pentagon.
 本例において、マークMa、Mb、…は、注目領域を表す標識の一例である。また、各マークMa、Mb、…が所定のレイアウトで配置されたシェーマ図Shは第1情報の一例である。更に、あらかじめ観察すべき部位(又は、静止画を撮影すべき部位)として定められた部位は、体内の複数の注目領域の中から選択された複数の特定の注目領域の一例である。 In this example, marks Ma, Mb, . . . are examples of marks representing regions of interest. Further, the schema diagram Sh in which marks Ma, Mb, . . . are arranged in a predetermined layout is an example of the first information. Furthermore, the site determined in advance as a site to be observed (or a site for which a still image should be photographed) is an example of a plurality of specific attention regions selected from a plurality of attention regions inside the body.
 図20は、部位認識後の観察状況表示ウインドウの表示の一例を示す図である。 FIG. 20 is a diagram showing an example of the display of the observation status display window after body part recognition.
 図20に示すように、静止画が撮影され、部位が認識されると、認識された部位のマークの表示形態が切り替わる。たとえば、認識された部位のマークの色が反転する。図20は、マークMcで示す部位(胃体部大弯)、マークMdで示す部位(胃角部)、及び、マークMeで示す部位(前庭部)が認識された場合の例を示している。なお、本例において、対応する部位が認識される前のマークの表示形態(色が変わる前の表示形態)は、第1形態の一例である。また、対応する部位が認識された後のマークの表示形態(色が変わった後の表示形態)は、第2形態の一例である。 As shown in FIG. 20, when a still image is taken and a body part is recognized, the display form of the mark of the recognized body part changes. For example, the color of the mark of the recognized part is reversed. FIG. 20 shows an example in which the region indicated by the mark Mc (greater curvature of the gastric body), the region indicated by the mark Md (angular region of the stomach), and the region indicated by the mark Me (antrum) are recognized. . In this example, the display form of the mark before the corresponding part is recognized (the display form before the color changes) is an example of the first form. Furthermore, the display form of the mark after the corresponding region is recognized (the display form after the color changes) is an example of the second form.
 また、図20に示すように、2つ目以降の部位が認識されると、先に認識された部位のマークと、その後に認識された部位とを結ぶ矢印Ar1、Ar2、…が表示される。各矢印Ar1、Ar2、…は、先に認識された部位のマークから、その後に認識された部位のマークに向けて表示される。図20は、マークMcで示す部位(胃体部大弯)、マークMeで示す部位(前庭部)、マークMdで示す部位(胃角部)の順で認識された場合の例を示している。この場合、マークMcとマークMeとを結ぶ矢印Ar1がマークMcからマークMeに向かって表示される。また、マークMeとマークMdとを結ぶ矢印Ar2がマークMeからマークMdに向かって表示される。 Furthermore, as shown in FIG. 20, when the second and subsequent parts are recognized, arrows Ar1, Ar2, etc. are displayed that connect the mark of the first recognized part and the part recognized after that. . The arrows Ar1, Ar2, . . . are displayed from the mark of the region recognized first to the mark of the region recognized subsequently. FIG. 20 shows an example where the region indicated by the mark Mc (greater curvature of the gastric body), the region indicated by the mark Me (antrum), and the region indicated by the mark Md (angular region of the stomach) are recognized in this order. . In this case, an arrow Ar1 connecting the mark Mc and the mark Me is displayed from the mark Mc toward the mark Me. Further, an arrow Ar2 connecting the mark Me and the mark Md is displayed from the mark Me toward the mark Md.
 矢印Ar1、Ar2、…を表示することにより、矢印Ar1、Ar2、…で結ばれたマークMa、Mb、…の部位間が観察されたことが示される。この矢印Ar1、Ar2、…によって、ユーザは観察状況を確認できる。 By displaying the arrows Ar1, Ar2, . . . , it is shown that the regions of the marks Ma, Mb, . . . connected by the arrows Ar1, Ar2, . . . have been observed. The user can check the observation status using the arrows Ar1, Ar2, . . . .
 また、図20に示すように、各部位間で評価値が算出される場合には、算出された評価値が、各部位を示すマーク間に表示される。具体的には、矢印Ar1、Ar2、…上に評価値表示枠Sc1、Sc2、…が表示され、その評価値表示枠Sc1、Sc2、…に評価値が表示される。 Furthermore, as shown in FIG. 20, when evaluation values are calculated between each region, the calculated evaluation values are displayed between the marks indicating each region. Specifically, evaluation value display frames Sc1, Sc2, . . . are displayed above the arrows Ar1, Ar2, . . . , and evaluation values are displayed in the evaluation value display frames Sc1, Sc2, .
 図20に示す例では、マークMcで示す部位(胃体部大弯)とマークMeで示す部位(前庭部)との間の胃の領域の観察の評価値が82[%]、マークMeで示す部位(前庭部)とマークMdで示す部位(胃角部)の間の胃の領域の観察の評価値が99[%]である場合の例を示している。 In the example shown in FIG. 20, the evaluation value of the observation of the region of the stomach between the region indicated by mark Mc (greater curvature of the gastric body) and the region indicated by mark Me (antrum) is 82 [%], An example is shown in which the observation evaluation value of the region of the stomach between the region shown (antrum) and the region indicated by mark Md (angular region) is 99%.
 本例において、矢印及び評価値は、第1注目領域(第1部位)と第2注目領域(第2部位)との間の領域が観察されたことを示す情報の一例であり、かつ、第2情報の一例である。また、矢印及び評価値は、マークが表示された図(第1情報)において、マーク(第1注目領域及び第2注目領域)を関連付ける情報の一例である。 In this example, the arrow and the evaluation value are an example of information indicating that a region between the first region of interest (first part) and the second region of interest (second part) has been observed, and This is an example of 2 information. Further, the arrow and the evaluation value are examples of information that associates the marks (the first attention area and the second attention area) in the diagram in which the marks are displayed (first information).
 上記変形例(3)及び(4)と同様に、本例においても、マークによって、観察された部位を容易に把握できる。また、矢印が表示されることにより、観察された領域及び観察方向を容易に把握できる。また、評価値を表示することにより、観察済みを判定された領域の観察状況(撮影状況)を把握できる。 Similarly to the above-mentioned modifications (3) and (4), in this example as well, the observed region can be easily recognized by the mark. Further, by displaying arrows, it is possible to easily understand the observed area and observation direction. Furthermore, by displaying the evaluation value, it is possible to grasp the observation status (photography status) of the area that has been determined to have been observed.
 なお、本例では、マーク同士を矢印で結ぶ構成としているが、線分で結ぶ構成としてもよい。矢印で結ぶことにより、観察方向(内視鏡10を移動させた方向)を確認できる。 Note that in this example, the marks are connected by arrows, but they may be connected by line segments. By connecting with arrows, the observation direction (the direction in which the endoscope 10 was moved) can be confirmed.
 また、本例では、マークMa、Mb、…の形状を円としているが、マークの形状等については、特に限定されない。 Further, in this example, the shapes of the marks Ma, Mb, . . . are circles, but the shapes of the marks are not particularly limited.
 図21は、マークの他の一例を示す図である。図21は、アイコンを用いて、各部位を示すマークMa、Mb、…を表示する場合の例を示している。アイコンは、観察対象の臓器のシェーマ図を用いて構成され、そのアイコンが示す部位の位置に点が表示された図で構成される。図21において、マークMaは「噴門」、マークMbは「穹窿部」、マークMcは「胃体部大弯」、マークMdは「胃角部」、マークMeは「前庭部」をそれぞれ示している。 FIG. 21 is a diagram showing another example of the mark. FIG. 21 shows an example of displaying marks Ma, Mb, . . . indicating each part using icons. The icon is constructed using a schema diagram of the organ to be observed, and is constructed of a diagram in which dots are displayed at the positions of the parts indicated by the icon. In FIG. 21, the mark Ma indicates the "cardia," the mark Mb indicates the "conical region," the mark Mc indicates the "greater curvature of the body of the stomach," the mark Md indicates the "angle of the stomach," and the mark Me indicates the "antrum." There is.
 また、本例では、該当する部位が認識された場合、マークの色を変える構成としているが、切り替えの態様は、これに限定されるものではない。たとえば、該当する部位が認識された場合に、マークの形を変えたり、マークを点滅させたりしてもよい。 Further, in this example, the color of the mark is changed when the corresponding part is recognized, but the mode of switching is not limited to this. For example, when a corresponding part is recognized, the shape of the mark may be changed or the mark may be made to blink.
 また、矢印及び/又は評価値は、一定時間のみ表示させる構成とすることもできる。たとえば、矢印及び/又は評価値の表示開始からT時間経過後に矢印のみ表示を消す構成とすることができる。時間Tは、あらかじめ設定される時間である。なお、矢印及び/又は評価値の表示を一定時間経過後に消す場合、必要に応じて、再表示できるようにすることが好ましい。 Furthermore, the arrow and/or the evaluation value can be configured to be displayed only for a certain period of time. For example, the configuration may be such that only the arrows are displayed after T time has elapsed since the start of displaying the arrows and/or evaluation values. Time T is a preset time. Note that when the display of the arrow and/or the evaluation value is to be erased after a certain period of time has elapsed, it is preferable that the arrow and/or the evaluation value be able to be displayed again if necessary.
 また、本例では、あらかじめ観察する部位が定められている場合を例に説明したが、画像処理装置100で認識可能な部位をマークで示してもよい。この場合、画像処理装置100で認識可能なすべての部位が表示される(認識可能なすべての部位のマークが、所定のレイアウトで表示される。)。 Furthermore, in this example, a case has been described in which a region to be observed is determined in advance, but a region that can be recognized by the image processing device 100 may be indicated by a mark. In this case, all parts that can be recognized by image processing device 100 are displayed (marks of all parts that can be recognized are displayed in a predetermined layout).
 [6]表示のその他の変形例
 認識した部位間で評価値を算出する場合において、算出された評価値が、閾値以下の場合、評価値の表示形態を通常と異なる形態で表示させてもよい。たとえば、強調表示させてもよい。
[6] Other variations of display When calculating evaluation values between recognized parts, if the calculated evaluation value is less than or equal to a threshold value, the evaluation value may be displayed in a different form than usual. . For example, it may be highlighted.
 図22は、評価値の強調表示の一例を示す図である。 FIG. 22 is a diagram showing an example of highlighted display of evaluation values.
 図22は、マークMeで示す部位(前庭部)とマークMdで示す部位(胃角部)との間の領域での観察の評価値が閾値以下、マークMcで示す部位(胃体部大弯)とマークMeで示す部位(前庭部)との間の領域での観察の評価値が閾値を超えている場合の例を示している。 FIG. 22 shows that the observation evaluation value in the region between the region indicated by the mark Me (antrum) and the region indicated by the mark Md (gastric angle) is below the threshold value, and the region indicated by the mark Mc (the greater curvature of the gastric body) ) and the region indicated by mark Me (antibular region), the evaluation value of observation exceeds the threshold value.
 図22に示すように、閾値以下の評価値が強調表示される。図22は、通常(閾値を超えている場合)よりも評価値表示枠の表示サイズを拡大し、かつ、色を変えて表示する場合の例を示している。なお、強調表示の形態は、これに限定されるものではない。この他、たとえば、評価値表示枠を点滅させたり、評価値表示枠の形状を変えたりして、強調表示させることができる。色を変える場合は、注意を促しやすい色を選択することが好ましい。たとえば、閾値を超えている場合は、無彩色(たとえば、灰色)で表示し、閾値以下の場合は、有彩色(たとえば、赤)で表示する。 As shown in FIG. 22, evaluation values below the threshold are highlighted. FIG. 22 shows an example in which the display size of the evaluation value display frame is enlarged compared to normal (when the threshold value is exceeded) and the evaluation value display frame is displayed in a different color. Note that the form of highlighted display is not limited to this. In addition, for example, the evaluation value display frame can be made to blink or the shape of the evaluation value display frame can be changed to highlight the evaluation value display frame. When changing the color, it is preferable to choose a color that easily draws attention. For example, if it exceeds a threshold, it is displayed in an achromatic color (for example, gray), and if it is below the threshold, it is displayed in a chromatic color (for example, red).
 このように、算出された評価値が閾値以下の場合に強調表示することにより、観察の見直しをユーザに促すことができる。 In this way, by highlighting the calculated evaluation value when it is less than or equal to the threshold value, it is possible to prompt the user to reconsider the observation.
 なお、強調表示させる場合、評価値に応じて、強調の程度を段階的に変えるようにしてもよい。すなわち、評価値が下がるほど、強調の度合いを強くして表示させる。 Note that when displaying in a highlighted manner, the degree of emphasis may be changed in stages according to the evaluation value. That is, the lower the evaluation value, the stronger the degree of emphasis is displayed.
 また、上記の例では、評価値を強調表示させる場合について説明したが、矢印(線分で表示させる場合を含む)についても同様に強調表示させてもよい。たとえば、評価値が閾値以下の領域の矢印の線の太さを変えたり、矢印の色を変えたり、矢印を点滅させたりして、強調表示することができる。 Furthermore, in the above example, the evaluation value is highlighted, but arrows (including the case where they are displayed as line segments) may also be highlighted in the same way. For example, an area where the evaluation value is less than or equal to a threshold can be highlighted by changing the thickness of the arrow line, changing the color of the arrow, or blinking the arrow.
 また、評価値は、閾値以下の場合にのみ表示させる構成とすることもできる。すなわち、算出した評価値が閾値を超えている場合は、正しく観察されていると想定されるので、別途、評価値は表示させずに、矢印のみを表示させる。その一方で、算出した評価値が閾値以下の場合は、正しく観察されていない可能性が高いので、画面上に評価値を表示させ、ユーザに警告を促す。なお、この場合、評価値を表示させずに、矢印を強調表示させて、警告を促す構成としてもよい。図23は、矢印を強調表示させる場合の一例を示す図である。図23は、マークMcで示す部位(胃体部大弯)とマークMeで示す部位(前庭部)との間の領域での観察の評価値が閾値以下、マークMeで示す部位(前庭部)とマークMdで示す部位(胃角部)との間の領域での観察の評価値が閾値を超えている場合の例を示している。図23に示すように、評価値が閾値以下の領域の矢印Ar1が強調表示される。図23は、矢印の線の太さ及び色を変えて強調表示する場合の例を示している。 Additionally, the evaluation value can be configured to be displayed only when it is equal to or less than a threshold value. That is, if the calculated evaluation value exceeds the threshold value, it is assumed that the observation is being performed correctly, so only the arrow is displayed without separately displaying the evaluation value. On the other hand, if the calculated evaluation value is less than or equal to the threshold value, there is a high possibility that the observation is not being performed correctly, so the evaluation value is displayed on the screen to prompt the user with a warning. In this case, the arrow may be highlighted to prompt a warning without displaying the evaluation value. FIG. 23 is a diagram illustrating an example of highlighting an arrow. FIG. 23 shows that the evaluation value of observation in the region between the region indicated by mark Mc (greater curvature of gastric body) and the region indicated by mark Me (antrum) is below the threshold value, and the region indicated by mark Me (antrum) An example is shown in which the evaluation value of the observation in the region between and the region indicated by the mark Md (angular region of the stomach) exceeds the threshold value. As shown in FIG. 23, the arrow Ar1 in the area where the evaluation value is less than or equal to the threshold value is highlighted. FIG. 23 shows an example of highlighting the arrow by changing its line thickness and color.
 また、強調表示は、マークを用いて行う構成とすることもできる。たとえば、図23に示す例において、マークMcで示す部位(胃体部大弯)とマークMeで示す部位(前庭部)との間の領域での観察の評価値が閾値以下の場合、マークMc及びマークMeを強調表示させる。たとえば、マークMc及びマークMeの色を変えたり、サイズを変えたり、形を変えたり、点滅させたりして、強調表示させることができる。 Further, the highlighting can also be configured using a mark. For example, in the example illustrated in FIG. and highlight the mark Me. For example, the marks Mc and Me can be highlighted by changing their color, size, shape, or blinking.
 [観察のやり直し]
 評価値が算出された領域(観察済みと判定された領域)が、再度観察された場合、評価値を算出し直すことが好ましい。また、評価値を算出し直した場合は、算出済みの評価値を新たに算出された評価値で更新することが好ましい。
[Redo observation]
When the region for which the evaluation value has been calculated (the region determined to have been observed) is observed again, it is preferable to recalculate the evaluation value. Furthermore, when the evaluation value is recalculated, it is preferable to update the already calculated evaluation value with the newly calculated evaluation value.
 評価値を算出し直す場合、たとえば、ユーザから評価値を算出し直す領域の指定を受け付けて、評価値を算出し直す構成とすることができる。また、特定の操作の検出をトリガーとして、自動的に評価値を算出し直す構成としてもよい。たとえば、認識済みの部位を再度認識した場合に、当該部位を起点として認識済みと判定された領域の評価値を算出し直す。以下、認識済みの部位を再度認識した場合に自動で評価値を算出し直す場合の処理について説明する。 When recalculating the evaluation value, the configuration may be such that, for example, a designation of a region for which the evaluation value is to be recalculated is accepted from the user, and the evaluation value is recalculated. Alternatively, the evaluation value may be automatically recalculated using detection of a specific operation as a trigger. For example, when a recognized part is recognized again, the evaluation value of the area determined to be recognized is recalculated using the part as a starting point. Hereinafter, a process for automatically recalculating an evaluation value when a recognized part is recognized again will be described.
 図24は、認識済みの部位を再度認識した場合に自動で評価値を算出し直す場合の処理の手順を示すフローチャートである。 FIG. 24 is a flowchart showing the procedure for automatically recalculating the evaluation value when a recognized part is recognized again.
 なお、ここでは、あらかじめ観察すべき部位(又は静止画を撮影すべき部位)が、観察状況表示ウインドウWに表示されたシェーマ図上にマークで示される場合を例に説明する(図15から図18参照)。この場合、部位が認識されると、認識された部位のマークの表示形態が切り替えられる。また、先に認識された部位のマークとの間に矢印が表示され、かつ、その矢印に隣接して評価値が表示される。 Here, we will explain the case where the region to be observed in advance (or the region to be photographed as a still image) is indicated by a mark on the schema diagram displayed in the observation status display window W (from FIG. 15 to FIG. (see 18). In this case, when a region is recognized, the display form of the mark of the recognized region is switched. Further, an arrow is displayed between the mark of the previously recognized part, and an evaluation value is displayed adjacent to the arrow.
 画像処理装置100のプロセッサ101は、表示装置50の画面52に観察状況表示ウインドウWを表示させる(ステップS11)。 The processor 101 of the image processing device 100 displays the observation status display window W on the screen 52 of the display device 50 (step S11).
 観察が開始されると、プロセッサ101は、静止画撮影の有無を判定する(ステップS12)。静止画が撮影されると、撮影された静止画に対し、部位の認識処理を行う(ステップS13)。 When observation starts, the processor 101 determines whether still images are to be taken (step S12). When a still image is photographed, body part recognition processing is performed on the photographed still image (step S13).
 認識処理後、プロセッサ101は、部位を認識した否かを判定する(ステップS14)。部位を認識した場合、プロセッサ101は、認識した部位が、認識済みの部位か否かを判定する(ステップS15)。すなわち、今回認識した部位が、既に認識されている部位か否かを判定する。 After the recognition process, the processor 101 determines whether or not the body part has been recognized (step S14). When the body part is recognized, the processor 101 determines whether the recognized body part is a recognized body part (step S15). That is, it is determined whether the currently recognized part is a part that has already been recognized.
 認識した部位が、認識済みの部位でない場合、プロセッサ101は、観察状況を判定する処理を行う(ステップS16)。すなわち、認識済みの部位の有無を判定し、認識済みの部位があれば、前回認識された部位と今回認識された部位との間の領域が観察済みと判定する。この後、プロセッサ101は、部位の認識結果と観察状況の判定結果とに基づいて、観察状況表示ウインドウWの表示を更新する(ステップS17)。始めて部位を認識した場合は、認識した部位のマークの表示形態が切り替えられる。認識済みの部位が存在する場合は、新たに認識した部位のマークの表示形態が切り替えられ、かつ、先に認識された部位のマークとの間に矢印が表示される。更に、その矢印に隣接して、評価値が表示される。 If the recognized part is not a recognized part, the processor 101 performs processing to determine the observation situation (step S16). That is, the presence or absence of a recognized part is determined, and if there is a recognized part, it is determined that the area between the previously recognized part and the currently recognized part has been observed. Thereafter, the processor 101 updates the display of the observation situation display window W based on the region recognition result and the observation situation determination result (step S17). When a part is recognized for the first time, the display form of the mark of the recognized part is switched. If there is a recognized part, the display form of the mark of the newly recognized part is switched, and an arrow is displayed between it and the mark of the previously recognized part. Furthermore, an evaluation value is displayed adjacent to the arrow.
 観察状況表示ウインドウWの表示の更新後、プロセッサ101は、観察が終了したか否かを判定する(ステップS18)。なお、ステップS12で静止画が撮影されていないと判定した場合、及び、ステップS14で部位を認識していないと判定した場合も同様に観察が終了したか否かを判定する。観察が終了したと判定すると、処理を終了する。観察が終了していないと判定すると、静止画撮影の有無を判定する(ステップS12)。 After updating the display of the observation status display window W, the processor 101 determines whether the observation has ended (step S18). Note that in the case where it is determined in step S12 that no still image has been taken, and in the case where it is determined in step S14 that the body part has not been recognized, it is similarly determined whether the observation has ended. When it is determined that the observation has ended, the process ends. If it is determined that the observation has not been completed, it is determined whether still images are to be taken (step S12).
 ステップS15において、認識した部位が、認識済みの部位であると判定すると、プロセッサ101は、認識した部位を起点として認識済みと判定された領域について、「認識済み」との判定を解除する処理を行う(ステップS19)。 In step S15, if it is determined that the recognized part is a recognized part, the processor 101 performs a process of canceling the determination of "recognized" for the area determined to be recognized starting from the recognized part. (Step S19).
 図25は、「認識済み」との判定を解除する処理の概念図である。図25は、マークMcの部位(胃体部大弯)、マークMeの部位(前庭部)、マークMdの部位(胃角部)の順で胃の各部を認識した後、再度、マークMeの部位(前庭部)を認識した場合の例を示している。この場合、マークMcの部位(胃体部大弯)とマークMeの部位(前庭部)との間の領域、及び、マークMeの部位(前庭部)とマークMdの部位(胃角部)との間の領域が認識済みの領域とされる。マークMdの部位(胃角部)を認識した後、マークMeの部位(前庭部)の部位を認識すると、マークMeの部位(前庭部)は、認識済みの部位であるので、マークMeの部位(前庭部)を起点として認識済みと判定された領域について、「認識済み」との判定が解除される。すなわち、マークMeの部位(前庭部)とマークMdの部位(胃角部)との間の領域について、「認識済み」との判定が解除される。 FIG. 25 is a conceptual diagram of the process of canceling the "recognized" determination. In FIG. 25, after recognizing each part of the stomach in the order of the mark Mc (greater curvature of the gastric body), the mark Me (antrum), and the mark Md (stomach angle), the mark Me is recognized again. An example is shown in which a part (antrum) is recognized. In this case, the area between the mark Mc (greater curvature of the body of the stomach) and the mark Me (antrum), and the area between the mark Me (antrum) and the mark Md (antrum). The area between is considered to be a recognized area. After recognizing the part of the mark Md (angular part of the stomach), when the part of the mark Me (antrum) is recognized, the part of the mark Me (antrum) is a recognized part, so the part of the mark Me is recognized. The determination as "recognized" is canceled for the region determined to have been recognized starting from the vestibular region. That is, the determination that the area between the mark Me (antrum) and the mark Md (gastric angle) is "recognized" is canceled.
 認識済みと判定された領域について、「認識済み」との判定が解除されると、当該領域の終点を規定する部位についても、認識済みとの判定が解除される。たとえば、図25に示す例の場合、マークMdの部位(胃角部)について、「認識済み」との判定が解除される。 When the determination as "recognized" is canceled for the area determined to be recognized, the determination as recognized is also canceled for the part that defines the end point of the area. For example, in the case of the example shown in FIG. 25, the determination that the part of the mark Md (the angle of the stomach) is "recognized" is canceled.
 該当する領域について、「認識済み」との判定を解除する処理を行った後、プロセッサ101は、観察状況表示ウインドウWの表示を更新する(ステップS17)。この場合、該当する領域について、矢印及び評価値が消去される。上記の例では、マークMeの部位(前庭部)とマークMdの部位(胃角部)との間の領域に表示された矢印及び評価値が消去される。また、マークMdの表示が、未認識の表示に切り替えられる。 After performing the process of canceling the "recognized" determination for the corresponding area, the processor 101 updates the display on the observation status display window W (step S17). In this case, the arrow and evaluation value are deleted for the corresponding area. In the above example, the arrow and evaluation value displayed in the area between the mark Me (antral region) and the mark Md (angular region) are erased. Further, the display of the mark Md is switched to an unrecognized display.
 観察状況表示ウインドウWの表示の更新後、プロセッサ101は、観察が終了したか否かを判定する(ステップS18)。観察が終了していないと判定すると、静止画撮影の有無を判定する(ステップS12)。そして、静止画が撮影され、撮影された静止画から新たに部位が認識されると、新たに認識された部位との間で観察状況の判定処理が行われる(ステップS16)。たとえば、図25に示す例において、マークMeの部位(前庭部)が再度認識された後、更に、マークMdの部位(胃角部)が認識された場合、マークMeの部位(前庭部)とマークMdの部位(胃角部)との間の領域が認識済みの領域と判定され、当該領域について、評価値が算出される。そして、部位の認識結果と観察状況の判定結果とに基づいて、観察状況表示ウインドウWの表示が更新される(ステップS17)。 After updating the display of the observation status display window W, the processor 101 determines whether the observation has ended (step S18). If it is determined that the observation has not been completed, it is determined whether still images are to be taken (step S12). Then, when a still image is photographed and a new part is recognized from the photographed still image, an observation situation determination process is performed with respect to the newly recognized part (step S16). For example, in the example shown in FIG. 25, if the part of the mark Me (antrum) is recognized again and then the part of the mark Md (angular part of the stomach) is recognized, then the part of the mark Me (antrum) is recognized again. The area between the mark Md and the site (angular part of the stomach) is determined to be a recognized area, and an evaluation value is calculated for the area. Then, the display of the observation situation display window W is updated based on the recognition result of the body part and the judgment result of the observation situation (step S17).
 このように、本例によれば、評価値が算出された領域(観察済みと判定された領域)について、起点となる部位を再度認識させる操作を行うだけで、簡単に評価値を算出し直す処理を行うことができる。 In this way, according to this example, the evaluation value can be easily recalculated by simply performing an operation to re-recognize the starting point for the region for which the evaluation value has been calculated (the region determined to have been observed). can be processed.
 なお、図25に示す例において、(1)マークMcの部位(胃体部大弯)、(2)マークMeの部位(前庭部)、(3)マークMdの部位(胃角部)、(4)マークMeの部位(前庭部)、(5)マークMdの部位(胃角部)の順で各部を認識し場合、マークMeの部位(前庭部)を認識した段階で、マークMcの部位(胃体部大弯)とマークMeの部位(前庭部)との間の領域が観察された領域と判定される。その後、マークMdの部位(胃角部)が認識されると、マークMeの部位(前庭部)とマークMdの部位(胃角部)との間の領域が観察された領域と判定される。そして、マークMeの部位(前庭部)が、再度認識されると、マークMeの部位(前庭部)とマークMdの部位(胃角部)との間の領域について、「観察済み」との判定が解除される。その後、マークMdの部位(胃角部)が再度認識されると、マークMeの部位(前庭部)とマークMdの部位(胃角部)との間の領域が観察された領域と判定される。本例において、マークMeの部位(前庭部)は、第1部位の一例であり、マークMdの部位(胃角部)は、第2部位の一例である。また、最初にマークMeの部位(前庭部)を認識した画像(静止画)は、第1画像の一例であり、また、最初にマークMdの部位(胃角部)を認識した画像(静止画)は、第2画像の一例である。また、二度目にマークMeの部位(前庭部)を認識した画像(静止画)は、第3画像の一例であり、また、二度目にマークMdの部位(胃角部)を認識した画像(静止画)は、第4画像の一例である。 In the example shown in FIG. 25, (1) the site of the mark Mc (greater curvature of the body of the stomach), (2) the site of the mark Me (antrum), (3) the site of the mark Md (the angle of the stomach), ( 4) If each part is recognized in the order of mark Me part (antrum) and (5) mark Md part (angular part), at the stage where mark Me part (antrum) is recognized, mark Mc part is recognized. (the greater curvature of the body of the stomach) and the region of the mark Me (antrum) is determined to be the observed region. Thereafter, when the region of the mark Md (angular region of the stomach) is recognized, the region between the region of the mark Me (antrum) and the region of the mark Md (angular region of the stomach) is determined to be the observed region. Then, when the part of the mark Me (antrum) is recognized again, the area between the part of the mark Me (antrum) and the part of the mark Md (angular part of the stomach) is determined to be "observed". is released. After that, when the region marked Md (angular region) is recognized again, the region between the region marked Me (antrum) and the region marked Md (angular region) is determined to be the observed region. . In this example, the region marked Me (antrum) is an example of the first region, and the region marked Md (angular region of the stomach) is an example of the second region. The image (still image) in which the part of the mark Me (antrum) is first recognized is an example of the first image, and the image (still image) in which the part of the mark Md (angular region) is first recognized is an example of the first image. ) is an example of the second image. In addition, the image (still image) in which the part of the mark Me (antrum) is recognized for the second time is an example of the third image, and the image (still image) in which the part of the mark Md (the gastric angle) is recognized the second time ( A still image) is an example of the fourth image.
 [観察の評価方法等についての変形例]
 [1]評価方法
 上記実施の形態では、画像のボケ状態及びブレ状態から画像を評価する構成としているが、画像を評価する方法は、これに限定されるものではない。画像のボケ状態及び/又はブレ状態に加えて、又は、これらに代えて、画像の明るさ(露出)の観点から画像を評価する構成としてもよい。
[Modifications of observation evaluation methods, etc.]
[1] Evaluation method In the above embodiment, the image is evaluated based on the blur state and blur state of the image, but the method for evaluating the image is not limited to this. In addition to or in place of the blur state and/or blur state of the image, a configuration may be adopted in which the image is evaluated from the viewpoint of the brightness (exposure) of the image.
 また、観察を評価する基準は、領域ごとに変えてもよい。たとえば、胃を観察する場合において、胃体部大弯と前庭部との間の領域を観察する際の評価の基準と、前庭部と胃角部の間の領域を観察する際の評価の基準とを変えてもよい。これにより、各領域の観察に適した評価基準を設定でき、各領域を適切に評価できる。 Additionally, the criteria for evaluating observations may be changed for each area. For example, when observing the stomach, there are evaluation criteria for observing the area between the greater curvature of the gastric body and the antrum, and evaluation criteria for observing the area between the antrum and the angle of the stomach. You may change . Thereby, evaluation criteria suitable for observation of each region can be set, and each region can be appropriately evaluated.
 また、観察の評価は、領域を観察した時間又は速度の観点から行うこともできる。たとえば、領域を観察した時間を計測し、計測された時間が、あらかじめ設定された基準時間を超えている場合をOK、基準時間以下の場合をNGと判定することができる。あるいは、領域を観察した速度を計測し、計測された速度が、あらかじめ設定された基準速度以下の場合をOK、基準速度を超えている場合をNGと判定することができる。基準時間及び基準速度は、領域ごとに設定することが、より好ましい。領域を観察した時間又は速度に基づく、OK及びNGの評価は、評価値の他の一例である。 Additionally, observation evaluation can also be performed from the perspective of the time or speed at which the area was observed. For example, it is possible to measure the time during which the area was observed, and to determine that the measured time is OK if it exceeds a preset reference time, and NG if it is less than the reference time. Alternatively, the speed at which the area is observed can be measured, and if the measured speed is less than or equal to a preset reference speed, it can be determined to be OK, and if it exceeds the reference speed, it can be determined to be NG. More preferably, the reference time and reference speed are set for each region. Evaluation of OK and NG based on the time or speed at which the area was observed is another example of the evaluation value.
 なお、画像は、所定のフレームレートで撮影されることから、領域を観察した時間又は速度を計測することは、領域で撮影された画像の数を計測することと実質的に同じである。すなわち、画像の数から時間を算出でき、更に、時間から速度を算出できるので、画像の数に基づいて、観察を評価することができる。たとえば、領域で撮影された画像の数が、あらかじめ設定された基準枚数を超える場合をOK、基準枚数以下の場合をNGと評価できる。基準枚数については、領域ごとに設定することがより好ましい。 Note that since images are captured at a predetermined frame rate, measuring the time or speed at which an area is observed is substantially the same as measuring the number of images captured in the area. That is, since time can be calculated from the number of images and speed can also be calculated from time, observation can be evaluated based on the number of images. For example, if the number of images taken in a region exceeds a preset reference number, it can be evaluated as OK, and if it is less than or equal to the reference number, it can be evaluated as NG. It is more preferable to set the reference number of sheets for each region.
 複数の観点から観察を評価した場合、それぞれの評価値を個別に表示させることができる。また、複数の観点から観察を評価した場合、各観点からの評価を総合した評価値を更に算出して、表示させる構成とすることもできる。総合的な評価は、たとえば、すべての項目(観点)において、評価値が基準を超えている又は満たしている場合を「OK」、一つでも評価値が基準以下又は基準を満たしていない場合を「NG」として、評価することができる。 If observations are evaluated from multiple viewpoints, each evaluation value can be displayed individually. Further, when the observation is evaluated from a plurality of viewpoints, an evaluation value that is a total of the evaluations from each viewpoint may be further calculated and displayed. For example, in the overall evaluation, if the evaluation value exceeds or meets the standard for all items (viewpoints), it is "OK", and if even one evaluation value is below the standard or does not meet the standard, it is "OK". It can be evaluated as "NG".
 [2]評価対象
 上記実施の形態では、部位間に撮影されたすべてのフレームの画像を評価対象としているが、特定の画像を評価対象とすることもできる。たとえば、一定のフレーム間隔で抽出される画像を評価対象とすることもできる。
[2] Evaluation target In the embodiment described above, images of all frames taken between body parts are evaluated, but a specific image can also be evaluated. For example, images extracted at regular frame intervals can be evaluated.
 [3]評価値の算出方法の変形例
 (1)評価値の算出方法の変形例1
 上記実施の形態では、OK画像の割合を評価値として算出しているが、NG画像の割合を評価値として算出することもできる。NG画像の割合を評価値として算出する場合は、評価値が低いほど高評価となる。
[3] Modification example of evaluation value calculation method (1) Modification example 1 of evaluation value calculation method
In the above embodiment, the percentage of OK images is calculated as the evaluation value, but the percentage of NG images can also be calculated as the evaluation value. When calculating the percentage of NG images as an evaluation value, the lower the evaluation value, the higher the evaluation.
 また、OK画像の割合又はNG画像の割合を評価値として算出する場合において、評価対象とする画像群(領域で撮影された複数の画像)の中に相互に類似する画像(同一の画像を含む)が存在する場合、一方を除外して評価値を算出することが好ましい。たとえば、先に撮影された画像と類似する画像が撮影された場合、後に撮影された画像を評価値の算出対象から除外する。 In addition, when calculating the percentage of OK images or the percentage of NG images as an evaluation value, images that are similar to each other (including the same images) in a group of images to be evaluated (multiple images taken in a region) are ), it is preferable to exclude one of them and calculate the evaluation value. For example, if an image similar to an image taken earlier is taken, the image taken later is excluded from the evaluation value calculation target.
 図26は、類似する画像を除外して評価値を算出する機能を備えた観察状況判定部のブロック図である。 FIG. 26 is a block diagram of an observation situation determination unit that has a function of calculating an evaluation value by excluding similar images.
 図26に示すように、本例の観察状況判定部113Dは、観察領域判定部113D1、撮影評価部113D2及び評価値算出部113D3に加えて、更に、類似画像検出部113D4の機能を有する。観察領域判定部113D1、撮影評価部113D2及び評価値算出部113D3の機能は、上記実施の形態の画像処理装置100の観察状況判定部113Dの機能と同じである。よって、ここでは、類似画像検出部113D4の機能について説明する。 As shown in FIG. 26, the observation situation determination unit 113D of this example has the functions of a similar image detection unit 113D4 in addition to the observation area determination unit 113D1, the imaging evaluation unit 113D2, and the evaluation value calculation unit 113D3. The functions of the observation area determination section 113D1, the imaging evaluation section 113D2, and the evaluation value calculation section 113D3 are the same as those of the observation situation determination section 113D of the image processing apparatus 100 of the above embodiment. Therefore, here, the functions of the similar image detection section 113D4 will be explained.
 類似画像検出部113D4は、内視鏡10で時系列に撮影された画像を取得し、相互に類似する画像を検出する。具体的には、先に撮影された画像(撮影済みの画像)と類似する画像を検出する。検出方法については、公知の技術を採用できる。たとえば、画像の相関を用いて、類似する画像を検出する手法を採用できる。検出対象は、観察済みと判定された領域で撮影される画像群である。よって、部位が認識されるたびに、検出対象がリセットされる。 The similar image detection unit 113D4 acquires images taken in chronological order by the endoscope 10, and detects images that are similar to each other. Specifically, an image similar to a previously photographed image (a photographed image) is detected. As for the detection method, known techniques can be employed. For example, a method of detecting similar images using image correlation can be adopted. The detection target is a group of images taken in a region determined to have been observed. Therefore, each time a part is recognized, the detection target is reset.
 類似画像検出部113D4は、取得した画像を順次処理し、撮影済みの画像と非類似の画像のみを撮影評価部113D2に加える。これにより、撮影済みの画像と類似する画像を評価対象(評価値の算出対象)から除外することができる。 The similar image detection unit 113D4 sequentially processes the acquired images and adds only images that are dissimilar to the photographed image to the photographic evaluation unit 113D2. This makes it possible to exclude images similar to already photographed images from evaluation targets (evaluation value calculation targets).
 内視鏡10の動きを停止させた場合などには、類似する画像が連続して撮影される。本例のように、撮影済みの画像と類似する画像を評価対象から除外することにより、該当する領域の観察をより正確に評価できる。 When the movement of the endoscope 10 is stopped, similar images are continuously captured. As in this example, by excluding images similar to already photographed images from evaluation targets, observation of the corresponding area can be evaluated more accurately.
 なお、評価値を算出する際に、撮影済みの画像と類似する画像を評価値の算出対象から除外する構成としてもよい。 Note that when calculating the evaluation value, it may be configured to exclude images similar to already photographed images from the evaluation value calculation targets.
 (2)評価値の算出方法の変形例2
 評価値は、リアルタイムに算出する構成とすることもできる。この場合、部位を認識後、順次、評価値を算出する。
(2) Modification example 2 of evaluation value calculation method
The evaluation value can also be configured to be calculated in real time. In this case, after recognizing the parts, evaluation values are calculated in sequence.
 図27は、評価値をリアルタイムに算出する場合の概念図である。 FIG. 27 is a conceptual diagram when calculating evaluation values in real time.
 図27において、符号IOKは、OK画像と評価された画像を示している。また、符号INGは、NG画像と評価された画像を示している。 In FIG. 27, the symbol IOK indicates an image that has been evaluated as an OK image. Moreover, the code ING indicates an image evaluated as an NG image.
 また、図27は、部位を認識後、20フレーム目の画像が撮影された状態を示している。評価値Sは、部位認識後に撮影された画像の積算数Nallに対する、部位認識後に撮影されたOK画像の積算数Nokの比(S=Nok/Nall)として算出される。したがって、20フレーム目の画像が撮影された段階で、NG画像の数が2つの場合、評価値Sは、S=18/20=90[%]となる。このように、評価値は、リアルタイムに算出できる。 Further, FIG. 27 shows a state in which the 20th frame image is taken after the body part is recognized. The evaluation value S is calculated as the ratio of the cumulative number Nok of OK images photographed after body part recognition to the cumulative number Nall of images photographed after body part recognition (S=Nok/Nall). Therefore, if the number of NG images is two when the 20th frame image is captured, the evaluation value S is S=18/20=90[%]. In this way, the evaluation value can be calculated in real time.
 評価値をリアルタイムに算出した場合、算出結果を表示装置50にリアルタイムに表示させることが好ましい。 When the evaluation value is calculated in real time, it is preferable to display the calculation result on the display device 50 in real time.
 図28は、評価値をリアルタイムに表示させる場合の一例を示す図である。 FIG. 28 is a diagram illustrating an example of displaying evaluation values in real time.
 図28は、リアルタイムに算出された評価値を観察状況表示ウインドウWに表示させる場合の例を示している。図28に示すように、観察状況表示ウインドウW内に評価値表示エリアSRが設定され、その評価値表示エリアSRに評価値が表示される。評価値表示エリアSRは、シェーマ図Shを表示する場合の余白領域に設定される。図28は、観察状況表示ウインドウW内の右上隅に矩形の評価値表示エリアSRを設定した場合の例を示している。 FIG. 28 shows an example in which evaluation values calculated in real time are displayed on the observation status display window W. As shown in FIG. 28, an evaluation value display area SR is set within the observation status display window W, and an evaluation value is displayed in the evaluation value display area SR. The evaluation value display area SR is set as a margin area when displaying the schema diagram Sh. FIG. 28 shows an example in which a rectangular evaluation value display area SR is set in the upper right corner of the observation status display window W.
 また、図28は、マークMcで示す部位(胃体部大弯)の後にマークMeで示す部位(前庭部)を認識した場合の例を示している。この場合、マークMcで示す部位(胃体部大弯)とマークMeで示す部位(前庭部)との間の領域が観察済みの領域となる。そして、マークMeで示す部位(前庭部)を認識した後に観察した画像に基づく評価値が評価値表示エリアSRに表示される。マークMeで示す部位(前庭部)を認識した後に観察した画像は、どの領域を観察した画像か未確定であるため、評価値表示エリアSRに評価値が表示される。観察した領域が確定した段階で、領域を示す矢印の近傍に評価値が表示される。この際表示される評価値は確定した評価値となる。 Further, FIG. 28 shows an example in which a region indicated by a mark Me (antrum) is recognized after a region indicated by a mark Mc (greater curvature of the body of the stomach). In this case, the region between the region indicated by the mark Mc (greater curvature of the body of the stomach) and the region indicated by the mark Me (antrum) is the observed region. Then, the evaluation value based on the image observed after recognizing the region indicated by the mark Me (antibular region) is displayed in the evaluation value display area SR. In the image observed after recognizing the region (vestibular region) indicated by the mark Me, it is not determined which region is observed in the image, so the evaluation value is displayed in the evaluation value display area SR. When the observed area is determined, an evaluation value is displayed near the arrow indicating the area. The evaluation value displayed at this time is the final evaluation value.
 図29は、観察状況表示ウインドウの表示の遷移の一例を示す図である。図29は、観察すべき部位があらかじめ定められている場合の例を示している。 FIG. 29 is a diagram showing an example of the display transition of the observation status display window. FIG. 29 shows an example where the region to be observed is determined in advance.
 図29(A)は、観察状況表示ウインドウWの初期表示の一例を示している。図29(A)に示すように、観察状況表示ウインドウWにはシェーマ図Shが表示され、かつ、シェーマ図Sh上に観察すべき部位を示すマークMa、Mb、…が表示される。 FIG. 29(A) shows an example of the initial display of the observation status display window W. As shown in FIG. 29A, a schema diagram Sh is displayed in the observation status display window W, and marks Ma, Mb, . . . indicating parts to be observed are displayed on the schema diagram Sh.
 図29(B)は、観察すべき部位が初めて認識された場合の観察状況表示ウインドウWの表示の一例を示している。図29(B)に示すように、認識された部位のマークMcの表示形態が切り替わる。図29(B)は、「胃体部大弯」が認識された場合の例を示している。図29(B)に示すように、部位が認識されると、認識後に撮影された画像に基づいて算出された評価値が、評価値表示エリアSRにリアルタイムに表示される。図29(B)は、マークMcで示す部位(胃体部大弯)を認識した後に撮影された画像から算出された評価値が、表示の時点で100[%]の場合の例を示している。 FIG. 29(B) shows an example of the display of the observation status display window W when the region to be observed is recognized for the first time. As shown in FIG. 29(B), the display form of the mark Mc of the recognized part changes. FIG. 29(B) shows an example in which the "greater curvature of the body of the stomach" is recognized. As shown in FIG. 29(B), when the body part is recognized, the evaluation value calculated based on the image taken after recognition is displayed in the evaluation value display area SR in real time. FIG. 29(B) shows an example in which the evaluation value calculated from the image taken after recognizing the region indicated by the mark Mc (greater curvature of the body of the stomach) is 100 [%] at the time of display. There is.
 図29(C)は、観察すべき部位の中から新たに部位が認識された場合の観察状況表示ウインドウWの表示の一例を示している。図29(C)に示すように、新たに認識された部位のマークMeの色(表示形態)が切り替わる。図29(C)は、新たに「前庭部」が認識された場合の例を示している。 FIG. 29(C) shows an example of the display of the observation status display window W when a new part is recognized from among the parts to be observed. As shown in FIG. 29(C), the color (display format) of the mark Me of the newly recognized part changes. FIG. 29(C) shows an example in which the "vestibular region" is newly recognized.
 また、図29(C)に示すように、先に認識された部位のマークMcと新たに表示された部位のマークMeを結ぶ矢印Ar1が表示される。また、その矢印Ar1に隣接して評価値表示枠Sc1が表示され、確定した評価値が表示される。図29(C)に示す例では、マークMcで示す「胃体部大弯」からマークMeで示す「前庭部」との間の胃の領域が観察されたことを示している。 Furthermore, as shown in FIG. 29(C), an arrow Ar1 is displayed that connects the mark Mc of the previously recognized part and the mark Me of the newly displayed part. Further, an evaluation value display frame Sc1 is displayed adjacent to the arrow Ar1, and the determined evaluation value is displayed. The example shown in FIG. 29(C) shows that the region of the stomach between the "greater curvature of the body of the stomach" indicated by the mark Mc and the "antrum" indicated by the mark Me has been observed.
 更に、図29(C)に示すように、部位が認識されると、認識後に撮影された画像に基づいて算出された評価値が、評価値表示エリアSRにリアルタイムに表示される。図29(C)は、マークMeで示す部位(前庭部)を認識した後に撮影された画像から算出された評価値が、表示の時点で98[%]の場合の例を示している。 Furthermore, as shown in FIG. 29(C), when the body part is recognized, the evaluation value calculated based on the image taken after recognition is displayed in the evaluation value display area SR in real time. FIG. 29C shows an example in which the evaluation value calculated from the image taken after recognizing the region indicated by the mark Me (antibular region) is 98% at the time of display.
 図29(D)は、観察すべき部位の中から更に部位が認識された場合の観察状況表示ウインドウWの表示の一例を示している。図29(D)に示すように、新たに認識された部位のマークMdの色(表示形態)が切り替わる。図29(D)は、新たに「胃角部」が認識された場合の例を示している。 FIG. 29(D) shows an example of the display of the observation status display window W when a region is further recognized from among the regions to be observed. As shown in FIG. 29(D), the color (display format) of the mark Md of the newly recognized region changes. FIG. 29(D) shows an example where the "angular part of the stomach" is newly recognized.
 また、図29(D)に示すように、先に表示されたマークMeと新たに表示されたマークMdとを結ぶ矢印Ar2が表示される。また、その矢印Ar2に隣接して評価値表示枠Sc2が表示され、確定した評価値が表示される。図29(D)に示す例では、マークMeで示す「前庭部」からマークMdで示す「胃角部」に向けて内視鏡10が移動し、その間の胃の領域が観察されたことを示している。 Additionally, as shown in FIG. 29(D), an arrow Ar2 is displayed that connects the previously displayed mark Me and the newly displayed mark Md. Further, an evaluation value display frame Sc2 is displayed adjacent to the arrow Ar2, and the determined evaluation value is displayed. In the example shown in FIG. 29(D), the endoscope 10 is moved from the "antrum" indicated by the mark Me toward the "angular region of the stomach" indicated by the mark Md, and the region of the stomach between them is observed. It shows.
 更に、図29(D)に示すように、部位が認識されると、認識後に撮影された画像に基づいて算出された評価値が、評価値表示エリアSRにリアルタイムに表示される。図29(D)は、マークMdで示す部位(胃角部)を認識した後に撮影された画像から算出された評価値が、表示の時点で95[%]の場合の例を示している。 Further, as shown in FIG. 29(D), when the body part is recognized, the evaluation value calculated based on the image taken after recognition is displayed in the evaluation value display area SR in real time. FIG. 29(D) shows an example in which the evaluation value calculated from the image taken after recognizing the region indicated by the mark Md (angular part of the stomach) is 95% at the time of display.
 このように、部位認識後に撮影された画像に基づいて算出された評価値が、評価値表示エリアSRにリアルタイムに表示される。これにより、ユーザは、観察状況をリアルタイムに把握できる。 In this way, the evaluation value calculated based on the image taken after body part recognition is displayed in the evaluation value display area SR in real time. This allows the user to grasp the observation situation in real time.
 なお、図27及び図28に示す例では、リアルタイムに算出される評価値を観察状況表示ウインドウWに表示させる場合を例に説明したが、リアルタイムに算出される評価値を表示させる場所は、これに限定されるものではない。観察状況表示ウインドウW以外の場所に表示させてもよい。また、観察状況表示ウインドウWに表示させる場合も、その表示位置は特に限定されない。たとえば、直前に認識した部位を示すマークの近傍(部位との関連性が分かる程度の範囲内の位置)に表示させてもよい。 In addition, in the examples shown in FIGS. 27 and 28, the case where the evaluation value calculated in real time is displayed in the observation status display window W was explained as an example, but this is the place where the evaluation value calculated in real time is displayed. It is not limited to. It may be displayed in a location other than the observation status display window W. Also, when displaying on the observation status display window W, the display position is not particularly limited. For example, it may be displayed near the mark indicating the part recognized immediately before (at a position within a range where the relationship with the part can be understood).
 また、本例のように、評価値をリアルタイムに表示させる場合、必要に応じて評価値の算出をリセットできるようにすることが好ましい。リセットは、たとえば、直前に認識した部位と同じ部位が認識された場合にリセットする方法を採用できる。この他、フットスイッチ等の操作でリセットを指示できる構成としてもよい。 Furthermore, when displaying the evaluation value in real time as in this example, it is preferable to be able to reset the calculation of the evaluation value as necessary. For the reset, for example, a method of resetting when the same part as the part recognized immediately before is recognized can be adopted. In addition, a configuration may be adopted in which the reset can be instructed by operating a foot switch or the like.
 [部位の認識]
 [1]認識の処理対象とする画像
 上記実施の形態では、静止画撮影された画像から部位を認識する構成としているが、部位認識の処理を行う画像は、これに限定されるものではない。たとえば、ユーザが任意に選択した画像に対し、部位の認識を行う構成とすることができる。画像の選択は、たとえば、フットスイッチ等で行い、フットスイッチが押された時点で観察像表示領域A1aに表示されている画像に対し、部位の認識を行う構成とすることができる。この他、内視鏡10の操作部に備えられたスイッチ、音声入力等の手段を用いて、画像の選択を行う構成とすることができる。
[Part recognition]
[1] Image to be processed for recognition In the embodiment described above, a body part is recognized from a still photographed image, but the image to be subjected to body part recognition processing is not limited to this. For example, a configuration may be adopted in which body part recognition is performed on an image arbitrarily selected by the user. The image selection may be performed using, for example, a foot switch, and the part may be recognized from the image displayed in the observation image display area A1a at the time the foot switch is pressed. In addition, a configuration may be adopted in which images are selected using a switch provided in the operation section of the endoscope 10, a voice input device, or the like.
 また、部位の認識は、撮影されたすべての画像(すべてのフレームの画像)を対象として行う構成とすることもできる。この場合、プロセッサ101は、時系列順に取得される画像をそれぞれ処理して、それぞれの画像に写っている部位を認識する。撮影されたすべての画像から部位を認識することにより、部位を自動で認識することが可能になる。この場合、観察対象の臓器において、一の部位が認識された後、他の一の部位が認識されると、両者の間の領域が観察されたものと判定される。たとえば、胃において、胃体部大弯が認識された後、前庭部が認識されると、胃体部大弯と前庭部との間の領域が観察されたと判定される。その後、更に、胃角部が認識されると、前庭部と胃角部との間の領域が観察されたと判定される。このように、画像から認識される部位が切り替わるたびに、先に認識されている部位との間の領域が観察されたものと判定される。 Additionally, the recognition of the body part can be configured to be performed for all captured images (images of all frames). In this case, the processor 101 processes each of the images acquired in chronological order and recognizes the body part shown in each image. By recognizing body parts from all captured images, it becomes possible to automatically recognize body parts. In this case, when one part of the organ to be observed is recognized and then another part is recognized, it is determined that the region between the two parts has been observed. For example, in the stomach, if the antrum is recognized after the greater curvature of the gastric corpus is recognized, it is determined that the region between the greater curvature of the gastric body and the antrum has been observed. Thereafter, when the gastric angle is further recognized, it is determined that the region between the antrum and the gastric angle has been observed. In this way, each time the region recognized from the image changes, it is determined that the region between the previously recognized region has been observed.
 なお、観察対象の臓器において、部位を複数種類に分類できる場合、必ずしもすべての部位を認識できるようにする必要はない。認識すべき対象として定められた特定の部位のみ認識できればよい。認識すべき対象(注目領域)には、解剖学的ランドマークとされる部位の他、検査等の観点から必ず観察すべき部位、必ず静止画を撮影すべき部位等が選択される。また、臓器ないし体内の部位として分類できるもの他、特徴となる臓器ないし体内の部分及び領域を認識対象として選択することもできる。また、ユーザが選択した領域(たとえば、ランドマークとなる領域)、病変、炎症領域等を認識対象に設定することもできる。これらの領域は、たとえば、過去に撮影された画像等から設定できる。 Note that if the parts of the organ to be observed can be classified into multiple types, it is not necessarily necessary to be able to recognize all parts. It is only necessary to recognize a specific part determined as an object to be recognized. As objects to be recognized (regions of interest), in addition to sites that are anatomical landmarks, sites that must be observed from the viewpoint of examinations, etc., sites that must be photographed as still images, etc. are selected. Furthermore, in addition to those that can be classified as organs or body parts, characteristic organs or body parts and regions can also be selected as recognition targets. Further, an area selected by the user (for example, an area serving as a landmark), a lesion, an inflamed area, etc. can be set as a recognition target. These areas can be set, for example, from images taken in the past.
 また、部位を自動認識する場合、必ずしも撮影されたすべての画像を認識の処理対象とする必要はない。たとえば、所定のフレーム間隔で取得される画像を認識の処理対象とすることができる。 Additionally, when automatically recognizing body parts, it is not necessary to target all captured images for recognition processing. For example, images acquired at predetermined frame intervals can be subjected to recognition processing.
 [2]認識処理
 部位を認識する際、撮影状態も評価して、部位の認識処理を行ってもよい。すなわち、処理対象とする画像が、適切に撮影されているか否かについても評価して、部位の認識処理を行ってもよい。この場合、画像が適切に撮影されていると判定された場合にのみ、部位の認識を確定させる。したがって、たとえば、画像から部位を認識できた場合であっても、画像が適切に撮影されていないと判定された場合は、部位が認識されなかったものとみなされる。
[2] Recognition Processing When recognizing a body part, the imaging state may also be evaluated and body part recognition processing may be performed. That is, the body part recognition process may be performed by also evaluating whether the image to be processed has been properly captured. In this case, recognition of the body part is confirmed only when it is determined that the image has been properly captured. Therefore, for example, even if a body part can be recognized from an image, if it is determined that the image was not captured appropriately, it is assumed that the body part has not been recognized.
 画像が適切に撮影されたか否かは、たとえば、画像のボケ、画像のブレ、画像の明るさ、構図の良否、画像の汚れ(観察窓の汚れの写り込み)等の観点から判定する。画像のボケ及びブレについては、画像のボケ及びブレのない画像(ボケ及びブレが許容範囲の場合を含む)が、適切に撮影された画像と判定される。画像の明るさについては、適正な明るさ(適正な露出)で撮影された画像が、適切に撮影された画像と判定される。構図の良否については、たとえば、認識対象とする部位が、所定の構図(たとえば、画面中央に配置された構造)で撮影されているか否かの観点で判定される。画像の汚れについては、たとえば、曇りのない画像(明瞭な画像)が、適切に撮影された画像と判定される。 Whether or not the image has been properly photographed is determined from the viewpoints of, for example, blur of the image, blur of the image, brightness of the image, quality of the composition, dirt on the image (reflection of dirt on the observation window), etc. Regarding image blur and blur, an image without blur and blur (including cases where blur and blur are within an acceptable range) is determined to be an appropriately photographed image. Regarding the brightness of an image, an image photographed with appropriate brightness (appropriate exposure) is determined to be an appropriately photographed image. The quality of the composition is determined, for example, from the viewpoint of whether the region to be recognized is photographed with a predetermined composition (for example, a structure placed in the center of the screen). Regarding image stains, for example, an image without cloudiness (a clear image) is determined to be an appropriately photographed image.
 画像が適切に撮影されたか否かは、複数の観点から複合的に判断する構成とすることができる。また、この場合、部位ごとに要件を定めることができる。たとえば、すべての部位に対し、画像のボケ、画像のブレ、画像の明るさ、画像の汚れの観点から画像が適切に撮影されたか否かを判定する一方、特定の部位については、構図の良否も判定する構成とすることができる。 Whether or not an image has been appropriately photographed can be determined in a composite manner from multiple viewpoints. Further, in this case, requirements can be determined for each part. For example, for all areas, it is determined whether or not the image was taken appropriately from the viewpoints of image blur, image blur, image brightness, and image smudge, while for a specific area, it is judged whether the composition is good or not. It is also possible to have a configuration in which it is also determined.
 画像が適切に撮影されたか否かは、たとえば、判定器を用いて判定する構成とすることができる。判定器は、たとえば、学習済みモデルで構成することができる。複数の観点から画像が適切に撮影されたか否かを判定する場合、判定器を個別に用意する。たとえば、画像のボケ、画像のブレ、画像の明るさ、構図の良否、画像の汚れ等の観点から画像が適切に撮影されたか否か判定する場合、画像のボケの有無を判定する判定器、画像のブレの有無を判定する判定器、画像の明るさの適否を判定する判定器、構図の良否を判定する判定器、画像の汚れの有無を判定する判定器等の個別に用意する。 For example, a configuration may be adopted in which a determination device is used to determine whether or not the image has been appropriately photographed. The determiner can be configured with a learned model, for example. When determining whether an image has been properly captured from multiple viewpoints, separate determining devices are prepared. For example, when determining whether or not an image has been properly photographed from the viewpoint of image blur, image blur, image brightness, quality of composition, image dirt, etc., a determination device that determines the presence or absence of image blur; A determiner for determining whether the image is blurred, a determiner for determining whether the brightness of the image is appropriate, a determiner for determining the quality of the composition, a determiner for determining the presence or absence of dirt in the image, etc. are separately prepared.
 画像が適切に撮影されたか否かを判定するために定められる基準は、第2基準の一例である。また、画像のボケ、画像のブレ、画像の明るさ、構図の良否、画像の汚れ等の観点から定められる判定基準は、第2基準の内容の一例である。 The standard set for determining whether the image has been appropriately captured is an example of the second standard. Further, the determination criteria determined from the viewpoints of image blur, image blur, image brightness, quality of composition, image dirt, etc. are examples of the contents of the second criteria.
 [観察対象]
 上記実施の形態では、胃を観察する場合を例に説明したが、観察対象とする臓器は、特に限定されない。また、臓器以外の器官を観察対象とすることもできる。すなわち、本発明は、内視鏡を用いて、体内を観察する場合に適用できる。
[observation target]
In the above embodiment, the case where the stomach is observed is described as an example, but the organ to be observed is not particularly limited. Furthermore, organs other than internal organs can also be observed. That is, the present invention can be applied to observing the inside of the body using an endoscope.
 図30は、大腸を観察する場合の観察状況表示ウインドウの表示の一例を示す図である。 FIG. 30 is a diagram showing an example of the display of the observation status display window when observing the large intestine.
 図30は、あらかじめ観察すべき部位が定められている場合の例を示している。図30に示すように、観察対象である大腸のシェーマ図Shが観察状況表示ウインドウWに表示され、かつ、そのシェーマ図Sh上に所定のマークMA、MB、…(図30に示す例では円)が表示されて、大腸の観察すべき部位(又は静止画を撮影すべき部位)が示される。図30では、観察すべき部位が4つの場合の例を示している。図30に示す4つ部位は、「回盲」、「肝湾曲(右結腸曲)」、「脾湾曲(左結腸曲)」及び「肛門」である。肝湾曲は、上行結腸と横行結腸の移行部である。脾肝彎は、横行結腸と下行結腸の移行部である。マークMAは「回盲」、マークMBは「肝湾曲」、マークMCは「脾湾曲」、マークMDは「肛門」をそれぞれ示している。 FIG. 30 shows an example where the region to be observed is determined in advance. As shown in FIG. 30, a schema diagram Sh of the large intestine to be observed is displayed in the observation status display window W, and predetermined marks MA, MB, ... (circles in the example shown in FIG. 30) are displayed on the schema diagram Sh. ) is displayed to indicate the part of the large intestine to be observed (or the part to take a still image). FIG. 30 shows an example where there are four parts to be observed. The four sites shown in FIG. 30 are "ileocecal", "hepatic curvature (right colonic flexure)", "splenic curvature (left colonic flexure)", and "anus". The hepatic flexure is the transition between the ascending and transverse colon. The splenohepatic curvature is the transition between the transverse and descending colons. Mark MA indicates "ileocecale," mark MB indicates "hepatic curvature," mark MC indicates "splenic curvature," and mark MD indicates "anus."
 図30は、マークMAで示す部位(回盲)、マークMBで示す部位(肝湾曲)、マークMCで示す部位(脾湾曲)の順で大腸の各部位が認識された場合の例を示している。この場合、マークMAで示す部位(回盲)とマークMBで示す部位(肝湾曲)の間の領域(上行結腸)、及び、マークMBで示す部位(肝湾曲)とマークMCで示す部位(脾湾曲)の間の領域(横行結腸)が、認識された領域として判定される。 FIG. 30 shows an example where each part of the large intestine is recognized in the order of the part indicated by mark MA (ileocecal), the part indicated by mark MB (hepatic curvature), and the part indicated by mark MC (splenic curvature). There is. In this case, the area (ascending colon) between the area indicated by mark MA (ileocecal) and the area indicated by mark MB (hepatic curvature), and the area (ascending colon) between the area indicated by mark MB (hepatic curvature) and the area indicated by mark MC (spleen). The region between the curves (transverse colon) is determined as the recognized region.
 認識された部位のマークは、その表示形態が切り替わる(たとえば、色が変わる)。また、認識された領域には、矢印Ar1、Ar2、…及び評価値が表示される。矢印Ar1、Ar2、…は、観察方向に向け表示される。評価値は、矢印Ar1、Ar2、…に隣接して表示された評価値表示枠Sc1、Sc2、…に表示される。 The display form of the mark of the recognized part changes (for example, the color changes). Further, arrows Ar1, Ar2, . . . and evaluation values are displayed in the recognized areas. Arrows Ar1, Ar2, . . . are displayed pointing in the observation direction. The evaluation values are displayed in evaluation value display frames Sc1, Sc2, . . . displayed adjacent to arrows Ar1, Ar2, .
 なお、上記実施の形態では、いわゆる軟性鏡で観察する場合を例に説明したが、本発明は、いわゆる硬性鏡で観察する場合にも適用できる。 Although the above embodiments have been described using an example of observation using a so-called flexible scope, the present invention can also be applied to a case where observation is performed using a so-called rigid scope.
 [観察状況の判定結果の記録]
 上記のように、観察状況の判定処理の結果については、観察中に撮影された静止画及び/又は動画の情報、観察中に検出された病変部等の情報、観察中に行われた鑑別結果の情報等に関連付けて記録することができる。
[Record of judgment results of observation status]
As mentioned above, the results of the observation status judgment process include information on still images and/or videos taken during observation, information on lesions detected during observation, and results of classification performed during observation. It can be recorded in association with the information etc.
 記録する情報には、(1)各部位を認識した順番の情報、(2)静止画を撮影して部位を認識した場合には、各部位を認識した静止画の情報、(3)各領域(認識した部位と部位の間の領域)について算出された評価値の情報、(4)各領域での撮影時間及び/又はフレーム数、(5)全体での撮影時間及び/又はフレーム数等の情報を含めることができる。全体での撮影時間は、最初の部位を認識してから最後の部位を認識するまでの撮影時間である。全体でのフレーム数は、最初の部位を認識してから最後の部位を認識するまでの総フレーム数である。 The information to be recorded includes (1) information on the order in which each part was recognized, (2) if the parts were recognized by taking a still image, information on the still image in which each part was recognized, (3) each area Information on the evaluation value calculated for (area between recognized parts), (4) imaging time and/or number of frames in each region, (5) overall imaging time and/or number of frames, etc. Can contain information. The total imaging time is the imaging time from recognizing the first part to recognizing the last part. The total number of frames is the total number of frames from recognizing the first part to recognizing the last part.
 また、観察状況の判定結果の情報は、画像処理装置100に備えられた記憶装置(補助記憶装置103)に加えて、又は、これに代えて、外部の記憶装置に記録する構成とすることができる。たとえば、結果の情報を内視鏡の検査結果等を管理するシステム(内視鏡情報管理システム等)に送信し、当該システムないし当該システムに接続されたデータサーバ(内視鏡検査データサーバ等)に記録する構成とすることができる。 Further, the information on the determination result of the observation situation may be configured to be recorded in an external storage device in addition to or in place of the storage device (auxiliary storage device 103) provided in the image processing device 100. can. For example, result information is sent to a system that manages endoscopy test results, etc. (endoscopy information management system, etc.), and the system or a data server connected to the system (endoscopy data server, etc.) It can be configured to record in
 また、通常、内視鏡による観察(検査を含む)が行われると、その観察結果を示すレポートが作成される。したがって、観察中に観察状況の判定処理を行った場合には、その判定処理の結果の情報についても、レポートに記載(記録)することが好ましい。 Additionally, when observation (including examination) is performed using an endoscope, a report is usually created showing the observation results. Therefore, when a process for determining the observation status is performed during observation, it is preferable to also write (record) information on the results of the determination process in the report.
 また、近年、レポートの作成は、レポートの作成を支援する装置(レポート作成支援装置)を用いて行われる。レポート作成支援装置は、画像処理装置等からレポートの作成に必要な情報を取得するが、このレポート作成支援装置が取得する情報に観察状況の判定結果の情報を含めることができる。これにより、レポートに記載(記録)する情報に観察状況の判定結果の情報を含めることができ、かつ、その情報を自動入力できる。 Additionally, in recent years, report creation is performed using a device that supports report creation (report creation support device). The report creation support device acquires information necessary for creating a report from an image processing device or the like, and the information acquired by the report creation support device can include information on the determination result of the observation situation. Thereby, information on the determination result of the observation situation can be included in the information described (recorded) in the report, and the information can be automatically input.
 [ハードウェア構成]
 内視鏡画像処理装置が有する機能は、各種のプロセッサ(Processor)で実現される。各種のプロセッサには、プログラムを実行して各種の処理部として機能する汎用的なプロセッサであるCPU及び/又はGPU(Graphic Processing Unit)、FPGA(Field Programmable Gate Array)などの製造後に回路構成を変更可能なプロセッサであるプログラマブルロジックデバイス(Programmable Logic Device:PLD)、ASIC(Application Specific Integrated Circuit)などの特定の処理を実行させるために専用に設計された回路構成を有するプロセッサである専用電気回路などが含まれる。プログラムは、ソフトウェアと同義である。
[Hardware configuration]
The functions of the endoscopic image processing device are realized by various processors. Various types of processors include CPUs and/or GPUs (Graphic Processing Units), FPGAs (Field Programmable Gate Arrays), etc., which are general-purpose processors that execute programs and function as various processing units.The circuit configuration may be changed after manufacturing. Programmable logic devices (PLDs), which are capable processors, and dedicated electric circuits, which are processors with circuit configurations specifically designed to execute specific processes, such as ASICs (Application Specific Integrated Circuits), etc. included. Program is synonymous with software.
 1つの処理部は、これら各種のプロセッサのうちの1つで構成されていてもよいし、同種又は異種の2つ以上のプロセッサで構成されてもよい。たとえば、1つの処理部は、複数のFPGA、あるいは、CPUとFPGAの組み合わせによって構成されてもよい。また、複数の処理部を1つのプロセッサで構成してもよい。複数の処理部を1つのプロセッサで構成する例としては、第1に、クライアントやサーバなどに用いられるコンピュータに代表されるように、1つ以上のCPUとソフトウェアの組み合わせで1つのプロセッサを構成し、このプロセッサが複数の処理部として機能する形態がある。第2に、システムオンチップ(System on Chip:SoC)などに代表されるように、複数の処理部を含むシステム全体の機能を1つのIC(Integrated Circuit)チップで実現するプロセッサを使用する形態がある。このように、各種の処理部は、ハードウェア的な構造として、上記各種のプロセッサを1つ以上用いて構成される。 One processing unit may be composed of one of these various processors, or may be composed of two or more processors of the same type or different types. For example, one processing unit may be configured by a plurality of FPGAs or a combination of a CPU and an FPGA. Further, the plurality of processing units may be configured with one processor. First, as an example of configuring multiple processing units with one processor, one processor is configured with a combination of one or more CPUs and software, as typified by computers used for clients and servers. There is a form in which this processor functions as a plurality of processing units. Second, there are processors that use a single IC (Integrated Circuit) chip, such as System on Chip (SoC), which implements the functions of an entire system that includes multiple processing units. be. In this way, various processing units are configured using one or more of the various processors described above as a hardware structure.
 [その他]
 各変形例については、適宜組み合わせて使用できる。
[others]
Each modification can be used in combination as appropriate.
1 内視鏡システム
10 内視鏡
20 光源装置
30 プロセッサ装置
31 内視鏡制御部
32 光源制御部
33 画像処理部
34 入力制御部
35 出力制御部
40 入力装置
50 表示装置
52 画面
100 画像処理装置
101 プロセッサ
102 主記憶装置
103 補助記憶装置
104 入出力インターフェース
111 画像取得部
112 コマンド取得部
113 画像処理部
113A 病変検出部
113B 鑑別部
113C 部位認識部
113D 観察状況判定部
113D1 観察領域判定部
113D2 撮影評価部
113D3 評価値算出部
113D4 類似画像検出部
114 記録制御部
115 表示制御部
A1 主表示領域
A1a 観察像表示領域
A1b 情報表示領域
A2 副表示領域
A3 鑑別結果表示領域
Ar1 矢印
Ar2 矢印
Fl1 部位表示枠
Fl2 部位表示枠
Fl3 部位表示枠
I 画像
IOK 画像(OK画像)
ING 画像(NG画像)
Im 画像
Ip 被検者の情報
Is 静止画
M1、M2、… マーク
Ma、Mb、… マーク
MA、MB、… マーク
SR 評価値表示エリア
Sc1、Sc2、… 評価値表示枠
Sh シェーマ図
W 観察状況表示ウインドウ
S1~S7 観察状況を示す情報を表示装置に表示させる場合の処理の手順
S11~S19 認識済みの部位を再度認識した場合に自動で評価値を算出し直す場合の処理の手順
1 Endoscope system 10 Endoscope 20 Light source device 30 Processor device 31 Endoscope control section 32 Light source control section 33 Image processing section 34 Input control section 35 Output control section 40 Input device 50 Display device 52 Screen 100 Image processing device 101 Processor 102 Main storage device 103 Auxiliary storage device 104 Input/output interface 111 Image acquisition unit 112 Command acquisition unit 113 Image processing unit 113A Lesion detection unit 113B Discrimination unit 113C Part recognition unit 113D Observation situation determination unit 113D1 Observation area determination unit 113D2 Imaging evaluation unit 113D3 Evaluation value calculation section 113D4 Similar image detection section 114 Recording control section 115 Display control section A1 Main display area A1a Observation image display area A1b Information display area A2 Sub display area A3 Identification result display area Ar1 Arrow Ar2 Arrow Fl1 Part display frame Fl2 Part Display frame Fl3 Part display frame I Image IOK Image (OK image)
ING image (NG image)
Im Image Ip Patient information Is Still images M1, M2, ... Marks Ma, Mb, ... Marks MA, MB, ... Marks SR Evaluation value display area Sc1, Sc2, ... Evaluation value display frame Sh Schema diagram W Observation status display Windows S1 to S7 Processing steps when displaying information indicating the observation status on the display device S11 to S19 Processing steps when automatically recalculating the evaluation value when a recognized part is recognized again

Claims (27)

  1.  内視鏡で時系列に撮影された複数の画像を処理する画像処理装置であって、
     プロセッサを備え、
     前記プロセッサは、
     前記複数の画像を取得し、
     前記画像を処理して、体内の複数の注目領域の中から前記画像に写っている前記注目領域を認識し、
     前記複数の画像のうちの第1画像から前記複数の注目領域のうちの第1注目領域が認識され、かつ、前記第1画像よりも時系列順で後の第2画像から前記複数の注目領域のうちの第2注目領域が認識された場合に、前記第1注目領域と前記第2注目領域との間の領域が観察されたことを示す情報を表示装置に表示させる、
     画像処理装置。
    An image processing device that processes multiple images taken in chronological order with an endoscope,
    Equipped with a processor,
    The processor includes:
    acquiring the plurality of images;
    processing the image to recognize the region of interest in the image from among a plurality of regions of interest within the body;
    A first region of interest among the plurality of regions of interest is recognized from a first image of the plurality of images, and a first region of interest among the plurality of regions of interest is recognized from a second image that is later in chronological order than the first image. displaying information on a display device indicating that a region between the first region of interest and the second region of interest has been observed when a second region of interest among the regions of interest is recognized;
    Image processing device.
  2.  前記プロセッサは、
     前記体内の前記複数の注目領域の中から選択された複数の特定の注目領域の情報を第1情報として前記表示装置に表示させ、
     前記第1注目領域と前記第2注目領域との間の領域が観察されたことを示す情報を第2情報として前記表示装置に表示させる、
     請求項1に記載の画像処理装置。
    The processor includes:
    displaying information on a plurality of specific attention regions selected from the plurality of attention regions in the body as first information on the display device;
    displaying information indicating that a region between the first region of interest and the second region of interest has been observed on the display device as second information;
    The image processing device according to claim 1.
  3.  前記第2情報は、前記第1情報において、前記第1注目領域と前記第2注目領域とを関連付ける情報で構成される、
     請求項2に記載の画像処理装置。
    The second information includes information that associates the first region of interest and the second region of interest in the first information.
    The image processing device according to claim 2.
  4.  前記第2情報は、観察の移動方向を示す情報を含む、
     請求項3に記載の画像処理装置。
    The second information includes information indicating a direction of observation movement.
    The image processing device according to claim 3.
  5.  前記第1情報は、前記注目領域を表す標識を特定のレイアウトで配置した情報で構成され、
     前記第2情報は、前記標識を関連付ける情報で構成される、
     請求項3に記載の画像処理装置。
    The first information is composed of information in which signs representing the attention area are arranged in a specific layout,
    The second information is comprised of information that associates the label.
    The image processing device according to claim 3.
  6.  前記第1情報は、観察対象とする臓器のシェーマ図を含み、前記シェーマ図上の各注目領域の位置に前記標識を配置した情報で構成される、
     請求項5に記載の画像処理装置。
    The first information includes a schema diagram of the organ to be observed, and is composed of information in which the markers are placed at positions of respective regions of interest on the schema diagram.
    The image processing device according to claim 5.
  7.  前記第2情報は、前記標識を結ぶ線分で構成される、
     請求項6に記載の画像処理装置。
    The second information is composed of line segments connecting the signs,
    The image processing device according to claim 6.
  8.  前記線分は、観察の移動方向を示す矢印を含む、
     請求項7に記載の画像処理装置。
    the line segment includes an arrow indicating a direction of movement of observation;
    The image processing device according to claim 7.
  9.  前記プロセッサは、認識された前記注目領域の前記標識の表示の形態を第1形態から第2形態に切り替える、
     請求項5から8のいずれか1項に記載の画像処理装置。
    the processor switches a display form of the mark of the recognized attention area from a first form to a second form;
    The image processing device according to any one of claims 5 to 8.
  10.  前記プロセッサは、時系列順で前記第1画像及び前記第2画像の間の前記画像に基づいて、観察の評価値を算出し、
     前記第2情報は、前記評価値の情報を含む、
     請求項2から8のいずれか1項に記載の画像処理装置。
    The processor calculates an observation evaluation value based on the images between the first image and the second image in chronological order,
    The second information includes information on the evaluation value,
    The image processing device according to any one of claims 2 to 8.
  11.  前記評価値は、第1基準を満たした画像の数、及び/又は、前記第1画像及び前記第2画像の間の画像の数に基づいて算出される、
     請求項10に記載の画像処理装置。
    The evaluation value is calculated based on the number of images that meet the first criterion and/or the number of images between the first image and the second image.
    The image processing device according to claim 10.
  12.  前記第1基準が、画像のボケ状態、画像のブレ状態及び画像の明るさの少なくとも一つに基づいて設定される、
     請求項11に記載の画像処理装置。
    The first criterion is set based on at least one of an image blur state, an image blur state, and an image brightness.
    The image processing device according to claim 11.
  13.  前記評価値は、第1基準を満たした画像の割合又は前記第1基準を満たしていない画像の割合として算出される、
     請求項10に記載の画像処理装置。
    The evaluation value is calculated as a percentage of images that meet the first criterion or a percentage of images that do not meet the first criterion.
    The image processing device according to claim 10.
  14.  前記プロセッサは、
     前記第1画像及び前記第2画像の間の前記画像のうち相互に類似する画像を抽出し、
     抽出した前記相互に類似する画像の一方を除外して、前記評価値を算出する、
     請求項13に記載の画像処理装置。
    The processor includes:
    extracting mutually similar images from among the images between the first image and the second image;
    calculating the evaluation value by excluding one of the extracted mutually similar images;
    The image processing device according to claim 13.
  15.  前記プロセッサは、前記評価値に応じた形態で前記第2情報を表示させる、
     請求項10に記載の画像処理装置。
    the processor displays the second information in a form according to the evaluation value;
    The image processing device according to claim 10.
  16.  前記プロセッサは、前記第1注目領域が、前記第2画像よりも時系列順で後の第3画像から認識され、かつ、前記第2注目領域が、前記第3画像よりも時系列順で後の第4画像から認識された場合に、前記評価値を更新し、かつ、前記第2情報の表示を更新する、
     請求項10に記載の画像処理装置。
    The processor recognizes the first region of interest from a third image that is later in chronological order than the second image, and that the second region of interest is recognized from a third image that is later in chronological order than the third image. updating the evaluation value and updating the display of the second information when the second information is recognized from the fourth image;
    The image processing device according to claim 10.
  17.  前記プロセッサは、
     前記複数の画像を時系列順に前記表示装置に表示させ、
     前記表示装置に表示された前記画像から前記第1画像の選択を受け付け、
     前記第1画像よりも後に前記表示装置に表示された前記画像から前記第2画像の選択を受け付ける、
     請求項1から8のいずれか1項に記載の画像処理装置。
    The processor includes:
    Displaying the plurality of images on the display device in chronological order,
    accepting selection of the first image from the images displayed on the display device;
    accepting selection of the second image from among the images displayed on the display device after the first image;
    The image processing device according to any one of claims 1 to 8.
  18.  前記プロセッサは、前記第1画像及び前記第2画像として選択された前記画像を静止画として記録する、
     請求項17に記載の画像処理装置。
    The processor records the images selected as the first image and the second image as still images.
    The image processing device according to claim 17.
  19.  前記プロセッサは、前記複数の画像のそれぞれを処理して、それぞれの前記画像に写っている前記注目領域を認識する、
     請求項1から8のいずれか1項に記載の画像処理装置。
    the processor processes each of the plurality of images to recognize the region of interest in each of the images;
    The image processing device according to any one of claims 1 to 8.
  20.  前記プロセッサは、前記画像から前記注目領域を認識した場合に、更に、前記注目領域を認識した前記画像が第2基準を満たすか否か判定し、前記第2基準を満たす場合に、前記注目領域の認識を確定させる、
     請求項1から8のいずれか1項に記載の画像処理装置。
    When the processor recognizes the region of interest from the image, the processor further determines whether the image in which the region of interest is recognized satisfies a second criterion, and when the second criterion is satisfied, the processor recognizes the region of interest from the image. confirm the recognition of
    The image processing device according to any one of claims 1 to 8.
  21.  前記第2基準の内容が、前記注目領域ごとに設定される、
     請求項20に記載の画像処理装置。
    Contents of the second standard are set for each of the attention areas;
    The image processing device according to claim 20.
  22.  前記プロセッサは、前記第2情報の表示を開始してからT時間経過後に前記第2情報の表示を終了させる、
     請求項2から8のいずれか1項に記載の画像処理装置。
    The processor ends the display of the second information after T time has elapsed since the start of display of the second information.
    The image processing device according to any one of claims 2 to 8.
  23.  前記プロセッサは、
     前記第2情報の表示の履歴を記録し、
     新たに前記第2情報を表示させる際、前記履歴に基づいて、過去に表示させた前記第2情報を同時に表示させる、
     請求項22に記載の画像処理装置。
    The processor includes:
    recording a history of displaying the second information;
    when newly displaying the second information, simultaneously displaying the second information displayed in the past based on the history;
    The image processing device according to claim 22.
  24.  前記プロセッサは、観察が終了した場合、及び/又は、前記履歴の表示が指示された場合に、前記履歴に基づいて、過去に表示させた前記第2情報を表示させる、
     請求項23に記載の画像処理装置。
    The processor causes the second information displayed in the past to be displayed based on the history when the observation is completed and/or when displaying the history is instructed.
    The image processing device according to claim 23.
  25.  前記プロセッサは、前記第1注目領域が、前記第2画像よりも時系列順で後の第3画像から認識された場合、前記第2情報の表示を終了させる、
     請求項2から8のいずれか1項に記載の画像処理装置。
    The processor terminates the display of the second information when the first region of interest is recognized from a third image that is later in chronological order than the second image.
    The image processing device according to any one of claims 2 to 8.
  26.  前記複数の注目領域が、観察対象とする臓器の複数の部位である、
     請求項1から8のいずれか1項に記載の画像処理装置。
    the plurality of regions of interest are a plurality of parts of an organ to be observed;
    The image processing device according to any one of claims 1 to 8.
  27.  内視鏡と、
     表示装置と、
     前記内視鏡で撮影された画像を処理する請求項1から8のいずれか1項に記載の画像処理装置と、
     を備えた内視鏡システム。
    endoscope and
    a display device;
    The image processing device according to any one of claims 1 to 8, which processes images taken by the endoscope;
    An endoscope system equipped with
PCT/JP2023/016078 2022-05-24 2023-04-24 Image processing device and endoscope system WO2023228659A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022-084588 2022-05-24
JP2022084588 2022-05-24

Publications (1)

Publication Number Publication Date
WO2023228659A1 true WO2023228659A1 (en) 2023-11-30

Family

ID=88918973

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/016078 WO2023228659A1 (en) 2022-05-24 2023-04-24 Image processing device and endoscope system

Country Status (1)

Country Link
WO (1) WO2023228659A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005218584A (en) * 2004-02-04 2005-08-18 Olympus Corp Display processor of image information and its display processing method and display processing program
WO2010103868A1 (en) * 2009-03-11 2010-09-16 オリンパスメディカルシステムズ株式会社 Image processing system, external device therefor, and image processing method therefor
JP2014083289A (en) * 2012-10-25 2014-05-12 Olympus Corp Insertion system, insertion support device, and insertion support method and program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005218584A (en) * 2004-02-04 2005-08-18 Olympus Corp Display processor of image information and its display processing method and display processing program
WO2010103868A1 (en) * 2009-03-11 2010-09-16 オリンパスメディカルシステムズ株式会社 Image processing system, external device therefor, and image processing method therefor
JP2014083289A (en) * 2012-10-25 2014-05-12 Olympus Corp Insertion system, insertion support device, and insertion support method and program

Similar Documents

Publication Publication Date Title
US8423123B2 (en) System and method for in-vivo feature detection
US9805469B2 (en) Marking and tracking an area of interest during endoscopy
JP4629143B2 (en) System for detecting contents in vivo
CN113573654A (en) AI system for detecting and determining lesion size
WO2023103467A1 (en) Image processing method, apparatus and device
US20090131746A1 (en) Capsule endoscope system and method of processing image data thereof
CN111275041B (en) Endoscope image display method and device, computer equipment and storage medium
CN113543694B (en) Medical image processing device, processor device, endoscope system, medical image processing method, and recording medium
WO2006100808A1 (en) Capsule endoscope image display controller
JPWO2019230302A1 (en) Learning data collection device, training data collection method and program, learning system, trained model, and endoscopic image processing device
Pogorelov et al. Deep learning and handcrafted feature based approaches for automatic detection of angiectasia
KR100751160B1 (en) Medical image recording system
JP2020156903A (en) Processor for endoscopes, information processing unit, program, information processing method and learning model generation method
JPWO2020184257A1 (en) Medical image processing equipment and methods
WO2023228659A1 (en) Image processing device and endoscope system
JP7127779B2 (en) Diagnostic support system and diagnostic support program
JP6840263B2 (en) Endoscope system and program
US20220361739A1 (en) Image processing apparatus, image processing method, and endoscope apparatus
WO2022080141A1 (en) Endoscopic imaging device, method, and program
US20220202284A1 (en) Endoscope processor, training device, information processing method, training method and program
CN116724334A (en) Computer program, learning model generation method, and operation support device
US20240148235A1 (en) Information processing apparatus, information processing method, endoscope system, and report creation support device
WO2023112499A1 (en) Endoscopic image observation assistance device and endoscope system
US20240136034A1 (en) Information processing apparatus, information processing method, endoscope system, and report creation support device
WO2023067922A1 (en) Endoscope image processing device, endoscope image processing method, and endoscope system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23811537

Country of ref document: EP

Kind code of ref document: A1