WO2023053334A1 - Processing system and information processing method - Google Patents

Processing system and information processing method Download PDF

Info

Publication number
WO2023053334A1
WO2023053334A1 PCT/JP2021/036108 JP2021036108W WO2023053334A1 WO 2023053334 A1 WO2023053334 A1 WO 2023053334A1 JP 2021036108 W JP2021036108 W JP 2021036108W WO 2023053334 A1 WO2023053334 A1 WO 2023053334A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
approach angle
processing unit
treatment
processing
Prior art date
Application number
PCT/JP2021/036108
Other languages
French (fr)
Japanese (ja)
Inventor
紘介 甕
哲寛 山田
晋平 宮原
秀範 橋本
晃佑 野川
咲 石澤
一郎 小田
哲 野中
Original Assignee
オリンパス株式会社
国立研究開発法人国立がん研究センター
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オリンパス株式会社, 国立研究開発法人国立がん研究センター filed Critical オリンパス株式会社
Priority to PCT/JP2021/036108 priority Critical patent/WO2023053334A1/en
Publication of WO2023053334A1 publication Critical patent/WO2023053334A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B23/00Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes
    • G09B23/28Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine

Definitions

  • the present invention relates to a processing system, an information processing method, and the like.
  • Patent Literature 1 discloses a method of evaluating a doctor's skill using motion data of a medical robot.
  • the approach angle which is the angle of the endoscope with respect to the tissue to be treated, has been found to be an effective parameter for judging the skill of a doctor because it is directly linked to safety.
  • Conventional methods such as Patent Document 1 do not consider such circumstances, and are not sufficient for evaluating the skill of doctors.
  • an acquisition unit that acquires approach angle information of an insertion portion of an endoscope and energization history information related to the energization history of a treatment instrument, and based on the approach angle information and the energization history information, It relates to a processing system including a processing unit that evaluates the skill of a user who operates a scope, and an output processing unit that outputs skill evaluation information that is the result of the skill evaluation.
  • Another aspect of the present disclosure acquires approach angle information of an insertion portion of an endoscope and power supply history information related to the power supply history of a treatment instrument, and based on the approach angle information and the power supply history information, the endoscope It relates to an information processing method for performing skill evaluation of a user who operates a , and outputting skill evaluation information as a result of the skill evaluation.
  • FIG. 1(A) to 1(F) are diagrams for explaining the positional relationship between an endoscope insertion portion and a tissue to be treated, and an example of a captured image.
  • the figure explaining the structural example of a processing system The figure explaining the example of the appearance of an endoscope system.
  • FIG. 4 is a diagram for explaining a configuration example of the distal end portion of the insertion section; The figure explaining the structural example of the system containing a processing system.
  • FIG. 4 is a diagram for explaining an example of evaluation skill information;
  • FIG. 4 is a diagram for explaining an example of advice information;
  • FIG. 4 is a diagram for explaining an example of log information of approach angle information;
  • FIG. 10 is a diagram for explaining another example of log information of approach angle information; The figure explaining the example of doctor information. The figure explaining the example of patient information. 13(A) and 13(B) are diagrams for explaining an approach angle.
  • FIG. 4 is a flowchart for explaining an example of processing for outputting approach angle information; The figure explaining a neural network.
  • FIG. 4 is a diagram for explaining an example of input and output of a neural network; 4 is a flowchart for explaining a processing example of learning processing; 4 is a flowchart for explaining a processing example of skill evaluation processing, which is inference processing;
  • FIG. 4 is a diagram for explaining an example of clustering results in an n-dimensional feature amount space; FIGS.
  • 20A to 20F are diagrams for explaining an example of a positional relationship between an endoscope insertion portion and a tissue to be treated, and captured images in a modified example;
  • a configuration example of a processing system in a modified example. An example of the appearance of an endoscope system in a modified example.
  • a configuration example of an endoscope system in a modified example. A configuration example of the distal end portion of the insertion section in a modified example.
  • 25(A) and 25(B) are explanatory diagrams of the approach angle in the modified example.
  • 10 is a flowchart for explaining processing in a processing system in a modified example;
  • 27(A) and 27(B) are examples of screens displaying relative changes in the approach angle in the modified example.
  • FIG. 11 is an explanatory diagram of classification processing of case data in a modified example; The flowchart explaining other processes of a processing system in a modification.
  • FIG. 11 is a diagram for explaining an example of measurement results of pressing pressure in each stage of treatment in another modified example; The figure explaining the structural example of the processing system in another modification. The figure explaining the example of the appearance of the endoscope system in another modification. The figure explaining the structural example of the endoscope system in another modification.
  • FIG. 11 is an explanatory diagram of classification processing of case data in a modified example; The flowchart explaining other processes of a processing system in a modification.
  • FIG. 11 is a diagram for explaining an example of measurement results of pressing pressure in each stage of treatment in another modified example;
  • the figure explaining the structural example of the processing system in another modification The figure explaining the example of the appearance of the endoscope system in another modification.
  • FIG. 11 is a diagram for explaining a configuration example of the distal end portion of the insertion section in another modified example; The figure explaining the structural example of the system containing a processing system in another modification. 39(A) and 39(B) are diagrams for explaining an example of a pressing pressure measuring method in another modified example.
  • FIG. FIG. 11 is a diagram for explaining an example of a skill evaluation sheet in another modified example; The figure explaining the example of the advice information in another modification.
  • FIG. 11 is a diagram for explaining an example of log information of pressing pressure information and air supply/suction information in another modified example; Another diagram for explaining an example of log information of pressing pressure information and air supply/suction information in another modified example.
  • EMR endoscopic mucosal resection
  • ESD endoscopic submucosal dissection
  • ESD includes multiple stages, such as marking, local injection, incision, and peeling.
  • a hemostasis step may also be included.
  • a stage can also be called a step.
  • Marking is the step of marking the area to be excised around the lesion.
  • Local injection is the step of injecting a drug into the submucosa.
  • Incision is the step of cutting the mucosa around the lesion with a knife to surround the marking.
  • Dissection is a step of dissecting the lesion from the living body using a dedicated knife or snare.
  • Harvesting is the step of retrieving the excised lesion.
  • Hemostasis is the step of stopping bleeding on the body surface after resection.
  • each treatment included in ESD will be mainly described as an example of treatment according to this embodiment, but the technique of this embodiment can be extended to other treatments for a living body such as EMR.
  • the flexible endoscope is flexible and long from the distal end portion 11 of the insertion section 310b to the operation section 310a, as will be described later with reference to FIG. 3, for example. Therefore, the feel and the amount of strength due to the contact of the tip portion 11 with the living body are hardly transmitted to the operator.
  • Information that can be obtained by the operator is mainly an image captured by an imaging system provided in the distal end portion 11 . That is, what the operator can confirm is only the range and angle visible from the screen. Furthermore, captured images often do not have three-dimensional information.
  • FIGS. 1A to 1F are diagrams illustrating the positional relationship between the distal end portion 11 of the insertion section 310b and the tissue to be treated, and captured images at that time.
  • OB11 in FIGS. 1(A) to 1(F) is a lesion, and the treatment target tissue is OB11 or its surrounding tissue.
  • the line shown by E11 is a line which shows the boundary of a submucosal layer and a muscle layer.
  • the angle of the endoscope with respect to the treatment target tissue is referred to as the approach angle.
  • the treatment target tissue here represents a lesion to be excised by treatment in a narrow sense, it may be a normal tissue around the lesion.
  • the approach angle with the peripheral portion as the tissue to be treated may make it easier to understand the angle between the living body and the treatment instrument. Further, specific processing for obtaining the approach angle will be described later with reference to FIGS. 13 and 14.
  • FIG. 1(A) shows a state in which the treatment instrument 360 is protruded from the distal end portion 11 of the insertion portion 310b and approach to the treatment target tissue is started.
  • FIG. 1(B) represents a captured image in the state shown in FIG. 1(A).
  • the treatment tool 360 and the tissue to be treated are not close enough to contact each other. Therefore, the operator can estimate the relative relationship between the insertion portion 310b and the tissue to be treated based on the captured image, compared to the state during treatment, which will be described later with reference to FIGS. 1C to 1F. Easy.
  • the treatment instrument 360 will be described later with reference to FIG.
  • the insertion section 310 b includes an objective optical system 311 .
  • the objective optical system 311 forms a subject image by reflecting light reflected from the subject.
  • FIG. 1(C) represents a state in which incision is being performed
  • FIG. 1(D) represents a captured image during incision.
  • the tip 11 is hidden under the tissue raised by the incision. Therefore, as shown in FIG. 1D, the captured image is in a state where the tissue covers the entire screen, and it is difficult for the operator to estimate the approach angle from the captured image.
  • FIG. 1(E) shows a state in which the approach angle changes greatly during incision and becomes a dangerous angle that increases bleeding
  • FIG. 1(F) shows the captured image at that time.
  • the distal end portion 11 maintains a state of getting under the raised tissue.
  • the captured image changes little from the state of FIG. 1(D), and it is not easy for the operator to notice that the approach angle has changed from the captured image. For example, if the treatment instrument 360 penetrates deeply into the tissue and the amount of bleeding increases, the operator can grasp from the captured image that the operator is in an unfavorable state, but it is difficult to predict this before the bleeding occurs.
  • the endoscopist who is an endoscopist, performs procedures based on his own empirical rules, in the absence of sufficient information regarding images, sensations, and competence.
  • experienced doctors have experienced that even with flexible endoscopes, by imagining the actual position and posture of the endoscope with respect to the lesion, the insertion portion 310b can be controlled while constantly correcting changes in operability.
  • the change in the approach angle during treatment is smaller for experienced doctors than for novice doctors.
  • experienced doctors themselves cannot express in words how and when to operate to make the approach angle smaller. In other words, there is a situation in which it is not possible to easily pass on the control know-how of the approach angle to the trainee doctor in an objective manner.
  • the transition of the approach angle in treatment is "tacit knowledge".
  • expert doctor refers to a doctor with high treatment skill
  • training doctor refers to a doctor with lower treatment skill than the expert doctor.
  • the treatment-related skill is evaluated as to whether it is high or low in consideration of the information after the treatment as well as the information on the action of the treatment.
  • Information after the course of treatment refers to, for example, a low incidence of complications, a short postoperative hospital stay, and the like.
  • the predetermined case is, for example, a case including a lesion having a remarkably complicated shape, or a case relating to a lesion site such as the vault of the stomach where it is difficult to reduce the approach angle.
  • a high-frequency device is a device that is used to excise and cauterize a target tissue by applying a high-frequency current.
  • High frequency devices include high frequency snares and high frequency knives.
  • Energization means that high-frequency current is supplied from the power supply to the high-frequency device, and the energization state can be determined based on the control signal of the power supply. Henceforth, the energization to a high frequency device may be simply written as energization.
  • FIG. 2 is a diagram showing the configuration of the processing system 100 according to this embodiment.
  • the processing system 100 includes an acquisition unit 110 , a processing unit 120 and an output processing unit 130 .
  • the processing system 100 is not limited to the configuration of FIG. 2, and various modifications such as omitting some of these components or adding other components are possible.
  • the acquisition unit 110 acquires the approach angle information of the insertion portion of the endoscope and the energization history information regarding the energization history of the treatment instrument 360 from the endoscope system 300 described later with reference to FIGS. 3 and 4 .
  • the acquisition unit 110 can also be said to be a communication interface that acquires energization history information regarding the energization history of the treatment instrument 360 .
  • the energization history is, for example, the timing of energization, but may be the energization period, the magnitude of the energization output, or the like, and may be combined arbitrarily. A specific acquisition method will be described later.
  • the acquisition unit 110 can be realized by, for example, a communication chip for information acquisition, a processor or a control circuit that controls the communication chip, or the like.
  • the processing unit 120 evaluates the skill of the user who has operated the endoscope system 300 based on the approach angle information and the energization history information.
  • the processing executed by the processing unit 120 is classification processing such as clustering. Details of the skill evaluation will be described later.
  • the processing system 100 When processing using a trained model is performed, the processing system 100 includes a storage unit (not shown) that stores a trained model generated by machine learning.
  • the storage unit here serves as a work area for the processing unit 120 and the like, and its function can be realized by a semiconductor memory, a register, a magnetic storage device, or the like.
  • the processing unit 120 reads a learned model from the storage unit and operates according to instructions from the learned model, thereby performing an inference process of outputting a user's skill evaluation result.
  • the processing unit 120 is configured with the following hardware.
  • the hardware may include circuitry for processing digital signals and/or circuitry for processing analog signals.
  • the hardware may consist of one or more circuit devices or one or more circuit elements mounted on a circuit board.
  • the one or more circuit devices are, for example, ICs (Integrated Circuits), FPGAs (field-programmable gate arrays), or the like.
  • the one or more circuit elements are, for example, resistors, capacitors, and the like.
  • Processing unit 120 may be realized by the following processors.
  • Processing system 100 includes a memory that stores information and a processor that operates on the information stored in the memory.
  • the memory here may be the storage unit described above, or may be a different memory.
  • the information is, for example, programs and various data.
  • a processor includes hardware.
  • Various processors such as a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), and a DSP (Digital Signal Processor) can be used as the processor.
  • the memory may be a semiconductor memory such as SRAM (Static Random Access Memory) or DRAM (Dynamic Random Access Memory), a register, or a magnetic storage device such as HDD (Hard Disk Drive).
  • it may be an optical storage device such as an optical disc device.
  • the memory stores computer-readable instructions, and the instructions are executed by the processor to implement the functions of the processing unit 120 as processes.
  • the instruction here may be an instruction set that constitutes a program, or an instruction that instructs a hardware circuit of a processor to perform an operation.
  • all or part of each part of the processing unit 120 can be realized by cloud computing, and each process described later can be performed on cloud computing.
  • processing unit 120 of this embodiment may be implemented as a module of a program that runs on a processor.
  • the processing unit 120 is implemented as a processing module that performs skill evaluation based on approach angle information and energization history information.
  • the program that implements the processing performed by the processing unit 120 of this embodiment can be stored, for example, in an information storage device that is a computer-readable medium.
  • the information storage device can be implemented by, for example, an optical disc, memory card, HDD, semiconductor memory, or the like.
  • a semiconductor memory is, for example, a ROM.
  • the processing unit 120 performs various processes of this embodiment based on programs stored in the information storage device. That is, the information storage device stores a program for causing the computer to function as the processing unit 120 .
  • a computer is a device that includes an input device, a processing unit, a storage unit, and an output unit.
  • the program according to the present embodiment is a program for causing a computer to execute each step described later with reference to FIG. 18 and the like.
  • the output processing unit 130 performs processing for outputting skill evaluation information that is the result of skill evaluation by the processing unit 120 .
  • the processing system 100 may include a display unit (not shown), and the output processing unit 130 may perform processing for displaying skill evaluation information on the display unit.
  • the processing system 100 may be connected to the endoscope system 300 via a network.
  • the output processing unit 130 may be a communication device or a communication chip that transmits skill evaluation information via a network.
  • the device that outputs the skill evaluation information is not limited to the endoscope system 300, and may be a PC (Personal Computer) capable of communicating with the processing system 100, or a mobile terminal device such as a smart phone or a tablet terminal. may
  • PC Personal Computer
  • the processing system 100 of this embodiment includes the acquisition unit 110, the processing unit 120, and the output processing unit .
  • the acquisition unit 110 also acquires approach angle information of the insertion portion of the endoscope and power supply history information related to the power supply history of the treatment instrument 360 .
  • the processing unit 120 performs skill evaluation of the user operating the endoscope based on the approach angle information and the energization history information.
  • the output processing unit 130 also outputs skill evaluation information that is the result of skill evaluation.
  • the processing performed by the processing system 100 of this embodiment may be implemented as an information processing method.
  • the information processing method acquires the approach angle information of the insertion portion of the endoscope and the power supply history information related to the power supply history of the treatment instrument 360, and based on the approach angle information and the power supply history information, the user operating the endoscope. Skill evaluation is performed, and skill evaluation information, which is the result of skill evaluation, is output.
  • the user's skill can be evaluated based on both the approach angle information and the energization history information, so the operator's skill can be evaluated with high accuracy.
  • FIG. 3 is a diagram showing a configuration example of the endoscope system 300.
  • the endoscope system 300 includes a scope section 310 , a processing device 330 , a display section 340 and a light source device 350 .
  • An operator uses the endoscope system 300 to perform an endoscopy on a patient.
  • the configuration of the endoscope system 300 is not limited to that shown in FIG. 3, and various modifications such as omitting some components or adding other components are possible. 3, illustration of a suction device 370, an air/water supply device 380, etc., which will be described later with reference to FIG. 4, is omitted.
  • FIG. 3 shows an example in which the processing device 330 is one device connected to the scope section 310 via the connector 310d, but it is not limited to this.
  • part or all of the configuration of the processing device 330 may be constructed by other information processing devices such as a PC or a server system that can be connected via a network.
  • processing unit 330 may be implemented by cloud computing.
  • the scope section 310 has an operation section 310a, a flexible insertion section 310b, and a universal cable 310c including signal lines and the like.
  • the scope section 310 is a tubular insertion device that inserts a tubular insertion section 310b into a body cavity.
  • a connector 310d is provided at the tip of the universal cable 310c.
  • the scope unit 310 is detachably connected to the light source device 350 and the processing device 330 by a connector 310d. Furthermore, as will be described later with reference to FIG. 4, a light guide 315 is inserted through the universal cable 310c. emitted from the tip.
  • the insertion portion 310b has a distal end portion 11, a bendable bending portion 12, and a flexible portion 13 from the distal end to the proximal end of the insertion portion 310b.
  • the insertion portion 310b is inserted into the subject.
  • the distal end portion 11 of the insertion portion 310b is the distal end portion of the scope portion 310 and is a hard distal end rigid portion.
  • An objective optical system 311 and an imaging element 312, which will be described later, are provided at the distal end portion 11, for example.
  • the bending portion 12 can be bent in a desired direction according to the operation of the bending operation member provided on the operation portion 310a.
  • the bending operation member includes, for example, a horizontal bending operation knob and a vertical bending operation knob.
  • the operation portion 310a is provided with various operation buttons such as a release button and an air/water supply button.
  • the processing device 330 is a video processor that performs predetermined image processing on the received imaging signal and generates a captured image.
  • a video signal of the generated captured image is output from the processing device 330 to the display unit 340, and the captured image is displayed on the display unit 340 in real time.
  • the configuration of the processing device 330 will be described later.
  • the display unit 340 is, for example, a liquid crystal display or an EL (Electro-Luminescence) display.
  • the light source device 350 is a light source device capable of emitting white light for normal observation mode.
  • the light source device 350 may be capable of selectively emitting white light for normal observation mode and special light such as narrow band light.
  • FIG. 4 is a diagram for explaining the configuration of each part of the endoscope system 300. As shown in FIG. Note that in FIG. 4, a part of the configuration of the scope unit 310 is omitted and simplified.
  • the light source device 350 includes a light source 352 that emits illumination light.
  • the light source 352 may be a xenon light source, an LED (light emitting diode), or a laser light source. Also, the light source 352 may be another light source, and the light emission method is not limited.
  • the insertion section 310b includes the above-described objective optical system 311, imaging element 312, illumination lens 314, light guide 315, suction tube 317, and air/water supply tube 319.
  • the light guide 315 guides illumination light from the light source 352 to the distal end of the insertion portion 310b.
  • the illumination lens 314 irradiates the subject with the illumination light guided by the light guide 315 .
  • the imaging element 312 receives light from the subject via the objective optical system 311 .
  • the imaging element 312 may be a monochrome sensor or an element with color filters.
  • the color filter may be a well-known Bayer filter, a complementary color filter, or other filters.
  • Complementary color filters are filters that include cyan, magenta, and yellow color filters.
  • the suction tube 317 activates the suction device 370 in a predetermined case to suck liquid or the like.
  • Predetermined cases include, for example, a case where gastric juice or the like interferes with diagnosis, a case where water is collected after treatment is completed, and the like, but other cases are also possible.
  • the suction device 370 is implemented by including a suction pump, recovery tank, and the like (not shown). Also, the suction device 370 is connected to the control unit 332 described later, and by pressing a suction button (not shown) or the like, the liquid or the like is collected in the collection tank described above through the opening 316 and the suction pipe 317 .
  • the opening 316 also serves as an opening from which a treatment instrument 360, which will be described later, protrudes. 4, illustration of the treatment instrument 360 and a tube housing the treatment instrument 360 is omitted.
  • the air/water supply pipe 319 activates the air/water supply device 380 to supply air or water.
  • the specific case is, for example, a case where it is desired to wash the residue around the lesion or a case where it is desired to expand the surrounding area around the lesion from the inside, but other cases may also be used.
  • the air/water supply device 380 is realized by including a pump, a gas cylinder, a water supply tank, and the like (not shown).
  • the air/water supply device 380 is connected to the control unit 332 described later, and when an air supply button or a water supply button (not shown) is pressed, gas or liquid is ejected from the nozzle 318 through the air/water supply pipe 319. be.
  • FIG. 4 one air/water supply pipe 319 is schematically illustrated, but a pipe for gas and a pipe for liquid are arranged side by side, and the respective pipes are connected before the nozzle 318. good too.
  • the processing device 330 performs image processing and control of the entire system.
  • the processing device 330 includes a pre-processing section 331 , a control section 332 , a storage section 333 , a detection processing section 335 and a post-processing section 336 .
  • the preprocessing unit 331 performs A/D conversion that converts analog signals sequentially output from the imaging element 312 into digital images, and various correction processes for image data after A/D conversion. Note that an A/D conversion circuit may be provided in the image sensor 312 and the A/D conversion in the preprocessing section 331 may be omitted.
  • the correction processing here includes, for example, color matrix correction processing, structure enhancement processing, noise reduction processing, AGC (automatic gain control), and the like.
  • the preprocessing unit 331 may also perform other correction processing such as white balance processing.
  • the preprocessing unit 331 outputs the processed image to the detection processing unit 335 as an input image.
  • the pre-processing unit 331 also outputs the processed image to the post-processing unit 336 as a display image.
  • the detection processing unit 335 performs detection processing for detecting a region of interest such as a lesion from the input image.
  • the attention area detection processing is not essential, and the detection processing unit 335 can be omitted.
  • the post-processing unit 336 performs post-processing based on the outputs of the pre-processing unit 331 and the detection processing unit 335 and outputs the post-processed image to the display unit 340 .
  • the post-processing unit 336 may add the detection result of the detection processing unit 335 to the display image and display the added image.
  • the user who is the operator, treats the lesion area in the living body while viewing the image displayed on the display unit 340 .
  • the treatment here is, for example, treatment for resecting lesions such as EMR and ESD described above.
  • the control unit 332 is connected to the imaging element 312, the preprocessing unit 331, the detection processing unit 335, the postprocessing unit 336, and the light source 352, and controls each unit.
  • the acquisition unit 110 acquires input data and energization history information, which will be described later, based on control information from the control unit 332, for example.
  • the acquisition unit 110 also acquires approach angle information, which will be described later, based on sensor information from a motion sensor provided in the insertion unit 310b, for example.
  • the processing unit 120 performs skill evaluation using the approach angle information and the energization history information.
  • the output processing unit 130 outputs skill evaluation information to the display unit 340 and external devices connected to the endoscope system 300 .
  • FIG. 5 is a diagram illustrating the configuration of the distal end portion 11 of the insertion portion 310b.
  • the distal end portion 11 has a substantially circular cross-sectional shape, and is provided with an objective optical system 311 and an illumination lens 314 as described above with reference to FIG.
  • the insertion portion 310b is provided with a channel, which is a cavity connecting from the operation portion 310a to the opening portion 316 of the distal end portion 11.
  • the opening 316 here is an opening for a treatment tool 360 called a forceps opening.
  • FIG. 5 illustrates the configuration of the distal end portion 11 having two systems of illumination lenses 314, one objective optical system 311, one opening 316, and one nozzle 318, the specific configuration can be modified in various ways. Implementation is possible.
  • the treatment instrument 360 here is an instrument for treating a living body, and includes, for example, a high-frequency snare and a high-frequency knife.
  • High frequency knives include needle knives, IT knives, hook knives, and the like.
  • a needle knife is used for ESD marking.
  • An IT knife is used for the incision.
  • a high-frequency snare or high-frequency knife is used for peeling.
  • the treatment instrument 360 may also include other instruments such as injection needles, forceps, and clips.
  • An injection needle is used for local injection of ESD. Forceps or clips are used to stop bleeding.
  • FIG. 6 is a diagram showing a configuration example of a system including the processing system 100. As shown in FIG. As shown in FIG. 6, the system includes multiple endoscope systems 300 and a processing system 100 .
  • the processing system 100 is a server system connected to each of the endoscope systems 300 via a network.
  • the server system here may be a server provided in a private network such as an intranet, or a server provided in a public communication network such as the Internet.
  • the processing system 100 may be configured by one server device, or may include a plurality of server devices.
  • the processing system 100 may include a database server that collects approach angle information and energization history information from a plurality of endoscope systems 300, and a processing server that performs skill evaluation.
  • the database server may also collect other information, such as physician information, patient information, etc., as described below.
  • the processing system 100 may perform skill evaluation based on machine learning, as described later.
  • the processing system 100 may include a learning server that generates a trained model by performing machine learning using data collected by a database server as learning data.
  • the processing server performs skill evaluation based on the trained model generated by the learning server.
  • the processing system 100 when the processing system 100 can be connected to a plurality of endoscope systems 300, it is possible to efficiently collect data. For example, since it is easy to increase the amount of learning data used for machine learning, it is possible to improve the accuracy of skill evaluation.
  • FIG. 7 The operator performs a prescribed treatment, and after a prescribed period of time has elapsed since the treatment, the operator outputs a skill evaluation sheet 400 shown in FIG. 7 as skill evaluation information to a prescribed display unit.
  • the predetermined period is, for example, the period until the patient who is the target of the treatment is discharged from the hospital.
  • the predetermined display unit is, for example, the display unit 340 described above, but may be a display unit of an external device connected to the endoscope system 300 .
  • the predetermined treatment is, for example, the ESD including multiple steps as described above, but may be other treatments including multiple steps. In other words, treatment with the treatment tool 360 includes multiple stages.
  • the skill evaluation sheet 400 includes, for example, a doctor information icon 410 and a case sheet 420.
  • the case sheet 420 displays, for example, the result of comprehensive evaluation and the result of skill evaluation for each stage of treatment as a breakdown of the comprehensive evaluation.
  • the output processing unit 130 outputs skill evaluation information at each stage of the plurality of stages.
  • the skill can be evaluated by subdividing it for each stage, so that the accuracy of the skill evaluation can be further improved.
  • skill evaluation is performed for the stages of marking, local injection, incision, and peeling, but as shown in FIG. 7, skill evaluation may be performed for five stages including hemostasis.
  • the number of treatment stages to be evaluated is not limited to five, and other stages may be added, or two or more may be reduced.
  • the multiple steps include at least two of a marking step, a local injection step, an incision step, and an ablation step.
  • the evaluation results of each stage are displayed by a marking evaluation icon 440, a local injection evaluation icon 442, an incision evaluation icon 444, a peeling evaluation icon 446, and a hemostatic evaluation icon 448. , C, or D.
  • these evaluation results may be displayed in a radar chart format. By doing so, it is possible to two-dimensionally display the superiority or inferiority of the skill of the operator, so that the skill evaluation can be visually and easily grasped.
  • A is the highest evaluation rank equivalent to that of an expert
  • D is the lowest evaluation rank.
  • the display format of the skill evaluation is not limited to the radar chart, and may be realized in the form of a bar graph, a line graph, or the like, and various modifications are possible. Furthermore, the display format of the skill evaluation may be changed.
  • advice information may be output for each stage.
  • the advice information is specifically advice information regarding at least one of the approach angle information and the energization history information.
  • the output processing unit 130 may output advice information regarding at least one of the approach angle information and the energization history information.
  • a marking evaluation icon 440 is selected on the display screen
  • a marking advice display 470 is displayed
  • a local injection evaluation icon 442 is selected
  • a local injection advice display 472 is displayed
  • an incision evaluation icon 444 is displayed.
  • an incision advice display 474 is displayed, and advice regarding approach angle information and energization history information is displayed.
  • a peeling advice display and a hemostatic advice display are also displayed, but they are omitted in FIG.
  • the display method of the advice is not limited to the method shown in FIG. 8, and various modifications such as displaying the advice on another screen are possible.
  • illustration is omitted, when the comprehensive evaluation icon 430 is selected, advice regarding the comprehensive evaluation may be displayed. As a result, highly accurate skill evaluation can be performed, and specific information can be provided to the operator.
  • the advice information includes advice including difference information obtained by comparing approach angle information and energization history information with the data of the operator to be evaluated and the data of the expert. display.
  • the output processing unit 130 displays, as advice information, the difference from the expert data regarding at least one of the approach angle information and the energization history information.
  • the advice information may include other information.
  • the marking advice display 470 it may be displayed for confirmation that an evaluation result equivalent to that of an expert was obtained in the marking stage.
  • the reason for the evaluation result may be displayed, or advice may be displayed to prompt the operator to refer to log information, which will be described later, as the reason for the evaluation.
  • skill evaluation is performed for each stage of treatment according to the above-described A to D, but skill evaluation may be performed by further subdividing the period of each stage.
  • a change in the approach angle during the period of energization in each treatment may be subject to skill evaluation.
  • the processing unit 120 may perform skill evaluation based on approach angle information during the energization period of the treatment instrument 360 . It is empirically known that the difference between the treatment performed by a skilled doctor and the treatment performed by a trainee doctor is the difference in the amount of change in the approach angle during the energization period. It is a high level of information.
  • advice may be displayed regarding the approach angle information during the energization period.
  • the change in the approach angle may be subject to skill evaluation, limited to the period before the period in which the current is applied in each treatment.
  • the processing unit 120 may perform skill evaluation based on the approach angle information during the period prior to the energization period of the treatment instrument 360 .
  • the time required for an operation by a skilled doctor is shorter than the time required for an operation by a novice doctor, it is known that there is a significant difference in the time required for position adjustment especially before treatment between an expert doctor and a novice doctor. In other words, the quality of the setup before treatment is greatly related to the operator's evaluation.
  • the behavior of the approach angle before energization can be used to obtain information about the quality of the setup before treatment, and skill evaluation can be performed based on this information, so the accuracy of skill evaluation can be further improved.
  • advice may be displayed regarding the approach angle information during the period prior to the energization period.
  • the skill evaluation information of this embodiment includes log information of approach angle information.
  • individual case sheets 420 include log data icons 450 .
  • the log data icon 450 is selected, the log information of the approach angle information shown in FIG. 9 is displayed.
  • the output processing unit 130 outputs log information of approach angle information.
  • highly accurate skill evaluation can be performed, and more specific information such as log information of approach angle information can be provided to the operator.
  • FIG. 9 is a diagram illustrating an example of log information of approach angle information.
  • the horizontal axis of FIG. 9 represents time, and the vertical axis represents relative change in approach angle.
  • t1 in FIG. 9 is the timing when treatment is first started. For example, it is the timing at which the protrusion or the like of the treatment instrument 360 is detected for the first time after the insertion portion 310b is inserted.
  • energization is not necessarily required, energization may be included.
  • FIG. 9 it is assumed that energization is not included in the treatment started at t1, but energization is included in the treatments started at t2 and t3. The same applies to FIG. 10 described later.
  • the processing unit 120 obtains the approach angle at t1 as the reference angle.
  • the relative change at t1 is set to 0 degrees, and thereafter the relative change is acquired in time series with the approach angle at t1 as a reference.
  • t2 represents the timing when the treatment is restarted, and resetting the reference angle calibrates the relative angle at t2 to 0 degrees.
  • t3 is also the timing at which the treatment is restarted, and the relative angle at t3 is calibrated to 0 degrees.
  • the approach angle information here is information regarding a relative change in the approach angle with respect to the reference angle when the approach angle at the timing corresponding to the start of treatment is used as the reference angle.
  • approach angle information in the form of the amount of change in the approach angle during the period during which the treatment is performed, so that the operator's skill can be evaluated with higher accuracy.
  • the approach angle is calibrated to 0 degrees each time treatment occurs in FIG. 9 and FIG. 10 described later, the approach angle may be measured as an absolute value without being limited to this. An example of processing for obtaining the approach angle and creating log information will be described later with reference to FIGS. 13 and 14. FIG.
  • log information with an allowable approach angle range added to the above-described relative change in approach angle may be displayed.
  • alert information may be displayed when the relative change in approach angle falls outside the allowable approach angle range.
  • the processing unit 120 obtains an allowable approach angle range, and when the angle representing the relative change deviates from the approach angle range, adds processing for outputting alert information, thereby displaying the log shown in FIG. can be realized.
  • the displayed alert information may be text information or image information such as icons.
  • the vertical and horizontal axes in FIG. 10 are the same as those in FIG. 9, and the timings t1 to t3 at which the treatment is determined to start are also the same as in FIG. In FIG.
  • the allowable approach angle range is greater than ⁇ 1 and less than ⁇ 2.
  • the processing unit 120 determines that the relative change is out of the approach angle range when the relative change in the approach angle is less than or equal to ⁇ 1 or greater than or equal to ⁇ 2. By doing so, the log information can be displayed in more detail.
  • the log information of the approach angle information may be displayed in real time on the display unit 340 or the like during treatment of the patient.
  • the alert information described above may be notified in real time at the timing when the relative change moves from within the approach angle range to outside the approach angle range.
  • the notification here may be notification by sound, vibration, or the like, in addition to notification by the display unit 340 or the like.
  • the output processing unit 130 may continue to output alert information between t4 and t5 and between t6 and t7 in FIG. In this way, the operator who is performing the treatment can be immediately notified of the abnormality, so that trouble can be prevented.
  • the skill evaluation information of this embodiment may further include doctor information.
  • doctor information For example, by selecting the doctor information icon 410 on the skill evaluation sheet 400, detailed doctor information is displayed. Specifically, it includes information on individual cases in addition to physical characteristics and surgical records. More specifically, physical characteristics include height, weight, hand size, and the like.
  • the surgical record includes information such as the number of experienced cases and cumulative surgical time.
  • the case information also includes the time required for the surgery, the degree of difficulty of the surgery, the number of times guidance was received, the content of the guidance, and the like. Note that information specifying a school may be included in the doctor information. As described above, the pros and cons of endoscopic procedures depend on the doctor's experience and tacit knowledge of operations.
  • the doctor information may include name, sex, date of birth, registration information, and date of registration, and these information may be linked to a predetermined database. Further, the doctor information may include academic achievements such as academic conference activities. Further, it is not necessary to display all of the information shown above. For example, it is stored in a predetermined storage area in a list format as shown in FIG. 400 may be displayed. By including the doctor information in the evaluation skill information in this way, it is possible to improve the accuracy of the skill evaluation.
  • the skill evaluation information of this embodiment may further include patient information.
  • Patient information includes patient information, lesion information, and postoperative status information.
  • the patient's own information includes, for example, name, age, sex, etc., and may include information on whether or not the patient uses an anticoagulant. This information is useful for judging the degree of difficulty of surgery, since the use of anticoagulants makes bleeding more likely.
  • the patient's own information may also include treatment history information. This information is also useful information because, for example, re-exfoliation treatment becomes difficult due to fibrosis or the like at a site that has been subjected to ESD treatment in the past.
  • the lesion information includes site information, tissue characterization information, and bleeding information, and may further include subdivided information as shown in FIG.
  • information on the state after surgery includes information on the amount of bleeding, the incidence of complications, and the number of days of hospitalization.
  • the marking evaluation icon 440, local injection evaluation icon 442, incision evaluation icon 444, peeling evaluation icon 446, and hemostasis evaluation icon 448 on the skill evaluation sheet 400 of FIG. Assume that there are multiple displayed operators. Further, when the patient information of these operators is compared, it is assumed that there is a large difference in the incidence of postoperative complications and the number of days required for hospitalization. In this case, the comprehensive evaluation icon 430 of the skill evaluation sheet 400 of the predetermined operator is displayed as "A+", and the comprehensive evaluation icon 430 of the skill evaluation sheet 400 of the specific operator is displayed as "A-". Skill evaluation may be performed in more detail.
  • a predetermined operator is an operator whose incidence of complications after surgery is lower than the average value and whose number of days required for hospitalization is less than the average value.
  • a specific operator is an operator who has a postoperative incidence of complications higher than the average value and the number of days required for hospitalization is higher than the average value. It should be noted that even for an operator with a comprehensive evaluation of B, C, etc., a more detailed evaluation may be similarly performed. By including patient information in the evaluation skill information in this way, it is possible to improve the accuracy of skill evaluation.
  • FIG. 13A is a diagram illustrating an example of approach angle information in this embodiment.
  • the approach angle of the present embodiment is information representing the relative angle between the tissue to be treated and the distal end portion 11 of the insertion section 310b, and more specifically, the axis of the insertion section 310b. It is the angle ⁇ between the straight line L11 and the plane P11 on the tissue to be treated. That is, it is one of the interior angles of the triangle defined by the perpendicular drawn from the tip 11 to the plane P11, the straight line L11, and the plane P11.
  • the axis of the insertion portion 310b is an axis representing the longitudinal direction of the insertion portion 310b, and is, for example, a straight line passing through the center of the substantially cylindrical insertion portion 310b or a straight line parallel thereto.
  • "parallel" includes substantially parallel.
  • the approach angle can also be said to be information indicating how much the axis of the insertion section 310b lies or stands with respect to the plane P11 representing the tissue to be treated.
  • the approach angle is, for example, a numerical value between 0 degrees and 90 degrees. In this definition, even if the axis rotates around the normal to the plane P11, the approach angle does not change. That is, the approach angle may not include information specifying the direction of approach. However, the approach angle is not limited to this, and the range of values is not limited to 0 degrees or more and 90 degrees or less.
  • the endoscope system 300 can include a motion sensor provided at the distal end 11 of the insertion section 310b.
  • the motion sensor is, for example, a 6-axis sensor including a 3-axis acceleration sensor and a 3-axis angular velocity sensor.
  • the acceleration sensor is a sensor that detects translational acceleration on each of the XYZ axes.
  • the angular velocity sensor is a sensor that detects angular velocity around each of the XYZ axes.
  • the processing unit 120 uses the reference position/posture as a reference to determine the displacement and By accumulating the amount of rotation, the position and orientation of the distal end portion 11 at each timing is obtained. This makes it possible to express the direction of the straight line L11 at each timing in a given coordinate system.
  • the plane P11 can be estimated, for example, by obtaining the distance to the treatment target tissue based on the captured image.
  • the endoscope system 300 may include multiple imaging systems in the distal end portion 11 .
  • the processing unit 120 obtains the distance to the subject imaged on the image by performing stereo matching processing based on parallax images imaged by a plurality of imaging systems at different positions. Stereo matching is a well-known technique, and detailed description thereof will be omitted. In this way, the three-dimensional shape of the subject can be estimated. For example, since the processing unit 120 can specify the coordinates of each point of the subject in the camera coordinate system, it is possible to obtain the plane P11 including the treatment target tissue using the camera coordinate system.
  • the processing unit 120 acquires the output of the motion sensor and the captured image, which is the parallax image, to express the straight line L11 and the plane P11 using an arbitrary coordinate system, and to express the straight line L11 and the plane P11 It is possible to compute the approach angle, which is the angle between
  • the plane P11 may be a plane perpendicular to the normal vector at a given point of the tissue to be treated, or may be a plane approximating the surface shape of the tissue to be treated.
  • the approach angle in the present embodiment may be any information representing the relationship between the distal end portion 11 of the insertion section 310b and the tissue to be treated. An approach angle may be determined.
  • the processing unit 120 may perform both calculation of the position and orientation of the tip portion 11 based on the motion sensor and estimation of the subject shape based on the parallax image at each timing.
  • the processing unit 120 obtains a plane P11 at a position where the subject can be relatively overlooked, and continues to use information on the plane P11 during treatment. In this way, even when it is difficult to obtain information on the treatment target tissue from the image, information on the plane P11 can be appropriately obtained.
  • the processing unit 120 obtains the straight line L11 based on the motion sensor at each timing, and calculates the approach angle using the obtained straight line L11 and the obtained plane P11.
  • endoscope system 300 may include a magnetic sensor provided at distal end 11 .
  • a magnetic sensor includes two cylindrical coils whose center axes are perpendicular to each other.
  • the endoscope system 300 also includes a magnetic field generator (not shown) as a peripheral device. The magnetic sensor detects the position and orientation of the distal end portion 11 by detecting the magnetic field generated by the magnetic field generator.
  • the measurement method on the treatment target tissue side is not limited to the method using parallax images.
  • the processing unit 120 may measure the tissue to be treated by measuring the distance to the subject using a TOF (Time Of Flight) method or a structured light method.
  • the TOF method is a method of measuring the time it takes for a reflected wave of light to reach an image sensor.
  • the structured light method is a method of projecting a plurality of patterns of light onto an object and determining the distance from how each pattern of light appears. For example, there is known a phase shift method of obtaining a phase shift by projecting a pattern whose brightness changes with a sine wave. Since these techniques for estimating the three-dimensional shape of the subject are well known, detailed description thereof will be omitted.
  • the processing unit 120 may also calculate the position of the treatment target tissue in the three-dimensional space by associating a plurality of feature points in a plurality of different captured images.
  • the positions of feature points can be calculated from image information using methods such as SLAM (Simultaneous Localization and Mapping) and SfM (Structure from Motion).
  • SLAM Simultaneous Localization and Mapping
  • SfM Structure from Motion
  • the processing unit 120 obtains information of the tissue to be treated by applying a bundle adjustment that optimizes the intrinsic parameters, the extrinsic parameters and the global coordinate point cloud from the image using a non-linear least squares method.
  • the processing unit 120 performs perspective projection transformation on the world coordinate points of the plurality of extracted feature points using each estimated parameter, and performs each parameter and each world coordinate point cloud so that the reprojection error is minimized.
  • methods such as SfM are publicly known, further detailed description thereof will be omitted. Note that these methods can estimate not only the three-dimensional position of the subject but also the position and orientation of the camera. Therefore, a technique such as SfM may be used to estimate the position and orientation of the distal end portion 11 .
  • the measurement of the tissue to be treated is not limited to using the captured image captured using the endoscope system 300 .
  • a patient's CT (Computed Tomography) image or MRI (Magnetic Resonance Imaging) image may be acquired before treatment using the endoscope system 300.
  • CT images and MRI images it is possible to estimate the shape of the periphery of the tissue to be treated and the shape of the organ until reaching there.
  • the estimated shape of the organ is the shape at the time of capturing the MR image or the CT image, and the shape of the organ during treatment changes due to various factors.
  • the processing unit 120 acquires information related to the factor and corrects the shape of the organ based on the information, thereby estimating the shape of the organ while the insertion portion 310b is being inserted.
  • the processing unit 120 calculates information of the treatment target tissue, for example, the plane P11, based on the corrected shape of the organ.
  • the information related to the organ shape change factor is, for example, the patient's body position, the pressure inside the lumen, the direction of gravity, the insertion shape of the insertion portion 310b, and the like.
  • the patient's body position may be obtained from the amount of movement of the movable bed, or may be input by the user.
  • the air pressure in the lumen may be estimated from the amount of air supplied or the amount of suction, or may be obtained by providing an air pressure sensor in the insertion section 310b.
  • the direction of gravity can be detected from a motion sensor.
  • the insertion shape of the insertion section 310b may be detected by providing motion sensors or magnetic sensors at a plurality of locations of the insertion section 310b, or may be estimated based on the history of forward/backward operations and bending operations of the insertion section 310b. good.
  • the approach angle may be an angle formed by a straight line L11, which is the axis of the insertion portion 310b, and a straight line L12, which represents the direction of gravity.
  • the approach angle does not directly correspond to the angle between the tissue to be treated and the distal end portion 11 .
  • the position of the patient is fixed to some extent, and for example, the left lateral decubitus position is used. Therefore, if the position of the treatment target tissue in the organ is known, the relationship between the plane P11 and the direction of gravity is known. Also, even when the body posture is changed, if information representing the body posture can be acquired each time, the relationship between the plane P11 and the direction of gravity is known.
  • the processing unit 120 may obtain the approach angle based on the axis and the direction of gravity.
  • the direction of gravity can be determined using, for example, the motion sensor described above.
  • the processing unit 120 may use the obtained approach angle as it is, or may calculate the angle between the straight line L11 and the plane P11 based on the relationship between the direction of gravity and the plane P11 in the same manner as in FIG. good.
  • the processing unit 120 detects whether or not treatment has been started on the treatment target tissue (step S401). For example, the processing unit 120 performs processing for detecting the start of treatment based on the usage information of the peripheral device.
  • the processing unit 120 corresponds to, for example, the processing device 330 described above, but may be expanded to include the scope unit 310, the display unit 340, the light source device 350, and the like.
  • the peripheral device corresponds to, for example, the treatment tool 360 for performing treatment, but may be expanded to include a suction device 370, an air/water supply device 380, a power supply device for supplying power to the treatment tool 360, and the like.
  • the processing unit 120 acquires information about the protruding state of the treatment instrument 360 as usage information.
  • the insertion portion 310b is provided with a channel that extends from the operation portion 310a to the opening 316 of the distal end portion 11, so that the operator can insert the treatment instrument 360 through the channel. take action.
  • the protruding state of the treatment instrument 360 is information indicating whether or not the distal end of the treatment instrument 360 protrudes from the opening 316 of the distal end portion 11, or the amount of projection.
  • the processing unit 120 acquires usage information regarding the protruding state by determining whether or not the treatment tool 360 is imaged in the captured image. As shown in FIG.
  • the treatment instrument 360 protrudes from the opening 316 of the distal end portion 11 in the axial direction of the insertion portion 310b. is an image of the protruding treatment instrument 360 .
  • the treatment instrument 360 has lower saturation than living tissue, and its shape is also known from its design. Therefore, the processing unit 120 can detect the treatment tool region in the image by performing image processing such as saturation determination processing and matching processing using a reference shape on the captured image. For example, when the treatment tool 360 is present in the image, the processing unit 120 determines that the treatment tool 360 is protruding and the treatment has started. Moreover, when treatment is not performed, there is no need to protrude the treatment instrument 360, and there is a risk that the protruding may rather harm the living body.
  • the processing unit 120 may determine the protruding state of the treatment instrument 360 based on the output of a sensor provided in the opening 316 corresponding to the forceps port.
  • the processing unit 120 acquires information regarding the energization state of the high-frequency device as usage information. Based on the control signal, the processing unit 120 determines that treatment has started when high-frequency current is being supplied to the high-frequency device.
  • a high-frequency device cannot perform resection or cauterization by simply protruding, and requires supply of high-frequency current. That is, when high-frequency current is supplied to the high-frequency device, there is a higher probability that treatment will be started immediately. Therefore, by using the energized state of the high-frequency device, it is possible to accurately determine the start of treatment.
  • the processing unit 120 performs processing for setting the approach angle at the timing when it is determined that the treatment has started as the reference angle (step S402). For example, the processing unit 120 acquires the sensor information of the motion sensor and the captured image at the timing when it is determined that the treatment tool 360 has protruded, and obtains the straight line L11 that is the axis of the insertion unit 310b and the plane P11 regarding the tissue to be treated. After processing, the angle formed by the straight line L11 and the plane P11 is set as the reference angle of the approach angle.
  • step S401 the calculation of the relative change in the approach angle is not performed because the procedure waits until the treatment is started. That is, during a period in which no treatment is performed, a graph parallel to the time axis continues to be drawn in the log information of the approach angle information shown in FIGS. 9 and 10 .
  • the “approach angle at the timing when the treatment instrument 360 protrudes” in the present embodiment represents the approach angle calculated with the detection of protrusion as a trigger, and includes the determination timing of protrusion, the acquisition timing of sensor information, and the imaging.
  • the image acquisition timing does not have to match exactly.
  • the processing unit 120 may set the approach angle at the timing corresponding to the energization start timing of the high-frequency device as the reference angle. As described above, by setting the approach angle at the start of treatment as the reference angle, it is possible to appropriately obtain the relative change in the approach angle during one treatment.
  • the processing unit 120 After setting the reference angle, the processing unit 120 obtains the relative change in the approach angle (step S403). For example, the processing unit 120 acquires sensor information from a motion sensor, and obtains information representing the straight line L11 based on the sensor information. The processing unit 120 obtains the angle formed by the straight line L11 and the plane P11 obtained in step S402 as the approach angle at that time. Further, the processing unit 120 calculates the difference between the obtained approach angle and the reference angle obtained in step S402, and regards the difference as the relative change in the approach angle.
  • the output processing unit 130 performs processing for outputting the relative change obtained in step S103 (step S104). For example, the output processing unit 130 performs display processing for displaying the angle corresponding to the relative change in real time at a timing after the timing at which it is determined that the treatment has started.
  • the timing at which it is determined that the treatment has started is the timing at which Yes is determined in step S401, and the subsequent timing is the timing in step S404.
  • log information related to approach angle information can be obtained by the methods shown in FIGS.
  • approach angle information can be acquired at desired timing using the motion sensor described above by a known method.
  • the flexible endoscope is flexible and long, and has no rigidity like the robot arm. In other words, approach angle information cannot be obtained even based on information such as an input signal to the operation unit 310a.
  • the input and output of the energization of the high-frequency device correspond, for example, based on the input signal to the high-frequency device, it is possible to obtain information on the energization start time, the energization end time, the number of times the energization is performed, etc. can be done.
  • energization history information can be acquired as digital data at desired timing by a known technique.
  • the approach angle information and the energization history information can be acquired at the same timing.
  • the same here includes substantially the same.
  • a process of changing the color of the area (dotted line section in FIGS. 9 and 10) corresponding to the energized period may be added. By doing so, the log information can be displayed more visually and clearly.
  • Machine learning using the neural network NN1 will be described below, but the method of the present embodiment is not limited to this.
  • machine learning using other models such as SVM (support vector machine) may be performed, or machine learning using techniques developed from these techniques may be performed. .
  • FIG. 15 is a schematic diagram explaining the neural network NN1.
  • the neural network NN1 has an input layer to which data is input, an intermediate layer that performs operations based on the output from the input layer, and an output layer that outputs data based on the output from the intermediate layer.
  • FIG. 15 illustrates a network with two intermediate layers, but the number of intermediate layers may be one, or three or more. Also, the number of nodes included in each layer is not limited to the example in FIG. 15, and various modifications are possible. Considering the accuracy, it is desirable to use deep learning using a multi-layered neural network NN1 for learning in this embodiment.
  • the term “multilayer” as used herein means four or more layers in a narrow sense.
  • the nodes contained in a given layer are combined with the nodes of adjacent layers.
  • a weighting factor is set for each connection.
  • Each node multiplies the output of the preceding node by the weighting factor, and obtains the sum of the multiplication results. Further, each node adds a bias to the total value and applies an activation function to the addition result to obtain the output of that node.
  • the output of the neural network NN1 is obtained.
  • Various functions such as a sigmoid function and a ReLU function are known as activation functions, and these functions can be widely applied in this embodiment.
  • Learning in the neural network NN1 is a process of determining appropriate weighting coefficients.
  • the weighting factor here includes the bias.
  • An example in which processing for generating a trained model is performed in a learning device will be described below.
  • the learning device may be, for example, a learning server included in the processing system 100 as described above, or may be a device provided outside the processing system 100 .
  • the learning device inputs the input data of the learning data to the neural network NN1, and obtains the output by performing forward calculations using the weighting coefficients at that time.
  • the learning device calculates an error function based on the output and the correct label in the learning data. Then, the weighting coefficients are updated so as to reduce the error function.
  • an error backpropagation method can be used to update the weighting coefficients from the output layer toward the input layer.
  • the neural network NN1 may be a CNN (Convolutional Neural Network), an RNN (Recurrent Neural Network), or other models.
  • CNN Convolutional Neural Network
  • RNN Recurrent Neural Network
  • the processing procedure is the same as in FIG. That is, the learning device inputs the input data of the learning data to the model and obtains the output by performing forward calculation according to the model configuration using the weighting coefficients at that time.
  • An error function is calculated based on the output and the correct label, and the weighting coefficients are updated so as to reduce the error function.
  • the error backpropagation method can also be used when updating the weighting coefficients of CNN or the like.
  • inputs to the neural network NN1 are, for example, approach angle information and energization history information.
  • approach angle information and energization history information are acquired in one surgery by a given operator.
  • the input of the neural network NN1 is time series data, but may be a statistic calculated based on the time series data.
  • the output of the neural network NN1 is, for example, information representing the rank when the skill of the user to be evaluated is ranked in M stages.
  • M is an integer of 2 or more.
  • rank I is higher in skill than rank I+1.
  • I is an integer of 1 or more and less than M; That is, rank 1 represents the highest skill, and rank M represents the lowest skill.
  • the output layer of neural network NN1 has M nodes.
  • the first node is information representing the likelihood that the skill of the user corresponding to the input data belongs to category 1.
  • the second node is information representing the probability that the input data belongs to category 2 to category M, respectively.
  • the M outputs are sets of probability data that sum to one.
  • Category 1 to category M are categories corresponding to rank 1 to rank M, respectively.
  • the learning device collects approach angle information and energization history information acquired when a large number of operators perform treatments using flexible endoscopes. good.
  • the metadata here is, for example, the patient information described above, but may also include doctor information.
  • the learning device assigns a correct label of rank A as an evaluation.
  • the predetermined patient information here is, for example, patient information in which the incidence of postoperative complications after surgery is lower than the average value and the number of days required for hospitalization is shorter than the average value.
  • a correct label of rank D is assigned as an evaluation.
  • the specific patient information here means information on a patient whose incidence of complications after surgery is significantly higher than the average value and the number of days required for hospitalization is significantly higher than the average value.
  • the learning device identifies one of the M levels of the operator's skill based on the patient information, which is metadata.
  • the skilled doctor may manually evaluate the skill of each trainee for each case and input the evaluation result into the learning device.
  • a trained model for when the postoperative course is good and a trained model for when the postoperative course is not good may be prepared.
  • the processing unit 120 selects a trained model when the postoperative progress is favorable and a trained model when the postoperative progress is not favorable.
  • FIG. 17 is a flowchart explaining the learning process of the neural network NN1.
  • the learning device acquires approach angle information for learning and energization history information for learning.
  • the processing of steps S101 and S102 corresponds to, for example, processing by the learning server to read out a set of approach angle information and energization history information from a large amount of data accumulated in the database server.
  • the approach angle information and the approach angle information for learning represent the difference between data used in the learning stage and data used in the inference stage for skill evaluation. is similar. Also, data used as approach angle information for inference at a given timing may be used as approach angle information for learning at subsequent timings. The same applies to the energization history information and the energization history information for learning.
  • the learning device acquires the correct label associated with the data read out in step S101.
  • the correct label is, for example, the result of evaluating the skill of the user who has operated the endoscope in M stages, as described above.
  • step S103 the learning device performs processing for obtaining an error function as learning processing. Specifically, the learning device inputs the approach angle information and the energization history information to the neural network NN1. The learning device performs forward calculations based on the input and the weighting coefficients at that time. Then, the learning device obtains an error function based on the calculation result and the comparison processing of the correct label. For example, if the correct label is rank 1, the learning device determines that the correct value of the first node corresponding to category 1 is 1, and the correct values of the second to M-th nodes corresponding to categories 2 to M. is 0 and the error function is obtained. Furthermore, in step S103, the learning device performs processing to update the weighting coefficients so as to reduce the error function. For this processing, the error backpropagation method or the like can be used as described above. The processing of steps S101 to S103 corresponds to one learning process based on one piece of learning data.
  • the learning device determines whether or not to end the learning process.
  • the learning device may hold a part of a large amount of learning data as evaluation data.
  • the evaluation data is data for confirming the accuracy of the learning result, and is data that is not used for updating the weighting coefficients.
  • the learning device ends the learning process when the accuracy rate of the estimation process using the evaluation data exceeds a predetermined threshold.
  • step S104 the process returns to step S101 to continue the learning process based on the next learning data. If Yes in step S104, the learning process is terminated.
  • the learning device transmits the generated learned model information to the processing system 100 .
  • the trained model is stored in a storage unit (not shown) included in the processing system 100 and read by the processing unit 120 .
  • Various techniques such as batch learning and mini-batch learning are known in machine learning, and these can be widely applied in the present embodiment.
  • machine learning is supervised learning.
  • the method of this embodiment is not limited to this, and unsupervised learning may be performed.
  • unsupervised learning based on the similarity of the feature amount derived from the input approach angle information and energization history information, A classification process is performed that classifies a number of inputs into M categories.
  • the learning device ranks each of the M categories. For example, a category containing a lot of data on experienced doctors is ranked high, and a category containing a lot of data on trainee doctors is ranked low. It is possible to determine whether each data is the data of a skilled doctor or the data of a trainee doctor based on the aforementioned doctor information, patient information, and the like. However, various modifications can be made to the detailed processing. For example, the learning data is ranked in M steps in advance, and the learning device selects M categories based on the average value or total value of the ranks of the data included in each category. Ranking may be done. Even when performing unsupervised learning, it is possible to generate a trained model that evaluates the user's skill in M stages based on the input, as in the case of supervised learning.
  • FIG. 18 is a flowchart illustrating a processing example of processing for outputting skill evaluation information.
  • the acquisition unit 110 acquires approach angle information for skill evaluation (step S201), and acquires energization history information (step S202).
  • the processing unit 120 performs inference processing based on the learned model (step S203).
  • the processing unit 120 inputs the approach angle information and the energization history information to the learned model, and performs forward calculations according to the learned weighting coefficients, thereby obtaining M Get the output.
  • the processing unit 120 obtains the user's skill evaluation information based on the output. For example, the processing unit 120 evaluates the user's skill in M stages based on the data with the largest value among the M outputs.
  • the processing unit 120 acquires a learned model obtained by performing machine learning for classifying the approach angle information for learning and the energization history information for learning into M (M is an integer equal to or greater than 2) categories, Skill evaluation is performed based on approach angle information and energization history information.
  • M is an integer equal to or greater than 2 categories
  • Skill evaluation is performed based on approach angle information and energization history information.
  • the trained model may be generated based on supervised learning or unsupervised learning.
  • the output processing unit 130 outputs skill evaluation information, which is the skill evaluation result (step S204).
  • the skill evaluation information here means, for example, M evaluations consisting of a combination of marking stage evaluation, local injection stage evaluation, incision stage evaluation, peeling stage evaluation, hemostasis stage evaluation, and comprehensive evaluation in FIG. This is the information specifying which of the results it is.
  • the storage unit of the processing system 100 stores separate trained models according to each evaluation. Then, the processing of FIG. 18 is performed using the learned models that are targets of skill evaluation. Further, when performing skill evaluation limited to the above-described energization period, for example, based on the created log information, a process of extracting data consisting of the energization period and approach angle information corresponding to the energization period is performed. Then, the processing of FIG. 18 is performed using the data and the learned model corresponding to the stage to which the energization period belongs. The same applies to the case of outputting the skill evaluation limited to the period before the energization period. Further, when adding the advice information shown in FIG.
  • the processing unit 120 of the processing system 100 evaluates the operator's skill by operating according to the learned model.
  • Calculations in the processing unit 120 according to the trained model that is, calculations for outputting output data based on input data may be performed by software or by hardware.
  • the sum-of-products operation and the like executed in each node in FIG. 16 may be executed by software.
  • the above calculations may be performed by a circuit device such as an FPGA.
  • the above operations may be performed by a combination of software and hardware.
  • a trained model includes an inference algorithm and weighting factors used in the inference algorithm.
  • An inference algorithm is an algorithm that performs forward calculations and the like based on input data.
  • both the inference algorithm and the weighting coefficient are stored in the storage unit, and the processing unit 120 may perform the inference processing by software by reading out the inference algorithm and the weighting coefficient.
  • the inference algorithm may be implemented by FPGA or the like, and the storage unit may store the weighting coefficients.
  • an inference algorithm including weighting factors may be implemented by an FPGA or the like.
  • the storage unit that stores the information of the trained model is, for example, the built-in memory of the FPGA.
  • N-Dimensional Feature Amount Space the processing unit 120 obtains an N-dimensional feature amount (N is an integer of 2 or more) based on the approach angle information, the energization history information, and the learned model. good too.
  • the learning device may perform machine learning to classify a plurality of pieces of approach angle information for learning and energization history information for learning into M categories in the same manner as the processing described above with reference to FIGS. 15 and 16 .
  • the processing unit 110 acquires approach angle information and energization history information, which are targets of skill evaluation (steps S201 and S202).
  • the processing unit 120 inputs the approach angle information and the energization history information to the learned model, and performs forward calculations according to the learned weighting coefficients.
  • the processing unit 120 obtains the data in the intermediate layer as an N-dimensional feature amount.
  • the processing unit 120 obtains the data in the intermediate layer as an N-dimensional feature amount.
  • the neural network NN1 has the first to Q-th intermediate layers
  • the value in the J-th intermediate layer having N nodes is the N-dimensional feature amount.
  • Q is an integer of 2 or more
  • J is an integer of 1 or more and Q or less.
  • an N-dimensional feature amount may be obtained by combining outputs from multiple intermediate layers.
  • FIG. 19 is an example of an N-dimensional feature amount space.
  • the horizontal axis represents the feature amount A1 among the N-dimensional feature amounts, and the vertical axis represents the feature amount B1 different from the feature amount A1.
  • N 2 here, N may be 3 or more.
  • the processing unit 120 calculates the position in the feature amount space of the N-dimensional feature amount obtained by inputting the approach angle information and the energization history information, which are targets of skill evaluation, into the learned model, and one of the M categories.
  • skill evaluation is performed based on the distance between the centroid positions in the feature amount space of a plurality of categories.
  • the position of the center of gravity here is information obtained based on the positions of a plurality of points included in each category, and is, for example, an average value of a plurality of coordinate values.
  • the centroid position of each category is known at the stage when learning is completed.
  • the processing unit 120 obtains an N-dimensional feature amount (N is an integer equal to or greater than 2) based on the approach angle information, the energization history information, and the learned model, and obtains the N-dimensional feature amount, Skill evaluation is performed based on the distance from the center of gravity of M categories.
  • the distance here is, for example, the Euclidean distance, but other distances such as the Mahalanobis distance may be used.
  • the processing unit 120 obtains the category having the smallest distance from the N-dimensional feature amount obtained by the forward calculation among the first to Mth categories, and determines that the data to be evaluated belongs to the category. .
  • the processing unit 120 determines the above-mentioned evaluation A when the distance from the center of gravity of C11 is the minimum, and the above-mentioned evaluation when the distance from the center of gravity of C12 is the minimum. It is determined to be the evaluation B, and when the distance from the center of gravity position of C13 is the minimum, it is determined to be the evaluation C described above.
  • the feature amount A1 and the feature amount B1 in FIG. 19 are parameters extracted based on the approach angle information and the energization history information. Therefore, the approach angle information and the energization history information are different parameters.
  • the feature amount A1 may correspond to the approach angle information itself
  • the feature amount B1 may correspond to the energization history information itself.
  • the processing unit 120 calculates the distance in the feature amount space defined by the feature amount A1, which is the first feature amount corresponding to the approach angle information, and the feature amount B1, which is the second feature amount corresponding to the energization history information. Skill evaluation may be done based on For example, C11 in FIG.
  • C12 is a category with a long energization time and a large amount of relative change in the approach angle, so it is determined to be evaluated as B as described above. Further, C13 has a shorter energization time than C12, but the relative change amount of the approach angle is larger than C12 and the risk is higher than C12, so it is determined to be evaluated as C. By doing so, it is possible to perform skill evaluation using the approach angle information and the energization history information more appropriately, so that more accurate skill evaluation can be performed.
  • step S204 the output processing unit 130 outputs skill evaluation information that is the result of skill evaluation.
  • an N-dimensional feature amount may be extracted by performing principal component analysis on inputs based on approach angle information and energization history information. Since the method of performing principal component analysis is well known, detailed description thereof will be omitted. A method of performing principal component analysis using machine learning is also known, and machine learning can be applied in that case as well. The processing after N-dimensional feature quantity extraction is the same as the above example.
  • the skill evaluation method is not limited to the above.
  • the processing unit 120 may perform skill evaluation based on the distance between the plot point corresponding to the user to be evaluated and the plot point corresponding to the second user different from the user.
  • the second user here is, for example, an instructor, and the user to be evaluated is a user who receives guidance from the instructor. In this way, an index indicating how close the skill of the user to be evaluated is to the skill of the instructor can be output as the skill evaluation information.
  • skill evaluation information which is the result of skill evaluation of the user operating the endoscope, is output based on the approach angle information and the energization history information.
  • the information to be output is not limited to this, and notification information regarding the relative change in the approach angle may be output.
  • notification information regarding the relative change in the approach angle may be output.
  • an embodiment relating to a processing system 2100 that outputs notification information regarding a relative change in approach angle will be described.
  • each treatment included in ESD will be mainly described, but the method of this modified example can be extended to other treatments for a living body.
  • the tissue is cut while high-frequency current is supplied to the high-frequency device.
  • the approach angle in treatment is directly linked to safety.
  • the captured image captured by the imaging system of the endoscope becomes a close-up image. Therefore, it is difficult for the operator to recognize the progress of the angle of the tip of the endoscope using the captured image.
  • the treatment is performed by burrowing under the tissue. Therefore, it is not easy for the operator to estimate the position and orientation of the endoscope even when viewing the captured image.
  • FIGS. 20(A) to 20(F) is a lesion
  • the tissue to be treated is OB21 or its surrounding tissue.
  • the line indicated by E21 is the line indicating the boundary between the submucosal layer and the muscle layer.
  • FIG. 20A shows a state in which the treatment instrument 2360 is protruded from the distal end portion 2011 of the insertion section 2310b and approach to the treatment target tissue is started.
  • FIG. 20(B) represents a captured image in the state shown in FIG. 20(A).
  • the treatment tool 2360 and the tissue to be treated are not close enough to contact each other. Therefore, the operator can estimate the relative relationship between the insertion portion 2310b and the tissue to be treated based on the captured image, compared to the state during treatment, which will be described later with reference to FIGS. Easy.
  • FIG. 20(C) represents a state in which incision is being performed
  • FIG. 20(D) represents a captured image during incision.
  • the tip 2011 is hidden under the tissue raised by the incision. Therefore, as shown in FIG. 20(D), the captured image is in a state where the tissue covers the entire screen, and it is difficult for the operator to estimate the approach angle from the captured image.
  • FIG. 20(E) shows a state in which the approach angle changes greatly during incision and becomes a dangerous angle that increases bleeding
  • FIG. 20(F) shows the captured image at that time.
  • the distal end portion 2011 maintains a state of getting under the raised tissue.
  • the captured image changes little from the state of FIG. 20(D), and it is not easy for the operator to notice that the approach angle has changed from the captured image. For example, when the treatment tool 2360 penetrates deeply into the tissue and the amount of bleeding increases, the operator can grasp from the captured image that the operator is in a dangerous state, but it is difficult to predict the danger before the bleeding occurs.
  • the endoscope is a flexible endoscope having a flexible section 2013, the area from the distal end of the insertion section to the operating section at hand is flexible and long. As a result, the feeling and strength are hardly conveyed to the operator.
  • the operator who is an endoscopist, performs the procedure based on his own empirical rules, in the absence of sufficient information regarding images, sensations, and strength. For example, the operator performs treatment while imagining the actual position and posture of the endoscope with respect to the lesion. For this reason, even experienced doctors cannot express in words how and when to operate. In other words, conventionally, the transition of the approach angle in treatment has been "tacit knowledge.” Therefore, it is desirable to perceive danger in the stage before bleeding occurs, but such a response is difficult unless the operator has a lot of experience.
  • the treatment instrument 2360 may come into contact with the tissue in an unintended manner. It is very dangerous because it becomes Therefore, even if the approach angle itself is small, the risk is high if the operator does not notice the change in the approach angle. Conversely, even if the approach angle is large, the degree of risk is relatively small because it is considered that the method of moving the distal end portion 2011 will be taken into consideration if the operator is aware of it.
  • FIG. 21 is a diagram showing a configuration example of a processing system 2100 according to this modification.
  • the processing system 2100 includes a processing section 2110 that obtains the approach angle of the insertion section 2310b of the endoscope system 2300 and an output processing section 2120 .
  • the processing unit 2110 sets the approach angle at the timing when it is determined that the treatment using the endoscope system 2300 is started on the living body as the reference angle, and obtains the relative change of the approach angle with respect to the reference angle. Then, the output processing unit 2120 performs output processing of notification information regarding the relative change.
  • the configuration of the processing system 2100 is not limited to that shown in FIG. 21, and various modifications such as adding other components are possible.
  • this modified example requires the relative change in the approach angle with reference to the start of treatment.
  • a relative change is numerical data representing, for example, a change angle.
  • the operator positions the tissue to be treated in a bird's-eye view, protrudes the treatment instrument 2360 from the distal end portion 2011 as shown in FIG. 20A, and then starts approaching the tissue to be treated. is assumed.
  • Positioning represents a step of determining the position and posture of the insertion section 2310b when approaching. At the positioning stage, the distance between the distal end portion 2011 and the tissue to be treated is greater than after the start of the approach.
  • the processing unit 2110 of the processing system 2100 is configured with the following hardware.
  • the hardware may include circuitry for processing digital signals and/or circuitry for processing analog signals.
  • the hardware may consist of one or more circuit devices or one or more circuit elements mounted on a circuit board.
  • the one or more circuit devices are for example ICs, FPGAs or the like.
  • the one or more circuit elements are, for example, resistors, capacitors, and the like.
  • processing unit 2110 may be realized by the following processor.
  • Processing system 2100 includes a memory (not shown) that stores information and a processor that operates based on the information stored in the memory.
  • the information is, for example, programs and various data.
  • a processor includes hardware.
  • Various processors such as CPU, GPU, and DSP can be used as the processor.
  • the memory may be a semiconductor memory such as an SRAM or a DRAM, a register, a magnetic storage device such as an HDD, or an optical storage device such as an optical disk device. good too.
  • the memory stores computer-readable instructions, and the functions of the processing unit 2110 are realized as processes when the instructions are executed by the processor.
  • the instruction here may be an instruction set that constitutes a program, or an instruction that instructs a hardware circuit of a processor to perform an operation. Further, all or part of each part of the processing unit 2110 can be realized by cloud computing, and each process described later can be performed on cloud computing.
  • processing unit 2110 may be implemented as a module of a program that runs on a processor.
  • processing section 2110 is implemented as a processing module that obtains relative changes in the approach angle of the insertion section 2310b.
  • the program that implements the processing performed by the processing unit 2110 can be stored in, for example, an information storage device that is a computer-readable medium.
  • the information storage device can be implemented by, for example, an optical disc, memory card, HDD, semiconductor memory, or the like.
  • a semiconductor memory is, for example, a ROM.
  • the processing unit 2110 performs various processes based on programs stored in the information storage device. That is, the information storage device stores a program for causing the computer to function as the processing unit 2110 .
  • a computer is a device that includes an input device, a processing unit, a storage unit, and an output unit.
  • the program according to this modification is a program for causing a computer to execute each step described later with reference to FIGS. 26 and 32.
  • the output processing unit 2120 performs processing for outputting notification information regarding the relative change in the approach angle.
  • the output process is, for example, a process of displaying a screen containing notification information on the display unit.
  • the display unit here is, for example, the display unit 2340 of the endoscope system 2300 described later with reference to FIGS. 22 and 23 .
  • the display unit may be included in the processing system 2100 or may be included in another device capable of communicating with the processing system 2100 .
  • the output processing unit 2120 may be a display interface that directly controls the display unit.
  • the output processing unit 2120 may be a communication interface for transmitting the display image itself or information for generating the display image to a device having a display unit.
  • the output processing unit 2120 may also include a control processor, control circuit, or the like that controls the display interface or the communication interface.
  • the output processing unit 2120 may output sound or light.
  • the output processing unit 2120 may be implemented by a light emitting unit such as an LED, a speaker, or the like, or may include a control circuit or the like for controlling these.
  • processing system 2100 may be included in an endoscope system 2300 that will be described later with reference to FIGS. 22 and 23.
  • processing system 2100 is included in processing device 2330 of endoscope system 2300 .
  • the processing system 2100 may be provided as a device separate from the endoscope system 2300 .
  • the processing system 2100 may be realized by a PC connected to the processing device 2330, or by a server system connected via a network.
  • the network here may be a private network or a public communication network such as the Internet.
  • the network may be wired or wireless.
  • Processing system 2100 may also be implemented by distributed processing of a plurality of devices.
  • the processing system 2100 may be implemented by two or more of the processing device 2330, PC, and server system.
  • FIG. 22 is a diagram showing the configuration of an endoscope system 2300.
  • the endoscope system 2300 includes a scope section 2310 , a processing device 2330 , a display section 2340 and a light source device 2350 .
  • the operator uses the endoscope system 2300 to perform an endoscopic examination of the patient.
  • the configuration of the endoscope system 2300 is not limited to that shown in FIG. 22, and various modifications such as omitting some components or adding other components are possible.
  • the endoscope system 2300 is a flexible endoscope used for diagnosis of digestive organs, for example.
  • FIG. 22 shows an example in which the processing device 2330 is one device connected to the scope section 2310 via the connector 2310d, but it is not limited to this.
  • part or all of the configuration of the processing device 2330 may be constructed by other information processing devices such as a PC or a server system that can be connected via a network.
  • processing unit 2330 may be implemented by cloud computing.
  • the scope section 2310 has an operation section 2310a, a flexible insertion section 2310b, and a universal cable 2310c including signal lines and the like.
  • the scope section 2310 is a tubular insertion device that inserts a tubular insertion section 2310b into a body cavity.
  • a connector 2310d is provided at the tip of the universal cable 2310c.
  • the scope unit 2310 is detachably connected to the light source device 2350 and the processing device 2330 by a connector 2310d. Furthermore, as will be described later with reference to FIG. 23, a light guide 2315 is inserted through the universal cable 2310c. emitted from the tip.
  • the insertion portion 2310b has a distal end portion 2011, a bendable bending portion 2012, and a flexible portion 2013 from the distal end to the proximal end of the insertion portion 2310b.
  • the insertion portion 2310b is inserted into the subject.
  • a distal end portion 2011 of the insertion portion 2310b is a distal end portion of the scope portion 2310 and is a hard distal end rigid portion.
  • An objective optical system 2311 and an imaging element 2312, which will be described later, are provided at the distal end portion 2011, for example.
  • the bending portion 2012 can bend in a desired direction according to the operation of the bending operation member provided on the operation portion 2310a.
  • the bending operation member includes, for example, a horizontal bending operation knob and a vertical bending operation knob.
  • the operation portion 2310a may be provided with various operation buttons such as a release button and an air/water supply button.
  • the processing device 2330 is a video processor that performs predetermined image processing on the received imaging signal and generates a captured image.
  • a video signal of the generated captured image is output from the processing device 2330 to the display unit 2340 , and the live captured image is displayed on the display unit 2340 .
  • the configuration of the processing device 2330 will be described later.
  • the display unit 2340 is, for example, a liquid crystal display, an EL display, or the like.
  • a light source device 2350 is a light source device capable of emitting white light for normal observation mode.
  • the light source device 2350 may be capable of selectively emitting white light for normal observation mode and special light such as narrow band light.
  • FIG. 23 is a diagram for explaining the configuration of each part of the endoscope system 2300.
  • the light source device 2350 includes a light source 2352 that emits illumination light.
  • the light source 2352 may be a xenon light source, an LED, or a laser light source. Also, the light source 2352 may be another light source, and the light emission method is not limited.
  • the insertion portion 2310b includes an objective optical system 2311, an imaging device 2312, an illumination lens 2314, and a light guide 2315.
  • the light guide 2315 guides illumination light from the light source 2352 to the tip of the insertion portion 2310b.
  • the illumination lens 2314 irradiates the subject with the illumination light guided by the light guide 2315 .
  • the objective optical system 2311 forms a subject image by reflecting light reflected from the subject.
  • the imaging element 2312 receives light from the subject via the objective optical system 2311 .
  • the imaging element 2312 may be a monochrome sensor or an element with color filters.
  • the color filter may be a well-known Bayer filter, a complementary color filter, or other filters.
  • Complementary color filters are filters that include cyan, magenta, and yellow color filters.
  • the processing device 2330 performs image processing and controls the entire system.
  • the processing device 2330 includes a preprocessing section 2331 , a control section 2332 , a storage section 2333 , a detection processing section 2335 and a postprocessing section 2336 .
  • the preprocessing unit 2331 performs A/D conversion for converting analog signals sequentially output from the image sensor 2312 into digital images, and various correction processes for image data after A/D conversion. Note that an A/D conversion circuit may be provided in the image sensor 2312 and the A/D conversion in the preprocessing section 2331 may be omitted.
  • the correction processing here includes, for example, color matrix correction processing, structure enhancement processing, noise reduction processing, AGC, and the like.
  • the preprocessing unit 2331 may also perform other correction processing such as white balance processing.
  • the preprocessing unit 2331 outputs the processed image to the detection processing unit 2335 as an input image.
  • the pre-processing unit 2331 also outputs the processed image to the post-processing unit 2336 as a display image.
  • the detection processing unit 2335 performs detection processing for detecting a region of interest such as a lesion from the input image.
  • the attention area detection processing is not essential, and the detection processing unit 2335 can be omitted.
  • a post-processing unit 2336 performs post-processing based on the outputs of the pre-processing unit 2331 and the detection processing unit 2335 and outputs the post-processed image to the display unit 2340 .
  • the post-processing unit 2336 may add the detection result of the detection processing unit 2335 to the display image and display the image after addition.
  • the control unit 2332 is connected to the imaging device 2312, the preprocessing unit 2331, the detection processing unit 2335, the postprocessing unit 2336, and the light source 2352, and controls each unit.
  • a processing unit 2110 and an output processing unit 2120 are added to the configuration of FIG.
  • the processing unit 2110 obtains the relative change in the approach angle using a method described later.
  • the output processing unit 2120 displays notification information regarding the relative change.
  • the display unit 2340 displays a display screen including a display image output from the post-processing unit 2336 and notification information output from the output processing unit 2120, for example.
  • the output processing unit 2120 may be realized by the post-processing unit 2336 .
  • FIG. 24 is a diagram illustrating the configuration of the distal end portion 2011 of the insertion portion 2310b.
  • the distal end portion 2011 has a substantially circular cross-sectional shape, and is provided with an objective optical system 2311 and an illumination lens 2314 as described above with reference to FIG.
  • the insertion portion 2310b is provided with a channel, which is a cavity, connecting from the operation portion 2310a to the opening portion 2316 of the distal end portion 2011.
  • the opening 2316 here is an opening for a treatment tool 2360 called a forceps opening. However, the opening 2316 may be used for air supply or suction.
  • FIG. 24 illustrates the configuration of the distal end portion 2011 having two systems of illumination lenses 2314, one objective optical system 2311, and one opening 2316, but the specific configuration can be modified in various ways. .
  • the processing performed by the processing system 2100 may be implemented as an information processing method.
  • the information processing method is an information processing method relating to a treatment using an endoscope, and calculates the approach angle of the insertion portion 2310b of the endoscope at the timing when it is determined that the treatment using the endoscope has started on the living body.
  • a relative change in the approach angle with respect to the reference angle is obtained at a timing after the timing at which the reference angle is set and it is determined that the treatment has started, and processing for outputting notification information regarding the relative change is performed.
  • FIG. 25A is a diagram illustrating an example of approach angles in this modified example.
  • the approach angle is information representing the relative angle between the treatment target tissue and the distal end portion 2011 of the insertion section 2310b.
  • the approach angle is the angle between the straight line L21, which is the axis of the insertion section 2310b, and the plane P21 on the treatment target tissue.
  • the axis of the insertion portion 2310b is an axis representing the longitudinal direction of the insertion portion 2310b, and is, for example, a straight line passing through the center of the substantially cylindrical insertion portion 2310b, or a straight line parallel (including substantially parallel) thereto.
  • a treatment instrument 2360 is protruded from an opening 2316 provided at the distal end portion 2011 of the insertion section 2310b, and ESD or the like is performed by the treatment instrument 2360.
  • FIG. therefore, the axis of the insertion portion 2310b is specifically the axis of the distal end portion 2011 of the insertion portion 2310b.
  • the approach angle is one of the internal angles of a triangle defined by a perpendicular line drawn from the tip 2011 to the plane P21, a straight line L21, and the plane P21. be.
  • the approach angle here is information representing the depth of approach whether the axis of the insertion section 2310b lies or stands with respect to the plane P21 representing the tissue to be treated.
  • the approach angle is, for example, a numerical value between 0 degrees and 90 degrees. In this definition, even if the axis rotates about the normal to the plane P21, the approach angle does not change. That is, the approach angle may not include information specifying the direction of approach.
  • the approach angle is not limited to this, and the range of values is not limited to 0 degrees or more and 90 degrees or less.
  • the endoscope system 2300 includes a motion sensor provided at the distal end 2011 of the insertion section 2310b.
  • the motion sensor is, for example, a 6-axis sensor including a 3-axis acceleration sensor and a 3-axis angular velocity sensor.
  • the acceleration sensor is a sensor that detects translational acceleration on each of the XYZ axes.
  • the angular velocity sensor is a sensor that detects angular velocity around each of the XYZ axes.
  • the processing unit 2110 uses the reference position/posture as a reference to determine the displacement and By accumulating the amount of rotation, the position and orientation of the distal end portion 2011 at each timing are obtained. This makes it possible to express the direction of the straight line L21 at each timing in a given coordinate system.
  • the plane P21 can be estimated, for example, by obtaining the distance to the treatment target tissue based on the captured image.
  • endoscope system 2300 may include multiple imaging systems in distal portion 2011 .
  • the processing unit 2110 obtains the distance to the subject imaged on the image by performing stereo matching processing based on parallax images imaged by a plurality of imaging systems at different positions. Stereo matching is a well-known technique, and detailed description thereof will be omitted. In this way, the three-dimensional shape of the subject can be estimated. For example, since the processing unit 2110 can specify the coordinates of each point of the subject in the camera coordinate system, it is possible to obtain the plane P21 including the treatment target tissue using the camera coordinate system.
  • the processing unit 2110 expresses the straight line L21 and the plane P21 using an arbitrary coordinate system and expresses the straight line L21 and the plane P21 using an arbitrary coordinate system by acquiring the output of the motion sensor and the captured image that is the parallax image. It is possible to compute the approach angle, which is the angle between
  • the plane P21 may be a plane orthogonal to the normal vector at a given point of the treatment target tissue, or may be a plane approximating the surface shape of the treatment target tissue.
  • the approach angle in this modified example may be any information that represents the relationship between the distal end portion 2011 of the insertion section 2310b and the tissue to be treated. An approach angle may be determined.
  • the processing unit 2110 may perform both calculation of the position and orientation of the distal end portion 2011 based on the motion sensor and estimation of the subject shape based on the parallax image at each timing.
  • the processing unit 2110 obtains a plane P21 at a position where the subject can be relatively overlooked, and continues to use information on the plane P21 during treatment. In this way, even if it is difficult to obtain information on the treatment target tissue from the image, information on the plane P21 can be appropriately obtained.
  • the processing unit 2110 obtains the straight line L21 based on the motion sensor at each timing, and calculates the approach angle using the obtained straight line L21 and the obtained plane P21.
  • endoscope system 2300 may include a magnetic sensor provided on distal end 2011 .
  • a magnetic sensor includes two cylindrical coils whose center axes are perpendicular to each other.
  • the endoscope system 2300 also includes a magnetic field generator (not shown) as a peripheral device. The magnetic sensor detects the position and orientation of the distal end portion 2011 by detecting the magnetic field generated by the magnetic field generator.
  • the measurement method on the treatment target tissue side is not limited to the method using parallax images.
  • the processing unit 2110 may measure the treatment target tissue side by measuring the distance to the subject using the TOF method or the structured light method.
  • the TOF method is a method of measuring the time it takes for a reflected wave of light to reach an image sensor.
  • the structured light method is a method of projecting a plurality of patterns of light onto an object and determining the distance from how each pattern of light appears. For example, there is known a phase shift method of obtaining a phase shift by projecting a pattern whose brightness changes with a sine wave. Since these techniques for estimating the three-dimensional shape of the subject are well known, detailed description thereof will be omitted.
  • the processing unit 2110 may also calculate the position of the treatment target tissue in the three-dimensional space by associating a plurality of feature points in a plurality of different captured images.
  • the positions of feature points can be calculated from image information using methods such as SLAM and SfM.
  • the processing unit 2110 obtains information of the tissue to be treated by applying a bundle adjustment that optimizes the intrinsic parameters, the extrinsic parameters and the global coordinate point cloud from the image using a non-linear least squares method.
  • the processing unit 2110 performs perspective projection transformation on the world coordinate points of the plurality of extracted feature points using each estimated parameter, and performs each parameter and each world coordinate point cloud so that the reprojection error is minimized.
  • the measurement of the tissue to be treated is not limited to using the captured image captured using the endoscope system 2300 .
  • CT or MRI images of the patient may be acquired prior to treatment using endoscopic system 2300 .
  • the processing unit 2110 acquires information related to the factor and corrects the shape of the organ based on the information, thereby estimating the shape of the organ while the insertion unit 2310b is being inserted.
  • the processing unit 2110 calculates information of the treatment target tissue, for example, the plane P21, based on the corrected shape of the organ.
  • the information related to the organ shape change factor is, for example, information such as the patient's posture, the pressure inside the lumen, the direction of gravity, and the insertion shape of the insertion portion 2310b.
  • the patient's body position may be obtained from the amount of movement of the movable bed, or may be input by the user.
  • the air pressure in the lumen may be estimated from the air supply amount or the suction amount, or may be obtained by providing an air pressure sensor in the insertion section 2310b.
  • the direction of gravity can be detected from a motion sensor.
  • the insertion shape of the insertion section 2310b may be detected by providing motion sensors or magnetic sensors at a plurality of locations of the insertion section 2310b, or may be estimated based on the history of forward/backward operations and bending operations of the insertion section 2310b. good.
  • the approach angle may be an angle formed by a straight line L21, which is the axis of the insertion portion 2310b, and a straight line L22, which represents the direction of gravity.
  • the approach angle does not directly correspond to the angle between the tissue to be treated and the distal end portion 2011 .
  • the position of the patient is fixed to some extent, and for example, the left lateral decubitus position is used. Therefore, if the position of the treatment target tissue in the organ is known, the relationship between the plane P21 and the direction of gravity is known. Also, even when the body posture is changed, if the information representing the body posture can be acquired each time, the relationship between the plane P21 and the direction of gravity is known.
  • the processing unit 2110 may obtain the approach angle based on the axis and the direction of gravity.
  • the direction of gravity can be determined using, for example, the motion sensor described above.
  • the processing unit 2110 may use the obtained approach angle as it is, or may calculate the angle between the straight line L21 and the plane P21 based on the relationship between the direction of gravity and the plane P21 in the same manner as in FIG. good.
  • the technique of this modification is not limited to this.
  • the information about the angle of view of the imaging system composed of the objective optical system 2311 and the imaging device 2312 may be used.
  • the information about the angle of view is an area representing the imaging range of the imaging system, and is a cone whose apex is the reference point of the imaging system.
  • the pyramid here is, for example, a quadrangular pyramid, and the quadrangular base corresponds to the angle of view.
  • the information about the angle of view may be information representing numerical values such as a horizontal angle of view, a vertical angle of view, and a diagonal angle of view.
  • the approach angle in this modified example may be the angle formed by the bottom surface of the square pyramid, which is a right pyramid, and the plane P21 representing the tissue to be treated.
  • the approach angle is a vector that defines the direction of the line of intersection of two planes.
  • the approach angle may be information representing the angle between the plane P21 and a straight line connecting the apex of the quadrangular pyramid and an arbitrary point on the bottom surface.
  • the processing system 2100 can periodically acquire information from the endoscope system 2300 .
  • the processing system 2100 acquires captured images captured by the endoscope system 2300 and sensor information of the motion sensor. Also, as described below, the processing system 2100 may periodically acquire control information for peripheral devices.
  • FIG. 26 is a flowchart for explaining the processing of the processing system 2100.
  • the processing unit 2110 detects whether or not the treatment for the treatment target tissue has been started. For example, the processing unit 2110 performs processing for detecting the start of treatment based on the usage information of the peripheral device.
  • a peripheral device here is a device provided accompanying the endoscope system 2300 .
  • the main body of the endoscope system 2300 includes components such as the scope unit 2310 and the processing device 2330 that are essential for observing the inside of the living body.
  • the main unit may include the display unit 2340 and the light source device 2350 .
  • the peripheral device is not an essential component for imaging itself, but includes, for example, a treatment tool 2360 for performing treatment and a power supply device for supplying power to the treatment tool 2360, which is a high-frequency device.
  • Peripheral equipment may also include a device having a pump or the like for supplying or sucking air. Since the peripheral device is used when performing the treatment, it is possible to appropriately detect the start of the treatment by using the usage information of the peripheral device.
  • the processing unit 2110 may acquire information about the protruding state of the treatment instrument 2360 as usage information.
  • the insertion section 2310b is provided with a channel that connects from the operator's hand portion (for example, the operation section 2310a) to the opening 2316 of the distal end section 2011 .
  • the operator performs treatment by inserting the treatment instrument 2360 through the channel.
  • the protruding state of the treatment instrument 2360 is information indicating whether or not the distal end of the treatment instrument 2360 protrudes from the opening 2316 of the distal end portion 2011, or the amount of projection.
  • the processing unit 2110 acquires usage information regarding the protruding state by determining whether or not the treatment tool 2360 is captured in the captured image.
  • the treatment instrument 2360 protrudes from the opening 2316 of the distal end portion 2011 in the axial direction of the insertion portion 2310b. is an image of the protruding treatment instrument 2360 .
  • the treatment instrument 2360 has lower saturation than living tissue, and its shape is also known from its design. Therefore, the processing unit 2110 can detect the treatment tool region in the image by performing image processing such as saturation determination processing and matching processing using the reference shape on the captured image. For example, when the treatment tool 2360 exists in the image, the processing unit 2110 determines that the treatment tool 2360 protrudes and the treatment has started. Further, the processing section 2110 may determine the protruding state of the treatment instrument 2360 based on the output of the sensor provided in the opening 2316 corresponding to the forceps port.
  • the treatment instrument 2360 here is an instrument for treating a living body, and includes, for example, a high-frequency snare and a high-frequency knife.
  • High frequency knives include needle knives, IT knives, hook knives, and the like.
  • a needle knife is used for ESD marking.
  • An IT knife is used for the incision.
  • a high-frequency snare or high-frequency knife is used for peeling.
  • the treatment instrument 2360 may also include other instruments such as injection needles, forceps, and clips.
  • An injection needle is used for local injection of ESD. Forceps or clips are used to stop bleeding.
  • the processing unit 2110 may acquire information about the energization state of the high-frequency device as usage information.
  • a high-frequency device is a device used to excise and cauterize a target tissue by applying a high-frequency current.
  • High frequency devices include high frequency snares and high frequency knives.
  • the energized state is information indicating whether or not a high-frequency current is being supplied from the power supply to the high-frequency device, and can be determined based on the control signal of the power supply. Based on the control signal, the processing unit 2110 determines that treatment has started when high-frequency current is being supplied to the high-frequency device.
  • a high-frequency device cannot perform resection or cauterization simply by protruding, and requires the supply of high-frequency current. That is, when high-frequency current is supplied to the high-frequency device, it can be said that there is a higher probability that the treatment will be started immediately. Therefore, by using the energized state of the high-frequency device, it is possible to accurately determine the start of treatment.
  • step S2101 the processing unit 2110 waits for a predetermined period of time without performing the processes after step S2102. Then, the processing unit 2110 executes the process of step S2101 again.
  • the calculation of the relative change in the approach angle and the output of the notification information may not be performed until the start of treatment is detected.
  • step S2102 the processing unit 2110 performs a process of setting the approach angle at the timing when it is determined that the treatment has started as the reference angle. For example, the processing unit 2110 acquires the sensor information of the motion sensor and the captured image at the timing when it is determined that the treatment instrument 2360 has protruded, and obtains the straight line L21 that is the axis of the insertion unit 2310b and the plane P21 regarding the tissue to be treated. After processing, the angle formed by the straight line L21 and the plane P21 is set as the reference angle of the approach angle.
  • the “approach angle at the timing when the treatment instrument 2360 protrudes” in this modification represents the approach angle calculated with the detection of protrusion as a trigger, and includes the protrusion determination timing, sensor information acquisition timing, and imaging.
  • the image acquisition timing does not have to match exactly.
  • the processing unit 2110 may set the approach angle at the timing corresponding to the energization start timing of the high-frequency device as the reference angle. As described above, by setting the approach angle at the start of treatment as the reference angle, it is possible to appropriately obtain the relative change in the approach angle during one treatment.
  • the processing unit 2110 After setting the reference angle, the processing unit 2110 obtains the relative change in the approach angle in step S2103. For example, the processing unit 2110 obtains sensor information from a motion sensor, and obtains information representing the straight line L21 based on the sensor information. The processing unit 2110 obtains the angle formed by the straight line L21 and the plane P21 obtained in step S2102 as the approach angle at that time. Further, the processing unit 2110 calculates the difference between the obtained approach angle and the reference angle obtained in step S2102, and regards the difference as the relative change in the approach angle.
  • step S2104 the output processing unit 2120 performs processing for outputting the relative change obtained in step S2103.
  • the output processing unit 2120 performs display processing for displaying the angle corresponding to the relative change in real time at a timing after the timing at which it is determined that the treatment has started.
  • the timing at which it is determined that the treatment has started is the timing at which Yes is determined in step S2101, and the subsequent timing is the timing in step S2104.
  • FIGS. 27(A) and 27(B) are examples of display screens when displaying the angle corresponding to the relative change in real time.
  • the output processing unit 2120 may display numerical values corresponding to relative changes using graphics.
  • FIG. 27(A) represents a state in which the relative change is 0 degrees
  • FIG. 27(B) represents a state in which the relative change is +30 degrees.
  • +30 degrees means that the current approach angle is 30 degrees higher than the reference angle. In this way, it is possible to quickly and easily present to the operator how the approach angle changes during treatment.
  • the screen that displays the relative change in real time is not limited to this, and the numerical value itself may be displayed.
  • step S2105 the processing unit 2110 determines whether the start of treatment has been detected again.
  • the specific processing is the same as step S2101, and the processing unit 2110 determines whether or not the protrusion of the treatment tool 2360 or the energization of the high-frequency device has been detected.
  • step S2105 is determined as No when the treatment started in step S2101 is continued. That is, if the treatment instrument 2360 continues to protrude or if the high-frequency device is kept energized, the processing unit 2110 determines No in step S2105.
  • One treatment by the operator includes a plurality of procedures of (a) protruding the treatment instrument, (b) energization, (c) specific treatment for the living body, (d) termination of energization, and (e) retraction of the treatment instrument. It is considered to be executed by repeating If the treatment instrument 2360 is not a high frequency device, (b) and (d) can be omitted. After the end of one treatment, for example, the positioning described above is performed before the start of the next treatment, but is omitted here.
  • ESD includes a plurality of steps such as incision and ablation, and each step is realized by performing procedures (a) to (e) once or multiple times.
  • step S2101 YES is determined in step S2101 by (a) or (b), and the process of step S2102 is executed.
  • step S2102 is executed.
  • steps S2103 and S2104 are repeatedly executed.
  • step S2105 there is a possibility that the determination result of step S2105 will be YES.
  • the processing unit 2110 determines Yes in step S2105.
  • steps S2101 and S2105 detects that the treatment instrument 2360 has transitioned from a non-projecting state to a projected state, or that the high-frequency device has transitioned from a non-energized state to an energized state. processing.
  • step S2102 the processing unit 2110 performs processing to set the approach angle at the timing when it is determined that the treatment has started, as the reference angle. The same applies thereafter, and during (c), the reset reference angle is used to calculate the relative change in the approach angle (step S2103) and output (step S2104).
  • step S2102 the reference angle is reset (step S2102).
  • FIG. 28 is a diagram explaining temporal changes in relative changes obtained by the method of this modification.
  • the horizontal axis of FIG. 28 represents time, and the vertical axis represents relative change in approach angle.
  • t1 in FIG. 28 corresponds to, for example, the timing at which the treatment instrument 2360 is first protruded or electrified after the insertion of the insertion portion 2310b (Yes in step S2101).
  • the processing unit 2110 obtains the approach angle at t1 as the reference angle (step S2102). After t1, the calculation of the approach angle and the calculation and output of the relative change are performed at each timing (steps S2103 and S2104).
  • the relative change at t1 is set to 0 degrees, and thereafter the relative change is acquired in time series with the approach angle at t1 as a reference.
  • t2 represents the timing at which the treatment was restarted. That is, the above (d) and (e) were performed between t1 and t2, and then (a) was performed at t2.
  • the processing unit 2110 obtains the approach angle at t2 as the reference angle (step S2102). As shown in FIG. 28, resetting the reference angle calibrates the relative angle at t2 to 0 degrees.
  • t3 also represents the timing at which the treatment was restarted. As shown in FIG. 28, resetting the reference angle calibrates the relative angle at t3 to 0 degrees.
  • the output processing unit 2120 starts output processing of notification information, triggered by the setting of the reference angle in the processing unit 2110 .
  • the processing unit 2110 performs resetting processing of the reference angle when determining that the treatment is started again. In this way, even if the treatment on the living body is repeated multiple times in one surgery, it is possible to output an appropriate relative change in each treatment.
  • the relative change in approach angle may be displayed in real time during treatment of the patient. In this way, it is possible to warn the operator who is performing the treatment not to perform dangerous operations.
  • the notification by the processing system 2100 is not limited to this.
  • the processing unit 2110 may acquire an allowable approach angle range, and perform processing for outputting alert information as notification information when the angle representing the relative change deviates from the approach angle range.
  • a skilled doctor can safely perform treatment by suppressing relative changes in the approach angle during treatment.
  • novice doctors may have problems such as bleeding due to variations in approach angles. Therefore, by setting an allowable approach angle range in advance, it is possible to determine whether or not the relative change in the approach angle is safe.
  • FIG. 29 is a diagram explaining the relationship between the relative change in approach angle and the allowable approach angle range.
  • the vertical axis and horizontal axis in FIG. 29 are the same as in FIG.
  • the timings t1 to t3 at which it is determined to start the treatment are the same as those in FIG.
  • the allowable approach angle range is greater than ⁇ 1 and less than ⁇ 2.
  • the processing unit 2110 determines that the relative change is out of the approach angle range when the relative change in the approach angle is ⁇ 1 or less or ⁇ 2 or more.
  • the output processing unit 2120 outputs alert information at timings t4 and t6.
  • the displayed information may be text information, image information such as icons, or other information.
  • the alert information may be output using a light emitting unit or a speaker.
  • the alert information may be notified at the timing when the relative change moves from within the approach angle range to outside the approach angle range, or may be continuously notified while outside the approach angle range.
  • the output processing unit 2120 may continue to output alert information between t4 and t5 and between t6 and t7 in FIG.
  • the processing system 2100 may include a database DB that stores relative changes in approach angles during treatment.
  • the processing unit 2110 acquires the approach angle range set based on the database DB.
  • FIG. 30 is a diagram explaining a system according to this modification.
  • the database DB is a database server connected to the endoscope system 2300 via a network or the like.
  • the processing system 2100 of this modification may be included in the endoscope system 2300, and the inclusion of the database DB in the endoscope system 2300 is not prevented.
  • the processing system 2100 obtains the relative change in the approach angle based on the information acquired from the endoscope system 2300, and stores the obtained relative change in the database DB.
  • the storage of the relative changes in the database DB may be performed in real time, or may be performed collectively after the insertion portion 2310b is removed.
  • time-series relative changes are obtained from insertion to removal of the insertion portion 2310b, and the average value thereof, for example, is stored in the database DB.
  • the time-series relative change itself may be stored in the database DB.
  • the database DB may contain expert data and non-expert data.
  • Expert data is information representing relative changes in the approach angle when treatment is performed by an expert doctor.
  • Non-expert data is information that represents the relative change in approach angle when a procedure is performed by a novice doctor.
  • the approach angle range in this modification is set based on at least expert data.
  • the processing system 2100 obtains information that many expert doctors change the angle during treatment by a few degrees or less.
  • the processing unit 2110 sets the approach angle range based on the obtained information. For example, the approach angle range is determined based on the maximum relative change when outliers are excluded from the expert data.
  • Whether the data stored in the database DB is expert or non-expert data may be determined based on the doctor's skill level or information specifying the course of treatment. For example, when the endoscope system 2300 transmits information for determining the approach angle to the processing system 2100, the skill level information representing the doctor's skill level and progress information representing the progress may be added as metadata. .
  • the proficiency level information is, specifically, number-of-cases information representing the number of times the target treatment has been performed.
  • the progress information is information representing the amount of bleeding, the incidence of complications, the number of days of hospitalization, and the like.
  • the processing system 2100 determines whether the target data is expert data, which is data of a skilled doctor, or non-expert data, which is data of a trainee doctor.
  • whether or not the data is expert data may be determined based on the movement trajectory of the treatment instrument. It is believed that as skill improves, movements become more controlled and procedures can be accomplished with fewer movements. Therefore, it is determined that the smaller the total number of nodes in the movement trajectory of the treatment instrument accumulated in the operation log information, the higher the skill of the operator corresponding to the data.
  • the processing system 2100 analyzes the information and feeds back the analysis result to the endoscope system 2300 as an approach angle range, thereby enabling more accurate support in subsequent cases.
  • an increase in the amount of accumulated information is expected to improve support accuracy. For example, as the amount of collected data increases, it is possible to realize support that is closer to the feeling of guidance by a skilled doctor.
  • the information fed back here is not limited to the approach angle range, and may include the skill evaluation and report information described above.
  • a classification process may be performed using a plurality of feature amounts including relative changes in the approach angle.
  • FIG. 31 is a diagram explaining the classification process.
  • the processing unit 2110 of the processing system 2100 acquires time-series data representing relative changes in the approach angle from the endoscope system 2300, and based on the time-series data, includes a first feature amount and a second feature amount.
  • the first feature amount is a feature amount related to the relative change in the approach angle
  • the second feature amount is a feature amount related to time.
  • the vertical axis of FIG. 31 is the first feature amount
  • the horizontal axis is the second feature amount.
  • the first feature amount is denoted as feature amount A2
  • the second feature amount is denoted as feature amount B2.
  • the processing unit 2110 divides the time-series data into multiple intervals.
  • Each section may be a section in which the relative change in the approach angle monotonically increases or decreases.
  • each section may be a section from when the relative change value increases or decreases from 0 to when it returns to 0 again.
  • the processing unit 2110 uses the length of time in each section as a first feature amount, and the absolute difference value between the maximum value and the minimum value of the relative change as a second feature amount. As shown in FIG. 28, it is assumed that data corresponding to one case includes a plurality of sections, and that a plurality of sets of first and second feature amounts are acquired.
  • the processing unit 2110 may employ the one with the largest value of the second feature amount or the one with the largest value of (second feature amount/first feature amount).
  • the processing unit 2110 may use the average value of a plurality of first feature values and the average value of a plurality of second feature values as data corresponding to one case. Moreover, outputting a plurality of data for one case is not prevented.
  • the first feature amount and the second feature amount serve as information representing how long the change occurred when the approach angle changed significantly.
  • the first feature amount and the second feature amount serve as information representing how long the change occurred when the approach angle changed significantly.
  • the degree of risk is relatively small. That is, the first feature amount and the second feature amount are information serving as an index for determining the degree of treatment risk in consideration of time.
  • the processing unit 2110 performs classification processing in a feature amount space including the first feature amount and the second feature amount. Specifically, one case is plotted as one point in the feature space.
  • F1 in FIG. 31 is the first category
  • F2 is the second category.
  • the processing unit 2110 determines which of F1 and F2 is expert data and which is non-expert data. Metadata such as the skill level information and progress information is used for the determination. For example, the processing unit 2110 can determine whether a case included in the first category is by a skilled doctor or by a novice doctor by acquiring metadata of each data of the first category. The same is true for the second category. Therefore, the processing unit 2110 determines that the data included in the category in which there are many data by skilled doctors among the two categories is the expert data. For example, in the example of FIG. 31, the processing unit 2110 determines data included in F1 as expert data, and determines data included in F2 as non-expert data.
  • the technique of identifying expert data from a plurality of case data is not limited to the above, and various modifications are possible.
  • the processing unit 2110 sets an allowable approach angle range based on the classification result. For example, the processing unit 2110 analyzes the trend of relative change in expert data in the same manner as in the above example, and sets the approach angle range based on the analysis results.
  • the processing unit 2110 may determine whether or not to output alert information based on information including time. For example, the processing unit 2110 may obtain an allowable relative change amount per unit time based on the above-described first feature amount and second feature amount. The output processing unit 2120 outputs alert information when the relative change is out of the approach angle range or when the slope of the relative change exceeds the permissible amount of relative change per unit time.
  • a database DB may be provided outside the processing system 2100 .
  • the processing system 2100 can communicate with the database DB via a network, for example, and sets the approach angle range by receiving information accumulated in the database DB.
  • the approach angle range is determined by the processing unit 2110 of the processing system 2100 performing analysis processing has been described above, but the processing unit 2110 may acquire the approach angle range determined by an external device. .
  • the processing unit 2110 acquires the first approach angle range as the approach angle range under the first treatment condition, and acquires the approach angle range different from the first approach angle range under the second treatment condition different from the first treatment condition. Two approach angle ranges may be obtained.
  • a treatment condition is information that identifies an organ in which a lesion to be treated exists.
  • the processing unit 2110 sets the approach angle range for the stomach when the organ to be treated is the stomach, and sets the approach angle range for the large intestine when the organ to be treated is the large intestine.
  • the target organ may be another organ such as the esophagus.
  • the processing unit 2110 may set the approach angle range according to the part of the organ. For example, even in the same stomach, the processing unit 2110 sets an approach angle range for the lesser curvature when the treatment target site is the lesser curvature, and sets the approach angle range for the greater curvature when the treatment target site is the greater curvature. Set the approach angle range. Also, when the large intestine is targeted, different approach angle ranges are set according to target regions such as the rectum, ascending colon, transverse colon, and descending colon.
  • organ information specifying the organ or site to be treated may be added as metadata.
  • the database DB stores a data set in which approach angle information, skill level information, progress information, and organ information are associated with each other.
  • the database DB may include multiple databases that differ for each organ or site.
  • the database DB may include a stomach database that stores case data on the stomach and a large intestine database that stores case data on the large intestine.
  • the processing unit 2110 of the processing system 2100 acquires the approach angle range for each organ by performing analysis processing for each organ or region.
  • the processing unit 2110 acquires an approach angle range for performing treatment on the stomach based on expert data included in the stomach database.
  • the analysis process is the same as the above example, and may be a process of obtaining the tendency of the approach angle of the expert doctor, or may be a process including a classification process. The same is true for other organs.
  • the processing unit 2110 acquires an approach angle range for performing treatment on the large intestine based on expert data included in the large intestine database. The same is true when one organ is subdivided into two or more parts, and the approach angle range for each part is obtained by analysis processing based on the data for each part.
  • the processing system 2100 performs processing for determining an approach angle range to be used for alert information output determination from among a plurality of approach angle ranges, based on the organ and site currently being treated.
  • the approach angle range can be set according to the target organ or site. Since the shape of the organ and the mode of the lesion differ depending on the organ, there are cases where it is easy to maintain the approach angle, but there are cases where the approach angle varies to some extent even by an experienced doctor. By performing processing for each organ and each site, it is possible to support the operator based on an appropriate approach angle range.
  • treatment conditions are not limited to organs or parts.
  • the content of treatment may vary depending on the type of lesion. Therefore, the processing unit 2110 may change the approach angle range according to the type of lesion.
  • the database DB stores data to which metadata specifying the lesion type is added, and the processing unit 2110 performs analysis processing for each lesion type.
  • the treatment condition may be information specifying steps in a treatment method including multiple steps such as ESD.
  • the processing unit 2110 changes the approach angle range according to steps such as marking, local injection, incision, peeling, and hemostasis.
  • the database DB stores data to which metadata specifying the steps of the treatment method is added, and the processing unit 2110 performs analysis processing for each step.
  • the treatment condition may be information specifying the operator's skill evaluation result.
  • the operator's skill evaluation result is determined based on, for example, the skill level information of the operator, progress information of cases handled in the past, relative change in approach angle, and the like.
  • the processing unit 2110 may set a narrow approach angle range for an operator with a low skill evaluation, and set a wide approach angle range for an operator with a high skill evaluation. In this way, an alert is more likely to be issued to the novice doctor, so that it is possible to notify the danger in advance. In addition, since the alert is less likely to be issued to the expert doctor, it is possible to suppress annoyance.
  • the output processing unit 2120 of this modification may stop the output processing of the notification information when a given stop condition is satisfied after starting the output processing of the notification information.
  • the processing unit 2110 may perform the reference angle resetting process.
  • the processing unit 2110 determines that (1) air supply, water supply, or suction has been performed, (2) the distance between the living body and the insertion unit 2310b is greater than or equal to a given distance threshold, and (3) insertion (4) It is determined that the treatment instrument 2360 has disappeared from the imaging screen. (5) A given time has elapsed since it was determined that the treatment was started. (6) It is determined that the distal end portion 2011 of the insertion portion 2310b of the endoscope wobbles greatly. All of the above stop conditions need not be used, and some may be omitted. Other stopping conditions may also be added. By doing so, it is possible to suppress output processing that is not necessary.
  • the processing unit 2110 detects air supply and suction, for example, based on a control signal of a pump that performs air supply or suction.
  • the captured image is a bird's-eye view of the tissue to be treated. Therefore, the operator can estimate the position and orientation of the distal end portion 2011 based on the captured image, and the notification by the processing system 2100 is of little significance.
  • the distance is increased, the treatment has been temporarily completed, and the operator may be positioning or observing the progress of the treatment. From this point of view as well, there is little need to output the relative change in the approach angle.
  • the distance from the treatment target tissue can be obtained by various methods such as stereo matching as described above.
  • endoscope system 2300 includes a pressure sensor provided in insertion section 2310b.
  • the pressure sensor can be realized by a piezoelectric element such as MEMS (Micro Electro Mechanical Systems).
  • the processing unit 2110 detects pressing pressure by acquiring sensor information from the pressure sensor.
  • the treatment tool 2360 disappears from the captured image, it is considered that the treatment tool 2360 was stored in (e) above. In this case, it is highly probable that one treatment has been completed, so there is little need to output the relative change in the approach angle. Detection of the treatment instrument 2360 based on the captured image can be realized by image processing using saturation or the like, as described above. Further, as described above, the processing unit 2110 may determine that the stop condition is satisfied when the power supply to the high-frequency device is stopped.
  • processing system 2100 may include a timing unit that outputs timing information.
  • processing system 2100 may obtain the timing information via a network.
  • the processing unit 2110 measures the time from the start of energization based on the clock information.
  • FIG. 32 is a flowchart for explaining the processing when using a stop condition. Steps S2201 to S2204 are the same as steps S2101 to S2104 in FIG. Specifically, when the start of treatment is detected, the reference angle of the approach angle is set, the relative change is calculated, and the output is performed.
  • step S2205 the processing unit 2110 determines whether or not the stop condition is satisfied.
  • the stop condition and determination method are as described above. If the stop condition is not satisfied (No in step S2205), the process returns to step S2202. That is, the current reference angle is used to continue calculating and outputting the relative change in the approach angle.
  • step S2205 if the stop condition is satisfied (Yes in step S2205), the process returns to step S2201. That is, the processing system 2100 periodically executes the processing of step S2201 without calculating and outputting the relative change in the approach angle until the start of treatment is detected.
  • the processing system 2100 determines that the treatment has started (Yes in step S2201)
  • the processing system 2100 resets the reference angle (step S2202), calculates the relative change, and outputs notification information (step S2203, step S2204).
  • the skill evaluation information is output based on the approach angle information and the energization history information, but is not limited to these. It may be output based on pressure information and air supply/suction information.
  • an embodiment relating to a processing system 3100 that outputs skill evaluation information based on pressing pressure information and air supply/suction information will be described.
  • each treatment included in ESD will be mainly described, but the method of this modified example can be extended to other treatments for a living body.
  • the quality of endoscopic procedures depends largely on the doctor's experience and "tacit knowledge" of operation.
  • the distal end portion 3011 of the insertion portion 3310b to the operation portion 3310a is flexible and long, the feeling and strength of the distal end portion 3011 when the distal end portion 3011 comes into contact with the living body. is hardly communicated to the operator.
  • Information that can be acquired by the operator is mainly an image captured by an imaging system provided in the distal end portion 3011 . That is, what the operator can confirm is only the range and angle visible from the screen. Furthermore, captured images often do not have three-dimensional information.
  • endoscopists perform procedures based on their own empirical rules, so the time required for treatment varies greatly depending on the operator. For example, after the endoscope reaches a lesion in the lumen, it is necessary to adjust the position of the distal end 3011, and it has been found that it takes a longer time for novice doctors to make such adjustments than skilled doctors. is coming.
  • adjustment of the position of the distal end portion 3011 may simply be referred to as adjustment of the position.
  • a skilled doctor here means a doctor with high treatment skills
  • a trainee doctor means a doctor with lower treatment skills than a skilled doctor.
  • the treatment-related skill is evaluated as to whether it is high or low in consideration of the information after the treatment as well as the information on the action of the treatment.
  • Information after the course of treatment refers to, for example, a low incidence of complications, a short postoperative hospital stay, and the like.
  • the trainee doctor in order to improve the skill of a trainee doctor, it is preferable for the trainee doctor to be able to advise or pass on technical knowledge for adjusting positions in a short time.
  • the portion from the distal end portion 3011 of the insertion portion 3310b to the operation portion 3310a is flexible and long, and the target tissue side is expanded or contracted by the above-described air supply and suction, and further peristaltic movement.
  • the operability of the endoscope is constantly changing. Therefore, the operational input and the operational output of the endoscope generally do not match.
  • FIG. 33 is only a rough estimate, and the details have not been clarified sufficiently.
  • the pressing pressure is determined not only by pressing the distal end portion 3011 or the like of the endoscope against the target site or the like, but also depends on the tension of the organ, for example. This is because if the tension of the organ is weak, the inner wall of the organ may be soft, and conversely, if the tension of the organ is high, the inner wall of the organ may become hard. Furthermore, the tension of the organ is affected by the peristaltic movement of the organ, and is also affected by the internal pressure fluctuation of the organ due to the above-mentioned air supply and suction. Therefore, the information on pressing pressure is closely related to the information on air supply and suction.
  • FIG. 34 is a diagram showing the configuration of a processing system 3100 according to this modification.
  • the processing system 3100 includes an acquisition unit 3110 , a processing unit 3120 and an output processing unit 3130 .
  • the processing system 3100 is not limited to the configuration of FIG. 34, and various modifications such as omitting some of these components or adding other components are possible.
  • the acquisition unit 3110 acquires the pressing pressure information of the insertion portion of the endoscope and the air supply/suction information regarding air supply and suction from the endoscope system 3300 described later with reference to FIGS. 35 and 36 .
  • the acquisition unit 3110 can also be said to be a communication interface that acquires pressing pressure information and air supply/suction information. A specific acquisition method will be described later.
  • the acquisition unit 3110 can be realized by, for example, a communication chip for information acquisition, a processor or a control circuit that controls the communication chip, or the like. Note that the endoscope according to this modification is assumed to be a flexible endoscope.
  • the processing unit 3120 evaluates the skill of the user who has operated the endoscope system 3300 based on the pressing pressure information and the air supply/suction information.
  • the processing executed by the processing unit 3120 is, for example, classification processing such as clustering. Details of the skill evaluation will be described later.
  • the processing system 3100 When processing using a trained model is performed, the processing system 3100 includes a storage unit (not shown) that stores a trained model generated by machine learning.
  • the storage unit here serves as a work area for the processing unit 3120 and the like, and its function can be realized by a semiconductor memory, a register, a magnetic storage device, or the like.
  • the processing unit 3120 reads the learned model from the storage unit and operates according to instructions from the learned model, thereby performing inference processing for outputting the user's skill evaluation result.
  • the processing unit 3120 is configured with the following hardware.
  • the hardware may include circuitry for processing digital signals and/or circuitry for processing analog signals.
  • the hardware can consist of one or more circuit devices or one or more circuit elements mounted on a circuit board.
  • the one or more circuit devices are for example ICs, FPGAs or the like.
  • the one or more circuit elements are, for example, resistors, capacitors, and the like.
  • processing unit 3120 may be realized by the following processors.
  • Processing system 3100 includes a memory that stores information and a processor that operates on the information stored in the memory.
  • the memory here may be the storage unit described above, or may be a different memory.
  • the information is, for example, programs and various data.
  • a processor includes hardware.
  • Various processors such as CPU, GPU, and DSP can be used as the processor.
  • the memory may be a semiconductor memory such as an SRAM or a DRAM, a register, a magnetic storage device such as an HDD, or an optical storage device such as an optical disk device. good too.
  • the memory stores computer-readable instructions, and the instructions are executed by the processor to implement the functions of the processing unit 3120 as processes.
  • the instruction here may be an instruction set that constitutes a program, or an instruction that instructs a hardware circuit of a processor to perform an operation. Furthermore, all or part of each part of the processing unit 3120 can be realized by cloud computing, and each process described later can be performed on cloud computing.
  • processing unit 3120 may be implemented as a module of a program that runs on a processor.
  • the processing unit 3120 is implemented as a processing module that performs skill evaluation based on pressing pressure information and air supply/suction information.
  • the program that implements the processing performed by the processing unit 3120 can be stored in, for example, an information storage device that is a computer-readable medium.
  • the information storage device can be implemented by, for example, an optical disc, memory card, HDD, semiconductor memory, or the like.
  • a semiconductor memory is, for example, a ROM.
  • the processing unit 3120 performs various processes based on programs stored in the information storage device. That is, the information storage device stores a program for causing the computer to function as the processing unit 3120 .
  • a computer is a device that includes an input device, a processing unit, a storage unit, and an output unit.
  • the program according to this modification is a program for causing a computer to execute each step described later with reference to FIG. 51 and the like.
  • the output processing unit 3130 performs processing for outputting skill evaluation information that is the result of skill evaluation by the processing unit 3120 .
  • the processing system 3100 may include a display unit (not shown), and the output processing unit 3130 may perform processing for displaying skill evaluation information on the display unit.
  • the processing system 3100 may be connected to the endoscope system 3300 via a network, as will be described later using FIG.
  • the output processing unit 3130 may be a communication device or communication chip that transmits skill evaluation information via a network. Note that the device that outputs the skill evaluation information is not limited to the endoscope system 3300, and may be a PC that can communicate with the processing system 3100, or a mobile terminal device such as a smart phone or a tablet terminal.
  • the processing system 3100 of this modified example includes an acquisition unit 3110 , a processing unit 3120 and an output processing unit 3130 .
  • the acquisition unit 3110 acquires information about the pressing pressure of the insertion section in treatment using an endoscope, which is a flexible scope, and information about air supply and suction regarding air supply and suction.
  • the processing unit 3120 also evaluates the skill of the user operating the endoscope based on the pressing pressure information and the air supply/suction information.
  • Output processing unit 3130 also outputs skill evaluation information that is the result of skill evaluation.
  • the processing performed by the processing system 3100 of this modified example may be implemented as an information processing method.
  • the information processing method acquires pressing pressure information of an insertion portion in a procedure using an endoscope, which is a flexible scope, and air supply/suction information regarding air supply and suction, and based on the pressing pressure information and air supply/suction information, Skill evaluation of the user who operates the endoscope is performed, and skill evaluation information, which is the result of the skill evaluation, is output.
  • the user's skill can be evaluated based on both the pressing pressure information and the air supply/suction information, so the operator's skill can be evaluated with high accuracy.
  • FIG. 35 is a diagram showing the configuration of an endoscope system 3300.
  • the endoscope system 3300 includes a scope section 3310 , a processing device 3330 , a display section 3340 and a light source device 3350 .
  • the operator uses the endoscope system 3300 to perform an endoscopic examination of the patient.
  • the configuration of the endoscope system 3300 is not limited to that shown in FIG. 35, and various modifications such as omitting some components or adding other components are possible.
  • illustration of a suction device 3370, an air/water supply device 3380, etc., which will be described later in FIG. 36, is omitted.
  • FIG. 35 shows an example in which the processing device 3330 is one device connected to the scope section 3310 via the connector 3310d, but it is not limited to this.
  • part or all of the processing device 3330 may be configured by other information processing devices such as a PC or a server system that can be connected via a network.
  • the processing unit 3330 may be implemented by cloud computing.
  • the scope section 3310 has an operation section 3310a, a flexible insertion section 3310b, and a universal cable 3310c including signal lines and the like.
  • the scope section 3310 is a tubular insertion device that inserts a tubular insertion section 3310b into a body cavity.
  • a connector 3310d is provided at the tip of the universal cable 3310c.
  • the scope unit 3310 is detachably connected to the light source device 3350 and the processing device 3330 by a connector 3310d. Furthermore, as will be described later with reference to FIG. 36, a light guide 3315 is inserted in the universal cable 3310c. emitted from the tip.
  • the insertion portion 3310b has a distal end portion 3011, a bendable bending portion 3012, and a flexible portion 3013 from the distal end to the proximal end of the insertion portion 3310b. Insert portion 3310b is inserted into the subject.
  • a distal end portion 3011 of the insertion portion 3310b is a distal end portion of the scope portion 3310 and is a hard distal end rigid portion.
  • An objective optical system 3311 and an imaging element 3312, which will be described later, are provided at the distal end portion 3011, for example.
  • the bending portion 3012 can bend in a desired direction according to the operation of the bending operation member provided in the operation portion 3310a.
  • the bending operation member includes, for example, a horizontal bending operation knob and a vertical bending operation knob.
  • the operation portion 3310a is provided with various operation buttons such as a release button and an air/water supply button.
  • the processing device 3330 is a video processor that performs predetermined image processing on the received imaging signal and generates a captured image.
  • a video signal of the generated captured image is output from the processing device 3330 to the display unit 3340, and the captured image is displayed on the display unit 3340 in real time.
  • the configuration of the processing device 3330 will be described later.
  • the display unit 3340 is, for example, a liquid crystal display, an EL display, or the like.
  • a light source device 3350 is a light source device capable of emitting white light for normal observation mode.
  • the light source device 3350 may be capable of selectively emitting white light for normal observation mode and special light such as narrow band light.
  • FIG. 36 is a diagram explaining the configuration of each part of the endoscope system 3300.
  • a light source device 3350 includes a light source 3352 that emits illumination light.
  • the light source 3352 may be a xenon light source, an LED, or a laser light source. Also, the light source 3352 may be another light source, and the light emission method is not limited.
  • the insertion section 3310b includes an objective optical system 3311, an imaging element 3312, an illumination lens 3314, a light guide 3315, a suction tube 3317, and an air/water supply tube 3319.
  • the light guide 3315 guides illumination light from the light source 3352 to the tip of the insertion portion 3310b.
  • the illumination lens 3314 irradiates the subject with the illumination light guided by the light guide 3315 .
  • the objective optical system 3311 forms a subject image by reflecting light reflected from the subject.
  • the imaging element 3312 receives light from the subject via the objective optical system 3311 .
  • the imaging element 3312 may be a monochrome sensor or an element with color filters.
  • the color filter may be a well-known Bayer filter, a complementary color filter, or other filters.
  • Complementary color filters are filters that include cyan, magenta, and yellow color filters.
  • the suction tube 3317 activates the suction device 3370 in a predetermined case to suction liquid and the like.
  • Predetermined cases include, for example, a case where gastric juice or the like interferes with diagnosis, a case where water is collected after treatment is completed, and the like, but other cases are also possible.
  • the suction device 3370 is realized by including a suction pump, recovery tank, and the like (not shown). Also, the suction device 3370 is connected to the control unit 3332 described later, and by pressing a suction button (not shown) or the like, the liquid or the like is collected in the collection tank described above through the opening 3316 and the suction pipe 3317 .
  • the opening 3316 also serves as an opening from which a treatment instrument 3360, which will be described later, protrudes. Also, in FIG. 36, illustration of the treatment instrument 3360 and a tube housing the treatment instrument 3360 is omitted.
  • the air/water pipe 3319 activates the air/water supply device 3380 to supply air or water in specific cases.
  • the specific case is, for example, a case where it is desired to wash the residue around the lesion or a case where it is desired to expand the surrounding area around the lesion from the inside, but other cases may also be used.
  • the air/water supply device 3380 is realized by including a pump, gas cylinder, water tank, etc. (not shown).
  • the air/water supply device 3380 is connected to a control unit 3332 described later, and when an air supply button or a water supply button (not shown) is pressed, gas or liquid is ejected from the nozzle 3318 through the air/water supply pipe 3319. be.
  • FIG. 36 one air/water supply pipe 3319 is schematically illustrated, but a pipe for gas and a pipe for liquid are arranged in parallel, and the pipes are connected before the nozzle 3318. good too.
  • the processing device 3330 performs image processing and control of the entire system.
  • the processing device 3330 includes a pre-processing section 3331 , a control section 3332 , a storage section 3333 , a detection processing section 3335 and a post-processing section 3336 .
  • a preprocessing unit 3331 performs A/D conversion for converting analog signals sequentially output from the image sensor 3312 into digital images, and various correction processes for image data after A/D conversion. Note that an A/D conversion circuit may be provided in the image sensor 3312 and the A/D conversion in the preprocessing section 3331 may be omitted.
  • the correction processing here includes, for example, color matrix correction processing, structure enhancement processing, noise reduction processing, AGC, and the like.
  • the preprocessing unit 3331 may also perform other correction processing such as white balance processing.
  • the preprocessing unit 3331 outputs the processed image to the detection processing unit 3335 as an input image.
  • the pre-processing unit 3331 also outputs the processed image to the post-processing unit 3336 as a display image.
  • the detection processing unit 3335 performs detection processing for detecting a region of interest such as a lesion from the input image.
  • the attention area detection processing is not essential, and the detection processing unit 3335 can be omitted.
  • a post-processing unit 3336 performs post-processing based on the outputs of the pre-processing unit 3331 and the detection processing unit 3335 and outputs the post-processed image to the display unit 3340 .
  • the post-processing unit 3336 may add the detection result of the detection processing unit 3335 to the display image and display the added image.
  • a user who is an operator treats a lesion area in the living body while viewing an image displayed on the display unit 3340 .
  • the treatment here is, for example, treatment for resecting lesions such as EMR and ESD described above.
  • the control unit 3332 is connected to the imaging element 3312, the preprocessing unit 3331, the detection processing unit 3335, the postprocessing unit 3336, and the light source 3352, and controls each unit.
  • the acquisition unit 3110 acquires input data and air supply/suction information, which will be described later, based on control information from the control unit 3332, for example.
  • the acquisition unit 3110 also acquires pressing pressure information, which will be described later, based on sensor information from a motion sensor provided in the insertion unit 3310b, for example.
  • the processing unit 3120 performs skill evaluation using the pressing pressure information and the air supply/suction information.
  • the output processing unit 3130 outputs skill evaluation information to the display unit 3340 and external devices connected to the endoscope system 3300 .
  • FIG. 37 is a diagram illustrating the configuration of the distal end portion 3011 of the insertion portion 3310b.
  • the distal end portion 3011 has a substantially circular cross-sectional shape, and is provided with an objective optical system 3311 and an illumination lens 3314 as described above with reference to FIG.
  • the insertion portion 3310b is provided with a channel, which is a cavity, connecting from the operation portion 3310a to the opening portion 3316 of the distal end portion 3011.
  • the opening 3316 here is an opening for a treatment tool 3360 called a forceps opening.
  • FIG. 37 illustrates the configuration of the distal end portion 3011 having two illumination lenses 3314, one objective optical system 3311, one opening 3316, and one nozzle 3318, but the specific configuration can be modified in various ways. Implementation is possible.
  • the treatment instrument 3360 here is an instrument for treating a living body, and includes, for example, a high-frequency snare and a high-frequency knife.
  • High frequency knives include needle knives, IT knives, hook knives, and the like.
  • a needle knife is used for ESD marking.
  • a high frequency knife is used for the incision.
  • a high-frequency snare or high-frequency knife is used for peeling.
  • the treatment instrument 3360 may also include other instruments such as injection needles, forceps, and clips.
  • An injection needle is used for local injection of ESD. Forceps or clips are used to stop bleeding.
  • FIG. 38 is a diagram showing a configuration example of a system including the processing system 3100. As shown in FIG. As shown in FIG. 38, the system includes multiple endoscope systems 3300 and a processing system 3100 .
  • the processing system 3100 is a server system connected to each of the plurality of endoscope systems 3300 via a network.
  • the server system here may be a server provided in a private network such as an intranet, or a server provided in a public communication network such as the Internet.
  • the processing system 3100 may be configured by one server device, or may include a plurality of server devices.
  • the processing system 3100 may include a database server that collects pressing pressure information and air supply/suction information from a plurality of endoscope systems 3300, and a processing server that performs skill evaluation.
  • the database server may also collect other information, such as physician information, patient information, etc., as described below.
  • the processing system 3100 may perform skill evaluation based on machine learning, as described later.
  • the processing system 3100 may include a learning server that generates a trained model by performing machine learning using data collected by the database server as learning data.
  • the processing server performs skill evaluation based on the trained model generated by the learning server.
  • the processing system 3100 when the processing system 3100 can connect with a plurality of endoscope systems 3300, it is possible to efficiently collect data. For example, since it is easy to increase the amount of learning data used for machine learning, it is possible to improve the accuracy of skill evaluation.
  • the pressing pressure is the force with which the tip of the endoscope or the like presses the target site or the like.
  • the tip or the like is the tip portion 3011 as shown in FIG. 39(A).
  • the pressing pressure is not limited to this, and may be the force of the protruding treatment instrument 3360 pushing the target site or the like, or the force of the cap (not shown) pressing the target site. Note that the cap protects the tip portion 3011 . Further, as shown in FIG.
  • the pressing pressure is not limited to the distal end portion 3011, and may be the force that the bending portion 3012 presses the target site or the like.
  • the curved portion 3012 is pressed, the movement of the portion indicated by G in FIG. 39(B) is restricted. This stabilizes the movement of the distal end portion 3011, thereby facilitating position adjustment.
  • the target site or the like refers to the target organ or the like, but it does not matter whether it is a lesion site or a normal site.
  • the endoscope system 3300 includes a pressure sensor (not shown) provided at the distal end portion 3011 of the insertion portion 3310b to obtain the pressing pressure.
  • the pressure detection method of the pressure sensor here is, for example, a strain gauge circuit type pressure detection method, but may be another method such as a capacitance type pressure detection method.
  • the pressing pressure may be obtained by a method of obtaining the movement amount of the pressed target part based on the captured image and estimating the contact pressure from the movement amount. Furthermore, the pressing pressure may be obtained by combining this estimation method with the above-described method using a pressure sensor. For example, if no pressure is detected by the pressure sensor included in the distal end portion 3011 but the organ appears to be bent from the captured image, it can be estimated that the bending portion 3012 is pressing against it. Since the method of estimating the contact pressure from the captured image is well known, the description thereof will be omitted.
  • the acquiring unit 3110 By transmitting the pressing pressure information data measured by the method described above to the control unit 3332, etc., it is possible for the acquiring unit 3110 to acquire the pressing pressure information. Note that specific processing for obtaining pressing pressure information will be described later with reference to FIGS. 47, 50, 51, and the like. The same applies to processing using the acquisition of air supply/suction information.
  • the air supply/suction information can be obtained based on the time the suction button or air supply button (not shown) is pressed and the set flow rate set in the suction device 3370 or the air/water supply device 3380 described above.
  • a flow meter (not shown) may be provided at a predetermined position, and the air supply/suction information may be obtained from the flow meter based on the flow rate data.
  • the predetermined position is, for example, the suction pipe 3317 or the air/water supply pipe 3319 , but may be inside the suction device 3370 or the air/water supply device 3380 .
  • the pressing pressure depends on the tension of the organ, so it is thought that information on the tension of the organ can be obtained by measuring the air pressure near the target area.
  • the tightness of an organ depends on other factors, it is difficult to estimate the tightness of an organ from atmospheric pressure information. Other factors include, for example, the softness of organs and the diffusion of supplied gas into other organs.
  • a pressure sensor or the like is necessary to obtain pressing pressure information
  • a flow meter or the like is necessary to obtain air supply/suction information.
  • FIG. 40 The operator performs a predetermined treatment, and after a predetermined period of time has passed since the treatment, a skill evaluation sheet 3400 shown in FIG. 40 is output to a predetermined display unit as skill evaluation information.
  • a predetermined period is, for example, the period until the patient who is the target of the treatment is discharged from the hospital.
  • the predetermined display unit is, for example, the display unit 3340 described above, but may be a display unit of an external device connected to the endoscope system 3300 .
  • the predetermined treatment is, for example, the ESD including multiple steps as described above, but may be other treatments including multiple steps. In other words, treatment with the endoscopic treatment tool 3360 includes multiple steps.
  • the skill evaluation sheet 3400 includes, for example, a doctor information icon 3410 and a case sheet 3420.
  • the case sheet 3420 for example, the result of comprehensive evaluation and the result of skill evaluation for each stage of treatment are displayed as a breakdown of the comprehensive evaluation.
  • the output processing unit 3130 outputs skill evaluation information at each stage of a plurality of stages.
  • the skill can be evaluated by subdividing it for each stage, so that the accuracy of the skill evaluation can be further improved. For example, skill evaluation is performed for the stages of marking, local injection, incision, and peeling, but as shown in FIG. 40, skill evaluation may be performed for five stages including hemostasis.
  • the number of treatment stages to be evaluated is not limited to five, and other stages may be added, or two or more may be reduced.
  • the multiple steps include at least two of a marking step, a local injection step, an incision step, and an ablation step.
  • the evaluation results of each stage are displayed with a marking evaluation icon 3440, a local injection evaluation icon 3442, an incision evaluation icon 3444, a peeling evaluation icon 3446, and a hemostasis evaluation icon 3448. , C, or D.
  • these evaluation results may be displayed in a radar chart format.
  • the display format of the skill evaluation is not limited to the radar chart, and may be realized in the form of a bar graph, a line graph, or the like, and various modifications are possible. Furthermore, the display format of the skill evaluation may be changed.
  • advice information may be output for each stage.
  • the advice information is, specifically, advice information regarding at least one of the pressing pressure information and the air supply/suction information.
  • the output processing unit 3130 may output advice information regarding at least one of the pressing pressure information and the air supply/suction information.
  • FIG. 41 when a peeling evaluation icon 3446 is selected on the display screen, a peeling advice display 3476 is displayed, and when a hemostasis evaluation icon 3448 is selected, a hemostasis advice display 3478 is displayed, and pressing pressure information and air supply/suction are displayed.
  • Informational advice is displayed.
  • a marking advice display, a local injection advice display, and an incision advice display are also displayed, but they are omitted in FIG.
  • the method of displaying advice is not limited to the method shown in FIG. 41, and various modifications such as displaying on a separate screen are possible. Also, although illustration is omitted, when the comprehensive evaluation icon 3430 is selected, advice regarding the comprehensive evaluation may be displayed. As a result, highly accurate skill evaluation can be performed, and specific information can be provided to the operator.
  • the pressing pressure information and the air supply/suction information are compared with the data of the operator to be evaluated and the data of the expert, and the advice information includes the difference information. display.
  • pressing pressure information, air supply/suction information, and the like acquired by an operation performed by a skilled doctor may be referred to as expert data.
  • the output processing unit 3130 displays, as advice information, the difference from the expert data regarding at least one of the pressing pressure information and the air supply/suction information.
  • the advice information may include other information.
  • the marking stage it may be possible to confirm that an evaluation result equivalent to that of an expert was obtained, the reason for the evaluation result may be displayed, or log information, which will be described later, may be displayed by the operator. Advice may be displayed to refer to
  • FIG. 42 is a diagram schematically showing an example of log information of pressing pressure information and air supply/suction information in an operation performed by a skilled doctor.
  • FIG. 43 is a diagram schematically showing an example of log information of pressing pressure information and air supply/suction information in an operation performed by a trainee. 42 and 43 schematically show the relationship between pressing pressure and air supply/suction. does not indicate a specific length of time.
  • the skill evaluation information of this modified example includes log information of pressing pressure information and air supply/suction information.
  • individual case sheets 3420 include log data icons 3450 .
  • log information of pressing pressure information shown in FIG. 41 is displayed.
  • the output processing unit 3130 outputs log information about pressing pressure information and air supply/suction information.
  • highly accurate skill evaluation can be performed, and more specific information such as log information about pressing pressure information and air supply/suction information can be provided to the operator.
  • treatment refers to a period in which the treatment instrument 3360 or the like is energized from a high-frequency device, and specific steps such as marking, local injection, incision, and peeling are not specified.
  • a high-frequency device is a device that is used to excise and cauterize a target tissue by applying a high-frequency current.
  • High frequency devices include high frequency snares and high frequency knives.
  • Energization means that high-frequency current is supplied from the power supply to the high-frequency device, and the energization state can be determined based on the control signal of the power supply.
  • the period during treatment is the period of treatment by the treatment tool 3360 of the endoscope.
  • the pressing pressure depends on the tension of the organ caused by the peristaltic movement of the organ. Therefore, when the operator does not perform any operation after pressing the endoscope against the target site with a predetermined force, the waveform of the measurement result of the pressing pressure changes due to the peristaltic movement of the organ, as shown in FIG. It becomes a periodic waveform. It is difficult to adjust the position if the pressing pressure is kept fluctuating periodically. Log information in operations performed by a skilled doctor has a waveform with a small amplitude of the pressing pressure due to detailed air supply and suction work in the pre-treatment stage.
  • a skilled doctor utilizes a predetermined property and presses an air supply button when the pressing pressure tends to decrease, thereby stopping the pressing pressure from falling and adjusting the amplitude of the fluctuation. Conversely, when the pressing pressure tends to rise, pressing the suction button stops the pressing pressure from falling and adjusts the swing width to be small. As a result, the skilled doctor can quickly stabilize the pressing pressure, so that the position adjustment can be completed in a short period of time.
  • the predetermined property is a property that when the internal pressure of the organ rises, the organ expands, the organ hardens, and the pressing pressure increases, and conversely, when the internal pressure of the organ decreases, the pressing pressure decreases. . Also, although the predetermined property does not always apply in every situation, it is assumed here that the pressing pressure can be controlled by the predetermined property.
  • the trainee doctor was preoccupied with manipulating the endoscope to adjust the position, and did not notice that the pressing pressure was rising, and did not perform air supply and suction.
  • the pressing pressure is not stable, it takes time to adjust the position.
  • FIG. 42 is a schematic diagram, and the length of the horizontal axis during treatment is not the same as the length before and after treatment.
  • the novice doctor starts the treatment while the pressing pressure is still rising, he/she rushes to perform abrupt suction during the treatment, resulting in a sudden decrease in the pressing pressure. Then, the pressing pressure rises sharply because air is supplied in haste in response to the sudden drop. As a result, the measured values of the pressing pressure fluctuate greatly.
  • skill evaluation may be performed based on pressing pressure information and air supply/suction information during a treatment period using the treatment tool 3360 of the endoscope.
  • the processing unit 3120 may perform skill evaluation based on the pressing pressure information and the air supply/suction information during the treatment period using the treatment tool 3360 of the endoscope.
  • the skill can be evaluated in a period of high importance, so that the accuracy of skill evaluation can be increased.
  • illustration is omitted, by adding images showing skill evaluations A to D for the period during treatment to the skill evaluation sheet 3400, skill evaluation for the period during treatment can be realized.
  • the method is not limited to this method, and for example, skill evaluation may be performed by displaying advice during the period of treatment on the exfoliation advice display 3476 in FIG. 41 .
  • skill evaluation may be performed only during the period prior to the treatment period.
  • the processing unit 3120 may perform skill evaluation based on the pressing pressure information and the air supply/suction information during the period prior to the treatment period by the treatment tool 3360 of the endoscope.
  • information for grasping the quality of the pretreatment setup can be obtained from the behavior of the pressing pressure information and air supply and suction information before treatment, and skill evaluation can be performed based on this information, so the accuracy of skill evaluation can be improved. can be higher.
  • advice may be displayed regarding the pressing pressure information and the air supply/suction information in the period before the treatment.
  • the post-treatment period is the preparation period for the next step or the next treatment at the same step. Therefore, the pressing pressure and the behavior of air supply and suction after treatment are the same as before treatment. Therefore, in the behavior of the pressing pressure by the expert doctor, the amplitude of fluctuation is reduced by finely performing the air supply and suction by the expert doctor. On the other hand, the behavior of the pressing pressure by the novice doctor has a large fluctuation range.
  • a safety range SA may be set and displayed with respect to the behavior of the measured pressing pressure.
  • the processing unit 3120 acquires the permissible pressing pressure range information and adds processing for displaying the safe range SA, thereby realizing the display of the log shown in FIG. 42 .
  • skill evaluation may be performed at arbitrary intervals according to the results of pressing pressure measurement. For example, in the period H1 of FIG. 44, the measured pressing pressure is outside the safe range SA throughout the period H1, so it is evaluated as being very unfavorable from the safety point of view.
  • the period H2 part of the pressing pressure measurement results are out of the safe range SA, so it is evaluated that it is not very preferable from the viewpoint of safety.
  • the measurement result of the pressing pressure is within the safe range SA throughout the period H3, so a favorable evaluation is performed from the viewpoint of safety.
  • illustration is omitted, a specification may be adopted in which an image showing these evaluations is displayed.
  • the pressing pressure safety range SA in advance. In other words, it is necessary to know how much pressing pressure range is permissible in order to perform treatment safely. For example, if there is data of a similar case treated by a skilled doctor in the past, the permissible pressing pressure range can be set based on the data.
  • the log information of the pressing pressure information and the air supply/suction information may be displayed in real time, for example, on the display unit 3340 during surgery. Furthermore, it may be notified in real time at the timing when the pressing pressure measurement result moves from within the safe range SA to outside the safe range SA.
  • the notification here may be notification by sound, vibration, or the like, in addition to notification by the display unit 3340 or the like. In this way, the operator who is performing the treatment can be immediately notified of the abnormality, so that trouble can be prevented.
  • the skill evaluation information of this modified example may further include doctor information.
  • doctor information For example, by selecting the doctor information icon 3410 on the skill evaluation sheet 3400, detailed doctor information is displayed. Specifically, it includes information on individual cases in addition to physical characteristics and surgical records. More specifically, physical characteristics include height, weight, hand size, and the like.
  • the surgical record includes information such as the number of experienced cases and cumulative surgical time.
  • the case information also includes the time required for the surgery, the degree of difficulty of the surgery, the number of times guidance was received, the content of the guidance, and the like. Note that information specifying a school may be included in the doctor information. As described above, the pros and cons of endoscopic procedures depend on the doctor's experience and tacit knowledge of operations.
  • the doctor information may include name, sex, date of birth, registration information, and date of registration, and these information may be linked to a predetermined database. Further, the doctor information may include academic achievements such as academic conference activities. It is not necessary to display all of the information described above. For example, it is stored in a predetermined storage area in a list format as shown in FIG. 3400 may be displayed. By including the doctor information in the evaluation skill information in this way, it is possible to improve the accuracy of the skill evaluation.
  • the skill evaluation information of this modified example may further include patient information.
  • Patient information includes patient information, lesion information, and postoperative status information.
  • the patient's own information includes, for example, name, age, sex, etc., and may include information on whether or not the patient uses an anticoagulant. This information is useful for judging the degree of difficulty of surgery, since the use of anticoagulants makes bleeding more likely.
  • the patient's own information may also include treatment history information. This information is also useful information because, for example, re-exfoliation treatment becomes difficult due to fibrosis or the like at a site that has been subjected to ESD treatment in the past.
  • the lesion information includes site information, tissue characterization information, and bleeding information, and may further include subdivided information as shown in FIG.
  • information on the state after surgery includes information on the amount of bleeding, the incidence of complications, and the number of days of hospitalization. For example, if the incidence of complications after surgery is low or the number of days required for hospitalization is small, the skill of the surgeon in charge is evaluated as high. By including patient information in the evaluation skill information in this way, it is possible to improve the accuracy of skill evaluation.
  • the marking evaluation icon 3440, local injection evaluation icon 3442, incision evaluation icon 3444, peeling evaluation icon 3446, and hemostasis evaluation icon 3448 on the skill evaluation sheet 3400 of FIG. Assume that there are multiple displayed operators. Further, when the patient information of these operators is compared, it is assumed that there is a large difference in the incidence of postoperative complications and the number of days required for hospitalization. In this case, the comprehensive evaluation icon 3430 of the skill evaluation sheet 3400 of the predetermined operator is displayed as "A+", and the comprehensive evaluation icon 3430 of the skill evaluation sheet 3400 of the specific operator is displayed as "A-". Skill evaluation may be performed in more detail.
  • a predetermined operator is an operator whose incidence of complications after surgery is lower than the average value and whose number of days required for hospitalization is less than the average value.
  • a specific operator is an operator who has a postoperative incidence of complications higher than the average value and the number of days required for hospitalization is higher than the average value. It should be noted that even for an operator with a comprehensive evaluation of B, C, etc., a more detailed evaluation may be similarly performed. By including patient information in the evaluation skill information in this way, it is possible to improve the accuracy of skill evaluation.
  • the processing unit 3120 determines whether the endoscope system 3300 is operating (step S3501).
  • the processing unit 3120 corresponds to, for example, the processing device 3330 described above, but may be expanded to include a scope unit 3310, a display unit 3340, a light source device 3350, and the like.
  • the peripheral device corresponds to, for example, a treatment tool 3360 for performing treatment, but may be expanded to include a suction device 3370, an air/water supply device 3380, a power supply device for supplying power to the treatment tool 3360, and the like.
  • the endoscope system 3300 may be activated, and the processing after step S3502 of FIG. 47 described later may be started as well.
  • step S3502 performs pressing pressure information acquisition processing
  • step S3503 performs air supply/suction information acquisition processing
  • step S3504 determines whether or not the operation of the endoscope system 3300 ends (step S3504). While the endoscope system 3300 is operating (NO in step S3504), steps S3502 and S3503 are repeatedly executed. In other words, while the endoscope system 3300 is operating, the pressing pressure information and the air supply/suction information are acquired. Although illustration is omitted, a process of associating timing information with the pressing pressure information and the air supply/suction information may be added.
  • the timing information includes, for example, information indicating the timing of each stage, and may further include timing information such as before treatment, during treatment, and after treatment.
  • the timing information can be obtained based on the usage information of the peripheral device. For example, by acquiring information about the energization state of the high-frequency device as usage information, the processing unit 3120 determines that the treatment has started when high-frequency current is supplied to the high-frequency device based on the control signal. be able to.
  • log information output processing (step S3505) is performed.
  • the output process includes, for example, a process of plotting the acquired pressing pressure information, air supply/suction information, etc. against the time axis, and a process of associating each stage of the treatment with the time axis based on the timing information described above. However, it may include a process of specifying a location that causes the evaluation to be low from the result of the plot, and various modifications are possible. By doing so, the log information can be displayed on a predetermined display unit.
  • Machine learning using the neural network NN2 will be described below, but the technique of this modification is not limited to this.
  • machine learning using other models such as SVM may be performed, or machine learning using techniques developed from these techniques may be performed.
  • FIG. 48 is a schematic diagram explaining the neural network NN2.
  • the neural network NN2 has an input layer to which data is input, an intermediate layer that performs operations based on the output from the input layer, and an output layer that outputs data based on the output from the intermediate layer.
  • FIG. 48 illustrates a network with two intermediate layers, but the number of intermediate layers may be one, or three or more. Also, the number of nodes included in each layer is not limited to the example shown in FIG. 48, and various modifications are possible. Considering the accuracy, it is desirable to use deep learning using a multi-layered neural network NN2 for learning in this modified example.
  • the term “multilayer” as used herein means four or more layers in a narrow sense.
  • the nodes contained in a given layer are combined with the nodes of adjacent layers.
  • a weighting factor is set for each connection.
  • Each node multiplies the output of the preceding node by the weighting factor, and obtains the sum of the multiplication results. Further, each node adds a bias to the total value and applies an activation function to the addition result to obtain the output of that node.
  • the output of the neural network NN2 is obtained.
  • Various functions such as a sigmoid function and a ReLU function are known as activation functions, and these can be widely applied in this modified example.
  • Learning in the neural network NN2 is a process of determining appropriate weighting coefficients.
  • the weighting factor here includes the bias.
  • An example in which processing for generating a trained model is performed in a learning device will be described below.
  • the learning device may be, for example, a learning server included in the processing system 3100 as described above, or may be a device provided outside the processing system 3100 .
  • the learning device inputs the input data of the learning data to the neural network NN2 and obtains the output by performing forward calculations using the weighting coefficients at that time.
  • the learning device calculates an error function based on the output and the correct label in the learning data. Then, the weighting coefficients are updated so as to reduce the error function.
  • an error backpropagation method can be used to update the weighting coefficients from the output layer toward the input layer.
  • the neural network NN2 may be CNN, RNN, or other models.
  • the processing procedure is the same as in FIG. That is, the learning device inputs the input data of the learning data to the model and obtains the output by performing forward calculation according to the model configuration using the weighting coefficients at that time.
  • An error function is calculated based on the output and the correct label, and the weighting coefficients are updated so as to reduce the error function.
  • the error backpropagation method can also be used when updating the weighting coefficients of CNN or the like.
  • the input to the neural network NN2 is, for example, pressing pressure information and air supply/suction information.
  • the input of the neural network NN2 is time series data, but may be a statistic calculated based on the time series data.
  • the output of the neural network NN2 is, for example, information representing the rank when the skill of the user to be evaluated is ranked in M stages.
  • M is an integer of 2 or more.
  • rank I is higher in skill than rank I+1.
  • I is an integer of 1 or more and less than M; That is, rank 1 represents the highest skill, and rank M represents the lowest skill.
  • the output layer of neural network NN2 has M nodes.
  • the first node is information representing the likelihood that the skill of the user corresponding to the input data belongs to category 1.
  • the second node is information representing the probability that the input data belongs to category 2 to category M, respectively.
  • the M outputs are sets of probability data that sum to one.
  • Category 1 to category M are categories corresponding to rank 1 to rank M, respectively.
  • the learning device collects pressing pressure information and air supply/suction information acquired when a large number of operators perform treatments using flexible endoscopes. good too.
  • the metadata here is, for example, the patient information described above, but may also include doctor information.
  • the learning device assigns a correct label of rank A as an evaluation.
  • the predetermined patient information here is, for example, patient information in which the incidence of postoperative complications after surgery is lower than the average value and the number of days required for hospitalization is shorter than the average value.
  • a correct label of rank D is assigned as an evaluation.
  • the specific patient information here means information on a patient whose incidence of complications after surgery is significantly higher than the average value and the number of days required for hospitalization is significantly higher than the average value.
  • the learning device identifies one of the M levels of the operator's skill based on the patient information, which is metadata.
  • the skilled doctor may manually evaluate the skill of each trainee for each case and input the evaluation result into the learning device.
  • a trained model for when the postoperative course is good and a trained model for when the postoperative course is not good may be prepared.
  • the processing unit 3120 selects a trained model when the postoperative progress is favorable and a trained model when the postoperative progress is not favorable.
  • FIG. 50 is a flowchart explaining the learning process of the neural network NN2.
  • the learning device acquires pressing pressure information for learning and air supply/suction information for learning.
  • the process of step S3101 corresponds to, for example, a process in which the learning server reads out a set of pressing pressure information and air supply/suction information from a large amount of data accumulated in the database server.
  • pressing pressure information and learning pressing pressure information represent the difference between data used in the learning stage and data used in the inference stage for skill evaluation. is similar. Also, data used as pressing pressure information for inference at a given timing may be used as pressing pressure information for learning at subsequent timings. The same applies to the air supply/suction information and the learning air supply/suction information.
  • step S3102 the learning device acquires the correct label associated with the data read out in step S3101.
  • the correct label is, for example, the result of evaluating the skill of the user who has operated the endoscope in M stages, as described above.
  • step S3103 the learning device performs processing for obtaining an error function. Specifically, the learning device inputs pressing pressure information and air supply/suction information to the neural network NN2. The learning device performs forward calculations based on the input and the weighting coefficients at that time. Then, the learning device obtains an error function based on the calculation result and the comparison processing of the correct label. For example, if the correct label is rank 1, the learning device determines that the correct value of the first node corresponding to category 1 is 1, and the correct values of the second to M-th nodes corresponding to categories 2 to M. is 0 and the error function is obtained. Furthermore, in step S3103, the learning device updates the weighting coefficients so as to reduce the error function. For this processing, the error backpropagation method or the like can be used as described above. The processing of steps S3101 to S3103 corresponds to one learning process based on one piece of learning data.
  • the learning device determines whether or not to end the learning process.
  • the learning device may hold a part of a large amount of learning data as evaluation data.
  • the evaluation data is data for confirming the accuracy of the learning result, and is data that is not used for updating the weighting coefficients.
  • the learning device ends the learning process when the accuracy rate of the estimation process using the evaluation data exceeds a predetermined threshold.
  • step S3104 the process returns to step S3101 to continue the learning process based on the next learning data. If Yes in step S3104, the learning process is terminated.
  • the learning device transmits the generated learned model information to the processing system 3100 .
  • the trained model is stored in a storage unit (not shown) included in processing system 3100 and read by processing unit 3120 .
  • Various techniques such as batch learning and mini-batch learning are known in machine learning, and these can be widely applied to this modified example.
  • the learning device ranks each of the M categories. For example, a category containing a lot of data on experienced doctors is ranked high, and a category containing a lot of data on trainee doctors is ranked low. It is possible to determine whether each data is the data of a skilled doctor or the data of a trainee doctor based on the aforementioned doctor information, patient information, and the like. However, various modifications can be made to the detailed processing. For example, the learning data is ranked in M steps in advance, and the learning device selects M categories based on the average value or total value of the ranks of the data included in each category. Ranking may be done. Even when performing unsupervised learning, it is possible to generate a trained model that evaluates the user's skill in M stages based on the input, as in the case of supervised learning.
  • FIG. 51 is a flowchart for explaining the skill evaluation process.
  • the acquisition unit 3110 acquires pressing pressure information for skill evaluation (step S3201), and acquires air supply/suction information (step S3202).
  • the processing unit 3120 performs inference processing based on the learned model (step S3203).
  • the processing unit 3120 inputs the pressing pressure information and the air supply/suction information to the learned model, and performs forward calculations according to the learned weighting coefficients. to get the output of
  • the processing unit 3120 obtains the user's skill evaluation information based on the output. For example, the processing unit 3120 evaluates the user's skill in M stages based on the data with the largest value among the M outputs.
  • the processing unit 3120 classifies the pressing pressure information for learning and the air supply/suction information for learning into M (M is an integer equal to or greater than 2) categories, and the learned model is acquired by performing machine learning. , the skill evaluation is performed based on the pressing pressure information for learning and the air supply/suction information for learning.
  • the trained model may be generated based on supervised learning or unsupervised learning.
  • the output processing unit 3130 outputs skill evaluation information, which is the skill evaluation result (step S3204).
  • the skill evaluation information here means, for example, M evaluations consisting of a combination of marking stage evaluation, local injection stage evaluation, incision stage evaluation, stripping stage evaluation, hemostasis stage evaluation, and comprehensive evaluation in FIG. This is the information specifying which of the results it is.
  • the storage unit of the processing system 3100 stores separate trained models according to each evaluation. Then, the processing of FIG. 51 is performed using the learned models to be evaluated for each skill.
  • the processing of FIG. 51 is performed using the data and the learned model corresponding to the stage to which the treatment period belongs. The same is true when skill evaluation is limited to the period before the treatment period. Further, when adding the advice information shown in FIG.
  • the processing unit 3120 of the processing system 3100 evaluates the operator's skill by operating according to the learned model.
  • Calculations in the processing unit 3120 according to the trained model that is, calculations for outputting output data based on input data may be performed by software or by hardware.
  • the sum-of-products operation and the like executed at each node in FIG. 48 may be executed by software.
  • the above calculations may be performed by a circuit device such as an FPGA.
  • the above operations may be performed by a combination of software and hardware.
  • a trained model includes an inference algorithm and weighting factors used in the inference algorithm.
  • An inference algorithm is an algorithm that performs forward calculations and the like based on input data.
  • both the inference algorithm and the weighting coefficient are stored in the storage unit, and the processing unit 3120 may perform the inference processing by software by reading out the inference algorithm and the weighting coefficient.
  • the inference algorithm may be implemented by FPGA or the like, and the storage unit may store the weighting coefficients.
  • an inference algorithm including weighting factors may be implemented by an FPGA or the like.
  • the storage unit that stores the information of the trained model is, for example, the built-in memory of the FPGA.
  • the processing unit 3120 may obtain an N-dimensional (N is an integer equal to or greater than 2) feature amount based on the pressing pressure information, the air supply/suction information, and the learned model.
  • the learning device may perform machine learning to classify a plurality of pieces of pressing pressure information for learning and air supply/suction information for learning into M categories in the same manner as the processing described above using FIGS. .
  • the acquisition unit 3110 acquires pressing pressure information and air supply/suction information that are subject to skill evaluation (steps S3201 and S3202).
  • the processing unit 3120 inputs the pressing pressure information and the air supply/suction information to the learned model, and performs forward calculations according to the learned weighting coefficients.
  • the processing unit 3120 obtains the data in the intermediate layer as an N-dimensional feature amount.
  • the neural network NN2 has the first to Q-th intermediate layers
  • the value in the J-th intermediate layer having N nodes is the N-dimensional feature amount.
  • Q is an integer of 2 or more
  • J is an integer of 1 or more and Q or less.
  • an N-dimensional feature amount may be obtained by combining outputs from multiple intermediate layers.
  • FIG. 52 is an example of an N-dimensional feature amount space.
  • the horizontal axis represents the feature amount A3 among the N-dimensional feature amounts, and the vertical axis represents the feature amount B3 different from the feature amount A3.
  • N 2 here, N may be 3 or more.
  • the values of the first to N-th feature amounts are obtained. That is, a set of pressing pressure information and air supply/suction information is plotted as one point on the N-dimensional feature amount space.
  • the N-dimensional feature quantity extracted based on machine learning is a feature quantity for classifying the input consisting of pressing pressure information and air supply/suction information into M categories. Therefore, as shown in FIG.
  • the processing unit 3120 calculates the position in the feature value space of the N-dimensional feature value obtained by inputting the pressing pressure information and the air supply/suction information, which are targets of skill evaluation, into the learned model, and the position in the feature value space of the M categories. Skill evaluation is performed based on the distance between the centroid position in the feature amount space of one or more categories.
  • the position of the center of gravity here is information obtained based on the positions of a plurality of points included in each category, and is, for example, an average value of a plurality of coordinate values.
  • the centroid position of each category is known at the stage when learning is completed.
  • the processing unit 3120 obtains an N (N is an integer equal to or greater than 2)-dimensional feature amount based on the pressing pressure information, the air supply/suction information, and the learned model, and calculates the obtained N-dimensional feature amount and , and the distance from the center of gravity of M categories.
  • N is an integer equal to or greater than 2
  • the distance is, for example, the Euclidean distance, but other distances such as the Mahalanobis distance may be used.
  • the processing unit 3120 obtains the category having the smallest distance from the N-dimensional feature amount obtained by the forward calculation among the first to Mth categories, and determines that the data to be evaluated belongs to this category. .
  • the processing unit 3120 determines the above-mentioned evaluation A when the distance from the center of gravity of C31 is minimum, and the above-mentioned evaluation when the distance from the center of gravity of C32 is minimum. It is determined to be the evaluation B, and when the distance from the center of gravity position of C33 is the minimum, it is determined to be the evaluation C described above.
  • the feature amount A3 and the feature amount B3 in FIG. 52 are parameters extracted based on the pressing pressure information and the air supply/suction information. However, it does not prevent the feature amount A3 from corresponding to the pressing pressure information itself and the feature amount B3 to correspond to the air supply/suction information itself.
  • the processing unit 3120 calculates the distance in the feature amount space defined by the feature amount A3, which is the first feature amount corresponding to the pressing pressure information, and the feature amount B3, which is the second feature amount corresponding to the air supply/suction information. Skill assessment may be based on For example, C31 in FIG. 52 is a category in which the amount of relative change in pressing pressure is very small, so it is determined to be evaluated as A.
  • C32 is a category with a large degree of air supply and suction, but the amount of relative change in pressing pressure is smaller than C33, so it is determined to be evaluated as B.
  • C33 is judged to be evaluated as C because the degree of air supply and suction is smaller than that of C32, but the amount of relative change in pressing pressure is larger than that of C32.
  • step S3204 the output processing unit 3130 outputs skill evaluation information that is the result of skill evaluation.
  • an N-dimensional feature amount may be extracted by performing principal component analysis on inputs based on pressing pressure information and air supply/suction information. Since the method of performing principal component analysis is well known, detailed description thereof will be omitted. A method of performing principal component analysis using machine learning is also known, and machine learning can be applied in that case as well.
  • the processing after N-dimensional feature quantity extraction is the same as the above example.
  • the skill evaluation method is not limited to the above.
  • the processing unit 3120 may perform skill evaluation based on the distance between the plot point corresponding to the user to be evaluated and the plot point corresponding to the second user different from the user.
  • the second user here is, for example, an instructor, and the user to be evaluated is a user who receives guidance from the instructor. In this way, an index indicating how close the skill of the user to be evaluated is to the skill of the instructor can be output as the skill evaluation information.
  • Illumination lens 315, 2315, 3315 Light guide 316,3316 Opening 317,3317 Suction pipe 318,3318 Nozzle 319,3319 Air/water pipe 330,2330,3330
  • Treatment device 331,2331,3331 Pretreatment Parts 332, 2332, 3332... Control part 333, 2333, 3333... Storage part 335, 2335, 3335... Detection processing part 336, 2336, 3336... Post-processing part 340, 2340, 3340... Display part 350 , 2350, 3350 ... Light source device 352, 2352, 3352 ... Light source 360, 2360, 3360 ... Treatment instrument 370, 3370 ... Suction device 380, 3380 ... Air supply and water supply device 400, 3400 ...
  • Skill evaluation sheet 410 3410... Doctor information icon, 420, 3420... Case sheet, 430, 3430... Comprehensive evaluation icon, 440, 3440... Marking evaluation icon, 442, 3442... Local injection evaluation icon, 444, 3444... Incision evaluation icon, 446, 3446 ... Peeling evaluation icon 448, 3448 ... Hemostasis evaluation icon 450, 3450 ... Data log icon 460, 3460 ... Patient information icon 470 ... Marking advice display 472 ... Local injection advice display 474 ... Incision advice display 3476 ... Peeling advice display, 3478... Hemostasis advice display, DB... Database, NN1, NN2... Neural network

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • General Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Medicinal Chemistry (AREA)
  • Mathematical Analysis (AREA)
  • Optics & Photonics (AREA)
  • Robotics (AREA)
  • Chemical & Material Sciences (AREA)
  • Radiology & Medical Imaging (AREA)
  • Algebra (AREA)
  • Computational Mathematics (AREA)
  • Pathology (AREA)
  • Biophysics (AREA)
  • Mathematical Optimization (AREA)
  • Mathematical Physics (AREA)
  • Pure & Applied Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Theoretical Computer Science (AREA)
  • Endoscopes (AREA)

Abstract

This processing system (100) includes an acquisition unit (110), a processing unit (120), and an output processing unit (130). Furthermore, the acquisition unit (110) acquires approach angle information for an insertion unit (310b) of an endoscope and energization history information regarding the energization history of a treatment tool (360). Furthermore, the processing unit (120) performs a skill evaluation of a user operating the endoscope on the basis of the approach angle information and the energization history information. Furthermore, the output processing unit (130) outputs skill evaluation information which is the result of the skill evaluation.

Description

処理システム及び情報処理方法Processing system and information processing method
 本発明は、処理システム及び情報処理方法等に関する。 The present invention relates to a processing system, an information processing method, and the like.
 従来、内視鏡システムを用いて生体を対象とした処置を行う手法が広く知られている。内視鏡における処置の良し悪しは、医師の経験値や操作上の暗黙知によるものが大きい。そのため、医師のスキルを評価する取組みが数多く行われている。例えば特許文献1には、医療ロボットの動作データを用いて、医師の技能を評価する手法が開示されている。 Conventionally, a method of performing treatment on a living body using an endoscope system is widely known. The pros and cons of endoscopic procedures largely depend on the doctor's experience and tacit knowledge of operations. As a result, many efforts have been made to assess the skills of physicians. For example, Patent Literature 1 discloses a method of evaluating a doctor's skill using motion data of a medical robot.
特開2012-521568号公報JP 2012-521568 A
 処置対象組織に対する内視鏡の角度であるアプローチ角度は、安全性に直結することから医者のスキルを判断するにあたり有効なパラメータであることが分かってきている。しかし、処置全体の良し悪しは、アプローチ角度だけをセンシングしても判断することが難しく、例えば高周波デバイスの通電履歴情報を組み合わせて初めて判断できる場合がある。特許文献1等の従来手法では、このような事情までは考慮されておらず、医師のスキルを評価するには十分ではない。  The approach angle, which is the angle of the endoscope with respect to the tissue to be treated, has been found to be an effective parameter for judging the skill of a doctor because it is directly linked to safety. However, it is difficult to determine whether the treatment as a whole is good or bad by sensing only the approach angle. Conventional methods such as Patent Document 1 do not consider such circumstances, and are not sufficient for evaluating the skill of doctors.
 本開示の一態様は、内視鏡の挿入部のアプローチ角度情報と、処置具の通電履歴に関する通電履歴情報を取得する取得部と、前記アプローチ角度情報及び前記通電履歴情報に基づいて、前記内視鏡を操作するユーザのスキル評価を行う処理部と、前記スキル評価の結果であるスキル評価情報を出力する出力処理部と、を含む処理システムに関係する。 According to one aspect of the present disclosure, an acquisition unit that acquires approach angle information of an insertion portion of an endoscope and energization history information related to the energization history of a treatment instrument, and based on the approach angle information and the energization history information, It relates to a processing system including a processing unit that evaluates the skill of a user who operates a scope, and an output processing unit that outputs skill evaluation information that is the result of the skill evaluation.
 本開示の他の態様は、内視鏡の挿入部のアプローチ角度情報と、処置具の通電履歴に関する通電履歴情報を取得し、前記アプローチ角度情報及び前記通電履歴情報に基づいて、前記内視鏡を操作するユーザのスキル評価を行い、前記スキル評価の結果であるスキル評価情報を出力する情報処理方法に関係する。 Another aspect of the present disclosure acquires approach angle information of an insertion portion of an endoscope and power supply history information related to the power supply history of a treatment instrument, and based on the approach angle information and the power supply history information, the endoscope It relates to an information processing method for performing skill evaluation of a user who operates a , and outputting skill evaluation information as a result of the skill evaluation.
図1(A)~図1(F)は、内視鏡挿入部と処置対象組織の位置関係と、撮像画像の例を説明する図。1(A) to 1(F) are diagrams for explaining the positional relationship between an endoscope insertion portion and a tissue to be treated, and an example of a captured image. 処理システムの構成例を説明する図。The figure explaining the structural example of a processing system. 内視鏡システムの外観例を説明する図。The figure explaining the example of the appearance of an endoscope system. 内視鏡システムの構成例を説明する図。The figure explaining the structural example of an endoscope system. 挿入部の先端部の構成例を説明する図。FIG. 4 is a diagram for explaining a configuration example of the distal end portion of the insertion section; 処理システムを含むシステムの構成例を説明する図。The figure explaining the structural example of the system containing a processing system. 評価スキル情報の例を説明する図。FIG. 4 is a diagram for explaining an example of evaluation skill information; アドバイス情報の例を説明する図。FIG. 4 is a diagram for explaining an example of advice information; アプローチ角度情報のログ情報の例を説明する図。FIG. 4 is a diagram for explaining an example of log information of approach angle information; アプローチ角度情報のログ情報の他の例を説明する図。FIG. 10 is a diagram for explaining another example of log information of approach angle information; 医師情報の例を説明する図。The figure explaining the example of doctor information. 患者情報の例を説明する図。The figure explaining the example of patient information. 図13(A)、図13(B)はアプローチ角度を説明する図。13(A) and 13(B) are diagrams for explaining an approach angle. FIG. アプローチ角度情報の出力の処理例を説明するフローチャート。4 is a flowchart for explaining an example of processing for outputting approach angle information; ニューラルネットワークについて説明する図。The figure explaining a neural network. ニューラルネットワークの入力及び出力の例を説明する図。FIG. 4 is a diagram for explaining an example of input and output of a neural network; 学習処理の処理例を説明するフローチャート。4 is a flowchart for explaining a processing example of learning processing; 推論処理であるスキル評価処理の処理例を説明するフローチャート。4 is a flowchart for explaining a processing example of skill evaluation processing, which is inference processing; n次元特徴量空間におけるクラスタリング結果の例を説明する図。FIG. 4 is a diagram for explaining an example of clustering results in an n-dimensional feature amount space; 図20(A)~図20(F)は、変形例における、内視鏡挿入部と処置対象組織の位置関係と、撮像画像の例を説明する図。FIGS. 20A to 20F are diagrams for explaining an example of a positional relationship between an endoscope insertion portion and a tissue to be treated, and captured images in a modified example; 変形例における、処理システムの構成例。A configuration example of a processing system in a modified example. 変形例における、内視鏡システムの外観例。An example of the appearance of an endoscope system in a modified example. 変形例における、内視鏡システムの構成例。A configuration example of an endoscope system in a modified example. 変形例における、挿入部の先端部の構成例。A configuration example of the distal end portion of the insertion section in a modified example. 図25(A)、図25(B)は、変形例における、アプローチ角度の説明図。25(A) and 25(B) are explanatory diagrams of the approach angle in the modified example. 変形例における、処理システムにおける処理を説明するフローチャート。10 is a flowchart for explaining processing in a processing system in a modified example; 図27(A)、図27(B)は、変形例における、アプローチ角度の相対変化を表示する画面例。27(A) and 27(B) are examples of screens displaying relative changes in the approach angle in the modified example. 変形例における、アプローチ角度の相対変化の時間変化を説明する図。The figure explaining the time change of the relative change of an approach angle in a modification. 変形例における、許容可能なアプローチ角度範囲の説明図。Explanatory drawing of the permissible approach angle range in a modification. 変形例における、データベースを含むシステムの構成例。A configuration example of a system including a database in a modified example. 変形例における、症例データの分類処理の説明図。FIG. 11 is an explanatory diagram of classification processing of case data in a modified example; 変形例における、処理システムの他の処理を説明するフローチャート。The flowchart explaining other processes of a processing system in a modification. 他の変形例における、処置の各段階における押し付け圧力の測定結果例を説明する図。FIG. 11 is a diagram for explaining an example of measurement results of pressing pressure in each stage of treatment in another modified example; 他の変形例における、処理システムの構成例を説明する図。The figure explaining the structural example of the processing system in another modification. 他の変形例における、内視鏡システムの外観例を説明する図。The figure explaining the example of the appearance of the endoscope system in another modification. 他の変形例における、内視鏡システムの構成例を説明する図。The figure explaining the structural example of the endoscope system in another modification. 他の変形例における、挿入部の先端部の構成例を説明する図。FIG. 11 is a diagram for explaining a configuration example of the distal end portion of the insertion section in another modified example; 他の変形例における、処理システムを含むシステムの構成例を説明する図。The figure explaining the structural example of the system containing a processing system in another modification. 図39(A)、図39(B)は、他の変形例における、押し付け圧力の測定方法の例を説明する図。39(A) and 39(B) are diagrams for explaining an example of a pressing pressure measuring method in another modified example. FIG. 他の変形例における、スキル評価シートの例を説明する図。FIG. 11 is a diagram for explaining an example of a skill evaluation sheet in another modified example; 他の変形例における、アドバイス情報の例を説明する図。The figure explaining the example of the advice information in another modification. 他の変形例における、押し付け圧力情報及び送気吸引情報のログ情報の例を説明する図。FIG. 11 is a diagram for explaining an example of log information of pressing pressure information and air supply/suction information in another modified example; 他の変形例における、押し付け圧力情報及び送気吸引情報のログ情報の例を説明する別の図。Another diagram for explaining an example of log information of pressing pressure information and air supply/suction information in another modified example. 他の変形例における、押し付け圧力情報及び送気吸引情報のログ情報の例を説明する別の図。Another diagram for explaining an example of log information of pressing pressure information and air supply/suction information in another modified example. 他の変形例における、医師情報の例を説明する図。The figure explaining the example of doctor information in another modification. 他の変形例における、患者情報の例を説明する図。The figure explaining the example of patient information in another modification. ログ情報を出力するための処理例を説明するフローチャート。4 is a flowchart for explaining an example of processing for outputting log information; 他の変形例における、ニューラルネットワークについて説明する図。The figure explaining the neural network in another modification. 他の変形例における、ニューラルネットワークの入力及び出力の例を説明する図。FIG. 11 is a diagram for explaining an example of input and output of a neural network in another modified example; 他の変形例における、学習処理の処理例を説明するフローチャート。10 is a flowchart for explaining a processing example of learning processing in another modified example; 他の変形例における、推論処理であるスキル評価処理の処理例を説明するフローチャート。10 is a flowchart for explaining an example of skill evaluation processing, which is inference processing, in another modified example; 他の変形例における、N次元特徴量空間におけるクラスタリング結果の例を説明する図。FIG. 11 is a diagram for explaining an example of clustering results in an N-dimensional feature quantity space in another modified example;
 以下、本実施形態について説明する。なお、以下に説明する本実施形態は、請求の範囲に記載された内容を不当に限定するものではない。また本実施形態で説明される構成の全てが、本開示の必須構成要件であるとは限らない。 The present embodiment will be described below. In addition, this embodiment described below does not unduly limit the content described in the claims. Moreover, not all the configurations described in the present embodiment are essential constituent elements of the present disclosure.
 1.システム構成例
 従来、軟性部を有する軟性内視鏡を用いて生体に対する処置を行う手法が広く知られている。例えば、病変を取り除く手法として、内視鏡的粘膜切除術(EMR:Endoscopic mucosal resection)や、内視鏡的粘膜下層はく離術(ESD:Endoscopic submucosal dissection)が知られている。EMRは、スネアと呼ばれる金属の輪を病変部に引っ掛け、高周波電流を流して切り取る方法である。ESDは、専用の処置具を使って、EMRよりも広い範囲の病変を切り取ることが可能な方法である。
1. System Configuration Example Conventionally, a method of performing treatment on a living body using a flexible endoscope having a flexible portion is widely known. For example, endoscopic mucosal resection (EMR) and endoscopic submucosal dissection (ESD) are known as techniques for removing lesions. EMR is a method in which a metal ring called a snare is hooked to a lesion and a high-frequency current is applied to cut it out. ESD is a method that can cut out lesions in a wider range than EMR using a dedicated treatment tool.
 ESDは、例えばマーキング、局注、切開、剥離等、複数の段階を含む。なお、さらに止血段階を含んでもよい。また、段階はステップとも言うことができる。マーキングは、病変の周辺に切り取る範囲の目印を付けるステップである。局注は、粘膜下層に薬剤を注入するステップである。切開は、マーキングを取り囲むように、病変部周辺の粘膜をナイフで切るステップである。剥離は、専用のナイフやスネアを用いて病変を生体から剥離させるステップである。回収は、切り取った病変を回収するステップである。止血は、切除後の生体表面の出血を止めるステップである。以下、本実施形態に係る処置例として主にESDに含まれる各処置について説明するが、本実施形態の手法はEMR等、生体に対する他の処置に拡張することが可能である。 ESD includes multiple stages, such as marking, local injection, incision, and peeling. In addition, a hemostasis step may also be included. A stage can also be called a step. Marking is the step of marking the area to be excised around the lesion. Local injection is the step of injecting a drug into the submucosa. Incision is the step of cutting the mucosa around the lesion with a knife to surround the marking. Dissection is a step of dissecting the lesion from the living body using a dedicated knife or snare. Harvesting is the step of retrieving the excised lesion. Hemostasis is the step of stopping bleeding on the body surface after resection. Hereinafter, each treatment included in ESD will be mainly described as an example of treatment according to this embodiment, but the technique of this embodiment can be extended to other treatments for a living body such as EMR.
 ところで、軟性内視鏡は、例えば図3を用いて後述するように、挿入部310bの先端部11から操作部310aまでが軟性且つ長大である。そのため、先端部11が生体と接触することによる感触や力量は、術者にはほとんど伝わらない。また術者が取得できる情報は、先端部11に設けられる撮像系による撮像画像が主である。即ち、術者が確認できるのは画面から見える範囲及び角度のみである。さらに言えば、撮像画像は3次元的な情報を有さない場合が多い。 By the way, the flexible endoscope is flexible and long from the distal end portion 11 of the insertion section 310b to the operation section 310a, as will be described later with reference to FIG. 3, for example. Therefore, the feel and the amount of strength due to the contact of the tip portion 11 with the living body are hardly transmitted to the operator. Information that can be obtained by the operator is mainly an image captured by an imaging system provided in the distal end portion 11 . That is, what the operator can confirm is only the range and angle visible from the screen. Furthermore, captured images often do not have three-dimensional information.
 具体的に、図1(A)、図1(B)、図1(C)、図1(D)、図1(E)、図1(F)を用いながら、処置対象組織に対する内視鏡の角度が安全な状態かどうかについて考えてみる。図1(A)~図1(F)は、挿入部310bの先端部11と処置対象組織の位置関係、及び、そのときの撮像画像を例示する図である。図1(A)~図1(F)におけるOB11が病変であり、処置対象組織とはOB11又はその周辺組織である。なお、図1(D)と図1(F)において、E11で示す線は、粘膜下層と筋層の境界を示す線である。以下、処置対象組織に対する内視鏡の角度をアプローチ角度と表記する。なお、ここでの処置対象組織とは、狭義には処置によって切除する対象となる病変を表すが、当該病変の周辺の正常組織であってもよい。例えば局注が行われた場合、病変部は***することが想定されるため、周辺部分を処置対象組織としてアプローチ角度を定義したほうが生体と処置具の角度がわかりやすい場合も考えられる。また、具体的なアプローチ角度求める処理等については図13、図14で後述する。 Specifically, while using FIGS. 1(A), 1(B), 1(C), 1(D), 1(E), and 1(F), the endoscope for the tissue to be treated Consider whether the angle of is a safe state. FIGS. 1A to 1F are diagrams illustrating the positional relationship between the distal end portion 11 of the insertion section 310b and the tissue to be treated, and captured images at that time. OB11 in FIGS. 1(A) to 1(F) is a lesion, and the treatment target tissue is OB11 or its surrounding tissue. In addition, in FIG.1(D) and FIG.1(F), the line shown by E11 is a line which shows the boundary of a submucosal layer and a muscle layer. Hereinafter, the angle of the endoscope with respect to the treatment target tissue is referred to as the approach angle. In addition, although the treatment target tissue here represents a lesion to be excised by treatment in a narrow sense, it may be a normal tissue around the lesion. For example, when a local injection is performed, it is assumed that the lesion is elevated, so defining the approach angle with the peripheral portion as the tissue to be treated may make it easier to understand the angle between the living body and the treatment instrument. Further, specific processing for obtaining the approach angle will be described later with reference to FIGS. 13 and 14. FIG.
 例えば、図1(A)は、挿入部310bの先端部11から処置具360を突出させ、処置対象組織へのアプローチを開始した状態を表す。図1(B)は、図1(A)に示す状態における撮像画像を表す。図1(A)の例では、処置具360と処置対象組織が接触するほどには近づいていない。よって、図1(C)~図1(F)を用いて後述する処置中の状態に比べて、撮像画像に基づいて、術者が挿入部310bと処置対象組織の相対関係を推定することが容易である。なお、処置具360については図5で後述する。また、挿入部310bは、対物光学系311を含む。対物光学系311は、被写体から反射した反射光を、被写体像として結像する。 For example, FIG. 1(A) shows a state in which the treatment instrument 360 is protruded from the distal end portion 11 of the insertion portion 310b and approach to the treatment target tissue is started. FIG. 1(B) represents a captured image in the state shown in FIG. 1(A). In the example of FIG. 1A, the treatment tool 360 and the tissue to be treated are not close enough to contact each other. Therefore, the operator can estimate the relative relationship between the insertion portion 310b and the tissue to be treated based on the captured image, compared to the state during treatment, which will be described later with reference to FIGS. 1C to 1F. Easy. Note that the treatment instrument 360 will be described later with reference to FIG. Also, the insertion section 310 b includes an objective optical system 311 . The objective optical system 311 forms a subject image by reflecting light reflected from the subject.
 図1(C)は、切開を行っている状態を表し、図1(D)は切開時の撮像画像を表す。図1(C)に示すように、先端部11は切開によって浮き上がった組織の下に潜り込んでいる。そのため図1(D)に示すように、撮像画像は当該組織が画面全体を覆う状態となっており、術者が撮像画像からアプローチ角度を推定することは困難である。 FIG. 1(C) represents a state in which incision is being performed, and FIG. 1(D) represents a captured image during incision. As shown in FIG. 1(C), the tip 11 is hidden under the tissue raised by the incision. Therefore, as shown in FIG. 1D, the captured image is in a state where the tissue covers the entire screen, and it is difficult for the operator to estimate the approach angle from the captured image.
 図1(E)は、切開中にアプローチ角度が大きく変化し、出血が増える危険な角度となっている状態を表し、図1(F)はそのときの撮像画像を表す。図1(E)に示すように、先端部11は浮き上がった組織の下に潜り込んだ状態を維持している。図1(F)に示すように、撮像画像は図1(D)の状態からの変化が小さく、術者が撮像画像からアプローチ角度が変化してしまっていることに気づくことは容易でない。例えば、処置具360が組織に深く入り込むことによって出血量が増大した場合、術者は好ましくない状態にあることを撮像画像から把握可能であるが、出血発生前にこれを予想することは難しい。 FIG. 1(E) shows a state in which the approach angle changes greatly during incision and becomes a dangerous angle that increases bleeding, and FIG. 1(F) shows the captured image at that time. As shown in FIG. 1(E), the distal end portion 11 maintains a state of getting under the raised tissue. As shown in FIG. 1(F), the captured image changes little from the state of FIG. 1(D), and it is not easy for the operator to notice that the approach angle has changed from the captured image. For example, if the treatment instrument 360 penetrates deeply into the tissue and the amount of bleeding increases, the operator can grasp from the captured image that the operator is in an unfavorable state, but it is difficult to predict this before the bleeding occurs.
 このように、実際には画像、感触、力量のいずれも十分な情報が得られない中で、内視鏡医である術者は自らの経験則に基づいて手技を行っている。実際、熟練医は、軟性内視鏡であっても、病変に対する実際の内視鏡の位置や姿勢を想像で補うことで、操作性の変化を絶えず修正しながら挿入部310bを制御できることが経験的に知られている。具体的には、熟練医は処置中のアプローチ角度の変化を調べると、修練医に比べて小さいことが知られている。ただし、熟練医自身も、どういう時にどう操作するとアプローチ角度が小さく制御できるのか、うまく言葉で表すことができない。つまり、アプローチ角度の制御ノウハウを客観性のある表現で修練医に指導することが出来す、容易に継承することができない状況にある。換言すれば、従来、処置におけるアプローチ角度の推移は「暗黙知」となっている。なお、ここでの熟練医とは処置に関するスキルの高い医師であり、修練医とは熟練医に比べて処置に関するスキルが低い医師をいう。また、処置に関するスキルは、処置の行為の情報の他、処置の経過後の情報も考慮して高低が評価される。処置の経過後の情報とは、例えば、偶発症発生率の低さや、術後の入院日数の短さ等をいう。 In this way, the endoscopist, who is an endoscopist, performs procedures based on his own empirical rules, in the absence of sufficient information regarding images, sensations, and competence. In fact, experienced doctors have experienced that even with flexible endoscopes, by imagining the actual position and posture of the endoscope with respect to the lesion, the insertion portion 310b can be controlled while constantly correcting changes in operability. commonly known. Specifically, it is known that the change in the approach angle during treatment is smaller for experienced doctors than for novice doctors. However, even experienced doctors themselves cannot express in words how and when to operate to make the approach angle smaller. In other words, there is a situation in which it is not possible to easily pass on the control know-how of the approach angle to the trainee doctor in an objective manner. In other words, conventionally, the transition of the approach angle in treatment is "tacit knowledge". It should be noted that the term "expert doctor" as used herein refers to a doctor with high treatment skill, and the term "training doctor" refers to a doctor with lower treatment skill than the expert doctor. In addition, the treatment-related skill is evaluated as to whether it is high or low in consideration of the information after the treatment as well as the information on the action of the treatment. Information after the course of treatment refers to, for example, a low incidence of complications, a short postoperative hospital stay, and the like.
 そのため、術者のスキルを可視化、定量化することに対する要求がある。客観的なスキル評価が可能になれば、術者のスキルアップを容易にすることや、病院における人材配置を最適化すること等が可能になる。例えば特許文献1では、ユーザが行なう外科的作業に対するデータを収集し、収集したデータと、同じ外科的作業に対する他のデータとを比較することによってユーザの臨床技能を定量化することができる。 Therefore, there is a demand for visualizing and quantifying the skill of the caster. If objective skill evaluation becomes possible, it will be possible to facilitate the skill improvement of surgeons and optimize the allocation of human resources in hospitals. For example, in US Pat. No. 6,200,000, data for a surgical task performed by a user can be collected and a user's clinical skill can be quantified by comparing the collected data to other data for the same surgical task.
 しかし、処置全体の良し悪しは個々のパラメータだけでは判断することが難しい。例えば、アプローチ角度は小さい方が好ましいのか、大きい方が好ましいのかは、症例によるため一概には言えない。また、例えば、アプローチ角度情報と高周波デバイスの通電履歴との情報を組み合わせて処置の良し悪しが判断可能になる場合がある。例えば、熟練医は、通電の時間を短く、かつ、当該通電期間内においてアプローチ角度の変化を小さく処置していることで、安全性に配慮できていると評価されている。一方、所定の症例では、アプローチ角度の変化が大きくなったり、或いは通電時間が長くなったりしても、低く評価されない場合もある。所定の症例とは、例えば著しく複雑な形状からなる病変を含む症例や、胃の穹窿部等、アプローチ角度を小さくすることが難しい病変の部位に関する症例等である。このように、処置の良し悪しの判断には、複数のパラメータの組み合わせと、さらにその組み合わせにおける最適範囲を症例ごとに見出すことが求められる。特許文献1等の従来手法では、このような事情までは考慮されておらず、術者のスキルを評価するには十分ではない。なお、高周波デバイスとは、高周波電流が印加されることによって、対象組織を切除、焼灼するために用いられるデバイスである。高周波デバイスは、高周波スネアや高周波ナイフを含む。通電とは、高周波デバイスに対して、電源装置から高周波電流が供給されていることであり、通電状態は電源装置の制御信号に基づいて判定可能である。以降、高周波デバイスへの通電を単に通電と表記することがある。 However, it is difficult to judge whether the treatment as a whole is good or bad based on individual parameters alone. For example, whether a small approach angle is preferable or a large one is preferable depends on the case, so it cannot be generally said. Also, for example, it may be possible to determine whether a treatment is good or bad by combining approach angle information and power supply history information of a high-frequency device. For example, skilled doctors are evaluated to be able to consider safety by shortening the duration of energization and performing treatment with small changes in the approach angle during the energization period. On the other hand, in certain cases, even if the change in the approach angle is large or the energization time is long, it may not be evaluated as low. The predetermined case is, for example, a case including a lesion having a remarkably complicated shape, or a case relating to a lesion site such as the vault of the stomach where it is difficult to reduce the approach angle. Thus, in order to judge whether treatment is good or bad, it is necessary to find a combination of a plurality of parameters and the optimum range of the combination for each case. Conventional methods such as Patent Document 1 do not take such circumstances into account, and are not sufficient for evaluating the skill of the operator. A high-frequency device is a device that is used to excise and cauterize a target tissue by applying a high-frequency current. High frequency devices include high frequency snares and high frequency knives. Energization means that high-frequency current is supplied from the power supply to the high-frequency device, and the energization state can be determined based on the control signal of the power supply. Henceforth, the energization to a high frequency device may be simply written as energization.
 図2は、本実施形態に係る処理システム100の構成を示す図である。処理システム100は、取得部110と、処理部120と、出力処理部130を含む。ただし処理システム100は図2の構成に限定されず、これらの一部の構成要素を省略したり、他の構成要素を追加したりする等の種々の変形実施が可能である。 FIG. 2 is a diagram showing the configuration of the processing system 100 according to this embodiment. The processing system 100 includes an acquisition unit 110 , a processing unit 120 and an output processing unit 130 . However, the processing system 100 is not limited to the configuration of FIG. 2, and various modifications such as omitting some of these components or adding other components are possible.
 取得部110は、図3、図4で後述する内視鏡システム300から、内視鏡の挿入部のアプローチ角度情報と、処置具360の通電履歴に関する通電履歴情報を取得する。例えば取得部110は処置具360の通電履歴に関する通電履歴情報を取得する通信インターフェースと言うこともできる。なお、通電履歴とは、例えば、通電のタイミングであるが、通電期間や通電出力の大きさ等であってもよく、さらにこれらを任意に組み合わせてもよい。また、具体的な取得方法については後述する。また、取得部110は、例えば情報取得用の通信チップ、当該通信チップを制御するプロセッサ又は制御回路等によって実現が可能である。 The acquisition unit 110 acquires the approach angle information of the insertion portion of the endoscope and the energization history information regarding the energization history of the treatment instrument 360 from the endoscope system 300 described later with reference to FIGS. 3 and 4 . For example, the acquisition unit 110 can also be said to be a communication interface that acquires energization history information regarding the energization history of the treatment instrument 360 . The energization history is, for example, the timing of energization, but may be the energization period, the magnitude of the energization output, or the like, and may be combined arbitrarily. A specific acquisition method will be described later. Also, the acquisition unit 110 can be realized by, for example, a communication chip for information acquisition, a processor or a control circuit that controls the communication chip, or the like.
 処理部120は、アプローチ角度情報及び通電履歴情報に基づいて、内視鏡システム300の操作を行ったユーザのスキル評価を行う。処理部120が実行する処理は、クラスタリング等の分類処理である。なお、スキル評価の詳細は後述する。 The processing unit 120 evaluates the skill of the user who has operated the endoscope system 300 based on the approach angle information and the energization history information. The processing executed by the processing unit 120 is classification processing such as clustering. Details of the skill evaluation will be described later.
 学習済モデルを用いる処理が行われる場合、処理システム100は、機械学習によって生成された学習済モデルを記憶する不図示の記憶部を含む。ここでの記憶部は、処理部120等のワーク領域となるもので、その機能は半導体メモリ、レジスタ、磁気記憶装置などにより実現できる。処理部120は、記憶部から学習済モデルを読み出し、当該学習済モデルからの指示に従って動作することによって、ユーザのスキル評価結果を出力する推論処理を行う。 When processing using a trained model is performed, the processing system 100 includes a storage unit (not shown) that stores a trained model generated by machine learning. The storage unit here serves as a work area for the processing unit 120 and the like, and its function can be realized by a semiconductor memory, a register, a magnetic storage device, or the like. The processing unit 120 reads a learned model from the storage unit and operates according to instructions from the learned model, thereby performing an inference process of outputting a user's skill evaluation result.
 なお、処理部120は、下記のハードウェアにより構成される。ハードウェアは、デジタル信号を処理する回路及びアナログ信号を処理する回路の少なくとも一方を含むことができる。例えば、ハードウェアは、回路基板に実装された1又は複数の回路装置や、1又は複数の回路素子で構成することができる。1又は複数の回路装置は例えばIC(Integrated Circuit)、FPGA(field-programmable gate array)等である。1又は複数の回路素子は例えば抵抗、キャパシター等である。 The processing unit 120 is configured with the following hardware. The hardware may include circuitry for processing digital signals and/or circuitry for processing analog signals. For example, the hardware may consist of one or more circuit devices or one or more circuit elements mounted on a circuit board. The one or more circuit devices are, for example, ICs (Integrated Circuits), FPGAs (field-programmable gate arrays), or the like. The one or more circuit elements are, for example, resistors, capacitors, and the like.
 また処理部120は、下記のプロセッサにより実現されてもよい。処理システム100は、情報を記憶するメモリと、メモリに記憶された情報に基づいて動作するプロセッサと、を含む。ここでのメモリは、上記の記憶部であってもよいし、異なるメモリであってもよい。情報は、例えばプログラムと各種のデータ等である。プロセッサは、ハードウェアを含む。プロセッサは、CPU(Central Processing Unit)、GPU(Graphics Processing Unit)、DSP(Digital Signal Processor)等、各種のプロセッサを用いることが可能である。メモリは、SRAM(Static Random Access Memory)、DRAM(Dynamic Random Access Memory)などの半導体メモリであってもよいし、レジスタであってもよいし、HDD(Hard Disk Drive)等の磁気記憶装置であってもよいし、光学ディスク装置等の光学式記憶装置であってもよい。例えば、メモリはコンピュータにより読み取り可能な命令を格納しており、当該命令がプロセッサにより実行されることで、処理部120の機能が処理として実現されることになる。ここでの命令は、プログラムを構成する命令セットの命令でもよいし、プロセッサのハードウェア回路に対して動作を指示する命令であってもよい。さらに、処理部120の各部の全部または一部をクラウドコンピューティングで実現し、後述する各処理をクラウドコンピューティング上で行うこともできる。 Also, the processing unit 120 may be realized by the following processors. Processing system 100 includes a memory that stores information and a processor that operates on the information stored in the memory. The memory here may be the storage unit described above, or may be a different memory. The information is, for example, programs and various data. A processor includes hardware. Various processors such as a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), and a DSP (Digital Signal Processor) can be used as the processor. The memory may be a semiconductor memory such as SRAM (Static Random Access Memory) or DRAM (Dynamic Random Access Memory), a register, or a magnetic storage device such as HDD (Hard Disk Drive). Alternatively, it may be an optical storage device such as an optical disc device. For example, the memory stores computer-readable instructions, and the instructions are executed by the processor to implement the functions of the processing unit 120 as processes. The instruction here may be an instruction set that constitutes a program, or an instruction that instructs a hardware circuit of a processor to perform an operation. Furthermore, all or part of each part of the processing unit 120 can be realized by cloud computing, and each process described later can be performed on cloud computing.
 また、本実施形態の処理部120は、プロセッサ上で動作するプログラムのモジュールとして実現されてもよい。例えば、処理部120は、アプローチ角度情報と通電履歴情報に基づいてスキル評価を行う処理モジュールとして実現される。 Also, the processing unit 120 of this embodiment may be implemented as a module of a program that runs on a processor. For example, the processing unit 120 is implemented as a processing module that performs skill evaluation based on approach angle information and energization history information.
 また、本実施形態の処理部120が行う処理を実現するプログラムは、例えばコンピュータによって読み取り可能な媒体である情報記憶装置に格納できる。情報記憶装置は、例えば光ディスク、メモリカード、HDD、或いは半導体メモリなどによって実現できる。半導体メモリは例えばROMである。処理部120は、情報記憶装置に格納されるプログラムに基づいて本実施形態の種々の処理を行う。即ち情報記憶装置は、処理部120としてコンピュータを機能させるためのプログラムを記憶する。コンピュータは、入力装置、処理部、記憶部、出力部を備える装置である。具体的には本実施形態に係るプログラムは、図18等を用いて後述する各ステップを、コンピュータに実行させるためのプログラムである。 Also, the program that implements the processing performed by the processing unit 120 of this embodiment can be stored, for example, in an information storage device that is a computer-readable medium. The information storage device can be implemented by, for example, an optical disc, memory card, HDD, semiconductor memory, or the like. A semiconductor memory is, for example, a ROM. The processing unit 120 performs various processes of this embodiment based on programs stored in the information storage device. That is, the information storage device stores a program for causing the computer to function as the processing unit 120 . A computer is a device that includes an input device, a processing unit, a storage unit, and an output unit. Specifically, the program according to the present embodiment is a program for causing a computer to execute each step described later with reference to FIG. 18 and the like.
 出力処理部130は、処理部120によるスキル評価の結果であるスキル評価情報を出力する処理を行う。例えば、処理システム100は不図示の表示部を含み、出力処理部130は、スキル評価情報を当該表示部に表示する処理を行ってもよい。或いは、図6を用いて後述するように、処理システム100はネットワークを介して、内視鏡システム300に接続されてもよい。出力処理部130は、ネットワークを介してスキル評価情報を送信する通信デバイスや通信チップであってもよい。なおスキル評価情報が出力される機器は内視鏡システム300に限定されず、処理システム100と通信可能なPC(Personal Computer)であってもよいし、スマートフォンやタブレット端末等の携帯端末装置であってもよい。 The output processing unit 130 performs processing for outputting skill evaluation information that is the result of skill evaluation by the processing unit 120 . For example, the processing system 100 may include a display unit (not shown), and the output processing unit 130 may perform processing for displaying skill evaluation information on the display unit. Alternatively, as will be described later using FIG. 6, the processing system 100 may be connected to the endoscope system 300 via a network. The output processing unit 130 may be a communication device or a communication chip that transmits skill evaluation information via a network. Note that the device that outputs the skill evaluation information is not limited to the endoscope system 300, and may be a PC (Personal Computer) capable of communicating with the processing system 100, or a mobile terminal device such as a smart phone or a tablet terminal. may
 このように、本実施形態の処理システム100は、取得部110と、処理部120と、出力処理部130を含む。また、取得部110は、内視鏡の挿入部のアプローチ角度情報と、処置具360の通電履歴に関する通電履歴情報を取得する。また、処理部120は、アプローチ角度情報及び通電履歴情報に基づいて、内視鏡を操作するユーザのスキル評価を行う。また、出力処理部130は、スキル評価の結果であるスキル評価情報を出力する。 Thus, the processing system 100 of this embodiment includes the acquisition unit 110, the processing unit 120, and the output processing unit . The acquisition unit 110 also acquires approach angle information of the insertion portion of the endoscope and power supply history information related to the power supply history of the treatment instrument 360 . In addition, the processing unit 120 performs skill evaluation of the user operating the endoscope based on the approach angle information and the energization history information. The output processing unit 130 also outputs skill evaluation information that is the result of skill evaluation.
 なお本実施形態の処理システム100が行う処理は、情報処理方法として実現されてもよい。情報処理方法は、内視鏡の挿入部のアプローチ角度情報と、処置具360の通電履歴に関する通電履歴情報を取得し、アプローチ角度情報及び通電履歴情報に基づいて、内視鏡を操作するユーザのスキル評価を行い、スキル評価の結果であるスキル評価情報を出力する。 The processing performed by the processing system 100 of this embodiment may be implemented as an information processing method. The information processing method acquires the approach angle information of the insertion portion of the endoscope and the power supply history information related to the power supply history of the treatment instrument 360, and based on the approach angle information and the power supply history information, the user operating the endoscope. Skill evaluation is performed, and skill evaluation information, which is the result of skill evaluation, is output.
 本実施形態の手法によれば、アプローチ角度情報及び通電履歴情報の両方に基づいてユーザのスキル評価を行うことができることから、術者のスキルについて、精度を高く評価することができる。 According to the method of this embodiment, the user's skill can be evaluated based on both the approach angle information and the energization history information, so the operator's skill can be evaluated with high accuracy.
 図3は、内視鏡システム300の構成例を示す図である。内視鏡システム300は、スコープ部310と、処理装置330と、表示部340と、光源装置350とを含む。術者は、内視鏡システム300を用いて患者の内視鏡検査を行う。ただし、内視鏡システム300の構成は図3に限定されず、一部の構成要素を省略したり、他の構成要素を追加したりする等の種々の変形実施が可能である。なお、図3においては、図4で後述する吸引装置370と送気送水装置380等の図示は省略している。 FIG. 3 is a diagram showing a configuration example of the endoscope system 300. As shown in FIG. The endoscope system 300 includes a scope section 310 , a processing device 330 , a display section 340 and a light source device 350 . An operator uses the endoscope system 300 to perform an endoscopy on a patient. However, the configuration of the endoscope system 300 is not limited to that shown in FIG. 3, and various modifications such as omitting some components or adding other components are possible. 3, illustration of a suction device 370, an air/water supply device 380, etc., which will be described later with reference to FIG. 4, is omitted.
 また図3においては、処理装置330が、コネクタ310dによってスコープ部310と接続される1つの装置である例を示したがこれには限定されない。例えば、処理装置330の一部又は全部の構成は、ネットワークを介して接続可能なPCやサーバシステム等の他の情報処理装置によって構築されてもよい。例えば、処理装置330はクラウドコンピューティングによって実現されてもよい。 Also, FIG. 3 shows an example in which the processing device 330 is one device connected to the scope section 310 via the connector 310d, but it is not limited to this. For example, part or all of the configuration of the processing device 330 may be constructed by other information processing devices such as a PC or a server system that can be connected via a network. For example, processing unit 330 may be implemented by cloud computing.
 スコープ部310は、操作部310aと、可撓性を有する挿入部310bと、信号線などを含むユニバーサルケーブル310cとを有する。スコープ部310は、管状の挿入部310bを体腔内に挿入する管状挿入装置である。ユニバーサルケーブル310cの先端にはコネクタ310dが設けられる。スコープ部310は、コネクタ310dによって、光源装置350及び処理装置330と着脱可能に接続される。さらに、図4を用いて後述するように、ユニバーサルケーブル310c内には、ライトガイド315が挿通されており、スコープ部310は、光源装置350からの照明光を、ライトガイド315を通して挿入部310bの先端から出射する。 The scope section 310 has an operation section 310a, a flexible insertion section 310b, and a universal cable 310c including signal lines and the like. The scope section 310 is a tubular insertion device that inserts a tubular insertion section 310b into a body cavity. A connector 310d is provided at the tip of the universal cable 310c. The scope unit 310 is detachably connected to the light source device 350 and the processing device 330 by a connector 310d. Furthermore, as will be described later with reference to FIG. 4, a light guide 315 is inserted through the universal cable 310c. emitted from the tip.
 例えば挿入部310bは、挿入部310bの先端から基端に向かって、先端部11と、湾曲可能な湾曲部12と、軟性部13とを有している。挿入部310bは、被写体に挿入される。挿入部310bの先端部11は、スコープ部310の先端部であり、硬い先端硬質部である。後述する対物光学系311や撮像素子312は、例えば先端部11に設けられる。 For example, the insertion portion 310b has a distal end portion 11, a bendable bending portion 12, and a flexible portion 13 from the distal end to the proximal end of the insertion portion 310b. The insertion portion 310b is inserted into the subject. The distal end portion 11 of the insertion portion 310b is the distal end portion of the scope portion 310 and is a hard distal end rigid portion. An objective optical system 311 and an imaging element 312, which will be described later, are provided at the distal end portion 11, for example.
 湾曲部12は、操作部310aに設けられた湾曲操作部材に対する操作に応じて、所望の方向に湾曲可能である。湾曲操作部材は、例えば左右湾曲操作ノブ及び上下湾曲操作ノブを含む。また操作部310aには、湾曲操作部材の他にも、レリーズボタン、送気送水ボタン等の各種操作ボタンが設けられている。 The bending portion 12 can be bent in a desired direction according to the operation of the bending operation member provided on the operation portion 310a. The bending operation member includes, for example, a horizontal bending operation knob and a vertical bending operation knob. In addition to the bending operation member, the operation portion 310a is provided with various operation buttons such as a release button and an air/water supply button.
 処理装置330は、受信した撮像信号に対して所定の画像処理を行い、撮像画像を生成するビデオプロセッサである。生成された撮像画像の映像信号は、処理装置330から表示部340へ出力され、撮像画像が、表示部340上にリアルタイムに表示される。処理装置330の構成については後述する。表示部340は、例えば液晶ディスプレイやEL(Electro-Luminescence)ディスプレイ等である。 The processing device 330 is a video processor that performs predetermined image processing on the received imaging signal and generates a captured image. A video signal of the generated captured image is output from the processing device 330 to the display unit 340, and the captured image is displayed on the display unit 340 in real time. The configuration of the processing device 330 will be described later. The display unit 340 is, for example, a liquid crystal display or an EL (Electro-Luminescence) display.
 光源装置350は、通常観察モード用の白色光を出射可能な光源装置である。なお、光源装置350は、通常観察モード用の白色光と、狭帯域光等の特殊光とを選択的に出射可能であってもよい。 The light source device 350 is a light source device capable of emitting white light for normal observation mode. The light source device 350 may be capable of selectively emitting white light for normal observation mode and special light such as narrow band light.
 図4は、内視鏡システム300の各部の構成を説明する図である。なお図4では、スコープ部310の一部の構成を省略、簡略化している。 FIG. 4 is a diagram for explaining the configuration of each part of the endoscope system 300. As shown in FIG. Note that in FIG. 4, a part of the configuration of the scope unit 310 is omitted and simplified.
 光源装置350は、照明光を発光する光源352を含む。光源352は、キセノン光源であってもよいし、LED(light emitting diode)であってもよいし、レーザー光源であってもよい。また光源352は他の光源であってもよく、発光方式は限定されない。 The light source device 350 includes a light source 352 that emits illumination light. The light source 352 may be a xenon light source, an LED (light emitting diode), or a laser light source. Also, the light source 352 may be another light source, and the light emission method is not limited.
 挿入部310bは、前述の対物光学系311、撮像素子312、照明レンズ314、ライトガイド315、吸引管317、送気送水管319を含む。ライトガイド315は、光源352からの照明光を、挿入部310bの先端まで導光する。照明レンズ314は、ライトガイド315によって導光された照明光を被写体に照射する。 The insertion section 310b includes the above-described objective optical system 311, imaging element 312, illumination lens 314, light guide 315, suction tube 317, and air/water supply tube 319. The light guide 315 guides illumination light from the light source 352 to the distal end of the insertion portion 310b. The illumination lens 314 irradiates the subject with the illumination light guided by the light guide 315 .
 撮像素子312は、対物光学系311を経由した被写体からの光を受光する。撮像素子312はモノクロセンサであってもよいし、カラーフィルタを備えた素子であってもよい。カラーフィルタは、広く知られたベイヤフィルタであってもよいし、補色フィルタであってもよいし、他のフィルタであってもよい。補色フィルタとは、シアン、マゼンタ及びイエローの各色フィルタを含むフィルタである。 The imaging element 312 receives light from the subject via the objective optical system 311 . The imaging element 312 may be a monochrome sensor or an element with color filters. The color filter may be a well-known Bayer filter, a complementary color filter, or other filters. Complementary color filters are filters that include cyan, magenta, and yellow color filters.
 吸引管317は、所定の場合に、吸引装置370を起動させ、液体等を吸引する。所定の場合とは、例えば、胃液等が診断の妨げになる場合や、後述する送水を処置終了に伴い回収する場合等であるが、他の場合であってもよい。なお、吸引装置370は、不図示の吸引ポンプや回収タンク等を含むことにより実現される。また、吸引装置370は、後述の制御部332と接続され、不図示の吸引ボタンを押下すること等により、開口部316と吸引管317を通じて、液体等が、前述の回収タンクに回収される。なお、本実施形態においては、開口部316は、後述の処置具360が突出するときの開口部を兼ねている。また、図4においては、処置具360及び処置具360を収納する管等の図示は省略している。 The suction tube 317 activates the suction device 370 in a predetermined case to suck liquid or the like. Predetermined cases include, for example, a case where gastric juice or the like interferes with diagnosis, a case where water is collected after treatment is completed, and the like, but other cases are also possible. The suction device 370 is implemented by including a suction pump, recovery tank, and the like (not shown). Also, the suction device 370 is connected to the control unit 332 described later, and by pressing a suction button (not shown) or the like, the liquid or the like is collected in the collection tank described above through the opening 316 and the suction pipe 317 . In addition, in the present embodiment, the opening 316 also serves as an opening from which a treatment instrument 360, which will be described later, protrudes. 4, illustration of the treatment instrument 360 and a tube housing the treatment instrument 360 is omitted.
 送気送水管319は、特定の場合に、送気送水装置380を起動させ、送気又は送水を行う。特定の場合とは、例えば、病変付近の残渣を洗浄したい場合や、病変付近の周辺を内側から拡張したい場合等であるが、他の場合であってもよい。なお、送気送水装置380は、不図示のポンプ、ガスボンベ、送水タンク等を含むことにより実現される。また、送気送水装置380は、後述の制御部332と接続され、不図示の送気ボタンや送水ボタンを押下すること等により、送気送水管319を通じてノズル318から、気体又は液体が噴出される。なお、図4では、送気送水管319を模式的に1本で図示しているが、気体が通る管と液体が通る管を並設し、ノズル318の手前でそれぞれの管を結合してもよい。 In a specific case, the air/water supply pipe 319 activates the air/water supply device 380 to supply air or water. The specific case is, for example, a case where it is desired to wash the residue around the lesion or a case where it is desired to expand the surrounding area around the lesion from the inside, but other cases may also be used. The air/water supply device 380 is realized by including a pump, a gas cylinder, a water supply tank, and the like (not shown). The air/water supply device 380 is connected to the control unit 332 described later, and when an air supply button or a water supply button (not shown) is pressed, gas or liquid is ejected from the nozzle 318 through the air/water supply pipe 319. be. In FIG. 4, one air/water supply pipe 319 is schematically illustrated, but a pipe for gas and a pipe for liquid are arranged side by side, and the respective pipes are connected before the nozzle 318. good too.
 処理装置330は、画像処理やシステム全体の制御を行う。処理装置330は、前処理部331、制御部332、記憶部333、検出処理部335、後処理部336を含む。 The processing device 330 performs image processing and control of the entire system. The processing device 330 includes a pre-processing section 331 , a control section 332 , a storage section 333 , a detection processing section 335 and a post-processing section 336 .
 前処理部331は、撮像素子312から順次出力されるアナログ信号をデジタルの画像に変換するA/D変換と、A/D変換後の画像データに対する各種補正処理を行う。なお、撮像素子312にA/D変換回路が設けられ、前処理部331におけるA/D変換が省略されてもよい。ここでの補正処理とは、例えばカラーマトリクス補正処理、構造強調処理、ノイズ低減処理、AGC(automatic gain control)等を含む。また前処理部331は、ホワイトバランス処理等の他の補正処理を行ってもよい。前処理部331は、処理後の画像を入力画像として検出処理部335に出力する。また前処理部331は、処理後の画像を表示画像として、後処理部336に出力する。 The preprocessing unit 331 performs A/D conversion that converts analog signals sequentially output from the imaging element 312 into digital images, and various correction processes for image data after A/D conversion. Note that an A/D conversion circuit may be provided in the image sensor 312 and the A/D conversion in the preprocessing section 331 may be omitted. The correction processing here includes, for example, color matrix correction processing, structure enhancement processing, noise reduction processing, AGC (automatic gain control), and the like. The preprocessing unit 331 may also perform other correction processing such as white balance processing. The preprocessing unit 331 outputs the processed image to the detection processing unit 335 as an input image. The pre-processing unit 331 also outputs the processed image to the post-processing unit 336 as a display image.
 検出処理部335は、入力画像から病変等の注目領域を検出する検出処理を行う。ただし本実施形態では、注目領域の検出処理は必須ではなく、検出処理部335は省略が可能である。 The detection processing unit 335 performs detection processing for detecting a region of interest such as a lesion from the input image. However, in this embodiment, the attention area detection processing is not essential, and the detection processing unit 335 can be omitted.
 後処理部336は、前処理部331、検出処理部335の出力に基づく後処理を行い、後処理後の画像を表示部340に出力する。例えば後処理部336は、表示画像に対して、検出処理部335における検出結果を付加し、付加後の画像を表示する処理を行ってもよい。術者であるユーザは、表示部340に表示される画像を見ながら、生体内の病変領域に対する処置を行う。ここでの処置は、例えば前述のEMRやESD等の病変を切除するための処置である。 The post-processing unit 336 performs post-processing based on the outputs of the pre-processing unit 331 and the detection processing unit 335 and outputs the post-processed image to the display unit 340 . For example, the post-processing unit 336 may add the detection result of the detection processing unit 335 to the display image and display the added image. The user, who is the operator, treats the lesion area in the living body while viewing the image displayed on the display unit 340 . The treatment here is, for example, treatment for resecting lesions such as EMR and ESD described above.
 制御部332は、撮像素子312、前処理部331、検出処理部335、後処理部336、光源352と互いに接続され、各部を制御する。 The control unit 332 is connected to the imaging element 312, the preprocessing unit 331, the detection processing unit 335, the postprocessing unit 336, and the light source 352, and controls each unit.
 例えば処理システム100が処理装置330に含まれる場合、図4の構成に取得部110、処理部120及び出力処理部130が追加される。取得部110は、例えば制御部332の制御情報に基づいて入力データや後述する通電履歴情報を取得する。また取得部110は、例えば挿入部310bに設けられるモーションセンサのセンサ情報に基づいて後述するアプローチ角度情報を取得する。処理部120は、アプローチ角度情報と通電履歴情報を用いてスキル評価を行う。出力処理部130は、表示部340や、内視鏡システム300と接続される外部機器にスキル評価情報を出力する。 For example, if the processing system 100 is included in the processing device 330, the acquisition unit 110, the processing unit 120, and the output processing unit 130 are added to the configuration of FIG. The acquisition unit 110 acquires input data and energization history information, which will be described later, based on control information from the control unit 332, for example. The acquisition unit 110 also acquires approach angle information, which will be described later, based on sensor information from a motion sensor provided in the insertion unit 310b, for example. The processing unit 120 performs skill evaluation using the approach angle information and the energization history information. The output processing unit 130 outputs skill evaluation information to the display unit 340 and external devices connected to the endoscope system 300 .
 図5は、挿入部310bの先端部11の構成を説明する図である。図5に示すように、先端部11の断面形状は略円形であり、図4を用いて上述したように、対物光学系311及び照明レンズ314が設けられる。また、挿入部310bには操作部310aから先端部11の開口部316までつながる空洞であるチャンネルが設けられる。ここでの開口部316は、いわゆる鉗子口と呼ばれる処置具360用の開口である。 FIG. 5 is a diagram illustrating the configuration of the distal end portion 11 of the insertion portion 310b. As shown in FIG. 5, the distal end portion 11 has a substantially circular cross-sectional shape, and is provided with an objective optical system 311 and an illumination lens 314 as described above with reference to FIG. In addition, the insertion portion 310b is provided with a channel, which is a cavity connecting from the operation portion 310a to the opening portion 316 of the distal end portion 11. As shown in FIG. The opening 316 here is an opening for a treatment tool 360 called a forceps opening.
 図5に示すように、術者は、処置具360を当該チャンネルに挿通し、開口部316から処置具360の先端部分を突出させることによって、処置対象組織に対する処置を行う。なお、図5では、2系統の照明レンズ314、1つの対物光学系311、1つの開口部316、1つのノズル318を有する先端部11の構成を例示したが、具体的な構成は種々の変形実施が可能である。 As shown in FIG. 5 , the operator inserts the treatment instrument 360 through the channel and causes the distal end portion of the treatment instrument 360 to protrude from the opening 316 to treat the treatment target tissue. Although FIG. 5 illustrates the configuration of the distal end portion 11 having two systems of illumination lenses 314, one objective optical system 311, one opening 316, and one nozzle 318, the specific configuration can be modified in various ways. Implementation is possible.
 なお、ここでの処置具360は、生体に対する処置を行うための器具であり、例えば高周波スネアや高周波ナイフを含む。高周波ナイフは、ニードルナイフ、ITナイフ、フックナイフ等を含む。例えばESDのマーキングには、ニードルナイフが用いられる。切開にはITナイフが用いられる。剥離には高周波スネアや高周波ナイフが用いられる。また処置具360は、注射針、鉗子、クリップ等の他の器具を含んでもよい。ESDの局注には注射針が用いられる。止血には鉗子やクリップが用いられる。 The treatment instrument 360 here is an instrument for treating a living body, and includes, for example, a high-frequency snare and a high-frequency knife. High frequency knives include needle knives, IT knives, hook knives, and the like. For example, a needle knife is used for ESD marking. An IT knife is used for the incision. A high-frequency snare or high-frequency knife is used for peeling. The treatment instrument 360 may also include other instruments such as injection needles, forceps, and clips. An injection needle is used for local injection of ESD. Forceps or clips are used to stop bleeding.
 また、処理システム100は、内視鏡システム300とは別体として設けられてもよい。図6は、処理システム100を含むシステムの構成例を示す図である。図6に示すように、システムは、複数の内視鏡システム300と、処理システム100を含む。 Also, the processing system 100 may be provided separately from the endoscope system 300 . FIG. 6 is a diagram showing a configuration example of a system including the processing system 100. As shown in FIG. As shown in FIG. 6, the system includes multiple endoscope systems 300 and a processing system 100 .
 例えば処理システム100は、複数の内視鏡システム300のそれぞれと、ネットワークを介して接続されるサーバシステムである。ここでのサーバシステムは、イントラネット等のプライベートネットワークに設けられるサーバであってもよいし、インターネット等の公衆通信網に設けられるサーバであってもよい。また処理システム100は、1つのサーバ装置によって構成されてもよいし、複数のサーバ装置を含んでもよい。例えば処理システム100は、複数の内視鏡システム300から、アプローチ角度情報と通電履歴情報を収集するデータベースサーバと、スキル評価を行う処理サーバを含んでもよい。データベースサーバは、例えば後述するように、医師情報、患者情報等の他の情報を収集してもよい。 For example, the processing system 100 is a server system connected to each of the endoscope systems 300 via a network. The server system here may be a server provided in a private network such as an intranet, or a server provided in a public communication network such as the Internet. Also, the processing system 100 may be configured by one server device, or may include a plurality of server devices. For example, the processing system 100 may include a database server that collects approach angle information and energization history information from a plurality of endoscope systems 300, and a processing server that performs skill evaluation. The database server may also collect other information, such as physician information, patient information, etc., as described below.
 また、処理システム100は、後述するように、機械学習に基づいてスキル評価を行ってもよい。例えば処理システム100は、データベースサーバが収集したデータを学習データとする機械学習を行うことによって、学習済モデルを生成する学習サーバを含んでもよい。処理サーバは、学習サーバによって生成された学習済モデルに基づいて、スキル評価を行う。 Also, the processing system 100 may perform skill evaluation based on machine learning, as described later. For example, the processing system 100 may include a learning server that generates a trained model by performing machine learning using data collected by a database server as learning data. The processing server performs skill evaluation based on the trained model generated by the learning server.
 図6に示したように、処理システム100が複数の内視鏡システム300と接続可能である場合、効率的にデータを収集することが可能である。例えば機械学習に用いる学習データの量を増やすことが容易であるため、スキル評価精度をより高くすることが可能である。 As shown in FIG. 6, when the processing system 100 can be connected to a plurality of endoscope systems 300, it is possible to efficiently collect data. For example, since it is easy to increase the amount of learning data used for machine learning, it is possible to improve the accuracy of skill evaluation.
 2.スキル評価情報
 次に、図7~図12を用いて、スキル評価情報について詳細に説明する。術者は所定の処置を行い、当該処置から所定の期間が経過した後、スキル評価情報として図7に示すスキル評価シート400を、所定の表示部に出力する。これにより、出力処理部130によるスキル評価情報の出力が実現される。なお、所定の期間とは、例えば、当該処置の対象となった患者が退院するまでの期間等である。また、所定の表示部とは、例えば、前述の表示部340であるが、内視鏡システム300と接続される外部機器の表示部でもよい。また、所定の処置は、例えば前述した、複数の段階を含むESDであるが、複数の段階を含む他の処置であってもよい。言い換えれば、処置具360による処置は複数の段階を含む。
2. Skill Evaluation Information Next, the skill evaluation information will be described in detail with reference to FIGS. 7 to 12. FIG. The operator performs a prescribed treatment, and after a prescribed period of time has elapsed since the treatment, the operator outputs a skill evaluation sheet 400 shown in FIG. 7 as skill evaluation information to a prescribed display unit. As a result, output of skill evaluation information by the output processing unit 130 is realized. Note that the predetermined period is, for example, the period until the patient who is the target of the treatment is discharged from the hospital. The predetermined display unit is, for example, the display unit 340 described above, but may be a display unit of an external device connected to the endoscope system 300 . Also, the predetermined treatment is, for example, the ESD including multiple steps as described above, but may be other treatments including multiple steps. In other words, treatment with the treatment tool 360 includes multiple stages.
 スキル評価シート400は、例えば、医師情報アイコン410と症例シート420とを含む。症例シート420において、例えば総合評価の結果と、当該総合評価の内訳として処置の各段階のスキル評価の結果が表示される。言い換えれば、出力処理部130は、複数の段階の各段階におけるスキル評価情報を出力する。これにより、段階ごとに細分化してスキルを評価できるため、スキル評価の精度をより高くすることができる。例えば、マーキング、局注、切開、剥離の段階についてスキル評価を行うが、図7に示すように、さらに止血を含めた5つの段階についてスキル評価を行ってもよい。また、評価の対象となる処置の段階は5つに限られず、他の段階を追加してもよいし、2つ以上であれば減らしてもよい。言い換えれば、複数の段階は、マーキング段階、局注段階、切開段階及び剥離段階のうち少なくとも2つを含む。これにより、ESD等の処置をより細かく分類してスキル評価を行うことができる。より具体的には、それぞれの段階の評価結果について、マーキング評価アイコン440、局注評価アイコン442、切開評価アイコン444、剥離評価アイコン446、止血評価アイコン448で表示し、それぞれのアイコンをA、B、C、Dのいずれかで表示する。さらに、これらの評価結果をレーダーチャート形式で表示してもよい。このようにすることで、術者のスキルの優劣を2次元的に表示できるので、スキル評価を視覚的かつ容易に把握することができる。なお、ここではAが最も評価が高く熟練者と同等の旨のランクであり、Dが最も評価が低いランクとするが、評価の高低については点数で表示する等、様々な方法で実現できる。また、スキル評価の表示形式はレーダーチャートに限らず、棒グラフや折れ線グラフ等の形式で実現してもよく、種々の変形実施が可能である。さらに、スキル評価の表示形式を変更可能な仕様にしてもよい。 The skill evaluation sheet 400 includes, for example, a doctor information icon 410 and a case sheet 420. The case sheet 420 displays, for example, the result of comprehensive evaluation and the result of skill evaluation for each stage of treatment as a breakdown of the comprehensive evaluation. In other words, the output processing unit 130 outputs skill evaluation information at each stage of the plurality of stages. As a result, the skill can be evaluated by subdividing it for each stage, so that the accuracy of the skill evaluation can be further improved. For example, skill evaluation is performed for the stages of marking, local injection, incision, and peeling, but as shown in FIG. 7, skill evaluation may be performed for five stages including hemostasis. In addition, the number of treatment stages to be evaluated is not limited to five, and other stages may be added, or two or more may be reduced. In other words, the multiple steps include at least two of a marking step, a local injection step, an incision step, and an ablation step. As a result, it is possible to perform skill evaluation by more finely classifying measures such as ESD. More specifically, the evaluation results of each stage are displayed by a marking evaluation icon 440, a local injection evaluation icon 442, an incision evaluation icon 444, a peeling evaluation icon 446, and a hemostatic evaluation icon 448. , C, or D. Furthermore, these evaluation results may be displayed in a radar chart format. By doing so, it is possible to two-dimensionally display the superiority or inferiority of the skill of the operator, so that the skill evaluation can be visually and easily grasped. Here, A is the highest evaluation rank equivalent to that of an expert, and D is the lowest evaluation rank. Moreover, the display format of the skill evaluation is not limited to the radar chart, and may be realized in the form of a bar graph, a line graph, or the like, and various modifications are possible. Furthermore, the display format of the skill evaluation may be changed.
 また、各段階について、さらにアドバイス情報を出力してもよい。アドバイス情報は、具体的には、アプローチ角度情報及び通電履歴情報の少なくとも一方に関するアドバイスの情報である。言い換えれば、出力処理部130は、アプローチ角度情報及び通電履歴情報の少なくとも一方に関するアドバイス情報を出力してもよい。例えば、図8に示すように、表示画面上でマーキング評価アイコン440を選択するとマーキングアドバイス表示470が表示され、局注評価アイコン442を選択すると局注アドバイス表示472が表示され、切開評価アイコン444を選択すると、切開アドバイス表示474が表示され、アプローチ角度情報や通電履歴情報に関するアドバイスが表示される。なお、同様に、剥離アドバイス表示や止血アドバイス表示も表示されるが、図8では省略する。なお、アドバイスの表示方法は図8に示す方法に限られず、例えば、別画面に表示する等、様々な変形実施が可能である。また、図示は省略するが、総合評価アイコン430を選択すると総合評価に関するアドバイスを表示してもよい。これにより、精度の高いスキル評価をすることができるとともに、術者に対して具体的な情報を提供することができる。 Further, advice information may be output for each stage. The advice information is specifically advice information regarding at least one of the approach angle information and the energization history information. In other words, the output processing unit 130 may output advice information regarding at least one of the approach angle information and the energization history information. For example, as shown in FIG. 8, when a marking evaluation icon 440 is selected on the display screen, a marking advice display 470 is displayed, and when a local injection evaluation icon 442 is selected, a local injection advice display 472 is displayed, and an incision evaluation icon 444 is displayed. When selected, an incision advice display 474 is displayed, and advice regarding approach angle information and energization history information is displayed. Similarly, a peeling advice display and a hemostatic advice display are also displayed, but they are omitted in FIG. The display method of the advice is not limited to the method shown in FIG. 8, and various modifications such as displaying the advice on another screen are possible. Also, although illustration is omitted, when the comprehensive evaluation icon 430 is selected, advice regarding the comprehensive evaluation may be displayed. As a result, highly accurate skill evaluation can be performed, and specific information can be provided to the operator.
 アドバイス情報は、例えば、図8の局注アドバイス表示472に示すように、アプローチ角度情報や通電履歴情報について、評価対象の術者のデータと熟練者のデータと比較し、その差異情報を含むアドバイスを表示する。言い換えれば、出力処理部130は、アプローチ角度情報及び通電履歴情報の少なくとも一方に関するエキスパートデータに対する差異をアドバイス情報として表示する。これにより、精度の高いスキル評価をすることができるとともに、術者に対して、エキスパートデータとの差異という、より具体的な情報を提供することができる。なお、アドバイス情報は、他の情報を含んでもよい。例えば、マーキングアドバイス表示470に示すように、マーキング段階にて熟練者と同等の旨の評価結果が得られた旨を確認的に表示してもよい。また、切開アドバイス表示474に示すように、評価結果の理由を表示してもよいし、評価の理由として後述するログ情報を術者に参照させる旨のアドバイスを表示してもよい。 For example, as shown in the local injection advice display 472 in FIG. 8, the advice information includes advice including difference information obtained by comparing approach angle information and energization history information with the data of the operator to be evaluated and the data of the expert. display. In other words, the output processing unit 130 displays, as advice information, the difference from the expert data regarding at least one of the approach angle information and the energization history information. As a result, highly accurate skill evaluation can be performed, and more specific information such as differences from expert data can be provided to the operator. Note that the advice information may include other information. For example, as shown in the marking advice display 470, it may be displayed for confirmation that an evaluation result equivalent to that of an expert was obtained in the marking stage. In addition, as shown in the incision advice display 474, the reason for the evaluation result may be displayed, or advice may be displayed to prompt the operator to refer to log information, which will be described later, as the reason for the evaluation.
 なお、図7、図8において、処置の各段階について前述のA~Dによるスキル評価を行っているが、さらに各段階の期間を細分化してスキル評価を行うようにしてもよい。例えば、各処置において通電を行っている期間内のアプローチ角度の変化をスキル評価の対象としてもよい。言い換えれば、処理部120は、処置具360の通電期間におけるアプローチ角度情報に基づいてスキル評価を行ってもよい。熟練医の行った処置と修練医の行った処置の差異として、通電期間におけるアプローチ角度の変化量の違いがあることが経験的に知られていることから、通電期間におけるアプローチ角度情報は重要度の高い情報である。これにより、重要度の高い情報をもとにスキルを評価できるため、スキル評価の精度をより高くすることができる。また、図8の切開アドバイス表示474に示すように、通電期間におけるアプローチ角度情報についてアドバイスを表示してもよい。 In addition, in FIGS. 7 and 8, skill evaluation is performed for each stage of treatment according to the above-described A to D, but skill evaluation may be performed by further subdividing the period of each stage. For example, a change in the approach angle during the period of energization in each treatment may be subject to skill evaluation. In other words, the processing unit 120 may perform skill evaluation based on approach angle information during the energization period of the treatment instrument 360 . It is empirically known that the difference between the treatment performed by a skilled doctor and the treatment performed by a trainee doctor is the difference in the amount of change in the approach angle during the energization period. It is a high level of information. As a result, the skill can be evaluated based on information with a high degree of importance, so the accuracy of skill evaluation can be increased. Also, as shown in the incision advice display 474 in FIG. 8, advice may be displayed regarding the approach angle information during the energization period.
 また、例えば、各処置において通電を行う期間より前の期間に限定してアプローチ角度の変化をスキル評価の対象としてもよい。言い換えれば、処理部120は、処置具360の通電期間よりも前の期間におけるアプローチ角度情報に基づいてスキル評価を行ってもよい。熟練医の術中に要する時間は修練医の術中に要する時間より短いが、特に処置前のポジション調整に要する時間が、熟練医と修練医で著しく差があることが知られている。つまり、処置前の段取りの良し悪しが、術者の評価に大きく関係している。これにより、通電前のアプローチ角度の挙動から、処置前の段取りの良し悪しを把握する情報が得られ、当該情報をもとにスキル評価ができることから、スキル評価の精度をより高くすることができる。また、図8の切開アドバイス表示474に示すように、通電期間より前の期間におけるアプローチ角度情報についてアドバイスを表示してもよい。 Also, for example, the change in the approach angle may be subject to skill evaluation, limited to the period before the period in which the current is applied in each treatment. In other words, the processing unit 120 may perform skill evaluation based on the approach angle information during the period prior to the energization period of the treatment instrument 360 . Although the time required for an operation by a skilled doctor is shorter than the time required for an operation by a novice doctor, it is known that there is a significant difference in the time required for position adjustment especially before treatment between an expert doctor and a novice doctor. In other words, the quality of the setup before treatment is greatly related to the operator's evaluation. As a result, the behavior of the approach angle before energization can be used to obtain information about the quality of the setup before treatment, and skill evaluation can be performed based on this information, so the accuracy of skill evaluation can be further improved. . Further, as shown in the incision advice display 474 in FIG. 8, advice may be displayed regarding the approach angle information during the period prior to the energization period.
 次に、アプローチ角度情報のログ情報について説明する。本実施形態のスキル評価情報は、アプローチ角度情報のログ情報を含む。具体的には、図7において、個々の症例シート420は、ログデータアイコン450を含む。そして、ログデータアイコン450を選択すると、図9に示すアプローチ角度情報のログ情報が表示される。これにより、出力処理部130によるアプローチ角度情報のログ情報の出力を実現することができる。言い換えれば、出力処理部130は、アプローチ角度情報のログ情報を出力する。これにより、精度の高いスキル評価をすることができるとともに、術者に対して、アプローチ角度情報のログ情報という、より具体的な情報を提供することができる。 Next, log information of approach angle information will be explained. The skill evaluation information of this embodiment includes log information of approach angle information. Specifically, in FIG. 7, individual case sheets 420 include log data icons 450 . When the log data icon 450 is selected, the log information of the approach angle information shown in FIG. 9 is displayed. As a result, output of log information of approach angle information by the output processing unit 130 can be realized. In other words, the output processing unit 130 outputs log information of approach angle information. As a result, highly accurate skill evaluation can be performed, and more specific information such as log information of approach angle information can be provided to the operator.
 図9は、アプローチ角度情報のログ情報の例を説明する図である。図9の横軸は時間を表し、縦軸はアプローチ角度の相対変化を表す。図9のt1は、最初に処置が開始されたタイミングである。例えば挿入部310bの挿入後、初めて処置具360の突出等又が検出されたタイミングである。なお、通電は必ずしも必要ではないが、通電を含んでもよい。また、図9では、t1で開始した処置に通電は含まれず、t2とt3で開始した処置に通電が含まれたものとする。後述する図10についても同様である。処理部120は、t1でのアプローチ角度を基準角度として求める。そして、t1以降の各タイミングにおいてアプローチ角度の算出と、相対変化の算出及び出力が行われる。具体的には、図9に示すように、t1での相対変化が0度に設定され、これ以降はt1でのアプローチ角度を基準として、相対変化が時系列的に取得される。t2は、処置が再度開始されたタイミングを表し、基準角度の再設定によって、t2での相対角度が0度にキャリブレーションされる。t3もt2と同様に、処置が再度開始されたタイミングであり、t3での相対角度が0度にキャリブレーションされる。このように、ここでのアプローチ角度情報は、処置開始に対応するタイミングにおけるアプローチ角度を基準角度としたときの、基準角度に対するアプローチ角度の相対変化に関する情報である。このようにすることで、処置が行われた期間におけるアプローチ角度の変化量という形でアプローチ角度情報を得ることができるので、術者のスキルをより高い精度で評価することができる。なお、図9及び後述の図10では処置の発生毎にアプローチ角度を0度にキャリブレーションしているが、これに限らず、アプローチ角度を絶対値で測定してもよい。また、アプローチ角度の求め方やログ情報を作成する処理例については図13、図14で後述する。 FIG. 9 is a diagram illustrating an example of log information of approach angle information. The horizontal axis of FIG. 9 represents time, and the vertical axis represents relative change in approach angle. t1 in FIG. 9 is the timing when treatment is first started. For example, it is the timing at which the protrusion or the like of the treatment instrument 360 is detected for the first time after the insertion portion 310b is inserted. Although energization is not necessarily required, energization may be included. In FIG. 9, it is assumed that energization is not included in the treatment started at t1, but energization is included in the treatments started at t2 and t3. The same applies to FIG. 10 described later. The processing unit 120 obtains the approach angle at t1 as the reference angle. At each timing after t1, calculation of the approach angle and calculation and output of the relative change are performed. Specifically, as shown in FIG. 9, the relative change at t1 is set to 0 degrees, and thereafter the relative change is acquired in time series with the approach angle at t1 as a reference. t2 represents the timing when the treatment is restarted, and resetting the reference angle calibrates the relative angle at t2 to 0 degrees. Similar to t2, t3 is also the timing at which the treatment is restarted, and the relative angle at t3 is calibrated to 0 degrees. In this way, the approach angle information here is information regarding a relative change in the approach angle with respect to the reference angle when the approach angle at the timing corresponding to the start of treatment is used as the reference angle. By doing so, it is possible to obtain approach angle information in the form of the amount of change in the approach angle during the period during which the treatment is performed, so that the operator's skill can be evaluated with higher accuracy. Although the approach angle is calibrated to 0 degrees each time treatment occurs in FIG. 9 and FIG. 10 described later, the approach angle may be measured as an absolute value without being limited to this. An example of processing for obtaining the approach angle and creating log information will be described later with reference to FIGS. 13 and 14. FIG.
 また、例えば、前述のアプローチ角度の相対変化に対して、許容可能なアプローチ角度範囲を追加したログ情報を表示してもよい。さらに、図10に示すように、アプローチ角度の相対変化が許容可能なアプローチ角度範囲から外れた場合、アラート情報を表示してもよい。例えば、処理部120は、許容可能なアプローチ角度範囲を取得し、相対変化を表す角度がアプローチ角度範囲から外れた場合、アラート情報を出力する処理を追加することで、図10に示すログの表示を実現することができる。表示されるアラート情報は、テキスト情報であってもよいし、アイコン等の画像情報等であってもよい。なお、図10の縦軸及び横軸は図9と同様であり、処置開始と判定されたタイミングであるt1~t3についても図9と同様である。図10において、θ1より大きくθ2未満の範囲が許容可能なアプローチ角度範囲である。図10の例ではθ1<0度<θ2である。処理部120は、アプローチ角度の相対変化がθ1以下である場合、又は、θ2以上である場合に、相対変化がアプローチ角度範囲から外れたと判定する。このようにすることで、ログ情報をより詳細に示すことができる。 Also, for example, log information with an allowable approach angle range added to the above-described relative change in approach angle may be displayed. Further, as shown in FIG. 10, alert information may be displayed when the relative change in approach angle falls outside the allowable approach angle range. For example, the processing unit 120 obtains an allowable approach angle range, and when the angle representing the relative change deviates from the approach angle range, adds processing for outputting alert information, thereby displaying the log shown in FIG. can be realized. The displayed alert information may be text information or image information such as icons. The vertical and horizontal axes in FIG. 10 are the same as those in FIG. 9, and the timings t1 to t3 at which the treatment is determined to start are also the same as in FIG. In FIG. 10, the allowable approach angle range is greater than θ1 and less than θ2. In the example of FIG. 10, θ1<0 degrees<θ2. The processing unit 120 determines that the relative change is out of the approach angle range when the relative change in the approach angle is less than or equal to θ1 or greater than or equal to θ2. By doing so, the log information can be displayed in more detail.
 なお、このような報知を行うためには、アプローチ角度範囲をあらかじめ設定しておく必要がある。つまり、安全に処置を行うためにはどの程度のアプローチ角度範囲が許容されるかが既知でなければならない。例えば、過去に熟練医が同様の症例を処置したデータが有れば、当該データをもとに許容可能なアプローチ角度範囲を設定することができる。 It should be noted that in order to make such notification, it is necessary to set the approach angle range in advance. In other words, it is necessary to know how much approach angle range is permissible in order to perform treatment safely. For example, if there is data of a similar case treated by a skilled doctor in the past, an allowable approach angle range can be set based on the data.
 なお、アプローチ角度情報のログ情報は、患者に対する処置中において、表示部340等にリアルタイムに表示されてもよい。さらに、前述のアラート情報を、相対変化がアプローチ角度範囲内からアプローチ角度範囲外へ移行したタイミングでリアルタイムに報知してもよい。ここでの報知は、表示部340等による報知の他に、音又は振動等による報知であってもよい。さらに、例えば出力処理部130は、図10において、t4とt5の間、t6とt7の間でアラート情報の出力を継続してもよい。このようにすれば、処置を行っている術者に対して、即座に異常を報知することができるので、トラブルを未然に防止することができる。 Note that the log information of the approach angle information may be displayed in real time on the display unit 340 or the like during treatment of the patient. Furthermore, the alert information described above may be notified in real time at the timing when the relative change moves from within the approach angle range to outside the approach angle range. The notification here may be notification by sound, vibration, or the like, in addition to notification by the display unit 340 or the like. Further, for example, the output processing unit 130 may continue to output alert information between t4 and t5 and between t6 and t7 in FIG. In this way, the operator who is performing the treatment can be immediately notified of the abnormality, so that trouble can be prevented.
 図7に戻り、スキル評価シート400の説明を続ける。本実施形態のスキル評価情報は、さらに医師情報を含んでもよい。例えば、スキル評価シート400の医師情報アイコン410を選択することで、医師情報の詳細が表示される。具体的には、身体的特徴、手術実績の他、個々の症例の情報等である。より具体的には、身体的特徴には、身長、体重、手の大きさ等を含む。また、手術実績は、経験症例数や累積手術時間等の情報を含む。また、症例情報は、手術に要した時間、手術の難易度、指導を受けた回数、当該指導の内容等を含む。なお、医師情報に流派を特定する情報を含めてもよい。前述の通り、内視鏡における処置の良し悪しは、医師の経験値や操作上の暗黙知によるものであるため、指導者が異なれば、指導内容即ち手技も大きく異なると考えられる。そのため、同じ症例を対象とする場合であっても、流派に応じて実行する手技が異なることから、流派の違いを考慮することで、スキル評価の精度を維持することができる。また、医師情報に、氏名、性別、生年月日、登録情報、登録年月日を含めてもよく、さらに、これらの情報は所定のデータベースにリンクさせてもよい。また、医師情報には学会活動等の業績を含めてもよい。また、以上に示した情報は、全てを表示する必要は無く、例えば、図11に示すような一覧形式で所定の記憶領域に記憶しておき、さらに必要な情報のみを選択してスキル評価シート400に表示してもよい。このように、医師情報を評価スキル情報に含めることによって、スキル評価の精度を高くすることができる。 Returning to FIG. 7, the description of the skill evaluation sheet 400 continues. The skill evaluation information of this embodiment may further include doctor information. For example, by selecting the doctor information icon 410 on the skill evaluation sheet 400, detailed doctor information is displayed. Specifically, it includes information on individual cases in addition to physical characteristics and surgical records. More specifically, physical characteristics include height, weight, hand size, and the like. In addition, the surgical record includes information such as the number of experienced cases and cumulative surgical time. The case information also includes the time required for the surgery, the degree of difficulty of the surgery, the number of times guidance was received, the content of the guidance, and the like. Note that information specifying a school may be included in the doctor information. As described above, the pros and cons of endoscopic procedures depend on the doctor's experience and tacit knowledge of operations. Therefore, it is believed that different instructors will provide different instructions, that is, techniques. Therefore, even when the same case is targeted, since the procedure to be performed differs depending on the school, it is possible to maintain the accuracy of skill evaluation by considering the difference in school. Further, the doctor information may include name, sex, date of birth, registration information, and date of registration, and these information may be linked to a predetermined database. Further, the doctor information may include academic achievements such as academic conference activities. Further, it is not necessary to display all of the information shown above. For example, it is stored in a predetermined storage area in a list format as shown in FIG. 400 may be displayed. By including the doctor information in the evaluation skill information in this way, it is possible to improve the accuracy of the skill evaluation.
 本実施形態のスキル評価情報は、さらに患者情報を含んでもよい。例えば、スキル評価シート400の患者情報アイコン460を選択することで、患者情報の詳細が表示される。患者情報とは、患者自身の情報のほか、病変の情報や手術後の状態の情報を含む。患者自身の情報は、例えば、氏名、年齢、性別等を含むが、抗凝固剤使用有無の情報を含んでもよい。当該情報は、抗凝固剤を使用すると出血しやすくなることから、手術の難易度の判断に当たり有益な情報となる。また、患者自身の情報には、治療歴の情報を含んでもよい。当該情報は、例えば過去にESDの処置を受けた箇所は線維化等により再度の剥離処置が難しくなることから、同様に、有益な情報となる。また、病変の情報は、部位の情報、組織性状の情報、出血の情報を含むが、さらに、図12に示すように細分化した情報を含んでもよい。また、手術後の状態の情報は、出血量、偶発症発生率、入院日数の情報を含む。 The skill evaluation information of this embodiment may further include patient information. For example, by selecting the patient information icon 460 on the skill evaluation sheet 400, details of the patient information are displayed. Patient information includes patient information, lesion information, and postoperative status information. The patient's own information includes, for example, name, age, sex, etc., and may include information on whether or not the patient uses an anticoagulant. This information is useful for judging the degree of difficulty of surgery, since the use of anticoagulants makes bleeding more likely. The patient's own information may also include treatment history information. This information is also useful information because, for example, re-exfoliation treatment becomes difficult due to fibrosis or the like at a site that has been subjected to ESD treatment in the past. The lesion information includes site information, tissue characterization information, and bleeding information, and may further include subdivided information as shown in FIG. In addition, information on the state after surgery includes information on the amount of bleeding, the incidence of complications, and the number of days of hospitalization.
 患者情報を用いた具体例として、図7のスキル評価シート400の、マーキング評価アイコン440、局注評価アイコン442、切開評価アイコン444、剥離評価アイコン446及び止血評価アイコン448の全てが「A」で表示されている術者が複数名いたとする。さらに、これらの術者同士の患者情報を比較すると、手術後の偶発症発生率及び入院に要した日数に大きな差が有ったとする。この場合、所定の術者のスキル評価シート400の総合評価アイコン430は「A+」と表示し、特定の術者のスキル評価シート400の総合評価アイコン430は「A-」と表示することにより、より細かくスキル評価を行ってもよい。なお、所定の術者とは、手術後の偶発症発生率が平均値より低く、かつ入院に要した日数が平均値より少ない術者である。また、特定の術者とは、手術後の偶発症発生率が平均値より高く、かつ入院に要した日数が平均値より高い術者である。なお、総合評価がBやC等の術者に対しても、同様に、より細分化した評価をしてもよい。このように、患者情報を評価スキル情報に含めることによって、スキル評価の精度を高くすることができる。 As a specific example using patient information, the marking evaluation icon 440, local injection evaluation icon 442, incision evaluation icon 444, peeling evaluation icon 446, and hemostasis evaluation icon 448 on the skill evaluation sheet 400 of FIG. Assume that there are multiple displayed operators. Further, when the patient information of these operators is compared, it is assumed that there is a large difference in the incidence of postoperative complications and the number of days required for hospitalization. In this case, the comprehensive evaluation icon 430 of the skill evaluation sheet 400 of the predetermined operator is displayed as "A+", and the comprehensive evaluation icon 430 of the skill evaluation sheet 400 of the specific operator is displayed as "A-". Skill evaluation may be performed in more detail. A predetermined operator is an operator whose incidence of complications after surgery is lower than the average value and whose number of days required for hospitalization is less than the average value. A specific operator is an operator who has a postoperative incidence of complications higher than the average value and the number of days required for hospitalization is higher than the average value. It should be noted that even for an operator with a comprehensive evaluation of B, C, etc., a more detailed evaluation may be similarly performed. By including patient information in the evaluation skill information in this way, it is possible to improve the accuracy of skill evaluation.
 3.各種処理
 次に、本実施形態の手法に関する処理について説明する。先ず、図13(A)、図13(B)と図14を用いて、アプローチ角度を求める処理について説明する。図13(A)は本実施形態におけるアプローチ角度情報の例を説明する図である。前述のように、本実施形態のアプローチ角度は、処置対象組織と、挿入部310bの先端部11との相対的な角度を表す情報であり、より具体的には、挿入部310bの軸線である直線L11と、処置対象組織上の平面P11とのなす角度θである。つまり、先端部11から平面P11に下ろした垂線と、直線L11と、平面P11とによって定義される三角形の内角の1つである。挿入部310bの軸線とは、挿入部310bの長手方向を表す軸であって、例えば略円柱形状である挿入部310bの中心を通過する直線、或いはそれに平行な直線である。なお、本実施形態において平行とは略平行を含む。アプローチ角度は、処置対象組織を表す平面P11に対して、挿入部310bの軸がどの程度寝ているか又は立っているかを示す情報とも言うことができる。アプローチ角度は例えば0度以上90度以下の数値である。この定義においては、軸線が平面P11の法線を回転軸として回転したとしても、アプローチ角度の大きさは変化しない。即ち、アプローチ角度は、アプローチの方向を特定する情報を含まなくてもよい。ただし、アプローチ角度はこれに限定されず、また、値の範囲も0度以上90度以下に限定されない。
3. Various Processing Next, processing related to the technique of the present embodiment will be described. First, the process of obtaining the approach angle will be described with reference to FIGS. 13A, 13B, and 14. FIG. FIG. 13A is a diagram illustrating an example of approach angle information in this embodiment. As described above, the approach angle of the present embodiment is information representing the relative angle between the tissue to be treated and the distal end portion 11 of the insertion section 310b, and more specifically, the axis of the insertion section 310b. It is the angle θ between the straight line L11 and the plane P11 on the tissue to be treated. That is, it is one of the interior angles of the triangle defined by the perpendicular drawn from the tip 11 to the plane P11, the straight line L11, and the plane P11. The axis of the insertion portion 310b is an axis representing the longitudinal direction of the insertion portion 310b, and is, for example, a straight line passing through the center of the substantially cylindrical insertion portion 310b or a straight line parallel thereto. In addition, in this embodiment, "parallel" includes substantially parallel. The approach angle can also be said to be information indicating how much the axis of the insertion section 310b lies or stands with respect to the plane P11 representing the tissue to be treated. The approach angle is, for example, a numerical value between 0 degrees and 90 degrees. In this definition, even if the axis rotates around the normal to the plane P11, the approach angle does not change. That is, the approach angle may not include information specifying the direction of approach. However, the approach angle is not limited to this, and the range of values is not limited to 0 degrees or more and 90 degrees or less.
 アプローチ角度を求めるための構成については種々の手法が考えられる。例えば内視鏡システム300は、挿入部310bの先端部11に設けられるモーションセンサを含むことができる。モーションセンサは、例えば3軸の加速度センサと、3軸の角速度センサを含む6軸センサである。例えば所与のセンサ座標系の3軸をX軸、Y軸、Z軸とした場合、加速度センサは、XYZの各軸における並進加速度を検出するセンサである。角速度センサは、XYZの各軸周りの角速度を検出するセンサである。 Various methods are conceivable for the configuration for obtaining the approach angle. For example, the endoscope system 300 can include a motion sensor provided at the distal end 11 of the insertion section 310b. The motion sensor is, for example, a 6-axis sensor including a 3-axis acceleration sensor and a 3-axis angular velocity sensor. For example, if the three axes of a given sensor coordinate system are the X, Y, and Z axes, the acceleration sensor is a sensor that detects translational acceleration on each of the XYZ axes. The angular velocity sensor is a sensor that detects angular velocity around each of the XYZ axes.
 モーションセンサを用いることによって、先端部11の位置姿勢を求めることが可能である。例えば、加速度センサ及び角速度センサの出力を積分することによって、先端部11の変位及び回転量が求められる。なお慣性センサであるモーションセンサから位置姿勢を特定するためには、境界条件となる所与の基準位置の設定が必要である。例えば3次元空間に固定された所与の基準座標系において基準位置姿勢を定義した場合に、処理部120は、当該基準位置姿勢を基準として、センサ出力に基づいて求められる先端部11の変位及び回転量を蓄積することによって、各タイミングにおける先端部11の位置姿勢を求める。これにより、各タイミングにおける直線L11の方向を所与の座標系で表現することが可能である。 By using a motion sensor, it is possible to obtain the position and orientation of the distal end portion 11 . For example, by integrating the outputs of the acceleration sensor and the angular velocity sensor, the displacement and rotation amount of the distal end portion 11 can be obtained. In order to specify the position and orientation from the motion sensor, which is an inertial sensor, it is necessary to set a given reference position as a boundary condition. For example, when a reference position/posture is defined in a given reference coordinate system fixed in a three-dimensional space, the processing unit 120 uses the reference position/posture as a reference to determine the displacement and By accumulating the amount of rotation, the position and orientation of the distal end portion 11 at each timing is obtained. This makes it possible to express the direction of the straight line L11 at each timing in a given coordinate system.
 また平面P11は、例えば撮像画像に基づいて処置対象組織との距離を求めることによって推定できる。例えば、内視鏡システム300は、先端部11に複数の撮像系を含んでもよい。処理部120は、位置の異なる複数の撮像系によって撮像された視差画像に基づいて、ステレオマッチング処理を行うことによって、画像上に撮像された被写体との距離を求める。ステレオマッチングについては公知の手法であるため、詳細な説明は省略する。このようにすれば、被写体の3次元形状を推定できる。例えば処理部120は、カメラ座標系における被写体の各点の座標を特定できるため、当該カメラ座標系を用いて処置対象組織を含む平面P11を求めることが可能である。先端部11の姿勢とカメラ座標系との関係は設計上既知であるため、先端部11の姿勢と平面P11の関係を求めることが可能である。例えば、先端部11の位置姿勢、及び、被写体の3次元形状を、共通の基準座標系で表現することも可能である。換言すれば、処理部120は、モーションセンサの出力と、視差画像である撮像画像を取得することによって、任意の座標系を用いて直線L11及び平面P11を表現すること、及び直線L11と平面P11の間の角度であるアプローチ角度を演算することが可能である。 Also, the plane P11 can be estimated, for example, by obtaining the distance to the treatment target tissue based on the captured image. For example, the endoscope system 300 may include multiple imaging systems in the distal end portion 11 . The processing unit 120 obtains the distance to the subject imaged on the image by performing stereo matching processing based on parallax images imaged by a plurality of imaging systems at different positions. Stereo matching is a well-known technique, and detailed description thereof will be omitted. In this way, the three-dimensional shape of the subject can be estimated. For example, since the processing unit 120 can specify the coordinates of each point of the subject in the camera coordinate system, it is possible to obtain the plane P11 including the treatment target tissue using the camera coordinate system. Since the relationship between the orientation of the distal end portion 11 and the camera coordinate system is known in terms of design, it is possible to obtain the relationship between the orientation of the distal end portion 11 and the plane P11. For example, it is possible to express the position and orientation of the distal end portion 11 and the three-dimensional shape of the subject in a common reference coordinate system. In other words, the processing unit 120 acquires the output of the motion sensor and the captured image, which is the parallax image, to express the straight line L11 and the plane P11 using an arbitrary coordinate system, and to express the straight line L11 and the plane P11 It is possible to compute the approach angle, which is the angle between
 なお、平面P11は、処置対象組織の所与の点における法線ベクトルに直交する平面であってもよいし、処置対象組織の表面形状を平面で近似した面であってもよい。さらに言えば、本実施形態におけるアプローチ角度は、挿入部310bの先端部11と処置対象組織の関係を表す情報であればよく、平面P11の代わりに上記法線ベクトルや、それに類する情報に基づいてアプローチ角度が求められてもよい。 The plane P11 may be a plane perpendicular to the normal vector at a given point of the tissue to be treated, or may be a plane approximating the surface shape of the tissue to be treated. Furthermore, the approach angle in the present embodiment may be any information representing the relationship between the distal end portion 11 of the insertion section 310b and the tissue to be treated. An approach angle may be determined.
 また、時系列のアプローチ角度を求める際に、処理部120は、各タイミングにおいてモーションセンサに基づく先端部11の位置姿勢の演算と、視差画像に基づく被写体形状の推定の両方を行ってもよい。ただし、図1(D)や図1(F)等の画像から被写体形状を求めることが容易でないことに鑑みれば、処理部120は、所与のタイミングにおいて取得された平面P11の情報を、それよりも後のタイミングにおいて継続して利用してもよい。例えば処理部120は、図1(A)、図1(B)に示すように、相対的に被写体を俯瞰できる位置において平面P11を求め、処置中は当該平面P11の情報を継続利用する。このようにすれば、画像から処置対象組織の情報を取得することが難しい場合にも、適切に平面P11の情報を取得できる。平面P11の情報取得後は、処理部120は、各タイミングにおいてモーションセンサに基づいて直線L11を求め、求めた直線L11と、取得済の平面P11を用いてアプローチ角度を演算する。 Further, when obtaining the time-series approach angles, the processing unit 120 may perform both calculation of the position and orientation of the tip portion 11 based on the motion sensor and estimation of the subject shape based on the parallax image at each timing. However, in view of the fact that it is not easy to determine the subject shape from the images such as those shown in FIGS. It may be used continuously at a later timing. For example, as shown in FIGS. 1A and 1B, the processing unit 120 obtains a plane P11 at a position where the subject can be relatively overlooked, and continues to use information on the plane P11 during treatment. In this way, even when it is difficult to obtain information on the treatment target tissue from the image, information on the plane P11 can be appropriately obtained. After obtaining the information on the plane P11, the processing unit 120 obtains the straight line L11 based on the motion sensor at each timing, and calculates the approach angle using the obtained straight line L11 and the obtained plane P11.
 なお、挿入部310bの先端部11の位置姿勢を求める手法はモーションセンサを用いるものに限定されない。例えば内視鏡システム300は、先端部11に設けられる磁気センサを含んでもよい。例えば磁気センサは、中心軸が互いに直交する2つの円筒状コイルを含む。また内視鏡システム300は、周辺機器として不図示の磁場発生装置を含む。磁気センサは、当該磁場発生装置が発生させた磁場を検出することによって、先端部11の位置と姿勢を検出する。 Note that the method of obtaining the position and orientation of the distal end portion 11 of the insertion portion 310b is not limited to using a motion sensor. For example, endoscope system 300 may include a magnetic sensor provided at distal end 11 . For example, a magnetic sensor includes two cylindrical coils whose center axes are perpendicular to each other. The endoscope system 300 also includes a magnetic field generator (not shown) as a peripheral device. The magnetic sensor detects the position and orientation of the distal end portion 11 by detecting the magnetic field generated by the magnetic field generator.
 また、処置対象組織側の計測方法は視差画像を用いる方式には限定されない。例えば処理部120は、TOF(Time Of Flight)方式やストラクチャードライト方式を用いて被写体との距離を測定することによって、処置対象組織側の計測を行ってもよい。TOF方式は、光の反射波がイメージセンサに到達する時間を測定する方式である。ストラクチャードライト方式は、被写体に複数のパターン光を投影し、各パターン光の写り方から距離を求める手法である。例えば、明度が正弦波で変化するパターンを投影することによって、位相のずれを求める位相シフト法等が知られている。被写体の3次元形状を推定するこれらの手法は公知であるため詳細な説明は省略する。 Also, the measurement method on the treatment target tissue side is not limited to the method using parallax images. For example, the processing unit 120 may measure the tissue to be treated by measuring the distance to the subject using a TOF (Time Of Flight) method or a structured light method. The TOF method is a method of measuring the time it takes for a reflected wave of light to reach an image sensor. The structured light method is a method of projecting a plurality of patterns of light onto an object and determining the distance from how each pattern of light appears. For example, there is known a phase shift method of obtaining a phase shift by projecting a pattern whose brightness changes with a sine wave. Since these techniques for estimating the three-dimensional shape of the subject are well known, detailed description thereof will be omitted.
 また処理部120は、異なる複数の撮像画像において、複数の特徴点の対応付けを行うことによって、処置対象組織の3次元空間内の位置を算出してもよい。特徴点の位置は、画像情報からSLAM(Simultaneous Localization and Mapping)、SfM(Structure from Motion)などの手法を用いて算出することが可能である。例えば処理部120は、非線形最小二乗法を用いて、画像から、内部パラメータ、外部パラメータ及び世界座標点群を最適化するバンドル調整を適用することによって、処置対象組織の情報を求める。また処理部120は、推定された各パラメータを用いて、抽出された複数の特徴点の世界座標点を透視投影変換し、再投影誤差が最小になるように、各パラメータと各世界座標点群を求める。SfM等の手法は公知であるため、これ以上の詳細な説明は省略する。なお、これらの手法では、被写体の3次元位置だけでなく、カメラの位置姿勢も推定可能である。よって先端部11の位置姿勢の推定にSfM等の手法が用いられてもよい。 The processing unit 120 may also calculate the position of the treatment target tissue in the three-dimensional space by associating a plurality of feature points in a plurality of different captured images. The positions of feature points can be calculated from image information using methods such as SLAM (Simultaneous Localization and Mapping) and SfM (Structure from Motion). For example, the processing unit 120 obtains information of the tissue to be treated by applying a bundle adjustment that optimizes the intrinsic parameters, the extrinsic parameters and the global coordinate point cloud from the image using a non-linear least squares method. In addition, the processing unit 120 performs perspective projection transformation on the world coordinate points of the plurality of extracted feature points using each estimated parameter, and performs each parameter and each world coordinate point cloud so that the reprojection error is minimized. Ask for Since methods such as SfM are publicly known, further detailed description thereof will be omitted. Note that these methods can estimate not only the three-dimensional position of the subject but also the position and orientation of the camera. Therefore, a technique such as SfM may be used to estimate the position and orientation of the distal end portion 11 .
 また処置対象組織側の計測は、内視鏡システム300を用いて撮像される撮像画像を用いたものに限定されない。例えば内視鏡システム300を用いた処置の前に、患者のCT(Computed Tomography)画像やMRI(Magnetic Resonance Imaging)画像が取得されてもよい。CT画像及びMRI画像を用いることによって、処置対象組織の周辺やそこに到達するまでの臓器の形状を推定できる。ただし、推定した臓器形状はMR画像やCT画像の撮像時の形状であり、処置における臓器形状は種々の要因で変化する。よって処理部120は、当該要因に関連する情報を取得し、当該情報に基づいて臓器形状を補正することによって、挿入部310bを挿入している間の臓器形状を推定する。処理部120は、補正後の臓器形状に基づいて、処置対象組織の情報、例えば上記平面P11を演算する。 Also, the measurement of the tissue to be treated is not limited to using the captured image captured using the endoscope system 300 . For example, before treatment using the endoscope system 300, a patient's CT (Computed Tomography) image or MRI (Magnetic Resonance Imaging) image may be acquired. By using CT images and MRI images, it is possible to estimate the shape of the periphery of the tissue to be treated and the shape of the organ until reaching there. However, the estimated shape of the organ is the shape at the time of capturing the MR image or the CT image, and the shape of the organ during treatment changes due to various factors. Therefore, the processing unit 120 acquires information related to the factor and corrects the shape of the organ based on the information, thereby estimating the shape of the organ while the insertion portion 310b is being inserted. The processing unit 120 calculates information of the treatment target tissue, for example, the plane P11, based on the corrected shape of the organ.
 臓器形状の変化要因に関連する情報は、例えば患者の***、管腔内の気圧、重力方向、挿入部310bの挿入形状等の情報である。患者の***は、可動ベッドの駆動量から求められてもよいし、ユーザによって入力されてもよい。管腔内の気圧は、送気量や吸引量から推定されてもよいし、気圧センサを挿入部310bに設けることによって取得されてもよい。重力方向は、モーションセンサから検出可能である。挿入部310bの挿入形状は、モーションセンサや磁気センサを挿入部310bの複数の箇所に設けることによって検出されてもよいし、挿入部310bの進退操作や湾曲操作の履歴に基づいて推定されてもよい。 The information related to the organ shape change factor is, for example, the patient's body position, the pressure inside the lumen, the direction of gravity, the insertion shape of the insertion portion 310b, and the like. The patient's body position may be obtained from the amount of movement of the movable bed, or may be input by the user. The air pressure in the lumen may be estimated from the amount of air supplied or the amount of suction, or may be obtained by providing an air pressure sensor in the insertion section 310b. The direction of gravity can be detected from a motion sensor. The insertion shape of the insertion section 310b may be detected by providing motion sensors or magnetic sensors at a plurality of locations of the insertion section 310b, or may be estimated based on the history of forward/backward operations and bending operations of the insertion section 310b. good.
 なお、図13(B)に示したように、アプローチ角度は、挿入部310bの軸線である直線L11と、重力方向を表す直線L12のなす角度であってもよい。重力方向を用いた場合、アプローチ角度は直接的には処置対象組織と先端部11の角度とはならない。しかし、例えば胃におけるESD等では、患者の***はある程度決まっており、例えば左側臥位等が用いられる。そのため、臓器における処置対象組織の位置が既知であれば、平面P11と重力方向の関係は既知となる。また***変換が行われる場合であっても、その都度、***を表す情報を取得可能であれば、平面P11と重力方向の関係は既知となる。 Note that, as shown in FIG. 13B, the approach angle may be an angle formed by a straight line L11, which is the axis of the insertion portion 310b, and a straight line L12, which represents the direction of gravity. When the direction of gravity is used, the approach angle does not directly correspond to the angle between the tissue to be treated and the distal end portion 11 . However, for ESD in the stomach, for example, the position of the patient is fixed to some extent, and for example, the left lateral decubitus position is used. Therefore, if the position of the treatment target tissue in the organ is known, the relationship between the plane P11 and the direction of gravity is known. Also, even when the body posture is changed, if information representing the body posture can be acquired each time, the relationship between the plane P11 and the direction of gravity is known.
 よって処理部120は、軸線と重力方向に基づいて、アプローチ角度を求めてもよい。重力方向は、例えば上述したモーションセンサを用いて求めることが可能である。なお処理部120は、求めたアプローチ角度をそのまま用いてもよいし、重力方向と平面P11との関係に基づいて、図13(A)と同様に直線L11と平面P11の角度を演算してもよい。 Therefore, the processing unit 120 may obtain the approach angle based on the axis and the direction of gravity. The direction of gravity can be determined using, for example, the motion sensor described above. The processing unit 120 may use the obtained approach angle as it is, or may calculate the angle between the straight line L11 and the plane P11 based on the relationship between the direction of gravity and the plane P11 in the same manner as in FIG. good.
 次に、図14のフローチャートを用いて、ログ情報を出力するための処理例を説明する。先ず処理部120は、処置対象組織に対する処置が開始されたか否かを検出する(ステップS401)。例えば処理部120は、周辺機器の使用情報に基づいて、処置の開始を検出する処理を行う。ここで処理部120とは、例えば前述の処理装置330に対応するが、スコープ部310、表示部340、光源装置350等まで拡張してもよい。周辺機器は例えば処置を行うための処置具360に対応するが、吸引装置370、送気送水装置380、処置具360に電力を供給する電源装置等まで拡張してもよい。 Next, an example of processing for outputting log information will be described using the flowchart of FIG. First, the processing unit 120 detects whether or not treatment has been started on the treatment target tissue (step S401). For example, the processing unit 120 performs processing for detecting the start of treatment based on the usage information of the peripheral device. Here, the processing unit 120 corresponds to, for example, the processing device 330 described above, but may be expanded to include the scope unit 310, the display unit 340, the light source device 350, and the like. The peripheral device corresponds to, for example, the treatment tool 360 for performing treatment, but may be expanded to include a suction device 370, an air/water supply device 380, a power supply device for supplying power to the treatment tool 360, and the like.
 例えば、処理部120は、処置具360の突出状態に関する情報を使用情報として取得する。図5で前述したように、挿入部310bには、操作部310aから先端部11の開口部316までつながるチャンネルが設けられているので、術者は、当該チャンネルに処置具360を挿通することによって処置を行う。処置具360の突出状態とは、先端部11の開口部316から処置具360の先端が突出したか否か、或いは、その突出量を表す情報である。そして、処理部120は、撮像画像に処置具360が撮像されたか否かを判定することによって、突出状態に関する使用情報を取得する。図1(A)等に示したように、処置具360は先端部11の開口部316から挿入部310bの軸方向に突出するため、図1(B)等に示したように、撮像画像には突出した状態の処置具360が撮像される。処置具360は、生体組織に比べて彩度が低く、その形状も設計から既知である。よって処理部120は、撮像画像に対して、彩度判定処理や、基準形状を用いたマッチング処理等の画像処理を行うことによって、画像中の処置具領域を検出できる。処理部120は、例えば画像中に処置具360が存在する場合に、処置具360が突出しており、処置が開始されたと判定する。また、処置を行わない場合、処置具360を突出させる必要はないし、突出させることでかえって生体を傷つけるおそれがある。即ち、術者が処置具360を突出させた場合、速やかに処置が開始されると推定できる。そのため、処置具360の突出状態を用いることによって、処置の開始を精度よく判定できる。なお、処理部120は、鉗子口に対応する開口部316に設けられるセンサの出力に基づいて、処置具360の突出状態を判定してもよい。 For example, the processing unit 120 acquires information about the protruding state of the treatment instrument 360 as usage information. As described above with reference to FIG. 5, the insertion portion 310b is provided with a channel that extends from the operation portion 310a to the opening 316 of the distal end portion 11, so that the operator can insert the treatment instrument 360 through the channel. take action. The protruding state of the treatment instrument 360 is information indicating whether or not the distal end of the treatment instrument 360 protrudes from the opening 316 of the distal end portion 11, or the amount of projection. Then, the processing unit 120 acquires usage information regarding the protruding state by determining whether or not the treatment tool 360 is imaged in the captured image. As shown in FIG. 1A and the like, the treatment instrument 360 protrudes from the opening 316 of the distal end portion 11 in the axial direction of the insertion portion 310b. is an image of the protruding treatment instrument 360 . The treatment instrument 360 has lower saturation than living tissue, and its shape is also known from its design. Therefore, the processing unit 120 can detect the treatment tool region in the image by performing image processing such as saturation determination processing and matching processing using a reference shape on the captured image. For example, when the treatment tool 360 is present in the image, the processing unit 120 determines that the treatment tool 360 is protruding and the treatment has started. Moreover, when treatment is not performed, there is no need to protrude the treatment instrument 360, and there is a risk that the protruding may rather harm the living body. That is, it can be estimated that the treatment will start promptly when the operator protrudes the treatment instrument 360 . Therefore, by using the protruded state of the treatment instrument 360, it is possible to accurately determine the start of treatment. Note that the processing unit 120 may determine the protruding state of the treatment instrument 360 based on the output of a sensor provided in the opening 316 corresponding to the forceps port.
 また、処理部120は、高周波デバイスの通電状態に関する情報を使用情報として取得する。処理部120は、当該制御信号に基づいて、高周波デバイスに高周波電流が供給されている場合に、処置が開始されたと判定する。高周波デバイスは、単に突出させただけでは切除や焼灼を行うことができず、高周波電流の供給が必要となる。即ち、高周波デバイスに高周波電流が供給された場合、速やかに処置が開始される蓋然性がより高い。よって高周波デバイスの通電状態を用いることによって、処置の開始を精度よく判定できる。 In addition, the processing unit 120 acquires information regarding the energization state of the high-frequency device as usage information. Based on the control signal, the processing unit 120 determines that treatment has started when high-frequency current is being supplied to the high-frequency device. A high-frequency device cannot perform resection or cauterization by simply protruding, and requires supply of high-frequency current. That is, when high-frequency current is supplied to the high-frequency device, there is a higher probability that treatment will be started immediately. Therefore, by using the energized state of the high-frequency device, it is possible to accurately determine the start of treatment.
 処理部120は、処置対象組織に対する処置が開始された場合(ステップS401でYES)、処置が開始されたと判定されたタイミングにおけるアプローチ角度を、基準角度に設定する処理を行う(ステップS402)。例えば、処理部120は、処置具360が突出したと判定されたタイミングにおいてモーションセンサのセンサ情報及び撮像画像を取得し、挿入部310bの軸線である直線L11と、処置対象組織に関する平面P11を求める処理を行い、直線L11と平面P11のなす角度を、アプローチ角度の基準角度に設定する。なお、処置対象組織に対する処置が開始されない(ステップS401でNo)場合、開始されるまで待機するため、アプローチ角度の相対変化の算出は行われない。つまり、処置が行われない期間は、図9や図10に示すアプローチ角度情報のログ情報において、時間軸に平行なグラフを描き続ける。 When the treatment for the treatment target tissue has started (YES in step S401), the processing unit 120 performs processing for setting the approach angle at the timing when it is determined that the treatment has started as the reference angle (step S402). For example, the processing unit 120 acquires the sensor information of the motion sensor and the captured image at the timing when it is determined that the treatment tool 360 has protruded, and obtains the straight line L11 that is the axis of the insertion unit 310b and the plane P11 regarding the tissue to be treated. After processing, the angle formed by the straight line L11 and the plane P11 is set as the reference angle of the approach angle. Note that if the treatment on the tissue to be treated is not started (No in step S401), the calculation of the relative change in the approach angle is not performed because the procedure waits until the treatment is started. That is, during a period in which no treatment is performed, a graph parallel to the time axis continues to be drawn in the log information of the approach angle information shown in FIGS. 9 and 10 .
 なお、本実施形態における「処置具360が突出したタイミングにおけるアプローチ角度」とは、突出検出をトリガーとして算出されたアプローチ角度を表すものであって、突出の判定タイミング、センサ情報の取得タイミング、撮像画像の取得タイミングは厳密に一致する必要はない。また、上述したように、処理部120は、高周波デバイスの通電開始タイミングに対応するタイミングのアプローチ角度を基準角度に設定してもよい。以上のように、処置開始時のアプローチ角度を基準角度に設定することによって、1回の処置のなかでのアプローチ角度の相対変化を適切に求めることが可能になる。 Note that the “approach angle at the timing when the treatment instrument 360 protrudes” in the present embodiment represents the approach angle calculated with the detection of protrusion as a trigger, and includes the determination timing of protrusion, the acquisition timing of sensor information, and the imaging. The image acquisition timing does not have to match exactly. Further, as described above, the processing unit 120 may set the approach angle at the timing corresponding to the energization start timing of the high-frequency device as the reference angle. As described above, by setting the approach angle at the start of treatment as the reference angle, it is possible to appropriately obtain the relative change in the approach angle during one treatment.
 基準角度が設定された後、処理部120は、アプローチ角度の相対変化を求める(ステップS403)。例えば処理部120は、モーションセンサのセンサ情報を取得し、当該センサ情報に基づいて、直線L11を表す情報を求める。処理部120は直線L11と、ステップS402で求めた平面P11とのなす角度を、そのときのアプローチ角度として求める。さらに処理部120は、求めたアプローチ角度と、ステップS402で求めた基準角度の差分を算出し、当該差分をアプローチ角度の相対変化とする。 After setting the reference angle, the processing unit 120 obtains the relative change in the approach angle (step S403). For example, the processing unit 120 acquires sensor information from a motion sensor, and obtains information representing the straight line L11 based on the sensor information. The processing unit 120 obtains the angle formed by the straight line L11 and the plane P11 obtained in step S402 as the approach angle at that time. Further, the processing unit 120 calculates the difference between the obtained approach angle and the reference angle obtained in step S402, and regards the difference as the relative change in the approach angle.
 そして、出力処理部130は、ステップS103で求められた相対変化を出力する処理を行う(ステップS104)。例えば出力処理部130は、処置が開始されたと判定したタイミングよりも後のタイミングにおいて、相対変化に対応する角度をリアルタイムに表示する表示処理を行う。処置が開始されたと判定したタイミングとはステップS401でYesと判定したタイミングであり、その後のタイミングとはステップS404のタイミングである。 Then, the output processing unit 130 performs processing for outputting the relative change obtained in step S103 (step S104). For example, the output processing unit 130 performs display processing for displaying the angle corresponding to the relative change in real time at a timing after the timing at which it is determined that the treatment has started. The timing at which it is determined that the treatment has started is the timing at which Yes is determined in step S401, and the subsequent timing is the timing in step S404.
 このように、図13、図14に示す手法により、アプローチ角度情報に関するログ情報を取得することができる。また、公知の手法により、前述のモーションセンサを用いて、所望のタイミングにてアプローチ角度情報を取得できることができる。軟性内視鏡は前述のように軟性かつ長大であり、ロボットアームのように剛性が無い為、入力の内容と出力結果には対応関係が無い。つまり、操作部310aへの入力信号等の情報に基づいてもアプローチ角度情報を得ることはできない。一方、高周波デバイスへの通電は入力と出力が対応していることから、例えば高周波デバイスに対する入力信号に基づいて、通電開始時刻や通電終了時刻の情報や通電を行った回数等の情報を得ることができる。また、さらに公知の手法により、所望のタイミングで通電に関する情報、すなわち通電履歴情報をデジタルデータとして取得することができる。これにより、アプローチ角度情報と通電履歴情報を同一のタイミングで取得することができるので、ログ情報を組み合わせることや、後述のニューラルネットワークNN1にアプローチ角度情報の時系列データや通電履歴情報の時系列データを入力することが実現できる。なお、ここでの同一は略同一を含む。なお、図14のフローチャートには図示を省略しているが、通電が行われた期間に対応するエリア(図9及び図10の点線区間)の色等を変更する処理を追加してもよい。このようにすることで、ログ情報をより視覚的かつ明確に示すことができる。 In this way, log information related to approach angle information can be obtained by the methods shown in FIGS. In addition, approach angle information can be acquired at desired timing using the motion sensor described above by a known method. As described above, the flexible endoscope is flexible and long, and has no rigidity like the robot arm. In other words, approach angle information cannot be obtained even based on information such as an input signal to the operation unit 310a. On the other hand, since the input and output of the energization of the high-frequency device correspond, for example, based on the input signal to the high-frequency device, it is possible to obtain information on the energization start time, the energization end time, the number of times the energization is performed, etc. can be done. In addition, information on energization, that is, energization history information can be acquired as digital data at desired timing by a known technique. As a result, the approach angle information and the energization history information can be acquired at the same timing. can be implemented. The same here includes substantially the same. Although not shown in the flowchart of FIG. 14, a process of changing the color of the area (dotted line section in FIGS. 9 and 10) corresponding to the energized period may be added. By doing so, the log information can be displayed more visually and clearly.
 次に、機械学習の概要について説明する。以下では、ニューラルネットワークNN1を用いた機械学習について説明するが、本実施形態の手法はこれに限定されない。本実施形態においては、例えばSVM(support vector machine)等の他のモデルを用いた機械学習が行われてもよいし、これらの手法を発展させた手法を用いた機械学習が行われてもよい。 Next, I will explain the outline of machine learning. Machine learning using the neural network NN1 will be described below, but the method of the present embodiment is not limited to this. In this embodiment, for example, machine learning using other models such as SVM (support vector machine) may be performed, or machine learning using techniques developed from these techniques may be performed. .
 図15は、ニューラルネットワークNN1を説明する模式図である。ニューラルネットワークNN1は、データが入力される入力層と、入力層からの出力に基づいて演算を行う中間層と、中間層からの出力に基づいてデータを出力する出力層を有する。図15においては、中間層が2層であるネットワークを例示するが、中間層は1層であってもよいし、3層以上であってもよい。また各層に含まれるノードの数は図15の例に限定されず、種々の変形実施が可能である。なお精度を考慮すれば、本実施形態の学習は多層のニューラルネットワークNN1を用いたディープラーニングを用いることが望ましい。ここでの多層とは、狭義には4層以上である。 FIG. 15 is a schematic diagram explaining the neural network NN1. The neural network NN1 has an input layer to which data is input, an intermediate layer that performs operations based on the output from the input layer, and an output layer that outputs data based on the output from the intermediate layer. FIG. 15 illustrates a network with two intermediate layers, but the number of intermediate layers may be one, or three or more. Also, the number of nodes included in each layer is not limited to the example in FIG. 15, and various modifications are possible. Considering the accuracy, it is desirable to use deep learning using a multi-layered neural network NN1 for learning in this embodiment. The term “multilayer” as used herein means four or more layers in a narrow sense.
 図15に示すように、所与の層に含まれるノードは、隣接する層のノードと結合される。各結合には重み付け係数が設定されている。各ノードは、前段のノードの出力と重み付け係数を乗算し、乗算結果の合計値を求める。さらに各ノードは、合計値に対してバイアスを加算し、加算結果に活性化関数を適用することによって当該ノードの出力を求める。この処理を、入力層から出力層へ向けて順次実行することによって、ニューラルネットワークNN1の出力が求められる。なお活性化関数としては、シグモイド関数やReLU関数等の種々の関数が知られており、本実施形態ではそれらを広く適用可能である。 As shown in FIG. 15, the nodes contained in a given layer are combined with the nodes of adjacent layers. A weighting factor is set for each connection. Each node multiplies the output of the preceding node by the weighting factor, and obtains the sum of the multiplication results. Further, each node adds a bias to the total value and applies an activation function to the addition result to obtain the output of that node. By sequentially executing this processing from the input layer to the output layer, the output of the neural network NN1 is obtained. Various functions such as a sigmoid function and a ReLU function are known as activation functions, and these functions can be widely applied in this embodiment.
 ニューラルネットワークNN1における学習は、適切な重み付け係数を決定する処理である。ここでの重み付け係数は、バイアスを含む。以下、学習済モデルを生成する処理が学習装置において行われる例を示す。学習装置とは、例えば上述したように処理システム100に含まれる学習サーバであってもよいし、処理システム100の外部に設けられる装置であってもよい。 Learning in the neural network NN1 is a process of determining appropriate weighting coefficients. The weighting factor here includes the bias. An example in which processing for generating a trained model is performed in a learning device will be described below. The learning device may be, for example, a learning server included in the processing system 100 as described above, or may be a device provided outside the processing system 100 .
 学習装置は、学習データのうちの入力データをニューラルネットワークNN1に入力し、そのときの重み付け係数を用いた順方向の演算を行うことによって出力を求める。学習装置は、当該出力と、学習データのうちの正解ラベルとに基づいて、誤差関数を演算する。そして誤差関数を小さくするように、重み付け係数を更新する。重み付け係数の更新では、例えば出力層から入力層に向かって重み付け係数を更新していく誤差逆伝播法を利用可能である。 The learning device inputs the input data of the learning data to the neural network NN1, and obtains the output by performing forward calculations using the weighting coefficients at that time. The learning device calculates an error function based on the output and the correct label in the learning data. Then, the weighting coefficients are updated so as to reduce the error function. For updating the weighting coefficients, for example, an error backpropagation method can be used to update the weighting coefficients from the output layer toward the input layer.
 なおニューラルネットワークNN1には種々の構成のモデルが知られており、本実施形態ではそれらを広く適用可能である。例えばニューラルネットワークNN1は、CNN(Convolutional Neural Network)であってもよいし、RNN(Recurrent Neural Network)であってもよいし、他のモデルであってもよい。CNN等を用いる場合も、処理の手順は図15と同様である。即ち、学習装置は、学習データのうちの入力データをモデルに入力し、そのときの重み付け係数を用いてモデル構成に従った順方向演算を行うことによって出力を求める。当該出力と、正解ラベルとに基づいて誤差関数が算出され、当該誤差関数を小さくするように、重み付け係数の更新が行われる。CNN等の重み付け係数を更新する際にも、例えば誤差逆伝播法を利用可能である。 Various configurations of models are known for the neural network NN1, and these can be widely applied in this embodiment. For example, the neural network NN1 may be a CNN (Convolutional Neural Network), an RNN (Recurrent Neural Network), or other models. When CNN or the like is used, the processing procedure is the same as in FIG. That is, the learning device inputs the input data of the learning data to the model and obtains the output by performing forward calculation according to the model configuration using the weighting coefficients at that time. An error function is calculated based on the output and the correct label, and the weighting coefficients are updated so as to reduce the error function. For example, the error backpropagation method can also be used when updating the weighting coefficients of CNN or the like.
 次に、図16を用いて、本実施形態の手法におけるニューラルネットワークNN1の入力と出力の関係を説明する。図16に示すように、ニューラルネットワークNN1の入力は、例えばアプローチ角度情報と、通電履歴情報である。例えば、所与の術者による1回の手術において、時系列のアプローチ角度情報と、通電履歴情報が取得される。ニューラルネットワークNN1の入力は、時系列データであるが、時系列データに基づいて演算される統計量であってもよい。 Next, using FIG. 16, the relationship between the input and output of the neural network NN1 in the method of this embodiment will be described. As shown in FIG. 16, inputs to the neural network NN1 are, for example, approach angle information and energization history information. For example, time-series approach angle information and energization history information are acquired in one surgery by a given operator. The input of the neural network NN1 is time series data, but may be a statistic calculated based on the time series data.
 ニューラルネットワークNN1の出力は、例えば評価対象となるユーザのスキルを、M段階でランク付けした際のランクを表す情報である。Mは2以上の整数である。以下、ランクIは、ランクI+1に比べてスキルが高いものとする。Iは、1以上M未満の整数である。即ち、ランク1は最もスキルが高いことを表し、ランクMが最もスキルが低いことを表す。 The output of the neural network NN1 is, for example, information representing the rank when the skill of the user to be evaluated is ranked in M stages. M is an integer of 2 or more. Hereinafter, it is assumed that rank I is higher in skill than rank I+1. I is an integer of 1 or more and less than M; That is, rank 1 represents the highest skill, and rank M represents the lowest skill.
 例えばニューラルネットワークNN1の出力層はM個のノードを有する。第1ノードは、入力となったデータに対応するユーザのスキルがカテゴリ1に属する確からしさを表す情報である。第2ノード~第Mノードも同様であり、各ノードはそれぞれ、入力となったデータがカテゴリ2~カテゴリMに属する確からしさを表す情報である。例えば、出力層が公知のソフトマックス層である場合、M個の出力は、合計が1となる確率データの集合である。カテゴリ1~カテゴリMは、それぞれランク1~ランクMに対応するカテゴリである。 For example, the output layer of neural network NN1 has M nodes. The first node is information representing the likelihood that the skill of the user corresponding to the input data belongs to category 1. The same is true for the second node to the Mth node, and each node is information representing the probability that the input data belongs to category 2 to category M, respectively. For example, if the output layer is a known softmax layer, the M outputs are sets of probability data that sum to one. Category 1 to category M are categories corresponding to rank 1 to rank M, respectively.
 学習段階では、学習装置は、多数の術者がそれぞれ軟性内視鏡を用いて処置を行った際に取得されたアプローチ角度情報と通電履歴情報を収集するが、さらにメタデータを保持してもよい。ここでのメタデータは、例えば前述の患者情報であるが、さらに医師情報を加えてもよい。例えば、学習装置は、所定の患者情報を有する症例に対してアプローチ角度情報及び通電履歴情報が入力された場合、評価としてランクAの正解ラベルを付与する。ここでの所定の患者情報とは、例えば前述の手術後の偶発症発生率が平均値より低く、かつ入院に要した日数が平均値より少ない患者情報である。一方、特定の患者情報を有する症例に対してアプローチ角度情報及び通電履歴情報が入力された場合、評価としてランクDの正解ラベルを付与する。ここでの特定の患者情報とは、手術後の偶発症発生率が平均値より著しく高く、かつ入院に要した日数が平均値より著しく高い患者情報である。このように、学習装置は、メタデータである患者情報に基づいて術者のスキルがM段階のうちいずれかを特定する。また、学習段階では、1症例ごとに、熟練医が手動で各修練医のスキルを評価して、評価結果を学習装置に入力してもよい。また、例えば、術後の経過が良好な場合の学習済モデルと、術後の経過が良好ではない場合の学習済モデルを用意してもよい。処理部120は、前述の患者情報をもとに、術後の経過が良好な場合の学習済モデルと、術後の経過が良好ではない場合の学習済モデルを選択する。そして、選択された学習済モデルにアプローチ角度情報と通電履歴情報を入力することで、スキル評価を行う。また、例えば、術後の経過が良好な場合の学習済モデルにのみ「A+」の評価結果が出力される仕様にしてもよい。 In the learning stage, the learning device collects approach angle information and energization history information acquired when a large number of operators perform treatments using flexible endoscopes. good. The metadata here is, for example, the patient information described above, but may also include doctor information. For example, when approach angle information and energization history information are input to a case having predetermined patient information, the learning device assigns a correct label of rank A as an evaluation. The predetermined patient information here is, for example, patient information in which the incidence of postoperative complications after surgery is lower than the average value and the number of days required for hospitalization is shorter than the average value. On the other hand, when approach angle information and energization history information are input for a case having specific patient information, a correct label of rank D is assigned as an evaluation. The specific patient information here means information on a patient whose incidence of complications after surgery is significantly higher than the average value and the number of days required for hospitalization is significantly higher than the average value. In this way, the learning device identifies one of the M levels of the operator's skill based on the patient information, which is metadata. In the learning stage, the skilled doctor may manually evaluate the skill of each trainee for each case and input the evaluation result into the learning device. Also, for example, a trained model for when the postoperative course is good and a trained model for when the postoperative course is not good may be prepared. Based on the patient information described above, the processing unit 120 selects a trained model when the postoperative progress is favorable and a trained model when the postoperative progress is not favorable. Then, by inputting approach angle information and energization history information to the selected learned model, skill evaluation is performed. Further, for example, a specification may be adopted in which an evaluation result of "A+" is output only to a trained model when the postoperative progress is favorable.
 図17は、ニューラルネットワークNN1の学習処理を説明するフローチャートである。まずステップS101とステップS102において、学習装置は、学習用アプローチ角度情報と、学習用通電履歴情報を取得する。ステップS101とステップS102の処理は、例えば学習サーバが、データベースサーバに蓄積された多数のデータから、1組のアプローチ角度情報及び通電履歴情報を読み出す処理に相当する。 FIG. 17 is a flowchart explaining the learning process of the neural network NN1. First, in steps S101 and S102, the learning device acquires approach angle information for learning and energization history information for learning. The processing of steps S101 and S102 corresponds to, for example, processing by the learning server to read out a set of approach angle information and energization history information from a large amount of data accumulated in the database server.
 なお、アプローチ角度情報と学習用アプローチ角度情報とは、学習段階で用いられるデータであるか、スキル評価を行う推論段階で用いられるデータであるかの違いを表すものであり、具体的なデータ形式は同様である。また、所与のタイミングにおいて推論用のアプローチ角度情報として用いられたデータが、それ以降のタイミングにおいて学習用アプローチ角度情報として用いられてもよい。通電履歴情報と学習用通電履歴情報についても同様である。 The approach angle information and the approach angle information for learning represent the difference between data used in the learning stage and data used in the inference stage for skill evaluation. is similar. Also, data used as approach angle information for inference at a given timing may be used as approach angle information for learning at subsequent timings. The same applies to the energization history information and the energization history information for learning.
 またステップS101とステップS102において、学習装置は、ステップS101で読み出したデータに対応付けられた正解ラベルを取得する。正解ラベルは、例えば上述したように、内視鏡操作を行ったユーザのスキルをM段階で評価した結果である。 Also, in steps S101 and S102, the learning device acquires the correct label associated with the data read out in step S101. The correct label is, for example, the result of evaluating the skill of the user who has operated the endoscope in M stages, as described above.
 ステップS103において、学習装置は、学習処理として誤差関数を求める処理を行う。具体的には、学習装置は、アプローチ角度情報及び通電履歴情報をニューラルネットワークNN1に入力する。学習装置は、入力と、その際の重み付け係数に基づいて順方向の演算を行う。そして学習装置は、演算結果と、正解ラベルの比較処理に基づいて誤差関数を求める。例えば、正解ラベルがランク1であった場合、学習装置は、カテゴリ1に対応する第1ノードの正解値が1であり、カテゴリ2~カテゴリMに対応する第2ノード~第Mノードの正解値が0であるものとして誤差関数を求める。さらにステップS103において、学習装置は、誤差関数を小さくするように重み付け係数を更新する処理を行う。この処理は、上述したように誤差逆伝播法等を利用可能である。ステップS101~S103の処理が、1つの学習データに基づく1回の学習処理に対応する。 In step S103, the learning device performs processing for obtaining an error function as learning processing. Specifically, the learning device inputs the approach angle information and the energization history information to the neural network NN1. The learning device performs forward calculations based on the input and the weighting coefficients at that time. Then, the learning device obtains an error function based on the calculation result and the comparison processing of the correct label. For example, if the correct label is rank 1, the learning device determines that the correct value of the first node corresponding to category 1 is 1, and the correct values of the second to M-th nodes corresponding to categories 2 to M. is 0 and the error function is obtained. Furthermore, in step S103, the learning device performs processing to update the weighting coefficients so as to reduce the error function. For this processing, the error backpropagation method or the like can be used as described above. The processing of steps S101 to S103 corresponds to one learning process based on one piece of learning data.
 ステップS104において、学習装置は学習処理を終了するか否かを判定する。例えば学習装置は、多数の学習データの一部を評価データとして保持していてもよい。評価データは、学習結果の精度を確認するためのデータであり、重み付け係数の更新には使用されないデータである。学習装置は、評価データを用いた推定処理の正解率が所定閾値を超えた場合に、学習処理を終了する。 In step S104, the learning device determines whether or not to end the learning process. For example, the learning device may hold a part of a large amount of learning data as evaluation data. The evaluation data is data for confirming the accuracy of the learning result, and is data that is not used for updating the weighting coefficients. The learning device ends the learning process when the accuracy rate of the estimation process using the evaluation data exceeds a predetermined threshold.
 ステップS104でNoの場合、ステップS101に戻り、次の学習データに基づく学習処理が継続される。ステップS104でYesの場合、学習処理が終了される。学習装置は、生成した学習済モデルの情報を処理システム100に送信する。例えば、学習済モデルは処理システム100に含まれる不図示の記憶部に記憶され、処理部120によって読み出される。なお、機械学習においてはバッチ学習、ミニバッチ学習等の種々の手法が知られており、本実施形態ではこれらを広く適用可能である。 If No in step S104, the process returns to step S101 to continue the learning process based on the next learning data. If Yes in step S104, the learning process is terminated. The learning device transmits the generated learned model information to the processing system 100 . For example, the trained model is stored in a storage unit (not shown) included in the processing system 100 and read by the processing unit 120 . Various techniques such as batch learning and mini-batch learning are known in machine learning, and these can be widely applied in the present embodiment.
 なお、以上では機械学習が教師あり学習である例について説明した。ただし、本実施形態の手法はこれに限定されず、教師無し学習が行われてもよい。例えば、上述したように、ニューラルネットワークNN1の出力層のノード数をM個とした場合、教師無し学習では入力であるアプローチ角度情報と通電履歴情報から導出される特徴量の類似度合いに基づいて、多数の入力をM個のカテゴリに分類する分類処理が行われる。 In the above, we explained an example where machine learning is supervised learning. However, the method of this embodiment is not limited to this, and unsupervised learning may be performed. For example, as described above, when the number of nodes in the output layer of the neural network NN1 is M, in unsupervised learning, based on the similarity of the feature amount derived from the input approach angle information and energization history information, A classification process is performed that classifies a number of inputs into M categories.
 学習装置は、M個のカテゴリの各カテゴリにランク付けを行う。例えば、熟練医のデータが多く含まれるカテゴリのランクが高く、修練医のデータが多く含まれるカテゴリのランクが低く判定される。各データが熟練医のデータであるか修練医のデータであるかは、前述の医者情報や患者情報等に基づいて、判定が可能である。ただし、詳細な処理については種々の変形実施が可能である。例えば、あらかじめ学習用のデータに対して、M段階のランク付けが行われており、学習装置は、各カテゴリに含まれるデータのランクの平均値や合計値等に基づいて、M個のカテゴリのランク付けを行ってもよい。教師無し学習を行う場合であっても、教師あり学習の例と同様に、入力に基づいて、ユーザのスキルをM段階で評価する学習済モデルを生成することが可能である。 The learning device ranks each of the M categories. For example, a category containing a lot of data on experienced doctors is ranked high, and a category containing a lot of data on trainee doctors is ranked low. It is possible to determine whether each data is the data of a skilled doctor or the data of a trainee doctor based on the aforementioned doctor information, patient information, and the like. However, various modifications can be made to the detailed processing. For example, the learning data is ranked in M steps in advance, and the learning device selects M categories based on the average value or total value of the ranks of the data included in each category. Ranking may be done. Even when performing unsupervised learning, it is possible to generate a trained model that evaluates the user's skill in M stages based on the input, as in the case of supervised learning.
 図18は、スキル評価情報を出力する処理についての処理例を説明するフローチャートである。先ず取得部110は、スキル評価の対象となるアプローチ角度情報を取得し(ステップS201)、通電履歴情報を取得する(ステップS202)。その後、処理部120は学習済モデルに基づく推論処理を行う(ステップS203)。図16に示した例であれば、処理部120は、アプローチ角度情報と通電履歴情報を学習済モデルに入力し、学習済みの重み付け係数に従った順方向の演算を行うことによって、M個の出力を取得する。処理部120は、当該出力に基づいて、ユーザのスキル評価情報を求める。例えば処理部120は、M個の出力のうち、最も値が大きいデータに基づいて、ユーザのスキルをM段階で評価する。言い換えれば、処理部120は、学習用アプローチ角度情報及び学習用通電履歴情報を、M(Mは2以上の整数)個のカテゴリに分類する機械学習を行うことによって取得された学習済モデルと、アプローチ角度情報と通電履歴情報とに基づいて、スキル評価を行う。上述したように、学習済モデルは、教師あり学習に基づいて生成されてもよいし、教師無し学習に基づいて生成されてもよい。 FIG. 18 is a flowchart illustrating a processing example of processing for outputting skill evaluation information. First, the acquisition unit 110 acquires approach angle information for skill evaluation (step S201), and acquires energization history information (step S202). After that, the processing unit 120 performs inference processing based on the learned model (step S203). In the example shown in FIG. 16, the processing unit 120 inputs the approach angle information and the energization history information to the learned model, and performs forward calculations according to the learned weighting coefficients, thereby obtaining M Get the output. The processing unit 120 obtains the user's skill evaluation information based on the output. For example, the processing unit 120 evaluates the user's skill in M stages based on the data with the largest value among the M outputs. In other words, the processing unit 120 acquires a learned model obtained by performing machine learning for classifying the approach angle information for learning and the energization history information for learning into M (M is an integer equal to or greater than 2) categories, Skill evaluation is performed based on approach angle information and energization history information. As described above, the trained model may be generated based on supervised learning or unsupervised learning.
 その後、出力処理部130は、スキル評価の結果であるスキル評価情報を出力する(ステップS204)。ここでのスキル評価情報とは、例えば、図7のマーキング段階の評価、局注段階の評価、切開段階の評価、剥離段階の評価、止血段階の評価及び総合評価の組み合わせからなるM通りの評価結果のいずれであるかを特定する情報である。 After that, the output processing unit 130 outputs skill evaluation information, which is the skill evaluation result (step S204). The skill evaluation information here means, for example, M evaluations consisting of a combination of marking stage evaluation, local injection stage evaluation, incision stage evaluation, peeling stage evaluation, hemostasis stage evaluation, and comprehensive evaluation in FIG. This is the information specifying which of the results it is.
 なお、スキル評価情報は、総合評価の情報と処置の各段階における評価の情報を含むため、処理システム100の記憶部には、それぞれの評価に応じて別々の学習済モデルが記憶されている。そして、それぞれのスキル評価の対象となる学習済モデルを用いて、図18の処理が行われる。また、前述した通電期間に限定してスキル評価を行う場合は、例えば、作成されたログ情報をもとに、通電期間及び通電期間に対応するアプローチ角度情報からなるデータを抽出する処理を行う。そして、当該データと、当該通電期間が属する段階に対応する学習済モデルを用いて、図18の処理が行われる。通電期間より前の期間に限定してスキル評価を出力する場合も同様である。また、図8に示したアドバイス情報をスキル評価に追加する場合は、例えば、スキル評価の結果に対応する定型文を予め記憶部に記憶させておく。そして、図18の処理に、得られたスキル評価結果に対応する定型文を選択する処理と、当該選択された定型文をスキル評価結果に追加する処理を行う。エキスパートデータとの差異を示すアドバイス情報を表示したい場合は、例えば、エキスパートデータをメタデータとして保持し、当該メタデータをもとに当該差異を推論する処理と、当該推論による結果の情報と、関連する定型文とに基づいて、適したアドバイス情報を作成する処理が追加される。なお、エキスパートデータを予め取得するには、同様の症例について熟練者による手術を通じてデータを得る必要がある。また、一の内視鏡システム300で取得したエキスパートデータを、他の内視鏡システム300に移植することも可能である。 Since the skill evaluation information includes comprehensive evaluation information and evaluation information at each stage of treatment, the storage unit of the processing system 100 stores separate trained models according to each evaluation. Then, the processing of FIG. 18 is performed using the learned models that are targets of skill evaluation. Further, when performing skill evaluation limited to the above-described energization period, for example, based on the created log information, a process of extracting data consisting of the energization period and approach angle information corresponding to the energization period is performed. Then, the processing of FIG. 18 is performed using the data and the learned model corresponding to the stage to which the energization period belongs. The same applies to the case of outputting the skill evaluation limited to the period before the energization period. Further, when adding the advice information shown in FIG. 8 to the skill evaluation, for example, fixed phrases corresponding to the result of the skill evaluation are stored in advance in the storage unit. Then, in the process of FIG. 18, a process of selecting a fixed phrase corresponding to the obtained skill evaluation result and a process of adding the selected fixed phrase to the skill evaluation result are performed. If you want to display advice information that indicates a difference from expert data, for example, store expert data as metadata, process to infer the difference based on the metadata, information on the result of the inference, and related A process of creating suitable advice information based on fixed phrases to be used is added. In addition, in order to obtain expert data in advance, it is necessary to obtain data through surgery by an expert on similar cases. It is also possible to transfer expert data acquired by one endoscope system 300 to another endoscope system 300 .
 これにより、アプローチ角度情報及び通電履歴情報の両方を考慮しつつ、さらに機械学習を用いることから、術者のスキルについて、精度を高く評価することができる。 As a result, both the approach angle information and the energization history information are taken into account, and machine learning is used, so the skill of the operator can be evaluated with high accuracy.
 以上で説明したように、処理システム100の処理部120は、学習済モデルに従って動作することによって、術者のスキル評価を行う。学習済モデルに従った処理部120おける演算、即ち、入力データに基づいて出力データを出力するための演算は、ソフトウェアによって実行されてもよいし、ハードウェアによって実行されてもよい。換言すれば、図16の各ノードにおいて実行される積和演算等は、ソフトウェア的に実行されてもよい。或いは上記演算は、FPGA等の回路装置によって実行されてもよい。また、上記演算は、ソフトウェアとハードウェアの組み合わせによって実行されてもよい。このように、学習済モデルからの指令に従った処理部120の動作は、種々の態様によって実現可能である。例えば学習済モデルは、推論アルゴリズムと、当該推論アルゴリズムにおいて用いられる重み付け係数とを含む。推論アルゴリズムとは、入力データに基づいて、順方向の演算等を行うアルゴリズムである。この場合、推論アルゴリズムと重み付け係数の両方が記憶部に記憶され、処理部120は、当該推論アルゴリズムと重み付け係数を読み出すことによってソフトウェア的に推論処理を行ってもよい。或いは、推論アルゴリズムはFPGA等によって実現され、記憶部は重み付け係数を記憶してもよい。或いは、重み付け係数を含む推論アルゴリズムがFPGA等によって実現されてもよい。この場合、学習済モデルの情報を記憶する記憶部は、例えばFPGAの内蔵メモリである。 As described above, the processing unit 120 of the processing system 100 evaluates the operator's skill by operating according to the learned model. Calculations in the processing unit 120 according to the trained model, that is, calculations for outputting output data based on input data may be performed by software or by hardware. In other words, the sum-of-products operation and the like executed in each node in FIG. 16 may be executed by software. Alternatively, the above calculations may be performed by a circuit device such as an FPGA. Also, the above operations may be performed by a combination of software and hardware. In this way, the operation of the processing unit 120 according to commands from the trained model can be realized in various ways. For example, a trained model includes an inference algorithm and weighting factors used in the inference algorithm. An inference algorithm is an algorithm that performs forward calculations and the like based on input data. In this case, both the inference algorithm and the weighting coefficient are stored in the storage unit, and the processing unit 120 may perform the inference processing by software by reading out the inference algorithm and the weighting coefficient. Alternatively, the inference algorithm may be implemented by FPGA or the like, and the storage unit may store the weighting coefficients. Alternatively, an inference algorithm including weighting factors may be implemented by an FPGA or the like. In this case, the storage unit that stores the information of the trained model is, for example, the built-in memory of the FPGA.
 4.N次元特徴量空間
 また、図18の処理において、処理部120は、アプローチ角度情報と通電履歴情報と学習済モデルとに基づいて、N(Nは2以上の整数)次元の特徴量を求めてもよい。例えば学習装置では、図15、図16を用いて上述した処理と同様に、複数の学習用アプローチ角度情報と学習用通電履歴情報を、M個のカテゴリに分類する機械学習を行ってもよい。
4. N-Dimensional Feature Amount Space In addition, in the processing of FIG. 18 , the processing unit 120 obtains an N-dimensional feature amount (N is an integer of 2 or more) based on the approach angle information, the energization history information, and the learned model. good too. For example, the learning device may perform machine learning to classify a plurality of pieces of approach angle information for learning and energization history information for learning into M categories in the same manner as the processing described above with reference to FIGS. 15 and 16 .
 処理システム100における処理の流れは、図18と同様である。まず取得部110は、スキル評価の対象となるアプローチ角度情報と通電履歴情報を取得する(ステップS201~ステップS202)。そして、ステップS203において、処理部120は、アプローチ角度情報と通電履歴情報を学習済モデルに入力し、学習済みの重み付け係数に従った順方向の演算を行う点も同様である。この際、処理部120は、中間層におけるデータを、N次元の特徴量として求める。例えば、ニューラルネットワークNN1が第1中間層~第Q中間層を有する場合、N個のノードを有する第J中間層での値をN次元特徴量とする。Qは2以上の整数であり、Jは1以上Q以下の整数である。例えば、J=Qであり、最も出力層に近い中間層がN個のノードを有し、各ノードの出力が特徴量となる。或いは、複数の中間層における出力を組み合わせることによって、N次元特徴量が求められてもよい。 The flow of processing in the processing system 100 is the same as in FIG. First, the acquiring unit 110 acquires approach angle information and energization history information, which are targets of skill evaluation (steps S201 and S202). Similarly, in step S203, the processing unit 120 inputs the approach angle information and the energization history information to the learned model, and performs forward calculations according to the learned weighting coefficients. At this time, the processing unit 120 obtains the data in the intermediate layer as an N-dimensional feature amount. For example, if the neural network NN1 has the first to Q-th intermediate layers, the value in the J-th intermediate layer having N nodes is the N-dimensional feature amount. Q is an integer of 2 or more, and J is an integer of 1 or more and Q or less. For example, J=Q, the intermediate layer closest to the output layer has N nodes, and the output of each node is the feature amount. Alternatively, an N-dimensional feature amount may be obtained by combining outputs from multiple intermediate layers.
 図19は、N次元特徴量空間の例である。横軸がN次元特徴量のうちの特徴量A1を表し、縦軸が特徴量A1とは異なる特徴量B1を表す。ここではN=2としているが、Nは3以上であってもよい。アプローチ角度情報と通電履歴情報を入力することによって、第1特徴量~第N特徴量の値が求められる。即ち、1組のアプローチ角度情報と通電履歴情報が、N次元特徴量空間上の1つの点としてプロットされる。図19に示すように、機械学習に基づいて抽出されるN次元特徴量は、アプローチ角度情報と通電履歴情報からなる入力を、M個のカテゴリに分類するための特徴量である。よって、図19に示すように、N次元特徴量空間での距離に基づいてクラスタリングした結果が、ユーザのスキルを表すカテゴリとなる。即ち、入力に基づいて求められたN次元特徴量での点の位置に応じて、ユーザのスキル評価をM通りに分類することが可能である。例えば、図7等に示す段階のスキル評価を行った場合、図19のC11が評価Aのカテゴリを表し、図19のC12が評価Bのカテゴリを表し、C13が評価Cのカテゴリを表す。これらのカテゴリの総和がM個となる。なお、図7の例では、評価はA~Dに分類可能であるから、M=4である。 FIG. 19 is an example of an N-dimensional feature amount space. The horizontal axis represents the feature amount A1 among the N-dimensional feature amounts, and the vertical axis represents the feature amount B1 different from the feature amount A1. Although N=2 here, N may be 3 or more. By inputting the approach angle information and the energization history information, the values of the first to N-th feature amounts are obtained. That is, a set of approach angle information and energization history information is plotted as one point on the N-dimensional feature amount space. As shown in FIG. 19, the N-dimensional feature amount extracted based on machine learning is a feature amount for classifying the input consisting of the approach angle information and the energization history information into M categories. Therefore, as shown in FIG. 19, the result of clustering based on the distance in the N-dimensional feature amount space becomes the category representing the skill of the user. That is, it is possible to classify the user's skill evaluation into M ways according to the position of the point in the N-dimensional feature value obtained based on the input. For example, when skill evaluation is performed at the stages shown in FIG. 7 and the like, C11 in FIG. 19 represents the category of evaluation A, C12 in FIG. 19 represents the category of evaluation B, and C13 represents the category of evaluation C. The total number of these categories is M. In the example of FIG. 7, since the evaluation can be classified into A to D, M=4.
 処理部120は、スキル評価の対象となるアプローチ角度情報と通電履歴情報を学習済モデルに入力することによって求められたN次元特徴量の特徴量空間における位置と、M個のカテゴリのうちの1又は複数のカテゴリの特徴量空間における重心位置と、の距離に基づいてスキル評価を行う。ここでの重心位置とは、各カテゴリに含まれる複数の点の位置に基づいて求められる情報であり、例えば複数の座標値の平均値である。各カテゴリの重心位置は、学習が完了した段階で既知である。言い換えれば、処理部120は、アプローチ角度情報及び通電履歴情報と、学習済モデルとに基づいて、N(Nは2以上の整数)次元の特徴量を求め、求められたN次元特徴量と、M個のカテゴリの重心との距離に基づいて、スキル評価を行う。またここでの距離は、例えばユークリッド距離であるが、マハラノビス距離等の他の距離が用いられてもよい。 The processing unit 120 calculates the position in the feature amount space of the N-dimensional feature amount obtained by inputting the approach angle information and the energization history information, which are targets of skill evaluation, into the learned model, and one of the M categories. Alternatively, skill evaluation is performed based on the distance between the centroid positions in the feature amount space of a plurality of categories. The position of the center of gravity here is information obtained based on the positions of a plurality of points included in each category, and is, for example, an average value of a plurality of coordinate values. The centroid position of each category is known at the stage when learning is completed. In other words, the processing unit 120 obtains an N-dimensional feature amount (N is an integer equal to or greater than 2) based on the approach angle information, the energization history information, and the learned model, and obtains the N-dimensional feature amount, Skill evaluation is performed based on the distance from the center of gravity of M categories. Also, the distance here is, for example, the Euclidean distance, but other distances such as the Mahalanobis distance may be used.
 例えば処理部120は、第1~第Mのカテゴリのうち、順方向の演算によって求められたN次元特徴量との距離が最も小さいカテゴリを求め、評価対象のデータが当該カテゴリに属すると判定する。図19の例であれば、処理部120は、C11の重心位置との距離が最小である場合に、前述の評価Aと判定し、C12の重心位置との距離が最小である場合に前述の評価Bと判定し、C13の重心位置との距離が最小である場合に前述の評価Cと判定する。 For example, the processing unit 120 obtains the category having the smallest distance from the N-dimensional feature amount obtained by the forward calculation among the first to Mth categories, and determines that the data to be evaluated belongs to the category. . In the example of FIG. 19, the processing unit 120 determines the above-mentioned evaluation A when the distance from the center of gravity of C11 is the minimum, and the above-mentioned evaluation when the distance from the center of gravity of C12 is the minimum. It is determined to be the evaluation B, and when the distance from the center of gravity position of C13 is the minimum, it is determined to be the evaluation C described above.
 なお、図19の特徴量A1と特徴量B1は、前述の通り、アプローチ角度情報と通電履歴情報に基づいて抽出されるパラメータであるから、アプローチ角度情報と通電履歴情報とは異なるパラメータであるが、特徴量A1をアプローチ角度情報自体に対応させ、特徴量B1を通電履歴情報自体に対応させることを妨げない。言い換えれば、処理部120は、アプローチ角度情報に対応する第1特徴量である特徴量A1と、通電履歴情報に対応する第2特徴量である特徴量B1によって規定される特徴量空間における距離に基づいてスキル評価を行ってもよい。例えば、図19のC11は通電時間が短く、かつ、アプローチ角度の相対変化量が小さいカテゴリであることから前述のように評価Aと判定する。また、C12は通電時間が長く、かつ、アプローチ角度の相対変化量が大きいカテゴリであることから前述のように評価Bと判定する。また、C13は、C12より通電時間は短いが、アプローチ角度の相対変化量がC12よりも大きく、危険性がC12より高いために評価Cと判定する。このようにすることで、アプローチ角度情報と通電履歴情報をより適切に利用したスキル評価を行うことができるので、より精度の高いスキル評価を行うことができる。 As described above, the feature amount A1 and the feature amount B1 in FIG. 19 are parameters extracted based on the approach angle information and the energization history information. Therefore, the approach angle information and the energization history information are different parameters. , the feature amount A1 may correspond to the approach angle information itself, and the feature amount B1 may correspond to the energization history information itself. In other words, the processing unit 120 calculates the distance in the feature amount space defined by the feature amount A1, which is the first feature amount corresponding to the approach angle information, and the feature amount B1, which is the second feature amount corresponding to the energization history information. Skill evaluation may be done based on For example, C11 in FIG. 19 is a category in which the energization time is short and the amount of relative change in the approach angle is small. C12 is a category with a long energization time and a large amount of relative change in the approach angle, so it is determined to be evaluated as B as described above. Further, C13 has a shorter energization time than C12, but the relative change amount of the approach angle is larger than C12 and the risk is higher than C12, so it is determined to be evaluated as C. By doing so, it is possible to perform skill evaluation using the approach angle information and the energization history information more appropriately, so that more accurate skill evaluation can be performed.
 ユーザ評価後の処理は図18と同様であり、ステップS204において、出力処理部130は、スキル評価の結果であるスキル評価情報を出力する。 The processing after user evaluation is the same as in FIG. 18, and in step S204, the output processing unit 130 outputs skill evaluation information that is the result of skill evaluation.
 以上では、クラスタリングを行った際の中間層データが、N次元特徴量である例について説明した。ただし本実施形態の手法はこれに限定されない。例えば、アプローチ角度情報と通電履歴情報に基づく入力に対して、主成分分析を行うことによってN次元特徴量が抽出されてもよい。主成分分析を行う手法は公知であるため詳細な説明は省略する。また機械学習を用いて主成分分析を行う手法も知られており、その場合も機械学習を適用可能である。N次元特徴量抽出後の処理については上記の例と同様である。 In the above, an example in which the intermediate layer data when clustering is performed is an N-dimensional feature amount has been explained. However, the method of this embodiment is not limited to this. For example, an N-dimensional feature amount may be extracted by performing principal component analysis on inputs based on approach angle information and energization history information. Since the method of performing principal component analysis is well known, detailed description thereof will be omitted. A method of performing principal component analysis using machine learning is also known, and machine learning can be applied in that case as well. The processing after N-dimensional feature quantity extraction is the same as the above example.
 またN次元特徴量を用いる場合、スキル評価の手法は上記に限定されない。例えば、処理部120は、評価対象となるユーザに対応するプロット点と、当該ユーザとは異なる第2ユーザに対応するプロット点との距離に基づいてスキル評価を行ってもよい。ここでの第2ユーザは例えば指導者であり、評価対象となるユーザは当該指導者による指導を受けるユーザである。このようにすれば、評価対象となるユーザのスキルが、指導者のスキルにどの程度近いかを表す指標を、スキル評価情報として出力できる。 Also, when using the N-dimensional feature amount, the skill evaluation method is not limited to the above. For example, the processing unit 120 may perform skill evaluation based on the distance between the plot point corresponding to the user to be evaluated and the plot point corresponding to the second user different from the user. The second user here is, for example, an instructor, and the user to be evaluated is a user who receives guidance from the instructor. In this way, an index indicating how close the skill of the user to be evaluated is to the skill of the instructor can be output as the skill evaluation information.
 内視鏡を用いた処置では、同じ部位の同じ病変を対象とする場合であっても、複数の方式が考えられる。どの方式が適していると考えるかはユーザによるため、指導者が異なればよいとされる処置の内容が異なる可能性がある。換言すれば、複数の熟練医が、特定処置についてそれぞれ異なる流派を形成する。その点、上記のように特定のユーザとの類似度を表す情報をスキル評価情報とすることによって、対象ユーザのスキルを適切に評価することが可能になる。例えば、所定の流派に属するユーザのスキルは、同じ流派における熟練医を基準として判断される。 In the treatment using an endoscope, multiple methods can be considered even when targeting the same lesion in the same site. Since it is up to the user to decide which method is suitable, the content of treatment that should be performed by different instructors may differ. In other words, multiple practitioners form different schools of thought for specific procedures. In this regard, by using information representing the degree of similarity with a specific user as skill evaluation information as described above, it is possible to appropriately evaluate the skill of the target user. For example, the skill of a user belonging to a certain school is judged based on the expert doctors of the same school.
 5.変形例
 以上のように、本実施形態の手法によれば、アプローチ角度情報及び通電履歴情報に基づいて、内視鏡を操作するユーザのスキル評価の結果であるスキル評価情報が出力されるが、出力される情報はこれに限られず、アプローチ角度の相対変化に関する報知情報が出力されるようにしてもよい。以降、変形例として、アプローチ角度の相対変化に関する報知情報を出力する処理システム2100に関する実施形態について説明する。なお、この変形例において、主にESDに含まれる各処置について説明するが、この変形例の手法は生体に対する他の処置に拡張することが可能である。
5. Modification As described above, according to the method of the present embodiment, skill evaluation information, which is the result of skill evaluation of the user operating the endoscope, is output based on the approach angle information and the energization history information. The information to be output is not limited to this, and notification information regarding the relative change in the approach angle may be output. Hereinafter, as a modified example, an embodiment relating to a processing system 2100 that outputs notification information regarding a relative change in approach angle will be described. In this modified example, each treatment included in ESD will be mainly described, but the method of this modified example can be extended to other treatments for a living body.
 従来、生体に対する処置を行う場合、内視鏡先端部と管腔構造物の相対的位置関係を推定することによって、診断者等による管腔構造物の構造把握をサポートする手法が知られている。しかし、病変の切除等の具体的な処置を考えた場合、各タイミングにおける角度ではなく、前述の処置対象組織に対する内視鏡の角度変化が安全な範囲内にあるかが重要であることがわかってきている。 Conventionally, when performing treatment on a living body, there has been known a method of assisting a diagnostician or the like in grasping the structure of the lumen structure by estimating the relative positional relationship between the tip of the endoscope and the lumen structure. . However, when considering specific procedures such as resection of lesions, it was found that it is important not to change the angle at each timing, but whether the angle change of the endoscope with respect to the tissue to be treated is within a safe range. is coming.
 例えば、ESDにおける切開や剥離のステップでは、高周波デバイスに高周波電流を供給した状態で組織を切っていくため、前述のアプローチ角度が適切でないと組織を過剰に切除したり、出血が増えたりするおそれがある。即ち、処置におけるアプローチ角度は安全性に直結する。しかし、切開や剥離等を行っている間は、内視鏡の撮像系によって撮像される撮像画像が近接画像となる。そのため、術者は内視鏡先端がどのような角度になっているかの経過を、撮像画像を用いて認識することが難しい。特に、ESDの剥離ステップ等においては、組織の下に潜り込むようにして処置が行われる。そのため術者は、撮像画像を閲覧したとしても、内視鏡の位置や姿勢を推定することが容易でない。具体的に、図20(A)、図20(B)、図20(C)、図20(D)、図20(E)、図20(F)を用いて、挿入部2310bの先端部2011と処置対象組織の位置関係、及び、そのときの撮像画像を例示しながら説明する。 For example, in the incision and ablation steps in ESD, the tissue is cut while high-frequency current is supplied to the high-frequency device. There is In other words, the approach angle in treatment is directly linked to safety. However, while an incision, peeling, or the like is being performed, the captured image captured by the imaging system of the endoscope becomes a close-up image. Therefore, it is difficult for the operator to recognize the progress of the angle of the tip of the endoscope using the captured image. In particular, in an ESD ablation step or the like, the treatment is performed by burrowing under the tissue. Therefore, it is not easy for the operator to estimate the position and orientation of the endoscope even when viewing the captured image. 20(A), 20(B), 20(C), 20(D), 20(E), and 20(F), the distal end portion 2011 of the insertion portion 2310b and the positional relationship of the treatment target tissue, and the captured image at that time.
 図20(A)~図20(F)におけるOB21が病変であり、処置対象組織とはOB21又はその周辺組織である。なお、図20(D)と図20(F)において、E21で示す線は、粘膜下層と筋層の境界を示す線である。例えば、図20(A)は、挿入部2310bの先端部2011から処置具2360を突出させ、処置対象組織へのアプローチを開始した状態を表す。図20(B)は、図20(A)に示す状態における撮像画像を表す。図20(A)の例では、処置具2360と処置対象組織が接触するほどには近づいていない。よって、図20(C)~図20(F)を用いて後述する処置中の状態に比べて、撮像画像に基づいて、術者が挿入部2310bと処置対象組織の相対関係を推定することが容易である。 OB21 in FIGS. 20(A) to 20(F) is a lesion, and the tissue to be treated is OB21 or its surrounding tissue. In addition, in FIGS. 20(D) and 20(F), the line indicated by E21 is the line indicating the boundary between the submucosal layer and the muscle layer. For example, FIG. 20A shows a state in which the treatment instrument 2360 is protruded from the distal end portion 2011 of the insertion section 2310b and approach to the treatment target tissue is started. FIG. 20(B) represents a captured image in the state shown in FIG. 20(A). In the example of FIG. 20A, the treatment tool 2360 and the tissue to be treated are not close enough to contact each other. Therefore, the operator can estimate the relative relationship between the insertion portion 2310b and the tissue to be treated based on the captured image, compared to the state during treatment, which will be described later with reference to FIGS. Easy.
 図20(C)は、切開を行っている状態を表し、図20(D)は切開時の撮像画像を表す。図20(C)に示すように、先端部2011は切開によって浮き上がった組織の下に潜り込んでいる。そのため図20(D)に示すように、撮像画像は当該組織が画面全体を覆う状態となっており、術者が撮像画像からアプローチ角度を推定することは困難である。 FIG. 20(C) represents a state in which incision is being performed, and FIG. 20(D) represents a captured image during incision. As shown in FIG. 20(C), the tip 2011 is hidden under the tissue raised by the incision. Therefore, as shown in FIG. 20(D), the captured image is in a state where the tissue covers the entire screen, and it is difficult for the operator to estimate the approach angle from the captured image.
 図20(E)は、切開中にアプローチ角度が大きく変化し、出血が増える危険な角度となっている状態を表し、図20(F)はそのときの撮像画像を表す。図20(E)に示すように、先端部2011は浮き上がった組織の下に潜り込んだ状態を維持している。図20(F)に示すように、撮像画像は図20(D)の状態からの変化が小さく、術者が撮像画像からアプローチ角度が変化してしまっていることに気づくことは容易でない。例えば、処置具2360が組織に深く入り込むことによって出血量が増大した場合、術者は危険な状態にあることを撮像画像から把握可能であるが、出血発生前に危険を予想することは難しい。 FIG. 20(E) shows a state in which the approach angle changes greatly during incision and becomes a dangerous angle that increases bleeding, and FIG. 20(F) shows the captured image at that time. As shown in FIG. 20(E), the distal end portion 2011 maintains a state of getting under the raised tissue. As shown in FIG. 20(F), the captured image changes little from the state of FIG. 20(D), and it is not easy for the operator to notice that the approach angle has changed from the captured image. For example, when the treatment tool 2360 penetrates deeply into the tissue and the amount of bleeding increases, the operator can grasp from the captured image that the operator is in a dangerous state, but it is difficult to predict the danger before the bleeding occurs.
 さらに図22を用いて後述するように、内視鏡が軟性部2013を有する軟性内視鏡である場合、挿入部先端から手元操作部までが軟性且つ長大である。そのため、感触や力量は、術者にはほとんど伝わらない。 Furthermore, as will be described later with reference to FIG. 22, when the endoscope is a flexible endoscope having a flexible section 2013, the area from the distal end of the insertion section to the operating section at hand is flexible and long. As a result, the feeling and strength are hardly conveyed to the operator.
 即ち、実際には画像、感触、力量のいずれも十分な情報が得られない中で、内視鏡医である術者は自らの経験則に基づいて手技を行っている。例えば術者は、病変に対する実際の内視鏡の位置や姿勢を想像で補いながら処置を実行する。そのため、熟練医自身も、どういう時にどう操作するのがよいのか、うまく言葉で表すことができない。換言すれば、従来、処置におけるアプローチ角度の推移は「暗黙知」となっていた。そのため、出血する前段階で危険を察知することが望ましいが、経験が豊富な術者でなければそのような対応は難しい。 In other words, the operator, who is an endoscopist, performs the procedure based on his own empirical rules, in the absence of sufficient information regarding images, sensations, and strength. For example, the operator performs treatment while imagining the actual position and posture of the endoscope with respect to the lesion. For this reason, even experienced doctors cannot express in words how and when to operate. In other words, conventionally, the transition of the approach angle in treatment has been "tacit knowledge." Therefore, it is desirable to perceive danger in the stage before bleeding occurs, but such a response is difficult unless the operator has a lot of experience.
 処置対象組織に対する先端部2011の角度が大きいほど、処置具2360が組織の深部まで侵入しやすく、危険な傾向がある。しかし、臓器の種類、臓器における病変の位置や形状によっては、そもそも浅い角度でのアプローチが難しい場合もある。そのため、組織に対する内視鏡の角度を単に取得したとしても、術者を適切にサポートすることは難しい。 The larger the angle of the distal end portion 2011 with respect to the tissue to be treated, the easier it is for the treatment instrument 2360 to penetrate deeper into the tissue, which tends to be dangerous. However, depending on the type of organ and the position and shape of the lesion in the organ, it may be difficult to approach at a shallow angle in the first place. Therefore, it is difficult to adequately support the operator by simply obtaining the angle of the endoscope with respect to the tissue.
 また、術者が気づかないうちにアプローチ角度が変化した場合、換言すれば実際のアプローチ角度と術者の想像するアプローチ角度が乖離した場合、処置具2360と組織が意図しない接触の仕方をすることになるため、非常に危険である。そのため、たとえアプローチ角度自体が小さかったとしても、術者がアプローチ角度の変化に気づかない場合は危険度が大きい。逆に、アプローチ角度が大きくても、術者がそのことを把握していれば、先端部2011の動かし方を考慮すると考えられるため、危険度は相対的に小さい。 In addition, when the approach angle changes without the operator noticing it, in other words, when the actual approach angle deviates from the approach angle imagined by the operator, the treatment instrument 2360 may come into contact with the tissue in an unintended manner. It is very dangerous because it becomes Therefore, even if the approach angle itself is small, the risk is high if the operator does not notice the change in the approach angle. Conversely, even if the approach angle is large, the degree of risk is relatively small because it is considered that the method of moving the distal end portion 2011 will be taken into consideration if the operator is aware of it.
 以上のように、所与のタイミングにおける単体のアプローチ角度を取得しても、術者のサポートが十分にできないことが分かる。しかし、アプローチ角度の相対変化に基づいて術者をサポートする処理システムは提案されていない。そこで、この変形例では、アプローチ角度の基準を設定し、当該基準に対するアプローチ角度の変化を出力する処理システム2100について説明する。 As described above, it can be seen that even if a single approach angle is obtained at a given timing, the operator cannot be sufficiently supported. However, no processing system has been proposed to support the operator based on relative changes in approach angle. Therefore, in this modified example, a processing system 2100 that sets a reference for the approach angle and outputs changes in the approach angle with respect to the reference will be described.
 図21は、この変形例に係る処理システム2100の構成例を示す図である。処理システム2100は、内視鏡システム2300の挿入部2310bのアプローチ角度を求める処理部2110と、出力処理部2120を含む。処理部2110は、生体に対して内視鏡システム2300を用いた処置が開始されたと判定したタイミングにおけるアプローチ角度を基準角度に設定し、基準角度に対するアプローチ角度の相対変化を求める。そして出力処理部2120は、相対変化に関する報知情報の出力処理を行う。ただし、処理システム2100の構成は図21に限定されず、他の構成要素を追加するなどの種々の変形実施が可能である。 FIG. 21 is a diagram showing a configuration example of a processing system 2100 according to this modification. The processing system 2100 includes a processing section 2110 that obtains the approach angle of the insertion section 2310b of the endoscope system 2300 and an output processing section 2120 . The processing unit 2110 sets the approach angle at the timing when it is determined that the treatment using the endoscope system 2300 is started on the living body as the reference angle, and obtains the relative change of the approach angle with respect to the reference angle. Then, the output processing unit 2120 performs output processing of notification information regarding the relative change. However, the configuration of the processing system 2100 is not limited to that shown in FIG. 21, and various modifications such as adding other components are possible.
 従来の手法では、単に内視鏡の角度を取得するのに対し、この変形例では処置開始時を基準としたときのアプローチ角度の相対変化が求められる。相対変化は、例えば変化角度を表す数値データである。そして、アプローチ角度の相対変化に関する報知情報を出力することによって、術者の処置を適切にサポートすること、又は、処置に関するユーザスキルを評価すること等が可能になる。 Whereas the conventional method simply acquires the angle of the endoscope, this modified example requires the relative change in the approach angle with reference to the start of treatment. A relative change is numerical data representing, for example, a change angle. By outputting notification information regarding the relative change in the approach angle, it is possible to appropriately support the treatment of the operator or to evaluate the user's skill regarding the treatment.
 例えば、図27(A)、図27(B)を用いて後述するように、アプローチ角度の相対変化をリアルタイム表示することによって、術者の処置をサポートできる。例えば術者は、処置対象組織を俯瞰した状態でポジショニングを行い、図20(A)に示すように処置具2360を先端部2011から突出させた後に、処置対象組織へのアプローチを開始することが想定される。ポジショニングとは、アプローチを行う際の挿入部2310bの位置、姿勢を決定するステップを表す。ポジショニング段階では、アプローチ開始後に比べて先端部2011と処置対象組織の距離が大きい。そのため修練医であっても、処置の開始時、即ち処置具2360を突出してアプローチを開始する時には所望のアプローチ角度を実現することは容易である。処置開始時からのアプローチ角度の相対変化を抑制するような報知を行うことによって、術者が修練医であっても、処置中のアプローチ角度が大きくばらつくことを抑制できる。この変形例の手法によれば、出血が発生する前に危険を予測できるため、出血などの問題の発生を未然に防ぐように術者をサポートすることが可能になる。 For example, as will be described later with reference to FIGS. 27(A) and 27(B), it is possible to support the operator's treatment by displaying the relative change in the approach angle in real time. For example, the operator positions the tissue to be treated in a bird's-eye view, protrudes the treatment instrument 2360 from the distal end portion 2011 as shown in FIG. 20A, and then starts approaching the tissue to be treated. is assumed. Positioning represents a step of determining the position and posture of the insertion section 2310b when approaching. At the positioning stage, the distance between the distal end portion 2011 and the tissue to be treated is greater than after the start of the approach. Therefore, even a trainee can easily realize a desired approach angle at the start of treatment, that is, when the treatment tool 2360 is protruded and the approach is started. By providing a notification that suppresses the relative change in the approach angle from the start of the treatment, even if the operator is a trainee doctor, it is possible to suppress large variations in the approach angle during the treatment. According to the technique of this modified example, the danger can be predicted before bleeding occurs, so it is possible to support the operator to prevent problems such as bleeding from occurring.
 なお処理システム2100の処理部2110は、下記のハードウェアにより構成される。ハードウェアは、デジタル信号を処理する回路及びアナログ信号を処理する回路の少なくとも一方を含むことができる。例えば、ハードウェアは、回路基板に実装された1又は複数の回路装置や、1又は複数の回路素子で構成することができる。1又は複数の回路装置は例えばIC、FPGA等である。1又は複数の回路素子は例えば抵抗、キャパシター等である。 The processing unit 2110 of the processing system 2100 is configured with the following hardware. The hardware may include circuitry for processing digital signals and/or circuitry for processing analog signals. For example, the hardware may consist of one or more circuit devices or one or more circuit elements mounted on a circuit board. The one or more circuit devices are for example ICs, FPGAs or the like. The one or more circuit elements are, for example, resistors, capacitors, and the like.
 また処理部2110は、下記のプロセッサにより実現されてもよい。処理システム2100は、情報を記憶する不図示のメモリと、メモリに記憶された情報に基づいて動作するプロセッサと、を含む。情報は、例えばプログラムと各種のデータ等である。プロセッサは、ハードウェアを含む。プロセッサは、CPU、GPU、DSP等、各種のプロセッサを用いることが可能である。メモリは、SRAM、DRAMなどの半導体メモリであってもよいし、レジスタであってもよいし、HDD等の磁気記憶装置であってもよいし、光学ディスク装置等の光学式記憶装置であってもよい。例えば、メモリはコンピュータにより読み取り可能な命令を格納しており、当該命令がプロセッサにより実行されることで、処理部2110の機能が処理として実現されることになる。ここでの命令は、プログラムを構成する命令セットの命令でもよいし、プロセッサのハードウェア回路に対して動作を指示する命令であってもよい。さらに、処理部2110の各部の全部または一部をクラウドコンピューティングで実現し、後述する各処理をクラウドコンピューティング上で行うこともできる。 Also, the processing unit 2110 may be realized by the following processor. Processing system 2100 includes a memory (not shown) that stores information and a processor that operates based on the information stored in the memory. The information is, for example, programs and various data. A processor includes hardware. Various processors such as CPU, GPU, and DSP can be used as the processor. The memory may be a semiconductor memory such as an SRAM or a DRAM, a register, a magnetic storage device such as an HDD, or an optical storage device such as an optical disk device. good too. For example, the memory stores computer-readable instructions, and the functions of the processing unit 2110 are realized as processes when the instructions are executed by the processor. The instruction here may be an instruction set that constitutes a program, or an instruction that instructs a hardware circuit of a processor to perform an operation. Further, all or part of each part of the processing unit 2110 can be realized by cloud computing, and each process described later can be performed on cloud computing.
 また、処理部2110は、プロセッサ上で動作するプログラムのモジュールとして実現されてもよい。例えば、処理部2110は、挿入部2310bのアプローチ角度の相対変化を求める処理モジュールとして実現される。 Also, the processing unit 2110 may be implemented as a module of a program that runs on a processor. For example, the processing section 2110 is implemented as a processing module that obtains relative changes in the approach angle of the insertion section 2310b.
 また、処理部2110が行う処理を実現するプログラムは、例えばコンピュータによって読み取り可能な媒体である情報記憶装置に格納できる。情報記憶装置は、例えば光ディスク、メモリカード、HDD、或いは半導体メモリなどによって実現できる。半導体メモリは例えばROMである。処理部2110は、情報記憶装置に格納されるプログラムに基づいて種々の処理を行う。即ち情報記憶装置は、処理部2110としてコンピュータを機能させるためのプログラムを記憶する。コンピュータは、入力装置、処理部、記憶部、出力部を備える装置である。具体的にはこの変形例に係るプログラムは、図26や図32を用いて後述する各ステップを、コンピュータに実行させるためのプログラムである。 Also, the program that implements the processing performed by the processing unit 2110 can be stored in, for example, an information storage device that is a computer-readable medium. The information storage device can be implemented by, for example, an optical disc, memory card, HDD, semiconductor memory, or the like. A semiconductor memory is, for example, a ROM. The processing unit 2110 performs various processes based on programs stored in the information storage device. That is, the information storage device stores a program for causing the computer to function as the processing unit 2110 . A computer is a device that includes an input device, a processing unit, a storage unit, and an output unit. Specifically, the program according to this modification is a program for causing a computer to execute each step described later with reference to FIGS. 26 and 32. FIG.
 出力処理部2120は、アプローチ角度の相対変化に関する報知情報を出力する処理を行う。出力処理は、例えば報知情報を含む画面を表示部に表示する処理である。ここでの表示部は、例えば図22、図23を用いて後述する内視鏡システム2300の表示部2340である。或いは表示部は、処理システム2100に含まれてもよいし、処理システム2100と通信可能な他の機器に含まれてもよい。出力処理部2120は、表示部を直接制御する表示用インターフェースであってもよい。また出力処理部2120は、表示画像そのもの、又は、表示画像を生成するための情報を、表示部を有する機器に送信するための通信インターフェースであってもよい。また出力処理部2120は、表示用インターフェース又は通信インターフェースを制御する制御プロセッサや制御回路等を含んでもよい。 The output processing unit 2120 performs processing for outputting notification information regarding the relative change in the approach angle. The output process is, for example, a process of displaying a screen containing notification information on the display unit. The display unit here is, for example, the display unit 2340 of the endoscope system 2300 described later with reference to FIGS. 22 and 23 . Alternatively, the display unit may be included in the processing system 2100 or may be included in another device capable of communicating with the processing system 2100 . The output processing unit 2120 may be a display interface that directly controls the display unit. Also, the output processing unit 2120 may be a communication interface for transmitting the display image itself or information for generating the display image to a device having a display unit. The output processing unit 2120 may also include a control processor, control circuit, or the like that controls the display interface or the communication interface.
 また出力処理部2120は、音や光による出力を行ってもよい。例えば出力処理部2120は、LED等の発光部や、スピーカー等により実現されてもよいし、これらを制御する制御回路等を含んでもよい。 Also, the output processing unit 2120 may output sound or light. For example, the output processing unit 2120 may be implemented by a light emitting unit such as an LED, a speaker, or the like, or may include a control circuit or the like for controlling these.
 なお、処理システム2100は、図22及び図23を用いて後述する内視鏡システム2300に含まれてもよい。例えば、処理システム2100は、内視鏡システム2300の処理装置2330に含まれる。或いは処理システム2100は、内視鏡システム2300と別体の装置として設けられてもよい。例えば処理システム2100は、処理装置2330と接続されるPCにより実現されてもよいし、ネットワークを介して接続されるサーバシステムによって実現されてもよい。ここでのネットワークは、プライベートネットワークであってもよいし、インターネット等の公衆通信網であってもよい。またネットワークは有線、無線を問わない。また処理システム2100は、複数の装置の分散処理によって実現されてもよい。例えば処理システム2100は、処理装置2330、PC、サーバシステムのうちの2以上の機器によって実現されてもよい。 Note that the processing system 2100 may be included in an endoscope system 2300 that will be described later with reference to FIGS. 22 and 23. For example, processing system 2100 is included in processing device 2330 of endoscope system 2300 . Alternatively, the processing system 2100 may be provided as a device separate from the endoscope system 2300 . For example, the processing system 2100 may be realized by a PC connected to the processing device 2330, or by a server system connected via a network. The network here may be a private network or a public communication network such as the Internet. The network may be wired or wireless. Processing system 2100 may also be implemented by distributed processing of a plurality of devices. For example, the processing system 2100 may be implemented by two or more of the processing device 2330, PC, and server system.
 図22は、内視鏡システム2300の構成を示す図である。内視鏡システム2300は、スコープ部2310と、処理装置2330と、表示部2340と、光源装置2350とを含む。術者は、内視鏡システム2300を用いて患者の内視鏡検査を行う。ただし、内視鏡システム2300の構成は図22に限定されず、一部の構成要素を省略したり、他の構成要素を追加したりするなどの種々の変形実施が可能である。内視鏡システム2300は、例えば消化器の診断等に用いられる軟性内視鏡である。 FIG. 22 is a diagram showing the configuration of an endoscope system 2300. FIG. The endoscope system 2300 includes a scope section 2310 , a processing device 2330 , a display section 2340 and a light source device 2350 . The operator uses the endoscope system 2300 to perform an endoscopic examination of the patient. However, the configuration of the endoscope system 2300 is not limited to that shown in FIG. 22, and various modifications such as omitting some components or adding other components are possible. The endoscope system 2300 is a flexible endoscope used for diagnosis of digestive organs, for example.
 また図22においては、処理装置2330が、コネクタ2310dによってスコープ部2310と接続される1つの装置である例を示したがこれには限定されない。例えば、処理装置2330の一部又は全部の構成は、ネットワークを介して接続可能なPCやサーバシステム等の他の情報処理装置によって構築されてもよい。例えば、処理装置2330はクラウドコンピューティングによって実現されてもよい。 Also, FIG. 22 shows an example in which the processing device 2330 is one device connected to the scope section 2310 via the connector 2310d, but it is not limited to this. For example, part or all of the configuration of the processing device 2330 may be constructed by other information processing devices such as a PC or a server system that can be connected via a network. For example, processing unit 2330 may be implemented by cloud computing.
 スコープ部2310は、操作部2310aと、可撓性を有する挿入部2310bと、信号線などを含むユニバーサルケーブル2310cとを有する。スコープ部2310は、管状の挿入部2310bを体腔内に挿入する管状挿入装置である。ユニバーサルケーブル2310cの先端にはコネクタ2310dが設けられる。スコープ部2310は、コネクタ2310dによって、光源装置2350及び処理装置2330と着脱可能に接続される。さらに、図23を用いて後述するように、ユニバーサルケーブル2310c内には、ライトガイド2315が挿通されており、スコープ部2310は、光源装置2350からの照明光を、ライトガイド2315を通して挿入部2310bの先端から出射する。 The scope section 2310 has an operation section 2310a, a flexible insertion section 2310b, and a universal cable 2310c including signal lines and the like. The scope section 2310 is a tubular insertion device that inserts a tubular insertion section 2310b into a body cavity. A connector 2310d is provided at the tip of the universal cable 2310c. The scope unit 2310 is detachably connected to the light source device 2350 and the processing device 2330 by a connector 2310d. Furthermore, as will be described later with reference to FIG. 23, a light guide 2315 is inserted through the universal cable 2310c. emitted from the tip.
 例えば挿入部2310bは、挿入部2310bの先端から基端に向かって、先端部2011と、湾曲可能な湾曲部2012と、軟性部2013とを有している。挿入部2310bは、被写体に挿入される。挿入部2310bの先端部2011は、スコープ部2310の先端部であり、硬い先端硬質部である。後述する対物光学系2311や撮像素子2312は、例えば先端部2011に設けられる。 For example, the insertion portion 2310b has a distal end portion 2011, a bendable bending portion 2012, and a flexible portion 2013 from the distal end to the proximal end of the insertion portion 2310b. The insertion portion 2310b is inserted into the subject. A distal end portion 2011 of the insertion portion 2310b is a distal end portion of the scope portion 2310 and is a hard distal end rigid portion. An objective optical system 2311 and an imaging element 2312, which will be described later, are provided at the distal end portion 2011, for example.
 湾曲部2012は、操作部2310aに設けられた湾曲操作部材に対する操作に応じて、所望の方向に湾曲可能である。湾曲操作部材は、例えば左右湾曲操作ノブ及び上下湾曲操作ノブを含む。また操作部2310aには、湾曲操作部材の他にも、レリーズボタン、送気送水ボタン等の各種操作ボタンが設けられてもよい。 The bending portion 2012 can bend in a desired direction according to the operation of the bending operation member provided on the operation portion 2310a. The bending operation member includes, for example, a horizontal bending operation knob and a vertical bending operation knob. In addition to the bending operation member, the operation portion 2310a may be provided with various operation buttons such as a release button and an air/water supply button.
 処理装置2330は、受信した撮像信号に対して所定の画像処理を行い、撮像画像を生成するビデオプロセッサである。生成された撮像画像の映像信号は、処理装置2330から表示部2340へ出力され、ライブの撮像画像が、表示部2340上に表示される。処理装置2330の構成については後述する。表示部2340は、例えば液晶ディスプレイやELディスプレイ等である。 The processing device 2330 is a video processor that performs predetermined image processing on the received imaging signal and generates a captured image. A video signal of the generated captured image is output from the processing device 2330 to the display unit 2340 , and the live captured image is displayed on the display unit 2340 . The configuration of the processing device 2330 will be described later. The display unit 2340 is, for example, a liquid crystal display, an EL display, or the like.
 光源装置2350は、通常観察モード用の白色光を出射可能な光源装置である。なお、光源装置2350は、通常観察モード用の白色光と、狭帯域光等の特殊光とを選択的に出射可能であってもよい。 A light source device 2350 is a light source device capable of emitting white light for normal observation mode. The light source device 2350 may be capable of selectively emitting white light for normal observation mode and special light such as narrow band light.
 図23は、内視鏡システム2300の各部の構成を説明する図である。なお図23では、スコープ部2310の一部の構成を省略、簡略化している。 FIG. 23 is a diagram for explaining the configuration of each part of the endoscope system 2300. FIG. Note that in FIG. 23, a part of the configuration of the scope unit 2310 is omitted and simplified.
 光源装置2350は、照明光を発光する光源2352を含む。光源2352は、キセノン光源であってもよいし、LEDであってもよいし、レーザー光源であってもよい。また光源2352は他の光源であってもよく、発光方式は限定されない。 The light source device 2350 includes a light source 2352 that emits illumination light. The light source 2352 may be a xenon light source, an LED, or a laser light source. Also, the light source 2352 may be another light source, and the light emission method is not limited.
 挿入部2310bは、対物光学系2311、撮像素子2312、照明レンズ2314、ライトガイド2315を含む。ライトガイド2315は、光源2352からの照明光を、挿入部2310bの先端まで導光する。照明レンズ2314は、ライトガイド2315によって導光された照明光を被写体に照射する。対物光学系2311は、被写体から反射した反射光を、被写体像として結像する。 The insertion portion 2310b includes an objective optical system 2311, an imaging device 2312, an illumination lens 2314, and a light guide 2315. The light guide 2315 guides illumination light from the light source 2352 to the tip of the insertion portion 2310b. The illumination lens 2314 irradiates the subject with the illumination light guided by the light guide 2315 . The objective optical system 2311 forms a subject image by reflecting light reflected from the subject.
 撮像素子2312は、対物光学系2311を経由した被写体からの光を受光する。撮像素子2312はモノクロセンサであってもよいし、カラーフィルタを備えた素子であってもよい。カラーフィルタは、広く知られたベイヤフィルタであってもよいし、補色フィルタであってもよいし、他のフィルタであってもよい。補色フィルタとは、シアン、マゼンタ及びイエローの各色フィルタを含むフィルタである。 The imaging element 2312 receives light from the subject via the objective optical system 2311 . The imaging element 2312 may be a monochrome sensor or an element with color filters. The color filter may be a well-known Bayer filter, a complementary color filter, or other filters. Complementary color filters are filters that include cyan, magenta, and yellow color filters.
 処理装置2330は、画像処理やシステム全体の制御を行う。処理装置2330は、前処理部2331、制御部2332、記憶部2333、検出処理部2335、後処理部2336を含む。 The processing device 2330 performs image processing and controls the entire system. The processing device 2330 includes a preprocessing section 2331 , a control section 2332 , a storage section 2333 , a detection processing section 2335 and a postprocessing section 2336 .
 前処理部2331は、撮像素子2312から順次出力されるアナログ信号をデジタルの画像に変換するA/D変換と、A/D変換後の画像データに対する各種補正処理を行う。なお、撮像素子2312にA/D変換回路が設けられ、前処理部2331におけるA/D変換が省略されてもよい。ここでの補正処理とは、例えばカラーマトリクス補正処理、構造強調処理、ノイズ低減処理、AGC等を含む。また前処理部2331は、ホワイトバランス処理等の他の補正処理を行ってもよい。前処理部2331は、処理後の画像を入力画像として検出処理部2335に出力する。また前処理部2331は、処理後の画像を表示画像として、後処理部2336に出力する。 The preprocessing unit 2331 performs A/D conversion for converting analog signals sequentially output from the image sensor 2312 into digital images, and various correction processes for image data after A/D conversion. Note that an A/D conversion circuit may be provided in the image sensor 2312 and the A/D conversion in the preprocessing section 2331 may be omitted. The correction processing here includes, for example, color matrix correction processing, structure enhancement processing, noise reduction processing, AGC, and the like. The preprocessing unit 2331 may also perform other correction processing such as white balance processing. The preprocessing unit 2331 outputs the processed image to the detection processing unit 2335 as an input image. The pre-processing unit 2331 also outputs the processed image to the post-processing unit 2336 as a display image.
 検出処理部2335は、入力画像から病変等の注目領域を検出する検出処理を行う。ただしこの変形例では、注目領域の検出処理は必須ではなく、検出処理部2335は省略が可能である。 The detection processing unit 2335 performs detection processing for detecting a region of interest such as a lesion from the input image. However, in this modification, the attention area detection processing is not essential, and the detection processing unit 2335 can be omitted.
 後処理部2336は、前処理部2331、検出処理部2335の出力に基づく後処理を行い、後処理後の画像を表示部2340に出力する。例えば後処理部2336は、表示画像に対して、検出処理部2335における検出結果を付加し、付加後の画像を表示する処理を行ってもよい。 A post-processing unit 2336 performs post-processing based on the outputs of the pre-processing unit 2331 and the detection processing unit 2335 and outputs the post-processed image to the display unit 2340 . For example, the post-processing unit 2336 may add the detection result of the detection processing unit 2335 to the display image and display the image after addition.
 制御部2332は、撮像素子2312、前処理部2331、検出処理部2335、後処理部2336、光源2352と互いに接続され、各部を制御する。 The control unit 2332 is connected to the imaging device 2312, the preprocessing unit 2331, the detection processing unit 2335, the postprocessing unit 2336, and the light source 2352, and controls each unit.
 例えば処理システム2100が処理装置2330に含まれる場合、図23の構成に処理部2110及び出力処理部2120が追加される。処理部2110は、後述する手法を用いてアプローチ角度の相対変化を求める。出力処理部2120は、相対変化に関する報知情報を表示する。表示部2340は、例えば後処理部2336から出力される表示画像と、出力処理部2120から出力される報知情報を含む表示画面を表示する。また出力処理部2120は、後処理部2336によって実現されてもよい。 For example, when the processing system 2100 is included in the processing device 2330, a processing unit 2110 and an output processing unit 2120 are added to the configuration of FIG. The processing unit 2110 obtains the relative change in the approach angle using a method described later. The output processing unit 2120 displays notification information regarding the relative change. The display unit 2340 displays a display screen including a display image output from the post-processing unit 2336 and notification information output from the output processing unit 2120, for example. Also, the output processing unit 2120 may be realized by the post-processing unit 2336 .
 図24は、挿入部2310bの先端部2011の構成を説明する図である。図24に示すように、先端部2011の断面形状は略円形であり、図23を用いて上述したように、対物光学系2311及び照明レンズ2314が設けられる。また、挿入部2310bには操作部2310aから先端部2011の開口部2316までつながる空洞であるチャンネルが設けられる。ここでの開口部2316は、いわゆる鉗子口と呼ばれる処置具2360用の開口である。ただし、開口部2316は、送気や吸引に用いられてもよい。 FIG. 24 is a diagram illustrating the configuration of the distal end portion 2011 of the insertion portion 2310b. As shown in FIG. 24, the distal end portion 2011 has a substantially circular cross-sectional shape, and is provided with an objective optical system 2311 and an illumination lens 2314 as described above with reference to FIG. Further, the insertion portion 2310b is provided with a channel, which is a cavity, connecting from the operation portion 2310a to the opening portion 2316 of the distal end portion 2011. As shown in FIG. The opening 2316 here is an opening for a treatment tool 2360 called a forceps opening. However, the opening 2316 may be used for air supply or suction.
 図24に示すように、術者は、処置具2360を当該チャンネルに挿通し、開口部2316から処置具2360の先端部分を突出させることによって、処置対象組織に対する処置を行う。なお、図24では、2系統の照明レンズ2314、1つの対物光学系2311、1つの開口部2316を有する先端部2011の構成を例示したが、具体的な構成は種々の変形実施が可能である。 As shown in FIG. 24 , the operator inserts the treatment instrument 2360 through the channel and protrudes the distal end portion of the treatment instrument 2360 from the opening 2316 to perform treatment on the treatment target tissue. Note that FIG. 24 illustrates the configuration of the distal end portion 2011 having two systems of illumination lenses 2314, one objective optical system 2311, and one opening 2316, but the specific configuration can be modified in various ways. .
 なお処理システム2100が行う処理は、情報処理方法として実現されてもよい。情報処理方法は、内視鏡を用いた処置に関する情報処理方法であって、生体に対して内視鏡を用いた処置が開始されたと判定したタイミングにおける内視鏡の挿入部2310bのアプローチ角度を基準角度に設定し、処置が開始されたと判定したタイミングよりも後のタイミングにおいて、基準角度に対するアプローチ角度の相対変化を求め、相対変化に関する報知情報の出力処理を行う。 The processing performed by the processing system 2100 may be implemented as an information processing method. The information processing method is an information processing method relating to a treatment using an endoscope, and calculates the approach angle of the insertion portion 2310b of the endoscope at the timing when it is determined that the treatment using the endoscope has started on the living body. A relative change in the approach angle with respect to the reference angle is obtained at a timing after the timing at which the reference angle is set and it is determined that the treatment has started, and processing for outputting notification information regarding the relative change is performed.
 以下、アプローチ角度について詳細に説明する。図25(A)はこの変形例におけるアプローチ角度の例を説明する図である。上述したように、アプローチ角度とは、処置対象組織と、挿入部2310bの先端部2011との相対的な角度を表す情報である。 The approach angle will be explained in detail below. FIG. 25A is a diagram illustrating an example of approach angles in this modified example. As described above, the approach angle is information representing the relative angle between the treatment target tissue and the distal end portion 2011 of the insertion section 2310b.
 図25(A)に示したように、アプローチ角度は、挿入部2310bの軸線である直線L21と、処置対象組織上の平面P21とのなす角度である。挿入部2310bの軸線とは、挿入部2310bの長手方向を表す軸であって、例えば略円柱形状である挿入部2310bの中心を通過する直線、或いはそれに平行(略平行を含む)な直線である。この変形例では、挿入部2310bの先端部2011に設けられる開口部2316から処置具2360を突出させ、当該処置具2360によってESD等が行われる場面を想定している。そのため、挿入部2310bの軸線とは、具体的には挿入部2310bの先端部2011における軸線である。 As shown in FIG. 25(A), the approach angle is the angle between the straight line L21, which is the axis of the insertion section 2310b, and the plane P21 on the treatment target tissue. The axis of the insertion portion 2310b is an axis representing the longitudinal direction of the insertion portion 2310b, and is, for example, a straight line passing through the center of the substantially cylindrical insertion portion 2310b, or a straight line parallel (including substantially parallel) thereto. . In this modified example, it is assumed that a treatment instrument 2360 is protruded from an opening 2316 provided at the distal end portion 2011 of the insertion section 2310b, and ESD or the like is performed by the treatment instrument 2360. FIG. Therefore, the axis of the insertion portion 2310b is specifically the axis of the distal end portion 2011 of the insertion portion 2310b.
 アプローチ角度は、具体的には図25(A)のθに示すように、先端部2011から平面P21に下ろした垂線と、直線L21と、平面P21とによって定義される三角形の内角の1つである。換言すれば、ここでのアプローチ角度は、処置対象組織を表す平面P21に対して、挿入部2310bの軸が寝ているか、立っているかというアプローチの深さを表す情報である。アプローチ角度は例えば0度以上90度以下の数値である。この定義においては、軸線が平面P21の法線を回転軸として回転したとしても、アプローチ角度の大きさは変化しない。即ち、アプローチ角度は、アプローチの方向を特定する情報を含まなくてもよい。ただし、アプローチ角度はこれに限定されず、また、値の範囲も0度以上90度以下に限定されない。 Specifically, as shown in θ in FIG. 25A, the approach angle is one of the internal angles of a triangle defined by a perpendicular line drawn from the tip 2011 to the plane P21, a straight line L21, and the plane P21. be. In other words, the approach angle here is information representing the depth of approach whether the axis of the insertion section 2310b lies or stands with respect to the plane P21 representing the tissue to be treated. The approach angle is, for example, a numerical value between 0 degrees and 90 degrees. In this definition, even if the axis rotates about the normal to the plane P21, the approach angle does not change. That is, the approach angle may not include information specifying the direction of approach. However, the approach angle is not limited to this, and the range of values is not limited to 0 degrees or more and 90 degrees or less.
 アプローチ角度を求めるための構成については種々の手法が考えられる。例えば内視鏡システム2300は、挿入部2310bの先端部2011に設けられるモーションセンサを含む。モーションセンサは、例えば3軸の加速度センサと、3軸の角速度センサを含む6軸センサである。例えば所与のセンサ座標系の3軸をX軸、Y軸、Z軸とした場合、加速度センサは、XYZの各軸における並進加速度を検出するセンサである。角速度センサは、XYZの各軸周りの角速度を検出するセンサである。 Various methods are conceivable for the configuration for obtaining the approach angle. For example, the endoscope system 2300 includes a motion sensor provided at the distal end 2011 of the insertion section 2310b. The motion sensor is, for example, a 6-axis sensor including a 3-axis acceleration sensor and a 3-axis angular velocity sensor. For example, if the three axes of a given sensor coordinate system are the X, Y, and Z axes, the acceleration sensor is a sensor that detects translational acceleration on each of the XYZ axes. The angular velocity sensor is a sensor that detects angular velocity around each of the XYZ axes.
 モーションセンサを用いることによって、先端部2011の位置姿勢を求めることが可能である。例えば、加速度センサ及び角速度センサの出力を積分することによって、先端部2011の変位及び回転量が求められる。なお慣性センサであるモーションセンサから位置姿勢を特定するためには、境界条件となる所与の基準位置の設定が必要である。例えば3次元空間に固定された所与の基準座標系において基準位置姿勢を定義した場合に、処理部2110は、当該基準位置姿勢を基準として、センサ出力に基づいて求められる先端部2011の変位及び回転量を蓄積することによって、各タイミングにおける先端部2011の位置姿勢を求める。これにより、各タイミングにおける直線L21の方向を所与の座標系で表現することが可能である。 By using a motion sensor, it is possible to obtain the position and orientation of the distal end portion 2011 . For example, by integrating the outputs of the acceleration sensor and the angular velocity sensor, the amount of displacement and rotation of the distal end portion 2011 can be obtained. In order to specify the position and orientation from the motion sensor, which is an inertial sensor, it is necessary to set a given reference position as a boundary condition. For example, when a reference position/posture is defined in a given reference coordinate system fixed in a three-dimensional space, the processing unit 2110 uses the reference position/posture as a reference to determine the displacement and By accumulating the amount of rotation, the position and orientation of the distal end portion 2011 at each timing are obtained. This makes it possible to express the direction of the straight line L21 at each timing in a given coordinate system.
 また平面P21は、例えば撮像画像に基づいて処置対象組織との距離を求めることによって推定できる。例えば、内視鏡システム2300は、先端部2011に複数の撮像系を含んでもよい。処理部2110は、位置の異なる複数の撮像系によって撮像された視差画像に基づいて、ステレオマッチング処理を行うことによって、画像上に撮像された被写体との距離を求める。ステレオマッチングについては公知の手法であるため、詳細な説明は省略する。このようにすれば、被写体の3次元形状を推定できる。例えば処理部2110は、カメラ座標系における被写体の各点の座標を特定できるため、当該カメラ座標系を用いて処置対象組織を含む平面P21を求めることが可能である。先端部2011の姿勢とカメラ座標系との関係は設計上既知であるため、先端部2011の姿勢と平面P21の関係を求めることが可能である。例えば、先端部2011の位置姿勢、及び、被写体の3次元形状を、共通の基準座標系で表現することも可能である。換言すれば、処理部2110は、モーションセンサの出力と、視差画像である撮像画像を取得することによって、任意の座標系を用いて直線L21及び平面P21を表現すること、及び直線L21と平面P21の間の角度であるアプローチ角度を演算することが可能である。 Also, the plane P21 can be estimated, for example, by obtaining the distance to the treatment target tissue based on the captured image. For example, endoscope system 2300 may include multiple imaging systems in distal portion 2011 . The processing unit 2110 obtains the distance to the subject imaged on the image by performing stereo matching processing based on parallax images imaged by a plurality of imaging systems at different positions. Stereo matching is a well-known technique, and detailed description thereof will be omitted. In this way, the three-dimensional shape of the subject can be estimated. For example, since the processing unit 2110 can specify the coordinates of each point of the subject in the camera coordinate system, it is possible to obtain the plane P21 including the treatment target tissue using the camera coordinate system. Since the relationship between the orientation of the distal end portion 2011 and the camera coordinate system is known in terms of design, it is possible to obtain the relationship between the orientation of the distal end portion 2011 and the plane P21. For example, it is possible to express the position and orientation of the distal end portion 2011 and the three-dimensional shape of the subject in a common reference coordinate system. In other words, the processing unit 2110 expresses the straight line L21 and the plane P21 using an arbitrary coordinate system and expresses the straight line L21 and the plane P21 using an arbitrary coordinate system by acquiring the output of the motion sensor and the captured image that is the parallax image. It is possible to compute the approach angle, which is the angle between
 なお平面P21は、処置対象組織の所与の点における法線ベクトルに直交する平面であってもよいし、処置対象組織の表面形状を平面で近似した面であってもよい。さらに言えば、この変形例におけるアプローチ角度は、挿入部2310bの先端部2011と処置対象組織の関係を表す情報であればよく、平面P21の代わりに上記法線ベクトルや、それに類する情報に基づいてアプローチ角度が求められてもよい。 The plane P21 may be a plane orthogonal to the normal vector at a given point of the treatment target tissue, or may be a plane approximating the surface shape of the treatment target tissue. Furthermore, the approach angle in this modified example may be any information that represents the relationship between the distal end portion 2011 of the insertion section 2310b and the tissue to be treated. An approach angle may be determined.
 なお時系列のアプローチ角度を求める際に、処理部2110は、各タイミングにおいてモーションセンサに基づく先端部2011の位置姿勢の演算と、視差画像に基づく被写体形状の推定の両方を行ってもよい。ただし、図20(D)や図20(F)等の画像から被写体形状を求めることが容易でないことに鑑みれば、処理部2110は、所与のタイミングにおいて取得された平面P21の情報を、それよりも後のタイミングにおいて継続して利用してもよい。例えば処理部2110は、図20(A)、図20(B)に示すように、相対的に被写体を俯瞰できる位置において平面P21を求め、処置中は当該平面P21の情報を継続利用する。このようにすれば、画像から処置対象組織の情報を取得することが難しい場合にも、適切に平面P21の情報を取得できる。平面P21の情報取得後は、処理部2110は、各タイミングにおいてモーションセンサに基づいて直線L21を求め、求めた直線L21と、取得済の平面P21を用いてアプローチ角度を演算する。 Note that when obtaining the time-series approach angles, the processing unit 2110 may perform both calculation of the position and orientation of the distal end portion 2011 based on the motion sensor and estimation of the subject shape based on the parallax image at each timing. However, in view of the fact that it is not easy to obtain the shape of the subject from the images such as those shown in FIGS. It may be used continuously at a later timing. For example, as shown in FIGS. 20A and 20B, the processing unit 2110 obtains a plane P21 at a position where the subject can be relatively overlooked, and continues to use information on the plane P21 during treatment. In this way, even if it is difficult to obtain information on the treatment target tissue from the image, information on the plane P21 can be appropriately obtained. After obtaining the information on the plane P21, the processing unit 2110 obtains the straight line L21 based on the motion sensor at each timing, and calculates the approach angle using the obtained straight line L21 and the obtained plane P21.
 挿入部2310bの先端部2011の位置姿勢を求める手法はモーションセンサを用いるものに限定されない。例えば内視鏡システム2300は、先端部2011に設けられる磁気センサを含んでもよい。例えば磁気センサは、中心軸が互いに直交する2つの円筒状コイルを含む。また内視鏡システム2300は、周辺機器として不図示の磁場発生装置を含む。磁気センサは、当該磁場発生装置が発生させた磁場を検出することによって、先端部2011の位置と姿勢を検出する。 The method of obtaining the position and orientation of the distal end portion 2011 of the insertion portion 2310b is not limited to using a motion sensor. For example, endoscope system 2300 may include a magnetic sensor provided on distal end 2011 . For example, a magnetic sensor includes two cylindrical coils whose center axes are perpendicular to each other. The endoscope system 2300 also includes a magnetic field generator (not shown) as a peripheral device. The magnetic sensor detects the position and orientation of the distal end portion 2011 by detecting the magnetic field generated by the magnetic field generator.
 また処置対象組織側の計測方法は視差画像を用いる方式には限定されない。例えば処理部2110は、TOF方式やストラクチャードライト方式を用いて被写体との距離を測定することによって、処置対象組織側の計測を行ってもよい。TOF方式は、光の反射波がイメージセンサに到達する時間を測定する方式である。ストラクチャードライト方式は、被写体に複数のパターン光を投影し、各パターン光の写り方から距離を求める手法である。例えば、明度が正弦波で変化するパターンを投影することによって、位相のずれを求める位相シフト法等が知られている。被写体の3次元形状を推定するこれらの手法は公知であるため詳細な説明は省略する。 Also, the measurement method on the treatment target tissue side is not limited to the method using parallax images. For example, the processing unit 2110 may measure the treatment target tissue side by measuring the distance to the subject using the TOF method or the structured light method. The TOF method is a method of measuring the time it takes for a reflected wave of light to reach an image sensor. The structured light method is a method of projecting a plurality of patterns of light onto an object and determining the distance from how each pattern of light appears. For example, there is known a phase shift method of obtaining a phase shift by projecting a pattern whose brightness changes with a sine wave. Since these techniques for estimating the three-dimensional shape of the subject are well known, detailed description thereof will be omitted.
 また処理部2110は、異なる複数の撮像画像において、複数の特徴点の対応付けを行うことによって、処置対象組織の3次元空間内の位置を算出してもよい。特徴点の位置は、画像情報からSLAM、SfMなどの手法を用いて算出することが可能である。例えば処理部2110は、非線形最小二乗法を用いて、画像から、内部パラメータ、外部パラメータ及び世界座標点群を最適化するバンドル調整を適用することによって、処置対象組織の情報を求める。また処理部2110は、推定された各パラメータを用いて、抽出された複数の特徴点の世界座標点を透視投影変換し、再投影誤差が最小になるように、各パラメータと各世界座標点群を求める。SfM等の手法は公知であるため、これ以上の詳細な説明は省略する。なお、これらの手法では、被写体の3次元位置だけでなく、カメラの位置姿勢も推定可能である。よって先端部2011の位置姿勢の推定にSfM等の手法が用いられてもよい。 The processing unit 2110 may also calculate the position of the treatment target tissue in the three-dimensional space by associating a plurality of feature points in a plurality of different captured images. The positions of feature points can be calculated from image information using methods such as SLAM and SfM. For example, the processing unit 2110 obtains information of the tissue to be treated by applying a bundle adjustment that optimizes the intrinsic parameters, the extrinsic parameters and the global coordinate point cloud from the image using a non-linear least squares method. In addition, the processing unit 2110 performs perspective projection transformation on the world coordinate points of the plurality of extracted feature points using each estimated parameter, and performs each parameter and each world coordinate point cloud so that the reprojection error is minimized. Ask for Since methods such as SfM are publicly known, further detailed description thereof will be omitted. Note that these methods can estimate not only the three-dimensional position of the subject but also the position and orientation of the camera. Therefore, a method such as SfM may be used to estimate the position and orientation of the distal end portion 2011 .
 また処置対象組織側の計測は、内視鏡システム2300を用いて撮像される撮像画像を用いたものに限定されない。例えば内視鏡システム2300を用いた処置の前に、患者のCT画像やMRI画像が取得されてもよい。CT画像及びMRI画像を用いることによって、処置対象組織の周辺やそこに到達するまでの臓器の形状を推定できる。ただし、推定した臓器形状はMR画像やCT画像の撮像時の形状であり、処置における臓器形状は種々の要因で変化する。よって処理部2110は、当該要因に関連する情報を取得し、当該情報に基づいて臓器形状を補正することによって、挿入部2310bを挿入している間の臓器形状を推定する。処理部2110は、補正後の臓器形状に基づいて、処置対象組織の情報、例えば上記平面P21を演算する。 Also, the measurement of the tissue to be treated is not limited to using the captured image captured using the endoscope system 2300 . For example, CT or MRI images of the patient may be acquired prior to treatment using endoscopic system 2300 . By using CT images and MRI images, it is possible to estimate the shape of the periphery of the tissue to be treated and the shape of the organ until reaching there. However, the estimated shape of the organ is the shape at the time of capturing the MR image or the CT image, and the shape of the organ during treatment changes due to various factors. Therefore, the processing unit 2110 acquires information related to the factor and corrects the shape of the organ based on the information, thereby estimating the shape of the organ while the insertion unit 2310b is being inserted. The processing unit 2110 calculates information of the treatment target tissue, for example, the plane P21, based on the corrected shape of the organ.
 臓器形状の変化要因に関連する情報は、例えば患者の***、管腔内の気圧、重力方向、挿入部2310bの挿入形状等の情報である。患者の***は、可動ベッドの駆動量から求められてもよいし、ユーザによって入力されてもよい。管腔内の気圧は、送気量や吸引量から推定されてもよいし、気圧センサを挿入部2310bに設けることによって取得されてもよい。重力方向は、モーションセンサから検出可能である。挿入部2310bの挿入形状は、モーションセンサや磁気センサを挿入部2310bの複数の箇所に設けることによって検出されてもよいし、挿入部2310bの進退操作や湾曲操作の履歴に基づいて推定されてもよい。 The information related to the organ shape change factor is, for example, information such as the patient's posture, the pressure inside the lumen, the direction of gravity, and the insertion shape of the insertion portion 2310b. The patient's body position may be obtained from the amount of movement of the movable bed, or may be input by the user. The air pressure in the lumen may be estimated from the air supply amount or the suction amount, or may be obtained by providing an air pressure sensor in the insertion section 2310b. The direction of gravity can be detected from a motion sensor. The insertion shape of the insertion section 2310b may be detected by providing motion sensors or magnetic sensors at a plurality of locations of the insertion section 2310b, or may be estimated based on the history of forward/backward operations and bending operations of the insertion section 2310b. good.
 また図25(B)に示したように、アプローチ角度は、挿入部2310bの軸線である直線L21と、重力方向を表す直線L22のなす角度であってもよい。重力方向を用いた場合、アプローチ角度は直接的には処置対象組織と先端部2011の角度とはならない。しかし、例えば胃におけるESD等では、患者の***はある程度決まっており、例えば左側臥位等が用いられる。そのため、臓器における処置対象組織の位置が既知であれば、平面P21と重力方向の関係は既知となる。また***変換が行われる場合であっても、その都度、***を表す情報を取得可能であれば、平面P21と重力方向の関係は既知となる。 Also, as shown in FIG. 25(B), the approach angle may be an angle formed by a straight line L21, which is the axis of the insertion portion 2310b, and a straight line L22, which represents the direction of gravity. When the direction of gravity is used, the approach angle does not directly correspond to the angle between the tissue to be treated and the distal end portion 2011 . However, for ESD in the stomach, for example, the position of the patient is fixed to some extent, and for example, the left lateral decubitus position is used. Therefore, if the position of the treatment target tissue in the organ is known, the relationship between the plane P21 and the direction of gravity is known. Also, even when the body posture is changed, if the information representing the body posture can be acquired each time, the relationship between the plane P21 and the direction of gravity is known.
 よって処理部2110は、軸線と重力方向に基づいて、アプローチ角度を求めてもよい。重力方向は、例えば上述したモーションセンサを用いて求めることが可能である。なお処理部2110は、求めたアプローチ角度をそのまま用いてもよいし、重力方向と平面P21との関係に基づいて、図25(A)と同様に直線L21と平面P21の角度を演算してもよい。 Therefore, the processing unit 2110 may obtain the approach angle based on the axis and the direction of gravity. The direction of gravity can be determined using, for example, the motion sensor described above. The processing unit 2110 may use the obtained approach angle as it is, or may calculate the angle between the straight line L21 and the plane P21 based on the relationship between the direction of gravity and the plane P21 in the same manner as in FIG. good.
 また、以上ではアプローチ角度の演算に挿入部2310bの軸線を用いる例を説明したが、この変形例の手法はこれに限定されない。例えば、軸線に代えて、対物光学系2311及び撮像素子2312からなる撮像系の画角に関する情報が用いられてもよい。例えば、画角に関する情報とは、撮像系による撮像範囲を表す領域であって、撮像系の基準点を頂点とする錐体である。ここでの錐体は例えば四角錐であり、底面である四角形が画角に対応する。或いは、画角に関する情報とは、水平画角、垂直画角、対角画角等の数値を表す情報であってもよい。 Also, although the example in which the axis of the insertion portion 2310b is used to calculate the approach angle has been described above, the technique of this modification is not limited to this. For example, instead of the axis line, information about the angle of view of the imaging system composed of the objective optical system 2311 and the imaging device 2312 may be used. For example, the information about the angle of view is an area representing the imaging range of the imaging system, and is a cone whose apex is the reference point of the imaging system. The pyramid here is, for example, a quadrangular pyramid, and the quadrangular base corresponds to the angle of view. Alternatively, the information about the angle of view may be information representing numerical values such as a horizontal angle of view, a vertical angle of view, and a diagonal angle of view.
 この変形例のアプローチ角度は、直錐である上記四角錐を考えた場合の底面と、処置対象組織を表す平面P21とのなす角度であってもよい。例えば、アプローチ角度は、2つの面の交線の方向を規定するベクトルである。或いはアプローチ角度は、四角錐の頂点と底面の任意の点とを結ぶ直線と、平面P21との角度を表す情報であってもよい。 The approach angle in this modified example may be the angle formed by the bottom surface of the square pyramid, which is a right pyramid, and the plane P21 representing the tissue to be treated. For example, the approach angle is a vector that defines the direction of the line of intersection of two planes. Alternatively, the approach angle may be information representing the angle between the plane P21 and a straight line connecting the apex of the quadrangular pyramid and an arbitrary point on the bottom surface.
 次に処理システム2100における処理の流れについて具体的に説明する。なお処理システム2100は、内視鏡システム2300からの情報を定期的に取得可能である。例えば処理システム2100は、内視鏡システム2300によって撮像された撮像画像や、モーションセンサのセンサ情報を取得する。また後述するように、処理システム2100は、周辺装置の制御情報を定期的に取得してもよい。 Next, the flow of processing in the processing system 2100 will be specifically described. Note that the processing system 2100 can periodically acquire information from the endoscope system 2300 . For example, the processing system 2100 acquires captured images captured by the endoscope system 2300 and sensor information of the motion sensor. Also, as described below, the processing system 2100 may periodically acquire control information for peripheral devices.
 図26は、処理システム2100の処理を説明するフローチャートである。この処理が開始されると、まずステップS2101において、処理部2110は、処置対象組織に対する処置が開始されたか否かを検出する。例えば処理部2110は、周辺機器の使用情報に基づいて、処置の開始を検出する処理を行う。ここでの周辺機器とは、内視鏡システム2300に付随して設けられる機器である。具体的には、スコープ部2310、処理装置2330のように生体内の観察に必須な構成が内視鏡システム2300の本体部に相当する。また本体部は、表示部2340や光源装置2350を含んでもよい。これに対して、周辺機器は撮像そのものに必須の構成ではなく、例えば処置を行うための処置具2360や、高周波デバイスである処置具2360に電力を供給するための電源装置を含む。また周辺機器には、送気、吸引を行うためのポンプ等を有する装置が含まれてもよい。処置を行う際には周辺機器が用いられるため、周辺機器の使用情報を用いることによって、処置の開始を適切に検出することが可能になる。 FIG. 26 is a flowchart for explaining the processing of the processing system 2100. FIG. When this process is started, first, in step S2101, the processing unit 2110 detects whether or not the treatment for the treatment target tissue has been started. For example, the processing unit 2110 performs processing for detecting the start of treatment based on the usage information of the peripheral device. A peripheral device here is a device provided accompanying the endoscope system 2300 . Specifically, the main body of the endoscope system 2300 includes components such as the scope unit 2310 and the processing device 2330 that are essential for observing the inside of the living body. Also, the main unit may include the display unit 2340 and the light source device 2350 . On the other hand, the peripheral device is not an essential component for imaging itself, but includes, for example, a treatment tool 2360 for performing treatment and a power supply device for supplying power to the treatment tool 2360, which is a high-frequency device. Peripheral equipment may also include a device having a pump or the like for supplying or sucking air. Since the peripheral device is used when performing the treatment, it is possible to appropriately detect the start of the treatment by using the usage information of the peripheral device.
 具体的には、処理部2110は、処置具2360の突出状態に関する情報を使用情報として取得してもよい。例えば図24を用いて上述したように、挿入部2310bには、術者の手元部分(例えば操作部2310a)から先端部2011の開口部2316までつながるチャンネルが設けられる。術者は、当該チャンネルに処置具2360を挿通することによって処置を行う。処置具2360の突出状態とは、先端部2011の開口部2316から処置具2360の先端が突出したか否か、或いは、その突出量を表す情報である。 Specifically, the processing unit 2110 may acquire information about the protruding state of the treatment instrument 2360 as usage information. For example, as described above with reference to FIG. 24, the insertion section 2310b is provided with a channel that connects from the operator's hand portion (for example, the operation section 2310a) to the opening 2316 of the distal end section 2011 . The operator performs treatment by inserting the treatment instrument 2360 through the channel. The protruding state of the treatment instrument 2360 is information indicating whether or not the distal end of the treatment instrument 2360 protrudes from the opening 2316 of the distal end portion 2011, or the amount of projection.
 例えば処理部2110は、撮像画像に処置具2360が撮像されたか否かを判定することによって、突出状態に関する使用情報を取得する。図20(A)等に示したように、処置具2360は先端部2011の開口部2316から挿入部2310bの軸方向に突出するため、図20(B)等に示したように、撮像画像には突出した状態の処置具2360が撮像される。処置具2360は、生体組織に比べて彩度が低く、その形状も設計から既知である。よって処理部2110は、撮像画像に対して、彩度判定処理や、基準形状を用いたマッチング処理等の画像処理を行うことによって、画像中の処置具領域を検出できる。処理部2110は、例えば画像中に処置具2360が存在する場合に、処置具2360が突出しており、処置が開始されたと判定する。また処理部2110は、鉗子口に対応する開口部2316に設けられるセンサの出力に基づいて、処置具2360の突出状態を判定してもよい。 For example, the processing unit 2110 acquires usage information regarding the protruding state by determining whether or not the treatment tool 2360 is captured in the captured image. As shown in FIG. 20A and the like, the treatment instrument 2360 protrudes from the opening 2316 of the distal end portion 2011 in the axial direction of the insertion portion 2310b. is an image of the protruding treatment instrument 2360 . The treatment instrument 2360 has lower saturation than living tissue, and its shape is also known from its design. Therefore, the processing unit 2110 can detect the treatment tool region in the image by performing image processing such as saturation determination processing and matching processing using the reference shape on the captured image. For example, when the treatment tool 2360 exists in the image, the processing unit 2110 determines that the treatment tool 2360 protrudes and the treatment has started. Further, the processing section 2110 may determine the protruding state of the treatment instrument 2360 based on the output of the sensor provided in the opening 2316 corresponding to the forceps port.
 ここでの処置具2360は、生体に対する処置を行うための器具であり、例えば高周波スネアや高周波ナイフを含む。高周波ナイフは、ニードルナイフ、ITナイフ、フックナイフ等を含む。例えばESDのマーキングには、ニードルナイフが用いられる。切開にはITナイフが用いられる。剥離には高周波スネアや高周波ナイフが用いられる。また処置具2360は、注射針、鉗子、クリップ等の他の器具を含んでもよい。ESDの局注には注射針が用いられる。止血には鉗子やクリップが用いられる。 The treatment instrument 2360 here is an instrument for treating a living body, and includes, for example, a high-frequency snare and a high-frequency knife. High frequency knives include needle knives, IT knives, hook knives, and the like. For example, a needle knife is used for ESD marking. An IT knife is used for the incision. A high-frequency snare or high-frequency knife is used for peeling. The treatment instrument 2360 may also include other instruments such as injection needles, forceps, and clips. An injection needle is used for local injection of ESD. Forceps or clips are used to stop bleeding.
 処置を行わない場合、処置具2360を突出させる必要はないし、突出させることでかえって生体を傷つけるおそれがある。即ち、術者が処置具2360を突出させた場合、速やかに処置が開始されると推定できる。よって処置具2360の突出状態を用いることによって、処置の開始を精度よく判定できる。 When treatment is not performed, there is no need to protrude the treatment tool 2360, and there is a risk that the protruding may rather harm the living body. That is, it can be estimated that the treatment will start promptly when the operator protrudes the treatment instrument 2360 . Therefore, by using the protruded state of the treatment instrument 2360, it is possible to accurately determine the start of treatment.
 また処理部2110は、高周波デバイスの通電状態に関する情報を使用情報として取得してもよい。高周波デバイスとは、高周波電流が印加されることによって、対象組織を切除、焼灼するために用いられるデバイスである。高周波デバイスは、高周波スネアや高周波ナイフを含む。通電状態とは、高周波デバイスに対して、電源装置から高周波電流が供給されているか否かを表す情報であり、電源装置の制御信号に基づいて判定可能である。処理部2110は、当該制御信号に基づいて、高周波デバイスに高周波電流が供給されている場合に、処置が開始されたと判定する。 Also, the processing unit 2110 may acquire information about the energization state of the high-frequency device as usage information. A high-frequency device is a device used to excise and cauterize a target tissue by applying a high-frequency current. High frequency devices include high frequency snares and high frequency knives. The energized state is information indicating whether or not a high-frequency current is being supplied from the power supply to the high-frequency device, and can be determined based on the control signal of the power supply. Based on the control signal, the processing unit 2110 determines that treatment has started when high-frequency current is being supplied to the high-frequency device.
 高周波デバイスは、単に突出させただけでは切除や焼灼を行うことができず、高周波電流の供給が必要となる。即ち、高周波デバイスに高周波電流が供給された場合、速やかに処置が開始される蓋然性がより高いといえる。よって高周波デバイスの通電状態を用いることによって、処置の開始を精度よく判定できる。 A high-frequency device cannot perform resection or cauterization simply by protruding, and requires the supply of high-frequency current. That is, when high-frequency current is supplied to the high-frequency device, it can be said that there is a higher probability that the treatment will be started immediately. Therefore, by using the energized state of the high-frequency device, it is possible to accurately determine the start of treatment.
 ステップS2101でNoの場合、処理部2110はステップS2102以降の処理を行わずに、所定時間待機する。そして処理部2110は、再度、ステップS2101の処理を実行する。換言すれば、この変形例の手法では、処置の開始が検出されるまでは、アプローチ角度の相対変化の算出や報知情報の出力が行われなくてもよい。 If No in step S2101, the processing unit 2110 waits for a predetermined period of time without performing the processes after step S2102. Then, the processing unit 2110 executes the process of step S2101 again. In other words, in the technique of this modified example, the calculation of the relative change in the approach angle and the output of the notification information may not be performed until the start of treatment is detected.
 一方、ステップS2101でYesの場合、ステップS2102において、処理部2110は、処置が開始されたと判定されたタイミングにおけるアプローチ角度を、基準角度に設定する処理を行う。例えば、処理部2110は、処置具2360が突出したと判定されたタイミングにおいてモーションセンサのセンサ情報及び撮像画像を取得し、挿入部2310bの軸線である直線L21と、処置対象組織に関する平面P21を求める処理を行い、直線L21と平面P21のなす角度を、アプローチ角度の基準角度に設定する。 On the other hand, if Yes in step S2101, in step S2102, the processing unit 2110 performs a process of setting the approach angle at the timing when it is determined that the treatment has started as the reference angle. For example, the processing unit 2110 acquires the sensor information of the motion sensor and the captured image at the timing when it is determined that the treatment instrument 2360 has protruded, and obtains the straight line L21 that is the axis of the insertion unit 2310b and the plane P21 regarding the tissue to be treated. After processing, the angle formed by the straight line L21 and the plane P21 is set as the reference angle of the approach angle.
 なお、この変形例における「処置具2360が突出したタイミングにおけるアプローチ角度」とは、突出検出をトリガーとして算出されたアプローチ角度を表すものであって、突出の判定タイミング、センサ情報の取得タイミング、撮像画像の取得タイミングは厳密に一致する必要はない。 Note that the “approach angle at the timing when the treatment instrument 2360 protrudes” in this modification represents the approach angle calculated with the detection of protrusion as a trigger, and includes the protrusion determination timing, sensor information acquisition timing, and imaging. The image acquisition timing does not have to match exactly.
 また上述したように、処理部2110は、高周波デバイスの通電開始タイミングに対応するタイミングのアプローチ角度を基準角度に設定してもよい。以上のように、処置開始時のアプローチ角度を基準角度に設定することによって、1回の処置のなかでのアプローチ角度の相対変化を適切に求めることが可能になる。 Also, as described above, the processing unit 2110 may set the approach angle at the timing corresponding to the energization start timing of the high-frequency device as the reference angle. As described above, by setting the approach angle at the start of treatment as the reference angle, it is possible to appropriately obtain the relative change in the approach angle during one treatment.
 基準角度が設定された後、処理部2110は、ステップS2103においてアプローチ角度の相対変化を求める。例えば処理部2110は、モーションセンサのセンサ情報を取得し、当該センサ情報に基づいて、直線L21を表す情報を求める。処理部2110は直線L21と、ステップS2102で求めた平面P21とのなす角度を、そのときのアプローチ角度として求める。さらに処理部2110は、求めたアプローチ角度と、ステップS2102で求めた基準角度の差分を算出し、当該差分をアプローチ角度の相対変化とする。 After setting the reference angle, the processing unit 2110 obtains the relative change in the approach angle in step S2103. For example, the processing unit 2110 obtains sensor information from a motion sensor, and obtains information representing the straight line L21 based on the sensor information. The processing unit 2110 obtains the angle formed by the straight line L21 and the plane P21 obtained in step S2102 as the approach angle at that time. Further, the processing unit 2110 calculates the difference between the obtained approach angle and the reference angle obtained in step S2102, and regards the difference as the relative change in the approach angle.
 ステップS2104において、出力処理部2120は、ステップS2103で求められた相対変化を出力する処理を行う。例えば出力処理部2120は、処置が開始されたと判定したタイミングよりも後のタイミングにおいて、相対変化に対応する角度をリアルタイムに表示する表示処理を行う。処置が開始されたと判定したタイミングとはステップS2101でYesと判定したタイミングであり、その後のタイミングとはステップS2104のタイミングである。 In step S2104, the output processing unit 2120 performs processing for outputting the relative change obtained in step S2103. For example, the output processing unit 2120 performs display processing for displaying the angle corresponding to the relative change in real time at a timing after the timing at which it is determined that the treatment has started. The timing at which it is determined that the treatment has started is the timing at which Yes is determined in step S2101, and the subsequent timing is the timing in step S2104.
 図27(A)、図27(B)は、相対変化に対応する角度をリアルタイムに表示する際の表示画面の例である。例えば、出力処理部2120は、相対変化に対応する数値を、図形を用いて表示してもよい。例えば、図27(A)は、相対変化が0度の状態を表し、図27(B)は相対変化が+30度の状態を表す。例えば+30度とは、基準角度に比べて、現在のアプローチ角度が30度増加した状態である。このようにすれば、処置中のアプローチ角度がどのように推移しているかを、速やかに、且つ、わかりやすい態様で術者に提示することが可能になる。ただし、相対変化をリアルタイムに表示する画面はこれに限定されず、数値そのものを表示してもよい。 FIGS. 27(A) and 27(B) are examples of display screens when displaying the angle corresponding to the relative change in real time. For example, the output processing unit 2120 may display numerical values corresponding to relative changes using graphics. For example, FIG. 27(A) represents a state in which the relative change is 0 degrees, and FIG. 27(B) represents a state in which the relative change is +30 degrees. For example, +30 degrees means that the current approach angle is 30 degrees higher than the reference angle. In this way, it is possible to quickly and easily present to the operator how the approach angle changes during treatment. However, the screen that displays the relative change in real time is not limited to this, and the numerical value itself may be displayed.
 ステップS2104の処理後、ステップS2105において、処理部2110は、処置の開始が再度検出されたか否かを判定する。具体的な処理はステップS2101と同様であり、処理部2110は、処置具2360の突出、又は、高周波デバイスの通電が検出されたか否かを判定する。なお、ステップS2105の処理は、ステップS2101で開始された処置が継続されている場合はNoと判定される。即ち、処置具2360が突出し続けている状態、又は、高周波デバイスの通電状態が維持されている場合、処理部2110はステップS2105でNoと判定する。 After the process of step S2104, in step S2105, the processing unit 2110 determines whether the start of treatment has been detected again. The specific processing is the same as step S2101, and the processing unit 2110 determines whether or not the protrusion of the treatment tool 2360 or the energization of the high-frequency device has been detected. It should be noted that the processing of step S2105 is determined as No when the treatment started in step S2101 is continued. That is, if the treatment instrument 2360 continues to protrude or if the high-frequency device is kept energized, the processing unit 2110 determines No in step S2105.
 具体的に説明する。術者による1回の処置は、(a)処置具の突出、(b)通電、(c)生体に対する具体的な処置、(d)通電終了、(e)処置具の収納、という手順を複数回繰り返すことによって実行されると考えられる。処置具2360が高周波デバイスでない場合、(b)や(d)は省略可能である。1回の処置の終了後、次の処置の開始までに、例えば上述したポジショニングが行われるが、ここでは省略する。例えばESDは、切開や剥離等の複数のステップを含むが、各ステップが(a)~(e)の手順を1回又は複数回行うことによって実現される。 I will explain in detail. One treatment by the operator includes a plurality of procedures of (a) protruding the treatment instrument, (b) energization, (c) specific treatment for the living body, (d) termination of energization, and (e) retraction of the treatment instrument. It is considered to be executed by repeating If the treatment instrument 2360 is not a high frequency device, (b) and (d) can be omitted. After the end of one treatment, for example, the positioning described above is performed before the start of the next treatment, but is omitted here. For example, ESD includes a plurality of steps such as incision and ablation, and each step is realized by performing procedures (a) to (e) once or multiple times.
 図26に示す処理では、まず(a)又は(b)によってステップS2101でYESと判定され、ステップS2102の処理が実行される。その後の(c)の実行中では、ステップS2105でNoと判定されるため、ステップS2103、S2104の処理が繰り返し実行される。また(d)及び(e)が行われたことを条件に、ステップS2105の判定結果がYESとなる可能性が出てくる。具体的には、(d)及び(e)の実行後、再度、(a)又は(b)が実行された場合に、処理部2110はステップS2105でYesと判定する。換言すれば、ステップS2101及びS2105の処理は、処置具2360が突出していない状態から突出した状態に移行したこと、又は、高周波デバイスが通電していない状態から通電した状態に移行したことを検出する処理である。 In the process shown in FIG. 26, first, YES is determined in step S2101 by (a) or (b), and the process of step S2102 is executed. During execution of (c) after that, since it is determined as No in step S2105, the processes of steps S2103 and S2104 are repeatedly executed. Also, on condition that (d) and (e) are performed, there is a possibility that the determination result of step S2105 will be YES. Specifically, when (a) or (b) is executed again after executing (d) and (e), the processing unit 2110 determines Yes in step S2105. In other words, the processing of steps S2101 and S2105 detects that the treatment instrument 2360 has transitioned from a non-projecting state to a projected state, or that the high-frequency device has transitioned from a non-energized state to an energized state. processing.
 ステップS2105でYesと判定された場合、ステップS2102において、処理部2110は、処置が開始されたと判定されたタイミングにおけるアプローチ角度を、基準角度に設定する処理を行う。これ以降も同様であり、(c)の間は再設定された基準角度を用いて、アプローチ角度の相対変化の演算(ステップS2103)、及び出力(ステップS2104)が行われる。(d)及び(e)の後に再度(a)が実行されると(ステップS2105でYes)、基準角度が再設定される(ステップS2102)。 If it is determined as Yes in step S2105, in step S2102, the processing unit 2110 performs processing to set the approach angle at the timing when it is determined that the treatment has started, as the reference angle. The same applies thereafter, and during (c), the reset reference angle is used to calculate the relative change in the approach angle (step S2103) and output (step S2104). When (a) is executed again after (d) and (e) (Yes in step S2105), the reference angle is reset (step S2102).
 以上のように、この変形例の手法は、例えば上記(a)~(e)を1サイクルとして、サイクルごとに基準角度の設定及び相対変化の出力が行われる。 As described above, in the method of this modified example, for example, the above (a) to (e) are set as one cycle, and the reference angle is set and the relative change is output for each cycle.
 図28は、この変形例の手法によって取得される相対変化の時間変化を説明する図である。図28の横軸は時間を表し、縦軸はアプローチ角度の相対変化を表す。図28のt1が例えば挿入部2310bの挿入後、初めて処置具2360の突出又は通電が検出されたタイミングに対応する(ステップS2101でYes)。処理部2110は、t1でのアプローチ角度を基準角度として求める(ステップS2102)。t1の後、各タイミングにおいてアプローチ角度の算出と、相対変化の算出及び出力が行われる(ステップS2103、S2104)。例えば図28に示すように、t1での相対変化が0度に設定され、これ以降はt1でのアプローチ角度を基準として、相対変化が時系列的に取得される。 FIG. 28 is a diagram explaining temporal changes in relative changes obtained by the method of this modification. The horizontal axis of FIG. 28 represents time, and the vertical axis represents relative change in approach angle. t1 in FIG. 28 corresponds to, for example, the timing at which the treatment instrument 2360 is first protruded or electrified after the insertion of the insertion portion 2310b (Yes in step S2101). The processing unit 2110 obtains the approach angle at t1 as the reference angle (step S2102). After t1, the calculation of the approach angle and the calculation and output of the relative change are performed at each timing (steps S2103 and S2104). For example, as shown in FIG. 28, the relative change at t1 is set to 0 degrees, and thereafter the relative change is acquired in time series with the approach angle at t1 as a reference.
 またt2は、処置が再度開始されたタイミングを表す。即ち、t1とt2の間に上記(d)及び(e)が行われ、その後、t2において(a)が実行された。処理部2110は、t2でのアプローチ角度を基準角度として求める(ステップS2102)。図28に示すように、基準角度の再設定によって、t2での相対角度が0度にキャリブレーションされる。t3もt2と同様に、処置が再度開始されたタイミングを表す。図28に示すように、基準角度の再設定によって、t3での相対角度が0度にキャリブレーションされる。 Also, t2 represents the timing at which the treatment was restarted. That is, the above (d) and (e) were performed between t1 and t2, and then (a) was performed at t2. The processing unit 2110 obtains the approach angle at t2 as the reference angle (step S2102). As shown in FIG. 28, resetting the reference angle calibrates the relative angle at t2 to 0 degrees. Similarly to t2, t3 also represents the timing at which the treatment was restarted. As shown in FIG. 28, resetting the reference angle calibrates the relative angle at t3 to 0 degrees.
 図26のフローチャートに示したように、出力処理部2120は、処理部2110において基準角度が設定されたことをトリガーとして、報知情報の出力処理を開始する。そして処理部2110は、出力処理部2120による出力処理の開始後、再度、処置が開始されたと判定した場合に、基準角度の再設定処理を行う。このようにすれば、1回の手術の中で生体に対する処置が複数回繰り返される場合であっても、各処置において適切な相対変化を出力することが可能になる。 As shown in the flowchart of FIG. 26, the output processing unit 2120 starts output processing of notification information, triggered by the setting of the reference angle in the processing unit 2110 . After the output processing by the output processing unit 2120 is started, the processing unit 2110 performs resetting processing of the reference angle when determining that the treatment is started again. In this way, even if the treatment on the living body is repeated multiple times in one surgery, it is possible to output an appropriate relative change in each treatment.
 なお、以上で説明したように、アプローチ角度の相対変化は、患者に対する処置中にリアルタイムに表示されてもよい。このようにすれば、処置を行っている術者に対して、危険な操作を行わないように注意を促すことが可能になる。ただし、処理システム2100による報知はこれに限定されない。 As described above, the relative change in approach angle may be displayed in real time during treatment of the patient. In this way, it is possible to warn the operator who is performing the treatment not to perform dangerous operations. However, the notification by the processing system 2100 is not limited to this.
 例えば処理部2110は、許容可能なアプローチ角度範囲を取得し、相対変化を表す角度がアプローチ角度範囲から外れた場合、アラート情報を報知情報として出力する処理を行ってもよい。上述したように、熟練医は処置中のアプローチ角度の相対変化を抑制することによって、安全に処置を行うことが可能である。一方、修練医はアプローチ角度がばらつくため、出血等の問題が生じるおそれがある。よって、あらかじめ許容可能なアプローチ角度範囲を設定しておくことによって、アプローチ角度の相対変化が安全なものか否かを判定することが可能である。 For example, the processing unit 2110 may acquire an allowable approach angle range, and perform processing for outputting alert information as notification information when the angle representing the relative change deviates from the approach angle range. As described above, a skilled doctor can safely perform treatment by suppressing relative changes in the approach angle during treatment. On the other hand, novice doctors may have problems such as bleeding due to variations in approach angles. Therefore, by setting an allowable approach angle range in advance, it is possible to determine whether or not the relative change in the approach angle is safe.
 図29は、アプローチ角度の相対変化と、許容可能なアプローチ角度範囲の関係を説明する図である。図29の縦軸、横軸は図28と同様である。また、処置開始と判定されたタイミングであるt1~t3についても図28と同様である。図29において、θ1より大きくθ2未満の範囲が許容可能なアプローチ角度範囲である。図29の例ではθ1<0度<θ2である。処理部2110は、アプローチ角度の相対変化がθ1以下である場合、又は、θ2以上である場合に、相対変化がアプローチ角度範囲から外れたと判定する。 FIG. 29 is a diagram explaining the relationship between the relative change in approach angle and the allowable approach angle range. The vertical axis and horizontal axis in FIG. 29 are the same as in FIG. Also, the timings t1 to t3 at which it is determined to start the treatment are the same as those in FIG. In FIG. 29, the allowable approach angle range is greater than θ1 and less than θ2. In the example of FIG. 29, θ1<0 degrees<θ2. The processing unit 2110 determines that the relative change is out of the approach angle range when the relative change in the approach angle is θ1 or less or θ2 or more.
 例えば出力処理部2120は、t4やt6のタイミングにおいて、アラート情報を出力する。表示される情報は、テキスト情報であってもよいし、アイコン等の画像情報であってもよいし、他の情報であってもよい。またアラート情報は、発光部やスピーカーを用いて出力されてもよい。またアラート情報は、相対変化がアプローチ角度範囲内からアプローチ角度範囲外へ移行したタイミングで報知されてもよいし、アプローチ角度範囲外である間は継続して報知されてもよい。例えば出力処理部2120は、図29において、t4とt5の間、t6とt7の間でアラート情報の出力を継続してもよい。 For example, the output processing unit 2120 outputs alert information at timings t4 and t6. The displayed information may be text information, image information such as icons, or other information. Also, the alert information may be output using a light emitting unit or a speaker. The alert information may be notified at the timing when the relative change moves from within the approach angle range to outside the approach angle range, or may be continuously notified while outside the approach angle range. For example, the output processing unit 2120 may continue to output alert information between t4 and t5 and between t6 and t7 in FIG.
 このような報知を行うためには、アプローチ角度範囲をあらかじめ設定しておく必要がある。より具体的には、安全に処置を行うためにはどの程度のアプローチ角度範囲が許容されるかが既知でなければならない。 In order to perform such notification, it is necessary to set the approach angle range in advance. More specifically, it must be known how much approach angle range is permissible for safe treatment.
 例えば処理システム2100は、処置におけるアプローチ角度の相対変化を蓄積したデータベースDBを含んでもよい。処理部2110は、データベースDBに基づいて設定されたアプローチ角度範囲を取得する。 For example, the processing system 2100 may include a database DB that stores relative changes in approach angles during treatment. The processing unit 2110 acquires the approach angle range set based on the database DB.
 図30は、この変形例に係るシステムを説明する図である。例えばデータベースDBは、内視鏡システム2300とネットワーク等を介して接続されるデータベースサーバである。ただし、この変形例の処理システム2100は内視鏡システム2300に含まれてもよく、データベースDBが内視鏡システム2300に含まれることも妨げられない。処理システム2100は、内視鏡システム2300から取得した情報に基づいてアプローチ角度の相対変化を求め、求めた相対変化をデータベースDBに記憶する。なお、相対変化のデータベースDBへの記憶は、リアルタイムに行われてもよいし、挿入部2310bを抜去した後にまとめて行われてもよい。図28に示すように、挿入部2310bの挿入から抜去の間に、時系列の相対変化が求められるが、例えばその平均値がデータベースDBに記憶される。ただし、時系列の相対変化そのものがデータベースDBに記憶されてもよい。 FIG. 30 is a diagram explaining a system according to this modification. For example, the database DB is a database server connected to the endoscope system 2300 via a network or the like. However, the processing system 2100 of this modification may be included in the endoscope system 2300, and the inclusion of the database DB in the endoscope system 2300 is not prevented. The processing system 2100 obtains the relative change in the approach angle based on the information acquired from the endoscope system 2300, and stores the obtained relative change in the database DB. Note that the storage of the relative changes in the database DB may be performed in real time, or may be performed collectively after the insertion portion 2310b is removed. As shown in FIG. 28, time-series relative changes are obtained from insertion to removal of the insertion portion 2310b, and the average value thereof, for example, is stored in the database DB. However, the time-series relative change itself may be stored in the database DB.
 データベースDBは、エキスパートデータと、非エキスパートデータとを含んでもよい。エキスパートデータとは、熟練医による処置が行われた際のアプローチ角度の相対変化を表す情報である。非エキスパートデータとは、修練医による処置が行われた際のアプローチ角度の相対変化を表す情報である。この変形例におけるアプローチ角度範囲は、少なくともエキスパートデータに基づいて設定される。 The database DB may contain expert data and non-expert data. Expert data is information representing relative changes in the approach angle when treatment is performed by an expert doctor. Non-expert data is information that represents the relative change in approach angle when a procedure is performed by a novice doctor. The approach angle range in this modification is set based on at least expert data.
 例えば処理システム2100は、エキスパートデータに基づいて、多くの熟練医は処置中の角度変化が何度以下である、といった情報を求める。処理部2110は、求めた情報に基づいて、アプローチ角度範囲を設定する。例えば、アプローチ角度範囲は、エキスパートデータのうち、外れ値を除いた場合の相対変化の最大値に基づいて決定される。 For example, based on the expert data, the processing system 2100 obtains information that many expert doctors change the angle during treatment by a few degrees or less. The processing unit 2110 sets the approach angle range based on the obtained information. For example, the approach angle range is determined based on the maximum relative change when outliers are excluded from the expert data.
 データベースDBに記憶されるデータがエキスパートであるか、非エキスパートデータであるかは、医師の熟練度、又は、処置の経過を特定する情報に基づいて決定されてもよい。例えば、内視鏡システム2300は、処理システム2100にアプローチ角度を求めるための情報を送信する際に、医師の熟練度を表す熟練度情報や経過を表す経過情報をメタデータとして付与してもよい。熟練度情報は、具体的には対象となる処置を実行した回数を表す症例数情報である。経過情報は、出血量、偶発症発生率、入院日数等を表す情報である。処理システム2100は、当該メタデータに基づいて、対象のデータが熟練医のデータであるエキスパートデータであるか、修練医のデータである非エキスパートデータであるかを決定する。 Whether the data stored in the database DB is expert or non-expert data may be determined based on the doctor's skill level or information specifying the course of treatment. For example, when the endoscope system 2300 transmits information for determining the approach angle to the processing system 2100, the skill level information representing the doctor's skill level and progress information representing the progress may be added as metadata. . The proficiency level information is, specifically, number-of-cases information representing the number of times the target treatment has been performed. The progress information is information representing the amount of bleeding, the incidence of complications, the number of days of hospitalization, and the like. Based on the metadata, the processing system 2100 determines whether the target data is expert data, which is data of a skilled doctor, or non-expert data, which is data of a trainee doctor.
 或いは、処置具の移動軌跡に基づいて、エキスパートデータであるか否かが判定されてもよい。技能が向上するほど動きが統制されて、より少ない動きで処置を遂行できると考えられる。よって操作ログ情報に蓄積されている処置具の移動軌跡におけるノード総数が少ないほど、当該データに対応する術者のスキルが高いと判定される。 Alternatively, whether or not the data is expert data may be determined based on the movement trajectory of the treatment instrument. It is believed that as skill improves, movements become more controlled and procedures can be accomplished with fewer movements. Therefore, it is determined that the smaller the total number of nodes in the movement trajectory of the treatment instrument accumulated in the operation log information, the higher the skill of the operator corresponding to the data.
 このようにすれば、データベースDBには、種々の症例から収集されたアプローチ角度の相対変化の情報が蓄積される。処理システム2100が当該情報の分析処理を行い、分析結果をアプローチ角度範囲として内視鏡システム2300にフィードバックすることによって、以降の症例において、より精度の高いサポートを行うことが可能になる。特に、蓄積される情報が増えることによって、サポート精度向上が期待される。例えば、収集データが増えるほど、熟練医による指導の感覚に近いサポートを実現できる。なお、ここでフィードバックされる情報は、アプローチ角度範囲に限定されず、上述したスキル評価やレポート情報を含んでもよい。 In this way, information on relative changes in approach angles collected from various cases is accumulated in the database DB. The processing system 2100 analyzes the information and feeds back the analysis result to the endoscope system 2300 as an approach angle range, thereby enabling more accurate support in subsequent cases. In particular, an increase in the amount of accumulated information is expected to improve support accuracy. For example, as the amount of collected data increases, it is possible to realize support that is closer to the feeling of guidance by a skilled doctor. The information fed back here is not limited to the approach angle range, and may include the skill evaluation and report information described above.
 なお、アプローチ角度範囲を設定する際に、アプローチ角度の相対変化を含む複数の特徴量を用いた分類処理が行われてもよい。 It should be noted that when setting the approach angle range, a classification process may be performed using a plurality of feature amounts including relative changes in the approach angle.
 図31は分類処理を説明する図である。例えば、処理システム2100の処理部2110は、内視鏡システム2300からアプローチ角度の相対変化を表す時系列データを取得し、当該時系列データに基づいて、第1特徴量及び第2特徴量を含む複数の特徴量を求める。以下、特徴量が2つである例について説明するが、特徴量は3個以上に拡張可能である。第1特徴量はアプローチ角度の相対変化に関係する特徴量であり、第2特徴量は時間に関係する特徴量である。図31の縦軸が第1特徴量であり、横軸が第2特徴量である。なお、図31において、第1特徴量を特徴量A2と表記し、第2特徴量を特徴量B2と表記している。 FIG. 31 is a diagram explaining the classification process. For example, the processing unit 2110 of the processing system 2100 acquires time-series data representing relative changes in the approach angle from the endoscope system 2300, and based on the time-series data, includes a first feature amount and a second feature amount. Obtain multiple features. An example in which there are two feature amounts will be described below, but the number of feature amounts can be expanded to three or more. The first feature amount is a feature amount related to the relative change in the approach angle, and the second feature amount is a feature amount related to time. The vertical axis of FIG. 31 is the first feature amount, and the horizontal axis is the second feature amount. In addition, in FIG. 31, the first feature amount is denoted as feature amount A2, and the second feature amount is denoted as feature amount B2.
 例えば処理部2110は、時系列データを複数の区間に区分する。各区間は、アプローチ角度の相対変化が単調増加又は単調減少する区間であってもよい。或いは各区間は、相対変化の値が0から増加又は減少した後、再び0に戻るまでの区間であってもよい。処理部2110は、各区間における時間の長さを第1特徴量とし、相対変化の最大値と最小値の差分絶対値を第2特徴量とする。図28に示すように、1つの症例に対応するデータは、複数の区間を含み、第1特徴量と第2特徴量の組が複数取得されることが想定される。例えば処理部2110は、このうち、第2特徴量の値が最大のもの、又は、(第2特徴量/第1特徴量)の値が最大となるものを採用してもよい。或いは処理部2110は、複数の第1特徴量の平均値と、複数の第2特徴量の平均値を、1つの症例に対応するデータとしてもよい。また1つの症例に対して、複数のデータが出力されることも妨げられない。 For example, the processing unit 2110 divides the time-series data into multiple intervals. Each section may be a section in which the relative change in the approach angle monotonically increases or decreases. Alternatively, each section may be a section from when the relative change value increases or decreases from 0 to when it returns to 0 again. The processing unit 2110 uses the length of time in each section as a first feature amount, and the absolute difference value between the maximum value and the minimum value of the relative change as a second feature amount. As shown in FIG. 28, it is assumed that data corresponding to one case includes a plurality of sections, and that a plurality of sets of first and second feature amounts are acquired. For example, the processing unit 2110 may employ the one with the largest value of the second feature amount or the one with the largest value of (second feature amount/first feature amount). Alternatively, the processing unit 2110 may use the average value of a plurality of first feature values and the average value of a plurality of second feature values as data corresponding to one case. Moreover, outputting a plurality of data for one case is not prevented.
 この場合、第1特徴量と第2特徴量は、アプローチ角度が大きく変化した場合に、その変化がどの程度の時間で発生したかを表す情報となる。処置具2360を用いた切除等では、先端部2011の角度が急激に変化した場合に危険度が高い。例えば、変化量が大きくても時間をかけてゆっくり変化するのであれば、危険度は相対的に小さい。即ち、上記第1特徴量及び第2特徴量は、時間も考慮した処置危険度合いを判定する指標となる情報である。 In this case, the first feature amount and the second feature amount serve as information representing how long the change occurred when the approach angle changed significantly. In resection or the like using the treatment instrument 2360, there is a high risk when the angle of the distal end portion 2011 changes abruptly. For example, even if the amount of change is large, if it changes slowly over time, the degree of risk is relatively small. That is, the first feature amount and the second feature amount are information serving as an index for determining the degree of treatment risk in consideration of time.
 処理部2110は、第1特徴量及び第2特徴量を含む特徴量空間において分類処理を行う。具体的には、1つの症例が、特徴量空間における1つの点としてプロットされる。処理部2110は、例えばk-means法等の公知のクラスタリング手法を用いて、データを複数のカテゴリに分類する。例えば、エキスパートデータと非エキスパートデータの2つに分類する場合、k=2であるが、カテゴリ数はこれに限定されない。 The processing unit 2110 performs classification processing in a feature amount space including the first feature amount and the second feature amount. Specifically, one case is plotted as one point in the feature space. The processing unit 2110 classifies data into a plurality of categories using a known clustering method such as the k-means method. For example, when classifying into expert data and non-expert data, k=2, but the number of categories is not limited to this.
 図31のF1が第1カテゴリであり、F2が第2カテゴリである。処理部2110は、F1とF2のいずれがエキスパートデータであり、いずれが非エキスパートデータであるかを判定する。判定には、例えば上記の熟練度情報や経過情報等のメタデータが用いられる。例えば処理部2110は、第1カテゴリの各データのメタデータを取得することによって、当該第1カテゴリに含まれる症例が、熟練医によるものであるか、修練医によるものであるかを判定できる。第2カテゴリについても同様である。よって処理部2110は、2つのカテゴリのうち、熟練医によるデータが多いカテゴリに含まれるデータを、エキスパートデータと判定する。例えば図31の例では、処理部2110は、F1に含まれるデータをエキスパートデータと判定し、F2に含まれるデータを非エキスパートデータと判定する。ただし、複数の症例データからエキスパートデータを特定する手法は以上に限定されず、種々の変形実施が可能である。 F1 in FIG. 31 is the first category, and F2 is the second category. The processing unit 2110 determines which of F1 and F2 is expert data and which is non-expert data. Metadata such as the skill level information and progress information is used for the determination. For example, the processing unit 2110 can determine whether a case included in the first category is by a skilled doctor or by a novice doctor by acquiring metadata of each data of the first category. The same is true for the second category. Therefore, the processing unit 2110 determines that the data included in the category in which there are many data by skilled doctors among the two categories is the expert data. For example, in the example of FIG. 31, the processing unit 2110 determines data included in F1 as expert data, and determines data included in F2 as non-expert data. However, the technique of identifying expert data from a plurality of case data is not limited to the above, and various modifications are possible.
 処理部2110は、分類結果に基づいて許容可能なアプローチ角度範囲を設定する。例えば処理部2110は、上述した例と同様にエキスパートデータの相対変化の傾向を分析し、分析結果に基づいて、アプローチ角度範囲を設定する。 The processing unit 2110 sets an allowable approach angle range based on the classification result. For example, the processing unit 2110 analyzes the trend of relative change in expert data in the same manner as in the above example, and sets the approach angle range based on the analysis results.
 ただし、上述したように時間は危険度の判定に有用である。よって処理部2110は、時間を含む情報に基づいて、アラート情報を出力するか否かの判定を行ってもよい。例えば処理部2110は、上述した第1特徴量や第2特徴量に基づいて、許容可能な単位時間当たりの相対変化量を求めてもよい。出力処理部2120は、相対変化がアプローチ角度範囲から外れた場合、又は、相対変化の傾きが許容可能な単位時間当たりの相対変化量を超えた場合に、アラート情報を出力する。 However, as mentioned above, time is useful for judging the degree of risk. Therefore, the processing unit 2110 may determine whether or not to output alert information based on information including time. For example, the processing unit 2110 may obtain an allowable relative change amount per unit time based on the above-described first feature amount and second feature amount. The output processing unit 2120 outputs alert information when the relative change is out of the approach angle range or when the slope of the relative change exceeds the permissible amount of relative change per unit time.
 なお、以上ではデータベースDBが処理システム2100に含まれる例について説明したが、この変形例の手法はこれに限定されない。例えば、処理システム2100の外部にデータベースDBが設けられてもよい。処理システム2100は、例えばネットワークを介してデータベースDBと通信が可能であり、当該データベースDBに蓄積された情報を受信することによって、アプローチ角度範囲を設定する。また、以上では処理システム2100の処理部2110が分析処理を行うことによってアプローチ角度範囲が求められる例を示したが、処理部2110は、外部装置で求められたアプローチ角度範囲を取得してもよい。 Although an example in which the database DB is included in the processing system 2100 has been described above, the technique of this modification is not limited to this. For example, a database DB may be provided outside the processing system 2100 . The processing system 2100 can communicate with the database DB via a network, for example, and sets the approach angle range by receiving information accumulated in the database DB. Further, an example in which the approach angle range is determined by the processing unit 2110 of the processing system 2100 performing analysis processing has been described above, but the processing unit 2110 may acquire the approach angle range determined by an external device. .
 なお、アラート情報の出力に用いるアプローチ角度範囲は、状況に応じて変更可能であってもよい。処理部2110は、第1処置条件では、アプローチ角度範囲として第1アプローチ角度範囲を取得し、第1処置条件と異なる第2処置条件では、アプローチ角度範囲として、第1アプローチ角度範囲とは異なる第2アプローチ角度範囲を取得してもよい。 Note that the approach angle range used for outputting alert information may be changeable depending on the situation. The processing unit 2110 acquires the first approach angle range as the approach angle range under the first treatment condition, and acquires the approach angle range different from the first approach angle range under the second treatment condition different from the first treatment condition. Two approach angle ranges may be obtained.
 このようにすれば、処置条件に応じてアラート情報の出力条件を変更できる。よって、術者による処置をより適切にサポートすることが可能になる。 In this way, the conditions for outputting alert information can be changed according to the treatment conditions. Therefore, it becomes possible to more appropriately support the treatment by the operator.
 処置条件とは、処置対象となる病変が存在する臓器を特定する情報である。例えば処理部2110は、処置対象の臓器が胃である場合、胃用のアプローチ角度範囲を設定し、処置対象の臓器が大腸である場合、大腸用のアプローチ角度範囲を設定する。また対象となる臓器は食道等の他の臓器であってもよい。 A treatment condition is information that identifies an organ in which a lesion to be treated exists. For example, the processing unit 2110 sets the approach angle range for the stomach when the organ to be treated is the stomach, and sets the approach angle range for the large intestine when the organ to be treated is the large intestine. Also, the target organ may be another organ such as the esophagus.
 また、処理部2110は、臓器の部位に応じてアプローチ角度範囲を設定してもよい。例えば処理部2110は、同じ胃であっても、処置対象の部位が小彎である場合、小彎用のアプローチ角度範囲を設定し、処置対象の部位が大彎である場合、大彎用のアプローチ角度範囲を設定する。また大腸を対象とする場合も、直腸、上行結腸、横行結腸、下行結腸等、対象とする部位に応じて異なるアプローチ角度範囲を設定する。 Also, the processing unit 2110 may set the approach angle range according to the part of the organ. For example, even in the same stomach, the processing unit 2110 sets an approach angle range for the lesser curvature when the treatment target site is the lesser curvature, and sets the approach angle range for the greater curvature when the treatment target site is the greater curvature. Set the approach angle range. Also, when the large intestine is targeted, different approach angle ranges are set according to target regions such as the rectum, ascending colon, transverse colon, and descending colon.
 例えば内視鏡システム2300は、処理システム2100にアプローチ角度を求めるための情報を送信する際に、処置対象の臓器や部位を特定する臓器情報をメタデータとして付与してもよい。データベースDBは、アプローチ角度に関する情報、熟練度情報、経過情報、臓器情報を対応付けたデータセットを記憶する。或いは、データベースDBは、臓器又は部位ごとに異なる複数のデータベースを含んでもよい。例えばデータベースDBは、胃に関する症例データを記憶する胃データベースと、大腸に関する症例データを記憶する大腸データベースとを含んでもよい。 For example, when the endoscope system 2300 transmits information for determining the approach angle to the processing system 2100, organ information specifying the organ or site to be treated may be added as metadata. The database DB stores a data set in which approach angle information, skill level information, progress information, and organ information are associated with each other. Alternatively, the database DB may include multiple databases that differ for each organ or site. For example, the database DB may include a stomach database that stores case data on the stomach and a large intestine database that stores case data on the large intestine.
 処理システム2100の処理部2110は、臓器や部位ごとに分析処理を行うことによって、臓器ごとのアプローチ角度範囲を取得する。例えば処理部2110は、胃データベースに含まれるエキスパートデータに基づいて、胃に対する処置を行う際のアプローチ角度範囲を取得する。分析処理については上記の例と同様であり、熟練医のアプローチ角度の傾向を求める処理であってもよいし、分類処理を含む処理であってもよい。他の臓器についても同様であり、例えば処理部2110は、大腸データベースに含まれるエキスパートデータに基づいて、大腸に対する処置を行う際のアプローチ角度範囲を取得する。1つの臓器を2以上の部位に細分化する場合も同様であり、部位ごとのデータに基づく分析処理によって、部位ごとのアプローチ角度範囲が求められる。 The processing unit 2110 of the processing system 2100 acquires the approach angle range for each organ by performing analysis processing for each organ or region. For example, the processing unit 2110 acquires an approach angle range for performing treatment on the stomach based on expert data included in the stomach database. The analysis process is the same as the above example, and may be a process of obtaining the tendency of the approach angle of the expert doctor, or may be a process including a classification process. The same is true for other organs. For example, the processing unit 2110 acquires an approach angle range for performing treatment on the large intestine based on expert data included in the large intestine database. The same is true when one organ is subdivided into two or more parts, and the approach angle range for each part is obtained by analysis processing based on the data for each part.
 処理システム2100は、現在の処置対象となる臓器、部位に基づいて、複数のアプローチ角度範囲のうち、アラート情報の出力判定に用いるアプローチ角度範囲を決定する処理を行う。このようにすれば、対象とする臓器や部位に応じてアプローチ角度範囲を設定できる。臓器によって臓器形状や、病変の態様が異なるため、アプローチ角度の維持が容易な場合があれば、熟練医であってもある程度アプローチ角度がばらついてしまう場合もある。臓器ごと、部位ごとに処理を行うことによって、適切なアプローチ角度範囲に基づく術者のサポートが可能になる。 The processing system 2100 performs processing for determining an approach angle range to be used for alert information output determination from among a plurality of approach angle ranges, based on the organ and site currently being treated. In this way, the approach angle range can be set according to the target organ or site. Since the shape of the organ and the mode of the lesion differ depending on the organ, there are cases where it is easy to maintain the approach angle, but there are cases where the approach angle varies to some extent even by an experienced doctor. By performing processing for each organ and each site, it is possible to support the operator based on an appropriate approach angle range.
 また、処置条件は臓器や部位に限定されない。例えば同じ臓器であっても病変の種類によって処置の内容が変化しうる。よって、処理部2110は、病変種類に応じてアプローチ角度範囲を変更してもよい。この場合、データベースDBは、病変種類を特定するメタデータが付与されたデータを記憶し、処理部2110は病変種類ごとに分析処理を行う。 Also, treatment conditions are not limited to organs or parts. For example, even for the same organ, the content of treatment may vary depending on the type of lesion. Therefore, the processing unit 2110 may change the approach angle range according to the type of lesion. In this case, the database DB stores data to which metadata specifying the lesion type is added, and the processing unit 2110 performs analysis processing for each lesion type.
 また、処置条件は、ESD等の複数のステップを含む治療法におけるステップを特定する情報であってもよい。例えば処理部2110は、マーキング、局注、切開、剥離、止血等のステップに応じて、アプローチ角度範囲を変更する。この場合、データベースDBは、治療法のステップを特定するメタデータが付与されたデータを記憶し、処理部2110はステップごとに分析処理を行う。 Also, the treatment condition may be information specifying steps in a treatment method including multiple steps such as ESD. For example, the processing unit 2110 changes the approach angle range according to steps such as marking, local injection, incision, peeling, and hemostasis. In this case, the database DB stores data to which metadata specifying the steps of the treatment method is added, and the processing unit 2110 performs analysis processing for each step.
 また、処置条件は、術者のスキル評価結果を特定する情報であってもよい。術者のスキル評価結果は、例えば当該術者の熟練度情報や、過去に担当した症例の経過情報、アプローチ角度の相対変化等に基づいて決定される。例えば処理部2110は、スキル評価が低い術者に対してはアプローチ角度範囲を狭く設定し、スキル評価が高い術者に対してはアプローチ角度範囲を広く設定してもよい。このようにすれば、修練医にはアラートが出されやすくなるため、危険を事前に知らせることが可能になる。また熟練医にはアラートが出されにくくなるため、煩わしさを感じさせることを抑制できる。 Also, the treatment condition may be information specifying the operator's skill evaluation result. The operator's skill evaluation result is determined based on, for example, the skill level information of the operator, progress information of cases handled in the past, relative change in approach angle, and the like. For example, the processing unit 2110 may set a narrow approach angle range for an operator with a low skill evaluation, and set a wide approach angle range for an operator with a high skill evaluation. In this way, an alert is more likely to be issued to the novice doctor, so that it is possible to notify the danger in advance. In addition, since the alert is less likely to be issued to the expert doctor, it is possible to suppress annoyance.
 図26を用いて上述した処理においては、処置の開始が検出された後は、1回の処置が終わってポジショニングを行っている期間であっても、アプローチ角度の相対変化の出力が継続される。図28の例では、t2よりも前のタイミングで上記(d)の通電終了、又は(e)の処置具2360の収納、が行われているが、t2まで相対変化の出力が継続されている。また、(d)や(e)以外にも報知情報を出力する必要性が低い場面が考えられるが、そのような場面でも相対変化の出力が継続される。 In the process described above with reference to FIG. 26, after the start of treatment is detected, output of the relative change in the approach angle continues even during the period in which positioning is performed after one treatment is completed. . In the example of FIG. 28, the end of energization in (d) or the retraction of the treatment instrument 2360 in (e) is performed at a timing before t2, but the output of the relative change continues until t2. . In addition to (d) and (e), there may be situations in which the need to output notification information is low, but the output of the relative change continues even in such situations.
 これに対して、この変形例の出力処理部2120は、報知情報の出力処理を開始した後、所与の停止条件が満たされた場合に、報知情報の出力処理を停止してもよい。処理部2110は、出力処理部2120による出力処理の停止後に、再度、処置が開始されたと判定した場合に、基準角度の再設定処理を行ってもよい。 On the other hand, the output processing unit 2120 of this modification may stop the output processing of the notification information when a given stop condition is satisfied after starting the output processing of the notification information. When the processing unit 2110 determines that the treatment has started again after the output processing by the output processing unit 2120 is stopped, the processing unit 2110 may perform the reference angle resetting process.
 このようにすれば、出力処理を適切な条件によって停止すること、及び停止後に相対変化を求める場合、基準角度の再設定を行うことが可能になる。ポジショニング中等の処置を行う蓋然性が低い場合、アプローチ角度の相対変化を出力する意義が小さいが、そのような場面での出力処理を抑制できる。また不要なアラート情報が出力されることも抑制できる。或いは、アプローチ角度の相対変化の演算精度が低下してしまう場合に、精度の低い情報の出力を抑制することも可能である。 By doing so, it is possible to stop the output process under appropriate conditions, and to reset the reference angle when obtaining a relative change after stopping. When the probability of performing a treatment such as during positioning is low, there is little significance in outputting the relative change in the approach angle, but output processing in such a situation can be suppressed. It is also possible to suppress the output of unnecessary alert information. Alternatively, it is possible to suppress the output of information with low accuracy when the calculation accuracy of the relative change in the approach angle is degraded.
 具体的には、処理部2110は、(1)送気、送水又は吸引が行われた、(2)生体と挿入部2310bとの距離が所与の距離閾値以上と判定した、(3)挿入部2310bにおける押しつけ圧力が所与の圧力閾値以上と判定した、(4)処置具2360が撮像画面から消失したと判定した、(5)処置が開始されたと判定してから所与の時間が経過した、(6)内視鏡の挿入部2310bの先端部2011のふらつきが大きい状態と判定した、のうちの少なくとも1つの条件が満たされた場合に、停止条件が満たされたと判定する。上記の停止条件は全てが用いられる必要はなく、一部が省略されてもよい。また他の停止条件が追加されてもよい。このようにすれば、必要性の低い出力処理を抑制することが可能になる。 Specifically, the processing unit 2110 determines that (1) air supply, water supply, or suction has been performed, (2) the distance between the living body and the insertion unit 2310b is greater than or equal to a given distance threshold, and (3) insertion (4) It is determined that the treatment instrument 2360 has disappeared from the imaging screen. (5) A given time has elapsed since it was determined that the treatment was started. (6) It is determined that the distal end portion 2011 of the insertion portion 2310b of the endoscope wobbles greatly. All of the above stop conditions need not be used, and some may be omitted. Other stopping conditions may also be added. By doing so, it is possible to suppress output processing that is not necessary.
 送気が行われた場合、臓器が膨らむことによって処置対象組織が動いてしまうため、上記平面P21も変化してしまう。よってアプローチ角度の演算精度が低下するため、相対変化の出力を停止する意義がある。吸引では臓器が収縮することによって、やはりアプローチ角度の演算精度が低下するため、相対変化の出力を停止する意義がある。処理部2110は、例えば送気又は吸引を行うポンプの制御信号に基づいて、送気吸引を検出する。 When air is supplied, the tissue to be treated moves due to swelling of the organ, so the plane P21 also changes. Therefore, since the calculation accuracy of the approach angle is lowered, it is meaningful to stop the output of the relative change. In the case of aspiration, the contraction of the organ also lowers the calculation accuracy of the approach angle, so it is significant to stop the output of the relative change. The processing unit 2110 detects air supply and suction, for example, based on a control signal of a pump that performs air supply or suction.
 また処置対象組織から挿入部2310bの先端部2011を離した場合、撮像画像は処置対象組織を俯瞰した画像となる。そのため、術者が撮像画像に基づいて先端部2011の位置姿勢を推定することが可能であり、処理システム2100が報知を行う意義が小さい。また、距離が離れると言うことは処置が一旦終了しており、術者はポジショニングを行ったり、処置経過を観察したりしている可能性もある。その点からも、アプローチ角度の相対変化を出力する必要性は低い。処置対象組織との距離は、上述したようにステレオマッチング等の種々の手法によって求めることが可能である。 Further, when the distal end portion 2011 of the insertion portion 2310b is separated from the tissue to be treated, the captured image is a bird's-eye view of the tissue to be treated. Therefore, the operator can estimate the position and orientation of the distal end portion 2011 based on the captured image, and the notification by the processing system 2100 is of little significance. In addition, when the distance is increased, the treatment has been temporarily completed, and the operator may be positioning or observing the progress of the treatment. From this point of view as well, there is little need to output the relative change in the approach angle. The distance from the treatment target tissue can be obtained by various methods such as stereo matching as described above.
 また挿入部2310bにおける押しつけ圧力が所与の圧力閾値である場合、押しつけによる力量によって臓器形状が変化してしまう。この場合も、アプローチ角度の演算精度が低下するため、相対変化の出力を停止するとよい。例えば内視鏡システム2300は、挿入部2310bに設けられる圧力センサを含む。圧力センサは、例えばMEMS(Micro Electro Mechanical Systems)等の圧電素子によって実現できる。処理部2110は、圧力センサからのセンサ情報を取得することによって、押し付け圧力を検出する。 Also, if the pressing pressure in the insertion portion 2310b is a given pressure threshold, the shape of the organ changes depending on the amount of pressing force. In this case as well, it is preferable to stop outputting the relative change because the calculation accuracy of the approach angle is degraded. For example, endoscope system 2300 includes a pressure sensor provided in insertion section 2310b. The pressure sensor can be realized by a piezoelectric element such as MEMS (Micro Electro Mechanical Systems). The processing unit 2110 detects pressing pressure by acquiring sensor information from the pressure sensor.
 また撮像画像から処置具2360が消失した場合、上記(e)の処置具2360の収納が行われたと考えられる。この場合、1回の処置が終了した蓋然性が高いため、アプローチ角度の相対変化を出力する必要性が低い。撮像画像に基づく処置具2360の検出は、上述したように、彩度等を用いた画像処理によって実現できる。また上記のように、処理部2110は、高周波デバイスへの通電が停止した場合に、停止条件が満たされたと判定してもよい。 Also, when the treatment tool 2360 disappears from the captured image, it is considered that the treatment tool 2360 was stored in (e) above. In this case, it is highly probable that one treatment has been completed, so there is little need to output the relative change in the approach angle. Detection of the treatment instrument 2360 based on the captured image can be realized by image processing using saturation or the like, as described above. Further, as described above, the processing unit 2110 may determine that the stop condition is satisfied when the power supply to the high-frequency device is stopped.
 また、通電開始から所与の時間が経過した場合、処置が終了する蓋然性が高い。高周波デバイスへの通電を長時間継続することは安全性の観点から問題となるためである。ここでの所与の時間は、例えば数分程度の時間である。例えば、処理システム2100は、計時情報を出力する計時部を含んでもよい。或いは処理システム2100は、ネットワークを介して計時情報を取得してもよい。処理部2110は、計時情報に基づいて、通電開始からの時間を計測する。 Also, if a given time has passed since the start of energization, there is a high probability that the treatment will end. This is because continuing to energize a high-frequency device for a long time poses a problem from the standpoint of safety. The given time here is, for example, a time of the order of several minutes. For example, processing system 2100 may include a timing unit that outputs timing information. Alternatively, processing system 2100 may obtain the timing information via a network. The processing unit 2110 measures the time from the start of energization based on the clock information.
 図32は、停止条件を用いる場合の処理を説明するフローチャートである。ステップS2201~ステップS2204は、図26のステップS2101~ステップS2104と同様である。具体的には、処置の開始が検出された場合に、アプローチ角度の基準角度の設定、相対変化の算出、及び出力が行われる。 FIG. 32 is a flowchart for explaining the processing when using a stop condition. Steps S2201 to S2204 are the same as steps S2101 to S2104 in FIG. Specifically, when the start of treatment is detected, the reference angle of the approach angle is set, the relative change is calculated, and the output is performed.
 次にステップS2205において、処理部2110は、停止条件が満たされたか否かを判定する。停止条件及び判定手法は上述したとおりである。停止条件が満たされていない場合(ステップS2205でNo)、ステップS2202に戻る。即ち、現在の基準角度を用いて、アプローチ角度の相対変化の算出及び出力が継続される。 Next, in step S2205, the processing unit 2110 determines whether or not the stop condition is satisfied. The stop condition and determination method are as described above. If the stop condition is not satisfied (No in step S2205), the process returns to step S2202. That is, the current reference angle is used to continue calculating and outputting the relative change in the approach angle.
 一方、停止条件が満たされた場合(ステップS2205でYes)、ステップS2201に戻る。即ち、処理システム2100は、処置の開始が検出されるまでは、アプローチ角度の相対変化の算出及び出力を行わずに、ステップS2201の処理を定期的に実行する。また処理システム2100は、処置が開始されたと判定した場合に(ステップS2201でYes)、基準角度を再設定した後(ステップS2202)、相対変化の算出及び報知情報の出力を行う(ステップS2203、ステップS2204)。 On the other hand, if the stop condition is satisfied (Yes in step S2205), the process returns to step S2201. That is, the processing system 2100 periodically executes the processing of step S2201 without calculating and outputting the relative change in the approach angle until the start of treatment is detected. When the processing system 2100 determines that the treatment has started (Yes in step S2201), the processing system 2100 resets the reference angle (step S2202), calculates the relative change, and outputs notification information (step S2203, step S2204).
 6.他の変形例
 なお、本実施形態の手法によれば、スキル評価情報は、アプローチ角度情報及び通電履歴情報に基づいて出力されるが、これらに限定されず、例えばスキル評価情報は、後述する押し付け圧力情報及び送気吸引情報に基づいて出力されてもよい。以降、変形例として、押し付け圧力情報及び送気吸引情報に基づいてスキル評価情報を出力する処理システム3100に関する実施形態について説明する。なお、この変形例において、主にESDに含まれる各処置について説明するが、この変形例の手法は生体に対する他の処置に拡張することが可能である。
6. Other Modifications According to the method of the present embodiment, the skill evaluation information is output based on the approach angle information and the energization history information, but is not limited to these. It may be output based on pressure information and air supply/suction information. Hereinafter, as a modified example, an embodiment relating to a processing system 3100 that outputs skill evaluation information based on pressing pressure information and air supply/suction information will be described. In this modified example, each treatment included in ESD will be mainly described, but the method of this modified example can be extended to other treatments for a living body.
 内視鏡における処置の良し悪しは、医師の経験値や操作上の「暗黙知」によるものが大きい。例えば軟性内視鏡は、図35を用いて後述するように、挿入部3310bの先端部3011から操作部3310aまでが軟性且つ長大であるため、先端部3011が生体と接触することによる感触や力量は、術者にはほとんど伝わらない。また術者が取得できる情報は、先端部3011に設けられる撮像系による撮像画像が主である。即ち、術者が確認できるのは画面から見える範囲及び角度のみである。さらに言えば、撮像画像は3次元的な情報を有さない場合が多い。 The quality of endoscopic procedures depends largely on the doctor's experience and "tacit knowledge" of operation. For example, in a flexible endoscope, as will be described later with reference to FIG. 35, since the distal end portion 3011 of the insertion portion 3310b to the operation portion 3310a is flexible and long, the feeling and strength of the distal end portion 3011 when the distal end portion 3011 comes into contact with the living body. is hardly communicated to the operator. Information that can be acquired by the operator is mainly an image captured by an imaging system provided in the distal end portion 3011 . That is, what the operator can confirm is only the range and angle visible from the screen. Furthermore, captured images often do not have three-dimensional information.
 また、任意の位置姿勢をとるには6自由度が必要であるが、軟性内視鏡は、一般に4自由度しか有していない。そのため、術者には、不足している自由度を補うために、管腔を収縮、膨張させることで、目標とする病変と内視鏡の位置関係をコントロールできる技量が要求される。管腔を収縮、膨張させるには、送気及び吸引により管腔の内圧をコントロールする必要がある。なお、以降、送気及び吸引を送気吸引と表記することがある。 In addition, 6 degrees of freedom are required to take an arbitrary position and posture, but flexible endoscopes generally have only 4 degrees of freedom. Therefore, the operator is required to have the skill to control the positional relationship between the target lesion and the endoscope by contracting and expanding the lumen in order to compensate for the lack of freedom. In order to deflate and expand the lumen, it is necessary to control the internal pressure of the lumen by air supply and suction. Note that, hereinafter, air supply and suction may be referred to as air supply suction.
 このような状況のもと、内視鏡医である術者は自らの経験則に基づいて手技を行っていることから、処置に要する時間は術者によって大きく異なる。例えば、内視鏡が管腔内における病変に到達した後、先端部3011のポジションを調整する必要があるが、修練医は熟練医に比べて当該調整に長い時間を要していることが分かってきている。なお、以降、先端部3011のポジションの調整を、単にポジションの調整ということがある。また、ここでの熟練医とは処置に関するスキルの高い医師であり、修練医とは熟練医に比べて処置に関するスキルが低い医師をいう。また、処置に関するスキルは、処置の行為の情報の他、処置の経過後の情報も考慮して高低が評価される。処置の経過後の情報とは、例えば、偶発症発生率の低さや、術後の入院日数の短さ等をいう。 Under these circumstances, endoscopists perform procedures based on their own empirical rules, so the time required for treatment varies greatly depending on the operator. For example, after the endoscope reaches a lesion in the lumen, it is necessary to adjust the position of the distal end 3011, and it has been found that it takes a longer time for novice doctors to make such adjustments than skilled doctors. is coming. In addition, hereinafter, adjustment of the position of the distal end portion 3011 may simply be referred to as adjustment of the position. Further, a skilled doctor here means a doctor with high treatment skills, and a trainee doctor means a doctor with lower treatment skills than a skilled doctor. In addition, the treatment-related skill is evaluated as to whether it is high or low in consideration of the information after the treatment as well as the information on the action of the treatment. Information after the course of treatment refers to, for example, a low incidence of complications, a short postoperative hospital stay, and the like.
 例えば修練医のスキルを上げるために、熟練医は、ポジションの調整を短時間で行うための技術的知見を、修練医に助言又は継承できることが好ましい。しかし、軟性内視鏡は、前述のように挿入部3310bの先端部3011から操作部3310aまでが軟性且つ長大であり、対象組織側は前述の送気吸引に伴い膨張又は収縮され、さらに蠕動運動によって絶えず変動することから、内視鏡の操作性は常に変化する。そのため、内視鏡の操作入力と操作出力は、概して一致しない。したがって、処置において客観的かつ明確に把握できるのは入力操作の内容であるが、仮にそれ自体を記録に残し、修練者が当該記録を参照できたとしても、修練医のスキルの向上に繋がらない。熟練医は操作性の変化を絶えず修正しながら処置を行うことができるが、どういう時にどう操作するのが好ましいかについての知見をうまく言葉で表すことができない。つまり、ポジションの調整を短時間で行う知見を、客観性のある表現で修練医に指導することができない状況にある。換言すれば、従来、処置における先端部3011のポジションの調整は「暗黙知」となっている。 For example, in order to improve the skill of a trainee doctor, it is preferable for the trainee doctor to be able to advise or pass on technical knowledge for adjusting positions in a short time. However, in the flexible endoscope, as described above, the portion from the distal end portion 3011 of the insertion portion 3310b to the operation portion 3310a is flexible and long, and the target tissue side is expanded or contracted by the above-described air supply and suction, and further peristaltic movement. , the operability of the endoscope is constantly changing. Therefore, the operational input and the operational output of the endoscope generally do not match. Therefore, although it is the contents of the input operation that can be objectively and clearly grasped in the treatment, even if the record itself is recorded and the trainee can refer to the record, it will not lead to the improvement of the skill of the trainee doctor. . A skilled physician can perform procedures while constantly correcting for changes in operability, but is unable to articulate his knowledge of when and how to prefer to operate. In other words, there is a situation in which it is not possible to instruct the novice doctor in an objective manner about the knowledge required to adjust the position in a short period of time. In other words, conventionally, adjustment of the position of the distal end portion 3011 in treatment is "tacit knowledge".
 そのため、軟性内視鏡の処置について、術者のスキルを可視化、定量化することに対する要求がある。客観的なスキル評価が可能になれば、術者のスキルアップを容易にすることや、病院における人材配置を最適化すること等が可能になる。そこで、術者のスキルを可視化、定量化するために必要なパラメータが何かを見出す必要がある。 Therefore, there is a demand for visualizing and quantifying the skills of operators in flexible endoscopy procedures. If objective skill evaluation becomes possible, it will be possible to facilitate the skill improvement of surgeons and optimize the allocation of human resources in hospitals. Therefore, it is necessary to find out what parameters are necessary to visualize and quantify the skill of the operator.
 近年、図39等で後述する押し付け圧力は、先端部3011のポジションの調整と相関の高いパラメータであることが分かってきている。具体的には、熟練医によるポジションの調整が早い処置において、押し付け圧力が早く安定することが分かってきている。 In recent years, it has been found that the pressing pressure, which will be described later with reference to FIG. Specifically, it has been found that the pressing pressure stabilizes quickly in a treatment in which the position is quickly adjusted by a skilled doctor.
 このため、押し付け圧力は、前述の暗黙知を解明するパラメータとして期待されている。その一例として、熟練医が処置の段階ごとに押し付け圧力をどのように制御しているかについて注目されている。例えば、前述のマーキング段階、局注段階、切開段階、剥離段階を通じて押し付け圧力の挙動は、図33のような概形になると予想されている。マーキング段階においては、先端部3011は組織に接触することが少ないことから押し付け圧力は低く測定される。また、局注段階においては、液の注入によって臓器側が変形する。さらに局注は複数回を行われるため、押し付け圧力の上昇及び降下が見られる。また、切開段階においては、図37で後述する処置具3360と組織の接触度合いにより組織への通電量を調節することから、先端部3011を強く押し付けるため、押し付け圧力は高く測定される。剥離段階においても、同様に、処置具3360と組織の接触度合いにより組織への通電量を調節するため、押し付け圧力は高く測定される。なお、図33はあくまでも概形の予想であり、詳細は十分解明されていない。 For this reason, imposing pressure is expected as a parameter to elucidate the aforementioned tacit knowledge. As an example, attention has been paid to how a skilled doctor controls the pressing pressure for each stage of treatment. For example, it is expected that the pressing pressure will behave roughly as shown in FIG. During the marking stage, the tip 3011 is less likely to be in contact with tissue, so the pressing pressure is measured to be lower. Also, in the local injection stage, the organ side is deformed by the injection of the liquid. Furthermore, since the local injection is performed multiple times, the pressing pressure rises and falls. In addition, in the incision stage, since the amount of energization to the tissue is adjusted according to the degree of contact between the treatment instrument 3360 (to be described later with reference to FIG. 37) and the tissue, the distal end portion 3011 is strongly pressed, so the pressing pressure is measured to be high. Similarly, in the peeling stage, the pressing pressure is measured to be high because the amount of electricity applied to the tissue is adjusted according to the degree of contact between the treatment instrument 3360 and the tissue. It should be noted that FIG. 33 is only a rough estimate, and the details have not been clarified sufficiently.
 このように、押し付け圧力のデータをより詳細に解析することにより、術者のスキルを把握することや、術者のスキルを上げる知見を得られることが期待される。例えばユーザが行なう外科的作業に対するデータを収集し、収集したデータと、同じ外科的作業に対する他のデータとを比較することによってユーザの臨床技能を定量化する手法が知られている。そのため、これらの手法を適用して押し付け圧力のデータを収集等することで、術者のスキルを定量化できるようにも思われる。 In this way, by analyzing the pressing pressure data in more detail, it is expected that we will be able to grasp the skill of the operator and gain knowledge to improve the skill of the operator. For example, techniques are known for quantifying a user's clinical skill by collecting data for a surgical task performed by the user and comparing the collected data to other data for the same surgical task. Therefore, it seems possible to quantify the skill of the operator by applying these methods and collecting pressing pressure data.
 しかし、押し付け圧力のような個々のパラメータを単独に解析しても、ポジションの調整の良し悪しを判断することは難しい。押し付け圧力は、内視鏡の先端部3011等を対象部位等に押し当てることだけでは決まらず、例えば、臓器の張り具合にも依存する。臓器の張り具合が弱いと臓器の内壁は柔らかく、逆に、臓器の張り具合が強いと臓器の内壁が固くなる場合が有るからである。さらに、臓器の張り具合は、当該臓器の蠕動運動に影響されるが、前述の送気吸引に伴う臓器の内圧変動にも影響される。そのため、押し付け圧力に関する情報は、送気吸引に関する情報と密接な関係にある。このように、ポジションの調整の良し悪しは、複数のパラメータをもとに総合的な判断が求められる。従来の手法では、このような事情までは考慮されておらず、術者のスキルを評価するには十分ではない。そこで、この変形例では、押し付け圧力情報及び送気吸引情報に基づいて、術者のスキルを評価する処理システム3100について説明する。 However, it is difficult to judge whether the position adjustment is good or bad by analyzing individual parameters such as pressing pressure alone. The pressing pressure is determined not only by pressing the distal end portion 3011 or the like of the endoscope against the target site or the like, but also depends on the tension of the organ, for example. This is because if the tension of the organ is weak, the inner wall of the organ may be soft, and conversely, if the tension of the organ is high, the inner wall of the organ may become hard. Furthermore, the tension of the organ is affected by the peristaltic movement of the organ, and is also affected by the internal pressure fluctuation of the organ due to the above-mentioned air supply and suction. Therefore, the information on pressing pressure is closely related to the information on air supply and suction. In this way, whether the position adjustment is good or bad requires a comprehensive judgment based on a plurality of parameters. Conventional methods do not take such circumstances into account, and are not sufficient to evaluate the skill of the operator. Therefore, in this modified example, a processing system 3100 that evaluates the operator's skill based on pressing pressure information and air supply/suction information will be described.
 図34は、この変形例に係る処理システム3100の構成を示す図である。処理システム3100は、取得部3110と、処理部3120と、出力処理部3130を含む。ただし処理システム3100は図34の構成に限定されず、これらの一部の構成要素を省略したり、他の構成要素を追加したりする等の種々の変形実施が可能である。 FIG. 34 is a diagram showing the configuration of a processing system 3100 according to this modification. The processing system 3100 includes an acquisition unit 3110 , a processing unit 3120 and an output processing unit 3130 . However, the processing system 3100 is not limited to the configuration of FIG. 34, and various modifications such as omitting some of these components or adding other components are possible.
 取得部3110は、図35、図36で後述する内視鏡システム3300から、内視鏡の挿入部の押し付け圧力情報と、送気及び吸引に関する送気吸引情報を取得する。例えば取得部3110は押し付け圧力情報及び送気吸引情報を取得する通信インターフェースと言うこともできる。なお、具体的な取得方法については後述する。また、取得部3110は、例えば情報取得用の通信チップ、当該通信チップを制御するプロセッサ又は制御回路等によって実現が可能である。なお、この変形例にかかる内視鏡は軟性鏡であるものとする。 The acquisition unit 3110 acquires the pressing pressure information of the insertion portion of the endoscope and the air supply/suction information regarding air supply and suction from the endoscope system 3300 described later with reference to FIGS. 35 and 36 . For example, the acquisition unit 3110 can also be said to be a communication interface that acquires pressing pressure information and air supply/suction information. A specific acquisition method will be described later. Also, the acquisition unit 3110 can be realized by, for example, a communication chip for information acquisition, a processor or a control circuit that controls the communication chip, or the like. Note that the endoscope according to this modification is assumed to be a flexible endoscope.
 処理部3120は、押し付け圧力情報及び送気吸引情報に基づいて、内視鏡システム3300の操作を行ったユーザのスキル評価を行う。処理部3120が実行する処理は、例えばクラスタリング等の分類処理である。なお、スキル評価の詳細は後述する。 The processing unit 3120 evaluates the skill of the user who has operated the endoscope system 3300 based on the pressing pressure information and the air supply/suction information. The processing executed by the processing unit 3120 is, for example, classification processing such as clustering. Details of the skill evaluation will be described later.
 学習済モデルを用いる処理が行われる場合、処理システム3100は、機械学習によって生成された学習済モデルを記憶する不図示の記憶部を含む。ここでの記憶部は、処理部3120等のワーク領域となるもので、その機能は半導体メモリ、レジスタ、磁気記憶装置などにより実現できる。処理部3120は、記憶部から学習済モデルを読み出し、当該学習済モデルからの指示に従って動作することによって、ユーザのスキル評価結果を出力する推論処理を行う。 When processing using a trained model is performed, the processing system 3100 includes a storage unit (not shown) that stores a trained model generated by machine learning. The storage unit here serves as a work area for the processing unit 3120 and the like, and its function can be realized by a semiconductor memory, a register, a magnetic storage device, or the like. The processing unit 3120 reads the learned model from the storage unit and operates according to instructions from the learned model, thereby performing inference processing for outputting the user's skill evaluation result.
 なお、処理部3120は、下記のハードウェアにより構成される。ハードウェアは、デジタル信号を処理する回路及びアナログ信号を処理する回路の少なくとも一方を含むことができる。例えば、ハードウェアは、回路基板に実装された1又は複数の回路装置や、1又は複数の回路素子で構成することができる。1又は複数の回路装置は例えばIC、FPGA等である。1又は複数の回路素子は例えば抵抗、キャパシター等である。 It should be noted that the processing unit 3120 is configured with the following hardware. The hardware may include circuitry for processing digital signals and/or circuitry for processing analog signals. For example, the hardware can consist of one or more circuit devices or one or more circuit elements mounted on a circuit board. The one or more circuit devices are for example ICs, FPGAs or the like. The one or more circuit elements are, for example, resistors, capacitors, and the like.
 また処理部3120は、下記のプロセッサにより実現されてもよい。処理システム3100は、情報を記憶するメモリと、メモリに記憶された情報に基づいて動作するプロセッサと、を含む。ここでのメモリは、上記の記憶部であってもよいし、異なるメモリであってもよい。情報は、例えばプログラムと各種のデータ等である。プロセッサは、ハードウェアを含む。プロセッサは、CPU、GPU、DSP等、各種のプロセッサを用いることが可能である。メモリは、SRAM、DRAMなどの半導体メモリであってもよいし、レジスタであってもよいし、HDD等の磁気記憶装置であってもよいし、光学ディスク装置等の光学式記憶装置であってもよい。例えば、メモリはコンピュータにより読み取り可能な命令を格納しており、当該命令がプロセッサにより実行されることで、処理部3120の機能が処理として実現されることになる。ここでの命令は、プログラムを構成する命令セットの命令でもよいし、プロセッサのハードウェア回路に対して動作を指示する命令であってもよい。さらに、処理部3120の各部の全部または一部をクラウドコンピューティングで実現し、後述する各処理をクラウドコンピューティング上で行うこともできる。 Also, the processing unit 3120 may be realized by the following processors. Processing system 3100 includes a memory that stores information and a processor that operates on the information stored in the memory. The memory here may be the storage unit described above, or may be a different memory. The information is, for example, programs and various data. A processor includes hardware. Various processors such as CPU, GPU, and DSP can be used as the processor. The memory may be a semiconductor memory such as an SRAM or a DRAM, a register, a magnetic storage device such as an HDD, or an optical storage device such as an optical disk device. good too. For example, the memory stores computer-readable instructions, and the instructions are executed by the processor to implement the functions of the processing unit 3120 as processes. The instruction here may be an instruction set that constitutes a program, or an instruction that instructs a hardware circuit of a processor to perform an operation. Furthermore, all or part of each part of the processing unit 3120 can be realized by cloud computing, and each process described later can be performed on cloud computing.
 また、処理部3120は、プロセッサ上で動作するプログラムのモジュールとして実現されてもよい。例えば、処理部3120は、押し付け圧力情報と送気吸引情報に基づいてスキル評価を行う処理モジュールとして実現される。 Also, the processing unit 3120 may be implemented as a module of a program that runs on a processor. For example, the processing unit 3120 is implemented as a processing module that performs skill evaluation based on pressing pressure information and air supply/suction information.
 また、処理部3120が行う処理を実現するプログラムは、例えばコンピュータによって読み取り可能な媒体である情報記憶装置に格納できる。情報記憶装置は、例えば光ディスク、メモリカード、HDD、或いは半導体メモリなどによって実現できる。半導体メモリは例えばROMである。処理部3120は、情報記憶装置に格納されるプログラムに基づいての種々の処理を行う。即ち情報記憶装置は、処理部3120としてコンピュータを機能させるためのプログラムを記憶する。コンピュータは、入力装置、処理部、記憶部、出力部を備える装置である。具体的にはこの変形例に係るプログラムは、図51等を用いて後述する各ステップを、コンピュータに実行させるためのプログラムである。 Also, the program that implements the processing performed by the processing unit 3120 can be stored in, for example, an information storage device that is a computer-readable medium. The information storage device can be implemented by, for example, an optical disc, memory card, HDD, semiconductor memory, or the like. A semiconductor memory is, for example, a ROM. The processing unit 3120 performs various processes based on programs stored in the information storage device. That is, the information storage device stores a program for causing the computer to function as the processing unit 3120 . A computer is a device that includes an input device, a processing unit, a storage unit, and an output unit. Specifically, the program according to this modification is a program for causing a computer to execute each step described later with reference to FIG. 51 and the like.
 出力処理部3130は、処理部3120によるスキル評価の結果であるスキル評価情報を出力する処理を行う。例えば、処理システム3100は不図示の表示部を含み、出力処理部3130は、スキル評価情報を当該表示部に表示する処理を行ってもよい。或いは、図38を用いて後述するように、処理システム3100はネットワークを介して、内視鏡システム3300に接続されてもよい。出力処理部3130は、ネットワークを介してスキル評価情報を送信する通信デバイスや通信チップであってもよい。なおスキル評価情報が出力される機器は内視鏡システム3300に限定されず、処理システム3100と通信可能なPCであってもよいし、スマートフォンやタブレット端末等の携帯端末装置であってもよい。 The output processing unit 3130 performs processing for outputting skill evaluation information that is the result of skill evaluation by the processing unit 3120 . For example, the processing system 3100 may include a display unit (not shown), and the output processing unit 3130 may perform processing for displaying skill evaluation information on the display unit. Alternatively, the processing system 3100 may be connected to the endoscope system 3300 via a network, as will be described later using FIG. The output processing unit 3130 may be a communication device or communication chip that transmits skill evaluation information via a network. Note that the device that outputs the skill evaluation information is not limited to the endoscope system 3300, and may be a PC that can communicate with the processing system 3100, or a mobile terminal device such as a smart phone or a tablet terminal.
 このように、この変形例の処理システム3100は、取得部3110と、処理部3120と、出力処理部3130を含む。また、取得部3110は、軟性鏡である内視鏡を用いた処置における挿入部の押し付け圧力情報と、送気及び吸引に関する送気吸引情報を取得する。また、処理部3120は、押し付け圧力情報及び送気吸引情報に基づいて、内視鏡を操作するユーザのスキル評価を行う。また、出力処理部3130は、スキル評価の結果であるスキル評価情報を出力する。 Thus, the processing system 3100 of this modified example includes an acquisition unit 3110 , a processing unit 3120 and an output processing unit 3130 . In addition, the acquisition unit 3110 acquires information about the pressing pressure of the insertion section in treatment using an endoscope, which is a flexible scope, and information about air supply and suction regarding air supply and suction. The processing unit 3120 also evaluates the skill of the user operating the endoscope based on the pressing pressure information and the air supply/suction information. Output processing unit 3130 also outputs skill evaluation information that is the result of skill evaluation.
 なおこの変形例の処理システム3100が行う処理は、情報処理方法として実現されてもよい。情報処理方法は、軟性鏡である内視鏡を用いた処置における挿入部の押し付け圧力情報と、送気及び吸引に関する送気吸引情報を取得し、押し付け圧力情報及び送気吸引情報に基づいて、内視鏡を操作するユーザのスキル評価を行い、スキル評価の結果であるスキル評価情報を出力する。 Note that the processing performed by the processing system 3100 of this modified example may be implemented as an information processing method. The information processing method acquires pressing pressure information of an insertion portion in a procedure using an endoscope, which is a flexible scope, and air supply/suction information regarding air supply and suction, and based on the pressing pressure information and air supply/suction information, Skill evaluation of the user who operates the endoscope is performed, and skill evaluation information, which is the result of the skill evaluation, is output.
 この変形例の手法によれば、押し付け圧力情報及び送気吸引情報の両方に基づいてユーザのスキル評価を行うことができることから、術者のスキルについて、精度を高く評価することができる。 According to the method of this modified example, the user's skill can be evaluated based on both the pressing pressure information and the air supply/suction information, so the operator's skill can be evaluated with high accuracy.
 図35は、内視鏡システム3300の構成を示す図である。内視鏡システム3300は、スコープ部3310と、処理装置3330と、表示部3340と、光源装置3350とを含む。術者は、内視鏡システム3300を用いて患者の内視鏡検査を行う。ただし、内視鏡システム3300の構成は図35に限定されず、一部の構成要素を省略したり、他の構成要素を追加したりする等の種々の変形実施が可能である。なお、図35においては、図36で後述する吸引装置3370と送気送水装置3380等の図示は省略している。 FIG. 35 is a diagram showing the configuration of an endoscope system 3300. FIG. The endoscope system 3300 includes a scope section 3310 , a processing device 3330 , a display section 3340 and a light source device 3350 . The operator uses the endoscope system 3300 to perform an endoscopic examination of the patient. However, the configuration of the endoscope system 3300 is not limited to that shown in FIG. 35, and various modifications such as omitting some components or adding other components are possible. In FIG. 35, illustration of a suction device 3370, an air/water supply device 3380, etc., which will be described later in FIG. 36, is omitted.
 また図35においては、処理装置3330が、コネクタ3310dによってスコープ部3310と接続される1つの装置である例を示したがこれには限定されない。例えば、処理装置3330の一部又は全部の構成は、ネットワークを介して接続可能なPCやサーバシステム等の他の情報処理装置によって構築されてもよい。例えば、処理装置3330はクラウドコンピューティングによって実現されてもよい。 Also, FIG. 35 shows an example in which the processing device 3330 is one device connected to the scope section 3310 via the connector 3310d, but it is not limited to this. For example, part or all of the processing device 3330 may be configured by other information processing devices such as a PC or a server system that can be connected via a network. For example, the processing unit 3330 may be implemented by cloud computing.
 スコープ部3310は、操作部3310aと、可撓性を有する挿入部3310bと、信号線などを含むユニバーサルケーブル3310cとを有する。スコープ部3310は、管状の挿入部3310bを体腔内に挿入する管状挿入装置である。ユニバーサルケーブル3310cの先端にはコネクタ3310dが設けられる。スコープ部3310は、コネクタ3310dによって、光源装置3350及び処理装置3330と着脱可能に接続される。さらに、図36を用いて後述するように、ユニバーサルケーブル3310c内には、ライトガイド3315が挿通されており、スコープ部3310は、光源装置3350からの照明光を、ライトガイド3315を通して挿入部3310bの先端から出射する。 The scope section 3310 has an operation section 3310a, a flexible insertion section 3310b, and a universal cable 3310c including signal lines and the like. The scope section 3310 is a tubular insertion device that inserts a tubular insertion section 3310b into a body cavity. A connector 3310d is provided at the tip of the universal cable 3310c. The scope unit 3310 is detachably connected to the light source device 3350 and the processing device 3330 by a connector 3310d. Furthermore, as will be described later with reference to FIG. 36, a light guide 3315 is inserted in the universal cable 3310c. emitted from the tip.
 例えば挿入部3310bは、挿入部3310bの先端から基端に向かって、先端部3011と、湾曲可能な湾曲部3012と、軟性部3013とを有している。挿入部3310bは、被写体に挿入される。挿入部3310bの先端部3011は、スコープ部3310の先端部であり、硬い先端硬質部である。後述する対物光学系3311や撮像素子3312は、例えば先端部3011に設けられる。 For example, the insertion portion 3310b has a distal end portion 3011, a bendable bending portion 3012, and a flexible portion 3013 from the distal end to the proximal end of the insertion portion 3310b. Insert portion 3310b is inserted into the subject. A distal end portion 3011 of the insertion portion 3310b is a distal end portion of the scope portion 3310 and is a hard distal end rigid portion. An objective optical system 3311 and an imaging element 3312, which will be described later, are provided at the distal end portion 3011, for example.
 湾曲部3012は、操作部3310aに設けられた湾曲操作部材に対する操作に応じて、所望の方向に湾曲可能である。湾曲操作部材は、例えば左右湾曲操作ノブ及び上下湾曲操作ノブを含む。また操作部3310aには、湾曲操作部材の他にも、レリーズボタン、送気送水ボタン等の各種操作ボタンが設けられている。 The bending portion 3012 can bend in a desired direction according to the operation of the bending operation member provided in the operation portion 3310a. The bending operation member includes, for example, a horizontal bending operation knob and a vertical bending operation knob. In addition to the bending operation member, the operation portion 3310a is provided with various operation buttons such as a release button and an air/water supply button.
 処理装置3330は、受信した撮像信号に対して所定の画像処理を行い、撮像画像を生成するビデオプロセッサである。生成された撮像画像の映像信号は、処理装置3330から表示部3340へ出力され、撮像画像が、表示部3340上にリアルタイムに表示される。処理装置3330の構成については後述する。表示部3340は、例えば液晶ディスプレイやELディスプレイ等である。 The processing device 3330 is a video processor that performs predetermined image processing on the received imaging signal and generates a captured image. A video signal of the generated captured image is output from the processing device 3330 to the display unit 3340, and the captured image is displayed on the display unit 3340 in real time. The configuration of the processing device 3330 will be described later. The display unit 3340 is, for example, a liquid crystal display, an EL display, or the like.
 光源装置3350は、通常観察モード用の白色光を出射可能な光源装置である。なお、光源装置3350は、通常観察モード用の白色光と、狭帯域光等の特殊光とを選択的に出射可能であってもよい。 A light source device 3350 is a light source device capable of emitting white light for normal observation mode. The light source device 3350 may be capable of selectively emitting white light for normal observation mode and special light such as narrow band light.
 図36は、内視鏡システム3300の各部の構成を説明する図である。なお図36では、スコープ部3310の一部の構成を省略、簡略化している。 FIG. 36 is a diagram explaining the configuration of each part of the endoscope system 3300. FIG. Note that in FIG. 36, a part of the configuration of the scope unit 3310 is omitted and simplified.
 光源装置3350は、照明光を発光する光源3352を含む。光源3352は、キセノン光源であってもよいし、LEDであってもよいし、レーザー光源であってもよい。また光源3352は他の光源であってもよく、発光方式は限定されない。 A light source device 3350 includes a light source 3352 that emits illumination light. The light source 3352 may be a xenon light source, an LED, or a laser light source. Also, the light source 3352 may be another light source, and the light emission method is not limited.
 挿入部3310bは、対物光学系3311、撮像素子3312、照明レンズ3314、ライトガイド3315、吸引管3317、送気送水管3319を含む。ライトガイド3315は、光源3352からの照明光を、挿入部3310bの先端まで導光する。照明レンズ3314は、ライトガイド3315によって導光された照明光を被写体に照射する。対物光学系3311は、被写体から反射した反射光を、被写体像として結像する。 The insertion section 3310b includes an objective optical system 3311, an imaging element 3312, an illumination lens 3314, a light guide 3315, a suction tube 3317, and an air/water supply tube 3319. The light guide 3315 guides illumination light from the light source 3352 to the tip of the insertion portion 3310b. The illumination lens 3314 irradiates the subject with the illumination light guided by the light guide 3315 . The objective optical system 3311 forms a subject image by reflecting light reflected from the subject.
 撮像素子3312は、対物光学系3311を経由した被写体からの光を受光する。撮像素子3312はモノクロセンサであってもよいし、カラーフィルタを備えた素子であってもよい。カラーフィルタは、広く知られたベイヤフィルタであってもよいし、補色フィルタであってもよいし、他のフィルタであってもよい。補色フィルタとは、シアン、マゼンタ及びイエローの各色フィルタを含むフィルタである。 The imaging element 3312 receives light from the subject via the objective optical system 3311 . The imaging element 3312 may be a monochrome sensor or an element with color filters. The color filter may be a well-known Bayer filter, a complementary color filter, or other filters. Complementary color filters are filters that include cyan, magenta, and yellow color filters.
 吸引管3317は、所定の場合に、吸引装置3370を起動させ、液体等を吸引する。所定の場合とは、例えば、胃液等が診断の妨げになる場合や、後述する送水を処置終了に伴い回収する場合等であるが、他の場合であってもよい。なお、吸引装置3370は、不図示の吸引ポンプや回収タンク等を含むことにより実現される。また、吸引装置3370は、後述の制御部3332と接続され、不図示の吸引ボタンを押下すること等により、開口部3316と吸引管3317を通じて、液体等が、前述の回収タンクに回収される。なお、この変形例においては、開口部3316は、後述の処置具3360が突出するときの開口部を兼ねている。また、図36においては、処置具3360及び処置具3360を収納する管等の図示は省略している。 The suction tube 3317 activates the suction device 3370 in a predetermined case to suction liquid and the like. Predetermined cases include, for example, a case where gastric juice or the like interferes with diagnosis, a case where water is collected after treatment is completed, and the like, but other cases are also possible. The suction device 3370 is realized by including a suction pump, recovery tank, and the like (not shown). Also, the suction device 3370 is connected to the control unit 3332 described later, and by pressing a suction button (not shown) or the like, the liquid or the like is collected in the collection tank described above through the opening 3316 and the suction pipe 3317 . In this modified example, the opening 3316 also serves as an opening from which a treatment instrument 3360, which will be described later, protrudes. Also, in FIG. 36, illustration of the treatment instrument 3360 and a tube housing the treatment instrument 3360 is omitted.
 送気送水管3319は、特定の場合に、送気送水装置3380を起動させ、送気又は送水を行う。特定の場合とは、例えば、病変付近の残渣を洗浄したい場合や、病変付近の周辺を内側から拡張したい場合等であるが、他の場合であってもよい。なお、送気送水装置3380は、不図示のポンプ、ガスボンベ、送水タンク等を含むことにより実現される。また、送気送水装置3380は、後述の制御部3332と接続され、不図示の送気ボタンや送水ボタンを押下すること等により、送気送水管3319を通じてノズル3318から、気体又は液体が噴出される。なお、図36では、送気送水管3319を模式的に1本で図示しているが、気体が通る管と液体が通る管を並設し、ノズル3318の手前でそれぞれの管を結合してもよい。 The air/water pipe 3319 activates the air/water supply device 3380 to supply air or water in specific cases. The specific case is, for example, a case where it is desired to wash the residue around the lesion or a case where it is desired to expand the surrounding area around the lesion from the inside, but other cases may also be used. The air/water supply device 3380 is realized by including a pump, gas cylinder, water tank, etc. (not shown). In addition, the air/water supply device 3380 is connected to a control unit 3332 described later, and when an air supply button or a water supply button (not shown) is pressed, gas or liquid is ejected from the nozzle 3318 through the air/water supply pipe 3319. be. In FIG. 36, one air/water supply pipe 3319 is schematically illustrated, but a pipe for gas and a pipe for liquid are arranged in parallel, and the pipes are connected before the nozzle 3318. good too.
 処理装置3330は、画像処理やシステム全体の制御を行う。処理装置3330は、前処理部3331、制御部3332、記憶部3333、検出処理部3335、後処理部3336を含む。 The processing device 3330 performs image processing and control of the entire system. The processing device 3330 includes a pre-processing section 3331 , a control section 3332 , a storage section 3333 , a detection processing section 3335 and a post-processing section 3336 .
 前処理部3331は、撮像素子3312から順次出力されるアナログ信号をデジタルの画像に変換するA/D変換と、A/D変換後の画像データに対する各種補正処理を行う。なお、撮像素子3312にA/D変換回路が設けられ、前処理部3331におけるA/D変換が省略されてもよい。ここでの補正処理とは、例えばカラーマトリクス補正処理、構造強調処理、ノイズ低減処理、AGC等を含む。また前処理部3331は、ホワイトバランス処理等の他の補正処理を行ってもよい。前処理部3331は、処理後の画像を入力画像として検出処理部3335に出力する。また前処理部3331は、処理後の画像を表示画像として、後処理部3336に出力する。 A preprocessing unit 3331 performs A/D conversion for converting analog signals sequentially output from the image sensor 3312 into digital images, and various correction processes for image data after A/D conversion. Note that an A/D conversion circuit may be provided in the image sensor 3312 and the A/D conversion in the preprocessing section 3331 may be omitted. The correction processing here includes, for example, color matrix correction processing, structure enhancement processing, noise reduction processing, AGC, and the like. The preprocessing unit 3331 may also perform other correction processing such as white balance processing. The preprocessing unit 3331 outputs the processed image to the detection processing unit 3335 as an input image. The pre-processing unit 3331 also outputs the processed image to the post-processing unit 3336 as a display image.
 検出処理部3335は、入力画像から病変等の注目領域を検出する検出処理を行う。ただしこの変形例では、注目領域の検出処理は必須ではなく、検出処理部3335は省略が可能である。 The detection processing unit 3335 performs detection processing for detecting a region of interest such as a lesion from the input image. However, in this modification, the attention area detection processing is not essential, and the detection processing unit 3335 can be omitted.
 後処理部3336は、前処理部3331、検出処理部3335の出力に基づく後処理を行い、後処理後の画像を表示部3340に出力する。例えば後処理部3336は、表示画像に対して、検出処理部3335における検出結果を付加し、付加後の画像を表示する処理を行ってもよい。術者であるユーザは、表示部3340に表示される画像を見ながら、生体内の病変領域に対する処置を行う。ここでの処置は、例えば前述のEMRやESD等の病変を切除するための処置である。 A post-processing unit 3336 performs post-processing based on the outputs of the pre-processing unit 3331 and the detection processing unit 3335 and outputs the post-processed image to the display unit 3340 . For example, the post-processing unit 3336 may add the detection result of the detection processing unit 3335 to the display image and display the added image. A user who is an operator treats a lesion area in the living body while viewing an image displayed on the display unit 3340 . The treatment here is, for example, treatment for resecting lesions such as EMR and ESD described above.
 制御部3332は、撮像素子3312、前処理部3331、検出処理部3335、後処理部3336、光源3352と互いに接続され、各部を制御する。 The control unit 3332 is connected to the imaging element 3312, the preprocessing unit 3331, the detection processing unit 3335, the postprocessing unit 3336, and the light source 3352, and controls each unit.
 例えば処理システム3100が処理装置3330に含まれる場合、図36の構成に取得部3110、処理部3120及び出力処理部3130が追加される。取得部3110は、例えば制御部3332の制御情報に基づいて入力データや後述する送気吸引情報を取得する。また取得部3110は、例えば挿入部3310bに設けられるモーションセンサのセンサ情報に基づいて後述する押し付け圧力情報を取得する。処理部3120は、押し付け圧力情報と送気吸引情報を用いてスキル評価を行う。出力処理部3130は、表示部3340や、内視鏡システム3300と接続される外部機器にスキル評価情報を出力する。 For example, if the processing system 3100 is included in the processing device 3330, an acquisition unit 3110, a processing unit 3120, and an output processing unit 3130 are added to the configuration of FIG. The acquisition unit 3110 acquires input data and air supply/suction information, which will be described later, based on control information from the control unit 3332, for example. The acquisition unit 3110 also acquires pressing pressure information, which will be described later, based on sensor information from a motion sensor provided in the insertion unit 3310b, for example. The processing unit 3120 performs skill evaluation using the pressing pressure information and the air supply/suction information. The output processing unit 3130 outputs skill evaluation information to the display unit 3340 and external devices connected to the endoscope system 3300 .
 図37は、挿入部3310bの先端部3011の構成を説明する図である。図37に示すように、先端部3011の断面形状は略円形であり、図36を用いて上述したように、対物光学系3311及び照明レンズ3314が設けられる。また、挿入部3310bには操作部3310aから先端部3011の開口部3316までつながる空洞であるチャンネルが設けられる。ここでの開口部3316は、いわゆる鉗子口と呼ばれる処置具3360用の開口である。 FIG. 37 is a diagram illustrating the configuration of the distal end portion 3011 of the insertion portion 3310b. As shown in FIG. 37, the distal end portion 3011 has a substantially circular cross-sectional shape, and is provided with an objective optical system 3311 and an illumination lens 3314 as described above with reference to FIG. Further, the insertion portion 3310b is provided with a channel, which is a cavity, connecting from the operation portion 3310a to the opening portion 3316 of the distal end portion 3011. As shown in FIG. The opening 3316 here is an opening for a treatment tool 3360 called a forceps opening.
 図37に示すように、術者は、処置具3360を当該チャンネルに挿通し、開口部3316から処置具3360の先端部分を突出させることによって、処置対象組織に対する処置を行う。なお、図37では、2系統の照明レンズ3314、1つの対物光学系3311、1つの開口部3316、1つのノズル3318を有する先端部3011の構成を例示したが、具体的な構成は種々の変形実施が可能である。 As shown in FIG. 37 , the operator inserts the treatment instrument 3360 through the channel and protrudes the distal end portion of the treatment instrument 3360 from the opening 3316 to treat the treatment target tissue. Note that FIG. 37 illustrates the configuration of the distal end portion 3011 having two illumination lenses 3314, one objective optical system 3311, one opening 3316, and one nozzle 3318, but the specific configuration can be modified in various ways. Implementation is possible.
 なお、ここでの処置具3360は、生体に対する処置を行うための器具であり、例えば高周波スネアや高周波ナイフを含む。高周波ナイフは、ニードルナイフ、ITナイフ、フックナイフ等を含む。例えばESDのマーキングには、ニードルナイフが用いられる。切開には高周波ナイフが用いられる。剥離には高周波スネアや高周波ナイフが用いられる。また処置具3360は、注射針、鉗子、クリップ等の他の器具を含んでもよい。ESDの局注には注射針が用いられる。止血には鉗子やクリップが用いられる。 The treatment instrument 3360 here is an instrument for treating a living body, and includes, for example, a high-frequency snare and a high-frequency knife. High frequency knives include needle knives, IT knives, hook knives, and the like. For example, a needle knife is used for ESD marking. A high frequency knife is used for the incision. A high-frequency snare or high-frequency knife is used for peeling. The treatment instrument 3360 may also include other instruments such as injection needles, forceps, and clips. An injection needle is used for local injection of ESD. Forceps or clips are used to stop bleeding.
 また、処理システム3100は、内視鏡システム3300とは別体として設けられてもよい。図38は、処理システム3100を含むシステムの構成例を示す図である。図38に示すように、システムは、複数の内視鏡システム3300と、処理システム3100を含む。 Also, the processing system 3100 may be provided separately from the endoscope system 3300 . FIG. 38 is a diagram showing a configuration example of a system including the processing system 3100. As shown in FIG. As shown in FIG. 38, the system includes multiple endoscope systems 3300 and a processing system 3100 .
 例えば処理システム3100は、複数の内視鏡システム3300のそれぞれと、ネットワークを介して接続されるサーバシステムである。ここでのサーバシステムは、イントラネット等のプライベートネットワークに設けられるサーバであってもよいし、インターネット等の公衆通信網に設けられるサーバであってもよい。また処理システム3100は、1つのサーバ装置によって構成されてもよいし、複数のサーバ装置を含んでもよい。例えば処理システム3100は、複数の内視鏡システム3300から、押し付け圧力情報と送気吸引情報を収集するデータベースサーバと、スキル評価を行う処理サーバを含んでもよい。データベースサーバは、例えば後述するように、医師情報、患者情報等の他の情報を収集してもよい。 For example, the processing system 3100 is a server system connected to each of the plurality of endoscope systems 3300 via a network. The server system here may be a server provided in a private network such as an intranet, or a server provided in a public communication network such as the Internet. Also, the processing system 3100 may be configured by one server device, or may include a plurality of server devices. For example, the processing system 3100 may include a database server that collects pressing pressure information and air supply/suction information from a plurality of endoscope systems 3300, and a processing server that performs skill evaluation. The database server may also collect other information, such as physician information, patient information, etc., as described below.
 また、処理システム3100は、後述するように、機械学習に基づいてスキル評価を行ってもよい。例えば処理システム3100は、データベースサーバが収集したデータを学習データとする機械学習を行うことによって、学習済モデルを生成する学習サーバを含んでもよい。処理サーバは、学習サーバによって生成された学習済モデルに基づいて、スキル評価を行う。 Also, the processing system 3100 may perform skill evaluation based on machine learning, as described later. For example, the processing system 3100 may include a learning server that generates a trained model by performing machine learning using data collected by the database server as learning data. The processing server performs skill evaluation based on the trained model generated by the learning server.
 図38に示したように、処理システム3100が複数の内視鏡システム3300と接続可能である場合、効率的にデータを収集することが可能である。例えば機械学習に用いる学習データの量を増やすことが容易であるため、スキル評価精度をより高くすることが可能である。 As shown in FIG. 38, when the processing system 3100 can connect with a plurality of endoscope systems 3300, it is possible to efficiently collect data. For example, since it is easy to increase the amount of learning data used for machine learning, it is possible to improve the accuracy of skill evaluation.
 次に、押し付け圧力情報と送気吸引情報を求める具体的な方法について説明する。図39(A)及び図39(B)は、押し付け圧力について説明する図である。押し付け圧力とは、内視鏡の先端等が対象部位等を押す力である。ここでの先端等とは、図39(A)のように、先端部3011である。ただし、これに限らず、押し付け圧力を、突出した処置具3360が対象部位等を押す力としてもよいし、不図示のキャップが対象部位を押す力としてもよい。なお、キャップは、先端部3011を保護する。また、図39(B)に示すように、押し付け圧力を、先端部3011に限らず、湾曲部3012が対象部位等を押す力としてもよい。例えば、湾曲部3012を押し付けるようにすると、図39(B)のGに示した箇所の動きが規制される。これにより、先端部3011の動きが安定するため、ポジションの調整が容易になる。このような状況においては、押し付け圧力を、湾曲部3012が対象部位等を押す力とすることは便宜である。また、対象部位等とは、対象臓器等を指すが病変部位であるか、正常部位であるかは問わない。 Next, we will explain the specific method of obtaining the pressing pressure information and the air supply/suction information. 39(A) and 39(B) are diagrams for explaining the pressing pressure. The pressing pressure is the force with which the tip of the endoscope or the like presses the target site or the like. Here, the tip or the like is the tip portion 3011 as shown in FIG. 39(A). However, the pressing pressure is not limited to this, and may be the force of the protruding treatment instrument 3360 pushing the target site or the like, or the force of the cap (not shown) pressing the target site. Note that the cap protects the tip portion 3011 . Further, as shown in FIG. 39B, the pressing pressure is not limited to the distal end portion 3011, and may be the force that the bending portion 3012 presses the target site or the like. For example, if the curved portion 3012 is pressed, the movement of the portion indicated by G in FIG. 39(B) is restricted. This stabilizes the movement of the distal end portion 3011, thereby facilitating position adjustment. In such a situation, it is convenient to set the pressing pressure as the force with which the bending portion 3012 presses the target site or the like. Also, the target site or the like refers to the target organ or the like, but it does not matter whether it is a lesion site or a normal site.
 押し付け圧力を求めるための構成については種々の手法が考えられる。例えば内視鏡システム3300は、挿入部3310bの先端部3011に設けられる不図示の圧力センサを含むことで、押し付け圧力を求めることが実現できる。なお、ここでの圧力センサの圧力検出方式は、例えば、歪みゲージ回路型圧力検出方式であるが、静電容量型圧力検出方式等、他の方式であってもよい。 Various methods are conceivable for the configuration for obtaining the pressing pressure. For example, the endoscope system 3300 includes a pressure sensor (not shown) provided at the distal end portion 3011 of the insertion portion 3310b to obtain the pressing pressure. The pressure detection method of the pressure sensor here is, for example, a strain gauge circuit type pressure detection method, but may be another method such as a capacitance type pressure detection method.
 また、例えば、撮像画像に基づいて押し付けられた対象部位の移動量を求め、当該移動量から接触圧力を推定する手法によって、押し付け圧力を求めてもよい。さらに、この推定手法と、前述の圧力センサを用いる手法とを組み合わせても押し付け圧力を求めてもよい。例えば、先端部3011に含まれる圧力センサからは圧力が検出されていないが、撮像画像から臓器がたわんでいるように見えるのであれば、湾曲部3012によって押し付けられていることが推定できる。なお、撮像画像から接触圧力を推定する手法は公知のため、説明は省略する。 Also, for example, the pressing pressure may be obtained by a method of obtaining the movement amount of the pressed target part based on the captured image and estimating the contact pressure from the movement amount. Furthermore, the pressing pressure may be obtained by combining this estimation method with the above-described method using a pressure sensor. For example, if no pressure is detected by the pressure sensor included in the distal end portion 3011 but the organ appears to be bent from the captured image, it can be estimated that the bending portion 3012 is pressing against it. Since the method of estimating the contact pressure from the captured image is well known, the description thereof will be omitted.
 以上に説明した手法によって測定された押し付け圧力情報のデータを制御部3332に送信すること等により、取得部3110が押し付け圧力情報を取得することを実現することができる。なお、押し付け圧力情報の取得についての具体的な処理等については図47、図50、図51等で後述する。送気吸引情報の取得を用いた処理についても同様である。 By transmitting the pressing pressure information data measured by the method described above to the control unit 3332, etc., it is possible for the acquiring unit 3110 to acquire the pressing pressure information. Note that specific processing for obtaining pressing pressure information will be described later with reference to FIGS. 47, 50, 51, and the like. The same applies to processing using the acquisition of air supply/suction information.
 次に、送気吸引情報について説明する。送気吸引情報を求めるための構成については種々の手法が考えられる。例えば、不図示の吸引ボタンや送気ボタン等を押した時間と、前述の吸引装置3370や送気送水装置3380にて設定されている設定流量に基づいて、送気吸引情報を求めることができる。また、不図示の流量計を、所定の位置に設けて、当該流量計から流量データに基づいて送気吸引情報を求めることもできる。所定の位置とは、例えば、吸引管3317や送気送水管3319であるが、吸引装置3370や送気送水装置3380の内部でもよい。測定された流量をデータに変換し、制御部3332に送信すること等により、取得部3110が送気吸引情報を取得することを実現することができる。 Next, the air supply and suction information will be explained. Various methods are conceivable for the configuration for obtaining the air supply/suction information. For example, the air supply/suction information can be obtained based on the time the suction button or air supply button (not shown) is pressed and the set flow rate set in the suction device 3370 or the air/water supply device 3380 described above. . Alternatively, a flow meter (not shown) may be provided at a predetermined position, and the air supply/suction information may be obtained from the flow meter based on the flow rate data. The predetermined position is, for example, the suction pipe 3317 or the air/water supply pipe 3319 , but may be inside the suction device 3370 or the air/water supply device 3380 . By converting the measured flow rate into data and transmitting the data to the control unit 3332 or the like, it is possible for the acquisition unit 3110 to acquire the air supply/suction information.
 なお、前述の通り、押し付け圧力は臓器の張り具合に依存するので、対象部位付近の気圧を測定することで、臓器の張り具合に関する情報が得られるとも思われる。しかし、臓器の張り具合は、他の要因にも依存するので、気圧の情報から臓器の張り具合を見積もることは困難である。他の要因とは、例えば、臓器の柔らかさや、送気した気体が他の臓器内部へ拡散すること等である。 In addition, as mentioned above, the pressing pressure depends on the tension of the organ, so it is thought that information on the tension of the organ can be obtained by measuring the air pressure near the target area. However, since the tightness of an organ depends on other factors, it is difficult to estimate the tightness of an organ from atmospheric pressure information. Other factors include, for example, the softness of organs and the diffusion of supplied gas into other organs.
 以上のことから、押し付け圧力情報を得るために圧力センサ等が必要で、送気吸引情報を得るために流量計等が必要だが、これらは所望のタイミングにデジタルデータとして取得できることは公知であるため、押し付け圧力情報と送気吸引情報を同一のタイミングで取得することで、後述するログ情報を作成することや、後述のニューラルネットワークNN2に押し付け圧力情報の時系列データや送気吸引情報の時系列データを入力することが実現できる。なお、ここでの同一は略同一を含む。 From the above, a pressure sensor or the like is necessary to obtain pressing pressure information, and a flow meter or the like is necessary to obtain air supply/suction information. By acquiring pressing pressure information and air supply/suction information at the same timing, log information described later can be created, and time-series data of pressing pressure information and air supply/suction information can be stored in the neural network NN2 described later. Inputting data can be realized. The same here includes substantially the same.
 次に、図40、図41、図42、図43、図44、図45及び図46を用いて、スキル評価情報について詳細に説明する。術者は所定の処置を行い、当該処置から所定の期間が経過した後、スキル評価情報として図40に示すスキル評価シート3400が、所定の表示部に出力される。これにより、出力処理部3130によるスキル評価情報の出力が実現される。なお、所定の期間とは、例えば、当該処置の対象となった患者が退院するまでの期間等である。また、所定の表示部とは、例えば、前述の表示部3340であるが、内視鏡システム3300と接続される外部機器の表示部でもよい。また、所定の処置は、例えば前述した、複数の段階を含むESDであるが、複数の段階を含む他の処置であってもよい。言い換えれば、内視鏡の処置具3360による処置は複数の段階を含む。 Next, the skill evaluation information will be explained in detail using FIGS. 40, 41, 42, 43, 44, 45 and 46. FIG. The operator performs a predetermined treatment, and after a predetermined period of time has passed since the treatment, a skill evaluation sheet 3400 shown in FIG. 40 is output to a predetermined display unit as skill evaluation information. As a result, output of skill evaluation information by the output processing unit 3130 is realized. Note that the predetermined period is, for example, the period until the patient who is the target of the treatment is discharged from the hospital. The predetermined display unit is, for example, the display unit 3340 described above, but may be a display unit of an external device connected to the endoscope system 3300 . Also, the predetermined treatment is, for example, the ESD including multiple steps as described above, but may be other treatments including multiple steps. In other words, treatment with the endoscopic treatment tool 3360 includes multiple steps.
 スキル評価シート3400は、例えば、医師情報アイコン3410と症例シート3420とを含む。症例シート3420において、例えば総合評価の結果と、当該総合評価の内訳として処置の各段階のスキル評価の結果が表示される。言い換えれば、出力処理部3130は、複数の段階の各段階におけるスキル評価情報を出力する。これにより、段階ごとに細分化してスキルを評価できるため、スキル評価の精度をより高くすることができる。例えば、マーキング、局注、切開、剥離の段階についてスキル評価を行うが、図40に示すように、さらに止血を含めた5つの段階についてスキル評価を行ってもよい。また、評価の対象となる処置の段階は5つに限られず、他の段階を追加してもよいし、2つ以上であれば減らしてもよい。言い換えれば、複数の段階は、マーキング段階、局注段階、切開段階及び剥離段階のうち少なくとも2つを含む。これにより、ESD等の処置をより細かく分類してスキル評価を行うことができる。より具体的には、それぞれの段階の評価結果について、マーキング評価アイコン3440、局注評価アイコン3442、切開評価アイコン3444、剥離評価アイコン3446、止血評価アイコン3448で表示し、それぞれのアイコンをA、B、C、Dのいずれかで表示する。さらに、これらの評価結果をレーダーチャート形式で表示してもよい。このようにすることで、術者のスキルの優劣を2次元的に表示できるので、スキル評価を視覚的かつ容易に把握することができる。なお、ここではAが最も評価が高く熟練者と同等の旨のランクであり、Dが最も評価が低いランクとするが、評価の高低については点数で表示する等、様々な方法で実現できる。また、スキル評価の表示形式はレーダーチャートに限らず、棒グラフや折れ線グラフ等の形式で実現してもよく、種々の変形実施が可能である。さらに、スキル評価の表示形式を変更可能な仕様にしてもよい。 The skill evaluation sheet 3400 includes, for example, a doctor information icon 3410 and a case sheet 3420. In the case sheet 3420, for example, the result of comprehensive evaluation and the result of skill evaluation for each stage of treatment are displayed as a breakdown of the comprehensive evaluation. In other words, the output processing unit 3130 outputs skill evaluation information at each stage of a plurality of stages. As a result, the skill can be evaluated by subdividing it for each stage, so that the accuracy of the skill evaluation can be further improved. For example, skill evaluation is performed for the stages of marking, local injection, incision, and peeling, but as shown in FIG. 40, skill evaluation may be performed for five stages including hemostasis. In addition, the number of treatment stages to be evaluated is not limited to five, and other stages may be added, or two or more may be reduced. In other words, the multiple steps include at least two of a marking step, a local injection step, an incision step, and an ablation step. As a result, it is possible to perform skill evaluation by more finely classifying measures such as ESD. More specifically, the evaluation results of each stage are displayed with a marking evaluation icon 3440, a local injection evaluation icon 3442, an incision evaluation icon 3444, a peeling evaluation icon 3446, and a hemostasis evaluation icon 3448. , C, or D. Furthermore, these evaluation results may be displayed in a radar chart format. By doing so, it is possible to two-dimensionally display the superiority or inferiority of the skill of the operator, so that the skill evaluation can be visually and easily grasped. Here, A is the highest evaluation rank equivalent to that of an expert, and D is the lowest evaluation rank. Moreover, the display format of the skill evaluation is not limited to the radar chart, and may be realized in the form of a bar graph, a line graph, or the like, and various modifications are possible. Furthermore, the display format of the skill evaluation may be changed.
 また、各段階について、さらにアドバイス情報を出力してもよい。アドバイス情報は、具体的には、押し付け圧力情報及び送気吸引情報の少なくとも一方に関するアドバイスの情報である。言い換えれば、出力処理部3130は、押し付け圧力情報及び送気吸引情報の少なくとも一方に関するアドバイス情報を出力してもよい。例えば、図41に示すように、表示画面上で剥離評価アイコン3446を選択すると剥離アドバイス表示3476が表示され、止血評価アイコン3448を選択すると止血アドバイス表示3478が表示され、押し付け圧力情報や送気吸引情報に関するアドバイスが表示される。なお、同様に、マーキングアドバイス表示、局注アドバイス表示、切開アドバイス表示も表示されるが、図41では省略する。なお、アドバイスの表示方法は図41に示す方法に限られず、例えば、別画面に表示する等、様々な変形実施が可能である。また、図示は省略するが、総合評価アイコン3430を選択すると総合評価に関するアドバイスを表示してもよい。これにより、精度の高いスキル評価をすることができるとともに、術者に対して具体的な情報を提供することができる。 Further, advice information may be output for each stage. The advice information is, specifically, advice information regarding at least one of the pressing pressure information and the air supply/suction information. In other words, the output processing unit 3130 may output advice information regarding at least one of the pressing pressure information and the air supply/suction information. For example, as shown in FIG. 41, when a peeling evaluation icon 3446 is selected on the display screen, a peeling advice display 3476 is displayed, and when a hemostasis evaluation icon 3448 is selected, a hemostasis advice display 3478 is displayed, and pressing pressure information and air supply/suction are displayed. Informational advice is displayed. Similarly, a marking advice display, a local injection advice display, and an incision advice display are also displayed, but they are omitted in FIG. Note that the method of displaying advice is not limited to the method shown in FIG. 41, and various modifications such as displaying on a separate screen are possible. Also, although illustration is omitted, when the comprehensive evaluation icon 3430 is selected, advice regarding the comprehensive evaluation may be displayed. As a result, highly accurate skill evaluation can be performed, and specific information can be provided to the operator.
 アドバイス情報は、例えば、図41の剥離アドバイス表示3476に示すように、押し付け圧力情報や送気吸引情報について、評価対象の術者のデータと熟練者のデータと比較し、その差異情報を含むアドバイスを表示する。以降、熟練医による手術によって取得された押し付け圧力情報、送気吸引情報等をエキスパートデータと言うことがある。言い換えれば、出力処理部3130は、押し付け圧力情報及び送気吸引情報の少なくとも一方に関するエキスパートデータに対する差異をアドバイス情報として表示する。これにより、精度の高いスキル評価をすることができるとともに、術者に対して、エキスパートデータとの差異という、より具体的な情報を提供することができる。なお、アドバイス情報は、他の情報を含んでもよい。例えば、マーキング段階にて熟練者と同等の旨の評価結果が得られた旨を確認的に表示してもよいし、評価結果の理由を表示してもよいし、後述するログ情報を術者に参照させる旨のアドバイスを表示してもよい。 For example, as shown in a peeling advice display 3476 in FIG. 41, the pressing pressure information and the air supply/suction information are compared with the data of the operator to be evaluated and the data of the expert, and the advice information includes the difference information. display. Hereafter, pressing pressure information, air supply/suction information, and the like acquired by an operation performed by a skilled doctor may be referred to as expert data. In other words, the output processing unit 3130 displays, as advice information, the difference from the expert data regarding at least one of the pressing pressure information and the air supply/suction information. As a result, highly accurate skill evaluation can be performed, and more specific information such as differences from expert data can be provided to the operator. Note that the advice information may include other information. For example, in the marking stage, it may be possible to confirm that an evaluation result equivalent to that of an expert was obtained, the reason for the evaluation result may be displayed, or log information, which will be described later, may be displayed by the operator. Advice may be displayed to refer to
 次に、図42~図44を用いて、ログ情報の例について説明する。図42は熟練医が行った手術において押し付け圧力情報と、送気吸引情報のログ情報の例を模式的に示した図である。一方、図43は修練医が行った手術において押し付け圧力情報と、送気吸引情報のログ情報の例を模式的に示した図である。なお、図42、図43は押し付け圧力と送気吸引の関係を模式的に示したものであり、縦軸の大きさが具体的な力の大きさを示すものではなく、横軸の長さが具体的な時間の長さを示すものではない。 Next, examples of log information will be described with reference to FIGS. 42 to 44. FIG. FIG. 42 is a diagram schematically showing an example of log information of pressing pressure information and air supply/suction information in an operation performed by a skilled doctor. On the other hand, FIG. 43 is a diagram schematically showing an example of log information of pressing pressure information and air supply/suction information in an operation performed by a trainee. 42 and 43 schematically show the relationship between pressing pressure and air supply/suction. does not indicate a specific length of time.
 この変形例のスキル評価情報は、押し付け圧力情報及び送気吸引情報のログ情報を含む。具体的には、図39において、個々の症例シート3420は、ログデータアイコン3450を含む。そして、ログデータアイコン3450を選択すると、図41に示す押し付け圧力情報のログ情報が表示される。これにより、出力処理部3130による押し付け圧力情報及び送気吸引情報のログ情報の出力を実現することができる。言い換えれば、出力処理部3130は、押し付け圧力情報及び送気吸引情報についてのログ情報を出力する。これにより、精度の高いスキル評価をすることができるとともに、術者に対して、押し付け圧力情報及び送気吸引情報についてのログ情報という、より具体的な情報を提供することができる。 The skill evaluation information of this modified example includes log information of pressing pressure information and air supply/suction information. Specifically, in FIG. 39, individual case sheets 3420 include log data icons 3450 . Then, when the log data icon 3450 is selected, log information of pressing pressure information shown in FIG. 41 is displayed. As a result, the output of the log information of the pressing pressure information and the air supply/suction information by the output processing unit 3130 can be realized. In other words, the output processing unit 3130 outputs log information about pressing pressure information and air supply/suction information. As a result, highly accurate skill evaluation can be performed, and more specific information such as log information about pressing pressure information and air supply/suction information can be provided to the operator.
 図42と図43を比較しながら、熟練医のログ情報と修練医のログ情報の相違点について、処置前、処置中及び処置後に分けて例示する。なお、ここでいう処置とは、前述の処置具3360等に高周波デバイスから通電を行っている期間であるものとし、マーキング、局注、切開、剥離等の具体的な段階については問わない。また、高周波デバイスとは、高周波電流が印加されることによって、対象組織を切除、焼灼するために用いられるデバイスである。高周波デバイスは、高周波スネアや高周波ナイフを含む。通電とは、高周波デバイスに対して、電源装置から高周波電流が供給されていることであり、通電状態は電源装置の制御信号に基づいて判定可能である。言い換えれば、処置中の期間とは、内視鏡の処置具3360による処置期間である。 While comparing FIGS. 42 and 43, the differences between the expert doctor's log information and the novice doctor's log information will be illustrated before treatment, during treatment, and after treatment. The term "treatment" as used herein refers to a period in which the treatment instrument 3360 or the like is energized from a high-frequency device, and specific steps such as marking, local injection, incision, and peeling are not specified. A high-frequency device is a device that is used to excise and cauterize a target tissue by applying a high-frequency current. High frequency devices include high frequency snares and high frequency knives. Energization means that high-frequency current is supplied from the power supply to the high-frequency device, and the energization state can be determined based on the control signal of the power supply. In other words, the period during treatment is the period of treatment by the treatment tool 3360 of the endoscope.
 押し付け圧力は、前述のように、臓器の蠕動運動に伴う臓器の張り具合に依存している。そのため、術者が内視鏡を対象部位に所定の力で押し当てた後に何ら操作していない場合、押し付け圧力の測定結果の波形は、臓器の蠕動運動によって、後述の図44に示すような周期的な波形になる。押し付け圧力が周期的に変動したままでは、ポジションの調整を行うことは難しい。熟練医が行う手術におけるログ情報は、処置前の段階で、送気吸引の作業が細かく行われ、押し付け圧力は、振れ幅を少ない波形となる。熟練医は、所定の性質を利用し、押し付け圧力が下降傾向にあるときは、送気ボタンを押すことで、押し付け圧力の下降を止めて振れ幅を小さくするように調整している。逆に、押し付け圧力が上昇傾向にあるときは、吸引ボタンを押すことで、押し付け圧力の下降を止めて振れ幅を小さくするように調整している。これにより、熟練医は押し付け圧力を早期に安定させることができるので、短時間でポジションの調整を終えることができる。なお、所定の性質とは、臓器の内圧が上昇すると臓器が張ることに伴い、臓器が硬くなり、押し付け圧力が高くなり、逆に臓器の内圧が低下すると、押し付け圧力が低くなるという性質である。また、所定の性質はどの状況においても必ず当てはまるとは限らないが、ここでは当該所定の性質によって押し付け圧力がコントロールできるものとする。 As mentioned above, the pressing pressure depends on the tension of the organ caused by the peristaltic movement of the organ. Therefore, when the operator does not perform any operation after pressing the endoscope against the target site with a predetermined force, the waveform of the measurement result of the pressing pressure changes due to the peristaltic movement of the organ, as shown in FIG. It becomes a periodic waveform. It is difficult to adjust the position if the pressing pressure is kept fluctuating periodically. Log information in operations performed by a skilled doctor has a waveform with a small amplitude of the pressing pressure due to detailed air supply and suction work in the pre-treatment stage. A skilled doctor utilizes a predetermined property and presses an air supply button when the pressing pressure tends to decrease, thereby stopping the pressing pressure from falling and adjusting the amplitude of the fluctuation. Conversely, when the pressing pressure tends to rise, pressing the suction button stops the pressing pressure from falling and adjusts the swing width to be small. As a result, the skilled doctor can quickly stabilize the pressing pressure, so that the position adjustment can be completed in a short period of time. The predetermined property is a property that when the internal pressure of the organ rises, the organ expands, the organ hardens, and the pressing pressure increases, and conversely, when the internal pressure of the organ decreases, the pressing pressure decreases. . Also, although the predetermined property does not always apply in every situation, it is assumed here that the pressing pressure can be controlled by the predetermined property.
 一方、修練医は、ポジションの調整のための内視鏡の操作に気を取られ、押し付け圧力が上昇等していることに気がつかず、送気吸引を行っていない。そして、押し付け圧力が安定していないため、ポジションの調整に時間を要する。 On the other hand, the trainee doctor was preoccupied with manipulating the endoscope to adjust the position, and did not notice that the pressing pressure was rising, and did not perform air supply and suction. In addition, since the pressing pressure is not stable, it takes time to adjust the position.
 次に、図42と図43の処置中における違いについて説明する。熟練医は、処置の段取りとしてのポジションの調整を安定して行っているため、処置中に送気吸引を行うことなく、短時間で処置を完了させることができる。なお、図42は模式的に示した図であり、処置中における横軸の長さは、処置前や処置後における長さと同等であるわけではない。一方、修練医は、前述のように、押し付け圧力が上昇したまま処置を開始しているため、処置中に慌てて急激に吸引を行おうとするため、急激な押し付け圧力の低下が起きる。そして、急激な低下に対して慌てて送気を行うことから、押し付け圧力が急激に上昇する。これにより、押し付け圧力の測定値が大きく変動する。また、このような状況であることから、修練医の処置中に要する時間は、熟練医より長くなる。このように、処置中の期間における押し付け圧力情報と送気吸引情報を追跡すると、術者のスキルに関係する知見を色々と発見できる可能性があるため、処置中の期間は、注目に値する。 Next, the difference during treatment between FIGS. 42 and 43 will be described. Since the skilled doctor is stably adjusting the position as a treatment setup, the treatment can be completed in a short period of time without performing air supply and suction during the treatment. Note that FIG. 42 is a schematic diagram, and the length of the horizontal axis during treatment is not the same as the length before and after treatment. On the other hand, as described above, since the novice doctor starts the treatment while the pressing pressure is still rising, he/she rushes to perform abrupt suction during the treatment, resulting in a sudden decrease in the pressing pressure. Then, the pressing pressure rises sharply because air is supplied in haste in response to the sudden drop. As a result, the measured values of the pressing pressure fluctuate greatly. Also, because of this situation, the novice doctor will spend more time during the procedure than the experienced doctor. In this way, by tracking the pressing pressure information and air supply/suction information during the period during the treatment, it is possible to discover various findings related to the skill of the operator, so the period during the treatment is worthy of attention.
 そこで、所定の期間に限定してスキル評価を行うようにしてもよい。具体的には、前述のように、内視鏡の処置具3360による処置期間における押し付け圧力情報及び送気吸引情報に基づいてスキル評価を行ってもよい。言い換えれば、処理部3120は、内視鏡の処置具3360による処置期間における押し付け圧力情報及び送気吸引情報に基づいてスキル評価を行ってもよい。これにより、重要度の高い期間においてスキルを評価できるため、スキル評価の精度をより高くすることができる。具体的には、図示は省略するが、スキル評価シート3400に、処置中の期間についてさらにA~Dによるスキル評価を示す画像を追加することで、処置中の期間についてのスキル評価を実現できる。ただし、この方法に限られず、例えば、図41の剥離アドバイス表示3476に、処置中の期間におけるアドバイスを表示することで、スキル評価を行っていることにしてもよい。 Therefore, it is possible to limit the skill evaluation to a predetermined period. Specifically, as described above, skill evaluation may be performed based on pressing pressure information and air supply/suction information during a treatment period using the treatment tool 3360 of the endoscope. In other words, the processing unit 3120 may perform skill evaluation based on the pressing pressure information and the air supply/suction information during the treatment period using the treatment tool 3360 of the endoscope. As a result, the skill can be evaluated in a period of high importance, so that the accuracy of skill evaluation can be increased. Specifically, although illustration is omitted, by adding images showing skill evaluations A to D for the period during treatment to the skill evaluation sheet 3400, skill evaluation for the period during treatment can be realized. However, the method is not limited to this method, and for example, skill evaluation may be performed by displaying advice during the period of treatment on the exfoliation advice display 3476 in FIG. 41 .
 また、各段階において、処置を行う期間より前の期間に限定してスキル評価を行ってもよい。言い換えれば、処理部3120は、内視鏡の処置具3360による処置期間よりも前の期間における押し付け圧力情報及び送気吸引情報に基づいてスキル評価を行ってもよい。これにより、処置前の押し付け圧力情報及び送気吸引情報の挙動から、処置前の段取りの良し悪しを把握する情報が得られ、当該情報をもとにスキル評価ができることから、スキル評価の精度をより高くすることができる。また、図40の剥離アドバイス表示3476に示すように、処置前の期間における押し付け圧力情報及び送気吸引情報についてアドバイスを表示してもよい。 In addition, at each stage, skill evaluation may be performed only during the period prior to the treatment period. In other words, the processing unit 3120 may perform skill evaluation based on the pressing pressure information and the air supply/suction information during the period prior to the treatment period by the treatment tool 3360 of the endoscope. As a result, information for grasping the quality of the pretreatment setup can be obtained from the behavior of the pressing pressure information and air supply and suction information before treatment, and skill evaluation can be performed based on this information, so the accuracy of skill evaluation can be improved. can be higher. In addition, as shown in the exfoliation advice display 3476 in FIG. 40, advice may be displayed regarding the pressing pressure information and the air supply/suction information in the period before the treatment.
 次に、処置後における違いについて説明する。処置後の期間は、次の段階または同じ段階における次の処置へ備える準備期間である。そのため、処置後の押し付け圧力、送気吸引の挙動は、処置前と同様である。そのため、熟練医による押し付け圧力の挙動は、熟練医は送気吸引を細かく行うことで、振れ幅を小さい。一方、修練医による押し付け圧力の挙動は、振れ幅が大きい。 Next, I will explain the difference after the treatment. The post-treatment period is the preparation period for the next step or the next treatment at the same step. Therefore, the pressing pressure and the behavior of air supply and suction after treatment are the same as before treatment. Therefore, in the behavior of the pressing pressure by the expert doctor, the amplitude of fluctuation is reduced by finely performing the air supply and suction by the expert doctor. On the other hand, the behavior of the pressing pressure by the novice doctor has a large fluctuation range.
 なお、ログ情報の表示方法の例は、図42、図43に限られない。例えば、図44に示すように、測定された押し付け圧力の挙動に対して、安全範囲SAを設定して表示できるようにしてもよい。例えば、処理部3120は、許容可能な押し付け圧力範囲情報を取得し、安全範囲SAを表示する処理を追加することで、図42に示すログの表示を実現することができる。さらに、押し付け圧力の測定結果に応じて任意の期間ごとにスキル評価をできるようにしてもよい。例えば、図44の期間H1においては、測定された押し付け圧力が、期間H1全体を通して安全範囲SAを外れていることから、安全性の観点から非常に好ましくない旨の評価を行う。また、期間H2においては押し付け圧力の測定結果の一部が、安全範囲SAを外れていることから、安全性の観点からあまり好ましくない旨の評価を行う。また、期間H3においては、押し付け圧力の測定結果が、期間H3全体を通して安全範囲SA内に収まっていることから、安全性の観点から好ましい評価を行う。なお、図示は省略しているが、これらの評価を示す画像を表示する仕様にしてもよい。 The example of the log information display method is not limited to FIGS. 42 and 43. For example, as shown in FIG. 44, a safety range SA may be set and displayed with respect to the behavior of the measured pressing pressure. For example, the processing unit 3120 acquires the permissible pressing pressure range information and adds processing for displaying the safe range SA, thereby realizing the display of the log shown in FIG. 42 . Further, skill evaluation may be performed at arbitrary intervals according to the results of pressing pressure measurement. For example, in the period H1 of FIG. 44, the measured pressing pressure is outside the safe range SA throughout the period H1, so it is evaluated as being very unfavorable from the safety point of view. Also, in the period H2, part of the pressing pressure measurement results are out of the safe range SA, so it is evaluated that it is not very preferable from the viewpoint of safety. In addition, in the period H3, the measurement result of the pressing pressure is within the safe range SA throughout the period H3, so a favorable evaluation is performed from the viewpoint of safety. Although illustration is omitted, a specification may be adopted in which an image showing these evaluations is displayed.
 なお、図44に示すスキル評価を行うためには、押し付け圧力の安全範囲SAをあらかじめ設定しておく必要がある。つまり、安全に処置を行うためにはどの程度の押し付け圧力範囲が許容されるかが既知でなければならない。例えば、過去に熟練医が同様の症例を処置したデータが有れば、当該データをもとに許容可能な押し付け圧力範囲を設定することができる。 In addition, in order to perform the skill evaluation shown in FIG. 44, it is necessary to set the pressing pressure safety range SA in advance. In other words, it is necessary to know how much pressing pressure range is permissible in order to perform treatment safely. For example, if there is data of a similar case treated by a skilled doctor in the past, the permissible pressing pressure range can be set based on the data.
 なお、押し付け圧力情報及び送気吸引情報のログ情報は、手術中において、例えば表示部3340等にリアルタイムに表示されてもよい。さらに、押し付け圧力の測定結果が安全範囲SAの範囲内から安全範囲SAの範囲外へ移行したタイミングでリアルタイムに報知してもよい。ここでの報知は、表示部3340等による報知の他に、音又は振動等による報知であってもよい。このようにすれば、処置を行っている術者に対して、即座に異常を報知することができるので、トラブルを未然に防止することができる。 Note that the log information of the pressing pressure information and the air supply/suction information may be displayed in real time, for example, on the display unit 3340 during surgery. Furthermore, it may be notified in real time at the timing when the pressing pressure measurement result moves from within the safe range SA to outside the safe range SA. The notification here may be notification by sound, vibration, or the like, in addition to notification by the display unit 3340 or the like. In this way, the operator who is performing the treatment can be immediately notified of the abnormality, so that trouble can be prevented.
 図40に戻り、スキル評価シート3400の説明を続ける。この変形例のスキル評価情報は、さらに医師情報を含んでもよい。例えば、スキル評価シート3400の医師情報アイコン3410を選択することで、医師情報の詳細が表示される。具体的には、身体的特徴、手術実績の他、個々の症例の情報等である。より具体的には、身体的特徴には、身長、体重、手の大きさ等を含む。また、手術実績は、経験症例数や累積手術時間等の情報を含む。また、症例情報は、手術に要した時間、手術の難易度、指導を受けた回数、当該指導の内容等を含む。なお、医師情報に流派を特定する情報を含めてもよい。前述の通り、内視鏡における処置の良し悪しは、医師の経験値や操作上の暗黙知によるものであるため、指導者が異なれば、指導内容即ち手技も大きく異なると考えられる。そのため、同じ症例を対象とする場合であっても、流派に応じて実行する手技が異なることから、流派の違いを考慮することで、スキル評価の精度を維持することができる。また、医師情報に、氏名、性別、生年月日、登録情報、登録年月日を含めてもよく、さらに、これらの情報は所定のデータベースにリンクさせてもよい。また、医師情報には学会活動等の業績を含めてもよい。また、以上に示した情報は、全てを表示する必要は無く、例えば、図45に示すような一覧形式で所定の記憶領域に記憶しておき、さらに必要な情報のみを選択してスキル評価シート3400に表示してもよい。このように、医師情報を評価スキル情報に含めることによって、スキル評価の精度を高くすることができる。 Returning to FIG. 40, the explanation of the skill evaluation sheet 3400 continues. The skill evaluation information of this modified example may further include doctor information. For example, by selecting the doctor information icon 3410 on the skill evaluation sheet 3400, detailed doctor information is displayed. Specifically, it includes information on individual cases in addition to physical characteristics and surgical records. More specifically, physical characteristics include height, weight, hand size, and the like. In addition, the surgical record includes information such as the number of experienced cases and cumulative surgical time. The case information also includes the time required for the surgery, the degree of difficulty of the surgery, the number of times guidance was received, the content of the guidance, and the like. Note that information specifying a school may be included in the doctor information. As described above, the pros and cons of endoscopic procedures depend on the doctor's experience and tacit knowledge of operations. Therefore, it is believed that different instructors will provide different instructions, that is, techniques. Therefore, even when the same case is targeted, since the procedure to be performed differs depending on the school, it is possible to maintain the accuracy of skill evaluation by considering the difference in school. Further, the doctor information may include name, sex, date of birth, registration information, and date of registration, and these information may be linked to a predetermined database. Further, the doctor information may include academic achievements such as academic conference activities. It is not necessary to display all of the information described above. For example, it is stored in a predetermined storage area in a list format as shown in FIG. 3400 may be displayed. By including the doctor information in the evaluation skill information in this way, it is possible to improve the accuracy of the skill evaluation.
 この変形例のスキル評価情報は、さらに患者情報を含んでもよい。例えば、スキル評価シート3400の患者情報アイコン3460を選択することで、医師情報の詳細が表示される。患者情報とは、患者自身の情報のほか、病変の情報や手術後の状態の情報を含む。患者自身の情報は、例えば、氏名、年齢、性別等を含むが、抗凝固剤使用有無の情報を含んでもよい。当該情報は、抗凝固剤を使用すると出血しやすくなることから、手術の難易度の判断に当たり有益な情報となる。また、患者自身の情報には、治療歴の情報を含んでもよい。当該情報は、例えば過去にESDの処置を受けた箇所は線維化等により再度の剥離処置が難しくなることから、同様に、有益な情報となる。また、病変の情報は、部位の情報、組織性状の情報、出血の情報を含むが、さらに、図46に示すように細分化した情報を含んでもよい。また、手術後の状態の情報は、出血量、偶発症発生率、入院日数の情報を含む。例えば、手術後の偶発症発生率が低い、または、入院に要した日数が少ないのであれば、担当した術者のスキルは高いと評価される。このように、患者情報を評価スキル情報に含めることによって、スキル評価の精度を高くすることができる。 The skill evaluation information of this modified example may further include patient information. For example, by selecting the patient information icon 3460 of the skill evaluation sheet 3400, detailed doctor information is displayed. Patient information includes patient information, lesion information, and postoperative status information. The patient's own information includes, for example, name, age, sex, etc., and may include information on whether or not the patient uses an anticoagulant. This information is useful for judging the degree of difficulty of surgery, since the use of anticoagulants makes bleeding more likely. The patient's own information may also include treatment history information. This information is also useful information because, for example, re-exfoliation treatment becomes difficult due to fibrosis or the like at a site that has been subjected to ESD treatment in the past. The lesion information includes site information, tissue characterization information, and bleeding information, and may further include subdivided information as shown in FIG. In addition, information on the state after surgery includes information on the amount of bleeding, the incidence of complications, and the number of days of hospitalization. For example, if the incidence of complications after surgery is low or the number of days required for hospitalization is small, the skill of the surgeon in charge is evaluated as high. By including patient information in the evaluation skill information in this way, it is possible to improve the accuracy of skill evaluation.
 患者情報を用いた具体例として、図40のスキル評価シート3400の、マーキング評価アイコン3440、局注評価アイコン3442、切開評価アイコン3444、剥離評価アイコン3446及び止血評価アイコン3448の全てが「A」で表示されている術者が複数名いたとする。さらに、これらの術者同士の患者情報を比較すると、手術後の偶発症発生率及び入院に要した日数に大きな差が有ったとする。この場合、所定の術者のスキル評価シート3400の総合評価アイコン3430は「A+」と表示し、特定の術者のスキル評価シート3400の総合評価アイコン3430は「A-」と表示することにより、より細かくスキル評価を行ってもよい。なお、所定の術者とは、手術後の偶発症発生率が平均値より低く、かつ入院に要した日数が平均値より少ない術者である。また、特定の術者とは、手術後の偶発症発生率が平均値より高く、かつ入院に要した日数が平均値より高い術者である。なお、総合評価がBやC等の術者に対しても、同様に、より細分化した評価をしてもよい。このように、患者情報を評価スキル情報に含めることによって、スキル評価の精度を高くすることができる。 As a specific example using patient information, the marking evaluation icon 3440, local injection evaluation icon 3442, incision evaluation icon 3444, peeling evaluation icon 3446, and hemostasis evaluation icon 3448 on the skill evaluation sheet 3400 of FIG. Assume that there are multiple displayed operators. Further, when the patient information of these operators is compared, it is assumed that there is a large difference in the incidence of postoperative complications and the number of days required for hospitalization. In this case, the comprehensive evaluation icon 3430 of the skill evaluation sheet 3400 of the predetermined operator is displayed as "A+", and the comprehensive evaluation icon 3430 of the skill evaluation sheet 3400 of the specific operator is displayed as "A-". Skill evaluation may be performed in more detail. A predetermined operator is an operator whose incidence of complications after surgery is lower than the average value and whose number of days required for hospitalization is less than the average value. A specific operator is an operator who has a postoperative incidence of complications higher than the average value and the number of days required for hospitalization is higher than the average value. It should be noted that even for an operator with a comprehensive evaluation of B, C, etc., a more detailed evaluation may be similarly performed. By including patient information in the evaluation skill information in this way, it is possible to improve the accuracy of skill evaluation.
 次に、この変形例の手法に関する処理について説明する。先ず、図47のフローチャートを用いて、ログ情報を出力するための処理例を説明する。先ず処理部3120は、内視鏡システム3300が動作しているか否かを判定(ステップS3501)する。ここで処理部3120とは、例えば前述の処理装置3330に対応するが、スコープ部3310、表示部3340、光源装置3350等まで拡張してもよい。周辺機器は例えば処置を行うための処置具3360に対応するが、吸引装置3370、送気送水装置3380、処置具3360に電力を供給する電源装置等まで拡張してもよい。なお、内視鏡システム3300を起動させるとともに、図47の後述するステップS3502以降の処理が開始される処理例にしてもよい。 Next, the processing related to the technique of this modified example will be explained. First, an example of processing for outputting log information will be described with reference to the flowchart of FIG. First, the processing unit 3120 determines whether the endoscope system 3300 is operating (step S3501). Here, the processing unit 3120 corresponds to, for example, the processing device 3330 described above, but may be expanded to include a scope unit 3310, a display unit 3340, a light source device 3350, and the like. The peripheral device corresponds to, for example, a treatment tool 3360 for performing treatment, but may be expanded to include a suction device 3370, an air/water supply device 3380, a power supply device for supplying power to the treatment tool 3360, and the like. Note that the endoscope system 3300 may be activated, and the processing after step S3502 of FIG. 47 described later may be started as well.
 そして、取得部3110は、押し付け圧力情報取得処理(ステップS3502)及び送気吸引情報取得処理(ステップS3503)を行う。そして、処理部3120は、内視鏡システム3300の動作が終了するか否かを判定する(ステップS3504)。内視鏡システム3300が動作している間(ステップS3504でNO)、ステップS3502とステップS3503を繰り返し実行し続ける。つまり、内視鏡システム3300が動作している間ずっと、押し付け圧力情報及び送気吸引情報の取得が行われる。なお、図示は省略するが、押し付け圧力情報及び送気吸引情報に、タイミング情報を関連付ける処理を追加してもよい。タイミング情報とは、例えば、各段階のタイミングを示す情報を含むが、さらに、処置前、処置中、処置後等のタイミングの情報を含んでもよい。タイミング情報は、周辺機器の使用情報をもとに取得することができる。例えば、高周波デバイスの通電状態に関する情報を使用情報として取得することで、処理部3120は、当該制御信号に基づいて、高周波デバイスに高周波電流が供給されている場合に、処置が開始されたと判定することができる。 Then, the acquisition unit 3110 performs pressing pressure information acquisition processing (step S3502) and air supply/suction information acquisition processing (step S3503). Then, the processing unit 3120 determines whether or not the operation of the endoscope system 3300 ends (step S3504). While the endoscope system 3300 is operating (NO in step S3504), steps S3502 and S3503 are repeatedly executed. In other words, while the endoscope system 3300 is operating, the pressing pressure information and the air supply/suction information are acquired. Although illustration is omitted, a process of associating timing information with the pressing pressure information and the air supply/suction information may be added. The timing information includes, for example, information indicating the timing of each stage, and may further include timing information such as before treatment, during treatment, and after treatment. The timing information can be obtained based on the usage information of the peripheral device. For example, by acquiring information about the energization state of the high-frequency device as usage information, the processing unit 3120 determines that the treatment has started when high-frequency current is supplied to the high-frequency device based on the control signal. be able to.
 その後、内視鏡システム3300の動作が終了したら(ステップS3504でYES)、ログ情報の出力処理(ステップS3505)を行う。出力処理は、例えば、取得した押し付け圧力情報及び送気吸引情報等を、時間軸に対してプロットする処理や、前述のタイミング情報に基づいて処置の各段階と時間軸とを対応づける処理を含むが、当該プロットの結果から評価が低くなる原因となる箇所を特定する処理等を含んでもよく、種々の変形実施が可能である。このようにすることで、所定の表示部にログ情報を表示することができる。 After that, when the operation of the endoscope system 3300 is completed (YES in step S3504), log information output processing (step S3505) is performed. The output process includes, for example, a process of plotting the acquired pressing pressure information, air supply/suction information, etc. against the time axis, and a process of associating each stage of the treatment with the time axis based on the timing information described above. However, it may include a process of specifying a location that causes the evaluation to be low from the result of the plot, and various modifications are possible. By doing so, the log information can be displayed on a predetermined display unit.
 次に、機械学習の概要について説明する。以下では、ニューラルネットワークNN2を用いた機械学習について説明するが、この変形例の手法はこれに限定されない。この変形例においては、例えばSVM等の他のモデルを用いた機械学習が行われてもよいし、これらの手法を発展させた手法を用いた機械学習が行われてもよい。 Next, I will explain the outline of machine learning. Machine learning using the neural network NN2 will be described below, but the technique of this modification is not limited to this. In this modification, for example, machine learning using other models such as SVM may be performed, or machine learning using techniques developed from these techniques may be performed.
 図48は、ニューラルネットワークNN2を説明する模式図である。ニューラルネットワークNN2は、データが入力される入力層と、入力層からの出力に基づいて演算を行う中間層と、中間層からの出力に基づいてデータを出力する出力層を有する。図48においては、中間層が2層であるネットワークを例示するが、中間層は1層であってもよいし、3層以上であってもよい。また各層に含まれるノードの数は図48の例に限定されず、種々の変形実施が可能である。なお精度を考慮すれば、この変形例の学習は多層のニューラルネットワークNN2を用いたディープラーニングを用いることが望ましい。ここでの多層とは、狭義には4層以上である。 FIG. 48 is a schematic diagram explaining the neural network NN2. The neural network NN2 has an input layer to which data is input, an intermediate layer that performs operations based on the output from the input layer, and an output layer that outputs data based on the output from the intermediate layer. FIG. 48 illustrates a network with two intermediate layers, but the number of intermediate layers may be one, or three or more. Also, the number of nodes included in each layer is not limited to the example shown in FIG. 48, and various modifications are possible. Considering the accuracy, it is desirable to use deep learning using a multi-layered neural network NN2 for learning in this modified example. The term “multilayer” as used herein means four or more layers in a narrow sense.
 図48に示すように、所与の層に含まれるノードは、隣接する層のノードと結合される。各結合には重み付け係数が設定されている。各ノードは、前段のノードの出力と重み付け係数を乗算し、乗算結果の合計値を求める。さらに各ノードは、合計値に対してバイアスを加算し、加算結果に活性化関数を適用することによって当該ノードの出力を求める。この処理を、入力層から出力層へ向けて順次実行することによって、ニューラルネットワークNN2の出力が求められる。なお活性化関数としては、シグモイド関数やReLU関数等の種々の関数が知られており、この変形例ではそれらを広く適用可能である。 As shown in FIG. 48, the nodes contained in a given layer are combined with the nodes of adjacent layers. A weighting factor is set for each connection. Each node multiplies the output of the preceding node by the weighting factor, and obtains the sum of the multiplication results. Further, each node adds a bias to the total value and applies an activation function to the addition result to obtain the output of that node. By sequentially executing this processing from the input layer to the output layer, the output of the neural network NN2 is obtained. Various functions such as a sigmoid function and a ReLU function are known as activation functions, and these can be widely applied in this modified example.
 ニューラルネットワークNN2における学習は、適切な重み付け係数を決定する処理である。ここでの重み付け係数は、バイアスを含む。以下、学習済モデルを生成する処理が学習装置において行われる例を示す。学習装置とは、例えば上述したように処理システム3100に含まれる学習サーバであってもよいし、処理システム3100の外部に設けられる装置であってもよい。 Learning in the neural network NN2 is a process of determining appropriate weighting coefficients. The weighting factor here includes the bias. An example in which processing for generating a trained model is performed in a learning device will be described below. The learning device may be, for example, a learning server included in the processing system 3100 as described above, or may be a device provided outside the processing system 3100 .
 学習装置は、学習データのうちの入力データをニューラルネットワークNN2に入力し、そのときの重み付け係数を用いた順方向の演算を行うことによって出力を求める。学習装置は、当該出力と、学習データのうちの正解ラベルとに基づいて、誤差関数を演算する。そして誤差関数を小さくするように、重み付け係数を更新する。重み付け係数の更新では、例えば出力層から入力層に向かって重み付け係数を更新していく誤差逆伝播法を利用可能である。 The learning device inputs the input data of the learning data to the neural network NN2 and obtains the output by performing forward calculations using the weighting coefficients at that time. The learning device calculates an error function based on the output and the correct label in the learning data. Then, the weighting coefficients are updated so as to reduce the error function. For updating the weighting coefficients, for example, an error backpropagation method can be used to update the weighting coefficients from the output layer toward the input layer.
 なおニューラルネットワークNN2には種々の構成のモデルが知られており、この変形例ではそれらを広く適用可能である。例えばニューラルネットワークNN2は、CNNであってもよいし、RNNであってもよいし、他のモデルであってもよい。CNN等を用いる場合も、処理の手順は図48と同様である。即ち、学習装置は、学習データのうちの入力データをモデルに入力し、そのときの重み付け係数を用いてモデル構成に従った順方向演算を行うことによって出力を求める。当該出力と、正解ラベルとに基づいて誤差関数が算出され、当該誤差関数を小さくするように、重み付け係数の更新が行われる。CNN等の重み付け係数を更新する際にも、例えば誤差逆伝播法を利用可能である。 Various configurations of models are known for the neural network NN2, and these can be widely applied to this modified example. For example, the neural network NN2 may be CNN, RNN, or other models. When CNN or the like is used, the processing procedure is the same as in FIG. That is, the learning device inputs the input data of the learning data to the model and obtains the output by performing forward calculation according to the model configuration using the weighting coefficients at that time. An error function is calculated based on the output and the correct label, and the weighting coefficients are updated so as to reduce the error function. For example, the error backpropagation method can also be used when updating the weighting coefficients of CNN or the like.
 次に、図49を用いて、この変形例の手法におけるニューラルネットワークNN2の入力と出力の関係を説明する。図49に示すように、ニューラルネットワークNN2の入力は、例えば押し付け圧力情報と、送気吸引情報である。例えば、前述の通り、所与の術者による1回の手術において、時系列の押し付け圧力情報と、送気吸引情報が取得される。ニューラルネットワークNN2の入力は、時系列データであるが、時系列データに基づいて演算される統計量であってもよい。 Next, using FIG. 49, the relationship between the input and output of the neural network NN2 in the technique of this modification will be described. As shown in FIG. 49, the input to the neural network NN2 is, for example, pressing pressure information and air supply/suction information. For example, as described above, time-series pressing pressure information and air supply/suction information are obtained in a single surgery performed by a given operator. The input of the neural network NN2 is time series data, but may be a statistic calculated based on the time series data.
 ニューラルネットワークNN2の出力は、例えば評価対象となるユーザのスキルを、M段階でランク付けした際のランクを表す情報である。Mは2以上の整数である。以下、ランクIは、ランクI+1に比べてスキルが高いものとする。Iは、1以上M未満の整数である。即ち、ランク1は最もスキルが高いことを表し、ランクMが最もスキルが低いことを表す。 The output of the neural network NN2 is, for example, information representing the rank when the skill of the user to be evaluated is ranked in M stages. M is an integer of 2 or more. Hereinafter, it is assumed that rank I is higher in skill than rank I+1. I is an integer of 1 or more and less than M; That is, rank 1 represents the highest skill, and rank M represents the lowest skill.
 例えばニューラルネットワークNN2の出力層はM個のノードを有する。第1ノードは、入力となったデータに対応するユーザのスキルがカテゴリ1に属する確からしさを表す情報である。第2ノード~第Mノードも同様であり、各ノードはそれぞれ、入力となったデータがカテゴリ2~カテゴリMに属する確からしさを表す情報である。例えば、出力層が公知のソフトマックス層である場合、M個の出力は、合計が1となる確率データの集合である。カテゴリ1~カテゴリMは、それぞれランク1~ランクMに対応するカテゴリである。 For example, the output layer of neural network NN2 has M nodes. The first node is information representing the likelihood that the skill of the user corresponding to the input data belongs to category 1. The same is true for the second node to the Mth node, and each node is information representing the probability that the input data belongs to category 2 to category M, respectively. For example, if the output layer is a known softmax layer, the M outputs are sets of probability data that sum to one. Category 1 to category M are categories corresponding to rank 1 to rank M, respectively.
 学習段階では、学習装置は、多数の術者がそれぞれ軟性内視鏡を用いて処置を行った際に取得された押し付け圧力情報と送気吸引情報を収集するが、さらにメタデータを保持してもよい。ここでのメタデータは、例えば前述の患者情報であるが、さらに医師情報を加えてもよい。例えば、学習装置は、所定の患者情報を有する症例に対して押し付け圧力情報及び送気吸引情報が入力された場合、評価としてランクAの正解ラベルを付与する。ここでの所定の患者情報とは、例えば前述の手術後の偶発症発生率が平均値より低く、かつ入院に要した日数が平均値より少ない患者情報である。一方、特定の患者情報を有する症例に対して押し付け圧力情報及び送気吸引情報が入力された場合、評価としてランクDの正解ラベルを付与する。ここでの特定の患者情報とは、手術後の偶発症発生率が平均値より著しく高く、かつ入院に要した日数が平均値より著しく高い患者情報である。このように、学習装置は、メタデータである患者情報に基づいて術者のスキルがM段階のうちいずれかを特定する。また、学習段階では、1症例ごとに、熟練医が手動で各修練医のスキルを評価して、評価結果を学習装置に入力してもよい。また、例えば、術後の経過が良好な場合の学習済モデルと、術後の経過が良好ではない場合の学習済モデルを用意してもよい。処理部3120は、前述の患者情報をもとに、術後の経過が良好な場合の学習済モデルと、術後の経過が良好ではない場合の学習済モデルを選択する。そして、選択された学習済モデルに押し付け圧力情報と送気吸引情報を入力することで、スキル評価を行う。また、例えば、術後の経過が良好な場合の学習済モデルにのみ「A+」の評価結果が出力される仕様にしてもよい。 In the learning stage, the learning device collects pressing pressure information and air supply/suction information acquired when a large number of operators perform treatments using flexible endoscopes. good too. The metadata here is, for example, the patient information described above, but may also include doctor information. For example, when pressing pressure information and air supply/suction information are input to a case having predetermined patient information, the learning device assigns a correct label of rank A as an evaluation. The predetermined patient information here is, for example, patient information in which the incidence of postoperative complications after surgery is lower than the average value and the number of days required for hospitalization is shorter than the average value. On the other hand, when pressing pressure information and air supply/suction information are input for a case having specific patient information, a correct label of rank D is assigned as an evaluation. The specific patient information here means information on a patient whose incidence of complications after surgery is significantly higher than the average value and the number of days required for hospitalization is significantly higher than the average value. In this way, the learning device identifies one of the M levels of the operator's skill based on the patient information, which is metadata. In the learning stage, the skilled doctor may manually evaluate the skill of each trainee for each case and input the evaluation result into the learning device. Also, for example, a trained model for when the postoperative course is good and a trained model for when the postoperative course is not good may be prepared. Based on the patient information described above, the processing unit 3120 selects a trained model when the postoperative progress is favorable and a trained model when the postoperative progress is not favorable. Then, by inputting pressing pressure information and air supply/suction information to the selected learned model, skill evaluation is performed. Further, for example, a specification may be adopted in which an evaluation result of "A+" is output only to a trained model when the postoperative progress is favorable.
 図50は、ニューラルネットワークNN2の学習処理を説明するフローチャートである。まずステップS3101において、学習装置は、学習用押し付け圧力情報と、学習用送気吸引情報を取得する。ステップS3101の処理は、例えば学習サーバが、データベースサーバに蓄積された多数のデータから、1組の押し付け圧力情報及び送気吸引情報を読み出す処理に相当する。 FIG. 50 is a flowchart explaining the learning process of the neural network NN2. First, in step S3101, the learning device acquires pressing pressure information for learning and air supply/suction information for learning. The process of step S3101 corresponds to, for example, a process in which the learning server reads out a set of pressing pressure information and air supply/suction information from a large amount of data accumulated in the database server.
 なお、押し付け圧力情報と学習用押し付け圧力情報とは、学習段階で用いられるデータであるか、スキル評価を行う推論段階で用いられるデータであるかの違いを表すものであり、具体的なデータ形式は同様である。また、所与のタイミングにおいて推論用の押し付け圧力情報として用いられたデータが、それ以降のタイミングにおいて学習用押し付け圧力情報として用いられてもよい。送気吸引情報と学習用送気吸引情報についても同様である。 Note that pressing pressure information and learning pressing pressure information represent the difference between data used in the learning stage and data used in the inference stage for skill evaluation. is similar. Also, data used as pressing pressure information for inference at a given timing may be used as pressing pressure information for learning at subsequent timings. The same applies to the air supply/suction information and the learning air supply/suction information.
 またステップS3102において、学習装置は、ステップS3101で読み出したデータに対応付けられた正解ラベルを取得する。正解ラベルは、例えば上述したように、内視鏡操作を行ったユーザのスキルをM段階で評価した結果である。 Also, in step S3102, the learning device acquires the correct label associated with the data read out in step S3101. The correct label is, for example, the result of evaluating the skill of the user who has operated the endoscope in M stages, as described above.
 ステップS3103において、学習装置は、誤差関数を求める処理を行う。具体的には、学習装置は、押し付け圧力情報及び送気吸引情報をニューラルネットワークNN2に入力する。学習装置は、入力と、その際の重み付け係数に基づいて順方向の演算を行う。そして学習装置は、演算結果と、正解ラベルの比較処理に基づいて誤差関数を求める。例えば、正解ラベルがランク1であった場合、学習装置は、カテゴリ1に対応する第1ノードの正解値が1であり、カテゴリ2~カテゴリMに対応する第2ノード~第Mノードの正解値が0であるものとして誤差関数を求める。さらにステップS3103において、学習装置は、誤差関数を小さくするように重み付け係数を更新する処理を行う。この処理は、上述したように誤差逆伝播法等を利用可能である。ステップS3101~ステップS3103の処理が、1つの学習データに基づく1回の学習処理に対応する。 In step S3103, the learning device performs processing for obtaining an error function. Specifically, the learning device inputs pressing pressure information and air supply/suction information to the neural network NN2. The learning device performs forward calculations based on the input and the weighting coefficients at that time. Then, the learning device obtains an error function based on the calculation result and the comparison processing of the correct label. For example, if the correct label is rank 1, the learning device determines that the correct value of the first node corresponding to category 1 is 1, and the correct values of the second to M-th nodes corresponding to categories 2 to M. is 0 and the error function is obtained. Furthermore, in step S3103, the learning device updates the weighting coefficients so as to reduce the error function. For this processing, the error backpropagation method or the like can be used as described above. The processing of steps S3101 to S3103 corresponds to one learning process based on one piece of learning data.
 ステップS3104において、学習装置は学習処理を終了するか否かを判定する。例えば学習装置は、多数の学習データの一部を評価データとして保持していてもよい。評価データは、学習結果の精度を確認するためのデータであり、重み付け係数の更新には使用されないデータである。学習装置は、評価データを用いた推定処理の正解率が所定閾値を超えた場合に、学習処理を終了する。 In step S3104, the learning device determines whether or not to end the learning process. For example, the learning device may hold a part of a large amount of learning data as evaluation data. The evaluation data is data for confirming the accuracy of the learning result, and is data that is not used for updating the weighting coefficients. The learning device ends the learning process when the accuracy rate of the estimation process using the evaluation data exceeds a predetermined threshold.
 ステップS3104でNoの場合、ステップS3101に戻り、次の学習データに基づく学習処理が継続される。ステップS3104でYesの場合、学習処理が終了される。学習装置は、生成した学習済モデルの情報を処理システム3100に送信する。例えば、学習済モデルは処理システム3100に含まれる不図示の記憶部に記憶され、処理部3120によって読み出される。なお、機械学習においてはバッチ学習、ミニバッチ学習等の種々の手法が知られており、この変形例ではこれらを広く適用可能である。 If No in step S3104, the process returns to step S3101 to continue the learning process based on the next learning data. If Yes in step S3104, the learning process is terminated. The learning device transmits the generated learned model information to the processing system 3100 . For example, the trained model is stored in a storage unit (not shown) included in processing system 3100 and read by processing unit 3120 . Various techniques such as batch learning and mini-batch learning are known in machine learning, and these can be widely applied to this modified example.
 なお、以上では機械学習が教師あり学習である例について説明した。ただし、この変形例の手法はこれに限定されず、教師無し学習が行われてもよい。例えば、上述したように、ニューラルネットワークNN2の出力層のノード数をM個とした場合、教師無し学習では入力である押し付け圧力情報と送気吸引情報から導出される特徴量の類似度合いに基づいて、多数の入力をM個のカテゴリに分類する分類処理が行われる。 In the above, we explained an example where machine learning is supervised learning. However, the technique of this modified example is not limited to this, and unsupervised learning may be performed. For example, as described above, when the number of nodes in the output layer of the neural network NN2 is M, in unsupervised learning, the , a classification process is performed to classify a number of inputs into M categories.
 学習装置は、M個のカテゴリの各カテゴリにランク付けを行う。例えば、熟練医のデータが多く含まれるカテゴリのランクが高く、修練医のデータが多く含まれるカテゴリのランクが低く判定される。各データが熟練医のデータであるか修練医のデータであるかは、前述の医者情報や患者情報等に基づいて、判定が可能である。ただし、詳細な処理については種々の変形実施が可能である。例えば、あらかじめ学習用のデータに対して、M段階のランク付けが行われており、学習装置は、各カテゴリに含まれるデータのランクの平均値や合計値等に基づいて、M個のカテゴリのランク付けを行ってもよい。教師無し学習を行う場合であっても、教師あり学習の例と同様に、入力に基づいて、ユーザのスキルをM段階で評価する学習済モデルを生成することが可能である。 The learning device ranks each of the M categories. For example, a category containing a lot of data on experienced doctors is ranked high, and a category containing a lot of data on trainee doctors is ranked low. It is possible to determine whether each data is the data of a skilled doctor or the data of a trainee doctor based on the aforementioned doctor information, patient information, and the like. However, various modifications can be made to the detailed processing. For example, the learning data is ranked in M steps in advance, and the learning device selects M categories based on the average value or total value of the ranks of the data included in each category. Ranking may be done. Even when performing unsupervised learning, it is possible to generate a trained model that evaluates the user's skill in M stages based on the input, as in the case of supervised learning.
 図51は、スキル評価を行う処理を説明するフローチャートである。先ず取得部3110は、スキル評価の対象となる押し付け圧力情報を取得し(ステップS3201)、送気吸引情報を取得する(ステップS3202)。その後、処理部3120は学習済モデルに基づく推論処理を行う(ステップS3203)。図48に示した例であれば、処理部3120は、押し付け圧力情報と送気吸引情報を学習済モデルに入力し、学習済みの重み付け係数に従った順方向の演算を行うことによって、M個の出力を取得する。処理部3120は、当該出力に基づいて、ユーザのスキル評価情報を求める。例えば処理部3120は、M個の出力のうち、最も値が大きいデータに基づいて、ユーザのスキルをM段階で評価する。言い換えれば、処理部3120は、学習用押し付け圧力情報及び学習用送気吸引情報を、M(Mは2以上の整数)個のカテゴリに分類する機械学習を行うことによって取得された学習済モデルと、学習用押し付け圧力情報及び学習用送気吸引情報とに基づいて、スキル評価を行う。上述したように、学習済モデルは、教師あり学習に基づいて生成されてもよいし、教師無し学習に基づいて生成されてもよい。 FIG. 51 is a flowchart for explaining the skill evaluation process. First, the acquisition unit 3110 acquires pressing pressure information for skill evaluation (step S3201), and acquires air supply/suction information (step S3202). After that, the processing unit 3120 performs inference processing based on the learned model (step S3203). In the example shown in FIG. 48, the processing unit 3120 inputs the pressing pressure information and the air supply/suction information to the learned model, and performs forward calculations according to the learned weighting coefficients. to get the output of The processing unit 3120 obtains the user's skill evaluation information based on the output. For example, the processing unit 3120 evaluates the user's skill in M stages based on the data with the largest value among the M outputs. In other words, the processing unit 3120 classifies the pressing pressure information for learning and the air supply/suction information for learning into M (M is an integer equal to or greater than 2) categories, and the learned model is acquired by performing machine learning. , the skill evaluation is performed based on the pressing pressure information for learning and the air supply/suction information for learning. As described above, the trained model may be generated based on supervised learning or unsupervised learning.
 その後、出力処理部3130は、スキル評価の結果であるスキル評価情報を出力する(ステップS3204)。ここでのスキル評価情報とは、例えば、図40のマーキング段階の評価、局注段階の評価、切開段階の評価、剥離段階の評価、止血段階の評価及び総合評価の組み合わせからなるM通りの評価結果のいずれであるかを特定する情報である。 After that, the output processing unit 3130 outputs skill evaluation information, which is the skill evaluation result (step S3204). The skill evaluation information here means, for example, M evaluations consisting of a combination of marking stage evaluation, local injection stage evaluation, incision stage evaluation, stripping stage evaluation, hemostasis stage evaluation, and comprehensive evaluation in FIG. This is the information specifying which of the results it is.
 なお、スキル評価情報は、総合評価の情報と処置の各段階における評価の情報を含むため、処理システム3100の記憶部には、それぞれの評価に応じて別々の学習済モデルが記憶されている。そして、それぞれのスキル評価の対象となる学習済モデルを用いて、図51の処理が行われる。また、前述した処置期間に限定してスキル評価を行う場合は、例えば、作成されたログ情報をもとに、処置期間及び処置期間に対応する押し付け圧力情報及び送気吸引情報からなるデータを抽出する処理を行う。そして、当該データと、当該処置期間が属する段階に対応する学習済モデルを用いて、図51の処理が行われる。処置期間より前の期間に限定してスキル評価を行う場合も同様である。また、図41に示したアドバイス情報をスキル評価に追加する場合は、例えば、スキル評価の結果に対応する定型文を予め記憶部に記憶させておく。そして、図51の処理に、得られたスキル評価結果に対応する定型文を選択する処理と、当該選択された定型文をスキル評価結果に追加する処理を行う。前述のエキスパートデータとの差異を示すアドバイス情報を表示したい場合は、例えば、エキスパートデータを予め取得し、当該エキスパートデータをメタデータとして保持し、当該メタデータをもとに当該差異を推論する処理と、当該推論による結果の情報と、関連する定型文とに基づいて、適したアドバイス情報を作成する処理が追加される。なお、エキスパートデータを予め取得するには、同様の症例について熟練者による手術を通じてデータを得る必要がある。また、一の内視鏡システム3300で取得したエキスパートデータを、他の内視鏡システム3300に移植することも可能である。 Since the skill evaluation information includes comprehensive evaluation information and evaluation information at each stage of treatment, the storage unit of the processing system 3100 stores separate trained models according to each evaluation. Then, the processing of FIG. 51 is performed using the learned models to be evaluated for each skill. In addition, when the skill evaluation is limited to the treatment period described above, for example, based on the created log information, data consisting of the treatment period and the pressing pressure information and air supply/suction information corresponding to the treatment period are extracted. process. Then, the processing of FIG. 51 is performed using the data and the learned model corresponding to the stage to which the treatment period belongs. The same is true when skill evaluation is limited to the period before the treatment period. Further, when adding the advice information shown in FIG. 41 to the skill evaluation, for example, fixed phrases corresponding to the result of the skill evaluation are stored in advance in the storage unit. Then, in the process of FIG. 51, a process of selecting a fixed phrase corresponding to the obtained skill evaluation result and a process of adding the selected fixed phrase to the skill evaluation result are performed. If you want to display advice information that indicates a difference from the aforementioned expert data, for example, obtain the expert data in advance, store the expert data as metadata, and infer the difference based on the metadata. , a process of creating suitable advice information based on the information of the result of the inference and the related fixed phrase is added. In addition, in order to obtain expert data in advance, it is necessary to obtain data through surgery by an expert on similar cases. It is also possible to transfer expert data acquired by one endoscope system 3300 to another endoscope system 3300 .
 これにより、押し付け圧力情報及び送気吸引情報の両方を考慮しつつ、さらに機械学習を用いることから、術者のスキルについて、精度を高く評価することができる。 As a result, it is possible to highly evaluate the accuracy of the skill of the operator by considering both the pressing pressure information and the air supply/suction information and using machine learning.
 以上で説明したように、処理システム3100の処理部3120は、学習済モデルに従って動作することによって、術者のスキル評価を行う。学習済モデルに従った処理部3120おける演算、即ち、入力データに基づいて出力データを出力するための演算は、ソフトウェアによって実行されてもよいし、ハードウェアによって実行されてもよい。換言すれば、図48の各ノードにおいて実行される積和演算等は、ソフトウェア的に実行されてもよい。或いは上記演算は、FPGA等の回路装置によって実行されてもよい。また、上記演算は、ソフトウェアとハードウェアの組み合わせによって実行されてもよい。このように、学習済モデルからの指令に従った処理部3120の動作は、種々の態様によって実現可能である。例えば学習済モデルは、推論アルゴリズムと、当該推論アルゴリズムにおいて用いられる重み付け係数とを含む。推論アルゴリズムとは、入力データに基づいて、順方向の演算等を行うアルゴリズムである。この場合、推論アルゴリズムと重み付け係数の両方が記憶部に記憶され、処理部3120は、当該推論アルゴリズムと重み付け係数を読み出すことによってソフトウェア的に推論処理を行ってもよい。或いは、推論アルゴリズムはFPGA等によって実現され、記憶部は重み付け係数を記憶してもよい。或いは、重み付け係数を含む推論アルゴリズムがFPGA等によって実現されてもよい。この場合、学習済モデルの情報を記憶する記憶部は、例えばFPGAの内蔵メモリである。 As described above, the processing unit 3120 of the processing system 3100 evaluates the operator's skill by operating according to the learned model. Calculations in the processing unit 3120 according to the trained model, that is, calculations for outputting output data based on input data may be performed by software or by hardware. In other words, the sum-of-products operation and the like executed at each node in FIG. 48 may be executed by software. Alternatively, the above calculations may be performed by a circuit device such as an FPGA. Also, the above operations may be performed by a combination of software and hardware. In this way, the operation of the processing unit 3120 according to commands from the trained model can be realized in various ways. For example, a trained model includes an inference algorithm and weighting factors used in the inference algorithm. An inference algorithm is an algorithm that performs forward calculations and the like based on input data. In this case, both the inference algorithm and the weighting coefficient are stored in the storage unit, and the processing unit 3120 may perform the inference processing by software by reading out the inference algorithm and the weighting coefficient. Alternatively, the inference algorithm may be implemented by FPGA or the like, and the storage unit may store the weighting coefficients. Alternatively, an inference algorithm including weighting factors may be implemented by an FPGA or the like. In this case, the storage unit that stores the information of the trained model is, for example, the built-in memory of the FPGA.
 また、図51の処理において、処理部3120は、押し付け圧力情報と送気吸引情報と学習済モデルとに基づいて、N(Nは2以上の整数)次元の特徴量を求めてもよい。例えば学習装置では、図48、図49を用いて上述した処理と同様に、複数の学習用押し付け圧力情報と学習用送気吸引情報を、M個のカテゴリに分類する機械学習を行ってもよい。 In addition, in the process of FIG. 51, the processing unit 3120 may obtain an N-dimensional (N is an integer equal to or greater than 2) feature amount based on the pressing pressure information, the air supply/suction information, and the learned model. For example, the learning device may perform machine learning to classify a plurality of pieces of pressing pressure information for learning and air supply/suction information for learning into M categories in the same manner as the processing described above using FIGS. .
 処理システム3100における処理の流れは、図51と同様である。まず取得部3110は、スキル評価の対象となる押し付け圧力情報と送気吸引情報を取得する(ステップS3201~ステップS3202)。そして、ステップS3203において、処理部3120は、押し付け圧力情報と送気吸引情を学習済モデルに入力し、学習済みの重み付け係数に従った順方向の演算を行う点も同様である。この際、処理部3120は、中間層におけるデータを、N次元の特徴量として求める。例えば、ニューラルネットワークNN2が第1中間層~第Q中間層を有する場合、N個のノードを有する第J中間層での値をN次元特徴量とする。Qは2以上の整数であり、Jは1以上Q以下の整数である。例えば、J=Qであり、最も出力層に近い中間層がN個のノードを有し、各ノードの出力が特徴量となる。或いは、複数の中間層における出力を組み合わせることによって、N次元特徴量が求められてもよい。 The flow of processing in the processing system 3100 is the same as in FIG. First, the acquisition unit 3110 acquires pressing pressure information and air supply/suction information that are subject to skill evaluation (steps S3201 and S3202). Similarly, in step S3203, the processing unit 3120 inputs the pressing pressure information and the air supply/suction information to the learned model, and performs forward calculations according to the learned weighting coefficients. At this time, the processing unit 3120 obtains the data in the intermediate layer as an N-dimensional feature amount. For example, if the neural network NN2 has the first to Q-th intermediate layers, the value in the J-th intermediate layer having N nodes is the N-dimensional feature amount. Q is an integer of 2 or more, and J is an integer of 1 or more and Q or less. For example, J=Q, the intermediate layer closest to the output layer has N nodes, and the output of each node is the feature amount. Alternatively, an N-dimensional feature amount may be obtained by combining outputs from multiple intermediate layers.
 図52は、N次元特徴量空間の例である。横軸がN次元特徴量のうちの特徴量A3を表し、縦軸が特徴量A3とは異なる特徴量B3を表す。ここではN=2としているが、Nは3以上であってもよい。押し付け圧力情報と送気吸引情報を入力することによって、第1特徴量~第N特徴量の値が求められる。即ち、1組の押し付け圧力情報と送気吸引情報が、N次元特徴量空間上の1つの点としてプロットされる。図52に示すように、機械学習に基づいて抽出されるN次元特徴量は、押し付け圧力情報と送気吸引情報からなる入力を、M個のカテゴリに分類するための特徴量である。よって、図52に示すように、N次元特徴量空間での距離に基づいてクラスタリングした結果が、ユーザのスキルを表すカテゴリとなる。即ち、入力に基づいて求められたN次元特徴量での点の位置に応じて、ユーザのスキル評価をM通りに分類することが可能である。例えば、図40等に示す段階のスキル評価を行った場合、図52のC31が評価Aのカテゴリを表し、C32が評価Bのカテゴリを表し、C33が評価Cのカテゴリを表す。これらのカテゴリの総和がM個となる。なお、図40の例では、評価はA~Dに分類可能であるから、M=4である。 FIG. 52 is an example of an N-dimensional feature amount space. The horizontal axis represents the feature amount A3 among the N-dimensional feature amounts, and the vertical axis represents the feature amount B3 different from the feature amount A3. Although N=2 here, N may be 3 or more. By inputting the pressing pressure information and the air supply/suction information, the values of the first to N-th feature amounts are obtained. That is, a set of pressing pressure information and air supply/suction information is plotted as one point on the N-dimensional feature amount space. As shown in FIG. 52, the N-dimensional feature quantity extracted based on machine learning is a feature quantity for classifying the input consisting of pressing pressure information and air supply/suction information into M categories. Therefore, as shown in FIG. 52, the result of clustering based on the distance in the N-dimensional feature amount space becomes the category representing the skill of the user. That is, it is possible to classify the user's skill evaluation into M ways according to the position of the point in the N-dimensional feature value obtained based on the input. For example, when skill evaluation is performed at the stages shown in FIG. 40 and the like, C31 in FIG. 52 represents the category of evaluation A, C32 represents the category of evaluation B, and C33 represents the category of evaluation C. The total number of these categories is M. In the example of FIG. 40, since the evaluation can be classified into A to D, M=4.
 処理部3120は、スキル評価の対象となる押し付け圧力情報と送気吸引情報を学習済モデルに入力することによって求められたN次元特徴量の特徴量空間における位置と、M個のカテゴリのうちの1又は複数のカテゴリの特徴量空間における重心位置と、の距離に基づいてスキル評価を行う。ここでの重心位置とは、各カテゴリに含まれる複数の点の位置に基づいて求められる情報であり、例えば複数の座標値の平均値である。各カテゴリの重心位置は、学習が完了した段階で既知である。言い換えれば、処理部3120は、押し付け圧力情報及び送気吸引情報と、学習済モデルとに基づいて、N(Nは2以上の整数)次元の特徴量を求め、求められたN次元特徴量と、M個のカテゴリの重心との距離に基づいて、スキル評価を行う。またここでの距離は、例えばユークリッド距離であるが、マハラノビス距離等の他の距離が用いられてもよい。 The processing unit 3120 calculates the position in the feature value space of the N-dimensional feature value obtained by inputting the pressing pressure information and the air supply/suction information, which are targets of skill evaluation, into the learned model, and the position in the feature value space of the M categories. Skill evaluation is performed based on the distance between the centroid position in the feature amount space of one or more categories. The position of the center of gravity here is information obtained based on the positions of a plurality of points included in each category, and is, for example, an average value of a plurality of coordinate values. The centroid position of each category is known at the stage when learning is completed. In other words, the processing unit 3120 obtains an N (N is an integer equal to or greater than 2)-dimensional feature amount based on the pressing pressure information, the air supply/suction information, and the learned model, and calculates the obtained N-dimensional feature amount and , and the distance from the center of gravity of M categories. Also, the distance here is, for example, the Euclidean distance, but other distances such as the Mahalanobis distance may be used.
 例えば処理部3120は、第1~第Mのカテゴリのうち、順方向の演算によって求められたN次元特徴量との距離が最も小さいカテゴリを求め、評価対象のデータが当該カテゴリに属すると判定する。図52の例であれば、処理部3120は、C31の重心位置との距離が最小である場合に、前述の評価Aと判定し、C32の重心位置との距離が最小である場合に前述の評価Bと判定し、C33の重心位置との距離が最小である場合に前述の評価Cと判定する。 For example, the processing unit 3120 obtains the category having the smallest distance from the N-dimensional feature amount obtained by the forward calculation among the first to Mth categories, and determines that the data to be evaluated belongs to this category. . In the example of FIG. 52, the processing unit 3120 determines the above-mentioned evaluation A when the distance from the center of gravity of C31 is minimum, and the above-mentioned evaluation when the distance from the center of gravity of C32 is minimum. It is determined to be the evaluation B, and when the distance from the center of gravity position of C33 is the minimum, it is determined to be the evaluation C described above.
 なお、図52の特徴量A3と特徴量B3は、前述の通り、押し付け圧力情報と送気吸引情報に基づいて抽出されるパラメータであるから、押し付け圧力情報と送気吸引情報とは異なるパラメータであるが、特徴量A3を押し付け圧力情報自体に対応させ、特徴量B3を送気吸引情報自体に対応させることを妨げない。言い換えれば、処理部3120は、押し付け圧力情報に対応する第1特徴量である特徴量A3と、送気吸引情報に対応する第2特徴量である特徴量B3によって規定される特徴量空間における距離に基づいてスキル評価を行ってもよい。例えば、図52のC31は押し付け圧力の相対変化量が非常に小さいカテゴリであることから評価Aと判定する。また、C32は送気吸引の程度が大きいが、押し付け圧力の相対変化量がC33より小さいカテゴリであることから評価Bと判定する。また、C33は、C32より送気吸引の程度は小さいが、押し付け圧力の相対変化量がC32よりも大きいために評価Cと判定する。このようにすることで、押し付け圧力情報と送気吸引情報をより適切に利用したスキル評価を行うことができるので、より精度の高いスキル評価を行うことができる。 As described above, the feature amount A3 and the feature amount B3 in FIG. 52 are parameters extracted based on the pressing pressure information and the air supply/suction information. However, it does not prevent the feature amount A3 from corresponding to the pressing pressure information itself and the feature amount B3 to correspond to the air supply/suction information itself. In other words, the processing unit 3120 calculates the distance in the feature amount space defined by the feature amount A3, which is the first feature amount corresponding to the pressing pressure information, and the feature amount B3, which is the second feature amount corresponding to the air supply/suction information. Skill assessment may be based on For example, C31 in FIG. 52 is a category in which the amount of relative change in pressing pressure is very small, so it is determined to be evaluated as A. C32 is a category with a large degree of air supply and suction, but the amount of relative change in pressing pressure is smaller than C33, so it is determined to be evaluated as B. In addition, C33 is judged to be evaluated as C because the degree of air supply and suction is smaller than that of C32, but the amount of relative change in pressing pressure is larger than that of C32. By doing so, it is possible to perform skill evaluation using pressing pressure information and air supply/suction information more appropriately, so that more accurate skill evaluation can be performed.
 ユーザ評価後の処理は図51と同様であり、ステップS3204において、出力処理部3130は、スキル評価の結果であるスキル評価情報を出力する。 The processing after user evaluation is the same as in FIG. 51, and in step S3204, the output processing unit 3130 outputs skill evaluation information that is the result of skill evaluation.
 以上では、クラスタリングを行った際の中間層データが、N次元特徴量である例について説明した。ただしこの変形例の手法はこれに限定されない。例えば、押し付け圧力情報と送気吸引情報に基づく入力に対して、主成分分析を行うことによってN次元特徴量が抽出されてもよい。主成分分析を行う手法は公知であるため詳細な説明は省略する。また機械学習を用いて主成分分析を行う手法も知られており、その場合も機械学習を適用可能である。N次元特徴量抽出後の処理については上記の例と同様である。 In the above, an example in which the intermediate layer data when clustering is performed is an N-dimensional feature amount has been explained. However, the technique of this modification is not limited to this. For example, an N-dimensional feature amount may be extracted by performing principal component analysis on inputs based on pressing pressure information and air supply/suction information. Since the method of performing principal component analysis is well known, detailed description thereof will be omitted. A method of performing principal component analysis using machine learning is also known, and machine learning can be applied in that case as well. The processing after N-dimensional feature quantity extraction is the same as the above example.
 またN次元特徴量を用いる場合、スキル評価の手法は上記に限定されない。例えば、処理部3120は、評価対象となるユーザに対応するプロット点と、当該ユーザとは異なる第2ユーザに対応するプロット点との距離に基づいてスキル評価を行ってもよい。ここでの第2ユーザは例えば指導者であり、評価対象となるユーザは当該指導者による指導を受けるユーザである。このようにすれば、評価対象となるユーザのスキルが、指導者のスキルにどの程度近いかを表す指標を、スキル評価情報として出力できる。 Also, when using the N-dimensional feature amount, the skill evaluation method is not limited to the above. For example, the processing unit 3120 may perform skill evaluation based on the distance between the plot point corresponding to the user to be evaluated and the plot point corresponding to the second user different from the user. The second user here is, for example, an instructor, and the user to be evaluated is a user who receives guidance from the instructor. In this way, an index indicating how close the skill of the user to be evaluated is to the skill of the instructor can be output as the skill evaluation information.
 内視鏡を用いた処置では、同じ部位の同じ病変を対象とする場合であっても、複数の方式が考えられる。どの方式が適していると考えるかはユーザによるため、指導者が異なればよいとされる処置の内容が異なる可能性がある。換言すれば、複数の熟練医が、特定処置についてそれぞれ異なる流派を形成する。その点、上記のように特定のユーザとの類似度を表す情報をスキル評価情報とすることによって、対象ユーザのスキルを適切に評価することが可能になる。例えば、所定の流派に属するユーザのスキルは、同じ流派における熟練医を基準として判断される。 In the treatment using an endoscope, multiple methods can be considered even when targeting the same lesion in the same site. Since it is up to the user to decide which method is suitable, the content of treatment that should be performed by different instructors may differ. In other words, multiple practitioners form different schools of thought for specific procedures. In this regard, by using information representing the degree of similarity with a specific user as skill evaluation information as described above, it is possible to appropriately evaluate the skill of the target user. For example, the skill of a user belonging to a certain school is judged based on the expert doctors of the same school.
 以上、本実施形態およびその変形例について説明したが、本開示は、各実施形態やその変形例そのままに限定されるものではなく、実施段階では、要旨を逸脱しない範囲内で構成要素を変形して具体化することができる。また、上記した各実施形態や変形例に開示されている複数の構成要素を適宜組み合わせることができる。例えば、各実施形態や変形例に記載した全構成要素からいくつかの構成要素を削除してもよい。さらに、異なる実施の形態や変形例で説明した構成要素を適宜組み合わせてもよい。このように、本開示の主旨を逸脱しない範囲内において種々の変形や応用が可能である。また、明細書又は図面において、少なくとも一度、より広義または同義な異なる用語と共に記載された用語は、明細書又は図面のいかなる箇所においても、その異なる用語に置き換えることができる。 As described above, the present embodiment and its modifications have been described, but the present disclosure is not limited to each embodiment and its modifications as they are. can be embodied in In addition, a plurality of constituent elements disclosed in each of the above-described embodiments and modifications can be appropriately combined. For example, some components may be deleted from all the components described in each embodiment and modification. Furthermore, components described in different embodiments and modifications may be combined as appropriate. In this manner, various modifications and applications are possible without departing from the gist of the present disclosure. In addition, a term described at least once in the specification or drawings together with a different term that has a broader definition or has the same meaning can be replaced with the different term anywhere in the specification or drawings.
 11,2011,3011…先端部、12,2012,3012…湾曲部、13,2013,3013…軟性部、100,2100,3100…処理システム、110,3110…取得部、120,2110,3120…処理部、130,2120,3130…出力処理部、300,2300,3300…内視鏡システム、310,2310,3310…スコープ部、310a,2310a,3310a…操作部、310b,2310b,3310b…挿入部、310c,2310c,3310c…ユニバーサルケーブル、310d,2310d,3310d…コネクタ、311,2311,3311…対物光学系、312,2312,3312…撮像素子、314,2314,3314…照明レンズ、315,2315,3315…ライトガイド、316,3316…開口部、317,3317…吸引管、318,3318…ノズル、319,3319…送気送水管、330,2330,3330…処理装置、331,2331,3331…前処理部、332,2332,3332…制御部、333,2333,3333…記憶部、335,2335,3335…検出処理部、336,2336,3336…後処理部、340,2340,3340…表示部、350,2350,3350…光源装置、352,2352,3352…光源、360,2360,3360…処置具、370,3370…吸引装置、380,3380…送気送水装置、400,3400…スキル評価シート、410,3410…医師情報アイコン、420,3420…症例シート、430,3430…総合評価アイコン、440,3440…マーキング評価アイコン、442,3442…局注評価アイコン、444,3444…切開評価アイコン、446,3446…剥離評価アイコン、448,3448…止血評価アイコン、450,3450…データログアイコン、460,3460…患者情報アイコン、470…マーキングアドバイス表示、472…局注アドバイス表示、474…切開アドバイス表示、3476…剥離アドバイス表示、3478…止血アドバイス表示、DB…データベース、NN1,NN2…ニューラルネットワーク 11, 2011, 3011 ... Tip part 12, 2012, 3012 ... Bending part 13, 2013, 3013 ... Flexible part 100, 2100, 3100 ... Processing system 110, 3110 ... Acquisition part 120, 2110, 3120 ... Processing Sections 130, 2120, 3130... Output processing section 300, 2300, 3300... Endoscope system 310, 2310, 3310... Scope section 310a, 2310a, 3310a... Operation section 310b, 2310b, 3310b... Insertion section, 310c, 2310c, 3310c... Universal cable 310d, 2310d, 3310d... Connector 311, 2311, 3311... Objective optical system 312, 2312, 3312... Imaging device 314, 2314, 3314... Illumination lens 315, 2315, 3315 Light guide 316,3316 Opening 317,3317 Suction pipe 318,3318 Nozzle 319,3319 Air/ water pipe 330,2330,3330 Treatment device 331,2331,3331 Pretreatment Parts 332, 2332, 3332... Control part 333, 2333, 3333... Storage part 335, 2335, 3335... Detection processing part 336, 2336, 3336... Post-processing part 340, 2340, 3340... Display part 350 , 2350, 3350 ... Light source device 352, 2352, 3352 ... Light source 360, 2360, 3360 ... Treatment instrument 370, 3370 ... Suction device 380, 3380 ... Air supply and water supply device 400, 3400 ... Skill evaluation sheet 410 , 3410... Doctor information icon, 420, 3420... Case sheet, 430, 3430... Comprehensive evaluation icon, 440, 3440... Marking evaluation icon, 442, 3442... Local injection evaluation icon, 444, 3444... Incision evaluation icon, 446, 3446 ... Peeling evaluation icon 448, 3448 ... Hemostasis evaluation icon 450, 3450 ... Data log icon 460, 3460 ... Patient information icon 470 ... Marking advice display 472 ... Local injection advice display 474 ... Incision advice display 3476 ... Peeling advice display, 3478... Hemostasis advice display, DB... Database, NN1, NN2... Neural network

Claims (13)

  1.  内視鏡の挿入部のアプローチ角度情報と、処置具の通電履歴に関する通電履歴情報を取得する取得部と、
     前記アプローチ角度情報及び前記通電履歴情報に基づいて、前記内視鏡を操作するユーザのスキル評価を行う処理部と、
     前記スキル評価の結果であるスキル評価情報を出力する出力処理部と、
     を含むことを特徴とする処理システム。
    an acquisition unit that acquires approach angle information of an insertion portion of an endoscope and energization history information related to the energization history of a treatment instrument;
    a processing unit that evaluates the skill of a user operating the endoscope based on the approach angle information and the energization history information;
    an output processing unit that outputs skill evaluation information that is the result of the skill evaluation;
    A processing system comprising:
  2.  請求項1において、
     前記処置具による処置は複数の段階を含み、
     前記出力処理部は、前記複数の段階の各段階における前記スキル評価情報を出力することを特徴とする処理システム。
    In claim 1,
    Treatment with the treatment instrument includes a plurality of stages,
    The processing system, wherein the output processing unit outputs the skill evaluation information at each stage of the plurality of stages.
  3.  請求項2において、
     前記複数の段階は、マーキング段階、局注段階、切開段階及び剥離段階のうち少なくとも2つを含むことを特徴とする処理システム。
    In claim 2,
    The processing system, wherein the plurality of stages includes at least two of a marking stage, a local injection stage, an incision stage, and an ablation stage.
  4.  請求項1において、
     前記出力処理部は、
     前記アプローチ角度情報のログ情報を出力することを特徴とする処理システム。
    In claim 1,
    The output processing unit is
    A processing system that outputs log information of the approach angle information.
  5.  請求項1において、
     前記出力処理部は、
     前記アプローチ角度情報及び前記通電履歴情報の少なくとも一方に関するアドバイス情報を出力することを特徴とする処理システム。
    In claim 1,
    The output processing unit is
    A processing system that outputs advice information regarding at least one of the approach angle information and the energization history information.
  6.  請求項5において、
     前記出力処理部は、
     前記アプローチ角度情報及び前記通電履歴情報の少なくとも一方に関するエキスパートデータに対する差異を前記アドバイス情報として表示することを特徴とする処理システム。
    In claim 5,
    The output processing unit is
    A processing system characterized by displaying, as the advice information, a difference with respect to expert data relating to at least one of the approach angle information and the energization history information.
  7.  請求項1において、
     前記処理部は、
     学習用アプローチ角度情報及び学習用通電履歴情報を、M(Mは2以上の整数)個のカテゴリに分類する機械学習を行うことによって取得された学習済モデルと、前記アプローチ角度情報及び前記通電履歴情報とに基づいて、前記スキル評価を行うことを特徴とする処理システム。
    In claim 1,
    The processing unit is
    A learned model acquired by performing machine learning that classifies approach angle information for learning and energization history information for learning into M categories (M is an integer equal to or greater than 2), and the approach angle information and the energization history. A processing system, wherein said skill evaluation is performed based on said information.
  8.  請求項7において、
     前記処理部は、
     前記アプローチ角度情報及び前記通電履歴情報と、前記学習済モデルとに基づいて、N(Nは2以上の整数)次元の特徴量を求め、求められたN次元特徴量と、前記M個のカテゴリの重心との距離に基づいて、前記スキル評価を行うことを特徴とする処理システム。
    In claim 7,
    The processing unit is
    Based on the approach angle information, the energization history information, and the learned model, an N (N is an integer of 2 or more) dimensional feature amount is obtained, and the obtained N dimensional feature amount and the M categories are obtained. and performing the skill evaluation based on the distance from the center of gravity of the processing system.
  9.  請求項1において、
     前記処理部は、
     前記アプローチ角度情報に対応する第1特徴量と、前記通電履歴情報に対応する第2特徴量によって規定される特徴量空間における距離に基づいて前記スキル評価を行うことを特徴とする処理システム。
    In claim 1,
    The processing unit is
    A processing system, wherein the skill evaluation is performed based on a distance in a feature amount space defined by a first feature amount corresponding to the approach angle information and a second feature amount corresponding to the energization history information.
  10.  請求項1において、
     前記処理部は、
     前記処置具の通電期間における前記アプローチ角度情報に基づいて前記スキル評価を行うことを特徴とする処理システム。
    In claim 1,
    The processing unit is
    A processing system, wherein the skill evaluation is performed based on the approach angle information during an energization period of the treatment instrument.
  11.  請求項1において、
     前記処理部は、
     前記処置具の通電期間よりも前の期間における前記アプローチ角度情報に基づいて前記スキル評価を行うことを特徴とする処理システム。
    In claim 1,
    The processing unit is
    A processing system, wherein the skill evaluation is performed based on the approach angle information during a period prior to an energization period of the treatment instrument.
  12.  請求項1において、
     前記アプローチ角度情報は、処置開始に対応するタイミングにおけるアプローチ角度を基準角度としたときの、前記基準角度に対するアプローチ角度の相対変化に関する情報であることを特徴とする処理システム。
    In claim 1,
    The processing system, wherein the approach angle information is information relating to a relative change in the approach angle with respect to the reference angle when the approach angle at the timing corresponding to the start of treatment is taken as the reference angle.
  13.  内視鏡の挿入部のアプローチ角度情報と、処置具の通電履歴に関する通電履歴情報を取得し、
     前記アプローチ角度情報及び前記通電履歴情報に基づいて、前記内視鏡を操作するユーザのスキル評価を行い、
     前記スキル評価の結果であるスキル評価情報を出力することを特徴とする情報処理方法。
    Acquire approach angle information of the insertion portion of the endoscope and energization history information related to the energization history of the treatment instrument,
    evaluating the skill of a user operating the endoscope based on the approach angle information and the energization history information;
    An information processing method characterized by outputting skill evaluation information as a result of the skill evaluation.
PCT/JP2021/036108 2021-09-30 2021-09-30 Processing system and information processing method WO2023053334A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/036108 WO2023053334A1 (en) 2021-09-30 2021-09-30 Processing system and information processing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/036108 WO2023053334A1 (en) 2021-09-30 2021-09-30 Processing system and information processing method

Publications (1)

Publication Number Publication Date
WO2023053334A1 true WO2023053334A1 (en) 2023-04-06

Family

ID=85781595

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/036108 WO2023053334A1 (en) 2021-09-30 2021-09-30 Processing system and information processing method

Country Status (1)

Country Link
WO (1) WO2023053334A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017094084A (en) * 2015-11-17 2017-06-01 コヴィディエン リミテッド パートナーシップ Systems and methods for ultrasound image-guided ablation antenna placement
JP2019165270A (en) * 2016-08-03 2019-09-26 シャープ株式会社 Video image output system, video image output method, and control apparatus
JP2020106844A (en) * 2013-12-20 2020-07-09 インテュイティブ サージカル オペレーションズ, インコーポレイテッド Simulator system for medical procedure training

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020106844A (en) * 2013-12-20 2020-07-09 インテュイティブ サージカル オペレーションズ, インコーポレイテッド Simulator system for medical procedure training
JP2017094084A (en) * 2015-11-17 2017-06-01 コヴィディエン リミテッド パートナーシップ Systems and methods for ultrasound image-guided ablation antenna placement
JP2019165270A (en) * 2016-08-03 2019-09-26 シャープ株式会社 Video image output system, video image output method, and control apparatus

Similar Documents

Publication Publication Date Title
JP7308936B2 (en) indicator system
US20220336078A1 (en) System and method for tracking a portion of the user as a proxy for non-monitored instrument
KR102558061B1 (en) A robotic system for navigating the intraluminal tissue network that compensates for physiological noise
JP7245360B2 (en) LEARNING MODEL GENERATION METHOD, PROGRAM, PROCEDURE ASSISTANCE SYSTEM, INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD AND ENDOSCOPE PROCESSOR
WO2023095492A1 (en) Surgery assisting system, surgery assisting method, and surgery assisting program
CN116096309A (en) Intracavity robot (ELR) system and method
JP7457415B2 (en) Computer program, learning model generation method, and support device
JP7194889B2 (en) Computer program, learning model generation method, surgery support device, and information processing method
US20210295980A1 (en) Medical image processing apparatus, trocar, medical observation system, image processing method, and computer readable recording medium
JP7376677B2 (en) Image processing system, endoscope system and method of operating the endoscope system
WO2023053334A1 (en) Processing system and information processing method
JP7146318B1 (en) Computer program, learning model generation method, and surgery support device
WO2023053333A1 (en) Processing system and information processing method
JP2023507063A (en) Methods, devices, and systems for controlling image capture devices during surgery
CN115443108A (en) Surgical procedure guidance system
US20230225802A1 (en) Phase segmentation of a percutaneous medical procedure
JP7148193B1 (en) Surgery support system, surgery support method, and surgery support program
JP7493285B2 (en) Information processing device, information processing method, and computer program
US20230215059A1 (en) Three-dimensional model reconstruction
WO2022186110A1 (en) Machine learning system, recognizer, learning method, and program
US20230116781A1 (en) Surgical devices, systems, and methods using multi-source imaging
JP2024514642A (en) System and method for tracking a portion of users as an alternative to non-monitoring equipment
WO2024081745A2 (en) Localization and targeting of small pulmonary lesions
CN117480569A (en) System and method for tracking a portion of a user as a proxy for non-monitoring instrumentation
WO2024068664A1 (en) Method and apparatus for guiding a surgical access device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21959375

Country of ref document: EP

Kind code of ref document: A1