WO2020203034A1 - Endoscopic system - Google Patents

Endoscopic system Download PDF

Info

Publication number
WO2020203034A1
WO2020203034A1 PCT/JP2020/009616 JP2020009616W WO2020203034A1 WO 2020203034 A1 WO2020203034 A1 WO 2020203034A1 JP 2020009616 W JP2020009616 W JP 2020009616W WO 2020203034 A1 WO2020203034 A1 WO 2020203034A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
unit
information
image
output
Prior art date
Application number
PCT/JP2020/009616
Other languages
French (fr)
Japanese (ja)
Inventor
青野 進
Original Assignee
オリンパス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オリンパス株式会社 filed Critical オリンパス株式会社
Publication of WO2020203034A1 publication Critical patent/WO2020203034A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes

Definitions

  • the present invention presents various information obtained from endoscopic image data, each detection unit, each constituent unit, etc. as information when performing endoscopic observation or treatment under endoscopic observation. It's about the system.
  • an endoscope system including an endoscope having an elongated tube-shaped insertion portion is widely used in, for example, the medical field and the industrial field.
  • the medical endoscope system used in the medical field is, for example, by inserting an insertion part into the body cavity of a living body to observe the inside of an organ or the like, or by using a predetermined treatment tool as necessary. It is configured so that various treatments can be applied to the target organs and the like.
  • an insertion part is inserted inside a device such as a jet engine or a factory pipe or a mechanical device to prevent scratches or corrosion in the device or the mechanical device. It is configured so that the condition can be observed and inspected.
  • various information for example, shape, hue, saturation, brightness, frequency characteristics, distance, etc.
  • Information can be obtained.
  • various information during endoscopic observation for example, position information of the tip of the endoscope, time information, etc.
  • position information of the tip of the endoscope for example, position information of the tip of the endoscope, time information, etc.
  • time information for example, time information of the endoscope, time information, etc.
  • various information regarding the state of each constituent unit for example, the amount of emitted light of the light source device, the wavelength information, the output amount of the energy treatment device, the output time information, etc. is obtained. Can be obtained.
  • the endoscopic system disclosed by the above-mentioned republished patent WO2016 / 151888, etc. determines the degree of progress of the action by irradiating therapeutic light such as PDT (photodynamic therapy) with therapeutic light.
  • PDT photodynamic therapy
  • the endoscope system disclosed in Japanese Patent Publication No. 2005-237641 and the like is said to more appropriately maintain the lamp and the like that supply the illumination light of the light source device by detecting the amount of illumination light. It is a thing.
  • the system disclosed in the above-mentioned Japanese Patent Publication No. 5-285099 is an X-ray imaging system, which includes a detection device for detecting the residual X-ray dose of X-rays emitted from the X-ray irradiation device.
  • a detection device for detecting the residual X-ray dose of X-rays emitted from the X-ray irradiation device.
  • the endoscopic system disclosed by the above-mentioned republished patent WO2016 / 151888 uses the luminance value information among the information acquired from the endoscopic image data.
  • the endoscope system disclosed in Japanese Patent Publication No. 2005-237641 uses light quantity information as information acquired from the light source device.
  • the system disclosed in the Japanese Patent Publication No. 5-285099 and the like uses X-dose information as information acquired by the detection device.
  • the present invention has been made in view of the above points, and an object of the present invention is to display various information acquired during endoscopic observation or treatment under endoscopic observation using a display device. It is to provide an endoscopic system that can be presented in real time immediately and can perform efficient and reliable observation and treatment.
  • the endoscopic system of one aspect of the present invention acquires an imaging unit that forms an optical image of a subject to generate image data and time information corresponding to the image data.
  • An external device information acquisition unit that acquires output energy amount information and output time information output from an energy treatment device that is simultaneously used in an inspection using the time measurement unit and the imaging unit, and the time for the image data.
  • a data integration unit that outputs integrated data by associating information with the information acquired by the external device information acquisition unit, and an integrated data output from the data integration unit are integrated based on the time information. It is provided with a recording unit for recording as an endoscope image group.
  • various information acquired during endoscopic observation or treatment under endoscopic observation is immediately presented in real time using a display device to perform efficient and reliable observation and treatment. It is possible to provide an endoscopic system that enables it.
  • Block configuration diagram showing a main configuration in the endoscope system of the first embodiment of the present invention Block configuration diagram showing a main configuration in the endoscope system of the second embodiment of the present invention
  • FIG. 1 is a block configuration diagram showing a main configuration in the endoscope system according to the first embodiment of the present invention.
  • the endoscope system 1 of the first embodiment of the present invention includes an endoscope 10, a light source device 20, a processor 30, an analysis processing device 40, a monitor 50 as a display device, an energy treatment device 60, and the like. It is configured to have.
  • the endoscope 10 is configured to have an elongated tube-shaped insertion portion, and the insertion portion is inserted into, for example, a body cavity of a living body to observe and inspect the inside of an organ or the like, and if necessary, a predetermined one. It is a device configured to be able to perform various treatments on a target organ or the like using a treatment tool. Therefore, an imaging unit 11 is provided inside the tip of the insertion portion of the endoscope 10.
  • the image pickup unit 11 is composed of an image pickup element (imager) 11a, an image pickup optical system 11b, and the like.
  • the image pickup device (imager) 11a includes a photoelectric conversion element that outputs image data by receiving an optical image imaged by the image pickup optical system 11b and performing a photoelectric conversion process.
  • the imaging optical system 11b is composed of a plurality of or a single optical lens for forming an optical image of a subject.
  • the imaging unit 11 is a configuration unit that forms an optical image of a subject to generate and acquire image data, and the same one as that applied to the conventional endoscope 10 is applied. Therefore, the configuration of the imaging unit 11 itself is assumed to be the same as that of the conventional one, and detailed description thereof will be omitted.
  • a configuration example having only one imaging unit 11 is shown, but the present embodiment is not limited to this embodiment.
  • image data capable of forming a stereo image 3D image
  • distance information and the like can be acquired based on two or more image data acquired by two or more image pickup units 11.
  • a light guide fiber 12 is inserted and arranged in the insertion portion of the endoscope 10.
  • the light guide fiber 12 is provided between the illumination optical system 13 provided on the tip end surface of the insertion portion of the endoscope 10 and the connector portion (not shown) of the light source device 20.
  • the light guide fiber 12 serves to transmit the luminous flux emitted from the light source device 20 to the illumination optical system 13 on the front end surface of the insertion portion of the endoscope 10.
  • the light emitted from the illumination optical system 13 is directed toward the observation target site 101 of the subject 100 such as a patient, and illuminates the vicinity of the observation target site 101.
  • the light source device 20 is a device for supplying illumination light to the endoscope 10.
  • the light source device 20 is composed of, for example, a white light light source 21, an excitation light light source 22, a splitter 23, a condenser lens 24, and the like.
  • the white light light source 21 is, for example, a light source that emits white light. Specifically, for example, a light emitting diode (LED; light emission diode) or a xenon lamp (xenon lamp) is applied to the white light light source 21. When a light emitting diode is used, for example, it may be of a type in which B, G, and R are combined to generate white light.
  • the excitation light light source 22 is, for example, a light source that emits white light.
  • the splitter 23 is a configuration unit having a dichroic mirror surface having a function of reflecting 100% of wavelength light having a specific wavelength or higher and transmitting 100% of normal light.
  • the splitter 23 reflects, for example, excitation light (light of a specific wavelength) emitted from an excitation light light source 22 and light having a wavelength equal to or higher than the specific wavelength with a transmittance of approximately 100%, and white light.
  • the white light emitted from the light source 21 is transmitted with a transmittance of almost 100%. Then, the combined wave is emitted toward the condenser lens 24.
  • the condensing lens 24 is an optical lens that condenses the light from each light source (21, 22) and emits it toward the end surface of the light guide fiber 12 provided in the connector portion of the light source device 20.
  • the configuration of the light source device 20 itself, it is assumed that the same configuration as the conventional one is applied, and further description thereof will be omitted. Further, in the light source device 20 of the present embodiment, the white light light source 21 and the excitation light light source 22 are illustrated, but the type of the light source is not limited to these, and those that generate different types of light are used. It may be the one adopted.
  • the processor 30 receives an output signal from the imaging unit 11 of the endoscope 10 to generate image data, and acquires various information based on the output signal. Further, the processor 30 acquires detection log data (time series data) based on various acquired information and time information corresponding to each of the acquired information. Further, the processor 30 associates and synthesizes the image data and the detection log data. Then, the processor 30 performs display control for displaying an image based on the generated image log composite data (integrated data), information based on various information data, and the like in a predetermined area on the display screen of the monitor 50. ..
  • the processor 30 includes, for example, an image reading unit 31, a time measuring unit 31a, a normal optical image generation unit 32, a detection unit 33 (parameter detection unit), a detection log data acquisition unit 34, and an output unit 35. It includes an image / log data synthesis unit 36 (data integration unit, image composition unit), a display control unit 37, a recording unit 38, an energy output detection unit 39 (external device information acquisition unit), and the like. ..
  • the image reading unit 31 is a circuit or software program that reads an image signal output from the image sensor 11a of the image pickup unit 11.
  • the time measurement unit 31a is a circuit or software program that measures a predetermined time and outputs the acquired time information. Specifically, for example, the time measurement unit 31a measures the time from the start of reading the image signal by the image reading unit 31 and outputs the acquired time information to the detection log data acquisition unit 34. Further, the time measuring unit 31a measures the energy output time by the energy output unit 61 and outputs the acquired time information to the energy output detecting unit 39.
  • the time measurement unit 31a corresponds to, for example, an internal clock circuit called a real-time clock (RTC) or the like.
  • the time information obtained by the time measuring unit 31a is used as time information related to, for example, date and time information associated with the image data, various numerical data detected by the detection unit 33, detection log data, and the like.
  • the normal optical image generation unit 32 is a circuit or software program that generates image data by receiving an image signal output from the image sensor 11a of the image pickup unit 11.
  • the image data generated by the normal light image generation unit 32 is image data generated based on an image signal acquired when the subject is illuminated with normal light (white light).
  • the detection unit 33 is a parameter detection unit including a circuit or a software program for detecting predetermined numerical data based on an image signal or the like read by the image reading unit 31.
  • the predetermined numerical data detected by the detection unit 33 includes, for example, shape information on the image, hue / saturation / brightness information in the image, frequency characteristics, brightness level, specific wavelength on the image, and the light source unit. There is light quantity information, wavelength information of the light emitted by the light source, and so on.
  • the detection unit 33 is not limited to numerical data based on image signals, but also has various other sensors such as a gyro sensor, GPS (Global Positioning System), temperature sensor, pressure sensor, acceleration sensor, and the like. Detects numerical data based on output signals from sensors.
  • one imaging unit 11 is provided, but in addition to this configuration, for example, an endoscope 10 having two or more imaging units 11 can be considered. With this configuration, it is also possible to acquire numerical data related to distance information as information data acquired based on each image data output from two or more imaging units 11.
  • the detection log data acquisition unit 34 is a circuit or software program that acquires the numerical data acquired by the detection unit 33 as log data. Therefore, the detection log data acquisition unit 34 acquires the time information corresponding to each numerical data acquired by the detection unit 33 from the time measurement unit 31a.
  • the output unit 35 transfers the detection log data acquired by the detection log data acquisition unit 34 and the output data (image log synthesis data, integrated data) from the image / log data synthesis unit 36 to the external analysis processing device 40. It is a circuit or software program that outputs.
  • the image / log data synthesis unit 36 performs a synthesis process for associating the image data generated by the normal optical image generation unit 32 with the detection log data acquired by the detection log data acquisition unit 34, and performs image log synthesis data (integration). It is a data integration unit consisting of a circuit or software program that generates and outputs data).
  • the image / log data synthesizing unit 36 is also an image synthesizing unit that performs an image synthesizing process for displaying the images and information included in the image log synthesizing data (integrated data) on the display screen.
  • the display control unit 37 is a circuit or software program that controls display when displaying an image, information, or the like in a predetermined form on the display screen of the monitor 50.
  • the display control process performed by the display control unit 37 is, for example, displaying an image based on the image data generated by the normal optical image generation unit 32 in a predetermined area on the display screen of the monitor 50, or acquiring detection log data.
  • Display control in various display forms, such as displaying information based on the detection log data acquired by the unit 34 in a predetermined area on the display screen of the monitor 50, and superimposing an image display and an information display. Is included.
  • the recording unit 38 includes a recording medium such as a semiconductor memory for recording each output data from the image / log data synthesis unit 36 and the detection log data acquisition unit 34, and a circuit or software program for driving the recording medium. It is a unit.
  • the recording unit 38 combines the image log composite data (integrated data) output from the image / log data synthesizer 36 (data integration unit) into one integrated endoscopic image group (for example, based on time information).
  • a plurality of still image data are recorded as a set of moving image data in a form collected in chronological order).
  • the present invention is not limited to this embodiment, and the recording unit 38 may be provided outside.
  • the recording unit 43 of the analysis processing device 40 which is an external device, it is possible to omit the recording unit 38 in the processor 30.
  • the energy output detection unit 39 receives various information data output from the energy output unit 61 of the energy treatment device 60, time information from the time measurement unit 31a, and the like, and receives various information regarding output energy (output amount, output intensity, etc.). It is an external device information acquisition unit consisting of a circuit or software program that detects (output time, etc.) and generates log data.
  • the processor 30 has various constituent units other than those described above, but the constituent units other than those described above are not directly related to the present invention, and therefore detailed description thereof will be omitted. To do.
  • the analysis processing device 40 receives output data (detection log data, image log synthesis data, etc.) from the output unit 35 of the processor 30 and records the data, performs a predetermined analysis and determination, and obtains the analysis and determination results. Based on this, it is a circuit or software program that generates and generates a control signal of the energy treatment device 60.
  • the analysis processing device 40 includes a detection data analysis determination unit 41 (data analysis unit), a detection data determination reference input unit 42, a recording unit 43, an energy treatment device control unit 44, and the like.
  • the detection data analysis judgment unit 41 is a data analysis consisting of a circuit or a software program that receives output data (detection log data, image log synthesis data, etc.) from the output unit 35 of the processor 30 and performs predetermined analysis processing and judgment processing. It is a department.
  • various detection data input from the output unit 35 has a constant numerical value defined with respect to the detection data determination reference value input from the detection data determination reference input unit 42. It performs analysis processing and determination processing such as whether it is within the range, is equal to or more than the specified numerical value, or is less than or equal to the specified numerical value.
  • the analysis determination result data of the detection data analysis determination unit 41 is sent to the energy treatment device control unit 44 in addition to the monitor 50 and the recording unit 43.
  • the detection data determination standard input unit 42 has a plurality of data such as a determination reference value preset for the detection data, and outputs a predetermined detection data determination reference value or the like to the detection data analysis determination unit 41 at a predetermined timing. It is a circuit or software program to be used.
  • the data such as the detection data determination reference value is reference data used in the analysis process and the determination process in the detection data analysis determination unit 41.
  • the recording unit 43 receives and records output data (detection log data, image log composite data, etc.) from the output unit 35 of the processor 30, information data related to the analysis determination result output from the detection data analysis determination unit 41, and the like.
  • a recording medium such as a semiconductor memory and a circuit or software program that drives the recording medium.
  • the energy treatment device control unit 44 is a circuit or software program for controlling the energy treatment device 60 based on the analysis determination result data by the detection data analysis determination unit 41.
  • the analysis processing apparatus 40 has various constituent units other than those described above, but the constituent units other than those described above are not directly related to the present invention, and therefore detailed description thereof will be omitted. To do.
  • the monitor 50 is controlled by the display control unit 37 of the processor 30, and based on the input image data and various information data, the image and various information are visually recognized on the display screen of the monitor 50 in an appropriate predetermined form. It is a display device that can display.
  • the monitor 50 includes, for example, a display panel such as a liquid crystal display (LCD) and an organic electro-Luminescence display (OEL), a drive circuit thereof, a software program, and the like.
  • LCD liquid crystal display
  • OEL organic electro-Luminescence display
  • the energy treatment device 60 is a treatment device used when performing a predetermined treatment under endoscopic observation.
  • the energy treatment device 60 is a treatment device that outputs energy (laser light or the like) to treat a predetermined affected portion of a subject (patient). Therefore, the energy treatment device 60 is configured to include an energy output unit 61 and the like.
  • the energy output unit 61 is a mechanism, circuit, or software program for outputting energy (laser light, etc.) for treatment.
  • the energy output unit 61 is controlled by the energy treatment device control unit 44 of the analysis processing device 40.
  • the energy treatment device 60 also has various constituent units other than those described above. However, since the constituent units other than those described above are not directly related to the present invention, detailed description thereof will be omitted.
  • the actions described here include, for example, observing the observation target site 101 in the body cavity of the subject 100 using an endoscope 10 and performing a predetermined treatment using the energy treatment device 60 under the endoscopic observation. A case is assumed.
  • the user inserts the insertion portion of the endoscope 10 into the body cavity of the subject 100, and is provided at the tip of the insertion portion of the endoscope 10 in the vicinity of the desired observation target site 101.
  • the imaging unit 11 is arranged. During that time, the imaging operation by the imaging unit 11 of the endoscope 10 is continuously performed. At this time, the image signal acquired by the imaging unit 11 is read by the image reading unit 31 of the processor 30 and transmitted to the normal optical image generation unit 32 and the detection unit 33.
  • the normal optical image generation unit 32 generates predetermined image data.
  • the detection unit 33 detects various information data as numerical data. Further, the time measuring unit 31a detects the time information for each image signal acquired by the image reading unit 31.
  • the information data detected by the detection unit 33 for example, there is information about the shape on the image.
  • shape information for example, blood vessels, ureters, nerves, etc. on an image can be detected and recognized.
  • the information data detected by the detection unit 33 for example, there is information on hue, saturation, brightness, etc. on the image. With this information, for example, bleeding points and blood flow on an image can be detected and recognized.
  • the information data detected by the detection unit 33 for example, there is information regarding frequency characteristics on the image.
  • this information for example, a shape on an image, specifically, for example, an organ or a polyp can be detected and recognized.
  • the information data detected by the detection unit 33 for example, there is information regarding the brightness level and the like. With this information, for example, it is possible to detect and recognize objects such as near-distance detection on an image, detection inside or outside the body, and gauze.
  • the information data detected by the detection unit 33 there is information obtained by receiving outputs from various sensors (not shown). Examples of these various sensors include a gyro sensor provided at the tip of the endoscope 10 for acquiring and clearly indicating the position of the tip of the endoscope 10.
  • the detection unit 33 detects the tip position of the endoscope 10 based on the output information from the gyro sensor.
  • the position information data as the information data detected by the detection unit 33 in this way is transmitted to the detection log data acquisition unit 34.
  • distance information is used as information data acquired based on each image data output from the two or more imaging units 11. You can also get numerical data about.
  • the information data related to this distance information is numerical data such as the distance from the tip surface of the endoscope 10 to the observation target portion 101.
  • the image data generated by the normal optical image generation unit 32 is transmitted to the image / log data synthesis unit 36.
  • various information data (numerical data) detected by the detection unit 33 is transmitted to the detection log data acquisition unit 34.
  • the detection log data acquisition unit 34 acquires detection log data based on various information data detected by the detection unit 33 and time information from the time measurement unit 31a.
  • the detection log data acquired here is transmitted to the image / log data synthesis unit 36, the output unit 35, and the recording unit 38.
  • the image / log data synthesis unit 36 generates image log synthesis data by associating the image data generated by the normal optical image generation unit 32 with the detection log data acquired by the detection log data acquisition unit 34.
  • the image log composite data generated here is transmitted to the output unit 35 and the recording unit 38.
  • the output unit 35 outputs the detection log data, the image log composite data, and the like to the external analysis processing device 40.
  • the data such as the detection log data and the image log synthesis data are recorded in the recording unit 43 of the analysis processing device 40.
  • the detection data analysis determination unit 41 of the analysis processing device 40 executes a predetermined analysis process and determination process based on the detection data determination reference value from the detection data determination reference input unit 42 for the detection log data from the output unit 35. To do. That is, does the detection data analysis determination unit 41 have the detection log data input from the output unit 35 within a predetermined numerical range with respect to the detection data determination reference value input from the detection data determination reference input unit 42? , Analyze and judge whether it is out of the specified numerical range.
  • the analysis determination result data of the detection data analysis determination unit 41 is transmitted to the recording unit 43 and recorded on the recording medium of the recording unit 43. Further, the analysis determination result data of the detection data analysis determination unit 41 is also transmitted to the energy treatment device control unit 44.
  • the energy output unit 61 transmits information data (energy output information data) regarding the energy treatment output to the energy output detection unit 39 of the processor 30.
  • the energy output information data is numerical data such as an output energy amount and an output energy fluctuation amount.
  • the time measurement unit 31a measures the time based on the energy treatment output information data from the energy output unit 61, and transmits the measurement result to the energy output detection unit 39 as time information regarding the energy treatment output.
  • the energy output detection unit 39 acquires information data such as the output time of the energy treatment output based on the energy treatment output information data from the energy output unit 61 and the time information from the time measurement unit 31a. This information data is transmitted to the detection log data acquisition unit 34.
  • the detection log data acquisition unit 34 generates log data (energy output log data) related to energy output based on the energy output information data from the energy output detection unit 39 and the time information from the time measurement unit 31a.
  • the energy output log data generated by the detection log data acquisition unit 34 is transmitted to the analysis processing device 40 via the output unit 35.
  • the analysis determination result data of the detection data analysis determination unit 41 is transmitted to the energy treatment device control unit 44.
  • the energy treatment device control unit 44 controls the energy output value from the energy output unit 61 so as to be within a predetermined numerical range based on the analysis determination result data.
  • the analysis determination result data of the detection data analysis determination unit 41 is also transmitted to the monitor 50 as described above.
  • output data (detection log data, image log synthesis data, energy output log data, etc.) from the output unit 35 of the processor 30 is also transmitted to the monitor 50.
  • the monitor 50 is controlled by the display control unit 37 to display the output data from the output unit 35.
  • the display control unit 37 controls, for example, to display the detection log data in a form of being superimposed on the displayed image based on the image data.
  • the display control unit 37 controls to display, for example, an endoscopic image based on image data (image log composite data) in a predetermined area (area indicated by reference numeral 50a in FIG. 1) in the display screen of the monitor 50. Do.
  • the display control unit 37 provides information based on, for example, detection log data, energy output log data, etc. (for example, character information) in a predetermined area (area indicated by reference numeral 50b in FIG. 1) in the display screen of the monitor 50. Controls the display on.
  • the display control unit 37 controls to display information based on, for example, analysis determination result data (for example, character information) in a predetermined area (area indicated by reference numeral 50c in FIG. 1) in the display screen of the monitor 50. Do.
  • various information according to the analysis judgment result for example, assist information regarding the treatment, warning notification, etc. may be displayed.
  • the time information related to each information data is to detect and analyze the time required for surgery using the treatment device (treatment time required for each procedure step, total time for the entire surgery, etc.). Can be done.
  • the order of the procedure steps is detected, and the step transition time is also referred to to check whether the transition between the procedure steps can be smoothly executed. Can be analyzed and judged.
  • the cooperation between the surgeon and the assistant can be determined.
  • the operator's Konko and treatment operations are detected, the tip position of the endoscope 10 in the body cavity is detected, and each operation timing, time, and positional relationship performed by the operator and the scorpist are detected. Therefore, it is possible to determine the cooperation between the surgeon and the scorpist.
  • the position of each trocar can be detected.
  • the image based on the image data is displayed in the predetermined area 50a of the monitor 50. Therefore, it is possible to detect whether the object to be observed or the desired portion is clearly captured only by checking the display screen of the monitor 50. For example, it is possible to determine whether or not an observation target object or a desired part (for example, a target organ, the desired part thereof, etc.) is displayed by an image.
  • the tip of the Konko and its position can be confirmed by the display. It is desirable that the object to be observed and the desired portion are, for example, near the center of the display area 50a of the monitor 50.
  • the bleeding site, organ damage site, etc. can be detected from the display image of the monitor 50.
  • the forceps and retractors held by the assistant are detected and their positions are detected in, for example, the peripheral area of the display area 50a of the monitor 50, it can be determined that a wide surgical field is secured.
  • a wide surgical field can be determined.
  • image data that is continuously acquired and log data (time-series data) associated with time information can be acquired, for example, by tracking changes in the position of Konko in time series, the operation time of Konko can be obtained. And the operation method can be detected. This makes it possible to determine whether or not the correct operation has been performed. In addition, it is possible to detect whether the correct treatment is performed by detecting the contact time between the Konko and the organ tissue and the change in the forceps position.
  • the type of treatment tool or the like can be detected. Therefore, it is possible to determine whether or not an appropriate treatment tool is selected. At the same time, for example, by detecting the energy output time, output timing, etc., it is possible to determine whether or not proper use is being performed.
  • information such as the shape, color, tissue running, and layer type of the peeling layer can be acquired from the image data, so that the peeling layer can be detected.
  • the blood vessel itself can be detected from image data, information of the light source device 20, etc., and the blood vessel clip can be detected, its position can be detected, and the arrangement time of the blood vessel clip can be detected. it can. Furthermore, the treatment time and treatment position for the blood vessel dissection treatment can be detected.
  • the suture time can be detected by the position and movement of the needle or thread.
  • the suturing time can be detected by detecting the positional relationship between the needle and the organ tissue and its change.
  • various information data acquired from the endoscopic image data and various sensors are acquired as numerical data, and the various information data correspond to each other.
  • it is configured to be sequentially output and recorded as predetermined log data (time series data).
  • the above-mentioned predetermined log data is analyzed and determined whether it is within the specified numerical range or outside the specified numerical range with respect to the predetermined reference value specified in advance.
  • the analysis determination result is recorded and displayed as information associated with the endoscopic image displayed on the monitor 50.
  • the endoscopic system 1 of the present embodiment various information when performing endoscopic observation or performing treatment under endoscopic observation is predetermined together with the corresponding endoscopic image. It can be displayed in the display form of. As a result, the user (user) can acquire the corresponding related information data in real time while displaying the observation image. Therefore, the user (user) can efficiently and reliably perform endoscopic observation and treatment under endoscopic observation.
  • FIG. 2 is a block configuration diagram showing a main configuration in the endoscope system according to the second embodiment of the present invention.
  • the endoscope system 1A of the present embodiment basically has substantially the same configuration as the endoscope system 1 of the first embodiment described above.
  • information data from the light source device 20 is acquired as numerical data and associated with time information.
  • time series data In addition to outputting the log data (time series data), whether the log data is within the specified numerical range or outside the specified numerical range with respect to the predetermined reference value specified in advance, etc. The difference is that the analysis and judgment of the above are performed.
  • the endoscope system 1A of the present embodiment includes an emission light amount detection unit 39A (external device information acquisition unit) in the processor 30. It includes a light source control unit 45 in the analysis processing device 40.
  • the emitted light amount detecting unit 39A of the processor 30 receives various information data output from each light source (white light light source 21, excitation light light source 22) of the light source device 20, time information from the time measuring unit 31a, and outputs the light source. It is an external device information acquisition unit consisting of a circuit or software program that detects various information related to the light source and generates log data. Therefore, the time measuring unit 31a further measures the light amount output time of the light source device 20 and outputs the acquired time information to the emitted light amount detecting unit 39A.
  • the information data acquired by the emitted light amount detecting unit 39A includes, for example, emitted light amount information, emitted light intensity information, emitted light output time, etc. output from each light source (white light light source 21, excitation light light source 22) of the light source device 20. It is numerical data.
  • the information data can be used to detect whether the tip of the endoscope 10 is inside or outside the body cavity.
  • the information data can be used to determine an imaged object, for example, to detect whether or not there is gauze.
  • the information data acquired by the emitted light amount detecting unit 39A also includes, for example, a specific wavelength (narrow band wavelength) of the emitted light output from each light source (white light light source 21, excitation light light source 22) of the light source device 20.
  • a specific wavelength narrow band wavelength
  • the information data acquired by the emitted light amount detection unit 39A is transmitted to the detection log data acquisition unit 34. Then, in the detection log data acquisition unit 34, log data (emission light log data) related to the emission light is generated based on the information data from the emission light amount detection unit 39A and the time information from the time measurement unit 31a. ..
  • the emitted light log data generated by the detection log data acquisition unit 34 is transmitted to the analysis processing device 40 via the output unit 35.
  • the light source control unit 45 of the analysis processing device 40 is a circuit or software program for controlling the light source device 20 based on the analysis determination result data by the detection data analysis determination unit 41. Therefore, the analysis determination result data of the detection data analysis determination unit 41 is also sent to the light source control unit 45.
  • the light source control unit 45 controls the light amount value from the light source device 20 to be within a predetermined numerical range based on the analysis determination result data of the detection data analysis determination unit 41.
  • the light source device 20 can be appropriately controlled by acquiring information data from the light source device 20 in addition to the energy treatment device 60. it can.
  • the present invention is not limited to the above-described embodiment, and it goes without saying that various modifications and applications can be carried out within a range that does not deviate from the gist of the invention.
  • the above-described embodiment includes inventions at various stages, and various inventions can be extracted by an appropriate combination of a plurality of disclosed constituent requirements. For example, even if some constituent requirements are deleted from all the constituent requirements shown in the above embodiment, if the problem to be solved by the invention can be solved and the effect of the invention is obtained, this constituent requirement is deleted.
  • the configured configuration can be extracted as an invention.
  • components across different embodiments may be combined as appropriate.
  • the present invention is not limited by any particular embodiment thereof except as limited by the accompanying claims.
  • the present invention can be applied not only to an endoscope control device in the medical field but also to an endoscope control device in the industrial field.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Surgery (AREA)
  • Optics & Photonics (AREA)
  • Biomedical Technology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Engineering & Computer Science (AREA)
  • Biophysics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Astronomy & Astrophysics (AREA)
  • General Physics & Mathematics (AREA)
  • Endoscopes (AREA)
  • Instruments For Viewing The Inside Of Hollow Bodies (AREA)

Abstract

The purpose of the present invention is to provide an endoscopic system that enables efficient and assured observation and treatment by expeditiously presenting a variety of information acquired during endoscopic observation or during treatment under endoscopic observation. For that purpose, this endoscopic system is provided with: an imaging unit 11 which generates image data by forming an optical image of a test object; a timing measurement unit 31a which acquires timing information corresponding to the image data; an external equipment information acquisition unit 39 which acquires output energy quantity information and output timing information that are outputted from an energy treatment apparatus 60 being simultaneously used in a checkup in which the imaging unit is used; a data integration unit 36 which outputs integrated data by associating the image data with the timing information and the information acquired by the external equipment information acquisition unit; and a recording unit 38, 43 which, on the basis of the timing information, records, as an integrated endoscopic image group, the integrated data outputted from the data integration unit.

Description

内視鏡システムEndoscope system
 この発明は、内視鏡画像データや各検出部,各構成ユニットなどから得られる各種の情報を、内視鏡観察や内視鏡観察下での処置を行う際に情報として提示する内視鏡システムに関するものである。 The present invention presents various information obtained from endoscopic image data, each detection unit, each constituent unit, etc. as information when performing endoscopic observation or treatment under endoscopic observation. It's about the system.
 従来、細長管形状の挿入部を有して構成される内視鏡などを含む内視鏡システムは、例えば医療分野や工業分野等において広く利用されている。このうち、医療分野において用いられる医療用内視鏡システムは、例えば生体の体腔内に挿入部を挿入して臓器等の内部を観察したり、必要に応じて所定の処置具を用いることにより、対象臓器等に対する各種の処置を施すことができるように構成されている。また、工業分野において用いられる工業用内視鏡システムは、例えばジェットエンジンや工場配管等の装置若しくは機械設備等の内部に挿入部を挿入して、当該装置又は機械設備内の傷や腐蝕等の状態を観察し検査することができるように構成されている。 Conventionally, an endoscope system including an endoscope having an elongated tube-shaped insertion portion is widely used in, for example, the medical field and the industrial field. Of these, the medical endoscope system used in the medical field is, for example, by inserting an insertion part into the body cavity of a living body to observe the inside of an organ or the like, or by using a predetermined treatment tool as necessary. It is configured so that various treatments can be applied to the target organs and the like. Further, in an industrial endoscope system used in the industrial field, for example, an insertion part is inserted inside a device such as a jet engine or a factory pipe or a mechanical device to prevent scratches or corrosion in the device or the mechanical device. It is configured so that the condition can be observed and inspected.
 この種の内視鏡システムにおいては、内視鏡の撮像ユニットによって取得される内視鏡画像データから、観察対象物に関する各種の情報(例えば形状,色相,彩度,輝度,周波数特性,距離等の情報)を取得することができる。 In this type of endoscopic system, various information (for example, shape, hue, saturation, brightness, frequency characteristics, distance, etc.) regarding the observation object is obtained from the endoscopic image data acquired by the imaging unit of the endoscope. Information) can be obtained.
 また、当該内視鏡システムに設けられる各種の検出装置からは、内視鏡観察中における各種の情報(例えば内視鏡先端部の位置情報,時間情報,等)を取得することができる。 In addition, various information during endoscopic observation (for example, position information of the tip of the endoscope, time information, etc.) can be acquired from various detection devices provided in the endoscope system.
 さらに、当該内視鏡システムを構成する各種の構成ユニットからは、各構成ユニットの状態に関する各種の情報(例えば光源装置の出射光量や波長情報,エネルギー処置装置の出力量や出力時間情報等)を取得することができる。 Further, from the various constituent units constituting the endoscope system, various information regarding the state of each constituent unit (for example, the amount of emitted light of the light source device, the wavelength information, the output amount of the energy treatment device, the output time information, etc.) is obtained. Can be obtained.
 これら各種の情報は、内視鏡システムを使用して内視鏡観察を行ったり、内視鏡観察下での処置を行う際に重要な情報となる。 These various types of information are important information when performing endoscopic observation using an endoscopic system or performing treatment under endoscopic observation.
 したがって、従来の内視鏡システムにおいては、その使用中に、これら各種の情報を用いることについての工夫が、例えば再公表特許WO2016/151888号,日本国特許公開2005-237641号公報,日本国特許公開平5-285099号公報等によって、種々提案されている。 Therefore, in the conventional endoscope system, ingenuity for using these various kinds of information during its use is described, for example, Republished Patent WO2016 / 151888, Japanese Patent Publication No. 2005-237641, Japanese Patent. Various proposals have been made in Japanese Patent Publication No. 5-285999.
 上記再公表特許WO2016/151888号等によって開示されている内視鏡システムは、PDT(photodynamic therapy;光線力学的治療法)などの治療用光を照射することによる作用についての進行度を、治療光の照射領域の輝度値と照射領域以外の領域の輝度値とを抽出して、両輝度値の比を算出することによって、精度良く確認することができるようにしたというものである。 The endoscopic system disclosed by the above-mentioned republished patent WO2016 / 151888, etc., determines the degree of progress of the action by irradiating therapeutic light such as PDT (photodynamic therapy) with therapeutic light. By extracting the brightness value of the irradiation area and the brightness value of the area other than the irradiation area and calculating the ratio of both brightness values, it is possible to confirm with high accuracy.
 上記日本国特許公開2005-237641号公報等によって開示されている内視鏡システムは、照明光の光量検出を行うことにより、光源装置の照明光を供給するランプなどのメンテナンスをより適切に行うというものである。 The endoscope system disclosed in Japanese Patent Publication No. 2005-237641 and the like is said to more appropriately maintain the lamp and the like that supply the illumination light of the light source device by detecting the amount of illumination light. It is a thing.
 上記日本国特許公開平5-285099号公報等によって開示されているシステムはX線撮像システムであって、X線照射装置から照射されたX線の残存X線量を検出する検知装置を備え、この検知装置により検知されたX線量が規定量以上である場合に危険告知をすることで、X照射線量あるいは残存X線量が人体に及ぼす悪影響を未然に防ぐというものである。 The system disclosed in the above-mentioned Japanese Patent Publication No. 5-285099 is an X-ray imaging system, which includes a detection device for detecting the residual X-ray dose of X-rays emitted from the X-ray irradiation device. By notifying the danger when the X-ray amount detected by the detection device is equal to or more than the specified amount, it is possible to prevent the adverse effect of the X-irradiation dose or the residual X-ray dose on the human body.
 ところが、上記再公表特許WO2016/151888号によって開示されている内視鏡システムは、内視鏡画像データから取得される情報のうち輝度値情報を用いている。また、上記日本国特許公開2005-237641号公報によって開示されている内視鏡システムは、光源装置から取得される情報としての光量情報を用いている。そして、上記日本国特許公開平5-285099号公報等によって開示されているシステムは、検知装置によって取得される情報としてのX線量情報を用いている。 However, the endoscopic system disclosed by the above-mentioned republished patent WO2016 / 151888 uses the luminance value information among the information acquired from the endoscopic image data. Further, the endoscope system disclosed in Japanese Patent Publication No. 2005-237641 uses light quantity information as information acquired from the light source device. The system disclosed in the Japanese Patent Publication No. 5-285099 and the like uses X-dose information as information acquired by the detection device.
 このように、上述の再公表特許WO2016/151888号,日本国特許公開2005-237641号公報,日本国特許公開平5-285099号公報等には、内視鏡観察と共に行われる処置に関する情報については言及されていない。 As described above, the above-mentioned Republished Patents WO2016 / 151888, Japanese Patent Publication No. 2005-237641, Japanese Patent Publication No. 5-285099, etc. provide information on the treatment performed together with endoscopic observation. Not mentioned.
 しかしながら、内視鏡システムを使用して、内視鏡観察下において処置を行う際には、各種の情報を参照したいという要望がある。 However, there is a desire to refer to various information when performing treatment under endoscopic observation using an endoscopic system.
 本発明は、上述した点に鑑みてなされたものであって、その目的とするところは、内視鏡観察や内視鏡観察下における処置中に取得される各種の情報を表示装置を用いて即座にリアルタイムで提示して、効率的かつ確実な観察や処置を行うことができるようにした内視鏡システムを提供することである。 The present invention has been made in view of the above points, and an object of the present invention is to display various information acquired during endoscopic observation or treatment under endoscopic observation using a display device. It is to provide an endoscopic system that can be presented in real time immediately and can perform efficient and reliable observation and treatment.
 上記目的を達成するために、本発明の一態様の内視鏡システムは、被検体の光学像を結像させて画像データを生成する撮像部と、前記画像データに対応する時間情報を取得する時間計測部と、前記撮像部を用いた検査において同時に使用されるエネルギー処置装置から出力される出力エネルギー量情報および出力時間情報を取得する外部機器情報取得部と、前記画像データに対して前記時間情報と前記外部機器情報取得部が取得した前記情報とを関連付けて統合データを出力するデータ統合部と、前記データ統合部から出力された統合データを前記時間情報に基づいて一つの統合された内視鏡画像群として記録する記録部とを備える。 In order to achieve the above object, the endoscopic system of one aspect of the present invention acquires an imaging unit that forms an optical image of a subject to generate image data and time information corresponding to the image data. An external device information acquisition unit that acquires output energy amount information and output time information output from an energy treatment device that is simultaneously used in an inspection using the time measurement unit and the imaging unit, and the time for the image data. A data integration unit that outputs integrated data by associating information with the information acquired by the external device information acquisition unit, and an integrated data output from the data integration unit are integrated based on the time information. It is provided with a recording unit for recording as an endoscope image group.
 本発明によれば、内視鏡観察や内視鏡観察下における処置中に取得される各種の情報を表示装置を用いて即座にリアルタイムで提示して、効率的かつ確実な観察や処置を行うことができるようにした内視鏡システムを提供することができる。 According to the present invention, various information acquired during endoscopic observation or treatment under endoscopic observation is immediately presented in real time using a display device to perform efficient and reliable observation and treatment. It is possible to provide an endoscopic system that enables it.
本発明の第1の実施形態の内視鏡システムにおける主要構成を示すブロック構成図Block configuration diagram showing a main configuration in the endoscope system of the first embodiment of the present invention 本発明の第2の実施形態の内視鏡システムにおける主要構成を示すブロック構成図Block configuration diagram showing a main configuration in the endoscope system of the second embodiment of the present invention
 以下、図示の実施の形態によって本発明を説明する。以下の説明に用いる各図面は模式的に示すものであり、各構成要素を図面上で認識できる程度の大きさで示すために、各部材の寸法関係や縮尺等を構成要素毎に異ならせて示している場合がある。したがって、本発明は、各図面に記載された各構成要素の数量や各構成要素の形状や各構成要素の大きさの比率や各構成要素の相対的な位置関係等に関して、図示の形態のみに限定されるものではない。 Hereinafter, the present invention will be described with reference to the illustrated embodiments. Each drawing used in the following description is schematically shown, and in order to show each component in a size that can be recognized on the drawing, the dimensional relationship and scale of each member are made different for each component. May be shown. Therefore, the present invention relates only to the illustrated form with respect to the quantity of each component, the shape of each component, the size ratio of each component, the relative positional relationship of each component, etc. described in each drawing. It is not limited.
 [第1の実施形態]
 図1は、本発明の第1の実施形態の内視鏡システムにおける主要構成を示すブロック構成図である。
[First Embodiment]
FIG. 1 is a block configuration diagram showing a main configuration in the endoscope system according to the first embodiment of the present invention.
 本発明の第1の実施形態の内視鏡システム1は、内視鏡10と、光源装置20と、プロセッサ30と、解析処理装置40と、表示装置であるモニタ50と、エネルギー処置装置60などを有して構成されている。 The endoscope system 1 of the first embodiment of the present invention includes an endoscope 10, a light source device 20, a processor 30, an analysis processing device 40, a monitor 50 as a display device, an energy treatment device 60, and the like. It is configured to have.
 内視鏡10は、細長管形状の挿入部を有して構成され、当該挿入部を、例えば生体の体腔内に挿入して臓器等の内部を観察し検査したり、必要に応じて所定の処置具を用いて対象臓器等に対する各種の処置を施すことができるように構成される装置である。そのために、内視鏡10の挿入部の先端部の内部には撮像部11が設けられている。 The endoscope 10 is configured to have an elongated tube-shaped insertion portion, and the insertion portion is inserted into, for example, a body cavity of a living body to observe and inspect the inside of an organ or the like, and if necessary, a predetermined one. It is a device configured to be able to perform various treatments on a target organ or the like using a treatment tool. Therefore, an imaging unit 11 is provided inside the tip of the insertion portion of the endoscope 10.
 撮像部11は、撮像素子(イメージャ)11aと、撮像光学系11bなどによって構成されている。撮像素子(イメージャ)11aは、撮像光学系11bによって結像された光学像を受光して光電変換処理を行うことにより画像データを出力する光電変換素子などからなる。撮像光学系11bは、被検体の光学像を結像させる複数若しくは単数の光学レンズなどによって構成されている。 The image pickup unit 11 is composed of an image pickup element (imager) 11a, an image pickup optical system 11b, and the like. The image pickup device (imager) 11a includes a photoelectric conversion element that outputs image data by receiving an optical image imaged by the image pickup optical system 11b and performing a photoelectric conversion process. The imaging optical system 11b is composed of a plurality of or a single optical lens for forming an optical image of a subject.
 なお、撮像部11は、被検体の光学像を結像させて画像データを生成し取得する構成ユニットであり、従来の内視鏡10に適用されているものと同様のものが適用される。したがって、撮像部11自体の構成は、従来のものと同じ構成からなるものとして、その詳細説明は省略する。 The imaging unit 11 is a configuration unit that forms an optical image of a subject to generate and acquire image data, and the same one as that applied to the conventional endoscope 10 is applied. Therefore, the configuration of the imaging unit 11 itself is assumed to be the same as that of the conventional one, and detailed description thereof will be omitted.
 また、本実施形態においては、撮像部11を1つのみ有する構成例を図示しているが、この形態に限られない。例えば、2つ以上の撮像部11を具備することにより、ステレオ画像(3D画像)を形成し得る画像データを取得する形態であってもよい。このような構成とすれば、2つ以上の撮像部11によって取得される2つ以上の画像データに基づいて距離情報などを取得することができるようになる。 Further, in the present embodiment, a configuration example having only one imaging unit 11 is shown, but the present embodiment is not limited to this embodiment. For example, by providing two or more imaging units 11, image data capable of forming a stereo image (3D image) may be acquired. With such a configuration, distance information and the like can be acquired based on two or more image data acquired by two or more image pickup units 11.
 また、内視鏡10の挿入部には、ライトガイドファイバ12が挿通配置されている。このライトガイドファイバ12は、内視鏡10の挿入部の先端面に設けられている照明光学系13から光源装置20のコネクタ部(不図示)までの間に設けられている。これにより、ライトガイドファイバ12は、光源装置20から出射される光束を、内視鏡10の挿入部の先端面の照明光学系13まで伝達する役目をしている。そして。照明光学系13から出射された光は、患者などの被検体100の観察対象部位101に向けて照射され、当該観察対象部位101近傍を照明する。 Further, a light guide fiber 12 is inserted and arranged in the insertion portion of the endoscope 10. The light guide fiber 12 is provided between the illumination optical system 13 provided on the tip end surface of the insertion portion of the endoscope 10 and the connector portion (not shown) of the light source device 20. As a result, the light guide fiber 12 serves to transmit the luminous flux emitted from the light source device 20 to the illumination optical system 13 on the front end surface of the insertion portion of the endoscope 10. And. The light emitted from the illumination optical system 13 is directed toward the observation target site 101 of the subject 100 such as a patient, and illuminates the vicinity of the observation target site 101.
 なお、内視鏡10自体の構成についても、従来のものと同様構成のものが適用されているものとして、その詳細説明は省略する。 As for the configuration of the endoscope 10 itself, it is assumed that the same configuration as the conventional one is applied, and the detailed description thereof will be omitted.
 光源装置20は、内視鏡10に対し照明光を供給するための装置である。この光源装置20は、例えば、白色光光源21と、励起光光源22と、スプリッタ23と、集光レンズ24などによって構成されている。 The light source device 20 is a device for supplying illumination light to the endoscope 10. The light source device 20 is composed of, for example, a white light light source 21, an excitation light light source 22, a splitter 23, a condenser lens 24, and the like.
 白色光光源21は、例えば白色光を出射する光源である。白色光光源21は、具体的には例えば、発光ダイオード(LED;light emitting diode)やキセノンランプ(xenon lamp)などが適用される。なお、発光ダイオードを利用する場合には、例えば、B,G,Rを合波して白色光を生成するといったタイプのものであってもよい。励起光光源22は、例えば白色光を出射する光源である。 The white light light source 21 is, for example, a light source that emits white light. Specifically, for example, a light emitting diode (LED; light emission diode) or a xenon lamp (xenon lamp) is applied to the white light light source 21. When a light emitting diode is used, for example, it may be of a type in which B, G, and R are combined to generate white light. The excitation light light source 22 is, for example, a light source that emits white light.
 スプリッタ23は、特定波長以上の波長光を100%反射させると共に、通常光を100%透過させる機能を有するダイクロイックミラー面を有する構成ユニットである。この構成により、当該スプリッタ23は、例えば励起光光源22から出射された励起光(特定波長の光)と、当該特定波長以上の波長の光をほぼ100%の反射率で反射すると共に、白色光光源21から出射された白色光をほぼ100%の透過率で透過する。そして、その合波は、集光レンズ24に向けて出射される。 The splitter 23 is a configuration unit having a dichroic mirror surface having a function of reflecting 100% of wavelength light having a specific wavelength or higher and transmitting 100% of normal light. With this configuration, the splitter 23 reflects, for example, excitation light (light of a specific wavelength) emitted from an excitation light light source 22 and light having a wavelength equal to or higher than the specific wavelength with a transmittance of approximately 100%, and white light. The white light emitted from the light source 21 is transmitted with a transmittance of almost 100%. Then, the combined wave is emitted toward the condenser lens 24.
 集光レンズ24は、各光源(21,22)からの光を集光して、当該光源装置20のコネクタ部に設けられるライトガイドファイバ12の端面に向けて出射する光学レンズである。 The condensing lens 24 is an optical lens that condenses the light from each light source (21, 22) and emits it toward the end surface of the light guide fiber 12 provided in the connector portion of the light source device 20.
 なお、光源装置20自体の構成についても、従来のものと同様のものが適用されているものとして、これ以上の説明は省略する。また、本実施形態における光源装置20においては、白色光光源21と励起光光源22とを例示しているが、光源種類は、これらに限られることはなく、異なる種類の光を発生させるものを採用したものであってもよい。 As for the configuration of the light source device 20 itself, it is assumed that the same configuration as the conventional one is applied, and further description thereof will be omitted. Further, in the light source device 20 of the present embodiment, the white light light source 21 and the excitation light light source 22 are illustrated, but the type of the light source is not limited to these, and those that generate different types of light are used. It may be the one adopted.
 プロセッサ30は、内視鏡10の撮像部11からの出力信号を受けて画像データを生成したり、同出力信号に基づいて各種の情報を取得する。また、プロセッサ30は、取得された各種の情報と、当該各情報に対応する時間情報とに基づいて検知ログデータ(時系列データ)を取得する。さらに、プロセッサ30は、画像データと検知ログデータとを関連付けて合成する。そして、プロセッサ30は、生成された画像ログ合成データ(統合データ)に基づく画像や、各種の情報データに基づく情報などを、モニタ50の表示画面における所定の領域に表示するための表示制御を行う。 The processor 30 receives an output signal from the imaging unit 11 of the endoscope 10 to generate image data, and acquires various information based on the output signal. Further, the processor 30 acquires detection log data (time series data) based on various acquired information and time information corresponding to each of the acquired information. Further, the processor 30 associates and synthesizes the image data and the detection log data. Then, the processor 30 performs display control for displaying an image based on the generated image log composite data (integrated data), information based on various information data, and the like in a predetermined area on the display screen of the monitor 50. ..
 このプロセッサ30は、例えば、画像読出部31と、時間計測部31aと、通常光画像生成部32と、検知部33(パラメータ検出部)と、検知ログデータ取得部34と、出力部35と、画像・ログデータ合成部36(データ統合部,画像合成部)と、表示制御部37と、記録部38と、エネルギー出力検知部39(外部機器情報取得部)などを有して構成されている。 The processor 30 includes, for example, an image reading unit 31, a time measuring unit 31a, a normal optical image generation unit 32, a detection unit 33 (parameter detection unit), a detection log data acquisition unit 34, and an output unit 35. It includes an image / log data synthesis unit 36 (data integration unit, image composition unit), a display control unit 37, a recording unit 38, an energy output detection unit 39 (external device information acquisition unit), and the like. ..
 画像読出部31は、撮像部11の撮像素子11aから出力される画像信号を読み出す回路若しくはソフトウエアプログラムである。 The image reading unit 31 is a circuit or software program that reads an image signal output from the image sensor 11a of the image pickup unit 11.
 時間計測部31aは、所定の時間計測を行って、取得した時間情報を出力する回路若しくはソフトウエアプログラムである。時間計測部31aは、具体的には例えば、画像読出部31による画像信号の読み出し開始からの時間を計測し、取得した時間情報を検知ログデータ取得部34へと出力する。また、時間計測部31aは、エネルギー出力部61によるエネルギー出力時間を計測し、取得した時間情報をエネルギー出力検知部39へと出力する。 The time measurement unit 31a is a circuit or software program that measures a predetermined time and outputs the acquired time information. Specifically, for example, the time measurement unit 31a measures the time from the start of reading the image signal by the image reading unit 31 and outputs the acquired time information to the detection log data acquisition unit 34. Further, the time measuring unit 31a measures the energy output time by the energy output unit 61 and outputs the acquired time information to the energy output detecting unit 39.
 この時間計測部31aは、例えばリアルタイムクロック(Real-Time Clock;RTC)などと呼ばれる内部時計回路が相当する。この時間計測部31aによって得られる時間情報は、例えば画像データに付随させる日時情報や、検知部33によって検出される各種の数値データや検知ログデータなどに関連する時間情報として利用される。 The time measurement unit 31a corresponds to, for example, an internal clock circuit called a real-time clock (RTC) or the like. The time information obtained by the time measuring unit 31a is used as time information related to, for example, date and time information associated with the image data, various numerical data detected by the detection unit 33, detection log data, and the like.
 通常光画像生成部32は、撮像部11の撮像素子11aから出力される画像信号を受けて画像データを生成する回路若しくはソフトウエアプログラムである。この通常光画像生成部32において生成される画像データは、被検体に対して通常光(白色光)を照明した際に取得される画像信号に基づいて生成される画像データである。 The normal optical image generation unit 32 is a circuit or software program that generates image data by receiving an image signal output from the image sensor 11a of the image pickup unit 11. The image data generated by the normal light image generation unit 32 is image data generated based on an image signal acquired when the subject is illuminated with normal light (white light).
 検知部33は、画像読出部31によって読み出された画像信号などに基づいて所定の数値データを検出するための回路若しくはソフトウエアプログラムからなるパラメータ検出部である。この検知部33によって検出される所定の数値データとしては、例えば画像上の形状情報,画像内の色相・彩度・輝度情報,周波数特性,輝度レベル,画像上の特定波長,光源部が照射した光量情報,光源部が照射した光の波長情報などがある。なお、検知部33は、画像信号に基づく数値データのみではなく、その他のセンサ類、例えばジャイロセンサ,GPS(Global Positioning System;全地球測位システム),温度センサ,圧力センサ,加速度センサ等の各種のセンサ類からの出力信号に基づく数値データを検出する。 The detection unit 33 is a parameter detection unit including a circuit or a software program for detecting predetermined numerical data based on an image signal or the like read by the image reading unit 31. The predetermined numerical data detected by the detection unit 33 includes, for example, shape information on the image, hue / saturation / brightness information in the image, frequency characteristics, brightness level, specific wavelength on the image, and the light source unit. There is light quantity information, wavelength information of the light emitted by the light source, and so on. The detection unit 33 is not limited to numerical data based on image signals, but also has various other sensors such as a gyro sensor, GPS (Global Positioning System), temperature sensor, pressure sensor, acceleration sensor, and the like. Detects numerical data based on output signals from sensors.
 なお、本実施形態においては、撮像部11を1つ備えた構成としているが、この構成以外にも、例えば2つ以上の撮像部11を備えた内視鏡10とする構成も考えられる。この構成とした場合、2つ以上の撮像部11からそれぞれ出力される各画像データに基づいて取得される情報データとして距離情報に関する数値データを取得することもできる。 In the present embodiment, one imaging unit 11 is provided, but in addition to this configuration, for example, an endoscope 10 having two or more imaging units 11 can be considered. With this configuration, it is also possible to acquire numerical data related to distance information as information data acquired based on each image data output from two or more imaging units 11.
 検知ログデータ取得部34は、検知部33において取得された数値データをログデータとして取得する回路若しくはソフトウエアプログラムである。そのために、検知ログデータ取得部34は、検知部33で取得された各数値データに対応する時間情報を、時間計測部31aから取得する。 The detection log data acquisition unit 34 is a circuit or software program that acquires the numerical data acquired by the detection unit 33 as log data. Therefore, the detection log data acquisition unit 34 acquires the time information corresponding to each numerical data acquired by the detection unit 33 from the time measurement unit 31a.
 出力部35は、検知ログデータ取得部34によって取得された検知ログデータや、画像・ログデータ合成部36からの出力データ(画像ログ合成データ,統合データ)などを外部の解析処理装置40へと出力する回路若しくはソフトウエアプログラムである。 The output unit 35 transfers the detection log data acquired by the detection log data acquisition unit 34 and the output data (image log synthesis data, integrated data) from the image / log data synthesis unit 36 to the external analysis processing device 40. It is a circuit or software program that outputs.
 画像・ログデータ合成部36は、通常光画像生成部32によって生成された画像データと、検知ログデータ取得部34によって取得された検知ログデータとを関連付ける合成処理を施して画像ログ合成データ(統合データ)を生成し出力する回路若しくはソフトウエアプログラムからなるデータ統合部である。また、画像・ログデータ合成部36は、画像ログ合成データ(統合データ)に含まれる画像及び情報を表示画面上に表示するための画像合成処理を行う画像合成部でもある。 The image / log data synthesis unit 36 performs a synthesis process for associating the image data generated by the normal optical image generation unit 32 with the detection log data acquired by the detection log data acquisition unit 34, and performs image log synthesis data (integration). It is a data integration unit consisting of a circuit or software program that generates and outputs data). The image / log data synthesizing unit 36 is also an image synthesizing unit that performs an image synthesizing process for displaying the images and information included in the image log synthesizing data (integrated data) on the display screen.
 表示制御部37は、モニタ50の表示画面上に所定の形態で、画像や情報などを表示する際の表示制御を行う回路若しくはソフトウエアプログラムである。この表示制御部37により行われる表示制御処理は、例えば、通常光画像生成部32によって生成された画像データに基づく画像をモニタ50の表示画面上の所定の領域に表示させたり、検知ログデータ取得部34によって取得された検知ログデータに基づく情報をモニタ50の表示画面上の所定の領域に表示させるなどのほか、画像表示と情報表示とを重畳表示させるなど、各種の表示形態での表示制御が含まれる。 The display control unit 37 is a circuit or software program that controls display when displaying an image, information, or the like in a predetermined form on the display screen of the monitor 50. The display control process performed by the display control unit 37 is, for example, displaying an image based on the image data generated by the normal optical image generation unit 32 in a predetermined area on the display screen of the monitor 50, or acquiring detection log data. Display control in various display forms, such as displaying information based on the detection log data acquired by the unit 34 in a predetermined area on the display screen of the monitor 50, and superimposing an image display and an information display. Is included.
 記録部38は、画像・ログデータ合成部36や検知ログデータ取得部34からの各出力データを記録する半導体メモリなどの記録媒体と、当該記録媒体を駆動する回路若しくはソフトウエアプログラムなどからなる構成ユニットである。 The recording unit 38 includes a recording medium such as a semiconductor memory for recording each output data from the image / log data synthesis unit 36 and the detection log data acquisition unit 34, and a circuit or software program for driving the recording medium. It is a unit.
 ここで、記録部38は、画像・ログデータ合成部36(データ統合部)から出力された画像ログ合成データ(統合データ)を時間情報に基づいて一つの統合された内視鏡画像群(例えば複数の静止画像データを時系列順にまとめた形態の一纏まりの動画像データ)として記録する。 Here, the recording unit 38 combines the image log composite data (integrated data) output from the image / log data synthesizer 36 (data integration unit) into one integrated endoscopic image group (for example, based on time information). A plurality of still image data are recorded as a set of moving image data in a form collected in chronological order).
 なお、本実施形態においては、記録部38をプロセッサ30の内部に設けた例を図示しているが、この形態に限らず、外部に設けて構成してもよい。例えば、外部装置である解析処理装置40の記録部43を利用することにより、プロセッサ30内の記録部38を省略する構成も可能である。 Although the example in which the recording unit 38 is provided inside the processor 30 is shown in the present embodiment, the present invention is not limited to this embodiment, and the recording unit 38 may be provided outside. For example, by using the recording unit 43 of the analysis processing device 40 which is an external device, it is possible to omit the recording unit 38 in the processor 30.
 エネルギー出力検知部39は、エネルギー処置装置60のエネルギー出力部61から出力される各種情報データと時間計測部31aからの時間情報などを受けて、出力エネルギーに関する各種の情報(出力量,出力強度,出力時間など)を検知すると共に、ログデータを生成する回路若しくはソフトウエアプログラムからなる外部機器情報取得部である。 The energy output detection unit 39 receives various information data output from the energy output unit 61 of the energy treatment device 60, time information from the time measurement unit 31a, and the like, and receives various information regarding output energy (output amount, output intensity, etc.). It is an external device information acquisition unit consisting of a circuit or software program that detects (output time, etc.) and generates log data.
 なお、プロセッサ30は、上述した以外にも各種の構成ユニットを有しているものであるが、上述した以外の構成ユニットについては、本発明に直接関連しない部分であるので、その詳細説明は省略する。 The processor 30 has various constituent units other than those described above, but the constituent units other than those described above are not directly related to the present invention, and therefore detailed description thereof will be omitted. To do.
 解析処理装置40は、プロセッサ30の出力部35からの出力データ(検知ログデータや画像ログ合成データなど)を受けて、記録したり、所定の解析及び判定を行うと共に、当該解析及び判定結果に基づいて、エネルギー処置装置60の制御信号を生成し発生させる回路若しくはソフトウエアプログラムである。 The analysis processing device 40 receives output data (detection log data, image log synthesis data, etc.) from the output unit 35 of the processor 30 and records the data, performs a predetermined analysis and determination, and obtains the analysis and determination results. Based on this, it is a circuit or software program that generates and generates a control signal of the energy treatment device 60.
 この解析処理装置40は、検知データ解析判定部41(データ解析部)と、検知データ判定基準入力部42と、記録部43と、エネルギー処置装置制御部44などを有して構成されている。 The analysis processing device 40 includes a detection data analysis determination unit 41 (data analysis unit), a detection data determination reference input unit 42, a recording unit 43, an energy treatment device control unit 44, and the like.
 検知データ解析判定部41は、プロセッサ30の出力部35からの出力データ(検知ログデータや画像ログ合成データなど)を受けて所定の解析処理及び判定処理を行う回路若しくはソフトウエアプログラムからなるデータ解析部である。 The detection data analysis judgment unit 41 is a data analysis consisting of a circuit or a software program that receives output data (detection log data, image log synthesis data, etc.) from the output unit 35 of the processor 30 and performs predetermined analysis processing and judgment processing. It is a department.
 検知データ解析判定部41は、例えば、出力部35から入力された各種の検知データが、検知データ判定基準入力部42から入力される検知データ判定基準値に対して、規定された一定の数値の範囲内であるか、規定された数値以上であるか、規定された数値以下であるか、などの解析処理及び判定処理を行う。当該検知データ解析判定部41の解析判定結果データはモニタ50や記録部43のほか、エネルギー処置装置制御部44へと送られる。 In the detection data analysis determination unit 41, for example, various detection data input from the output unit 35 has a constant numerical value defined with respect to the detection data determination reference value input from the detection data determination reference input unit 42. It performs analysis processing and determination processing such as whether it is within the range, is equal to or more than the specified numerical value, or is less than or equal to the specified numerical value. The analysis determination result data of the detection data analysis determination unit 41 is sent to the energy treatment device control unit 44 in addition to the monitor 50 and the recording unit 43.
 検知データ判定基準入力部42は、検知データに関し予め設定された判定基準値などの複数のデータを有し、所定のタイミングで所定の検知データ判定基準値などを検知データ解析判定部41へと出力する回路若しくはソフトウエアプログラムである。この検知データ判定基準値などのデータは、検知データ解析判定部41における解析処理及び判定処理の際に用いられる参照データである。 The detection data determination standard input unit 42 has a plurality of data such as a determination reference value preset for the detection data, and outputs a predetermined detection data determination reference value or the like to the detection data analysis determination unit 41 at a predetermined timing. It is a circuit or software program to be used. The data such as the detection data determination reference value is reference data used in the analysis process and the determination process in the detection data analysis determination unit 41.
 記録部43は、プロセッサ30の出力部35からの出力データ(検知ログデータや画像ログ合成データなど)や、検知データ解析判定部41から出力される解析判定結果に関する情報データなどを受けて記録する半導体メモリなどの記録媒体と、当該記録媒体を駆動する回路若しくはソフトウエアプログラムである。 The recording unit 43 receives and records output data (detection log data, image log composite data, etc.) from the output unit 35 of the processor 30, information data related to the analysis determination result output from the detection data analysis determination unit 41, and the like. A recording medium such as a semiconductor memory and a circuit or software program that drives the recording medium.
 エネルギー処置装置制御部44は、検知データ解析判定部41による解析判定結果データに基づいてエネルギー処置装置60を制御するための回路若しくはソフトウエアプログラムである。 The energy treatment device control unit 44 is a circuit or software program for controlling the energy treatment device 60 based on the analysis determination result data by the detection data analysis determination unit 41.
 解析処理装置40は、上述した以外にも各種の構成ユニットを有しているものであるが、上述した以外の構成ユニットについては、本発明に直接関連しない部分であるので、その詳細説明は省略する。 The analysis processing apparatus 40 has various constituent units other than those described above, but the constituent units other than those described above are not directly related to the present invention, and therefore detailed description thereof will be omitted. To do.
 モニタ50は、プロセッサ30の表示制御部37によって制御され、入力される画像データや各種の情報データなどに基づいて、画像や各種の情報をモニタ50の表示画面上において、適宜所定の形態で視認可能に表示する表示装置である。モニタ50は、例えば液晶ディスプレイ(LCD;Liquid Crystal Display),有機エレクトロルミネッセンスディスプレイ(OEL;Organic Electro-Luminescence Display)等の表示パネルと、その駆動回路及びソフトウエアプログラム等を含んで構成される。 The monitor 50 is controlled by the display control unit 37 of the processor 30, and based on the input image data and various information data, the image and various information are visually recognized on the display screen of the monitor 50 in an appropriate predetermined form. It is a display device that can display. The monitor 50 includes, for example, a display panel such as a liquid crystal display (LCD) and an organic electro-Luminescence display (OEL), a drive circuit thereof, a software program, and the like.
 エネルギー処置装置60は、内視鏡観察下で所定の処置を行う際に用いられる処置装置である。ここで、エネルギー処置装置60は、エネルギー(レーザー光など)を出力して被検体(患者)の所定の患部の治療などを行う処置装置である。そのために、当該エネルギー処置装置60は、エネルギー出力部61などを有して構成されている。 The energy treatment device 60 is a treatment device used when performing a predetermined treatment under endoscopic observation. Here, the energy treatment device 60 is a treatment device that outputs energy (laser light or the like) to treat a predetermined affected portion of a subject (patient). Therefore, the energy treatment device 60 is configured to include an energy output unit 61 and the like.
 エネルギー出力部61は、処置のためのエネルギー(レーザー光など)を出力するための機構及び回路若しくはソフトウエアプログラムである。このエネルギー出力部61は、解析処理装置40のエネルギー処置装置制御部44によって制御される。 The energy output unit 61 is a mechanism, circuit, or software program for outputting energy (laser light, etc.) for treatment. The energy output unit 61 is controlled by the energy treatment device control unit 44 of the analysis processing device 40.
 なお、エネルギー処置装置60についても、上述した以外に各種の構成ユニットを有しているものである。しかしながら、上述した以外の構成ユニットについては、本発明に直接関連しない部分であるので、その詳細説明は省略する。 The energy treatment device 60 also has various constituent units other than those described above. However, since the constituent units other than those described above are not directly related to the present invention, detailed description thereof will be omitted.
 このように構成された本実施形態の内視鏡システム1の作用は、概略以下に示すようなものとなる。 The operation of the endoscope system 1 of the present embodiment configured in this way is roughly as shown below.
 ここで説明する作用は、例えば内視鏡10を用いて被検体100の体腔内における観察対象部位101の観察を行うと共に、内視鏡観察下においてエネルギー処置装置60を用いた所定の処置を行う場合を想定するものとする。 The actions described here include, for example, observing the observation target site 101 in the body cavity of the subject 100 using an endoscope 10 and performing a predetermined treatment using the energy treatment device 60 under the endoscopic observation. A case is assumed.
 まず、使用者(ユーザ)は、内視鏡10の挿入部を被検体100の体腔内へと挿入し、所望の観察対象部位101の近傍に、内視鏡10の挿入部の先端に設けられている撮像部11を配置する。その間、内視鏡10の撮像部11による撮像動作が継続して行われている。このとき撮像部11によって取得される画像信号は、プロセッサ30の画像読出部31によって読み出され、通常光画像生成部32,検知部33へと伝達される。 First, the user inserts the insertion portion of the endoscope 10 into the body cavity of the subject 100, and is provided at the tip of the insertion portion of the endoscope 10 in the vicinity of the desired observation target site 101. The imaging unit 11 is arranged. During that time, the imaging operation by the imaging unit 11 of the endoscope 10 is continuously performed. At this time, the image signal acquired by the imaging unit 11 is read by the image reading unit 31 of the processor 30 and transmitted to the normal optical image generation unit 32 and the detection unit 33.
 通常光画像生成部32は、所定の画像データを生成する。検知部33は、各種の情報データを数値データとして検出する。また、時間計測部31aは、画像読出部31によって取得される各画像信号毎の時間情報を検出する。 The normal optical image generation unit 32 generates predetermined image data. The detection unit 33 detects various information data as numerical data. Further, the time measuring unit 31a detects the time information for each image signal acquired by the image reading unit 31.
 ここで、検知部33において検出される情報データとしては、例えば、画像上における形状に関する情報がある。この形状情報によって、例えば画像上における血管,尿管,神経などを検知し認識することができる。 Here, as the information data detected by the detection unit 33, for example, there is information about the shape on the image. With this shape information, for example, blood vessels, ureters, nerves, etc. on an image can be detected and recognized.
 また、検知部33において検出される情報データとしては、例えば、画像上における色相,彩度,輝度などに関する情報がある。これらの情報によって、例えば画像上における出血点,血流などを検知し認識することができる。 Further, as the information data detected by the detection unit 33, for example, there is information on hue, saturation, brightness, etc. on the image. With this information, for example, bleeding points and blood flow on an image can be detected and recognized.
 そして、検知部33において検出される情報データとしては、例えば、画像上における周波数特性などに関する情報がある。この情報によって、例えば画像上における形状、具体的には例えば臓器やポリープなどを検知し認識することができる。 Then, as the information data detected by the detection unit 33, for example, there is information regarding frequency characteristics on the image. With this information, for example, a shape on an image, specifically, for example, an organ or a polyp can be detected and recognized.
 さらに、検知部33において検出される情報データとしては、例えば、輝度レベルなどに関する情報がある。この情報によって、例えば画像上における遠近距離の検知,体内又は体外の検知,ガーゼ等の物を検知し認識することができる。 Further, as the information data detected by the detection unit 33, for example, there is information regarding the brightness level and the like. With this information, for example, it is possible to detect and recognize objects such as near-distance detection on an image, detection inside or outside the body, and gauze.
 またさらに、検知部33において検出される情報データとしては、不図示の各種センサ類からの出力を受けて得られる情報がある。この各種センサ類としては、例えば内視鏡10の先端に設けられ、当該内視鏡10の先端位置に関する情報を取得し明示するためのジャイロセンサーなどがある。検知部33は、このジャイロセンサーからの出力情報に基づいて、内視鏡10の先端位置を検知する。こうして検知部33で検出された情報データとしての位置情報データは、検知ログデータ取得部34へと伝達される。 Furthermore, as the information data detected by the detection unit 33, there is information obtained by receiving outputs from various sensors (not shown). Examples of these various sensors include a gyro sensor provided at the tip of the endoscope 10 for acquiring and clearly indicating the position of the tip of the endoscope 10. The detection unit 33 detects the tip position of the endoscope 10 based on the output information from the gyro sensor. The position information data as the information data detected by the detection unit 33 in this way is transmitted to the detection log data acquisition unit 34.
 なお、2つ以上の撮像部11を備えた内視鏡を採用した場合には、当該2つ以上の撮像部11からそれぞれ出力される各画像データに基づいて取得される情報データとして、距離情報に関する数値データを取得することもできる。この距離情報に関する情報データは、例えば内視鏡10の先端面から観察対象部位101までの距離などの数値データである。 When an endoscope having two or more imaging units 11 is adopted, distance information is used as information data acquired based on each image data output from the two or more imaging units 11. You can also get numerical data about. The information data related to this distance information is numerical data such as the distance from the tip surface of the endoscope 10 to the observation target portion 101.
 通常光画像生成部32によって生成された画像データは、画像・ログデータ合成部36へと伝達される。また、検知部33によって検出された各種情報データ(数値データ)は、検知ログデータ取得部34へと伝達される。 The image data generated by the normal optical image generation unit 32 is transmitted to the image / log data synthesis unit 36. In addition, various information data (numerical data) detected by the detection unit 33 is transmitted to the detection log data acquisition unit 34.
 検知ログデータ取得部34は、検知部33によって検出された各種情報データと、時間計測部31aからの時間情報とに基づいて検知ログデータを取得する。ここで取得された検知ログデータは、画像・ログデータ合成部36,出力部35,記録部38へと伝達される。 The detection log data acquisition unit 34 acquires detection log data based on various information data detected by the detection unit 33 and time information from the time measurement unit 31a. The detection log data acquired here is transmitted to the image / log data synthesis unit 36, the output unit 35, and the recording unit 38.
 画像・ログデータ合成部36は、通常光画像生成部32によって生成された画像データと、検知ログデータ取得部34によって取得された検知ログデータとを関連付けて画像ログ合成データを生成する。ここで生成された画像ログ合成データは、出力部35,記録部38へと伝達される。 The image / log data synthesis unit 36 generates image log synthesis data by associating the image data generated by the normal optical image generation unit 32 with the detection log data acquired by the detection log data acquisition unit 34. The image log composite data generated here is transmitted to the output unit 35 and the recording unit 38.
 出力部35は、検知ログデータ,画像ログ合成データなどを外部の解析処理装置40へと出力する。これにより、これら検知ログデータ,画像ログ合成データなどのデータは、解析処理装置40の記録部43に記録される。 The output unit 35 outputs the detection log data, the image log composite data, and the like to the external analysis processing device 40. As a result, the data such as the detection log data and the image log synthesis data are recorded in the recording unit 43 of the analysis processing device 40.
 また、解析処理装置40の検知データ解析判定部41は、出力部35からの検知ログデータについて、検知データ判定基準入力部42からの検知データ判定基準値に基づく所定の解析処理及び判定処理を実行する。即ち、検知データ解析判定部41は、出力部35から入力された検知ログデータが、検知データ判定基準入力部42から入力される検知データ判定基準値に対して、規定の数値範囲内にあるのか、規定の数値範囲外であるのか、の解析及び判定を行う。この検知データ解析判定部41の解析判定結果データは、記録部43へと伝達されて、当該記録部43の記録媒体に記録される。また、検知データ解析判定部41の解析判定結果データは、エネルギー処置装置制御部44へも伝達される。 Further, the detection data analysis determination unit 41 of the analysis processing device 40 executes a predetermined analysis process and determination process based on the detection data determination reference value from the detection data determination reference input unit 42 for the detection log data from the output unit 35. To do. That is, does the detection data analysis determination unit 41 have the detection log data input from the output unit 35 within a predetermined numerical range with respect to the detection data determination reference value input from the detection data determination reference input unit 42? , Analyze and judge whether it is out of the specified numerical range. The analysis determination result data of the detection data analysis determination unit 41 is transmitted to the recording unit 43 and recorded on the recording medium of the recording unit 43. Further, the analysis determination result data of the detection data analysis determination unit 41 is also transmitted to the energy treatment device control unit 44.
 一方、エネルギー処置装置60が稼働されると、エネルギー出力部61からエネルギー処置出力に関する情報データ(エネルギー出力情報データ)が、プロセッサ30のエネルギー出力検知部39に伝達される。ここで、エネルギー出力情報データは、例えば出力エネルギー量,出力エネルギー変動量などの数値データである。 On the other hand, when the energy treatment device 60 is operated, the energy output unit 61 transmits information data (energy output information data) regarding the energy treatment output to the energy output detection unit 39 of the processor 30. Here, the energy output information data is numerical data such as an output energy amount and an output energy fluctuation amount.
 また、時間計測部31aは、エネルギー出力部61からのエネルギー処置出力情報データに基づいて時間計測を行い、計測結果をエネルギー処置出力に関する時間情報としてエネルギー出力検知部39へと伝達する。 Further, the time measurement unit 31a measures the time based on the energy treatment output information data from the energy output unit 61, and transmits the measurement result to the energy output detection unit 39 as time information regarding the energy treatment output.
 エネルギー出力検知部39は、エネルギー出力部61からのエネルギー処置出力情報データと、時間計測部31aからの時間情報とに基づいて、エネルギー処置出力の出力時間などの情報データを取得する。この情報データは、検知ログデータ取得部34へと伝達される。 The energy output detection unit 39 acquires information data such as the output time of the energy treatment output based on the energy treatment output information data from the energy output unit 61 and the time information from the time measurement unit 31a. This information data is transmitted to the detection log data acquisition unit 34.
 検知ログデータ取得部34は、エネルギー出力検知部39からのエネルギー出力情報データと、時間計測部31aからの時間情報とに基づいて、エネルギー出力に関するログデータ(エネルギー出力ログデータ)を生成する。検知ログデータ取得部34によって生成されたエネルギー出力ログデータは、出力部35を介して解析処理装置40へと伝達される。 The detection log data acquisition unit 34 generates log data (energy output log data) related to energy output based on the energy output information data from the energy output detection unit 39 and the time information from the time measurement unit 31a. The energy output log data generated by the detection log data acquisition unit 34 is transmitted to the analysis processing device 40 via the output unit 35.
 また、上述したように、検知データ解析判定部41の解析判定結果データは、エネルギー処置装置制御部44へ伝達される。ここで、エネルギー処置装置制御部44は、上記解析判定結果データに基づいて、エネルギー出力部61からのエネルギー出力値が、規定の数値範囲内となるように制御する。 Further, as described above, the analysis determination result data of the detection data analysis determination unit 41 is transmitted to the energy treatment device control unit 44. Here, the energy treatment device control unit 44 controls the energy output value from the energy output unit 61 so as to be within a predetermined numerical range based on the analysis determination result data.
 一方、検知データ解析判定部41の解析判定結果データは、上述したように、モニタ50へも伝達される。また、モニタ50へは、プロセッサ30の出力部35からの出力データ(検知ログデータや画像ログ合成データ,エネルギー出力ログデータなど)も伝達されている。ここで、モニタ50は、表示制御部37の制御を受けて、出力部35からの出力データについての表示が行われる。この場合において、表示制御部37は、例えば、検知ログデータを画像データに基づいて表示される画像に重畳させた形態で表示する制御を行う。 On the other hand, the analysis determination result data of the detection data analysis determination unit 41 is also transmitted to the monitor 50 as described above. In addition, output data (detection log data, image log synthesis data, energy output log data, etc.) from the output unit 35 of the processor 30 is also transmitted to the monitor 50. Here, the monitor 50 is controlled by the display control unit 37 to display the output data from the output unit 35. In this case, the display control unit 37 controls, for example, to display the detection log data in a form of being superimposed on the displayed image based on the image data.
 また、表示制御部37は、例えば画像データ(画像ログ合成データ)に基づく内視鏡画像を、モニタ50の表示画面内における所定の領域(図1の符号50aで示す領域)に表示する制御を行う。 Further, the display control unit 37 controls to display, for example, an endoscopic image based on image data (image log composite data) in a predetermined area (area indicated by reference numeral 50a in FIG. 1) in the display screen of the monitor 50. Do.
 同様に、表示制御部37は、例えば検知ログデータ,エネルギー出力ログデータなどに基づく情報(例えば文字情報など)を、モニタ50の表示画面内における所定の領域(図1の符号50bで示す領域)に表示する制御を行う。 Similarly, the display control unit 37 provides information based on, for example, detection log data, energy output log data, etc. (for example, character information) in a predetermined area (area indicated by reference numeral 50b in FIG. 1) in the display screen of the monitor 50. Controls the display on.
 同様に、表示制御部37は、例えば解析判定結果データに基づく情報(例えば文字情報など)を、モニタ50の表示画面内における所定の領域(図1の符号50cで示す領域)に表示する制御を行う。 Similarly, the display control unit 37 controls to display information based on, for example, analysis determination result data (for example, character information) in a predetermined area (area indicated by reference numeral 50c in FIG. 1) in the display screen of the monitor 50. Do.
 この場合において、解析判定結果データに基づく情報を表示すると共に、例えば解析判定結果に応じた各種の情報、例えば処置に関するアシスト情報や、警告通知などを表示するようにしてもよい。 In this case, in addition to displaying the information based on the analysis judgment result data, for example, various information according to the analysis judgment result, for example, assist information regarding the treatment, warning notification, etc. may be displayed.
 なお、画像データや情報データなどのモニタ50への表示形態については、図示の例に限られることはなく、その他のさまざまな形態が考えられる。 Note that the display form of image data, information data, etc. on the monitor 50 is not limited to the illustrated example, and various other forms can be considered.
 本実施形態の内視鏡システム1において取得された各種の情報データに基づいて、さまざまな解析や判定を行うことができる。 Various analyzes and judgments can be performed based on various information data acquired by the endoscope system 1 of the present embodiment.
 具体的には、例えば、各情報データに関連する時間情報は、処置装置を用いる手術などにかかる時間(各手技ステップ毎にかかる処置時間や手術全体のトータル時間など)を検知し、解析することができる。 Specifically, for example, the time information related to each information data is to detect and analyze the time required for surgery using the treatment device (treatment time required for each procedure step, total time for the entire surgery, etc.). Can be done.
 また、画像データや各種センサから取得される情報データからは、例えば、手技ステップの順番を検知し、ステップ移行時間なども参照して、各手技ステップ間の移行がスムーズに実行できているかなどを解析し、判定できる。 In addition, from the image data and the information data acquired from various sensors, for example, the order of the procedure steps is detected, and the step transition time is also referred to to check whether the transition between the procedure steps can be smoothly executed. Can be analyzed and judged.
 また、手術中における術者や助手が持つ紺子や処置操作を検知し、術者又は助手のそれぞれが行う操作タイミングや時間,位置関係などを検知することにより、術者と助手との連携についての判定を行うことができる。 In addition, by detecting the konko and treatment operations of the surgeon and assistant during surgery and detecting the operation timing, time, and positional relationship performed by each of the surgeon and assistant, the cooperation between the surgeon and the assistant Can be determined.
 また、術者の持つ紺子や処置操作を検知し、体腔内にある内視鏡10の先端位置を検知して、術者やスコーピストが行う各動作タイミングや時間,位置関係などを検知することにより、術者とスコーピストとの連携についての判定を行うことができる。 In addition, the operator's Konko and treatment operations are detected, the tip position of the endoscope 10 in the body cavity is detected, and each operation timing, time, and positional relationship performed by the operator and the scorpist are detected. Therefore, it is possible to determine the cooperation between the surgeon and the scorpist.
 また、複数のトロッカーを用いる場合において、各トロッカーのそれぞれの位置を検知することができる。 Also, when using a plurality of trocars, the position of each trocar can be detected.
 また、画像データに基づく画像は、モニタ50の所定の領域50aに表示される。したがって、モニタ50の表示画面を確認するのみで、観察対象とする対象物や所望の部位が明確に捉えられているかを検知できる。例えば、観察対象物や所望の部位(例えば対象臓器,その所望の部位など)が表示されているかの判定を画像によって行うことができる。また、観察対象物に対して処置を行っているときに、その紺子先端や、その位置などを表示によって確認できる。観察対象物や所望の部位は、モニタ50の表示領域50aの例えば略中央近傍にあるのが望ましい。 Further, the image based on the image data is displayed in the predetermined area 50a of the monitor 50. Therefore, it is possible to detect whether the object to be observed or the desired portion is clearly captured only by checking the display screen of the monitor 50. For example, it is possible to determine whether or not an observation target object or a desired part (for example, a target organ, the desired part thereof, etc.) is displayed by an image. In addition, when the object to be observed is being treated, the tip of the Konko and its position can be confirmed by the display. It is desirable that the object to be observed and the desired portion are, for example, near the center of the display area 50a of the monitor 50.
 また、モニタ50の表示画像によって、出血部位,臓器損傷部位などを検知できる。 In addition, the bleeding site, organ damage site, etc. can be detected from the display image of the monitor 50.
 また、助手の持つ鉗子やリトラクターなどを検知し、それらの位置が、モニタ50の表示領域50aの例えば周辺領域にあることを検知すると、広い術野が確保されていることが判定できる。 Further, when the forceps and retractors held by the assistant are detected and their positions are detected in, for example, the peripheral area of the display area 50a of the monitor 50, it can be determined that a wide surgical field is secured.
 また、リトラクトされる臓器自体を検知し、それらが、モニタ50の表示領域50aの例えば周辺領域にあることを検知することで、広い術野を判定できる。 Further, by detecting the retracted organs themselves and detecting that they are in the peripheral area of the display area 50a of the monitor 50, a wide surgical field can be determined.
 また、形状情報などによって、紺子先端形状や鉗子種類を検知し、識別することで、使用紺子として適切なものが選択されているか否かの判定をおこなうことができる。 In addition, by detecting and identifying the shape of the tip of the konko and the type of forceps based on the shape information, it is possible to determine whether or not an appropriate konko to be used is selected.
 また、連続的に取得される画像データと、時間情報が関連付けられたログデータ(時系列データ)を取得できるので、例えば紺子の位置の変化を時系列で追うことで、紺子の操作時間や操作方法を検知できる。これにより、正しい操作が成されているかどうかの判定ができる。また、紺子と臓器組織との接触時間や鉗子位置の変化を検知して、正しい処置が行われているかどうかの判定ができる。 In addition, since image data that is continuously acquired and log data (time-series data) associated with time information can be acquired, for example, by tracking changes in the position of Konko in time series, the operation time of Konko can be obtained. And the operation method can be detected. This makes it possible to determine whether or not the correct operation has been performed. In addition, it is possible to detect whether the correct treatment is performed by detecting the contact time between the Konko and the organ tissue and the change in the forceps position.
 また、エネルギー処置装置60などの外部機器からの情報データを取得することで、処置具等の種類を検知できる。したがって、適切な処置具が選択されているか否かの判定ができる。同時に、例えばエネルギー出力時間や出力タイミングなどを検知することで、適切な使用が行われているかどうかの判定ができる。 Further, by acquiring information data from an external device such as the energy treatment device 60, the type of treatment tool or the like can be detected. Therefore, it is possible to determine whether or not an appropriate treatment tool is selected. At the same time, for example, by detecting the energy output time, output timing, etc., it is possible to determine whether or not proper use is being performed.
 また、所定の処置を行う際には、例えば剥離層の形状,色,組織走行,層種別などの情報を画像データから取得できるので、これにより、剥離層を検知できる。 Further, when performing a predetermined treatment, for example, information such as the shape, color, tissue running, and layer type of the peeling layer can be acquired from the image data, so that the peeling layer can be detected.
 また、血管処理を行う際には、画像データや光源装置20の情報などから血管そのものの検知ができるほか、血管クリップを検知し、その位置を検知し、血管クリップの配置時間を検知することができる。さらに、血管切離処置についての処置時間や、処置位置を検知できる。 Further, when performing blood vessel processing, the blood vessel itself can be detected from image data, information of the light source device 20, etc., and the blood vessel clip can be detected, its position can be detected, and the arrangement time of the blood vessel clip can be detected. it can. Furthermore, the treatment time and treatment position for the blood vessel dissection treatment can be detected.
 また、縫合処理を行う際には、針又は糸などの位置や動きにより縫合時間を検知できる。また、針と臓器組織との位置関係や、その変化を検知することで、縫合時間などを検知できる。 Also, when performing the suture process, the suture time can be detected by the position and movement of the needle or thread. In addition, the suturing time can be detected by detecting the positional relationship between the needle and the organ tissue and its change.
 以上説明したように上記第1の実施形態によれば、内視鏡画像データや各種センサ類から取得される各種の情報データを数値データとして取得すると共に、これら各種の情報データと、各対応する時間情報とを関連付けることにより、所定のログデータ(時系列データ)として順次出力し、記録するように構成している。 As described above, according to the first embodiment, various information data acquired from the endoscopic image data and various sensors are acquired as numerical data, and the various information data correspond to each other. By associating with time information, it is configured to be sequentially output and recorded as predetermined log data (time series data).
 また、上記所定のログデータは、予め規定された所定の基準値に対して、規定の数値範囲内にあるのか、又は規定の数値範囲外にあるのか、などの解析及び判定を行って、その解析判定結果を記録すると共に、モニタ50に表示される内視鏡画像に関連付けられた情報として表示される。 In addition, the above-mentioned predetermined log data is analyzed and determined whether it is within the specified numerical range or outside the specified numerical range with respect to the predetermined reference value specified in advance. The analysis determination result is recorded and displayed as information associated with the endoscopic image displayed on the monitor 50.
 このような構成により、本実施形態の内視鏡システム1においては、内視鏡観察を行いまたは内視鏡観察下での処置を行う際の各種の情報を、対応する内視鏡画像と共に所定の表示形態で表示することができる。これにより、使用者(ユーザ)は、観察画像の表示中に、リアルタイムで対応する関連の情報データを取得することができる。したがって、使用者(ユーザ)は、内視鏡観察や内視鏡観察下の処置を、効率的かつ確実に行うことができる。 With such a configuration, in the endoscopic system 1 of the present embodiment, various information when performing endoscopic observation or performing treatment under endoscopic observation is predetermined together with the corresponding endoscopic image. It can be displayed in the display form of. As a result, the user (user) can acquire the corresponding related information data in real time while displaying the observation image. Therefore, the user (user) can efficiently and reliably perform endoscopic observation and treatment under endoscopic observation.
 [第2の実施形態]
 図2は、本発明の第2の実施形態の内視鏡システムにおける主要構成を示すブロック構成図である。
[Second Embodiment]
FIG. 2 is a block configuration diagram showing a main configuration in the endoscope system according to the second embodiment of the present invention.
 本実施形態の内視鏡システム1Aは、基本的には上述の第1の実施形態の内視鏡システム1と略同様の構成からなる。本実施形態の内視鏡システム1Aにおいては、上述の第1の実施形態の内視鏡システム1の構成に加えて、光源装置20からの情報データを数値データとして取得し、時間情報とを関連付けたログデータ(時系列データ)を出力すると共に、当該ログデータについて、予め規定された所定の基準値に対して、規定の数値範囲内にあるのか、又は規定の数値範囲外にあるのか、などの解析及び判定を行うようにした点が異なる。 The endoscope system 1A of the present embodiment basically has substantially the same configuration as the endoscope system 1 of the first embodiment described above. In the endoscope system 1A of the present embodiment, in addition to the configuration of the endoscope system 1 of the first embodiment described above, information data from the light source device 20 is acquired as numerical data and associated with time information. In addition to outputting the log data (time series data), whether the log data is within the specified numerical range or outside the specified numerical range with respect to the predetermined reference value specified in advance, etc. The difference is that the analysis and judgment of the above are performed.
 そのために、本実施形態の内視鏡システム1Aは、上述の第1の実施形態の内視鏡システム1の構成に加えて、プロセッサ30における出射光量検知部39A(外部機器情報取得部)と、解析処理装置40における光源制御部45とを有して構成している。 Therefore, in addition to the configuration of the endoscope system 1 of the first embodiment described above, the endoscope system 1A of the present embodiment includes an emission light amount detection unit 39A (external device information acquisition unit) in the processor 30. It includes a light source control unit 45 in the analysis processing device 40.
 プロセッサ30の出射光量検知部39Aは、光源装置20の各光源(白色光光源21,励起光光源22)から出力される各種情報データと時間計測部31aからの時間情報などを受けて、光源出力に関する各種の情報を検知すると共に、ログデータを生成する回路若しくはソフトウエアプログラムからなる外部機器情報取得部である。そのために、時間計測部31aは、さらに、光源装置20の光量出力時間を計測し、取得した時間情報を出射光量検知部39Aへと出力する。 The emitted light amount detecting unit 39A of the processor 30 receives various information data output from each light source (white light light source 21, excitation light light source 22) of the light source device 20, time information from the time measuring unit 31a, and outputs the light source. It is an external device information acquisition unit consisting of a circuit or software program that detects various information related to the light source and generates log data. Therefore, the time measuring unit 31a further measures the light amount output time of the light source device 20 and outputs the acquired time information to the emitted light amount detecting unit 39A.
 出射光量検知部39Aによって取得される情報データは、例えば光源装置20の各光源(白色光光源21,励起光光源22)から出力される出射光量情報や出射光強度情報,出射光出力時間などの数値データである。 The information data acquired by the emitted light amount detecting unit 39A includes, for example, emitted light amount information, emitted light intensity information, emitted light output time, etc. output from each light source (white light light source 21, excitation light light source 22) of the light source device 20. It is numerical data.
 これらの情報データによって、観察対象物までの距離について遠いか近いかの判定ができる。また、同情報データによって、内視鏡10の先端が体腔内にあるか体外にあるかの検知ができる。その他にも、同情報データによって、撮像された対象物の判定、例えばガーゼがあるか否かの検知などができる。 From these information data, it is possible to determine whether the distance to the observation object is far or near. In addition, the information data can be used to detect whether the tip of the endoscope 10 is inside or outside the body cavity. In addition, the information data can be used to determine an imaged object, for example, to detect whether or not there is gauze.
 また、出射光量検知部39Aによって取得される情報データは、ほかにも、例えば光源装置20の各光源(白色光光源21,励起光光源22)から出力される出射光の特定波長(狭帯域波長光,近赤外光,長波長光など)の数値データがある。この出射光波長に関する情報データによって、出射光の種類を識別することで、血流,区域,病変や腫瘍などの検知を行うことができる。 In addition, the information data acquired by the emitted light amount detecting unit 39A also includes, for example, a specific wavelength (narrow band wavelength) of the emitted light output from each light source (white light light source 21, excitation light light source 22) of the light source device 20. There is numerical data of light, near-infrared light, long-wavelength light, etc.). By identifying the type of emitted light from the information data regarding the emitted light wavelength, it is possible to detect blood flow, area, lesion, tumor, and the like.
 出射光量検知部39Aによって取得される情報データは、検知ログデータ取得部34に伝達される。そして、この検知ログデータ取得部34において、出射光量検知部39Aからの情報データと、時間計測部31aからの時間情報とに基づいて、出射光に関するログデータ(出射光ログデータ)が生成される。検知ログデータ取得部34によって生成された出射光ログデータは、出力部35を介して解析処理装置40へと伝達される。 The information data acquired by the emitted light amount detection unit 39A is transmitted to the detection log data acquisition unit 34. Then, in the detection log data acquisition unit 34, log data (emission light log data) related to the emission light is generated based on the information data from the emission light amount detection unit 39A and the time information from the time measurement unit 31a. .. The emitted light log data generated by the detection log data acquisition unit 34 is transmitted to the analysis processing device 40 via the output unit 35.
 解析処理装置40の光源制御部45は、検知データ解析判定部41による解析判定結果データに基づいて光源装置20を制御するための回路若しくはソフトウエアプログラムである。そのために、検知データ解析判定部41の解析判定結果データは光源制御部45へも送られる。 The light source control unit 45 of the analysis processing device 40 is a circuit or software program for controlling the light source device 20 based on the analysis determination result data by the detection data analysis determination unit 41. Therefore, the analysis determination result data of the detection data analysis determination unit 41 is also sent to the light source control unit 45.
 これにより、光源制御部45は、検知データ解析判定部41の解析判定結果データに基づいて、光源装置20からの光量値が規定の数値範囲内となるような制御などを行う。 As a result, the light source control unit 45 controls the light amount value from the light source device 20 to be within a predetermined numerical range based on the analysis determination result data of the detection data analysis determination unit 41.
 その他の構成は、上述の第1の実施形態と略同様である。 Other configurations are substantially the same as those of the first embodiment described above.
 このように構成された本実施形態の内視鏡システム1Aにおいては、エネルギー処置装置60のほかに、光源装置20からの情報データを取得することで、光源装置20の制御も適切に行うことができる。 In the endoscope system 1A of the present embodiment configured as described above, the light source device 20 can be appropriately controlled by acquiring information data from the light source device 20 in addition to the energy treatment device 60. it can.
 本発明は上述した実施形態に限定されるものではなく、発明の主旨を逸脱しない範囲内において種々の変形や応用を実施することができることは勿論である。さらに、上記実施形態には、種々の段階の発明が含まれており、開示される複数の構成要件における適宜な組み合わせによって、種々の発明が抽出され得る。例えば、上記一実施形態に示される全構成要件から幾つかの構成要件が削除されても、発明が解決しようとする課題が解決でき、発明の効果が得られる場合には、この構成要件が削除された構成が発明として抽出され得る。さらに、異なる実施形態にわたる構成要素を適宜組み合わせてもよい。この発明は、添付のクレームによって限定される以外にはそれの特定の実施態様によって制約されない。 The present invention is not limited to the above-described embodiment, and it goes without saying that various modifications and applications can be carried out within a range that does not deviate from the gist of the invention. Further, the above-described embodiment includes inventions at various stages, and various inventions can be extracted by an appropriate combination of a plurality of disclosed constituent requirements. For example, even if some constituent requirements are deleted from all the constituent requirements shown in the above embodiment, if the problem to be solved by the invention can be solved and the effect of the invention is obtained, this constituent requirement is deleted. The configured configuration can be extracted as an invention. In addition, components across different embodiments may be combined as appropriate. The present invention is not limited by any particular embodiment thereof except as limited by the accompanying claims.
 本出願は、2019年4月3日に日本国に出願された特許出願2019-071581号を優先権主張の基礎として出願するものである。上記基礎出願により開示された内容は、本願の明細書と請求の範囲と図面に引用されているものである。 This application is based on the patent application 2019-071581 filed in Japan on April 3, 2019 as the basis for claiming priority. The contents disclosed by the above basic application are cited in the specification, claims and drawings of the present application.
 本発明は、医療分野の内視鏡制御装置だけでなく、工業分野の内視鏡制御装置にも適用することができる。 The present invention can be applied not only to an endoscope control device in the medical field but also to an endoscope control device in the industrial field.

Claims (5)

  1.  被検体の光学像を結像させて画像データを生成する撮像部と、
     前記画像データに対応する時間情報を取得する時間計測部と、
     前記撮像部を用いた検査において同時に使用されるエネルギー処置装置から出力される出力エネルギー量情報および出力時間情報を取得する外部機器情報取得部と、
     前記画像データに対して前記時間情報と前記外部機器情報取得部が取得した前記情報とを関連付けて統合データを出力するデータ統合部と、
     前記データ統合部から出力された統合データを前記時間情報に基づいて一つの統合された内視鏡画像群として記録する記録部と、
     を備える内視鏡システム。
    An imaging unit that forms an optical image of a subject and generates image data,
    A time measuring unit that acquires time information corresponding to the image data, and
    An external device information acquisition unit that acquires output energy amount information and output time information output from the energy treatment device that is simultaneously used in the inspection using the imaging unit.
    A data integration unit that outputs integrated data by associating the time information with the information acquired by the external device information acquisition unit with respect to the image data.
    A recording unit that records the integrated data output from the data integration unit as one integrated endoscopic image group based on the time information, and a recording unit.
    Endoscopic system with.
  2.  前記撮像部により取得された前記画像データに基づいて所定の数値データを検出するパラメータ検出部を、さらに具備し、
     前記パラメータ検出部は、前記画像データに基づいて表示される画像における輝度情報を少なくとも検出することを特徴とする請求項1に記載の内視鏡システム。
    A parameter detection unit that detects predetermined numerical data based on the image data acquired by the imaging unit is further provided.
    The endoscope system according to claim 1, wherein the parameter detection unit detects at least the luminance information in the image displayed based on the image data.
  3.  前記統合データに含まれる画像及び情報を表示画面上に表示するための画像合成処理を行う画像合成部を、さらに具備することを特徴とする請求項1に記載の内視鏡システム。 The endoscope system according to claim 1, further comprising an image compositing unit that performs an image compositing process for displaying an image and information included in the integrated data on a display screen.
  4.  前記統合データに対して所定の解析を行って、所定の判定基準に対する判定結果を出力するデータ解析部を、さらに具備することを特徴とする請求項1に記載の内視鏡システム。 The endoscope system according to claim 1, further comprising a data analysis unit that performs a predetermined analysis on the integrated data and outputs a determination result for a predetermined determination criterion.
  5.  前記被検体に対して光を照射する光源装置を、さらに具備し、
     前記パラメータ検出部は、前記光源装置が出射した光量に関する数値データまたは前記光源装置が出射した光の波長情報を検出することを特徴とする請求項2に記載の内視鏡システム。
    A light source device for irradiating the subject with light is further provided.
    The endoscope system according to claim 2, wherein the parameter detection unit detects numerical data regarding the amount of light emitted by the light source device or wavelength information of light emitted by the light source device.
PCT/JP2020/009616 2019-04-03 2020-03-06 Endoscopic system WO2020203034A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-071581 2019-04-03
JP2019071581A JP2020168208A (en) 2019-04-03 2019-04-03 Endoscope system

Publications (1)

Publication Number Publication Date
WO2020203034A1 true WO2020203034A1 (en) 2020-10-08

Family

ID=72668700

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/009616 WO2020203034A1 (en) 2019-04-03 2020-03-06 Endoscopic system

Country Status (2)

Country Link
JP (1) JP2020168208A (en)
WO (1) WO2020203034A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005287832A (en) * 2004-03-31 2005-10-20 Olympus Corp Heat treatment apparatus
JP2006230490A (en) * 2005-02-22 2006-09-07 Olympus Medical Systems Corp Apparatus for specimen inspection and inspection system
WO2011004801A1 (en) * 2009-07-06 2011-01-13 富士フイルム株式会社 Lighting device for endoscope, and endoscope device
JP2017513645A (en) * 2014-04-28 2017-06-01 カーディオフォーカス,インコーポレーテッド System and method for visualizing tissue using an ICG dye composition during an ablation procedure
JP2018504154A (en) * 2014-12-03 2018-02-15 カーディオフォーカス,インコーポレーテッド System and method for visual confirmation of pulmonary vein isolation during ablation procedures
WO2018216276A1 (en) * 2017-05-22 2018-11-29 ソニー株式会社 Observation system and light source control apparatus

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005287832A (en) * 2004-03-31 2005-10-20 Olympus Corp Heat treatment apparatus
JP2006230490A (en) * 2005-02-22 2006-09-07 Olympus Medical Systems Corp Apparatus for specimen inspection and inspection system
WO2011004801A1 (en) * 2009-07-06 2011-01-13 富士フイルム株式会社 Lighting device for endoscope, and endoscope device
JP2017513645A (en) * 2014-04-28 2017-06-01 カーディオフォーカス,インコーポレーテッド System and method for visualizing tissue using an ICG dye composition during an ablation procedure
JP2018504154A (en) * 2014-12-03 2018-02-15 カーディオフォーカス,インコーポレーテッド System and method for visual confirmation of pulmonary vein isolation during ablation procedures
WO2018216276A1 (en) * 2017-05-22 2018-11-29 ソニー株式会社 Observation system and light source control apparatus

Also Published As

Publication number Publication date
JP2020168208A (en) 2020-10-15

Similar Documents

Publication Publication Date Title
JP6905274B2 (en) Devices, systems, and methods for mapping tissue oxygenation
JP5810248B2 (en) Endoscope system
JP5642619B2 (en) Medical device system and method of operating medical device system
KR101647022B1 (en) Apparatus and method for capturing medical image
JP5930531B2 (en) Imaging apparatus and imaging method
JP6581984B2 (en) Endoscope system
JP2001299676A (en) Method and system for detecting sentinel lymph node
JP4190917B2 (en) Endoscope device
US10413619B2 (en) Imaging device
JP4202671B2 (en) Standardized image generation method and apparatus
WO2014155783A1 (en) Endoscopic system
US20210186594A1 (en) Heat invasion observation apparatus, endoscope system, heat invasion observation system, and heat invasion observation method
US20100076304A1 (en) Invisible light irradiation apparatus and method for controlling invisible light irradiation apparatus
JP7328432B2 (en) medical control device, medical observation system, control device and observation system
JP2008043383A (en) Fluorescence observation endoscope instrument
WO2020039931A1 (en) Endoscopic system and medical image processing system
WO2020203034A1 (en) Endoscopic system
JP2005342434A (en) Infrared observation system and specifying method of lesion by infrared observation system
JP2006043002A (en) Endoscopic observing apparatus, and endoscopic observing method
WO2021024314A1 (en) Medical treatment assistance device and medical treatment assistance method
US10537225B2 (en) Marking method and resecting method
KR20150026325A (en) Image acquisition and projection apparatus which enable simultaneous implementation of visible optical image and invisible fluorescence image
JP2013094173A (en) Observation system, marking device, observation device and endoscope diagnostic system
KR102311982B1 (en) Endoscope apparatus with reciprocating filter unit
US20230248209A1 (en) Assistant device, endoscopic system, assistant method, and computer-readable recording medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20782539

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20782539

Country of ref document: EP

Kind code of ref document: A1