WO2015114906A1 - Système d'imagerie et dispositif d'imagerie - Google Patents

Système d'imagerie et dispositif d'imagerie Download PDF

Info

Publication number
WO2015114906A1
WO2015114906A1 PCT/JP2014/079977 JP2014079977W WO2015114906A1 WO 2015114906 A1 WO2015114906 A1 WO 2015114906A1 JP 2014079977 W JP2014079977 W JP 2014079977W WO 2015114906 A1 WO2015114906 A1 WO 2015114906A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
timing
signal
illumination
image
Prior art date
Application number
PCT/JP2014/079977
Other languages
English (en)
Japanese (ja)
Inventor
紗依里 齋藤
秀範 橋本
田中 靖洋
Original Assignee
オリンパス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オリンパス株式会社 filed Critical オリンパス株式会社
Priority to JP2015536694A priority Critical patent/JPWO2015114906A1/ja
Publication of WO2015114906A1 publication Critical patent/WO2015114906A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • A61B1/0655Control therefor
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
    • G02B23/2407Optical details
    • G02B23/2461Illumination
    • G02B23/2469Illumination using optical fibres
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/555Constructional details for picking-up images in sites, inaccessible due to their dimensions or hazardous conditions, e.g. endoscopes or borescopes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/74Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/50Control of the SSIS exposure
    • H04N25/53Control of the integration time
    • H04N25/531Control of the integration time by controlling rolling shutters in CMOS SSIS

Definitions

  • the present invention relates to, for example, an imaging device including an imaging device having a plurality of pixels, and an imaging system that acquires an imaging signal captured by the imaging device.
  • an endoscope system is used to observe an organ of a subject such as a patient.
  • An endoscope system includes, for example, an endoscope having an imaging element provided at the distal end and having an elongated shape having flexibility and being inserted into a body cavity of a subject, and a proximal end side of the insertion portion.
  • a processing device that is connected via a cable and performs image processing of an in-vivo image in accordance with an imaging signal imaged by an imaging element, and displays the in-vivo image on a display unit or the like.
  • the imaging element captures an in-vivo image.
  • the insertion unit performs signal processing such as A / D conversion on the video signal picked up by the image sensor, and outputs the signal-processed video signal to the processing device.
  • a user such as a doctor observes the organ of the subject based on the in-vivo image displayed by the processing device.
  • CMOS Complementary Metal Oxide Semiconductor
  • the CMOS sensor generates image data by, for example, a rolling shutter method in which reading is performed while shifting the timing for each line.
  • CMOS sensor As an endoscope system using such a CMOS sensor, a technique for obtaining illumination light using a semiconductor light source such as an LED (Light Emitting Diode) is disclosed (for example, see Patent Document 1).
  • a semiconductor light source such as an LED (Light Emitting Diode)
  • Patent Document 1 since the control device controls the readout timing of the image sensor and the light emission timing of the semiconductor light source, if the parameters for each timing do not match, the readout timing of the image sensor and the semiconductor There may be a deviation in the light emission timing of the light source.
  • the parameters related to the readout timing of the image sensor differ depending on the specifications of the image sensor, when trying to control with the parameters related to the uniform timing, the specifications of the image sensor are limited to those that can operate with the parameters related to the uniform timing Will be.
  • the image sensor has variations in characteristics for each individual, and it is necessary to individually set parameters related to the readout timing of the image sensor.
  • the present invention has been made in view of the above, and an object of the present invention is to provide an imaging system and an imaging apparatus that can eliminate a deviation between the readout timing of the imaging element and the light emission timing of the light source.
  • an imaging system includes an illumination unit that emits illumination light, and an illumination control unit that controls emission of the illumination light by the illumination unit, Based on a light receiving unit provided with a plurality of pixels that photoelectrically convert received light to generate an electric signal, a reading unit that reads out the electric signals generated by the plurality of pixels as image information, and an input synchronization signal An imaging control unit that generates a readout signal and outputs the readout signal to the readout unit; and a timing information generation unit that generates timing information related to illumination timing by the illumination unit according to a readout operation by the readout unit. An image sensor and the timing information generated by the timing information generation unit are acquired, and the illumination control unit is based on the acquired timing information A timing controller for controlling the emission of the illumination light by, characterized by comprising a.
  • the timing information includes a delay time that is a processing time from when the imaging element receives the synchronization signal until the reading unit starts reading. It is characterized by.
  • the imaging device includes a superimposing unit that superimposes a timing signal including the timing information on an image signal including the image information output from the reading unit.
  • the superimposing unit superimposes the timing information on an information area other than information on an effective pixel area among information included in the image signal. .
  • the delay time is a time from a rising or falling edge of a vertical synchronizing signal of the inputted synchronizing signals to a reading start by the reading unit.
  • the timing information includes a time from a read start timing to a read end timing in one frame.
  • the timing information includes information indicating a timing at which exposure to the pixel is started by electronic shutter control.
  • the illumination control unit sequentially emits each illumination light of red light (R), green light (G), and blue light (B) at a predetermined timing.
  • the illumination unit is controlled.
  • the timing information generation unit generates the timing information during white balance adjustment.
  • the image pickup device operates a signal processing unit that performs signal processing on the electrical signal output from the readout unit, and the image pickup device and the illumination unit.
  • a first clock generation unit that generates a clock signal serving as a reference for the operation timing of the image processing unit, and a predetermined image process performed on the image information output from the image sensor, and the image processing unit
  • a second clock generation unit that generates a clock signal that serves as a reference for operation timing for the operation.
  • An imaging apparatus is an imaging apparatus that captures an in-vivo image of a subject illuminated by illumination light emitted from a light source device, and generates an electrical signal by photoelectrically converting the light received by each
  • a sensor unit having a light receiving unit provided with a plurality of pixels, a readout unit that reads out electrical signals generated by the plurality of pixels as image information, and generates a readout signal based on an input synchronization signal
  • An imaging control unit that outputs the readout signal to the readout unit; and a timing information generation unit that generates timing information related to illumination timing by the light source device according to a readout operation by the readout unit.
  • FIG. 1 is a diagram illustrating a schematic configuration of an endoscope system according to the first embodiment of the present invention.
  • FIG. 2 is a block diagram illustrating a functional configuration of the endoscope system according to the first embodiment of the present invention.
  • FIG. 3 is a diagram for explaining the exposure and readout timing of the image sensor at the time of photographing by the endoscope system according to the first embodiment of the present invention.
  • FIG. 4 is a diagram for explaining the exposure and readout timing of the image sensor at the time of photographing by the endoscope system according to the first embodiment of the present invention.
  • FIG. 5 is a diagram for explaining exposure and readout timing of the image sensor at the time of photographing by the endoscope system according to the modification of the first embodiment of the present invention.
  • FIG. 1 is a diagram illustrating a schematic configuration of an endoscope system according to the first embodiment of the present invention.
  • FIG. 2 is a block diagram illustrating a functional configuration of the endoscope system according to the first embodiment of the present invention
  • FIG. 6 is a block diagram illustrating a schematic configuration of the endoscope system according to the second embodiment of the present invention.
  • FIG. 7A is a diagram for explaining an example of a data structure of an image signal output by the endoscope according to the second embodiment of the present invention.
  • FIG. 7B is a diagram for explaining an example of a data structure of an image signal output by the endoscope according to the second embodiment of the present invention.
  • FIG. 8 is a diagram for explaining exposure and readout timing of the image sensor at the time of photographing by the endoscope system according to the second modification of the second embodiment of the present invention.
  • FIG. 9 is a diagram for explaining the exposure and readout timing of the image sensor at the time of photographing with the endoscope system according to the third modification of the second embodiment of the present invention.
  • FIG. 10 is a diagram for explaining the exposure and readout timing of the image sensor at the time of photographing with the endoscope system according to the fourth modification of the second embodiment of the present invention.
  • FIG. 11 is a diagram for explaining exposure and readout timing of the image sensor at the time of photographing by the endoscope system according to the sixth modification of the second embodiment of the present invention.
  • FIG. 12 is a block diagram illustrating a schematic configuration of the endoscope system according to the third embodiment of the present invention.
  • FIG. 1 is a diagram illustrating a schematic configuration of an endoscope system according to the first embodiment of the present invention.
  • FIG. 2 is a block diagram illustrating a functional configuration of the endoscope system according to the first embodiment of the present invention.
  • An endoscope system 1 shown in FIGS. 1 and 2 inserts a distal end portion into a body cavity of a subject to capture an in-vivo image of the subject, and emits the light from the distal end of the endoscope 2.
  • a light source device 3 that generates illumination light
  • a processing device 4 (control device) that performs predetermined image processing on an in-vivo image captured by the endoscope 2 and comprehensively controls the operation of the entire endoscope system 1.
  • the display device 5 displays the in-vivo image on which the processing device 4 has performed image processing.
  • the endoscope 2 includes an insertion portion 21 having an elongated shape having flexibility, an operation portion 22 that is connected to a proximal end side of the insertion portion 21 and receives input of various operation signals, and an insertion portion from the operation portion 22. And a universal cord 23 that includes various cables that extend in a direction different from the direction in which 21 extends and connect to the light source device 3 and the processing device 4.
  • the insertion unit 21 includes a distal end portion 24 including an imaging element 244 (imaging device) in which pixels that generate signals by receiving light and performing photoelectric conversion are arranged in a two-dimensional manner, and a plurality of bending pieces.
  • the bendable bending portion 25 is connected to the proximal end side of the bending portion 25 and has a long flexible tube portion 26 having flexibility.
  • the distal end portion 24 is configured using a glass fiber or the like, and forms a light guide path for light emitted from the light source device 3.
  • An illumination lens 242 provided at the distal end of the light guide 241.
  • an image sensor 244 that is provided at an image forming position of the optical system 243, receives light collected by the optical system 243, photoelectrically converts the light into an electrical signal, and performs predetermined signal processing.
  • the optical system 243 is configured by using one or a plurality of lenses, and has an optical zoom function for changing the angle of view and a focus function for changing the focus.
  • the image sensor 244 photoelectrically converts light from the optical system 243 to generate an electrical signal (imaging signal), and performs noise removal and A / D conversion on the electrical signal output from the sensor unit 244a.
  • An analog front end unit 244b (hereinafter referred to as “AFE unit 244b”), a P / S conversion unit 244c that performs parallel / serial conversion on a digital signal (image signal) output from the AFE unit 244b and transmits the converted signal to the outside, and a sensor unit A timing generator 244d that generates pulses of various signal processing in the AFE unit 244b and the P / S conversion unit 244c, and an imaging control unit 244e that controls the operation of the imaging device 244;
  • the image sensor 244 is a CMOS (Complementary Metal Oxide Semiconductor) sensor.
  • a plurality of pixels each having a photodiode that accumulates electric charge according to the amount of light and an amplifier that amplifies the electric charge accumulated by the photodiode are two-dimensionally arranged, and photoelectrically converts light from the optical system 243.
  • a light receiving unit 244f that generates an electrical signal (imaging signal) and a reading unit 244g that sequentially reads out, as image information, an electrical signal generated by a pixel arbitrarily set as a reading target among a plurality of pixels of the light receiving unit 244f.
  • the reading unit 244g sequentially reads out electrical signals (imaging signals) for each horizontal line from the light receiving unit 244f and outputs them to the AFE unit 244b.
  • the image sensor 244 (CMOS sensor) according to the first embodiment generates an electrical signal by a rolling shutter system in which exposure or reading is performed at different timings for each horizontal line. Further, the image sensor 244 outputs image information for each line (line data unit described later) to the processing device 4.
  • the AFE unit 244b reduces the noise component included in the electrical signal and adjusts the amplification factor of the electrical signal to maintain a constant output level, and outputs it via a CDS (Correlated Double Sampling) unit 244h and the CDS unit 244h.
  • An A / D converter 244i that performs A / D conversion on the electrical signal that has been converted, and a correction unit 244j that corrects the electrical signal digitally converted by the A / D converter 244i.
  • the CDS unit 244h performs noise reduction using, for example, a correlated double sampling method.
  • the correction unit 244j performs image correction and color correction (tone correction ( ⁇ correction) of the RGB video signal) for pixel defects.
  • the P / S conversion unit 244c performs parallel / serial conversion on the digital signal (image signal) output from the AFE unit 244b, transmits the digital signal to the outside, and outputs the electric signal output from the AEF unit 244b before the parallel / serial conversion.
  • the signal may include processing such as N-bit / M-bit encoding (N ⁇ M, hereinafter, the bit is expressed as “b”) and synchronization signal superposition.
  • N ⁇ M N ⁇ M, hereinafter, the bit is expressed as “b”
  • the P / S conversion unit 244c performs 8b / 10b encoding processing based on the stored conversion table to convert the 8b electrical signal into the 10b electrical signal.
  • the imaging control unit 244e controls various operations of the distal end portion 24 according to the setting data received from the processing device 4. For example, the imaging control unit 244e outputs a readout signal to the readout unit 244g, and controls an output mode of an electrical signal output from each pixel in units of pixels.
  • the imaging control unit 244e includes a timing information generation unit 2441 that generates timing information related to the timing at which the reading unit 244g starts reading.
  • the imaging control unit 244e outputs the timing information generated by the timing information generation unit 2441 to the processing device 4.
  • the timing information is information related to the illumination timing by the illumination unit 31 according to the readout operation by the readout unit 244g.
  • the imaging control unit 244e is configured using a CPU (Central Processing Unit), a register that records various programs, and the like.
  • the operation section 22 includes a bending knob 221 that bends the bending section 25 in the vertical direction and the left-right direction, a treatment instrument insertion section 222 that inserts a treatment instrument such as a biological forceps, an electric knife, and a test probe into the body cavity of the subject.
  • a treatment instrument such as a biological forceps, an electric knife, and a test probe into the body cavity of the subject.
  • it has a plurality of switches 223 which are operation input units for inputting operation instruction signals of peripheral devices such as air supply means, water supply means, and screen display control.
  • the treatment tool inserted from the treatment tool insertion portion 222 is exposed from the opening (not shown) via the treatment tool channel (not shown) of the distal end portion 24.
  • the universal cord 23 includes at least a light guide 241 and a collective cable 245 in which one or a plurality of signal lines are collected.
  • the collective cable 245 includes a signal line for transmitting / receiving setting data, a signal line for transmitting / receiving an image signal, a signal line for transmitting / receiving a timing signal for driving for driving the image sensor 244, and a reading unit 244g. Includes a signal line for transmitting timing information related to the timing of starting reading.
  • the light source device 3 includes an illumination unit 31 and an illumination control unit 32.
  • the illumination unit 31 sequentially switches and emits a plurality of illumination lights having different wavelength bands to the subject (subject) under the control of the illumination control unit 32.
  • the illumination unit 31 includes a light source unit 31a, a light source driver 31b, a rotary filter 31c, a drive unit 31d, and a drive driver 31e.
  • the light source unit 31a is configured using a white LED and one or a plurality of lenses, and emits white light to the rotary filter 31c under the control of the light source driver 31b.
  • the white light generated by the light source unit 31a is emitted from the tip of the tip part 24 toward the subject via the rotary filter 31c and the light guide 241.
  • the light source driver 31b supplies white light to the light source unit 31a by supplying a current to the light source unit 31a under the control of the illumination control unit 32.
  • the rotary filter 31c is disposed on the optical path of white light emitted from the light source unit 31a and rotates to transmit only light in a predetermined wavelength band among the white light emitted from the light source unit 31a.
  • the rotary filter 31c includes a red filter 311, a green filter 312 and a blue filter 313 that transmit light having respective wavelength bands of red light (R), green light (G), and blue light (B).
  • the rotary filter 31c sequentially transmits light in the red, green, and blue wavelength bands (for example, red: 600 nm to 700 nm, green: 500 nm to 600 nm, blue: 400 nm to 500 nm) by rotating.
  • the white light (W illumination) emitted from the light source unit 31a is converted into any one of red light (R illumination), green light (G illumination), and blue light (B illumination) with a narrow band. Can be sequentially emitted (surface sequential method).
  • the drive unit 31d is configured using a stepping motor, a DC motor, or the like, and rotates the rotary filter 31c.
  • the drive driver 31e supplies a predetermined current to the drive unit 31d under the control of the illumination control unit 32.
  • the illumination control unit 32 supplies first illumination light (for example, R illumination) to the illumination unit 31 during a first period (for example, Texposure in FIG. 3) in which all lines to be read in a certain frame are blanking periods.
  • first period for example, Texposure in FIG. 3
  • second illumination light having a wavelength band different from that of the first illumination light (for example, G illumination) is supplied to the illumination unit 31 during the second period which is the blanking period of the next frame.
  • the illumination unit 31 performs a series of processes for emitting the third illumination light (for example, B illumination) to the illumination unit 31 during the third period, which is the blanking period of the next frame, after the second period ends. Let it run repeatedly.
  • the light source unit 31a may be composed of a red LED, a green LED, and a blue LED, and the light source driver 31b may sequentially emit red light, green light, or blue light by supplying current to each LED. Good.
  • the white LED, the red LED, the green LED, and the blue LED may emit light simultaneously, or the subject may be irradiated with white light using a discharge lamp such as a xenon lamp to acquire an image. .
  • the processing device 4 includes an S / P conversion unit 401, an image processing unit 402, a brightness detection unit 403, a light control unit 404, a synchronization signal generation unit 405, an input unit 406, a recording unit 407, and a control.
  • the S / P conversion unit 401 performs serial / parallel conversion on the image information (electric signal) input from the image sensor 244 and outputs the converted image information to the image processing unit 402.
  • the image information includes an imaging signal, a correction parameter for correcting the imaging element 244, and the like.
  • the image processing unit 402 generates an in-vivo image displayed by the display device 5 based on the image information input from the S / P conversion unit 401.
  • the image processing unit 402 performs predetermined image processing on the image information to generate an in-vivo image.
  • image processing synchronization processing, optical black subtraction processing, white balance adjustment processing, color matrix calculation processing, gamma correction processing, color reproduction processing, edge enhancement processing, composition processing and format for combining a plurality of image data Conversion processing and the like.
  • the image processing unit 402 outputs the image information input from the S / P conversion unit 401 to the control unit 408 or the brightness detection unit 403.
  • the brightness detection unit 403 detects the brightness level corresponding to each pixel from the RGB image information input from the image processing unit 402, records the detected brightness level in a memory provided therein, and controls the control unit. Output to 408.
  • the brightness detection unit 403 calculates a gain adjustment value based on the detected brightness level, and outputs the gain adjustment value to the image processing unit 402.
  • the light control unit 404 sets the amount of light generated by the light source device 3, the light emission timing, and the like based on the light irradiation amount calculated by the brightness detection unit 403.
  • the dimming signal including it is output to the light source device 3.
  • the synchronization signal generation unit 405 generates a synchronization signal including at least a vertical synchronization signal, transmits the synchronization signal to a timing generator 244d via a predetermined signal line included in the aggregate cable 245, and also transmits to the image processing unit 402 inside the processing device 4. Send.
  • the input unit 406 receives input of various signals such as an operation instruction signal that instructs the operation of the endoscope system 1.
  • the recording unit 407 is realized by using a semiconductor memory such as a flash memory or a DRAM (Dynamic Random Access Memory).
  • the recording unit 407 records various programs for operating the endoscope system 1 and data including various parameters necessary for the operation of the endoscope system 1.
  • the recording unit 407 records the identification information of the processing device 4.
  • the identification information includes unique information (ID) of the processing device 4, year type, specification information, transmission method, transmission rate, and the like.
  • the control unit 408 is configured using a CPU or the like, and performs drive control of each component including the imaging device 244 and the light source device 3, input / output control of information with respect to each component, and the like.
  • the control unit 408 transmits setting data (for example, pixels to be read) recorded in the recording unit 407 for imaging control to the imaging control unit 244e via a predetermined signal line included in the collective cable 245. To do.
  • the control unit 408 functions as a timing control unit that generates a drive signal for driving the light source based on timing information including the exposure timing and readout timing of each line of the image sensor 244 and outputs the drive signal to the light source device 3.
  • the reference clock generation unit 409 generates a clock signal that is a reference for the operation of each component of the endoscope system 1 and supplies the generated clock signal to each component of the endoscope system 1.
  • the display device 5 receives and displays the in-vivo image corresponding to the in-vivo image information generated by the processing device 4 via the video cable.
  • the display device 5 is configured using liquid crystal or organic EL (Electro Luminescence).
  • 3 and 4 are diagrams for explaining the exposure and readout timing of the image sensor when photographing with the endoscope system 1.
  • the readout unit 244g shifts the timing for each horizontal line and reads out the electrical signals of the first to nth lines (one frame), and the subject by illumination light.
  • the in-vivo image of the subject is acquired by alternately repeating the blanking period for performing the illumination.
  • the read start time of the first line is T read
  • the read end time of the nth line is T read-end
  • one line is read.
  • the illumination light is irradiated on all lines from the 1st line to the nth line during the period of T read-length (line)
  • the reading period of 1 frame is T read-length (Frame)
  • T exposure be a period.
  • the read start time T read for the first line corresponds to a delay period (time, number of clocks, or number of lines and number of clocks) from when the vertical synchronization signal rises until the reading unit 244g starts reading.
  • the read start time T read for the first line may be a delay period from when the vertical synchronization signal falls until the read unit 244g starts reading.
  • the reading unit 244g sequentially reads out an electrical signal (imaging signal) for each horizontal line from the light receiving unit 244f and outputs it to the AFE unit 244b.
  • the readout start time Tread switches from the blanking period to the readout period.
  • the delay time from the rise of the vertical synchronization signal to the start of reading varies depending on the endoscope 2 to be used and the drive mode. Specifically, the delay time varies depending on the individual difference (processing ability) of the image sensor 244 employed by the endoscope 2. Even if the same image sensor 244 is used, the delay time calculation method varies depending on the drive mode, and therefore the delay time itself also varies depending on the drive mode.
  • the start timing (start time T read) at which the reading unit 244g starts reading is output as timing information to the processing device 4 via the signal line. Therefore, even when the reading start timing by the reading unit 244g of the image pickup device 244 is different, the read start timing according to the image pickup device or the drive mode can be acquired.
  • the control unit 408 controls the illumination control unit 32 based on the acquired timing information, so that the parameters relating to the timings of imaging and illumination are different regardless of the difference in readout start timing depending on the specifications of the imaging element and the drive mode. Even in this case, it is possible to irradiate the illumination light during the period T exposure during which the illumination light is irradiated on all the lines from the first line to the n-th line. As a result, it is possible to acquire an image in which illumination light is uniformly irradiated on each line.
  • the image sensor 244 outputs the start timing (start time T read) at which the reading unit 244g starts reading to the processing device 4 via the predetermined signal line as timing information. Since the control unit 408 controls the illumination control unit 32 based on the acquired timing information, the readout timing of the image sensor is different even when the readout start timing by the readout unit 244g of the image sensor 244 is different. And the light emission timing of the light source can be eliminated. Thereby, an in-vivo image with uniform brightness and an in-vivo image without color mixture can be obtained regardless of the specifications of the image sensor and the drive mode.
  • the light emission timing of the light source can be controlled in accordance with the readout timing of the image sensor regardless of the specifications of the image sensor and the drive mode, so only reading out all pixels. Even if a CMOS sensor that can read out pixels in a specific range or read out the order of the physical arrangement of pixels is used, there is a difference between the readout timing of the image sensor and the light emission timing of the light source.
  • the imaging process can be performed without the occurrence of.
  • start timing start time T read
  • start time T read start time T read
  • a high level signal may be output, and a low level signal may be output during a period when reading is not performed.
  • the control unit 408 controls the illumination timing based on the change in the level of this signal.
  • the timing information can be applied as long as it can be the illumination timing by the illumination unit 31 according to the readout operation by the readout unit 244g.
  • FIG. 5 is a diagram for explaining the exposure and readout timing of the image sensor at the time of photographing by the endoscope system 1 according to the modification of the first embodiment.
  • the image information is generated by the rolling shutter method in which exposure or reading is performed at different timings for each line.
  • a global that performs exposure or reading simultaneously on all lines is described.
  • Image information is generated by a shutter method.
  • the start timing (start time T read) at which the reading unit 244g starts reading is output as timing information to the processing device 4 via the signal line. Therefore, even when the reading start timing by the reading unit 244g of the image pickup device 244 is different, the read start timing according to the image pickup device or the drive mode can be acquired.
  • the control unit 408 controls the illumination control unit 32 based on the acquired timing information, so that the first to n-th lines are controlled regardless of the difference in readout start timing depending on the specifications of the image sensor and the drive mode. Illumination light can be irradiated during a period T exposure during which illumination light is irradiated in all lines.
  • FIG. 6 is a block diagram showing a schematic configuration of the endoscope system 1a according to the second embodiment of the present invention.
  • symbol is attached
  • timing information is output from the image sensor 244 to the processing device 4 via a predetermined signal line.
  • the timing information is superimposed on the image signal from the image sensor 244. Timing information is output to the processing device 4.
  • the endoscope system 1a includes a superimposition unit 244k that performs signal processing between the P / S conversion unit 244c and the correction unit 244j.
  • the superimposing unit 244k superimposes timing information on the image signal subjected to the correction processing by the correcting unit 244j.
  • FIG. 7A and 7B are diagrams for explaining an example of the data structure of the image signal output by the endoscope 2 according to the second embodiment of the present invention.
  • FIG. 7A is a diagram illustrating frame data 6 corresponding to pixel data of one frame.
  • FIG. 7B is a diagram illustrating line data 7 corresponding to pixel data of one line (line L in FIG. 7A).
  • frame data 6 of one frame (one image) acquired by the image sensor 244 is stored in a start code area 61, a header area 62, an image data area 63, a footer area 64, and an end code area 65. Divided.
  • the start code area 61 and the end code area 65 are assigned control codes indicating the beginning and the end of the line data 7, respectively.
  • the header area 62 includes additional information such as information stored in the image data area 63 (for example, a line number, a frame number, and an error correction code paired with the header information).
  • the image data area 63 information corresponding to an electrical signal (imaging signal) generated by the imaging element 244 (sensor unit 244a) is stored.
  • the image data area 63 includes an effective pixel area 631 that is image data of effective pixels, a horizontal blanking area 632 that is a margin area provided at the head of the horizontal line of the effective pixel area 631, and the effective pixel area 631 and the horizontal blanking area.
  • the first vertical blanking area 633 is set on the upper end side of the ranking area 632, and the second vertical blanking area 634 is set on the lower end side of the effective pixel area 631 and the horizontal blanking area 632.
  • an error correction code paired with information stored in the image data area 63 is assigned.
  • the line data 7 corresponding to the pixel data of one line is provided with a start code 71 at the head and a header 72, image data 73, a footer 74, and an end code 75 in order.
  • FIG. 7B shows the line data of the line L having the first vertical blanking area 633 of the image data area 63, and therefore blanking data is given to the area of the image data 73.
  • T timing) D is superimposed on the blanking data. That is, in the second embodiment, the superimposing unit 244k performs a superimposition process in which the timing information D is stored in the first vertical blanking region 633, whereby the timing signal including the timing information D is superimposed on the image signal.
  • the data is output from 244 to the processing device 4.
  • the control unit 408 controls the illumination control unit 32 based on the timing information extracted from the image signal by the image processing unit 402 performing signal processing. As a result, the illumination light is irradiated during the period T exposure during which the illumination light is irradiated on all the lines from the first line to the n-th line, regardless of the difference in the readout start timing depending on the specifications of the image sensor and the drive mode. can do.
  • the image sensor 244 uses the superimposition unit 244k to generate an image signal by using the start timing (start time Tread) at which the reading unit 244g starts reading as timing information.
  • start timing start time Tread
  • the control unit 408 controls the illumination control unit 32 based on the acquired timing information. Even so, the difference between the readout timing of the image sensor and the light emission timing of the light source can be eliminated. Thereby, an in-vivo image with uniform brightness and an in-vivo image without color mixture can be obtained regardless of the specifications of the image sensor and the drive mode.
  • timing information is superimposed on the image signal and output to the processing device 4, so that the timing information is output to the processing device 4 without increasing the number of signal lines. It becomes possible. Thereby, the diameter of the aggregate cable 245 due to an increase in signal lines can be prevented.
  • the timing information is given to the first vertical blanking area.
  • the timing information may be given to the second vertical blanking area or to the horizontal blanking area.
  • it may be given to the header area, or may be given to the surplus data area for adjusting the timing of the effective pixel area.
  • the timing information has been described as being stored in the image data area 63.
  • the timing information is stored in an arbitrary position (data) of the line data 7.
  • it may be stored in a line data header (for example, the first line header 72) having the first vertical blanking region 633. Further, it can be provided at an arbitrary position in the header 72.
  • the timing at which arbitrary data (for example, data representing the beginning of a frame) has an arbitrary value is specified, and timing information is stored at the specified position.
  • FIG. 8 is a diagram for explaining the exposure and readout timing of the image sensor at the time of photographing by the endoscope system 1a according to the second modification of the second embodiment.
  • the timing for starting reading (start time T read for the first line) is described as timing information.
  • timing information is included in each header of all horizontal lines. Is stored. This timing information includes information on the clock at which the reading is started in the line next to the line to which the timing information is added.
  • FIG. 9 is a diagram for explaining the exposure and readout timing of the image sensor at the time of photographing by the endoscope system 1a according to the third modification of the second embodiment.
  • the start timing start time T read of the first line
  • the first line is read. Even when the start timing (start time T read for the first line) and the end timing (end time T read-end for the n-th line) to finish reading are used as timing information. Good.
  • FIG. 10 is a diagram for explaining the exposure and readout timing of the image sensor at the time of photographing by the endoscope system 1a according to the fourth modification of the second embodiment.
  • the start timing for starting the reading of the first line (start time T read for the first line) and the end timing for ending reading of the nth line (start time T read for the nth line). -end) is used as the timing information, but as in Modification 4, the period from the start of reading the first line to the end of reading the n-th line (read time corresponding to one frame) T read-length) may be used as timing information.
  • the charge accumulation period of the image sensor is controlled by controlling the lighting period of the light source without using the electronic shutter
  • the charge accumulation period of the photodiode of the sensor unit 244a (the blanking period described above) is the mth time. Since this is a period (light source lighting period) from the read timing to the (m + 1) th read start timing, the light source control timing of the illumination control unit 32 is obtained by acquiring timing information as in the first and second embodiments. Can be determined.
  • the charge accumulation period of the photodiode is a period from the timing of the electronic shutter after the m-th reading to the m + 1-th reading start timing.
  • the read data is not affected regardless of the incident light. For this reason, by outputting timing information (information relating to the accumulation period) to the processing device 4 using the charge accumulation timing of the electronic shutter as a read start timing, for example, illumination light is emitted in the above-described period T exposure, and uniform illumination light is emitted. Irradiation can be performed.
  • FIG. 11 is a diagram for explaining the exposure and readout timing of the image sensor at the time of photographing by the endoscope system according to the sixth modification of the second embodiment.
  • the purpose is to understand the readout period or charge accumulation time of all the effective pixel areas.
  • the readout target pixels for example, FIG. 11
  • the region R can be subjected to timing information, so that the period during which pixels that are not to be read are exposed can be ignored and illumination light can be uniformly irradiated to the pixels to be read.
  • the read start timing of the first line and the read end timing of the nth line are used as timing information.
  • the read start timing and read end timing of each line may be used as timing information. Good.
  • timing information is given to the header of each line data.
  • FIG. 12 is a block diagram illustrating a schematic configuration of the endoscope system 1b according to the third embodiment of the present invention.
  • symbol is attached
  • the processing device 4 is described as controlling the entire endoscope system, but the endoscope 2 may be configured to control the entire endoscope system.
  • the imaging element 244 of the endoscope 2 described above further includes a reference clock generation unit 244l. Further, the endoscope system 1b does not include the above-described synchronization signal generation unit 405.
  • a reference clock generation unit 244l is provided in the image sensor 244, and the read start timing and the illumination light irradiation timing are controlled by the clock generated by the reference clock generation unit 244l. In other words, in the endoscope system 1b, the timing between the reading by the reading unit 244g and the emission of the illumination light by the illumination control unit 32 is controlled based on the clock generated by the reference clock generation unit 244l.
  • the light source device 3 operates based on the clock generated by the reference clock generation unit 244l
  • the reference clock generation unit 409 operates the clock for operating the internal configuration of the processing device 4 such as the image processing unit 402. Is generated.
  • the read start timing and the illumination light irradiation timing are controlled by the clock generated by the reference clock generation unit 244l, the clock generated by the reference clock generation unit 244l of the image sensor 244 is used. Even if there is a frequency deviation in the clock generated by the reference clock generation unit 409 of the control device 4, the read start timing and the illumination light irradiation timing can be matched with high accuracy.
  • the reference clock generation unit 244l is not limited to being provided inside the imaging device, but may be provided anywhere in the endoscope 2.
  • the read start time T read as timing information has been described as a delay period from when the vertical synchronization signal rises to when the read unit 244g starts reading.
  • any data in the header or horizontal blanking area in the image signal may be used as the reference timing, and the delay time from the reference timing to the read start time may be used as the timing information. If the reference timing is clear, any position (data) can be applied as the reference timing.
  • the readout start timing of the first line is the first horizontal line of the line having the effective pixel region, from the viewpoint of acquiring the image of the effective pixel region.
  • the timing information may be acquired at the time of calibration of the endoscope 2 such as white balance. Timing information is acquired at the time of white balance, and the timing of reading start by the reading unit 244g and the irradiation timing of illumination light by the illumination control unit 32 are combined to acquire timing information at the time of observation and calculate the delay time. It is possible to perform imaging processing by irradiating illumination light at an optimal timing without performing processing such as performing processing, and processing performed by the endoscope 2 or the processing device 4 during observation can be reduced.
  • the imaging control unit 244e has been described as including the timing information generation unit 2441.
  • the timing information generation unit is provided by individually providing the imaging element 244 with a timing information generation unit.
  • the timing information may be output to the processing device 4.
  • the superimposing unit 244k is provided in the image sensor 244, and the data on which the timing information is superimposed by the superimposing unit 244k is output to the processing device 4.
  • the AFE such as the correcting unit 244j is used.
  • the unit 244b may perform superimposition processing.
  • control unit 408 of the control device 4 has been described as controlling the driving of the light source device 3 based on the acquired timing information.
  • the light source device 3 has a control unit.
  • the control unit may acquire timing information and drive based on the timing information.
  • the imaging system and the imaging apparatus according to the present invention are useful for eliminating the deviation between the readout timing of the imaging device and the light emission timing of the light source, regardless of the external environment.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Surgery (AREA)
  • Optics & Photonics (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biophysics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Pathology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Astronomy & Astrophysics (AREA)
  • General Physics & Mathematics (AREA)
  • Endoscopes (AREA)
  • Instruments For Viewing The Inside Of Hollow Bodies (AREA)

Abstract

La présente invention concerne un système d'imagerie qui comprend : un élément d'imagerie (244) qui a une unité de détection (244a) ayant une unité de réception de lumière (244f) dotée d'une pluralité de pixels et une unité de lecture (244g) pour lire un signal électrique généré par la pluralité de pixels en tant qu'informations d'image, une unité de commande d'imagerie (244) pour générer un signal de lecture sur la base d'un signal de synchronisation entré et produire le signal de lecture pour l'unité de lecture (244g), et une unité de génération d'informations temporelles (2441) pour générer des informations temporelles relatives à la durée d'éclairage au moyen d'une unité d'éclairage (31) en fonction des opérations de lecture par l'unité de lecture ; un dispositif source de lumière (3) pour émettre une lumière d'éclairage ; et une unité de commande (408) qui acquiert les informations temporelles générées par l'unité de génération d'informations temporelles (2441) et commande les émissions de lumière d'éclairage au moyen d'une unité de commande d'éclairage (32) sur la base des informations temporelles acquises.
PCT/JP2014/079977 2014-01-29 2014-11-12 Système d'imagerie et dispositif d'imagerie WO2015114906A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2015536694A JPWO2015114906A1 (ja) 2014-01-29 2014-11-12 撮像システムおよび撮像装置

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014-014689 2014-01-29
JP2014014689 2014-01-29

Publications (1)

Publication Number Publication Date
WO2015114906A1 true WO2015114906A1 (fr) 2015-08-06

Family

ID=53756503

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/079977 WO2015114906A1 (fr) 2014-01-29 2014-11-12 Système d'imagerie et dispositif d'imagerie

Country Status (2)

Country Link
JP (1) JPWO2015114906A1 (fr)
WO (1) WO2015114906A1 (fr)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017221491A1 (fr) * 2016-06-23 2017-12-28 ソニー株式会社 Dispositif, système et procédé de commande
WO2018043428A1 (fr) * 2016-09-05 2018-03-08 オリンパス株式会社 Endoscope et système d'endoscope
JP2018175871A (ja) * 2017-04-14 2018-11-15 キヤノンメディカルシステムズ株式会社 撮像装置及び撮像装置の制御プログラム
WO2018220801A1 (fr) * 2017-06-01 2018-12-06 オリンパス株式会社 Dispositif d'endoscope
JPWO2018025457A1 (ja) * 2016-08-01 2019-05-30 ソニー株式会社 制御装置、制御システム、および制御方法
CN115428431A (zh) * 2020-04-02 2022-12-02 株式会社小糸制作所 门控照相机、车辆用感测***、车辆用灯具

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007018098A1 (fr) * 2005-08-05 2007-02-15 Olympus Medical Systems Corp. Unité électroluminescente
JP2009136447A (ja) * 2007-12-05 2009-06-25 Hoya Corp 光源制御システム、シャッタ制御システム。内視鏡プロセッサ、および内視鏡システム
JP2011030985A (ja) * 2009-08-06 2011-02-17 Hoya Corp 内視鏡システム、および内視鏡
JP2012034934A (ja) * 2010-08-10 2012-02-23 Hoya Corp 電子内視鏡用プロセッサ
JP2013000452A (ja) * 2011-06-20 2013-01-07 Olympus Corp 電子内視鏡装置
JP2013172904A (ja) * 2012-02-27 2013-09-05 Olympus Medical Systems Corp 撮像装置および撮像システム
WO2013175908A1 (fr) * 2012-05-25 2013-11-28 オリンパスメディカルシステムズ株式会社 Système d'imagerie

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4187486B2 (ja) * 2002-08-29 2008-11-26 フジノン株式会社 電子内視鏡装置
JP2008229208A (ja) * 2007-03-23 2008-10-02 Hoya Corp 電子内視鏡システムの電子スコープ

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007018098A1 (fr) * 2005-08-05 2007-02-15 Olympus Medical Systems Corp. Unité électroluminescente
JP2009136447A (ja) * 2007-12-05 2009-06-25 Hoya Corp 光源制御システム、シャッタ制御システム。内視鏡プロセッサ、および内視鏡システム
JP2011030985A (ja) * 2009-08-06 2011-02-17 Hoya Corp 内視鏡システム、および内視鏡
JP2012034934A (ja) * 2010-08-10 2012-02-23 Hoya Corp 電子内視鏡用プロセッサ
JP2013000452A (ja) * 2011-06-20 2013-01-07 Olympus Corp 電子内視鏡装置
JP2013172904A (ja) * 2012-02-27 2013-09-05 Olympus Medical Systems Corp 撮像装置および撮像システム
WO2013175908A1 (fr) * 2012-05-25 2013-11-28 オリンパスメディカルシステムズ株式会社 Système d'imagerie

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017221491A1 (fr) * 2016-06-23 2017-12-28 ソニー株式会社 Dispositif, système et procédé de commande
JPWO2018025457A1 (ja) * 2016-08-01 2019-05-30 ソニー株式会社 制御装置、制御システム、および制御方法
EP3492037A4 (fr) * 2016-08-01 2019-07-24 Sony Corporation Dispositif, système et procédé de commande
JP7070412B2 (ja) 2016-08-01 2022-05-18 ソニーグループ株式会社 制御装置、制御システム、および制御方法
WO2018043428A1 (fr) * 2016-09-05 2018-03-08 オリンパス株式会社 Endoscope et système d'endoscope
JP6360988B1 (ja) * 2016-09-05 2018-07-18 オリンパス株式会社 内視鏡および内視鏡システム
US10653304B2 (en) 2016-09-05 2020-05-19 Olympus Corporation Endoscope and endoscope system
JP2018175871A (ja) * 2017-04-14 2018-11-15 キヤノンメディカルシステムズ株式会社 撮像装置及び撮像装置の制御プログラム
JP7116580B2 (ja) 2017-04-14 2022-08-10 キヤノン株式会社 撮像装置、撮像装置を制御するための方法及びプログラム
WO2018220801A1 (fr) * 2017-06-01 2018-12-06 オリンパス株式会社 Dispositif d'endoscope
CN115428431A (zh) * 2020-04-02 2022-12-02 株式会社小糸制作所 门控照相机、车辆用感测***、车辆用灯具

Also Published As

Publication number Publication date
JPWO2015114906A1 (ja) 2017-03-23

Similar Documents

Publication Publication Date Title
JP5452785B1 (ja) 撮像システム
US9844312B2 (en) Endoscope system for suppressing decrease of frame rate without changing clock rate of reading
WO2015114906A1 (fr) Système d'imagerie et dispositif d'imagerie
WO2014002732A1 (fr) Dispositif d'imagerie et système d'imagerie
JP5467182B1 (ja) 撮像システム
JP5927370B1 (ja) 撮像装置および処理装置
WO2014171332A1 (fr) Dispositif de capture et dispositif de traitement d'image
WO2013128764A1 (fr) Dispositif médical
WO2016104386A1 (fr) Gradateur, système d'imagerie, procédé permettant de faire fonctionner un gradateur, et programme de fonctionnement pour gradateur
JP5847368B1 (ja) 撮像装置および内視鏡装置
JP5926980B2 (ja) 撮像装置および撮像システム
WO2020178962A1 (fr) Système d'endoscope et dispositif de traitement d'image
JP6137892B2 (ja) 撮像システム
JP6945660B2 (ja) 撮像システムおよび処理装置
JP5815162B2 (ja) 撮像装置
JP5885617B2 (ja) 撮像システム
JP5932191B1 (ja) 伝送システムおよび処理装置
JP7224963B2 (ja) 医療用制御装置及び医療用観察システム
JP6172738B2 (ja) 電子内視鏡および電子内視鏡システム
JP2013172765A (ja) 内視鏡システム
JP2016025509A (ja) 撮像システムおよび内視鏡

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2015536694

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14880470

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14880470

Country of ref document: EP

Kind code of ref document: A1