EP4309358A1 - An imaging system, method and computer program product for an imaging device - Google Patents

An imaging system, method and computer program product for an imaging device

Info

Publication number
EP4309358A1
EP4309358A1 EP22712261.1A EP22712261A EP4309358A1 EP 4309358 A1 EP4309358 A1 EP 4309358A1 EP 22712261 A EP22712261 A EP 22712261A EP 4309358 A1 EP4309358 A1 EP 4309358A1
Authority
EP
European Patent Office
Prior art keywords
light
polarization
image
imaging device
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP22712261.1A
Other languages
German (de)
French (fr)
Inventor
Paul Springer
Thimo Emmerich
Zoltan Facius
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Group Corp
Sony Europe BV
Original Assignee
Sony Group Corp
Sony Europe BV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Group Corp, Sony Europe BV filed Critical Sony Group Corp
Publication of EP4309358A1 publication Critical patent/EP4309358A1/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0075Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for altering, e.g. increasing, the depth of field or depth of focus
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00064Constructional details of the endoscope body
    • A61B1/00071Insertion part of the endoscope body
    • A61B1/0008Insertion part of the endoscope body characterised by distal tip features
    • A61B1/00096Optical elements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00163Optical arrangements
    • A61B1/00186Optical arrangements with imaging filters
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
    • G02B23/2407Optical details
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
    • G02B23/2476Non-optical details, e.g. housings, mountings, supports
    • G02B23/2484Arrangements in relation to a camera or imaging device
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/28Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for polarising
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/30Polarising elements
    • G02B5/3025Polarisers, i.e. arrangements capable of producing a definite output polarisation state from an unpolarised input state
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals

Definitions

  • the present invention relates to an imaging system, method and computer program product for an imaging device.
  • Imaging devices are used in a wide variety of situations in order to obtain images of an environment or scene.
  • the optical performance of the imaging device is a key factor in the quality of images and/or the amount of information regarding the environment or scene which can be obtained.
  • imaging devices are limited in optical performance in many ways.
  • a particular problem is the limited depth of field at which high spatial frequencies can be obtained in order to capture a sufficiently sharp image.
  • a lower-aperture number wider aperture size
  • an imaging system for an imaging device comprising: image sensor circuitry having a plurality of image sensing regions, each of the respective image sensing regions configured to be sensitive to light of a certain polarization state; and a polarization unit being configured to be arranged along an optical axis of the imaging device between an imaging lens system of the imaging device and the image sensor circuitry, the polarization unit further being further configured to receive incident light and provide light of different polarization states depending on the region of the polarization unit upon which the incident light is incident; the image sensor circuitry being configured to be arranged to receive light from the polarization unit and further being configured to output, at a first instance of time, image data corresponding to each of the respective image sensing regions.
  • a method of imaging for an imaging system device comprising: receiving, using image sensor circuitry having a plurality of image sensing regions configured to be sensitive to light of a certain polarization state, light from a polarization unit arranged along an optical axis of the imaging device between an imaging lens system of the imaging device and the image sensor circuitry, the polarization unit being configured to receive indecent light and provide light of different polarization states depending on the region of the polarization unit upon which the incident light is incident; and outputting, at a first instance of time, image data corresponding to each of the respective image sensing regions.
  • a computer program product comprising instructions which, when the program is executed by a computer, cause the computer to perform a method of imaging for an imaging device
  • the method comprising: controlling image sensor circuitry having a plurality of image sensing regions configured to be sensitive to light of a certain polarization state to receive light from a polarization unit arranged along an optical axis of the imaging device between an imaging lens system of the imaging device and the image sensor circuitry, the polarization unit being configured to receive incident light and provide light of different polarization states depending on the region of the polarization unit upon which the indecent light is indecent; and controlling the image sensor circuitry to output, at a first instance of time, image data corresponding to each of the respective image sensing regions.
  • the depth of field which an imaging device can achieve, for a given aperture number is extended, which enables a sharper in-focus image to be obtained for an extended range of object distances within the scene. Furthermore, the increase in the depth of field can be achieved without increasing the form factor of the imaging device.
  • the present disclosure is not particularly limited to these advantageous technical effects, there may be others as will be apparent to the skilled person when reading the disclosure.
  • Figure 1 is a view depicting an example of a schematic configuration of an endoscopic surgery system to which the technology according to an embodiment of the present disclosure can be applied;
  • Figure 2 is a block diagram depicting an example of a functional configuration of the camera head and the CCU depicted in Figure 1;
  • Figure 3 illustrates an imaging system for an imaging device in accordance with embodiments of the disclosure
  • Figure 4A illustrates a polarization unit in accordance with embodiments of the disclosure
  • Figure 4B illustrates an example configuration of the polarization unit in accordance with embodiments of the disclosure
  • Figure 4C illustrates an example configuration of the polarization unit in accordance with embodiments of the disclosure
  • FIG. 5 illustrates image sensor circuitry in accordance with embodiments of the disclosure
  • Figure 6A illustrates an example focal plane shift in accordance with embodiments of the disclosure
  • Figure 6B illustrates an example focal plane shift in accordance with embodiments of the disclosure
  • Figure 6C illustrates an example of a focus shift in the image plane in accordance with embodiments of the disclosure
  • Figure 6D illustrates an example of a focus shift in accordance with embodiments of the disclosure
  • Figure 6E illustrates an example of a focus shift in accordance with embodiments of the disclosure
  • Figure 7 illustrates a device in accordance with embodiments of the disclosure.
  • Figure 8 illustrates a method of imaging for an imaging device in accordance with embodiments of the disclosure.
  • the technology according to an embodiment of the present disclosure can be applied to various products.
  • the technology according to an embodiment of the present disclosure may be applied to an endoscopic surgery system, surgical microscopy or medical imaging device or other kind of industrial endoscopy in, say pipe or tube laying or fault finding.
  • Figure 1 is a view depicting an example of a schematic configuration of an endoscopic surgery system 5000 to which the technology according to an embodiment of the present disclosure can be applied.
  • a state is illustrated in which a surgeon (medical doctor) 5067 is using the endoscopic surgery system 5000 to perform surgery for a patient 5071 on a patient bed 5069.
  • the endoscopic surgery system 5000 includes an endoscope 5001, other surgical tools 5017, a supporting arm apparatus 5027 which supports the endoscope 5001 thereon, and a cart 5037 on which various apparatus for endoscopic surgery are mounted.
  • trocars 5025a to 5025d are used to puncture the abdominal wall.
  • a lens barrel 5003 of the endoscope 5001 and the other surgical tools 5017 are inserted into body lumens of the patient 5071 through the trocars 5025a to 5025d.
  • a pneumoperitoneum tube 5019, an energy treatment tool 5021 and forceps 5023 are inserted into body lumens of the patient 5071.
  • the energy treatment tool 5021 is a treatment tool for performing incision and peeling of a tissue, sealing of a blood vessel or the like by high frequency current or ultrasonic vibration.
  • the surgical tools 5017 depicted are mere examples at all, and as the surgical tools 5017, various surgical tools which are generally used in endoscopic surgery such as, for example, a pair of tweezers or a retractor may be used.
  • An image of a surgical region in a body lumen of the patient 5071 imaged by the endoscope 5001 is displayed on a display apparatus 5041.
  • the surgeon 5067 would use the energy treatment tool 5021 or the forceps 5023 while watching the image of the surgical region displayed on the display apparatus 5041 on the real time basis to perform such treatment as, for example, resection of an affected area.
  • the pneumoperitoneum tube 5019, the energy treatment tool 5021 and the forceps 5023 are supported by the surgeon 5067, an assistant or the like during surgery.
  • the supporting arm apparatus 5027 includes an arm unit 5031 extending from a base unit 5029.
  • the arm unit 5031 includes joint portions 5033a, 5033b and 5033c and links 5035a and 5035b and is driven under the control of an arm controlling apparatus 5045.
  • the endoscope 5001 is supported by the arm unit 5031 such that the position and the posture of the endoscope 5001 are controlled. Consequently, stable fixation in position of the endoscope 5001 can be implemented.
  • the endoscope 5001 includes the lens barrel 5003 which has a region of a predetermined length from a distal end thereof to be inserted into a body lumen of the patient 5071, and a camera head 5005 connected to a proximal end of the lens barrel 5003.
  • the endoscope 5001 is depicted which includes as a hard mirror having the lens barrel 5003 of the hard type.
  • the endoscope 5001 may otherwise be configured as a soft mirror having the lens barrel 5003 of the soft type.
  • the lens barrel 5003 has, at a distal end thereof, an opening in which an objective lens is fitted.
  • a light source apparatus 5043 is connected to the endoscope 5001 such that light generated by the light source apparatus 5043 is introduced to a distal end of the lens barrel by a light guide extending in the inside of the lens barrel 5003 and is irradiated toward an observation target in a body lumen of the patient 5071 through the objective lens.
  • the endoscope 5001 may be a direct view mirror or may be a perspective view mirror or a side view mirror.
  • An optical system and an image pickup element are provided in the inside of the camera head 5005 such that reflected light (observation light) from an observation target is condensed on the image pickup element by the optical system.
  • the observation light is photo-electrically converted by the image pickup element to generate an electric signal corresponding to the observation light, namely, an image signal corresponding to an observation image.
  • the image signal is transmitted as RAW data to a CCU 5039.
  • the camera head 5005 has a function incorporated therein for suitably driving the optical system of the camera head 5005 to adjust the magnification and the focal distance.
  • a plurality of image pickup elements may be provided on the camera head 5005.
  • a plurality of relay optical systems are provided in the inside of the lens barrel 5003 in order to guide observation light to each of the plurality of image pickup elements.
  • the CCU 5039 includes a central processing unit (CPU), a graphics processing unit (GPU) or the like and integrally controls operation of the endoscope 5001 and the display apparatus 5041.
  • the CCU 5039 performs, for an image signal received from the camera head 5005, various image processes for displaying an image based on the image signal such as, for example, a development process (demosaic process).
  • the CCU 5039 provides the image signal for which the image processes have been performed to the display apparatus 5041.
  • the CCU 5039 transmits a control signal to the camera head 5005 to control driving of the camera head 5005.
  • the control signal may include information relating to an image pickup condition such as a magnification or a focal distance.
  • the display apparatus 5041 displays an image based on an image signal for which the image processes have been performed by the CCU 5039 under the control of the CCU 5039. If the endoscope 5001 is ready for imaging of a high resolution such as 4K (horizontal pixel number 3840 c vertical pixel number 2160), 8K (horizontal pixel number 7680 c vertical pixel number 4320) or the like and/or ready for 3D display, then a display apparatus by which corresponding display of the high resolution and/or 3D display are possible may be used as the display apparatus 5041.
  • a display apparatus by which corresponding display of the high resolution and/or 3D display are possible may be used as the display apparatus 5041.
  • the apparatus is ready for imaging of a high resolution such as 4K or 8K
  • the display apparatus used as the display apparatus 5041 has a size of equal to or not less than 55 inches, then a more immersive experience can be obtained.
  • a plurality of display apparatus 5041 having different resolutions and/or different sizes may be provided in accordance with purposes.
  • the light source apparatus 5043 includes a light source such as, for example, a light emitting diode (LED) and supplies irradiation light for imaging of a surgical region to the endoscope 5001.
  • a light source such as, for example, a light emitting diode (LED) and supplies irradiation light for imaging of a surgical region to the endoscope 5001.
  • LED light emitting diode
  • the arm controlling apparatus 5045 includes a processor such as, for example, a CPU and operates in accordance with a predetermined program to control driving of the arm unit 5031 of the supporting arm apparatus 5027 in accordance with a predetermined controlling method.
  • a processor such as, for example, a CPU and operates in accordance with a predetermined program to control driving of the arm unit 5031 of the supporting arm apparatus 5027 in accordance with a predetermined controlling method.
  • An inputting apparatus 5047 is an input interface for the endoscopic surgery system 5000.
  • a user can perform inputting of various kinds of information or instruction inputting to the endoscopic surgery system 5000 through the inputting apparatus 5047.
  • the user would input various kinds of information relating to surgery such as physical information of a patient, information regarding a surgical procedure of the surgery and so forth through the inputting apparatus 5047.
  • the user would input, for example, an instruction to drive the arm unit 5031, an instruction to change an image pickup condition (type of irradiation light, magnification, focal distance or the like) by the endoscope 5001, an instruction to drive the energy treatment tool 5021 or the like through the inputting apparatus 5047.
  • an image pickup condition type of irradiation light, magnification, focal distance or the like
  • the type of the inputting apparatus 5047 is not limited and may be that of any one of various known inputting apparatus.
  • the inputting apparatus 5047 for example, a mouse, a keyboard, a touch panel, a switch, a foot switch 5057 and/or a lever or the like may be applied.
  • a touch panel is used as the inputting apparatus 5047, it may be provided on the display face of the display apparatus 5041.
  • the inputting apparatus 5047 is a device to be mounted on a user such as, for example, a glasses type wearable device or a head mounted display (HMD), and various kinds of inputting are performed in response to a gesture or a line of sight of the user detected by any of the devices mentioned.
  • the inputting apparatus 5047 includes a camera which can detect a motion of a user, and various kinds of inputting are performed in response to a gesture or a line of sight of a user detected from a video imaged by the camera.
  • the inputting apparatus 5047 includes a microphone which can collect the voice of a user, and various kinds of inputting are performed by voice collected by the microphone.
  • the inputting apparatus 5047 By configuring the inputting apparatus 5047 such that various kinds of information can be inputted in a contactless fashion in this manner, especially a user who belongs to a clean area (for example, the surgeon 5067) can operate an apparatus belonging to an unclean area in a contactless fashion. Further, since the user can operate an apparatus without releasing a possessed surgical tool from its hand, the convenience to the user is improved.
  • a clean area for example, the surgeon 5067
  • a treatment tool controlling apparatus 5049 controls driving of the energy treatment tool 5021 for cautery or incision of a tissue, sealing of a blood vessel or the like.
  • a pneumoperitoneum apparatus 5051 feeds gas into a body lumen of the patient 5071 through the pneumoperitoneum tube 5019 to inflate the body lumen in order to secure the field of view of the endoscope 5001 and secure the working space for the surgeon.
  • a recorder 5053 is an apparatus capable of recording various kinds of information relating to surgery.
  • a printer 5055 is an apparatus capable of printing various kinds of information relating to surgery in various forms such as a text, an image or a graph.
  • the supporting arm apparatus 5027 includes the base unit 5029 serving as a base, and the arm unit 5031 extending from the base unit 5029.
  • the arm unit 5031 includes the plurality of joint portions 5033a, 5033b and 5033c and the plurality of links 5035a and 5035b connected to each other by the joint portion 5033b.
  • Figure 1 for simplified illustration, the configuration of the arm unit 5031 is depicted in a simplified form.
  • the shape, number and arrangement of the joint portions 5033a to 5033c and the links 5035a and 5035b and the direction and so forth of axes of rotation of the joint portions 5033a to 5033c can be set suitably such that the arm unit 5031 has a desired degree of freedom.
  • the arm unit 5031 may preferably be configured such that it has a degree of freedom equal to or not less than 6 degrees of freedom. This makes it possible to move the endoscope 5001 freely within the movable range of the arm unit 5031. Consequently, it becomes possible to insert the lens barrel 5003 of the endoscope 5001 from a desired direction into a body lumen of the patient 5071.
  • An actuator is provided in each of the joint portions 5033a to 5033c, and the joint portions 5033a to 5033c are configured such that they are rotatable around predetermined axes of rotation thereof by driving of the respective actuators.
  • the driving of the actuators is controlled by the arm controlling apparatus 5045 to control the rotational angle of each of the joint portions 5033a to 5033c thereby to control driving of the arm unit 5031. Consequently, control of the position and the posture of the endoscope 5001 can be implemented.
  • the arm controlling apparatus 5045 can control driving of the arm unit 5031 by various known controlling methods such as force control or position control.
  • the arm unit 5031 may be controlled suitably by the arm controlling apparatus 5045 in response to the operation input to control the position and the posture of the endoscope 5001.
  • the endoscope 5001 at the distal end of the arm unit 5031 is moved from an arbitrary position to a different arbitrary position by the control just described, the endoscope 5001 can be supported fixedly at the position after the movement.
  • the arm unit 5031 may be operated in a master-slave fashion. In this case, the arm unit 5031 may be remotely controlled by the user through the inputting apparatus 5047 which is placed at a place remote from the surgery room.
  • the arm controlling apparatus 5045 may perform power-assisted control to drive the actuators of the joint portions 5033a to 5033c such that the arm unit 5031 may receive external force by the user and move smoothly following the external force.
  • This makes it possible to move, when the user directly touches with and moves the arm unit 5031, the arm unit 5031 with comparatively weak force. Accordingly, it becomes possible for the user to move the endoscope 5001 more intuitively by a simpler and easier operation, and the convenience to the user can be improved.
  • the endoscope 5001 is supported by a medical doctor called scopist.
  • the position of the endoscope 5001 can be fixed more certainly without hands, and therefore, an image of a surgical region can be obtained stably and surgery can be performed smoothly.
  • the arm controlling apparatus 5045 may not necessarily be provided on the cart 5037. Further, the arm controlling apparatus 5045 may not necessarily be a single apparatus. For example, the arm controlling apparatus 5045 may be provided in each of the joint portions 5033a to 5033c of the arm unit 5031 of the supporting arm apparatus 5027 such that the plurality of arm controlling apparatus 5045 cooperate with each other to implement driving control of the arm unit 5031.
  • the light source apparatus 5043 supplies irradiation light upon imaging of a surgical region to the endoscope 5001.
  • the light source apparatus 5043 includes a white light source which includes, for example, an LED, a laser light source or a combination of them.
  • a white light source includes a combination of red, green, and blue (RGB) laser light sources, since the output intensity and the output timing can be controlled with a high degree of accuracy for each colour (each wavelength), adjustment of the white balance of a picked up image can be performed by the light source apparatus 5043.
  • RGB red, green, and blue
  • G and B colours can be picked up time-divisionally. According to the method just described, a colour image can be obtained even if a colour fdter is not provided for the image pickup element.
  • driving of the light source apparatus 5043 may be controlled such that the intensity of light to be outputted is changed for each predetermined time.
  • driving of the image pickup element of the camera head 5005 in synchronism with the timing of the change of the intensity of light to acquire images time-divisionally and synthesizing the images, an image of a high dynamic range free from underexposed blocked up shadows and overexposed highlights can be created.
  • the light source apparatus 5043 may be configured to supply light of a predetermined wavelength band ready for special light observation.
  • This may include, but not be limited to laser light such as that provided by a Vertical Cavity surface laser or any kind of laser light.
  • the light may be InfraRed (IR) light.
  • IR InfraRed
  • special light observation for example, by utilizing the wavelength dependency of absorption of light in a body tissue to irradiate light of a narrower band in comparison with irradiation light upon ordinary observation (namely, white light), narrow band light observation (narrow band imaging) of imaging a predetermined tissue such as a blood vessel of a superficial portion of the mucous membrane or the like in a high contrast is performed.
  • fluorescent observation for obtaining an image from fluorescent light generated by irradiation of excitation light may be performed.
  • fluorescent observation it is possible to perform observation of fluorescent light from a body tissue by irradiating excitation light on the body tissue (autofluorescence observation) or to obtain a fluorescent light image by locally injecting a reagent such as indocyanine green (ICG) into a body tissue and irradiating excitation light corresponding to a fluorescent light wavelength of the reagent upon the body tissue.
  • the light source apparatus 5043 can be configured to supply such narrow-band light and/or excitation light suitable for special light observation as described above.
  • the light source may also apply a heat pattern to an area.
  • the light source apparatus 5043 is, in embodiments, a Vertical Cavity Surface-Emitting Laser (VCSEL) which can produce light in the visible part of the electromagnetic spectrum and some produce light in the Infra-Red part of the electromagnetic spectrum. In this respect, the light source apparatus 5043 may also act as a visible light source illuminating the area.
  • the light source apparatus 5043 is, in embodiments, one or more Vertical Cavity Surface -Emitting Laser (VCSEL) which can produce light in the visible part of the electromagnetic spectrum and some produce light in the Infra-Red part of the electromagnetic spectrum. In this respect, the light source apparatus 5043 may also act as a visible light source illuminating the area.
  • VCSEL Vertical Cavity Surface-Emitting Laser
  • the one or more VCSELs may be single wavelength narrowband VCSELs, where each VCSEL varies in emission spectral frequency.
  • one or more of the VCSELs may be a Micro Electro Mechanical system (MEMs) type VCSEL whose wavelength emission may be altered over a specific range.
  • the wavelength may alter over the range 550nm to 650nm or 600nm to 650nm.
  • the shape of the VCSEL may vary such as a square or circular shape and may be positioned at one or varying positions in the endoscope 5001.
  • the light source apparatus 5043 may illuminate one or more areas. This may be achieved by selectively switching the VCSELs on or by performing a raster scan of the area using a Micro Electro Mechanical system (MEMs).
  • MEMs Micro Electro Mechanical system
  • the purpose of the light source apparatus 5043 is to perform Spatial Light Modulation (SLM) on the light over the area. This will be explained in more detail later.
  • SLM Spatial Light Modulation
  • the light source apparatus 5043 may be positioned in the cart, the disclosure is not so limited. In particular, the light source apparatus may be positioned in the camera head 5005.
  • FIG. 2 is a block diagram depicting an example of a functional configuration of the camera head 5005 and the CCU 5039 depicted in Figure 1.
  • the camera head 5005 has, as functions thereof, a lens unit 5007, an image pickup unit 5009, a driving unit 5011, a communication unit 5013 and a camera head controlling unit 5015.
  • the CCU 5039 has, as functions thereof, a communication unit 5059, an image processing unit 5061 and a control unit 5063.
  • the camera head 5005 and the CCU 5039 are connected to be bidirectionally communicable to each other by a transmission cable 5065.
  • the lens unit 5007 is an optical system provided at a connecting location of the camera head 5005 to the lens barrel 5003. Observation light taken in from a distal end of the lens barrel 5003 is introduced into the camera head 5005 and enters the lens unit 5007.
  • the lens unit 5007 includes a combination of a plurality of lenses including a zoom lens and a focusing lens.
  • the lens unit 5007 has optical properties adjusted such that the observation light is condensed on a light receiving face of the image pickup element of the image pickup unit 5009.
  • the zoom lens and the focusing lens are configured such that the positions thereof on their optical axis are movable for adjustment of the magnification and the focal point of a picked up image.
  • the image pickup unit 5009 includes an image pickup element and disposed at a succeeding stage to the lens unit 5007. Observation light having passed through the lens unit 5007 is condensed on the light receiving face of the image pickup element, and an image signal corresponding to the observation image is generated by photoelectric conversion of the image pickup element. The image signal generated by the image pickup unit 5009 is provided to the communication unit 5013.
  • an image pickup element which is included by the image pickup unit 5009, an image sensor, for example, of the complementary metal oxide semiconductor (CMOS) type is used which has a Bayer array and is capable of picking up an image in colour.
  • CMOS complementary metal oxide semiconductor
  • an image pickup element may be used which is ready, for example, for imaging of an image of a high resolution equal to or not less than 4K. If an image of a surgical region is obtained in a high resolution, then the surgeon 5067 can comprehend a state of the surgical region in enhanced details and can proceed with the surgery more smoothly.
  • the image pickup element which is included by the image pickup unit 5009 includes such that it has a pair of image pickup elements for acquiring image signals for the right eye and the left eye compatible with 3D display. Where 3D display is applied, the surgeon 5067 can comprehend the depth of a living body tissue in the surgical region more accurately. It is to be noted that, if the image pickup unit 5009 is configured as that of the multi -plate type, then a plurality of systems of lens units 5007 are provided corresponding to the individual image pickup elements of the image pickup unit 5009.
  • the image pickup unit 5009 may not necessarily be provided on the camera head 5005.
  • the image pickup unit 5009 may be provided just behind the objective lens in the inside of the lens barrel 5003.
  • the driving unit 5011 includes an actuator and moves the zoom lens and the focusing lens of the lens unit 5007 by a predetermined distance along the optical axis under the control of the camera head controlling unit 5015. Consequently, the magnification and the focal point of a picked up image by the image pickup unit 5009 can be adjusted suitably.
  • the communication unit 5013 includes a communication apparatus for transmitting and receiving various kinds of information to and from the CCU 5039.
  • the communication unit 5013 transmits an image signal acquired from the image pickup unit 5009 as RAW data to the CCU 5039 through the transmission cable 5065.
  • the image signal is transmitted by optical communication. This is because, upon surgery, the surgeon 5067 performs surgery while observing the state of an affected area through a picked up image, it is demanded for a moving image of the surgical region to be displayed on the real time basis as far as possible in order to achieve surgery with a higher degree of safety and certainty.
  • a photoelectric conversion module for converting an electric signal into an optical signal is provided in the communication unit 5013. After the image signal is converted into an optical signal by the photoelectric conversion module, it is transmitted to the CCU 5039 through the transmission cable 5065.
  • the communication unit 5013 receives a control signal for controlling driving of the camera head 5005 from the CCU 5039.
  • the control signal includes information relating to image pickup conditions such as, for example, information that a frame rate of a picked up image is designated, information that an exposure value upon image picking up is designated and/or information that a magnification and a focal point of a picked up image are designated.
  • the communication unit 5013 provides the received control signal to the camera head controlling unit 5015.
  • the control signal from the CCU 5039 may be transmitted by optical communication.
  • a photoelectric conversion module for converting an optical signal into an electric signal is provided in the communication unit 5013. After the control signal is converted into an electric signal by the photoelectric conversion module, it is provided to the camera head controlling unit 5015.
  • the image pickup conditions such as the frame rate, exposure value, magnification or focal point are set automatically by the control unit 5063 of the CCU 5039 on the basis of an acquired image signal.
  • an auto exposure (AE) function, an auto focus (AF) function and an auto white balance (AWB) function are incorporated in the endoscope 5001.
  • the camera head controlling unit 5015 controls driving of the camera head 5005 on the basis of a control signal from the CCU 5039 received through the communication unit 5013.
  • the camera head controlling unit 5015 controls driving of the image pickup element of the image pickup unit 5009 on the basis of information that a frame rate of a picked up image is designated and/or information that an exposure value upon image picking up is designated.
  • the camera head controlling unit 5015 controls the driving unit 5011 to suitably move the zoom lens and the focus lens of the lens unit 5007 on the basis of information that a magnification and a focal point of a picked up image are designated.
  • the camera head controlling unit 5015 may further include a function for storing information for identifying the lens barrel 5003 and/or the camera head 5005.
  • the camera head 5005 can be provided with resistance to an autoclave sterilization process.
  • the communication unit 5059 includes a communication apparatus for transmitting and receiving various kinds of information to and from the camera head 5005.
  • the communication unit 5059 receives an image signal transmitted thereto from the camera head 5005 through the transmission cable 5065. Thereupon, the image signal may be transmitted preferably by optical communication as described above.
  • the communication unit 5059 includes a photoelectric conversion module for converting an optical signal into an electric signal.
  • the communication unit 5059 provides the image signal after conversion into an electric signal to the image processing unit 5061.
  • the communication unit 5059 transmits, to the camera head 5005, a control signal for controlling driving of the camera head 5005.
  • the control signal may also be transmitted by optical communication.
  • the image processing unit 5061 performs various image processes for an image signal in the form of RAW data transmitted thereto from the camera head 5005.
  • the image processes include various known signal processes such as, for example, a development process, an image quality improving process (a bandwidth enhancement process, a super-resolution process, a noise reduction (NR) process and/or an image stabilization process) and/or an enlargement process (electronic zooming process).
  • the image processing unit 5061 performs a detection process for an image signal in order to perform AE, AF and AWB.
  • the image processing unit 5061 includes a processor such as a CPU or a GPU, and when the processor operates in accordance with a predetermined program, the image processes and the detection process described above can be performed. It is to be noted that, where the image processing unit 5061 includes a plurality of GPUs, the image processing unit 5061 suitably divides information relating to an image signal such that image processes are performed in parallel by the plurality of GPUs.
  • the control unit 5063 performs various kinds of control relating to image picking up of a surgical region by the endoscope 5001 and display of the picked up image. For example, the control unit 5063 generates a control signal for controlling driving of the camera head 5005. Thereupon, if image pickup conditions are inputted by the user, then the control unit 5063 generates a control signal on the basis of the input by the user.
  • the control unit 5063 suitably calculates an optimum exposure value, focal distance and white balance in response to a result of a detection process by the image processing unit 5061 and generates a control signal.
  • control unit 5063 controls the display apparatus 5041 to display an image of a surgical region on the basis of an image signal for which image processes have been performed by the image processing unit 5061. Thereupon, the control unit 5063 recognizes various objects in the surgical region image using various image recognition technologies. For example, the control unit 5063 can recognize a surgical tool such as forceps, a particular living body region, bleeding, mist when the energy treatment tool 5021 is used and so forth by detecting the shape, colour and so forth of edges of the objects included in the surgical region image.
  • the control unit 5063 causes, when it controls the display unit 5041 to display a surgical region image, various kinds of surgery supporting information to be displayed in an overlapping manner with an image of the surgical region using a result of the recognition. Where surgery supporting information is displayed in an overlapping manner and presented to the surgeon 5067, the surgeon 5067 can proceed with the surgery more safety and certainty.
  • the transmission cable 5065 which connects the camera head 5005 and the CCU 5039 to each other is an electric signal cable ready for communication of an electric signal, an optical fibre ready for optical communication or a composite cable ready for both of electrical and optical communication.
  • the communication between the camera head 5005 and the CCU 5039 may be performed otherwise by wireless communication.
  • the communication between the camera head 5005 and the CCU 5039 is performed by wireless communication, there is no necessity to lay the transmission cable 5065 in the surgery room. Therefore, such a situation that movement of medical staff in the surgery room is disturbed by the transmission cable 5065 can be eliminated.
  • the endoscopic surgery system 5000 to which the technology according to an embodiment of the present disclosure can be applied has been described above. It is to be noted here that, although the endoscopic surgery system 5000 has been described as an example, the system to which the technology according to an embodiment of the present disclosure can be applied is not limited to the example.
  • the technology according to an embodiment of the present disclosure may be applied to a soft endoscopic system for inspection or a microscopic surgery system. Indeed, the technology may be applied to a surgical microscope for conducting neurosurgery or the like.
  • the technology according to an embodiment of the present disclosure can be applied suitably to the CCU 5039 from among the components described hereinabove.
  • the technology according to an embodiment of the present disclosure is applied to an endoscopy system, surgical microscopy or medical imaging.
  • an endoscopy system surgical microscopy or medical imaging.
  • blood flow in veins, arteries and capillaries may be identified.
  • objects may be identified and the material of those objects may be established. This reduces the risk to the patient’s safety during operations.
  • an imaging system which can address a number of problems related to the optical performance of imaging devices (such as the medical imaging device described with reference to Figures 1 and 2 of the present disclosure).
  • an imaging system which enables an image (or images) with extended depth of field to be obtained is desired.
  • Certain image systems can be used in order to extend the depth of field of an imaging device. For example, a degree of extension of the depth of field of an image can be achieved through wave front manipulation and subsequent inverse filtering.
  • a drawback with these systems for extending the depth of field is a reduced level of in-focus sharpness of the image which is obtained. That is, the sharpness level of the sharpest regions within the image is often reduced by the application of this process to extend the depth of field.
  • polarization parameters can influence the point spread function (PSF) of an optical system.
  • PSF point spread function
  • an imaging system for an imaging device is provided in accordance with embodiments of the disclosure.
  • the imaging system 1000 comprises image sensor circuitry 1002 and a polarization unit 1004.
  • the imaging system 1000 is illustrated as part of an imaging device (the imaging device including an imaging lens system 1006).
  • the imaging system 1000 is arranged along the optical axis 1010 of the medical imaging device.
  • an optional light source 1008 is illustrated as part of the imaging system 1000.
  • the polarization unit 1004 is arranged along the optical axis 1010 of the medical imaging device between the imaging lens system 1006 and the image sensor circuitry 2002. That is, the path of the light collected by the imaging lens system 1006 of the imaging device is intercepted by the polarization unit 1004 before that light reaches the image sensor circuitry 1002 of the imaging system 1000. Accordingly, the image sensor circuitry 1002 is configured to receive light from the polarization unit 1004.
  • the imaging lens system 1006 of the medical imaging device is thus the lens or lens system which is used in order to collect light from the scene.
  • the imaging lens system is not particularly limited, and will vary depending on the imaging device to which the imaging system 1000 is applied.
  • the imaging device to which the imaging system 1000 is applied is a medical imaging device.
  • the present disclosure may also be applied to other types of imaging devices (such as industrial imaging devices or the like).
  • the type of medical imaging device, in which the imaging system 1000 is arranged is not particularly limited. That is, the medical imaging device may be any medical imaging device, such as a microscopic medical imaging device, laparoscopic medical imaging device, endoscopic medical imaging device or the like.
  • an imaging system comprising image sensor circuitry which is configured to measure, at a first instance of time, a plurality of different polarization orientations of light in combination with a polarization unit which is configured to introduce a polarization dependent phase difference to in-coming wave-fronts of light from a scene is particularly advantageous as it enables an extended depth of field of the image to be obtained (compared to an optical system of the same optical specification) without requiring an increase in the size (e.g. form factor) of the imaging device.
  • a plurality of images of the scene can be obtained, each having a different in-focus object distance (corresponding to each of the different polarization states provided by the polarization unit 1004).
  • polarization information regarding the scene is not lost. Therefore, the ability to perform polarization image analysis is maintained. That is, polarization based image analysis may still be performed on the image data, as information regarding all the polarization states provided by the polarization unit 1004 is maintained. This is particularly advantageous, therefore, over a system whereby input light is modulated and the separate polarization states of light are sequentially read out by image sensor circuitry.
  • the polarization unit 1004 of the imaging system 1000 is configured to receive incident light of a first polarization and provide light of different polarization states depending on the region of the polarization unit upon which the incident light is incident.
  • the polarization unit 1004 will now be described in more detail with reference to Figures 4A to 4C of the present disclosure.
  • FIG. 4A illustrates a polarization unit in accordance with embodiments of the disclosure.
  • the polarization unit 1004 is configured to receive light of a first polarization state (e.g. circular and/or elliptical polarization) and provide light of a plurality of different polarization states (e.g. a plurality of linear polarization states) depending on the region of the polarization unit upon which the incident light is incident.
  • a first polarization state e.g. circular and/or elliptical polarization
  • a plurality of different polarization states e.g. a plurality of linear polarization states
  • Polarization unit 1004 is then configured to provide light of different polarization states 3002a, 3002b, 3002c and 3002d depending on the region of the polarization unit upon which the incident light is incident. That is, in this example, each of the polarization states of light 3002a, 3002b, 3002c and 3002d are different forms of linear polarized light which are provided (e.g. output or transmitted) by the polarization unit 1004.
  • the polarization of light which is provided by the polarization unit 1004 in response to receiving the incident light 3000 varies across the surface of the polarization unit 1004.
  • the polarization unit 1004 may be comprised of a number of regions or segments, with each region or segment being configured to provide light of a given polarization state. That is, a first region of the polarization unit 1004 may provide light of polarization state 3002a, while a second region of the polarization unit 1004 may provide light of polarization state 3002b.
  • the polarization unit 1004 is not particularly limited to providing the type and/or number of polarization states which are illustrated in this specific example. In fact, any number of polarization states can be provided by the polarization unit 1004, insofar as there are at least two polarization states (such that a phase shift can be introduced between the respective polarization states).
  • Figures 4B and 4C illustrate an example configuration of the polarization unit 1004 in accordance with embodiments of the disclosure.
  • the polarization unit 1004 is arranged in a number of distinct segments 4000, 4002, 4004 and 4006. That is, in this specific example, the polarization unit 1004 is arranged as a segmented polarization unit, whereby the polarization state of light which is provided by the polarization unit 1004 varies in accordance with both the segment in which the light is incident and the annular distance of that light from the centre of the polarization unit.
  • the polarization unit is configured to provide light of a vertical polarization state when that light is incident upon the region 4006a of segment 4006. In contrast, within segment 4006, a horizontal polarization state of light is provided when the incident light is incident upon the region 4006b of segment 4006.
  • the regions of the polarization unit which provide light of different polarization states are segmented regions of the polarization unit 1004.
  • An alternative example configuration of the polarization unit 1004 is illustrated by Figure 4C of the present disclosure.
  • the polarization state of light which is provided by the polarization unit 1004 varies only in accordance with the radial distance of the light from the centre of the polarization unit.
  • the polarization unit 1004 is segmented into a number of annular zones or regions 4008, 4010, 4012 and 4014.
  • the polarization state of light which is provided by the polarization unit 1004 depends only upon the annular zone or region 4008, 4010, 4012 and 4014 upon which that light was incident. For example, light which is received by the polarization unit 1004 anywhere within the annular zone or region 4014 will be provided, by the polarization unit 1004, with a horizontal polarization state.
  • the regions of the polarization unit which provide light of different polarization state are annular regions of the polarization unit 1004.
  • the polarization unit 1004 is not particularly limited to these specific configurations illustrated in Figure 4B and Figure 4C of the present disclosure. In fact, any such configuration of the polarization unit 1004 may be provided as required, insofar as the polarization unit 1004 is configured to provide light of different polarization state depending upon the region of the polarization unit upon which the incident light is incident. This introduces a phase shift between the different polarizations of light.
  • the polarization unit 1004 may comprise a quarter wave zone plate (or the like).
  • a quarter wave zone plate introduces a relative phase shift of 90 degrees between the orthogonal wave-fronts passing through the quarter wave zone plate, when that plate is illuminated by light.
  • the quarter wave zone plate will convert an incoming circularly polarized wave-front into a linearly polarized wave-front.
  • the materials used in the construction of the quarter wave plate may vary in accordance with the situation to which the embodiments of the disclosure are applied.
  • the plate By introducing a number of zones, or regions, across the quarter wave plate, the plate will thus convert an incoming circularly polarized wave-front into a number of linearly polarized wave-fronts in accordance with the region on which the incoming wave-front is incident.
  • the quarter wave zone plate is a segmented quarter wave plate, where each segment has a different optical axis orientation.
  • the quarter wave zone plate is one example of an polarization unit 1004 which can be used in accordance with embodiments of the disclosure.
  • any polarization unit which is configured to convert incident light of a first polarization state into light of a plurality of polarization states may be used in accordance with embodiments of the disclosure.
  • the polarization unit 1004 through use of the polarization unit 1004, light from the scene (captured by the imaging lens system 1006 of the medical imaging device) is turned into polarized light of different state/orientation (such as different linear polarization states) depending on the light beam position in pupil space (which dictates upon which segment of the polarization unit the light from the scene is incident).
  • the light incident on the polarization unit comprises light of a predetermined first polarization state (such as circular or elliptically polarized light), such that the polarization unit 1004 may provide light of a plurality of polarization states (such as a plurality of linearly polarized light) from this incident light.
  • a predetermined first polarization state such as circular or elliptically polarized light
  • the polarization unit 1004 may provide light of a plurality of polarization states (such as a plurality of linearly polarized light) from this incident light.
  • the reflected light from the scene (captured by the imaging lens system 1006 of the imaging device) will be deformed into light of a first polarization state (e.g. elliptical or circularly polarized light). That is, in certain situations, the reflected light from the scene may already be in a desired first polarization state.
  • a first polarization state e.g. elliptical or circularly polarized light
  • the input polarization state of light being the light which is incident upon the polarization unit 1004
  • a light source located within the surgical environment.
  • the illumination source may be internal or external to the medical imaging device itself.
  • an optional light source 1008 is illustrated as being part of the imaging system 1000.
  • the optional light source 1008 may be configured to provide light of a desired first polarization state as input light of the scene.
  • the light in the scene may be provided by an ambient light source.
  • the ambient light source is of a desired first polarization state (such as a circular and/or elliptical pre -polarization).
  • an additional light source may also be provided in order to illuminate the scene with a desired first polarization of light (such as circular and/or elliptical pre polarized light).
  • this additional light source may be either internal or external to the imaging system 1000.
  • the light source may, instead of being provided as part of the imaging system 1000 (as optionally illustrated in Figure 3 of the present disclosure) the light source may, instead, be provided as an independent light source device.
  • the light source may, optionally, be provided as part of the imaging device itself.
  • the optional light source 1008 may be configured to provide light of a first pre- configured polarization state (e.g. light of a circular/elliptical polarization).
  • a first pre- configured polarization state e.g. light of a circular/elliptical polarization
  • the imaging system 1000 further comprises image sensor circuitry having a plurality of image sensing regions, each of the respective image sensing regions configured to be sensitive to light of a certain polarization state.
  • the image sensor circuitry may be image sensor circuity, such as a CCD sensor, a CMOS sensor or the like. In fact, any image capture device may be used as the image sensor circuitry in accordance with embodiments of the disclosure. However, image sensor circuitry according to embodiments of the disclosure is configured with a sensor sub-pixel structure such that a plurality of image sensing regions are provided (each of these regions being sensitive to light of a certain polarization state).
  • the image sensor circuitry may be a CMOS based sensor with 2D pixel array (such as the Sony product named IMX250) with four different linear polarization filter aligned in 2x2 cluster on pixel level.
  • Figure 5 illustrates an example configuration of a region of image sensor circuitry 1002 in accordance with embodiments of the disclosure.
  • the image sensor circuitry 1002 is configured with a sub-pixel structure such that four sub-pixel regions 2000, 2002, 2004 and 2006 are provided.
  • Each of these regions of the image sensor circuitry 1002 are configured to be sensitive to light of a different polarization state. That is, in this example, the region 2000 is configured to be sensitive to light of a horizontal polarization state, while the region 2006 is configured to be sensitive to a light of a vertical polarization state.
  • each the regions 2004 and 2002 are configured to be sensitive to light of a different diagonal polarization orientation.
  • the image sensor circuitry of the present disclosure is not particularly limited to the above described sub-pixel structure.
  • the image sensor circuitry illustrated in Figure 5 of the present disclosure has four analyser orientations (corresponding to the four different polarization orientations/states of light).
  • the image sensor circuitry 1002 according to embodiments of the disclosure therefore fdters the light from the polarization unit in accordance with its polarization direction.
  • the single image sensor circuitry can produce four output images (each image corresponding to a specific polarization state of light provided by the polarization unit 1004).
  • the image sensor circuitry of the present disclosure may comprise a two-dimensional solid-state image capture device, whereby a polarization member is disposed at a light incident side of the sub-pixel regions constituting each pixel area of the image sensor.
  • the polarization member thus shields, or restricts, the polarization state of light which can be incident upon each of the respective sub-pixel regions.
  • the image sensor circuitry of the present disclosure may be provided by a sensor such as the Sony IMX 250 sensor (either RGB or Monochrome version).
  • This sensor is a four-directional pixel-wise polarization CMOS image sensor using air-gap wire grid on 2.5pm back illuminated pixels.
  • the present disclosure is not particularly limited in this regard, and other sensors with the above described configuration may be used as required.
  • the sub-pixel regions of the image sensor 1002 are co-aligned with the regions of the polarization unit 1004.
  • the image sensor circuitry is configured to be arranged to receive light from the polarization unit and further being configured to output, at a first instance of time, image data corresponding to each of the respective image sensing regions.
  • the image sensor circuitry may, therefore, be configured in order to output, at a first instance of time, image data from sub-pixel region 2006 (being an image data formed of vertically polarized light) as a first image.
  • the image sensor circuitry may be configured to output image data from sub-pixel regions 2000, 2002, and 2004 as independent image data having horizontal and diagonal polarization states respectively.
  • the image sensor circuitry of the present disclosure may be configured to produce, at the first instance of time, individual image data for each of the image polarization states of light received from the polarization unit respectively.
  • Figure 6A illustrates an example situation to which the imaging system 1000 may be applied.
  • the imaging lens system 1006 of the medical imaging device the polarization unit 1004 and the image sensor circuity 1002 are shown.
  • Input light (being reflected light from the surgical scene) of either elliptical and/or circular polarization is received by the imaging lens system 1006 of the medical imaging device.
  • Light from this imaging lens system 1006 of the medical imaging device is intercepted by the polarization unit 1004 of the imaging lens system 1000.
  • the polarization unit 1004 is arranged at the pupil positon of the imaging lens system 1006 of the medical imaging device.
  • the polarization unit 1004 provides light of a different polarization state depending on the light beam position of the incident light in pupil space.
  • the change of polarization state of the incident light introduces a phase shift between the respective polarization states of the light from the scene.
  • This is illustrated with reference to Figure 6B of the present disclosure.
  • light of each of the respective polarization states provided by the polarization unit 1004 have a focal plane which is axially shifted along the optical axis of the medical imaging device. That is, in this example, light of the vertical polarization state has a focus plane axial shift of Af along the optical axis of the medical imaging device compared to light of the horizontal polarization state.
  • light from an object with the same object distance will have a different focal plane along optical axis of the imaging device for each different polarization state of light provided by the polarization unit 1004.
  • the amount of focal shift between the different polarization states Af is dependent on the focal length, magnification and aperture size of the optical system.
  • ‘in focus’ light for each of the different polarization states of light will correspond to light received from an object with a different object distance in the scene. That is, in focus light for the vertical polarization state of light may originate from an object in the scene with a first object distance X, while in focus light for a different polarization state of light (e.g. a horizontal polarization state of light) will originate from an object in the scene with a second object distance Y.
  • a different polarization state of light e.g. a horizontal polarization state of light
  • Figure 6C of the present disclosure illustrates an example focus shift in the image plane in accordance with embodiments of the present disclosure.
  • two paths of light 6000P1 and 6000P2 (corresponding to two distinct polarizations of the light) are shown.
  • the light 6000P1 originates at the same object position in the object plane as the light 6000P2 in this example (i.e. being the same distance from the lens 1006).
  • an axial shift along the optical image axis of the medical imaging device occurs between the light of the first polarization and the second polarization.
  • Figure 6D of the present disclosure illustrates an example focus shift in accordance with embodiments of the disclosure.
  • input light being either elliptical or circular in polarization
  • the segmented retarder zone plate When the light from these different object distances reaches the segmented retarder zone plate, an axial shift along the optical image axis of the medical imaging device occurs between the light of the different polarization states.
  • ‘in focus’ light for each of the different polarization states of light will correspond to light received from an object with a different object distance in the scene.
  • in focus light for the vertical polarization state of light may originate from an object in the scene with a first object distance X
  • focus light for a different polarization state of light e.g. a horizontal polarization state of light
  • a second object distance Y e.g. a horizontal polarization state of light
  • Figure 6E of the present disclosure illustrates an example focus shift in accordance with embodiments of the disclosure.
  • input light being either elliptical or circular in polarization in this example
  • an object or, from a plurality of objects at the same object distance from the lens.
  • an axial shift along the optical image axis of the medical imaging device occurs between the light of the first polarization and the second polarization. Therefore, light of different polarization states will have a different in-focus position at the sensor plane.
  • the image sensor circuitry 1002 by independently imaging each of the polarization states of the scene, with the image sensor circuitry 1002, in focus images can be obtained for a range of object distances within the image scene at the same instance of time. That is, for each polarization state provided by the polarization unit 1004, the image sensor circuitry 1002 is configured to produce an individual image of the scene at that instance of time. Furthermore, because the focal position of the images corresponding to different polarization states will be different (owing to the shift of focal position) different regions of each of these respective images (corresponding to the different polarization states, with different in-focus object distances) will be in focus.
  • the subsequent array of in-focus images (each with a different in-focus distance), corresponding to each of the polarization states of light provided by the polarization unit 1004 and independently detected by the image sensor circuitry 1002, can be produced without any increase in the size (e.g. form factor) of the medical imaging device.
  • a plurality of different focus positions (planes) can be generated using the imaging system 1000 of the present disclosure. Therefore, the imaging system 1000 of the present disclosure provides an increase in the depth of field which can be obtained by the imaging device.
  • the processing applied to the images output by the imaging system 1000 in accordance to embodiments of the disclosure is not particularly limited, and will vary depending on the situation to which the embodiments of the disclosure are applied.
  • the image data of each of the polarization states may be analysed individually.
  • the image data may be stored in a storage unit.
  • any type of image processing may be applied to the image data as required.
  • the image sensor circuitry (or indeed, in certain examples, processor circuitry which receives the image data from the image sensor circuitry (either directly, or from an intermediate storage)) is configured to perform a series of processing steps in order to produce an image, for the first instance of time, having an extended depth of filed.
  • processing is herein described with reference to a single instance of time. However, it will be appreciated that these steps may, alternatively, be applied, respectively and in sequence, to images obtained at subsequent instances of time. In this case, a sequence of images, at respective instances of time, having an extended depth of field is obtained.
  • each individual image data obtained at the first instance of time, corresponding to each individual polarization state provided by the polarization unit 1004, is partitioned into regions.
  • the size of these regions is not particularly limited, and will vary in accordance with processing requirements of the system. However, it will be appreciated that, for a given instance of time, the size of these regions will be the same across the images of the different polarization states.
  • the sharpness level of each region for each image is determined. That is, the sharpest image of the plurality of image data (corresponding to the different polarization states, each having a different focal position) obtained at the first instance of time is determined for each region of the image data.
  • any appropriate method known in the art may be used to determine the sharpness level of each region in each of the images (e.g. a modulation transfer function may be applied to the image data in order to determine the sharpness level of each region).
  • a sharpness transfer function is then applied to the image data in order to transfer image detail from the image of each region having the highest sharpness level to the corresponding regions of the images of other polarization states (having a lower sharpness level for that region).
  • the sharpness transport function enables the production of a reconstructed image having extended depth of field. It shall be understood that transporting the sharpness level from the polarization image (or image data) having the highest sharpness level to the complementary polarization images (or image data) having a lower sharpness level for each region of the image data may be carried out using any suitable sharpness transport technique such as those known in the art.
  • the sharpness gradient (with respect to image texture) will be analysed for each of the sub-pixel types of the image sensor circuitry 1002, with the high frequency information being transferred to subpixel types with lower sharpness gradient. This enables the re-sharpening of objects for different object distances in a single image.
  • resultant image data having an extended depth of field can be produced in accordance with embodiments of the disclosure.
  • FIG. 7 an apparatus 7000 according to embodiments of the disclosure is shown.
  • an apparatus 7000 is a computer device such as a personal computer or a terminal connected to a server. Indeed, in embodiments, the apparatus may also be a server.
  • the apparatus 7000 is controlled using a microprocessor or other processing circuitry 7002.
  • the apparatus 7000 may be a portable computing device such as a mobile phone, laptop computer or tablet computing device.
  • the processing circuitry 7002 may be a microprocessor carrying out computer instructions or may be an Application Specific Integrated Circuit.
  • the computer instructions are stored on storage medium 7004 which maybe a magnetically readable medium, optically readable medium or solid state type circuitry.
  • the storage medium 7004 may be integrated into the apparatus 7000 or may be separate to the apparatus 7000 and connected thereto using either a wired or wireless connection.
  • the computer instructions may be embodied as computer software that contains computer readable code which, when loaded onto the processor circuitry 7002, configures the processor circuitry 7002 to perform a method according to embodiments of the disclosure (such as a method of imaging for a medical imaging device as illustrated with reference to Figure 8 of the present disclosure).
  • an optional user input device 7006 is shown connected to the processing circuitry 7002.
  • the user input device 7006 may be a touch screen or may be a mouse or stylist type input device.
  • the user input device 7006 may also be a keyboard or any combination of these devices.
  • a network connection 7008 may optionally be coupled to the processor circuitry 7002.
  • the network connection 7008 may be a connection to a Local Area Network or a Wide Area Network such as the Internet or a Virtual Private Network or the like.
  • the network connection 7008 may be connected to a server allowing the processor circuitry 7002 to communicate with another apparatus in order to obtain or provide relevant data.
  • the network connection 7002 may be behind a firewall or some other form of network security.
  • a display device 7010 shown coupled to the processing circuitry 7002, is a display device 7010.
  • the display device 7010 although shown integrated into the apparatus 7000, may additionally be separate to the apparatus 7000 and may be a monitor or some kind of device allowing the user to visualise the operation of the system.
  • the display device 7010 may be a printer, projector or some other device allowing relevant information generated by the apparatus 7000 to be viewed by the user or by a third party.
  • Figure 8 illustrates a method of imaging for an imaging device in accordance with embodiments of the disclosure.
  • the imaging device may comprise the imaging system described with reference to Figure 3 of the present disclosure.
  • step S8000 The method begins at step S8000, and proceeds to step S8002.
  • the method comprises receiving, using image sensor circuitry having a plurality of image sensing regions configured to be sensitive to light of a certain polarization state, light from a polarization unit arranged along an optical axis of the imaging device between an imaging lens system of the imaging device and the image sensor circuitry, the polarization unit being configured to receive indecent light and provide light of different polarization states depending on the region of the polarization unit upon which the incident light is incident.
  • step S8004 the method proceeds to step S8004.
  • step S8004 the method comprises outputting, at a first instance of time, image data corresponding to each of the respective image sensing regions.
  • step S8006 the method proceeds to, and ends with, step S8006.
  • the method of imaging for an imaging device in accordance with embodiments of the disclosure is not particularly limited to the method illustrated in Figure 8 of the present disclosure.
  • the method according to embodiments of the present disclosure may, instead of proceeding to method step S8006, comprise returning to method step S8000 or S8002.
  • Imaging system for an imaging device comprising: image sensor circuitry having a plurality of image sensing regions, each of the respective image sensing regions configured to be sensitive to light of a certain polarization state; and a polarization unit being configured to be arranged along an optical axis of the imaging device between an imaging lens system of the imaging device and the image sensor circuitry, the polarization unit further being further configured to receive incident light and provide light of different polarization states depending on the region of the polarization unit upon which the incident light is incident; the image sensor circuitry being configured to be arranged to receive light from the polarization unit and further being configured to output, at a first instance of time, image data corresponding to each of the respective image sensing regions.
  • each of the plurality of image sensing regions is configured to be sensitive to a different polarization state of light.
  • the polarization unit is configured such that that the regions of the polarization unit which provide light of different polarization states are annular regions of the polarization unit.
  • the polarization unit is configured such that the regions of the polarization unit which provide light of different polarization states are segmented regions of the polarization unit.
  • processing circuitry configured to analyse a portion of the image data corresponding to each of the respective image sensing regions to identify an image sensing region with a highest sharpness gradient for that portion of the image data.
  • processing circuitry is further configured to transfer, for the portion of the image data, high frequency information from the image sensing region with the highest sharpness gradient to image data from a different image sensing region.
  • imaging device is a laparoscopic imaging device or an endoscopic imaging device.
  • imaging device is a medical imaging device or an industrial imaging device.
  • the imaging system according to any preceding Clause further comprising a light source configured to emit light of a predetermined polarization.
  • Method of imaging for an imaging device comprising: receiving, using image sensor circuitry having a plurality of image sensing regions configured to be sensitive to light of a certain polarization state, light from a polarization unit arranged along an optical axis of the imaging device between an imaging lens system of the imaging device and the image sensor circuitry, the polarization unit being configured to receive indecent light and provide light of different polarization states depending on the region of the polarization unit upon which the incident light is incident; and outputting, at a first instance of time, image data corresponding to each of the respective image sensing regions.
  • a computer program product comprising instructions which, when the program is executed by a computer, cause the computer to carry out a method of imaging for an imaging device, the method comprising: controlling image sensor circuitry having a plurality of image sensing regions configured to be sensitive to light of a certain polarization state to receive light from a polarization unit arranged along an optical axis of the imaging device between an imaging lens system of the imaging device and the image sensor circuitry, the polarization unit being configured to receive incident light and provide light of different polarization states depending on the region of the polarization unit upon which the indecent light is indecent; and controlling the image sensor circuitry to output, at a first instance of time, image data corresponding to each of the respective image sensing regions.
  • embodiments of the disclosure have been described in relation to an imaging system for a medical imaging device, it will be appreciated that the claimed invention is not limited to medical imaging (or medical imaging devices), and could, instead, be used in any imaging situation.
  • the imaging system according to embodiments of the disclosure could be employed to effect in an industrial imaging device such as an industrial endoscopic device.
  • embodiments of the disclosure could be used in architectural endoscopy, whereby a scale version of a new building or complex can be correctly viewed from the perspective of a person walking through the architectural creation improving the visualisation, design and construction of proposed buildings.
  • Embodiments of the disclosure could be used for internal visualisation of works of engineering.
  • an imaging device according to embodiments of the disclosure could be used to view the interior of underground pipe systems, such as water pipes, in order to locate leaks or generally survey the structure.
  • An imaging device according to embodiments of the disclosure could also be used for quality control and internal inspection of other mechanical systems such as turbines and engine components.
  • embodiments of the disclosure could be used in the security and surveillance industry.
  • an imaging device according to embodiments of the disclosure could be used to conduct surveillance in an area where the presence of a person is restricted, such as in an enclosed area or a very tight space.
  • an imaging system may be applied to the imaging device in order to capture high resolution images with an extended depth of field. It will be appreciated that the above are merely examples of possible industrial applications of an imaging system according to embodiments of the disclosure, and many further applications of the imaging device are possible, as would be apparent to the skilled person when reading the disclosure.
  • Described embodiments may be implemented in any suitable form including hardware, software, firmware or any combination of these. Described embodiments may optionally be implemented at least partly as computer software running on one or more data processors and/or digital signal processors.
  • the elements and components of any embodiment may be physically, functionally and logically implemented in any suitable way. Indeed the functionality may be implemented in a single unit, in a plurality of units or as part of other functional units. As such, the disclosed embodiments may be implemented in a single unit or may be physically and functionally distributed between different units, circuitry and/or processors.

Landscapes

  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Optics & Photonics (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • General Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Multimedia (AREA)
  • Astronomy & Astrophysics (AREA)
  • Signal Processing (AREA)
  • Endoscopes (AREA)

Abstract

An imaging system for an imaging device is provided by embodiments of the disclosure, the imaging system comprising: image sensor circuitry having a plurality of image sensing regions, each of the respective image sensing regions configured to be sensitive to light of a certain polarization state; and a polarization unit being configured to be arranged along an optical axis of the imaging device between an imaging lens system of the imaging device and the image sensor circuitry, the polarization unit further being further configured to receive incident light and provide light of different polarization states depending on the region of the polarization unit upon which the incident light is incident; the image sensor circuitry being configured to be arranged to receive light from the polarization unit and further being configured to output, at a first instance of time, image data corresponding to each of the respective image sensing regions.

Description

AN IMAGING SYSTEM. METHOD AND COMPUTER PROGRAM PRODUCT FOR AN
IMAGING DEVICE
BACKGROUND Field of the Disclosure
The present invention relates to an imaging system, method and computer program product for an imaging device.
Description of the Related Art
The “background” description provided herein is for the purpose of generally presenting the context of the disclosure. Work of the presently named inventors, to the extent it is described in the background section, as well as aspects of the description which may not otherwise qualify as prior art at the time of filing, are neither expressly or impliedly admitted as prior art against the present invention.
Imaging devices are used in a wide variety of situations in order to obtain images of an environment or scene. The optical performance of the imaging device is a key factor in the quality of images and/or the amount of information regarding the environment or scene which can be obtained.
However, imaging devices are limited in optical performance in many ways. A particular problem is the limited depth of field at which high spatial frequencies can be obtained in order to capture a sufficiently sharp image. Specifically, there is often a trade-off between the resolution of the image which can be obtained and the corresponding depth of field of that image. That is, higher-aperture numbers (smaller aperture size) provide a wider depth of field, yet reduce the amount of light which can be collected at sensor level (leading to noisy and low contrast and low resolution images). In contrast, a lower-aperture number (wider aperture size) enables an increased amount of light to be collected at the sensor level, yet reduces the depth of field which can be obtained.
These problems are often exacerbated in endoscopic and/or laparoscopic imaging devices. These imaging devices are limited in size (e.g. limited to small aperture diameters and/or form factors). Hence, in order to maintain a sufficiently high resolution image, the pixel sensor size is generally reduced. This, in turn, demands a low aperture number (or increase in physical aperture size) and a high effort on optical design to transfer high spatial frequencies.
It is an aim of the present disclosure to address these issues.
SUMMARY
In a first aspect of the present disclosure, an imaging system for an imaging device is provided, the imaging system comprising: image sensor circuitry having a plurality of image sensing regions, each of the respective image sensing regions configured to be sensitive to light of a certain polarization state; and a polarization unit being configured to be arranged along an optical axis of the imaging device between an imaging lens system of the imaging device and the image sensor circuitry, the polarization unit further being further configured to receive incident light and provide light of different polarization states depending on the region of the polarization unit upon which the incident light is incident; the image sensor circuitry being configured to be arranged to receive light from the polarization unit and further being configured to output, at a first instance of time, image data corresponding to each of the respective image sensing regions.
In a second aspect of the present disclosure, a method of imaging for an imaging system device is provided, the method comprising: receiving, using image sensor circuitry having a plurality of image sensing regions configured to be sensitive to light of a certain polarization state, light from a polarization unit arranged along an optical axis of the imaging device between an imaging lens system of the imaging device and the image sensor circuitry, the polarization unit being configured to receive indecent light and provide light of different polarization states depending on the region of the polarization unit upon which the incident light is incident; and outputting, at a first instance of time, image data corresponding to each of the respective image sensing regions.
In a third aspect of the present disclosure, a computer program product comprising instructions which, when the program is executed by a computer, cause the computer to perform a method of imaging for an imaging device is provided, the method comprising: controlling image sensor circuitry having a plurality of image sensing regions configured to be sensitive to light of a certain polarization state to receive light from a polarization unit arranged along an optical axis of the imaging device between an imaging lens system of the imaging device and the image sensor circuitry, the polarization unit being configured to receive incident light and provide light of different polarization states depending on the region of the polarization unit upon which the indecent light is indecent; and controlling the image sensor circuitry to output, at a first instance of time, image data corresponding to each of the respective image sensing regions.
According to these aspects of the present disclosure, the depth of field which an imaging device can achieve, for a given aperture number, is extended, which enables a sharper in-focus image to be obtained for an extended range of object distances within the scene. Furthermore, the increase in the depth of field can be achieved without increasing the form factor of the imaging device. Of course, the present disclosure is not particularly limited to these advantageous technical effects, there may be others as will be apparent to the skilled person when reading the disclosure.
The foregoing paragraphs have been provided by way of general introduction, and are not intended to limit the scope of the following claims. The described embodiments, together with further advantages, will be best understood by reference to the following detailed description taken in conjunction with the accompanying drawings.
BRIEF DESCRIPTION OF THE DRAWINGS A more complete appreciation of the disclosure and many of the attendant advantages thereof will be readily obtained as the same becomes better understood by reference to the following detailed description when considered in connection with the accompanying drawings, wherein:
Figure 1 is a view depicting an example of a schematic configuration of an endoscopic surgery system to which the technology according to an embodiment of the present disclosure can be applied;
Figure 2 is a block diagram depicting an example of a functional configuration of the camera head and the CCU depicted in Figure 1;
Figure 3 illustrates an imaging system for an imaging device in accordance with embodiments of the disclosure;
Figure 4A illustrates a polarization unit in accordance with embodiments of the disclosure;
Figure 4B illustrates an example configuration of the polarization unit in accordance with embodiments of the disclosure;
Figure 4C illustrates an example configuration of the polarization unit in accordance with embodiments of the disclosure;
Figure 5 illustrates image sensor circuitry in accordance with embodiments of the disclosure;
Figure 6A illustrates an example focal plane shift in accordance with embodiments of the disclosure;
Figure 6B illustrates an example focal plane shift in accordance with embodiments of the disclosure;
Figure 6C illustrates an example of a focus shift in the image plane in accordance with embodiments of the disclosure;
Figure 6D illustrates an example of a focus shift in accordance with embodiments of the disclosure;
Figure 6E illustrates an example of a focus shift in accordance with embodiments of the disclosure;
Figure 7 illustrates a device in accordance with embodiments of the disclosure; and
Figure 8 illustrates a method of imaging for an imaging device in accordance with embodiments of the disclosure.
DESCRIPTION OF THE EMBODIMENTS
Referring now to the drawings, wherein like reference numerals designate identical or corresponding parts throughout the several views.
«Application»
The technology according to an embodiment of the present disclosure can be applied to various products. For example, the technology according to an embodiment of the present disclosure may be applied to an endoscopic surgery system, surgical microscopy or medical imaging device or other kind of industrial endoscopy in, say pipe or tube laying or fault finding.
Figure 1 is a view depicting an example of a schematic configuration of an endoscopic surgery system 5000 to which the technology according to an embodiment of the present disclosure can be applied. In Figure 1, a state is illustrated in which a surgeon (medical doctor) 5067 is using the endoscopic surgery system 5000 to perform surgery for a patient 5071 on a patient bed 5069. As depicted, the endoscopic surgery system 5000 includes an endoscope 5001, other surgical tools 5017, a supporting arm apparatus 5027 which supports the endoscope 5001 thereon, and a cart 5037 on which various apparatus for endoscopic surgery are mounted.
In endoscopic surgery, in place of incision of the abdominal wall to perform laparotomy, a plurality of tubular aperture devices called trocars 5025a to 5025d are used to puncture the abdominal wall. Then, a lens barrel 5003 of the endoscope 5001 and the other surgical tools 5017 are inserted into body lumens of the patient 5071 through the trocars 5025a to 5025d. In the example depicted, as the other surgical tools 5017, a pneumoperitoneum tube 5019, an energy treatment tool 5021 and forceps 5023 are inserted into body lumens of the patient 5071. Further, the energy treatment tool 5021 is a treatment tool for performing incision and peeling of a tissue, sealing of a blood vessel or the like by high frequency current or ultrasonic vibration. However, the surgical tools 5017 depicted are mere examples at all, and as the surgical tools 5017, various surgical tools which are generally used in endoscopic surgery such as, for example, a pair of tweezers or a retractor may be used.
An image of a surgical region in a body lumen of the patient 5071 imaged by the endoscope 5001 is displayed on a display apparatus 5041. The surgeon 5067 would use the energy treatment tool 5021 or the forceps 5023 while watching the image of the surgical region displayed on the display apparatus 5041 on the real time basis to perform such treatment as, for example, resection of an affected area. It is to be noted that, though not depicted, the pneumoperitoneum tube 5019, the energy treatment tool 5021 and the forceps 5023 are supported by the surgeon 5067, an assistant or the like during surgery.
(Supporting Arm Apparatus)
The supporting arm apparatus 5027 includes an arm unit 5031 extending from a base unit 5029. In the example depicted, the arm unit 5031 includes joint portions 5033a, 5033b and 5033c and links 5035a and 5035b and is driven under the control of an arm controlling apparatus 5045. The endoscope 5001 is supported by the arm unit 5031 such that the position and the posture of the endoscope 5001 are controlled. Consequently, stable fixation in position of the endoscope 5001 can be implemented.
(Endoscope)
The endoscope 5001 includes the lens barrel 5003 which has a region of a predetermined length from a distal end thereof to be inserted into a body lumen of the patient 5071, and a camera head 5005 connected to a proximal end of the lens barrel 5003. In the example depicted, the endoscope 5001 is depicted which includes as a hard mirror having the lens barrel 5003 of the hard type. However, the endoscope 5001 may otherwise be configured as a soft mirror having the lens barrel 5003 of the soft type.
The lens barrel 5003 has, at a distal end thereof, an opening in which an objective lens is fitted. A light source apparatus 5043 is connected to the endoscope 5001 such that light generated by the light source apparatus 5043 is introduced to a distal end of the lens barrel by a light guide extending in the inside of the lens barrel 5003 and is irradiated toward an observation target in a body lumen of the patient 5071 through the objective lens. It is to be noted that the endoscope 5001 may be a direct view mirror or may be a perspective view mirror or a side view mirror.
An optical system and an image pickup element are provided in the inside of the camera head 5005 such that reflected light (observation light) from an observation target is condensed on the image pickup element by the optical system. The observation light is photo-electrically converted by the image pickup element to generate an electric signal corresponding to the observation light, namely, an image signal corresponding to an observation image. The image signal is transmitted as RAW data to a CCU 5039. It is to be noted that the camera head 5005 has a function incorporated therein for suitably driving the optical system of the camera head 5005 to adjust the magnification and the focal distance.
It is to be noted that, in order to establish compatibility with, for example, a stereoscopic vision (three dimensional (3D) display), a plurality of image pickup elements may be provided on the camera head 5005. In this case, a plurality of relay optical systems are provided in the inside of the lens barrel 5003 in order to guide observation light to each of the plurality of image pickup elements.
(Various Apparatus Incorporated in Cart)
The CCU 5039 includes a central processing unit (CPU), a graphics processing unit (GPU) or the like and integrally controls operation of the endoscope 5001 and the display apparatus 5041. In particular, the CCU 5039 performs, for an image signal received from the camera head 5005, various image processes for displaying an image based on the image signal such as, for example, a development process (demosaic process). The CCU 5039 provides the image signal for which the image processes have been performed to the display apparatus 5041. Further, the CCU 5039 transmits a control signal to the camera head 5005 to control driving of the camera head 5005. The control signal may include information relating to an image pickup condition such as a magnification or a focal distance.
The display apparatus 5041 displays an image based on an image signal for which the image processes have been performed by the CCU 5039 under the control of the CCU 5039. If the endoscope 5001 is ready for imaging of a high resolution such as 4K (horizontal pixel number 3840 c vertical pixel number 2160), 8K (horizontal pixel number 7680 c vertical pixel number 4320) or the like and/or ready for 3D display, then a display apparatus by which corresponding display of the high resolution and/or 3D display are possible may be used as the display apparatus 5041. Where the apparatus is ready for imaging of a high resolution such as 4K or 8K, if the display apparatus used as the display apparatus 5041 has a size of equal to or not less than 55 inches, then a more immersive experience can be obtained. Further, a plurality of display apparatus 5041 having different resolutions and/or different sizes may be provided in accordance with purposes.
The light source apparatus 5043 includes a light source such as, for example, a light emitting diode (LED) and supplies irradiation light for imaging of a surgical region to the endoscope 5001.
The arm controlling apparatus 5045 includes a processor such as, for example, a CPU and operates in accordance with a predetermined program to control driving of the arm unit 5031 of the supporting arm apparatus 5027 in accordance with a predetermined controlling method.
An inputting apparatus 5047 is an input interface for the endoscopic surgery system 5000. A user can perform inputting of various kinds of information or instruction inputting to the endoscopic surgery system 5000 through the inputting apparatus 5047. For example, the user would input various kinds of information relating to surgery such as physical information of a patient, information regarding a surgical procedure of the surgery and so forth through the inputting apparatus 5047. Further, the user would input, for example, an instruction to drive the arm unit 5031, an instruction to change an image pickup condition (type of irradiation light, magnification, focal distance or the like) by the endoscope 5001, an instruction to drive the energy treatment tool 5021 or the like through the inputting apparatus 5047.
The type of the inputting apparatus 5047 is not limited and may be that of any one of various known inputting apparatus. As the inputting apparatus 5047, for example, a mouse, a keyboard, a touch panel, a switch, a foot switch 5057 and/or a lever or the like may be applied. Where a touch panel is used as the inputting apparatus 5047, it may be provided on the display face of the display apparatus 5041.
Otherwise, the inputting apparatus 5047 is a device to be mounted on a user such as, for example, a glasses type wearable device or a head mounted display (HMD), and various kinds of inputting are performed in response to a gesture or a line of sight of the user detected by any of the devices mentioned. Further, the inputting apparatus 5047 includes a camera which can detect a motion of a user, and various kinds of inputting are performed in response to a gesture or a line of sight of a user detected from a video imaged by the camera. Further, the inputting apparatus 5047 includes a microphone which can collect the voice of a user, and various kinds of inputting are performed by voice collected by the microphone. By configuring the inputting apparatus 5047 such that various kinds of information can be inputted in a contactless fashion in this manner, especially a user who belongs to a clean area (for example, the surgeon 5067) can operate an apparatus belonging to an unclean area in a contactless fashion. Further, since the user can operate an apparatus without releasing a possessed surgical tool from its hand, the convenience to the user is improved.
A treatment tool controlling apparatus 5049 controls driving of the energy treatment tool 5021 for cautery or incision of a tissue, sealing of a blood vessel or the like. A pneumoperitoneum apparatus 5051 feeds gas into a body lumen of the patient 5071 through the pneumoperitoneum tube 5019 to inflate the body lumen in order to secure the field of view of the endoscope 5001 and secure the working space for the surgeon. A recorder 5053 is an apparatus capable of recording various kinds of information relating to surgery. A printer 5055 is an apparatus capable of printing various kinds of information relating to surgery in various forms such as a text, an image or a graph.
In the following, especially a characteristic configuration of the endoscopic surgery system 5000 is described in more detail.
(Supporting Arm Apparatus)
The supporting arm apparatus 5027 includes the base unit 5029 serving as a base, and the arm unit 5031 extending from the base unit 5029. In the example depicted, the arm unit 5031 includes the plurality of joint portions 5033a, 5033b and 5033c and the plurality of links 5035a and 5035b connected to each other by the joint portion 5033b. In Figure 1, for simplified illustration, the configuration of the arm unit 5031 is depicted in a simplified form. Actually, the shape, number and arrangement of the joint portions 5033a to 5033c and the links 5035a and 5035b and the direction and so forth of axes of rotation of the joint portions 5033a to 5033c can be set suitably such that the arm unit 5031 has a desired degree of freedom. For example, the arm unit 5031 may preferably be configured such that it has a degree of freedom equal to or not less than 6 degrees of freedom. This makes it possible to move the endoscope 5001 freely within the movable range of the arm unit 5031. Consequently, it becomes possible to insert the lens barrel 5003 of the endoscope 5001 from a desired direction into a body lumen of the patient 5071.
An actuator is provided in each of the joint portions 5033a to 5033c, and the joint portions 5033a to 5033c are configured such that they are rotatable around predetermined axes of rotation thereof by driving of the respective actuators. The driving of the actuators is controlled by the arm controlling apparatus 5045 to control the rotational angle of each of the joint portions 5033a to 5033c thereby to control driving of the arm unit 5031. Consequently, control of the position and the posture of the endoscope 5001 can be implemented. Thereupon, the arm controlling apparatus 5045 can control driving of the arm unit 5031 by various known controlling methods such as force control or position control.
For example, if the surgeon 5067 suitably performs operation inputting through the inputting apparatus 5047 (including the foot switch 5057), then driving of the arm unit 5031 may be controlled suitably by the arm controlling apparatus 5045 in response to the operation input to control the position and the posture of the endoscope 5001. After the endoscope 5001 at the distal end of the arm unit 5031 is moved from an arbitrary position to a different arbitrary position by the control just described, the endoscope 5001 can be supported fixedly at the position after the movement. It is to be noted that the arm unit 5031 may be operated in a master-slave fashion. In this case, the arm unit 5031 may be remotely controlled by the user through the inputting apparatus 5047 which is placed at a place remote from the surgery room.
Further, where force control is applied, the arm controlling apparatus 5045 may perform power-assisted control to drive the actuators of the joint portions 5033a to 5033c such that the arm unit 5031 may receive external force by the user and move smoothly following the external force. This makes it possible to move, when the user directly touches with and moves the arm unit 5031, the arm unit 5031 with comparatively weak force. Accordingly, it becomes possible for the user to move the endoscope 5001 more intuitively by a simpler and easier operation, and the convenience to the user can be improved.
Here, generally in endoscopic surgery, the endoscope 5001 is supported by a medical doctor called scopist. In contrast, where the supporting arm apparatus 5027 is used, the position of the endoscope 5001 can be fixed more certainly without hands, and therefore, an image of a surgical region can be obtained stably and surgery can be performed smoothly.
It is to be noted that the arm controlling apparatus 5045 may not necessarily be provided on the cart 5037. Further, the arm controlling apparatus 5045 may not necessarily be a single apparatus. For example, the arm controlling apparatus 5045 may be provided in each of the joint portions 5033a to 5033c of the arm unit 5031 of the supporting arm apparatus 5027 such that the plurality of arm controlling apparatus 5045 cooperate with each other to implement driving control of the arm unit 5031.
(Light Source Apparatus)
The light source apparatus 5043 supplies irradiation light upon imaging of a surgical region to the endoscope 5001. The light source apparatus 5043 includes a white light source which includes, for example, an LED, a laser light source or a combination of them. In this case, where a white light source includes a combination of red, green, and blue (RGB) laser light sources, since the output intensity and the output timing can be controlled with a high degree of accuracy for each colour (each wavelength), adjustment of the white balance of a picked up image can be performed by the light source apparatus 5043. Further, in this case, if laser beams from the respective RGB laser light sources are irradiated time- divisionally on an observation target and driving of the image pickup elements of the camera head 5005 is controlled in synchronism with the irradiation timings, then images individually corresponding to the R,
G and B colours can be picked up time-divisionally. According to the method just described, a colour image can be obtained even if a colour fdter is not provided for the image pickup element.
Further, driving of the light source apparatus 5043 may be controlled such that the intensity of light to be outputted is changed for each predetermined time. By controlling driving of the image pickup element of the camera head 5005 in synchronism with the timing of the change of the intensity of light to acquire images time-divisionally and synthesizing the images, an image of a high dynamic range free from underexposed blocked up shadows and overexposed highlights can be created.
Further, the light source apparatus 5043 may be configured to supply light of a predetermined wavelength band ready for special light observation. This may include, but not be limited to laser light such as that provided by a Vertical Cavity surface laser or any kind of laser light. Alternatively or additionally, the light may be InfraRed (IR) light. In special light observation, for example, by utilizing the wavelength dependency of absorption of light in a body tissue to irradiate light of a narrower band in comparison with irradiation light upon ordinary observation (namely, white light), narrow band light observation (narrow band imaging) of imaging a predetermined tissue such as a blood vessel of a superficial portion of the mucous membrane or the like in a high contrast is performed. Alternatively, in special light observation, fluorescent observation for obtaining an image from fluorescent light generated by irradiation of excitation light may be performed. In fluorescent observation, it is possible to perform observation of fluorescent light from a body tissue by irradiating excitation light on the body tissue (autofluorescence observation) or to obtain a fluorescent light image by locally injecting a reagent such as indocyanine green (ICG) into a body tissue and irradiating excitation light corresponding to a fluorescent light wavelength of the reagent upon the body tissue. The light source apparatus 5043 can be configured to supply such narrow-band light and/or excitation light suitable for special light observation as described above. The light source may also apply a heat pattern to an area. This heat pattern will be explained later with reference to Figures 3A-C. The light source apparatus 5043 is, in embodiments, a Vertical Cavity Surface-Emitting Laser (VCSEL) which can produce light in the visible part of the electromagnetic spectrum and some produce light in the Infra-Red part of the electromagnetic spectrum. In this respect, the light source apparatus 5043 may also act as a visible light source illuminating the area. The light source apparatus 5043 is, in embodiments, one or more Vertical Cavity Surface -Emitting Laser (VCSEL) which can produce light in the visible part of the electromagnetic spectrum and some produce light in the Infra-Red part of the electromagnetic spectrum. In this respect, the light source apparatus 5043 may also act as a visible light source illuminating the area. The one or more VCSELs may be single wavelength narrowband VCSELs, where each VCSEL varies in emission spectral frequency. Alternatively, or additionally, one or more of the VCSELs may be a Micro Electro Mechanical system (MEMs) type VCSEL whose wavelength emission may be altered over a specific range. In embodiments of the disclosure, the wavelength may alter over the range 550nm to 650nm or 600nm to 650nm. The shape of the VCSEL may vary such as a square or circular shape and may be positioned at one or varying positions in the endoscope 5001.
The light source apparatus 5043 may illuminate one or more areas. This may be achieved by selectively switching the VCSELs on or by performing a raster scan of the area using a Micro Electro Mechanical system (MEMs). The purpose of the light source apparatus 5043 is to perform Spatial Light Modulation (SLM) on the light over the area. This will be explained in more detail later.
It should be noted that although the foregoing describes the light source apparatus 5043 as being positioned in the cart, the disclosure is not so limited. In particular, the light source apparatus may be positioned in the camera head 5005.
(Camera Head and CCU)
Functions of the camera head 5005 of the endoscope 5001 and the CCU 5039 are described in more detail with reference to Figure 2. Figure 2 is a block diagram depicting an example of a functional configuration of the camera head 5005 and the CCU 5039 depicted in Figure 1. Referring to Figure 2, the camera head 5005 has, as functions thereof, a lens unit 5007, an image pickup unit 5009, a driving unit 5011, a communication unit 5013 and a camera head controlling unit 5015. Further, the CCU 5039 has, as functions thereof, a communication unit 5059, an image processing unit 5061 and a control unit 5063. The camera head 5005 and the CCU 5039 are connected to be bidirectionally communicable to each other by a transmission cable 5065.
First, a functional configuration of the camera head 5005 is described. The lens unit 5007 is an optical system provided at a connecting location of the camera head 5005 to the lens barrel 5003. Observation light taken in from a distal end of the lens barrel 5003 is introduced into the camera head 5005 and enters the lens unit 5007. The lens unit 5007 includes a combination of a plurality of lenses including a zoom lens and a focusing lens. The lens unit 5007 has optical properties adjusted such that the observation light is condensed on a light receiving face of the image pickup element of the image pickup unit 5009.
Further, the zoom lens and the focusing lens are configured such that the positions thereof on their optical axis are movable for adjustment of the magnification and the focal point of a picked up image.
The image pickup unit 5009 includes an image pickup element and disposed at a succeeding stage to the lens unit 5007. Observation light having passed through the lens unit 5007 is condensed on the light receiving face of the image pickup element, and an image signal corresponding to the observation image is generated by photoelectric conversion of the image pickup element. The image signal generated by the image pickup unit 5009 is provided to the communication unit 5013.
As the image pickup element which is included by the image pickup unit 5009, an image sensor, for example, of the complementary metal oxide semiconductor (CMOS) type is used which has a Bayer array and is capable of picking up an image in colour. It is to be noted that, as the image pickup element, an image pickup element may be used which is ready, for example, for imaging of an image of a high resolution equal to or not less than 4K. If an image of a surgical region is obtained in a high resolution, then the surgeon 5067 can comprehend a state of the surgical region in enhanced details and can proceed with the surgery more smoothly.
Further, the image pickup element which is included by the image pickup unit 5009 includes such that it has a pair of image pickup elements for acquiring image signals for the right eye and the left eye compatible with 3D display. Where 3D display is applied, the surgeon 5067 can comprehend the depth of a living body tissue in the surgical region more accurately. It is to be noted that, if the image pickup unit 5009 is configured as that of the multi -plate type, then a plurality of systems of lens units 5007 are provided corresponding to the individual image pickup elements of the image pickup unit 5009.
The image pickup unit 5009 may not necessarily be provided on the camera head 5005. For example, the image pickup unit 5009 may be provided just behind the objective lens in the inside of the lens barrel 5003. The driving unit 5011 includes an actuator and moves the zoom lens and the focusing lens of the lens unit 5007 by a predetermined distance along the optical axis under the control of the camera head controlling unit 5015. Consequently, the magnification and the focal point of a picked up image by the image pickup unit 5009 can be adjusted suitably.
The communication unit 5013 includes a communication apparatus for transmitting and receiving various kinds of information to and from the CCU 5039. The communication unit 5013 transmits an image signal acquired from the image pickup unit 5009 as RAW data to the CCU 5039 through the transmission cable 5065. Thereupon, in order to display a picked up image of a surgical region in low latency, preferably the image signal is transmitted by optical communication. This is because, upon surgery, the surgeon 5067 performs surgery while observing the state of an affected area through a picked up image, it is demanded for a moving image of the surgical region to be displayed on the real time basis as far as possible in order to achieve surgery with a higher degree of safety and certainty. Where optical communication is applied, a photoelectric conversion module for converting an electric signal into an optical signal is provided in the communication unit 5013. After the image signal is converted into an optical signal by the photoelectric conversion module, it is transmitted to the CCU 5039 through the transmission cable 5065.
Further, the communication unit 5013 receives a control signal for controlling driving of the camera head 5005 from the CCU 5039. The control signal includes information relating to image pickup conditions such as, for example, information that a frame rate of a picked up image is designated, information that an exposure value upon image picking up is designated and/or information that a magnification and a focal point of a picked up image are designated. The communication unit 5013 provides the received control signal to the camera head controlling unit 5015. It is to be noted that also the control signal from the CCU 5039 may be transmitted by optical communication. In this case, a photoelectric conversion module for converting an optical signal into an electric signal is provided in the communication unit 5013. After the control signal is converted into an electric signal by the photoelectric conversion module, it is provided to the camera head controlling unit 5015.
It is to be noted that the image pickup conditions such as the frame rate, exposure value, magnification or focal point are set automatically by the control unit 5063 of the CCU 5039 on the basis of an acquired image signal. In other words, an auto exposure (AE) function, an auto focus (AF) function and an auto white balance (AWB) function are incorporated in the endoscope 5001.
The camera head controlling unit 5015 controls driving of the camera head 5005 on the basis of a control signal from the CCU 5039 received through the communication unit 5013. For example, the camera head controlling unit 5015 controls driving of the image pickup element of the image pickup unit 5009 on the basis of information that a frame rate of a picked up image is designated and/or information that an exposure value upon image picking up is designated. Further, for example, the camera head controlling unit 5015 controls the driving unit 5011 to suitably move the zoom lens and the focus lens of the lens unit 5007 on the basis of information that a magnification and a focal point of a picked up image are designated. The camera head controlling unit 5015 may further include a function for storing information for identifying the lens barrel 5003 and/or the camera head 5005.
It is to be noted that, by disposing the components such as the lens unit 5007 and the image pickup unit 5009 in a sealed structure having high airtightness and waterproof, the camera head 5005 can be provided with resistance to an autoclave sterilization process.
Now, a functional configuration of the CCU 5039 is described. The communication unit 5059 includes a communication apparatus for transmitting and receiving various kinds of information to and from the camera head 5005. The communication unit 5059 receives an image signal transmitted thereto from the camera head 5005 through the transmission cable 5065. Thereupon, the image signal may be transmitted preferably by optical communication as described above. In this case, for the compatibility with optical communication, the communication unit 5059 includes a photoelectric conversion module for converting an optical signal into an electric signal. The communication unit 5059 provides the image signal after conversion into an electric signal to the image processing unit 5061.
Further, the communication unit 5059 transmits, to the camera head 5005, a control signal for controlling driving of the camera head 5005. The control signal may also be transmitted by optical communication.
The image processing unit 5061 performs various image processes for an image signal in the form of RAW data transmitted thereto from the camera head 5005. The image processes include various known signal processes such as, for example, a development process, an image quality improving process (a bandwidth enhancement process, a super-resolution process, a noise reduction (NR) process and/or an image stabilization process) and/or an enlargement process (electronic zooming process). Further, the image processing unit 5061 performs a detection process for an image signal in order to perform AE, AF and AWB.
The image processing unit 5061 includes a processor such as a CPU or a GPU, and when the processor operates in accordance with a predetermined program, the image processes and the detection process described above can be performed. It is to be noted that, where the image processing unit 5061 includes a plurality of GPUs, the image processing unit 5061 suitably divides information relating to an image signal such that image processes are performed in parallel by the plurality of GPUs.
The control unit 5063 performs various kinds of control relating to image picking up of a surgical region by the endoscope 5001 and display of the picked up image. For example, the control unit 5063 generates a control signal for controlling driving of the camera head 5005. Thereupon, if image pickup conditions are inputted by the user, then the control unit 5063 generates a control signal on the basis of the input by the user. Alternatively, where the endoscope 5001 has an AE function, an AF function and an AWB function incorporated therein, the control unit 5063 suitably calculates an optimum exposure value, focal distance and white balance in response to a result of a detection process by the image processing unit 5061 and generates a control signal. Further, the control unit 5063 controls the display apparatus 5041 to display an image of a surgical region on the basis of an image signal for which image processes have been performed by the image processing unit 5061. Thereupon, the control unit 5063 recognizes various objects in the surgical region image using various image recognition technologies. For example, the control unit 5063 can recognize a surgical tool such as forceps, a particular living body region, bleeding, mist when the energy treatment tool 5021 is used and so forth by detecting the shape, colour and so forth of edges of the objects included in the surgical region image. The control unit 5063 causes, when it controls the display unit 5041 to display a surgical region image, various kinds of surgery supporting information to be displayed in an overlapping manner with an image of the surgical region using a result of the recognition. Where surgery supporting information is displayed in an overlapping manner and presented to the surgeon 5067, the surgeon 5067 can proceed with the surgery more safety and certainty.
The transmission cable 5065 which connects the camera head 5005 and the CCU 5039 to each other is an electric signal cable ready for communication of an electric signal, an optical fibre ready for optical communication or a composite cable ready for both of electrical and optical communication.
Here, while, in the example depicted, communication is performed by wired communication using the transmission cable 5065, the communication between the camera head 5005 and the CCU 5039 may be performed otherwise by wireless communication. Where the communication between the camera head 5005 and the CCU 5039 is performed by wireless communication, there is no necessity to lay the transmission cable 5065 in the surgery room. Therefore, such a situation that movement of medical staff in the surgery room is disturbed by the transmission cable 5065 can be eliminated.
An example of the endoscopic surgery system 5000 to which the technology according to an embodiment of the present disclosure can be applied has been described above. It is to be noted here that, although the endoscopic surgery system 5000 has been described as an example, the system to which the technology according to an embodiment of the present disclosure can be applied is not limited to the example. For example, the technology according to an embodiment of the present disclosure may be applied to a soft endoscopic system for inspection or a microscopic surgery system. Indeed, the technology may be applied to a surgical microscope for conducting neurosurgery or the like.
The technology according to an embodiment of the present disclosure can be applied suitably to the CCU 5039 from among the components described hereinabove. Specifically, the technology according to an embodiment of the present disclosure is applied to an endoscopy system, surgical microscopy or medical imaging. By applying the technology according to an embodiment of the present disclosure to these areas, blood flow in veins, arteries and capillaries may be identified. Further, objects may be identified and the material of those objects may be established. This reduces the risk to the patient’s safety during operations.
Furthermore, it will be appreciated that the technology of the present disclosure may be applied more generally to any kind of imaging device (including industrial imaging devices, for example). As noted above, it is desired that an imaging system is provided which can address a number of problems related to the optical performance of imaging devices (such as the medical imaging device described with reference to Figures 1 and 2 of the present disclosure). In particular, an imaging system which enables an image (or images) with extended depth of field to be obtained is desired.
Certain image systems can be used in order to extend the depth of field of an imaging device. For example, a degree of extension of the depth of field of an image can be achieved through wave front manipulation and subsequent inverse filtering. However, a drawback with these systems for extending the depth of field (using wave front manipulation and an inverse filter) is a reduced level of in-focus sharpness of the image which is obtained. That is, the sharpness level of the sharpest regions within the image is often reduced by the application of this process to extend the depth of field.
It is known that polarization parameters can influence the point spread function (PSF) of an optical system. By changing polarization phase and amplitude it is thus possible to manipulate the PSF of an optical system. In fact, by introducing a phase shift between light of different polarizations, it is possible to vary the in-focus position for different polarizations of light from an object or scene. In other words, by introducing a phase shift between light of different polarizations, the different polarizations of light can, then, be focused to different focal positions.
Accordingly, an imaging system for an imaging device is provided in accordance with embodiments of the disclosure.
<Imaging System>
An imaging system for an imaging device according to embodiments of the present disclosure is illustrated in Figure 3.
The imaging system 1000 comprises image sensor circuitry 1002 and a polarization unit 1004. In the example of Figure 3, the imaging system 1000 is illustrated as part of an imaging device (the imaging device including an imaging lens system 1006). The imaging system 1000 is arranged along the optical axis 1010 of the medical imaging device. In Figure 3, an optional light source 1008 is illustrated as part of the imaging system 1000.
In the imaging system 1000, the polarization unit 1004 is arranged along the optical axis 1010 of the medical imaging device between the imaging lens system 1006 and the image sensor circuitry 2002. That is, the path of the light collected by the imaging lens system 1006 of the imaging device is intercepted by the polarization unit 1004 before that light reaches the image sensor circuitry 1002 of the imaging system 1000. Accordingly, the image sensor circuitry 1002 is configured to receive light from the polarization unit 1004. The imaging lens system 1006 of the medical imaging device is thus the lens or lens system which is used in order to collect light from the scene. The imaging lens system is not particularly limited, and will vary depending on the imaging device to which the imaging system 1000 is applied.
In this example, the imaging device to which the imaging system 1000 is applied is a medical imaging device. However, it will be appreciated that the present disclosure may also be applied to other types of imaging devices (such as industrial imaging devices or the like). Furthermore, the type of medical imaging device, in which the imaging system 1000 is arranged, is not particularly limited. That is, the medical imaging device may be any medical imaging device, such as a microscopic medical imaging device, laparoscopic medical imaging device, endoscopic medical imaging device or the like.
The inventors have realised that provision of an imaging system comprising image sensor circuitry which is configured to measure, at a first instance of time, a plurality of different polarization orientations of light in combination with a polarization unit which is configured to introduce a polarization dependent phase difference to in-coming wave-fronts of light from a scene is particularly advantageous as it enables an extended depth of field of the image to be obtained (compared to an optical system of the same optical specification) without requiring an increase in the size (e.g. form factor) of the imaging device. Specifically, by separating the incoming pre-polarized light wave-front into a plurality of different polarization states (each with an individual phase shift) a plurality of images of the scene can be obtained, each having a different in-focus object distance (corresponding to each of the different polarization states provided by the polarization unit 1004).
Moreover, as the different polarizations of light are measured by the imaging circuitry 1002 at substantially the same time (e.g. a first instance of time, being the time of image read out by the image sensor circuitry) polarization information regarding the scene is not lost. Therefore, the ability to perform polarization image analysis is maintained. That is, polarization based image analysis may still be performed on the image data, as information regarding all the polarization states provided by the polarization unit 1004 is maintained. This is particularly advantageous, therefore, over a system whereby input light is modulated and the separate polarization states of light are sequentially read out by image sensor circuitry.
Hence, according to embodiments of the disclosure, the polarization unit 1004 of the imaging system 1000 is configured to receive incident light of a first polarization and provide light of different polarization states depending on the region of the polarization unit upon which the incident light is incident. The polarization unit 1004 will now be described in more detail with reference to Figures 4A to 4C of the present disclosure.
<Polarization Unit>
Figure 4A illustrates a polarization unit in accordance with embodiments of the disclosure. The polarization unit 1004 is configured to receive light of a first polarization state (e.g. circular and/or elliptical polarization) and provide light of a plurality of different polarization states (e.g. a plurality of linear polarization states) depending on the region of the polarization unit upon which the incident light is incident.
Consider the example polarization unit illustrated in Figure 4A. In this example, light 3000 of a first polarization (e.g. circular polarization) is received by the polarization unit 1004. Polarization unit 1004 is then configured to provide light of different polarization states 3002a, 3002b, 3002c and 3002d depending on the region of the polarization unit upon which the incident light is incident. That is, in this example, each of the polarization states of light 3002a, 3002b, 3002c and 3002d are different forms of linear polarized light which are provided (e.g. output or transmitted) by the polarization unit 1004.
The polarization of light which is provided by the polarization unit 1004 in response to receiving the incident light 3000 varies across the surface of the polarization unit 1004. In other words, the polarization unit 1004 may be comprised of a number of regions or segments, with each region or segment being configured to provide light of a given polarization state. That is, a first region of the polarization unit 1004 may provide light of polarization state 3002a, while a second region of the polarization unit 1004 may provide light of polarization state 3002b.
Of course, the polarization unit 1004 is not particularly limited to providing the type and/or number of polarization states which are illustrated in this specific example. In fact, any number of polarization states can be provided by the polarization unit 1004, insofar as there are at least two polarization states (such that a phase shift can be introduced between the respective polarization states).
Figures 4B and 4C illustrate an example configuration of the polarization unit 1004 in accordance with embodiments of the disclosure.
Regarding Figure 4B, a detailed configuration of the polarization unit 1004 is shown, with the polarization unit 1004 being arranged in a number of distinct segments 4000, 4002, 4004 and 4006. That is, in this specific example, the polarization unit 1004 is arranged as a segmented polarization unit, whereby the polarization state of light which is provided by the polarization unit 1004 varies in accordance with both the segment in which the light is incident and the annular distance of that light from the centre of the polarization unit. In this specific example, the polarization unit is configured to provide light of a vertical polarization state when that light is incident upon the region 4006a of segment 4006. In contrast, within segment 4006, a horizontal polarization state of light is provided when the incident light is incident upon the region 4006b of segment 4006.
As such, for this example configuration of the polarization unit, the regions of the polarization unit which provide light of different polarization states are segmented regions of the polarization unit 1004. An alternative example configuration of the polarization unit 1004 is illustrated by Figure 4C of the present disclosure.
In the specific example of Figure 4C the polarization state of light which is provided by the polarization unit 1004 varies only in accordance with the radial distance of the light from the centre of the polarization unit. Specifically, the polarization unit 1004 is segmented into a number of annular zones or regions 4008, 4010, 4012 and 4014. The polarization state of light which is provided by the polarization unit 1004 depends only upon the annular zone or region 4008, 4010, 4012 and 4014 upon which that light was incident. For example, light which is received by the polarization unit 1004 anywhere within the annular zone or region 4014 will be provided, by the polarization unit 1004, with a horizontal polarization state.
In contrast, light which is received by the polarization unit 1004 anywhere within the annular zone or region 4012 will be provided, by the polarization unit 1004, with a vertical polarization state.
As such, for this example configuration of the polarization unit, the regions of the polarization unit which provide light of different polarization state are annular regions of the polarization unit 1004.
Again, it will be appreciated that the polarization unit 1004 is not particularly limited to these specific configurations illustrated in Figure 4B and Figure 4C of the present disclosure. In fact, any such configuration of the polarization unit 1004 may be provided as required, insofar as the polarization unit 1004 is configured to provide light of different polarization state depending upon the region of the polarization unit upon which the incident light is incident. This introduces a phase shift between the different polarizations of light.
In certain examples of the disclosure, the polarization unit 1004 may comprise a quarter wave zone plate (or the like). A quarter wave zone plate introduces a relative phase shift of 90 degrees between the orthogonal wave-fronts passing through the quarter wave zone plate, when that plate is illuminated by light. Thus, the quarter wave zone plate will convert an incoming circularly polarized wave-front into a linearly polarized wave-front. The materials used in the construction of the quarter wave plate may vary in accordance with the situation to which the embodiments of the disclosure are applied.
By introducing a number of zones, or regions, across the quarter wave plate, the plate will thus convert an incoming circularly polarized wave-front into a number of linearly polarized wave-fronts in accordance with the region on which the incoming wave-front is incident. In other words, the quarter wave zone plate is a segmented quarter wave plate, where each segment has a different optical axis orientation.
Of course, it will be appreciated that the quarter wave zone plate is one example of an polarization unit 1004 which can be used in accordance with embodiments of the disclosure. However, any polarization unit which is configured to convert incident light of a first polarization state into light of a plurality of polarization states may be used in accordance with embodiments of the disclosure. As such, through use of the polarization unit 1004, light from the scene (captured by the imaging lens system 1006 of the medical imaging device) is turned into polarized light of different state/orientation (such as different linear polarization states) depending on the light beam position in pupil space (which dictates upon which segment of the polarization unit the light from the scene is incident). Owing to the phase shift introduced by the change in polarization state of the light, light of different polarization states accumulates at the image plane (e.g. the image sensor circuitry) with a different focal plane (e.g. each polarization state will have a different focal position for a given object distance).
Accordingly, different regions of the scene (at different object distances) will be in-focus at the image plane corresponding to different polarization states.
<Light Source>
It is required that the light incident on the polarization unit comprises light of a predetermined first polarization state (such as circular or elliptically polarized light), such that the polarization unit 1004 may provide light of a plurality of polarization states (such as a plurality of linearly polarized light) from this incident light.
It will be appreciated that, in certain situations to which the embodiments of the disclosure may be applied, the reflected light from the scene (captured by the imaging lens system 1006 of the imaging device) will be deformed into light of a first polarization state (e.g. elliptical or circularly polarized light). That is, in certain situations, the reflected light from the scene may already be in a desired first polarization state.
However, in other situations, it may be desired to control the input polarization state of light (being the light which is incident upon the polarization unit 1004) through use of a light source. That is, in certain example situations, such as that of laparoscopic surgery, the light within the scene must be provided by a light source located within the surgical environment. The illumination source may be internal or external to the medical imaging device itself. Hence, in Figure 3, an optional light source 1008 is illustrated as being part of the imaging system 1000. The optional light source 1008 may be configured to provide light of a desired first polarization state as input light of the scene.
In other situations, the light in the scene may be provided by an ambient light source. In these examples, no further illumination of the scene is required, provided the ambient light source is of a desired first polarization state (such as a circular and/or elliptical pre -polarization). If the ambient light source is not of a desired first polarization state, then an additional light source may also be provided in order to illuminate the scene with a desired first polarization of light (such as circular and/or elliptical pre polarized light). Again, this additional light source may be either internal or external to the imaging system 1000. For example, instead of being provided as part of the imaging system 1000 (as optionally illustrated in Figure 3 of the present disclosure) the light source may, instead, be provided as an independent light source device. Alternatively, the light source may, optionally, be provided as part of the imaging device itself.
Hence, if required, the optional light source 1008 may be configured to provide light of a first pre- configured polarization state (e.g. light of a circular/elliptical polarization).
Accordingly, it will be appreciated that, provided the incident light is of a desired first polarization state, no further modulation or adaptation of the input light will be required.
<Image Sensor Circuitry>
As illustrated in Figure 3 of the present disclosure, the imaging system 1000 further comprises image sensor circuitry having a plurality of image sensing regions, each of the respective image sensing regions configured to be sensitive to light of a certain polarization state.
The image sensor circuitry may be image sensor circuity, such as a CCD sensor, a CMOS sensor or the like. In fact, any image capture device may be used as the image sensor circuitry in accordance with embodiments of the disclosure. However, image sensor circuitry according to embodiments of the disclosure is configured with a sensor sub-pixel structure such that a plurality of image sensing regions are provided (each of these regions being sensitive to light of a certain polarization state).
Indeed, in examples, the image sensor circuitry may be a CMOS based sensor with 2D pixel array (such as the Sony product named IMX250) with four different linear polarization filter aligned in 2x2 cluster on pixel level.
Figure 5 illustrates an example configuration of a region of image sensor circuitry 1002 in accordance with embodiments of the disclosure.
In this example, the image sensor circuitry 1002 is configured with a sub-pixel structure such that four sub-pixel regions 2000, 2002, 2004 and 2006 are provided. Each of these regions of the image sensor circuitry 1002 are configured to be sensitive to light of a different polarization state. That is, in this example, the region 2000 is configured to be sensitive to light of a horizontal polarization state, while the region 2006 is configured to be sensitive to a light of a vertical polarization state. Likewise, each the regions 2004 and 2002 are configured to be sensitive to light of a different diagonal polarization orientation.
Of course, only four sub-pixel regions 2000, 2002, 2004 and 2006 are illustrated in Figure 5, it will be appreciated that such sub-pixel structure illustrated in Figure 5 may be repeated across the full diameter of the image sensor circuitry 1002. That is, it will be appreciated that the image sensor circuitry of the present disclosure is not particularly limited to the above described sub-pixel structure. As such, the image sensor circuitry illustrated in Figure 5 of the present disclosure has four analyser orientations (corresponding to the four different polarization orientations/states of light). The image sensor circuitry 1002 according to embodiments of the disclosure therefore fdters the light from the polarization unit in accordance with its polarization direction.
In other words, owing to the four sub-pixel sensing regions, the single image sensor circuitry can produce four output images (each image corresponding to a specific polarization state of light provided by the polarization unit 1004).
In certain examples, the image sensor circuitry of the present disclosure may comprise a two-dimensional solid-state image capture device, whereby a polarization member is disposed at a light incident side of the sub-pixel regions constituting each pixel area of the image sensor. The polarization member thus shields, or restricts, the polarization state of light which can be incident upon each of the respective sub-pixel regions.
In certain examples, the image sensor circuitry of the present disclosure may be provided by a sensor such as the Sony IMX 250 sensor (either RGB or Monochrome version). This sensor is a four-directional pixel-wise polarization CMOS image sensor using air-gap wire grid on 2.5pm back illuminated pixels. However, the present disclosure is not particularly limited in this regard, and other sensors with the above described configuration may be used as required.
Moreover, in some situations, it may be advantageous that the sub-pixel regions of the image sensor 1002 are co-aligned with the regions of the polarization unit 1004.
As such, according to embodiments of the disclosure, the image sensor circuitry is configured to be arranged to receive light from the polarization unit and further being configured to output, at a first instance of time, image data corresponding to each of the respective image sensing regions.
Returning to Figure 5 of the present disclosure, the image sensor circuitry may, therefore, be configured in order to output, at a first instance of time, image data from sub-pixel region 2006 (being an image data formed of vertically polarized light) as a first image. Likewise, at the same first instance of time, the image sensor circuitry may be configured to output image data from sub-pixel regions 2000, 2002, and 2004 as independent image data having horizontal and diagonal polarization states respectively. In this manner, the image sensor circuitry of the present disclosure may be configured to produce, at the first instance of time, individual image data for each of the image polarization states of light received from the polarization unit respectively.
In this manner, individual image data for each of the respective polarization states (with their respective in focus position) can be obtained by the imaging system 1000 at the first instance of time.
<Example> Figure 6A illustrates an example situation to which the imaging system 1000 may be applied.
In this example, the imaging lens system 1006 of the medical imaging device, the polarization unit 1004 and the image sensor circuity 1002 are shown.
Input light (being reflected light from the surgical scene) of either elliptical and/or circular polarization is received by the imaging lens system 1006 of the medical imaging device. Light from this imaging lens system 1006 of the medical imaging device is intercepted by the polarization unit 1004 of the imaging lens system 1000. In this example, the polarization unit 1004 is arranged at the pupil positon of the imaging lens system 1006 of the medical imaging device. The polarization unit 1004 provides light of a different polarization state depending on the light beam position of the incident light in pupil space.
As previously described, the change of polarization state of the incident light introduces a phase shift between the respective polarization states of the light from the scene. This is illustrated with reference to Figure 6B of the present disclosure. In this Figure, it can be seen that light of each of the respective polarization states provided by the polarization unit 1004 have a focal plane which is axially shifted along the optical axis of the medical imaging device. That is, in this example, light of the vertical polarization state has a focus plane axial shift of Af along the optical axis of the medical imaging device compared to light of the horizontal polarization state. As such, light from an object with the same object distance will have a different focal plane along optical axis of the imaging device for each different polarization state of light provided by the polarization unit 1004. Of course, the amount of focal shift between the different polarization states Af is dependent on the focal length, magnification and aperture size of the optical system.
Alternatively (as illustrated in Figure 6A) it will be appreciated that ‘in focus’ light for each of the different polarization states of light will correspond to light received from an object with a different object distance in the scene. That is, in focus light for the vertical polarization state of light may originate from an object in the scene with a first object distance X, while in focus light for a different polarization state of light (e.g. a horizontal polarization state of light) will originate from an object in the scene with a second object distance Y.
Figure 6C of the present disclosure illustrates an example focus shift in the image plane in accordance with embodiments of the present disclosure. In this alternative example, two paths of light 6000P1 and 6000P2 (corresponding to two distinct polarizations of the light) are shown. The light 6000P1 originates at the same object position in the object plane as the light 6000P2 in this example (i.e. being the same distance from the lens 1006). However, when the light of the first polarization and the second polarization reach the segmented retarder zone plate, an axial shift along the optical image axis of the medical imaging device occurs between the light of the first polarization and the second polarization. Accordingly, there is an offset between the image position of the light paths 6000P1 and 6000P2 when the light of the respective polarization encounters the sensor with the sub-pixel structure. In other words, light paths with the same object plane position are shifted in the image plane. Therefore, each light path (corresponding to each polarization) will have a different in-focus position.
Figure 6D of the present disclosure illustrates an example focus shift in accordance with embodiments of the disclosure. In this example, input light (being either elliptical or circular in polarization) is received from a plurality of different object distances. When the light from these different object distances reaches the segmented retarder zone plate, an axial shift along the optical image axis of the medical imaging device occurs between the light of the different polarization states. Indeed, as explained above, it will be appreciated that ‘in focus’ light for each of the different polarization states of light will correspond to light received from an object with a different object distance in the scene. That is, in focus light for the vertical polarization state of light may originate from an object in the scene with a first object distance X, while in focus light for a different polarization state of light (e.g. a horizontal polarization state of light) will originate from an object in the scene with a second object distance Y. Hence, at the sensor plane the light from the different object distances is able to be kept in focus using the different polarization states of the light.
Figure 6E of the present disclosure illustrates an example focus shift in accordance with embodiments of the disclosure. In this example, input light (being either elliptical or circular in polarization in this example) is received from an object (or, from a plurality of objects at the same object distance from the lens). When the light of the first polarization and the second polarization reach the segmented retarder zone plate, an axial shift along the optical image axis of the medical imaging device occurs between the light of the first polarization and the second polarization. Therefore, light of different polarization states will have a different in-focus position at the sensor plane.
As such, by independently imaging each of the polarization states of the scene, with the image sensor circuitry 1002, in focus images can be obtained for a range of object distances within the image scene at the same instance of time. That is, for each polarization state provided by the polarization unit 1004, the image sensor circuitry 1002 is configured to produce an individual image of the scene at that instance of time. Furthermore, because the focal position of the images corresponding to different polarization states will be different (owing to the shift of focal position) different regions of each of these respective images (corresponding to the different polarization states, with different in-focus object distances) will be in focus.
The subsequent array of in-focus images (each with a different in-focus distance), corresponding to each of the polarization states of light provided by the polarization unit 1004 and independently detected by the image sensor circuitry 1002, can be produced without any increase in the size (e.g. form factor) of the medical imaging device. In other words, a plurality of different focus positions (planes) can be generated using the imaging system 1000 of the present disclosure. Therefore, the imaging system 1000 of the present disclosure provides an increase in the depth of field which can be obtained by the imaging device.
<Example Image Processing>
The processing applied to the images output by the imaging system 1000 in accordance to embodiments of the disclosure is not particularly limited, and will vary depending on the situation to which the embodiments of the disclosure are applied. In some examples, the image data of each of the polarization states may be analysed individually. Alternatively, the image data may be stored in a storage unit. In fact, any type of image processing may be applied to the image data as required. However, in certain situations, it may be desired to provide a single image, for each instance of time, of the imaging environment, that image having an extended depth of field.
As such, according to embodiments of the disclosure, the image sensor circuitry (or indeed, in certain examples, processor circuitry which receives the image data from the image sensor circuitry (either directly, or from an intermediate storage)) is configured to perform a series of processing steps in order to produce an image, for the first instance of time, having an extended depth of filed.
The processing is herein described with reference to a single instance of time. However, it will be appreciated that these steps may, alternatively, be applied, respectively and in sequence, to images obtained at subsequent instances of time. In this case, a sequence of images, at respective instances of time, having an extended depth of field is obtained.
Firstly, each individual image data obtained at the first instance of time, corresponding to each individual polarization state provided by the polarization unit 1004, is partitioned into regions. The size of these regions is not particularly limited, and will vary in accordance with processing requirements of the system. However, it will be appreciated that, for a given instance of time, the size of these regions will be the same across the images of the different polarization states. Then, once each individual image data has been partitioned (corresponding to each individual polarization state at that instance of time), the sharpness level of each region for each image is determined. That is, the sharpest image of the plurality of image data (corresponding to the different polarization states, each having a different focal position) obtained at the first instance of time is determined for each region of the image data. The skilled person will appreciate that any appropriate method known in the art may be used to determine the sharpness level of each region in each of the images (e.g. a modulation transfer function may be applied to the image data in order to determine the sharpness level of each region).
Furthermore, a sharpness transfer function is then applied to the image data in order to transfer image detail from the image of each region having the highest sharpness level to the corresponding regions of the images of other polarization states (having a lower sharpness level for that region). The sharpness transport function enables the production of a reconstructed image having extended depth of field. It shall be understood that transporting the sharpness level from the polarization image (or image data) having the highest sharpness level to the complementary polarization images (or image data) having a lower sharpness level for each region of the image data may be carried out using any suitable sharpness transport technique such as those known in the art.
As such, the sharpness gradient (with respect to image texture) will be analysed for each of the sub-pixel types of the image sensor circuitry 1002, with the high frequency information being transferred to subpixel types with lower sharpness gradient. This enables the re-sharpening of objects for different object distances in a single image.
Accordingly, resultant image data having an extended depth of field can be produced in accordance with embodiments of the disclosure.
<Computer Device>
Referring to Figure 7, an apparatus 7000 according to embodiments of the disclosure is shown.
Typically, an apparatus 7000 according to embodiments of the disclosure is a computer device such as a personal computer or a terminal connected to a server. Indeed, in embodiments, the apparatus may also be a server. The apparatus 7000 is controlled using a microprocessor or other processing circuitry 7002.
In some examples, the apparatus 7000 may be a portable computing device such as a mobile phone, laptop computer or tablet computing device.
The processing circuitry 7002 may be a microprocessor carrying out computer instructions or may be an Application Specific Integrated Circuit. The computer instructions are stored on storage medium 7004 which maybe a magnetically readable medium, optically readable medium or solid state type circuitry. The storage medium 7004 may be integrated into the apparatus 7000 or may be separate to the apparatus 7000 and connected thereto using either a wired or wireless connection. The computer instructions may be embodied as computer software that contains computer readable code which, when loaded onto the processor circuitry 7002, configures the processor circuitry 7002 to perform a method according to embodiments of the disclosure (such as a method of imaging for a medical imaging device as illustrated with reference to Figure 8 of the present disclosure).
Additionally, an optional user input device 7006 is shown connected to the processing circuitry 7002.
The user input device 7006 may be a touch screen or may be a mouse or stylist type input device. The user input device 7006 may also be a keyboard or any combination of these devices.
A network connection 7008 may optionally be coupled to the processor circuitry 7002. The network connection 7008 may be a connection to a Local Area Network or a Wide Area Network such as the Internet or a Virtual Private Network or the like. The network connection 7008 may be connected to a server allowing the processor circuitry 7002 to communicate with another apparatus in order to obtain or provide relevant data. The network connection 7002 may be behind a firewall or some other form of network security.
Additionally, shown coupled to the processing circuitry 7002, is a display device 7010. The display device 7010, although shown integrated into the apparatus 7000, may additionally be separate to the apparatus 7000 and may be a monitor or some kind of device allowing the user to visualise the operation of the system. In addition, the display device 7010 may be a printer, projector or some other device allowing relevant information generated by the apparatus 7000 to be viewed by the user or by a third party.
<Method>
Figure 8 illustrates a method of imaging for an imaging device in accordance with embodiments of the disclosure. The imaging device may comprise the imaging system described with reference to Figure 3 of the present disclosure.
The method begins at step S8000, and proceeds to step S8002.
In step S8002, the method comprises receiving, using image sensor circuitry having a plurality of image sensing regions configured to be sensitive to light of a certain polarization state, light from a polarization unit arranged along an optical axis of the imaging device between an imaging lens system of the imaging device and the image sensor circuitry, the polarization unit being configured to receive indecent light and provide light of different polarization states depending on the region of the polarization unit upon which the incident light is incident.
Once the light from the polarization unit has been received, the method proceeds to step S8004.
In step S8004, the method comprises outputting, at a first instance of time, image data corresponding to each of the respective image sensing regions.
Once the image date corresponding to each of the respective image sensing regions has been output, the method proceeds to, and ends with, step S8006.
Of course, the method of imaging for an imaging device in accordance with embodiments of the disclosure is not particularly limited to the method illustrated in Figure 8 of the present disclosure. For example, once method step S8004 has been completed, the method according to embodiments of the present disclosure may, instead of proceeding to method step S8006, comprise returning to method step S8000 or S8002.
Aspects of the present disclosure can further be arranged in accordance with the following numbered clauses: 1. Imaging system for an imaging device, the imaging system comprising: image sensor circuitry having a plurality of image sensing regions, each of the respective image sensing regions configured to be sensitive to light of a certain polarization state; and a polarization unit being configured to be arranged along an optical axis of the imaging device between an imaging lens system of the imaging device and the image sensor circuitry, the polarization unit further being further configured to receive incident light and provide light of different polarization states depending on the region of the polarization unit upon which the incident light is incident; the image sensor circuitry being configured to be arranged to receive light from the polarization unit and further being configured to output, at a first instance of time, image data corresponding to each of the respective image sensing regions.
2. The imaging system according to Clause 1, wherein the each of the plurality of image sensing regions is configured to be sensitive to a different polarization state of light.
3. The imaging system according to Clause 1 or 2, wherein the certain polarization states of the image sensing regions are linear polarization states of light.
4. The imaging system according to any preceding Clause, wherein the image sensor circuitry has four image sensing regions, each being configured to output image data at the first instance of time.
5. The imaging system according to any preceding Clause, wherein the polarization unit is configured such that that the regions of the polarization unit which provide light of different polarization states are annular regions of the polarization unit.
6. The imaging system according to any preceding Clause, wherein the polarization unit is configured such that the regions of the polarization unit which provide light of different polarization states are segmented regions of the polarization unit.
7. The imaging system according to Clause 5 or 6, wherein the polarization unit is a quarter-wave zone plate.
8. The imaging system according to any preceding Clause, wherein the regions of the polarization unit are configured to be co-aligned with the image sensing regions of the image sensor circuitry.
9. The imaging system according to any preceding Clause, further comprising processing circuitry configured to analyse a portion of the image data corresponding to each of the respective image sensing regions to identify an image sensing region with a highest sharpness gradient for that portion of the image data. 10. The imaging system according to Clause 9, wherein the processing circuitry is further configured to transfer, for the portion of the image data, high frequency information from the image sensing region with the highest sharpness gradient to image data from a different image sensing region.
11. The imaging system according to any preceding Clause, wherein the imaging device is a laparoscopic imaging device or an endoscopic imaging device.
12. The imaging system of any preceding Clause, wherein the imaging device is a medical imaging device or an industrial imaging device.
13. The imaging system according to any preceding Clause, further comprising a light source configured to emit light of a predetermined polarization.
14. Method of imaging for an imaging device, the method comprising: receiving, using image sensor circuitry having a plurality of image sensing regions configured to be sensitive to light of a certain polarization state, light from a polarization unit arranged along an optical axis of the imaging device between an imaging lens system of the imaging device and the image sensor circuitry, the polarization unit being configured to receive indecent light and provide light of different polarization states depending on the region of the polarization unit upon which the incident light is incident; and outputting, at a first instance of time, image data corresponding to each of the respective image sensing regions.
15. A computer program product comprising instructions which, when the program is executed by a computer, cause the computer to carry out a method of imaging for an imaging device, the method comprising: controlling image sensor circuitry having a plurality of image sensing regions configured to be sensitive to light of a certain polarization state to receive light from a polarization unit arranged along an optical axis of the imaging device between an imaging lens system of the imaging device and the image sensor circuitry, the polarization unit being configured to receive incident light and provide light of different polarization states depending on the region of the polarization unit upon which the indecent light is indecent; and controlling the image sensor circuitry to output, at a first instance of time, image data corresponding to each of the respective image sensing regions.
While embodiments of the disclosure have been described in relation to an imaging system for a medical imaging device, it will be appreciated that the claimed invention is not limited to medical imaging (or medical imaging devices), and could, instead, be used in any imaging situation. The imaging system according to embodiments of the disclosure could be employed to effect in an industrial imaging device such as an industrial endoscopic device. For example, embodiments of the disclosure could be used in architectural endoscopy, whereby a scale version of a new building or complex can be correctly viewed from the perspective of a person walking through the architectural creation improving the visualisation, design and construction of proposed buildings.
Embodiments of the disclosure could be used for internal visualisation of works of engineering. For example, an imaging device according to embodiments of the disclosure could be used to view the interior of underground pipe systems, such as water pipes, in order to locate leaks or generally survey the structure. An imaging device according to embodiments of the disclosure could also be used for quality control and internal inspection of other mechanical systems such as turbines and engine components.
Alternatively, embodiments of the disclosure could be used in the security and surveillance industry. For example, an imaging device according to embodiments of the disclosure could be used to conduct surveillance in an area where the presence of a person is restricted, such as in an enclosed area or a very tight space.
In all these applications, an imaging system according to embodiments of the disclosure may be applied to the imaging device in order to capture high resolution images with an extended depth of field. It will be appreciated that the above are merely examples of possible industrial applications of an imaging system according to embodiments of the disclosure, and many further applications of the imaging device are possible, as would be apparent to the skilled person when reading the disclosure.
Obviously, numerous modifications and variations of the present disclosure are possible in light of the above teachings. It is therefore to be understood that within the scope of the appended claims, the disclosure may be practiced otherwise than as specifically described herein.
In so far as embodiments of the disclosure have been described as being implemented, at least in part, by software-controlled data processing apparatus, it will be appreciated that a non-transitory machine- readable medium carrying such software, such as an optical disk, a magnetic disk, semiconductor memory or the like, is also considered to represent an embodiment of the present disclosure.
It will be appreciated that the above description for clarity has described embodiments with reference to different functional units, circuitry and/or processors. However, it will be apparent that any suitable distribution of functionality between different functional units, circuitry and/or processors may be used without detracting from the embodiments.
Described embodiments may be implemented in any suitable form including hardware, software, firmware or any combination of these. Described embodiments may optionally be implemented at least partly as computer software running on one or more data processors and/or digital signal processors. The elements and components of any embodiment may be physically, functionally and logically implemented in any suitable way. Indeed the functionality may be implemented in a single unit, in a plurality of units or as part of other functional units. As such, the disclosed embodiments may be implemented in a single unit or may be physically and functionally distributed between different units, circuitry and/or processors. Although the present disclosure has been described in connection with some embodiments, it is not intended to be limited to the specific form set forth herein. Additionally, although a feature may appear to be described in connection with particular embodiments, one skilled in the art would recognize that various features of the described embodiments may be combined in any manner suitable to implement the technique.

Claims

CLAIMS:
1) Imaging system for an imaging device, the imaging system comprising: image sensor circuitry having a plurality of image sensing regions, each of the respective image sensing regions configured to be sensitive to light of a certain polarization state; and a polarization unit being configured to be arranged along an optical axis of the imaging device between an imaging lens system of the imaging device and the image sensor circuitry, the polarization unit further being further configured to receive incident light and provide light of different polarization states depending on the region of the polarization unit upon which the incident light is incident; the image sensor circuitry being configured to be arranged to receive light from the polarization unit and further being configured to output, at a first instance of time, image data corresponding to each of the respective image sensing regions.
2) The imaging system according to Claim 1, wherein the each of the plurality of image sensing regions is configured to be sensitive to a different polarization state of light.
3) The imaging system according to Claim 1 or 2, wherein the certain polarization states of the image sensing regions are linear polarization states of light.
4) The imaging system according to any preceding Claim, wherein the image sensor circuitry has four image sensing regions, each being configured to output image data at the first instance of time.
5) The imaging system according to any of Claims 1 to 4, wherein the polarization unit is configured such that that the regions of the polarization unit which provide light of different polarization states are annular regions of the polarization unit.
6) The imaging system according to any of Claims 1 to 4, wherein the polarization unit is configured such that the regions of the polarization unit which provide light of different polarization states are segmented regions of the polarization unit.
7) The imaging system according to Claims 5 or 6, wherein the polarization unit is a quarter-wave zone plate.
8) The imaging system according to any preceding Claim, wherein the regions of the polarization unit are configured to be co-aligned with the image sensing regions of the image sensor circuitry.
9) The imaging system according to any preceding Claim, further comprising processing circuitry configured to analyse a portion of the image data corresponding to each of the respective image sensing regions to identify an image sensing region with a highest sharpness gradient for that portion of the image data. 10) The imaging system according to Claim 9, wherein the processing circuitry is further configured to transfer, for the portion of the image data, high frequency information from the image sensing region with the highest sharpness gradient to image data from a different image sensing region.
11) The imaging system according to any preceding Claim, wherein the imaging device is a laparoscopic imaging device or an endoscopic imaging device.
12) The imaging system of any preceding Claim, wherein the imaging device is a medical imaging device or an industrial imaging device.
13) The imaging system according to any preceding Claim, further comprising a light source configured to emit light of a predetermined polarization.
14) Method of imaging for an imaging device, the method comprising: receiving, using image sensor circuitry having a plurality of image sensing regions configured to be sensitive to light of a certain polarization state, light from a polarization unit arranged along an optical axis of the imaging device between an imaging lens system of the imaging device and the image sensor circuitry, the polarization unit being configured to receive indecent light and provide light of different polarization states depending on the region of the polarization unit upon which the incident light is incident; and outputting, at a first instance of time, image data corresponding to each of the respective image sensing regions.
15) A computer program product comprising instructions which, when the program is executed by a computer, cause the computer to carry out a method of imaging for an imaging device, the method comprising: controlling image sensor circuitry having a plurality of image sensing regions configured to be sensitive to light of a certain polarization state to receive light from a polarization unit arranged along an optical axis of the imaging device between an imaging lens system of the imaging device and the image sensor circuitry, the polarization unit being configured to receive incident light and provide light of different polarization states depending on the region of the polarization unit upon which the indecent light is indecent; and controlling the image sensor circuitry to output, at a first instance of time, image data corresponding to each of the respective image sensing regions.
EP22712261.1A 2021-03-15 2022-02-22 An imaging system, method and computer program product for an imaging device Pending EP4309358A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP21162644 2021-03-15
PCT/EP2022/054359 WO2022194497A1 (en) 2021-03-15 2022-02-22 An imaging system, method and computer program product for an imaging device

Publications (1)

Publication Number Publication Date
EP4309358A1 true EP4309358A1 (en) 2024-01-24

Family

ID=74874749

Family Applications (1)

Application Number Title Priority Date Filing Date
EP22712261.1A Pending EP4309358A1 (en) 2021-03-15 2022-02-22 An imaging system, method and computer program product for an imaging device

Country Status (2)

Country Link
EP (1) EP4309358A1 (en)
WO (1) WO2022194497A1 (en)

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2868254B1 (en) * 2012-06-28 2018-04-04 Olympus Corporation Endoscope system
JP2017158764A (en) * 2016-03-09 2017-09-14 ソニー株式会社 Image processing device, image processing method, and recording medium
US10924689B1 (en) * 2019-11-22 2021-02-16 Karl Storz Imaging, Inc. Method and apparatus to improve high dynamic range image capture using image sensor with polarization

Also Published As

Publication number Publication date
WO2022194497A1 (en) 2022-09-22

Similar Documents

Publication Publication Date Title
US11788966B2 (en) Imaging system
CN110832842B (en) Imaging apparatus and image generating method
US11653824B2 (en) Medical observation system and medical observation device
US11109927B2 (en) Joint driving actuator and medical system
US20220008156A1 (en) Surgical observation apparatus, surgical observation method, surgical light source device, and surgical light irradiation method
US11699215B2 (en) Imaging device, method and program for producing images of a scene having an extended depth of field with good contrast
US11394942B2 (en) Video signal processing apparatus, video signal processing method, and image-capturing apparatus
US11553838B2 (en) Endoscope and arm system
US11039067B2 (en) Image pickup apparatus, video signal processing apparatus, and video signal processing method
US10778889B2 (en) Image pickup apparatus, video signal processing apparatus, and video signal processing method
CN110475504B (en) Medical imaging device and endoscope
US11310481B2 (en) Imaging device, system, method and program for converting a first image into a plurality of second images
EP4309358A1 (en) An imaging system, method and computer program product for an imaging device
WO2020203164A1 (en) Medical system, information processing device, and information processing method
WO2020045014A1 (en) Medical system, information processing device and information processing method
US11576555B2 (en) Medical imaging system, method, and computer program
EP4312711A1 (en) An image capture device, an endoscope system, an image capture method and a computer program product
CN110785110B (en) Medical imaging system, method and computer program product
US11676242B2 (en) Image processing apparatus and image processing method
US20230248231A1 (en) Medical system, information processing apparatus, and information processing method
WO2020050187A1 (en) Medical system, information processing device, and information processing method

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20231001

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)