EP2976609B1 - System for hyperspectral imaging in visible light, method for recording a hyperspectral image and displaying the hyperspectral image in visible light - Google Patents

System for hyperspectral imaging in visible light, method for recording a hyperspectral image and displaying the hyperspectral image in visible light Download PDF

Info

Publication number
EP2976609B1
EP2976609B1 EP14714393.7A EP14714393A EP2976609B1 EP 2976609 B1 EP2976609 B1 EP 2976609B1 EP 14714393 A EP14714393 A EP 14714393A EP 2976609 B1 EP2976609 B1 EP 2976609B1
Authority
EP
European Patent Office
Prior art keywords
light field
camera
image
hyperspectral
projector
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP14714393.7A
Other languages
German (de)
French (fr)
Other versions
EP2976609A1 (en
Inventor
Frederik Jan De Bruijn
Remco Theodorus Johannes Muijs
Jorrit Ernst DE VRIES
Bernardus Hendrikus Wilhelmus Hendriks
Drazenko Babic
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Koninklijke Philips NV
Original Assignee
Koninklijke Philips NV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips NV filed Critical Koninklijke Philips NV
Publication of EP2976609A1 publication Critical patent/EP2976609A1/en
Application granted granted Critical
Publication of EP2976609B1 publication Critical patent/EP2976609B1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3185Geometric adjustment, e.g. keystone or convergence
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/11Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths for generating image signals from visible and infrared light wavelengths
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0033Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room
    • A61B5/0035Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room adapted for acquisition of images from more than one imaging mode, e.g. combining MRI and optical tomography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0075Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence by spectroscopy, i.e. measuring spectra, e.g. Raman spectroscopy, infrared absorption spectroscopy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4887Locating particular structures in or on the body
    • A61B5/489Blood vessels
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5211Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
    • A61B6/5229Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image
    • A61B6/5247Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image combining images from an ionising-radiation diagnostic technique and a non-ionising radiation diagnostic technique, e.g. X-ray and ultrasound
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/2823Imaging spectrometer
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/02Constructional details
    • G01J5/025Interfacing a pyrometer to an external device or network; User interface
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/02Constructional details
    • G01J5/08Optical arrangements
    • G01J5/0896Optical arrangements using a light source, e.g. for illuminating a surface
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/10Beam splitting or combining systems
    • G02B27/14Beam splitting or combining systems operating by reflection only
    • G02B27/141Beam splitting or combining systems operating by reflection only using dichroic mirrors
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B3/00Simple or compound lenses
    • G02B3/0006Arrays
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/73Deblurring; Sharpening
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3191Testing thereof
    • H04N9/3194Testing thereof including sensor feedback
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/646Circuits for processing colour signals for image enhancement, e.g. vertical detail restoration, cross-colour elimination, contour correction, chrominance trapping filters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/374NMR or MRI
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
    • A61B2090/3762Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy using computed tomography systems [CT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/378Surgical systems with images on a monitor during operation using ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/44Constructional features of apparatus for radiation diagnosis
    • A61B6/4429Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units
    • A61B6/4435Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units the source unit and the detector unit being coupled by a rigid structure
    • A61B6/4441Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units the source unit and the detector unit being coupled by a rigid structure the rigid structure being a C-arm or U-arm
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J2005/0077Imaging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10052Images from lightfield camera

Definitions

  • the invention relates to a system for hyperspectral imaging comprising a camera for recording a hyperspectral image of an object and a display device for displaying the recorder hyperspectral image in visible light, and to a method for recording an hyperspectral image and displaying the hyperspectral image in visible light.
  • Hyperspectral imaging is known to reveal details impossible or difficult to see by the human eye, such as for instance tissue differences in a human.
  • an image is taken of an object in one or more wavelength bands where at least one wavelength band is at least partly invisible to the human eye or at least very difficult to see. This image is then converted into a visible image, which image is provided in visible light to a viewer.
  • Hyperspectral imaging can be based both by spectrally selective illumination (i.e. illumination an object with light in a certain wavelength band) and by spectrally selective filtering prior (i.e. using a filter that transmits only light in a certain wavelength band) to image capture. In both cases image processing is required to generate a resulting image revealing the structure contrast of interest.
  • a hyperspectral image (e.g. image extending beyond the visible spectrum) is taken and the result is shown on a display screen.
  • a hyperspectral image e.g. image extending beyond the visible spectrum
  • the viewer interested in not or hardly visible details of the object under observation, can study the image on the screen in visible light as it would appear in for instance UV light, or in IR light.
  • US2012/0200829 A1 discloses plenoptic cameras and detectors.
  • US2010/0177184 A1 discloses an infrared imaging system and projector to enhance the visibility of vascular structures in the body of a patient.
  • Taheri et al., Skin Research and Technology 2013, 19, 288-290 discloses the use of light field cameras for imaging of body parts.
  • EP2075616A1 discloses an imaging device and projector for projecting the recorded image of the body part of a patient on the same body part for surgical applications.
  • US2012/0170824 A1 discloses an imaging and projection device comprising visible and X-rax imaging sensors for projecting an X-ray image on the surface of the body of a patient.
  • the system of the invention is characterized in that the system the camera is a light field capturing camera for recording a hyperspectral image of a patient in a spectral range of radiation at least including one of UV, IR and Terahertz radiation, and the display device is a light field projector wherein the camera and projector share a coaxial optical path and wherein the camera is arranged to capture a hyperspectral light field, and comprises an output for sending data on the captured light field to an input of the light field projector and the light field projector is arranged to project a light field in visible light over the patient based on the data received from the camera.
  • the system further comprises a secondary imaging system for providing secondary image data on a three-dimensional internal image of the patient, wherein the system comprises a processor to provide depth information based on the data of the captured light field and to format, based on the depth information, the secondary data into an image projected on the object.
  • the secondary imaging system is one of an X-ray system, MRI, CT, PET-CT or ultrasound system.
  • the method of the invention is characterized in that a light field in a range of radiation at least including one of UV, IR and Terahertz radiation, of a patient is captured by a light field camera, the data on the light field captured by the camera is processed to provide projection image data for a light field projector, the light field projector projecting a light field based the projection image data over the patient, wherein the camera and projector share a coaxial optical path and a light field in visible light is projected on the object by the light field projector.
  • Secondary image data is provided on a three-dimensional internal image of the patientobject, wherein the secondary image is ontained using one of an X-ray, MRI, CT, PET-CT or Ultrasound system, and wherein said secondary image data is reformatted using the depth information and said reformatted data is provided to the light field projector.
  • the light field capturing camera captures a light field in a hyper range of the spectrum, i.e. in a spectral range of radiation at least partly not visible to the human eye and the light field projector projects a light field in visible light.
  • the light field projector forms a display device for displaying the recorder hyperspectral image in visible light.
  • the projected light field causes the display of a projected 3D image overlaying the object, said 3D image being sharp throughout a large range of depths.
  • a shared coaxial optical path provides for a relatively easy alignment of captured and projected light fields. This allows accurate and real time projection by the projector of the hyperspectral image in visible light on the object of observation of which the camera has captured the hyperspectral light field, also in case the object of observation is not flat but has a 3D form.
  • a light field capturing camera has, compared to a regular 2D or even a 3D camera, the advantage that a complete light field is obtained, with the possibility of obtaining sharp images throughout a range of depths.
  • a normal 2D camera does not provide a large depth of view and although a 3D camera can provide some depth information, neither are capable of providing a sharp image throughout a range of depths.
  • a light field camera is also called a plenoptic camera.
  • a light-field camera is a camera that captures light field information about a scene using plenoptic imaging. Plenoptic imaging captures an incident light field preserving both intensity and direction of incident light.
  • the implementation of a plenoptic imaging system can be based on various techniques: a microlens array as in M.
  • microlens array Of the techniques described above using a microlens array is preferred. In a continuously graded attenuation mask and in an aperture encoding mask some of the light passing through the mask is attenuated, leading to a loss in intensity. In a microlens array, a higher percentage of the available light is used.
  • a microlens array is situated between a lens and an image sensor of the plenoptic camera.
  • the microlens array refocuses light captured by the lens onto the image sensor thereby creating many small images taken from slightly different viewpoints.
  • the 3D information is stored in the small images, each of which is produced by a single microlens.
  • Each of the small images has a relatively low spatial resolution.
  • Another type of light field capturing camera that does not use a microlens array, is a system that uses a so-called sweeping focus or sweeping lens technique.
  • This technique amounts to integrating images at the sweep of the focus (i.e. in a particular range of depth of fields).
  • the resulting image comprises for the focus sweep all image information and also captures all available light.
  • the image taken can be deconvoluted to provide sharp images at various depths and reconstruct a plenoptic projected light field.
  • Using a microlens array is preferred since the light field can be obtained instantaneously. Using a microlens it is relatively easy to align the captured light field, captured by the camera, and the projected light fields, projected by the projector.
  • the camera and the projector share common chain of optical imaging elements along the shared coaxial optical axis. This enables better alignment of the captured light field and the projected light field.
  • the system comprises an element providing a plenoptic function being positioned in the shared coaxial optical path.
  • Providing the element providing a plenoptic function in the shared coaxial path increases the ease of alignment of captured and projected light field.
  • Such elements may be a microlens array, a coded aperture or a wavefront encoder.
  • microlens array is preferred.
  • the system comprises a beam splitter for splitting light paths from and to the common optical axis to and from the light field camera respectively light field projector the beam splitter having a dichroic, spectrally selective property.
  • the dichroic beam splitter passes or reflects light in the hyperspectral range to the camera, while reflecting or passing light in the visible range coming from the projector.
  • a spectrally selective illumination is used.
  • the system is a mobile, preferably portable system, for instance a hand held system. This allows e.g. a physician to view veins immediately and on the spot. When needles have to be inserted in a vein such on the spot inspection is a great advantage.
  • system is a part of a surgical luminary.
  • Hyperspectral imaging provides contrast, e.g. tissue contrast that is invisible by the naked eye.
  • the improved contrast can for instance be used to reveal blood vessels and nerves during surgery or introduction of needles into veins. It can also be used to identify malignant tissue.
  • Hyperspectral image capture can be based on a monochrome, non-spectral-selective image sensor and the use of spectrally selective filtering prior to the image sensor, similar to an normal RGB camera but with more color channels and with different filter characteristics. Otherwise, hyperspectral image capture can also be based on spectrally selective (controlled) illumination in combination with an unfiltered image sensor. A combination of 'filtered illumination' and 'filtered acquisition' is also possible.
  • Differences in spectral response between different materials is generally converted into a visible contrast (b/w or pseudo-color) by way of a linear weighted combination of different spectral input values for the same spatial location.
  • Various different predetermined weight combinations lead to different tissue contrasts.
  • the result from hyperspectral image capture is generally an image with an enhanced contrast of the material (liquid or tissue) of interest. This way, e.g. it is possible to reveal the position of veins and arteries on the basis of their subtle but distinct spectral response compared to e.g. the skin.
  • the corresponding result image depicts the blood vessel structure directly under the observed skin area. It is an object of the invention to project the result image in real time on the observed tissue, in real time and in constant correct alignment with the observed tissue.
  • Fig. 1 illustrates an embodiment of a system and method according to the invention.
  • hyperspectral light is shone from hyperspectral source 2.
  • the light source may be part of the system, and in preferred embodiments it is, or could be separately provided.
  • the hyperspectral source causes the issue to form a hyperspectral image, for instance in IR or in UV.
  • the obj ect may itself, independent from the presence of a hyperspectral light source, provide a hyperspectral image, i.e. an image in a wavelength not or difficult to see with the human eye.
  • the object may be provided with a substance that, after having been illuminated in the past, phosphoresces in a particular wavelength.
  • the tissue 1 may, even without a light source being shone upon it, provide an IR image showing details in an IR wavelength that are invisible in visible wavelengths.
  • the object may be illuminated with a source that supplies visible light as well as for instance UV and/or IR light a wavelength selective element is provided in the light path to the camera or in the camera so that the camera records the hyperspectral image.
  • the camera may be provided with sensor pixels that electronically record the image in visible light and sensor pixels that record the image in a hyperspectral range of radiation and the data from the hyperspectral sensitive pixels is used for the hyperspectral light field.
  • a light field camera that comprises pixels that are sensitive both to visible light and to the hyperspectral (for instance IR and/or UV part of the spectrum) radiation and time-sequentially put a wavelength selective filters in front of a source providing visible light as well hyperspectral radiation, wherein the filters pass either visible light or a hyperspectral part of the spectrum and synchronize the data acquisition from the light field camera with the time-sequential illumination to provide the light field data in the hyperspectral range and possibly also in the visible part of the spectrum.
  • the hyperspectral for instance IR and/or UV part of the spectrum
  • the hyperspectral image is taken in a UV or IR range of the electromagnetic spectrum. These embodiments are preferred.
  • the hyperspectral image may be taken in other ranges of the electromagnetic spectrum, for instance by means of X-ray imaging or Terahertz imaging.
  • the light field camera is an X-ray or Terahertz imaging device providing light field data in the X-ray or Terahertz part of the electromagnetic spectrum.
  • the 4D light field provided by the tissue is captured by light field camera 3 through a lens system 5.
  • the lens system 5 comprises a beam splitter 6 and a microlens array 7.
  • the captured light field is denoted by CLF in figure 1 .
  • the light field camera comprises a sensor on which the light field is captured.
  • the data on the captured light field are provided, via an image processor 8, to a light field projector 4.
  • the output of the camera thus provides data for the input of the projector.
  • "Providing data” should, however, not be interpreted as to mean that the camera data are directly supplied to the projector, but that the camera data form a basis for the data for the projector. Processing of the data may be provided in between the output of the light field camera 3 and the input of the light field projector 4.
  • the light field projector projects a light field PLF on the tissue 1 via beam splitter 6 and microlens array 7. It is preferred that the light source forms part of the system. This enables a control of the intensity of the light shining on the object 1.
  • the embodiment of figure 1 shows a system wherein a hyperspectral image is taken in UV or IR. As explained above in various ways such an image can be taken. For simplicity sake the in the figure no wavelength selective element has been shown. Such a wavelength selective element can be for instance put in front of the source, or in front of the camera or, if the camera comprises different pixels for visible light than for UV or IR, the data can be electronic filtered, i.e. by means of a data filter to filter the data acquired by the light field camera.
  • microlens array Due to the generally short focal length of the microlenses in the microlens array, the microlens array tends to create an array of micro images also focused very closely behind the lens array.
  • the optical lens system between microlens array 7 and beam splitter 6, and also behind the beam splitter relays this (micro-)image plane such that the micro image plane coincides with the sensor plane of the camera and with the plane of the image-generating element in the projector.
  • the image generating element can be for instance an array of light emitting elements, and array of switching mirrors (typ. a DLP element), or an array of LCD light shutters.
  • the projector 4 and the camera 3 share a common coaxial optical axis.
  • a common optical axis is illustrated in figure 1 by the fact that the light rays are parallel.
  • the advantage of using a common optical path for image capture and projection is that the projected overlay is in good alignment with the associated tissue. Apart from scaling for differences in sensing- and projecting-element size, no complex 3D processing is required.
  • Each microlens can be regarded as a super-pixel that not only stores angular information in addition to the intensity of the incident light at the location of that 'super-pixel'.
  • a projector that generates the same micro-images in association with a microlens array will lead to a projection of which the focal plane coincides with the original surface plane, regardless its curved shape.
  • the use of a common optical path and alignment of sensor and projector pixels will lead to a projection that is always in focus on the surface that is captured with the camera.
  • the use of a microlens array is preferred since a microlens array does not attenuate the light field.
  • the system can be called a plenoptic hyperspectral augmented-reality system providing range-invariant capture and projection.
  • the beam splitter 6 can also provide spectral selectivity. Particularly when the image capture is primarily in an invisible light domain such as IR, the beam splitter can have a dichroic property. In that case, the incident IR light follows a straight path towards the camera, and the visible light from the projector is refracted by the beam splitter.
  • Figure 2 illustrates also an embodiment of a system according to the invention.
  • a mirror is used to fold the projected light field. This allows in circumstances a more compact design of the system.
  • Figure 3 illustrates a further embodiment.
  • the camera and the projector comprise different microlens arrays.
  • the systems of figures 1 and 2 are preferred but, if for instance the spectral wavelength of the hyperspectral imaging requires a specific material for the microlens that is less suitable for visible light wavelength, separate microlens arrays can be used.
  • the camera and the projector share common imaging elements along the common optical axis.
  • Figure 4 and 5 illustrates a preferred embodiment of the system.
  • the system is a mobile, preferably portable system.
  • the system is a hand held system.
  • the system comprises a hyperspectral source within the hand held device and a camera and projector, wherein the portable device is used to capture tissue region and provide a projection of otherwise invisible data, e.g. on the position of veins, as illustrated in figure 5 .
  • Having a properly image capture and properly projected sharp image of e.g. veins using a portable device provides great advantages in situation wherein it is important or even vital to find a vein fast.
  • an emergency situation such as an accident
  • Existing systems do not provide the possibility to, accurately and in real time, and on the spot of the accident, provide an image of the position of veins or other hyperspectral details.
  • the portable system of figures 4 and 5 does provide this possibility.
  • the system is hand held.
  • the system may be worn on a helmet or on a sleeve so that the hands are free to insert a needle or perform other medical procedures.
  • Fig. 6 illustrates use of a system according to the invention in a surgical lamp or a dentist lamp.
  • the lamp can optionally provide spectrally selective illumination as part of the hyperspectral image capturing.
  • the system comprises a secondary imaging system, for instance X-ray imaging system, or more generally in a system that produces an internal image of the object under observation, e.g. a system as described in patent application WO2010067281 .
  • a secondary imaging system for instance X-ray imaging system, or more generally in a system that produces an internal image of the object under observation, e.g. a system as described in patent application WO2010067281 .
  • FIG. 7 a schematic drawing of a system for such an embodiment is shown.
  • the system comprises an X-ray C-arm with two cameras sensitive to UV, Visible, or Infrared wavelengths attached.
  • the illustrated C-arm X-ray system is composed of a base frame 72 movable on wheels 71 and at which a C-arm 73 is seated such that it is rotatable around the axis 74 (angulation) such that it also can be turned around an axis 75 in the direction of the double arrow 76 (orbital rotation).
  • a mobile system is described here, the X-ray system can also be fixed to the wall as in a cathlab.
  • An X-ray source 77 and a detector 81 preferably a rectangular flat detector, residing 180 degree opposite one another, are secured to the C-arm 73 in the region of its ends.
  • the X-ray C-arm is capable of acquiring a three-dimensional internal image of the patient.
  • Camera system 82 is attached aside to the detector 81 and is capable of capturing images of the patient's operation field.
  • the camera system is capable of three-dimensional imaging of the patient.
  • a hyperspectral imaging system 83 according to the invention is also attached to the detector 81 and is capable of projecting information in visible light back onto the patient in such a way that the images are in focus on the curved surfaces of the patient. For instance structures such as tumour boundaries are better delineated in the hyperspectral image and can be projected back onto the patient in the visible light according to the invention. This makes the tumour boundaries better visible to the surgeon.
  • the back projection of images taken by the X-ray system and converted to visible images is possible by the system 83.
  • the position of the tumour deep inside the body visible with X-ray imaging is projected back onto the patient body.
  • important structures such as large blood vessels that lie just below the surface and are not visible by the eyes can be indicated.
  • the surgeon knows in advance to be careful when making incisions at this position.
  • a similar approach can also be applied to a MRI, CT, PET-CT or Ultrasound system.
  • a teraherz imaging system can also be used. All these systems provide an internal image of an object under observation and in all cases the data sources produce a stream 2D images which form a secondary data set in addition to the data based on the camera acquisitions.
  • variable preferably means are provided to determine the relative positions of the hyperspectral imaging and the secondary imaging system. This may be done automatically, for instance by providing an electronic means to measure the X, Y and Z coordinates of both imaging systems and preferably also the orientation or axes of the imaging system if this information is relevant. This may of course also be done by a manual input of such data.
  • image features either naturally occurring or specifically placed within the range of the respective images present in both the hyperspectral and the secondary image may be used to align the hyperspectral and secondary images. For instance small metal objects on the patient at various points which would show in the hyperspectral as well as visible as well as X-ray images could be used for this purpose.
  • Figure 8 illustrates the system of figure 7 further.
  • the use of such secondary image data coming from for instance the X-ray data requires the explicit calculation of a depth map d(x,y) describing the distance d between the plenoptic camera/projector and the tissue surface for every pixel (x,y) of the projector.
  • This in contrast to the plenoptic camera data itself which only requires a spatial interpolation to match the input pixel grid of the plenoptic camera to the projector's output pixel grid.
  • the captured light field comprises depth information.
  • various solutions have been proposed, e.g., by Bishop et al. in T. Bishop, P. Favaro, "Plenoptic depth estimation from multiple aliased views", in: 2009 IEEE 12th International Conference on Computer Vision Workshops (ICCV Workshops), IEEE, pp. 1622-1629, Los Alamitos, 2009 and by Wanner et al. in S. Wanner, J. Fehr, B. Jaehne, "Generation EPI representations of 4D light fields with a single lens focused plenoptics camera", in: Proc. ISVC 2011, G. Bebis et al. eds., pp. 90-101, 2011 .
  • the recovered depth map d(x,y) is then used in part 9 to reformat the image from the secondary data source into an array of micro-images. In case of proper alignment with the microlens array, also the secondary data will then project in proper focus on the tissue surface, regardless its shape and orientation.
  • the part 9 may also have an input for inputting data on the relative positions and/or orientations of the hyperspectral and X-ray imaging system.
  • Fig. 9 illustrates the principle of using a micro lens to capture a light field and project a light field.
  • the top part of Fig. 9 illustrates capturing of a light field.
  • Plenoptic imaging stores spatial information of the incident light field.
  • the 3D information is stored in small micro-images, each of which is produced by a single microlens of the microlens array.
  • the captured light field is in fact 4-dimensional, as each light ray is characterized by a 2D location on the sensor and a horizontal and vertical angle of incidence, adding 2 more dimensions.
  • Each microlens can be regarded as a super-pixel that not only stores angular information but also the intensity of the incident light at the location of that 'super-pixel'.
  • the bottom part of fig. 9 illustrates projecting a light field from pixel of projector 4.
  • the light rays are reversed.
  • a projector that generates the same micro-images in association with a microlens array will lead to a projection of which the focal plane coincides with the original surface plane, regardless its curved shape.
  • the use of a common optical path and alignment of sensor and projector pixels will lead to a projection that is always in focus on the surface that is captured with the camera. If all elements are exactly the same, same size, same position etc. there is a simple one-to-one relation between the pixel of the camera and the pixel of the projector. In reality the two may differ in size or exact location. However, the relation remains simple task of translating (T) and scaling (S). This is performed in processor 8.
  • the task of translating could also be done mechanically by providing the projector or the camera with a means for translating the sensor or projecting surface in x and y-direction.
  • the microlens array 7 increases the correspondence between the optical paths of image recording and projection thereby simplifying the processing.
  • Figure 10 illustrates a method for finding the required translation and scaling factors.
  • a test image T is provided, this test image T is recorded by the camera 3 which sends the data on the recorded image to processor 8; the processor 8 applies an initial T and S transformation, found for instance by a previous computer generated optical ray tracing assuming known features of camera and projector, to the data and sends it to projector 4.
  • the projected image is compared to the test image, which can for instance be done with a separate camera capable of recording both the hyperspectral image and the projected image. If the test image and the projected image coincide the preset values for T and S are used, if not the values of T and S are varied until the test image and the projected image coincide. This is one way of finding the T and S values.
  • a method for aligning the light field camera and light field projector for a system according to the invention is shown by adjusting the translation and scaling factors T and S to align a test image T to a projected light field image. This testing and alignment procedure is done in preferred methods according to the invention prior to acquiring light field images and projecting light field images.
  • the different elements of a system may be and preferably are in a single device but various elements may be at various physical positions, for instance when light field data are sent from the light field camera to part 8 to be processed to provide projection light field data for projector 4.
  • This part 8 may be in the same device as the camera and the projector, and preferably is, but can also be in a CPU or on a site on the internet or shared by various systems.
  • the data can be transmitted from camera 3 to part 8 by any means for transmission of data, by wire as well as wireless. The same holds for data from part 8 to the projector 4.
  • the invention also relates, for those embodiments in which the invention is partly done by means software, to a computer program product comprising program code means stored on a computer readable medium for performing a method according to the invention and to a computer program product to be loaded by a computer arrangement, comprising instructions for a method according to the invention.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Optics & Photonics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Human Computer Interaction (AREA)
  • Vascular Medicine (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Geometry (AREA)
  • Theoretical Computer Science (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)
  • Image Processing (AREA)
  • Studio Devices (AREA)
  • Photoreceptors In Electrophotography (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)
  • Spectrometry And Color Measurement (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • Magnetic Resonance Imaging Apparatus (AREA)

Description

    FIELD OF THE INVENTION
  • The invention relates to a system for hyperspectral imaging comprising a camera for recording a hyperspectral image of an object and a display device for displaying the recorder hyperspectral image in visible light, and to a method for recording an hyperspectral image and displaying the hyperspectral image in visible light.
  • BACKGROUND OF THE INVENTION
  • Hyperspectral imaging is known to reveal details impossible or difficult to see by the human eye, such as for instance tissue differences in a human. In hyperspectral imaging an image is taken of an object in one or more wavelength bands where at least one wavelength band is at least partly invisible to the human eye or at least very difficult to see. This image is then converted into a visible image, which image is provided in visible light to a viewer. Hyperspectral imaging can be based both by spectrally selective illumination (i.e. illumination an object with light in a certain wavelength band) and by spectrally selective filtering prior (i.e. using a filter that transmits only light in a certain wavelength band) to image capture. In both cases image processing is required to generate a resulting image revealing the structure contrast of interest.
  • In such a system conventionally a hyperspectral image (e.g. image extending beyond the visible spectrum) is taken and the result is shown on a display screen. Sitting behind the display screen the viewer, interested in not or hardly visible details of the object under observation, can study the image on the screen in visible light as it would appear in for instance UV light, or in IR light.
  • Although using a screen is a very useful technique, the possibilities are limited. It has been proposed to project a hyperspectral image on a studied object, for instance in R.K. Miyake, H.D. Zeman, F.H. Duarte, R. Kikuchi, E. Ramacciotti, G. Lovhoiden, C. Vrancken, "Vein imaging: A new method of near infrared imaging where a processed image is projected onto the skin for the enhancement of vein treatment", Dermatologic, Surgery, vol. 32, pp. 1031-1038, 2006. The projection is performed with a laser projector.
  • It is difficult, if not near impossible, using the known technique to provide a good sharp projection wherein the projection coincides to a relatively high degree of alignment unless the object, in the known prior art the skin, is stationary and to a high degree flat US2012/0200829 A1 discloses plenoptic cameras and detectors. US2010/0177184 A1 discloses an infrared imaging system and projector to enhance the visibility of vascular structures in the body of a patient. Taheri et al., Skin Research and Technology 2013, 19, 288-290, discloses the use of light field cameras for imaging of body parts. EP2075616A1 discloses an imaging device and projector for projecting the recorded image of the body part of a patient on the same body part for surgical applications. US2012/0170824 A1 discloses an imaging and projection device comprising visible and X-rax imaging sensors for projecting an X-ray image on the surface of the body of a patient.
  • It is an object of the invention to provide a system and a method that allows directly seeing hyperspectral details of an object under observation and in correct alignment.
  • SUMMARY OF THE INVENTION
  • To this end the system of the invention is characterized in that the system the camera is a light field capturing camera for recording a hyperspectral image of a patient in a spectral range of radiation at least including one of UV, IR and Terahertz radiation, and the display device is a light field projector wherein the camera and projector share a coaxial optical path and wherein the camera is arranged to capture a hyperspectral light field, and comprises an output for sending data on the captured light field to an input of the light field projector and the light field projector is arranged to project a light field in visible light over the patient based on the data received from the camera. The system further comprises a secondary imaging system for providing secondary image data on a three-dimensional internal image of the patient, wherein the system comprises a processor to provide depth information based on the data of the captured light field and to format, based on the depth information, the secondary data into an image projected on the object. The secondary imaging system is one of an X-ray system, MRI, CT, PET-CT or ultrasound system.
  • To this end the method of the invention is characterized in that a light field in a range of radiation at least including one of UV, IR and Terahertz radiation, of a patient is captured by a light field camera, the data on the light field captured by the camera is processed to provide projection image data for a light field projector, the light field projector projecting a light field based the projection image data over the patient, wherein the camera and projector share a coaxial optical path and a light field in visible light is projected on the object by the light field projector. Secondary image data is provided on a three-dimensional internal image of the patientobject, wherein the secondary image is ontained using one of an X-ray, MRI, CT, PET-CT or Ultrasound system, and wherein said secondary image data is reformatted using the depth information and said reformatted data is provided to the light field projector.
  • The light field capturing camera captures a light field in a hyper range of the spectrum, i.e. in a spectral range of radiation at least partly not visible to the human eye and the light field projector projects a light field in visible light. The light field projector forms a display device for displaying the recorder hyperspectral image in visible light. The projected light field causes the display of a projected 3D image overlaying the object, said 3D image being sharp throughout a large range of depths. A shared coaxial optical path provides for a relatively easy alignment of captured and projected light fields. This allows accurate and real time projection by the projector of the hyperspectral image in visible light on the object of observation of which the camera has captured the hyperspectral light field, also in case the object of observation is not flat but has a 3D form.
  • A light field capturing camera has, compared to a regular 2D or even a 3D camera, the advantage that a complete light field is obtained, with the possibility of obtaining sharp images throughout a range of depths. A normal 2D camera does not provide a large depth of view and although a 3D camera can provide some depth information, neither are capable of providing a sharp image throughout a range of depths. A light field camera is also called a plenoptic camera. A light-field camera is a camera that captures light field information about a scene using plenoptic imaging. Plenoptic imaging captures an incident light field preserving both intensity and direction of incident light. The implementation of a plenoptic imaging system can be based on various techniques: a microlens array as in M. Levoy et al., "Light field microscopy", ACM Trans. on Graphics, vol. 25, no. 3, pp. 924-934, July 2006; dappled photography with a continuously graded attenuation mask as in A. Veeraraghavan et al., "Dappled photography: Mask enhanced cameras for heterodyned light fields and coded aperture refocusing", ACM Trans. on Graphics (Proc. SIGGRAPH 2007), vol. 26, no. 3, July 2007; aperture encoding mask as in A. Levin et al., "Image and depth form a conventional camera with a coded aperture", ACM Trans. on Graphics (Proc. SIGGRAPH 2007), vol. 26, no. 3, July 2007; wavefront encoder as in E.R. Dowski et al., "Extended depth of field through wave-front coding", Applied Optics, vol. 34, no. 11, pp. 1859-1866, Apr. 1995; sweeping-focus imaging as in H. Nagahara et al., "Flexible Depth of Field Photography", in Proc. ECCV 2008, Oct. 2008. Plenoptic imaging stores spatial information of the incident light field. The captured light field is in fact 4-dimensional, as each light ray is characterized by a 2D location on the sensor and a horizontal and vertical angle of incidence, adding 2 more dimensions. The projected light field creates an image on the object that is sharp throughout a large range of optical depths.
  • Of the techniques described above using a microlens array is preferred. In a continuously graded attenuation mask and in an aperture encoding mask some of the light passing through the mask is attenuated, leading to a loss in intensity. In a microlens array, a higher percentage of the available light is used.
  • A microlens array is situated between a lens and an image sensor of the plenoptic camera. The microlens array refocuses light captured by the lens onto the image sensor thereby creating many small images taken from slightly different viewpoints. The 3D information is stored in the small images, each of which is produced by a single microlens. Each of the small images has a relatively low spatial resolution.
  • Another type of light field capturing camera that does not use a microlens array, is a system that uses a so-called sweeping focus or sweeping lens technique. In such cameras the focusing lens and/or the sensor position is changed during capturing of the image. This technique amounts to integrating images at the sweep of the focus (i.e. in a particular range of depth of fields). The resulting image comprises for the focus sweep all image information and also captures all available light. The image taken can be deconvoluted to provide sharp images at various depths and reconstruct a plenoptic projected light field. Using a microlens array is preferred since the light field can be obtained instantaneously. Using a microlens it is relatively easy to align the captured light field, captured by the camera, and the projected light fields, projected by the projector.
  • Preferably the camera and the projector share common chain of optical imaging elements along the shared coaxial optical axis. This enables better alignment of the captured light field and the projected light field.
  • Preferably the system comprises an element providing a plenoptic function being positioned in the shared coaxial optical path.
  • Providing the element providing a plenoptic function in the shared coaxial path increases the ease of alignment of captured and projected light field.
  • Such elements may be a microlens array, a coded aperture or a wavefront encoder.
  • Of these elements the microlens array is preferred.
  • In embodiments the system comprises a beam splitter for splitting light paths from and to the common optical axis to and from the light field camera respectively light field projector the beam splitter having a dichroic, spectrally selective property. The dichroic beam splitter passes or reflects light in the hyperspectral range to the camera, while reflecting or passing light in the visible range coming from the projector.
  • In another embodiment a spectrally selective illumination is used.
  • In preferred embodiments the system is a mobile, preferably portable system, for instance a hand held system. This allows e.g. a physician to view veins immediately and on the spot. When needles have to be inserted in a vein such on the spot inspection is a great advantage.
  • In another preferred embodiment the system is a part of a surgical luminary.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • These and other objects and advantageous aspects will become apparent from exemplary embodiments that will be described using the following Figs.
    • Fig. 1 illustrates an embodiment of a system according to the invention;
    • Fig. 2 illustrates another embodiment of a system according to the invention;
    • Fig. 3 illustrates another embodiment of a system according to the invention;
    • Fig. 4 illustrates a hand-held device comprising a system according to the invention;
    • Fig. 5 illustrates vein image enhancement using a hand-held system as shown in Fig. 4;
    • Fig. 6 illustrates a surgical or dentist lamp comprising a system according to the invention;
    • Figs. 7 and 8 illustrate an X-ray system comprising a system according to the invention;
    • Fig. 9 illustrates the principle of using a micro lens to capture a light field and project a light field;
    • Fig. 10 illustrates a method for fine tuning correspondence between captured and projected light field.
  • The figures are not drawn to scale. Generally, identical components are denoted by the same reference numerals in the figures.
  • DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS
  • It is an object of the invention to provide a result image as projection on the observed tissue, yet in such a way that the projection is always in correct focus on tissue, regardless the surface curvature of the tissue or its orientation with respect to the capturing/projecting device.
  • Hyperspectral imaging provides contrast, e.g. tissue contrast that is invisible by the naked eye. The improved contrast can for instance be used to reveal blood vessels and nerves during surgery or introduction of needles into veins. It can also be used to identify malignant tissue.
  • Hyperspectral image capture can be based on a monochrome, non-spectral-selective image sensor and the use of spectrally selective filtering prior to the image sensor, similar to an normal RGB camera but with more color channels and with different filter characteristics. Otherwise, hyperspectral image capture can also be based on spectrally selective (controlled) illumination in combination with an unfiltered image sensor. A combination of 'filtered illumination' and 'filtered acquisition' is also possible.
  • Differences in spectral response between different materials is generally converted into a visible contrast (b/w or pseudo-color) by way of a linear weighted combination of different spectral input values for the same spatial location. Various different predetermined weight combinations lead to different tissue contrasts. As such, the result from hyperspectral image capture is generally an image with an enhanced contrast of the material (liquid or tissue) of interest. This way, e.g. it is possible to reveal the position of veins and arteries on the basis of their subtle but distinct spectral response compared to e.g. the skin. The corresponding result image depicts the blood vessel structure directly under the observed skin area. It is an object of the invention to project the result image in real time on the observed tissue, in real time and in constant correct alignment with the observed tissue.
  • Disadvantage of current hyperspectral imaging systems in general is that the result data appears separately on a display screen, such that the geometrical relation with the real tissue is easily lost. Nowadays, the use of augmented reality glasses is a popular method to keep the result data, generated in the glasses, which form the display screens, in constant alignment with a tissue that is observed by the surgeon. A major disadvantage is that this requires a head-mounted device of which the location and orientation is tracked against the position of the working area, adding to the complexity of such solutions. Moreover, it forces the specialist to wear special glasses. Also, in operating theatres many people are present; if only the specialist carries special glasses assistants are not able to see what the specialist is seeing, unless they also wear glasses and the position and orientation of said glasses is also tracked, adding immense complexity to the system.
  • It is an object of the invention to provide a system and a method allowing directly seeing hyperspectral details of an object under observation and in correct alignment without the need for 3D object-tracking or wearing special glasses.
  • Fig. 1 illustrates an embodiment of a system and method according to the invention.
  • On object 1, in this example a human tissue, hyperspectral light is shone from hyperspectral source 2. The light source may be part of the system, and in preferred embodiments it is, or could be separately provided. The hyperspectral source causes the issue to form a hyperspectral image, for instance in IR or in UV. Alternatively the obj ect may itself, independent from the presence of a hyperspectral light source, provide a hyperspectral image, i.e. an image in a wavelength not or difficult to see with the human eye. For instance, the object may be provided with a substance that, after having been illuminated in the past, phosphoresces in a particular wavelength.
  • Alternatively or in addition the tissue 1 may, even without a light source being shone upon it, provide an IR image showing details in an IR wavelength that are invisible in visible wavelengths. Alternatively or in addition, the object may be illuminated with a source that supplies visible light as well as for instance UV and/or IR light a wavelength selective element is provided in the light path to the camera or in the camera so that the camera records the hyperspectral image.
  • Alternatively or in addition, the camera may be provided with sensor pixels that electronically record the image in visible light and sensor pixels that record the image in a hyperspectral range of radiation and the data from the hyperspectral sensitive pixels is used for the hyperspectral light field.
  • It is also possible to use a light field camera that comprises pixels that are sensitive both to visible light and to the hyperspectral (for instance IR and/or UV part of the spectrum) radiation and time-sequentially put a wavelength selective filters in front of a source providing visible light as well hyperspectral radiation, wherein the filters pass either visible light or a hyperspectral part of the spectrum and synchronize the data acquisition from the light field camera with the time-sequential illumination to provide the light field data in the hyperspectral range and possibly also in the visible part of the spectrum.
  • In embodiments the hyperspectral image is taken in a UV or IR range of the electromagnetic spectrum. These embodiments are preferred.
  • However, the hyperspectral image may be taken in other ranges of the electromagnetic spectrum, for instance by means of X-ray imaging or Terahertz imaging.
  • For such embodiments the light field camera is an X-ray or Terahertz imaging device providing light field data in the X-ray or Terahertz part of the electromagnetic spectrum.
  • The 4D light field provided by the tissue is captured by light field camera 3 through a lens system 5. The lens system 5 comprises a beam splitter 6 and a microlens array 7. The captured light field is denoted by CLF in figure 1. The light field camera comprises a sensor on which the light field is captured. The data on the captured light field are provided, via an image processor 8, to a light field projector 4. The output of the camera thus provides data for the input of the projector. "Providing data" should, however, not be interpreted as to mean that the camera data are directly supplied to the projector, but that the camera data form a basis for the data for the projector. Processing of the data may be provided in between the output of the light field camera 3 and the input of the light field projector 4. The light field projector projects a light field PLF on the tissue 1 via beam splitter 6 and microlens array 7. It is preferred that the light source forms part of the system. This enables a control of the intensity of the light shining on the object 1. The embodiment of figure 1 shows a system wherein a hyperspectral image is taken in UV or IR. As explained above in various ways such an image can be taken. For simplicity sake the in the figure no wavelength selective element has been shown. Such a wavelength selective element can be for instance put in front of the source, or in front of the camera or, if the camera comprises different pixels for visible light than for UV or IR, the data can be electronic filtered, i.e. by means of a data filter to filter the data acquired by the light field camera.
  • Due to the generally short focal length of the microlenses in the microlens array, the microlens array tends to create an array of micro images also focused very closely behind the lens array. The optical lens system between microlens array 7 and beam splitter 6, and also behind the beam splitter relays this (micro-)image plane such that the micro image plane coincides with the sensor plane of the camera and with the plane of the image-generating element in the projector. The image generating element can be for instance an array of light emitting elements, and array of switching mirrors (typ. a DLP element), or an array of LCD light shutters.
  • The projector 4 and the camera 3 share a common coaxial optical axis. A common optical axis is illustrated in figure 1 by the fact that the light rays are parallel. The advantage of using a common optical path for image capture and projection is that the projected overlay is in good alignment with the associated tissue. Apart from scaling for differences in sensing- and projecting-element size, no complex 3D processing is required.
  • Each microlens can be regarded as a super-pixel that not only stores angular information in addition to the intensity of the incident light at the location of that 'super-pixel'. Similarly, a projector that generates the same micro-images in association with a microlens array will lead to a projection of which the focal plane coincides with the original surface plane, regardless its curved shape. The use of a common optical path and alignment of sensor and projector pixels will lead to a projection that is always in focus on the surface that is captured with the camera. The use of a microlens array is preferred since a microlens array does not attenuate the light field.
  • The system can be called a plenoptic hyperspectral augmented-reality system providing range-invariant capture and projection.
  • Depending on the application, the beam splitter 6 can also provide spectral selectivity. Particularly when the image capture is primarily in an invisible light domain such as IR, the beam splitter can have a dichroic property. In that case, the incident IR light follows a straight path towards the camera, and the visible light from the projector is refracted by the beam splitter.
  • Figure 2 illustrates also an embodiment of a system according to the invention. A mirror is used to fold the projected light field. This allows in circumstances a more compact design of the system.
  • Figure 3 illustrates a further embodiment. In this embodiment the camera and the projector comprise different microlens arrays. The systems of figures 1 and 2 are preferred but, if for instance the spectral wavelength of the hyperspectral imaging requires a specific material for the microlens that is less suitable for visible light wavelength, separate microlens arrays can be used. In the figures 1 to 3 the camera and the projector share common imaging elements along the common optical axis.
  • Figure 4 and 5 illustrates a preferred embodiment of the system. In this embodiment the system is a mobile, preferably portable system. In this embodiment the system is a hand held system. The system comprises a hyperspectral source within the hand held device and a camera and projector, wherein the portable device is used to capture tissue region and provide a projection of otherwise invisible data, e.g. on the position of veins, as illustrated in figure 5. Having a properly image capture and properly projected sharp image of e.g. veins using a portable device provides great advantages in situation wherein it is important or even vital to find a vein fast. When inserting a needle in a vein in e.g. an emergency situation, such as an accident, it may be critical or even a matter of life or death to work fast and accurately and to need only a relatively simple device which can be operated easily and brought to the emergency situation. Existing systems do not provide the possibility to, accurately and in real time, and on the spot of the accident, provide an image of the position of veins or other hyperspectral details. The portable system of figures 4 and 5 does provide this possibility. In this case example the system is hand held. The system may be worn on a helmet or on a sleeve so that the hands are free to insert a needle or perform other medical procedures.
  • Fig. 6 illustrates use of a system according to the invention in a surgical lamp or a dentist lamp. The lamp can optionally provide spectrally selective illumination as part of the hyperspectral image capturing.
  • According to the invention the system comprises a secondary imaging system, for instance X-ray imaging system, or more generally in a system that produces an internal image of the object under observation, e.g. a system as described in patent application WO2010067281 .
  • In Figure 7 a schematic drawing of a system for such an embodiment is shown. The system comprises an X-ray C-arm with two cameras sensitive to UV, Visible, or Infrared wavelengths attached. The illustrated C-arm X-ray system is composed of a base frame 72 movable on wheels 71 and at which a C-arm 73 is seated such that it is rotatable around the axis 74 (angulation) such that it also can be turned around an axis 75 in the direction of the double arrow 76 (orbital rotation). Although a mobile system is described here, the X-ray system can also be fixed to the wall as in a cathlab. An X-ray source 77 and a detector 81, preferably a rectangular flat detector, residing 180 degree opposite one another, are secured to the C-arm 73 in the region of its ends.
  • The X-ray C-arm is capable of acquiring a three-dimensional internal image of the patient. Camera system 82 is attached aside to the detector 81 and is capable of capturing images of the patient's operation field. In a particular embodiment the camera system is capable of three-dimensional imaging of the patient. Furthermore, a hyperspectral imaging system 83 according to the invention is also attached to the detector 81 and is capable of projecting information in visible light back onto the patient in such a way that the images are in focus on the curved surfaces of the patient. For instance structures such as tumour boundaries are better delineated in the hyperspectral image and can be projected back onto the patient in the visible light according to the invention. This makes the tumour boundaries better visible to the surgeon. Apart from this hyperspectral imaging back projection, the back projection of images taken by the X-ray system and converted to visible images is possible by the system 83. For instance, the position of the tumour deep inside the body visible with X-ray imaging is projected back onto the patient body. In this way, the surgeon has a much better indication where the tumour is located. Also, important structures such as large blood vessels that lie just below the surface and are not visible by the eyes can be indicated. In this way, the surgeon knows in advance to be careful when making incisions at this position. Instead of an X-ray system a similar approach can also be applied to a MRI, CT, PET-CT or Ultrasound system. A teraherz imaging system can also be used. All these systems provide an internal image of an object under observation and in all cases the data sources produce a stream 2D images which form a secondary data set in addition to the data based on the camera acquisitions.
  • In the system of figure 7 the relative positions of the hyperspectral imaging system and the secondary imaging system (the X-ray system in figure 7) are known and fixed. This enables a relatively simple matching of hyperspectral and internal imaging.
  • In systems wherein the relative position of hyperspectral imaging system and secondary internal imaging system are to a more or less greater extent variable preferably means are provided to determine the relative positions of the hyperspectral imaging and the secondary imaging system. This may be done automatically, for instance by providing an electronic means to measure the X, Y and Z coordinates of both imaging systems and preferably also the orientation or axes of the imaging system if this information is relevant. This may of course also be done by a manual input of such data. Alternatively or in addition image features, either naturally occurring or specifically placed within the range of the respective images present in both the hyperspectral and the secondary image may be used to align the hyperspectral and secondary images. For instance small metal objects on the patient at various points which would show in the hyperspectral as well as visible as well as X-ray images could be used for this purpose.
  • Figure 8 illustrates the system of figure 7 further. The use of such secondary image data coming from for instance the X-ray data requires the explicit calculation of a depth map d(x,y) describing the distance d between the plenoptic camera/projector and the tissue surface for every pixel (x,y) of the projector. This, in contrast to the plenoptic camera data itself which only requires a spatial interpolation to match the input pixel grid of the plenoptic camera to the projector's output pixel grid.
  • The captured light field comprises depth information. To recover the distance-profile from the captured light-field data, various solutions have been proposed, e.g., by Bishop et al. in T. Bishop, P. Favaro, "Plenoptic depth estimation from multiple aliased views", in: 2009 IEEE 12th International Conference on Computer Vision Workshops (ICCV Workshops), IEEE, pp. 1622-1629, Los Alamitos, 2009 and by Wanner et al. in S. Wanner, J. Fehr, B. Jaehne, "Generation EPI representations of 4D light fields with a single lens focused plenoptics camera", in: Proc. ISVC 2011, G. Bebis et al. eds., pp. 90-101, 2011. This then becomes an extra task that is performed by the processing block 8 in Figure 8. The recovered depth map d(x,y) is then used in part 9 to reformat the image from the secondary data source into an array of micro-images. In case of proper alignment with the microlens array, also the secondary data will then project in proper focus on the tissue surface, regardless its shape and orientation. Although not shown the part 9 may also have an input for inputting data on the relative positions and/or orientations of the hyperspectral and X-ray imaging system.
  • Fig. 9 illustrates the principle of using a micro lens to capture a light field and project a light field. The top part of Fig. 9 illustrates capturing of a light field. Plenoptic imaging stores spatial information of the incident light field. In case a microlens array is used, the 3D information is stored in small micro-images, each of which is produced by a single microlens of the microlens array. The captured light field is in fact 4-dimensional, as each light ray is characterized by a 2D location on the sensor and a horizontal and vertical angle of incidence, adding 2 more dimensions.
  • Each microlens can be regarded as a super-pixel that not only stores angular information but also the intensity of the incident light at the location of that 'super-pixel'.
  • The bottom part of fig. 9 illustrates projecting a light field from pixel of projector 4. The light rays are reversed. A projector that generates the same micro-images in association with a microlens array will lead to a projection of which the focal plane coincides with the original surface plane, regardless its curved shape. The use of a common optical path and alignment of sensor and projector pixels will lead to a projection that is always in focus on the surface that is captured with the camera. If all elements are exactly the same, same size, same position etc. there is a simple one-to-one relation between the pixel of the camera and the pixel of the projector. In reality the two may differ in size or exact location. However, the relation remains simple task of translating (T) and scaling (S). This is performed in processor 8.
  • The task of translating could also be done mechanically by providing the projector or the camera with a means for translating the sensor or projecting surface in x and y-direction.
  • Having common optical elements, and in particular a common element that provides the plenoptic function, in figure 9 the microlens array 7 increases the correspondence between the optical paths of image recording and projection thereby simplifying the processing.
  • Figure 10 illustrates a method for finding the required translation and scaling factors.
  • In figure 10 a test image T is provided, this test image T is recorded by the camera 3 which sends the data on the recorded image to processor 8; the processor 8 applies an initial T and S transformation, found for instance by a previous computer generated optical ray tracing assuming known features of camera and projector, to the data and sends it to projector 4. The projected image is compared to the test image, which can for instance be done with a separate camera capable of recording both the hyperspectral image and the projected image. If the test image and the projected image coincide the preset values for T and S are used, if not the values of T and S are varied until the test image and the projected image coincide. This is one way of finding the T and S values. In the figure 10 a method for aligning the light field camera and light field projector for a system according to the invention is shown by adjusting the translation and scaling factors T and S to align a test image T to a projected light field image. This testing and alignment procedure is done in preferred methods according to the invention prior to acquiring light field images and projecting light field images.
  • The invention is defined by the appended independent claims.
  • The word "comprising" does not exclude the presence of other elements or steps than those listed in a claim. Use of the article "a" or "a" preceding an element does not exclude the presence of a plurality of such elements.
  • The word "means" comprises any means, whether in the form of software, hardware any combination thereof for performing the function indicated.
  • The different elements of a system may be and preferably are in a single device but various elements may be at various physical positions, for instance when light field data are sent from the light field camera to part 8 to be processed to provide projection light field data for projector 4. This part 8 may be in the same device as the camera and the projector, and preferably is, but can also be in a CPU or on a site on the internet or shared by various systems. The data can be transmitted from camera 3 to part 8 by any means for transmission of data, by wire as well as wireless. The same holds for data from part 8 to the projector 4.
  • The invention also relates, for those embodiments in which the invention is partly done by means software, to a computer program product comprising program code means stored on a computer readable medium for performing a method according to the invention and to a computer program product to be loaded by a computer arrangement, comprising instructions for a method according to the invention.

Claims (9)

  1. System comprising a hyperspectral imaging system, the hyperspectral imaging system comprising:
    - a camera (3) for recording a hyperspectral image of a patient, and
    - a display device (4) for displaying the hyperspectral image in visible light,
    wherein the camera (3) is a light field capturing camera and the display device (4) is a light field projector, wherein the camera and projector share a coaxial optical path, and wherein the camera is arranged to capture a light field in a spectral range of radiation at least including one of UV, IR and Terahertz radiation, thereby obtaining the hyperspectral image of the patient, and wherein the light field capturing camera (3) comprises an output for sending data on the captured light field to an input of the light field projector (4), and the light field projector (4) is arranged to project a light field in visible light over the patient based on the data received from the light field capturing camera (3);
    the system further comprising:
    - a secondary imaging system for providing secondary image data on a three-dimensional internal image of the patient, where the secondary imaging system is one of an X-ray, MRI, CT, PET-CT or Ultrasound system; and
    - a processor (8) to provide depth information based on data of the captured light field and to format, based on the depth information, the secondary image data into an image for projection on the patient using the light field projector (4).
  2. System as claimed in claim 1, wherein the light field camera (3) and the light field projector (4) share a common chain of optical imaging elements along the shared coaxial optical axis.
  3. System as claimed in claim 1 or 2, wherein the hyperspectral imaging system comprises an element providing a plenoptic function being positioned in the shared coaxial optical path.
  4. System as claimed in any of the preceding claims wherein the hyperspectral imaging system comprises an microlens array (7) providing a plenoptic function.
  5. System as claimed in claim 4 wherein the microlens array (7) is an element common to the light field camera (3) and the light field projector (4).
  6. System as claimed in any of the preceding claims wherein the hyperspectral imaging system comprises a beam splitter (6) for splitting light paths from and to the common optical axis to and from the light field camera (3) respectively light field projector (4) the beam splitter (6) having a dichroic, spectrally selective property.
  7. System as claimed in any of the preceding claims, wherein the hyperspectral imaging system is one of a mobile system, a portable system, and a hand held system.
  8. Method for recording a hyperspectral image and displaying the hyperspectral image in visible light, wherein a light field in a spectral range of radiation, at least including one of UV, IR and Terahertz radiation, of a patient is captured by a light field camera (3) thereby obtaining the hyperspectral image of the patient, the data on the light field captured by the camera is processed to provide projection image data for a light field projector (4), the light field projector (4) projecting a light field based the projection image data over the patient, wherein the light field camera (3) and light field projector (4) share a coaxial optical path and a light field in visible light is projected on the patient by the light field projector (4), the data on the light field captured by the light field camera (3) is processed to provide depth information and wherein secondary image data is provided on a three-dimensional internal image of the patient, wherein the secondary image is obtained using one of an X-ray, MRI, CT, PET-CT or Ultrasound system, and wherein said secondary image data is reformatted using the depth information and said reformatted data is provided to the light field projector (4).
  9. Computer program product comprising program code means stored on a computer readable medium for performing a method as claimed in claim 8, when used in a computer in combination with a system according to claim 1.
EP14714393.7A 2013-03-19 2014-03-12 System for hyperspectral imaging in visible light, method for recording a hyperspectral image and displaying the hyperspectral image in visible light Active EP2976609B1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201361803169P 2013-03-19 2013-03-19
PCT/IB2014/059652 WO2014147515A1 (en) 2013-03-19 2014-03-12 System for hyperspectral imaging in visible light, method for recording a hyperspectral image and displaying the hyperspectral image in visible light

Publications (2)

Publication Number Publication Date
EP2976609A1 EP2976609A1 (en) 2016-01-27
EP2976609B1 true EP2976609B1 (en) 2022-01-05

Family

ID=50397213

Family Applications (1)

Application Number Title Priority Date Filing Date
EP14714393.7A Active EP2976609B1 (en) 2013-03-19 2014-03-12 System for hyperspectral imaging in visible light, method for recording a hyperspectral image and displaying the hyperspectral image in visible light

Country Status (7)

Country Link
US (1) US9736402B2 (en)
EP (1) EP2976609B1 (en)
JP (1) JP5974174B2 (en)
CN (1) CN104380066B (en)
BR (1) BR112014028811B1 (en)
RU (1) RU2655018C2 (en)
WO (1) WO2014147515A1 (en)

Families Citing this family (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102012222375B3 (en) * 2012-12-06 2014-01-30 Siemens Aktiengesellschaft Magnetic coil device for investigation on head of patient, has light field camera element which is provided in camera unit of magnetic coil assembly, such that camera element is arranged within receiving region surrounding shell unit
US10107747B2 (en) * 2013-05-31 2018-10-23 Ecole Polytechnique Federale De Lausanne (Epfl) Method, system and computer program for determining a reflectance distribution function of an object
DE102014210938A1 (en) * 2014-06-06 2015-12-17 Siemens Aktiengesellschaft Method for controlling a medical device and control system for a medical device
WO2016048911A1 (en) * 2014-09-22 2016-03-31 Invuity, Inc. Hyperspectral imager
EP3226877B1 (en) * 2014-12-04 2021-05-05 PerkinElmer Health Sciences, Inc. Systems and methods for facilitating placement of labware components
US9906759B2 (en) 2015-04-09 2018-02-27 Qualcomm Incorporated Combined processing and display device package for light field displays
CN104887181A (en) * 2015-04-29 2015-09-09 浙江大学 Portable vein projector
US10722200B2 (en) * 2015-06-04 2020-07-28 Siemens Healthcare Gmbh Apparatus and methods for a projection display device on X-ray imaging devices
CN106331442B (en) * 2015-07-02 2021-01-15 松下知识产权经营株式会社 Image pickup apparatus
US10317667B2 (en) * 2015-07-04 2019-06-11 The Regents Of The University Of California Compressive plenoptic microscopy for functional brain imaging
CN105158888B (en) * 2015-09-29 2020-09-11 南京理工大学 Programmable microscope condenser device based on LCD (liquid crystal display) panel and imaging method thereof
JP2017080159A (en) * 2015-10-29 2017-05-18 パイオニア株式会社 Image processing apparatus, image processing method, and computer program
US10448910B2 (en) * 2016-02-03 2019-10-22 Globus Medical, Inc. Portable medical imaging system
ES2932185T3 (en) 2016-02-26 2023-01-16 Univ Southern California Optimized volumetric imaging with selective volume illumination and light field detection
DE102016207501A1 (en) * 2016-05-02 2017-11-02 Siemens Healthcare Gmbh Method for operating a magnetic resonance device and magnetic resonance device
EP3284396B1 (en) * 2016-08-16 2020-02-12 Leica Instruments (Singapore) Pte. Ltd. Observation apparatus and method for visual enhancement of an observed object
EP3509529A4 (en) * 2016-09-09 2020-06-03 Intuitive Surgical Operations Inc. Simultaneous white light and hyperspectral light imaging systems
EP3582711A1 (en) 2017-02-14 2019-12-25 Atracsys Sàrl High-speed optical tracking with compression and/or cmos windowing
EP3379487B1 (en) * 2017-03-21 2023-10-04 Siemens Healthcare GmbH Contrast enhanced reproduction of spectral ct image data
GB201713512D0 (en) * 2017-08-23 2017-10-04 Colordyne Ltd Apparatus and method for projecting and detecting light on a 2D or 3D surface, e.g. for semantic lighting based therapy
CN109087341B (en) * 2018-06-07 2022-07-05 华南农业大学 Fusion method of close-range hyperspectral camera and ranging sensor
CN108836506A (en) * 2018-07-20 2018-11-20 东北大学 A kind of black light for operation shows that equipment and optics instruct system
CN108937992B (en) * 2018-08-06 2020-10-23 清华大学 In-situ visualization system for X-ray perspective imaging and calibration method thereof
GB201817092D0 (en) * 2018-10-19 2018-12-05 Cancer Research Tech Ltd Apparatus and method for wide-field hyperspectral imaging
US11550145B2 (en) 2019-01-16 2023-01-10 Korea Photonics Technology Institute Optical system for implementing augmented reality and device including the same
KR102222076B1 (en) * 2019-03-19 2021-03-03 한국광기술원 Optical System for Realizing Augmented Reality and Medical Augmented Reality Apparatus Including the Same
GB201902668D0 (en) * 2019-02-27 2019-04-10 Colordyne Ltd Appoaratus for selectively illuminating a target field, for example, in a self dimming headlight system
JP7281632B2 (en) * 2019-06-25 2023-05-26 パナソニックIpマネジメント株式会社 projection system
CN112001998B (en) * 2020-09-02 2021-02-19 西南石油大学 Real-time simulation ultrasonic imaging method based on OptiX and Unity3D virtual reality platforms
US20220319031A1 (en) * 2021-03-31 2022-10-06 Auris Health, Inc. Vision-based 6dof camera pose estimation in bronchoscopy
WO2024086564A1 (en) * 2022-10-17 2024-04-25 Monogram Orthopaedics Inc. Markerless tracking with spectral imaging camera(s)

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1999022640A2 (en) * 1997-10-30 1999-05-14 Hypermed Imaging, Inc. Multispectral/hyperspectral medical instrument
JP4625956B2 (en) * 2004-08-27 2011-02-02 国立大学法人東京工業大学 Image processing apparatus and image processing method
CA2631564A1 (en) * 2004-11-29 2006-06-01 Hypermed, Inc. Medical hyperspectral imaging for evaluation of tissue and tumor
JP5149015B2 (en) * 2004-12-28 2013-02-20 ハイパーメツド・イメージング・インコーポレイテツド Hyperspectral / multispectral imaging in the determination, evaluation and monitoring of systemic physiology and shock
US8838210B2 (en) * 2006-06-29 2014-09-16 AccuView, Inc. Scanned laser vein contrast enhancer using a single laser
GB0602137D0 (en) 2006-02-02 2006-03-15 Ntnu Technology Transfer As Chemical and property imaging
US20080298642A1 (en) * 2006-11-03 2008-12-04 Snowflake Technologies Corporation Method and apparatus for extraction and matching of biometric detail
KR20090113324A (en) * 2007-02-14 2009-10-29 루미네트엑스 코포레이션 System and method for projection of subsurface structure onto an object's surface
EP2075616A1 (en) 2007-12-28 2009-07-01 Möller-Wedel GmbH Device with a camera and a device for mapping and projecting the picture taken
JP5587861B2 (en) * 2008-03-28 2014-09-10 コーニンクレッカ フィリップス エヌ ヴェ Target localization of X-ray images
BRPI0917591A2 (en) * 2008-12-11 2015-11-17 Koninkl Philips Electronics Nv system for generating an image and method for generating an image
CN102421365B (en) * 2009-05-13 2015-03-25 皇家飞利浦电子股份有限公司 System for detecting global patient movement during imaging procedures
US20120200829A1 (en) * 2011-02-09 2012-08-09 Alexander Bronstein Imaging and projecting devices and methods
US8897522B2 (en) * 2012-05-30 2014-11-25 Xerox Corporation Processing a video for vascular pattern detection and cardiac function analysis

Also Published As

Publication number Publication date
JP5974174B2 (en) 2016-08-23
JP2015529482A (en) 2015-10-08
CN104380066B (en) 2018-12-21
WO2014147515A1 (en) 2014-09-25
RU2655018C2 (en) 2018-05-23
US20150381908A1 (en) 2015-12-31
US9736402B2 (en) 2017-08-15
RU2014153621A (en) 2016-07-20
BR112014028811B1 (en) 2020-11-17
BR112014028811A2 (en) 2017-06-27
CN104380066A (en) 2015-02-25
RU2014153621A3 (en) 2018-03-19
EP2976609A1 (en) 2016-01-27

Similar Documents

Publication Publication Date Title
EP2976609B1 (en) System for hyperspectral imaging in visible light, method for recording a hyperspectral image and displaying the hyperspectral image in visible light
US20240000295A1 (en) Light field capture and rendering for head-mounted displays
ES2972214T3 (en) Generation of one or more luminosity edges to form three-dimensional models of objects
US20200081530A1 (en) Method and system for registering between an external scene and a virtual image
EP3198330A1 (en) Hyperspectral imager
KR20110016896A (en) System and method for generating a multi-dimensional image
US11779210B2 (en) Ophthalmic imaging apparatus and system
CN108601505A (en) Imaging device, imaging method and imaging system
KR20240100446A (en) Systems and methods for medical imaging
US20180192871A1 (en) Ophthalmic surgery using light-field microscopy
KR20140041012A (en) Multi 3-dimension camera using multi pattern beam and method of the same
JP6831506B2 (en) Light measuring device and light measuring method
Kagawa et al. Variable field-of-view visible and near-infrared polarization compound-eye endoscope
KR101133503B1 (en) Integrated optical and x-ray ct system and method of reconstructing the data thereof
KR101355671B1 (en) Portable scanning probe with monitor and optical coherence tomography using the same
EP3478182A2 (en) Method, system, software, and device for remote, miniaturized, and three-dimensional imaging and analysis of human lesions. research and clinical applications thereof
Bae et al. New technique of three-dimensional imaging through a 3-mm single lens camera
Kwan et al. Development of a Light Field Laparoscope for Depth Reconstruction
JP5692716B2 (en) Fundus observation device
JP2010249907A (en) Photographing device and imaging method
Viganò et al. Towards X-ray Plenoptic Imaging: Emulation with a Laboratory X-ray Scanner
Maiden et al. The Remarkably Flexible Ptychographic Data Set
Sergeyev et al. Design of the stereoscopic eye-tracking system for quantitative remote sensing applications

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20141106

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20190426

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: KONINKLIJKE PHILIPS N.V.

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

RIC1 Information provided on ipc code assigned before grant

Ipc: G01J 5/08 20060101AFI20210630BHEP

Ipc: G01J 5/02 20060101ALI20210630BHEP

Ipc: A61B 6/00 20060101ALI20210630BHEP

Ipc: A61B 5/00 20060101ALI20210630BHEP

Ipc: G01J 3/28 20060101ALI20210630BHEP

Ipc: A61B 90/00 20160101ALN20210630BHEP

Ipc: G01J 5/00 20060101ALN20210630BHEP

INTG Intention to grant announced

Effective date: 20210730

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 1460994

Country of ref document: AT

Kind code of ref document: T

Effective date: 20220115

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602014082011

Country of ref document: DE

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG9D

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20220105

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 1460994

Country of ref document: AT

Kind code of ref document: T

Effective date: 20220105

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220105

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220105

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220105

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220505

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220405

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220105

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220105

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220105

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220405

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220105

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220105

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220406

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220105

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220105

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220505

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602014082011

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220105

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220105

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220105

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220105

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220105

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220105

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220105

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220105

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20220331

26N No opposition filed

Effective date: 20221006

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220312

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220331

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220312

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220331

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220105

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220331

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220105

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20140312

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220105

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220105

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20240328

Year of fee payment: 11

Ref country code: GB

Payment date: 20240319

Year of fee payment: 11

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20240326

Year of fee payment: 11