WO2021205356A1 - Space time imaging system and method - Google Patents

Space time imaging system and method Download PDF

Info

Publication number
WO2021205356A1
WO2021205356A1 PCT/IB2021/052887 IB2021052887W WO2021205356A1 WO 2021205356 A1 WO2021205356 A1 WO 2021205356A1 IB 2021052887 W IB2021052887 W IB 2021052887W WO 2021205356 A1 WO2021205356 A1 WO 2021205356A1
Authority
WO
WIPO (PCT)
Prior art keywords
sample
previous
optical system
illumination pattern
images
Prior art date
Application number
PCT/IB2021/052887
Other languages
French (fr)
Inventor
Theo Lasser
Original Assignee
Perseus Biomics
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Perseus Biomics filed Critical Perseus Biomics
Publication of WO2021205356A1 publication Critical patent/WO2021205356A1/en

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/58Optics for apodization or superresolution; Optical synthetic aperture systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/62Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
    • G01N21/63Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
    • G01N21/64Fluorescence; Phosphorescence
    • G01N21/6408Fluorescence; Phosphorescence with measurement of decay time, time resolved fluorescence
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/62Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
    • G01N21/63Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
    • G01N21/64Fluorescence; Phosphorescence
    • G01N21/645Specially adapted constructive features of fluorimeters
    • G01N21/6456Spatial resolved fluorescence measurements; Imaging
    • G01N21/6458Fluorescence microscopy
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/0004Microscopes specially adapted for specific applications
    • G02B21/002Scanning microscopes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/06Means for illuminating specimens
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/16Microscopes adapted for ultraviolet illumination ; Fluorescence microscopes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
    • G02B21/365Control or image processing arrangements for digital or video microscopes
    • G02B21/367Control or image processing arrangements for digital or video microscopes providing an output produced by processing a plurality of individual source images, e.g. image tiling, montage, composite images, depth sectioning, image comparison

Definitions

  • the invention relates to imaging systems or imaging systems for optical imaging, and additionally and more particularly to a reader or genome reader with high accuracy and contrast allowing high speed super-resolved read-out.
  • Structured light illumination also known as SIM (structured light microscopy) allows to enhance the classical resolution by almost twice the classical resolution [1 ,2]
  • This gain in resolution in its most simple form uses a fringe illumination. Shifting and rotating this fringe based illumination pattern yields a sequence of images, which serves as an input to an algorithm for generating a super-resolved image of an object.
  • the gain in resolution when used for two-dimensional imaging needs typically at least 9 images for 3 distinct angular directions. In short this gain in resolution for a two dimensional object slows down the whole acquisition process by almost one order of magnitude.
  • the concept can be substantially simplified.
  • a one-dimensional structured illumination reading concept can provide an improved spatial resolution as well as a better contrast.
  • the DNA samples should be well-positioned for taking at least 2 images with phase-shifted illumination patterns.
  • An objective of this invention is to resolve at least the above-mentioned problems and/or disadvantages and to provide the advantages described herewith in the following.
  • An objective of this invention is to improve the accuracy in the identification and reading of the DNA code.
  • a further objective of this invention is to improve the matching quality between theoretical intensity profiles and the experimentally measured intensity profiles.
  • An additional objective of this invention is to disclose a device or system configuration or method integrating a highly synchronized sample translation, image acquisition and pulsed illumination.
  • a further objective of this invention is to exploit the a priori knowledges of stretched DNA strands for improving the matching accuracy and the DNA strand identification.
  • a further objective of this invention is to disclose an optical imaging system based on static structural illumination for imaging a moving object in a selected direction by such avoiding the well-known stitching problem in classical microscope imaging.
  • An additional objective of this invention is to teach how to extract this super- resolved information based on 2 images (or only two images) taken of the moving object.
  • a further objective of this invention is the extraction of the super-resolved image when using a 3-dimensional Fourier transform (in time and in 2 spatial dimensions) with a further gain for the signal-to-noise ratio.
  • the present invention concerns an optical imaging system according to claim 1 , and an imaging method according to claim 24.
  • an improved optical system for reading and acquiring a metagenomic abundancy profiling is provided based and organized by an interplay of synchronized subsystems of an overall reader system.
  • the optical imaging system and method of the present disclosure assures high quality and/or super-resolved imaging.
  • the optical imaging system and method permits the generation of super-resolved image information of one or more objects and/or a super- resolved image of one or more objects.
  • the optical imaging system and method assure fast and accurate localisation.
  • Such objects may be, but not limited to, labelled or tagged molecules, for example, fluorescently tagged molecules.
  • This may be for example, nucleotide specific labeling of long DNA strands (typical length > 5 k-base pairs). Tagging a preselected 4 nucleotide- subsequence of this lengthy DNA strand permits a specific intensity profile to be generated and, once accurately imaged, permits identifying and matching this measured profile to known intensity profiles taken from existing databanks. This can be used for identifying bacteria for example but is in no case limited hereto.
  • Reading for example, fluorescently labeled DNA strands allows discriminating bacteria among an enormous bacterial diversity.
  • the optical readout of these patterns permits, for example, a faster and accurate identification of bacteria.
  • “Structured illumination pattern” or “Structured illumination” is used herein to generate a specific intensity profile [1] well known for achieving a resolution beyond the classical resolution known as the Abbe limit or the diffraction limit.
  • the structured illumination pattern can be generated in several ways, as for example by the interference of mutually coherent light fields or by imaging a mask on the object under investigation. In no case the technique or by interference of by imaging an appropriate mask will represent an innovative way to overcome the teaching of this disclosure.
  • “Abundancy profile” is used herein for relative and/or absolute proportions of specific entries within a population, as for example the intestinal microbiome, but not limited hereto. Such abundance profiles allow to compare populations in terms of their content and functionality.
  • the abundance profile can be displayed in tabular format, with each line corresponding to a specific entry (e.g. virus, microorganism, bacteria, and species) or groups thereof (families, functional clusters).
  • the abundance profiles can be relative or absolute, by inclusion of non-assigned signals.
  • the abundancy profile may be used as a diagnostic mean, due to its potential alterations caused by diseases.
  • Detector is used herein to mean any device capable of measuring energy in an electromagnetic signal as a function of wavelength.
  • a detector array means a plurality of detectors.
  • the preferred detector arrays used in this disclosure have their optimal sensitivity in the wavelength range of the used source and/or the by the light source induced fluorescence light.
  • the detectors can either be one-, multi- dimensional or line arrays, depending on the optical setup.
  • CMOS detectors In the mostly used wavelength range of 400 - 700nm, CMOS detectors have currently the best performance with respect to sensitivity and read out speed.
  • high speed cameras are array detectors acquiring images at a frame rate of at least 100 frames per second (fps). It is obvious that high frame rates translate to short integration times per array pixel.
  • CMOS-TDI cameras (TDI - time delay integration) overcomes this limit of integration time by shifting and adding the induced photoelectrons into the readout register [3]
  • This TDI principle and its use for spacetime imaging is also understood as a fast readout concept offering an increased sensitivity.
  • Fig. 1 discloses the optical system 1010, 1110 complemented by an electronic synchronization system 230 for space-time imaging.
  • the optical detection system 1010 comprises an objective 13, a tubelens 12 collecting and transporting the light emanating from the sample 1030. As indicated, the sample is moving across a static illumination field generated by the illumination system 1110.
  • This illumination system 1110 contains a light source LS and optical means to generate at least a 2 beam configuration, which allows to create a static fringe pattern at the object plane.
  • a synchronization electronic system 231 generates the triggering of the light source LS or illumination system 1110 according to the temporal position of the sample (measured by an attached sensor MS) and ensures the synchronization with the camera 11 clock frequency.
  • Fig. 2 shows the time-space diagram and is intended to provide the underlying principles of space-time imaging. This diagram 100 is decomposed into 4 subparts. The first subdiagram 120 is showing the structured illumination pattern 133. This pattern extends over the full field, but stays static in time.
  • the 2nd subdiagram 140 shows the moving object at three distinct time points.
  • the constantly moving object is illuminated by the static periodic structured illumination.
  • This structured illumination pattern is preferentially oriented perpendicular to the movement direction. It is obvious for those skilled in the art that a pulsed illumination or pulsed short duration (for example for a typical sample movement of 1 mm/s, a pulse duration of ⁇ 1 ⁇ s will avoid blurring) illumination 111 allows to define the relative phase shifts seen by a subfield at least at 2 timepoints during the movement of the whole object across the observation field. As said, the time duration of the structured illumination is chosen short enough for avoiding any blurring artifact.
  • the subdiagram 160 is showing the subfield at three distinct time points.
  • the synchronization between a moving object, the pulsed illumination and the acquisition by the camera is important for a high quality super-resolved imaging.
  • Sub diagram 180 shows the illumination pattern as seen by an observer moving with the object. Obviously, this underlines the phase shifted illumination pattern which allows finally the super-resolved imaging.
  • Fig. 3 [300] illustrates the moving object [310] at different but determined time points
  • the static illumination is shown [330,332-338] as well as the dynamic “SIM” object at the time points as mentioned above. It is obvious by those skilled in the art that this is equivalent to a situation of a static object and a moving illumination pattern.
  • Fig. 4 shows stretched DNA strands [400] on a glass slide.
  • the various DNA fragments attached to this slide appear in this experimental approach as a 1-dim object [401] almost perfectly oriented in a predefined direction.
  • Fig. 5 demonstrates the signal acquisition and processing for the case of 2p shifted images.
  • Sub- Figure 5001 shows the ground truth, the exact localizations of the marked positions of a 1 -dimensional object (but not limited hereto).
  • Sub- Figure 5002 shows the ground truth as a reference and the DC part of the signal which would correspond to a classical image with no super-resolution.
  • Sub- Figure 5003 shows the AC part corresponding to the difference operation of two consecutive images shifted by p. Obviously the resolution is improved and the different peak positions clearly correlated to the ground truth. As said, this represents only the AC-part and not the total intensity. This explains the negative below “0-intensity”.
  • Sub- Figure 5004 indicates the difference in signal resolution and localization accuracy between a classical image as shown in sub- Figure 5002 in comparison to the AC part in sub- Figure 5003.
  • the dash-dot line in sub- Figure 5003 is the spatial derivative which indicates that further processing may help to improve the localization of markers, which are fully unresolved when taking a classical image (see also sub-Fig. 5002). It is obvious that the derivative shown is only an example of more sophisticated processing tools.
  • sub- Figure 5005 shows the super-resolved image based on the AC part with a clear super-resolved quality. Even if the first 3 lines are not resolved, the increased signal amplitude allows at least an estimation of the underlying multiple labels.
  • Sub- Figure 5006 is an overlay of the ground truth as is shown in sub- Figure 5001 , the AC part as is shown in sub- Figure 5003 a classical image as shown in sub- Figure 5002 and the super-resolved image as is shown in sub- Figure 5005.
  • Fig. 1 discloses an exemplary optical system or optical imaging system 1 for space-time imaging according to the present disclosure.
  • the optical system 1 comprises, for example, an illumination system 1110, displacement means DM, an optical detection system 1010 and a synchronization system 230.
  • the illumination system 1110 is configured to generate a structured illumination pattern 133 or static structured illumination pattern 133 (see Fig. 2) and configured or arranged to illuminate the at least one sample or object 1030 (to be investigated or under investigation) with the static structured illumination pattern 133.
  • the static structured illumination pattern 133 comprises or consists of, for example, a fringe-based illumination pattern as shown for example in Figure 2, where the fringes comprise or consist of, for example, alternating bright and dark regions or elongations.
  • the static structured illumination pattern 133 is generated by a structured light pattern generator 1012 of the illumination system 1110 and imaged onto the sample or support 1030 by at least one imaging means 1011 , for example, at least one optical lens.
  • the static structured illumination pattern 133 may have, for example, a pattern oriented (substantially) perpendicular to a direction of displacement of the sample 1030 by the displacement means DM, as shown for example in Fig. 2.
  • the structured light pattern generator 1012 includes, for example, a light source LS and optical means or elements configured or arranged to generate at least a two-beam configuration arranged to generate a static fringe pattern via interference.
  • the static fringe pattern is provided or created at an object plane and imaged by imaging means 1011 onto the sample 1030. As indicated, the sample is arranged to move across this static illumination field generated by the illumination system 1110.
  • the light source LS may, for example, comprise or consist of a continuous or pulsed light source.
  • the light source LS may comprise or consist of, for example, a LED or laser.
  • the light source LS may comprise or consist of a coherent light source.
  • Pulsed light illumination may, for example, be used to generate the static structured illumination pattern 133.
  • the pulsed light illumination may be provided by the pulsed light source or by modulating the output of the continuous light source.
  • a pulse duration of the pulsed light illumination may be, for example, set to a value that reduces or assures an absence of blurring, as discussed further below
  • the optical means or elements configured to generate at least a two-beam configuration and the static fringe pattern, may for example comprise or consist of a plurality of optical reflectors (mirrors) and at least one beam splitter arranged to generate interference between the beams.
  • the structured illumination pattern 133 can thus be generated by the interference of mutually coherent light fields.
  • the structured illumination pattern can however, be generated in other ways, for example, by imaging a mask onto the object under investigation to produce the desired pattern on the object.
  • the orientation of the static structured illumination pattern 133 may, for example, be defined by the orientation of the optical reflectors.
  • the displacement means (or sample displacer) DM is configured to displace the sample 1030 relative to the illumination system 1110, and relative to the other elements of the optical system 1 , for example, the optical detection system 1010.
  • the displacement means DM is configured to displace the sample 1030 relative to the generated static structured illumination pattern 133, or relative to the image plane or image formed by the imaging means 1011 of structured illumination pattern 133.
  • the displacement means DM comprises, for example, a motorized translation stage and controller. A sample holder, for example, attached to the translation stage may hold the sample 1030.
  • a displacement speed V of the translation stage or sample may, for example, be measured or determined. For example, it may be determined on the basis of a measured current provided to a displacement motor, or a measured rotation speed of the motor.
  • the controller or the synchronization system 230 may determine the displacement speed V based, for example, on a look-up table containing displacement speed values associated with motor current or motor rotation.
  • the controller provides the determined displacement speed V to the synchronization system 230.
  • the displacement speed V is set or determined by the controller or the synchronization system 230 setting a current to be provided to the displacement motor based, for example, on the look-up table containing displacement speed values associated with motor current.
  • the displacement means DM can be, for example, configured to continuously displace the sample 1030 in the generated static structured illumination pattern 133 or across the generated static structured illumination pattern 133.
  • the displacement means DM can be, for example, configured to displace the sample 103 in or across the generated static structured illumination pattern 133 at a constant or non-constant speed V.
  • the displacement means DM can be, for example, configured to displace the sample 1030 in or across the generated static structured illumination pattern 133 unidirectionally or non-unidirectionally, or in an oscillating manner.
  • the optical detection system 1010 comprises, for example, a first imaging means 13, for instance an objective 13, arranged to receive light emanating from the sample 1030, a second imaging means 12, for instance a tube-lens 12, arranged to collect light from the first imaging means 13 and to transport or direct the light emanating from the sample 1030 towards a camera or detector 11.
  • the first and second imaging means 12, 13 are arranged relative to the sample 1030 and displacement means DM to image the light emanating from/through the sample 1030 onto the camera or detector 11 where the camera 11 captures or performs image acquisitions.
  • the camera or detector 11 comprises or consists of, for example, the detector defined earlier.
  • the optical detection system 1010 is configured to receive electromagnetic radiation emanating from (or transmitted from) the illuminated sample 1030 during displacement of the sample 1030 in the static structured illumination pattern 133, or during displacement of the sample 1030 across the static structured illumination pattern 133.
  • the optical detection system 1010 can image the illuminated sample 1030 within a field-of-view that is less than an observable field-of-view of the optical detection system 1010, for example, within a field-of-view that is half the observable field-of-view. This can be done, for example, by setting an acquisition frame rate of the camera 11 to a rate that is sufficiently fast to capture a predetermined number of images as the sample 1030 is being displaced, for example, only two or at least two images.
  • Fig.2 shows, for example, a plurality of subfields.
  • the synchronization system 230 comprises or consists of, for example, an electronic synchronization system 230.
  • the electronic synchronization system 230 is connected to (and/or configured to communicate with) the illumination system 1110, for example, to the structured light pattern generator 1012 or a component there of, such as, a pulsed light source, and configured to control operation of the illumination system 1110 and elements of the illumination system 1110.
  • the electronic synchronization system 230 is also connected to (and/or configured to communicate with) the optical detection system 1010, for example, to the camera 11 and configured to control operation of the camera 11.
  • the electronic synchronization system 230 is also connected to the displacement means DM and configured to control operation of the displacement means DM.
  • the electronic synchronization system 230 is configured to control or synchronize operation of these elements or components.
  • the electronic synchronization system 230 includes, for example processing means, for instance a processor 231 and/or analog and/or digital processing circuits so called encoding (encoder) and decoding (decoder) means, connected to elements of the illumination system 1110, the displacement means DM, and the optical detection system 1010.
  • the processing means or processor 231 is connected to optical source LS of the illumination system 1110, the camera 11 of the optical detection system 1010 and a controller of the displacement means DM.
  • the processing means or processor 231 may, for example, be connected directly to elements of the illumination system 1110, the displacement means DM, and the optical detection system 1010 to communicate and controller operation via a local controller.
  • the processing means or processor 231 may, for example, be connected indirectly to these elements of the illumination system 1110, the displacement means DM, and the optical detection system 1010 via controllers 232, 233, 234 included in the electronic synchronization system 230.
  • the processing means or a processor 231 is connected to these elements of the system 1 and configured to control and command these elements to permit operation or synchronized or timely operation of different or interrelated elements of the system 1.
  • the system 1 or electronic synchronization system 230 includes, for example, a memory or storage means ST (for example, semiconductor memory, HDD, or flash memory) configured to store or storing at least one program or processor executable instructions.
  • the at least program or processor executable instructions may comprise instructions permitting, for example, to control and command the system elements, (such as the camera 11 , the optical source and the displacement means DM) and to communicate data to and from system elements.
  • the processor executable instructions may, for example, comprise instructions permitting to receive and process the captured images or image data from the camera 11.
  • the processing means or a processor 231 and the memory ST can be, for example, included in a computer or a portable device such as a smart phone.
  • the processor executable instructions can include instructions permitting various different actions concerning capturing and processing images and image data of the present disclosure.
  • the processor executable instructions are provided to or obtained by the processor for execution.
  • the electronic synchronization system 230 is configured to trigger the generation of the static structured illumination pattern 133 at different timepoints tN during the displacement of the sample 1030 in or across the static structured illumination pattern 133. This provides a plurality of signals (or images to the camera 11) comprising a relative phase shift of the generated static structured illumination pattern 133.
  • the signals are processed by the processor 231.
  • Triggering the generation of the static structured illumination pattern 133 at different timepoints by the synchronization electronic system 231 may, for example, be carried out by generating the triggering of the light source LS (to output light) according to the temporal position of the sample 1030 and is done in a manner to assure synchronization with a clock frequency of the camera 11.
  • the clock frequency of the camera 11 determines the specific time points of the image acquisition of the camera 11.
  • the optical system 1 includes a position measurement device or sensor MS configured to measure or determine the position in time of the sample 1030.
  • the position is, for example, measured or determined while the sample 1030 is being displaced by the displacement means DM. .
  • a first image of the static structured illumination pattern 133 is acquired at a first position in time and, during the displacement of the sample 1030, a second image (or further images) of the static structured illumination pattern 133 is acquired at a later position in time (with respect to the position in time of the first acquisition) that corresponds to a predetermined or desired phase shift of the static structured illumination pattern 133 relative to the first acquisition, for example, a phase shift of p or l/2.
  • Real time displacement monitoring with subwavelength precision is, for example, carried out by a displacement monitoring system included, for example, in the position measurement device or sensor MS.
  • the displacement monitoring system includes, for example, an interferometer system such as a Michelson interferometer system including, for example, a separate coherent light source such as a laser, where a first reflector is attached to and displaced with the sample 1030 to provide a simple back- reflection that is arranged to interfere with reflected light from a second reflector fixed in relation to the illumination system 1110 or the optical detection system 1010 (located in the fixed frame).
  • This interferometric scheme provides an interference signal for monitoring the displacement of the sample 1030 and permits to determine a position of the sample 1030 with a precision better than l/20.
  • processing means 231 is configured to determine the positions in time where image acquisition of the static structured illumination pattern 133 is carried out.
  • a camera or photodetector may capture the interference signal that is processed by controller 234 and/or processing means 231 to determine when the sample 1030 has been displaced to a position where the predetermined or desired phase shift of the static structured illumination pattern 133 relative to the first or a previous acquisition is achieved, and determine the position in time that a second or further image acquisition is carried out.
  • the electronic synchronization system 230 is configured to trigger and/or synchronize acquisition by the camera 11 of the electromagnetic radiation emanating from the illuminated sample 1030 with the triggering of the generation of the static structured illumination pattern 133 by the illumination system 1110.
  • the electronic synchronization system 230 is configured to receive or determine the temporal positions of the samples 1030, and further configured to trigger the generation of the static structured illumination pattern 133 and/or the acquisition, by the camera 11 , of the electromagnetic radiation emanating from the illuminated sample 1030 based on the temporal sample positions at which image acquisition is to be carried out.
  • the electronic synchronization system 230 is configured to trigger the generation of the static structured illumination pattern 133 and the acquisition of the electromagnetic radiation emanating from the illuminated sample 1030 at different timepoints tN (or different temporal positions of the sample 1030), for example, determined by the following equation: where tN is a timepoint, V is the speed of displacement of the sample 1030, N is a positive natural number and Km is the wave vector of the illumination light producing the static structured illumination pattern 133.
  • the electronic synchronization system 230 is configured to trigger the generation of the static structured illumination pattern 133 and the acquisition of at least two (or a plurality of) signals of the electromagnetic radiation emanating from the illuminated sample 1030 at least two (or a plurality of) different timepoints tN.
  • the optical system 1 may include, for example, at least one sample or support 1030.
  • the sample or support 1030 may for example, comprise a slide or glass slide which holds a 1 D object or objects.
  • the sample or support 1030 may include, for example, a plurality of 1 -dimensional (1 D) objects, or elongated 1 D objects.
  • the 1 D object or the plurality of 1 D objects extend, for example, in or along one spatial direction (or solely one spatial direction). Orthogonal directions to the direction of extension or all orthogonal object dimensions are (much) smaller when compared to an optical wavelength, for example, the wavelength of the light being used in the optical system 1.
  • the 1 D object(s) may comprise optical light emitting markers or labels, for example, fluorescently labeled or marked.
  • the 1 D object may comprise or consist of, for example, a DNA strand.
  • Fig. 4 shows stretched DNA strands on a glass slide 400. The various DNA fragments attached to this slide appear in this experimental approach as a 1 -dimensional object 401 (almost perfectly) oriented in a predefined direction.
  • the DNA strand may, for example, be fluorescently labeled DNA strands.
  • space-time imaging is carried out by the optical system 1 by acquiring electromagnetic radiation emanating from florescent markers or light scatters of the illuminated sample 1030 as well as a sequence of images at different timepoints tN as is now further explained.
  • SIM microscopy is shifting the illumination fringe pattern for the of 3 images taken with 3 distinct phase lags ⁇ i per k- space orientation [LIT] as given by or correspondingly 3 phase lags at 0, 120, 240 degree for generating 3 distinct “SIM- objects”
  • shifting the phase is equivalent to move or displace the object q(c) by a certain equivalent displacement ⁇ xi relative to reference position given by the static fringe illumination.
  • This fringe illumination is characterized by the illumination K-vector denoted as or the illumination wavelength and we obtain for the displacement
  • OTF optical transfer function related to the Fourier transform of the point spread function PSF.
  • PSF point spread function
  • the first term which we named the DC-term is nothing else than the classical image not depending on time.
  • This phase factor P(t) carries the innovation of the present space-time imaging method and system.
  • the moving object causes a continuous phase modulation of its Fourier transformed image. As indicated above, this phase factor depending on time and the constant parameters allows to reconstruct a super-resolved image.
  • the optical system 1 of the present disclosure is configured to process consecutive phase shifted images (for example, 2 images with a p phase shift) to provide corresponding Fourier transformed images, and to perform subtraction of the Fourier transformed images to determine information carrying higher spatial frequencies providing a super-resolution, or an improved localization or spatial information of the 1 D object or objects, such as DNA strands, located on the sample 1030.
  • splitting the intensity into an AC and DC part can be easily generalized by taking a phase shift between consecutive images. Obviously, a phase shift of p needs only two images for separating and extracting the AC and DC part. It is possible to recombine the AC and DC part in such a manner, that a super-resolved image can be extracted by this simple sum and difference operation of the AC and DC part. Besides the simplicity of image processing, this procedure provides a faster signal acquisition as only 2 images are needed (see also Fig. 5).
  • an object moving with a constant speed leads to a phase modulation of the Fourier transformed image.
  • This continuous phase modulation can be used for selecting the AC-part, while suppressing the DC-part by a simple image subtraction, supposing an appropriate time difference determined by the speed and the wave vector of the illumination.
  • the system 1 or the electronic synchronization system 230 is configured to carry out the above-described processing steps.
  • the processor 231 and the memory ST include a program or instructions configured to carry out the above-described processing steps.
  • the moving object is constantly moving out of the observable field. This objection is correct, but can easily circumvented when restricting the observed field to half the observable field size. This restricted sub-field can be imaged at least 2 times by an appropriate choice of the camera frame rate in relation to the speed of the object.
  • Those skilled in the art may further object, that the image of a moving object will be blurred due to the sample movement causing this artefact. This artefact can easily be circumvented when using a short pulsed illumination, where the illumination duration is chosen short enough.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Optics & Photonics (AREA)
  • Health & Medical Sciences (AREA)
  • Biochemistry (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • General Health & Medical Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Microscoopes, Condenser (AREA)

Abstract

The present disclosure concerns an optical system for space-time imaging comprising an illumination system configured to generate a static structured illumination pattern for illuminating at least one sample to be investigated; displacement means for displacing the sample relative to the generated static structured illumination pattern; an optical detection system configured to receive electromagnetic radiation emanating from the illuminated sample during displacement of the sample in or across the static structured illumination pattern; and an electronic synchronization system configured to trigger the generation of the static structured illumination pattern at different timepoints during the displacement of the sample in or across the static structured illumination pattern to provide a plurality of signals or images comprising a relative phase shift of the generated static structured illumination pattern.

Description

SPACE TIME IMAGING SYSTEM AND METHOD
FIELD OF THE INVENTION [0001] The invention relates to imaging systems or imaging systems for optical imaging, and additionally and more particularly to a reader or genome reader with high accuracy and contrast allowing high speed super-resolved read-out.
BACKGROUND
[0002] Structured light illumination also known as SIM (structured light microscopy) allows to enhance the classical resolution by almost twice the classical resolution [1 ,2] This gain in resolution in its most simple form uses a fringe illumination. Shifting and rotating this fringe based illumination pattern yields a sequence of images, which serves as an input to an algorithm for generating a super-resolved image of an object. The gain in resolution when used for two-dimensional imaging needs typically at least 9 images for 3 distinct angular directions. In short this gain in resolution for a two dimensional object slows down the whole acquisition process by almost one order of magnitude. [0003] For imaging a one dimensional object the concept can be substantially simplified. Exploiting this a priori knowledge of the object, only 3 images are necessary for acquiring a better resolved image of a one-dimensional object. It is well understood that a one dimensional object extends only in one spatial direction whereas all orthogonal object dimensions are much smaller when compared to the optical wavelength. A typical example is a DNA strand with a length up to several micrometers, but with a lateral extent of only several nanometers. It should be well understood that this is not a principal limitation to this specific example.
[0004] Speed and accuracy of the reading procedure is of utmost importance for accurately determining a quantitative precise localization of fluorescently tagged molecules. This is the case for nucleotide specific labeling of long DNA strands (typical length > 5 k-base pairs). Tagging a preselected 4 nucleotide-subsequence of this lengthy DNA strand, a specific intensity profile can be generated for identifying and matching this measured profile to known intensity profiles taken from existing databanks. This general concept is of high interest for identifying bacteria, but is in no case limited hereto.
[0005] Microbiome analysis as has been shown in several research projects demands to read and identify a huge amount of bacterial DNA fragments. As has been shown in former research projects, reading these fluorescently labeled DNA strands allows discriminating bacteria among an enormous bacterial diversity. An optical readout of these patterns permits a faster and accurate identification of bacteria.
[0006] It is obvious to those skilled in the art that an accurate and high contrast reader concept of the labeled marker positions is of utmost importance. In addition, a fast identification of bacteria present in the microbiome will be a strong advantage when assessing a huge amount of DNA strands.
[0007] As already mentioned, a one-dimensional structured illumination reading concept can provide an improved spatial resolution as well as a better contrast. For such a reader concept the DNA samples should be well-positioned for taking at least 2 images with phase-shifted illumination patterns.
[0008] Herein is disclosed a reader concept which overcomes this stringent sample positioning requirement for taking 2 images. Another way for achieving this super- resolved reading can be designed by a well synchronized image acquisition, pulsed light illumination and a continuous sample translation well synchronized to the periodic imaging acquisition, as disclosed herein.
OBJECTIVES [0009] An objective of this invention is to resolve at least the above-mentioned problems and/or disadvantages and to provide the advantages described herewith in the following.
[0010] An objective of this invention is to improve the accuracy in the identification and reading of the DNA code. [0011] A further objective of this invention is to improve the matching quality between theoretical intensity profiles and the experimentally measured intensity profiles.
[0012] An additional objective of this invention is to disclose a device or system configuration or method integrating a highly synchronized sample translation, image acquisition and pulsed illumination.
[0013] A further objective of this invention is to exploit the a priori knowledges of stretched DNA strands for improving the matching accuracy and the DNA strand identification. [0014] A further objective of this invention is to disclose an optical imaging system based on static structural illumination for imaging a moving object in a selected direction by such avoiding the well-known stitching problem in classical microscope imaging. [0015] An additional objective of this invention is to teach how to extract this super- resolved information based on 2 images (or only two images) taken of the moving object.
[0016] A further objective of this invention is the extraction of the super-resolved image when using a 3-dimensional Fourier transform (in time and in 2 spatial dimensions) with a further gain for the signal-to-noise ratio.
[0017] Additional advantages, objects and features of the invention will be set forth in part in the description and claims which follow and in part will become evident to those having ordinary skill in the art upon examination of the following or may learned from practice of the invention. The objects and advantages of the invention may be realized and attained as particularly pointed out in the appended claims. SUMMARY OF THE INVENTION
[0018] To achieve the afore-mentioned objects, the present invention concerns an optical imaging system according to claim 1 , and an imaging method according to claim 24. To achieve the afore-mentioned objects, an improved optical system for reading and acquiring a metagenomic abundancy profiling is provided based and organized by an interplay of synchronized subsystems of an overall reader system. The optical imaging system and method of the present disclosure assures high quality and/or super-resolved imaging. The optical imaging system and method permits the generation of super-resolved image information of one or more objects and/or a super- resolved image of one or more objects. The optical imaging system and method assure fast and accurate localisation.
When included in a reader or conceived as a fast reader for the fluorescently labelled marker on DNA fragments, fast and accurate reading can be assured to accurately determine a quantitative precise localization of objects, or fast and accurate imaging of a specific intensity profile(s).
Such objects may be, but not limited to, labelled or tagged molecules, for example, fluorescently tagged molecules. This may be for example, nucleotide specific labeling of long DNA strands (typical length > 5 k-base pairs). Tagging a preselected 4 nucleotide- subsequence of this lengthy DNA strand permits a specific intensity profile to be generated and, once accurately imaged, permits identifying and matching this measured profile to known intensity profiles taken from existing databanks. This can be used for identifying bacteria for example but is in no case limited hereto.
Reading, for example, fluorescently labeled DNA strands allows discriminating bacteria among an enormous bacterial diversity. The optical readout of these patterns permits, for example, a faster and accurate identification of bacteria.
Accurate and high contrast optical imaging by the optical imaging system and method of the present disclosure allows accurate reading of the labeled marker positions and a fast identification of bacteria present in the microbiome, which is particularly advantageous when assessing a significant amount of DNA strands. DEFINITIONS, TERMS AND ELEMENTS
[0019] “Structured illumination pattern” or “Structured illumination” is used herein to generate a specific intensity profile [1] well known for achieving a resolution beyond the classical resolution known as the Abbe limit or the diffraction limit. The structured illumination pattern can be generated in several ways, as for example by the interference of mutually coherent light fields or by imaging a mask on the object under investigation. In no case the technique or by interference of by imaging an appropriate mask will represent an innovative way to overcome the teaching of this disclosure.
[0020] “Abundancy profile” is used herein for relative and/or absolute proportions of specific entries within a population, as for example the intestinal microbiome, but not limited hereto. Such abundance profiles allow to compare populations in terms of their content and functionality. The abundance profile can be displayed in tabular format, with each line corresponding to a specific entry (e.g. virus, microorganism, bacteria, and species) or groups thereof (families, functional clusters). The abundance profiles can be relative or absolute, by inclusion of non-assigned signals. The abundancy profile may be used as a diagnostic mean, due to its potential alterations caused by diseases.
[0021] “Detector” is used herein to mean any device capable of measuring energy in an electromagnetic signal as a function of wavelength. A detector array means a plurality of detectors. In general the preferred detector arrays used in this disclosure have their optimal sensitivity in the wavelength range of the used source and/or the by the light source induced fluorescence light. The detectors can either be one-, multi- dimensional or line arrays, depending on the optical setup. In the mostly used wavelength range of 400 - 700nm, CMOS detectors have currently the best performance with respect to sensitivity and read out speed. As a common definition, high speed cameras are array detectors acquiring images at a frame rate of at least 100 frames per second (fps). It is obvious that high frame rates translate to short integration times per array pixel. It is known by those skilled in the art, that CMOS-TDI cameras (TDI - time delay integration) overcomes this limit of integration time by shifting and adding the induced photoelectrons into the readout register [3] This TDI principle and its use for spacetime imaging is also understood as a fast readout concept offering an increased sensitivity.
[0022] The aforementioned embodiments and advantage are exemplary and not shown as a limit of the present invention. The present teaching may be extended to other instrumentations. The detailed description of the present invention is intended to be illustrative, and in no case to limit the scope of this invention. Many alternatives, alterations, modification and variations will be apparent to those skilled in the art.
A BRIEF DESCRIPTION OF SEVERAL VIEWS OF THE DRAWINGS
[0023] Fig. 1 discloses the optical system 1010, 1110 complemented by an electronic synchronization system 230 for space-time imaging. The optical detection system 1010 comprises an objective 13, a tubelens 12 collecting and transporting the light emanating from the sample 1030. As indicated, the sample is moving across a static illumination field generated by the illumination system 1110. This illumination system 1110 contains a light source LS and optical means to generate at least a 2 beam configuration, which allows to create a static fringe pattern at the object plane.
A synchronization electronic system 231 generates the triggering of the light source LS or illumination system 1110 according to the temporal position of the sample (measured by an attached sensor MS) and ensures the synchronization with the camera 11 clock frequency. [0024] Fig. 2 shows the time-space diagram and is intended to provide the underlying principles of space-time imaging. This diagram 100 is decomposed into 4 subparts. The first subdiagram 120 is showing the structured illumination pattern 133. This pattern extends over the full field, but stays static in time.
[0025] The 2nd subdiagram 140 shows the moving object at three distinct time points. The constantly moving object is illuminated by the static periodic structured illumination. This structured illumination pattern is preferentially oriented perpendicular to the movement direction. It is obvious for those skilled in the art that a pulsed illumination or pulsed short duration (for example for a typical sample movement of 1 mm/s, a pulse duration of <1 μs will avoid blurring) illumination 111 allows to define the relative phase shifts seen by a subfield at least at 2 timepoints during the movement of the whole object across the observation field. As said, the time duration of the structured illumination is chosen short enough for avoiding any blurring artifact.
[0026] The subdiagram 160 is showing the subfield at three distinct time points. The synchronization between a moving object, the pulsed illumination and the acquisition by the camera is important for a high quality super-resolved imaging. Sub diagram 180 shows the illumination pattern as seen by an observer moving with the object. Obviously, this underlines the phase shifted illumination pattern which allows finally the super-resolved imaging.
[0027] Fig. 3 [300] illustrates the moving object [310] at different but determined time points
Figure imgf000009_0001
The static illumination is shown [330,332-338] as well as the dynamic “SIM” object at the time points as mentioned above. It is obvious by those skilled in the art that this is equivalent to a situation of a static object and a moving illumination pattern.
[0028] Fig. 4 shows stretched DNA strands [400] on a glass slide. The various DNA fragments attached to this slide appear in this experimental approach as a 1-dim object [401] almost perfectly oriented in a predefined direction.
Fig. 5 demonstrates the signal acquisition and processing for the case of 2p shifted images. Sub-Figure 5001 shows the ground truth, the exact localizations of the marked positions of a 1 -dimensional object (but not limited hereto). Sub-Figure 5002 shows the ground truth as a reference and the DC part of the signal which would correspond to a classical image with no super-resolution. Sub-Figure 5003 shows the AC part corresponding to the difference operation of two consecutive images shifted by p. Obviously the resolution is improved and the different peak positions clearly correlated to the ground truth. As said, this represents only the AC-part and not the total intensity. This explains the negative below “0-intensity”. Sub-Figure 5004 indicates the difference in signal resolution and localization accuracy between a classical image as shown in sub-Figure 5002 in comparison to the AC part in sub-Figure 5003. The dash-dot line in sub-Figure 5003 is the spatial derivative which indicates that further processing may help to improve the localization of markers, which are fully unresolved when taking a classical image (see also sub-Fig. 5002). It is obvious that the derivative shown is only an example of more sophisticated processing tools.
Finally, sub-Figure 5005 shows the super-resolved image based on the AC part with a clear super-resolved quality. Even if the first 3 lines are not resolved, the increased signal amplitude allows at least an estimation of the underlying multiple labels. Sub- Figure 5006 is an overlay of the ground truth as is shown in sub-Figure 5001 , the AC part as is shown in sub-Figure 5003 a classical image as shown in sub-Figure 5002 and the super-resolved image as is shown in sub-Figure 5005.
Herein, identical reference numerals are used, where possible, to designate identical elements that are common to the Figures.
DETAILED DESCRIPTION OF THE INVENTION
[0029] Fig. 1 discloses an exemplary optical system or optical imaging system 1 for space-time imaging according to the present disclosure. The optical system 1 comprises, for example, an illumination system 1110, displacement means DM, an optical detection system 1010 and a synchronization system 230.
[0030] The illumination system 1110 is configured to generate a structured illumination pattern 133 or static structured illumination pattern 133 (see Fig. 2) and configured or arranged to illuminate the at least one sample or object 1030 (to be investigated or under investigation) with the static structured illumination pattern 133. The static structured illumination pattern 133 comprises or consists of, for example, a fringe-based illumination pattern as shown for example in Figure 2, where the fringes comprise or consist of, for example, alternating bright and dark regions or elongations. The static structured illumination pattern 133 is generated by a structured light pattern generator 1012 of the illumination system 1110 and imaged onto the sample or support 1030 by at least one imaging means 1011 , for example, at least one optical lens. The static structured illumination pattern 133 may have, for example, a pattern oriented (substantially) perpendicular to a direction of displacement of the sample 1030 by the displacement means DM, as shown for example in Fig. 2.
The structured light pattern generator 1012 includes, for example, a light source LS and optical means or elements configured or arranged to generate at least a two-beam configuration arranged to generate a static fringe pattern via interference. The static fringe pattern is provided or created at an object plane and imaged by imaging means 1011 onto the sample 1030. As indicated, the sample is arranged to move across this static illumination field generated by the illumination system 1110. The light source LS may, for example, comprise or consist of a continuous or pulsed light source. The light source LS may comprise or consist of, for example, a LED or laser. The light source LS may comprise or consist of a coherent light source. Pulsed light illumination may, for example, be used to generate the static structured illumination pattern 133. The pulsed light illumination may be provided by the pulsed light source or by modulating the output of the continuous light source. A pulse duration of the pulsed light illumination may be, for example, set to a value that reduces or assures an absence of blurring, as discussed further below.
The optical means or elements, configured to generate at least a two-beam configuration and the static fringe pattern, may for example comprise or consist of a plurality of optical reflectors (mirrors) and at least one beam splitter arranged to generate interference between the beams. The structured illumination pattern 133 can thus be generated by the interference of mutually coherent light fields. As previously mentioned, the structured illumination pattern can however, be generated in other ways, for example, by imaging a mask onto the object under investigation to produce the desired pattern on the object. The orientation of the static structured illumination pattern 133 may, for example, be defined by the orientation of the optical reflectors.
[0031] The displacement means (or sample displacer) DM is configured to displace the sample 1030 relative to the illumination system 1110, and relative to the other elements of the optical system 1 , for example, the optical detection system 1010. The displacement means DM is configured to displace the sample 1030 relative to the generated static structured illumination pattern 133, or relative to the image plane or image formed by the imaging means 1011 of structured illumination pattern 133. The displacement means DM comprises, for example, a motorized translation stage and controller. A sample holder, for example, attached to the translation stage may hold the sample 1030.
A displacement speed V of the translation stage or sample may, for example, be measured or determined. For example, it may be determined on the basis of a measured current provided to a displacement motor, or a measured rotation speed of the motor. The controller or the synchronization system 230 may determine the displacement speed V based, for example, on a look-up table containing displacement speed values associated with motor current or motor rotation. The controller provides the determined displacement speed V to the synchronization system 230. Alternatively, for example, the displacement speed V is set or determined by the controller or the synchronization system 230 setting a current to be provided to the displacement motor based, for example, on the look-up table containing displacement speed values associated with motor current. Various, solutions for such translation stages exist even with an integrated displacement module for measuring the displacement in real-time and with an accuracy better than 10nm (see, for example, linear motor solutions etc. References [7] and [8]). The displacement means DM can be, for example, configured to continuously displace the sample 1030 in the generated static structured illumination pattern 133 or across the generated static structured illumination pattern 133. The displacement means DM can be, for example, configured to displace the sample 103 in or across the generated static structured illumination pattern 133 at a constant or non-constant speed V. The displacement means DM can be, for example, configured to displace the sample 1030 in or across the generated static structured illumination pattern 133 unidirectionally or non-unidirectionally, or in an oscillating manner.
[0032] The optical detection system 1010 comprises, for example, a first imaging means 13, for instance an objective 13, arranged to receive light emanating from the sample 1030, a second imaging means 12, for instance a tube-lens 12, arranged to collect light from the first imaging means 13 and to transport or direct the light emanating from the sample 1030 towards a camera or detector 11. The first and second imaging means 12, 13 are arranged relative to the sample 1030 and displacement means DM to image the light emanating from/through the sample 1030 onto the camera or detector 11 where the camera 11 captures or performs image acquisitions.
The camera or detector 11 comprises or consists of, for example, the detector defined earlier.
The optical detection system 1010 is configured to receive electromagnetic radiation emanating from (or transmitted from) the illuminated sample 1030 during displacement of the sample 1030 in the static structured illumination pattern 133, or during displacement of the sample 1030 across the static structured illumination pattern 133.
By setting an acquisition frame rate of the camera 11 at an appropriate value relative to the speed of displacement V of the sample 1030, the optical detection system 1010 can image the illuminated sample 1030 within a field-of-view that is less than an observable field-of-view of the optical detection system 1010, for example, within a field-of-view that is half the observable field-of-view. This can be done, for example, by setting an acquisition frame rate of the camera 11 to a rate that is sufficiently fast to capture a predetermined number of images as the sample 1030 is being displaced, for example, only two or at least two images. Fig.2 shows, for example, a plurality of subfields.
[0033] The synchronization system 230 comprises or consists of, for example, an electronic synchronization system 230. The electronic synchronization system 230 is connected to (and/or configured to communicate with) the illumination system 1110, for example, to the structured light pattern generator 1012 or a component there of, such as, a pulsed light source, and configured to control operation of the illumination system 1110 and elements of the illumination system 1110.
The electronic synchronization system 230 is also connected to (and/or configured to communicate with) the optical detection system 1010, for example, to the camera 11 and configured to control operation of the camera 11.
The electronic synchronization system 230 is also connected to the displacement means DM and configured to control operation of the displacement means DM. The electronic synchronization system 230 is configured to control or synchronize operation of these elements or components.
The electronic synchronization system 230 includes, for example processing means, for instance a processor 231 and/or analog and/or digital processing circuits so called encoding (encoder) and decoding (decoder) means, connected to elements of the illumination system 1110, the displacement means DM, and the optical detection system 1010. For example, the processing means or processor 231 is connected to optical source LS of the illumination system 1110, the camera 11 of the optical detection system 1010 and a controller of the displacement means DM. The processing means or processor 231 may, for example, be connected directly to elements of the illumination system 1110, the displacement means DM, and the optical detection system 1010 to communicate and controller operation via a local controller. Alternatively, the processing means or processor 231 may, for example, be connected indirectly to these elements of the illumination system 1110, the displacement means DM, and the optical detection system 1010 via controllers 232, 233, 234 included in the electronic synchronization system 230.
The processing means or a processor 231 is connected to these elements of the system 1 and configured to control and command these elements to permit operation or synchronized or timely operation of different or interrelated elements of the system 1.
The system 1 or electronic synchronization system 230 includes, for example, a memory or storage means ST (for example, semiconductor memory, HDD, or flash memory) configured to store or storing at least one program or processor executable instructions. The at least program or processor executable instructions may comprise instructions permitting, for example, to control and command the system elements, (such as the camera 11 , the optical source and the displacement means DM) and to communicate data to and from system elements. The processor executable instructions may, for example, comprise instructions permitting to receive and process the captured images or image data from the camera 11.
The processing means or a processor 231 and the memory ST can be, for example, included in a computer or a portable device such as a smart phone.
The processor executable instructions can include instructions permitting various different actions concerning capturing and processing images and image data of the present disclosure.
The processor executable instructions are provided to or obtained by the processor for execution. The electronic synchronization system 230 is configured to trigger the generation of the static structured illumination pattern 133 at different timepoints tN during the displacement of the sample 1030 in or across the static structured illumination pattern 133. This provides a plurality of signals (or images to the camera 11) comprising a relative phase shift of the generated static structured illumination pattern 133. The signals are processed by the processor 231.
Triggering the generation of the static structured illumination pattern 133 at different timepoints by the synchronization electronic system 231 may, for example, be carried out by generating the triggering of the light source LS (to output light) according to the temporal position of the sample 1030 and is done in a manner to assure synchronization with a clock frequency of the camera 11. The clock frequency of the camera 11 , for example, determines the specific time points of the image acquisition of the camera 11.
The optical system 1 includes a position measurement device or sensor MS configured to measure or determine the position in time of the sample 1030. The position is, for example, measured or determined while the sample 1030 is being displaced by the displacement means DM. .
A first image of the static structured illumination pattern 133 is acquired at a first position in time and, during the displacement of the sample 1030, a second image (or further images) of the static structured illumination pattern 133 is acquired at a later position in time (with respect to the position in time of the first acquisition) that corresponds to a predetermined or desired phase shift of the static structured illumination pattern 133 relative to the first acquisition, for example, a phase shift of p or l/2.
Real time displacement monitoring with subwavelength precision is, for example, carried out by a displacement monitoring system included, for example, in the position measurement device or sensor MS. The displacement monitoring system includes, for example, an interferometer system such as a Michelson interferometer system including, for example, a separate coherent light source such as a laser, where a first reflector is attached to and displaced with the sample 1030 to provide a simple back- reflection that is arranged to interfere with reflected light from a second reflector fixed in relation to the illumination system 1110 or the optical detection system 1010 (located in the fixed frame). This interferometric scheme provides an interference signal for monitoring the displacement of the sample 1030 and permits to determine a position of the sample 1030 with a precision better than l/20.
In view of the precisely determined displacement position of the sample 1030, processing means 231 is configured to determine the positions in time where image acquisition of the static structured illumination pattern 133 is carried out.
A camera or photodetector, for example, may capture the interference signal that is processed by controller 234 and/or processing means 231 to determine when the sample 1030 has been displaced to a position where the predetermined or desired phase shift of the static structured illumination pattern 133 relative to the first or a previous acquisition is achieved, and determine the position in time that a second or further image acquisition is carried out.
The electronic synchronization system 230 is configured to trigger and/or synchronize acquisition by the camera 11 of the electromagnetic radiation emanating from the illuminated sample 1030 with the triggering of the generation of the static structured illumination pattern 133 by the illumination system 1110.
The electronic synchronization system 230 is configured to receive or determine the temporal positions of the samples 1030, and further configured to trigger the generation of the static structured illumination pattern 133 and/or the acquisition, by the camera 11 , of the electromagnetic radiation emanating from the illuminated sample 1030 based on the temporal sample positions at which image acquisition is to be carried out. The electronic synchronization system 230 is configured to trigger the generation of the static structured illumination pattern 133 and the acquisition of the electromagnetic radiation emanating from the illuminated sample 1030 at different timepoints tN (or different temporal positions of the sample 1030), for example, determined by the following equation:
Figure imgf000018_0001
where tN is a timepoint, V is the speed of displacement of the sample 1030, N is a positive natural number and Km is the wave vector of the illumination light producing the static structured illumination pattern 133.
The electronic synchronization system 230 is configured to trigger the generation of the static structured illumination pattern 133 and the acquisition of at least two (or a plurality of) signals of the electromagnetic radiation emanating from the illuminated sample 1030 at least two (or a plurality of) different timepoints tN.
The optical system 1 may include, for example, at least one sample or support 1030. The sample or support 1030, may for example, comprise a slide or glass slide which holds a 1 D object or objects. The sample or support 1030 may include, for example, a plurality of 1 -dimensional (1 D) objects, or elongated 1 D objects. The 1 D object or the plurality of 1 D objects extend, for example, in or along one spatial direction (or solely one spatial direction). Orthogonal directions to the direction of extension or all orthogonal object dimensions are (much) smaller when compared to an optical wavelength, for example, the wavelength of the light being used in the optical system 1.
The 1 D object(s) may comprise optical light emitting markers or labels, for example, fluorescently labeled or marked.
The 1 D object may comprise or consist of, for example, a DNA strand. Fig. 4 shows stretched DNA strands on a glass slide 400. The various DNA fragments attached to this slide appear in this experimental approach as a 1 -dimensional object 401 (almost perfectly) oriented in a predefined direction. The DNA strand may, for example, be fluorescently labeled DNA strands. According to another aspect of the present disclosure, space-time imaging is carried out by the optical system 1 by acquiring electromagnetic radiation emanating from florescent markers or light scatters of the illuminated sample 1030 as well as a sequence of images at different timepoints tN as is now further explained.
SpaceTime imaging
[0034] Classical structured light microscopy SIM microscopy is shifting the illumination fringe pattern for the of 3 images taken with 3 distinct phase lags θi per k- space orientation [LIT] as given by
Figure imgf000019_0001
or correspondingly 3 phase lags at 0, 120, 240 degree for generating 3 distinct “SIM- objects”
Figure imgf000019_0002
This procedure is repeated for additional angular directions. We obtain for a fringe pattern illuminated object of the sample
Figure imgf000019_0003
Where q(c) is the object given as a function of the spatial coordinates x=(x’,y’,z’). [0035] For those skilled in the art, it is obvious that shifting the phase is equivalent to move or displace the object q(c) by a certain equivalent displacement Δxi relative to reference position given by the static fringe illumination. This fringe illumination is characterized by the illumination K-vector denoted as
Figure imgf000019_0004
or the illumination wavelength
Figure imgf000019_0005
and we obtain for the displacement
Figure imgf000020_0001
[0036] There are various ways to shift/displace the object relative to the static fringe illumination. The Inventor exploits the equivalence of an object moving with a known and (precisely) measurable speed Vx in a given spatial direction x.
Taking now images of this dynamic or moving “SIM-object”
Figure imgf000020_0002
[Fig.3: 310,312-318] moving as given in this example (but in general not limited hereto) with a beforehand determinate constant speed Vx we write for a dynamic “SIM” object
Figure imgf000020_0003
(Fis the movement speed of the sample; Kill the illumination wave vector producing the fringe pattern used for illumination and f an insignificant constant residual phase lag). Derived from classical SIM we write for our dynamic “SIM” object [Fig. 3: 350,352-358]
Figure imgf000020_0004
where y>is again the residual phase and / the complex constant as we used the complex representation of the cos-function (Euler formula)).
[0037] For those skilled in the art, it is obvious that moving the object across a static illumination pattern is equivalent to moving the illumination pattern
Figure imgf000020_0005
over a static object O(x) (in other terms this is just a change of the coordinate system). We write for this moving illumination pattern [Fig.3: 330,332-338]
Figure imgf000021_0001
or in its equivalent form given in fourier space based on the fourier coordinates p
Figure imgf000021_0002
[0038] This Fourier transform (indicated by the capital F) of Cm(x,t) bears the key ingredients for the super-resolution based on structured illumination. This is recognized by the Dirac function
S(p) and/or the corresponding phase shift
Figure imgf000021_0005
which clearly indicates an increase of the spatial frequency footprint in Fourier space or the resulting super-resolution in the real space (coordinate x as explained above). [0039] Integrating and assembling all these elements into the classical image- object relation
Figure imgf000021_0003
where PSF is the point spread function and <¾ represents a mathematical convolution relating the image l(x”) to the object O(x) (x” represents the coordinates in image space and x the coordinates in object space).
For the equivalent expression in Fourier space we write
Figure imgf000021_0004
where OTF is the optical transfer function related to the Fourier transform of the point spread function PSF. For the general object O(x) and its fourier spectrum
Figure imgf000022_0001
we substitute the SIM-Object
Figure imgf000022_0002
and obtain for the corresponding object spectrum
Figure imgf000022_0003
Figure imgf000022_0004
and for the image spectrum
Figure imgf000022_0005
in fourier space.
Those skilled in the art recognize immediately the differences to the classical structured illumination imaging.
The first term
Figure imgf000022_0006
which we named the DC-term is nothing else than the classical image not depending on time.
The 2nd and 3rd term
Figure imgf000022_0007
which we named AC-terms and which is shifted in Fourier space and carries the higher spatial frequencies providing the super-resolution known in analogy from classical SIM. In obvious contrast to SIM, both terms are modulated in time due to the time dependence of the phase factor
Figure imgf000023_0001
This phase factor P(t) carries the innovation of the present space-time imaging method and system.
[0040] As explained, the moving object causes a continuous phase modulation of its Fourier transformed image. As indicated above, this phase factor depending on time and the constant parameters
Figure imgf000023_0002
allows to reconstruct a super-resolved image.
[0041] Taking only 2 images with a p phase shift between these 2 images we obtain for the aforementioned phase factor
Figure imgf000023_0003
Taking the sum of 2 images with a total phase shift of p we will obtain a sum DC-signal corresponding to
Figure imgf000023_0004
or by taking the difference of both tt-shifted images we will obtain an AC-only signal corresponding to
Figure imgf000023_0005
which clearly shows the extended fourier space or the high spatial frequency content which leads to an improved localization accuracy. This is in essence shown in Fig. 5 where the sub-image 5002 shows the DC-signal and sub-image 5004 shows the AC-signal. The localization improvement due to the enlarged spatial frequency contributions is quite obvious. In summary, taking the difference between the fourier transform of images according the rules mentioned above will generate “difference images” having the DC-term eliminated and the AC-contribution doubled. In this example, 2 images are sufficient to extract the individual AC-terms and achieve a super-resolution of 1-dim objects. The optical system 1 of the present disclosure is configured to process consecutive phase shifted images (for example, 2 images with a p phase shift) to provide corresponding Fourier transformed images, and to perform subtraction of the Fourier transformed images to determine information carrying higher spatial frequencies providing a super-resolution, or an improved localization or spatial information of the 1 D object or objects, such as DNA strands, located on the sample 1030.
[0042] If the acquisition rate is high enough a 3D-fourier transform over the temporal dimension of the 2-dimensional acquired image stack will offer an alternative filtering of the DC, AC contribution. The DC-term appears at the temporal frequency
Figure imgf000024_0001
whereas the AC-contribution will appear at the a priori known temporal frequency
Figure imgf000024_0002
This signal processing will result in an improved signal-to-noise ratio. [0043] As shown above, splitting the intensity into an AC and DC part (as indicated by the formula above) can be easily generalized by taking a phase shift between consecutive images. Obviously, a phase shift of p needs only two images for separating and extracting the AC and DC part. It is possible to recombine the AC and DC part in such a manner, that a super-resolved image can be extracted by this simple sum and difference operation of the AC and DC part. Besides the simplicity of image processing, this procedure provides a faster signal acquisition as only 2 images are needed (see also Fig. 5).
[0044] In summary, an object moving with a constant speed leads to a phase modulation of the Fourier transformed image. This continuous phase modulation can be used for selecting the AC-part, while suppressing the DC-part by a simple image subtraction, supposing an appropriate time difference determined by the speed and the wave vector of the illumination.
The system 1 or the electronic synchronization system 230 is configured to carry out the above-described processing steps. The processor 231 and the memory ST include a program or instructions configured to carry out the above-described processing steps.
[0045] This is clearly an advantage of dynamic SIM when compared to classical SIM where the super-resolved information needs at least 3 images.
[0046] Those skilled in the art may object, the moving object is constantly moving out of the observable field. This objection is correct, but can easily circumvented when restricting the observed field to half the observable field size. This restricted sub-field can be imaged at least 2 times by an appropriate choice of the camera frame rate in relation to the speed of the object. [0047] Those skilled in the art may further object, that the image of a moving object will be blurred due to the sample movement causing this artefact. This artefact can easily be circumvented when using a short pulsed illumination, where the illumination duration is chosen short enough. A simple calculation shows, that for a speed in the order of several mm/sec a pulsed illumination with a duration of less than a microsecond fulfills the requirement of a neglectable blur effect. [0048] It is worth mentioning that this space time imaging can be generalized to any movement, unidirectional or not even oscillating, constant or not as long as the moving sample can be matched in time and space to an adequate illumination pattern. This will in no case be considered as an alternative innovative solution to the problems enumerated in the introductory paragraphs.
[0049] While the invention has been disclosed with reference to certain preferred embodiments, numerous modifications, alterations, and changes to the described embodiments, and equivalents thereof, are possible without departing from the sphere and scope of the invention. Accordingly, it is intended that the invention not be limited to the described embodiments and be given the broadest reasonable interpretation in accordance with the language of the appended claims. The features of any one of the above-described embodiments may be included in any other embodiment described herein.
REFERENCES
[1] M. G. Gustafsson, “Surpassing the lateral resolution limit by a factor of two using structured illumination microscopy,” J. Microsc. 198(2), 82-87 (2000). [2] Vandenberg, W., Leutenegger, M., Lasser, T. et al. Diffraction-unlimited imaging: from pretty pictures to hard numbers. Cell Tissue Res 360, 151-178 (2015). https://doi.Org/10.1007/S00441 -014-2109-0
[3] H. Mikami, C. Lei, N. Nitta, T. Sugimura, T. Ito, Y. Ozeki, and K. Goda, “High-speed imaging meets single-cell analysis,” Chem 4(10), 2278-2300 (2018). [4] A. Lai, C. Shan and P. Xi, "Structured Illumination Microscopy Image Reconstruction
Algorithm," in IEEE Journal of Selected Topics in Quantum Electronics, vol. 22, no. 4, pp. 50-63, July-Aug. 2016, Art no. 6803414, doi:10.1109/JSTQE.2016.2521542.
For fourier optics and interferometry:
[5] Fourier Optics J. Goodman, 3rd edition, ROBERTS & COMPANY Englewod, Colorado
[6] Optical Interferometry, P. Hariharan, Academic Press, 2nd edition 2003 Translation stages and position monitoring:
[7] Linear Piezo Motors with integrated position measuring device
Dr. Fritz Faulhaber GmbH & Co. KG, Daimlerst^e 23/25, 71101 Schonaich, Germany, www.faulhaber.com
DC motors and integrated encoders etc.
[8] maxon motor ag, B runigstrasse 220, 6072 Sachseln, Switzerland, [email protected]
The content of each of the above being fully incorporated herein by reference.

Claims

1. Optical system for space-time imaging comprising:
- an illumination system (1110) configured to generate a static structured illumination pattern (133) for illuminating at least one sample (1030) to be investigated; - displacement means for displacing the sample (1030) relative to the generated static structured illumination pattern (133);
- an optical detection system (1010) configured to receive electromagnetic radiation emanating from the illuminated sample (1030) during displacement of the sample (1030) in or across the static structured illumination pattern (133); and - an electronic synchronization system (230) configured to trigger the generation of the static structured illumination pattern (133) at different timepoints (ΪN) during the displacement of the sample (1030) in or across the static structured illumination pattern (133) to provide a plurality of signals or images comprising a relative phase shift of the generated static structured illumination pattern (133). 2. Optical system according to the previous claim, wherein the electronic synchronization system (230) is configured to trigger or synchronize acquisition of the electromagnetic radiation emanating from the illuminated sample (1030) with the triggering of the generation of the static structured illumination pattern (133).
3. Optical system according to anyone of the previous claims, wherein the electronic synchronization system (230) is configured to determine positions in time of the sample
(1030), and further configured to trigger the generation of the static structured illumination pattern (133) and/or acquisition of the electromagnetic radiation emanating from the illuminated sample (1030) based on said positioOns in time of the sample (1030).
4. Optical system according to anyone of the previous claims, wherein the electronic synchronization system (230) is configured to trigger the generation of the static structured illumination pattern (133) and acquire the electromagnetic radiation emanating from the illuminated sample (1030) at different timepoints (ΪN) determined by:
Figure imgf000028_0001
where tN is a timepoint, V is a speed of displacement of the sample (1030), N is a positive natural number and Km is an illumination wave vector producing the static structured illumination pattern (133).
5. Optical system according to anyone of the previous claims, wherein the electronic synchronization system (230) is configured to trigger the generation of the static structured illumination pattern (133) and the acquisition of at least two signals or images of the electromagnetic radiation emanating from the illuminated sample (1030) at least two different timepoints (ΪN).
6. Optical system according to any one of the previous claims, wherein the optical system is configured to process the plurality of signals or images comprising a relative phase shift of the generated static structured illumination pattern (133) to provide a plurality of Fourier transformed signals or images, and to perform subtraction of the Fourier transformed images or signals to determine super-resolved information of an object or objects located on the sample (1030). 7. Optical system according to any one of the previous claims, wherein the optical system is configured to process first and second signals or images, the second signal or image being a consecutive p phase shifted signal or image of the generated static structured illumination pattern (133), and wherein the optical system is configured to perform a Fourier transform of the first and second signals or images, and to carry out a subtraction of the Fourier transformed first and second signals or images to determine super-resolved information of an object or objects located on the sample (1030).
8. Optical system according to the previous claim, wherein solely first and second signals or images are processed to determine the super-resolved information of an object or objects located on the sample (1030). 9. Optical system according to anyone of the previous claims, wherein the displacement means is configured to continuously displace the sample (1030) in or across the generated static structured illumination pattern (133).
10. Optical system according to anyone of the previous claims, wherein the displacement means is configured to displace the sample (1030) in or across the generated static structured illumination pattern (133) at a constant or non-constant speed (V).
11. Optical system according to anyone of the previous claims, wherein the displacement means is configured to displace the sample (1030) in or across the generated static structured illumination pattern (133) at unidirectionaly or non- unidirectionaly, or in an oscillating manner. 12. Optical system according to anyone of the previous claims, wherein the illumination system (1110) is configured to generate a static structured illumination pattern (133) using pulsed light illumination.
13. Optical system according to the previous claims, wherein the illumination system (1110) is configured to set a pulse duration of the pulsed light illumination to a value assuring an absence of blurring.
14. Optical system according to anyone of the previous claims, wherein the optical detection system (1010) is configured to image the illuminated sample (1030) within a field-of-view that is less than an observable field-of-view of the optical detection system (1010). 15. Optical system according to the previous claim, wherein the optical detection system (1010) is configured to image the illuminated sample (1030) within a field-of-view that is half the observable field-of-view.
16. Optical system according to anyone of the previous claims 14 or 15, wherein the optical detection system (1010) is configured to image the illuminated sample (1030) within a field-of-view of the observable field-of-view by setting an acquisition frame rate of an optical detection system (1010) at an appropriate value relative to the speed (V) of the sample (1030).
17. Optical system according to anyone of the previous claims, further including means for setting, measuring or determining a displacement speed (Vx) of the sample (1030). 18. Optical system according to anyone of the previous claims, wherein the illumination system (1110) is configured to generate a static structured illumination pattern (133) having pattern oriented perpendicular to a direction of displacement of the sample (1030).
19. Optical system according to anyone of the previous claims, further including the sample (1030), wherein the sample includes a plurality of 1 D objects, or elongated 1 D objects.
20. Optical system according to the previous claim, wherein the 1 D object extends in one spatial direction and all orthogonal object dimensions are much smaller when compared to an optical wavelength.
21. Optical system according to the previous claim 19 or 20, wherein the 1 D object comprises or consists of a DNA strand.
22. Optical system according to any one of the previous claims, wherein the optical system is configured to determine object localization information based on the following equation
Figure imgf000031_0001
wherein O is an object or structured light microscopy object, OTF is the optical transfer function, f is a residual phase, p represents fourier coordinates in fourier space, and a sequence of images used in the determination of the object localization information are provided at timepoints (ΪN) determined by:
Figure imgf000031_0002
and wherein tN is a timepoint, V is a speed of displacement of the sample (1030), N is a positive natural number and Km is an illumination wave vector producing the static structured illumination pattern (133).
23. Optical system according to any one of the previous claims, wherein the optical system is configured to determine object localization information or AC contributions using a 3D-Fourier transform in two spatial dimensions and one temporal dimension.
24. Space-Time imaging method comprising the steps of:
- providing an optical system according to any one of the previous claims;
- acquiring electromagnetic radiation emanating from the illuminated sample (1030) and a sequence of images or signals at different timepoints (ΪN). 25. Space-Time imaging method according to the previous claim 24, further including the steps of processing the sequence of signals or images comprising a relative phase shift of the generated static structured illumination pattern (133) to provide a plurality of Fourier transformed signals or images, and performing subtraction of the Fourier transformed images or signals to determine super-resolved information of an object or objects located on the sample (1030).
26. Optical system according to any one of the previous claims 24 or 25, further including processing first and second signals or images, the second signal or image being a consecutive p phase shifted signal or image of the generated static structured illumination pattern (133), performing a Fourier transform of the first and second signals or images, and carrying out a subtraction of the Fourier transformed first and second signals or images to determine super-resolved information of an object or objects located on the sample (1030). 27. Optical system according to the previous claim, wherein solely first and second signals or images are processed to determine the super-resolved information of an object or objects located on the sample (1030).
28. Space-Time imaging method according to any one of the previous claims 24 to 27, further including the step of: - determining object localization information or AC contributions according to the following equation
Figure imgf000032_0001
using the acquired sequence of images, wherein O is an object or structured light microscopy object, OTF is the optical transfer function, <p is a residual phase, p represents fourier coordinates in fourier space, V is a speed of displacement of the sample (1030), t is time, and Km is an illumination wave vector producing the static structured illumination pattern (133).
29. Space-Time imaging method according to any one of previous claims 24 to 28, wherein the sequence of images at timepoints (ΪN) are determined by:
Figure imgf000032_0002
where tN is a timepoint, V is a speed of displacement of the sample (1030), N is a positive natural number and Km is an illumination wave vector producing the static structured illumination pattern (133).
30. Space-Time imaging method according to anyone of the previous claims 24 to 29, further including determining AC contributions using a 3D-Fourier transform in two spatial dimensions and one temporal dimensions.
PCT/IB2021/052887 2020-04-08 2021-04-07 Space time imaging system and method WO2021205356A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IBPCT/IB2020/053361 2020-04-08
IB2020053361 2020-04-08

Publications (1)

Publication Number Publication Date
WO2021205356A1 true WO2021205356A1 (en) 2021-10-14

Family

ID=76250386

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2021/052887 WO2021205356A1 (en) 2020-04-08 2021-04-07 Space time imaging system and method

Country Status (1)

Country Link
WO (1) WO2021205356A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018151599A1 (en) * 2017-02-16 2018-08-23 Universiteit Van Amsterdam Structured illumination scanning microscopy
US20200088992A1 (en) * 2018-09-19 2020-03-19 Illumina, Inc. Structured illumination of a sample
US20200103639A1 (en) * 2018-01-24 2020-04-02 lllumina, Inc. Reduced Dimensionality Structured Illumination Microscopy With Patterned Arrays of Nanowells
WO2020179036A1 (en) * 2019-03-06 2020-09-10 株式会社ニコン Microscope

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018151599A1 (en) * 2017-02-16 2018-08-23 Universiteit Van Amsterdam Structured illumination scanning microscopy
US20200103639A1 (en) * 2018-01-24 2020-04-02 lllumina, Inc. Reduced Dimensionality Structured Illumination Microscopy With Patterned Arrays of Nanowells
US20200088992A1 (en) * 2018-09-19 2020-03-19 Illumina, Inc. Structured illumination of a sample
WO2020179036A1 (en) * 2019-03-06 2020-09-10 株式会社ニコン Microscope

Non-Patent Citations (9)

* Cited by examiner, † Cited by third party
Title
"Linear Piezo Motors with integrated position measuring device", DR. FRITZ FAULHABER GMBH & CO. KG
A. LALC. SHANP. XI: "Structured Illumination Microscopy Image Reconstruction Algorithm", IEEE JOURNAL OF SELECTED TOPICS IN QUANTUM ELECTRONICS, vol. 22, no. 4, pages 50 - 63
ANONYMOUS: "Data Sheet: PCO.edge 5.5", 20 August 2013 (2013-08-20), XP055597339, Retrieved from the Internet <URL:https://www.pco.de/fileadmin/user_upload/db/download/BR_pco_edge55_101.pdf> [retrieved on 20190618] *
H. MIKAMIC. LEIN. NITTAT. SUGIMURAT. ITOY. OZEKIK. GODA: "High-speed imaging meets single-cell analysis", CHEM, vol. 4, no. 10, 2018, pages 2278 - 2300
M. G. GUSTAFSSON: "Surpassing the lateral resolution limit by a factor of two using structured illumination microscopy", J. MICROSC., vol. 198, no. 2, 2000, pages 82 - 87, XP008083176, DOI: 10.1046/j.1365-2818.2000.00710.x
P. HARIHARAN: "Fourier Optics J. Goodman", 2003, ROBERTS & COMPANY
SCHENK FRIEDRICH WALTER: "Hochdurchsatz-Mikroskopie von Mikrotiterplatten auf Basis einer kontinuierlichen Objektbewegung", 11 November 2016 (2016-11-11), pages 41 - 59, XP055828541, ISBN: 978-3-86359-475-6, Retrieved from the Internet <URL:http://ebookcentral.proquest.com/lib/epo-ebooks/detail.action?docID=4751868> [retrieved on 20210728] *
VANDENBERG, W.LEUTENEGGER, M.LASSER, T. ET AL.: "Diffraction-unlimited imaging: from pretty pictures to hard numbers", CELL TISSUE RES, vol. 360, 2015, pages 151 - 178, XP035477214, Retrieved from the Internet <URL:https://di.rg/10.1007/00441-014-2109-0> DOI: 10.1007/s00441-014-2109-0
ZHANG YINXIN ET AL: "Partial-frequency-spectrum reconstruction algorithm of SIM with reduced raw images", OSA CONTINUUM, vol. 3, no. 1, 20 December 2019 (2019-12-20), pages 1, XP055828457, DOI: 10.1364/OSAC.3.000001 *

Similar Documents

Publication Publication Date Title
US9658442B2 (en) Cumulant microscopy
US8160379B2 (en) Methods and devices for image processing with higher harmonics of an illumination grating
US5737456A (en) Method for image reconstruction
CA3155485A1 (en) Systems and methods for structured illumination microscopy
US8675062B2 (en) Shape measuring device, observation device, and image processing method
EP3735606B1 (en) Method and system for localisation microscopy
WO2010141608A1 (en) Superresolution optical fluctuation imaging (sofi)
US9921161B1 (en) Structured light active localization microscopy
CN113466187B (en) System and method for carrying out polarization super-resolution imaging on fluorescence anisotropy
WO2020056423A1 (en) Multi-range imaging system and method
EP1604188A1 (en) Imaging device
Zeng et al. Computational methods in super-resolution microscopy
JP2006221190A (en) Confocal scanning microscope system
Pawlowska et al. Embracing the uncertainty: the evolution of SOFI into a diverse family of fluctuation-based super-resolution microscopy methods
CN111902761B (en) Sample observation device and sample observation method
JP2003255231A (en) Optical imaging system and optical image data processing method
CN109557070B (en) Raman imaging system based on space coded light
Esposito et al. Innovating lifetime microscopy: a compact and simple tool for life sciences, screening, and diagnostics
Zhao et al. Faster super-resolution imaging with auto-correlation two-step deconvolution
WO2021205356A1 (en) Space time imaging system and method
Maalouf Contribution to fluorescence microscopy, 3D thick samples deconvolution and depth-variant PSF
WO2014172035A1 (en) Coherent fluorescence super-resolution microscopy
Hugelier et al. A perspective on data processing in super-resolution fluorescence microscopy imaging
CN103477309B (en) In optical navigation device or improvement related with optical navigation device
Zhao et al. Enhancing detectable fluorescence fluctuation for high-throughput and four-dimensional live-cell super-resolution imaging

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21729627

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21729627

Country of ref document: EP

Kind code of ref document: A1