CN104155320B - A kind of time resolution overlapping associations Imaging - Google Patents

A kind of time resolution overlapping associations Imaging Download PDF

Info

Publication number
CN104155320B
CN104155320B CN201410419950.8A CN201410419950A CN104155320B CN 104155320 B CN104155320 B CN 104155320B CN 201410419950 A CN201410419950 A CN 201410419950A CN 104155320 B CN104155320 B CN 104155320B
Authority
CN
China
Prior art keywords
target object
function
reconstruction
wave
probe
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410419950.8A
Other languages
Chinese (zh)
Other versions
CN104155320A (en
Inventor
王鹏
高斯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University
Original Assignee
Nanjing University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University filed Critical Nanjing University
Priority to CN201410419950.8A priority Critical patent/CN104155320B/en
Publication of CN104155320A publication Critical patent/CN104155320A/en
Application granted granted Critical
Publication of CN104155320B publication Critical patent/CN104155320B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Analysing Materials By The Use Of Radiation (AREA)

Abstract

The invention discloses a kind of time resolution overlapping associations Imagings, are a kind of methods for providing image data.This method reconstructs the relevant information in the region of target object using the associated diffraction image of acquisition.This method step includes:Target image is obtained, until acquiring enough associated images;Target object is obtained for time averaging reconstruct by enhanced overlapping associations Imaging iteration engine (ePIE);Using reconstruct All Time sequence target object and probe functions as target, change initial input, result is finally obtained by enhanced overlapping associations Imaging iteration engine (ePIE).For existing in the prior art including overlapping associations Imaging (Ptychography) and a lot of other imaging techniques, the Information Problems present invention for being merely able to the static target object of reconstruction can obtain the time-resolved reconstruct of target object.

Description

Time-resolved overlap-correlated imaging
Technical Field
The present invention relates to the field of overlay correlation imaging, and more particularly to a method of reconstructing a target object, which may generate the evolution of the target over time from a set of correlated image reconstructions.
Background
There are a variety of imaging techniques for deriving spatial information of a target object, and with the use of the superimposed correlation imaging (Ptychography), the achievable resolution is many times the theoretical wavelength limit, and aberrations and instabilities introduced by the usual lens in the optical system can be avoided. While conventional scanning transmission imaging takes a significant amount of time to complete an image and can only measure the total scatter intensity at the detector, rather than phase information, overlay correlation imaging (Ptychography) can complete the acquisition of an image in little time, with less impact of system instability, and with iterative means, recover several pieces of information of the target object.
In international application WO 2005/106531, a method and apparatus for acquiring images for reconstructing phase information of a target object (hereinafter sometimes referred to as a sample), called Ptychography, is disclosed. Meanwhile, a reconstruction method of a target object using an image obtained by a moving probe function (probe function), called a Ptychographic Iterative Engine (PIE), is also disclosed in WO 2005/106531. A particular method of overlap correlation imaging is to use the object under test in a defocused (defocus) condition to detect the intensity scattered at a distance from the target object by at least one detector. The method for forming such images is known as lap correlation imaging.
In international application WO 2010/064051, an enhanced ptychographic Iteration Engine (ePIE) is disclosed. It is characterized by the fact that the Probe Function (Probe Function) can be calculated step by step in an iteration by estimating the initial values. The enhanced ptychographic iteration engine provides a technique for recovering several features of the target object in the region of interest using a set of measurements of the diffraction pattern. And updating the target object and the probe function through each iteration by using the known or unknown probe function and the related image obtained by detection.
In international application WO 2008/142360, WO 2012/038749, a method of constructing a three-dimensional target object region is disclosed which determines, by an iterative process, several features of a target object at respective depths within the target object. This method shares the same technical basis as the Ptychography (Ptychography).
However, in general, including Ptychography (Ptychography) and many other imaging techniques, only information of a static target object can be reconstructed, and information of a dynamic target object cannot be reconstructed.
Disclosure of Invention
1. Technical problem to be solved
Aiming at the problem that the information of a static target object can only be reconstructed by using a superposition correlation imaging (Ptychographic) and other imaging technologies in the prior art, the invention provides a time-resolved superposition correlation imaging method, which can obtain time-resolved reconstructed object information by improving an iteration mode.
2. Technical scheme
The purpose of the invention is realized by the following technical scheme:
a time resolved ptychographic imaging procedure comprising the steps of:
(1): acquiring a group of images by an image acquisition method represented by an overlapped correlation imaging technology, determining a corresponding acquisition time point, and simultaneously recording the displacement of the probe function corresponding to the origin;
(2): determining a set of reconstruction time points and corresponding reconstruction time intervals;
(3): utilizing a method for constructing a three-dimensional target object region to complete the reconstruction of a target object and a probe function, and taking the result as an initial value of a subsequent step;
(4): reconstructing the target object and the probe function for a single reconstruction time point;
(5): and (5) repeating the step (4), obtaining the reconstruction of the target object and the probe function at all reconstruction time points, and combining to obtain the time-resolved reconstruction of the target object and the probe function.
Preferably, the step (2) further comprises the following steps:
(2.1) determining a set of reconstruction time points t1,t2...tMSatisfy t1<t2<...<tMAnd a time interval [ - τ 1, τ 2]Two or more acquisition time points exist in the corresponding time interval of each reconstruction time point;
(2.2) the condition tr will be satisfiedr∈[tm-τ1,tm+τ2]The subscripts of (a) are recombined into a set, and the image corresponding to the set is used in step (4), wherein M is greater than or equal to 1 and less than or equal to M.
Preferably, the step (4) further comprises the following steps:
step (4.1): taking the result of the step (3) as an initial condition of the target object and the probe function;
step (4.2): updating the target object and the probe function by using one of the collected images, and calculating a root mean square error;
step (4.3): repeating the step (4.2) until all the images are used up, and calculating the average value of the root mean square error to the used images; the conditions that these images need to satisfy are given in step (2.2);
step (4.4): and (4.2) repeating the steps (4.2) to (4.3) until the required target object and the update of the probe function are obtained.
Preferably, said step (4.2) further comprises the steps of:
step (4.2.1): slicing and layering the target object in a layering mode, dividing the three-dimensional target object into S-layer quasi-two-dimensional target objects, wherein the wave function of the front surface of the first layer of slices of the target object is equal to the currently updated probe function;
step (4.2.2): acquiring wave functions of the wave functions on the front surface and the back surface of all target object slices by using the target object reconstructed at the present stage;
step (4.2.3): replacing the modulus of the wave function on the detection plane by the modulus of the image, and calculating the root mean square error;
step (4.2.4): updating the operators using a method of constructing a three-dimensional target object region using the data obtained in step (4.2.2), updating the wave functions of the target object and the front and rear surfaces of each slice;
step (4.2.5): the updated probe function is equal to the wave function of the target object at the front surface of the first slice.
3. Advantageous effects
Compared with the prior art, the invention has the advantages that:
(1) compared with the common overlapping correlation imaging (Ptychodography) which can reconstruct only a plurality of pieces of information of a static target object, the invention can obtain the time-resolved reconstruction of the target object;
(2) compared with the traditional scanning transmission imaging which takes a lot of time to complete the image, the method only needs to acquire images at fewer positions based on the Ptychography (Ptychography), thereby being capable of obtaining higher time resolution in reconstructed sample information, reducing damage to a target object caused by long-time irradiation and simultaneously obtaining a plurality of reconstructed information such as phase. When the Ptychography (Ptychography) is realized, nearly hundred images are often required to be adopted, and the method further increases the time resolution capability of the Ptychography;
(3) there is a trend in optics, X-ray and electron optics to utilize single-emission techniques, typically using ultra-fast pulsed signal sources to emit single photons or electronic pulses, and then to extract the desired information from the acquired images. If it is desired to increase the resolution of this technique in combination with overlay correlation imaging, the time taken to acquire nearly a hundred images will be contrary to the ultra-fast time resolution. The present invention may provide a solution to the combination of the two;
(4) for target objects which change rapidly and continuously, the original overlap correlation imaging (Ptychodography) may bring too many artificial structures (artifacts) in the change area, and compared with the original overlap correlation imaging, the method improves the time resolution and can reduce the occurrence of the artificial structures (artifacts) during reconstruction to a certain extent;
(5) the invention can reconstruct two-dimensional images and three-dimensional images in time, and only needs a small amount of imaging time to reconstruct the images.
Drawings
FIG. 1 is a schematic diagram of an implementation of image overlay correlation imaging as a layer;
FIG. 2 is a schematic diagram of a three-dimensional mode of an overlay correlation imaging implementation;
FIG. 3 is a schematic diagram of reconstruction time point selection in time resolution;
fig. 4 is a flow chart of reconstruction of a single reconstruction time point in time resolution.
The reference numbers in the figures illustrate:
1: a radiation propagation path; 2: performing radiation translation; 3: a target sample front surface; 4: a target sample back surface; 5: defocusing; 6: a target object; 7: detecting a plane; 8: slice spacing; 9: a radiation vacuum propagation region; 10: a three-dimensional target object second layer slice position front surface; 11: a back surface of a second layer of slice locations of the three-dimensional target object; 12: the final sliced rear surface of the target object; 13: target object rear surface and detection plane position; 14: collecting time points; 15: reconstructing an earlier interval around the time point; 16: later intervals around the reconstruction time point.
Detailed Description
The technical solution of the present invention is further explained with reference to the drawings and the specific embodiments. The data in the examples were programmed and calculated using the matlab program.
A time resolved ptychographic imaging procedure comprising the steps of:
(1): acquiring a set of R images by an Image acquisition method represented by lap correlation imaging1,Image2...ImageRAnd determining corresponding acquisition time points { tr }1,tr2...trR}. At the same time, with the optical axis and the focal point of the front surface of the target object as the origin, in the acquisition of the image, the displacement { x ] of the probe function corresponding to the origin will be recorded1,x2...xR}。
Since there is no specific requirement for the arrangement of R images, tr is made by rearrangement1<tr2<...<trRHere "<" represents advance in time.
The determination of the corresponding acquisition time points comprises recording the acquisition time points by the image acquisition device during the acquisition process and determining the distribution of the acquisition time points by an estimation method.
(2): determining a set of reconstruction time points t1,t2...tMSatisfy t1<t2<...<tMAnd 0 is<M<R, for the case that the number of reconstruction time points is more than the number of acquisition time points, even 0<M<When the R condition is not satisfied, the technical scheme also gives results as long as other conditions are satisfied.
(2.1) determining a time interval [ - τ 1, τ 2]So that it satisfies the condition of a corresponding time interval t at each reconstruction time pointm-τ1,tm+τ2]There are at least two acquisition time points in each case, M is any one element of the set {1,2, … M }, described mathematically asSo that
(2.2) for an arbitrary reconstruction time point tmOrder setSatisfy the requirement ofSo thatThat is to say satisfy the condition trr∈[tm-τ1,tm+τ2]The subscripts of (1) are recombined into a set, and the image corresponding to the set is used for the step (4), and M is more than or equal to 1 and less than or equal to M.
(3): the target object is reconstructed by a method for constructing a three-dimensional target object region, and the result is recorded asAnd reconstruction of the Probe function, the results are noted
The three-dimensional target object is divided into S layers of quasi-two-dimensional target objects. For a target object with thickness D, to achieve an axial resolution of D, letThe numerical value in parentheses here is [.]Representing a rounding down. The thickness D of the target object may be obtained by other detection means, and S is an arbitrary natural number.
In particular, when S is 1, the invention corresponds to a time-resolved reconstruction of a two-dimensional target object from a probe function.
(4): for a single reconstruction time point tmPerforming a reconstruction of the target object and the probe function, tm∈{t1,t2...tM}。
The goal being to obtain tmThree-dimensional reconstruction of a temporal target objectAnd reconstruction of probe functions
(5): repeating the step (4) to ensure that tmTraversal set t1,t2...tMAnd finally obtaining time-resolved three-dimensional reconstruction of the target objectAnd reconstruction of probe functions
Specifically, the step (4) includes the steps of:
(4.1): taking the result of the step (3) as an initial bar of the target object and the probe functionA piece:and
(4.2): using captured ImagerUpdating target objects and probe functions, arbitrarilyAnd calculating the root mean square error Ej,rIs given in step (2).
(4.3): repeating the step (4.2) to make r traverse the setAnd calculating the mean of the root mean square error over the acquired images usedEjThe root mean square error is the jth iteration. Where j represents the number of iterations and r traverses the set
Completing one step (4.3) is recorded as completing one iteration of the loop.
Will end upAndrelabeling as:andthe index j represents the number of iterations, j being 0,1,2,3.
(4.4): and (4.2) repeating the steps (4.2) to (4.3) until the required target object and the update of the probe function are obtained.
The stopping criterion of the judging step (4.4) is to stop when a certain iteration number is reached or to use the root mean square error to judge when the root mean square error E is reachedjLess than a value specified by an implementation, e.g. 1 × e-8The iteration can then be ended.
More specifically, step (4.2) comprises the steps of:
(4.2.1): in front of the first slice of the target object, the wave function ψi,1Equal to the probe function currently updated, i.e.The front surface and the back surface are relative to the target object and the radiation incidence direction, the front surface represents the surface of the target object facing the radiation incidence direction, and the back surface is opposite. The terms referred to in the present invention, front surface and back surface are selected accordingly. Wherein x is the plane coordinate of the probe function.
For wave function psii,sOr psie,sIt is shown that the first index i represents the mathematical form of the wave function on the front surface of the target object and the first table e represents the mathematical form of the wave function on the rear surface of the target object. The second subscript S represents the slice index of the target object, S ∈ {1,2, … S }, wherein the slice index increases stepwise from the first layer facing the direction of incidence of the radiation, denoted by 1, away from the direction of incidence of the radiation.
(4.2.2): and acquiring wave functions of the wave functions on the front surface and the back surface of all target object slices by using the target object reconstructed at the present stage. Psi obtained from the previous stepi,1And repeatedly using the formula for many times:
obtaining { psii,1i,2,...ψi,S},{ψe,1e,2,...ψe,S},
Where H is the transfer function of the wave function in vacuum, and H has different approximate expressions under different radiation sources, and, in general,wherein z is the propagation distance and w is the dimensionless spectral coordinate. λ is the wavelength.
(4.2.3): using ImagerReplaces the modulus of the wave function in the detection plane and records the root mean square error Ej,rOn the probe plane, the wave function psid=FFT(ψe,S) ImagerModulus replacing wave functionSince the image recorded by the detector is intensity information, the quadratic root of the image represents the modulus of the wave function in the detection plane in the actually acquired image.Then psi is removeddWhile only the phase information is retained. Extrapolating back the wave function U (psi) of the surface of the wave function after the last slice by the updated wave functione,s)=FFT-1(U(ψd) Root mean square error (rms)The summation here is the summation of the intensities everywhere on the detection plane.
(4.2.4): the wave functions of the target object and the front and back surfaces of each slice are updated from the data obtained in the previous steps.
The usage update operator is as follows:
U(ψe,s-1)=FFT[FFT-1[U(ψi,s)]×H-1]
in the update operators, α represents the target object update speed, β represents the update speed of the wave function, and usually 0< α ≦ 1,0< β ≦ 1.
In the update operator, #i,sAnd psie,sRepresenting the wave function at different locations,representing the target object, the subscripts of which have been explained previously. SymbolRepresenting the maximum modulus of the plane in which the value in the symbol is located, and taking the square of the maximum modulus. Symbol represents taking the conjugate.
Thus obtaining { U (psii,1),U(ψi,2),...U(ψi,S)},{U(ψe,1),U(ψe,2),...U(ψe,S) And updated target object
(4.2.5): the updated probe function is equal to the wave function U (ψ) of the front surface of the first slice of the target objecti,1),If the target object and the probe function need to be further processed iteratively subsequently, the updated target object and probe function need to be known, that is:
example 1
The three-dimensional mode overlapping correlation imaging is realized by the following steps:
step 1: the method of acquiring images represented by lap correlation imaging, International publication No. WO 2005/106531, acquires a set of 25 images { Image }1,Image2...Image25And determines the corresponding acquisition time point 14, and marks the interval as t0
As shown in fig. 1, the image ptychography is simplified into a one-layer approach, which is a two-dimensional ptychography image acquisition approach, and specifically, a set of images is acquired by recording intensity information of a radiation source, i.e., an image, at a detection plane 7 through a target object 6 via a target sample front surface 3 and a target sample rear surface 4 with a radiation propagation path 1 accompanied by defocus 5, and acquiring such a set of images by radiation translation 2.
When the three-dimensional target object is reconstructed, the three-dimensional target object is divided into the S-layer quasi-two-dimensional target object in a layered mode. For a target object with thickness D, to achieve an axial resolution of D, letThe parentheses here indicate that the numerical values are rounded down and S is an arbitrary natural number.
Referring to fig. 2, a schematic diagram of an implementation of three-dimensional mode ptychography is shown, in which the three-dimensional image target object has two layers, which change with time during image acquisition.
The radiation source is an electron beam accelerated by a 300kev electric field, which can be described by a set of plane waves, focused through a 20mrad stop, with-600 nm defocus of 5 and 1.2mm third order spherical aberration at the target object 6 plane.
Step 2: as shown in FIG. 3, a set of 24 reconstruction time points, each located between the acquisition time points 14 satisfying 25 acquisition time points, are determined, with a time interval [ - τ 1, τ 2]=[-0.6t0,0.6t0]τ 1 is an earlier interval 15 around the reconstruction time point, τ 2 is a later interval 16 around the reconstruction time point, then the first, second acquisition time point is within the first reconstruction time period, the second, third acquisition time point is within the second reconstruction time period, and so on, to determine a set of reconstruction time points.
Thus, the goal of this example is to obtain a time-series set of probe functionsAnd a target objectAs shown in fig. 4, the reconstruction time point selection in time resolution is performed according to the flowchart.
The reconstruction of the target object is recorded as a result of using a method for constructing a three-dimensional target object region, international publication No. WO 2012/038749And the result of the reconstruction of the probe function is recorded as
(a) The following (a) - (g) steps are taken to reconstruct t1The time target object and the probe function are taken as examples.
Taking the result of the step 2 as an initial value,
(b) wave function psii,1At the target sample front surface 3, equal to the currently updated probe function, i.e.:
the wave function firstly acts with the first layer of the target object to obtain the wave function of the back surface 4 of the target sample, then propagates to the front surface 10 of the second layer of the slice position of the three-dimensional target object, and then acts with the second layer to obtain the wave function of the back surface 11 of the second layer of the slice position of the three-dimensional target object:
ψi,2=FFT[FFT-1e,1]×H]
wherein,is the transfer function of the electron wave function in the radiation vacuum propagation area 9.
z is the two-layer target object slice spacing 8, which is made 10nm here. And X is the coordinate of the plane where the probe function is located, w is lambda/X, and X is the size of the plane where the probe function is located. Since space is described by a finite matrix, in simulations, this plane is described by a matrix of 1024X 1024, each pixel, i.e. each matrix element, representing the spatial dimension of X0.02 nm, hence X20.48 nm. λ is the wavelength and the electron wavelength accelerated by a 300kev electric field is about 1.97 pm.
Since the target sample has two layers in total, psi is finally obtainedi,1e,1i,2e,2Four data sets.
(c) The wave function propagates from the last slice back surface 12 of the target object to the detection plane 7, which corresponds to a fourier transform with the detection plane position 13 as shown in fig. 2: psid=FFT(ψe,2). In the case of a two-layer slice, where the back surface 12 of the last slice of the target object and the back surface 11 of the second layer of the three-dimensional target object are in the same plane.
Updating an emergent wave function using a captured imageThe collection time point can only be tr1Or tr2Since only the first and second acquisition time points are within the first reconstruction period.
Finally, the wave function U (psi) of the surface 12 is pushed back by the updated wave function after the final slice of the target objecte,2)=FFT-1(U(ψd))。
In addition, the root mean square error is calculated
(d) The target object and the probe function are updated with the updated emergent wave.
The operator is updated with the following equation:
U(ψe,s-1)=FFT[FFT-1[U(ψi,s)]×H-1]
wherein,representing the difference in the update rate of the target object and the probe function.
Representing the transfer function of the wave function in the radiation vacuum propagation region 9 in the opposite direction to H where the above formula occurs. Thus obtaining U (psi)i,1),U(ψi,2),U(ψe,1),U(ψe,2) And
(e) the updated probe function is equal to the wave function of the target object at the front surface 3 of the target sample.Up to this point, the updating of the target object and the probe function using one captured image is completed.
(f) Repeating steps (b) - (e) until all the images satisfying the condition have been used, thus completing a loop iteration, and calculating the mean root mean square error of the acquired images used, since there are only two images satisfying the condition, namely the images corresponding to the first and second acquisition time points in the first reconstruction time period
(g) Repeating steps (b) - (f) until the root mean square error Ej<1e-4 stop. Obtaining a reconstruction t for a single point in time1
(h) And (d) respectively taking reconstruction different time points as targets, and repeating the steps (a) to (g) to obtain the reconstruction of all 24 time points.
The invention and its embodiments have been described above schematically, without limitation, and the embodiments shown in the drawings are only one of the embodiments of the invention, and the actual structure is not limited thereto. Therefore, if a person skilled in the art receives the teachings of the present invention, without inventive design, a similar structure and an embodiment to the above technical solution should be covered by the protection scope of the present patent.

Claims (1)

1. A time resolved ptychographic imaging procedure comprising the steps of:
(1): acquiring a group of images by an image acquisition method represented by an overlapped correlation imaging technology, determining a corresponding acquisition time point, and simultaneously recording the displacement of the probe function corresponding to the origin;
(2): determining a set of reconstruction time points and corresponding reconstruction time intervals;
the method comprises the following steps: (2.1) determining a set of reconstruction time points t1,t2...tMSatisfy t1<t2<...<tM,0<M<R, and a time interval [ - τ 1, τ 2 [ ]]Two or more acquisition time points exist in the corresponding time interval of each reconstruction time point;
(2.2) for an arbitrary reconstruction time point tmOrder setSatisfy the requirement ofSo thatWill satisfy the condition trr∈[tm-τ1,tm+τ2]The subscripts of (1) are recombined into a set, and the image corresponding to the set is used for the step (4), wherein M is more than or equal to 1 and less than or equal to M;
(3): utilizing a method for constructing a three-dimensional target object region to complete the reconstruction of a target object and a probe function, and taking the result as an initial value of a subsequent step; the three-dimensional target object is divided into S-layered quasi-two-dimensional target objects, and for a target object with a thickness D, to achieve an axial resolution of D, the order ofThe numerical value in parentheses here is [.]Representing rounding-down, the thickness D of the target object can be obtained by other detection means, and S is any natural number;
(4): reconstructing the target object and the probe function for a single reconstruction time point;
step (4.1): taking the result of the step (3) as an initial condition of the target object and the probe function;
step (4.2): updating the target object and the probe function by using one of the collected images, and calculating a root mean square error;
step (4.2.1): slicing and layering the target object in a layering mode, dividing the three-dimensional target object into S-layer quasi-two-dimensional target objects, wherein the wave function of the front surface of the first layer of slices of the target object is equal to the currently updated probe function;
step (4.2.2): acquiring wave functions of the wave functions on the front surface and the back surface of all target object slices by using the target object reconstructed at the present stage; wave function psi obtained in the last stepi,1And repeatedly using the formula for many times:
obtaining { psii,1i,2,...ψi,S},{ψe,1e,2,...ψe,S};
Wherein psii,s、ψe,sIn order to be a function of the wave,h is the transfer function of the wave function in vacuum,subscript i represents the mathematical form of the wave function at the front surface of the target object, the first subscript e represents the mathematical form of the wave function at the rear surface of the target object, and the second subscript S represents the slice index of the target object, S ∈ {1,2, … S }, with the index increasing stepwise from the first layer slice index facing the incident direction of radiation being 1 toward the direction away from the incident direction of radiation; z is the propagation distance, w is the dimensionless frequency spectrum coordinate, and λ is the wavelength; the subscript j represents the number of iterations, j 0,1,2, 3.; t is tmIs a time point;
step (4.2.3): replacing the modulus of the wave function on the detection plane by the modulus of the image, and calculating the root mean square error;
step (4.2.4): updating the operators using a method of constructing a three-dimensional target object region using the data obtained in step (4.2.2), updating the wave functions of the target object and the front and rear surfaces of each slice; the usage update operator is as follows:
U(ψe,s-1)=FFT[FFT-1[U(ψi,s)]×H-1]
in the update operator, α represents the update speed of the target object, β represents the update speed of the wave function, 0< α ≦ 1,0< β ≦ 1,
in the update operator, # i,sAnd psi e,sRepresenting the wave function at different locations,representing target objects, symbolsRepresenting the maximum modulus of the plane in which the value in the symbol is located, squaring the value, and representing the conjugate by the symbol;
U(ψi,s)
U(ψe,s-1)
representing the updated target object and the wave functions of the front and back surfaces of each slice;
step (4.2.5): the updated probe function is equal to the wave function of the target object at the front surface of the first slice;
step (4.3): repeating the step (4.2) until all the images are used up, and calculating the average value of the root mean square error to the used images; the conditions that these images need to satisfy are given in step (2.2);
step (4.4): repeating the steps (4.2) - (4.3) until the required target object and the update of the probe function are obtained;
(5): and (5) repeating the step (4), obtaining the reconstruction of the target object and the probe function at all reconstruction time points, and combining to obtain the time-resolved reconstruction of the target object and the probe function.
CN201410419950.8A 2014-08-22 2014-08-22 A kind of time resolution overlapping associations Imaging Active CN104155320B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410419950.8A CN104155320B (en) 2014-08-22 2014-08-22 A kind of time resolution overlapping associations Imaging

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410419950.8A CN104155320B (en) 2014-08-22 2014-08-22 A kind of time resolution overlapping associations Imaging

Publications (2)

Publication Number Publication Date
CN104155320A CN104155320A (en) 2014-11-19
CN104155320B true CN104155320B (en) 2018-08-10

Family

ID=51880877

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410419950.8A Active CN104155320B (en) 2014-08-22 2014-08-22 A kind of time resolution overlapping associations Imaging

Country Status (1)

Country Link
CN (1) CN104155320B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106768395B (en) * 2016-12-14 2019-10-08 中国科学院光电技术研究所 Precision measurement method for alignment errors of adaptive optical wavefront sensor and wavefront corrector

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101820817A (en) * 2007-05-22 2010-09-01 阶段聚焦有限公司 Three dimensional imaging
EP2410353A2 (en) * 2008-12-04 2012-01-25 Phase Focus Limited Provision of Image Data
CN103200870A (en) * 2010-09-24 2013-07-10 相位聚焦有限公司 Three dimensional imaging
CN103595414A (en) * 2012-08-15 2014-02-19 王景芳 Sparse sampling and signal compressive sensing reconstruction method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB0409572D0 (en) * 2004-04-29 2004-06-02 Univ Sheffield High resolution imaging

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101820817A (en) * 2007-05-22 2010-09-01 阶段聚焦有限公司 Three dimensional imaging
EP2410353A2 (en) * 2008-12-04 2012-01-25 Phase Focus Limited Provision of Image Data
CN103200870A (en) * 2010-09-24 2013-07-10 相位聚焦有限公司 Three dimensional imaging
CN103595414A (en) * 2012-08-15 2014-02-19 王景芳 Sparse sampling and signal compressive sensing reconstruction method

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Dynamic Imaging Using Ptychography;Jesse N. Clark et al.;《PHYSICAL REVIEW LETTERS》;20140318;第113901-1至113901-5页 *
中心挡板对扫描相干X射线衍射成像的影响;刘海岗等;《物理学报》;20131231;第62卷(第15期);第150702-1至150702-8页 *
同步辐射X射线光斑信息对扫描相干衍射成像的影响;刘海岗等;《辐射研究与辐射工艺学报》;20131231;第31卷(第6期);第1页左栏第1段,第2页第1节 *

Also Published As

Publication number Publication date
CN104155320A (en) 2014-11-19

Similar Documents

Publication Publication Date Title
EP2356487B1 (en) Provision of image data
JP5917507B2 (en) Calibration method of probe by typography method
EP1740975B1 (en) High resolution imaging
EP2152164B1 (en) Three dimensional imaging
CN107655405B (en) Method for eliminating axial distance error between object and CCD by using self-focusing iterative algorithm
Mukherjee et al. An iterative algorithm for phase retrieval with sparsity constraints: application to frequency domain optical coherence tomography
JP6556623B2 (en) Improved phase recovery
CN102645739B (en) Phase microscopic device for transmission type samples and phase microscopic method
Vine et al. Ptychographic Fresnel coherent diffractive imaging
CN106646511B (en) A kind of reconstruction processing method of laser reflection tomography data for projection
CA2810610A1 (en) Improvements in three dimensional imaging
CN104484894A (en) Multi-wavelength lamination imaging technology facing to three-dimensional information recovery
EP3830628B1 (en) Device and process for capturing microscopic plenoptic images with turbulence attenuation
CN116895349A (en) Strain assessment method, device and storage medium based on Bayesian neural network
EP2227705B1 (en) Method and apparatus for providing image data
CN104155320B (en) A kind of time resolution overlapping associations Imaging
CN104132952B (en) Time resolution ptychography
Whitehead et al. Fresnel diffractive imaging: Experimental study of coherence and curvature
Wang et al. Phase imaging with rotating illumination
CN111369638B (en) Laser reflection tomography undersampled reconstruction method, storage medium and system
Nicolas et al. 3D reconstruction of compressible flow by synchronized multi camera BOS
Hu et al. Hybrid method for accurate phase retrieval based on higher order transport of intensity equation and multiplane iteration
WO2019012796A1 (en) Information processing device, information processing method, program, and cell observation system
CN115993611B (en) Non-visual field imaging method and device based on transient signal super-resolution network
Mounaix Advanced Data Processing For Tomography and 3D Rendering With Terahertz Waves

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant