Embodiments of the present invention relate to imaging systems, more particularly to super-resolution microscopy imaging systems.
It may be desirable to achieve a high spatial resolution and a wide field of view (FOV) simultaneously in a microscopy imaging system. In conventional microscope systems, a combination of an objective lens and a tube lens may be used to image an object. Designing a high numerical aperture (NA) lens with diffraction-limited performance over a large field of view may be challenging. In addition, conventional microscope systems with optical lenses tend to be bulky and expensive. Therefore, there is a need for improved microscopy imaging systems.
According to some embodiments, an imaging system includes a sample mount for holding a sample to be imaged, a light source configured to emit a light beam to be incident on the sample, a translation mechanism coupled to the sample mount and configured to scan the sample, to a plurality of sample positions in a plane substantially perpendicular to an optical axis of the imaging system, a mask positioned downstream from the sample along the optical axis, and an image sensor positioned downstream from the mask along the optical axis. The image sensor is configured to acquire a plurality of images as the sample is translated to the plurality of sample positions. Each respective image corresponds to a respective sample position. The imaging system further includes a processor configured to process the plurality of images to recover a complex profile of the sample based on positional shifts extracted from the plurality of images.
According to some embodiments, an imaging system includes a sample mount for holding a sample to be imaged, a light source configured to emit a light beam to be incident on the sample, a translation mechanism coupled to the sample mount and configured to scan the sample to a plurality of sample positions in a plane substantially perpendicular to an optical axis of the imaging system, and an image sensor positioned downstream from the phase mask along the optical axis. A top surface of the image sensor is tilted with respect to a surface of the sample. The image sensor is configured to acquire a plurality of images as the sample is translated to the plurality of sample positions. Each respective image corresponds to a respective sample position. The imaging system further includes a processor configured to process the plurality of images to recover a complex profile of the sample based positional shifts extracted from the plurality of images.
According to some embodiments, an imaging system includes a sample mount for holding a sample to be imaged, a light source configured to emit a light beam, the light beam including light in a plurality of wavelengths, and a light dispersing element configured to disperse the light beam into a plurality of sub light beams to be incident on the sample at a plurality of angles of incidence. Each respective sub light beam corresponds to a respective wavelength and is incident on the sample at a respective angle of incidence. The imaging system further includes a translation mechanism coupled to the sample mount and configured to scan the sample to a plurality of sample positions in a plane substantially perpendicular to an optical axis of the imaging system, a mask positioned downstream from the sample along the optical axis, and an image sensor positioned downstream from the mask along the optical axis. The image sensor is configured to acquire a plurality of images as the sample is translated to the plurality of sample positions. Each respective image corresponds to a respective sample position. The imaging system further includes a processor configured to process the plurality of images to recover a plurality of complex profiles of the sample based on positional shifts extracted from the plurality of images. Each respective complex profile of the sample corresponds to a respective wavelength.
According to some embodiments, an imaging system includes a sample mount for holding a sample to be imaged, a light source configured to emit a light beam, a diffuser positioned in front of the light source and configured to transform the light beam into a speckle illumination beam characterized by a speckle pattern, a mirror configured to receive and reflect the speckle illumination beam toward the sample, a scanning mechanism coupled to the mirror and configured to scan the mirror to a plurality of mirror angles such that the speckle illumination beam is incident on the sample at a plurality of angles of incidence, and an image sensor positioned downstream from the sample along an optical axis of the imaging system. The image sensor is configured to acquire a plurality of images as the mirror is being scanned so that the speckle illumination beam is incident on the sample at the plurality of angles of incidence. Each respective image corresponds to a respective angle of incidence. The imaging system further includes a processor configured to process the plurality of images to recover a complex profile of the sample based on positional shifts extracted from the plurality of images.
According to some embodiments, an imaging system includes a sample mount for holding a sample to be imaged, a light source configured to emit a light beam to be incident on the sample, a mask positioned downstream from the sample along an optical axis of the imaging system, a translation mechanism coupled to the mask and configured to scan the mask to a plurality of mask positions in a plane substantially perpendicular to the optical axis of the imaging system, and an image sensor positioned downstream from the mask along the optical axis. The image sensor is configured to acquire a plurality of images as the mask is scanned to the plurality of mask positions. Each respective image corresponds to a respective mask position. The imaging system further includes a processor configured to process the plurality of images to recover a complex profile of the sample based on positional shifts extracted from the plurality of images.
According to some embodiments, an imaging system includes a sample mount for holding a sample to be imaged, a light source configured to emit a light beam to be incident on the sample, a first transparent plate positioned downstream from the sample along an optical axis of the imaging system, a scanning mechanism coupled to the first transparent plate and configured to rotate the first transparent plate around a first axis orthogonal to the optical axis so that the first transparent plate is rotated to a plurality of first angles, a mask positioned downstream from the first transparent plate along the optical axis, and an image sensor positioned downstream from the mask along the optical axis. The image sensor is configured to acquire a plurality of images as the first transparent plate is scanned to the plurality of first angles. Each respective image corresponds to a respective first angle. The imaging system further includes a processor configured to process the plurality of images to recover a complex profile of the sample based on positional shifts extracted from the plurality of images.
Embodiments of the present invention provide various imaging systems for achieving super-resolution imaging via translated speckle illumination, translated pattern modulation, translated phase modulation, and wavelength-encoded mask modulation. In some embodiments, the imaging systems may not include any optical lens. Such imaging system are referred herein as lensless imaging systems. Compared with conventional microscope imaging systems, the imaging systems according to embodiments of the present invention may be able to achieve high spatial resolution and large field of view at the same time. The achievable spatial resolution may surpass the diffraction-limited resolution of conventional microscope imaging systems.
The imaging systems according to embodiments of the present invention may have applications in digital pathology, quantitative phase imaging, and the like. In addition, these imaging platforms can be employed in visible light imaging systems, coherent X-ray imaging systems, and electron imaging systems to increase spatial resolution and provide quantitative absorption and object phase contrast.
The imaging systems according to embodiments of the present invention may afford numerous advantages. For example, by not including any optical lens, the imaging systems may be made to be compact, portable, and cost-effective, and therefore may be suitable for deployment in point-of-care settings.
The imaging system 100 further includes a light source 120. The light source 120 may comprise a laser or a light-emitting diode (LED), and is configured to emit a coherent or partially coherent light beam. The light beam may be collimated, partially collimated, or uncollimated. The imaging system 100 further includes a diffuser 150 positioned in front of the light source 120. The diffuser 150 may include an unknown pattern formed thereon. Thus, as the light beam emitted by the light source 120 passes through the diffuser 150, the light beam may be transformed into a speckle illumination beam. The imaging system 100 may further include a mirror 130 configured to receive and reflect the speckle illumination beam toward the sample 110.
The imaging system 100 further includes a scanning mechanism (not shown in
The image sensor 140, which is positioned downstream from the sample 110 along the optical axis 102 of the imaging system 100, is configured to capture a plurality of images as the speckle illumination beam is incident on the sample 110 at the plurality of angles of incidence. Each respective image corresponds to a respective angle of incidence. The plurality of images may be processed by a processor (not shown in
According to some embodiments, to address the positioning repeatability and accuracy issues, the positional shifts of the speckle pattern ate recovered based on the phase correlations among of the plurality of images. To bypass the resolution limit set by the pixel size of the image sensor 140, a sub-sampled ptychographic phase retrieval process is used to recover the complex profile of the sample 110. The complex profile of the sample 110 may include an intensity image as well as a phase image of the sample 110. The reconstruction process may recover the unknown speckle pattern as well.
According to some embodiments, the reconstruction process may include the following steps.
At S101, initialize the complex object O(x, y) (e.g., the sample) and the speckle pattern P(x, y).
At S102, estimate the jth translated position of the speckle pattern (xj, yj) based on image cross-correlation, or other tracking algorithms such as mutual information optimization and the like.
At S103, according to the imaging model, generate the jth complex image's exit wave ψj(x, y) at the image sensor plane based on the translated position (xj, yj), O(x, y), and P(x, y):
ψj(x, y)=(O(x, y)·P(x−xj, y−yj))*PSFfree(d)=φj(x, y)*PSFfree(d),
where (xj, yj) is the jth positional shift of the speckle pattern, PSFfree(d) is the point spread function (PSF) for free-space propagation over a distance d, and ‘*’ stands for convolution operation, and φj(x, y)=(O(x, y)·P(x−xj, y−yj)
At S104, at the image sensor plane, use the following equation to update the exit wave ψj(x, y) based on the captured intensity image Ij(x, y):
In the above equation, the image sizes of ψj(x, y) and Ij(x, y) are different if Ij has a size of 100 by 100 pixels, ψj will have 300 by 300 pixels, with an up-sampling factor M=3. The term ‘Ij(x, y)↑M’ represents the nearest-neighbor up-sampling of the captured image Ij. In the denominator of the above equation, the term |ψj(x, y)|2 first convolutes with an average filter (M by M all-one matrix ones (M, M)). It will be then down-sampling by M-times followed by nearest-neighbor up-sampling of M-times. In some embodiments, other up-sampling factor (e.g., M=4, 5, 6, . . . ) may be used.
At S105, propagate the updated ψ′j(x, y) to the object plane and get φ′j(xj, yj). Update the object and P(x, y):
where ‘conj’ denotes conjugate, and αobj and αP are algorithm.
At S106, j=j+1 and repeat steps S102-S105.
At S107, repeat steps S102-S106 until the solution converges.
It should be appreciated that the specific steps S101-S107 discussed above provide a particular reconstruction process according to some embodiments. Other sequences of steps may also be performed according to alternative embodiments. For example, alternative embodiments of the present invention may perform the steps outlined above in a different order. Moreover, the individual steps S101-S107 may include multiple sub-steps that may be performed in various sequences as appropriate to the individual step. Furthermore, additional steps may be added or removed depending on the particular applications. One of ordinary skill in the art would recognize many variations, modifications, and alternatives.
The performance of the imaging system 100 were validated using a resolution target, a phase target, and a biological sample. It was demonstrated that accurate, high-quality complex images can be obtained from an image set including as few as 10 images. In some embodiments, a 6.4 mm by 4.6 mm field of view (FOV) and a half pitch resolution of 1 μm can be achieved.
The imaging system 500 further includes a light source 520. The light source 520 may comprise a laser or a light-emitting diode (LED), and is configured to emit a coherent or partially coherent light beam to be incident on the sample 510. The light beam may be collimated, partially collimated, or uncollimated. The imaging system 500 may include a mirror 530 positioned substantially at a 45 degree angle with respect to the path of the light beam emitted by the light source 520, so as to fold the light beam for a more compact configuration. The minor 530 is optional.
The imaging system 500 further includes a mask 550 positioned downstream from the sample 510 along an optical axis 502 of the imaging system 500, and above the image sensor 540. The mask 550 may include an unknown pattern formed thereon. Thus, as the light beam is transmitted through the sample 510 and the mask 550, a diffused image may be formed at the image sensor 540.
The imaging system 500 further includes a translation mechanism (not shown in
As the mask 550 is scanned, the diffused image formed at the image sensor 540 may shift accordingly. The image sensor 540 is configured to capture a plurality of images as the mask 550 is scanned to the plurality of mask positions. Each respective image corresponds to a respective mask position. The plurality of images may be processed by a processor (not shown in
According to some embodiments, the reconstruction process may include the following steps.
At S201, initialize the complex object O(x, y) (e.g., the sample) and the diffuser pattern P(x, y).
At S202, estimate the jth translated position of the diffuser pattern (xj, yj) based on image cross-correlation, or other tracking algorithms such as mutual information optimization and the like.
At S203, according to the imaging model, O(x, y) is propagated for distance ‘d1’ to the diffuser plane
O
d1(x, y)=O(x, y)*PSFfree(d1)
At S204, generate the jth complex image s exit wave ψj(x, y) at the image sensor plane based on the translated diffuser position (xj,yj), O(x, y), and P(x, y):
ψj(x, y)=(Od1(x, y)·P(x−xj, y−yj))*PSFfree(d)=φj(x, y)*PSFfree(d),
where PSFfree(d) is the point spread function (PSF) for free-space propagation over a distance d, and ‘*’ stands for convolution operation, and φj(x, y)=Od1(x, y)·P(x−xj, y−yj).
At S205, at the image sensor plane, use the following equation to update the exit wave ψj(x, y) based on the captured intensity image Ij(x, y):
In the above equation, the image sizes of ψj(x, y) and Ij(x, y) are different. If Ij has a size of 100 by 100 pixels, ψj will have 300 by 300 pixels, with an up-sampling factor M=3. The term ‘Ij(x, y)↑M’ represents the nearest-neighbor up-sampling of the captured image Ij. In the denominator of the above equation, the term |ψj(x, y)|2 first convolutes with an average tiller (M by M all-one matrix ones (M, M)). It will be then down-sampling by M-times followed by nearest-neighbor up-sampling of M-times. In some embodiments, other up-sampling factor (e.g., M=4, 5, 6, . . . ) may be used.
At S206, propagate the updated ψ′j(x, y) to the object plane and get φ′j(xj, yj). Update the object Od1(x, y) and P(x, y):
where ‘conj’ denotes conjugate, and αobj and αP are algorithm.
At S207, j=j+1 and repeat steps S202-S206.
At S208, repeat steps S202-S207 until the solution converges.
At S209, propagate the recovered Od1(x, y) to the object plane.
it should be appreciated that the specific steps S201-S209 discussed above provide a particular reconstruction process according to some embodiments. Other sequences of steps may also be performed according to alternative embodiments. For example, alternative embodiments of the present invention may perform the steps outlined above in a different order. Moreover, the individual steps S201-S209 may include multiple sub-steps that may be performed in various sequences as appropriate to the individual step. Furthermore, additional steps may be added or removed depending on the particular applications. One of ordinary skill in the art would recognize many variations, modifications, and alternatives.
The imaging system 700 further includes a light source 720. The light source 720 comprise a laser or a light-emitting diode (LED), and is configured to emit a coherent or partially coherent light beam to be incident on the sample 510. The light beam may be collimated, partially collimated, or uncollimated. The imaging system 700 may include a mirror 730 positioned substantially at a 45 degree angle with respect to the path of the light beam emitted by the light source 720, so as to fold the light beam for a more compact configuration. The mirror 730 is optional.
The imaging system 700 further includes a mask 750 positioned downstream from the sample 710 along an optical axis 702 of the imaging system 700, and above the image sensor 740. The mask 750 may include an unknown pattern formed thereon. Thus, as the light beam is transmitted through the sample 710 and the mask 750 a diffused image may be formed at the image sensor 740.
The imaging system 700 further includes a first transparent plate 760 and a second transparent plate 770 positioned between the sample 710 and the mask 750. The imaging system 700 may further include a scanning mechanism (not shown in
As the first transparent plate 760 and the second transparent plate 7711 are rotated, the diffused image formed at the image sensor 740 may shift accordingly. The image sensor 740 is configured to capture a plurality of images as the first transparent plate 760 is scanned to a plurality of first angles and the second transparent plate 770 is scanned to a plurality of second angles. Each respective image corresponds to a respective first angle of the first transparent plate 760 and a respective second angle of the second transparent plate 770. The plurality of images may be processed by a processor (not shown in
According to some embodiments, the reconstruction process may include the following steps.
At S301, initialize the complex object O(x, y) (e.g., the sample) and the diffuser pattern P(x, y).
At S302, estimate the jth translated position of the sample (xj, yj) based on image cross-correlation, or other tracking algorithms such as mutual information optimization and the like.
At S303, according to the imaging model, O(x−xj, y−yj) is propagated for distance ‘d1’ to the diffuser plane
O
d1(x−xj, y−yj)=O(x−xj, y−yj)*PSFfree(d1).
At S304, generate the jth complex image's exit wave ψj(x, y) at the image sensor plane based on the translated diffuser position (xj, yj), O(x, y), and P(x, y):
ψj(x, y)=(Od1(x−xj, y−yj)·P(x, y))*PSFfree(d)=φj(x, y)*PSFfree(d),
where PSFfree(d) is the point spread function (PSF) for free-space propagation over a distance d, and ‘*’ stands for convolution operation, and φj(x, y)=Od1(x−xj, y−yj)·P(x, y).
At S305, at the image sensor plane, use the following equation to update the exit wave ψj(x, y) based on the captured intensity image Ij(x, y):
In the above equation, the image sizes of ψj(x, y) and Ij(x, y) are different. If Ij has a size of 100 by 100 pixels, ψj will have 300 by 300 pixels, with an up-sampling factor M=3. The term ‘Ij(x, y)↑M’ represents the nearest-neighbor up-sampling of the captured image Ij In the denominator of the above equation, the term |ψj(x, y)|2 first convolutes with an average filter (M by M all-one matrix ones (M, M)). It will be then down-sampling by M-times followed by nearest-neighbor up-sampling of M-times. In some embodiments, other up-sampling factor (e.g., M=4, 5, 6, . . . ) may be used.
At S306, propagate the updated ψ′j(x, y) to the object plane and get φ′j(xj, yj). Update the object O(x, y) and P(x, y):
where ‘conj’ denotes conjugate, and αobj and αP are algorithm.
At S307, j=j+1 and repeat steps S302-S306.
At S308, repeat steps S302-S30 until the solution converges.
At S309, propagates the recovered Od1(x, y) to the object plane.
It should be appreciated that the specific steps S301-S309 discussed above provide a particular reconstruction process according to some embodiments. Other sequences of steps may also be performed according to alternative embodiments. For example, alternative embodiments of the present invention may perform the steps outlined above in a different order. Moreover, the individual steps S301-S309 may include multiple sub-steps that may be performed in various sequences as appropriate to the individual step. Furthermore, additional steps may be added or removed depending on the particular applications. One of ordinary skill in the art would recognize many variations, modifications, and alternatives.
The imaging system 900 further includes a light source 920. The light source 920 may comprise a laser or a light-emitting diode (LED), and is configured to emit a coherent or partially coherent light beam to be incident on the sample 910. The light beam may be collimated, partially collimated, or uncollimated. The imaging system 900 may include a mirror 930 positioned substantially at a 45 degree angle with respect to the path of the light beam emitted by the light source 920, so as to fold the light beam for a more compact configuration. The minor 930 is optional.
The imaging system 900 further includes a mask 950 positioned downstream from the sample 910 along an optical axis 902 of the imaging system 900, and above the image sensor 940. The mask 950 may include an unknown pattern formed thereon. Thus, as the light beam is transmitted through the sample 910 and the mask 950, a diffused image may be formed at the image sensor 940. In some embodiments, the mask 950 may include an area 952 that is free of the pattern. Thus, the image sensor 940 may detect an image of a feature on the sample 910. By tracking the movement of the feature, the movement of the sample 910 may be tracked. The detected positional shift of the sample is used to recover the sample and/or the mask profile in the reconstruction process.
The imaging system 900 further includes a translation mechanism (not shown in
As the sample 910 is scanned, the diffused image formed at the image sensor 940 may shift accordingly. The image sensor 940 is configured to capture a plurality of images as the sample 910 is scanned to the plurality of sample positions. Each respective image corresponds to a respective sample position. The plurality of images may be processed by a processor (not shown in
According to some embodiments, the reconstruction process may include steps similar to steps S301-S309 as discussed above.
According to some embodiments, the reconstruction process may include steps similar to steps S301-S309 as discussed above.
According to some embodiments, the reconstruction process may include steps similar to steps S301-S309 as discussed above.
The imaging system 1200 farther includes a light source 1220. The light source 1220 is configured to emit a light beam 1270 of multiple wavelengths. The light beam 1270 may be collimated, partially collimated, or uncollimated. In some embodiments, the light source 1220 may comprise multiple light-emitting elements (e.g., 3, 5, or up to 20 laser diodes) configured to emit light in different wavelengths. Alternatively, the light source 1220 may comprise a broadband light source, for example, a broadband light-emitting diode (LED). The imaging system 1200 may include a mirror 1230 positioned substantially at a 45 degree angle with respect to the path of the light beam 1270 emitted by the light source 1220, so as to fold the light beam 1270 for a more compact configuration. The mirror 1230 is optional.
The imaging system 1200 farther includes a light dispersing element 1260 configured to receive and disperse the light beam 1270 into a plurality of sub light beams 1272a, 1272b, and 1272c. each sub light beam 1272a, 1272b, or 1272c corresponding to a respective wavelength. The light dispersing element 1260 may comprise, for example, a prism, an optical diffraction grating, or the like. Although only three sub light beams are illustrated in
The imaging system 1200 further includes a mask 1250 positioned downstream from the sample 1210 along an optical axis 1202 of the imaging system 1200, and above the image sensor 1240. The mask 1250 may include an unknown pattern formed thereon. Thus, as the plurality of sub light beams 1272a, 1272b, and 1272c is transmitted through the sample 1210 and the mask 1250, a diffused image may be formed at the image sensor 1240. The diffused image may be a superposition of a plurality of sub-images corresponding to the different wavelengths of the plurality of sub light beams 1272a, 1272b, and 1272c. Since the plurality of sub light beams 1272a, 1272b, and 1272c is incident on the mask 1250 at different angles of incidence, the light modulation produced by the mask 1250 may be wavelength-dependent. The wavelength-dependent feature of the light modulation may be used to recover the profiles of the sample 1210 at different wavelengths in the phase retrieval process.
In some embodiments, the mask 950 may include an area 952 that is free of the pattern. Thus, the image sensor 940 may detect an image of a feature on the sample 910. By tracking the movement of the feature, the movement of the sample 910 may be tracked. The detected positional shift of the sample is used to recover the sample and/or the mask profile in the reconstruction process.
The imaging system 1200 further includes a translation mechanism (not shown in
As the sample 1210 is scanned, the diffused image (e.g., a superposition of a plurality of sub-images corresponding to the different wavelengths) formed at the image sensor 1240 may shift accordingly. The image sensor 1240 is configured to capture a plurality of images as the sample 1210 is scanned to the plurality of sample positions. Each respective image corresponds to a respective sample position. The plurality of images may be processed by a processor (not shown in
According to some embodiments, the reconstruction process may include the following steps.
At S401, initialize multiple object (e.g., sample) estimates Ot(x, y) and the diffuser pattern or the modulation mask pattern Pt(x, y), where ‘t=1, 2 . . . T’. T represents the number of wavelengths used for illumination.
At S402, estimate the translated position of the translated sample position (xi, yi) based on cross-correlation or mutual information of the captured images or other tracking algorithms.
At S403, according to the imaging model, Ot(x, y) is propagated ‘d1’ to the modulate: plane based on translated position (xi, yi), to obtain:
O
t,d
(x−xi, y−yi)=Ot(x−xi, y−yi)*PSFfree(d1).
Then generate the corresponding target image It,i(x, y) at the image sensor plane as follows:
where ‘·’ stands for point-wise multiplication, and ‘*’ denotes the convolution operation. ‘d1’ is the distance between the object and the diffuser, and ‘d2’ is the distance between the diffuser and the image sensor. PSFfree(d) is used to model the point spread function (PSF) for free-space propagation over distance ‘d’. ‘↓M’ in the above equation represents the down-sampling process.
At S404, sum It,i(x, y) up to generate the incoherent mixture:
I
incoherent,i(x, y)=Σt=1TIt,i(x, y).
At S405, update ψt,i(x, y) using the, ratio between, the actual measurement Im,i(x, y) and Iincoherent(xi, yi) and keep the phase unchanged:
The term Im,i(x, y)↑M represents the nearest-neighbor up-sampling of the captured image Im,i(x, y). In the denominator of equation, the term Iincoherent,i(x, y) first convolutes with an average filter (M by M all-ones matrix). It will be then down-sampled by M-times followed by M-times nearest-neighbor up-sampling. In some embodiments, other up-sampling factor (e.g., M=4, 5, 6 . . . ) may be used.
At S406, back propagate ψ′t,i(x, y) to the modulate plane:
ψ′t,i(x, y)=ψ′t,i(x, y)*PSFfree(−d2).
At S407, update Ot,d
At S408, update the shifted object Ot(x−xi, y−yi) using:
O
t
update(x−xi, y−yi)=Ot,d
At S409, j=j+1 and repeat steps S402-S408.
At S410, repeat steps S402-S409 until the solution converges.
At S411, propagate the recovered Otupdate(x, y) to the object plane.
It should be appreciated that the specific steps S401-S411 discussed above provide a particular reconstruction process according to some embodiments. Other sequences of steps may also be performed according to alternative embodiments. For example, alternative embodiments of the present invention may perform the steps outlined above in a different order. Moreover, the individual steps S401-S411 may include multiple sub-steps that may be performed in various sequences as appropriate to the individual step. Furthermore, additional steps may be added or removed depending on the particular applications. One of ordinary skill in the art would recognize many variations, modifications, and alternatives.
An advantage of the imaging systems illustrated in
This concept was validated using a thick potato sample.
According to some embodiments, the imaging systems described above, the light source may be replaced by a light source array, such as an LED array. Different light sources in the light source array may illuminate the sample at different angles of incidence. A plurality of complex profiles of the sample may be recovered, each respective profile corresponding to a respective light source. A three-dimensional tomographic image of the sample may be reconstructed from the plurality of complex profiles of the sample.
The imaging systems discussed above according to embodiments of the present invention may afford numerous advantages. For example, it is not necessary to know the position of the speckle pattern or the mask modulation. Therefore, the image acquisition process can be free-run. That is, any scanning motion (e.g., the scanning of the mirror 130 shown in
It is also understood that the examples and embodiments described herein are for illustrative purposes only and that various modifications or changes in light thereof will be suggested to persons skilled in the art and are to be included within the spirit and purview of this application and scope of the appended claims.
This application is a non-provisional application of and claims the benefit and priority under 35 U.S.C. 119(e) of U.S. Provisional Application No. 62/825,120, filed Mar. 28, 2019 entitled “SUPER-RESOLUTION IMAGING VIA TRANSLATED PATTERN ILLUMINATION AND TRANSLATED PATTERN MODULATION,” the entire content of which is incorporated herein by reference for all purposes.
This invention was made with government support under Grant No. 1510077 awarded by the National Science Foundation. The government has certain rights in the invention.
Number | Date | Country | |
---|---|---|---|
62825120 | Mar 2019 | US | |
62832403 | Apr 2019 | US |