The field of the invention generally relates to methods and devices for imaging of microscopic structures such as cells. More particularly, the field of the invention pertains to systems and methods for the tomographic imaging of small particles such as cells, organelles, cellular particles and the like in a static sample or flowing within a microfluidic environment.
Light microscopy has been an irreplaceable tool in life sciences for several centuries. Nevertheless, its design has not fundamentally changed since its inception, i.e., the image of the specimen is magnified through a system of lenses and other optical components before being detected by the eye or a digital sensor array for visualization. The quest to resolve smaller features with better resolution and contrast has improved the capabilities of light microscopy at the cost of increasing its size and complexity. On the other hand, emerging technologies have flourished such as microfluidic and lab-on-a-chip systems which offer fast and efficient handling and processing of biological samples within highly miniaturized architectures. However, optical inspection of specimens is still being performed by conventional light microscopes, which has in general several orders of magnitude size mismatch compared to the scale of the microfluidic systems. As a result, there is a clear need for alternative compact microscopy modalities that are capable of integrating with miniaturized lab-on-a-chip platforms.
The urge for new optical microscopy modalities is not solely driven by the need for miniaturization and microfluidic integration. The fact that high resolution is achieved at the cost of significant field-of-view (FOV) reduction is another fundamental limitation of lens-based imaging. The relatively small FOV of conventional light microscopy brings additional challenges for its application to several important problems such as rare cell imaging or optical phenotyping of model organisms, where high throughput microscopy is highly desired.
In order to provide a complementary solution to these aforementioned needs, alternative, lens-free microscopy platforms have been developed which combines high resolution and large FOV in a compact, on-chip imaging architecture. In this modality, digital in-line holograms of micro-objects are recorded on a sensor array using partially coherent illumination with unit fringe magnification such that the entire active area of the sensor serves as the imaging FOV. To overcome the resolution limitation imposed by the pixel size at the sensor, multiple sub-pixel shifted holograms of the sample are acquired, and pixel super-resolution techniques are then applied to achieve sub-micron lateral resolution without compromising the large FOV. As a result, a lateral imaging performance comparable to a microscope objective with a numerical aperture (NA) of ˜0.5 has been achieved over an FOV of 24 mm2, which is more than two orders-of-magnitude larger than that of an objective lens with similar resolution. See e.g., Bishara W. et al., Lensfree on-chip microscopy over a wide field-of-view using pixel super-resolution. Optics Express 18:11181-11191 (2010).
While pixel super-resolution techniques in partially coherent lens-free in-line holography enable imaging with sub-micron lateral resolution over a large FOV, the axial resolution is unfortunately significantly lower (e.g., >40-50 μm) due to the inherently long depth-of-focus of digital in-line holography. Accordingly, despite the fact that holographic reconstruction can be numerically focused at different depths, sectioning of planes closer than ˜50 μm has not been feasible with lens-free wide-field holographic microscopes regardless of their detection numerical apertures. This fundamental limitation needs to be addressed.
Along the same lines, in recent years, there has been an increased interest in optical microscopy modalities that enable sectional imaging. As an example, Optical Projection Tomography (OPT) has been proposed, where an optically cleared specimen immersed in index-matching gel is rotated with respect to the fixed optical path of a conventional lens-based microscope, offers an isotropic resolution of ˜10 μm in all three dimensions within an imaging volume of up to ˜1 cm3. See Sharpe J et al., Optical Projection Tomography as a Tool for 3D Microscopy and Gene Expression Studies, Science 296:541-545 (2002).
A modified version of OPT by using high NA objective lenses has also been implemented recently to achieve sub-micron resolution cell imaging over a significantly reduced volume of e.g., <0.0005 mm3. See Fauver M et al., Three-dimensional imaging of single isolated cell nuclei using optical projection tomography, Optics Express 13:4210-4223 (2005).
Optical Diffraction Tomography (ODT) is another powerful technique where digital holography is utilized to reconstruct the 3D refractive index distribution of the specimen by changing the illumination direction, rotating the object, or by capturing multiple images at different wavelengths. These tomographic systems can routinely image cells potentially achieving sub-micron resolution in all three dimensions. However the trade-off between resolution and imaging volume also applies to these systems just like conventional microscopy, and high resolution is achieved at the cost of a significantly reduced imaging FOV of e.g., less than 0.04-0.2 mm2 and a depth-of-field (DOF) of less than 10-20 μm depending on the objective lens that is used.
For the same purpose, another imaging modality, namely, Selective Plane Illumination Microscopy (SPIM) has also been introduced, which utilizes a light sheet generated by a cylindrical lens to successively illuminate selective planes within a fluorescent sample to create a 3D image with enhanced axial resolution. See Iluisken J et al., Optical Sectioning Deep Inside Live Embryos by Selective Plane Illumination Microscopy, Science 305:1007-1009 (2004). SPIM, which is limited to only fluorescent imaging, achieves ˜6 μm axial resolution in thick samples up to a few millimeters over an FOV ranging between 0.04-2 mm2, which is dictated by either the NA of the objective lens that is used or the active area of the opto-electronic sensor array. In general, these existing optical tomography platforms, as summarized above, all rely on relatively complex and bulky optical setups that are challenging to miniaturize and integrate with microfluidic systems. Therefore, an alternative tomographic microscopy platform which offers both high resolution and a large imaging volume in a compact embodiment may offer an important imaging toolset in various fields including cell and developmental biology, neuroscience and drug discovery.
In one aspect of the invention, a system and method for lens-free optical tomography is provided that achieves less than 1 μm lateral resolution together with an axial resolution of ˜2.5-3 μm over a large FOV of ˜14 mm2 as well as an extended DOF of ˜41 nm, enabling an on-chip imaging volume of ˜15 mm3. This lens-free optical tomography platform merges high resolution in three dimensions (3D) with a significantly large imaging volume, offering a 3D space-bandwidth product that is unmatched by existing optical computed tomography modalities.
In one approach, lens-free tomographic imaging is achieved by rotating a partially coherent light source with ˜10 nm spectral bandwidth to illuminate the sample volume from multiple angles (spanning ±50° in air), where at each illumination angle several sub-pixel shifted inline projection holograms of the objects are recorded without using any lenses, lasers or other bulky optical components. The sub-pixel images are then digitally processed to generate a single, high resolution (e.g., pixel super-resolution) hologram of each angular projection. The high resolution holograms are then digitally reconstructed to obtain phase and amplitude information which are then back-projected to compute tomograms of the sample.
Limited spatial and temporal coherence of the hologram recording geometry brings important advantages to the reconstructed images such as reduced speckle and multiple reflection interference noise terms. Furthermore, the unit fringe magnification in the geometry permits recording of inline holograms of the objects even at oblique illumination angles of e.g., >40° which would not be normally feasible with conventional coherent inline holographic imaging schemes that utilize fringe magnification.
In order to minimize the artifacts due to limited angular range of tilted illumination, a dual-axis tomography scheme may be adopted where the light source is rotated along two substantially orthogonal axes. Tomographic imaging performance is quantified using microbeads of different dimensions, as well as by imaging wild type C. Elegans. Probing a large volume with good 3D spatial resolution, this lens-free optical tomography platform provides a powerful tool for high-throughput imaging applications in e.g., cell and developmental biology.
In one embodiment, a system for three dimensional imaging of an object contained within a sample includes an image sensor; a sample holder configured to hold the sample, the sample holder disposed adjacent to the image sensor; and an illumination source comprising partially coherent light or coherent light, the illumination source configured to illuminate the sample through at least one of an aperture, fiber-optic cable, or optical waveguide interposed between the illumination source and the sample holder, wherein the illumination source is configured to illuminate the sample through a plurality of different angles.
In another embodiment, a method of obtaining a three dimensional image of an object contained within a sample includes illuminating a sample holder configured to hold the sample with an illumination source emitting partially coherent light or coherent light at a first angle, the light passing through at least one of an aperture or a fiber-optic cable prior to illuminating the sample; illuminating the sample holder with the illumination source emitting light at different angles, the light passing through the aperture or a fiber-optic cable prior to illuminating the sample; obtaining, at each angle, a plurality of sub-pixel image frames from an image sensor disposed on an opposing side of the sample holder; digitally converting the sub-pixel image frames at each angle into a single higher resolution hologram for each angle; digitally reconstructing projection images for each angle from the higher resolution holograms; and digitally back projecting three dimensional tomographic images of the object within the sample.
In still another embodiment, a method of performing three dimensional imaging of an object contained within a sample includes flowing a sample through a flow cell disposed adjacent to an image sensor; illuminating the sample with an illumination source emitting partially coherent light or coherent light at a first angle, the light passing through at least one of an aperture, fiber-optic cable, or optical waveguide prior to illuminating the sample; obtaining a plurality of image frames of the object in the moving sample at the first angle with the image sensor; illuminating the sample with the illumination source at one or more different angles, the light passing through at least one of the aperture, fiber-optic cable, or optical waveguide prior to illuminating the sample; obtaining a plurality of image frames of the object in the moving sample at the one or more different angles with the image sensor; digitally reconstructing a super-resolved projection hologram of the object from the plurality of image frames obtained at the first and one or more different angles; digitally reconstructing complex projection images of the object within the sample based on the super-resolved projection holograms obtained at the first angle and the one or more different angles; and digitally reconstructing three dimensional tomograms of the object within the sample through filtered back-projection of the complex projection images.
In still another embodiment, a portable tomographic imager includes a housing containing a sample holder configured to hold a sample therein; a plurality of partially coherent or coherent light sources disposed in the housing at varying angles with respect to a first side of the sample, each of the plurality of light sources being coupled to respective waveguides; a microcontroller operatively connected to the plurality of light sources, the microcontroller configured to selectively activate individual light sources; an electromagnetic actuator configured to move the waveguides in substantially orthogonal directions; and an image sensor disposed in the housing on a second opposing side of the sample.
In another embodiment a portable tomographic imager includes a housing containing a sample holder configured to hold a sample therein; a plurality of partially coherent or coherent light sources disposed in the housing at varying angles with respect to a first side of the sample, each of the plurality of light sources being coupled to respective spatial apertures; a microcontroller operatively connected to the plurality of light sources, the microcontroller configured to selectively activate individual light sources; an electromagnetic actuator configured to move the spatial apertures in substantially orthogonal directions; and an image sensor disposed in the housing on a second opposing side of the sample.
Regardless, the surface of image sensor 16 may be in contact with or close proximity to the sample 14. Generally, the object 12 within the sample 14 is several millimeters within the active surface of the image sensor 16. The image sensor 16 may include, for example, a charged coupled device (CCD) or a complementary metal-oxide semiconductor (CMOS) device. The image sensor 16 may be monochromatic or color. The image sensor 16 generally has a small pixel size which is less than 9.0 μm in size and more particularly, smaller than 5.0 μm in size (e.g., 2.2 μm or smaller). Generally, image sensors 16 having smaller pixel size will produce higher resolutions. One benefit of the imaging method described herein is that a spatial resolution better than pixel size can be obtained.
Still referring to
With reference to
Still referring to
Still referring to
While
In operation 1400, the multiple sub-pixel images at each angle are digitally converted to a single, higher resolution hologram (SR hologram), using a pixel super-resolution technique, the details of which are disclosed in Bishara et al., Lensfree on-chip microscopy over a wide field-of-view using pixel super-resolution, Optics Express 18:11181-11191 (2010), which is incorporated by reference. First, the shifts between these holograms are estimated with a local-gradient based iterative algorithm. Once the shifts are estimated, a high resolution grid is iteratively calculated, which is compatible with all the measured shifted holograms. In these iterations, the cost function to minimize is chosen as the mean square error between the down-sampled versions of the high-resolution hologram and the corresponding sub-pixel shifted raw holograms.
Next, in operation 1500, complex projection images are digitally reconstructed at each angle. Digitally synthesized super-resolved holographic projections are reconstructed to obtain the lens-free projection images of the objects at various illumination angles. It should be emphasized that the holograms recorded with oblique illumination angles are still in-line holograms due to co-axial propagation of the scattered object wave and the unperturbed reference wave toward the sensor array. Consequently, digitally reconstructed images are contaminated by the twin-image artifact, which is a manifestation of the fact that the phase of the complex field in the detector plane is lost during the recording process. In order to obtain faithful projection images, a size-constrained iterative phase recovery algorithm is utilized, which enables recovering the phase of the complex field detected by the sensor. Details regarding the phase recover algorithm may be found in Mudanyali et al., Compact, Light-weight and Cost-effective Microscope based on Lensless Incoherent Holography for Telemedicine Applications, Lab Chip 10:1417-1428 (2010), which is incorporated by reference as if set forth fully herein.
Similar to the conventional vertical illumination case, holograms recorded with oblique illumination angles are multiplied with a reference wave that is the digital replica of the reference wave utilized for recording the holograms, which translates to using a plane reference wave tilted with respect to sensor normal. It should be noted that the tilt angle of this reconstruction wave is not equal to the tilt of the illuminating beam, due to refraction of light in the sample holder. In fact, the digital reconstruction angle for projection holograms are determined by calculating the inverse tangent of the ratio Δd/z2, where Δd denotes the lateral shifts of the holograms of objects with respect to their positions in the vertical projection image, and z2 is either experimentally known, or is iteratively determined by the digital reconstruction distance of the vertical hologram.
For iterative phase recovery, the complex field is digitally propagated back and forth between the parallel image detector and object planes. In order to obtain the projection image in the plane normal to the illumination, the recovered field is also interpolated on a grid whose dimension along the tilt direction is resealed by cos(θ), where θ is the angle of digital reconstruction. In addition, the projection images need to be aligned with respect to a common center-of-rotation before computing the tomograms. To achieve that, an automated two-step cross-correlation was implemented based image registration algorithm. Since the projection images obtained with successive illumination angles (e.g., 50° and 48°) are very similar to each other, the first step of image-registration is performed by cross-correlating the projection images obtained at adjacent angles. In most cases, especially when the object is a large connected structure such as C. Elegans, this step yields a successfully registered set of projections. However, if the FOV contains distributed small objects such as beads, the slight differences in projection images due to perspective change, even for adjacent angles, may deteriorate the registration accuracy. In this case the bead at the center of the projection images, which is also assumed to be the center-of-rotation, walks off the center of projection images, indicating poor image registration. Then, a second step of registration is utilized following the first one, where the bead at the center of the vertical projection image is used as a global reference, and all other projection images are automatically aligned with respect to that particular bead. Since the reference bead is already roughly aligned in the first step, the second correlation step is performed only on the reference bead by correlating cropped projection images with the cropped global, i.e. vertical, projection image.
The large z1/z2 ratio in this lens-free recording scheme permits a detection NA that is close to the refractive index of the medium. While this property of the system is of paramount importance for recording holograms with tilted illumination beams, the design of the opto-electronic sensor arrays limits the maximum angle that we can utilize. Opto-electronic sensor arrays in general are designed for lens-based imaging systems, where the angle of incident rays does not typically exceed 20°-30°, as a result of which holograms recorded at illumination angles larger than ±50° start to exhibit artifacts. For this reason, experimental projection holograms were obtained within a limited angular range of ˜50° to +50°, along two different rotation axes. It should be understood, however, that the angular range may be larger than this, for example, spanning angles between −89° and +89°, or in some instances spanning angles between −89° and +89°
The lens-free projection images (both phase and amplitude) are subject to a filtered back-projection algorithm to produce three-dimensional images as seen in operation 1600. Fourier-projection theorem allows reconstructing the 3D transmission function of an object from its 2D projections along different directions. Details regarding the back-projection method may be found in Radermacher M., Weighted back-projection methods, Electron Tomography: Methods for three dimensional visualization of structures in the cell, (Springer, New York, 2nd ed.) pp. 245-273(2006), which is incorporated herein by reference. Of course, other tomographic reconstruction methods known to those skilled in the art may be used as well.
Accordingly, one pixel super-resolved (SR) hologram for each illumination angle is digitally synthesized by utilizing multiple sub-pixel (LR) shifted holograms, which is followed by holographic reconstruction of all high resolution holograms yielding lens-free projection images. Then, in operation 1600, these reconstructed lens-free projection images (both phase and amplitude) are used to compute 3D tomograms of micro-objects using a filtered back-projection algorithm. A fundamental requirement for this technique, commonly referred to as the projection assumption, is that the projection images should represent a linear summation of a certain property of the object, for which tomograms can be computed. While it is much easier to satisfy this condition in X-Ray Computed Tomography due to negligible diffraction at that part of the electromagnetic spectrum, computed tomography in the optical regime requires weakly scattering objects. Similarly, this lens-free optical tomography modality also requires that the majority of the photons experience at most a single scattering event over the volume of each stack of tomograms. For weakly scattering objects, together with the long depth-of-focus of the system, complex scattering potential becomes additive along the direction of illumination. Consequently, tomograms of complex scattering potential of an object can be computed by applying a filtered back-projection algorithm whose inputs are the complex projection images calculated by holographic reconstruction of pixel super-resolved lens-free holograms at each illumination angle.
Since holograms are recorded for a limited angular range of ±50°, there is a missing region in the Fourier space of the object, commonly known as the missing wedge. The main implication of the missing wedge is reduced axial resolution, which limits the axial resolution to a value larger than the lateral. Further, in the lateral plane, ringing artifacts are observed as well as narrowing of the point-spread function (PSF) along the direction of rotation of the illumination such that the PSF in the x-y plane becomes elliptical.
In order to minimize these imaging artifacts, a dual-axis tomography scheme is used. Projection images obtained along each tilt direction are separately back-projected to compute two sets of complex tomograms. These tomograms are merged in Fourier space following the sequence given in Mastronarde D. N., Dual-Axis Tomography: An Approach with Alignment Methods That Preserve Resolution, Journal of Structural Biology 120:343-352 (1997), which is incorporated by reference as if set forth fully herein. Accordingly, the regions where both sets of tomograms have data for are averaged, while regions where only one set has useful data in its Fourier space, are filled by the data of the corresponding tomograms. As a result, the missing wedge is minimized to a missing pyramid, significantly reducing the aforementioned limited angle tomography artifacts. To further reduce the artifacts outside the support of the object, a mask is applied that is utilized for digital reconstruction of the vertical projection hologram to all tomograms. The missing wedge could also be iteratively filled to improve resolution and reduce artifacts by implementing iterative constraint algorithms based on a priori information of the 3D support or transmission function of the object.
The moving objects 12 are moved or flowed through the flow cell 52 using one or more pumping techniques. For example, a pressure gradient may be established to pump fluid containing objects 12 within flow cell 52. Alternatively, the moving objects 12 may be moved through the flow cell 52 using electro-kinetic motion with electrodes at opposing ends of the flow cell 52 being used. In this regard, any particular pumping modality may be used to move the objects 12 through the flow cell 52. Examples include the use of pumps like syringe pumps, dielectrophoresis based electrodes, magnetohydrodynamic electrodes, and the like.
Still referring to
A spatial filter 56 may be integrated into the distal end of the illumination source 54 as illustrated in
As seen in
Still referring to
As seen in
Still referring to
Moving objects 12 that flow through the flow cell 52 are imaged using the image sensor 58. In particular, a plurality of low resolution holographic image frames is acquired using the angularly offset image sensor 58. Because of the unit fringe magnification of the system imaging geometry, depending on the pixel size at the image sensor 58, the acquired holograms may be under-sampled. On the other hand, since during the flow each lens-free object hologram is sampled with different sub-pixel shifts as a function of time, one can use a pixel super-resolution algorithm to digitally synthesize a high-resolution hologram that has an effective pixel size of e.g., ≦0.5 μm, which is significantly smaller than the physical pixel size of the sensor (e.g., >2 μm). Thus, the system 50 uses the flow of the moving object 12 within the flow cell 52 to digitally create smaller pixels for hologram sampling. Such a super-resolved digital in-line hologram, after elimination of the twin-image artifact, enables high-resolution lens-free imaging of the moving objects 12.
The embodiment of
Each optical fiber 68 acts as a waveguide and the array of optical fibers 68 are tiled along an arc as illustrated in
A battery (not shown) could be used to power the imager 62. For example, standard alkaline batteries (with a capacity of e.g., 3000 mA h) could be used to actuate the fibers without the need for replacement for at least several days of continuous use of the tomographic microscope. Alternatively, the imager 62 could be powered by an external power source.
The imager 62 further includes a microcontroller 78 in the housing 64. The microcontroller 78 is used to control the firing of the LEDs that make up the illumination source 66. For instance, the microcontroller 78 may activate or trigger each individual LED at the appropriate time. As an example, the LEDs may be activated sequentially along the bridge 72. The microcontroller 78 may also be used to control the actuation of the coils 76.
Still referring to
The system 60 further includes a computer 30 having at least one processor 32 therein that is used to execute software for the processing and analysis of images as in the prior embodiment. A monitor 34 and input device 36 may be connected to the computer 30 for displaying results and interfacing with the computer 30. The computer 30, monitor 34, and input device 36 operate in the same or similar manner as in the prior embodiments.
The embodiment illustrated in
Lens-free tomographic imaging is achieved by rotating a partially coherent light source with ˜10 nm spectral bandwidth to illuminate the sample volume from multiple angles (spanning ±50° in air), where at each illumination angle several sub-pixel shifted inline projection holograms of the objects on the chip are recorded without using any lenses, lasers or other bulky optical components. Limited spatial and temporal coherence of the hologram recording geometry brings important advantages to the reconstructed images such as reduced speckle and multiple reflection interference noise terms. Furthermore, the unit fringe magnification in this geometry permits recording of inline holograms of the objects even at oblique illumination angles of e.g., >40° which would not be normally feasible with conventional coherent inline holographic imaging schemes that utilize fringe magnification.
In order to combat the limited angle artifacts in the tomograms, a dual-axis tomography scheme is employed by sequentially rotating the illumination source in two orthogonal directions as illustrated in
These results constitute the first time that (1) optical tomographic imaging has been extended to lens-free on-chip imaging; and (2) dual-axis tomography has been applied to optical part of the electro-magnetic spectrum; and (3) pixel super-resolution techniques have been applied for optical tomographic imaging. Without the use of any lenses or coherent sources such as lasers, the presented lens-free tomographic imaging scheme achieves a spatial resolution of <1 μm×<1 μm˜˜2.5-3 μm over a large imaging volume of ˜15 mm3 using dual-axis tomography scheme. The imaging volume increases to ˜30 mm3, at the cost of ˜15% reduction in axial resolution, if only single-axis projection data is utilized. Offering good spatial resolution over a large imaging volume, lens-free optical tomography could in general be quite useful for high-throughput imaging applications in e.g., cell and developmental biology.
In the lens-free tomographic imaging setup used in this experiment, the light source, situated about z1=70 mm away from the sensor (Aptina MT9P031STC, 5 Megapixels, 2.2 μm pixel size), provides partially coherent illumination to record inline holograms of the objects, whose distance to the sensor surface ranges between e.g., z2=0.5-4 mm depending on the chamber height. For experimental flexibility, a monochromator was utilized to provide tunable broadband illumination with ˜10 nm bandwidth centered around 500 nm. After being filtered through an aperture of diameter 0.05-0.1 mm and propagating a distance of z1=70 mm, the illuminating beam acquires a spatial coherence diameter <0.5-1 mm which permits recording the inline holograms of individual objects. Multi-angle illumination is achieved by rotating the light source, using a motorized stage, along an arc whose origin is the center of the sensor array. Due to the large z1/z2 ratio, this alignment is not sensitive and robustness of the setup is maintained.
At every illumination angle, a series of sub-pixel shifted holograms are recorded for implementing pixel super-resolution (operation 1400 of
Because most digital sensor arrays are designed to operate in lens-based imaging systems where the angle of incident rays measured from the sensor surface normal does not exceed 20°-30°, the waves incident with large k-vectors are sampled with increased artifacts and reduced SNR. Therefore, even though the detection NA of the system can reach the refractive index of the medium owing to the short z2, it has been observed that the reconstructed projection images for angles above ±50° exhibit artifacts and including these projections for tomographic reconstruction can deteriorate the final image quality rather than improving it. Consequently, projections are acquired only within a tilt range of ±50°, with 2° angular increments.
In order to reduce the artifacts of limited angle tomography, the dual-axis tomography scheme was used. Accordingly, after the completion of recording the projections along one axis, the sensor, with the sample mounted on it, is rotated 90° using a computer controlled rotating stage to record a second set of projections along the orthogonal direction. A custom developed LabView interface is used to automate the data acquisition process and a total of 918 wide FOV lens-free holograms are recorded. Acquiring a set of 459 projections along one axis takes ˜5 min with a frame rate of ˜4 fps, which can significantly be improved by using a faster frame rate sensor.
To characterize the lens-free tomographic system, a series of experiments using microbeads of different dimensions was conducted.
Although the results of
To further investigate the imaging properties of the tomographic microscope, 2 μm diameter beads distributed in an optical adhesive were imaged.
In addition to enabling 3D imaging of objects over a wide FOV, owing to its lens-free unit-magnification geometry, the platform also enjoys a significantly extended DOF compared to imaging systems where conventional microscope objectives are used. To demonstrate the large DOF, a multilayer chamber composed of 10 μm beads which has four layers stacked with ˜1 mm separation. (i.e., having a total thickness of 3.3 mm) was imaged. The chamber is then elevated above the sensor active area, and the furthest layer is situated ˜4 mm away from the sensor chip. With an illumination angle spanning ±50° in air, the entire tomographic data corresponding to a volume of 14 mm2×3.3 mm is acquired over ˜10 minutes using dual-axis scanning. Once this raw data is acquired (which includes nine sub-pixel shifted holograms at each illumination angle), separate tomograms for each depth layer are computed. These tomograms are then digitally combined into a single volumetric image, which now has a DOF of ˜4 mm Holographically recorded set of projections, one of which is illustrated in
One important challenge for tomographic reconstruction of such a large DOF is actually the implementation of pixel super-resolution at each illumination angle. Since the raw holograms of particles/objects that are located at considerably separated depths will create different shifts, if their holograms overlap at the detector plane, blind realization of pixel super-resolution will create errors for at least some of the overlapping particle holograms. To mitigate this challenge, the raw holograms of different layers were filtered from each other such that pixel super-resolution can be separately applied to lens-free holograms of different depth layers. Computing the super-resolved holographic projections for axially overlapping objects in thick samples requires additional digital processing due to the fact that the holograms of objects with an axial separation >200-300 μm shift significantly different amounts over the sensor-chip. As a result, the raw holograms obtained by shifting the light source are essentially different two-dimensional functions rather than translated versions of the same 2D raw hologram, which is a requirement to be met for the pixel super-resolution technique. Consequently, a single super-resolved projection hologram at a given illumination angle cannot be calculated for the entire sample depth. Instead, separate super-resolved holograms are calculated for each depth layer. To achieve this, the measured holographic projections such as the measured hologram of
In order to demonstrate the performance of the lens-free tomographic microscope for applications in life sciences, a wild-type C. Elegans worm was imaged in L4 stage (˜650 μm in length) in deionized water. The worm was temporarily immobilized with 4 mM levamisole (Sigma Aldrich L9756) solution to avoid undesired motion during the imaging process. Because the worm was aligned parallel to y-axis during data acquisition, only the projections obtained by tilts along the x-axis were utilized to compute the tomograms of the worm, which took ˜4 min using a single GPU.
The lens-free tomographic imaging system provides a unique microscopy modality that can probe a wide FOV of ˜14 mm2 and a long DOE of ˜4 mm at a lateral resolution of <1 μm and an axial resolution of ˜2.5-3 μm. These results suggest a resolving power that is comparable to a standard 20× objective lens (NA˜0.4, FOV<1 mm2) but over >104 times larger imaging volume. This makes the platform especially suitable for high-throughput imaging and screening applications such as 3D model animal imaging. Also note that the imaging volume can be increased to ˜30 mm3 by utilizing projections acquired with a single-axis data set, at the cost of a lower axial resolution of ˜3-3.5 μm.
There are several unique aspects of the lens-free incoherent holography scheme that enable achieving on-chip tomographic imaging over such a wide FOV and an extended DOF. For instance, choosing a large z1/z2 ratio of ˜20-100 allows holographic imaging with unit magnification, which brings the large FOV to this imaging modality. The limited hologram resolution dictated by this unit-magnification and the pixel-size at the sensor-chip is balanced by a pixel super-resolution approach, which increases the lateral numerical aperture up to 0.4-0.5 without a trade off in imaging FOV. The same large z1/z2 ratio also permits the use of unusually large illumination apertures (e.g., >50 μm), which significantly simplifies the illumination end without the need for any light-coupling optics, a sensitive alignment or a trade-off in achievable resolution. As a result, projections are easily acquired by tilting the light source rather than having to rotate the object which would unnecessarily complicate the setup, and perturb the sample. Moreover, the simplicity of the optics and the alignment-free structure of the lens-free setup also permit straightforward implementation of dual-axis tomography, since either the tilt-axis of the light source or the sensor (with the sample mounted on it) can be rotated 90° to acquire projections along two orthogonal directions.
Another unique aspect of the lens-free tomography scheme is the use of partially coherent light, both temporally and spatially. The spectral width of the illumination is ˜10 nm with a center wavelength of ˜500 nm, which limits the coherence length to be <10 μm. This relatively short coherence length does not impose any limitations for the technique and in fact, it significantly reduces two major sources of noise, i.e., the speckle and multiple-reflection interference noise terms. The latter one would especially have been a nuisance under laser illumination at oblique angles. In addition, such a limited coherence length also partially eliminates the cross-talk of different depths with each other. Such cross-interference terms are undesired and in fact are entirely ignored in any holographic reconstruction scheme. The same cross-interference also occurs within a given depth layer. In other words, scattering centers within the sample volume actually interfere with each other at the detector plane, which once again is a source of artifact as far as holographic reconstruction (e.g., twin-image elimination) is concerned. The limited spatial coherence also helps us to mitigate this issue by choosing a spatial coherence diameter (e.g., <0.5-1 mm) that is sufficiently large to record individual holograms of the objects, and yet that is significantly smaller than the entire imaging FOV. This spatial coherence diameter is rather straightforward to engineer in this geometry by changing the illumination aperture (e.g., 0.05-0.1 mm) as well as by changing the distance between the source aperture and the sample volume.
In this experiment, the embodiment of
Multi-angle illumination for tomographic imaging would not be feasible with conventional optofluidic microscopy architectures because at higher illumination angles the projection images of different cross-sections of the same object would start to lose resolution due to increased distance and diffraction between the object and the aperture/sensor planes. In this optofluidic tomography platform, at each illumination angle (spanning e.g., θ=−50°:+50°) several projection holograms (i.e. 15 frames) are recorded while the sample flows rigidly above the sensor array. These lower-resolution (LR) lens-free holograms are then digitally synthesized into a single super-resolved (SR) hologram by using pixel super-resolution techniques to achieve a lateral resolution of <1 μm for each projection hologram corresponding to a given illumination direction. These SR projection holograms are digitally reconstructed to obtain complex projection images of the same object, which can then be back-projected using a filtered hack-projection algorithm to compute tomograms of the objects.
An experiment was conducted where a wild-type C. elegans worm was sequentially imaged during its flow within a microfluidic channel at various illumination angles spanning θ−−50°:+50° in discrete increments of 2°. In these experiments, the design of the CMOS sensor-chip utilized for experiments ultimately limits the maximum useful angle of illumination. Most digital sensors are designed to work in lens-based imaging systems and therefore holograms recorded at illumination angles larger than ±50° exhibit artifacts. For this reason, we have limited the angular range to ±50°. For each illumination angle, ˜15 holographic frames were captured of the flowing object (in <3 seconds), resulting in a total imaging time of ˜2.5 minutes per tomogram under the electro-kinetic flow condition. These illumination angles are automatically created by a computer-controlled rotation stage holding the light source, and they define rotation of the source within the x-z plane with respect to the detector array, which is located at the x-y plane. Some exemplary LR holograms recorded with this set-up are illustrated in
To obtain complex projection images of the sample through digital holographic reconstruction, the synthesized SR holograms are digitally multiplied with a tilted reference wave. The tilt angle of this reconstruction wave is not equal to the tilt of the light source, because of the refraction of light in the microfluidic chamber. Instead, the digital reconstruction angle (θ) for projection holograms are determined by calculating the inverse tangent of the ratio Δd/z2, where Δd denotes the lateral shifts of the holograms of objects with respect to their positions in the vertical projection image, and z2 can be either experimentally known, or determined by the digital reconstruction distance of the vertical projection hologram. It should be noted that despite the use of tilted illumination angles, the recorded holograms are still in-line holograms since the reference wave and the object wave propagate co-axially. As a result, an iterative phase recovery algorithm based on object-support constraint is utilized to reconstruct the complex field transmitted through the object. Throughout these iterations, the optical field is propagated back and forth between the parallel hologram and object planes. Once the iterations converge, the projection of the complex field in the plane normal to the illumination angle is obtained by interpolating the recovered field on a grid whose dimension along the tilt direction is resealed by cos(θ). Exemplary reconstructions are shown in
For weakly scattering objects, the complex field obtained through digital holographic reconstruction (as shown in
Due to the limited angular range of holograms that can be recorded, there is a missing region in the Fourier space of the object, commonly known as the “missing wedge”. The most significant effect of the missing wedge is the elongation of the PSF in the axial direction, which limits the axial resolution to a value larger than the lateral, which is estimated to be ˜3 μm in this case. Reduction of such artifacts can be achieved by implementing iterative constraint algorithms either based on the 3D support of the object or by utilizing a priori information about the transmission function of the object, which enables iteratively filling the missing region in the 3D Fourier space of the object function.
Embodiment three relates to a field-portable lens-free tomographic microscope that can achieve depth sectioning of objects on a chip. This compact lens-free optical tomographic microscope, weighing only ˜110 grams, is based on partially-coherent digital in-line holography and can achieve an axial resolution of <7 μm over a large FOV of ˜20 mm2 and a depth-of-field (DOF) of ˜1 mm, probing a large sample volume of ˜20 mm3 on a chip. By extending the DOF to ˜4 mm, the imaging volume can also be increased to ˜80 mm3 at the cost of reduced spatial resolution.
In this field-portable lens-free tomographic platform, the major factors that enable a significantly enhanced 3D spatial resolution are: (i) to record multiple digital in-line holograms of objects with varying illumination angles for tomographic imaging; and (ii) to implement pixel super-resolution to significantly increase the lateral resolution of lens-free holograms at each viewing angle. For implementation of this tomographic on-chip microscope, twenty four (24) light-emitting diodes (LEDs—each with a cost of <0.3 USD) that are individually butt-coupled to an array of fiber-optic waveguides tiled along an arc as illustrated in
In order to record lens-free projection holograms from multiple angles, the LEDs are sequentially and automatically turned on/off by a low-cost micro-controller (Atmel ATmega8515, ˜3 USD/per piece). A digital sensor array (Aptina MT9P031STC, 5 Megapixels, 2.2 μm pixel size), which is placed z1=˜60 mm away from the fiber-ends records the lens-free projection holograms of the objects that are loaded (with z2<5 mm distance to the active area of the sensor-chip) through a sample tray inserted from one side of the lens-free microscope (see
Despite the fact that the large z1/z2 ratio in the hologram recording geometry permits recording of holograms at angles close to ±90°, the design of digital sensor array itself restricts the actual range of illumination angles that can be used in the tomographic microscope. Most digital sensor arrays are designed for imaging systems that use lenses as imaging elements, as a result of which the angle of incident rays measured from the sensor surface normal is typically less than 20°-30°. Therefore, the sensitivity of these opto-electronic sensors, by design, rapidly drops for incidence angles that are larger than 50° and aberrations become significant. Therefore, even though the hologram recording geometry permits the use of higher angles (e.g. 70°-80°), we limit the angular range of illumination to ±50° for this particular tomographic microscopy set-up.
As described earlier, the optical fibers that are used for multi-angle illumination are connected to a common arc-shaped lightweight bridge (˜1.7 grams), which moves together with all the fibers when actuated by electromagnetic forces. The other ends of these fiber-optic cables are mechanically fixed and are butt-coupled to individually addressed LEDs. Therefore, the entire structure can be modeled as a spring-mass system, where all the fibers collectively act as a spring, and the bridge piece is the mass load.
There are several critical specifications that need to be taken into account for the design of this structure: (1) to keep the form factor of the instrument small, the overall architecture of the actuator should be as compact as possible; (2) the structure should be stiff enough to stay rigid by itself such that small external perturbations do not randomly move the fiber tips during image acquisition, which would otherwise cause blurring of the recorded holograms; (3) the natural mechanical resonant frequency of the lowest vibrational mode of the structure should be as high as possible such that the structure does not move due to coupling of external vibrations, which also helps the fiber ends to reach the steady-state displacement rapidly without swinging for a long duration; and (4) sufficient actuation should be achieved with reasonable current and voltage values that can be supplied using standard batteries for field use. While (1), (2) and (3) can be achieved by keeping the fibers short, which makes the structure compact and stiff (also increasing the resonant frequencies), this would unfortunately demand a significant increase in the required electromagnetic force, and thereby would result in high electrical power consumption.
To better analyze this mechanical system, we assume a simple model where each fiber-optic waveguide acts as a cantilever beam with a cylindrical cross-section such that the stiffness (k) of the structure can be written as:
Where E is the Young's modulus of the silica fiber (E=72 GPa), r is the radius of the fiber (r—˜62.5 μm) and L is the length of the fibers. In this lens-free tomographic microscope design, a fiber length of L=14 mm was chosen which is the distance between the plastic bridge to the fixed-end of the fibers. Assuming that these fibers act as parallel springs forming a lumped system of N=24 fibers, one can calculate the mechanical frequency of the structure as:
Equation (2) yields an expected value of f0˜24 Hz when a measured mass of m=1.7 grams is used for the plastic bridge and the two magnets. According to this calculation, the time to reach the steady-steady displacement for the fibers once a force is applied can be estimated as ˜300 ms assuming a quality factor of e.g., ˜45. The actual settlement time of the fibers is short, supporting these calculations. Furthermore, during the experiments no undesired swinging of the fiber-array was observed due to external perturbations, and the entire structure is quite robust and sturdy making it suitable for field use.
To achieve electromagnetic actuation of the illumination fibers, two Neodymium magnets were mounted at each end of the plastic bridge. One of these magnets is aligned such that, when a DC current is applied to the coil mounted across it with ˜1-2 mm distance, the electromagnetic force moves the fibers along the direction of the arc. The other magnet is placed to generate an orthogonal displacement when its corresponding coil is operated. Therefore, displacements of the fiber-ends in both x and y directions can be achieved to generate super-resolved projection holograms of the samples. These coils are placed such that their cylindrical axes are aligned with the magnetization vector of the magnets. In this configuration, the force generated on the magnets (Fmag) can be calculated as:
F
mag
=S·M·(Hz1−Hz2)=S·M·ΔHz (3)
where S is the cylindrical cross-sectional area (in units of m2) of the magnet, M is the magnetization (in Tesla), Hz1 and Hz2 (in A/m) are the axial components of the magnetic field intensity at the top and bottom of the magnet, respectively. As Equation (3) suggests, the generated force is directly proportional to the magnetic field difference, ΔHz, across the two ends of the magnet, and it can be used to pull or push the magnet along the cylindrical axis depending on the polarity of the applied current.
As illustrated in
In order to perform pixel super-resolution (SR) for enhancing the spatial resolution at each illumination angle, the fiber-optic waveguide ends are mechanically displaced by small amounts (<500 μm) through electromagnetic actuation. In this scheme, the fibers are connected to a common bridge (radius: 3.1 mm, length: 6.2 mm) with low-cost Neodymium magnets attached on both ends. Compact circular electro-coils (radius: 5 mm, height: 5 mm) are mounted inside the plastic housing, which are used to electromagnetically actuate the magnets, resulting in simultaneous shift of all the fibers along both the x and y directions.
The exact amounts of displacement for these fiber-ends do not need to be known beforehand or even be repeatable or accurately controlled. As a matter of fact, the individual displacement of each fiber-end can be digitally calculated using the acquired lens-free hologram sequence. Once the fibers are shifted to a new position by driving the coils with a DC current, a new set of lens-free projection holograms are recorded, each of which is slightly shifted in 2D with respect to the sensor array. A maximum current of 80 mA is required for the largest fiber displacement (i.e., <500 μm), with ˜4 volts of potential difference applied across the electro-coil (50Ω). Standard alkaline batteries (with a capacity of e.g., 3000 mAh) could be used to actuate the fibers without the need for replacement for at least several days of continuous use of the tomographic microscope.
With the above described set-up, 10-15 projection holograms are recorded at each illumination angle to digitally synthesize one SR hologram for a given illumination angle.
The shifted holograms recorded at each illumination angle are digitally processed to synthesize projection holograms with higher spatial resolution. This is illustrated as operation 1400 in
Once the projection images at each illumination angle are calculated, they need to be registered with respect to a common center-of-rotation before computing the tomograms (see e.g.,
The filtered back-projection algorithm (described in more detail in the Radermacher M. publication incorporated herein by reference) is utilized to compute tomograms of the objects from their lens-free projection images. A fundamental requirement for the validity of this approach is that the projection images should represent a linear summation of a property of the object for which tomograms are being computed (e.g. phase, absorption, scattering strength, etc.). This is generally satisfied by weakly scattering objects in which case the majority of the incident photons experience at most a single scattering event over the volume of the object.
Assume that a weakly scattering object is represented by a complex scattering function s(xθ,yθ,zθ), which satisfies |s(xθ,yθ,zθ)|<<1 where (xθ,yθ,zθ) defines a coordinate system whose z-axis is aligned with the direction of illumination angle at a particular projection angle. In this case, the contribution of cross-interference terms to the hologram will be negligible in comparison to the actual holographic heterodyne terms. This assumption is further validated by the low spatial coherence (which minimizes cross-talk between objects with lateral separation larger than coherence diameter) and low temporal coherence (which minimizes the cross-talk between different layers with separation longer than coherence length) of the system, acting as a 3D coherence filter. As a result, for each projection image within a single tomogram volume (spanning e.g., Δz˜±25 μm), the holographically reconstructed image contrast will yield the linear summation of the scattering strength function given by: ∫|s(xθ,yθ,zθ)|·dzθ. This conclusion is further justified by the fact that, regardless of their detection numerical apertures, digital in-line holography schemes in general have a very long depth of focus as a result of which the scattering coefficients along a given zθ direction can be approximated to add up linearly after appropriate twin-image elimination steps. Consequently, tomograms of scattering strength of an object can be computed by applying a filtered back-projection algorithm whose inputs are the projection images calculated by holographic reconstruction of pixel super-resolved lens-free holograms acquired at various illumination angles.
To validate the performance of the field-portable lens-free tomographic microscope, micro-beads of different dimensions as well as a Hymenolepis Nana egg, which is an infectious parasitic flatworm, were imaged. Without utilizing lenses, lasers or other costly opto-mechanical components, the presented lens-free tomographic microscope offers sectional imaging with an axial resolution of <7 μm, while also implementing pixel super-resolution that can increase the NA of each projection image up to ˜0.3-0.4, over a large imaging volume of ˜20 mm3. Furthermore, this volume can also be extended up to ˜80 mm3 (corresponding to a DOF of ˜4 mm) at the cost of reduced spatial resolution. Offering good spatial resolution over such a large imaging volume, this compact, light-weight (˜110 grams) and cost-effective lens-free tomographic microscope could provide a valuable tool for telemedicine and high-throughput imaging applications in remote locations.
Next the reconstructed depth (z) profiles were investigated corresponding to the LR and SR holograms shown in
To mitigate this fundamental axial resolution limitation, lens-free SR holograms were used that are synthesized for ˜20 illumination angles spanning a range of ±50° to create a tomogram of the same micro-particle as illustrated in
To further demonstrate the depth sectioning capability of the field-portable lens-free tomographic microscope, 5 μm diameter spherical micro-beads (refractive index ˜1.68, Corpuscular Inc.) were imaged that are randomly distributed within a ˜50 μm thick chamber filled with an optical adhesive (refractive index ˜1.52, Norland NOA65).
The lens-free hologram recording geometry shown in
To specifically demonstrate this capability, a multilayer chamber (four layers stacked together with ˜1 mm separation in between, i.e., a total thickness of ˜3.5 mm) was imaged that was composed of 10 μm beads embedded in an optical adhesive. This thick object is placed at ˜0.7 mm away from the active area of the sensor-chip with its furthest layer situated at z˜4.2 mm from the sensor plane.
As stated above, for a chamber where the objects are distributed within a height of e.g., <200-300 μm, the holograms of all the objects shift almost equally for a given source shift. Therefore, a single SR hologram satisfying the measured data in all the sub-pixel shifted holograms can be synthesized. For thick or multilayer chambers, however, the lens-free holograms of objects that are axially separated by >200-300 μm shift considerably different amounts, and the recorded holograms for different source shifts look different. As a result, a single SR hologram to satisfy all shifted holograms cannot be calculated. To solve this issue, new holograms with the information of only the desired layers can be obtained by digitally erasing the undesired layers from the hologram intensity. To achieve this, the lens-free hologram for a thick (or multilayer) chamber as in
In order to validate the performance of the field-portable lens-free tomographic microscope for potential applications in bio-medicine, a Hymenolepis Nana (H. Nana) egg was imaged. The H. Nana egg is an infectious parasitic flatworm of humans having an approximately spherical structure with ˜40 μm diameter. Due to the long depth-of-focus of lens-free in-line holography, optical sectioning of this egg is not possible by merely reconstructing its recorded hologram at any given illumination angle. However, as demonstrated in
While embodiments have been shown and described, various modifications may be made without departing from the scope of the inventive concepts disclosed herein. The invention(s), therefore, should not be limited, except to the following claims, and their equivalents.
This application claims priority to U.S. Provisional Patent Application No. 61/430,465 filed on Jan. 6, 2011 and U.S. Provisional Patent Application No. 61/486,685 filed on May 16, 2011. Priority is claimed pursuant to 35 U.S.C. §119. The above-noted patent applications are incorporated by reference as if set forth fully herein.
Number | Date | Country | |
---|---|---|---|
61430465 | Jan 2011 | US | |
61486685 | May 2011 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13976197 | Jun 2013 | US |
Child | 15432611 | US |