The present disclosure is directed to quantitative phase imaging (QPI), and more particularly, to methods and systems for performing multiple (multi)-wavelength QPI that are well suited for imaging biological cells and other applications, such as lens testing and atmospheric correction of images captured by cell phone cameras, for example.
Quantitative phase imaging (QPI) techniques for mapping phase distribution continue to expand, offering information about the complex optical field (amplitude and phase) in the image plane. Shifts in optical path length of a sample are mapped to form the measured image, which contains information about the thickness and refractive index of the sample. QPI techniques allow for a variety of applications, particularly in biology, including imaging of red blood cells, optical properties of tissues, label-free in vitro biological sample imaging, and cell refractive index.
Measurement of the multi-spectral refractive index of cells is a growing topic of interest in the biological and medical fields. In biological cells and fluids, the refractive index provides a characteristic marker of individual materials, governing how the tissue interacts with electromagnetic radiation. Direct measurement of the refractive index has the potential to differentiate healthy from pathologic tissue, leading to novel methods for cancer detection and diagnosis. The refractive index marker has been used for tracking the development of diabetes by examining the glycation process and glycated proteins. Additionally, other optical diagnostic technologies make use of spectral light sources to analyze tissue properties in the visible and near-infrared (NIR) regions.
Typical quantitative techniques that can recover the phase delay caused by the sample make use of coherent illumination interferometry, creating expensive and highly sensitive systems. Most QPI techniques require the use of multiple images to retrieve complex optical field information. A common example of this method is Differential Phase Contrast (DPC). In this partially coherent QPI technique, the amplitude and phase are recovered from the Fourier asymmetry and a single-step deconvolution.
Several of the QPI techniques recently developed are single-shot methods, requiring only one image taken by an RGB camera to capture the white-light illuminated sample. An early example of a single-shot QPI method is Instantaneous Spatial Light Interference Microscopy (iSLIM), which allows for spatially and temporally sensitive optical path length measurements, provides phase dispersion imaging, and multiplexes with fluorescence imaging. These QPI systems typically require modifications to the internal portions of the microscope or require modification and/or addition of hardware elements in an axillary image path.
A need exists for a multi-wavelength QPI system that has a relatively simple configuration and that can be employed with microscopes without having to modify the internal hardware of the microscope.
The present disclosure discloses QPI systems and methods. In accordance with an embodiment, a QPI system is provided for imaging a biological sample. The QPI system of this embodiment comprises an optics system, a sample holder, a multi-wavelength light source, an optically transmissive substrate, a lens, a multi-wavelength image sensor array, and a processor. the optics system has at least first and second optical ports. The sample holder is positioned in a preselected position and orientation relative to the first optical port and has a biological sample disposed thereon. The multi-wavelength light source is configured to emit multiple wavelengths of light and is positioned relative to the first optical port to ensure that at least a portion of the emitted light of multiple wavelengths is transmitted through the biological sample disposed on the sample holder. Light transmitted though the biological sample is directed by the optics system out of the second optical port. The optically transmissive substrate is positioned relative to the second optical port to ensure that light passing out of the second optical port is incident on a structure formed in the optically transmissive substrate. The structure comprises at least one of a computer generated hologram (CGH) and a diffractive grating. Light passing out of the second optical port is incident on the structure, which generates at least one respective phasorgram image for each respective wavelength of light. The lens is positioned in a preselected position relative to the optically transmissive substrate. The multi-wavelength image sensor array, the lens and the multi-wavelength image sensor array are positioned relative to one another to ensure that the phasorgram images are directed by the lens onto the multi-wavelength image sensor array, which converts the phasorgram images into electrical signals representing the phasorgram images. The processor is configured to perform a reconstruction algorithm that processes the electrical signals representing the phasorgram images to obtain a phase distribution estimate of the light transmitted through the biological sample.
In accordance with an embodiment, the optics system is an optics system of a microscope.
In accordance with an embodiment, at least the multi-wavelength light source and the sample holder are coupled together in a first device that is adapted to be attached to the microscope in alignment with the first optical port.
In accordance with an embodiment, the optically transmissive substrate and the lens are coupled together in a second device that is adapted to be attached to the microscope in alignment with the second optical port.
In accordance with an embodiment, the optically transmissive substrate, the lens and the multi-wavelength image sensor array are coupled together in a second device that is adapted to be attached to the microscope in alignment with the second optical port.
In accordance with an embodiment, the multi-wavelength image sensor array is inside of a red-green-blue (RGB) camera.
In accordance with an embodiment, the reconstruction algorithm comprises an iterative Fourier algorithm with field averaging.
In accordance with an embodiment, the reconstruction algorithm is performed for each of the wavelengths of light on the respective phasorgram images associated with the respective wavelengths of light, and each reconstruction algorithm comprises the steps of:
dividing one of the phasorgram images into N phasorgram subareas, where Nis a positive integer that is greater than or equal to one;
multiplying a complex transmission of N effective filters by an initial estimate of an extrinsic phase distribution to obtain N pupil fields;
performing a Fourier transformation algorithm on the N pupil fields to generate N complex fields on an image plane;
applying an amplitude constraint that replaces N calculated amplitudes of the N complex fields with N amplitudes of the N phasorgram subareas, respectively, to generate N amplitude-constrained complex fields;
performing an inverse Fourier transformation on the N amplitude-constrained complex fields to transform the N amplitude-constrained complex fields into N complex pupil fields;
dividing the N complex pupil fields by the N effective filters to obtain N estimates, respectively, of an extrinsic phase distribution associated with the light transmitted through the biological sample;
averaging at least a subset of the N estimates to obtain an estimate for a current iteration of the reconstruction algorithm of the extrinsic phase distribution associated with the light transmitted through the biological sample; and
reiterating the steps of the reconstruction algorithm until convergence of the reconstruction algorithm occurs, wherein a final estimate of the extrinsic phase distribution associated with the light transmitted through the biological sample is obtained when convergence is reached.
In accordance with an embodiment, the reconstruction algorithm further comprises:
prior to dividing one of the phasorgram images into N phasorgram subareas, performing a spectral filtering algorithm that uses the phasorgram images associated with the other wavelengths of light to obtain a spectrally-filter phasorgram image, and wherein the step of dividing one of the phasorgram images into N phasorgram subareas comprises dividing the spectrally-filtered phasorgram image into N phasorgram subareas.
In accordance with an embodiment, the reconstruction algorithm further comprises:
after averaging said at least a subset of the N estimates to obtain an estimate for a current iteration of the reconstruction algorithm of the extrinsic phase distribution associated with the light transmitted through the biological sample, and prior to reiterating the steps of the reconstruction algorithm until convergence, performing a spectral filtering algorithm that uses the estimate of the extrinsic phase distribution for the current iteration obtained by the other reconstruction algorithms being performed in parallel to obtain a spectrally-filtered estimate for the current iteration of the reconstruction algorithm of the extrinsic phase distribution associated with the light transmitted through the biological sample.
The phase distribution estimate contains information about a refractive index of the biological sample. In accordance with an embodiment, the processor processes the refractive index to produce a marker that is characteristic of one or more materials comprising the biological sample and of how tissue of the sample interacts with electromagnetic radiation.
In accordance with an embodiment, the marker is used to differentiate healthy from pathologic tissue.
In accordance with an embodiment, the marker is used for cancer detection and diagnosis.
In accordance with an embodiment, the marker is used to track a development of diabetes.
In accordance with an embodiment, the QPI system is used for testing lenses and comprises a multi-wavelength light source, an optically transmissive substrate, a multi-wavelength image sensor array and a processor. The optically transmissive substrate is positioned relative to the light source to ensure that light emitted by the light source is incident on a structure formed in the optically transmissive substrate. The structure comprises at least one of a CGH and a diffractive grating and generates at least one respective phasorgram image for each respective wavelength of light. A lens under test is positioned in a preselected position relative to the optically transmissive substrate. The multi-wavelength image sensor array, the lens under test and the multi-wavelength image sensor array are positioned relative to one another to ensure that the phasorgram images are directed by the lens under test onto the multi-wavelength image sensor array which converts the phasorgram images into electrical signals representing the phasorgram images. A processor is configured to perform a reconstruction algorithm that processes the electrical signals representing the phasorgram images to obtain a phase distribution estimate of the light transmitted through the lens under test.
In accordance with an embodiment, the reconstruction algorithm performed for lens testing is performed for each of the wavelengths of light on the respective phasorgram images associated with the respective wavelengths of light and comprises steps of:
dividing one of the phasorgram images into N phasorgram subareas, where N is a positive integer that is greater than or equal to one;
multiplying a complex transmission of N effective filters by an initial estimate of an extrinsic phase distribution to obtain N pupil fields;
performing a Fourier transformation algorithm on the N pupil fields to generate N complex fields on an image plane;
applying an amplitude constraint that replaces N calculated amplitudes of the N complex fields with N amplitudes of the N phasorgram subareas, respectively, to generate N amplitude-constrained complex fields;
performing an inverse Fourier transformation on the N amplitude-constrained complex fields to transform the N amplitude-constrained complex fields into N complex pupil fields;
dividing the N complex pupil fields by the N effective filters to obtain N estimates, respectively, of an extrinsic phase distribution associated with the light transmitted through the biological sample;
averaging at least a subset of the N estimates to obtain an estimate for a current iteration of the reconstruction algorithm of the extrinsic phase distribution associated with the light transmitted through the biological sample; and
reiterating the steps of the reconstruction algorithm until convergence of the reconstruction algorithm occurs, wherein a final estimate of the extrinsic phase distribution associated with the light transmitted through the biological sample is obtained when convergence is reached.
In accordance with an embodiment, the reconstruction algorithm performed for lens testing further comprises:
prior to dividing one of the phasorgram images into N phasorgram subareas, performing a spectral filtering algorithm that uses the phasorgram images associated with the other wavelengths of light to obtain a spectrally-filter phasorgram image, and wherein the step of dividing one of the phasorgram images into N phasorgram subareas comprises dividing the spectrally-filtered phasorgram image into N phasorgram subareas.
In accordance with an embodiment, the reconstruction algorithm performed for lens testing further comprises:
after averaging said at least a subset of the N estimates to obtain an estimate for a current iteration of the reconstruction algorithm of the extrinsic phase distribution associated with the light transmitted through the biological sample, and prior to reiterating the steps of the reconstruction algorithm until convergence, performing a spectral filtering algorithm that uses the estimate of the extrinsic phase distribution for the current iteration obtained by the other reconstruction algorithms being performed in parallel to obtain a spectrally-filtered estimate for the current iteration of the reconstruction algorithm of the extrinsic phase distribution associated with the light transmitted through the biological sample.
In accordance with an embodiment, the QPI system is used for correcting images captured by a camera module of a mobile device for atmospheric aberration. The QPI system comprises an adapter, a lens of the mobile device, a multi-wavelength image sensor of the mobile device, and a processor of the mobile device. The adapter is configured to be mechanically coupled to the mobile device in optical alignment with the lens of the mobile device. The adapter comprises a multi-wavelength light source configured to emit multiple wavelengths of light, and an optically transmissive substrate positioned relative to the light source to ensure that light emitted by the light source is incident on a structure formed in the optically transmissive substrate. The structure comprises at least one of a CGH and a diffractive grating. The structure generates at least one respective phasorgram image for each respective wavelength of light. The lens of the mobile device directs the phasorgram images onto a multi-wavelength image sensor array of the mobile device, which converts the phasorgram images into electrical signals representing the phasorgram images. The processor of the mobile device is configured to perform a reconstruction algorithm that processes the electrical signals representing the phasorgram images to obtain a phase distribution estimate of the light transmitted through the lens. The processor is configured to perform an atmospheric aberration correction algorithm that uses the phase distribution estimate to correct images captured by other camera modules of the mobile device to correct for atmospheric aberration in the captured images.
In accordance with an embodiment of the QPI system for correcting atmospheric aberration, the multi-wavelength light source comprises a combination of a natural light source and a plurality of filters that operate on natural light from the natural light source to produce the light of multiple wavelengths.
In accordance with another representative embodiment, the phasorgram images for each respective wavelength are captured by the multi-wavelength image sensor array in a single-shot acquisition.
In accordance with another representative embodiment, the multi-wavelength light source comprises at least one of (1) a combination of a natural light source and a plurality of filters that operate on natural light from the natural light source to produce the light of multiple wavelengths and (2) a microscope illumination lamp.
These and other features and advantages will become apparent from the following description, drawings and claims.
The example embodiments are best understood from the following detailed description when read with the accompanying drawing figures. It is emphasized that the various features are not necessarily drawn to scale. In fact, the dimensions may be arbitrarily increased or decreased for clarity of discussion. Wherever applicable and practical, like reference numerals refer to like elements.
The present disclosure is directed to systems and methods for performing multi-wavelength QPI that are well suited for imaging of biological cells and other applications, such as lens testing and atmospheric correction in cell phone cameras. The multi-wavelength QPI system can have a relatively simple configuration and can be employed with microscopes without having to modify the hardware (e.g., the auxiliary image pathway) of the microscope.
In accordance with a representative embodiment, multi-wavelength red-green-blue (RGB) quantitative information is acquired in a single image with a system having a relatively simple configuration, as will be described below in more detail with reference to
In the following detailed description, for purposes of explanation and not limitation, exemplary, or representative, embodiments disclosing specific details are set forth in order to provide a thorough understanding of inventive principles and concepts. However, it will be apparent to one of ordinary skill in the art having the benefit of the present disclosure that other embodiments according to the present teachings that are not explicitly described or shown herein are within the scope of the appended claims. Moreover, descriptions of well-known apparatuses and methods may be omitted so as not to obscure the description of the exemplary embodiments. Such methods and apparatuses are clearly within the scope of the present teachings, as will be understood by those of skill in the art. It should also be understood that the word “example,” as used herein, is intended to be non-exclusionary and non-limiting in nature.
The terminology used herein is for purposes of describing particular embodiments only and is not intended to be limiting. Any defined terms are in addition to the technical, scientific, or ordinary meanings of the defined terms as commonly understood and accepted in the relevant context.
The terms “a,” “an” and “the” include both singular and plural referents, unless the context clearly dictates otherwise. Thus, for example, “a device” includes one device and plural devices. The terms “substantial” or “substantially” mean to within acceptable limits or degrees acceptable to those of skill in the art. For example, the term “substantially parallel to” means that a structure or device may not be made perfectly parallel to some other structure or device due to tolerances or imperfections in the process by which the structures or devices are made. The term “approximately” means to within an acceptable limit or amount to one of ordinary skill in the art. Relative terms, such as “in,” “out,” “over,” “above,” “below,” “top,” bottom,” “upper” and “lower” may be used to describe the various elements' relationships to one another, as illustrated in the accompanying drawings. These relative terms are intended to encompass different orientations of the device and/or elements in addition to the orientation depicted in the drawings. For example, if the device were inverted with respect to the view in the drawings, an element described as “above” another element, for example, would now be below that element.
Relative terms may be used to describe the various elements' relationships to one another, as illustrated in the accompanying drawings. These relative terms are intended to encompass different orientations of the device and/or elements in addition to the orientation depicted in the drawings. Unless, otherwise stated herein, elements or components that are referred to as being “connected to” or “coupled to” one another may be directly connected or coupled to one another or indirectly connected or coupled to one another through an intervening element, component, device or structure.
The term “memory” or “memory device”, as those terms are used herein, are intended to denote a non-transitory computer-readable storage medium that is capable of storing computer instructions, or computer code, for execution by one or more processors. References herein to “memory” or “memory device” should be interpreted as one or more memories or memory devices. The memory may, for example, be multiple memories within the same computer system. The memory may also be multiple memories distributed amongst multiple computer systems or computing devices.
A “processor,” “processing device,” or “processing logic,” as those terms are used herein, encompass an electronic component that is able to execute computer instructions. References herein to a system comprising “a processor,” “a processing device,” or “processing logic” should be interpreted as a system having one or more processors, processing devices or instances of processing logic. A processor may be, for example, a multi-core processor. A processor may also refer to a collection of processors within a single computer system or distributed amongst multiple computer systems. The term “computer,” as that term is used herein, should be interpreted as possibly referring to a single computer or computing device or to a collection or network of computers (e.g., cloud computing), each comprising a processor or processors. Instructions of a computer program can be performed by a single computer or processor or by multiple processors that may be within the same computer or distributed across multiple computers.
Light transmitted through the CGH 105 is focused with a FT lens 107 onto an RGB camera 108, which acquires a phasorgram image. A processor 110 of the system then performs a complex diversity reconstruction algorithm that processes the phasorgram image to reconstruct a QPI image of the biological sample. Memory 111 of the system 100 is in communication with the processor 110 and stores data and/or computer instructions executed by the processor 110.
The CGH 105 is designed to produce multiple focus points that are spatially separated in the phasorgram, where each focus point is disturbed with a known amount of random perturbation.
The processor 110 performs a complex diversity reconstruction algorithm that processes the phasorgram images for the three different wavelengths shown in
The single-shot imaging technique overcomes drawbacks of conventional imaging techniques by capturing multiple images generated by the CGH 105 in a single acquisition. However, the discussion below shows that there are special considerations in the design of the system and the reconstruction algorithm for single-shot techniques that have not, before now, been adequately addressed. The concept of complex diversity described in the present disclosure, and introduced in a previously-filed PCT application by the Applicant, application number PCT/US2020/042112, adequately accounts for these special considerations, where complex-number pupil filters containing both amplitude and phase values are extracted by numerical propagation from the CGH design.
The design of the CGH 105 of the present disclosure in accordance with a representative embodiment comprises three processes, which are illustrated in
During the first process 501, the seed filters are converted to complex field phasorgrams by taking the inverse Fast Fourier Transforms (IFFTs). Amplitudes of the complex fields of the N individual phasorgrams are mapped onto different positions in the image plane. This multiple-phasorgram amplitude distribution is the target amplitude for the CGH design. Conventional CGH design techniques, such as the Gerchberg-Saxton (GS) algorithm and the modified GS technique, for example, can be used for this design process. The constraints in the CGH design preferably include a pure phase constraint in the pupil plane and the amplitude target distribution in the image plane.
The second process, represented by the block labeled 502, is an extraction of the N complex effective filters. Although the CGH generates the same amplitude patterns as generated from individual seed filters, the field modulations introduced into the diffraction orders are not the same as the seed filters when using the CGH because the CGH design process does not constrain these field modulations. Thus, in accordance with a representative embodiment, the actual effective filters are extracted from the designed CGH during the extraction process 502.
During the extraction process 502, the complex field reflected from the CGH pattern is calculated in a computer by assuming illumination with a uniform plane-wave amplitude. Then, the field is numerically propagated to the image plane by an performing an IFFT. The complex field at the image plane is divided into N individual subareas corresponding to the phasorgrams by cropping data in the image plane, which results in a collection of phasorgrams. The phasorgram fields from the cropped subareas are individually propagated back to the pupil plane by performing a FFT. The resulting collection of N complex field patterns in the pupil plane are the effective filters introduced to the individual diffraction orders.
It should be noted that the design process 501 and the extraction process 502 need only be performed one time when a new CGH (or diffractive grating) is generated. Thus, these additional process steps do not increase computational time for the reconstruction, represented by the block labeled 503.
The third process 503 is reconstruction. The reconstruction process represented by block 503 is performed for each wavelength used in the multi-wavelength QPI system 100, but for ease of illustration and discussion, a single instance of the reconstruction algorithm is shown and described. The experimentally-measured phasorgram associated with one of the wavelengths formed by using the CGH in the manner described with reference to
In accordance with a representative embodiment, the reconstruction algorithm is an iterative Fourier method with field average proposed by Gerchberg, which is modified for complex diversity in accordance with the inventive principles and concepts. In accordance with this embodiment, the reconstruction algorithm starts with setting an initial guess in the pupil plane as a flat phase, i.e., no aberration. Then, the following steps proceed in an iterative manner as follows: (1) the effective filters obtained by the extraction process 502 are applied to the estimate of the extrinsic phase distribution (the initial guess in the first iteration) to yield N individual pupil fields by multiplying the complex transmission of the effective filters by the estimate of the extrinsic phase distribution; (2) FFTs generate N individual complex fields on the image plane; (3) an amplitude constraint preferably is applied by replacing the calculated amplitudes with the measured phasorgram set without disturbing the phase distribution; (4) inverse FFTs are applied to the N complex fields to propagate them back to the pupil plane; (5) the resulting complex pupil fields are divided by the effective filters in order to form N individual estimates of the extrinsic phase; (6) these N estimates are averaged to obtain the iteration's estimate of the extrinsic field, except in areas where amplitudes of the effective filters are smaller than a threshold value to avoid noise amplification. The threshold value may be set to, for example, 10% of the maximum amplitude, but other threshold values could be used.
In accordance with a preferred embodiment, the reconstruction algorithm includes a step (7), which is a spectral filtering algorithm on the iteration's estimate obtained by averaging the complex fields. In the spectral filtering algorithm, input from other spectral channels is used to define the estimate of complex-valued electric field Us passed to the algorithm, which are the current estimates of the extrinsic phase and amplitude at each wavelength. For example, if there are three wavelength channels (RGB), then there are three parallel reconstruction algorithms, with the phase estimates from the other spectral channels being used in the spectral filtering block.
The process from steps (1) to (6) or (1) to (7) is defined as one iteration, and iterations preferably are repeated until an error measure between the measured irradiance pattern and the synthetically generated pattern falls below a target threshold value, or the iteration number reaches a pre-determined maximum. The final estimates obtained after spectral filtering by all of the parallel reconstruction algorithms provide a QPI result that provides information associated with the biological sample 102.
Steps 705-708 represent the reconstruction algorithms (performed per wavelength) performed by the processor 110, as described above with reference to block 503 of
One of the benefits of the QPI system 100 shown in
The processor 110 and memory 111 may be components of a computer system, such as, for example, a personal computer (PC) that can be electrically coupled via a wired or wireless link to the camera 108. The aforementioned attachments, if used, could be configured to attach to the respective ports in aligned positions via keying features on the attachments and the ports to ensure precise alignment along the optical pathways.
The final phase distribution estimate obtained at convergence of the reconstruction algorithm contains information about the thickness and refractive index of the biological sample. This information is useful for a variety of applications, particularly in biology, including imaging of red blood cells, optical properties of tissues, label-free in vitro biological sample imaging, and cell refractive index. In biological cells and fluids, the refractive index provides a characteristic marker of individual materials, governing how the tissue interacts with electromagnetic radiation. The information about the refractive index has the potential to be used to differentiate healthy from pathologic tissue, leading to novel methods for cancer detection and diagnosis. The refractive index marker has been used in the past for tracking the development of diabetes by examining the glycation process and glycated proteins.
Alternatively, the cell-phone camera image sensor can be used in place of the high-quality RGB camera 108. In this case, the cell phone with a simple CGH adaptor and filter can be used as a single instrument for detecting phase of an incoming wavefront at multiple wavelengths, which is useful for correction of atmospheric aberrations that distort images, like what is observed imaging scenes across a hot, sandy beach. That is, the addition of a simple CGH/filter adaptor can be used in the cell phone camera system to correct images from other camera modules in the system. Typically, modern cell phone cameras have multiple cameras for wide-angle, telephoto, and normal viewing, as well as time-of-flight information. The additional camera module could be used for atmospheric correction.
The A, B, C and D orders are separated and processed in the computer algorithm used to reconstruct the phase of the object. Depending on the manner in which the reconstruction algorithm is designed and implemented, diffracted orders may or may not be used. For example, since RGB information is separated spatially in the A, B, C and D orders, this information may be used and the Z order not used. In addition, the CGH may form additional diffracted orders that may or may not be used in the reconstruction algorithm.
Also shown in the drawing is a dashed line that represents a software fiducial that is a common feature of commercially available camera viewing software. A fiducial of this type is useful in aligning the CGH 105 with the camera 108. For example, the Z-order information focused from the CGH 105 can be used with a mechanical stage that controls the position of the camera in order to align the camera so that the Z order is in the center of the camera area. All diffraction orders move together, so the entire pattern is centered when the Z order is centered.
The CGH 105 can also have simple gratings or other diffractive features to aid in the alignment of the CGH 105 and the camera 108. For example, diffracted orders a, b, c, and d result from a simple linear grating that is designed into a region outside the CGH 105 used for the phase reconstruction. In other words, in the optically transmissive substrate in which the CGH is formed, a simple grating may be formed in a region of the substrate outside of the region in which the CGH is formed. These orders provide a reference for rotation of the CGH 105 with respect to the camera face. As with centering the Z order described above, rotation of the CGH 105 to align a, b, c and d with the vertical and horizontal lines 901 and 902, respectively, of the fiducial also rotates the CGH orders used for image processing. Thus, with the combination of x and y translation of the camera 108 and rotation of the CGH 105 using a fiducial and the reference grating orders, the system is aligned easily.
In addition to the CGH 105 that is used to produce phasorgrams for the computer algorithm, a second CGH can be designed without random phase changes in order initially to find the centers of diffracted orders on the camera. That is, the second CGH forms point foci at centers of the A, B, C and D circles. These foci are used to determine the centers of the phasorgram circles from the first CGH, which is information that is useful for the portion of the computer algorithm that divides the image from the first CGH into separate phasorgrams. Like with the first CGH 105, the second CGH can contain alignment features, like gratings, in order to align the second CGH with the camera.
An extension of the methods described above uses multiple input beams coupled onto the CGH at different angles.
It should be noted that the inventive principles and concepts have been described with reference to representative embodiments, but that the inventive principles and concepts are not limited to the representative embodiments described herein. Although the inventive principles and concepts have been illustrated and described in detail in the drawings and in the foregoing description, such illustration and description are to be considered illustrative or exemplary and not restrictive; the invention is not limited to the disclosed embodiments. Other variations to the disclosed embodiments can be understood and effected by those skilled in the art, from a study of the drawings, the disclosure, and the appended claims.
This application is a nonprovisional Patent Cooperation Treaty (PCT) application that claims priority to, and the benefit of the filing date of, U.S. Provisional Application having Ser. No. 62/969,055, filed on Feb. 1, 2020 and entitled “SYSTEMS AND METHODS FOR PERFORMING MULTIPLE-WAVELENGTH QUANTITATIVE PHASE IMAGING (QPI),” which is incorporated by reference herein in its entirety.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2021/016101 | 2/1/2021 | WO |
Number | Date | Country | |
---|---|---|---|
62969055 | Feb 2020 | US |