The embodiments discussed in the present disclosure are related to a method of using phase engineering and computational recovery for spatial 'omics.
Unless otherwise indicated in the present disclosure, the materials described in the present disclosure are not prior art to the claims in the present application and are not admitted to be prior art by inclusion in this section.
In spatial 'omics applications (e.g., genomics, transcriptomics, proteomics, metabolomics, etc.), spatial organization may describe biological networks. In these biological networks, sub-cellular components (e.g., proteins, Ribonucleic acid (RNA), Messenger RNA (mRNA), Deoxyribonucleic acid (DNA), etc.) may be influenced by a surrounding environment including other sub-cellular components. Additionally, spatial 'omics is a powerful tool in cellular quantification and cellular phenotyping, and in the understanding of important mechanisms such as tissue organization or cell regulation by examination of a multitude of expression factors (e.g., mitosis, trafficking, morphology, chromaticity, density, etc.). An optical system (e.g., an optical microscope) may be implemented to capture data and/or images representative of a sample (e.g., a biological sample) in spatial 'omics applications.
The subject matter claimed in the present disclosure is not limited to embodiments that solve any disadvantages or that operate only in environments such as those described above. Rather, this background is only provided to illustrate one example technology area where some embodiments described in the present disclosure may be practiced.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential characteristics of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
One or more embodiments of the present disclosure may include an optical system. The optical system may include an objective lens including a back focal plane. The optical system may also include a camera configured to capture an image representative of a biological sample. In addition, the optical system may include an optical element optically coupled to the back focal plane. The optical element may extend a depth of field defined by the objective lens and maintain light throughput to permit the camera to capture the image representative of the biological sample within the extended depth of field. Further, the optical system may include a computing device communicatively coupled to the camera. The computing device may determine a location of tissue, cellular, or sub-cellular components within the biological sample, count a number of the components within the biological sample, or determine interactions of the components within the biological sample based on the image representative of the biological sample within the extended depth of field.
One or more embodiments of the present disclosure may include a system including one or more computer-readable storage media configured to store instructions and one or more processors communicatively coupled to the one or more computer-readable storage media and configured to, in response to execution of the instructions, cause the system to perform operations. The operations may include receiving data representative of a biological sample within an extended depth of field. The operations may also include extracting information corresponding to tissue, cellular, or sub-cellular components within the biological sample within the extended depth of field from the data representative of the biological sample within the extended depth of field. In addition, the operations may include determining 'omics information of the tissue, cellular, or sub-cellular components based on the extracted information.
One or more embodiment of the present disclosure may include a method. The method may include receiving data representative of a biological sample within an extended depth of field. The method may also include extracting information corresponding to tissue, cellular, or sub-cellular components within the biological sample within the extended depth of field from the data representative of the biological sample within the extended depth of field. In addition, the method may include determining 'omics information based on the extracted information.
The object and advantages of the embodiments will be realized and achieved at least by the elements, features, and combinations particularly pointed out in the claims. Both the foregoing general description and the following detailed description are exemplary and explanatory and are not restrictive.
Example embodiments will be described and explained with additional specificity and detail through the use of the accompanying drawings in which:
An optical system (e.g., an optical microscope) may include an objective lens that defines an image resolution, a magnification, a lateral field of view, a depth of field, or some combination thereof to capture data and/or images representative of the sample. A computing device may use the data and/or the images to determine information about the biological networks. For example, the computing device may determine a role of a genome's three-dimensional (3D) organization in regulation of transcription and/or other cellular functions.
Spatially resolved transcriptomics may greatly expand the knowledge and understanding of complex multicellular biological networks. Spatially resolved transcriptomics may combine gene expression data with spatial information. The gene expression data and the spatial information may be obtained using one or more of various methods. One or more of these methods may include sample chemistry and may be tissue-type dependent.
Some objective lenses may be designed in accordance with Gaussian optics design principles to define an image resolution, a magnification, a lateral field of view, a depth of field, or some combination thereof to capture an image. These objective lenses may be designed to operate using a highest possible resolution and highest image quality only at a nominal focal plane. If the sample's axial dimension is greater than (i.e., extends outside) the depth of field associated with this nominal focal plane, the data and/or images that are obtained of the sample may include errors (e.g., may be out of focus or incomplete). In addition, these objective lenses may include intrinsic tradeoffs. For example, these objective lenses may provide a higher image resolution (e.g., include a higher numerical aperture (NA)) while providing a lower depth of field. As another example, these objective lenses may provide a lower image resolution (e.g., include a lower NA) while providing a greater depth of field. Objective lenses designed in accordance with Gaussian optics design may be unable to simultaneously provide both a higher image resolution and a greater depth of field.
The tradeoff between the resolution and the depth of field of these objective lenses may become more pronounced as the NA of the objective lenses increases due to a quadratic dependence of the depth of field on the NA. As an example, the depth of field of a 0.95 NA air objective lens at an emission wavelength of six hundred seventy nanometers (nm) may be equal to roughly 0.8 micrometers (μm). The depth of field of the 0.95 NA air objective lens may be defined by a full width at half maximum (FWHM) of a peak intensity of a PSF when measured as a function of defocus. The PSF may include an impulse response of the optical system to a point source.
Some samples in image-based spatial 'omics applications are thicker than the depth of field of these objective lenses (e.g., thicker than 0.8 μm). For example, cells within the sample may be equal to or greater than two μm and sections of the sample may be equal to or greater than ten μm. Furthermore, due to the limited depth of field of these objective lenses, small imperfections on a surface of a coverslip, a petri dish, a silicon surface, a multi-well imaging plate, or other surface that the sample is physically positioned on may cause focus deviations that are greater than the depth of field, which may cause the sample to be out of focus in the image.
Some optical systems that include these objective lenses may perform axial scanning to generate a combined image of a sample that is greater than the depth of field. Axial scanning may include the process of stepping through focus of the objective lens (e.g., stepping through focus in a Z axis away or toward the objective lens) and capturing images of different portions of the sample at each axial step. The computing device may combine images together in post processing to generate the combined image.
To generate the combined image of the sample using axial scanning, each axial step may be made in accordance with a Nyquist sampling rate. The Nyquist sampling rate may be equal to roughly half the depth of field of the corresponding objective lens. For example, the Nyquist sampling rate for the 0.95 NA air objective lens may be roughly equal to four hundred nm. Thus, ten or more axial steps may be performed for the optical system including the 0.95 NA air objective lens to capture the images to generate the combined image of a typical four μm sample.
Some optical systems may implement a feedback loop or other mechanical system to perform axial scanning. These optical systems may experience a settling time as the optical system performs each axial step. These optical systems may experience mechanical errors, positional errors, or some combination thereof moving between one or more axial steps. These errors may increase an amount of time the optical system takes to capture the images to generate the combined image. To try and avoid these errors, these optical systems may introduce a delay between each axial step. The delay between the axial steps may reduce a frame rate of the optical system, increase the amount of time to capture the images to generate the combined image, or some combination thereof.
These optical systems may also increase an amount of data to be captured, stored, processed, or some combination thereof due to multiple images being captured and combined to generate the combined image. The increased data may introduce delays experienced by these optical systems by increasing an amount of time to process the increased data (e.g., may increase an extensiveness of labor, scale of the data, throughput of the data, or some combination thereof).
Delays experienced by these optical systems may increase an experiment time (e.g., an amount of time to determine the spatial organization of the sample). For example, these optical systems may increase the experiment time between fifty to one hundred percent compared to an optical system that captures an image of the sample that is equal to or smaller than the depth of field.
Some optical systems may include a spatial light modulator or a binary mask pattern optically coupled to the objective lens to extend the depth of field. Spatial light modulators have drawbacks in regard to polarization dependence, pixel size, phase modulation variance, phase and amplitude efficiency, and the like. Spatial light modulators are also limited to only diffractive phase elements. The binary phase pattern creates a null in a modulation transfer function (MTF) at lower spatial frequencies for the optical system, reducing the image resolution to a level that is below a computational recovery level to capture a useful image. In addition, the binary pattern may also be sensitive to system aberrations.
Some embodiments described in the present disclosure may include an optical element to extend the depth of field while maintaining the image resolution defined by the objective lens. Some embodiments described in the present disclosure may overcome the tradeoffs between the depth of field and the image resolution of the objective lens.
For example, in some embodiments, an optical system may include the optical element (e.g., a dielectric phase mask) optically coupled to a back focal plane of the objective lens of the optical system. The optical element may extend the depth of field of the objective lens while maintaining the image resolution of the objective lens. The optical element may be matched to 'omics problems of locating, counting, and/or determining the interactions of the sub-cellular components within the sample. For example, the optical element may be designed based on known characteristics of the objective lens, the sample, or some combination thereof.
The optical element may be designed to match the PSF of the objective lens and the optical element to a thickness of the sample, a density of the sample, or some combination thereof. The optical element may alter an optical response of the optical system by positioning a phase mask in the back focal plane. In addition, a computational recovery algorithm may be applied to the image that is captured using the optical element to extract spatial 'omics data.
Alternatively or additionally, in some embodiments, the objective lens may include a back focal plane to which the optical element is optically coupled. The camera may be configured to capture an image representative of the sample within the depth of field defined by the objective lens. The optical element may be configured to extend the depth of field defined by the objective lens and substantially maintain light throughput to permit the camera to capture the image representative of the sample within the extended depth of field.
One or more embodiments described in the present disclosure may enhance 'omics information recovery, reduce an amount time for acquisition of 'omics information, reduce a size of data sets for analysis, improve existing instrumentation, enhance an image volume, or some combination thereof. In addition, one or more embodiments described in the present disclosure may reduce an amount of time to extract 'omics information from collected data, which may improve the throughput and speed of use of the optical system.
The optical element may reduce or eliminate the use of axial scanning to capture images of different portions of the sample with an axial dimension that is greater than the depth of field defined by the objective lens. The optical element may also increase a sample volume and/or a density of information captured in a single image without using axial scanning. In some embodiments, the optical element may provide twenty times as much information in a single image than an optical system that does not include the optical element. In addition, the optical element may maintain the image resolution (e.g., lateral resolution) over the extended depth of field.
The optical element may reduce the experiment time compared to an optical system using axial scanning. For example, the optical element may permit the computing device to determine a number of fluorescing sub-cellular components (e.g., particles), a location of the sub-cellular components, a brightness of the sub-cellular components, and other key attributes of the sub-cellular components at an increased rate and with enhanced recovery compared to an optical system that performed axial scanning since a given axial range may be imaged with fewer images. In addition, the optical element may permit moving parts to be removed from the optical system, because the optical system does not perform axial steps, which may reduce a cost of the optical system, a number of failure modes of the optical system, or some combination thereof.
The optical system including the optical element may increase a focal volume to capture more sample information related to the sample in a single image. Increasing the focal volume may ensure that the sample stays in focus to reduce or eliminate the use of electro-mechanical parts (e.g., autofocus stages and/or autofocus algorithms). The reduction or elimination of electro-mechanical parts may reduce a number of possible failure modes of the optical system. Furthermore, the optical system may include the optical element with a fixed alignment (e.g., fully encased in or attached to an objective lens) to further improve reliability of the optical system. In some embodiments, the optical element may comprise a passive device with the fixed alignment.
The optical element may be configured to interface with (e.g., optically couple to a back focal plane and physically attach to) standard microscope systems (e.g., existing system architectures).
These and other embodiments of the present disclosure will be explained with reference to the accompanying figures. It is to be understood that the figures are diagrammatic and schematic representations of such example embodiments, and are not limiting, nor are they necessarily drawn to scale. In the figures, features with like numbers indicate like structure and function unless described otherwise.
Light rays from the sample 108 may traverse the dish 111, the objective lens 102, and the tube lens 106 and be captured by a camera sensor within the camera 104 as an image. The light rays from the sample 108 may include reflected light rays (e.g., light rays reflected by the sample 108), transmitted light rays (e.g., light rays transmitted through the sample 108), fluorescence light rays (e.g., light rays emitted from the sample 108 via fluorescence), and/or other light rays which may be used to image the sample 108. Characteristics of the image captured by the camera 104 may be based on the image resolution, the magnification, the lateral field of view, the depth of field, or other aspects defined by the optical system 100.
As illustrated in
The computing device 113 may receive the images corresponding to the different depths of field 110a-c captured by the camera 104 (e.g., the first image, the second image, and the third image). The computing device 113 may process and combine the images together to generate a combined image representative of the portions within each of the depths of field 110a-c. For example, the computing device 113 may process and combine together the first image, the second image, and the third image to generate the combined image representative of the first portion, the second portion, and the third portion of the sample 108. The computing device 113 may use the combined image to extract the 'omics information corresponding to the sample 108.
The optical element 214 may include a non-traditional optics device (e.g., a non-traditional optics dielectric phase mask), which may be placed in an optical path of the optical system 200. In addition, the optical element 214 may comprise a passive device with a fixed alignment.
The optical element 214 may be optically coupled to a back focal plane of the objective lens 212. The optical element 214 may extend the depth of field defined by the objective lens 212, maintain an image resolution defined by the objective lens 212, substantially maintain light throughput, or some combination thereof to permit the camera 104 to capture an image representative of the sample within the extended depth of field. The extended depth of field 218 may be greater than the depth of field defined just by the objective lens 212. In some embodiments, the optical element 214 may generally maintain ninety-eight, or greater, percent of light throughput. In these and other embodiments, the optical element 214 may generally maintain ninety percent or greater of the image resolution.
Light rays from the sample 108 may traverse the dish 111, the objective lens 212, the optical element 214, and the tube lens 106 and be captured by the camera sensor within the camera 104 as an image. The light rays from the sample 108 may include reflected light rays, transmitted light rays, fluorescence light rays, and/or other light rays which may be used to image the sample 108. As illustrated in
Characteristics of the image captured by the camera 104 may be based on the image resolution, the magnification, the lateral field of view, the depth of field, or other aspects defined by the objective lens 212, the optical element 214, or some combination thereof.
The optical element 214 may include a phase mask optically coupled to the back focal plane of the objective lens 212. An example phase mask 216 is illustrated in
The objective lens 212 is illustrated in
The optical element 214 may be designed based on physics of PSF engineering matched to the objective lens 212, the sample 108, 'omics problems, or some combination thereof. For example, the optical element 214 may be designed based on the physics of PSF engineering matched to characteristics of the objective lens 212, a thickness of the sample 108, a density of sub-cellular components within the sample 108, or some combination thereof.
The optical element 214 may also be designed based on the physics of PSF engineering in conjunction with a computational recovery algorithm that is matched to the objective lens 212, the sample 108, 'omics problems, or some combination thereof.
The computing device 213 may receive the image representative of the sample 108 within the extended depth of field 218 captured by the camera 104. The computing device 213 may apply the computational recovery algorithm to process the image and extract the 'omics information corresponding to the sample 108. The 'omics information may include locations of the sub-cellular components, interactions of the sub-cellular components, counts of the sub-cellular components, densities of the sub-cellular components, clusters of the sub-cellular components, chromaticity of the sub-cellular components, brightness of the sub-cellular components, colocations of the sub-cellular components, trajectories of the sub-cellular components, velocity of the sub-cellular components, or some combination thereof. Additionally or alternatively, the 'omics information may include locations of cellular components, interactions of the cellular components, counts of the cellular components, densities of the cellular components, clusters of the cellular components, chromaticity of the cellular components, brightness of the cellular components, colocations of the cellular components, trajectories of the cellular components, velocity of the cellular components, or some combination the tissue components, counts of the tissue components, densities of the tissue components, clusters of the tissue components, chromaticity of the tissue components, brightness of the tissue components, colocations of the tissue components, trajectories of the tissue components, velocity of the tissue components, or some combination thereof. The computational recovery algorithm may include identifying clusters of overlapping signal emissions representative of different tissue, cellular, or sub-cellular components within the image, fitting the clustered signals simultaneously, and fitting a variant background beneath each cluster of signal emissions.
The light rays from the tube lens 106 may be re-directed by the reflector 303 and captured by the camera sensor within the camera 104 as an image. The camera 104 may capture the image representative of the entire sample 108 during a single step.
The computing device 313 may receive the image representative of the sample 108 within the extended depth of field 218 captured by the camera 104. The computing device 313 may apply the computational recovery algorithm to process the image and extract the 'omics information corresponding to the sample 108. The 'omics information may include locations of the sub-cellular components, interactions of the sub-cellular components, counts of sub-cellular components, densities of the sub-cellular components, clusters of the sub-cellular components, chromaticity of the sub-cellular components, brightness of the sub-cellular components, colocations of the sub-cellular components, trajectories of the sub-cellular components, velocity of the sub-cellular components, or some combination thereof. Additionally or alternatively, the 'omics information may include locations of cellular components, interactions of the cellular components, counts of the cellular components, densities of the cellular components, clusters of the cellular components, chromaticity of the cellular components, brightness of the cellular components, colocations of the cellular components, trajectories of the cellular components, velocity of the cellular components, or some combination the tissue components, counts of the tissue components, densities of the tissue components, clusters of the tissue components, chromaticity of the tissue components, brightness of the tissue components, colocations of the tissue components, trajectories of the tissue components, velocity of the tissue components, or some combination thereof. The computational recovery algorithm may include identifying clusters of overlapping signal emissions representative of different tissue, cellular, or sub-cellular components within the image, fitting the clustered signals simultaneously, and fitting a variant background beneath each cluster of signal emissions.
The graphical representation 501 illustrates a conventional lateral or an XY PSF of the optical system that did not include the optical element. The graphical representation 503 illustrates an XZ cross section of the conventional PSF of the optical system that did not include the optical element. The graphical representation 505 illustrates a DF lateral or XY PSF of the optical system that included the optical element. The graphical representation 507 illustrates an XZ cross section of the DF PSF of the optical system that included the optical element.
The graphical representation 501 illustrates the conventional lateral or XY PSF as a function of defocus of the optical system that did not include the optical element. The graphical representation 505 illustrate the DF lateral or XY PSF normalized to a brightest PSF at the original focal plane of the optical system that included the optical element. The graphical representation 507 of the XZ cross section of the DF PSF compared to the graphical representation 503 of the XZ cross section of the conventional PSF shows an extension in the depth of field of the optical system that included the optical element compared to the optical system that did not include the optical element.
A graphical representation 509 of a phase mask of the optical element used for the simulations is illustrated in
The optical system that included the optical element extended the depth of field to roughly twenty-one 1 μm while generally maintaining a high image resolution (e.g., a lower PSF footprint). As illustrated in the graphical representation 513, due to conservation of energy, extending the depth of field resulted in a drop in the peak intensity of the PSF. In the simulations, the peak intensity of the optical system that included the optical element at focus was about one third the peak intensity of the optical system that did not include the optical element, which shows that energy was conserved with minimal losses to side lobes. However, away from focus, the peak intensity of the optical system that included the optical element was higher than the optical system that did not include the optical element which resulted in a higher signal to noise ratio (SNR), a greater depth of field, or some combination thereof.
The graphical representation 601 illustrates a conventional lateral or XY PSF of the optical system that did not include the optical element. The graphical representation 603 illustrates an XZ cross section of the conventional PSF of the optical system that did not include the optical element. The graphical representation 605 illustrates a DF lateral or XY PSF of the optical system that included the optical element. The graphical representation 607 illustrates an XZ cross section of the DF PSF of the optical system that included the optical element.
The graphical representation 601 illustrate the conventional lateral or XY PSF as a function of defocus of the optical system that did not include the optical element. The graphical representation 605 illustrate the DF lateral or XY PSF normalized to a brightest PSF at the original focal plane of the optical system that included the optical element. The graphical representation 607 of the XZ cross section of the DF PSF compared to the graphical representation 603 of the XZ cross section of the conventional PSF shows an extension in the depth of field in the optical system that included the optical element compared to the optical system that did not include the optical element.
A graphical representation 609 of a phase mask of the optical element used for the simulations is illustrated in
The graphical representation 701 illustrates a conventional lateral or XY PSF of the optical system that did not include the optical element. The graphical representation 703 illustrates an XZ cross section of the conventional PSF of the optical system that did not include the optical element. The graphical representation 705 illustrates a DF lateral or XY PSF of the optical system that included the optical element. The graphical representation 707 illustrates an XZ cross section of the DF PSF of the optical system that included the optical element.
The graphical representation 701 illustrates the conventional lateral or XY PSF as a function of defocus of the optical system that did not include the optical element. The graphical representation 705 illustrates the DF lateral or XY PSF normalized to a brightest PSF at the original focal plane of the optical system that included the optical element. The graphical representation 707 of the XZ cross section of DF PSF compared to the graphical representation 703 of the XZ cross section of the conventional PSF shows an extension in the depth of field in the optical system that included the optical element compared to the optical system that did not include the optical element.
A graphical representation 709 of a phase mask of the optical element used for the simulations is illustrated in
The simulations were performed based on single molecule fluorescence in-situ hybridization (smFISH) image data. smFISH may be used to study gene expression in cells and tissues by counting individual mRNA molecules. In smFISH imaging experiments, individual molecules may be pinpointed by locating a center of an emission signal of each molecule. However, since the tissue, cellular, or sub-cellular components (e.g., MRNA) are distributed throughout the volume of the sample, the limited depth of field of the microscope that did not include the optical element may cause the components to be out of focus, which may lead to increased background blur and misidentification, misclassification, or miscounting the components.
In the graphical representations 801 and 803, ground truths of the tissue, cellular, or sub-cellular components are illustrated as circles and detected positions of the components are illustrated as plus signs. As illustrated in the graphical representations 801 of the simulations of the microscope that did not include the optical element, in all but one of the simulations, all of the components were out of focus and not detected due to the relatively small depth of field. In simulation 3 using the microscope that did not include the optical element, one out of three of the components was in focus and was detected. However, as illustrated in the graphical representations 803 of the simulations of the microscope that included the optical element, three out of three components were in focus and two out of three components were detected for each of the simulations due to the extended depth of field.
The workflow 1001 illustrates the workflow and the images 1005a-d captured or generated using the standard objective lens. The workflow 1001 may include one or more blocks 1009, 1011, 1013, 1015, or 1017. At block 1009, a first image 1005a may be captured. At block 1011, a Z-stage may be stepped and a second image 1005b may be captured. At block 1013, another Z-stage may be stepped and another image 1005c may be captured. The workflow 1001 may repeat block 1013 to capture N images. At block 1015, the images 1005a-c may be stored and combined as a combined image 1005d (e.g., an N-slice maximum intensity projection (MIP)). At block 1017, spots in the combined image 1005d may be located. As illustrated in
The workflow 1003 illustrates the workflow and the image 1007 captured using the DF objective lens. The workflow 1003 may include one or more blocks 1019, 1021, or 1023. At block 1019, the image 1007 may be captured. At block 1021, the image 1007 may be saved and stored. At block 1023, spots in the image 1007 may be located. As illustrated in
At block 1102, data representative of a biological sample within an extended depth of field may be received. At block 1104, information corresponding to tissue, cellular, or sub-cellular components within the biological sample within the extended depth of field from the data representative of the biological sample within the extended depth of field may be extracted. At block 1106, 'omics information based on the extracted information may be determined.
Modifications, additions, or omissions may be made to the method 1100 without departing from the scope of the present disclosure. For example, the operations of method 1100 may be implemented in differing order. Additionally or alternatively, two or more operations may be performed at the same time. Furthermore, the outlined operations and actions are only provided as examples, and some of the operations and actions may be optional, combined into fewer operations and actions, or expanded into additional operations and actions without detracting from the essence of the described embodiments.
In general, the processor 1150 may include any suitable special-purpose or general-purpose computer, computing entity, or processing device including various computer hardware or software modules and may be configured to execute instructions stored on any applicable computer-readable storage media. For example, the processor 1150 may include a microprocessor, a microcontroller, a digital signal processor (DSP), an application-specific integrated circuit (ASIC), a Field-Programmable Gate Array (FPGA), or any other digital or analog circuitry configured to interpret and/or to execute program instructions and/or to process data. Although illustrated as a single processor in
In some embodiments, the processor 1150 may be configured to interpret and/or execute program instructions and/or process data stored in the memory 1152, the data storage 1154, or the memory 1152 and the data storage 1154. In some embodiments, the processor 1150 may fetch program instructions from the data storage 1154 and load the program instructions in the memory 1152. After the program instructions are loaded into memory 1152, the processor 1150 may execute the program instructions.
The processor 1150 may fetch the program instructions from the data storage 1154 and may load the program instructions in the memory 1152. After the program instructions are loaded into memory 1152, the processor 1150 may execute the program instructions such that the computing system 1200 may implement the operations as directed by the instructions.
The memory 1152 and the data storage 1154 may include computer-readable storage media for carrying or having computer-executable instructions or data structures stored thereon. Such computer-readable storage media may include any available media that may be accessed by a general-purpose or special-purpose computer, such as the processor 1150. By way of example, and not limitation, such computer-readable storage media may include tangible or non-transitory computer-readable storage media including Random Access Memory (RAM), Read-Only Memory (ROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Compact Disc Read-Only Memory (CD-ROM) or other optical disk storage, magnetic disk storage or other magnetic storage devices, flash memory devices (e.g., solid state memory devices), or any other storage medium which may be used to carry or store particular program code in the form of computer-executable instructions or data structures and which may be accessed by a general-purpose or special-purpose computer. Combinations of the above may also be included within the scope of computer-readable storage media. Computer-executable instructions may include, for example, instructions and data configured to cause the processor 1150 to perform a certain operation or group of operations.
The communication unit 1156 may include any component, device, system, or combination thereof that is configured to transmit or receive information over a network. In some embodiments, the communication unit 1156 may communicate with other devices at other locations, the same location, or even other components within the same system. For example, the communication unit 1156 may include a modem, a network card (wireless or wired), an infrared communication device, a wireless communication device (such as an antenna), and/or chipset (such as a Bluetooth® device, an 802.6 device (e.g., Metropolitan Area Network (MAN)), a WiFi device, a WiMax device, cellular communication facilities, etc.), and/or the like. The communication unit 1156 may permit data to be exchanged with a network and/or any other devices or systems described in the present disclosure.
Modifications, additions, or omissions may be made to the computing system 1200 without departing from the scope of the present disclosure. For example, in some embodiments, the computing system 1200 may include any number of other components that may not be explicitly illustrated or described.
Terms used in the present disclosure and especially in the appended claims (e.g., bodies of the appended claims) are generally intended as “open terms” (e.g., the term “including” should be interpreted as “including, but not limited to.”).
Additionally, if a specific number of an introduced claim recitation is intended, such an intent will be explicitly recited in the claim, and in the absence of such recitation no such intent is present. For example, as an aid to understanding, the following appended claims may contain usage of the introductory phrases “at least one” and “one or more” to introduce claim recitations. However, the use of such phrases should not be construed to imply that the introduction of a claim recitation by the indefinite articles “a” or “an” limits any particular claim containing such introduced claim recitation to embodiments containing only one such recitation, even when the same claim includes the introductory phrases “one or more” or “at least one” and indefinite articles such as “a” or “an” (e.g., “a” and/or “an” should be interpreted to mean “at least one” or “one or more”); the same holds true for the use of definite articles used to introduce claim recitations.
In addition, even if a specific number of an introduced claim recitation is expressly recited, those skilled in the art will recognize that such recitation should be interpreted to mean at least the recited number (e.g., the bare recitation of “two recitations,” without other modifiers, means at least two recitations, or two or more recitations). Furthermore, in those instances where a convention analogous to “at least one of A, B, and C, etc.” or “one or more of A, B, and C, etc.” is used, in general such a construction is intended to include A alone, B alone, C alone, A and B together, A and C together, B and C together, or A, B, and C together, etc.
Further, any disjunctive word or phrase preceding two or more alternative terms, whether in the description, claims, or drawings, should be understood to contemplate the possibilities of including one of the terms, either of the terms, or both of the terms. For example, the phrase “A or B” should be understood to include the possibilities of “A” or “B” or “A and B.”
All examples and conditional language recited in the present disclosure are intended for pedagogical objects to aid the reader in understanding the present disclosure and the concepts contributed by the inventor to furthering the art, and are to be construed as being without limitation to such specifically recited examples and conditions. Although embodiments of the present disclosure have been described in detail, various changes, substitutions, and alterations could be made hereto without departing from the spirit and scope of the present disclosure.
Embodiments described in the present disclosure may be implemented using computer-readable media for carrying or having computer-executable instructions or data structures stored thereon. Such computer-readable media may be any available media that may be accessed by a general-purpose or special purpose computer. By way of example, and not limitation, such computer-readable media may include non-transitory computer-readable storage media including Random Access Memory (RAM), Read-Only Memory (ROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Compact Disc Read-Only Memory (CD-ROM) or other optical disk storage, magnetic disk storage or other magnetic storage devices, flash memory devices (e.g., solid state memory devices), or any other storage medium which may be used to carry or store desired program code in the form of computer-executable instructions or data structures and which may be accessed by a general-purpose or special purpose computer. Combinations of the above may also be included within the scope of computer-readable media.
Computer-executable instructions may include, for example, instructions and data, which cause a general-purpose computer, special purpose computer, or special purpose processing device (e.g., one or more processors) to perform a certain function or group of functions. Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are described as example forms of implementing the claims.
This patent application claims the benefit of and priority to U.S. Provisional App. No. 63/371,193 filed Aug. 11, 2022, titled “PHASE ENGINEERING AND COMPUTATIONAL RECOVERY FOR SPATIAL 'OMICS” and to U.S. Provisional App. No. 63/479,908 filed Jan. 13, 2023, titled “METHOD OF USING PHASE ENGINEERING AND COMPUTATIONAL RECOVERY FOR SPATIAL 'OMICS,” each of which are incorporated in the present disclosure by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
63371193 | Aug 2022 | US | |
63479908 | Jan 2023 | US |