This invention relates to optical sensing devices, and in particular, it relates to optical sensing devices based on light intensity detectors integrated with nanostructures.
Spectrometers using an integrated frequency filter and light sensor structure have been described. For example, U.S. Pat. Appl. Pub. No. 20150211922, Jul. 30, 2015, describes a spectrometer which “employs multiple filters having complex filter spectra that can be generated robustly from received light over short optical path lengths. The complex filter spectra provide data that can be converted to a spectrum of the received light using compressed sensing techniques.” (Id., Abstract). More specifically, the spectrometer includes “a frequency filter receiving light and modifying the light according to a set of different filter spectra each defining a frequency-dependent attenuation of the received light to provide a corresponding set of filtered light beams each associated with a different filter spectra; a broadband light detector receiving the set of filtered light beams to provide a corresponding set of independent measures of each filtered light beam; an electronic computer executing a program stored in non-transient memory to receive the independent measures of the filtered light beams to generate a spectrum derived from the set of independent measures, the spectrum indicating intensity as a function of frequency for different light frequencies over a range of frequencies; wherein each different filter spectra is a broadband spectrum with substantially non-periodic variations in value as a function of frequency.” (Id., claim 1.) It also describes a method of measuring a spectrum using such a spectrometer, which includes “(a) illuminating a sample material to obtain multiple independent measures of each filtered light beam; (b) comparing the multiple independent measures of each light signal to known different filter spectra to produce partial spectra indicating selective frequency attenuation of a broadband light signal by the filter spectra and the sample material; and (c) combining the partial spectra into the spectrum.” (Id., claim 15.)
U.S. Pat. Appl. Pub. No. 20140146207, May 29, 2014, describes a “solid-state image sensor and an imaging system which are capable of providing a solid-state image sensor and an imaging system which are capable of realizing a spectroscopic/imaging device for visible/near-infrared light having a high sensitivity and high wavelength resolution, and of achieving two-dimensional spectrum mapping with high spatial resolution. There are provided a two-dimensional pixel array, and a plurality of types of filters that are arranged facing a pixel region of the two-dimensional pixel array, the filters each including a spectrum function and a periodic fine pattern shorter than a wavelength to be detected, wherein each of the filters forms a unit which is larger than the photoelectric conversion device of each pixel on the two-dimensional pixel array, where one type of filter is arranged for a plurality of adjacent photoelectric conversion device groups, wherein the plurality of types of filters are arranged for adjacent unit groups to form a filter bank, and wherein the filter banks are arranged in a unit of N×M, where N and M are integers of one or more, facing the pixel region of the two-dimensional pixel array.” (Id., Abstract.)
WIPO Pub. No. WO2013064510 describes “A spectral camera for producing a spectral output [which] has an objective lens for producing an image, a mosaic of filters for passing different bands of the optical spectrum, and a sensor array arranged to detect pixels of the image at the different bands passed by the filters, wherein for each of the pixels, the sensor array has a cluster of sensor elements for detecting the different bands, and the mosaic has a corresponding cluster of filters of different bands, integrated on the sensor element so that the image can be detected simultaneously at the different bands. The filters are first order Fabry-Perot filters, which can give any desired passband to give high spectral definition.” (Id., Abstract.)
Z. Wang et al., Spectral analysis based on compressive sensing in nanophotonic structures, Optics Express, Vol. 22, No. 21, 25608-25614, 13 Oct. 2014 (“Wang et al. 2014”), describes a “method of spectral sensing based on compressive sensing . . . . The random bases used in compressive sensing are created by the optical response of a set of different nanophotonic structures, such as photonic crystal slabs. The complex interferences in these nanostructures offer diverse spectral features suitable for compressive sensing.” (Id., Abstract.)
The present invention is directed to an optical sensing device and related fabrication method that substantially obviates one or more of the problems due to limitations and disadvantages of the related art.
One object of the present invention is to provide improved sensors for high resolution hyperspectral imaging.
Another object of the present invention is to provide multi-modal optical sensing devices for simultaneous sensing spectral information and one or more of polarization, angle and phase information of the incident light field.
Additional features and advantages of the invention will be set forth in the descriptions that follow and in part will be apparent from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims thereof as well as the appended drawings.
To achieve the above objects, the present invention provides an optical sensing device, which includes: a light intensity detector; and a three-dimensional nanostructure integrated above the light intensity detector, the three-dimensional nanostructure having feature sizes in all three dimensions comparable to a wavelength range of an incident light to be detected. In some embodiments, the three-dimensional nanostructure including a plurality of layers of two-dimensional nanostructures.
In another aspect, the present invention provides an optical sensing device, which includes: a plurality of light intensity detectors disposed adjacent to each other; and a plurality of nanostructures each integrated above a corresponding one of the plurality of light intensity detectors, wherein the nanostructures for different light intensity detectors have different feature sizes and locations, and wherein the nanostructure for each light intensity detector has an intrinsic anisotropy which includes either an anisotropy in geometries of the nanostructure or an anisotropy in a material that forms the nanostructure or both, and the nanostructures for different light intensity detectors have different intrinsic anisotropies.
In yet another aspect, the present invention provides an optical sensing device which includes: a plurality of light intensity detectors disposed adjacent to each other within an area; and a plurality of nanostructures each integrated above a corresponding one of the plurality of light intensity detectors, wherein light intensity detectors includes a first group of light intensity detectors located in a first region of the area and a second group of light intensity detectors located in a second region of the area, wherein the nanostructures for different ones of the second group of light intensity detectors are configured to have different spectral responses to an incident light and low sensitivities to an angle of the incident light, and the nanostructures for different ones of the first group of light intensity detectors are configured to have different responses to the angle of the incident light.
In yet another aspect, the present invention provides an optical sensing device which includes: a plurality of light intensity detectors disposed adjacent to each other; a plurality of nanostructures each integrated above a corresponding one of the plurality of light intensity detectors, wherein the plurality of light intensity detectors and corresponding nanostructures form an array of identical subunits, each subunit including an array of light intensity detectors and corresponding nanostructures, wherein within each subunit, the nanostructures for different light intensity detectors are different; a lenslet array disposed above the plurality of nanostructures, the lenslet array having a pitch larger than a pitch of the plurality of subunits in the array of subunits; and a transparent optical medium disposed in a space between the plurality of nanostructures and the lenslet array.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed.
Embodiments of the present invention provide optical sensing devices based on broadband light intensity detectors integrated with nanostructures, where the nanostructures, located above the light intensity detectors, have feature sizes in at least one dimension comparable to the wavelength of the incident light. Preferably, CMOS (complementary metal-oxide-semiconductor) image sensors are used as the light intensity detectors and are used in the descriptions below, but other image sensors such as CCD image sensors may be used as well.
In some embodiments, the optical sensing device is multi-modal, i.e., it has the capabilities of simultaneously detecting multiple parameters of the incident light field, including spectral, polarization, angle, and phase information.
In some embodiments, by extending the nanostructures into the direction (z direction) normal to the CMOS sensor surface (xy directions), the design space is greatly expanded. Variations of structural geometries in the two-dimensional plane parallel to the sensor surface are combined with that in the z direction to create more optical filters.
The nanostructures allow multiple scatterings and interferences of the light in the direction normal to the sensor surface (z direction) in addition to the direction parallel to the sensor surface (xy plane). This effectively increases the optical path length, encoding the light intensity to be detected with the light intensity detector with complex patterns that reveals more information of the incident light field.
To exploit the multiple scatterings and interferences of light from the nanostructures, the feature size of the nanostructures is preferably comparable to the representative wavelength of the incident light, and the range is preferably within 1/10 to 5 times of the representative wavelength. The nanostructures is preferably placed within 10 μm from the surface of the CMOS image sensor, and therefore the photocurrent of the CMOS image sensor records the optical response of the nanostructures.
To describe the structure of the optical sensing device, three levels of “pixels” are defined here: spatial pixels, sampling pixels, and detector pixels. A detector pixel is a single photocurrent generating unit, for example a photodiode, and its associated transistors and circuitry in a CMOS image sensor. Each sampling pixel performs one sampling of one or more types of optical information; for example, a sampling pixel may consist of a specific nanostructure on top of one or multiple detector pixels, and the output of these corresponding detector pixels, which are light intensities, is used to reconstruct a specific spectral information, polarization component information, angular component, and/or phase information of the incident light. A spatial pixel is a set of sampling pixels where the nanostructures of different sampling pixel are different; the outputs from all the sampling pixels collectively generate reconstruction of spectral, polarization, angle and/or phase information of the incident light field. A spatial pixel may be used as a stand-alone optical sensor, or as one spatial point of a periodic array in the xy space when used for imaging.
The nanostructure of each of these sampling pixels has a distinct response and modulation of the spectral, polarization, angle and/or phase information of the incident light that includes different spectral components, polarization states, incident angles and/or phases. For each sampling pixel, a specific nanostructure can be designed, and a light intensity detector is placed under the nanostructure to measure the incident light field intensity after modulation by the nanostructure. Sophisticated algorithms are used to analyze the measured data by an entire spatial pixel in the CMOS sensor to reconstruct information of spectrum, polarization states, angle or phase of the incident light. The reconstruction algorithms of the information of light exploit the correlation of the CMOS detector pixel photocurrent of multiple adjacent sampling pixels as a result of the optical response of the adjacent sampling pixel nanostructures. Responses from at least 2×2 sampling pixels are necessary to constitute a spatial pixel for reconstruction of meaningful information of incident light field.
The nanostructure can be made of any types of materials that influence the incident light by scattering or absorption, including but not limited to dielectric, metallic and polymeric materials. The embodiments and manufacturing methods described below include specific examples of nanostructures and their preferred materials.
A conventional CMOS sensor used to measure one of the spectral, polarization, angle or phase information of the incident light uses two-dimensional (2D) structures (over the xy plane) to modulate the incident light. A structure is referred to as 2D structures when it has structural variations distributed in the xy directions but the structure in the third direction is constant.
In some embodiments of the present invention, on the other hand, 3D nanostructures are used, which have structural variations (with nano-scale feature sizes) in all three dimensions. Using a 3D nanostructure, resonance in the z direction greatly reduces the need of the resonance in the xy plane that is generated by a nanostructure fabricated over more than one photodiode (detector pixel) on the image sensor. As such, in spectral sensing, compared to a conventional 2D structure which needs to cover more than one detector pixel to complete one valid measurement of the spectral information of a single point on the image, the 3D nanostructure according to embodiment of the present invention makes it possible to complete this measurement using a single detector pixel. Therefore the sampling pixel can include a single detector pixel, which is the smallest number possible. The footprint of one spatial pixel, which contains a specific set of sampling pixels, is therefore reduced, and higher or the highest possible resolution for imaging can be achieved.
In some embodiments of the present invention, in addition to the improved spectral sensing performance, at least one layer (parallel to the xy plane) of nanostructures can be designed to be sensitive to the angle, phase and/or polarization of the incident light. As such, not only the spectral information, but also the incident angle, polarization states, and/or phase of the incident light are simultaneously encoded in the intensity patterns detected by the CMOS sensor. As such, a single measurement from the CMOS image sensor with an array of spatial pixels includes sufficient information to recover the multi-modal parameters of the light field, without consuming more area in the CMOS sensor surface. This significantly shrinks the footprint of the sensor and improves the spatial resolution of the sensor for imaging.
One group of embodiments of the present invention provide an optical sensing device for high spatial resolution hyperspectral imaging to obtain the spectrum for each pixel in the image of a scene.
In a first embodiment, schematically illustrated in
The 3D nanostructure can be equivalently viewed as being formed by stacking 2D nanostructures layer by layer, each layer containing random 2D nanostructures, and the layer thicknesses may also be randomly chosen. The nanostructures within each sampling pixel may be periodic or non-periodic in the direction perpendicular to the surface of the image sensor. It may also be periodic or non-periodic in the plane of image sensor. The materials system of the nanostructures contains as least two different materials, whose refractive indices have sufficient contrast in the wavelengths of interest to form resonances under the three-dimensional nanostructure design.
The transmission spectra of each sampling pixel may be characterized (calibrated), for example, using a wavelength-tunable light source and measuring the response of each sampling pixel, or other suitable methods. Further description of calibration may be found in, for example, Wang et al. 2014, Z. Wang et al., Single-Shot On-Chip Spectral Sensors Based on Photonic Crystal Slabs, NATURE COMMUNICATIONS (2019) 10:1020 (Wang et al. 2019). The spectrum of the incident light can be reconstructed from the detected light intensities of the sampling pixels using the data construction method described in Wang et al. 2014 or other suitable methods.
In a second embodiment, schematically illustrated in
Two exemplary methods for manufacturing a 3D nanostructure according to embodiments of the present invention are described with reference to
Note that in the step of forming a subsequent layer (referred to as the current layer) above a preceding layer, the deposited material of the current layer, e.g. material 32 shown in
In the embodiment shown in
The number of total lithographical cycles is the number of patterned layers in the 3D nanostructures. Variations to this baseline process can be used to achieve various 3D nanostructure designs. Examples include skipping some of the patterning steps to have multiple layers of Fabry-Perot resonators in the 3D nanostructure. The method may be monolithically integrated in a typical CMOS process flow.
In a third embodiment, schematically illustrated in
In a fourth embodiment, schematically illustrated in
A manufacturing method according to an embodiment of the present invention, which achieves different dielectric layer thicknesses or etching depths for different sampling pixels of a spatial pixel such as that shown in
More specifically,
Grayscale lithography is described in, for example, MicroChemicals GmbH, Greyscale Lithography with Photoresists, published online at https://www.microchemicals.com/technical_information/greyscale_lithography.pdf, and C. Williams et al., Grayscale-to-color: Single-step fabrication of bespoke multispectral filter arrays, published online at https://arxiv.org/abs/1901.10949.
In a fifth embodiment, schematically illustrated in
A manufacturing method according to an embodiment of the present invention for forming 3D nanostructures with nanoparticles, such as that shown in
In a sixth embodiment, schematically illustrated in
The homogenizer film 92 may be made from dielectric materials with certain thickness and surface roughness on the scale of the wavelength of the incident light, to randomize the angle and polarization of the incident light. One method of manufacturing such a homogenizer film on top of nanostructures includes deposition of a layer of dielectric materials 92A followed by etching (such as physical sputtering) to create desired surface roughness, as shown in
In a seventh embodiment, schematically illustrated in three examples in
The sensors in the above-described embodiments may be used for hyperspectral imaging using compressive sensing techniques.
Using compressive sensing, since the number of spectral sampling is much smaller than the number of unknown points in the spectrum, the reconstruction from the measurements is an under-determined problem. Various methods have been proposed, each based on certain prior knowledge, to reduce the range of possible solutions. One commonly used prior knowledge is spectral sparsity, which assume that the spectrum of the sample is sparse, containing a large amount of zeros or very small values in most region of the spectrum. Under this assumption, the optimization process minimizes the L1 norm of the reconstructed spectrum. This approach is described in Wang et al. 2014. Another prior knowledge is smoothness, which assume that the spectrum of the sample is largely smooth. The derivatives of the reconstructed spectrum can be used to regularize the optimization process to enforce the smoothness constraint. This approach is described in Wang 2019. However, in many applications the sample spectrum is not purely sparse nor smooth, therefore neither of the aforementioned method results in a satisfying reconstruction.
According to an embodiment of the present invention, a new spectral reconstruction method combines the sparsity and smoothness criteria and jointly minimizes the L1, L2 norms and the derivatives (specifically second derivative) of the reconstructed spectrum.
In addition to the aforementioned method to reconstruct spectrum for general purpose, better reconstruction result may be achieved using spectrum reconstruction methods that are optimized for specific tasks. With more constraints associated with the purpose of each task, additional prior knowledge may be obtained and used to facilitate the spectral reconstruction. In one approach according to an embodiment of the present invention, deep learning techniques are used to automatically learn the different types of prior knowledge and improve the optimization process. Under this approach, a sufficient number of sample spectra are collected, either through experiments or synthesis through simulation, and fed into a neural network. Various types of neural network may be used, such as convolutional neural networks or residual neural networks. The output of the neural network indicates the most appropriate parameters to use for the optimization problem, such as the weights of sparsity and smoothness, etc. Alternatively, an end-to-end pipeline can be trained to reconstruct the spectra directly from the raw measurements using deep learning. This process may be repeated for better performance on designated type of spectral sensing tasks.
Another group of embodiments of the present invention provide optical sensing devices for simultaneous sensing of spectrum and another property of the incident light, such polarization or angle, at each spatial location, thereby achieving multi-model sensing.
In an eighth embodiment, schematically illustrated in
In a ninth embodiment, schematically illustrated in
An alternative embodiment may combine the eighth and ninth embodiment, i.e., the nanostructures are formed in compound materials which has intrinsic anisotropy and at the same time, their geometry has an intrinsic anisotropy.
Method of producing arrays of aligned nanowires are known, and any suitable method may be used to implement this embodiment.
A manufacturing method for manufacturing the structures of polymeric films embedded with nanowires with certain orientations such as that shown in
In an alternative manufacturing method (not shown), nanowires are grown on a substrate with pre-formed groves which guide the nanowire orientation. The nanowires may be grown directly on top of the sensor, or grown above a separate substrate and then transferred on top of the sensor.
The principle of detecting polarization states of light is generally known. See, for example, Y. Maruyama et al., 3.2-MP Back-Illuminated Polarization Image Sensor With Four-Directional Air-Gap Wire Grid and 2.5-μm Pixels, IEEE Transactions on Electron Devices, Vol. 65, No. 6, June 2018; and D. Kwon et al., Optical planar chiral metamaterial designs for strong circular dichroism and polarization rotation, Vol. 16, No. 16/Optics Express 11802, 4 Aug. 2008. The principles may be applied to process the data measured by the polarization and spectral sampling pixels in the above embodiments to obtain the polarization of the incident light.
In an tenth embodiment, schematically illustrated in
In implementation, the angle-insensitive sampling pixels may be implemented by reducing the size of each sampling pixel, as compared to the size of the angle-sensitive sampling pixels. It is well known that photonic crystals have angle-sensitive spectral responses. However, it is believed that by reducing the overall lateral size of the nanostructure, the angle sensitivity will be reduced.
The measured data from the angle-sensitive sampling pixels are compared to the angle-insensitive measurement to retrieve the incident angle information of the incident light. More specifically, the transmission spectrum of the angle-sensitive pixels at different angles are obtained through a wavelength and angle scanning calibration process. The spectrum of the incident light is reconstructed from the measurements of the angle-insensitive pixels (an example is schematically illustrated in
In an eleventh embodiment, schematically illustrated in
A system (not nanostructure based) that uses a principle similar to that described above to detect angle of incident light is described in R. Ng et al., Light Field Photography with a Hand-held Plenoptic Camera, Stanford Tech Report CTSR 2005-02, 2005.
It will be apparent to those skilled in the art that various modification and variations can be made in the optical sensing device and related method of the present invention without departing from the spirit or scope of the invention. Thus, it is intended that the present invention cover modifications and variations that come within the scope of the appended claims and their equivalents.