This disclosure relates generally to optics, and in particular to polarizers.
Optical components in devices include refractive lenses, diffractive lenses, color filters, neutral density filters, and polarizers. Linear and circular polarizers are common-place in both commercial and consumer systems and devices, for example. Wire-grid polarizers are a common polarizer.
Non-limiting and non-exhaustive embodiments of the invention are described with reference to the following figures, wherein like reference numerals refer to like parts throughout the various views unless otherwise specified.
Embodiments of liquid crystal polarizers are described herein. In the following description, numerous specific details are set forth to provide a thorough understanding of the embodiments. One skilled in the relevant art will recognize, however, that the techniques described herein can be practiced without one or more of the specific details, or with other methods, components, materials, etc. In other instances, well-known structures, materials, or operations are not shown or described in detail to avoid obscuring certain aspects.
Reference throughout this specification to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, the appearances of the phrases “in one embodiment” or “in an embodiment” in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Throughout this specification, several terms of art are used. These terms are to take on their ordinary meaning in the art from which they come, unless specifically defined herein or the context of their use would clearly suggest otherwise.
In some implementations of the disclosure, the term “near-eye” may be defined as including an element that is configured to be placed within 50 mm of an eye of a user while a near-eye device is being utilized. Therefore, a “near-eye optical element” or a “near-eye system” would include one or more elements configured to be placed within 50 mm of the eye of the user.
In aspects of this disclosure, visible light may be defined as having a wavelength range of approximately 380 nm-700 nm. Non-visible light may be defined as light having wavelengths that are outside the visible light range, such as ultraviolet light and infrared light. Infrared light having a wavelength range of approximately 700 nm-1 mm includes near-infrared light. In aspects of this disclosure, near-infrared light may be defined as having a wavelength range of approximately 700 nm-1.4 μm.
In aspects of this disclosure, the term “transparent” may be defined as having greater than 90% transmission of light. In some aspects, the term “transparent” may be defined as a material having greater than 90% transmission of visible light.
Wire-grid polarizers are traditionally used in products for infrared applications. However, micropatterned wire-grid polarizers have (1) limited spatial resolution, (2) poor performance at visible wavelengths, (3) require complicated lithographic processing, and (4) are susceptible to defects.
In this disclosure, a liquid crystal polarizer (LCP) fabricated by photoalignment of absorbing materials is disclosed as an alternative to creating patterned polarizers (e.g. micro-patterned wire-grid polarizers) for particular imaging systems. The LCP may be fabricated with polymers and photoalignment of absorbing materials. The photoalignment of absorbing materials in polymers can produce micron-sized polarizers of high efficiency and extinction for ultraviolet (UV), visible, and near-infrared (NIR) wavelengths. In some implementations, the absorbing materials are dimensions at less than 10 microns. In some implementations, the features may be as small as 2.5 microns. In some implementations, the LCP includes twisted liquid crystals. In some implementations, the LCP includes untwisted liquid crystals. In some implementations, the LCP includes both twisted liquid crystals and untwisted liquid crystals.
In implementations of the disclosure, a CMOS sensor with Liquid Crystal polarizers (LCP) is disclosed that allows for full or partial Stokes imaging (e.g.
An implementation of the disclosure includes an optical sensor with a patterned liquid crystal polarizer on top of a photo-sensitive region with photodiode(s) beneath it to measure Stokes parameters for polarization imaging. Above the patterned LCP, there can be a light guiding element (for e.g. microlens) to improved optical efficiency. Between the patterned LCP and the photon-sensitive region, there can be optional optical structures including filter, high absorption protrusion, back side metals, deep trench interface and polarization sensitive element. Deep Trench Interface (DTI) can be added around the boundaries of the photosensitive region (e.g. silicon) for each pixel, to reduce crosstalk between pixels.
Another implementation of the disclosure includes an optical sensor with a patterned liquid crystal polarizer on top of a photo-sensitive region with photodiode(s) beneath it to measure partial Stokes parameters for polarization difference imaging. Above the patterned LCP, there can be a light guiding element (for e.g. microlens) to improved optical efficiency. Between the patterned LCP and the photo-sensitive region, there can be optional optical structures including filter, high absorption protrusion, back side metals, deep trench interface and polarization sensitive element.
Some implementations of the disclosure may include a LC-PBP lens disposed over photodiodes(s) to measure components of RHC and LHC for polarization imaging. These and other embodiments are described in more detail in connection with
In operation, imaging light 190 is incident on subpixel 101 and microlens 140A focuses the imaging light 190 to semiconductor substrate region 110A. 0-degree (vertical) polarizer 131 passes the vertically polarized portion 191 of imaging light 190 and blocks/rejects other polarizations of imaging light 190. Vertically polarized portion 191 of imaging light 190 becomes incident on semiconductor substrate region 110A and generates a first imaging signal 181 in response to the intensity of the vertically polarized portion 191 of imaging light 190.
In operation, imaging light 190 is incident on subpixel 102 and microlens 140B focuses the imaging light 190 to semiconductor substrate region 110B. 45-degree polarizer 132 passes the 45-degree polarized portion 192 of imaging light 190 and blocks/rejects other polarizations of imaging light 190. 45-degree polarized portion 192 of imaging light 190 becomes incident on semiconductor substrate region 110B and generates a second imaging signal 182 in response to the intensity of the 45-degree polarized portion 192 of imaging light 190.
In operation, imaging light 190 is incident on subpixel 103 and microlens 140C focuses the imaging light 190 to semiconductor substrate region 110C. 90-degree (horizontal) polarizer 133 passes the horizontally polarized portion 193 of imaging light 190 and blocks/rejects other polarizations of imaging light 190. Horizontally polarized portion 193 of imaging light 190 becomes incident on semiconductor substrate region 110C and generates a third imaging signal 183 in response to the intensity of the horizontally polarized portion 193 of imaging light 190.
In operation, imaging light 190 is incident on subpixel 104 and microlens 140D focuses the imaging light 190 to semiconductor substrate region 110D. 135-degree polarizer 134 passes the 135-degree polarized portion 194 of imaging light 190 and blocks/rejects other polarizations of imaging light 190. 135-degree polarized portion 194 of imaging light 190 becomes incident on semiconductor substrate region 110D and generates a fourth imaging signal 184 in response to the intensity of the 135-degree polarized portion 194 of imaging light 190.
In operation, imaging light 190 is incident on subpixel 105 and microlens 140E focuses the imaging light 190 to semiconductor substrate region 110E. RHC polarizing layer 160 passes the RHC polarized portion 195 of imaging light 190 and blocks/rejects other polarizations of imaging light 190. RHC polarized portion 195 of imaging light 190 becomes incident on semiconductor substrate region 110E and generates a fifth imaging signal 185 in response to the intensity of the RHC polarized portion 195 of imaging light 190.
In operation, imaging light 190 is incident on subpixel 106 and microlens 140F focuses the imaging light 190 to semiconductor substrate region 110F. LHC polarizing layer 170 passes the LHC polarized portion 196 of imaging light 190 and blocks/rejects other polarizations of imaging light 190. LHC polarized portion 196 of imaging light 190 becomes incident on semiconductor substrate region 110F and generates a sixth imaging signal 186 in response to the intensity of the LHC polarized portion 196 of imaging light 190.
Hence, subpixels 105 and 106 of
S
0=Horizontal+Vertical
S
1=Horizontal−Vertical
S
2=45°-(135°)
S
3
=LHC−RHC
Those skilled in the art appreciate that the reference coordinate system for “vertical,” 45-degree, “horizontal,” and 135-degree can be rotated arbitrarily in different implementations as long as the angles of transmission differ by 45 degrees from each other. In addition, there may be a margin range for each polarization orientation. For example, the term “45-degree linearly polarized light” may include 40 degree to 50 degree linearly polarized light and the term “135-degree linearly polarized light” may include 130 degree to 140 degree linearly polarized light.
Region 03 of LCP 301 is configured to pass LHC polarized light to a photodiode disposed below region 03 and region 08 of LCP 301 is configured to pass RHC polarized light to a photodiode disposed below region 08. Subpixels disposed below regions 04 (X) and 07 (X) of LCP 301 may be configured to sense infrared light, visible light, and/or specific bandwidths of visible light and infrared light. In an implementation, at least one of region 04 or region 07 is configured to sense horizontally polarized light and vertically polarized light to generate an intensity signal. A refractive microlens 342 may optionally be disposed over regions 03, 04, 07, and 08 to focus imaging light to the subpixels.
LCP 401 includes regions 01, 02, 03, 04, 05, 06, 07, 08, 09, 10, 11, 12, 13, 14, 15, and 16. The configuration of each region is notated similarly to the notation of the regions of LCP 301 (e.g. 0, 90, X). A refractive microlens 441 may be optionally disposed over regions 01, 02, 05, and 06 of LCP 401 to focus imaging light to the subpixels. A refractive microlens 442 may optionally be disposed over regions 03, 04, 07, and 08 of LCP 401 to focus imaging light to the subpixels.
In operation, control logic 708 drives image pixel array 702 to capture an image. Image pixel array 702 may be configured to have a global shutter or a rolling shutter, for example. Each subpixel may be configured in a 3-transistor (3T) or 4-transistor (4T) readout circuit configuration. Processing logic 712 is configured to receive the imaging signals from each subpixel. Processing logic 712 may perform further operations such as subtracting or adding some imaging signals from other imaging signals. For example, determining a Stokes parameter may require adding imaging signals or subtracting imaging signal from various subpixels. Processing logic 712 may be configured to generate a partial-Stokes image 715 in response to first signals 181, second signals 182, third signals 183, and fourth signals 184 from all the subpixels in image pixel array 702. In an implementation where LCP 301 was disposed over image pixel array 702, processing logic 712 may be configured to generate a full-Stokes image 715 in response to first signals 181, second signals 182, third signals 183, fourth signals 184, fifth signals 185/281, and sixth signals 186/282 from all the subpixels in image pixel array 702. Processing logic 712 may also be configured to assist in generating a PDI image 715 where LCP 401, 501, or 601 are disposed over image pixel array 702.
Image pixel array 802 includes an on-axis pixel 852. On-axis pixel 852 may be disposed in a center of image pixel array 802 and may receive the image light from a middle of focusing element 815. Image pixel 852 includes a first subpixel 812A and a second subpixel 812B. First subpixel 812A is configured to receive RHC polarized light and second subpixel 812B is configured to receive LHC polarized light. Microlens 842 may be configured to focus light to first subpixel 812A and a second subpixel 812B.
Image pixel array 802 also includes off-axis pixel 851 disposed closer to an outside boundary of the image pixel array 802 than on-axis pixel 851. Image pixel 851 includes a first subpixel 811A and a second subpixel 811B. First subpixel 811A is configured to receive RHC polarized light and second subpixel 811B is configured to receive LHC polarized light. Microlens 841 may be configured to focus light to first subpixel 811A and a second subpixel 811B.
A contiguous LC-PBP 830 may be disposed over subpixels 811A, 811B, 812A, 812B (and all the image pixels in image pixel array 802). LC-PBP 830 may be configured similarly to LC-PBP 230 of
Hence,
First subpixel 911A is configured to receive RHC polarized light and second subpixel 911B is configured to receive LHC polarized light. PBP lens 941 may be configured to focus image light to subpixels 911A and 911B while also being configured with the functionality of LC-PBP 230 of
Embodiments of the invention may include or be implemented in conjunction with an artificial reality system. Artificial reality is a form of reality that has been adjusted in some manner before presentation to a user, which may include, e.g., a virtual reality (VR), an augmented reality (AR), a mixed reality (MR), a hybrid reality, or some combination and/or derivatives thereof. Artificial reality content may include completely generated content or generated content combined with captured (e.g., real-world) content. The artificial reality content may include video, audio, haptic feedback, or some combination thereof, and any of which may be presented in a single channel or in multiple channels (such as stereo video that produces a three-dimensional effect to the viewer). Additionally, in some embodiments, artificial reality may also be associated with applications, products, accessories, services, or some combination thereof, that are used to, e.g., create content in an artificial reality and/or are otherwise used in (e.g., perform activities in) an artificial reality. The artificial reality system that provides the artificial reality content may be implemented on various platforms, including a head-mounted display (HMD) connected to a host computer system, a standalone HMD, a mobile device or computing system, or any other hardware platform capable of providing artificial reality content to one or more viewers.
The term “processing logic” (e.g. processing logic 712) in this disclosure may include one or more processors, microprocessors, multi-core processors, Application-specific integrated circuits (ASIC), and/or Field Programmable Gate Arrays (FPGAs) to execute operations disclosed herein. In some embodiments, memories (not illustrated) are integrated into the processing logic to store instructions to execute operations and/or store data. Processing logic may also include analog or digital circuitry to perform the operations in accordance with embodiments of the disclosure.
A “memory” or “memories” described in this disclosure may include one or more volatile or non-volatile memory architectures. The “memory” or “memories” may be removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules, or other data. Example memory technologies may include RAM, ROM, EEPROM, flash memory, CD-ROM, digital versatile disks (DVD), high-definition multimedia/data storage disks, or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information for access by a computing device.
Networks may include any network or network system such as, but not limited to, the following: a peer-to-peer network; a Local Area Network (LAN); a Wide Area Network (WAN); a public network, such as the Internet; a private network; a cellular network; a wireless network; a wired network; a wireless and wired combination network; and a satellite network.
Communication channels may include or be routed through one or more wired or wireless communication utilizing IEEE 802.11 protocols, BlueTooth, SPI (Serial Peripheral Interface), FC (Inter-Integrated Circuit), USB (Universal Serial Port), CAN (Controller Area Network), cellular data protocols (e.g. 3G, 4G, LTE, 5G), optical communication networks, Internet Service Providers (ISPs), a peer-to-peer network, a Local Area Network (LAN), a Wide Area Network (WAN), a public network (e.g. “the Internet”), a private network, a satellite network, or otherwise.
A computing device may include a desktop computer, a laptop computer, a tablet, a phablet, a smartphone, a feature phone, a server computer, or otherwise. A server computer may be located remotely in a data center or be stored locally.
The processes explained above are described in terms of computer software and hardware. The techniques described may constitute machine-executable instructions embodied within a tangible or non-transitory machine (e.g., computer) readable storage medium, that when executed by a machine will cause the machine to perform the operations described. Additionally, the processes may be embodied within hardware, such as an application specific integrated circuit (“ASIC”) or otherwise.
A tangible non-transitory machine-readable storage medium includes any mechanism that provides (i.e., stores) information in a form accessible by a machine (e.g., a computer, network device, personal digital assistant, manufacturing tool, any device with a set of one or more processors, etc.). For example, a machine-readable storage medium includes recordable/non-recordable media (e.g., read only memory (ROM), random access memory (RAM), magnetic disk storage media, optical storage media, flash memory devices, etc.).
The above description of illustrated embodiments of the invention, including what is described in the Abstract, is not intended to be exhaustive or to limit the invention to the precise forms disclosed. While specific embodiments of, and examples for, the invention are described herein for illustrative purposes, various modifications are possible within the scope of the invention, as those skilled in the relevant art will recognize.
These modifications can be made to the invention in light of the above detailed description. The terms used in the following claims should not be construed to limit the invention to the specific embodiments disclosed in the specification. Rather, the scope of the invention is to be determined entirely by the following claims, which are to be construed in accordance with established doctrines of claim interpretation.
This application claims priority to U.S. provisional Application No. 63/218,605 filed Jul. 6, 2021, which is hereby incorporated by reference.
Number | Date | Country | |
---|---|---|---|
63218605 | Jul 2021 | US |