The present application is an International Application, which claims benefit of European Patent Application No. EP21306035, entitled “PYRAMIDAL STRUCTURE FOR PLENOPTIC PIXEL,” filed Jul. 26, 2021, which is hereby incorporated by reference in its entirety.
An integrated plenoptic sensor is an imaging sensor where a microlens is shared between multiple subpixels. An integrated plenoptic sensor records a “4D light-field” which may give indications on the angle of incidence of light. An integrated plenoptic sensor may be used on cameras and smartphones to drive the autofocus of the main lens. An integrated plenoptic sensor may provide additional abilities, such as passive depth mapping, refocusing, or aberration correction.
The present disclosure relates to plenoptic cameras. A plenoptic camera is similar to a common camera with a lens system and alight sensor, with the addition of a micro-lens array over the micro-image sensor. Each micro-lens produces a micro-image on the sensor. The resulting plenoptic image may be referred to as a 4D light field which gives indications on the sensor and pupil coordinates of the photon trajectory. For later display and processing, the 4D light field may be processed through an operation known as projection into a 2D re-focused image. The projection operation allows for the possibility of tuning the focalization distance.
In some plenoptic cameras, each pixel of the light sensor is covered by a color filter that primarily allows light of one color to reach the corresponding pixel. In some such cameras, the color filters are arranged as a so-called Bayer filter. The conventional Bayer filter allows one color-red, green, or blue—to be recorded by each corresponding pixel. When an image has been captured using a Bayer filter, each pixel has only one associated color value, corresponding to the color of the filter associated with that pixel. From this image, it may be desirable to obtain an image in which each of the pixels has all three color values. This may be done with processing to obtain the two missing color values for each pixel. Such processing techniques are referred to as demosaicing. Demosaicing can be a non-trivial process, particularly for images or regions of images that cover highly textured areas.
Bayer color filters have been used with plenoptic cameras. To process 4D light field images captured with such cameras, demosaicing may be performed concurrently with a 2D refocusing process.
Conventional plenoptic cameras 100 are similar to ordinary 2D cameras with a main lens 102 and the addition of a micro-lens array 104 set just in front of the sensor 106 as illustrated schematically in
Plenoptic cameras record 4D light-field data which can be transformed into various by-products such as re-focused images with freely selected distances of focalization.
The sensor of a light-field camera records an image which is made of a collection of 2D small images arranged within a larger 2D image. Each micro-lens in the array, and each corresponding small micro-lens image generated under that lens, may be indexed by the coordinates (i, j). The pixels of the light field may be associated to four coordinates (x, y, i, j), where (x, y) identifies the location of the pixel in the complete image. The 4D light field recorded by the sensor may be represented by L(x, y, i, j).
where r is the number of consecutive micro-lens images in one dimension, and └ . . . ┘ is the floor function. An object is theoretically visible in r2 micro-lens images. Depending on the shape of the micro-lens image, some of the r2 views of the object might be invisible.
The distances p and w introduced in the previous sub-section are given in unit of pixel size. They can be converted into physical unit distances (e.g., meters), respectively P and W, by multiplying them by the pixel size δ, such that W=δw and P=δp. These distances can vary depending on the light-field camera characteristics.
In an alternative light-field camera design referred to as a type I plenoptic camera 500, the parameters are selected such that focal length f=d, the distance between the micro-lens array 504 and the sensor 506. An example of such a design is illustrated in
The replication distance W varies with the z, the distance of the object. To establish the relation between W and z, one may refer to the thin lens equation
Combining the previous two equations, one can deduce
The relation between W and z does not assume that the micro-lens images are in focus. Micro-lens images may be in focus when thin lens equation is satisfied such that
Also from the Thales law one derives P as follows.
The ratio e defines the enlargement between the micro-lens pitch and the micro-lens images pitch. This ratio is very close to 1 since D>>d.
Some of the plenoptic cameras as described above have the following properties: the micro-lens array has a square lattice (like the pixel array) and has no rotation versus the pixels; and the micro-lens image diameter is equal to an integer number of pixels (or almost equal to an integer number of pixels). These properties are satisfied by most feasible plenoptic sensors. These properties allow for the generation of images known as sub-aperture images.
A sub-aperture image collects all of the 4D light-field pixels having the same relative position within their respective micro-lens image, for example all of the pixels having the same (u, v) coordinates. If the array of micro-lenses has the size I×J, then each sub-aperture image also has size I×J. And if there is a p×p array of pixels under each micro-lens, then there are p×p sub-aperture images. If the number of pixels of the sensor is Nx×Ny, then each sub-aperture image may have the size of Nx/p×Ny/p.
An example of generating a sub-aperture image from a light-field image is as follows. In
The relations between (x, y, i,j) and (a, p, u, v) may be expressed as follows:
where └·┘ denotes the floor function, and mod denotes the modulo function.
If p is not exactly an integer but close to an integer, then the sub-aperture images can be computed by considering the distance between the micro-lens image equal to └p┘ the integer just greater than p. This case occurs especially when the micro-lens diameter ϕ is equal to an integer number of pixels. In the case, p=ϕe being slightly larger than ϕ since e=(D+d)/d is slightly greater than 1. The advantage of considering └p┘ is that the sub-aperture images are computed without interpolation since one pixel L(x, y, i, j) corresponds to an integer coordinate sub-aperture pixel S(α, β, u, v). The drawback is that the portion of the pupil from which photons are recorded is not constant within a given sub-aperture image S(u, v). As a result, S(u, v) sub-aperture image is not exactly sampling the (u, v) pupil coordinate.
In cases where p is not an integer, or where the micro-lens array is rotated versus the pixel array, then the sub-aperture images may be computed using interpolation since the centers (xi,j, ui,j) of the micro-lenses are not at integer coordinates.
Within the light-field image L(x, y, i,j) an object is made visible on several micro-images with a replication distance w. On the sub-aperture images, an object is also visible several times. From one sub-aperture image to the next horizontal one, an object coordinate (α, β) appears shifted by the disparity p. The relation between ρ and w can be expressed by:
Also, it is possible to establish a relation between the disparity p and the distance z of the object by combining equations (5) and (9):
Image refocusing consists in projecting the light-field pixels L(x, y, i,j) recorded by the sensor into a 2D refocused image of coordinate (X, Y). The projection may be performed by shifting the micro-images (i, j):
where wfocus is the selected replication distance corresponding to zfocus the distance of the objects that appear in focus in the computed refocused image. s is a zoom factor which controls the size of the refocused image. The value of the light-field pixel L(x, y, i, j) is added on the refocused image at coordinate (X, Y). If the projected coordinate is non-integer, the pixel is added using interpolation. To record the number of pixels projected into the refocus image, a weight-map image having the same size as the refocus image is created. This image is preliminary set to 0. For each light-field pixel projected on the refocused image, the value of 1.0 is added to the weight-map at the coordinate (X, Y). If interpolation is used, the same interpolation kernel is used for both the refocused and the weight-map images. After all of the light-field pixels are projected, the refocused image is divided pixel per pixel by the weight-map image. This normalization step provides for brightness consistency of the normalized refocused image.
In another technique of performing refocusing, the refocused images can be computed by summing-up the sub-aperture images S(α, β) taking into consideration the disparity ρfocus for which objects at distance zfocus are in focus.
The sub-aperture pixels are projected on the refocused image, and a weight-map records the contribution of this pixel, following the same procedure described above.
An example apparatus in accordance with some embodiments may include a microlens; and two or more plenoptic subpixel sensors, the two or more plenoptic subpixel sensors including two or more photodiodes formed in a silicon layer, wherein a pyramidal structure is etched in the two or more respective silicon photodiodes formed in the silicon layer.
For some embodiments of the example apparatus, the pyramidal structure may be formed to generate an optical prism deviating light away from a center of the two or more plenoptic subpixel sensors.
For some embodiments of the example apparatus, the pyramidal structure may be configured to reduce optical crosstalk between at least two of the two or more plenoptic subpixel sensors.
For some embodiments of the example apparatus, the pyramidal structure may be configured to increase sensitivity of at least one of the two or more plenoptic subpixel sensors.
For some embodiments of the example apparatus, the pyramidal structure may be configured to increase angular discrimination for light incident on the microlens at an angle near a chief ray angle of the microlens.
For some embodiments of the example apparatus, the pyramid structure may be configured to confine light to one of the two or more plenoptic subpixel sensors.
For some embodiments of the example apparatus, the pyramid structure may be configured to reflect light away from at least one of the two or more plenoptic subpixel sensors.
For some embodiments of the example apparatus, the silicon layer may be etched with a trench conforming to a deep trench isolation (DTI) structure.
For some embodiments of the example apparatus, a focal length of the microlens may be less than a threshold.
For some embodiments of the example apparatus, the two or more plenoptic sensors may include a dual pixel sensor; and the pyramidal structure may include a triangular structure.
For some embodiments of the example apparatus, the two or more plenoptic sensors may include a quad pixel sensor.
For some embodiments of the example apparatus, the pyramidal structure may include a material layer above the silicon layer.
For some embodiments of the example apparatus, the pyramidal structure may conform to pyramid-shaped trenches etched in the silicon layer.
Some embodiments of the example apparatus may further include a color filter system.
For some embodiments of the example apparatus, the pyramidal structure may include a material layer above the silicon layer; and the material layer may include an oxide layer between the color filter system and at least one of the two or more plenoptic subpixel sensors.
For some embodiments of the example apparatus, the pyramidal structure may further include a color filter layer of the color filter system above the material layer.
For some embodiments of the example apparatus, the color filter layer may include a color filter resin.
For some embodiments of the example apparatus, the color filter system may include a planarization layer and a color filter layer.
For some embodiments of the example apparatus, a horizontal facet angle of the pyramidal structure may be 54.7 degrees.
For some embodiments of the example apparatus, a base of the pyramidal structure may be equal to half of a diameter of the microlens.
For some embodiments of the example apparatus, the two or more plenoptic subpixel sensors may be arranged in a 2×1 grid, and the pyramidal structure may be centered on a line dividing the 2×1 grid in half.
For some embodiments of the example apparatus, the two or more plenoptic subpixel sensors may be arranged in a 2×2 grid, and the pyramidal structure may be centered on a center of the 2×2 grid.
For some embodiments of the example apparatus, the two or more plenoptic subpixel sensors may be arranged in a 3×3 grid, and the pyramidal structure may be centered on a center of the 3×3 grid.
For some embodiments of the example apparatus, the pyramidal structure may be a frustum, and a flat top of the frustum may be configured to match a center subpixel sensor of the 3×3 grid.
For some embodiments of the example apparatus, a tip of the pyramidal structure may be a square.
For some embodiments of the example apparatus, a base of the pyramidal structure may be a disk.
For some embodiments of the example apparatus, a tip of the pyramidal structure may be a disk.
For some embodiments of the example apparatus, the two or more plenoptic subpixel sensors may be arranged in a grid array, and the pyramidal structure may be centered on a center of the grid array.
For some embodiments of the example apparatus, a pyramid portion of the pyramidal structure may be a triangle extruded in at least one direction.
For some embodiments of the example apparatus, a pyramid portion of the pyramidal structure may be a polygon extruded in at least one direction.
For some embodiments of the example apparatus, a base of a pyramid portion of the pyramidal structure may be a square.
Some embodiments of the example apparatus may further include one or more external insulators adjacent to an external edge of at least one of the plenoptic subpixel sensors.
For some embodiments of the example apparatus, the microlens may be configured to shift a chief ray angle corresponding to a pixel offset of two pixels.
For some embodiments of the example apparatus, at least one of the two or more plenoptic subpixel sensors may be adjacent to the color filter system.
Some embodiments of the example apparatus may further include an oxide layer, wherein the oxide layer may be adjacent to the color filter system, wherein at least one of the two or more plenoptic subpixel sensors may be adjacent to the oxide layer, and wherein the oxide layer may be between the color filter system and at least one of the two or more plenoptic subpixel sensors.
For some embodiments of the example apparatus, a base of a pyramid portion of the pyramidal structure may be adjacent to the oxide layer.
For some embodiments of the example apparatus, a size of a base of a pyramid portion of the pyramidal structure may be adjusted based on a curvature of the microlens.
For some embodiments of the example apparatus, the pyramidal structure may be symmetrical along a first plane cutting through a center of the pyramidal structure, the pyramidal structure may be symmetrical along a second plane cutting through the center of the pyramidal structure, and wherein the first plane may be perpendicular to the second plane.
For some embodiments of the example apparatus, wherein the color filter system may be adjacent to the microlens.
For some embodiments of the example apparatus, two or more plenoptic subpixel sensors may be adjacent to the color filter system.
For some embodiments of the example apparatus, the color filter system may include a filter pattern of two or more subpixels.
For some embodiments of the example apparatus, the pyramidal structure may include silicon dioxide (SiO2).
For some embodiments of the example apparatus, the pyramidal structure may include silicon oxide.
For some embodiments of the example apparatus, the pyramidal structure may be configured to reduce optical crosstalk between at least two pixels.
For some embodiments of the example apparatus, the pyramidal structure may be configured to reduce optical crosstalk between at least two microlenses.
vs. angle of incidence according to some embodiments.
and a threshold vs. angle of incidence for a 3.5 μm quad pixel without a pyramid shape etched in the silicon according to some embodiments.
and a threshold vs. angle of incidence for a 3.5 μm quad pixel with a pyramid shape etched in the silicon according to some embodiments.
and a threshold vs. angle of incidence for a 1.6 μm quad pixel without a pyramid shape etched in the silicon according to some embodiments.
and a threshold vs. angle of incidence for a 1.6 μm quad pixel with a pyramid shape etched in the silicon according to some embodiments.
The entities, connections, arrangements, and the like that are depicted in—and described in connection with—the various figures are presented by way of example and not by way of limitation. As such, any and all statements or other indications as to what a particular figure “depicts,” what a particular element or entity in a particular figure “is” or “has,” and any and all similar statements—that may in isolation and out of context be read as absolute and therefore limiting—may only properly be read as being constructively preceded by a clause such as “In at least one embodiment, . . . ” For brevity and clarity of presentation, this implied leading clause is not repeated ad nauseum in the detailed description.
A wireless transmit/receive unit (WTRU) may be used, e.g., as a plenoptic camera in some embodiments described herein.
The processor 718 may be a general purpose processor, a special purpose processor, a conventional processor, a digital signal processor (DSP), a plurality of microprocessors, one or more microprocessors in association with a DSP core, a controller, a microcontroller, Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) circuits, any other type of integrated circuit (IC), a state machine, and the like. The processor 718 may perform signal coding, data processing, power control, input/output processing, and/or any other functionality that enables the WTRU 702 to operate in a wireless environment. The processor 718 may be coupled to the transceiver 720, which may be coupled to the transmit/receive element 722. While
The transmit/receive element 722 may be configured to transmit signals to, or receive signals from, a base station over the air interface 716. For example, in one embodiment, the transmit/receive element 722 may be an antenna configured to transmit and/or receive RF signals. In an embodiment, the transmit/receive element 722 may be an emitter/detector configured to transmit and/or receive IR, UV, or visible light signals, for example. In yet another embodiment, the transmit/receive element 722 may be configured to transmit and/or receive both RF and light signals. It will be appreciated that the transmit/receive element 722 may be configured to transmit and/or receive any combination of wireless signals.
The transceiver 720 may be configured to modulate the signals that are to be transmitted by the transmit/receive element 722 and to demodulate the signals that are received by the transmit/receive element 722. The WTRU 702 may have multi-mode capabilities. Thus, the transceiver 720 may include multiple transceivers for enabling the WTRU 702 to communicate via multiple radio access technologies, such as New Radio and IEEE 802.11, for example.
The processor 718 of the WTRU 702 may be coupled to, and may receive user input data from, the speaker/microphone 724, the keypad 726, the display/touchpad 728 (e.g., a liquid crystal display (LCD) display unit or organic light-emitting diode (OLED) display unit), and/or the camera 736. The processor 718 may also output user data to the speaker/microphone 724, the keypad 726, and/or the display/touchpad 728. In addition, the processor 718 may access information from, and store data in, any type of suitable memory, such as the non-removable memory 730 and/or the removable memory 732. The non-removable memory 730 may include random-access memory (RAM), read-only memory (ROM), a hard disk, or any other type of memory storage device. The removable memory 732 may include a subscriber identity module (SIM) card, a memory stick, a secure digital (SD) memory card, and the like. In other embodiments, the processor 718 may access information from, and store data in, memory that is not physically located on the WTRU 702, such as on a server or a home computer (not shown).
The processor 718 may receive power from the power source 734, and may be configured to distribute and/or control the power to the other components in the WTRU 702. The power source 734 may be any suitable device for powering the WTRU 702. For example, the power source 734 may include one or more dry cell batteries (e.g., nickel-cadmium (NiCd), nickel-zinc (NiZn), nickel metal hydride (NiMH), lithium-ion (Li-ion), etc.), solar cells, fuel cells, and the like.
The processor 718 may also be coupled to the GPS chipset 736, which may be configured to provide location information (e.g., longitude and latitude) regarding the current location of the WTRU 702. In addition to, or in lieu of, the information from the GPS chipset 736, the WTRU 702 may receive location information over the air interface 716 from a base station and/or determine its location based on the timing of the signals being received from two or more nearby base stations. It will be appreciated that the WTRU 702 may acquire location information by way of any suitable location-determination method while remaining consistent with an embodiment.
The processor 718 may further be coupled to other peripherals 738, which may include one or more software and/or hardware modules that provide additional features, functionality and/or wired or wireless connectivity. For example, the peripherals 738 may include an accelerometer, an e-compass, a satellite transceiver, additional digital camera (for photographs and/or video), a universal serial bus (USB) port, a vibration device, a television transceiver, a hands free headset, a Bluetooth© module, a frequency modulated (FM) radio unit, a digital music player, a media player, a video game player module, an Internet browser, a Virtual Reality and/or Augmented Reality (VR/AR) device, an activity tracker, and the like. The peripherals 738 may include one or more sensors, the sensors may be one or more of a gyroscope, an accelerometer, a hall effect sensor, a magnetometer, an orientation sensor, a proximity sensor, a temperature sensor, a time sensor; a geolocation sensor; an altimeter, a light sensor, a touch sensor, a magnetometer, a barometer, a gesture sensor, a biometric sensor, and/or a humidity sensor.
The emulation devices may be designed to implement one or more tests of other devices in a lab environment and/or in an operator network environment. For example, the one or more emulation devices may perform the one or more, or all, functions while being fully or partially implemented and/or deployed as part of a wired and/or wireless communication network in order to test other devices within the communication network. The one or more emulation devices may perform the one or more, or all, functions while being temporarily implemented/deployed as part of a wired and/or wireless communication network. The emulation device may be directly coupled to another device for purposes of testing and/or may performing testing using over-the-air wireless communications.
The one or more emulation devices may perform the one or more, including all, functions while not being implemented/deployed as part of a wired and/or wireless communication network. For example, the emulation devices may be utilized in a testing scenario in a testing laboratory and/or a non-deployed (e.g., testing) wired and/or wireless communication network in order to implement testing of one or more components. The one or more emulation devices may be test equipment. Direct RF coupling and/or wireless communications via RF circuitry (e.g., which may include one or more antennas) may be used by the emulation devices to transmit and/or receive data.
As noted previously, an integrated plenoptic sensor is an imaging sensor in which a microlens is shared between multiple subpixels. An integrated plenoptic sensor records a “4D light-field” which may give indications on the angle of incidence of light. An integrated plenoptic sensor may be used on cameras and smartphones to drive the autofocus of the main lens. Furthermore, an integrated plenoptic sensor may provide additional abilities, such as passive depth mapping, refocusing, or aberration correction.
In particular, an example practical implementation of the example design shown in
Examples of quad-pixel sensors, along with example quad-pixel sensor designs and examples of related wave optics simulations and results, are included in: Chataignier, G., Vandame, B., and Vaillant, J., Joint Electromagnetic and Ray-Tracing Simulations for Quad-Pixel Sensor and Computational Imaging, 27:21 O
For some embodiments, a shape such as a pyramid shape (not shown in
ratio vs. angle of incidence; (4) to increase sensitivity between adjacent (subpixel) sensors; (5) to increase angular discrimination of light incident on the microlens (or main lens for some embodiments) at an angle near a chief ray angle of the microlens. For some embodiments, one or more external insulators may be adjacent to an external edge of a photo detector. For example, an external insulator (or isolator for some embodiments) be attached to or next to a portion of a photo detector, such as the example shown in
A quad-pixel was simulated using electromagnetic simulations with Finite Difference Time Domain (FDTD) to study and optimize the design of quad pixels of 1.75 μm and 0.8 μm in size. The power absorbed by the silicon (Pabs) as a function of the angle of incidence is called the angular response of the pixel. Two types of crosstalk have been identified: inside the microlens and between the microlenses. In the manner of Kobayashi, Masahiro, A Low Noise and High Sensitivity Image Sensor with Imaging and Phase-Difference Detection AF in All Pixels, 4:2 ITE T
When light enters the microlens, the light forms a diffraction spot, which has a spatial extent. The diffraction spot may cause crosstalk between subpixels, especially near normal incidence (perpendicular to the surface).
For light entering a microlens at small angles of incidence, diffraction inside the pixel may be a source of crosstalk between the subpixels. Reducing crosstalk between subpixels may help avoid pollution between SAIs. With more crosstalk, the SAIs look more similar. If the SAIs look more similar, disparity estimation is more difficult because there are less clues indicating depth.
Compare the sub aperture images 1100, 1110, 1120, 1130, 1140 of
In
Described herein are embodiments which may decrease crosstalk between microlenses 1202. With a reduction in crosstalk, some embodiments may pair a sensor with a main lens that has a higher f-number (f/#). The f-number is a ratio of the focal length to the diameter of the lens. The higher the f-number, the smaller the lens. Some embodiments may make the sensor more resilient with respect to the chief ray angle.
For some embodiments, the microlens may be configured to shift a chief ray angle corresponding to a pixel offset of two pixels. For example, the microlens may have a shape or arc that shifts the chief ray angle by an amount corresponding to a shift as shown in
For each simulated angle of incidence, the left/right ratio described in Kobayashi was used to compare different pixel designs. This figure of merit, the
ratio, was computed and compared to an arbitrary threshold. If the ratio was below the threshold, the signal was treated as “good.” Three areas on the ratio curve were distinguished:
vs. angle of incidence according to some embodiments. The graph 1450 in
For some embodiments, the angular range Ra is reduced (or even minimized for some embodiments), and the angular range Rb is increased (of even maximized for some embodiments). For some embodiments, the radius of curvature of the microlens and the height of the optical stack (the distance between the bottom of the microlens and the surface of the silicon) are adjusted to reduce Ra and increase Rb.
To speed up simulation of a quad-pixel, only subpixels A and B, which are shown in
For some embodiments, an inverted pyramid shape is etched in the silicon to reduce crosstalk between microlenses. The use of such an inverted pyramid shape (which may be a pyramidal structure or pyramid-shaped prism depending on context for some embodiments) may increase convergence (and even ensure optimal convergence for some embodiments) of the microlens at low angles of incidence, thereby reducing Ra, and decrease (or “break”) convergence at higher angles of incidence. As such, at higher angles of incidence, the silicon absorbs more of the light before the light enters a neighboring pixel cell structure.
For some embodiments, the optical stack may be decreased (or “thinned”). For example, compare the optical stack 1500, 1520 of microlens 1502, 1522, planarization 1504, 1524, color filter 1506, 1526, oxide layer 1508, 1528, and photo detector 1510, 1530 of
For some embodiments, a pixel stack structure may include, e.g., a microlens, a color filter system (for example), and two or more plenoptic subpixel sensors, in which the two or more plenoptic subpixel sensors may include two or more photodiodes formed in a silicon layer, and in which a pyramid shape is etched in the two or more respective silicon photodiodes formed in the silicon layer. For some embodiments, the pyramid shape and/or pyramidal structure may be configured to reduce optical crosstalk between at least two of the two or more plenoptic subpixel sensors. In some embodiments, the pyramidal structure may include a pyramid shaped insulator or pyramid shaped insulator structure. For some embodiments, a horizontal facet angle of the pyramid shape of the pyramidal structure may be 54.7 degrees. An example of a pyramidal structure 1550 is shown in
An example color filter system is shown in
As shown in
For some embodiments, an apparatus may include: a microlens; and two or more plenoptic subpixel sensors, the two or more plenoptic subpixel sensors including two or more photodiodes formed in a silicon layer, in which a pyramidal structure is etched in the two or more respective silicon photodiodes formed in the silicon layer. For some embodiments, the pyramid structure may be formed and/or configured for one or more of the following features: to generate an optical prism deviating light away from a center of the two or more plenoptic subpixel sensors; to reduce optical crosstalk between at least two of the two or more plenoptic subpixel sensors; to increase sensitivity of at least one of the two or more plenoptic subpixel sensors; to increase angular discrimination for light incident on the microlens at an angle near a chief ray angle of the microlens; to confine light to one of the two or more plenoptic subpixel sensors; and to reflect light away from at least one of the two or more plenoptic subpixel sensors. For some embodiments, the silicon layer may be etched with a trench conforming to a deep trench isolation (DTI) structure. For some embodiments, the focal length of the microlens may be less than a threshold.
and a threshold vs. angle of incidence for a 3.5 μm quad pixel without a pyramid shape etched in the silicon according to some embodiments. The graphs 1600, 1650 of
and a threshold vs. angle of incidence for a 3.5 μm quad pixel with a pyramid shape etched in the silicon according to some embodiments. The graphs 1700, 1750 of
and a threshold vs. angle of incidence for a 1.6 μm quad pixel without a pyramid shape etched in the silicon according to some embodiments. The graphs 1800, 1850 of
and a threshold vs. angle of incidence for a 1.6 μm quad pixel with a pyramid shape etched in the silicon according to some embodiments. The graphs 1900,1950 of
The threshold line in
In some embodiments, using a structure with a pyramidal or pyramid shape allows good gains on the crosstalk between microlenses, and reduced (or even minimal) losses on the crosstalk between subpixels. In some embodiments, this pyramidal structure allows the use of a wide aperture main lens, and this structure makes the sensor more robust versus the Chief Ray Angle.
Constraints in the manufacturing process may limit the degree to which the optical stack components may be thinned, thereby explaining the losses on the Ra range. To preserve color filtering functionality, for some embodiments a limit may be imposed on the amount of height reduction of the color filter. For some embodiments, the base of the pyramid may set equal to half of the diameter of the microlens, and the height of pyramid may be set according to equation 13:
For some embodiments, the 54.7° facet angle may be used to determine height of the pyramid. For some embodiments, the horizontal facet angle may be an angle other than 54.7°, and the pyramid height may be calculated as shown in Eq. 14:
For some embodiments, a base of the pyramid shape may be equal to half of a diameter of the microlens. For example, the size of the base may be calculated using Eq. 13 for some embodiments.
For some embodiments, the pyramidal structure may conform to pyramid-shaped trenches etched in the silicon layer. For example, the pyramidal structure may fill in, e.g., a pyramid shaped trench such as the triangle shaped trench and form a flat base even with the top of two or more photo detectors as shown in
It will be understood, of course, in some embodiments and in different manufacturing processes in some embodiments, that a pyramidal structure as part of a dual or multi-pixel sensor configuration may approximate, or exhibit generally, a pyramid shape in observable, or potentially measurable, respects, but the pyramidal structure might not be formed as a “perfect” pyramid, pyramidal, or triangular abstract or geometric shape. The pyramidal structure may also have, e.g., convex or concave features, examples of which are shown in
For some embodiments, the plenoptic subpixel sensors may be arranged in a 2×2 grid, in which the pyramidal shape is centered on a center of the 2×2 grid. For example, the pyramid shape may have a square base which is centered around the center point of the 2×2 grid as shown in
An angle of 54.7° was used with the analysis described herein because such an angle is the natural angle of silicon when using wet etching. Alternatively, other angles may be used. For example, greyscale lithography may be used for some embodiments. For some embodiments, a non-straight face, such as the examples shown in
For some embodiments, the plenoptic subpixel sensors may be arranged in a 3×3 grid, and the pyramid shape may be centered on a center of the 3×3 grid. For some embodiments, the pyramid shape may be a frustum, and the flat top of the frustum may be configured to match a center subpixel sensor of the 3×3 grid. For example, the pyramid shape may be symmetrical and centered around the center subpixel sensor of the 3×3 grid for some embodiments. In some embodiments, the tip of the pyramid shape is a square. For example, the tip of the pyramidal structure may be clipped so that the tip is a square. The square tip may correspond to a square center subpixel sensor of a 3×3 grid of subpixels. For some embodiments, the pyramid portion of the pyramid shape may be a polygon extruded in at least one direction.
For some embodiments, the pyramidal structure may include a material layer above the silicon layer. For some embodiments, the pyramidal structure may include a color filter layer of the color filter system above the material layer. For some embodiments, the color filter layer may include a color filter resin. For some embodiments, the material layer may include an oxide layer between the color filter system and at least one of the two or more plenoptic subpixel sensors. For some embodiments, the pyramidal structure may be made of silicon dioxide (SiO2).
An example apparatus in accordance with some embodiments may include a microlens; and two or more plenoptic subpixel sensors, the two or more plenoptic subpixel sensors including two or more photodiodes formed in a silicon layer, wherein a pyramidal structure is etched in the two or more respective silicon photodiodes formed in the silicon layer.
For some embodiments of the example apparatus, the pyramidal structure may be formed to generate an optical prism deviating light away from a center of the two or more plenoptic subpixel sensors.
For some embodiments of the example apparatus, the pyramidal structure may be configured to reduce optical crosstalk between at least two of the two or more plenoptic subpixel sensors.
For some embodiments of the example apparatus, the pyramidal structure may be configured to increase sensitivity of at least one of the two or more plenoptic subpixel sensors.
For some embodiments of the example apparatus, the pyramidal structure may be configured to increase angular discrimination for light incident on the microlens at an angle near a chief ray angle of the microlens.
For some embodiments of the example apparatus, the pyramid structure may be configured to confine light to one of the two or more plenoptic subpixel sensors.
For some embodiments of the example apparatus, the pyramid structure may be configured to reflect light away from at least one of the two or more plenoptic subpixel sensors.
For some embodiments of the example apparatus, the silicon layer may be etched with a trench conforming to a deep trench isolation (DTI) structure.
For some embodiments of the example apparatus, a focal length of the microlens may be less than a threshold.
For some embodiments of the example apparatus, the two or more plenoptic sensors may include a dual pixel sensor; and the pyramidal structure may include a triangular structure.
For some embodiments of the example apparatus, the two or more plenoptic sensors may include a quad pixel sensor.
For some embodiments of the example apparatus, the pyramidal structure may include a material layer above the silicon layer.
For some embodiments of the example apparatus, the pyramidal structure may conform to pyramid-shaped trenches etched in the silicon layer.
Some embodiments of the example apparatus may further include a color filter system.
For some embodiments of the example apparatus, the pyramidal structure may include a material layer above the silicon layer; and the material layer may include an oxide layer between the color filter system and at least one of the two or more plenoptic subpixel sensors.
For some embodiments of the example apparatus, the pyramidal structure may further include a color filter layer of the color filter system above the material layer.
For some embodiments of the example apparatus, the color filter layer may include a color filter resin.
For some embodiments of the example apparatus, the color filter system may include a planarization layer and a color filter layer.
For some embodiments of the example apparatus, a horizontal facet angle of the pyramidal structure may be 54.7 degrees.
For some embodiments of the example apparatus, a base of the pyramidal structure may be equal to half of a diameter of the microlens.
For some embodiments of the example apparatus, the two or more plenoptic subpixel sensors may be arranged in a 2×1 grid, and the pyramidal structure may be centered on a line dividing the 2×1 grid in half.
For some embodiments of the example apparatus, the two or more plenoptic subpixel sensors may be arranged in a 2×2 grid, and the pyramidal structure may be centered on a center of the 2×2 grid.
For some embodiments of the example apparatus, the two or more plenoptic subpixel sensors may be arranged in a 3×3 grid, and the pyramidal structure may be centered on a center of the 3×3 grid.
For some embodiments of the example apparatus, the pyramidal structure may be a frustum, and a flat top of the frustum may be configured to match a center subpixel sensor of the 3×3 grid.
For some embodiments of the example apparatus, a tip of the pyramidal structure may be a square.
For some embodiments of the example apparatus, a base of the pyramidal structure may be a disk.
For some embodiments of the example apparatus, a tip of the pyramidal structure may be a disk.
For some embodiments of the example apparatus, the two or more plenoptic subpixel sensors may be arranged in a grid array, and the pyramidal structure may be centered on a center of the grid array.
For some embodiments of the example apparatus, a pyramid portion of the pyramidal structure may be a triangle extruded in at least one direction.
For some embodiments of the example apparatus, a pyramid portion of the pyramidal structure may be a polygon extruded in at least one direction.
For some embodiments of the example apparatus, a base of a pyramid portion of the pyramidal structure may be a square.
Some embodiments of the example apparatus may further include one or more external insulators adjacent to an external edge of at least one of the plenoptic subpixel sensors.
For some embodiments of the example apparatus, the microlens may be configured to shift a chief ray angle corresponding to a pixel offset of two pixels.
For some embodiments of the example apparatus, at least one of the two or more plenoptic subpixel sensors may be adjacent to the color filter system.
Some embodiments of the example apparatus may further include an oxide layer, wherein the oxide layer may be adjacent to the color filter system, wherein at least one of the two or more plenoptic subpixel sensors may be adjacent to the oxide layer, and wherein the oxide layer may be between the color filter system and at least one of the two or more plenoptic subpixel sensors.
For some embodiments of the example apparatus, a base of a pyramid portion of the pyramidal structure may be adjacent to the oxide layer.
For some embodiments of the example apparatus, a size of a base of a pyramid portion of the pyramidal structure may be adjusted based on a curvature of the microlens.
For some embodiments of the example apparatus, the pyramidal structure may be symmetrical along a first plane cutting through a center of the pyramidal structure, the pyramidal structure may be symmetrical along a second plane cutting through the center of the pyramidal structure, and wherein the first plane may be perpendicular to the second plane.
For some embodiments of the example apparatus, wherein the color filter system may be adjacent to the microlens.
For some embodiments of the example apparatus, two or more plenoptic subpixel sensors may be adjacent to the color filter system.
For some embodiments of the example apparatus, the color filter system may include a filter pattern of two or more subpixels.
For some embodiments of the example apparatus, the pyramidal structure may include silicon dioxide (SiO2).
For some embodiments of the example apparatus, the pyramidal structure may include silicon oxide.
For some embodiments of the example apparatus, the pyramidal structure may be configured to reduce optical crosstalk between at least two pixels.
For some embodiments of the example apparatus, the pyramidal structure may be configured to reduce optical crosstalk between at least two microlenses.
Note that various hardware elements of one or more of the described embodiments are referred to as “modules” that carry out (i.e., perform, execute, and the like) various functions that are described herein in connection with the respective modules. As used herein, a module includes hardware (e.g., one or more processors, one or more microprocessors, one or more microcontrollers, one or more microchips, one or more application-specific integrated circuits (ASICs), one or more field programmable gate arrays (FPGAs), one or more memory devices) deemed suitable by those of skill in the relevant art for a given implementation. Each described module may also include instructions executable for carrying out the one or more functions described as being carried out by the respective module, and it is noted that those instructions could take the form of or include hardware (i.e., hardwired) instructions, firmware instructions, software instructions, and/or the like, and may be stored in any suitable non-transitory computer-readable medium or media, such as commonly referred to as RAM, ROM, etc.
Although features and elements are described above in particular combinations, one of ordinary skill in the art will appreciate that each feature or element can be used alone or in any combination with the other features and elements. In addition, the methods described herein may be implemented in a computer program, software, or firmware incorporated in a computer-readable medium for execution by a computer or processor. Examples of computer-readable storage media include, but are not limited to, a read only memory (ROM), a random access memory (RAM), a register, cache memory, semiconductor memory devices, magnetic media such as internal hard disks and removable disks, magneto-optical media, and optical media such as CD-ROM disks, and digital versatile disks (DVDs). A processor in association with software may be used to implement a radio frequency transceiver for use in a WTRU, UE, terminal, base station, RNC, or any host computer.
Number | Date | Country | Kind |
---|---|---|---|
21306035.3 | Jul 2021 | EP | regional |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/EP2022/070209 | 7/19/2022 | WO |