Image sensors are used in electronic devices such as cellular telephones, cameras, and computers to capture images. In particular, an electronic device is provided with an array of image pixels arranged in a grid pattern. Each image pixel receives incident photons, such as light, and converts the photons into electrical signals. Many image sensors suffer from low-light sensitivity. That is, in low light situations, such as dawn or dusk, insufficient photons may be captured to recreate a suitable image.
One example is an image sensor comprising a plurality of image pixels. Each image pixel may comprise: a color router defining a router collection area on an upper surface; a first photosensitive region beneath the color router; a second photosensitive region beneath the color router; and a third photosensitive region beneath the color router. The color router may be configured to route photons of a first wavelength received at the router collection area to the first photosensitive region, route photons of a second wavelength received at the router collection area to the second photosensitive region, and route photons of a third wavelength received at the router collection area to the third photosensitive region.
In the example image sensor, when the color router routes photons of a first wavelength, the color router may be further configured to route photons having a wavelength corresponding to red to the first photosensitive region, and when the color router routes photons of a second wavelength the color router may be further configured to route photons having a wavelength corresponding to yellow to the second photosensitive region, and when the color router routes photons of a third wavelength the color router is further configured to route photons having a wavelength corresponding to cyan to the third photosensitive region.
In the example image sensor, when the color router routes photons of a first wavelength the color router may be further configured to route photons having a wavelength corresponding to red to the first photosensitive region, when the color router routes photons of a second wavelength the color router may be further configured to route photons having a wavelength corresponding to yellow to the second photosensitive region; and when the color router routes photons of a third wavelength the color router may be further configured to route photons having a wavelength corresponding to blue to the third photosensitive region.
In the example image sensor, each image pixel may further comprise a fourth photosensitive region beneath the color router. When the color router routes photons of a first wavelength, the color router may be further configured to route photons having a wavelength corresponding to red to the first photosensitive region, when the color router routes photons of a second wavelength the color router may be further configured to route photons having a wavelength corresponding to yellow to the second photosensitive region, and when the color router routes photons of a third wavelength the color router may be further configured to route photons having a wavelength corresponding to green to the third photosensitive region, and the color router may be further configured to route photons having a wavelength corresponding to blue to the fourth photosensitive region.
In the example image sensor, each image pixel may further comprise a fourth photosensitive region beneath the color router. When the color router routes photons of a first wavelength the color router may be further configured to route photons having a wavelength corresponding to red to the first photosensitive region, when the color router routes photons of a second wavelength the color router may be further configured to route photons having a wavelength corresponding to green to the second photosensitive region, when the color router routes photons of a third wavelength the color router may be further configured to route photons having a wavelength corresponding to blue to the third photosensitive region, and the color router may be further configured to route photons having a wavelength corresponding to infrared to the fourth photosensitive region.
The example image sensor may further comprise: the first photosensitive region defining a first collection area; the second photosensitive region defining a second collection area smaller than the first collection area; and the third photosensitive region defining a third collection area smaller than the second collection area. The first wavelength may be is longer than the second wavelength, and the second wavelength may be longer than the first wavelength. The first wavelength may correspond to red, the second wavelength may correspond to blue, and the third wavelength may correspond to green. The first wavelength may correspond to a first infrared wavelength, the second wavelength may correspond to a second infrared wavelength different than the first wavelength, and the third wavelength may correspond to a third infrared wavelength. The example image sensor may further comprise a fourth photosensitive region beneath the color router, the fourth photosensitive region defining a fourth collection area larger than the first collection area, and the color router may be configured to route photons of a fourth wavelength to the fourth photosensitive region, the fourth wavelength longer than the first wavelength. The first wavelength may correspond to red, the second wavelength may correspond to blue, the third wavelength may correspond to green, and the fourth wavelength may correspond to infrared.
In the example image sensor, each image pixel may defines a long dimension measured parallel to the router collection area, and each image pixel further comprises: the first photosensitive region defines a collection area defining first shape; the second photosensitive region defines a collection area defining second shape; and the third photosensitive region defines a collection area defining third shape. The first shape, the second shape, and the third shape may be configured such that the longest horizontal distance a photon is routed through the color router is half the long dimension.
In the example image sensor, the color router may defines a first quadrant and a second quadrant. The first photosensitive region may define a composite collection area made up of a plurality of discrete photosensitive regions, and wherein the plurality of discrete photosensitive regions are equally divided beneath the first quadrant and the second quadrant. And the color router may be further configured to route photons of the first wavelength that arrive within the first quadrant to discrete photosensitive regions directly beneath the first quadrant, and to route photons of the first wavelength that arrive within the second quadrant to discrete photosensitive regions directly beneath the second quadrant. The example image sensor may further comprise an imaging controller operatively coupled to the first photosensitive region. The imaging controller may be configured to detect a phase imbalance based on a different number of photons arriving in the first quadrant compared to the second quadrant, and to modify a focus parameter based on the phase imbalance.
The example image sensor may further comprise a collimator disposed above the color router.
Another example is an imaging system comprising: an imaging controller and a camera module. The camera module may comprise: a lens system coupled to the imaging controller; and a plurality of image pixels in operational relationship to the lens system and communicatively coupled to the imaging controller. Each image pixel may comprise: a color router defining a router collection area on an upper surface; a first photosensitive region beneath the color router; a second photosensitive region beneath the color router; and a third photosensitive region beneath the color router. The color router may be configured to route photons of a first wavelength received at the router collection area to the first photosensitive region, route photons of a second wavelength received at the router collection area to the second photosensitive region, and route photons of a third wavelength received at the router collection area to the third photosensitive region.
In the example imaging system, the imaging controller and camera module may be associated with an automobile.
In the example image sensor, when the color router routes photons of a first wavelength the color router may be further configured to route photons having a wavelength corresponding to red to the first photosensitive region, when the color router routes photons of a second wavelength the color router may be further configured to route photons having a wavelength corresponding to yellow to the second photosensitive region, and when the color router routes photons of a third wavelength the color router may be further configured to route photons having a wavelength corresponding to cyan to the third photosensitive region.
In the example image system, when the color router routes photons of a first wavelength the color router may be further configured to route photons having a wavelength corresponding to red to the first photosensitive region, when the color router routes photons of a second wavelength the color router may be further configured to route photons having a wavelength corresponding to yellow to the second photosensitive region, and when the color router routes photons of a third wavelength the color router may be further configured to route photons having a wavelength corresponding to blue to the third photosensitive region.
In the example image system, each image pixel may further comprise a fourth photosensitive region beneath the color router. And when the color router routes photons of a first wavelength the color router may be further configured to route photons having a wavelength corresponding to red to the first photosensitive region, when the color router routes photons of a second wavelength the color router may be further configured to route photons having a wavelength corresponding to yellow to the second photosensitive region, when the color router routes photons of a third wavelength the color router may be further configured to route photons having a wavelength corresponding to green to the third photosensitive region, and the color router may be further configured to route photons having a wavelength corresponding to blue to the fourth photosensitive region.
In the example image system, each image pixel may further comprises a fourth photosensitive region beneath the color router. And when the color router routes photons of a first wavelength the color router may be further configured to route photons having a wavelength corresponding to red to the first photosensitive region, when the color router routes photons of a second wavelength the color router is further configured to route photons having a wavelength corresponding to green to the second photosensitive region, when the color router routes photons of a third wavelength the color router may be further configured to route photons having a wavelength corresponding to blue to the third photosensitive region, and the color router may be further configured to route photons having a wavelength corresponding to infrared to the fourth photosensitive region.
The example image system the first photosensitive region may define a first collection area, the second photosensitive region may define a second collection area smaller than the first collection area, and the third photosensitive region may define a third collection area smaller than the second collection area. The first wavelength may be longer than the second wavelength, and the second wavelength may be longer than the first wavelength. The first wavelength may correspond to red, the second wavelength may correspond to blue, and the third wavelength may correspond to green. The first wavelength may correspond to a first infrared wavelength, the second wavelength may correspond to a second infrared wavelength different than the first wavelength, and the third wavelength may corresponds to a third infrared wavelength. The example imaging sensor
The example imaging system may further comprise a fourth photosensitive region beneath the color router, the fourth photosensitive region defining a fourth collection area larger than the first collection area. The may be configured to route photons of a fourth wavelength to the fourth photosensitive region, the fourth wavelength longer than the first wavelength. The first wavelength may corresponds to red, the second wavelength may correspond to blue, the third wavelength may correspond to green, and the fourth wavelength may correspond to infrared.
The example imaging system each image pixel may define a long dimension measured parallel to the router collection area, and each image pixel may further comprise: the first photosensitive region defines a collection area defining first shape; the second photosensitive region defines a collection area defining second shape; and the third photosensitive region defines a collection area defining third shape. The first shape, the second shape, and the third shape may be configured such that the longest horizontal distance a photon is routed through the color router is half the long dimension.
In the example imaging system the color router may defines a first quadrant and a second quadrant, the first photosensitive region may define a composite collection area made up of a plurality of discrete photosensitive regions, and the plurality of discrete photosensitive regions may be equally divided beneath the first quadrant and the second quadrant. The color router may be further configured to route photons of the first wavelength that arrive within the first quadrant to discrete photosensitive regions directly beneath the first quadrant, and to route photons of the first wavelength that arrive within the second quadrant to discrete photosensitive regions directly beneath the second quadrant. The example imaging system may further comprise an imaging controller operatively coupled to the first photosensitive region. The imaging controller may be configured to detect a phase imbalance based on a different number of photons arriving in the first quadrant compared to the second quadrant, and to modify a focus parameter based on the phase imbalance.
The example imaging system may further comprise a collimator disposed above the color router.
Yet still other examples are methods of operating an image sensor, the method comprising: directing photons from a scene into a color router positioned above a plurality of photosensitive regions; routing, by the color router, photons of a first wavelength to a first photosensitive region beneath the color router; routing photons of a second wavelength to a second photosensitive region beneath the color router; and routing photons of a third wavelength to a third photosensitive region beneath the color router.
In the example method, the first wavelength may correspond to red, the second wavelength may correspond to yellow, and the third wavelength may correspond to cyan.
In the example method, the first wavelength may correspond to red, the second wavelength may correspond to yellow, and the third wavelength may correspond to blue.
In the example method, the first wavelength may corresponds to red, the second wavelength may correspond to yellow, and the third wavelength may correspond to green/The method may further comprise routing, by the color router, photons of a fourth wavelength corresponding to blue to a fourth photosensitive region beneath the color router.
In the example method, the first wavelength may correspond to red, the second wavelength may correspond to green, and the third wavelength may correspond to blue. The method may further comprise routing, by the color router, photons of a fourth wavelength corresponding to infrared to a fourth photosensitive region beneath the color router.
In the example method, the first photosensitive region may define a first collection area, the second photosensitive region may define a second collection area smaller than the first collection area, and the third photosensitive region may define a third collection area smaller than the second collection area. The first wavelength may be longer than the second wavelength, and the second wavelength may be longer than the third wavelength. The first wavelength may correspond to red, the second wavelength may correspond to blue, and the third wavelength may correspond to green. The first wavelength may correspond to a first infrared wavelength, the second wavelength may correspond to a second infrared wavelength, and the third wavelength may correspond to a third infrared wavelength. The example method may further comprise routing, by the color router, photons of a fourth wavelength to a fourth photosensitive region, the four photosensitive region defining a fourth collection area larger than the first collection area. The first wavelength may correspond to red, the second wavelength may correspond to blue, the third wavelength may correspond to green, and the fourth may wavelength correspond to infrared.
In the example method, each image pixel may define a long dimension measured parallel to a router collection area of the color router. Routing photons of the first wavelength may further comprise horizontally routing the photons of the first wavelength no more than three-quarters of the long dimension to reach the first photosensitive region. Routing photons of the second wavelength may further comprise horizontally routing the photons of the second wavelength no more than three-quarters of the long dimension to reach the second photosensitive region. Routing photons of the third wavelength further comprises horizontally routing the photons of the third wavelength no more than three-quarters of the long dimension to reach the third photosensitive region.
In the example method: the color router may define a first quadrant and a second quadrant: the first photosensitive region may define a composite collection area made up of a plurality of discrete photosensitive regions; and the plurality of discrete photosensitive regions may be are equally divided beneath the first quadrant and the second quadrant. The color router may be further configured to route photons of the first wavelength that arrive within the first quadrant to discrete photosensitive regions directly beneath the first quadrant, and to route photons of the first wavelength that arrive within the second quadrant to discrete photosensitive regions directly beneath the second quadrant. The example method may further comprise: detecting a phase imbalance based on a different number of photons arriving in the first quadrant compared to the second quadrant; and modifying a focus parameter based on the phase difference.
The example method may further comprise collimating photons between the scene and the color router.
For a detailed description of example embodiments, reference will now be made to the accompanying drawings in which:
Various terms are used to refer to particular system components. Different companies may refer to a component by different names—this document does not intend to distinguish between components that differ in name but not function. In the following discussion and in the claims, the terms “including” and “comprising” are used in an open-ended fashion, and thus should be interpreted to mean “including, but not limited to . . . .” Also, the term “couple” or “couples” is intended to mean either an indirect or direct connection. Thus, if a first device couples to a second device, that connection may be through a direct connection or through an indirect connection via other devices and connections.
Terms defining an elevation, such as “above,” “below,” “upper”, and “lower” shall be locational terms in reference to a direction of light incident upon a pixel array and/or an image pixel. Light entering shall be considered to interact with or pass objects and/or structures that are “above” and “upper” before interacting with or passing objects and/or structures that are “below” or “lower.” Thus, the locational terms may not have any relationship to the direction of the force of gravity.
“IR” shall mean infrared.
In relation to electrical devices, whether stand alone or as part of an integrated circuit, the terms “input” and “output” refer to electrical connections to the electrical devices, and shall not be read as verbs requiring action. For example, a differential amplifier, such as an operational amplifier, may have a first differential input and a second differential input, and these “inputs” define electrical connections to the operational amplifier, and shall not be read to require inputting signals to the operational amplifier.
“Controller” shall mean, alone or in combination, individual circuit components, an application specific integrated circuit (ASIC), a microcontroller with controlling software, a reduced-instruction-set computing (RISC) with controlling software, a digital signal processor (DSP), a processor with controlling software, a programmable logic device (PLD), a field programmable gate array (FPGA), or a programmable system-on-a-chip (PSOC), configured to read inputs and drive outputs responsive to the inputs.
The following discussion is directed to various embodiments of the invention. Although one or more of these embodiments may be preferred, the embodiments disclosed should not be interpreted, or otherwise used, as limiting the scope of the disclosure, including the claims. In addition, one skilled in the art understands that the following description has broad application, and the discussion of any embodiment is meant only to be exemplary of that embodiment, and not intended to intimate that the scope of the disclosure, including the claims, is limited to that embodiment.
Various examples are directed to imaging systems, image pixels, and related methods. More particularly, at least some examples are directed to image pixels designed and constructed to be more sensitive in low-light situations by not using color filters, which color filters tend to absorb light of certain frequencies and thus reduce the overall number of photons available to the photosensitive regions. More particularly, various examples are directed to image pixels that use color routers to direct photons of light, incident upon a collection area of the color router, to the underlying photosensitive regions. The routing is based on the wavelength of each photon. Other examples are directed to image pixels in which each photosensitive region has a collection area associated with an overlying color router, and where the collection area of each photosensitive region is proportional to the wavelength of photons directed to each photosensitive region. That is, photons having shorter wavelengths such blue are directed to photosensitive regions with smaller collection areas, and photons having longer wavelengths such as red or infrared are directed to photo sensitive regions with larger collection areas. In yet still other examples, the underlying photosensitive regions are arranged such that the longest horizontal distance a photon travels through a color router is half the long dimension of the image pixel. The specification now turns to example systems to orient the reader.
The imaging controller 108 may include one or more integrated circuits, such as image processing circuits, microprocessors, and storage devices such as random-access memory and non-volatile memory. The imaging controller 108 may be implemented using components that are separate from camera module 102 or that form part of camera module 102, such as circuits that form part of the image sensor 106. Digital image data captured by the camera module 102 may be processed and stored using the imaging controller 108. Processed image data may, if desired, be provided to external equipment, such as a computer, an external display, or other devices, using wired and/or wireless communications paths coupled to imaging controller 108.
The image sensor 106 comprises a pixel array 210 containing a plurality of image pixels 212 arranged in rows and columns. Pixel array 210 may comprise, for example, hundreds or thousands of rows and columns of image pixels 212. Control and readout of the pixel array 210 may be implemented by an image sensor controller 214 coupled to a row controller 216 and a column controller 218. The example row controller 216 may receive row addresses from image sensor controller 214 and supply corresponding row control signals to the image pixels 212, such as reset, row-select, charge transfer, dual conversion gain, and readout control signals. The row control signals may be communicated over one or more conductors, such as row control paths 220.
Column controller 218 may be coupled to the pixel array 210 by way of one or more conductors, such as column lines 222. The column controller may sometimes be referred to as a column control circuit, a readout circuit, and/or column decoder. Column lines 222 may be used for reading out image signals from image pixels 212 and for supplying bias currents and/or bias voltages to the image pixels 212. If desired, during pixel readout operations, a pixel row in the pixel array 210 may be selected using row controller 216 and image signals generated by image pixels 212 in that row can be read out along the column lines 222. Each image pixel 212 may comprises a plurality of photosensitive regions, such as four, nine, or sixteen, and thus while each column line 222 is shown as a single conductor, a plurality of such column lines may be associated with each image pixel 212 in a column.
The example column controller 218 may include sample-and-hold circuitry for sampling and temporarily storing image signals read out from pixel array 210, amplifier circuitry, analog-to-digital conversion (ADC) circuitry, bias circuitry, column memory, latch circuitry for selectively enabling or disabling the column circuitry, or other circuitry that is coupled to one or more columns of pixels in the pixel array 210 for operating the image pixels 212 and for reading out image signals from the image pixels 212. ADC circuitry in the column controller 218 may convert analog pixel values received from the pixel array 210 into corresponding digital image data. Column controller 218 may supply digital image data to the image sensor controller 214 and/or the imaging controller 108 (
Similarly,
Referring to
Another reason for poor performance in low-light situations is that the color filters are not themselves perfectly efficient in passing the colors within the passband. For example, the green color filter 330 may be made of material that passes only 90% of the green light incident upon the filter, and it follows the green color filter 330 absorbs about 10% of the green light. Thus, of the green light that falls on overall image pixel 312, only 50% falls on the two example green sensors 302 and 304, and only 90% of the 50%, or only about 45%, of the green light finds its way into the photosensitive regions of the green sensors 302 and 304. The color filters of the red sensor 300 and the blue sensor may have similar issues.
The performance of image sensors may be improved by use of nanophotonic structures or color routers. In particular, a color router is a semiconductor structure that accepts photons incident on upper surface or router collection area defined by the upper surface. The color router then routes photons from the router collection area to the underlying photosensitive regions, with the routing based on the wavelength of each particular photon. The router collection area may correspond to an overall image pixel, thereby being greater in area than an area above a single photosensitive region. The specification hereby defines and adopts the following notation for referencing color represented by the wavelength of a photon: red photons are photons having a wavelength that correspond to the color red; yellow photons are photons having a wavelength that correspond to the color yellow; blue photons are photons having a wavelength that correspond to the color blue; cyan photons are photons having a wavelength that correspond to the color cyan; infrared photons are photons having a wavelength that correspond to infrared; and so on. Consider, as an example, a red-yellow-yellow-cyan (RYYCy) image pixel. With that nomenclature in mind, of all the photons that are incident on the router collection area of the color router of an example RYYCy image pixel: red photons are directed to the photosensitive region designated for receiving red; yellow photons are directed the photosensitive regions designated for receiving yellow; and cyan photons directed to the photosensitive region designated for receiving cyan. Thus, for example, red photons that happen to arrive physically above the photosensitive region designated for cyan or the photosensitive regions designated for yellow are not lost to absorption, but are routed to the photosensitive region designated for red.
The photosensitive regions 402, 404, 406, and 408 are semiconductor regions, such as silicon, within which photons of light may be captured or absorbed to create electrical signals, such as voltage or current. The photosensitive regions 402, 404, 406, and 408 themselves are agnostic the wavelength of the photons absorbed—photons of any suitable wavelength for image pixels, spanning from visible region into the infrared region—may be absorbed once the photons find their way into the semiconductor material. In many cases, each photosensitive region is designed and constructed as photodiode that produces electrical voltage and current responsive to capture or absorption of photons of light.
The example color router 400 defines a router collection area 410 on an upper surface thereof. That is, the upper surface of the color router 400 defines a length LCR and a width WCR, and the length LCR and a width WCR considered together define the collection area. Each photosensitive region 402, 404, 406, and 408 itself defines a collection aperture or collection area. For example, photosensitive region 408 defines a length LPD and a width WPD, and the length LPD and the width WPD considered together define a collection area. Thus, the photosensitive region 402 defines a collection area 412, the photosensitive region 404 defines a collection area 414, the photosensitive region 406 defines a collection area 416, and the photosensitive region 408 defines a collection area 418. Given that the photosensitive regions 402, 404, 406, and 408 reside in the area beneath the color router 400 or coextensive with the color router 400, the collection area for each photosensitive region is smaller than the router collection area 410. In other examples, the router collection area 410 may be coextensive with more or fewer photosensitive collection areas than what is shown in
In various examples, photons of light that are incident upon the router collection area 410 of the color router 400 enter the structure of the color router 400 and are routed to particular the collection areas of the underlying photosensitive regions. In an image pixel 212 having four underlying photosensitive regions, the color router 400 may be designed and constructed to: route photons of a first wavelength received at the router collection area 410 to the photosensitive region 402; route photons of a second wavelength received at the router collection area 410 to the second photosensitive region 404; route photons of a third wavelength received at the router collection area 410 to the photosensitive region 406; and route photons of a fourth wavelength received at the router collection area 410 to the photosensitive region 408. The representative example of
The example image pixel 212 with the RYGB sensitivity may be particularly suited for automotive applications. That is, one of the more difficult tasks performed by automated driving system systems is distinguishing between red and yellow, for example, the red and yellow of a stop light. The wavelengths corresponding to red and the wavelengths corresponding to yellow are close together. Related art image pixels using color filters not only suffer the collection-area shortcomings noted above, but yellow color filters designed to pass yellow photons tend absorb a high percentage of the desirable photons given the yellow wavelength proximity to red. However, using a color router 400 reduces the collection-area shortcomings—yellow photons incident anywhere on the router collection area 410 may be routed to the underlying photosensitive region 404. And with no yellow color filter needed, the percentage of yellow photons that find their way into the photosensitive region 404 is significantly higher than for related-art image pixels using color filters. Of course, the statement regarding yellow is true for all the color sensitivity of the image pixel 212.
The color router 400 may take any suitable form. In many cases, the color router 400 is constructed of several levels or layers, where each layer is designed and constructed to perform at least a partial routing. Each layer may be designed and constructed of a plurality of three-dimensional structures, such as cuboids of materials having varying indices of refraction and/or different sizes. For example, the three-dimensional structures a particular layer may be selectively made of silicon dioxide and silicon nitride to bend and reflect photons at least partially toward to the designated underlying photosensitive region. For the example image pixel 212 of
The design of the color router 400 may not be 100% percent efficient at routing photons received at the router collection area 410 to the corresponding color collection areas 412, 414, 416, and 418. For example, some photons may be reflected back out through router collection area 410. Other photons may be misrouted, such as photons with a high angle of incidence. Moreover, refractions and reflections within the color router 400 may send photons streaming out between the router collection area 410 and the various collection areas 412, 414, 416, and 418. However, even taking into account the potential inefficiencies of such a color router, the overall image pixel 212 may still have better collection efficiency, and thus better low-light sensitivity, than other image pixels using color filters. Consider, as an example, that the color router 400 is only 50% efficient in routing red photons. That is, in the example only half the red photons incident upon the router collection area 410 find their way to the photosensitive region 406. The 50% assumed efficiency is still likely higher than the efficiency of the color-filtered image pixel's collection of red photons over only 25% of the same region. Further, given that a red color filter may only pass about 90% of the red photons, an example image pixel with a color router 400 having only 50% efficiency in routing red photons may potentially collect more than double the number of photons of the related-art image pixel 312 with a red color filter 322.
Again, the example image pixel 212 may be particularly suited for automotive applications. However, use of an image pixel with both red and yellow sensitivity, along with a color router, is not limited to just RYGB sensitivity.
The various specific embodiments discussed to this point have been directed to automotive applications concerned with the visible spectrum; however, use of the color router may also be beneficial when the image pixel is designed and constructed to receive infrared wavelengths.
In yet still further cases, the image pixels may be designed and constructed for hyperspectral use. Hyperspectral imaging may be used in industrial applications for counterfeit detection, powder analysis, and/or checking the integrity of packaging. In agriculture, hyperspectral imaging may be used to precisely water, fertilize, weed and pest control. There may be applications of hyperspectral imaging in medical and surveillance as well.
Still referring to
The example photosensitive region 1100 designated for red defines a length LR and a width WR, and together the length LR and the width WR define a collection area for the photosensitive region 1100. Given that the overlaying color router, not shown in the shorthand notation, spans the entire image pixel 212, the collection area for the photosensitive region 1100 is smaller than the router collection area 410 (
The example photosensitive region 1102 designated for green defines a length LG and a width WG, and together the length LG and the width WG define a collection area for the photosensitive region 1102. The collection area for the photosensitive region 1102 is smaller than the router collection area 410 (
Still referring to
Now consider a blue photon incident upon the router collection area 410. On average, such a blue photon may have longer horizontal distance to travel to reach the blue photosensitive region 1106 than the red photon since the blue photosensitive region 1106 is smaller than the red photosensitive region. But because the wavelength of blue is shorter than the wavelength of red, the blue photon it may be more efficiently absorbed by the silicon of the photodiode than the red photon.
In some cases, each image pixel 212 of
The size considerations for the collection areas of the photosensitive regions are not limited to just wavelengths in the visible spectrum. The same collection area and wavelength considerations may apply for image pixels with mixed sensitivity, such as including visible and infrared wavelengths.
In the example of
The size considerations for the collection areas of the photosensitive regions are not limited to just wavelengths in the visible spectrum and mixed visible and infrared. The same collection area and wavelength considerations may apply for image pixels dedicated only to infrared.
In the example of
In the examples described above, each photosensitive region defines a cuboid with a square collection aperture or collection area, such as
In some examples, multiple discrete photosensitive regions may be arranged together to create the desired shape for the given wavelength range. It follows that the collection areas for the multiple discrete photosensitive regions considered together define the overall collection for the given wavelength range. The detected signal for the given wavelength range may be realized by summing the signals generated by the discrete photosensitive regions in the analog or digital domain. For example, in
The example photosensitive region 1400 designated for red defines a collection area with a first shape. In the example of
Still referring to
The example image pixel 212 of
Other considerations for the design of the color router 400 may include phase detection auto focus (PDAF) considerations. That is, the design and construction of the color router 400 may route photons to respective photosensitive regions not only based on wavelength, but also based on the physical location of the router collection area at which a photon enters the color router 400.
To describe such operation,
Using the image pixel 212 in
The same reasoning may be applied with respect to the quadrants shown in
In the example PDAF considerations discussed with respect to
In some examples, the image pixel may be associated with a collimator or other means of reducing an angle of incidence of incoming photons. As alluded to above, the ability of the color router 400 to perform the wavelength- and/or location-based routing may be impaired when the photon impinges upon the router collection area with a high angle of incidence. Reducing the angle of incidence of incoming photons to a color router may therefore improve routing efficiency.
As the name implies, the collimator 1710 is designed and constructed to at least partially collimate the photons collected by collimator 1710 before those photons are incident upon the router collection area 410. Stated otherwise, the collimator 1710 is designed and constructed to modify the angle of incidence of at least some photons prior to those photons being incident upon the router collection area 410. The collimator 1710 may take any suitable form, such as a set of parallel walls that form grid pattern. The collimator may be designed and constructed of a plurality of three-dimensional structures, such as cuboids, or of materials having varying indices of refraction and/or different sizes. For example, the three-dimensional structures of the collimator layer may be selectively made of silicon dioxide and silicon nitride to modify the angle of incidence.
Turning to the design considerations for the color router 400, as noted above, the color router 400 may take any suitable form. In many cases, the color router 400 is constructed of several levels or layers, where each layer is designed and constructed to perform at least a partial routing. Each layer may be designed and constructed of a plurality of three-dimensional structures of materials having varying indices of refraction and/or different sizes. For example, the three-dimensional structures may be cuboids or other forms. The materials may include dielectrics and/or metallic materials. In order to reduce complexity of the construction, constraints may be placed on size of the elements that make up each layer of the color router.
In example cases, each layer is made of cuboids of a particular size or volume. For example, each cuboid of the layer 1900 defines a first size or first volume. Each cuboid of the layer 1902 defines a second size or second volume larger than the first size or the first volume. Each cuboid of the layer 1904 defines a third size or third volume larger than the second size or the second volume. The size change of the cuboids in each layer may be in any order, even though the example of
In some cases, the cuboids of each example layer are made of the same material, such as silicon oxide, silicon nitride, or titanium oxide, embedded in a lower refractive index material, such as silicon oxide and/or air. In other cases, however, the cuboids of each layer may be made of different materials than the layer above and/or below. For example, a particular layer may be made of oxide, and an abutting layer may be made of nitride, and yet another layer may be made of metal, such as gold or silver. Keeping the material of the cuboids the same on each layer may make the design and/or construction of the color router 400. In other examples, a different combination of materials in a given layer may be used.
As before, each layer in
The above discussion is meant to be illustrative of the principles and various embodiments of the present invention. Numerous variations and modifications will become apparent to those skilled in the art once the above disclosure is fully appreciated. It is intended that the following claims be interpreted to embrace all such variations and modifications.
This application claims the benefit of U.S. Prov. App. No. 63/266,804 filed Jan. 14, 2022 titled “Nanophotonic Color Filter and Lens.” The provisional application is incorporated by reference herein as if reproduced in full below.
| Number | Date | Country | |
|---|---|---|---|
| 63266804 | Jan 2022 | US |