This disclosure relates generally to image sensors, and in particular but not exclusively, relates to 3D image sensors.
Recently, three-dimensional (3D) movies are played in cinemas and receive a great deal of attention from public. 3D TVs and displays are also available for home use. 3D movies can be played using a 3D DVD player and thus a 3D movie can be watched at home. A 3D display may also be known as a stereoscopic display.
Displaying a 3D image is essentially conducted by allowing the left eye to see a left view only and the right eye to see a right view only. The difference between the left and right views creates a parallax such that the viewer feels as he is seeing a 3D object. To let the left eye see the left view only and the right eye see the right view only both views must be isolated from each other. The left and right views provided by a display may be encoded in time, i.e., alternately displayed. A pair of goggle shutters may be turned on and off in a synchronized fashion, such that by wearing the goggle the left eye will see the left view only and the right eye will see the right view only. The left and right views may also be polarization-encoded. For example, the left view may horizontally polarized while the right view is vertically polarized. A pair of polarized goggles then may have a horizontal polarizer for the left eye and a vertical polarizer for the right eye. By wearing the polarized goggle, the left and right eyes will see their respective views only.
Some conventional 3D imaging systems use two separate cameras to capture a 3D image—one to record the left view and another to record the right view. Other conventional systems may record the left and right views using a single camera having two individual image sensors. Further conventional systems may include a single camera that uses a single image sensor, where the left and right views are recorded in left and right halves of the image sensor, respectively.
In more recent conventional systems, a single camera that uses a single image sensor may be implemented where the left and right views are separated using a lenticular-lens or micro-lens array such that the left and right images are interleaved onto the image sensor. However, the interleaving using a lenticular-lens or micro-lens array may increase pixel crosstalk and may also introduce crosstalk between the captured left and right images.
Non-limiting and non-exhaustive embodiments of the invention are described with reference to the following figures, wherein like reference numerals refer to like parts throughout the various views unless otherwise specified.
Embodiments of an Image Sensor with Optical Filters having Alternating Polarization for 3D Imaging are described herein. In the following description numerous specific details are set forth to provide a thorough understanding of the embodiments. One skilled in the relevant art will recognize, however, that the techniques described herein can be practiced without one or more of the specific details, or with other methods, components, materials, etc. In other instances, well-known structures, materials, or operations are not shown or described in detail to avoid obscuring certain aspects.
Reference throughout this specification to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, the appearances of the phrases “in one embodiment” or “in an embodiment” in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
For example, pixel array 105 is a two-dimensional array of backside or frontside illuminated imaging pixels (e.g., pixels P1, P2 . . . , Pn). In one embodiment, each pixel is an active pixel sensor (“APS”), such as a complementary metal-oxide-semiconductor (“CMOS”) imaging pixel. As illustrated, each pixel is arranged into a row (e.g., rows R1 to Ry) and a column (e.g., column C1 to Cx) to acquire image data of a person, place, or object, which can then be used to render an image of the person, place, or object.
After each pixel has acquired its image data or image charge, the image data is readout by readout circuitry 110 and transferred to function logic 115. Readout circuitry 110 may include amplification circuitry, analog-to-digital conversion circuitry, or otherwise. Function logic 115 may simply storage the image data or even manipulate the image data by applying post image effects (e.g., crop, rotate, remove red eye, adjust brightness, adjust contrast, or otherwise). In one embodiment, readout circuitry 110 may readout a row of image data at a time along readout column lines (illustrated) or may readout the image data using a variety of other techniques (not illustrated), such as a serial readout or a full parallel readout of all pixels simultaneously.
Control circuitry 120 is coupled to pixel array 105 to control operational characteristics of pixel array 105. For example, control circuitry 120 may generate a shutter signal for controlling image acquisition.
In
Reset transistor T2 is coupled between a power rail VDD and the floating diffusion node FD to reset (e.g., discharge or charge the FD to a preset voltage) under control of a reset signal RST. The floating diffusion node FD is coupled to the gate of SF transistor T3. SF transistor T3 is coupled between the power rail VDD and select transistor T4. SF transistor T3 operates as a source-follower providing a high impedance output from floating diffusion node FD. Finally, select transistor T4 selectively couples the output of pixel circuitry 200 to the readout column line under control of a select signal SEL. In one embodiment, the TX signal, the RST signal, and the SEL signal are generated by control circuitry 120. The TX signal, the RST signal, the SEL signal, VDD, and ground may be routed in pixel circuitry 100 by way of metal interconnect layers included in the image sensor.
As illustrated, each pixel unit of array 300 is arranged into a row (e.g., rows R1 to Rj) and a column (e.g., column C1 to Ci). Thus, an image sensor in accordance with the teachings herein may include both an array of imaging pixels and an array of pixel units, where the array of pixel units is just an organized grouping of the imaging pixels in the pixel array.
Embodiments of the present invention may be configured to receive polarization-encoded light that contains information for imaging a three-dimensional (3D) image. For example, light incident on the image sensor may include light polarized to a first polarization corresponding to a first view of a 3D object and may also include light polarized to a second polarization corresponding to a second view of the 3D object. In one embodiment, the first view is a “left view” and the second view is a “right view” to allow for the creation of a parallax by the image sensor. Thus, image sensors in accordance with the teachings given herein may include optical filters having alternating polarizations to filter the polarization-encoded light to photosensing regions of the pixel units.
For example,
The optical filters may cover an entire pixel unit (e.g., all four imaging pixels of a Bayer pattern, as shown in
By way of example,
Although
By alternating the polarizations of the optical filters for each adjacent column of pixel units, the first and second views of a 3D object may be interleaved across the image sensor. For example, if the left view is polarized along horizontal polarization and the right view is polarized along vertical polarization, the captured left and right images will be interleaved column by column.
The optical filters may cover an entire pixel unit (e.g., all four imaging pixels of a Bayer pattern, as shown in
By way of example,
By alternating the polarizations of the optical filters for each adjacent row of pixel units, the first and second views of a 3D object may be interleaved across the image sensor. For example, if the left view is polarized along horizontal polarization and the right view is polarized along vertical polarization, the captured left and right images will be interleaved row by row.
The optical filters may cover an entire pixel unit (e.g., all four imaging pixels of a Bayer pattern, as shown in
By way of example,
Similarly, pixel units 602, 614, and 616 are all disposed in the same column C1 of array 600. Pixel unit 602 adjoins a top-side of pixel unit 614 and pixel unit 616 adjoins a bottom-side of pixel unit 614, such that pixel unit 614 is between pixel units 602 and 616. Optical filters 608 and 620, included in pixel units 602 and 616 respectively, both include the same polarization, such as vertical polarization. However, optical filter 618 included in the in-between pixel unit 614 has a polarization that is different than that of the optical filters to the top and bottom. In one embodiment, the polarization of optical filter 618 is orthogonal to that of optical filters 608 and 620.
By alternating the polarizations of the optical filters for each adjacent row and for each adjacent column of pixel units, the first and second views of a 3D object may be interleaved across the image sensor. For example, if the left view is polarized along horizontal polarization and the right view is polarized along vertical polarization, the captured left and right images will be interleaved in a checkerboard pattern corresponding with the rows and columns of the pixel units.
Array 700 is similar to that of array 600 of
a) shows an embodiment of array 700 of pixel units 702 and 704 having alternate polarization of optical filters, while
If the left view is polarized along horizontal polarization and the right view is polarized along vertical polarization, the captured left and right images are sampled by array 700 and interleaved accordingly. For example,
e) shows that one R pixel, one B pixel, and two G pixels (i.e., indicated in
By alternating the polarizations of the optical filters for each adjacent row and for each adjacent column of pixel units, the first and second views of a 3D object may be interleaved across the image sensor. For example, if the left view is polarized along horizontal polarization and the right view is polarized along vertical polarization, the captured left and right images will be interleaved in a checkerboard pattern corresponding with the rows and columns of the pixel units. It is thus appreciated that the array of optical filters of alternating polarization may be arranged in any desired pattern. When the left view is polarized along a first polarization and the right view is polarized along a second polarization, the captured left and right images are sampled by the array of imaging pixels and interleaved according to the pattern of polarization of the corresponding optical filters.
In the illustrated example, first channel 802 is configured to generate a first view 803 of a 3D object. In one embodiment, first view 803 may be a “left view” of the object. Second channel 804 may be configured to generate a second view 805 of the 3D object. By way of example, the second view 805 may be a “right view” of the object. When displayed together the first and second views 803 and 805 may allow for the creation of a parallax of the object being imaged.
As shown in
In one embodiment, optional polarizers 816 and 826 are not included, such that first and second views 803 and 805 are non-polarized views of the object being imaged. In this embodiment, beam splitter 808 is a polarizing beam splitter configured to combine and polarized first and second views 803 and 805. For example, beam splitter 808 may polarize first view 803 with a first polarization and may also polarize second view 805 with a second polarization. Beam splitter 808 may then combine the polarized first and second views to generate the polarization-encoded light 801.
In another embodiment, beam splitter 808 is a non-polarizing beam splitter and optional polarizers 816 and 826 are included in their respective channels. Polarizer 816 is then configured to polarize the first view with a first polarization, while polarizer 826 is configured to polarize the second view with a second polarization. Non-polarizing beam splitter 808 then combines the polarized first and second views, 803 and 805, to generate the polarization-encoded light 801. Although the polarizers 816 and 826 are illustrated on a light incident-side of the imaging lenses 810 and 820, the polarizers can be disposed at any position along the optical path of their respective channel before reaching non-polarizing beam splitter 808.
In one example, first view 803 is a “left view” of the object being imaged and is horizontally polarized by polarizing beam splitter 808. Similarly, second view 805 may be a “right view” of the object and is vertically polarized by polarizing beam splitter 808. In this way, imaging pixels of image sensor 806 that include optical filters of horizontal polarization capture a left image of the object, while imaging pixels that include optical filters of vertical polarization capture a right image of the object.
Additionally, 3D display 916 may include any device that displays (e.g., outputs, projects, emits, presents, etc.) a parallax of the object imaged by image sensor 910. For example, 3D display 916 may be an LCD display, a stereoscopic display, a projector, a lenticular printer, etc.
Disposed on semiconductor layer 1010 is a layer (or combination of layers) 1016. Layer(s) 1016 may include one or more metal layers for routing electrical signals between imaging pixel 1014 and readout or control circuitry disposed on a periphery area (not shown) of image sensor 1000. By way of example, periphery circuitry of image sensor 1000 may include readout circuitry 110, function logic 115, and control circuitry 120 of
Also shown in
In operation, polarization-encoded light 1035 is received at a light incident side of image sensor 1000. The polarization-encoded light 1035 includes light 1037 that horizontally polarized and light 1039 that is vertically polarized. In one embodiment, light 1037 corresponds with a “left view” of a 3D object, while light 1039 corresponds with a “right view” of the 3D object, such that image sensor 1000 may form a parallax image of the object. Since optical filter 1026 has a horizontal polarization, optical filter 1026 transmits (i.e., passes through) the horizontally polarized light 1037 of the polarization-encoded light 1035, but blocks (i.e., absorbs) the vertically polarized light 1039. Similarly, optical filter 1028 of the adjacent pixel unit 1004 transmits the vertically polarized light 1039 and blocks the horizontally polarized light 1037. Thus, photosensing region 1008 may capture a portion of left image of the object, while photosensing region 1012 may capture a portion of the right image of the object.
In one embodiment, optical filter 1026 is made from polyvinyl alcohol (PVA) impregnated with iodine. During manufacture of the optical filters, the PVA polymer chains may be stretched such that they form an array of aligned, linear molecules in the material. The iodine dopant attaches to the PVA molecules and makes them conducting along the length of the chains. Light polarized parallel to the chains is absorbed, and light polarized perpendicular to the chains is transmitted. For embodiments described above, a polarized optical filter may be formed on the image sensor during the manufacture of image sensor using a standard lithographic process. The iodine crystals in the polarizer array may be aligned by applying electric or magnetic fields during the formation of horizontal and vertical polarized optical filters, respectively. With the crystals aligned and fixed, the optical filter may absorb light which is polarized parallel to the direction of the crystal alignment, and transmit light which is polarized perpendicular to it. In another embodiment, the optical filters may be cut from polarizer sheets and pasted onto the image sensor.
Image sensor 1100 is similar to image sensor 1000 of
The above description of illustrated embodiments of the invention, including what is described in the Abstract, is not intended to be exhaustive or to limit the invention to the precise forms disclosed. While specific embodiments of, and examples for, the invention are described herein for illustrative purposes, various modifications are possible within the scope of the invention, as those skilled in the relevant art will recognize.
These modifications can be made to the invention in light of the above detailed description. The terms used in the following claims should not be construed to limit the invention to the specific embodiments disclosed in the specification. Rather, the scope of the invention is to be determined entirely by the following claims, which are to be construed in accordance with established doctrines of claim interpretation.
Number | Name | Date | Kind |
---|---|---|---|
5727239 | Hankawa et al. | Mar 1998 | A |
5835133 | Moreton et al. | Nov 1998 | A |
5877040 | Park et al. | Mar 1999 | A |
6108130 | Raj | Aug 2000 | A |
6395576 | Chang et al. | May 2002 | B1 |
6545741 | Meltzer | Apr 2003 | B2 |
7154527 | Goldstein et al. | Dec 2006 | B1 |
20040070767 | Tobiason et al. | Apr 2004 | A1 |
20040246352 | Suzuki | Dec 2004 | A1 |
20060283952 | Wang | Dec 2006 | A1 |
20080231946 | Scott et al. | Sep 2008 | A1 |
20090272880 | Stanton et al. | Nov 2009 | A1 |
20100200738 | Yamashita | Aug 2010 | A1 |
20100282945 | Yokogawa | Nov 2010 | A1 |
20100321476 | Martinez et al. | Dec 2010 | A1 |
20110266441 | Fagan et al. | Nov 2011 | A1 |
20110316983 | Hiramoto et al. | Dec 2011 | A1 |
20120075513 | Chipman et al. | Mar 2012 | A1 |
20120307018 | Damstra et al. | Dec 2012 | A1 |
Number | Date | Country |
---|---|---|
2001-016611 | Jan 2001 | JP |
2011-43365 | Dec 2011 | TW |
WO 2010144866 | Dec 2010 | WO |
Entry |
---|
Ohta, Jun, “Smart CMOS image sensors and applications,” CRC Press Taylor & Francis Group, Boca Raton, 2007, pp. 80-83. |
TW 102102370—First Taiwan Office Action with English Translation, issued Jan. 28, 2015, 18 pages. |
CN 201310025756.7—First Chinese Office Action with English Translation, issued Feb. 6, 2015, 21 pages. |
Number | Date | Country | |
---|---|---|---|
20130188023 A1 | Jul 2013 | US |