1. Field of the Invention
The present invention relates to the field of directional light modulation, 3D displays, emissive micro displays, 2D/3D switchable displays.
2. Prior Art
In 3D displays, directional modulation of the emitted light is necessary to create the 3D viewing perception. In a typical 3D display, a backlight with uniform illumination in multiple illumination directions is required to display images of the same scene from different directions by utilizing some combination of spatial multiplexing and temporal multiplexing in the spatial light modulator. In these 3D displays the light that typically comes from the directional backlight is usually processed by a directionally selective filter (such as diffractive plate or a holographic optical plate for example) before it reaches the spatial light modulator pixels that modulate the light color and intensity while keeping its directionality.
Currently available directional light modulators are a combination of an illumination unit comprising multiple light sources and a directional modulation unit that directs the light emitted from the light sources to a designated direction (see
In both electro-mechanically and electro-optically modulated directional light modulators there are three main drawbacks:
1. Response time: The mechanical movement or optical surface change are typically not achieved instantaneously and affect the modulator response time. In addition, the speed of these operations usually takes up some portion of the image frame time that reduces the achievable display brightness.
2. Volumetric aspects: These methods need a distance between the light source and directional modulation device to work with, which increases the total volume of the display.
3. Light loss: Coupling light on to a moving mirror creates light losses which in turn degrades the display system power efficiency and creates heat that has to be eliminated by incorporating bulky cooling methods that add more volume and increased power consumption.
In addition to being slow, bulky and optically lossy, the prior art directional backlight units need to have narrow spectral bandwidth, high collimation and individual controllability for being combined with a directionally selective filter for 3D display purposes. Achieving narrow spectral bandwidth and high collimation requires device level innovations and optical light conditioning, increasing the cost and the volumetric aspects of the overall display system. Achieving individual controllability requires additional circuitry and multiple light sources increasing the system complexity, bulk and cost. U.S. patent application Ser. No. 13/329,107 introduced a novel spatio-optical directional light modulator that overcomes most all of these drawbacks, however its angular coverage is limited by the numerical aperture of its light collimation optics.
It is therefore an objective of this invention to introduce an extended angular coverage spatio-temporal light modulator that overcomes the limitation of the prior art, thus making it feasible to create 3D and high resolution 2D displays that provide the volumetric advantages plus a viewing experience over a wide viewing angle. Additional objectives and advantages of this invention will become apparent from the following detailed description of a preferred embodiment thereof that proceeds with reference to the accompanying drawings.
References in the following detailed description of the present invention to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristics described in connection with the embodiment is included in at least one embodiment of the invention. The appearances of the phrase “in one embodiment” in various places in this detailed description are not necessarily all referring to the same embodiment.
A new class of emissive micro-scale pixel array devices has been recently introduced These devices feature high brightness, very fast light multi-color intensity and spatial modulation capabilities in a very small single device size that includes all the drive circuitry. The SSL light emitting pixels of one such a device may be either a light emitting diode (LED) or laser diode (LD) whose on-off state is controlled by the drive circuitry contained within a CMOS chip (or device) upon which the emissive micro-scale pixel array is bonded. The size of the pixels comprising the emissive array of such devices would typically be in the range of approximately 5-20 micron with the typical emissive surface area of the device being in the range of approximately 15-150 square millimeter. The pixels within the emissive micro-scale pixel array device are individually addressable spatially, chromatically and temporally, typically through the drive circuitry of its CMOS chip. One example of such devices are the QPI devices (see U.S. Pat. Nos. 7,623,560, 7,767,479, 7,829,902, 8,049,231, and 8,098,265, and U.S. Patent Application Publication Nos. 2010/0066921, 2012/0033113), referred to in the exemplary embodiments described below. Another example of such device is an OLED based micro-display. However it is to be understood that the QPI device is merely an example of the types of devices that may be used in the present invention. Thus in the description to follow, references to a QPI device are to be understood to be for purposes of specificity in the embodiments disclosed, and not for any limitation of the present invention.
The present invention combines the emissive micro pixel array capabilities of the QPI device with passive wafer level optics (WLO) and an articulated movement of the entire assembly to create a light modulator that can perform the functionalities of a directional light source and a diffractive plate of the prior art at the same time. As used herein, wafer level or wafer means a device or matrix of devices having a diameter of at least 2 inches, and more preferably 4 inches or more. WLO are fabricated monolithically on the wafer from a polymer using ultra violet (UV) imprint lithography. Among primary advantages of WLO are the ability to fabricate small feature micro lens arrays (MLA) and to be able to precisely align multiple WLO micro lens array layers together and with an optoelectronics device such as a CMOS sensor or the QPI. The alignment precision that can be achieved by a typical WLO fabrication technique can be less than one micron. The combination of the individual pixel addressability of the emissive micro emitter pixel array of the QPI and the WLO micro lens array (MLA) that can be precisely aligned with respect to the micro emitter array of the QPI eliminates the need experienced in prior art for having a directionally selective filter in the system while relaxing the requirement for the narrow spectral bandwidth in the light source, reducing the system volume, complexity and cost simultaneously. In this invention directional modulation of the emitted light is achieved by the combination of the light divergence achieved the WLO and the articulated movement of the entire assembly.
Referring to
The x-axis and y-axis articulation of QPI/MLA assembly 230 as illustrated in
The 2-axis articulation of the QPI/MLA assembly 230 of the spatio-temporal directional light modulator 200 of this invention can be in either temporally continuous or discrete (stepwise).
Referring to
As an alternative, using the 3×3 example again, if Θx represents the angular extent (half angle) of one lens element around the x axis and Θy represents the angular extent of one lens element around the y axis and if αx equals 2Θx and αy equals 2Θy, the total angular extent, including the articulation, will be three times the angular extent of one micro lens element (3 times 2Θx or 3 times 2Θy). By way of example, for the x axis, these three contiguous angular extents will be:
(−αx−Θx) to (−Θx)
(−Θx) to (Θx), and
(Θx) to (Θx+αx)
each angular extent also being constituting an angular increment in articulation.
The three contiguous individual angular extents in each direction can be considered as a two dimensional angular extent matrix as follows:
1, 2, 3
4, 5, 6
7, 8, 9
This alternative is a discrete technique, namely to display angular extent 1 for an allotted time, then advance around a first axis by one angular increment and then display angular extent 2 for the same allotted time, then advance one more angular increment and display angular extent 3 for the allotted time, then advance one angular increment on the other axis to display extent 6 for the allotted time, then go back one angular increment on that axis and display angular extent 5 for the allotted time, etc. After angular extent 9 is displayed for the allotted time, one could repeat 9 (continue displaying for twice the allotted time and then backtrack to avoid more than one angular increment in one axis at a time, though this would be expected to create a flicker unless a higher rate was used. A better approach would be to go from angular extent 9 to angular extent 1, a jump of two angular increments on 2 axes at the same time. However a jump of two angular increments on 2 axes should not take twice as long as an angular change of one angular increment on one axis, as the x and y axes will be independent of each other, and any change comprises an angular acceleration followed by an angular deceleration, so the average velocity is higher for a change of two angular increments than for a change of one angular increment. Still further alternatives might include a combination of discrete and continuous techniques. The point is that there are many alternatives one could choose from, all of which are within the scope of the present invention.
One embodiment of this invention, herein referred to as 600, is illustrated in
Referring to the side view illustration of
Referring to the side view illustration of
The drive electrical signals to the set of electromagnets 635, which are generated by the QPI device 210 and supplied to the set of electromagnets 635 via the metal rails and contacts incorporated in the hinged interior segment 625 described earlier, would be comprised of a base component and a correction component. The base component of the drive electrical signals to the set of electromagnets 635 would represent a nominal value and a correction component would be derived from an angular articulation error value generated by a set of four sensors positioned on the backside of the hinged interior segment 625 in alignment with the hinges 624 and 626. These sensors would be an array of infrared (IR) detectors placed on the backside of the interior segment 625 in alignment with four IR emitters placed on the topside of the base layer 630. The output values these four IR detector arrays will be routed to the QPI device, again via the metal rails and contacts incorporated in the hinged interior segment 625 described earlier, and used to compute an estimate of the error between the derived and the actual articulation angle which will be incorporated as a correction to the drive signals provided by the QPI to the set of electromagnets 635. The sensors positioned on the backside of the hinged interior segment 625 could also be micro-scale gyros properly aligned to detect the actuation angle along each of the 2-axis of the gimbal.
Another embodiment of this invention is illustrated in
The hinged pad 721/735 is retained in place within the surface curvature of the pedestal 730/736 by the spring layer 725 which contains at each of its four corners a single spiral shaped spring 726 that is etched into the spring layer 725. As illustrated in
Also illustrated in
The actuation of the hinged pad 721 together with the bonded QPI/MLA assembly 230 would be accomplished using a set of electromagnets embedded within the spherical pivot 735 and a set of permanent magnets embedded within the spherical socket 736. The actuation electrical drive signal would be routed to electromagnets embedded within the spherical pivot 735 in order to affect the actuation movement described in the earlier paragraphs. The base component of the actuation electrical drive signals to the electromagnets embedded within the spherical pivot 735 would represent a nominal value and a correction component that would be derived from an angular articulation error value generated by a set of four sensors positioned on the backside of the hinged pad 721. These sensors are an array of infrared (IR) detectors placed on the backside of the hinged pad 721 in alignment with four IR emitters placed on the topside of the base layer 730. The output values these four IR detector arrays will be routed to the QPI device, again via the metal rails and contacts incorporated in the hinged pad 721 described earlier, and used to compute an estimate of the error between the derived and the actual articulation angle which will be incorporated as a correction to the drive signals provided by the QPI device to the set of electromagnets embedded within the spherical pivot 735. The sensors positioned on the backside of the hinged pad 721 could also be micro-scale gyros properly aligned to detect the actuation angle along each of the 2-axis of the gimbal.
The permanent magnets embedded within the spherical socket 736 would be a thin magnetic rods or wires, typically of neodymium magnet (Nd2Fe14B) or the like, and would be shaped to provide a uniform magnetic field across the curved cavity of the spherical socket 736. Actuation of the hinged pad 721 together with the bonded QPI/MLA assembly 230 as described earlier would be accomplished by driving the set of electromagnets embedded within the spherical pivot 735 with an electrical signal having the appropriate temporal amplitude variation to affect the appropriate temporal variation in the magnetic attraction between the set of electromagnets embedded within the spherical pivot 735 and permanent magnets embedded within the spherical socket 736 that would cause of the hinged pad 721 together with the bonded QPI/MLA assembly 230 to be temporally articulated as described earlier. The drive electrical signals to the set of the set of electromagnets embedded within the spherical pivot 735, which are generated by the QPI device and routed via the metal rails and contacts incorporated on the hinged pad 721 described earlier, would be made synchronous with the pixel modulation performed by the QPI device to an extent that will enable the desired directional modulation of the intensity and color modulated light emitted from the pixel array of the QPI device. The temporal variation of the drive electrical signals to the set of electromagnets embedded within the spherical pivot 735 would be selected to enable the temporal angular articulation of the hinged pad 721 together with the bonded QPI/MLA assembly 230 along both of their x-axis and y-axis as illustrated in
A person skilled in the art would know that the gimbal actuators of the embodiments 600 and 700 of this invention described in the previous paragraphs can be implemented to achieve substantially the same objective by exchanging the positions of the electromagnets and the permanent magnets.
The two exemplary embodiments 600 and 700 of this invention differ mainly in the maximum value αmax of the temporal angular articulation α(t) each can achieve and in the outer area each embodiment needs beyond the boundary of the QPI/MLA assembly 230. First, as illustrated in
The angular extent Θ of the MLA 220 micro lens system 810, 820 and 830 can be made either larger or smaller than the ±15° of the exemplary embodiment of
It should be noted that unlike prior art that uses a scanning mirror to tempo-directionally modulate a light beam, the spatio-temporal light modulators of this invention differs in one very important aspect in that it generates, at any given instance of time, a multiplicity of light beams that are directionally modulated simultaneously. In the case of the spatio-temporal light modulators of this invention, the multiplicity of directionally modulated light beams would be temporally multiplexed by the articulation of the gimbaled QPI/MLA assembly 230 to expand the directional modulation resolution and angular extent. As explained earlier (see
In addition to the directional modulation capabilities for the spatio-temporal directional light modulator of this invention, spatial modulation would also be possible using an array of (N×M) of the QPI pixel modulation groups Gi such as that described in the previous design example. If, for example, it is required to create a directional light modulator of this invention with spatial modulation resolution of N=16 by M=16 that provides the (9×128)2=147,456 directional modulation resolution of the previous example, the spatio-temporal directional light modulator of this invention would comprise an array of (16×16) directional modulation groups Gi and when a QPI with (5×5) micron pixel size is used, the total size of the spatio-temporal directional light modulator would be approximately 10.24×10.24 mm. Using the angular extent values of the previous example, the light emitted from such a spatio-optical directional light modulator of this invention can be spatially modulated at a resolution of (16×16) and directionally modulated at a resolution of 147,456 within the angular extent ±45°, and can also be modulated in color and intensity in each direction.
As illustrated by the previous examples, the spatial and directional modulation resolutions of the spatio-temporal light modulator of this invention in terms of the number of individually addressable directions within a given the angular extent would be determined by selecting the resolution and pixel pitch of the emissive micro emitter array QPI device 210, the pitch of the MLA 220 lens elements, the angular extent of the MLA 220 lens elements and the maximum articulation angle of the modulator gimbal. It is obvious to a person skilled in the art that the MLA lens system can be designed to allow either wider or narrower angular extent, the gimbal design can be selected to allow either wider or narrower articulation angle and the number of pixels within each modulation group can be selected either smaller or larger in order to create a spatio-temporal directional light modulator that can achieve any desired spatial and directional modulation capabilities following the teachings provided in the preceding discussion.
Any desired spatial and directional modulation capabilities can be realized using the spatio-optical directional light modulators of this invention. The previous example illustrated how a spatio-optical directional light modulator of this invention with (16)2 spatial resolution and (3×128)2 directional resolution can be implemented using a single 10.24×10.24 mm QPI device 210. In order to realize higher spatial resolution, the spatio-temporal directional light modulator of this invention can be implemented using a tiled array comprising multiplicity of smaller spatial resolution spatio-temporal directional light modulator of this invention. For example, when an array of (3×3) of the spatio-temporal directional light modulator of the previous example are tiled as illustrated in
The principle of operation of the spatio-temporal directional light modulator of this invention will be described in reference to the illustrations of
Where the αx(t) and αy(t) are values of the articulation angles around the x-axis and y-axis at the time epoch t; respectively, the angles θ(t) and φ(t) are the values of the directional modulation spherical coordinates at the time epoch t with the polar axis at θ=0 parallel to the z-axis of the emissive surface of the modulation group Gi and m=log2 n is the number of bits used to express the x and y pixel resolution within the modulation group Gi. The spatial resolution of the spatio-temporal directional light modulator of this invention is defined the coordinates (X, Y) of each of the individual modulation group Gi within the two dimensional array of modulation groups comprising the overall spatio-temporal directional light modulator. In essence, the spatio-temporal light modulator of this invention would be capable of temporally generating (modulating) a light field described by the spatial coordinates (X, Y) defined by its modulation group array and the directional coordinates (θ,φ) with the latter being defined by the values of the coordinates (x, y) of the emissive pixels within the modulation group Gi and the temporal value of the articulation angle of the spatio-temporal directional light modulator as defined by Eq. 1 and 2 above.
In using 16-bit for representing the directional modulation and the typical 24-bit for representing the modulated light intensity and color in each direction, the total number bits that would represent the modulation data word for each modulation group would be 40-bit. In assuming, without loss of generality, that such 40-bit words would be inputted to the spatio-temporal directional light modulator of this invention for addressing its constituent modulation groups sequentially; i.e., sequential addressing is used to input the modulation group data 40-bit words, block 120 of
Possible Applications
The spatio-temporal directional light modulator of this invention can be used to implement a 3D display with an arbitrary size that is realized, for example, as a tiled array of multiplicity of spatio-temporal directional light modulator devices such as that illustrated in
The spatio-temporal directional light modulator of this invention can also be used as a backlight for liquid crystal display (LCD) to implement a 3D display. The spatio-temporal directional light modulator of this invention can also be operated as a 2D high resolution display. In this case the individual pixels of the QPI device 210 would be used to modulate the color and intensity while the MLA 220 would be used to fill the viewing angle of the display. It is also possible for the spatio-temporal light modulator of this invention to be switched from 2D to 3D display modes by adapting the format of its input data to be commensurate with the desired operational mode. When the spatio-temporal directional light modulator of this invention is used as a 2D display its light angular extent will be that of associate with its MLA 220 micro lens element plus the articulation angle of its gimbal ±(Θ+αmax) with the pixel resolution of the individual modulation group Gi leveraged to achieve higher spatial resolution.
This application is a divisional of U.S. patent application Ser. No. 13/546,858 filed Jul. 11, 2012 now U.S. Pat. No. 8,854,724 B2 which claims the benefit of U.S. Provisional Patent Application No. 61/616,249 filed Mar. 27, 2012.
Number | Name | Date | Kind |
---|---|---|---|
5059008 | Flood et al. | Oct 1991 | A |
5691836 | Clark | Nov 1997 | A |
5986811 | Wohlstadter | Nov 1999 | A |
6137535 | Meyers | Oct 2000 | A |
6151167 | Melville | Nov 2000 | A |
6433907 | Lippert et al. | Aug 2002 | B1 |
6795221 | Urey | Sep 2004 | B1 |
6795241 | Holzbach | Sep 2004 | B1 |
6803561 | Dunfield | Oct 2004 | B2 |
6924476 | Wine et al. | Aug 2005 | B2 |
6937221 | Lippert et al. | Aug 2005 | B2 |
6999238 | Glebov et al. | Feb 2006 | B2 |
7009652 | Tanida et al. | Mar 2006 | B1 |
7061450 | Bright et al. | Jun 2006 | B2 |
7071594 | Yan et al. | Jul 2006 | B1 |
7106519 | Aizenberg et al. | Sep 2006 | B2 |
7190329 | Lewis et al. | Mar 2007 | B2 |
7193758 | Wiklof et al. | Mar 2007 | B2 |
7209271 | Lewis et al. | Apr 2007 | B2 |
7215475 | Woodgate et al. | May 2007 | B2 |
7232071 | Lewis et al. | Jun 2007 | B2 |
7286143 | Kang et al. | Oct 2007 | B2 |
7292257 | Kang et al. | Nov 2007 | B2 |
7324687 | Zitnick, III et al. | Jan 2008 | B2 |
7334901 | El-Ghoroury | Feb 2008 | B2 |
7369321 | Ren et al. | May 2008 | B1 |
7379583 | Zitnick, III et al. | May 2008 | B2 |
7400439 | Holman | Jul 2008 | B2 |
7482730 | Davis et al. | Jan 2009 | B2 |
7486255 | Brown et al. | Feb 2009 | B2 |
7561620 | Winder et al. | Jul 2009 | B2 |
7580007 | Brown et al. | Aug 2009 | B2 |
7609906 | Matusik et al. | Oct 2009 | B2 |
7619807 | Baek et al. | Nov 2009 | B2 |
7620309 | Georgiev | Nov 2009 | B2 |
7623560 | El-Ghoroury et al. | Nov 2009 | B2 |
7630118 | Onvlee | Dec 2009 | B2 |
7639293 | Narabu | Dec 2009 | B2 |
7656428 | Trutna, Jr. | Feb 2010 | B2 |
7671893 | Li et al. | Mar 2010 | B2 |
7702016 | Winder et al. | Apr 2010 | B2 |
7703924 | Nayar | Apr 2010 | B2 |
7724210 | Sprague et al. | May 2010 | B2 |
7732744 | Utagawa | Jun 2010 | B2 |
7767479 | El-Ghoroury et al. | Aug 2010 | B2 |
7780364 | Raskar et al. | Aug 2010 | B2 |
7791810 | Powell | Sep 2010 | B2 |
7792423 | Raskar et al. | Sep 2010 | B2 |
7829902 | El-Ghoroury et al. | Nov 2010 | B2 |
7835079 | El-Ghoroury et al. | Nov 2010 | B2 |
7841726 | Conner | Nov 2010 | B2 |
7872796 | Georgiev | Jan 2011 | B2 |
7880794 | Yamagata et al. | Feb 2011 | B2 |
7897910 | Roichman et al. | Mar 2011 | B2 |
7916934 | Vetro et al. | Mar 2011 | B2 |
7936392 | Ng et al. | May 2011 | B2 |
7949252 | Georgiev | May 2011 | B1 |
7952809 | Takai | May 2011 | B2 |
7956924 | Georgiev | Jun 2011 | B2 |
7957061 | Connor | Jun 2011 | B1 |
7962033 | Georgiev | Jun 2011 | B2 |
7965936 | Raskar et al. | Jun 2011 | B2 |
8009358 | Zalevsky et al. | Aug 2011 | B2 |
8019215 | Georgiev et al. | Sep 2011 | B2 |
8049231 | El-Ghoroury et al. | Nov 2011 | B2 |
8098265 | El-Ghoroury et al. | Jan 2012 | B2 |
8106994 | Ichimura | Jan 2012 | B2 |
8126323 | Georgiev et al. | Feb 2012 | B2 |
8681185 | Guncer | Mar 2014 | B2 |
8749620 | Knight et al. | Jun 2014 | B1 |
8854724 | El-Ghoroury et al. | Oct 2014 | B2 |
8928969 | Alpaslan et al. | Jan 2015 | B2 |
8970646 | Guncer | Mar 2015 | B2 |
20030107804 | Dolgoff | Jun 2003 | A1 |
20050179868 | Seo et al. | Aug 2005 | A1 |
20060061660 | Brackmann | Mar 2006 | A1 |
20060098285 | Woodgate et al. | May 2006 | A1 |
20060221209 | McGuire et al. | Oct 2006 | A1 |
20060238723 | El-Ghoroury | Oct 2006 | A1 |
20070045518 | Mishina et al. | Mar 2007 | A1 |
20070109813 | Copeland et al. | May 2007 | A1 |
20070279535 | Fiolka | Dec 2007 | A1 |
20080117491 | Robinson | May 2008 | A1 |
20080144174 | Lucente et al. | Jun 2008 | A1 |
20080170293 | Lucente et al. | Jul 2008 | A1 |
20080218853 | El-Ghoroury et al. | Sep 2008 | A1 |
20080278808 | Redert | Nov 2008 | A1 |
20090086170 | El-Ghoroury et al. | Apr 2009 | A1 |
20090140131 | Utagawa | Jun 2009 | A1 |
20090190022 | Ichimura | Jul 2009 | A1 |
20090278998 | El-Ghoroury et al. | Nov 2009 | A1 |
20100003777 | El-Ghoroury et al. | Jan 2010 | A1 |
20100007804 | Guncer | Jan 2010 | A1 |
20100026852 | Ng et al. | Feb 2010 | A1 |
20100026960 | Sprague | Feb 2010 | A1 |
20100066921 | El-Ghoroury et al. | Mar 2010 | A1 |
20100085468 | Park et al. | Apr 2010 | A1 |
20100091050 | El-Ghoroury et al. | Apr 2010 | A1 |
20100165155 | Chang | Jul 2010 | A1 |
20100208342 | Olsen | Aug 2010 | A1 |
20100220042 | El-Ghoroury et al. | Sep 2010 | A1 |
20100225679 | Guncer | Sep 2010 | A1 |
20100245957 | Hudman et al. | Sep 2010 | A1 |
20100265369 | Chang | Oct 2010 | A1 |
20100265386 | Raskar et al. | Oct 2010 | A1 |
20110075257 | Hua et al. | Mar 2011 | A1 |
20110096156 | Kim et al. | Apr 2011 | A1 |
20110128393 | Tavi et al. | Jun 2011 | A1 |
20110157367 | Chang | Jun 2011 | A1 |
20110157387 | Han et al. | Jun 2011 | A1 |
20110234841 | Akeley et al. | Sep 2011 | A1 |
20120307357 | Choi et al. | Dec 2012 | A1 |
20130033586 | Hulyalkar | Feb 2013 | A1 |
20130141895 | Alpaslan et al. | Jun 2013 | A1 |
20130148950 | Chang | Jun 2013 | A1 |
Number | Date | Country |
---|---|---|
2190019 | May 2010 | EP |
2398235 | Dec 2011 | EP |
2008-304572 | Dec 2008 | JP |
2010-117398 | May 2010 | JP |
WO-2005048599 | May 2005 | WO |
WO-2007092545 | Aug 2007 | WO |
WO-2011065738 | Jun 2011 | WO |
Entry |
---|
“International Search Report and Written Opinion of the International Searching Authority Dated Mar. 19, 2013, International Application No. PCT/US2012/068029”, (Mar. 19, 2013). |
“International Search Report and Written Opinion of the International Searching Authority Dated Sep. 18, 2013; International Application No. PCT/US2012/068028”, (Sep. 18, 2013). |
“Invitation to Pay Additional Fees, Partial Search Report Dated Jan. 25, 2013, International Application No. PCT/US2012/068028”, (Jan. 25, 2013). |
“Notice of Allowance Dated Aug. 21, 2014; U.S. Appl. No. 13/329,107”, (Aug. 21, 2014). |
“Notice of Allowance Dated May 30, 2014; U.S. Appl. No. 13/546,858”, (May 30, 2014). |
“Office Action Dated Mar. 21, 2014; U.S. Appl. No. 13/329,107”, (Mar. 21, 2014). |
“Office Action Dated Nov. 22, 2013; U.S. Appl. No. 13/546,858”, (Nov. 22, 2013). |
“Office Action Dated Sep. 26, 2013; U.S. Appl. No. 13/546,858”, (Sep. 26, 2013). |
Adelson, Edward H., et al., “Single Lens Stereo with a Plenoptic Camera”, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 14, No. 2, (Feb. 1992), pp. 99-106. |
Bolles, Robert C., et al., “Epipolar-Plane Image Analysis: An Approach to Determining Structure from Motion”, International Journal of Computer Vision, vol. 1, (1987), pp. 7-55. |
Georgiev, Todor , et al., “Light Field Camera Design for Integral View Photography”, Adobe Technical Report, (2003), pp. 1-13. |
Nayar, Shree K., “Computational Cameras: Approaches, Benefits and Limits”, Columbia University Technical Report No. CUCS-001-11, (Jan. 15, 2011), pp. 1-22. |
Ng, Ren , “Digital Light Field Photography”, Stanford University Doctorial Thesis, (Jul. 2006), 203 pp. total. |
Ng, Ren , et al., “Light Field Photography with a Hand-held Plenoptic Camera”, Stanford University Tech Report CTSR 2005-02, (2005), pp. 1-11. |
Veeraraghavan, Ashok , et al., “Dappled Photography: Mask Enhanced Cameras for Heterodyned Light Fields and Coded Aperture Refocusing”, Mitsubishi Electric Research Laboratories (MERL) TR2007-115; ACM Transactions on Graphics, vol. 26, No. 3, Article 69, (Jul. 2007), pp. 69-1-69-12, 14 pp. total. |
Arai, Jun , “Depth-control method for integral imaging”, Optics Letters, vol. 33, No. 3, (Feb. 1, 2008), pp. 279-281. |
Arai, Jun , et al., “Effects of focusing on the resolution characteristics of integral photography”, J. Opt. Soc. Am. A, vol. 20, No. 6, (Jun. 2003), pp. 996-1004. |
Baasantseren, Ganbat , et al., “Computational Integral Imaging with Enhanced Depth Sensitivity”, Journal of Information Display, vol. 10, No. 1, (Mar. 2009), pp. 1-5. |
Baasantseren, Ganbat , et al., “Integral floating-image display using two lenses with reduced distortion and enhanced depth”, Journal of the SID, vol. 18, No. 7, (2010), pp. 519-526. |
Baasantseren, Ganbat , et al., “Viewing angle enhanced integral imaging display using two elemental image masks”, Optics Express, vol. 17, No. 16, (Aug. 3, 2009), pp. 14405-14417. |
Bagheri, Saeed , et al., “A Fast Optimization Method for Extension of Depth-of-Field in Three-Dimensional Task-Specific Imaging Systems”, Journal of Display Technology, vol. 6, No. 10, (Oct. 2010), pp. 412-421. |
Castro, Albertina , et al., “Integral imaging with large depth of field using an asymmetric phase mask”, Opt. Express, vol. 15, (2007), pp. 10266-12073. |
Choi, Heejin , et al., “Depth- and viewing-angle-enhanced 3-D/2-D switchable display system with high contrast ratio using multiple display devices and a lens array”, Journal of the SID, 15/5, (2007), pp. 315-320. |
Choi, Heejin , et al., “Depth-enhanced integral imaging using two parallel display devices”, Proceedings of the Pacific Rim Conference on Lasers and Electro-Optics 2005. CLEO/Pacific Rim 2005., (Aug. 2005), pp. 201-202. |
Choi, Heejin , et al., “Depth-enhanced integral imaging with a stepped lens array or a composite lens array for three-dimensional display”, Proceedings of the 16th Annual Meeting of the IEEE Lasers and Electro-Optics Society, 2003. LEOS 2003, vol. 2, (Oct. 27-28, 2003), pp. 730-731. |
Choi, Heejin , et al., “Improved analysis on the viewing angle of integral imaging”, Applied Optics, vol. 44, No. 12, (Apr. 20, 2005), pp. 2311-2317. |
Choi, Heejin , et al., “Multiple-viewing-zone integral imaging using a dynamic barrier array for three-dimensional displays”, Optics Express, vol. 11, No. 8, (Apr. 21, 2003), pp. 927-932. |
Choi, Heejin , et al., “Wide-viewing-angle 3D/2D convertible display system using two display devices and a lens array”, Optics Express, vol. 13, No. 21, (Oct. 17, 2005), pp. 8424-8432. |
Date, Munekazu , et al., “Depth reproducibility of multiview depth-fused 3-D display”, Journal of the SID, vol. 18, No. 7, (2010), pp. 470-475. |
Goodman, Joseph W., “Introduction to Fourier Optics, Third Edition”, Roberts & Company Publishers, (2005), pp. 138-145, 154-162, 186-212, 355-367. |
Hahn, Joonku , et al., “Wide viewing angle dynamic holographic stereogram with a curved array of spatial light modulators”, Optics Express, vol. 16, No. 16, (Aug. 4, 2008), pp. 12372-12386. |
Hudson, Alex , “Could 3D TV be dangerous to watch?”, BBC News, http://news.bbc.co.uk/2/hi/programmes/click—online/9378577.stm, (Jan. 28, 2011), 3 pp. total. |
Hyun, Joobong , et al., “Curved Projection Integral Imaging Using an Additional Large-Aperture Convex Lens for Viewing Angle Improvement”, ETRI Journal, vol. 31, No. 2, (Apr. 2009), pp. 105-110. |
Jang, Ju-Seog , et al., “Depth and lateral size control of three-dimensional images in projection integral imaging”, Optics Express, vol. 12, No. 16, (Aug. 9, 2004), pp. 3778-3790. |
Jang, Ju-Seog , et al., “Three-dimensional projection integral imaging using micro-convex-mirror arrays”, Optics Express, vol. 12, No. 6, (Mar. 22, 2004), pp. 1077-1083. |
Jang, Jae-Young , et al., “Viewing angle enhanced integral imaging display by using a high refractive index medium”, Applied Optics, vol. 50, No. 7, (Mar. 1, 2011), pp. B71-B76. |
Javidi, Bahram , et al., “New developments in active and passive 3D image sensing, visualization, and processing”, Proc. of SPIE, vol. 5986, (2005), pp. 598601-1 to 59806-11. |
Javidi, Bahram , et al., “Orthoscopic, long-focal-depth integral imaging by hybrid method”, Proc. of SPIE, vol. 6392, (2006), pp. 639203-1 to 639203-8. |
Jung, Sungyong , et al., “Depth-enhanced integral-imaging 3D display using different optical path lengths by polarization devices or mirror barrier array”, Journal of the SID, 12/4, (2004), pp. 461-467. |
Jung, Sungyong , et al., “Viewing-angle-enhanced integral 3-D imaging using double display devices with masks”, Opt. Eng., vol. 41, No. 10, (Oct. 2002), pp. 2389-2390. |
Jung, Sungyong , et al., “Viewing-angle-enhanced integral three-dimensional imaging along all directions without mechanical movement”, Optics Express, vol. 11, No. 12, (Jun. 16, 2003), pp. 1346-1356. |
Jung, Sungyong , et al., “Wide-viewing integral three-dimensional imaging by use of orthogonal polarization switching”, Applied Optics, vol. 42, No. 14, (May 10, 2003), pp. 2513-2520. |
Kavehvash, Zahra , et al., “Extension of depth of field using amplitude modulation of the pupil function for bio-imaging”, Proc. of SPIE, vol. 7690, (2010), pp. 76900O-1 to 76900O-8. |
Kim, Youngmin , et al., “Depth-enhanced integral floating imaging system with variable image planes using polymer-dispersed liquid-crystal films”, OSA Optics and Photonics Spring Congress, St. Petersburg, Florida, USA, paper JMA2, (2008), 3 pp. total. |
Kim, Yunhee , et al., “Depth-enhanced three-dimensional integral imaging by use of multilayered display devices”, Applied Optics, vol. 45, No. 18, (Jun. 20, 2006), pp. 4334-4343. |
Kim, Hwi , et al., “Image volume analysis of omnidirectional parallax regular-polyhedron three-dimensional displays”, Optics Express, vol. 17, No. 8, (Apr. 13, 2009), pp. 6389-6396. |
Kim, Yunhee , et al., “Point light source integral imaging with improved resolution and viewing angle by the use of electrically movable pinhole array”, Optics Express, vol. 15, No. 26, (Dec. 24, 2007), pp. 18253-18267. |
Kim, Youngmin , et al., “Projection-type integral imaging system using multiple elemental image layers”, Applied Optics, vol. 50, No. 7, (Mar. 1, 2011), pp. B18-B24. |
Kim, Hwi , et al., “The use of a negative index planoconcave lens array for wide-viewing angle integral imaging”, Optics Express, vol. 16, No. 26, (Dec. 22, 2008), pp. 21865-21880. |
Kim, Joowhan , et al., “Viewing region maximization of an integral floating display through location adjustment of viewing window”, Optics Express, vol. 15, No. 20, (Oct. 2007), pp. 13023-13034. |
Kim, Yunhee , et al., “Viewing-angle-enhanced integral imaging system using a curved lens array”, Optics Express, vol. 12, No. 3, (Feb. 9, 2004), pp. 421-429. |
Lee, Byoungho , et al., “Viewing-angle-enhanced integral imaging by lens switching”, Optics Letters, vol. 27, No. 10, (May 15, 2002), pp. 818-820. |
Martinez-Corral, Manuel , et al., “Integral imaging with extended depth of field”, Proc. of SPIE, vol. 6016, (2005), pp. 601602-1 to 601602-14. |
Martinez-Corral, Manuel , et al., “Integral imaging with improved depth of field Manuel by use of amplitude-modulated microlens arrays”, Applied Optics, vol. 43, No. 31, (Nov. 1, 2004), pp. 5806-5813. |
Martinez-Corral, Manuel , et al., “Orthoscopic, long-focal-depth 3D Integral Imaging”, Proc. of SPIE, vol. 6934, (2006), pp. 69340H-1 to 69340H-9. |
Martinez-Cuenca, Raul , et al., “Enhanced depth of field integral imaging with sensor resolution constraints”, Optics Express, vol. 12, No. 21, (Oct. 18, 2004), pp. 5237-5242. |
Martinez-Cuenca, R. , et al., “Enhanced viewing-angle integral imaging by multiple-axis telecentric relay system”, Optics Express, vol. 15, No. 24, (Nov. 26, 2007), pp. 16255-16260. |
Martinez-Cuenca, Raul , et al., “Extended Depth-of-Field 3-D Display and Visualization by Combination of Amplitude-Modulated Microlenses and Deconvolution Tools”, Journal of Display Technology, vol. 1, No. 2, (Dec. 2005), pp. 321-327. |
Min, Sung-Wook , et al., “Analysis of an optical depth converter used in a three-dimensional integral imaging system”, Applied Optics, vol. 43, No. 23, (Aug. 10, 2004), 2004), pp. 4539-4549. |
Min, Sung-Wook , et al., “New Characteristic Equation of Three-Dimensional Integral Imaging System and its Application”, Japanese Journal of Applied Physics, vol. 44, No. 2, (2005), pp. L71-L74. |
Navarro, H. , et al., “3D integral imaging display by smart pseudoscopic-to-orthoscopic conversion (SPOC)”, Optics Express, vol. 18, No. 25, (Dec. 6, 2010), pp. 25573-25583. |
Navarro, Hector , et al., “Method to Remedy Image Degradations Due to Facet Braiding in 3D Integral-Imaging Monitors”, Journal of Display Technology, vol. 6, No. 10, (Oct. 2010), pp. 404-411. |
Okano, Fumio , et al., “Depth Range of a 3D Image Sampled by a Lens Array with the Integral Method”, IEEE 3DTV-CON, (2009), 4 pp. total. |
Okoshi, Takanori , “Three-Dimensional Imaging Techniques”, Academic Press, Inc. Publishers, (1976), pp. 43-123, 295-349, 351-357. |
Park, Soon-Gi , et al., “2D/3D convertible display with enhanced 3D viewing region based on integral imaging”, Proc. of the SPIE, 7524, (2010), 9 pp. total. |
Park, Jae-Hyeung , et al., “Analysis of viewing parameters for two display methods based on integral photography”, Applied Optics, vol. 40, No. 29, (Oct. 10, 2001), pp. 5217-5232. |
Park, Chan-Kyu , et al., “Depth-extended integral imaging system based on a birefringence lens array providing polarization switchable focal lengths”, Optics Express, vol. 17, No. 21, (Oct. 12, 2009), pp. 19047-19054. |
Park, Jae-Hyeung , et al., “Integral imaging with multiple image planes using a uniaxial crystal plate”, Optics Express, vol. 11, No. 16, (Aug. 11, 2003), pp. 1862-1875. |
Park, Gilbae , et al., “Multi-viewer tracking integral imaging system and its viewing zone analysis”, Optics Express, vol. 17, No. 20, (Sep. 28, 2009), pp. 17895-17908. |
Park, Jae-Hyeung , et al., “Recent progress in three-dimensional information processing based on integral imaging”, Applied Optics, vol. 48, No. 34, (Dec. 1, 2009), pp. H77-H94. |
Ponce-Diaz, Rodrigo , et al., “Digital Magnification of Three-Dimensional Integral Images”, Journal of Display Technology, vol. 2, No. 3, (Sep. 2006), pp. 284-291. |
Saavedra, G. , et al., “Digital slicing of 3D scenes by Fourier filtering of integral images”, Optics Express, vol. 16, No. 22, (Oct. 27, 2008), pp. 17154-17160. |
Song, Yong-Wook , et al., “3D object scaling in integral imaging display by varying the spatial ray sampling rate”, Optics Express, vol. 13, No. 9, (May 2, 2005), pp. 3242-3251. |
Stern, Adrian , et al., “3-D computational synthetic aperture integral imaging (COMPSAII)”, Optics Express, vol. 11, No. 19, (Sep. 22, 2003), pp. 2446-2451. |
The Telegraph, “Samsung warns of dangers of 3D television”, The Telegraph, http://www.telegraph.co.uk/technology/news/7596241/Samsung-warns-of-dangers-of-3D-television.html, (Apr. 16, 2010), 2 pp. total. |
Tolosa, A. , et al., “Optical implementation of micro-zoom arrays for parallel focusing in integral imaging”, J. Opt. Soc. Am. A, vol. 27, No. 3, (Mar. 2010), pp. 495-500. |
Wakabayashi, Daisuke , “Panasonic, Japan Work on 3-D Safety”, The Wall Street Journal, http://blogs.wsj.com/digits/2011/01/06/panasonic-working-with-japan-on-3-d-standards/, (Jan. 6, 2011), 2 pp. total. |
Kim, Joohwan , et al., “A depth-enhanced floating display system based on integral imaging”, Proceedings of the 2006 SPIE-IS&T Electronic Imaging, SPIE vol. 6055, 60551F, (2006), pp. 60551F-1 to 60551F-9, April. |
Number | Date | Country | |
---|---|---|---|
20150033539 A1 | Feb 2015 | US |
Number | Date | Country | |
---|---|---|---|
61616249 | Mar 2012 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13546858 | Jul 2012 | US |
Child | 14486758 | US |