The present invention relates to an image capture system, and more particularly to an image capture system utilizing light path length modulation.
Providing multiple focal planes, or discrete steps of focus adjustment, is useful for a number of applications. It can be part of capturing three dimensional data. In the prior art, multiple focus capture utilized mechanical movement such as gears or liquid lenses. Such mechanisms are expensive, slow, and relatively fragile. Another prior art method of capturing multiple focal lengths involves using multiple cameras. However, this is bulky and expensive. The bulk and expense also limits the number of focal lengths that can be simultaneously captured.
The present invention is illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings and in which like reference numerals refer to similar elements and in which:
An image capture system is described. The image capture system in one embodiment includes a modulation stack, with one or more digital light path length modulators. Each digital light path length modulator includes an optical path length extender (OPLE) and a polarization modulator, and can be used to adjust the path length of light. In one embodiment, light with state 1 polarization travels through a longer path in the OPLE than light with state 2 polarization. This can be used to capture data at two or more focal distances. An OPLE is made up of one or more plates with a plurality of polarization sensitive reflective elements. A plurality of digital light path length modulators create the modulation stack. In one embodiment, the image capture system may include a striped OPLE to capture images at two focal distances.
This mechanism can be used for cameras and image capture, and various other uses in which light waves or other waves in a similar spectrum are captured. Cameras may include conventional cameras with both fixed and interchangeable lenses, including but not limited to cell phone cameras and DSLR cameras, medical cameras such as endoscopes, and other types of probes or cameras which acquire image data.
The following detailed description of embodiments of the invention makes reference to the accompanying drawings in which like references indicate similar elements, showing by way of illustration specific embodiments of practicing the invention. Description of these embodiments is in sufficient detail to enable those skilled in the art to practice the invention. One skilled in the art understands that other embodiments may be utilized and that logical, mechanical, electrical, functional and other changes may be made without departing from the scope of the present invention. The following detailed description is, therefore, not to be taken in a limiting sense, and the scope of the present invention is defined only by the appended claims.
In one embodiment, by positioning the focus distance adjustments within the camera lens assembly, the system can be used to retrofit an existing camera system for capturing multiple focal points utilizing a single CCD (charged coupled device) or other image capture mechanism. The camera system 100 is merely an example of an image capture system.
Processor 270 in one embodiment may be used to control the image capture sub-system 205, to adjust the light path lengths. The processor 270 may also be used to post-process the captured data, in one embodiment. Memory 275 may be a buffer-memory, enabling the image capture system 200 to capture image and video data. I/O system 280 makes the captured image data available to other systems. In one embodiment, post-processing takes place in a separate device, such as a computer or server system, which provides post-processing for the captured image data.
The image capture subsystem 205 includes, in one embodiment, capture assembly 220, modulation stack 230, and imager system 255. The lens 215 in one embodiment is a conventional lens to capture an image.
The polarizer 225 allows light with a particular polarization to pass through. In one embodiment, the polarization may be S/P polarization, or circular polarization. The polarizer 225 in one embodiment is a linear polarizer. The polarizer 225 in one embodiment is a linear polarizer with a backwards circular ¼ wave polarizer. Such a combined polarizer addresses glare issues.
The capture assembly 220 includes a modulation stack 230. The modulation stack 230 includes one or more digital light path length modulators 235, 240. The digital light path length modulators 235, 240 alter the light path length based on the polarization of the light. In one embodiment, polarizer 225 may be positioned after modulation stack 230.
Capture assembly 220 is used to capture images at multiple focal distances. In one embodiment, the image capture subsystem 205 may include additional mechanical and optical elements 250 which can provide correction or alteration of the image.
The image data is captured by imager system 255. Imager system 255 includes an image sensor 260. In one embodiment, it further includes a digital correction system 265. In another embodiment, the digital correction may take place in a different system from the image capture system 200. This may include, in one embodiment, reconstructing the image(s), based on the captured data, and data from the modulation stack.
In one embodiment, the system may receive data from auxiliary data system 285, which may be used to control the image capture sub-system 205, directly or through processor 270. The auxiliary data system 285 may provide information for selecting the appropriate focal lengths. As noted above, the modulation stack 230 allows image elements to be captured at various focal distances. The auxiliary control system 285 may be used to select the focal distances, based on various factors.
One auxiliary control system may be a content-based focal distance selection 290, which enables the system to selectively choose one or more focal distances, based on what is being captured. For example, the system may selectively choose a portion of the image data for focus. In one embodiment, content-based focal distance selection 290 may utilize a distance sensor, to detect a distance, for capture. One example of a distance sensor that may be used is a time of flight sensor. Based on the type of system, the focal distances selection may vary. For example, to capture a fingerprint including blood flow data, the system may sense a distance to the finger, and then identify a second focal distance beyond that distance. In one embodiment, the user input system 292 may enable a user to select the use of the image capture system 200. In one embodiment, such selection may be made via a remote system, such as a mobile device paired with the image capture system 200.
User input systems 292 enable focus selection based on user input, through various mechanisms. Such user input may be provided via gesture, voice control, an adjustment dial or another selection mechanism. As noted above, the user input system 292 may be coupled with the content-based focal distance selection 290, such that the user enters the targeted image capture and purpose, and the actual focal distances are selected automatically. User input systems 292 may include video game controllers, microphones, cameras, inertial measurement sensors, coupled devices, and other sensors for detecting user input.
Other control data 293 may also be provided to the system. Any of this data from auxiliary control system 285 may be used to adjust the focal distances for the one or more captured image elements. In one embodiment, in addition to auxiliary data system 285, the system may additionally accept manual adjustment 295.
In one embodiment, the image capture system 200 may capture multiple focal distances within a single image, utilizing the polarizer and a striped OPLE, as will be described below.
In one embodiment, the image capture system 200 may include a display system 299 to display the captured image. In one embodiment, the display system 299 may be a conventional display of a camera, showing what image is being captured. In one embodiment, the display system 299 may utilize a near-eye display system or other system which enables display of multiple image elements at multiple virtual distances.
In one embodiment, the capture assembly 310 may correct for chromatic aberration and other irregularities of optical systems.
With the shown set of three different OPLEs, the system can create up to eight (23) virtual object distances by selectively modulating the polarization, as follows:
In one embodiment, the configuration utilizes two OPLEs to be able to capture image elements at four focal distances.
The stripes 540 provide a light blocking coating on the entry surface of the OPLE. In one embodiment, the light blocking coating is a metal. In one embodiment, the light blocking coating is aluminum. The polarization coating on the OPLE enables the striping of images at two focal distances onto a single image capture apparatus. The separate images can then be reconstructed in post-processing.
In one embodiment, the striped OPLE may assist in capturing multiple focal distances simultaneously. In one embodiment, additionally or alternatively, focal distances may be captured time sequentially, e.g. with different focal distances captured as the length of the image path is changed.
In one embodiment, for simultaneous capture of images, the reconstruction may utilize edge finding to identify the separation between the focal planes, which have a magnification difference.
As shown in
Striped OPLE 593 is positioned in proximity to the imaging assembly 594. The striped OPLE separates the portion of the light with S-polarization from the portion of the light with P-polarization, and has a longer light path for one of those polarizations. The imaging assembly 594 utilizes the image data to display image elements at two focal distances, corresponding to the two polarizations.
In one embodiment, the image correction and image assembly may be done using processor 596. In one embodiment, memory 597 may store the data, and I/O system 598 may obtain the data to be displayed from outside the display system 580. In one embodiment, additional optical elements, and optionally additional OPLEs or modulation stack elements may be incorporated into the system.
At block 620 the system selects one or more focal planes for capture. In one embodiment, the selection may be made manually by a user. In one embodiment, the selection may be made automatically by the system based on an evaluation of the image, as discussed above. In one embodiment, the focal planes may be a plurality of discrete focal steps, covering the real-world image being captured.
At block 630, the first image/subframe is captured with a first polarization, at first focal distance. At block 640, the second image/subframe is captured with a second polarization at a second focal distance. In one embodiment, the first and second images may be captured simultaneously. In another embodiment, the first and second images may be captured sequentially.
At block 650, the images are reconstructed. In one embodiment, a series of images captured at a plurality of focal lengths may be combined to create a single multi-dimensional image. In one embodiment, simultaneously captured images may be reconstructed by separating the image portions at the plurality of focal distances. In one embodiment, other adjustments may be made to the image, including color correction, optical correction, etc. This may be done by a processor on the image capture device, or on a separate computer system.
At block 660, the captured image data is stored and/or displayed. In one embodiment, the image data may be stored in memory. In one embodiment, interim image data may be stored in a cache memory, and the final image data may be stored in memory, which may be a flash memory, a hard drive, or other type of storage. In one embodiment, the displaying may be using a near-eye display system or other system capable of displaying multi-focal images. The process then ends at block 670. In one embodiment, if the system is capturing video or a stream of images, the system may automatically return to block 620, to select subsequent focal planes, and capture additional images, until it is turned off or otherwise terminated. Although this flowchart only shows two image/subframes being captured, it should be understood that any number of images/subframes may be captured, at different focal distances utilizing the described system and process.
At block 720, the system selects one or more focal planes for image capture. In one embodiment, for a system providing simultaneous capture, a plurality of simultaneously captured focal planes may be selected. In one embodiment, for a system providing sequential capture, the first focal plane for capture may be selected.
At block 730, the system adjusts the modulation stack for the selected focal planes. As discussed above, two potential focal planes may be selected for simultaneous capture, utilizing a striped OPLE, or utilizing a polarizer.
At block 740 the image(s) are captured.
At block 750, the process determines whether all of the images to be captured have been captured. If not, the process returns to block 730 to adjust the light path length through the modulation stack. As noted above this is done by altering the polarization of the light. If all images have been captured, the process continues to block 760.
At block 760, the process determines whether there are any simultaneously captured images. If not, at block 780, the captured image data is stored and/or displayed, and the process ends at block 790.
If there were simultaneously captured images, the process continues to block 770. The simultaneous images are captured on the same CCD/sensor, so the system, at block 770 reconstructs the individual images in software. In one embodiment, this may be done by identifying the edges of each captured image segment, separating the image segments by focal distance, and combining the image segments to generate an image. The process then continues to block 780.
At block 820, image data is received through a striped OPLE. In one embodiment, this configuration includes only a single OPLE, which is striped. In one embodiment, the striping is as shown in
At block 830, the image is captured. As noted above, the image captures two focal distances in a single image, separated by polarization, based on the striping.
At block 840, the captured images are reconstructed. The reconstruction of the images in one embodiment creates two separate complete images, having different focal distances.
At block 850, the captured images are stored and/or displayed. The process then ends at block 860.
At block 920, the process determines whether the image capture system is capable of processing the data locally. There may be image capture systems that have integrated processors capable of processing the data locally. However, some systems cannot do so. For example, the system may be implemented as a retrofit lens for a camera. In such an embodiment, the modulation stack may be built into a camera lens assembly, enabling retrofitting of existing camera bodies for multi-focal capture. Such systems cannot perform post-processing. There are other embodiments in which the camera cannot perform processing, or can only do some of the potential processing needed to generate the final images.
If the camera can process locally, in one embodiment, it does so, and the process continues directly to block 930. If the image capture device cannot complete processing locally, at block 925, the post-processing, such as image separation, and building of multi-dimensional images, is implemented in a separate system. In one embodiment, the processing may be split between the computer and the image capture device, with the image capture device doing pre-processing.
In one embodiment, the separate system may be a client computer. In one embodiment, the separate system may be a server computer. If a server computer is used, in one embodiment, the user (or user's computer or other device) may upload the image data to a server, and receive the processed image data from the server. The process then continues to block 930.
At block 930, the process determines whether image capture was simultaneous. When image capture is simultaneous a single sensor captures interleaved slices of images at more than one focal distance. In such an instance the images are separated in post-processing, at block 935. The process then continues to block 940.
At block 940, the process determines whether the capture is differentiated by wavelength. In one embodiment, the images may be separated by wavelength, rather than focal length, or in addition to focal length. If so, at block 945, the captured images are separated by wavelength. This enables capturing data such as blood flow, and other image data not in the visible spectrum. The process then continues to block 950.
At block 950, the process determines whether the data is designed for polarity analysis. If so, at block 955, differential intensity analysis is used to generate this data, at block 955. This may be used to discern polarity of things, based on the energy distribution from a light source. The differential intensity of two polarities may be used to determine the elliptical polarities. The process then continues to block 960.
At block 960, the process determines whether the data is designed for light field reconstruction. Light field reconstruction captures all the rays of light, so that each pixel captures not only the total light level, but the directionality of the light. By capturing a plurality of images, at different focal distances, the directionality of each light ray may be used to reconstruct the light field, at block 965. By capturing the light field, the system can later adjust the focus area(s). The process then continues to block 970.
At block 970, the process determines whether the data is for a 3D scan. A 3D scan reconstructs a three-dimensional image. By capturing a plurality of images at a plurality of focal distances, the system may be used to construct a 3D image. This is done at block 975.
In one embodiment, the system may also be used for 360 degree spherical cameras. By capturing a sequence of images with different orientations, with uniform focal changes at the plurality of orientations, the system can capture data at a plurality of discrete focal steps, that can be calibrated for. This may enable easier stitching of images to create the 360 degree image. Of course, this technique, like the others described, may be applied to video or image sequences as well as still images.
The process then ends at block 990.
The data processing system illustrated in
The system further includes, in one embodiment, a random access memory (RAM) or other volatile storage device 1020 (referred to as memory), coupled to bus 1040 for storing information and instructions to be executed by processor 1010. Main memory 1020 may also be used for storing temporary variables or other intermediate information during execution of instructions by processing unit 1010.
The system also comprises in one embodiment a read only memory (ROM) 1050 and/or static storage device 1050 coupled to bus 1040 for storing static information and instructions for processor 1010. In one embodiment, the system also includes a data storage device 1030 such as a magnetic disk or optical disk and its corresponding disk drive, or Flash memory or other storage which is capable of storing data when no power is supplied to the system. Data storage device 1030 in one embodiment is coupled to bus 1040 for storing information and instructions.
The system may further be coupled to an output device 1070, such as a cathode ray tube (CRT) or a liquid crystal display (LCD) coupled to bus 1040 through bus 1060 for outputting information. The output device 1070 may be a visual output device, an audio output device, and/or tactile output device (e.g. vibrations, etc.).
An input device 1075 may be coupled to the bus 1060. The input device 1075 may be an alphanumeric input device, such as a keyboard including alphanumeric and other keys, for enabling a user to communicate information and command selections to processing unit 1010. An additional user input device 1080 may further be included. One such user input device 1080 is a cursor control device, such as a mouse, a trackball, stylus, cursor direction keys, or touch screen, may be coupled to bus 1040 through bus 1060 for communicating direction information and command selections to processing unit 1010, and for controlling movement on display device 1070.
Another device, which may optionally be coupled to computer system 1000, is a network device 1085 for accessing other nodes of a distributed system via a network. The communication device 1085 may include any of a number of commercially available networking peripheral devices such as those used for coupling to an Ethernet, token ring, Internet, or wide area network, personal area network, wireless network or other method of accessing other devices. The communication device 1085 may further be a null-modem connection, or any other mechanism that provides connectivity between the computer system 1000 and the outside world.
Note that any or all of the components of this system illustrated in
It will be appreciated by those of ordinary skill in the art that the particular machine that embodies the present invention may be configured in various ways according to the particular implementation. The control logic or software implementing the present invention can be stored in main memory 1020, mass storage device 1030, or other storage medium locally or remotely accessible to processor 1010.
It will be apparent to those of ordinary skill in the art that the system, method, and process described herein can be implemented as software stored in main memory 1020 or read only memory 1050 and executed by processor 1010. This control logic or software may also be resident on an article of manufacture comprising a computer readable medium having computer readable program code embodied therein and being readable by the mass storage device 1030 and for causing the processor 1010 to operate in accordance with the methods and teachings herein.
The present invention may also be embodied in a handheld or portable device containing a subset of the computer hardware components described above. For example, the handheld device may be configured to contain only the bus 1040, the processor 1010, and memory 1050 and/or 1020.
The handheld device may be configured to include a set of buttons or input signaling components with which a user may select from a set of available options. These could be considered input device #11075 or input device #21080. The handheld device may also be configured to include an output device 1070 such as a liquid crystal display (LCD) or display element matrix for displaying information to a user of the handheld device. Conventional methods may be used to implement such a handheld device. The implementation of the present invention for such a device would be apparent to one of ordinary skill in the art given the disclosure of the present invention as provided herein.
The present invention may also be embodied in a special purpose appliance including a subset of the computer hardware components described above, such as a kiosk or a vehicle. For example, the appliance may include a processing unit 1010, a data storage device 1030, a bus 1040, and memory 1020, and no input/output mechanisms, or only rudimentary communications mechanisms, such as a small touch-screen that permits the user to communicate in a basic manner with the device. In general, the more special-purpose the device is, the fewer of the elements need be present for the device to function. In some devices, communications with the user may be through a touch-based screen, or similar mechanism. In one embodiment, the device may not provide any direct input/output signals, but may be configured and accessed through a website or other network-based connection through network device 1085.
It will be appreciated by those of ordinary skill in the art that any configuration of the particular machine implemented as the computer system may be used according to the particular implementation. The control logic or software implementing the present invention can be stored on any machine-readable medium locally or remotely accessible to processor 1010. A machine-readable medium includes any mechanism for storing information in a form readable by a machine (e.g. a computer). For example, a machine readable medium includes read-only memory (ROM), random access memory (RAM), magnetic disk storage media, optical storage media, flash memory devices, or other storage media which may be used for temporary or permanent data storage. In one embodiment, the control logic may be implemented as transmittable data, such as electrical, optical, acoustical or other forms of propagated signals (e.g. carrier waves, infrared signals, digital signals, etc.).
In the foregoing specification, the invention has been described with reference to specific exemplary embodiments thereof. It will, however, be evident that various modifications and changes may be made thereto without departing from the broader spirit and scope of the invention as set forth in the appended claims. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.
The present invention is a continuation of U.S. patent application Ser. No. 15/398,705 filed on Jan. 4, 2017 and issuing on Aug. 21, 2018 as U.S. Pat. No. 10,057,488, which claims priority to U.S. patent application Ser. No. 15/236,101, filed on Aug. 12, 2016 (14100P0030). The present invention also claims priority to U.S. patent application Ser. No. 15/358,040 filed on Nov. 21, 2016 (14100P0036). The above applications are incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
3586416 | Bitetto | Jun 1971 | A |
3856407 | Takeda et al. | Dec 1974 | A |
4253723 | Kojima | Mar 1981 | A |
4670744 | Buzak | Jun 1987 | A |
5610765 | Colucci | Mar 1997 | A |
5751243 | Turpin | May 1998 | A |
6134031 | Nishi et al. | Oct 2000 | A |
6515801 | Shimizu | Feb 2003 | B1 |
6580078 | O'callaghan et al. | Jun 2003 | B1 |
7023548 | Pallingen | Apr 2006 | B2 |
7360899 | Mcguire, Jr. et al. | Apr 2008 | B2 |
7798648 | Ijzerman et al. | Sep 2010 | B2 |
7905600 | Facius et al. | Mar 2011 | B2 |
8262234 | Watanabe | Sep 2012 | B2 |
8755113 | Gardner et al. | Jun 2014 | B2 |
9025067 | Gray | May 2015 | B2 |
9304319 | Bar-Zeev et al. | Apr 2016 | B2 |
9494805 | Ward et al. | Nov 2016 | B2 |
9588270 | Merrill et al. | Mar 2017 | B2 |
10057488 | Evans | Aug 2018 | B2 |
10185153 | Eash | Jan 2019 | B2 |
10187634 | Eash | Jan 2019 | B2 |
20010027125 | Kiyomatsu | Oct 2001 | A1 |
20020191300 | Neil | Dec 2002 | A1 |
20030020925 | Patel et al. | Jan 2003 | A1 |
20040156134 | Furuki et al. | Aug 2004 | A1 |
20040263806 | Silverstein et al. | Dec 2004 | A1 |
20050141076 | Bausenwein | Jun 2005 | A1 |
20060119951 | McGuire | Jun 2006 | A1 |
20070030456 | Duncan et al. | Feb 2007 | A1 |
20070030543 | Javidi et al. | Feb 2007 | A1 |
20070139760 | Baker et al. | Jun 2007 | A1 |
20070146638 | Ma et al. | Jun 2007 | A1 |
20080130887 | Harvey et al. | Jun 2008 | A1 |
20080174741 | Yanagisawa | Jul 2008 | A1 |
20080205244 | Kitabayashi | Aug 2008 | A1 |
20090021824 | Ijzerman et al. | Jan 2009 | A1 |
20090046262 | Okazaki | Feb 2009 | A1 |
20090052838 | McDowall et al. | Feb 2009 | A1 |
20090061505 | Hong | Mar 2009 | A1 |
20090061526 | Hong | Mar 2009 | A1 |
20090237785 | Bloom | Sep 2009 | A1 |
20090244355 | Horie | Oct 2009 | A1 |
20110032436 | Shimizu et al. | Feb 2011 | A1 |
20110075257 | Hua et al. | Mar 2011 | A1 |
20110149245 | Barth | Jun 2011 | A1 |
20120075588 | Suga | Mar 2012 | A1 |
20120113092 | Bar-Zeev et al. | May 2012 | A1 |
20130070338 | Gupta et al. | Mar 2013 | A1 |
20130100376 | Sawado | Apr 2013 | A1 |
20130222770 | Tomiyama | Aug 2013 | A1 |
20130344445 | Clube et al. | Dec 2013 | A1 |
20140168035 | Luebke et al. | Jun 2014 | A1 |
20140176818 | Watson et al. | Jun 2014 | A1 |
20150061976 | Ferri | Mar 2015 | A1 |
20150153572 | Miao | Jun 2015 | A1 |
20150205126 | Schowengerdt | Jul 2015 | A1 |
20150319342 | Schowengerdt | Nov 2015 | A1 |
20160041390 | Poon et al. | Feb 2016 | A1 |
20160041401 | Suga | Feb 2016 | A1 |
20160077338 | Robbins et al. | Mar 2016 | A1 |
20160131920 | Cook | May 2016 | A1 |
20160195718 | Evans | Jul 2016 | A1 |
20160225337 | Ek et al. | Aug 2016 | A1 |
20160227195 | Venkataraman et al. | Aug 2016 | A1 |
20160306168 | Singh | Oct 2016 | A1 |
20160381352 | Palmer | Dec 2016 | A1 |
20170038579 | Yeoh | Feb 2017 | A1 |
20170068103 | Huang et al. | Mar 2017 | A1 |
20170075126 | Carls et al. | Mar 2017 | A1 |
20170097507 | Yeoh | Apr 2017 | A1 |
20170146803 | Kishigami et al. | May 2017 | A1 |
20170160518 | Lanman et al. | Jun 2017 | A1 |
20170227770 | Carollo et al. | Aug 2017 | A1 |
20170269369 | Qin | Sep 2017 | A1 |
20180045973 | Evans | Feb 2018 | A1 |
20180045974 | Eash et al. | Feb 2018 | A1 |
20180045984 | Evans | Feb 2018 | A1 |
20180048814 | Evans | Feb 2018 | A1 |
20180149862 | Kessler et al. | May 2018 | A1 |
20180283969 | Wang | Oct 2018 | A1 |
20190007610 | Evans | Jan 2019 | A1 |
20190086675 | Carollo et al. | Mar 2019 | A1 |
20190155045 | Eash | May 2019 | A1 |
20190174124 | Eash | Jun 2019 | A1 |
Number | Date | Country |
---|---|---|
1910499 | Feb 2007 | CN |
102566049 | Jul 2012 | CN |
103765294 | Apr 2014 | CN |
105739093 | Jul 2016 | CN |
109997357 | Jul 2019 | CN |
0195584 | Sep 1986 | EP |
H06258673 | Sep 1994 | JP |
3384149 | Mar 2003 | JP |
2012104839 | Aug 2012 | WO |
2012175939 | Dec 2012 | WO |
2015190157 | Dec 2015 | WO |
2016087393 | Jun 2016 | WO |
Entry |
---|
Hu, Xinda et al., “High-resolution optical see-through multi-focal-plane head-mounted display using freeform optics,” Opt. Express 22, 13896-13903 (2014). |
Lee, Yun-Han et al., Switchable Lens for 3D Display, Augmented Reality and Virtual Reality. Society for Information Display (SID), International Symposium Digest of Technical Papers, vol. 47, Issue 1, May 25, 2016 (4 page). |
Matjasec et al., “All-Optical Thermos-Optical Path Length Modulation based on the Vanadium-Doped Fibers,” Optical Society of America, vol. 21, No. 10, May 2013, pp. 1-14. |
Pate, Michael, Polarization Conversion Systems for Digital Projectors, Web Publication, Apr. 21, 2006, Downloaded from http://www.zemax.com/os/resources/learn/knowledgebase/polarization-conversion-systems-for-digital-projectors on Jun. 17, 2016 (8 pages). |
PCT Search Report & Written Opinion PCT/US2017/046645, dated Oct. 30, 20017, 12 pages. |
Polatechno Co., Ltd., LCD Projector Components, http://www.polatechno.co.jp/english/products/projector.html downloaded Jun. 17, 2016 (2 pages). |
Sandner et al., “Translatory MEMS Actuators for optical path length modulation in miniaturized Fourier-Transform infrared spectrometers,” MEMS MOEMS 7(2), Apr.-Jun. 2008 pp. 1-11. |
Wang Hui, “Optical Science and Applications Series: Digital holographic three-dimensional display and detection”. ISBN: 9787313100801. Shanghai Jiaotong University Press. Published Nov. 1, 2013. (4 pages). |
Yang Tie Jun, Industry Patent Analysis Report (vol. 32)—New Display, ISBN: 7513033447. Intellectual Property Publishing House Co., Ltd. Published Jun. 2015. (4 pages). |
Number | Date | Country | |
---|---|---|---|
20190007610 A1 | Jan 2019 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15398705 | Jan 2017 | US |
Child | 16105943 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15358040 | Nov 2016 | US |
Child | 15398705 | US | |
Parent | 15236101 | Aug 2016 | US |
Child | 15358040 | US |