The disclosure relates generally to an imaging system and method and in particular to a system and method for recording wave fronts of light using holography and integral imaging.
There are many approaches to recording wave fronts of light. These approaches can be arbitrarily divided into three classes of recording:
1. Approaches that record the amplitude of a real image on a recording medium, the real image being produced by an optical system. Typically, these processes are called photography.
2. Approaches that record the phase of the wave front of light by recording the amplitude of an interference pattern on a recording medium produced by a signal wave front and a reference wave front. Typically, these processes are called holography.
3. Approaches that record the phase of a wave front of light through the simultaneous recording of light amplitude on a recording medium of a large number of points of view. These processes are often known as integral imaging.
Each of the above approaches has advantages and limitations.
For example, photography produces an image that is easily processed and understood, but is limited to a single point of view and a single focal plane. Technologies for recording photographs are well understood, are widely used, and its recording and computational components have benefited from cost efficiencies from mass production. Unfortunately, photography has poor efficiency in the use of the energy of the input wave front of light. This is due to the fact that the photography process must filter the light spatially using a small aperture for optimal depth of field, and by frequency, to record color. The use of a small aperture also limits the amount of information a photograph may capture due to the diffraction limits of an optical system using a small aperture.
Holography produces an image that is a full recreation of the wave front of light over the aperture of the recording media, with no limitations on viewpoint or focal plane. Holography is very efficient in its use of the energy of the input wave front of light. The entire incident light from the two wave fronts is recorded. Holograms are recorded using interference, which is matched to the diffraction limits of the wave fronts recorded. Since the recording aperture is the entire recording area, holograms approach the theoretical information and resolution limits of recording visual information. Unfortunately, there are severe constraints limiting when a hologram may be recorded that makes them impractical as a general approach to recording a wide variety of imagery. For example, holograms may only be recorded using light that has spatial and frequency coherence.
Integral imaging can record multiple points of view, and through use of additional computation, is free of focal plane constraints. The use of classic optical systems in integral imaging imposes the same efficiency and information limits as photography, with the additional resource burden of imaging many points of view, often on a single recording device or surface.
Thus, it is desirable to provide an imaging system and method that overcomes the above limitations with the known approaches and it is to this end that the disclosure is directed.
The disclosure is particularly applicable to the hybrid integral imaging system and method described below that efficiently records large aperture wave fronts of visible and near visible light and it is in this context that the disclosure will be described. It will be appreciated, however, that the system and method has greater utility. For example, the embodiment described below may be used for visible light, but the hybrid integral imaging system and method may also be used with non-visible light. More generally, the hybrid integral imaging system and method may be used with various types of electromagnetic radiation.
In one implementation, the system and method may be used with electromagnetic radiation that has line spectra that fall within a Visible region and a Near infrared (IR) region of the electromagnetic spectrum. For example, the line spectra (with a range of wavelengths) may include one or more of violet at 380-450 nm, blue at 450-495 nm, green at 495-570 nm, yellow at 570-590 nm, orange at 590-620 nm, red at 620-750 nm and near it at 2,500 to 750 nm.
In addition to the implementation of the processing module 102 shown in
The holographic multiplexing element 200 provides efficient multiplexing of information in the input light wave front to individual recording devices. More particularly, the holographic multiplexing element 200 may efficiently multiplex the wave front into discrete channels for Color, Viewpoint, and Depth.
The hybrid integral imaging unit 104 may be composed of five components including the holographic multiplexing element 200, one or more image recording elements 202, a support component 204, one or more light ways 206 that are shown in
A. Holographic Multiplexing Element 200
The holographic multiplexing element 200 may be a volume reflection hologram recorded in the Lipmann-Bragg regime. This type of hologram contains recorded interference fringes slicing through a volume of the recording medium at ½ wavelength layers. This geometry reflects and filters light of a specific wavelength. Recording materials are available that are very efficient in reconstruction, so it is possible to record in this hologram many virtual off-axis concave parabolic (or spheroid) focusing sub-elements, such as mirrors, that efficiently reflect and focus a narrow wavelength of light. Each virtual mirror sub-element may be designed to focus an image of a particular frequency band from a specific point of view, with a specific virtual aperture, and a specific depth of field to an individual recording element. The sub-elements may also be mapped to portions of an image recording element. Alternatively, each focusing sub-element may be paired with an image recording element.
In one implementation, the holographic multiplexing element 200 may be implemented as a set of discrete volume reflection hologram elements. Each holographic element may filter the incoming wave front of light by color spectra using constructive interference in the Lippman/Bragg domain. Each holographic element may filter based on view point and/or depth by reconstructing a real image of a portion of the incoming wave front of light using constructive interference in the Fresnel domain.
To manufacture the holographic multiplexing element 200, each discrete volume hologram may be written to a holographic recording substrate (such as, for example, dichromated gelatin on optical glass) sequentially using an automated process than uses specific spectra of coherent light as an object wave and a reference wave to describe the mapping the aperture of each discrete holographic element to an input point in the incoming wave front and an output point on the image recording element.
The holographic multiplexing element 200 may additionally be implemented as a set of discrete relief transmission holograms that have a specific blazed coating applied. Each discrete holographic element may filter based on view point and depth by reconstructing a real image of a portion of the incoming wave front of light using constructive interference in the Fresnel domain. Each holographic element may filter the color spectra of the incoming wave front of light by constructive interference of the blazed coating. In the manufacture of this embodiment of the holographic multiplexing element 200, each discrete relief hologram may be written to a holographic recording substrate (such as, for example, a photo sensitive plastic applied to optical glass) sequentially using an automated process than uses specific spectra of coherent light as an object wave and a reference wave to describe the inverse mapping the aperture of each discrete holographic element to an input point in the incoming wave front and an output point on the image recording element. Each recording element is coated with a series of films through a mask to provide spectral filtering. The resulting holographic multiplexing element may be then mounted in a manner to use its conjugate (or psuedoscopic) reconstruction geometry. The above methods may also be used to manufacture a master holographic multiplexing element that is then replicated using methods known to those that are skilled in the art.
The Holographic Multiplexing Element may have data encoded in it that may be read by a processing module 102. For example, it is possible to encode, on an unused portion of the holographic multiplexing element, binary data using a known gray code to produce a pattern of lines or dots. In this manner, digital data may be associated with a specific holographic multiplexing element. The digital data may contain information of how the holographic multiplexing element maps the incoming wave front of light to portions of the one or more image recording elements. For example, the digital data may indicate that a first element of the holographic multiplexing element maps to a first image recording element at a particular location.
The characteristics of the holographic multiplexing element 200 may be changed to optimize the Imaging Device for a specific use. For instance, a holographic multiplexing element 200 may be manufactured that provides discrete elements that provide a large variety of spectral filters, and only a few spatial points and this would be useful for spectroscopy. Alternatively, the holographic multiplexing element 200 may be manufactured that encodes only red, green and blue spectral filters and a variety of spatial viewpoints arranged on the horizontal plane which would be useful for three dimensional imaging. In the example in
B. Image Recording Elements 202
The holographic multiplexing element 200 multiplexes the input wave front of light onto the one or more image recording elements 202 that are positioned at predetermined locations on the support component 204, such as the positioning shown in
Each image recording element has a mechanism, such as a wire or cable, to transfer the electrical signals to the processing module 102.
C. Support Component 204
The support component 204 supports the holographic multiplexing element 200 and the one or more image recording elements 202 and matches the off-axis multiplexing relationship between the holographic multiplexing element 200 and the one or more image recording elements 202. Specifically, as shown in
D. Light Ways 206
The light ways 206 may each be a piece of material, such as plastic or metal with certain properties to suppress ambient or reflected electromagnetic energy, that is connected to the support component 204, such as in the configuration shown in
For simple mapping geometries, a light way may be constructed using a honeycomb shape, where the proportion of the width to depth of a honeycomb element is chosen to match the camera angle of view, and the width of the honeycomb is chosen as a proportion to the thickness of the honeycomb wall to optimize throughput of the electromagnetic radiation wave front and the reduction of shadows or artifacts from the light ways. For more complex light ways, the path of each input and output ray for a given discrete holographic sub-element may be laser cut into a lightweight, rigid foam block.
E. Processing Module 102
The processing module 102 may be a processing unit or a processing unit based device (such as shown in
The processing module may then acquire the data from the image recording elements (406) using specific methods and attributes. For example, the image data may be acquired from several optical elements containing similar viewpoints, and using multiple processing elements mapped to portions of the image recording elements, combine these viewpoints to comprise a sharper, higher resolution image data. In addition, input image data from a variety of view may be analyzed, and converted into a three dimensional representation such as a Binary Space Partition or Octree, representing component watersheds of the wave front. The image data may also be compressed, using known methods such as entropy encoding based on differences in one or more spatial or color spaces.
The processing module may then store the acquired data from the image recording elements and/or load already stored data from the image recording elements (408.) The processing module (such as the computer system in
During the reconstruction process, for example, the processing module may use multiple processors to decode information stored in a 3d form, and transform the input data to represent a given point of view, synthetic aperture, or color space. The processing module may also construct an image by directly assembling images from a specific point of view encoded in the holographic multiplexing element, and mapped to the image recording element in a manner that is easily tiled and reconstructed. The processing module may then refine the image such a reconstructed image using edges or high frequencies found in similar or adjacent views.
The processing module (or the other device) may also adjust its reconstruction process based on configuration options from the user interface component or from another device over a network.
F. User Interface Component/Module
The User Interface component has device/mechanism/process for a user to set attributes and methods and communicate them to the processing module 102. For instance, one attribute may be a specific method of how the processing Module will acquire data from the Recording Elements. Additionally, one of these attributes may be when the processing module starts and/or stops recording of the wave front of light from the image recording elements. The attributes may also, for instance, include a method to display a reconstruction of the acquired data as an image. Additionally, one of the attributes that are set by the user and communicated to the processing module may be what point of view to reconstruct, at what depth.
G. Enclosure
The enclosure holds all of the other system components in a manner that protects them from light, dust and/or physical damage. It does this in a way that is pleasing for a user to hold, to touch and to see. The enclosure may be made of a carbon composite material which may be rigid and light and may be cast with a texture on the surface that may be decorative and may provide an easy to hold surface.
H. Implementation of the Hybrid Integral Imaging Geometry
The hybrid integral imaging geometry may be designed to maximize the efficiency of the holographic multiplexing element in the Lippman-Bragg regime while reducing internal reflections from the various system elements. One example of a configuration is to place the Holographic Multiplexing Element plane to the incoming wave front of light at the back of the device, and to place the Recording Elements off axis under the device aperture as shown in
The Holographic Multiplexing Element may be removed and replaced from the device as needed. Once the hybrid integral imaging geometry is determined, the geometry between the Holographic Multiplexing Element and the image recording elements remains fixed. However, the form of multiplexing may be changed, with a replacement of the holographic multiplexing element. For instance, if the Hybrid Integral Imaging System had 64 Image Recording Elements of 3200×2400, possible multiplexing configurations (requiring a holographic multiplexing element change) could be:
1. 530 (640×480) points of view at 3 colors with an aggregate 491 Mega Pixels.
2. 2130 (320×240) points of view at 3 colors.
3. 16 (3200×2400) points of view, 12 depths, Monochrome.
4. 1 (3200×2400) view. 8 depths, 8 color samples for full spectrum imaging.
In the above examples, a same number of image recording element pixels (or sensing areas) are used for each configuration and the change needed to change from a large aperture 3d light field camera represented by example (2) and a high resolution spectral depth imaging camera represented by example (4) is the holographic multiplexing element 200. In the case of example (2) the holographic multiplexing element 200 may be manufactured to map many input wave front spatial points into small sections of the image recording element. In the case of example [4], more frequency filtering, and fewer input spatial points are used to map the input wave front of light to large areas of image recording element. An example of the geometry of an implementation of the system is shown in
The hybrid integral imaging system receives a wave front of light and uses the constructive interference of light, distributes the wave front of light into a multiplicity of amplitude images (by the holographic multiplexing element) focused onto the one or more image recording elements. In the system, each focused image recording element is distributed by wavelength, position of view, and/or depth of focus.
While the foregoing has been with reference to a particular embodiment of the invention, it will be appreciated by those skilled in the art that changes in this embodiment may be made without departing from the principles and spirit of the disclosure, the scope of which is defined by the appended claims.
This application claims the benefit of and priority under 35 USC 119(e) and 120 to U.S. Provisional Patent Application Ser. No. 61/581,389, filed on Dec. 29, 2011 and entitled “System and Method for the Efficient Recording of Large Aperture Wave Fronts of Visible and Near Visible Light”, the entirety of which is incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
3623798 | Sheridon | Nov 1971 | A |
4636027 | Dube | Jan 1987 | A |
4888260 | Cowan | Dec 1989 | A |
5134669 | Keogh et al. | Jul 1992 | A |
5343401 | Goldberg et al. | Aug 1994 | A |
5422819 | Nakamura | Jun 1995 | A |
5615318 | Matsuura | Mar 1997 | A |
5850222 | Cone | Dec 1998 | A |
5872555 | Kolar et al. | Feb 1999 | A |
6012402 | Sekine | Jan 2000 | A |
6012890 | Celorio Garrido | Jan 2000 | A |
6029141 | Bezos et al. | Feb 2000 | A |
6097310 | Harrell et al. | Aug 2000 | A |
6173211 | Williams et al. | Jan 2001 | B1 |
6321670 | Tomita et al. | Jan 2001 | B1 |
6196146 | Goldberg et al. | Mar 2001 | B1 |
6280891 | Daniel | Aug 2001 | B2 |
6310627 | Sakaguchi | Oct 2001 | B1 |
6392748 | Fateley | May 2002 | B1 |
6473671 | Yan | Oct 2002 | B1 |
6513921 | Houle | Feb 2003 | B1 |
6513924 | Goldberg et al. | Feb 2003 | B1 |
6546309 | Gazzuolo | Apr 2003 | B1 |
6564118 | Swab | May 2003 | B1 |
6804573 | Goldman | Oct 2004 | B2 |
6804660 | Landau et al. | Oct 2004 | B2 |
6842532 | Hu et al. | Jan 2005 | B2 |
6859679 | Smith | Feb 2005 | B1 |
6907310 | Gardner et al. | Jun 2005 | B2 |
6947808 | Goldman | Sep 2005 | B2 |
6968075 | Chang | Nov 2005 | B1 |
6994201 | Yu et al. | Feb 2006 | B2 |
7016756 | Goldman | Mar 2006 | B2 |
7016757 | Goldman | Mar 2006 | B2 |
7054709 | Takeuchi | May 2006 | B2 |
7216092 | Weber et al. | May 2007 | B1 |
7340416 | Larabee | Mar 2008 | B1 |
7409259 | Reyes Moreno | Aug 2008 | B2 |
7479956 | Shaw-Weeks | Jan 2009 | B2 |
7616851 | Uhlhorn et al. | Nov 2009 | B1 |
8069091 | Callen et al. | Nov 2011 | B1 |
8090461 | Harvill et al. | Jan 2012 | B2 |
8174521 | Harvill et al. | May 2012 | B2 |
8175931 | Harvill et al. | May 2012 | B2 |
8240262 | Zeiger et al. | Aug 2012 | B2 |
8249738 | Lastra et al. | Aug 2012 | B2 |
8401916 | Harvill et al. | Mar 2013 | B2 |
8411090 | Wang | Apr 2013 | B2 |
8514220 | Harvill et al. | Aug 2013 | B2 |
8516392 | Ostroff | Aug 2013 | B2 |
8654120 | Beaver et al. | Feb 2014 | B2 |
8711175 | Aarabi | Apr 2014 | B2 |
8712566 | Harvill | Apr 2014 | B1 |
8878850 | Harvill et al. | Nov 2014 | B2 |
8958633 | Harvill | Feb 2015 | B2 |
9087355 | Harvill et al. | Jul 2015 | B2 |
9213920 | Harvill et al. | Dec 2015 | B2 |
9477979 | Harvill et al. | Oct 2016 | B2 |
20010026272 | Feid et al. | Oct 2001 | A1 |
20020007228 | Goldman | Jan 2002 | A1 |
20020030689 | Eichel et al. | Mar 2002 | A1 |
20020082960 | Goedken | Jun 2002 | A1 |
20020099524 | Sell et al. | Jul 2002 | A1 |
20030023687 | Wolfe | Jan 2003 | A1 |
20030076318 | Shaw-Weeks | Apr 2003 | A1 |
20030120183 | Simmons | Jun 2003 | A1 |
20030168148 | Gerber et al. | Sep 2003 | A1 |
20030177364 | Walsh et al. | Sep 2003 | A1 |
20030179197 | Sloan et al. | Sep 2003 | A1 |
20030182402 | Goodman et al. | Sep 2003 | A1 |
20030184544 | Prudent | Oct 2003 | A1 |
20030229893 | Sgaraglino | Dec 2003 | A1 |
20040024764 | Hsu et al. | Feb 2004 | A1 |
20040044566 | Bostelmann et al. | Mar 2004 | A1 |
20040049309 | Gardner et al. | Mar 2004 | A1 |
20040078285 | Bijvoet | Apr 2004 | A1 |
20040078294 | Rollins et al. | Apr 2004 | A1 |
20040095375 | Burmester et al. | May 2004 | A1 |
20040153512 | Friend | Aug 2004 | A1 |
20040194344 | Tadin | Oct 2004 | A1 |
20040227752 | McCartha et al. | Nov 2004 | A1 |
20040236455 | Woltman et al. | Nov 2004 | A1 |
20040236634 | Ruuttu | Nov 2004 | A1 |
20040267610 | Gossett et al. | Dec 2004 | A1 |
20050065799 | Dare et al. | Mar 2005 | A1 |
20050071242 | Allen et al. | Mar 2005 | A1 |
20050125092 | Lukis et al. | Jun 2005 | A1 |
20050131571 | Costin | Jun 2005 | A1 |
20050144090 | Gadamsetty | Jun 2005 | A1 |
20050149223 | Takeuchi | Jul 2005 | A1 |
20050155316 | Shipley | Jul 2005 | A1 |
20050177453 | Anton et al. | Aug 2005 | A1 |
20050182707 | Yeager | Aug 2005 | A1 |
20050203766 | Donaldson | Sep 2005 | A1 |
20050204002 | Friend | Sep 2005 | A1 |
20050238251 | Lunetta et al. | Oct 2005 | A1 |
20050251462 | Nykamp | Nov 2005 | A1 |
20060015207 | Weiser et al. | Jan 2006 | A1 |
20060020486 | Kurzweil et al. | Jan 2006 | A1 |
20060027154 | Naka et al. | Feb 2006 | A1 |
20060156283 | Landau et al. | Jul 2006 | A1 |
20060178952 | Harris | Aug 2006 | A1 |
20060226563 | Albert et al. | Oct 2006 | A1 |
20070005461 | Lenz | Jan 2007 | A1 |
20070083383 | Van Bael et al. | Apr 2007 | A1 |
20070174132 | Shemula | Jul 2007 | A1 |
20070208633 | Singh | Sep 2007 | A1 |
20080006192 | Zeiger et al. | Jan 2008 | A1 |
20080079727 | Goldman et al. | Apr 2008 | A1 |
20080092309 | Ellis et al. | Apr 2008 | A1 |
20080147512 | Yankton | Jun 2008 | A1 |
20080147219 | Jones | Aug 2008 | A1 |
20090070666 | Eilers et al. | Mar 2009 | A1 |
20090109214 | Harvill et al. | Apr 2009 | A1 |
20090122329 | Hegemier et al. | May 2009 | A1 |
20090182573 | Lidestri | Jul 2009 | A1 |
20090190858 | Moody et al. | Jul 2009 | A1 |
20090222127 | Lind | Sep 2009 | A1 |
20090254207 | Tiffany et al. | Oct 2009 | A1 |
20100042484 | Sipes | Feb 2010 | A1 |
20100119957 | Satou et al. | May 2010 | A1 |
20100169185 | Cottingham | Jul 2010 | A1 |
20100185309 | Ohiaeri et al. | Jul 2010 | A1 |
Number | Date | Country |
---|---|---|
1536511 | Oct 2004 | CN |
1713196 | Dec 2005 | CN |
1828671 | Sep 2006 | CN |
1877629 | Dec 2006 | CN |
102203818 | Mar 2010 | CN |
101663662 | Sep 2011 | CN |
1136899 | Sep 2001 | EP |
S61-255376 | Nov 1986 | JP |
H07-140886 | Jun 1995 | JP |
10-222653 | Aug 1998 | JP |
10-247256 | Sep 1998 | JP |
11-066119 | Mar 1999 | JP |
2001-052177 | Feb 2001 | JP |
2001-314677 | Nov 2001 | JP |
2002-133201 | May 2002 | JP |
2003-122960 | Apr 2003 | JP |
2004-152000 | May 2004 | JP |
2005-118215 | May 2005 | JP |
2007-183486 | Jul 2007 | JP |
2011-077764 | Apr 2011 | JP |
H02-104758 | Apr 1990 | KR |
WO 2001087001 | Nov 2001 | WO |
WO 200212925 | Feb 2002 | WO |
WO 2003085186 | Oct 2003 | WO |
WO 2008038039 | Apr 2008 | WO |
WO 2010022404 | Feb 2010 | WO |
WO 2012067230 | May 2012 | WO |
Entry |
---|
Wu et al (Wavelength-multiplexed submicron holograms for disk-compatible data storage, Optics Express, vol. 15, No. 26, Dec. 24, 2007, pp. 17798-17804). |
PCT International Preliminary Report on Patentability of PCT/US12/71589 dated Dec. 2, 2014; (7 pgs.). |
PCT International Search Report of PCT/US12/71589; dated Mar. 4, 2013 (2 pgs.). |
PCT Written Opinion of PCT/US12/71589; dated Mar. 4, 2013 (6 pgs.). |
Demarco “Zazzle and Pitney Bowes Team up to deliver custom stamps to consumers,” (2005), published online: http//www.zazzle.com/mk/welcome/pressreleases/pr071805_2 (2 pages). |
Joele: “Research Assignment of an Augmented Reality System using ARToolKit and user invisible markers,” 2005 (Jan. 1, 2005), XP055103237, Retrieved from the Internet: URL:http://graphics.tudelft.nl/˜vrphobia/RA_final_report_Dennis_Joele.pdf [retrieved on Feb. 19, 2014] (66 pgs.). |
Ehara J et al: “Texture overlay for virtual clothing based on PCA of silhouettes” Mixed and Augmented Reality, 2006. ISMAR 2006. IEEE/ACM International Symposium on, IEEE, PI, Oct. 1, 2006 (Oct. 1, 2006) , pp. 139-142, XP031014661 ISBN: 978-1-4244-0650-0 (4 pages). |
Ehara J et al: “Texture overlay onto deformable surface for virtual clothing” ACM International Conference Proceeding Series—Proceedings of the 2005 International Conference on Augmented Tele-Existence, ICAT '05 2005 Association for Computing Machinery USA, vol. 157, 2005 , pp. 172-179, XP002606672 DO!: DOI:10.1145/1152399.1152431 (8 pages). |
Gruber, “Texture Mapping,” Ted Gruber Software, Inc., pp. 1-2 (2001) (2 pgs.). |
Heckbert, Paul S. “Fundamentals of texture mapping and image warping” Master's Thesis under the direction of Carlo Sequin, Dept. of Electrical Engineering and Computer Science University of California, Berkeley. Jun. 17, 1989. |
Meseth “Towards Predictive Rendering in Virtual Reality” Ph.D. dissertation, Bonn University, published Oct. 2006 (369 pages). |
Nelson, J “From Moon Shoes to Gel Gun—Sustaining Hart health”, Columbian. Vancouver, Wash.: Oct. 11, 2007 (3 pages). |
Real Studio, “Real 3D Tutorials: Tutorial 5—Texture Mapping,” pp. 1-5 (printed Sep. 9, 2016) (5 pgs.). |
Scholz V et al: “Garment motion capture using color-coded patterns” Computer Graphics Forum Blackwell Publishers for Eurographics Assoc UK, vol. 24, No. 3, 2005 , pp. 439-439, XP002603022 ISSN: 0167-7055 (9 pages). |
Wolfe, “Teaching Texture Mapping Visually,” Nov. 1997, pp. 1-37 (37 pgs.). |
Number | Date | Country | |
---|---|---|---|
20130208329 A1 | Aug 2013 | US |
Number | Date | Country | |
---|---|---|---|
61581389 | Dec 2011 | US |