Wavelength and spatially modulated non-visible light is useful in a variety of applications. However, traditional techniques to generate non-visible images in conjunction with visible light images have resulted in devices which are complicated or expensive to implement.
The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The use of the same reference numbers in different figures indicates similar or identical components or features.
Non-visible light images are useful in several applications. Non-visible light may be used to transmit information to devices, form structured light patterns, broadcast synchronization pulses, and so forth. In the instance of structured light, a known image or pattern is projected onto an otherwise unknown physical scene. By observing the distortion in the known pattern, the topology of the scene may be characterized. Structured light which is non-visible to a user's eye, such as infrared (IR) or ultraviolet (UV) light, allows for characterization of the scene in a manner that is imperceptible to the user.
Described herein are devices and techniques for projecting both visible and non-visible images with a single device. Projected images, including structured light patterns, may use one or more different wavelengths (or colors) of light. A light source for the projected image may, in some implementations, comprise a multiple-wavelength light source, such as an incandescent lamp. Un-modulated light from the multiple-wavelength light source may by modulated by a wavelength modulator to produce a particular wavelength or pre-determined range of wavelengths of light.
In one implementation, the wavelength modulator may be a color wheel placed in an optical path of an image projector. The color wheel comprises a plurality of segments, with each segment configured to pass a pre-determined range of wavelengths. A color wheel motor spins the color wheel, producing a sequence of light in the pre-determined range of wavelengths associated with each segment. One or more segments of the color wheel are configured to pass or transmit the pre-determined range of visible wavelengths and substantially block other wavelengths outside of the pre-determined range. One or more other segments of the color wheel are configured to pass the pre-determined range of non-visible infrared wavelengths while substantially blocking other wavelengths outside of the pre-determined range non-visible range. Images are formed by timing spatial modulation of the light.
In other implementations, the wavelength modulator may comprise other devices such as acousto-optic modulators, prisms, diffraction gratings, and so forth. The wavelength modulator is constructed such that incoming light is processed and outgoing light is of a pre-determined range of wavelengths.
In some implementations, the light source itself may be configurable to provide wavelength modulated light. As a result, rather than filtering output from a lamp which emits over a wide range of wavelengths, a wavelength modulated light source may be configured to produce a particular wavelength or pre-determined range of wavelengths.
The wavelength modulated light source may comprise solid state light sources such as lasers, light emitting diodes (LEDs), and so forth. These solid state light sources may be switched on and off to allow for production of a particular pre-determined range of wavelengths, or a particular wavelength where the light source is monochromatic, at a particular time. In other implementations, non-solid state light sources such as gas lasers, dye lasers, and so forth may also be used as wavelength modulated light sources.
The wavelength modulated light source includes emitters for visible and non-visible light. By having both visible and non-visible light available, specific images may be rendered in particular wavelengths by an imaging modulator.
The imaging modulator spatially modulates the wavelength modulated light to produce an image. The wavelength modulator or wavelength modulated light source may be synchronized with the imaging modulator such that when a particular pre-determined range of wavelengths is being produced, a particular image may be generated therefrom.
Compared to the rates of change of the wavelength modulator and imaging modulator, the eye has a relatively long integration time. This effect is also known as “persistence of vision” and enables the non-visible image to be interspersed amongst visible light images. For example, a given frame of a color image may include four sub-frames: a red sub-frame with an associated image corresponding to the red channel of the frame, a green sub-frame with an associated image corresponding to the green channel of the frame, a blue sub-frame with an associated image corresponding to the blue channel of the frame, and a non-visible light sub-frame with an associated image such as a structured light pattern. The sequencing of the sub-frames and the interval during which each sub-frame is projected are configured to minimize or eliminate user-perceptible flicker. In some implementations, the duration of the non-visible sub-frame may differ from the duration of the visible sub-frames. For example, the non-visible sub-frame may have a shorter or longer duration than the sub-frames corresponding to visible light.
A wavelength modulation synchronization signal may also be generated which indicates the state of the wavelength modulated light source. A camera configured to sense the non-visible light may use this signal to synchronize image capture and recover structured light data. A computing device may then accept the structured light data and process the data. Processing may include determining the physical arrangement of objects within a room, or the shape of the room itself, based at least in part upon the structured light data.
Illustrative Projection Systems
At least a portion of the un-modulated light 104 is configured to impinge upon a wavelength modulator 106. The wavelength modulator 106 is configured to selectively pass a pre-determined range of wavelengths for a given interval of time. In the implementation shown in this figure, the wavelength modulator 106 comprises a color wheel 108 coupled to a motor 110. The color wheel 108 comprises a plurality of segments. Each segment of the color wheel is configured to pass a pre-determined range of wavelengths. These wavelengths may be visible or non-visible. The motor 110 rotates the color wheel, as indicated with arrow 112, such that for a given moment of time while a segment is in an optical path, the particular pre-determined range of wavelengths of that segment may pass. As the color wheel 108 rotates, over time the pre-determined range of wavelengths changes according to the sequence of the segments on the color wheel.
The color wheel 108 is illustrated with four segments, one for non-visible light and three for visible light. Four segments are shown by way of illustration, and not as a limitation. More or fewer segments may be used. Furthermore, in some implementations multiple segments may be configured to pass the same pre-determined range of wavelengths.
A non-visible light segment 114 is configured to pass a pre-determined range of non-visible wavelengths. These non-visible wavelengths are outside of the range of wavelengths visible to the user. For example, non-visible wavelengths may be longer or shorter than the range of wavelengths visible to the user. In one implementation, the non-visible wavelengths are in an infrared portion of the spectrum. More specifically, in one implementation the non-visible light segment may be configured to pass wavelengths of about 935-940 nanometers (nm) and significantly attenuate other wavelengths below a pre-determined threshold or block wavelengths outside of this range. For ease of illustration and not by way of limitation one non-visible light segment 114 is shown here. In other implementations additional non-visible light segments 114 may be present on the color wheel 108.
Three visible light color segments are also shown in this illustration: a first visible light color segment 116, a second visible light color segment 118, and a third visible light color segment 120. For example, in some implementations these may correspond to red, green, and blue filters suitable for the reproduction of a color image from a generally white light source.
After being filtered by the wavelength modulator 106, the un-modulated light 104 is now wavelength modulated light 122. The wavelength modulated light 122 may then impinge upon an imaging modulator 124. The imaging modulator 124 is configured to spatially modulate incident light. The imaging modulator 124 may comprise a digital micromirror device (DMD), liquid crystal on silicon (LCOS), liquid crystal display (LCD), light valve, and so forth. For example, the imaging modulator 124 may take a beam of incident wavelength modulated light 122 and form an image comprising an array of pixels. In some implementations multiple imaging modulators 124 may be used.
In another implementation, the color wheel 108 may be configured without a visible light color segment, and comprise two or more non-visible light segments 114. The two or more visible light segments 114 and their corresponding pre-determined range of non-visible wavelengths may then be used to generate images such as structured light patterns in these non-visible wavelengths. For example, one non-visible light segment may be configured to pass near infrared, another to pass far infrared, and a third to pass ultraviolet.
A controller 126 may be coupled to the wavelength modulator 106 and the imaging modulator 124. The controller 126 may be configured to maintain synchronization between the wavelength modulation and the image modulation. For example, the controller 126 may be configured such that when a non-visible image is to be modulated on the imaging modulator 124, the non-visible light segment 114 of the color wheel 108 is in the optical path between the multiple-wavelength light source 102 and the imaging modulator 124.
The controller 126 may also be configured to generate or accept a wavelength modulation synchronization signal 128. This synchronization signal 128 may be used to coordinate the operation of other devices, such as a camera.
After image modulation, the now image and wavelength modulated light 130 continues to a target 132. The target 132 may comprise a projection screen, wall, clothing, user, or essentially any other surface configured to accept and present the incident light.
As shown here, the wavelength modulated light source 302 comprises a plurality of emitters or sources of light varying wavelengths. A non-visible light source 302 is shown. Similarly, a first visible light color source 306, second visible light color source 308, and a third visible light color source 310 are also shown as part of the wavelength modulated light source 302. By having both visible and non-visible light available, specific images may be rendered in particular wavelengths by the imaging modulator 124.
The light sources within the wavelength modulated light source 302 may comprise solid state devices such as lasers, light emitting diodes (LEDs), electro- or sono-luminescent materials, and so forth. These solid state light sources may be switched on and off allowing production of a particular pre-determined range of wavelengths, or a particular wavelength where the light source is monochromatic, at a particular time.
The wavelength modulated light source 302 is optically coupled to the imaging modulator 124 such that image and wavelength modulated light 130 may be generated. The controller 126 may coordinate the imaging modulator 124 and the wavelength modulated light source 302 as described above.
In some implementations, the non-visible sub-frame 406 may be configured with a duration different from that of the visible light sub-frames. For example, each of the fourth sub-frames 406 for non-visible light may have a duration of about 4 milliseconds (ms) while the visible light sub-frames 406 may have durations of about 8 milliseconds. Furthermore, the duration of the visible light sub-frames 406 may vary as well.
The timing and distribution of non-visible sub-frames 406 within the wavelength modulation pattern may be configured to reduce or eliminate flicker perceptible to the eye. For example, an overall image frame rate 404 may be 30 hertz while the sub-frames 406 are modulated at 120 hertz, or 1 frame for every 4 sub-frames 406.
When the wavelength modulator 106 allows for selection of a particular range of wavelengths at any time, such as with the wavelength modulated light source 302, the wavelength modulation pattern may be adjusted dynamically. Thus, one or more of the frequency, duration, or sequencing of the non-visible light sub-frames may be changed. For example, when a level of motion or displacement of objects within a physical scene exceeds a pre-determined threshold, additional non-visible light sub-frames may be injected to increase the scanning rate of the scene with structured light.
In another implementation, display settings or environmental conditions may result in a dynamic adjustment to the wavelength modulation pattern. For example, when ambient light increases, the number of non-visible light sub-frames may be reduced to increase the overall brightness of the projected image.
A sphere 506 is shown positioned between the projector 502 and the target 132. A shadow 508 from the sphere 506 is shown on the target 132. Also shown is a deformation effect 510 of the structured light pattern 504 as it interacts with the curved surface of the sphere 506.
This deformation effect 510 is detected by camera 512. The camera 512 is configured to sense or detect the non-visible light. In some implementations the camera 512 may also sense or detect visible light, allowing for multi-spectral imaging of the target. Other effects on the structured light pattern 504 may also be used. For example, a dispersion pattern of points in the structured light pattern 504 may provide details about the scene.
The image captured by the camera 512 is processed by the computing device 514 to determine physical attributes about the scene. The computing device 516 may comprise one or more processors 516, one or more input/output interfaces 518, and a memory 520. The memory may store an operating system 522, structured light analysis module 524, and a synchronization module 526. In some implementations, the resources among a plurality of computing devices 514 may be shared. These resources may include input/output devices such as processors 516, memory 520, and so forth. The memory 520 may include computer-readable storage media (“CRSM”). The CRSM may be any available physical media accessible by a computing device to implement the instructions stored thereon. CRSM may include, but is not limited to, random access memory (“RAM”), read-only memory (“ROM”), electrically erasable programmable read-only memory (“EEPROM”), flash memory or other memory technology, compact disk read-only memory (“CD-ROM”), digital versatile disks (“DVD”) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by a computing device.
The input/output interface 518 may be configured to couple the computing device 514 to the projector 502 and the camera 512. The coupling between the computing device 514 and the external devices such as the projector and the camera 512 may be via wire, fiber optic cable, wirelessly, and so forth.
The structured light analysis module 524 is configured to analyze the structured light as projected by the projector 502 with the structured light data gathered by the camera 512 to determine characteristics about the topology of the scene.
A synchronization module 526 is configured to maintain synchronization between components and actions within the device or within multiple devices. The synchronization module 526 may be configured to synchronize particular wavelength modulator states and the acquisition of images by the camera 512. A wavelength modulation synchronization signal 128 may be distributed to the projector 502, the camera 512, or both. An independent clocking source, the projector 502, the camera 512, or another device may also generate the signal 128. When synchronous, the non-visible light image such as the structured light pattern 504 is presented and contemporaneously imaged by the camera 512. Synchronization thus aids in the recovery of the structured light data from the scene.
The synchronization module 526 may also be configured to coordinate multiple projectors 502, multiple cameras 512, and so forth. For example, multiple projectors 502 in a room may be coordinated such that each projector 502 is configured to generate a structured light pattern with the non-visible wavelengths at different times, using different pre-determined ranges of non-visible wavelengths, or both.
The wavelength modulation synchronization signal may be triggered by one or more different events. In one implementation, the synchronization may be based on frame timing. In another implementation, the synchronization may be based on sub-frame timing, or the wavelength modulation state.
Recovery of the structured light data from the scene may also occur using a “free running” camera 512 which does not use the synchronization signal 128. In such an implementation, the camera 512 may be configured to scan at a sufficiently high rate, such as twice the sub-frame rate, and track and recover the timing of the non-visible light sub-frame.
While these modules are described as being stored such as in the memory 520 and executed by the processor 516, it is understood that in some implementations all or part of the functions provided by these modules may be performed by devices such as application-specific integrated circuits, field programmable gate arrays, or other dedicated circuitry.
A camera detection threshold 606 is also shown. The camera detection threshold 606 indicates a minimum strength or sensitivity of a received photon at the camera detector which will generate a signal in the camera. For ease of illustration and discussion, the camera detection threshold 606 is shown as being flat across the entire spectrum. However, it is understood that the detection threshold of a given camera may vary with wavelength.
Among the wavelengths shown and their designations on this chart is ultraviolet (UV) light with a wavelength of about 10 nm to 390 nm. Visible light as shown extends from about 390 nm to 750 nm. Infrared (IR) is shown from about 750 nm to about 30,000 nm. The definition of visible and non-visible light may vary with the user. For example, the typical human eye is considered capable of detecting the visible light wavelengths, but is generally incapable of detecting wavelengths outside of this range. Thus, UV and IR light are considered non-visible to the human eye. However, where the users include other organisms, the definition of visible and non-visible light wavelengths may be suitably adjusted to accommodate the differing physiologies of these users. For example, where the human user has pet bird whose visible range extends into a portion of the ultraviolet spectrum, non-UV non-visible light may be used.
As described above, wavelength modulated light 122 may encompass a pre-determined range of wavelengths 608. In some implementations the pre-determined range of wavelengths 608 may be very narrow, such as in the case of monochromatic light. In other implementations, the pre-determined range of wavelengths 608 may be broader, such as a green filter segment configured to pass wavelengths of 520-570 nm.
Also shown on the chart are the transmittance/irradiance 604 curves for several different “colors” or pre-determined ranges of wavelengths 608. A non-visible light in the UV band is shown at 610(1). Likewise, a non-visible light in the IR band is shown at 610(2). In one implementation, non-visible light with a wavelength of about 940 nm may be used. Water in the atmosphere absorbs 940 nm wavelength light well, and thus at short ranges this wavelength is useful for structured light because longer range sources, such as the sun are heavily attenuated due to the absorption. Additionally, 940 nm wavelength light is readily produced by available light sources, such as light-emitting diodes and lasers and is also readily imaged by existing cameras. Also shown are curves corresponding to the irradiance/transmittance of light sources or filters, respectively, for blue light 612, green light 614, and red light 616.
At 702, a wavelength modulation synchronization signal for non-visible light is received. In some implementations, the controller 126, projector 502, camera 512, input/output interface 518, and so forth may generate the signal 128.
At 704, a light source is modulated to produce a pre-determined range of non-visible wavelengths. This modulation may include setting a color wheel to place a particular non-visible light segment into the optical path 704(1) or activating a non-visible light source 704(2). For example, as described above the non-visible light segment 114 of the color wheel 108 may be placed into the optical path to filter the multiple-wavelength light source 102, or a particular non-visible light source 304 may be activated.
At 706, a structured light pattern is generated with the non-visible wavelengths. For example, the imaging modulator 124 may be configured to generate the structured light pattern 504 from the non-visible light.
At 708, the light source is modulated to produce a pre-determined range of visible wavelengths. This modulation may include setting a color wheel 708(1) to place a particular visible light segment into the optical path, or activating a visible light source 708(2). For example, as described above the first visible light segment 116 of the color wheel 108 may be placed into the optical path, or a particular first visible light source 306 may be activated.
At 710, a visible light image is generated with the visible light wavelengths. For example, the imaging modulator 124 may be configured to generate the sub-frame image associated with the wavelength of visible light. Thus, the same projection system is able to generate visible and non-visible images.
Although the subject matter has been described in language specific to structural features, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features described. Rather, the specific features are disclosed as illustrative forms of implementing the claims.
Number | Name | Date | Kind |
---|---|---|---|
3835245 | Pieters | Sep 1974 | A |
3840699 | Bowerman | Oct 1974 | A |
4112463 | Kamin | Sep 1978 | A |
5704836 | Norton et al. | Jan 1998 | A |
5946209 | Eckel et al. | Aug 1999 | A |
6059576 | Brann | May 2000 | A |
6098091 | Kisor | Aug 2000 | A |
6503195 | Keller et al. | Jan 2003 | B1 |
6618076 | Sukthankar et al. | Sep 2003 | B1 |
6690618 | Tomasi et al. | Feb 2004 | B2 |
6760045 | Quinn et al. | Jul 2004 | B1 |
6789903 | Parker et al. | Sep 2004 | B2 |
6803928 | Bimber et al. | Oct 2004 | B2 |
6811267 | Allen et al. | Nov 2004 | B1 |
7046214 | Ebersole, Jr. et al. | May 2006 | B2 |
7315241 | Daily et al. | Jan 2008 | B1 |
7418392 | Mozer et al. | Aug 2008 | B1 |
7538764 | Salomie | May 2009 | B2 |
7720683 | Vermeulen et al. | May 2010 | B1 |
7743348 | Robbins et al. | Jun 2010 | B2 |
7774204 | Mozer et al. | Aug 2010 | B2 |
7911444 | Yee | Mar 2011 | B2 |
7925996 | Hofmeister et al. | Apr 2011 | B2 |
7949148 | Rhoads et al. | May 2011 | B2 |
8107736 | Brown et al. | Jan 2012 | B2 |
8159739 | Woodgate et al. | Apr 2012 | B2 |
8199966 | Guven et al. | Jun 2012 | B2 |
8253746 | Geisner et al. | Aug 2012 | B2 |
8264536 | McEldowney | Sep 2012 | B2 |
8284205 | Miller et al. | Oct 2012 | B2 |
8285256 | Gupta et al. | Oct 2012 | B2 |
8307388 | Igoe et al. | Nov 2012 | B2 |
8308304 | Jung et al. | Nov 2012 | B2 |
8382295 | Kim et al. | Feb 2013 | B1 |
8408720 | Nishigaki et al. | Apr 2013 | B2 |
8591039 | Morrison et al. | Nov 2013 | B2 |
20010049713 | Arnold et al. | Dec 2001 | A1 |
20020001044 | Villamide | Jan 2002 | A1 |
20020070278 | Hung et al. | Jun 2002 | A1 |
20040046736 | Pryor et al. | Mar 2004 | A1 |
20040190716 | Nelson | Sep 2004 | A1 |
20040201823 | Raskar et al. | Oct 2004 | A1 |
20050081164 | Hama et al. | Apr 2005 | A1 |
20050110964 | Bell et al. | May 2005 | A1 |
20050128196 | Popescu et al. | Jun 2005 | A1 |
20050254683 | Schumann et al. | Nov 2005 | A1 |
20050264555 | Zhou et al. | Dec 2005 | A1 |
20050276444 | Zhou et al. | Dec 2005 | A1 |
20050288078 | Cheok et al. | Dec 2005 | A1 |
20050289590 | Cheok et al. | Dec 2005 | A1 |
20060028400 | Lapstun et al. | Feb 2006 | A1 |
20060041926 | Istvan et al. | Feb 2006 | A1 |
20060080408 | Istvan et al. | Apr 2006 | A1 |
20060152803 | Provitola | Jul 2006 | A1 |
20060170880 | Dambach et al. | Aug 2006 | A1 |
20060262140 | Kujawa et al. | Nov 2006 | A1 |
20070005747 | Batni et al. | Jan 2007 | A1 |
20070024644 | Bailey | Feb 2007 | A1 |
20070239211 | Lorincz et al. | Oct 2007 | A1 |
20070260669 | Neiman et al. | Nov 2007 | A1 |
20080094588 | Cole et al. | Apr 2008 | A1 |
20080151195 | Pacheco et al. | Jun 2008 | A1 |
20080174735 | Quach et al. | Jul 2008 | A1 |
20080180640 | Ito | Jul 2008 | A1 |
20080186255 | Cohen et al. | Aug 2008 | A1 |
20080229318 | Franke | Sep 2008 | A1 |
20080273754 | Hick et al. | Nov 2008 | A1 |
20090066805 | Fujiwara et al. | Mar 2009 | A1 |
20090073034 | Lin | Mar 2009 | A1 |
20090184888 | Chen et al. | Jul 2009 | A1 |
20100011637 | Zhang | Jan 2010 | A1 |
20100026479 | Tran | Feb 2010 | A1 |
20100060723 | Kimura et al. | Mar 2010 | A1 |
20100066676 | Kramer et al. | Mar 2010 | A1 |
20100164990 | Van Doorn | Jul 2010 | A1 |
20100199232 | Mistry et al. | Aug 2010 | A1 |
20100207872 | Chen et al. | Aug 2010 | A1 |
20100240455 | Gagner et al. | Sep 2010 | A1 |
20100257252 | Dougherty et al. | Oct 2010 | A1 |
20100284055 | Kothari et al. | Nov 2010 | A1 |
20110012925 | Luo | Jan 2011 | A1 |
20110050885 | McEldowney | Mar 2011 | A1 |
20110061100 | Mattila et al. | Mar 2011 | A1 |
20110072047 | Wang et al. | Mar 2011 | A1 |
20110087731 | Wong et al. | Apr 2011 | A1 |
20110093094 | Goyal et al. | Apr 2011 | A1 |
20110096844 | Poupel et al. | Apr 2011 | A1 |
20110134204 | Rodriguez et al. | Jun 2011 | A1 |
20110154350 | Doyle et al. | Jun 2011 | A1 |
20110161912 | Eteminan et al. | Jun 2011 | A1 |
20110164163 | Bilbrey et al. | Jul 2011 | A1 |
20110197147 | Fai | Aug 2011 | A1 |
20110216090 | Woo et al. | Sep 2011 | A1 |
20110238751 | Belimpasakis et al. | Sep 2011 | A1 |
20110249197 | Sprowl et al. | Oct 2011 | A1 |
20120009874 | Kiukkonen et al. | Jan 2012 | A1 |
20120120296 | Roberts et al. | May 2012 | A1 |
20120124245 | Reeves et al. | May 2012 | A1 |
20120127320 | Balogh | May 2012 | A1 |
20120130513 | Hao et al. | May 2012 | A1 |
20120223885 | Perez | Sep 2012 | A1 |
20120306878 | Wang et al. | Dec 2012 | A1 |
20130235354 | Kilcher et al. | Sep 2013 | A1 |
20130300637 | Smits et al. | Nov 2013 | A1 |
Number | Date | Country |
---|---|---|
WO2009112585 | Sep 2009 | FR |
WO2011088053 | Jul 2011 | WO |
Entry |
---|
Pinhanez, “The Everywhere Displays Projector: A Device to Create Ubiquitous Graphical Interfaces”, IBM Thomas Watson Research Center, Ubicomp 2001, 18 pages. |
Office action for U.S. Appl. No. 12/982,519, mailed on Feb. 7, 2013, Worley III , “Complementing Operation of Display Devices in an Augmented Reality Environment”, 13 pages. |
Final Office Action for U.S. Appl. No. 12/982,457, mailed on May 8, 2014, William Spencer Worley III, “Utilizing Content Output Devices in an Augmented Reality Environment”, 58 pages. |
Office action for U.S. Appl. No. 12/978,800, mailed on Jun. 17, 2014, Worley III, “Integrated Augmented Reality Environment”, 40 pages. |
Office Action for U.S. Appl. No. 12/977,949, mailed on Jan. 22, 2014, William Spencer Worley III, “Powered Augmented Reality Projection Accessory Display Device”, 11 pages. |
Office action for U.S. Appl. No. 12/977,924, mailed on Nov. 15, 2013, Coley, et al., “Characterization of a Scene With Structured Light”, 9 pages. |
Office Action for U.S. Appl. No. 13/236,294, mailed on Nov. 7, 2013, Christopher Coley, “Optical Interference Mitigation”, 12 pages. |
Office Action for U.S. Appl. No. 12/982,457, mailed on Dec. 3, 2013, William Spencer Worley III, “Utilizing Content Output Devices in an Augmented Reality Environment”, 56 pages. |
Office Action for U.S. Appl. No. 12/978,800, mailed on Oct. 25, 2013, William Spencer Worley III, “Integrated Augmented Reality Environment”, 36 pages. |
Office action for U.S. Appl. No. 12/982,519, mailed on Aug. 29, 2013, Worley III, “Complementing Operation of Display Devices in an Augmented Reality Environment”, 12 pages. |
Sneath, “The Bumper List of Windows 7 Secrets”, retrieved on Aug. 21, 2013, at http://blogs.msdn.com/b/tims/archive/2009/01/12/ the bumper-list-of-windows-7-secrets.aspx., 2009, 13 pages. |
Office Action for U.S. Appl. No. 12/982,519, mailed on Feb. 12, 2014, William Spencer Worley III, “Complementing Operation of Display Devices in an Augmented Reality Environment”, 12 pages. |
Final Office Action for U.S. Appl. No. 13/236,294, mailed on Mar. 13, 2014, Christopher Coley, “Optical Interference Mitigation”, 14 pages. |
Office Action for U.S. Appl. No. 12/975,175, mailed on Apr. 10, 2014, William Spencer Worley III, “Designation of Zones of Interest Within an Augmented Reality Environment”, 33 pages. |
Office Action for U.S. Appl. No. 12/977,992, mailed on Apr. 4, 2014, William Spencer Worley III, “Unpowered Augmented Reality Projection Accessory Display Device”, 6 pages. |
Office Action for U.S. Appl. No. 12/982,519, mailed on Mar. 5, 2015, William Spencer Worley III, “Complementing Operation of Display Devices in an Augmented Reality Environment”, 13 pages. |
Final Office Action for U.S. Appl. No. 12/982,457, mailed on Apr. 8, 2015, William Spencer Worley III, “Utilizing Content Output Devices in an Augmented Reality Environment”, 64 pages. |
Foscam User Manual, Model:F19821W, retrieved at <<http://foscam.us/downloads/F19821W%20user%20manual.pdf>>, May 2010, pp. 45-46 (71 pages). |
Office Action for U.S. Appl. No. 12/975,175, mailed on Oct. 1, 2014, William Spencer Worley III, “Designation of Zones of Interest Within an Augmented Reality Environment”, 36 pages. |
Office action for U.S. Appl. No. 12/982,457, mailed on Oct. 8, 2014, Worley III et al., “Utilizing Content Output Devices in an Augmented Reality Environment”, 62 pages. |
Office action for U.S. Appl. No. 12/982,519, mailed on Aug. 14, 2014, Worley III, “Complementing Operation of Display Devices in an Augmented Reality Environment”, 12 pages. |
Office Action for U.S. Appl. No. 13/236,294, mailed on Oct. 22, 2014, Christopher Coley, “Optical Interference Mitigation”, 20 pages. |
Office Action for U.S. Appl. No. 12/978,800, mailed on Dec. 2, 2014, William Spencer Worley III, “Integrated Augmented Reality Environment”, 46 pages. |