Optical fault monitoring

Information

  • Patent Grant
  • 9063001
  • Patent Number
    9,063,001
  • Date Filed
    Friday, November 2, 2012
    12 years ago
  • Date Issued
    Tuesday, June 23, 2015
    9 years ago
Abstract
Various embodiments related to monitoring for optical faults in an optical system are disclosed. For example, one disclosed embodiment provides, in an optical system comprising a light source, a light outlet, and an optical element disposed between the light source and the light outlet, a method of monitoring for optical system faults. The method includes detecting, via a light sensor directed toward an interface surface of the optical element closest to the light source, an intensity of light traveling from the interface surface of the optical element to the light sensor, and comparing an intensity of light detected to one or more threshold intensity values. The method further includes identifying an optical system fault condition based on comparing the intensity of light detected to one or more threshold values, and modifying operation of the optical system.
Description
BACKGROUND

Optical projectors and other optical devices may utilize a laser or other relatively bright light source to project an image onto a surface. For example, some depth-sensing cameras may utilize a diffractive optical element to transform light from a laser source to project a structured light pattern on a target in the field of view of an image sensor. Variations in the structured light pattern from an expected pattern that are caused by the distance of the target from the camera may be used to determine a distance of the target from the camera.


Depth-sensing cameras and other optical systems may rely upon the location of DOEs and other optical components to remain constant for proper device performance. Therefore, in the case of a depth-sensing camera, if an optical element becomes misplaced or damaged, the reference structured light image may change compared to that expected by the image processing software. However, such an optical system fault may not be easily discernable by the camera and depth-sensing image processing software. Therefore, various faults may result.


SUMMARY

Accordingly, various embodiments related to optical fault monitoring are disclosed herein. For example, one disclosed embodiment provides, in an optical system comprising a light source, a light outlet, and an optical element disposed between the light source and the light outlet, a method of monitoring for optical system faults. The method includes detecting, via a light sensor directed toward an interface surface of the optical element closest to the light source, an intensity of light traveling from the interface surface of the optical element to the light sensor, and comparing the intensity of light detected to one or more threshold intensity values. The method further includes identifying an optical system fault condition based on comparing the intensity of light detected to one or more threshold values, and modifying operation of the optical system based upon the optical system fault condition.


This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 shows a schematic depiction of an example optical system.



FIG. 2 shows a flow diagram of an embodiment of an example method of monitoring for optical system faults.



FIG. 3 shows a flow diagram of another embodiment of an example method of monitoring for optical faults.



FIG. 4 shows a schematic depiction of an example depth-sensing camera.





DETAILED DESCRIPTION

Optical devices such as depth-sensing cameras may utilize a laser, or other such light source, modulated by a diffractive optical element to project a structured light pattern on a target in the field of view of an image sensor. As such, the distance from the camera to the target (i.e., the depth from the camera to the target) may be determined based on detecting variations in the projected structured light pattern. For example, a variation may be detected if the reference structured light image differs from that expected by the image processing software. However, other sources may cause variations in the projected structured light pattern that are independent of depth detection, and instead result from optical faults in the optical system. Optical faults may include, but are not limited to, damage to and/or contamination of an optical element, changes in positioning of an optical element, physical objects in an optical path of the optical element, and the like. Such optical faults may not be easily discernable by the camera and depth-sensing image processing software, resulting in ambiguity of fault mitigation.


Therefore, the monitoring of faults in such an optical device, as described herein may provide for the detection and determination of optical faults, and enable the application of corrective and/or mitigating actions. FIG. 1 shows an example optical system 100 within an optical device 102, wherein optical device 102 includes a light source 104 configured to output a beam of light 106. Examples of suitable light producing elements 107 for use within light source 104 may include, but are not limited to, one or more lasers, laser diodes, light emitting diodes, etc. Further, in some embodiments, light source 104 may include a collimating lens 109 configured to collimate the beam of light 106.


As depicted, the beam of light 106 exits optical device 102 through a light outlet 108. Light outlet 108 may be any suitable outlet through which the light may leave the optical device, such as a hole, a filter, a plastic cover, a lens, etc. Optical device 102 further includes an optical element 110 disposed between light source 104 and light outlet 108. Optical element 110 may be any suitable optical element configured to receive the beam of light 106 on a light-source side of the optical element (i.e., at an interface surface 112) and to diffract the beam of light 106 to form a structured pattern, as depicted in FIG. 1 at 114. As an example, in a structured light depth sensor, optical element 110 may comprise a diffracting optical element.


Due to propagation reciprocity symmetry, optical element 110 may be bidirectional. As such, in addition to optical element 110 directing the beam of light 106 from an interface surface 112 toward the light outlet 108 as described above, optical element 110 may also direct light received through the light outlet 108 toward the interface surface 112. As an example, upon exiting light outlet 108, beam of light 106 may reflect off of a physical object within the optical path, and this reflected light may then be directed back through light outlet 108 and through optical element 110 toward the interface surface 112.


As such, optical device 102 further includes a light sensor 116 directed toward interface surface 112 of optical element 110 closest to the light source 104 (i.e., a light-source side of the optical element 110) so as to detect such light traveling from interface surface 112 toward light sensor 116. Light sensor 116 may comprise any suitable sensor for detecting an intensity of light traveling from interface surface 112 of optical element 110 to light sensor 116. Examples include, but are not limited to, photodetectors and image sensors.


Optical device 102 further includes a controller 118 configured to perform various device functions. For example, where the optical device 102 is a structured light depth sensor, the controller 118 may be configured to control the projection of a structured light pattern, and to determine a distance of objects located in front of the depth sensor via an image of the structured light pattern, as described above. Further, controller 118 may be configured to detect an optical fault condition based upon a signal received from the light sensor 116. Controller 118 may determine an optical fault condition in any suitable manner. For example, controller 118 may monitor an intensity of light received from interface surface 112 as measured by light sensor 116, and compare the measured intensity of light to one or more threshold intensity values. Controller 118 may be further configured to apply one or more response actions upon detecting an optical fault condition. For example, controller 118 may be further configured to change a power state of optical device 102 if an upper or lower threshold is met (e.g. shut off light source 104), and/or display a warning message on a display device. Methods of optical fault monitoring are described in more detail hereafter with reference to FIGS. 2-4.



FIG. 2 shows a flow diagram of an embodiment of an example of a method 200 of monitoring for optical system faults in an optical system, wherein the optical system comprises a light source, a light outlet, and an optical element disposed between the light source and the light outlet, as described above. At 202, method 200 includes detecting an intensity of light traveling from the interface surface of the optical element to the light sensor. As described above, in some embodiments, an interface surface of an optical element may comprise the surface of the optical element closest to the light source. The intensity of the light may be detected via any suitable sensor, including but not limited to a photodetector and/or an image sensor directed toward the interface surface of the optical element closest to the light source.


Next, at 204, method 200 includes comparing the intensity of light detected to one or more threshold intensity values, and then at 206, determining if an optical fault condition exists based on this comparison. As will be described in more detail hereafter with reference to FIG. 3, identifying an optical system fault condition may include determining that the intensity of light detected is less than a threshold value, greater than a threshold value, outside of an operating range of expected values, etc. Examples of optical system fault conditions include, but are not limited to, a change in a location of the optical element within the optical system, a physical object close to or blocking the light outlet, a contamination of the optical element, and other such conditions that may interfere with proper optical system operation.


Continuing with FIG. 2, if it is determined at 206 that an optical fault is not detected, then method 200 returns to 202. However, if it is determined at 206 that an optical fault is detected, at 208, method 200 includes modifying operation of the optical system based upon the optical system fault condition. The operation of the optical system may be modified in any suitable manner depending upon the nature of the optical fault detected. Examples include, but are not limited to, changing a power state of the optical device, performing a corrective action, displaying a warning message on a display device, displaying a message prompting a user to perform an action, etc. The optical system may further determine whether or not the user has performed the action, and the optical system may then further modify operation of the optical system based on this determination.


As an example, in one embodiment, the optical system may determine an optical fault condition indicating presence of a physical object located on a light-outlet side of the optical element and in the optical path of the optical element. The optical system may in response display on a display device a warning message asking the user to remove the physical object. If the optical system determines that the physical object has not been removed, for example after a predetermined time duration, the optical system may further modify operation of the optical system by performing a shutdown operation. Additional examples of optical fault conditions and corrective actions are described hereafter.



FIG. 3 shows a flow diagram of another embodiment of an example method 300 of monitoring for optical faults. Method 300 may be performed, for example, by a structured light depth-sensing camera comprising a light source, a light outlet, and a diffractive optical element disposed between the light source and the light outlet. FIG. 4 shows a schematic depiction of an example embodiment of a depth-sensing camera 400 comprising a light source 402 configured to output a beam of light 404 that is directed through a diffractive optical element 406 toward a light outlet 408. As a nonlimiting example, light source 402 may comprise a laser diode, and may utilize a lens 410 to collimate the beam of light as indicated at 412. Diffractive optical element 406 then outputs diffracted light through light outlet 408 as a structured pattern, as indicated at 414.


Returning to FIG. 3, at 302 method 300 includes detecting, via a photodetector located on a light-source side of the DOE, an intensity of light traveling from the diffractive optical element. As described above, a diffractive optical element and other optical components may be bidirectional in that in addition to transmitting and diffracting light received from the light source, it may also receive light at the light outlet and transmit the light toward the photodetector. FIG. 4 shows an example photodetector 416 located on the light-source side of diffractive optical element 406, and configured to measure an intensity of light traveling from the light-source side of diffractive optical element 406 via inherent reflections.


Continuing with FIG. 3, method 300 next includes comparing the measured intensity of the light to one or more threshold values. Two thresholds are described in more detail as follows, however, it is to be understood that additional and/or other comparisons to additional and/or other threshold values may also be made without departing from the scope of this disclosure. At 304, method 300 includes determining if the intensity of light is less than a first threshold value. If it is determined that the intensity of light is less than a first threshold value, at 306 method 300 includes identifying an optical fault due to a change in a location of the diffractive optical element. For example, the diffractive optical element may have fallen, become dislodged, broken, etc. such that it is no longer properly located within the optical path, thus reducing the intensity of (unintended but inherent) light reflected from the diffractive optical element interface that reaches the photodetector. In this case, as indicated at 308, method 300 may include applying a first corrective action. For example, in some specific embodiments, the first corrective may include performing a shutdown operation to the projector or overall depth-sensing camera, as indicated at 310. It will be understood that the term “shutdown” operation as used herein refers to any operation in which the projected beam of light is shut off, whether or not other device components remain powered.


Continuing with FIG. 2, if it is determined that the intensity of light is not less than the first threshold value, then method 300 proceeds to 312, where it is determined if the intensity of light is greater than a second threshold value. If it is determined that the intensity of light is greater than a second threshold value, at 314 method 300 includes identifying an optical fault due to a physical object blocking the projected beam of light. Such a physical object may be proximal to the light outlet such that light exiting the light-outlet is reflected by the physical object, and then returns back through the diffractive optical element toward the photodetector, thus increasing the intensity of light reaching the photodetector, causing the second threshold to be reached. If it is determined that the intensity of light is greater than the second threshold value, then method 300 comprises applying a second corrective action. As an example, the second corrective action may include displaying a warning message on a display device as indicated at 318. For example, the warning message may indicate to a user that there may be a physical object present in the optical path of the depth-sensing camera interfering with proper operation of the depth-sensing camera and request that the user remove the physical object to continue operation of the depth-sensing camera. It will be understood that any other suitable corrective action may be applied in other embodiments.


In some embodiments, method 300 may further include determining that no response has yet been taken to the warning message, for example, within a predetermined time duration, and performing another corrective action, such as performing a shutdown operation. Then, in some embodiments, the depth-sensing camera may periodically be re-powered to determine whether the object has been removed from the beam path. In some embodiments, after performing a shutdown operation, a response may be detected to the warning message and the optical system may be returned to a normal operating state. In other embodiments, the depth-sensing camera may remain in the shut-down state until re-activated by a user.


Continuing with FIG. 3, if it is determined at 312 that the intensity of light is not greater than a second threshold value, then method 300 returns to 302.


As described above, any other additional and/or alternative threshold comparisons may be used to determine other fault conditions without departing from the scope of this disclosure. For example, in some embodiments, method 300 may include determining if the intensity of light is outside of an operating range of accepted values, for example, due to contamination of an optical component (e.g. moisture on the diffractive optical element, etc.). If it is determined that the intensity of light is outside of such an operating range, method 300 may include identifying an optical fault due to degraded performance of the diffractive optical element or other optical element, and applying a third corrective action.


In some embodiments, the above-described optical system and methods may be tied to a computing device. As an example, a depth-sensing camera may be included within a gaming system including a gaming console and a display device. It will be appreciated that the computing devices described herein may be any suitable computing device configured to execute the programs described herein. For example, the computing devices may be a mainframe computer, personal computer, laptop computer, portable data assistant (PDA), computer-enabled wireless telephone, networked computing device, or other suitable computing device, and may be connected to each other via computer networks, such as the Internet. These computing devices typically include a processor and associated volatile and non-volatile memory, and are configured to execute programs stored in non-volatile memory using portions of volatile memory and the processor. As used herein, the term “program” refers to software or firmware components that may be executed by, or utilized by, one or more computing devices described herein, and is meant to encompass individual or groups of executable files, data files, libraries, drivers, scripts, database records, etc. It will be appreciated that computer-readable media may be provided having program instructions stored thereon, which upon execution by a computing device, cause the computing device to execute the methods described above and cause operation of the systems described above.


It should be understood that the embodiments herein are illustrative and not restrictive, since the scope of the invention is defined by the appended claims rather than by the description preceding them, and all changes that fall within metes and bounds of the claims, or equivalence of such metes and bounds thereof are therefore intended to be embraced by the claims.

Claims
  • 1. In an optical system comprising a light source, a light outlet, and an optical element disposed between the light source and the light outlet, a method of monitoring for optical system faults, the method comprising: detecting, via a light sensor configured to receive light from an interface surface of the optical element closest to the light source, an intensity of light traveling from the interface surface of the optical element to the light sensor;if a first optical system fault condition, due to a change in location of the optical element, is identified based upon the intensity of light, then modifying operation of the optical system in a first manner; andif a second optical system fault condition, due to a physical object blocking an optical path, is identified based upon the intensity of light, then modifying operation of the optical system in a second manner that is different than the first manner.
  • 2. The method of claim 1, further comprising identifying the first optical system fault condition by determining that the intensity of light detected, due to the change in location of the optical element, is less than a threshold value.
  • 3. The method of claim 2, wherein modifying operation of the optical system in the first manner includes performing a shutdown operation to the optical system.
  • 4. The method of claim 1, wherein identifying the second optical system fault condition includes determining that the intensity of light detected, based upon the physical object blocking the optical path, is greater than a threshold value.
  • 5. The method of claim 4, wherein modifying operation of the optical system in the second manner includes providing a warning message for display on a display device.
  • 6. The method of claim 5, further comprising detecting no response to the warning message, and performing a shutdown operation to the optical system.
  • 7. The method of claim 6, further comprising, after performing the shutdown operation, detecting a response to the warning message and in response, returning the optical system to a normal operating state.
  • 8. An optical device, comprising: a light source configured to output a beam of light;a diffractive optical element configured to receive the beam of light on a light-source side of the diffractive optical element and to diffract the beam of light to form a structured pattern;a photodetector directed toward the light-source side of the diffractive optical element, the photodetector configured to measure an intensity of light traveling from the light-source side of the diffractive optical element; anda controller configured to distinguish one or more optical fault conditions from proper optical device operation by monitoring the intensity of light as measured by the photodetector, and to take action corresponding to the one or more optical fault conditions.
  • 9. The optical device of claim 8, wherein the controller is further configured to change a power state of the optical device if a first optical fault condition is detected.
  • 10. The optical device of claim 9, wherein the controller is configured to detect the first optical fault condition by comparing the intensity of light to a first threshold.
  • 11. The optical device of claim 8, wherein the controller is further configured to provide a warning message for display if a second optical fault condition is detected.
  • 12. The optical device of claim 11, wherein the controller is configured to detect the second optical fault condition by comparing the intensity of light to a second threshold that is different from a first threshold used to detect the first optical fault condition.
  • 13. The optical device of claim 11, wherein the controller is further configured to perform a shutdown operation if no response to the warning message is detected.
  • 14. The optical device of claim 13, wherein the controller is further configured to, after performing the shutdown operation, detect a response to the warning message and in response, return the optical system to a normal operating state.
  • 15. The optical device of claim 8, wherein the optical device is a depth-sensing camera.
  • 16. An optical system, comprising: a light source;a light outlet;one or more optical elements;a light sensor configured to receive light from an interface surface of an optical element closest to the light source; anda controller configured to: control the light source to project a beam of light through the one or more optical elements;detect, via the light sensor, an intensity of light transmitted through the optical element to the light sensor;compare the intensity of light detected to one or more threshold intensity values;identify an optical system fault condition based on comparing the intensity of light detected to the one or more threshold intensity values; andtake one or more actions based upon the optical system fault condition.
  • 17. The method of claim 16, wherein identifying the optical system fault condition includes determining that the intensity of light detected is less than a threshold value.
  • 18. The method of claim 17, wherein modifying operation of the optical system includes performing a shutdown operation to the optical system.
  • 19. The method of claim 16, wherein identifying the optical system fault condition includes determining that the intensity of light detected is greater than a threshold value.
  • 20. The method of claim 19, wherein modifying operation of the optical system includes displaying a warning message.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 12/559,160, titled OPTICAL FAULT MONITORING and filed Sep. 14, 2009, the disclosure of which is hereby incorporated by reference.

US Referenced Citations (189)
Number Name Date Kind
4627620 Yang Dec 1986 A
4630910 Ross et al. Dec 1986 A
4645458 Williams Feb 1987 A
4695953 Blair et al. Sep 1987 A
4702475 Elstein et al. Oct 1987 A
4711543 Blair et al. Dec 1987 A
4751642 Silva et al. Jun 1988 A
4796997 Svetkoff et al. Jan 1989 A
4809065 Harris et al. Feb 1989 A
4817950 Goo Apr 1989 A
4843568 Krueger et al. Jun 1989 A
4893183 Nayar Jan 1990 A
4901362 Terzian Feb 1990 A
4925189 Braeunig May 1990 A
5101444 Wilson et al. Mar 1992 A
5148154 MacKay et al. Sep 1992 A
5184295 Mann Feb 1993 A
5229754 Aoki et al. Jul 1993 A
5229756 Kosugi et al. Jul 1993 A
5239463 Blair et al. Aug 1993 A
5239464 Blair et al. Aug 1993 A
5288078 Capper et al. Feb 1994 A
5291028 Droge et al. Mar 1994 A
5295491 Gevins Mar 1994 A
5320538 Baum Jun 1994 A
5347306 Nitta Sep 1994 A
5385519 Hsu et al. Jan 1995 A
5405152 Katanics et al. Apr 1995 A
5417210 Funda et al. May 1995 A
5423554 Davis Jun 1995 A
5454043 Freeman Sep 1995 A
5469740 French et al. Nov 1995 A
5495576 Ritchey Feb 1996 A
5516105 Eisenbrey et al. May 1996 A
5524637 Erickson Jun 1996 A
5534917 MacDougall Jul 1996 A
5563988 Maes et al. Oct 1996 A
5577981 Jarvik Nov 1996 A
5580249 Jacobsen et al. Dec 1996 A
5594469 Freeman et al. Jan 1997 A
5597309 Riess Jan 1997 A
5616078 Oh Apr 1997 A
5617312 Iura et al. Apr 1997 A
5638300 Johnson Jun 1997 A
5641288 Zaenglein, Jr. Jun 1997 A
5682196 Freeman Oct 1997 A
5682229 Wangler Oct 1997 A
5690582 Ulrich et al. Nov 1997 A
5703367 Hashimoto et al. Dec 1997 A
5704837 Iwasaki et al. Jan 1998 A
5715834 Bergamasco et al. Feb 1998 A
5875108 Hoffberg et al. Feb 1999 A
5877803 Wee et al. Mar 1999 A
5913727 Ahdoot Jun 1999 A
5933125 Fernie et al. Aug 1999 A
5980256 Carmein Nov 1999 A
5989157 Walton Nov 1999 A
5995649 Marugame Nov 1999 A
6005548 Latypov et al. Dec 1999 A
6009210 Kang Dec 1999 A
6054991 Crane et al. Apr 2000 A
6066075 Poulton May 2000 A
6072494 Nguyen Jun 2000 A
6073489 French et al. Jun 2000 A
6077201 Cheng et al. Jun 2000 A
6098458 French et al. Aug 2000 A
6100896 Strohecker et al. Aug 2000 A
6101289 Kellner Aug 2000 A
6128003 Smith et al. Oct 2000 A
6130677 Kunz Oct 2000 A
6141463 Covell et al. Oct 2000 A
6147678 Kumar et al. Nov 2000 A
6152856 Studor et al. Nov 2000 A
6159100 Smith Dec 2000 A
6173066 Peurach et al. Jan 2001 B1
6181343 Lyons Jan 2001 B1
6188777 Darrell et al. Feb 2001 B1
6215890 Matsuo et al. Apr 2001 B1
6215898 Woodfill et al. Apr 2001 B1
6226396 Marugame May 2001 B1
6229913 Nayar et al. May 2001 B1
6256033 Nguyen Jul 2001 B1
6256400 Takata et al. Jul 2001 B1
6283860 Lyons et al. Sep 2001 B1
6289112 Jain et al. Sep 2001 B1
6299308 Voronka et al. Oct 2001 B1
6308565 French et al. Oct 2001 B1
6316934 Amorai-Moriya et al. Nov 2001 B1
6363160 Bradski et al. Mar 2002 B1
6384819 Hunter May 2002 B1
6411744 Edwards Jun 2002 B1
6430997 French et al. Aug 2002 B1
6476834 Doval et al. Nov 2002 B1
6496598 Harman Dec 2002 B1
6503195 Keller et al. Jan 2003 B1
6509967 Pingel et al. Jan 2003 B1
6539931 Trajkovic et al. Apr 2003 B2
6570555 Prevost et al. May 2003 B1
6633294 Rosenthal et al. Oct 2003 B1
6640202 Dietz et al. Oct 2003 B1
6661918 Gordon et al. Dec 2003 B1
6681031 Cohen et al. Jan 2004 B2
6714665 Hanna et al. Mar 2004 B1
6731799 Sun et al. May 2004 B1
6738066 Nguyen May 2004 B1
6765726 French et al. Jul 2004 B2
6788809 Grzeszczuk et al. Sep 2004 B1
6801637 Voronka et al. Oct 2004 B2
6873723 Aucsmith et al. Mar 2005 B1
6876496 French et al. Apr 2005 B2
6937742 Roberts et al. Aug 2005 B2
6950534 Cohen et al. Sep 2005 B2
7003134 Covell et al. Feb 2006 B1
7036094 Cohen et al. Apr 2006 B1
7038855 French et al. May 2006 B2
7039676 Day et al. May 2006 B1
7042440 Pryor et al. May 2006 B2
7050606 Paul et al. May 2006 B2
7058204 Hildreth et al. Jun 2006 B2
7060957 Lange et al. Jun 2006 B2
7113918 Ahmad et al. Sep 2006 B1
7121946 Paul et al. Oct 2006 B2
7149383 Chen Dec 2006 B2
7162114 Donval et al. Jan 2007 B2
7170492 Bell Jan 2007 B2
7184048 Hunter Feb 2007 B2
7184585 Hamza et al. Feb 2007 B2
7185987 Tamura Mar 2007 B2
7202898 Braun et al. Apr 2007 B1
7222078 Abelow May 2007 B2
7227526 Hildreth et al. Jun 2007 B2
7259747 Bell Aug 2007 B2
7308112 Fujimura et al. Dec 2007 B2
7317836 Fujimura et al. Jan 2008 B2
7348963 Bell Mar 2008 B2
7359121 French et al. Apr 2008 B2
7367887 Watabe et al. May 2008 B2
7379563 Shamaie May 2008 B2
7379566 Hildreth May 2008 B2
7389591 Jaiswal et al. Jun 2008 B2
7412077 Li et al. Aug 2008 B2
7421093 Hildreth et al. Sep 2008 B2
7430312 Gu Sep 2008 B2
7436496 Kawahito Oct 2008 B2
7450736 Yang et al. Nov 2008 B2
7452275 Kuraishi Nov 2008 B2
7460690 Cohen et al. Dec 2008 B2
7489812 Fox et al. Feb 2009 B2
7536032 Bell May 2009 B2
7555142 Hildreth et al. Jun 2009 B2
7560701 Oggier et al. Jul 2009 B2
7570805 Gu Aug 2009 B2
7574020 Shamaie Aug 2009 B2
7576727 Bell Aug 2009 B2
7590262 Fujimura et al. Sep 2009 B2
7593552 Higaki et al. Sep 2009 B2
7598942 Underkoffler et al. Oct 2009 B2
7607509 Schmiz et al. Oct 2009 B2
7620202 Fujimura et al. Nov 2009 B2
7668340 Cohen et al. Feb 2010 B2
7680298 Roberts et al. Mar 2010 B2
7683954 Ichikawa et al. Mar 2010 B2
7684592 Paul et al. Mar 2010 B2
7701439 Hillis et al. Apr 2010 B2
7702130 Im et al. Apr 2010 B2
7704135 Harrison, Jr. Apr 2010 B2
7710391 Bell et al. May 2010 B2
7729530 Antonov et al. Jun 2010 B2
7746345 Hunter Jun 2010 B2
7760182 Ahmad et al. Jul 2010 B2
7809167 Bell Oct 2010 B2
7834846 Bell Nov 2010 B1
7852262 Namineni et al. Dec 2010 B2
RE42256 Edwards Mar 2011 E
7898522 Hildreth et al. Mar 2011 B2
8028918 Zhang et al. Oct 2011 B2
8035612 Bell et al. Oct 2011 B2
8035614 Bell et al. Oct 2011 B2
8035624 Bell et al. Oct 2011 B2
8072470 Marks Dec 2011 B2
20020109844 Christel et al. Aug 2002 A1
20030103211 Lange et al. Jun 2003 A1
20050199725 Craen et al. Sep 2005 A1
20050200840 Terui Sep 2005 A1
20070215822 Wuestefeld Sep 2007 A1
20070295814 Tanaka et al. Dec 2007 A1
20080026838 Dunstan et al. Jan 2008 A1
20080101843 Murahashi et al. May 2008 A1
20090084851 Vinogradov et al. Apr 2009 A1
Foreign Referenced Citations (7)
Number Date Country
101254344 Jun 2010 CN
0583061 Feb 1994 EP
08044490 Feb 1996 JP
9310708 Jun 1993 WO
9717598 May 1997 WO
9944698 Sep 1999 WO
2008109932 Sep 2008 WO
Non-Patent Literature Citations (30)
Entry
Kanade, et al., “A Stereo Machine for Video-rate Dense Depth Mapping and Its New Applications”, IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 1996, pp. 196-202,The Robotics Institute, Carnegie Mellon University, Pittsburgh, PA.
Miyagawa, et al., “CCD-Based Range Finding Sensor”, Oct. 1997, pp. 1648-1652, vol. 44 No. 10, IEEE Transactions on Electron Devices.
Rosenhahn, et al., “Automatic Human Model Generation”, 2005, pp. 41-48, University of Auckland (CITR), New Zealand.
Aggarwal, et al., “Human Motion Analysis: A Review”, IEEE Nonrigid and Articulated Motion Workshop, 1997, University of Texas at Austin, Austin, TX, pp. 13.
Shao, et al., “An Open System Architecture for a Multimedia and Multimodal User Interface”, Aug. 24, 1998, Japanese Society for Rehabilitation of Persons with Disabilities (JSRPD), Japan, pp. 8.
Kohler, Markus., “Special Topics of Gesture Recognition Applied in Intelligent Home Environments”, In Proceedings of the Gesture Workshop, 1998, pp. 285-296, Germany.
Kohler, Markus., “Vision Based Remote Control in Intelligent Home Environments”, University of Erlangen-Nuremberg/Germany, 1996, pp. 147-154, Germany.
Kohler, Markus., “Technical Details and Ergonomical Aspects of Gesture Recognition applied in Intelligent Home Environments”, 1997, Germany, pp. 35.
Hasegawa, et al., “Human-Scale Haptic Interaction with a Reactive Virtual Human in a Real-Time Physics Simulator”, Jul. 2006, vol. 4, No. 3, Article 6C, ACM Computers in Entertainment, New York, NY, pp. 12.
Qian, et al., “A Gesture-Driven Multimodal Interactive Dance System”, Jun. 2004, pp. 1579-1582, IEEE International Conference on Multimedia and Expo (ICME), Taipei, Taiwan.
Zhao, Liang., “Dressed Human Modeling, Detection, and Parts Localization”, 2001, The Robotics Institute, Carnegie Mellon University, Pittsburgh, PA, pp. 121.
He, Lei., “Generation of Human Body Models”, Apr. 2005, University of Auckland, New Zealand, pp. 111.
Isard, et al., “Condensation—Conditional Density Propagation for Visual Tracking”, 1998, pp. 5-28, International Journal of Computer Vision 29(1), Netherlands.
Livingston, Mark Alan., “Vision-based Tracking with Dynamic Structured Light for Video See-through Augmented Reality”, 1998, University of North Carolina at Chapel Hill, North Carolina, USA, pp. 145.
Wren et al., “Pfinder: Real-Time Tracking of the Human Body”, MIT Media Laboratory Perceptual Computing Section Technical Report No. 353, Jul. 1997, vol. 19, No. 7, pp. 780-785, IEEE Transactions on Pattern Analysis and Machine Intelligence, Caimbridge, MA.
Breen, et al., “Interactive Occlusion and Collusion of Real and Virtual Objects in Augmented Reality”, Technical Report ECRC-95-02, 1995, European Computer-Industry Research Center GmbH, Munich, Germany, pp. 22.
Freeman, et al., “Television Control by Hand Gestures”, Dec. 1994, Mitsubishi Electric Research Laboratories, TR94-24, Caimbridge, MA, pp. 7.
Hongo, et al., “Focus of Attention for Face and Hand Gesture Recognition Using Multiple Cameras”, Mar. 2000, pp. 156-161, 4th IEEE International Conference on Automatic Face and Gesture Recognition, Grenoble, France.
Pavlovic, et al., “Visual Interpretation of Hand Gestures for Human-Computer Interaction: A Review”, Jul. 1997, pp. 677-695, vol. 19, No. 7, IEEE Transactions on Pattern Analysis and Machine Intelligence.
Azarbayejani, et al., “Visually Controlled Graphics”, Jun. 1993, vol. 15, No. 6, IEEE Transactions on Pattern Analysis and Machine Intelligence, pp. 602-605.
Granieri, et al., “Simulating Humans in VR”, The British Computer Society, Oct. 1994, Academic Press, pp. 15.
Brogan, et al., “Dynamically Simulated Characters in Virtual Environments”, Sep./Oct. 1998, pp. 2-13, vol. 18, Issue 5, IEEE Computer Graphics and Applications.
Fisher, et al., “Virtual Environment Display System”, ACM Workshop on Interactive 3D Graphics, Oct. 1986, Chapel Hill, NC, pp. 12.
“Virtual High Anxiety”, Tech Update, Aug. 1995, pp. 1.
Sheridan, et al., “Virtual Reality Check”, Technology Review, Oct. 1993, pp. 22-28, vol. 96, No. 7.
Stevens, “Flights into Virtual Reality Treating Real World Disorders”, The Washington Post, Mar. 27, 1995, Science Psychology, pp. 2.
“Simulation and Training”, 1994, Division Incorporated, pp. 6.
Taga, et al. , “Power Penalty Due to Optical Back Reflection in Semiconductor Optical Amplifier Repeater Systems”, Retrieved at <<http://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=00053262>>, IEEE Photonomics Technology Letters , vol. 2, No. 4, Apr. 1990, pp. 279-281.
“Beam Dump”, Retrieved at <<http://en.wikipedia.org/wiki/Beam—dump>>, Jul. 22, 2009, pp. 2.
Swain, et al. , “Manufacturing Low Insertion Loss Fiber-Lens Elements”, Retrieved at <<http://www.photon-inc.com/support/library/pdf/Mfg—FiberLensElements.pdf>>, pp. 1-7.
Related Publications (1)
Number Date Country
20130056615 A1 Mar 2013 US
Continuations (1)
Number Date Country
Parent 12559160 Sep 2009 US
Child 13667915 US