The technology herein relates to direction finding and pointing, and more particularly to pointing and/or targeting techniques, devices and systems. Still more particularly, the technology herein relates to position indicating light-emitting devices, and to video game and other remote control devices that sense such light-emitting devices for use in determining where they are pointing.
Since ancient times, people have used lights in the sky to find their way. An experienced celestial navigator can find his or her way at night or in the daytime with great accuracy just by noting the positions of the points of light in the heavens above us. Such “celestial navigation” makes use of observed positions of the North star and other stars, the Sun, the Moon and certain planets to ascertain position and direction. Even a young child knows that the Sun rises in the East and sets in the West. By using a sextant or other measuring device, navigators ancient and modern have been able to nearly exactly ascertain their position (e.g., longitude and latitude) on the earth's surface.
Artificial lights can also be used to ascertain heading. For example, a sailor piloting a ship off the coast can estimate the direction and distance to port by observing the lights of cities, towns and lighthouses. Aircraft can ascertain position by observing many light beacons on radio towers or other structures.
In the electronic world, it is possible for a computer-based camera device or other light sensor or detector to automatically ascertain distance relative to man-made light sources by measuring the spacing between detected points of light or other illumination. If the detector “sees” two spaced-apart light sources as being close together, then the detector is relatively far away from the light sources. If the detector detects that the two spaced-apart light sources have “moved” farther apart, then the detector has moved closer to the light sources. Furthermore, just as someone familiar with the summer night sky can tell which direction they are looking simply by observing the orientation of patterns of the star constellations they see, a light detector provided with the appropriate processing capabilities (e.g., software and/or hardware) can determine some aspects of its orientation relative to point light sources based on received light pattern orientation.
Such principles can be used for various applications, including, but not limited to detecting some aspects of the orientation of a handheld or other pointing or control device relative to a variety of target surfaces, such as display surfaces. For example, using such techniques, it is possible to detect how a handheld pointing device is aimed toward a display or other presentation surface. Such a handheld pointing device may be for example as described in US2007/0066394 filed Sep. 15, 2006, incorporated herein by reference.
While many display devices, including, but not limited to, computer, television, graphical and other displays, are capable of generating and emitting light, such light emissions are generally for the purpose of conveying information to human eyes. For example, the electron beam scanning performed by a conventional television display or computer monitor causes display pixels to emit visible light in the form of visible images. Some in the past have used these same displays to generate machine-recognizable light targets (e.g., so-called “flying spot scanners”). In addition, some display light detection techniques have been commercially successful. For example, a so-called “light gun” was used with early Nintendo video games such as “Duck Hunt” to determine where on a display screen the user was aiming a simulated weapon. However, further improvements and additional techniques as possible and desirable.
The technology herein provides exemplary illustrative non-limiting systems, methods, devices and techniques for supplying convenient and effective targeting or “marking” light sources for use with presentation surfaces including but not limited to 2D and 3D video display systems. Useful non-limiting applications include electronic and non-electronic displays of all types such as televisions, computer monitors, light projection systems, whiteboards, blackboards, easels and any other presentation or other surface imaginable. Such targeting or marking can be used for example to control cursors, other symbols, or objects on electronic displays.
An exemplary illustrative non-limiting implementation provides an elongated member such as a bar shaped housing including spaced-apart point light sources. The point sources could be disposed within a housing of any shape, could be in separate housings, could be included in the display's housing, etc. In one exemplary illustrative non-limiting implementation, the elongated member may have first and second ends. A point source may be disposed on each end or anywhere else along the housing. In one exemplary illustrative non-limiting implementation, the spacing between the two spaced-apart point sources may be 20 centimeters or more. The point sources could be closer together or further apart.
In an exemplary illustrative non-limiting implementation, each point source comprises an array of plural point illumination sources. The plural point illumination sources in each array may be directional. The point sources may be aimed in different directions to provide different illumination patterns. For example, some (e.g., three) of the point sources can have a primary radiation directionality (lobes) that is substantially perpendicular to a front face of the bar-like structure, whereas other point sources can have primary radiation directionalities (lobes) that define acute angles with respect to such perpendicular direction. In one exemplary illustrative non-limiting implementation, some of the point sources are directed forward, while others are directed outwardly, and still others are directed inwardly with respect to the elongated member. Such expanded irradiation coverage area can provide advantages for multi-player games or the like where two or more spaced-apart detection devices each independently detect the point sources from different positions.
In one exemplary illustrative non-limiting implementation, the point sources can be generally oriented to emit light within a common horizontal plane or into different planes (e.g., some upwardly, some downwardly, etc.). Such 3D directionality can provide a potentially wider coverage area horizontally and/or vertically. The point sources could use a single point source in each array. The point source arrays could be oriented in varying directions or in the same direction.
In an exemplary non-limiting implementation of arrays with a plurality of point sources, the point sources in each array may emit the same or different light colors or frequencies of light. For example, one exemplary illustrative non-limiting implementation may provide, on each end of a rigid “marker bar” or other structure, an array of differently-aimed infrared point light sources, with the different point light sources emitting the same frequencies or wavelengths of infrared or other light. Other arrangements are possible.
In an exemplary illustrative non-limiting implementation, the elongated member may comprise a rigid bar or other structure that is especially adapted for mounting to the top, bottom, side or other dimension of an electronic display device such as a television set. Such light emitting bar structure can be mounted by a variety of convenient means including but not limited to adhesive tape, Velcro, gravity, interlocking parts, or any other desired mechanism. The device could also be affixed to a stand on which the display sits or to which the display is attached. Still other arrangements could provide structures that are integral or partially integral to display devices.
Additional aspects of exemplary illustrative non-limiting implementations include:
These and other non-limiting illustrative exemplary aspects will be better and more completely understood by referring to the following description of exemplary illustrative non-limiting implementations in conjunction with the drawings, of which:
The
Housing 102 shown in
In the example shown, printed circuit boards 116, 118 each have light emitting components mounted thereon. For example, printed circuit board 116 is shown with four light emitting diodes 120a, 120b, 120c, 120d mounted thereon. Similarly, printed circuit board 118 is shown with four light emitting diodes 122a, 122b, 122c, 122d mounted thereon. In another exemplary illustrative non-limiting implementation, each of printed circuit boards 116, 118 may be provided with five (5) light emitting diodes to provide adequate on-axis intensity to extend on-axis range while also providing some off-axis coverage as well. More or fewer light emitting diodes can be used in other exemplary illustrative non-limiting implementations. The light emitting diodes can emit infrared radiation at for example 940 nm.
One advantage of targeting or marking device 100 emitting infrared light is that such emitted infrared (IR) light will generally not be interfered with by visible light including the light emitted by conventional television and computer cathode ray tubes, plasma type displays, liquid crystal displays and other illuminated human readable displays. On the other hand, care may need to be taken to ensure that heat sources (e.g., fireplaces, candles, heat lamps, sunlight streaming in a nearby window, etc.) do not interfere. Usually, users locate their televisions or other display devices away from such heat sources and interference may not be a concern. In addition, a remote portable sensing device designed to sense the infrared light emitted by marking device 100 can be programmed to have adjustable or fixed sensitivity that will allow the sensing device to sense the light emitted by marking device 100 without suffering from undue interference due to ambient heat sources.
Other implementations can use other light generation and/or emission mechanisms for generating and/or emitting light at any frequency including but not limiting to lasing, incandescence, chemoluminescence, bioluminescence, fluorescence, radioactive decay, gas discharge, phosphorescence, scintillation, sonoluminescence, triboluminescence, or any other mechanism for generating and/or emitting radiation at any detectable frequency. Furthermore, other exemplary illustrative non-limiting implementations could use fewer light sources and provide additional or different mechanism(s) for spreading, aiming and/or concentrating light or otherwise developing light emission patterns, such mechanisms including lenses, mirrors or any other optics.
As shown in
The exemplary illustrative non-limiting implementation's use of plural light sources aimed in different directions provides a wider illumination coverage area or radiation pattern. This allows sensing devices that are substantially “off axis” with respect to a direction perpendicular to housing surface 110 to nevertheless detect the positions of ports 112, 114. This can be advantageous for example in multiplayer situations in which several (e.g., two, three or four) different players are positioned in front of the same marking device 100 and associated display device to participate in the same activity such as playing a video game. With this many participants, especially in active games in which participants may need room to move around, at least some of the participants may be located substantially off axis with respect to the central aiming direction of the marking device 100. As will be explained in more detail below, the exemplary positioning of some LEDs 120 to aim directly forward while also providing additional LEDs 102a, 120d, 122a, 122d aiming off axis provided both adequate on-axis detection (even for larger rooms) and adequate off-axis detection (especially since off-axis players are not likely to be located very far away from the display so as to see the off-axis display better).
For example,
This exemplary implementation allows the same, fixed length marker bar 100 to function for all sized displays, since the marker is marking positions in space, not display edges. Marker bars 100 which are variable in size and designed to mark the display edges as well as positions in space can also be used, but in one exemplary illustrative non-limiting implementation, there are advantages in terms of manufacturing and product marketization to provide a “one size fits all” marker bar 100 that does not need to be customized, modified or special ordered for differently sized display devices. As anyone who has ever shopped for a television or computer display knows, there is a wide range of differently sized displays from sub compact (e.g., 9 inches) to massive (e.g., 54 inches or more). The exemplary illustrative non-limiting marker bar 100 can be used with any such displays.
There is therefore no need in the exemplary illustrative non-limiting implementation for the position of marker bar 100 to be aligned with the edges or extents of the corresponding display device.
One exemplary illustrative non-limiting application is to use the reflected infrared signals to signal. For example, such reflected signals can be used to remotely control the operation of the television set 332 or other devices in the room. Typical televisions are remotely controlled by continuous wave (CW) modulated (i.e., on/off switched) infrared signals of a certain frequency (e.g., 980 nm). Simple mark-and-space binary codes example are often used to switch IR remote control devices for popular television sets. If the main unit 154 applying power to marker bar 100 were to switch the marker bar 100 on and off at certain encodings, the infrared signals radiated by the marker bar and reflected from the surrounding room back toward the television set could be used to remotely control the television set to perform certain functions (e.g., turn television on and off, control volume, etc.). IR signals can bounce off of players as well as walls, and software within main unit 154 could optionally be used to “flash” the marker bar 100 on and off to provide a conventional remote control function. Software can be provided in the system powering the marking device 100 to allow it to be adapted to control a variety of IR controlled devices, from TVs, to receivers, to cable boxes.
Exemplary specifications for an exemplary LED may be as follows:
Alternative Exemplary Non-Limiting Implementations
In one exemplary illustrative non-limiting implementation, a user may wish to verify that there are no bright light sources, including sunlight, behind or near the TV, shining towards the remote device or reflecting off the TV screen. It may be desirable to avoid sources of infrared light in the gameplay area such as electric, propane or kerosene heaters, flames from fireplaces or candles, and stoves or other sources of heat. If there are bright lights shining directly behind the TV or on the screen, it may be best to turn the lights off. A user may wish to make sure that the marker bar 100 is setup correctly. To do this, it may be desirable to check the cord on the marker bar for any frayed wires or kinks, and verify that the marker bar is free of obstructions. It may also be desirable to verify that the remote sensing device is being used between 3 and 8 feet directly in front of the TV, and that the marker bar is placed properly.
In one exemplary illustrative non-limiting implementation, it may also be desirable to make sure that the marker bar sensitivity is properly set. This sensitivity may be a property of a remote detecting device as opposed to the marker bar itself. The marker bar sensitivity determines the distance the player can be from the TV. If you move out of the range of the Marker bar the cursor can become erratic. The higher the sensitivity is set, the more susceptible the marker bar is to light and infrared heat sources. It may be useful to make sure there are no bright light sources, including sunlight, behind or near the TV, shining towards the remote or reflecting off the TV screen. It may be desirable to avoid sources of infrared light in the gameplay area such as electric, propane or kerosene heaters, flames from fireplaces or candles, and stoves or other sources of heat.
In one exemplary illustrative non-limiting implementation, it may also be desirable to verify Sensitivity Setting dots. To do this, it is possible to go to the Marker bar's sensitivity setting in a user interface associated with the remote sensing device. If only one or no dots appear on the sensitivity screen, there is likely a problem with the Marker bar. If there are more than two dots, then it appears there is an additional light or infrared heat source being picked up by the remote. Once again, it may desirable under some circumstances to avoid sources of infrared light in the gameplay area such as electric, propane or kerosene heaters, flames from fireplaces or candles, and stoves or other sources of heat. If both dots appear and the erratic behavior continues even when you move closer to the TV, there could be a problem with the remote sensing device.
It is to be understood that the invention is not to be limited to the disclosed exemplary illustrative non-limiting implementations. On the contrary, the invention is intended to cover various modifications and equivalent arrangements included within the scope of the claims.
This application is a continuation of U.S. application Ser. No. 15/180,337 filed Jun. 13, 2016 (now U.S. Pat. No. 9,694,278 issued Jul. 4, 2017), which is a continuation of U.S. application Ser. No. 13/230,953 filed Sep. 13, 2011 (now U.S. Pat. No. 9,364,755 issued Jun. 14, 2016), which is a division of U.S. application Ser. No. 11/745,842 filed May 8, 2007, which application claims the benefit of priority from provisional Application No. 60/746,769 filed May 8, 2006; and is a continuation-in-part of U.S. application Ser. No. 29/268,255 filed Nov. 2, 2006 (now U.S. Design No. D589965), and also is a continuation-in-part of U.S. application Ser. No. 29/268,254 filed Nov. 2, 2006 (now U.S. Design No. D583876). The entire contents of each of these prior applications is incorporated herein by reference as expressly set forth.
Number | Name | Date | Kind |
---|---|---|---|
4839838 | LaBiche et al. | Jun 1989 | A |
4862152 | Milner | Aug 1989 | A |
4988981 | Zimmerman | Jan 1991 | A |
5009501 | Fenner | Apr 1991 | A |
5181181 | Glynn | Jan 1993 | A |
5325133 | Adachi | Jun 1994 | A |
5440326 | Quinn | Aug 1995 | A |
5554980 | Hashimoto | Sep 1996 | A |
5574479 | Odell | Nov 1996 | A |
5598187 | Ide et al. | Jan 1997 | A |
5616901 | Crandall | Apr 1997 | A |
5623358 | Madey | Apr 1997 | A |
5627565 | Morishita et al. | May 1997 | A |
5645077 | Foxlin | Jul 1997 | A |
5710623 | Kim | Jan 1998 | A |
5757360 | Nitta et al. | May 1998 | A |
5757530 | Crandall, Jr. | May 1998 | A |
5796354 | Cartabiano et al. | Aug 1998 | A |
5796387 | Curran | Aug 1998 | A |
5867146 | Kim et al. | Feb 1999 | A |
6002138 | Bond et al. | Dec 1999 | A |
6012980 | Yoshida | Jan 2000 | A |
6184863 | Sibert | Feb 2001 | B1 |
6283612 | Hunter | Sep 2001 | B1 |
6297802 | Fujioka | Oct 2001 | B1 |
6424410 | Pelosi | Jul 2002 | B1 |
6450664 | Kelly | Sep 2002 | B1 |
RE37929 | Fernandez | Dec 2002 | E |
6498860 | Sasaki | Dec 2002 | B1 |
6545661 | Goschy et al. | Apr 2003 | B1 |
6636199 | Kobayashi | Oct 2003 | B2 |
6640337 | Lu | Oct 2003 | B1 |
6659623 | Friend | Dec 2003 | B2 |
6903674 | Hoesel et al. | Jun 2005 | B2 |
6967644 | Kobayashi | Nov 2005 | B1 |
6982697 | Wilson et al. | Jan 2006 | B2 |
7039218 | Lin | May 2006 | B2 |
7053932 | Lin et al. | May 2006 | B2 |
7102616 | Sleator | Sep 2006 | B1 |
7151561 | Lin et al. | Dec 2006 | B2 |
7169633 | Huang et al. | Jan 2007 | B2 |
7173652 | Lin et al. | Feb 2007 | B2 |
7199350 | Chien | Apr 2007 | B2 |
7230611 | Bischoff | Jun 2007 | B2 |
7242391 | Lin et al. | Jul 2007 | B2 |
7274836 | Chien et al. | Sep 2007 | B1 |
7324088 | Lin et al. | Jan 2008 | B2 |
7342570 | Lin et al. | Mar 2008 | B2 |
7355588 | Lin et al. | Apr 2008 | B2 |
7388997 | Lin et al. | Jun 2008 | B2 |
7796116 | Salsman | Sep 2010 | B2 |
7834848 | Ohta | Nov 2010 | B2 |
7864159 | Sweetser | Jan 2011 | B2 |
7969413 | Aonuma et al. | Jun 2011 | B2 |
8089455 | Wieder | Jan 2012 | B1 |
8441440 | Makita | May 2013 | B2 |
8658995 | Hotelling et al. | Feb 2014 | B2 |
8870655 | Ikeda | Oct 2014 | B2 |
9364755 | Ashida | Jun 2016 | B1 |
9694278 | Ashida | Jul 2017 | B2 |
20010002139 | Hiraoka | May 2001 | A1 |
20010010514 | Ishino | Aug 2001 | A1 |
20010036082 | Kanesaka | Nov 2001 | A1 |
20010050672 | Kobayashi | Dec 2001 | A1 |
20020036617 | Pryor | Mar 2002 | A1 |
20020136010 | Luk | Sep 2002 | A1 |
20030002033 | Boman | Jan 2003 | A1 |
20030038778 | Noguera | Feb 2003 | A1 |
20040066659 | Mezei et al. | Apr 2004 | A1 |
20040207597 | Marks | Oct 2004 | A1 |
20040222969 | Buchenrieder | Nov 2004 | A1 |
20050157515 | Chen et al. | Jul 2005 | A1 |
20050180159 | Wu et al. | Aug 2005 | A1 |
20050276064 | Wu et al. | Dec 2005 | A1 |
20060152488 | Salsman | Jul 2006 | A1 |
20060152489 | Sweetser | Jul 2006 | A1 |
20060264260 | Zalewski | Nov 2006 | A1 |
20060268565 | Chang | Nov 2006 | A1 |
20070049374 | Ikeda et al. | Mar 2007 | A1 |
20070050597 | Ikeda et al. | Mar 2007 | A1 |
20070052177 | Ikeda et al. | Mar 2007 | A1 |
20070060228 | Akasaka | Mar 2007 | A1 |
20070060391 | Ikeda et al. | Mar 2007 | A1 |
20070066394 | Ikeda et al. | Mar 2007 | A1 |
20070072680 | Ikeda et al. | Mar 2007 | A1 |
20070236452 | Venkatesh | Oct 2007 | A1 |
20080015017 | Ashida et al. | Jan 2008 | A1 |
20080030991 | Yeh | Feb 2008 | A1 |
20080039202 | Sawano et al. | Feb 2008 | A1 |
20080174550 | Laurila et al. | Jul 2008 | A1 |
20090005166 | Sato | Jan 2009 | A1 |
20090027335 | Ye | Jan 2009 | A1 |
Number | Date | Country |
---|---|---|
3-74434 | Jul 1991 | JP |
6-50758 | Feb 1994 | JP |
6-154422 | Jun 1994 | JP |
10-99542 | Apr 1998 | JP |
11-506857 | Jun 1999 | JP |
2005-040493 | Feb 2005 | JP |
2005-063230 | Mar 2005 | JP |
02-17054 | Feb 2002 | WO |
Entry |
---|
Allen, et al., “A General Method for Comparing the Expected Performance of Tracking and Motion Capture Systems,” {VRST} '05: Proceedings of the ACM symposium on Virtual reality software and technology, pp. 201-210 (Nov. 2005). |
Allen, et al., “Tracking: Beyond 15 Minutes of Thought,” SIGGRAPH 2001 Course 11 (Course Pack) from Computer Graphics (2001). |
Analog Devices “ADXL202E Low-Cost ±2 g Dual-Axis Accelerometer with Duty Cycle Output” (Data Sheet), Rev. A (2000). |
Analog Devices “ADXL50 Single Axis Accelerometer” (Data Sheet), http://www.analog.com/en/obsolete/adx150/products/product.html (Mar. 1996). |
Bachmann et al., “Inertial and Magnetic Posture Tracking for Inserting Humans into Networked Virtual Environments,” Virtual Reality Software and Technology archive, Proceedings of the ACM Symposium on Virtual Reality Software and Technology, Baniff, Alberta, Canada, pp. 9-16 (2001). |
Bachmann, “Inertial and Magnetic Angle Tracking of Limb Segments for Inserting Humans into Synthetic Environments,” Dissertation, Naval Postgraduate School, Monterey, CA (Dec. 2000). |
Benbasat, “An Inertial Measurement Unit for User Interfaces,” Massachusetts Institute of Technology Dissertation, (Sep. 2000). |
Bhatnagar, “Position trackers for Head Mounted Display systems: A survey” (Technical Report), University of North Carolina at Chapel Hill (Mar. 1993). |
Bishop, “The Self-Tracker: A Smart Optical Sensor on Silicon,” Ph.D. Dissertation, Univ. of North Carolina at Chapel Hill (1984). |
Bishop, et al., “Grids Progress Meeting” (Slides), University of North Carolina at Chapel Hill, NC (1998). |
Bishop, et al., Self-Tracker: Tracking for Hybrid Environments without Infrastructure (1996). |
“ Celestial Navigation—Wikipedia,” http://en.wikipedia.org/wiki/Celestial_navigation, 7 pages (last edit May 1, 2007). |
Cutts, “A Hybrid Image/Inertial System for Wide-Area Tracking” (Internal to UNC-CH Computer Science) (Jun. 1999 ). |
Duck Hunt, 4 pages (1985). |
Eiβele, “Orientation as an additional User Interface in Mixed-Reality Environments,” 1. workshop Ervwiterte and Virtuelle Realität, pp. 79-90. GI-Fachgruppe AR/VR (2007). |
Ferrin, “Survey of Helmet Tracking Technologies,” Proc. SPIE vol. 1456, p. 86-94 (Apr. 1991). |
Foxlin, “Generalized architecture for simultaneous localization, auto-calibration, and map-building,” IEEE/RSJ Conf. on Intelligent Robots and Systems, Lausanne, Switzerland (Oct. 2002). |
Foxlin, “Inertial Head Tracker Sensor Fusion by a Complementary Separate-bias Kalman Filter,” Proceedings of the IEEE 1996 Virtual Reality Annual International Symposium, pp. 185-194, 267 (1996). |
Foxlin, “Motion Tracking Requirements and Technologies,” Chapter 7, from Handbook of Virtual Environment Technology, Stanney Kay, Ed. (2002). |
Foxlin, et al., “VIS-Tracker: A Wearable Vision-Inertial Self-Tracker,” IEEE Computer Society (2003). |
Fuchs, “Inertial Head-Tracking,” Massachusetts Institute of Technology, Sep. 1993. |
Goschy, “Midway Velocity Controller” (youtube video http://www.youtube.com/watch?v=wjLhSrSxFNw) (Sep. 8, 2007). |
Harada, et al., “Portable orientation estimation device based on accelerometers, magnetometers and gyroscope sensors for sensor network,” Proceedings of IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems, MFI2003, pp. 191-196 (Jul. 2003). |
Hogue, “MARVIN: A Mobile Automatic Realtime Visual and INertial tracking system,” Master's Thesis, York University (2003). |
Intersense, “InterSense InertialCube2 Manual for Serial Port Model” (2001). |
Meyer et al., “A Survey of Position Tracker,” vol. 1, Issue 2, pp. 173-200, MIT Presence, (1992). |
Mizell, “Using Gravity to Estimate Accelerometer Orientation,” IEEE Computer Society (2003). |
Pique, “Semantics of Interactive Rotations,” Interactive 3D Graphics, Proceedings of the 1986 workshop on Interactive 3D graphics, pp. 259-269 (Oct. 1986). |
Selectech, “Airmouse Remote Control System Model AM-1 User's Guide,” Colchester, VT (Sep. 24, 1991). |
Selectech, “AirMouse Remote Controls, AirMouse Remote Control Warranty” (1991). |
Selectech, “Changing Driver Versions on CDTV/AMIGA” (Oct. 17, 1991). |
Selectech, “Selectech AirMouse Remote Controls, Model # AM-R1,” photographs (1991). |
Selectech, Facsimile Transmission from Rossner to Monastiero, Airmouse Remote Controls, Colchester, VT (Mar. 25, 1992). |
Selectech, Selectech AirMouse Devices (image) (1991). |
Selectech, Software, “AirMouse for DOS and Windows IBM & Compatibles,” “AirMouse Remote Control B0100EN-C, Amiga Driver, CDTV Driver, Version: 1.00,” “AirMouse Remote Control B0100EM-C.1, Apple Macintosh Serial Driver Version: 1.00 (1.01B),” “AirMouse Remote Control B0100EL-B/3.05 DOS Driver Version: 3.0, Windows Driver Version 1.00,” AirMouse Remote Control MS-DOS Driver Version: 3.00/3.05, Windows 3.0 Driver Version: 1.00 (1991). |
Simon, et al. “The Yo Yo: A Handheld Combining Elastic and Isotonic Input,” http://www.uni-weimar.de/cms/fileadmin/medien/vr/documents/publications/TheYoYo-Interact2003-Talk.pdf (2003). |
Spencer, Mark, “A TV Remote Control Decoder,” http://www.arrl.org/news/features/2004/03/30/1/, 5 pages (Mar. 30, 2004). |
Titterton et al., “Strapdown Inertial Navigation Technology,” pp. 1-56 and pp. 292-321 (May 1997). |
Ward, et al., “A Demonstrated Optical Tracker With Scalable Work Area for Head-Mounted Display Systems,” Symposium on Interactive 3D Graphics, Proceedings of the 1992 Symposium on Interactive 3D Graphics, pp. 43-52, ACM Press, Cambridge, MA (1992). |
“Washington Area Model Accessibility Project (Washington Area Map,” Talking Signs, Infrared Communications System, Baton Rouge, LA, 7 pages (copyright 2000 Revised May 11, 2006). |
Welch et al., Motion Tracking: No Silver Bullet, but a Respectable Arsenal IEEE Computer Graphics and Applications, vol. 22, No. 6, pp. 24-38 (Nov. 2002). |
Welch, et al., “The HiBall Tracker: High-Performance Wide-Area Tracking for Virtual and Augmented Environments,” ACM SIGGRAPH, Addison-Wesley (1999). |
Welch, et al., “The High-Performance Wide-Area Optical Tracking : The HiBall Tracking System,” MIT Presence, Presence, vol. 10 , No. 1 (Feb. 2001). |
Wilson, “XWand: UI for Intelligent Environments,” http://research.microsoft.com/en-us/um/people/awilson/wand/default.htm (Apr. 2004). |
Wilson, et al., “Xwand: UI for Intelligent Spaces,” CHI 2003, Proceedings of the SIGCHI conference on Human factors in computing systems, pp. 545-552 (Apr. 2003). |
Wormell, et al., “Advancements in 3D Interactive Devices for Virtual Environments,” ACM International Conference Proceeding Series; vol. 39 (2003). |
www.3rdtech.com (2000-2006). |
Office Action in corresponding U.S. Appl. No. 11/745,842 dated Jan. 22, 2016. |
Office Action in corresponding U.S. Appl. No. 11/745,842 dated Apr. 3, 2015. |
Office Action in corresponding U.S. Appl. No. 11/745,842 dated Feb. 22, 2013. |
Office Action in corresponding U.S. Appl. No. 11/745,842 dated Nov. 9, 2012. |
Office Action in corresponding U.S. Appl. No. 11/745,842 dated Nov. 28, 2011. |
Office Action in U.S. Appl. No. 11/745,842 dated Aug. 10, 2016. |
Office Action in U.S. Appl. No. 15/180,337 dated Oct. 27, 2016. |
Notice of Allowance in U.S. Appl. No. 15/180,337 dated Mar. 2, 2017. |
Number | Date | Country | |
---|---|---|---|
20170333784 A1 | Nov 2017 | US |
Number | Date | Country | |
---|---|---|---|
60746769 | May 2006 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 11745842 | May 2007 | US |
Child | 13230953 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15180337 | Jun 2016 | US |
Child | 15635463 | US | |
Parent | 13230953 | Sep 2011 | US |
Child | 15180337 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 29268255 | Nov 2006 | US |
Child | 11745842 | US | |
Parent | 29268254 | Nov 2006 | US |
Child | 29268255 | US |