The present invention relates generally to a data processing system. More particularly, this invention relates to free-space multi-dimensional absolute pointer using a projection marker system.
Among the several handheld devices that exist for remotely controlling electronic equipment, the free-space multi-dimensional absolute pointer (as described in the above-incorporated applications) stands to bring the ease of use by unifying control of nearly all devices under one simple operational paradigm, point-twist-zoom. In a similar way that the mouse and the graphical user interface brought the simplicity and user-friendliness to the PC (personal computer) platform in the early 1970's with its “point-and-click” paradigm, the world of the digital living room is now experiencing a rapid convergence of electronic equipment and technologies that are overwhelming the control capabilities of traditional interfaces, such as universal IR remote controls, mice, and keyboards.
This is becoming even more evident with several key consumer trends: 1) strong sales of large screen digital TVs, 2) strong demand for digital video recording functionality (e.g., TiVo) and advanced TV viewing, 3) pervasive entrenchment of the internet in all aspects of human life (e.g., information search, travel, purchase/sales, banking, etc.), 4) nearly complete conversion to digital cameras and camcorders in the USA, and 5) increased demand for gaming for recreational purposes (e.g., on-line games, casual games, multi-player games, etc.). When these trends collide in the digital living room, the user needs a simple device and user paradigm to be able to manage and navigate this flood of content.
Methods and apparatuses for free-space multi-dimensional absolute pointer using a projection marker system are described herein. In one embodiment, a presentation system includes, but is not limited to, a projection-based marker apparatus to project one or more optical spots on a display surface for displaying machine generated content capable of being manipulated via a cursor of a pointing device, a handheld device to wirelessly capture the projected optical spots from the display surface, and a control unit communicatively coupled to the projection-based marker apparatus and the handheld device to determine coordinates of the cursor based on characteristics of the captured light spots.
Other features of the present invention will be apparent from the accompanying drawings and from the detailed description which follows.
The present invention is illustrated by way of example and not limitation in the figures of the accompanying drawings in which like references indicate similar elements.
Methods and apparatuses for a free-space multi-dimensional absolute pointer using a projection marker system are described herein. In the following description, numerous details are set forth to provide a more thorough explanation of embodiments of the present invention. It will be apparent, however, to one skilled in the art, that embodiments of the present invention may be practiced without these specific details. In other instances, well-known structures and devices are shown in block diagram form, rather than in detail, in order to avoid obscuring embodiments of the present invention.
Reference in the specification to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the invention. The appearances of the phrase “in one embodiment” in various places in the specification do not necessarily all refer to the same embodiment.
According to certain embodiments of the invention, a free-space absolute pointer, henceforth referred to as the WavIt, provides such a tool by combining simple 3D pointing with a graphical user interface on a large screen monitor/TV. The WavIt is an absolute pointer, where a user points is where the cursor goes. It works on any type of screen (e.g., CRT, DLP, RPTV, LCD, Plasma etc). The WavIt also tracks other degrees of freedom, such as the absolute angle of rotation of the user's wrist, and the user's absolute distance away from the screen. Some versions also track the user's location in the room. All this takes place in real time, and multiple users can use devices at the same time, which is of particular interest for multi-player gaming.
An embodiment of the invention is to expand on ways in which the WavIt absolute pointing system may be engineered to work with large front- (and/or rear-) projection screens. Specifically, the techniques described throughout this application focus on how a projection-POD (Photonic Origin Designator) or p-POD may be developed to allow easy usability and setup, primarily for conference room settings.
The WavIt multi-dimensional pointer is a high precision tracking system in which the core miniature sensing unit resides inside a handheld remote control device. The device tracks multiple degrees of freedom in absolute space, meaning that not only does it sense where the user is pointing, but it also senses whether the user is twisting his wrist, leaning forward, or sitting to the side of the room. Functionally, it basically acts like a localized GPS (global positioning system) device to track where you are with respect to the TV screen as well a laser-pointer to detect where you are pointing, and a tilt sensor to know how much your wrist is twisted. These functions that typically require several distinct sensing and pointing technologies are achieved using the same underlying optical tracking that is core to the WavIt system.
A receiving device 106, hereafter referred to as the POD, receives the data from the Handset using a variety of wireless communication protocols, such as, for example IR, Bluetooth, or IEEE 802.xx protocols. This device is coupled to a computer via a communication link such as a USB connection. The receiver channels the data from the Handset into a data processing system 105. The receiver also has the ability to “blast” IR signals to all other Infrared sensitive devices within a predetermined proximity such as a room. A sub-section of the POD is dedicated to generating the Optical Beacons, which serve as the optical markers that are tracked by the handset.
A host computer 105 (or set-top box) receives the data from the POD. This is handled by a driver, which communicates with the Handset using the USB device. A driver will, based on the data sent, calculate position and pointing coordinates and read the button presses and use this information to control the PC and specific programs or environments.
All interaction happens via a display surface such as a TV screen. This is the screen on which the content, e.g., movies or internet pages, will be displayed. It is also where additional graphical overlays may appear as dictated by a specific user interface.
The WavIt multidimensional tracker is based on optical tracking of one or more spots, or marker images on an optical sensor. In one embodiment, an optical sensor is incorporated into a handset. This arrangement is one of the key aspects of WavIt system from which many of its highly desirable features are derived. By incorporating specific optical wavelength filtering in the sensor, according to one embodiment, the sensor can be made to only see or detect a light of a specific wavelength range, such as, for example, ranging approximately from 900 to 1000 nm.
There are a number of ways to generate the optical beacons or markers. In one embodiment, one or more IR LEDs may be incorporated into a POD unit that is placed near the screen, with the IR LEDs emitting into the room, towards the handset. However, it is not so limited. Different embodiments of a POD and/or beacons may be incorporated with one or more IR LEDs that emit into the room. For example, the beacons may be built into the RF receiver and/or USB chip enclosure. Alternatively, the RF receiver may only contain the beacons, and RF reception is handled separately by a separate USB dongle unit with RF receiver.
With IR LEDs placed in the POD, according to certain embodiments, these light sources may be seen or detected by the handset as distinct spots. A single spot, as seen by the handset's image sensor and microcontroller, is shown in
It will be appreciated that multiple spots may also be implemented. For example, as shown in
There are situations in which placing a physical POD near a screen is either not very feasible or undesirable. This may be the case, for example, in a conference room scenario. Here the user would need to find a way to mount the POD near a screen and would then have a long USB cable extending back to the user's PC in order to make his system operational. Without prior knowledge of the room and/or time to set up before a meeting, this may be a risky undertaking.
One option is to have a 3-piece system, as shown in
In one embodiment, that obviates the need for the beacons to be mounted or attached to a wall or screen, a virtual POD is created inside the screen.
In one embodiment, the two spots are generated by two collimated 980 nm IR lasers, pointing out of the projection-POD at a slightly diverging angle. It should be noted that it is also possible to project light from IR LEDs onto the screen, according to another embodiment, but that care must then be taken to refocus (re-image) the spots whenever the projection-POD is moved so that its distance from the screen changes appreciably. In order to have minimal system dependence (e.g., signal strength and spot size) on the POD location, it is useful to use collimated light sources, and for this reason, lasers are an ideal source. The IR spots could originate from individual sources or from a single source that is optically split into multiple beams.
The handset will now see two spots when pointing in the vicinity of the screen, in a similar way as if a POD had been mounted in the middle of the projection screen. The projection-POD has the benefit of not requiring cables extending from the screen to the PC. The setup procedure is also relatively simple—simply point the p-POD at the screen. In one embodiment, one or more guiding visible lasers are used to facilitate placement of the invisible IR spots onto the screen.
The functional components of a p-POD in which the light source and receiver electronics are integrated into a single unit are shown in
Note that some or all of these functions (wireless Rx/Tx, micro-control, and computer interface) may be integrated into one or two chips. For example, the Chipcon/TI CC2430 combines the wireless radio and micro-controller functions. Not shown in
Referring to
For example, for typical operating distances of 2 to 5 meters from a screen with a 50 inch diagonal, a projected spot separation of approximately 15 to 25 cm is a relatively good compromise for a vision system with a 50 degree field of view. If the p-POD is placed approximately 2 to 3 meters from the screen, where a typical front projector would be located, then the optimal angle between the IR beams would be approximately 3 to 7 degrees. For other operating conditions and system characteristics, different optimal beam angles will result. In some configurations, this angle is fixed, and in other configurations, the angle is made to be adjustable.
In a particular embodiment, the visible guide laser beam bisects the two IR beams so that when it is pointed at the middle of the screen, it is known that the invisible IR spots are symmetrically located around the middle of the screen. During subsequent operation of the system, the visible guide laser may be turned off and used only when the p-POD alignment needs to be verified. The guide laser may be controlled by the micro-controller or by other means, such as an electromechanical switch. The two IR beams are generated from a single laser device using optical beamsplitters and mirrors, both standard optical components. In one embodiment, the beamsplitter divides the incident IR light from the IR laser into two equal power components and transmits one and reflects the other. The reflected beam then reflects off of the mirror and exits the POD. The beamsplitter and mirror are adjusted to provide the desired beam angle. The advantage of this arrangement is that only one IR laser is used, thus saving in cost, component count, and space. However this laser must be more powerful than those used in the two-laser arrangement (approximately twice the power) and additional optical components are needed.
Note that the visible alignment laser may be included in either the single or two-laser embodiments, and that some or all of the receiver/transmitter components may be housed separately from the lasers. For example, in an alternative embodiment, the optical components of the p-POD are contained in an enclosure that resides near the projector and the receiver components are contained in a small enclosure that plugs into a computer input port (e.g., a USB dongle device). In this arrangement, the Handset and receiver communicate with each other, and the p-POD is used only to project the reference markers onto the screen. The p-POD would then have its own power source and switch. If it is desirable to communicate and control the lasers in the p-POD remotely, then a micro-controller and wireless chip could be included with the p-POD. This arrangement might be desirable in situations where the projector and computer are located far from each other.
In one embodiment, the laser beam vectors are slightly diverging (as shown in
Other configurations for the lasers and IR spots are possible. For example, according to one embodiment, the lasers may be mounted such that the divergence between the beams is in the vertical plane instead of the horizontal plane, thus producing spots that are arranged vertically on the screen, or any combination of horizontal and vertical displacement between the beams (e.g., spots arranged diagonally on the screen) is also possible. Other geometries include ones in which the beams from the two lasers cross and then diverge before reaching the screen or simply converge from the p-POD and do not cross before hitting the screen.
The infrared spots that are projected onto a normal screen will tend to scatter in the same way that the visible light does. This means that for normal screens there will be near Lambertian scatter of the incident light from the p-POD, which will mean that the spots will be visible from very large angles to the screen. In addition, many projection screens (rear projection in particular) are designed to have asymmetric scattering of the incident light in order to increase the viewing angle, typically in the horizontal plane. Such screens will also work well with the p-POD system since, in general, similar non-uniform scattering will increase the operational region (both in angle and distance from the screen) for the WavIt system.
Another benefit of projected spots is that they are relatively easily tailored without the same physical constraints that govern the design of a regular emissive POD. For example, according to certain embodiments, the projected spots can be made bigger without impacting the size of the p-POD. Shaped spots, such as lines and crosses can be projected into various corners of the screen. Larger and shaped spots may be more easily resolved by the detection system in the Handset and thus provide additional information. Multiple spots could be arranged in a circle. The benefit of this is, for example, that if the spots are arranged into a large circle, then the aspect ratio of the circle, which is more accurately determined, could help the WavIt Handset determine its location in the room. For example, a tall oval shape would indicate that the user is positioned to the left or right of the screen and not directly in front. A flat oval would indicate that he is positioned above or below the screen.
Multiple arrangements of spots (and/or their powers) can be used to break the symmetry to allow for discerning whether the user is to the right or left of the screen. For example, a four-marker arrangement in which the separation between the markers on the screen is a non-negligible fraction of the user's distance from the screen can permit the distinction between left and right view angle. This can be done using one or more of at least two properties of the markers—their apparent separation (the spot separation on the Handset sensor) and their detected powers. In essence, the pair of markers that are closer to the Handset will both have a larger separation and stronger signal strengths. The side of the screen on which the user is located will determine which pair of spot images is stronger and/or farther apart. The procedures for determining various degrees of freedom using multiple marker configurations are described in detail in the pending applications previously referenced. Returning to the two spot geometry, it is also now more feasible to place the spots much farther apart (perhaps even the length the screen). In addition to permitting better view angle resolution as described above (by using the difference in the received power between the pairs of spots), this has the benefit of improving the resolution of the measurement of distance from the WavIt to the screen since the Handset uses the spot separation to gauge distance from Handset to the screen. It also improves the resolution of the roll angle (the angle of rotation of the WavIt about the axis perpendicular to the sensor surface).
In one embodiment, the WavIt handset can be modified to have a very narrowband laser-line band-pass 980 nm filter. This allows light only of very close spectral proximity to 980 nm to pass through to the image sensor. A continuous-wave (CW) 980 nm laser usually has a bandwidth of much less than 1 nm, in comparison to an LED, whose spectrum normally spans >10 nm. This means that the system can be made much more optically robust to spurious light sources, such as room lights, essentially increasing the inherent signal to noise ratio of the system.
In one embodiment, the p-POD contains two 35 mW 980 nm diode laser sources that each use ˜70 mA with an operating voltage of 3.2V. This is well within the power limit of the USB port of a PC which can support 500 mA at 5V.
While it is noted that an alternative arrangement is possible in which the laser is placed in the Handset and/or the camera is fixed on the table or onto the projector, this “reverse” configuration suffers from a few weaknesses: 1) it does not lend itself very well to non-interfering robust multi-user operation because it will be difficult to distinguish which spots belong to which user without a much more computationally intensive and expensive image analysis system, and 2) it involves placing lasers in a handset where free hand motion is more likely to direct potentially dangerous laser beams into unsuspecting eyes, 3) the lasers consume more power (˜150 mA at 3.3V, or ˜500 mW) and hence would require much more regular recharging when compared to the <20 mA (or ˜70 mW) for a WavIt optical tracking handset, 4) the lasers are significantly more expensive than an optical sensor, and 5) the space requirement for two lasers is generally greater than that for a single image sensor and thus would add to the size of the Handset.
According to one embodiment of the invention, the p-POD WavIt system is compatible with the regular WavIt POD system in that no modifications need be made to the WavIt Handset. This also means that all the regular benefits of WavIt tracking system apply. For example, it allows for non-interfering robust multi-user operation, and the handset remains relatively low cost, low power, and robust with a very small form factor.
Note that, although we have discussed primarily the use of a p-POD configuration for front projectors, it is clear to those with ordinary skill in the art that the same principle applies equally to a rear-projection TV system, in which the lasers, or other projected IR light sources, are mounted inside the TV and the spots are projected onto the back-side of the TV screen along with the normal projected picture.
In traditional RP systems, the screen is designed to scatter the light isotropically. More recently, systems are designed so that the front surface of the viewing screen scatters the incident light asymmetrically in the horizontal and vertical directions. This is done in order to make more efficient use of the available light since typical viewers will not be located at large vertical angles with respect to the screen. Advanced technologies such as Fresnel lenses and lenticular arrays are used to produce asymmetrically scattering screens.
There are several different arrangements that could be employed for the incorporation of the p-POD based absolute pointing device into an RP system. The one shown in
The vision system in the Handset held by the viewer will see two spots located near the center of the screen and displaced horizontally with respect to each other, as shown in
In addition to standard RPTV systems, laser-based projection TVs are another, recently-developed, type of display system in which the p-POD may be integrated. The main difference between standard RP and laser-based RP displays is the type of light source. Instead of filtered lamp light or, in some cases, visible LEDs, laser-based displays use lasers as the source of illumination. The main advantages of laser-based displays are higher efficiency, smaller size and weight, longer lifetime, and superior color gamut compared with conventional projection systems. Although still in their infancy, laser-based TVs and displays are anticipated to become more prevalent over the next several years due to the continuing improvements in the quality and cost of the component solid state lasers used as the sources.
Laser-based projection displays (both front and rear) are potentially ideally suited for incorporation of a laser-based p-POD as the source of the reference markers for use with a WavIt pointing device. In typical laser-based displays, at least one of the three component visible light sources (red, green, and blue) is derived from an IR laser source. For example, typically, the blue light is obtained by frequency doubling of a near-IR laser, although in some cases, the green and/or the red are also derived from an IR source via second-harmonic generation. This is done because of the difficulty and inefficiency in generating shorter wavelength (e.g. blue) laser light directly. A typical wavelength range for the blue component is 430 nm-490 nm, which places the fundamental wavelength in the 860 nm-980 nm range. This light is typically not used and must be blocked or filtered out of the projection system. However, this range is nearly ideal as a near-IR source of marker light. Furthermore, because of the relatively low conversion efficiency in the frequency-doubling process (<50%), there is available a high power source of residual ˜920 nm laser light that is otherwise wasted. The only additional components necessary to use the IR light may be collimating, beam steering, and beam splitting optics to separate the IR from the visible light, split it into the desired number of beams (one for each marker), shape the beams as needed, and direct them to the screen.
Note that, although the use of available IR light is ideal in many ways, it may not be practical in some systems. In such cases, an external IR source may be added to the system, as shown in
The preferred embodiments described thus far involve the use of one or more IR lasers as the source of the projected marker light in the p-POD. Lasers are generally preferred primarily because of their superior optical properties. In particular, their inherent brightness (or radiance) is typically many orders of magnitude larger than for incoherent light sources such as LEDs and lamps. This fact results in the ability to more efficiently collect the emitted light from the source and project to a target (e.g., a wall or screen). In some cases, however, it may be necessary or desirable to use more conventional sources such as IREDs, either for cost or safety reasons. Because of the inherently larger radiance associated with lasers, it is not practical for an IRED-based p-POD to produce the same signal level in a WavIt Handset as a laser-based p-POD with the same optical power. However, it is conceivable, depending on the details of the system, to design a p-POD based on IREDs or other incoherent light sources. The relevant system details include the image detection method in the Handset, the required optical power and beam size, and the distance between the p-POD and the screen.
The radiance of an optical source cannot be increased during transmission through an optical system, and because the inherent radiance of LEDs is much smaller than that of lasers, it is generally difficult to achieve the same amount of detectable optical power scattered off of the screen using LEDs versus using lasers with the same source power. There are a few approaches, however, that may permit the use of LED-based p-PODs in some situations. Note that one of the relevant parameters for determining the signal level is the amount of light from the source that is scattered in the direction of the Handset and not necessarily the radiance of the source. Therefore, even though the radiance of LEDs is much smaller than that of lasers, it is still conceivable that they can be used in some situations. In most cases, what matters more than the radiance (e.g., W/cm2-Sr) is the irradiance (e.g., W/cm2) of the projected light at the target. This is because the incident light is typically scattered uniformly at the screen, regardless of the source. Therefore, depending on the required spot size, LEDs may provide sufficient signal strength. The required spot size (or range of spot sizes) will depend on several factors including the detection method, the sensor resolution, and the operating distance.
In general, because of the radiance limitations, it is impractical to maintain sufficiently small spot sizes when the p-POD is located far from the screen (typically more than ˜1 meter) without sacrificing optical power. In such cases, the irradiance at the screen may be increased in one of two ways. Increasing the size of the projection optics (e.g., refractive and/or reflective elements) will decrease the target spot size, thereby increasing the irradiance. However, in many cases, the required optic size would be impractically large. It is also possible to attempt to collimate or focus the light from the LED (or other incoherent light source) to obtain a smaller spot. However, in order to achieve a sufficiently small spot containing a significant fraction of the optical power from the LED, the required optical system would also be impractically large (in length and height). The other option is to simply increase the power projected to the target area by adding additional sources and directing their projected beams to the desired location. Of course, this approach also requires a larger effective size for the p-POD and the addition of more sources which results in higher power requirements and additional cost.
A specific example of an LED-based p-POD uses, for each marker, three separate LEDs, each of which has a narrow divergence angle (<10° is typically achievable for a 5 mm diameter device). The LEDs may be oriented so that their beams have maximum overlap at the target location which may be 1-3 meters away depending on the details of the display system (e.g., front vs rear projection). Additional optics may be added to the system to further reduce the spot size (subject to the inherent radiance limits of the source). In general, the further the p-POD is from the screen, the larger the optics must be to maintain the same irradiance. Using a set of three standard high-power (e.g., 35 mW) IREDs for each marker would result in a ˜100 mW of total power contained in a spot size of ˜10 cm for a screen ˜1 meter away. The corresponding irradiance of ˜1.3 mW/cm2 is to be compared with ˜7.5 mW/cm2 for a 35 mw laser spot of ˜2.5 cm diameter. The larger the acceptable spot size for the marker, the more feasible the LED-based p-POD approach becomes since more of the available power is used. The maximum acceptable spot size will depend on factors such as the detection method and the acceptable separation between the two projected markers. In some detection methods, the signal on the image sensor is saturated such that the detected image size is much larger than the actual image size. In such cases, broadening the marker size on the screen can be done without significantly affecting the quality of the detected signal. In fact, in some cases, the signal may be improved (larger and/or more stable) by increasing the spot size whether using a laser or LED. The upper limit on spot size is ultimately determined by the maximum operating distance of the Handset since the images on the sensor approach each other as the user moves farther away from the screen. If the spots are too large then they will become too close to each other on the sensor to resolve for sufficiently large operating distances.
Note that while
As shown in
Typically, the input/output devices 810 are coupled to the system through input/output controllers 809. The volatile RAM 805 is typically implemented as dynamic RAM (DRAM) which requires power continuously in order to refresh or maintain the data in the memory. The non-volatile memory 806 is typically a magnetic hard drive, a magnetic optical drive, an optical drive, or a DVD RAM or other type of memory system which maintains data even after power is removed from the system. Typically, the non-volatile memory will also be a random access memory, although this is not required.
While
Methods and apparatuses for free-space multi-dimensional absolute pointer using a projection marker system have been described herein. Some portions of the preceding detailed descriptions have been presented in terms of algorithms and symbolic representations of operations on data bits within a computer memory. These algorithmic descriptions and representations are the ways used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. An algorithm is here, and generally, conceived to be a self-consistent sequence of operations leading to a desired result. The operations are those requiring physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. It has proven convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like.
It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the above discussion, it is appreciated that throughout the description, discussions utilizing terms such as “processing” or “computing” or “calculating” or “determining” or “displaying” or the like, refer to the action and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.
Embodiments of the present invention also relate to an apparatus for performing the operations herein. This apparatus may be specially constructed for the required purposes, or it may comprise a general-purpose computer selectively activated or reconfigured by a computer program stored in the computer. Such a computer program may be stored in a computer readable storage medium, such as, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, and magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), erasable programmable ROMs (EPROMs), electrically erasable programmable ROMs (EEPROMs), magnetic or optical cards, or any type of media suitable for storing electronic instructions, and each coupled to a computer system bus.
The algorithms and displays presented herein are not inherently related to any particular computer or other apparatus. Various general-purpose systems may be used with programs in accordance with the teachings herein, or it may prove convenient to construct more specialized apparatus to perform the required method operations. The required structure for a variety of these systems will appear from the description below. In addition, embodiments of the present invention are not described with reference to any particular programming language. It will be appreciated that a variety of programming languages may be used to implement the teachings of embodiments of the invention as described herein.
A machine-readable medium may include any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computer). For example, a machine-readable medium includes read only memory (“ROM”); random access memory (“RAM”); magnetic disk storage media; optical storage media; flash memory devices; electrical, optical, acoustical or other form of propagated signals (e.g., carrier waves, infrared signals, digital signals, etc.); etc.
In the foregoing specification, embodiments of the invention have been described with reference to specific exemplary embodiments thereof. It will be evident that various modifications may be made thereto without departing from the broader spirit and scope of the invention as set forth in the following claims. The specification and drawings are, accordingly, to be regarded in an illustrative sense rather than a restrictive sense.
This application claims the benefit of U.S. Provisional Application No. 60/831,735, filed Jul. 17, 2006, which is incorporated by reference herein in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
4229009 | Ohta | Oct 1980 | A |
4285523 | Lemelson | Aug 1981 | A |
4395045 | Baer | Jul 1983 | A |
4813682 | Okada | Mar 1989 | A |
4955812 | Hill | Sep 1990 | A |
4959721 | Micic et al. | Sep 1990 | A |
4964837 | Collier | Oct 1990 | A |
5009501 | Fenner et al. | Apr 1991 | A |
5023943 | Heberle | Jun 1991 | A |
5027132 | Manns | Jun 1991 | A |
5045843 | Per | Sep 1991 | A |
5115230 | Smoot | May 1992 | A |
5130693 | Gigandet | Jul 1992 | A |
5170002 | Suzuki et al. | Dec 1992 | A |
5181015 | Marshall et al. | Jan 1993 | A |
5181181 | Glynn | Jan 1993 | A |
5187540 | Morrison | Feb 1993 | A |
5227985 | DeMenthon | Jul 1993 | A |
5237617 | Miller | Aug 1993 | A |
5297061 | Dementhon | Mar 1994 | A |
5309137 | Kajiwara | May 1994 | A |
5317140 | Dunthorn | May 1994 | A |
5319387 | Yoshikawa | Jun 1994 | A |
5353042 | Klapman | Oct 1994 | A |
5371802 | McDonald et al. | Dec 1994 | A |
5388059 | DeMenthon | Feb 1995 | A |
5394326 | Liu | Feb 1995 | A |
5396281 | Maeda | Mar 1995 | A |
5424556 | Symosek et al. | Jun 1995 | A |
5457478 | Frank | Oct 1995 | A |
5502459 | Marshall et al. | Mar 1996 | A |
5502568 | Ogawa et al. | Mar 1996 | A |
5504501 | Hauck et al. | Apr 1996 | A |
5506605 | Paley | Apr 1996 | A |
5510811 | Tobey et al. | Apr 1996 | A |
5510893 | Suzuki | Apr 1996 | A |
5515079 | Hauck | May 1996 | A |
5557690 | O'Gorman | Sep 1996 | A |
5574479 | Odell | Nov 1996 | A |
5581372 | Kerz | Dec 1996 | A |
5594807 | Liu | Jan 1997 | A |
5608528 | Ogawa | Mar 1997 | A |
5627565 | Morishita et al. | May 1997 | A |
5694495 | Hara et al. | Dec 1997 | A |
5707237 | Takemoto et al. | Jan 1998 | A |
5724106 | Autry et al. | Mar 1998 | A |
5733201 | Caldwell et al. | Mar 1998 | A |
5736947 | Imanaka | Apr 1998 | A |
5736974 | Selker | Apr 1998 | A |
5736975 | Lunetta | Apr 1998 | A |
5754094 | Frushour | May 1998 | A |
5793353 | Wu | Aug 1998 | A |
5793361 | Kahn et al. | Aug 1998 | A |
5796387 | Curran et al. | Aug 1998 | A |
5805165 | Thorne, III et al. | Sep 1998 | A |
5832139 | Batterman et al. | Nov 1998 | A |
5860648 | Petermeier et al. | Jan 1999 | A |
5883569 | Kolefas | Mar 1999 | A |
5900863 | Numazaki | May 1999 | A |
5917472 | Perälä | Jun 1999 | A |
5926168 | Fan | Jul 1999 | A |
5929444 | Leichner | Jul 1999 | A |
5953077 | Honey et al. | Sep 1999 | A |
5963145 | Escobosa | Oct 1999 | A |
5973618 | Ellis | Oct 1999 | A |
5973757 | Aubuchon et al. | Oct 1999 | A |
5983161 | Lemelson et al. | Nov 1999 | A |
5986644 | Herder et al. | Nov 1999 | A |
5999166 | Rangan | Dec 1999 | A |
6008800 | Pryor | Dec 1999 | A |
6008899 | Trebino et al. | Dec 1999 | A |
6037936 | Ellenby et al. | Mar 2000 | A |
6045446 | Ohshima | Apr 2000 | A |
6061055 | Marks | May 2000 | A |
6061644 | Leis | May 2000 | A |
6067079 | Shieh | May 2000 | A |
6091378 | Richardson et al. | Jul 2000 | A |
6094189 | Quillen et al. | Jul 2000 | A |
6097374 | Howard | Aug 2000 | A |
6110039 | Oh | Aug 2000 | A |
6115128 | Vann | Sep 2000 | A |
6125190 | Wen | Sep 2000 | A |
6146278 | Kobayashi | Nov 2000 | A |
6184863 | Sibert et al. | Feb 2001 | B1 |
6188388 | Arita et al. | Feb 2001 | B1 |
6244956 | Nakayama et al. | Jun 2001 | B1 |
6251011 | Yamazaki | Jun 2001 | B1 |
6252720 | Haseltine | Jun 2001 | B1 |
6259431 | Futatsugi et al. | Jul 2001 | B1 |
6281878 | Montellese | Aug 2001 | B1 |
6287198 | McCauley | Sep 2001 | B1 |
6295051 | Kanevsky et al. | Sep 2001 | B1 |
6317784 | Mackintosh et al. | Nov 2001 | B1 |
6324255 | Kondo | Nov 2001 | B1 |
6324296 | McSheery et al. | Nov 2001 | B1 |
6331848 | Stove et al. | Dec 2001 | B1 |
6373961 | Richardson et al. | Apr 2002 | B1 |
6377242 | Sweed | Apr 2002 | B1 |
6404416 | Kahn et al. | Jun 2002 | B1 |
6429856 | Omura | Aug 2002 | B1 |
6445409 | Ito | Sep 2002 | B1 |
6456892 | Dara-Abrams | Sep 2002 | B1 |
6489945 | Gordon | Dec 2002 | B1 |
6515651 | Berstis | Feb 2003 | B1 |
6540607 | Mokris | Apr 2003 | B2 |
6545661 | Goschy | Apr 2003 | B1 |
6559935 | Tew | May 2003 | B1 |
6600475 | Gutta et al. | Jul 2003 | B2 |
6603880 | Sakamoto | Aug 2003 | B2 |
6618076 | Sukthankar et al. | Sep 2003 | B1 |
6641269 | Kitazawa | Nov 2003 | B2 |
6650822 | Zhou | Nov 2003 | B1 |
6660475 | Jack et al. | Dec 2003 | B2 |
6664948 | Crane | Dec 2003 | B2 |
6677987 | Girod | Jan 2004 | B1 |
6683628 | Nakagawa et al. | Jan 2004 | B1 |
6724368 | Strubbe | Apr 2004 | B2 |
6727885 | Ishino et al. | Apr 2004 | B1 |
6727887 | Levine et al. | Apr 2004 | B1 |
6753849 | Curran | Jun 2004 | B1 |
6757446 | Li | Jun 2004 | B1 |
6765555 | Wu | Jul 2004 | B2 |
6765608 | Himeda | Jul 2004 | B1 |
6766036 | Pryor | Jul 2004 | B1 |
6766066 | Kitazawa | Jul 2004 | B2 |
6791531 | Johnston | Sep 2004 | B1 |
6795068 | Marks | Sep 2004 | B1 |
6811489 | Shimizu | Nov 2004 | B1 |
6844871 | Hinckley et al. | Jan 2005 | B1 |
6847348 | Rojewski | Jan 2005 | B2 |
6881147 | Naghi et al. | Apr 2005 | B2 |
6890262 | Oishi | May 2005 | B2 |
6900791 | Tanaka et al. | May 2005 | B2 |
6924809 | Chao et al. | Aug 2005 | B2 |
6926610 | Kobayashi | Aug 2005 | B2 |
6955598 | Hagiwara | Oct 2005 | B2 |
6956503 | Yokokohji | Oct 2005 | B2 |
6973202 | Mostafavi | Dec 2005 | B2 |
6975301 | Fan | Dec 2005 | B2 |
6978037 | Fechner et al. | Dec 2005 | B1 |
6982697 | Wilson | Jan 2006 | B2 |
6990639 | Wilson | Jan 2006 | B2 |
7039218 | Lin | May 2006 | B2 |
7050043 | Huang et al. | May 2006 | B2 |
7053798 | Popineau | May 2006 | B2 |
7053932 | Lin et al. | May 2006 | B2 |
7061468 | Tiphane | Jun 2006 | B2 |
7064742 | Navab | Jun 2006 | B2 |
7069516 | Rekimoto | Jun 2006 | B2 |
7071908 | Guttag | Jul 2006 | B2 |
7091949 | Hansen et al. | Aug 2006 | B2 |
7102616 | Sleator | Sep 2006 | B1 |
7105795 | Cartlidge | Sep 2006 | B2 |
7130469 | Adachi | Oct 2006 | B2 |
7133031 | Wang | Nov 2006 | B2 |
7136053 | Hendriks | Nov 2006 | B2 |
7139983 | Kelts | Nov 2006 | B2 |
7158118 | Liberty | Jan 2007 | B2 |
7158181 | Cartlidge | Jan 2007 | B2 |
7161596 | Hoile | Jan 2007 | B2 |
7170492 | Bell | Jan 2007 | B2 |
7176905 | Baharav | Feb 2007 | B2 |
7215322 | Genc | May 2007 | B2 |
7227526 | Hildreth | Jun 2007 | B2 |
7232986 | Worthington et al. | Jun 2007 | B2 |
7262760 | Liberty | Aug 2007 | B2 |
7268774 | Pittel | Sep 2007 | B2 |
7292151 | Ferguson et al. | Nov 2007 | B2 |
7414596 | Satoh | Aug 2008 | B2 |
7414611 | Liberty | Aug 2008 | B2 |
7417719 | Michelsson | Aug 2008 | B2 |
7430303 | Sefcik | Sep 2008 | B2 |
7440610 | Wirtz | Oct 2008 | B1 |
7457439 | Madsen | Nov 2008 | B1 |
7542072 | DeMenthon | Jun 2009 | B2 |
7773076 | Pittel | Aug 2010 | B2 |
7787992 | Pretlove | Aug 2010 | B2 |
7796116 | Salsman et al. | Sep 2010 | B2 |
7800585 | Gordon | Sep 2010 | B2 |
7852317 | Grunnet-Jepsen | Dec 2010 | B2 |
7859523 | Kong | Dec 2010 | B2 |
7864159 | Sweetser et al. | Jan 2011 | B2 |
7869618 | Thelen | Jan 2011 | B2 |
7893924 | Lieberman | Feb 2011 | B2 |
7912286 | Ozaki et al. | Mar 2011 | B2 |
7927216 | Ikeda et al. | Apr 2011 | B2 |
7940986 | Mekenkamp | May 2011 | B2 |
8095200 | Quaid, III | Jan 2012 | B2 |
20010010514 | Ishino | Aug 2001 | A1 |
20010030668 | Erten | Oct 2001 | A1 |
20010045940 | Hansen | Nov 2001 | A1 |
20020011987 | Kitazawa | Jan 2002 | A1 |
20020056136 | Wistendahl | May 2002 | A1 |
20020078446 | Dakss et al. | Jun 2002 | A1 |
20020085097 | Colmenarez | Jul 2002 | A1 |
20020098887 | Himoto | Jul 2002 | A1 |
20020103617 | Uchiyama | Aug 2002 | A1 |
20020107069 | Ishino | Aug 2002 | A1 |
20020135565 | Gordon et al. | Sep 2002 | A1 |
20030017872 | Oishi | Jan 2003 | A1 |
20030048280 | Russell | Mar 2003 | A1 |
20030144056 | Leifer | Jul 2003 | A1 |
20030189545 | Trajkovic et al. | Oct 2003 | A1 |
20030199324 | Wang | Oct 2003 | A1 |
20030216176 | Shimizu et al. | Nov 2003 | A1 |
20040004276 | Hsu et al. | Jan 2004 | A1 |
20040016939 | Akiba et al. | Jan 2004 | A1 |
20040017357 | Kinoshita et al. | Jan 2004 | A1 |
20040017473 | Marks | Jan 2004 | A1 |
20040046736 | Pryor | Mar 2004 | A1 |
20040046744 | Rafii et al. | Mar 2004 | A1 |
20040063481 | Wang | Apr 2004 | A1 |
20040095317 | Zhang et al. | May 2004 | A1 |
20040151218 | Branzoi | Aug 2004 | A1 |
20040160420 | Baharav | Aug 2004 | A1 |
20040169639 | Pate et al. | Sep 2004 | A1 |
20040174340 | Bruneau | Sep 2004 | A1 |
20040174569 | Karito | Sep 2004 | A1 |
20040196451 | Aoyama | Oct 2004 | A1 |
20040207597 | Marks | Oct 2004 | A1 |
20040222969 | Buchenrieder | Nov 2004 | A1 |
20040239670 | Marks | Dec 2004 | A1 |
20040246229 | Yamada | Dec 2004 | A1 |
20040266528 | Wang | Dec 2004 | A1 |
20040268393 | Hunleth et al. | Dec 2004 | A1 |
20050026689 | Marks | Feb 2005 | A1 |
20050028191 | Sullivan et al. | Feb 2005 | A1 |
20050052415 | Braun | Mar 2005 | A1 |
20050059488 | Larsen | Mar 2005 | A1 |
20050073525 | Chao et al. | Apr 2005 | A1 |
20050104632 | Lettvin | May 2005 | A1 |
20050104849 | Hoile | May 2005 | A1 |
20050137774 | Rupp | Jun 2005 | A1 |
20050168444 | Lin et al. | Aug 2005 | A1 |
20050200351 | Shimizu | Sep 2005 | A1 |
20050225536 | Hsieh et al. | Oct 2005 | A1 |
20050237303 | Yang | Oct 2005 | A1 |
20050244034 | Miyahara | Nov 2005 | A1 |
20050272502 | Marks | Dec 2005 | A1 |
20060007142 | Wilson et al. | Jan 2006 | A1 |
20060023111 | DeMenthon | Feb 2006 | A1 |
20060028442 | Bynum et al. | Feb 2006 | A1 |
20060047472 | Krumm | Mar 2006 | A1 |
20060049930 | Zruya | Mar 2006 | A1 |
20060082546 | Wey | Apr 2006 | A1 |
20060094286 | Lee et al. | May 2006 | A1 |
20060105842 | Kim | May 2006 | A1 |
20060108507 | Huang | May 2006 | A1 |
20060125932 | Lu et al. | Jun 2006 | A1 |
20060148563 | Yang | Jul 2006 | A1 |
20060152487 | Grunnet-Jepsen et al. | Jul 2006 | A1 |
20060152489 | Sweetser et al. | Jul 2006 | A1 |
20060267935 | Corson | Nov 2006 | A1 |
20060284841 | Hong et al. | Dec 2006 | A1 |
20070002037 | Kuroki | Jan 2007 | A1 |
20070060228 | Akasaka | Mar 2007 | A1 |
20070066394 | Ikeda et al. | Mar 2007 | A1 |
20070089915 | Ogawa | Apr 2007 | A1 |
20070124694 | Van De Sluis et al. | May 2007 | A1 |
20070155502 | Wu | Jul 2007 | A1 |
20070176908 | Lipman | Aug 2007 | A1 |
20070188447 | Wu | Aug 2007 | A1 |
20070211027 | Ohta | Sep 2007 | A1 |
20070298882 | Marks | Dec 2007 | A1 |
20080094354 | Thelen | Apr 2008 | A1 |
20080100574 | Lou et al. | May 2008 | A1 |
20080174550 | Laurila | Jul 2008 | A1 |
20080188959 | Kneissler | Aug 2008 | A1 |
20080204404 | Kneissler et al. | Aug 2008 | A1 |
20090085869 | Destura et al. | Apr 2009 | A1 |
20090128815 | Draaijer et al. | May 2009 | A1 |
20090295595 | Thelen et al. | Dec 2009 | A1 |
20090300535 | Skourup | Dec 2009 | A1 |
20100073289 | Kneissler | Mar 2010 | A1 |
Number | Date | Country |
---|---|---|
1338961 | Mar 2002 | CN |
1559644 | Jan 2005 | CN |
3930581 | Mar 1991 | DE |
19701344 | Jul 1997 | DE |
19701374 | Jul 1997 | DE |
19648487 | Jun 1998 | DE |
19814254 | Oct 1998 | DE |
19937307 | Feb 2000 | DE |
10029173 | Jan 2002 | DE |
10241392 | May 2003 | DE |
10219198 | Nov 2003 | DE |
0 852 961 | Jul 1998 | EP |
0 993 845 | Apr 2000 | EP |
1 062 994 | Dec 2000 | EP |
1 062 994 | Dec 2000 | EP |
1 081 635 | Mar 2001 | EP |
1081635 | Mar 2001 | EP |
1279425 | Jan 2003 | EP |
1 293 237 | Mar 2003 | EP |
1 450 243 | Aug 2004 | EP |
1524334 | Sep 1978 | GB |
2 244546 | Dec 1991 | GB |
2 284478 | Jun 1995 | GB |
2 307133 | May 1997 | GB |
2 316482 | Feb 1998 | GB |
2319374 | May 1998 | GB |
2381686 | May 2003 | GB |
60-77231 | May 1985 | JP |
62-14527 | Jan 1987 | JP |
03-074434 | Jul 1991 | JP |
03-059619 | Sep 1991 | JP |
05-056191 | May 1993 | JP |
06-050758 | Feb 1994 | JP |
06-154422 | Jun 1994 | JP |
06-190144 | Jul 1994 | JP |
3000028 | Jul 1994 | JP |
06-273058 | Sep 1994 | JP |
06-308879 | Nov 1994 | JP |
07-028591 | Jan 1995 | JP |
07-044315 | Feb 1995 | JP |
07-107573 | Apr 1995 | JP |
07-115690 | May 1995 | JP |
07121293 | May 1995 | JP |
07-146123 | Jun 1995 | JP |
07-200142 | Aug 1995 | JP |
H7-230354 | Aug 1995 | JP |
07-262797 | Oct 1995 | JP |
07-302148 | Nov 1995 | JP |
07-318332 | Dec 1995 | JP |
08-071252 | Mar 1996 | JP |
08-095704 | Apr 1996 | JP |
08-106352 | Apr 1996 | JP |
08-111144 | Apr 1996 | JP |
08-114415 | May 1996 | JP |
08-122070 | May 1996 | JP |
08-152959 | Jun 1996 | JP |
08-211993 | Aug 1996 | JP |
08-292998 | Nov 1996 | JP |
08-305355 | Nov 1996 | JP |
08-335136 | Dec 1996 | JP |
09-166417 | Jun 1997 | JP |
09-230997 | Sep 1997 | JP |
09-265346 | Oct 1997 | JP |
09-274534 | Oct 1997 | JP |
09-319510 | Dec 1997 | JP |
10-021000 | Jan 1998 | JP |
10-033831 | Feb 1998 | JP |
10-099542 | Apr 1998 | JP |
H10-154038 | Jun 1998 | JP |
10-228349 | Aug 1998 | JP |
H10-228349 | Aug 1998 | JP |
10-254614 | Sep 1998 | JP |
11-053994 | Feb 1999 | JP |
11-099284 | Apr 1999 | JP |
11-114223 | Apr 1999 | JP |
11-307243 | May 1999 | JP |
2901476 | Jun 1999 | JP |
11-305935 | Nov 1999 | JP |
2000-063230 | Feb 2000 | JP |
2000-270237 | Sep 2000 | JP |
2000-308756 | Nov 2000 | JP |
2001-038052 | Feb 2001 | JP |
2001-104643 | Apr 2001 | JP |
3078268 | Apr 2001 | JP |
2001-175412 | Jun 2001 | JP |
2001-236181 | Aug 2001 | JP |
2002-224444 | Aug 2001 | JP |
3194841 | Aug 2001 | JP |
2001-306245 | Nov 2001 | JP |
3228845 | Nov 2001 | JP |
2001-356875 | Dec 2001 | JP |
2002-062981 | Feb 2002 | JP |
2002-082751 | Mar 2002 | JP |
2002-091642 | Mar 2002 | JP |
3262677 | Mar 2002 | JP |
3273531 | Apr 2002 | JP |
2002-153673 | May 2002 | JP |
2002-202843 | Jul 2002 | JP |
2002-232549 | Aug 2002 | JP |
2002-233665 | Aug 2002 | JP |
2004-139206 | Oct 2002 | JP |
2002032770 | Nov 2002 | JP |
2003-053038 | Feb 2003 | JP |
2003044220 | Feb 2003 | JP |
2003-083715 | Mar 2003 | JP |
2003-140823 | May 2003 | JP |
3422383 | Jun 2003 | JP |
2003-208260 | Jul 2003 | JP |
2005-040493 | Jul 2003 | JP |
2003-279799 | Oct 2003 | JP |
2003-325974 | Nov 2003 | JP |
2004-062774 | Feb 2004 | JP |
2004-070502 | Mar 2004 | JP |
3517482 | Apr 2004 | JP |
2004139206 | May 2004 | JP |
2004-191906 | Jul 2004 | JP |
2004-310074 | Nov 2004 | JP |
2004-313492 | Nov 2004 | JP |
2004-348459 | Dec 2004 | JP |
2005-025170 | Jan 2005 | JP |
2005-040493 | Feb 2005 | JP |
2005-063230 | Mar 2005 | JP |
2006-113019 | Apr 2006 | JP |
2006-136694 | Jun 2006 | JP |
2007-083024 | Apr 2007 | JP |
2007-283134 | Nov 2007 | JP |
9300171 | Aug 1994 | NL |
2 125 853 | Feb 1994 | RU |
2 126 161 | Feb 1999 | RU |
2 141 738 | Nov 1999 | RU |
WO 9402931 | Feb 1994 | WO |
WO-9519031 | Jul 1995 | WO |
WO 9519031 | Jul 1995 | WO |
WO 9605766 | Feb 1996 | WO |
WO 9709101 | Mar 1997 | WO |
WO 9712337 | Apr 1997 | WO |
WO 9717598 | May 1997 | WO |
WO 9728864 | Aug 1997 | WO |
WO 9732641 | Sep 1997 | WO |
WO 97041502 | Nov 1997 | WO |
WO 9811528 | Mar 1998 | WO |
WO 9958214 | Nov 1999 | WO |
WO 0033168 | Jun 2000 | WO |
WO 0035345 | Jun 2000 | WO |
WO-0060534 | Oct 2000 | WO |
WO 0187426 | Nov 2001 | WO |
WO 0191042 | Nov 2001 | WO |
WO 0217054 | Feb 2002 | WO |
WO 03015005 | Feb 2003 | WO |
WO 03056505 | Jul 2003 | WO |
WO 03088147 | Oct 2003 | WO |
WO 03107260 | Dec 2003 | WO |
WO 2004012130 | Feb 2004 | WO |
WO-2004012130 | Feb 2004 | WO |
WO 2004039055 | May 2004 | WO |
WO-2004038328 | May 2004 | WO |
WO 2004051391 | Jun 2004 | WO |
WO 2005013115 | Feb 2005 | WO |
WO-2005013115 | Feb 2005 | WO |
WO 2005040493 | May 2005 | WO |
WO 2005073838 | Aug 2005 | WO |
WO 2006076557 | Jul 2006 | WO |
WO 2006076557 | Jul 2006 | WO |
WO 2007063449 | Jun 2007 | WO |
WO 2007105133 | Sep 2007 | WO |
WO 2007105132 | Sep 2007 | WO |
WO 2008065579 | Jun 2008 | WO |
WO 2008065601 | Jun 2008 | WO |
Entry |
---|
International Search Report mailed Jan. 30, 2008, for International Patent Application No. PCT/US07/15955, filed Jul. 13, 2007, 4 pages. |
Final Office Action from U.S. Appl. No. 11/777,078, mailed Jun. 10, 2010, 13 pages. |
Notice of Allowance from U.S. Appl. No. 11/187,387, mailed Aug. 23, 2010, 9 pages. |
Notice of Allowance from U.S. Appl. No. 11/187,435, mailed Sep. 2, 2010, 11 pages. |
Office Action from U.S. Appl. No. 11/187,387, mailed Feb. 4, 2010, pp. 17. |
Office Action for counterpart European Patent Application No. 06718289.9, mailed Jul. Sep. 7, 2009, 5 pages. |
Notice of Allowance from U.S. Appl. No. 11/187,435, mailed May 3, 2010, 13 pages. |
Notice of Allowance from U.S. Appl. No. 11/187,405, mailed Apr. 29, 2010, 18 pages. |
Second Office Action from foreign counterpart China Patent Application No. 200680006415.3 mailed Oct. 8, 2010, pp. 6. |
Odell, D., FAM 18.5, “An Optical Pointer for Infrared Remote Controllers”, IEEE, 1995, pp. 324-325. |
Vellgdan, J., “Unique Interactive Projection Display Screen”, BNL-64780, U.S. Air Force P.R.A.M. office and by D.A.R.P.A, pp. 9. |
Aihara, T., et al., “Wireless Pointing Device”, IBM Technical Disclosure Bulletin, vol. 36, No. 06B, Jun. 1993, pp. 345-346. |
Office Action for counterpart European Patent Application No. 06718289.9, mailed Mar. 9, 2008, 6 pages. |
Danish Patent and Trademark Office, Novelty Search Dec. 20, 2006, SE2006 05704., 11 pages. |
Office Action from Foreign Counterpart China Patent Application No. 200680006415.3, mailed Jul. 22, 2009, 2 pgs. |
Moore, Stan , “Optimal Pixel Size”, http://www.stanmooreastro.com/pixel—size.htm, 2004, pp. 1-5. |
European Office Action, 06718289.9 mailed Jul. 19, 2011, 6 pages. |
Third Office Action of Chinese Patent Application 200680006415.3 , mailed Aug. 12, 2011, 5 pages. |
Office Action for counterpart Japanese Patent Application JP 2007-551393 mailed Sep. 15, 2011, 2 pages, Office Action and English summary. |
Notice of Allowance for U.S. Appl. No. 12/115,251 mailed Nov. 4, 2011, 5 pages. |
Office Action from U.S. Appl. No. 11/187,387, Mailed Feb. 4, 2010., 17 pages. |
PCT International Search Report and Written Opinion of the International Searching Authority, Mailed Oct. 30, 2006., 14 pages. |
PCT International Search Report and Written Opinion of the International Searching Authority, Mailed Jul. 17, 2007., 11 pages. |
Sukthankar, R., et al., “Smarter Presentations: Exploiting Homography in Camera-Projector Systems,” Proceedings of International Conference on Computer Vision, 2001., 7 pages. |
Madritsch, F., “CCD—Camera Based Optical Tracking for Human-Computer Interaction.” Proceedings 1st European Conference on Disability, Virtual Reality and Associated Technologies, Maidenhead, 1996, pp. 161-170. |
Olsen, Jr. D., et al., “Laser Pointer Interaction,” Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 2001, Seattle, WA., 7 pages. |
Myers, B., et al., “Interacting at a Distance: Measuring the Performance of Laser Pointers and Other Devices,” vol. 4: Issue 1, Proceedings of CHI Apr. 2002, Minneapolis, Minnesota, pp. 33-40. |
Agilent Technologies, “Agilent ADNB-6031 and ADNB-6032 Low Power Laser Mouse Bundles: Datasheet,” http://www.agilent.com/semiconductors. May 27, 2005, pp. 1-45, Agilent Technologies. |
Agilent Technologies, “Agilent ADNK-2623 Optical Mouse Designer's Kit: Product Overview,” http://www.agilent.com/semiconductors. Jul. 3, 2003. Agilent Technologies., 4 pages. |
3D Connexion, “SpaceMouse Plus: Premium Motion Controller,” www.3Dconnexion.com, 2003., 2 pages. |
Gyration, Inc., “MicroGyro MG1101,” http://www.gyration.com, pp. 1-18, 2005, DEO1300-001, Data Sheet, MicroGyro MG1101, Rev A. |
3rdTech, Inc., HiBallTM—3000 Wide-Area Tracker and 3D Digitizer: 6 DOF Position and Orientation with Unparalleled Precision and Performance, 3rdTech, Inc., Chapel Hill, 2 pages. |
SmartHome, “Philips ProntoPro NG Fully Programmable Remote Control #TSU7000,” Printed Sep. 22, 2006, www.smartphone.com/8196.html. pp. 4. |
Owens, R., “Optical Mouse Technology,” pp. 15, Printed on Sep. 22, 2006, www.mstarmetro.net/˜rlowens/OpticalMouse/. |
Gyration, “Gyration Home,” pp. 2, printed on Sep. 22, 2006, www.gyration.com/en-US. |
Gyration, “Gyro Technology,” pp. 2, printed on Sep. 22, 2006, www.gyration.com/en-US/GyroTech.html. |
Intersense, Inc., “The New Standard in Motion Tracking,” pp. 2, printed on Sep. 22, 2006, www.intersense.com. |
Prentke Romich Company, “HeadMouse Extreme,” pp. 1, printed on Sep. 22, 2006, http://store.prentrom.com/cgi-bin/store/HE-X.html. |
SMARTNAV, “Assistive Technology—NaturalPoint head tracking for people with special needs,” pp. 6, printed on Sep. 22, 2006, http://rjcooper.com/smartnav/index.html. |
3rdTech, Inc., “HiBallTM-3000 Wide-Area Tracker and 3D Digitizer”, 2002, 2 pages. |
3rdTech, Inc., HiBallTM—3100 Wide-Area, High-Precision Tracker and 3D Digitizer 2003, 3 pges. |
Air Mouse Go Plus with MotionSense™, Gyration, 2009, 2 pages. |
AirMouse™ Remote Control System Model AM-1 User's Guide, Selectech, Ltd., 1991, 29 pages. |
Baca, A., “Spatial Reconstruction of Marker Trajectories from High-Speed Video Image Sequences,” Technical Note, Med. Eng. Phys., vol. 19, No. 4, Jun. 1997, pp. 367-74. |
Billinghurst, M. & Hirokazu Kato, “Collaborative Augmented Reality”, Communications of the ACM, vol. 45, No. 7, pp. 64-70, Jul. 2002. |
Billinghurst, M. & Hirokazu Kato, “Real World Teleconferencing”, CHI EA'99, 194-194 May 15-20, 1999. |
Billinghurst, M. et al., “The MagicBook-Moving Seamlessly between Reality and Virtuality”, IEEE Computer Graphics and Applications, vol. 21, No. 3, pp. 6-8, May-Jun. 2001. |
Allen, et al., “Tracking: Beyond 15 Minutes of Thought”, SIGGRAPH 2001 Course 11, U.N.C. Chapel Hill, ACM, Inc., Aug. 12-17, 2001, pp. 1-117. |
Bishop, G. Ph.D.,“Self-Tracker: A Smart Optical Sensor on Silicon”, dissertation, U.N.C. Chapel Hill, 1984, 65 pages. |
Browse, R., et al., “Controlling Graphic Objects Naturally: Use Your Head,” Proceedings of the ISP Conference on the Engineering Reality of Virtual Reality, 1997, 6 pages. |
Cameron, A. et al. Proc. SPIE: Helmet- and Head-Mounted Displays and Symbology Design Reqs. II, vol. 2465, 281-95 Helmet Trackers—The Future 1995. |
Cantzler, H. et al., “A Novel Form of a Pointing Device,” Vision, Video, and Graphics, 2003, pp. 1-6. |
Re-examination U.S. Appl. No. 95/002,114, filed Aug. 31, 2012, Part 1, 303 pages. |
Re-examination U.S. Appl. No. 95/002,114, filed Aug. 31, 2012, Part 2, 176 pages. (Total 479 pages). |
Re-examination U.S. Appl. No. 95/002,116, filed Aug. 31, 2012, Part 1, 302 pages. |
Re-examination U.S. Appl. No. 95/002,116, filed Aug. 31, 2012, Part 2, 231 pages. (Total 533 pages). |
Re-examination U.S. Appl. No. 95/002,118, filed Aug. 31, 2012, Part 1, 300 pages. |
Re-examination U.S. Appl. No. 95/002,118, filed Aug. 31, 2012, Part 2, 242 pages. (Total 542 pages). |
Danette, A., et al. “Tracking: Beyond 15 Minutes of Thought”, SIGGRAPH 2001 Course 11, U.N.C. Chapel Hill, ACM, Inc., pp. 8/12-17/2001, 117 pages. |
Decision of Rejection Office Action from foreign counterpart China Patent Application 200680006415.3 mailed Feb. 17, 2012, 10 pages. |
Decision of Rejection Office Action from foreign counterpart Japanese Patent Application JP 2007-551393 mailed Apr. 3, 2012, 2 pages. |
DeMenthon, D. (Principal Investigator) National Science Foundation, “Award Abstract #961576:SBIR Phase I: High Performance Three Dimensional Mouse and Head Tracker” 1996, 2 pages. |
Dementhon, U.S. Appl. No. 60/591,892, filed Jun. 2, 2009. |
Exam Report from foreign counterpart Europe Patent Application 06718289.0 mailed Dec. 10, 2010, 6 pages. |
Final Office Action from foreign counterpart Japanese Patent Application JP 2007-551393 mailed Oct. 16, 2012, 2 pages. |
Final Office Action from U.S. Appl. No. 11/187,405, mailed Dec. 17, 2009, 18 pages. |
Final Office action from U.S. Appl. No. 11/777,078, mailed Jun. 10, 2010, 11 pgs. |
Final Office action from U.S. Appl. No. 11/777,078, mailed Sep. 29, 2009, 9 pgs. |
Final Office Action from U.S. Appl. No. 11/187,387, mailed Dec. 15, 2008, 24 pages. |
Final Office Action from U.S. Appl. No. 11/187,405, mailed Dec. 30, 2008, 30 pages. |
Final Office Action from U.S. Appl. No. 11/187,435, mailed Dec. 10, 2009, 19 pages. |
Final Office Action from U.S. Appl. No. 11/187,435, mailed Dec. 30, 2008, 17 pages. |
Forstner, W., “On the Geometric Precision of Digital Correlation,” Proceedings of the ISPRS Symposium Mathematical Models, Accuracy Aspects and Quality Control, Finland 1982, International Archives of Photogrammetry, vol. 24, No. 3, 1982, pp. 176-189. |
Foxlin, E. & Leonid Naimark, “VIS-Tracker: A Wearable Vision-Inertial Self-Tracker”, IEEE VR2003, Mar. 22-26, 2003, 8 pages. |
Foxlin, E. et al., “FlightTracker: A Novel Optical/Inertial Tracker for Cockpit Enhanced Vision ”, IEEE/ACM ISMAR 2004, Nov. 2-5, 2004, 1-10 pages. |
Gordon, G. et al., “The Use of Dense Stereo Range Data in Augmented Reality”, IEEE/ACM ISMAR 2002, 10 pages. |
Gottschalk, S. & John Hughes, “Autocalibration for Virtual Environments Tracking Hardware ”, SIGGRAPH '93, U.N.C. Chapel Hill, 65-72 1993. |
Grunnet-Jepsen, A. et al. “Convolution-kernel-based optimal trade-off filters for optical pattern recognition,” Applied Optics, vol. 35, No. 20, Jul. 10, 1996, pp. 3874-3879. |
Havelock, D. “Geometric Precision in Noise-Free Digital Images,” IEEE Transactions on Pattern Analysis & Machine Intelligence, vol. 11, No. 10, pp. 1065-1075. Oct. 1989. |
Havelock, D., “Geometric Precision in Digital Images”, Int'l Archives of Photogrammetry & Remote Sensing, vol. XXV, part A3, pp. 381-392, 1984. |
Hogue, A., “MARVIN: A Mobile Automatic Realtime Visual and Inertial Tracking System,” Technical Report CSE-2003-13, Ph.D. Thesis, York University, Canada, May 15, 2003, 229 pages. |
Howard, B. et al., “Lightglove: Wrist-Worn Virtual Typing and Pointing”, Lightglove, Inc. Ubiquitous Computing Enabled by Optical Reflectance Controller 2004, Proc. 5th IEEE Int'l Symposium on Wearable Computers (ISWC '01), pp. 172-173, 2001. |
International Search Report mailed Sep. 2, 2008, for related International Application No. PCT/US08/05820, filed May 6, 2008, 2 pages. |
Jobbagy, A. , “Centre Estimation in Marker Based Motion Analysis ”, Department of Measurement and Instrument Engineering Technical University of Budapest, N. TUB-TR-93-EE04, Budapest, Apr. 15, 1993, Technical Report, Ser. Elec. Eng., 95 pages. |
Kato, H. & Mark Billinghurst, “Marker Tracking and HMD Calibration for a Video-based Augmented Reality Conferencing System”, Proc. 2nd IEEE/ACM Int'l Workshop on Augmented Reality (IWAR '99), 1999, 10 pages. |
Kato, H. et al., “Virtual Object Manipulation on a Table—Top AR Environment”, Proc. Int'l Symposium on Augmented Reality (ISAR 2000), pp. 111-119, 2000. |
Kato, H. et al., “The Effect of Spatial Cues in Augmented Reality Video Conferencing”, Proc. 9th Int'l Conf. on Human-Computer Interaction (HCI Int'l 2001), Aug. 5-10, 2001, 4 pages. |
Kim, D. et al., “An Optical Tracker for Augmented Reality and Wearable Computers”, Proc. 1997 Virtual Reality Annual Int'l Symposium (VRAIS '97), pp. 146-150, 1997. |
Nalwa, V. Addison-Wesley Publishing Co. A Guided Tour of Computer Vision 1993, 372 pages. |
NaturalPoint, Inc. TrackIR Software Version 3.10 Manual 2004, 35 pages. |
NaturalPoint, Inc. TrackIR Software Version 4.0.020 Manual Jan. 9, 2005 35 pages. |
NaturalPoint's TrackiR™, “Reviews and Interviews”, 9 pages, 2002 http://www.naturalpoint.com/trackir/02-products/ product-reviews.html. |
NaturalPoint's TrackIR2™ “How It Works”, 3 pages, 2003, (located at internet archive's wayback machine) http://web.archive.org/web/20040610203639/http://www.naturalpoint.com /. |
NaturalPoint's TrackIR™ “3-Pro: Put Your Head In The Game”, 2 pages, 2003, (located at internet archive's wayback machine) http://web.archive.org/web/20040619225900/http://www.naturalpoint.com. |
Nitzan, D. et al. NSF Grant DAR78-27128 “Machine Intelligence Research Applied to Industrial Automation,” Tenth Report, Nov. 1980, 208 pages. |
Notice of Allowance for U.S. Appl. No. 11/777,073 mailed Aug. 15, 2013, 55 pages. |
Notice of Allowance for U.S. Appl. No. 11/777,078 mailed Jul. 22, 2013, 11 pages. |
Notice of Allowance for U.S. Appl. No. 12/115,251 mailed Aug. 16, 2013, 47 pages. |
Notice of Allowance from foreign counterpart Japanese Patent Application JP 2007551393 mailed Feb. 26, 2013, 2 pages. |
Notice of Allowance from U.S. Appl. No. 11/187,405, mailed Apr. 29, 2010, 16 pgs. |
Notice of Allowance from U.S. Appl. No. 11/187,435, mailed May 3, 2010, 13 pgs. |
Notice of Allowance from U.S. Appl. No. 11/187,387, mailed Apr. 29, 2010, 14 pages. |
Notice of Allowance from U.S. Appl. No. 12/983,554, mailed Apr. 1, 2013, 24 pages. |
Notice of Allowance from U.S. Appl. No. 12/983,554, mailed Dec. 20, 2012, 12 pages. |
Notice of Allowance from U.S. Appl. No. 12/983,554, mailed Feb. 29, 2012, 18 pages. |
Notice of Allowance from U.S. Appl. No. 12/115,251, mailed Feb. 28, 2013, 13 pages. |
Notice of Allowance mailed from U.S. Appl. No. 11/187,387, mailed Aug. 23, 2010, 9 pgs. |
Notice of Allowance mailed from U.S. Appl. No. 11/187,435, mailed Sep. 2, 2010, 11 pgs. |
Office Action for counterpart European Patent Application No. 06718289.9, mailed Nov. 9, 2007, 7 pages. |
Office Action for counterpart European Patent Application No. 06718289.9, mailed Dec. 10, 2010, 6 pages. |
Office Action from counterpart Japanese Patent Application JP 2007-551393 mailed Jan. 16, 2013, 11 pages. |
Office Action from foreign counterpart Europe Patent Application 06718289.0 mailed Jul. 19, 2011, 6 pages. |
Office Action from Re-examination U.S. Appl. No. 95/002,114, mailed Oct. 15, 2012, 15 pages. |
Office Action from Re-examination U.S. Appl. No. 95/002,116, mailed Oct. 22, 2012, 15 pages. |
Office Action from Re-examination U.S. Appl. No. 95/002,118, mailed Oct. 4, 2013, 126 pages. |
Office Action from U.S. Appl. No. 11/777,078, mailed Sep. 29, 2009, 9 pages. |
Office Action from U.S. Appl. No. 11/777,078 mailed Oct. 10, 2012, 11 pages. |
Office Action from U.S. Appl. No. 11/187,387, mailed May 13, 2009, 22 pages. |
Office Action from U.S. Appl. No. 11/187,387, mailed Aug. 19, 2008, 24 pages. |
Office Action from U.S. Appl. No. 11/187,405, mailed May 13, 2009, 18 pages. |
Office Action from U.S. Appl. No. 11/187,405, mailed Aug. 20, 2008, 33 pages. |
Office Action from U.S. Appl. No. 11/187,435, mailed May 11, 2009, 15 pages. |
Office Action from U.S. Appl. No. 11/187,435, mailed Aug. 20, 2008, 25 pages. |
Office Action from U.S. Appl. No. 11/777,078 mailed Nov. 22, 2011, 17 pages. |
Office Action from U.S. Appl. No. 12/983,554, mailed Feb. 23, 2011, 54 pages. |
O'Gorman, L. et al., “A Comparison of Fiducial Shapes for Machine Vision Registration”, IAPR Workshop on Machine Vision Applications (MVA '90), pp. 253-256, Nov. 28-30, 1990. |
PCT Application No. PCTUS2006001198 International Search Report and Written Opinion of the International Searching Authority, mailed Jul. 17, 2007, 11 pages. |
PCT Application No. PCTUS2006001198 International Search Report and Written Opinion of the International Searching Authority, mailed Oct. 30, 2006, 14 pages. |
Rose, E. et al., “Annotating Real-World Objects Using Augmented Reality,” Technical Report ECRC-94-41, 21 pages (also published in CG International '95 Proceedings, Leeds, UK, Jun. 1995, pp. 357-370). |
Savvides, M. et al., “Face Verification using Correlation Filters”, Proc. 3rd IEEE Workshop. On Automatic Identification Advanced Technologies, pp. 56-61, Mar. 2002. |
Selectech, Ltd.'s AirMouse™ Remote Control System (“AirMouse”). |
Smith, S., “The Scientist and Engineer's Guide to Digital Signal Processing,” California Technical Publishing, 1997, 643 pages. Submitted by pp. 01-199; 200-350; 351-475; and 476-626. |
Smith, S., “The Scientist and Engineer's Guide to Digital Signal Processing,” California Technical Publishing 1st Edition, 1997, pp. 503-534. |
Solymar, L. et al. Oxford Series in Optical and Imaging Sciences, Clarendon Press “The Physics and Applications of Photorefractive Materials,” 1996, 3 pages. |
Stern, A. et al., “Shannon number and information capacity of three-dimensional integral imaging”, J. Opt. Soc. Am. A, vol. 21, No. 9, pp. 1602-1612, Sep. 2004. |
Summons from counterpart European Patent Application No. 06718289.9, mailed Dec. 4, 2012, 8 pages. |
Tew, A. Med. & Biol. Eng. & Comput., vol. 26, 68-74 The Oxford Optical Pointer: a direction-sensing device with proportional electrical output Jan. 1988. |
ThinkOptics, Inc.v. Nintendo of America et al., “Defendant Nyko Technologies. Inc. and Intec. Inc. Invalidity Contentions Exhibit A,” Civ. Action No. 6:11-cv-0045S-LED, US. Dist. Ct of Tx., Eastern Dist. Filed Aug. 14, 2012, 78 pages. |
ThinkOptics, Inc.v. Nintendo of America et al., “Invalidity Contentions Pursuant to P.R. 3-3 and 3-4 of Defendants Nintendo of America Inc. and Nintendo Co. Ltd. Exhibit C,” Civ. Action No. 6:11-cv-00455-LED, US. Dist. Ct of Tx., Eastern Dist. Filed Aug. 14, 2012, 521 pp. |
ThinkOptics, Inc v. Nintendo of America et al., “Imation Corp's 3-3 and 3-4 Disclosures” Civ. Action No. 6:11-cv-00455-LED, US. Dist. Ct of Tx, Eastern Dist. Filed Aug. 14, 2012, 56 pp. |
ThinkOptics, Inc v. Nintendo of America et al., “Defendant Nyko Technologies, Inc. and Intec, Inc. Invalidity Contentions Exhibit D,” Civ. Action No. 6:11-cv-00455-LED, US. Dist. Ct of Tx., Eastern Dist. Filed Aug. 14, 2012, 3 pages. |
ThinkOptics, Inc v. Nintendo of America et al., “Defendant Nyko Technologies. Inc. and Intec. Inc. Invalidity Contentions Exhibit B,” Civ. Action No. 6:11-cv-00455-LED, US. Dist. Ct of Tx., Eastern Dist. Filed Aug. 14, 2012, 55 pages. |
ThinkOptics, Inc v. Nintendo of America et al., “Imation Corp's 3-3 and 3-4 Disclosures,” Exhibit A. Civ. Action No. 6:11-cv-00455-LED, US. Dist. Ct of Tx., Eastern Dist. Filed Aug. 12, 2012, 116 pp. |
ThinkOptics, Inc v. Nintendo of America et al., “Invalidity Contentions Pursuant to P.R. 3-3 and 3-4 of Defendants Nintendo of America Inc. and Nintendo Co. Ltd. Exhibit A.,” Civ. Action No. 6:11-cv 00455-LED, US. Dist. Ct. of Tx., Eastern Dist. Filed Aug. 14, 2012, 583 pp. |
ThinkOptics, Inc v. Nintendo of America et al., “Invalidity Contentions Pursuant to P.R. 3-3 and 3-4 of Defendants Nintendo of America Inc. and Nintendo Co. Ltd. Exhibit B.,” Civ. Action No. 6:11-cv-00455-LED, US. Dist. Ct of Tx., Eastern Dist. Filed Aug. 14, 2012, 367 pp. |
ThinkOptics, Inc. v. Nintendo of America et al., “Defendant Nyko Technologies. Inc. and Intec. Inc, Invalidity Contentions Exhibit C,” Civ. Action No. 6:11-cv-00455-LED, US. Dist. Ct of Tx., Eastern Dist., Filed Aug. 14, 2012, 64 pages. |
ThinkOptics, Inc. v. Nintendo of America et al., “Invalidity Contentions Pursuant to P.R. 3-3 and 3-4 of Defendants Nintendo of America Inc. and Nintendo Co. Ltd.” Civ. Action No. 6:11-cv-00455-LED, US. Dist. Ct of Tx. Eastern Dist. Filed Aug. 14, 2012 393 pp. |
ThinkOptics, Inc. v. Nintendo of America et al., “Defendant Nyko Technologies. Inc. and Intec. Inc. Invalidity Contentions,” Civ. Action No. 6:11-cv-00455-LED, US. Dist. Ct of Tx., Eastern Dist. Filed Aug. 14, 2012, 9 pages. |
Tian, Q., et al., “Algorithms for Subpixel Registration”, Computer Vision, Graphics, and Image Processing, vol. 35, pp. 220-233, 1986. |
Tjan, B. et al., “Digital Sign System for Indoor Wayfinding for the Visually Impaired”, 2005 IEEE Computer Science Conf. on Computer Vision and Pattern Recognition (CVPRW'05), 8 pgs. Jun. 20-26, 2005. |
Van de Valde, G., et al., “Inventory of Embedded Systems for Image Processing,” Version 1.3, HOBU-fund research project 030106, De Nayer Instituut, Oct. 8, 2004, 39 pages. |
Van de Velde, G., et al., “Sub-Pixel Edge Detection,” Version: 1.2, HOBU-fund research project 030106, De Nayer Instituut, Oct. 2004, 7 pages. |
Veligdan, J., “Unique Interactive Projection Display Screen”, BNL-64780, U.S. Air Force P.R.A.M. office and by D.A.R.P.A, pp. 9. |
Welch, G., et al., “Motion Tracking: No Silver Bullet, but a Respectable Arsenal,” IEEE computer Graphics and Applications, vol. 22, No. 6, Nov.-Dec. 2002, pp. 24-38. |
Welch, G., et al., “SCAAT: Incremental Tracking with Incomplete Information,” re-published as appended paper to SIGGRAPH 2001, Course 11, ACM, Inc., Aug. 12-17, 2001, 12 pages. |
Welch, G. et al. “The HiBall Tracker: High-Performance Wide-Area Tracking for Virtual and Augmented Environments”, Proc. ACM Symposium on Virtual Reality Software & Technology Dec. 20-22, 1999, pp. 1-11. |
Welch, G. et al., “High-Performance Wide-Area Optical Tracking—The HiBall Tracking System”, SIGGRAPH 2001, Course 11, Feb. 2001, pp. 1-22. |
Wilson, A, et al., “ Demonstration of the XWand Interface for Intelligent Spaces,” UIST '02 companion, 2002. pp. 37-38. |
Wilson, A. & Steven Shafer, “XWand: UI for Intelligent Spaces”, Proc. SIGCHI Conf. on Human factors in computing systems (CHI '03), pp. 545-552, Apr. 5-10, 2003. |
Wilson, A., “Wireless User Interface Devices for Connected Intelligent Environments”, Ubicomp Workshop on Multi-Device Interfaces for Ubiquitous Peripheral Interaction , 2003, 3 pages. |
Wilson, D. et al., “Gesture Recognition Using the XWand”, Tech. Report CMU-RI-TR-04- 57, Robotics Institute, Carnegie Mellon University, Apr. 2004, 10 pages. |
Woods, E. et al. Proc. 1st Inl'l Conf. on Computer Graphics and Interactive Techniques, MagicMouse: an Inexpensive 6-Degree-of-Freedom Mouse, pp. 285-286, 2003. |
Wormell, D. & Eric Foxlin, “Advancements in 3D Interactive Devices for Virtual Environments,” Proc. Workshop on Virtual Environments (EGVE '03), May 22-23, 2003, pp. 47-56. |
Zuech, N., “Smart Vision Sensors, Consultancy”, AIA Vision Online Sep. 3, 2003, 10 pages. |
Naimark, L., et al., “Encoded LED System for Optical Trackers,” Proceedings of the 4th IEEE/ACM Int'l Symposium on Mixed and Augmented Reality, Oct. 5-8, 2005, pp. 1-4. |
Vicci, L., “Quaternions and Rotations in 3-Space: the Algebra and Its Geometric Interpretation,” Technical Report TR01-014, UNC Chapel Hill, Department of Computer Science, pp. 1-11, republished as appended paper to SIGGRAPH 2001, Course 11, ACM, Inc., Aug. 12-17, 2001. |
Response to Office Action from Foreign Counterpart China Patent Application No. 200680006415.3, dated Dec. 2009, 27 pages. |
Response to Second Office Action from Foreign Counterpart China Patent Application No. 200680006415.3, dated Feb. 2011, 33 pages. |
Response to Third Office Action from Foreign Counterpart China Patent Application No. 200680006415.3, dated Oct. 2011, 21 pages. |
Response to Decision of Rejection Office Action from Foreign Counterpart China Patent Application No. 200680006415.3, mailed Jun. 2012, 28 pages. |
International Preliminary Report on Patentability and Written Opinion of the International Searching Authority mailed Jan. 30, 2008, for International Patent Application No. PCT/US07/15955, 9 pages. |
Response to Office Action for counterpart European Patent Application No. 06718289.9 mailed Nov. 9, 2007, dated May 9, 2008, 25 pages. |
Response to Office Action for counterpart European Patent Application No. 06718289.9 mailed Sep. 7, 2009, dated Mar. 17, 2010, 27 pages. |
Response to Examiners Report from Foreign counterpart European Patent Application No. 06718289.9, mailed Dec. 10, 2010, dated Jun. 8, 2011, 29 pages. |
Response to Office Action for counterpart European Patent Application No. 06718289.9 dated Jan. 5, 2012, 30 pages. |
Response to Summons to Attend Oral Proceedings for counterpart European Patent Application No. 06718289.9 mailed Dec. 4, 2012, filed Jan. 18, 2013, 25 pages. |
Petition for Appeal to Office Action from foreign Counterpart Japanese Patent Application JP 2007-551393, filed Aug. 1, 2012, 32 pages. |
Office Action from foreign Counterpart Japanese Patent Application Jp 2007-551393 mailed Oct. 16, 2012, 2 pages. |
Amendment to Office Action from foreign Counterpart Japanese Patent Application JP 2007-551393, filed Nov. 20, 2012, 17 pages. |
PCT International Search Report and Written Opinion of the International Searching Authority from PCT Patent Application No. PCT/US2006/001198, mailed Oct. 30, 2006, 14 pages. |
PCT International Search Report and Written Opinion of the International Searching Authority from PCT Patent Application No. PCT/US2006/001198 mailed Jul. 17, 2007, 11 pages. |
Final Office Action from U.S. Appl. No. 11/187,405, mailed Aug. 20, 2008, 42 pages. |
Final Office Action from U.S. Appl. No. 11/777,078, mailed Oct. 10, 2012, 11 pages. |
Robinette, W. et al., “The Visual Display Transformation for Virtual Reality,” TR94-031, Sep. 1994, pp. 1-30, re-published as appended paper to SIGGRAPH 2001, Course 11, ACM, Inc., Aug. 12-17, 2001. |
Yang, H., et al., “Illumination Insensitive Model-Based 3D Object Tracking and Texture Refinement,” Proceedings of the 3rd Int'l Symposium on 3D Data Processing, Visualization, & Transmission, 2006, 8 pages. |
Office Action from foreign counterpart Chinese Patent Application No. 200680006415.3 mailed Nov. 20, 2013, 5 pages. (no English Translation). |
Final Office Action from U.S. Appl. No. 11/777,078, mailed May 2, 2014, 11 pages. |
Office Action from Re-examination U.S. Appl. No. 95/002,118 mailed Nov. 9, 2012, 13 pages. |
Office Action from U.S. Appl. No. 11/187,387 mailed May 13, 2009, 22 pages. |
Notice of Allowance from U.S. Appl. No. 12/983,554 mailed Mar. 19, 2014, 15 pages. |
Final Office Action from U.S. Appl. No. 12/983,554 mailed Jul. 26, 2011, 16 pages. |
Notice of Allowance from U.S. Appl. No. 12/983,554 mailed Nov. 14, 2011, 13 pages. |
Notice of Allowance from U.S. Appl. No. 12/983,554 mailed Jun. 11, 2012, 11 pages. |
Notice of Allowance from U.S. Appl. No. 12/983,554 mailed Dec. 5, 2013, 26 pages. |
Random House Webster's College Dictionary, see p. 273 (Random House, 2d ed. 2000), 3 pages, ISBN 0-375-42560-8 or ISBN 0-375-42561-6 (Deluxe Edition). |
Hobbs, Phillip, C.D., “Building Electro-Optical Systems Making It All Work”, Second Edition, Wiley, A John Wiley & Sons, Inc., Publication, copyright page, see p. 29, 2009, ISBN 978-0-470-40229-0, 3 pages. |
Number | Date | Country | |
---|---|---|---|
20080012824 A1 | Jan 2008 | US |
Number | Date | Country | |
---|---|---|---|
60831735 | Jul 2006 | US |