Resistive force sensor with capacitive discrimination

Information

  • Patent Grant
  • 9654104
  • Patent Number
    9,654,104
  • Date Filed
    Tuesday, July 17, 2007
    17 years ago
  • Date Issued
    Tuesday, May 16, 2017
    7 years ago
Abstract
A resistive force sensor with capacitive discrimination is disclosed. According to an example of the disclosure, a sensor is directed to detect resistance and capacitance in an alternating fashion, the resistance indicating a force being applied to an input area of a device, and the capacitance indicating a proximity of a body part to the input area of the device, and the detected resistance and capacitance are utilized to determine whether the body part has pressed the input area of the device.
Description
FIELD OF THE DISCLOSURE

The disclosure of the present application relates to input mechanisms, and more particularly, to sensing input through the use of force and proximity sensors.


BACKGROUND

Virtually every consumer product device on the market has some form of input mechanism that allows a user to interact with the device. One of the most common input mechanisms is the button, which when pressed by a user causes the device to change a state associated with the button. The button may take many forms, from a mechanical push button, such as a rubber knob commonly found on TV remote controls and calculators, to a virtual button, such as a graphical user interface input area displayed on a flat and/or rigid touch-sensitive surface commonly found on ATMs and some handheld computing devices.


Irrespective of the form, the button is usually associated with two states—“pressed” or “not pressed”. Pressing or selecting a button changes the “not pressed” state to “pressed”, causing the “pressed” state to be activated. Releasing the button changes the “pressed” state back to “not pressed”, causing the “pressed” state to be deactivated. In this sense, the button allows a user to define the state of input into the device.


For example, when a device is powered off and a user presses the power button, the button press activates the power button's “pressed” state, which triggers the device to power on. When the user releases the button, the button release deactivates the “pressed” state, usually to no effect. In a different example, when a user presses a horn on a car (which can be considered a large button), the horn press activates the horn's “pressed” state, triggering the car to sound the horn. When the user releases the horn, the horn release deactivates the “pressed” state, triggering the car to stop sounding the horn.


The mechanism behind the operation of many buttons is a force sensor. When a user presses a button, a force sensor detects the force being applied to the button from the user's finger, hand or other object. When the output of the sensor indicates that the force exceeds a threshold amount (e.g., a strong enough press of the user's finger to indicate the user is intending to press the button), the “pressed” state of the button is activated, triggering an action to be taken by the device due to the button being pressed.


Thus, in order for the button to work properly, it is important that the button's sensor output be interpreted correctly to indicate that the button has been pressed or released. An incorrect interpretation of the button's sensor output can result in a phantom button press or release, which can trigger an unintended action with potentially damaging consequences.


SUMMARY

In order to correctly interpret whether a user is pressing a button of a device, methods of the present disclosure can detect both the force applied to the button area as well as the proximity of a user's finger to the button area.


In this manner, proximity detection can be used to verify that a detected force is actually caused by an intended press of a button and not some other effect, such as temperature change or a stuck button, for example.


For instance, when in certain situations a temperature change causes a force sensor to indicate a force being applied to a button, the combination of proximity detection with force detection can prevent the temperature change from being confused for a user's button press if the proximity sensor indicates that no finger is in the button area.


Similarly, when in certain situations a stuck button causes a force sensor to indicate a force being applied to a button, the combination of proximity detection with force detection can prevent the stuck button from being confused for a user continuing to hold down a button if the proximity sensor indicates that the user's finger has left the button area.


In addition to resolving these signal conditioning issues, the present disclosure teaches that the same physical sensor can be utilized to switch back and forth between force detection and proximity detection, since the same sensor element can be directed to detect both resistance (to indicate applied force) and capacitance (to indicate proximity of a user's finger).


The use of a single sensor device to accomplish both force and proximity sensing can be advantageous from an implementation and a cost standpoint. From an implementation standpoint, it can be beneficial to have dual-sensing ability in one physical sensor because it ensures that the same input area can be detected for force and proximity. From a cost standpoint, it is less expensive to use one physical sensor for detecting both force and proximity, rather than two sensors whereby one is used for detecting only force and the other for detecting only proximity.


Further, the present disclosure teaches the ability of a device to programmatically change threshold amounts of the force and/or proximity output required in order to activate an input state of a button. For example, if the device can alter the level of force required to activate a button's “pressed” state, and/or the level of proximity of a finger to the button area to activate the button's “pressed” state, the effective size of the button area can be changed without changing the physical sensor associated with the button.


Such an ability could allow a user to resize a virtual button displayed on a device surface by merely adjusting the sensor threshold parameters via software control.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a graph of an example of idealized force sensor output and corresponding button input state.



FIG. 2 is a graph of an example of force sensor output with a drifting baseline and corresponding button input state.



FIG. 3 is a graph of an example of force sensor output subject to hysteresis and corresponding button input state.



FIG. 4 is a diagram of an example of switching sensor operation modes.



FIG. 5 is a flow chart of an example of an algorithm for activating a button input state.



FIG. 6 is a flow chart of an example of an algorithm utilizing proximity detection for deactivating a button input state.



FIG. 7 is a flow chart of an example of an algorithm utilizing force detection for deactivating a button input state.



FIG. 8 is a flow chart of an example of an algorithm that accounts for baseline drift and hysteresis.



FIG. 9 is a graph of an example of force sensor output and corresponding button input state that accounts for baseline drift and hysteresis.



FIG. 10 is a flow chart of an example of a button resizing and input state activation algorithm.



FIG. 11 is a diagram of an example of a housing.



FIG. 12 is a diagram of an example of a sensor configuration.



FIG. 13 is a diagram of another example of a sensor configuration.



FIGS. 14a and 14b are diagrams of examples of sensor contact configurations.



FIG. 15 is a diagram of an example of a device.





DETAILED DESCRIPTION

The present disclosure teaches the use of resistive force detection in combination with capacitive proximity detection in order to implement a button, for example. The resistive force detection and capacitive proximity detection may work through a rigid cover or housing, including glass, for example. The same physical sensor element may be used for both resistive force detection and capacitive proximity detection.


The resistive force sensor can be used to detect force applied by a user's finger to an input area of a device. To address situations in which the force sensor output changes due to unintended effects, such as, for example, temperature changes, a stuck button or even a user applying force to the device but not directly over the force sensor area, the capacitive proximity sensor can be used to detect the proximity of the user's finger to the input area in order to confirm the finger press.


Temperature change and sticking buttons relate to signal conditioning issues referred to as baseline drift and hysteresis, respectively. These issues make it difficult to properly interpret the sensor's output signal as clearly indicating either the “pressed” or “not pressed” state.


Baseline drift occurs when factors other than a user pressing a button, such as changes in temperature, cause the sensor to output a signal indicating that a user pressed the button. In this situation, the simple act of placing a cell phone or portable music player in the sun or near a hot appliance could cause the sensor's output to indicate that a button has been pressed.


Hysteresis occurs when a button “sticks”, or fails to return completely to its original position, after being pressed. In this situation, because the “stuck” button is still exerting a force on the sensor, the sensor output may incorrectly indicate that the user is continuing to press the button.


In an effort to better illustrate these issues, a basic description of the workings of a force sensor is warranted. In a basic sense, a force sensor usually works by detecting the resistance of a sensor element, and outputting a signal indicating the level of the detected resistance. A sensor element usually includes two contacts positioned closely together—but not touching—while at rest, as shown in FIGS. 14a and 14b for example. When a force is applied to the contacts, they are pushed closer together causing the contact resistance between them to be reduced. As a force being applied to the sensor element increases, the resistance between the contacts decreases.


Thus, when a force sensor detects a drop in resistance of the sensor element, the drop is interpreted as a force being applied to the sensor. The greater the drop in resistance, the greater the level of force interpreted as being applied to the sensor.


In order to detect a drop in resistance, a baseline resistance is usually established from which to measure any subsequent drop. The baseline resistance is the level of resistance detected in the sensor element when at rest—i.e., when no intended force is being applied to the sensor.


To illustrate these issues graphically, FIG. 1 shows an example of ideal force sensor output that is not affected by baseline drift or hysteresis as a user presses and releases a button.


Although a force sensor output indicates a level of resistance, force sensor output plot 100 plots the sensor output in terms of conductance over time for better presentation purposes. Conductance is the inverse of resistance (depicted as 1/R), and enables the resistance output to be plotted with an increasing, rather than a decreasing, slope in relation to an increasing force being applied to the sensor (and vice-versa).


In an ideal situation, plot 100 shows that the force sensor only provides an output above baseline 130 when the user is pressing the button beginning at point 140. When the user releases the user's finger from the button at point 160, the output returns to baseline 130. In such a situation, a simple threshold algorithm can be utilized to interpret the button press—when the output exceeds a threshold amount of resistance, the button is considered pressed; when the output falls below the threshold amount, the button is considered released.


As shown in plot 100, the “pressed” state of the button is activated at point 150, which is when the force of the finger press exceeds the threshold amount of resistance depicted by activation threshold 120. When the output falls below activation threshold 120 at point 170, the “pressed” state of the button is deactivated, indicating that the button has been released by the user. In a real application, the output is never this clean.



FIG. 2 shows an example of force sensor output that is affected by baseline drift. In this example the user does not press the button, so the sensor output should be considered at baseline at every point. In plot 200, other factors such as temperature change cause the output to drift, leading to drifting baseline 130.


Once the output (and hence drifting baseline 130) drifts from starting baseline 220 and exceeds activation threshold 120 at point 150, the “pressed” state of the button is activated. As shown in plot 210, the output is interpreted as if a user is continuing to press the button after point 150.


Thus, the simple threshold algorithm is impractical to implement in a baseline drift situation.



FIG. 3 shows an example of a force sensor output that is affected by hysteresis. In plot 300, the sensor output is correctly interpreted as the button being pressed when it exceeds activation point 150. However, when the user releases the button at point 160, the button becomes partially stuck and continues to exert a force on the sensor element, leading to a new baseline above activation threshold 120.


In this situation under the simple threshold algorithm, because the output did not fall back below activation threshold 120, the “pressed” state remains activated as illustrated in corresponding plot 310.


Thus, the simple threshold algorithm is also impractical to implement in a hysteresis situation.


Although some algorithms more complex than the simple threshold algorithm, such as a re-baseline algorithm and derivative algorithm, may attempt to interpret force sensor output properly for switch-like operation in light of baseline drift and hysteresis, each possesses drawbacks that hinder their ability to appropriately compensate for these signal conditioning issues.


The re-baseline algorithm adjusts the baseline (or “re-baselines”) to match the current output level at a specified time interval. Unfortunately, this algorithm depends on picking the correct time interval at which to re-baseline. If the algorithm re-baselines too quickly, it will miss button pushes because it will re-baseline to the force applied by the user's finger. If it re-baselines too slowly, it will allow accidental button pushes because it will not catch the baseline drift in time. In some cases, there is no appropriate “happy medium” interval.


The derivative algorithm relies on the derivative of the sensor output. In other words, it looks not at the change in output at discrete intervals in time (as in the re-baseline algorithm), but rather at how quickly the output changes over a short period of time. It therefore requires the user to press quickly on the button in order for the force to be interpreted as a button press. If the user presses slowly by holding a finger over the button and gradually applying force, the button push could be missed all together.


Accordingly, the use of resistive force detection in combination with capacitive proximity detection can overcome these signal conditioning issues when implementing a button, for example.



FIG. 4 shows an example of a controller that can switch the operation of a sensor between two distinct operation modes—a force detection mode for providing output responsive to a force applied by an object, and a proximity detection mode for providing output responsive to a proximity of the object.


In step 420, controller 400 can switch sensor 410 into force detection mode by directing sensor 410 to detect resistance between its sensor contacts. While in force detection mode in step 430, sensor 410 can output a signal indicating the level of detected resistance which may be interpreted by controller 400 as a level of force being applied to sensor 410. In step 440, controller 400 can switch sensor 410 into proximity detection mode by directing sensor 410 to detect capacitance of the sensor element instead of resistance. While in proximity detection mode in step 450, sensor 410 can output a signal indicating the level of detected capacitance which may be interpreted by controller 400 as a level of proximity of an object to sensor 410. As indicated by the bent arrows, switching between the two sensor operation modes may occur in an alternating fashion.


Controller 400 can switch back and forth between detection modes using, for example, a copper pattern shape as a force sensor element for part of the time and as a capacitive sensor element for part of the time. Controller 400 can be programmed or instructed to direct sensor 410 to alternate between resistive force detection and capacitive proximity detection every 25 milliseconds or less, for example, so that a time lag would not be evident to a user between pressing the button and the device identifying the press as a button press (i.e., activating the “pressed” state of the button).


In an another method of the present disclosure, at specified intervals controller 400 may receive only resistive force detection output from one sensor and only capacitive proximity detection output from a different sensor situated in close proximity to the first sensor.



FIG. 5 depicts an example of an algorithm for activating a button input state. In this example, at step 500 a processor (such as controller 400) may recurringly receive force and proximity output from one or more sensors corresponding to an input area of a housing. At step 510 the processor can determine whether the proximity output exceeds a threshold amount of proximity, and at step 520, whether the force output exceeds a threshold amount of force. If the processor determines that the threshold amounts of force and proximity have been exceeded, at step 530 the processor may activate a “pressed” input state indicating a button press on the input area of the housing.



FIG. 6 depicts an example of an algorithm utilizing proximity detection for deactivating a button input state. In this example, a processor can determine whether the proximity output exceeds a threshold amount of proximity at step 600, and whether the force output exceeds a threshold amount of force at step 610, in order to activate the “pressed” input state at step 620. Once the state has been activated, it may remain activated until the proximity output falls below the threshold amount of proximity at step 630, at which time the processor can deactivate the “pressed” state at step 640, indicating user release of the button. This can be advantageous in situations in which a user removes a finger from the input area but the button continues to apply a force due to sticking, for example.



FIG. 7 depicts an example of an algorithm utilizing force detection for deactivating a button input state. In this example, a processor can determine whether the proximity output exceeds a threshold amount of proximity at step 700, and whether the force output exceeds a threshold amount of force at step 710, in order to activate the “pressed” input state at step 720. Once the state has been activated, it may remain activated until the force output falls below the threshold amount of force at step 730, at which time the processor can deactivate the “pressed” state at step 740. This can be advantageous in situations in which it is less likely that a button will stick, and more likely that a user would intend to release a button by lightening up on the force applied to the button without moving away from the button, for example.



FIG. 8, in combination with plot 900 of FIG. 9, depicts an example of an algorithm that accounts for baseline drift and hysteresis. In this example, a processor can continually or intermittently adjust force output baseline 920 to match detected force output levels when a user's body part, such as a finger, is not near the sensor area.


At step 800 the processor can determine if the proximity output exceeds a threshold amount of proximity, indicating proximity of a finger to the sensor area. When the threshold amount of proximity is exceeded at point 940, the processor can disable the adjusting baseline functionality by switching to static baseline 930 mode at step 810. At this point, the processor can continue to determine, without adjusting for baseline drift, whether the proximity and force output exceed the threshold amounts of proximity and force, respectively, at steps 820 and 830, in order to activate the “pressed” button state at step 840.


If the proximity output falls below the threshold amount of proximity (e.g., indicating the finger moved away) at step 820, which occurs prior to the “pressed” state being activated, the processor can simply switch back to adjusting baseline 920 mode at step 870. If the proximity output falls below the threshold amount of proximity at step 850 and points 160 and 950, which occur after the “pressed” state has been activated, the processor can deactivate the “pressed” state at step 860 and switch back to adjusting baseline 920 mode at step 870. As plot 910 illustrates, the button state is correctly activated and deactivated in light of the baseline drift and hysteresis factors.


Of course, in step 850 force output could be utilized instead of proximity output to determine whether to deactivate the switch, similar to step 730, or a combination of both a force output and proximity output may be utilized, for example.



FIG. 10 depicts an example of a button resizing and input state activation algorithm. In this example, as above, a processor can receive force and proximity sensor output at step 1020 to determine whether the threshold amounts have been exceeded at steps 1030 and 1040 for activating the “pressed” button state at step 1050. However, in this example, the processor may also receive at step 1000 a request to resize the button input area to be pressed by a user in order to activate the “pressed” button state.


This request could be generated by a user via a user interface associated with the device. Upon receiving the request, at step 1010 the processor may adjust the force and/or proximity thresholds accordingly in order to change the physical detection coverage for a virtual button displayed on an input area of the device.



FIG. 11 depicts an example of a housing. In this example, the housing may comprise a device including touch screen display area 1100, cover 1110 fabricated from a rigid material such as glass, for example, and input area 1120 where a user may press in order to activate a “pressed” state of a virtual button. Examples of the housing may include portable music players, mobile communications devices and other handheld computing devices.



FIG. 12 depicts an example of a sensor configuration. In this example, a hybrid force/proximity sensor may include deformable material 1210, such as, for example, rubber that is doped with carbon to make the rubber slightly conductive (although somewhat less conductive than a piece of metal is an embodiment). Pattern 1230 on printed circuit board (“PCB”) 1220 may be disposed underneath rubber element 1210. FIGS. 14a and 14b depicts exemplary pattern configurations.


If rubber 1210 is compressed onto PCB 1220 pattern 1230, then the contact resistance between the two halves of the pattern can be reduced. The change in resistance caused by this force may be measured by, for example, a processor. Adhesive 1240 may be included to allow doped rubber 1210 to actually push harder on PCB 1220 pattern 1230, with adhesive 1240 compressing slightly when the user pushes their finger directly on the input area 1120 of the cover 1110.


Cover 1110 may be adhered to frame 1200, which has a small hole. PCB 1220 may be stuck to the bottom of frame 1200 and have pattern 1230 on it.



FIG. 13 depicts an example of a sensor configuration without the hole in the frame. This example is similar to that of FIG. 12, except instead of having a hole drilled all the way through frame 1200, a small indentation may be carved out of frame 1200 in which the circuit 1300 (which may be flexible) and deformable material 1210 may be inserted. Conductive paint 1310 may be applied between cover 1110 and circuit 1300.



FIG. 15 depicts an example of a device. In this example, the device may include processor 1500, memory 1510, controller 400 and sensor 410.


Controller 400 may provide the necessary drive and detection circuitry to obtain force and proximity output from sensor 410. Controller 400 can process the received force and proximity output to determine whether input area 1120 was pressed or released by a user with the intent to activate or deactivate the “pressed” state of a button. In order to activate or deactivate the “pressed” state, controller 400 can send a signal indicating such activation or deactivation to processor 1500 (e.g., a central processor responsible for running the device), which may trigger the appropriate programming functionality to react to the indicated button press or release.


Memory 1510 may include, for example, one or more of the following types of storage media: magnetic disks; optical media; and semiconductor memory devices such as static and dynamic random access memory (RAM), Electrically Programmable Read-Only Memory (“EPROM”), Electrically Erasable Programmable Read-Only Memory (“EEPROM”), Programmable Gate Arrays and flash devices.


The processing functionality described herein may be performed by a processor located on the sensor board itself, controller 400 or the central processor responsible for running the device, for example.


Although the claimed subject matter has been fully described in connection with examples thereof with reference to the accompanying drawings, it is to be noted that various changes and modifications will become apparent to those skilled in the art. Such changes and modifications are to be understood as being included within the scope of the present disclosure as defined by the appended claims.

Claims
  • 1. A method, comprising: switching operation of a sensor between a first operation mode and a second operation mode, whereinwhen operating in the first operation mode, the sensor provides a first output responsive to a force applied by an object, andwhen operating in the second operation mode, the sensor provides a second output responsive to a proximity of the object;using a combination of the first and second outputs and corresponding force and proximity activation thresholds to determine whether the object has pressed an input area of a device; andusing the second output to determine whether to adjust a force baseline in accordance with the first output, wherein the force baseline is different from the activation thresholds and represents the level of force detected in the sensor when no force is being applied to the sensor.
  • 2. The method of claim 1, further comprising receiving the first output when the sensor is operating in the first operation mode, andreceiving the second output when the sensor is operating in the second operation mode.
  • 3. The method of claim 2, further comprising activating an input state based on changes in the first and second received outputs.
  • 4. The method of claim 3, further comprising de-activating the input state based on further changes in the first and second received outputs.
  • 5. The method of claim 1, wherein the object is a finger.
  • 6. A method for sensing force with capacitive discrimination, comprising: receiving from one or more sensors one or more signals indicating levels of force and proximity detected at an input area;determining whether the detected level of force exceeds a threshold amount of force;determining whether the detected level of proximity exceeds a threshold amount of proximity;if the threshold amounts of both force and proximity are determined to have been exceeded, activating a state indicating receiving a user input at the input area; anddetermining whether to adjust a force baseline in accordance with the detected levels of force based on the detected level of proximity indicated in the one or more signals, wherein the force baseline is different from the threshold amounts of force and proximity and represents the level of force detected in the sensor when no force is being applied to the sensor.
  • 7. The method of claim 6, wherein the one or more signals indicating the detected levels of force are received from the same one or more sensors from which the one or more signals indicating the detected levels of proximity are received.
  • 8. The method of claim 6, wherein the one or more signals indicating the detected levels of force are received from different sensors from which the one or more signals indicating the detected levels of proximity are received.
  • 9. The method of claim 6, further comprising receiving from any of the one or more sensors a subsequent level of proximity detected at the input area of the housing; anddeactivating the state if the subsequently detected level of proximity falls below the threshold amount of proximity.
  • 10. The method of claim 6, further comprising receiving from any one of the one or more sensors a subsequent level of force detected at the input area of the housing; anddeactivating the state if the subsequently detected level of force falls below the threshold amount of force.
  • 11. The method of claim 6, further comprising if the threshold amount of proximity is determined to have been exceeded, disabling the functionality that adjusts the force baseline accordance with the levels of force indicated in signals received from the one or more sensors.
  • 12. The method of claim 6, further comprising receiving from any of the one or more sensors a subsequent level of proximity detected at the input area; andif the subsequently detected level of proximity falls below the threshold amount of proximity, enabling the functionality that adjusts the force baseline accordance with the levels of force indicated in signals received from the one or more sensors.
  • 13. The method of claim 6, wherein the activating of the state comprises sending a signal indicating that a virtual button displayed on the input area of the housing has been pressed.
  • 14. A device comprising: a sensor having at least a first mode and a second mode, the sensor providing in the first mode a first output associated with force applied to the sensor by an object, the sensor providing in the second mode a second output associated with proximity to the sensor of the object, anda processor capable of processing the first output and the second output and determining whether the object has pressed an input area of the device based on a combination of the first and second outputs and corresponding force and proximity activation thresholds, the processor further capable of determining whether to adjust a force baseline in accordance with the first output based on the second output, wherein the force baseline is different from the activation thresholds and represents the level of force detected in the sensor when no force is being applied to the sensor.
  • 15. A method comprising: operating a sensor in a first mode providing a first output associated with force applied to the sensor by an object,operating the sensor in a second mode providing a second output associated with proximity to the sensor of the object, andprocessing the first output and the second output to determine whether the object has pressed an input area of a device based on a combination of the first and second outputs and corresponding force and proximity activation thresholds, and determine whether to adjust a force baseline in accordance with the first output based on the second output, wherein the force baseline is different from the activation thresholds and represents the level of force detected in the sensor when no force is being applied to the sensor.
  • 16. A method comprising: providing a sensor having at least a first mode and a second mode, the sensor providing in the first mode a first output associated with force applied to the sensor by an object, the sensor providing in the second mode a second output associated with proximity to the sensor of the object, andproviding a processor capable of processing the first output and the second output and determining whether the object has pressed an input area of a device based on a combination of the first and second outputs and corresponding force and proximity activation thresholds, the processor further capable of determining whether to adjust a force baseline in accordance with the first output based on the second output, wherein the force baseline is different from the activation thresholds and represents the level of force detected in the sensor when no force is being applied to the sensor.
  • 17. The device of claim 14, wherein the processor is further configured to receive the first and second outputs from the sensor and activate or de-activate an input state based on changes in the first and second outputs.
  • 18. The device of claim 14, wherein the processor is further configured to determine whether the first output exceeds a threshold amount of force and the second output exceeds a threshold amount of proximity.
  • 19. The device of claim 18, wherein the processor is further configured to activate an input state if the threshold amounts of force and proximity are determined to have been exceeded.
US Referenced Citations (359)
Number Name Date Kind
1061578 Wischhusen et al. May 1913 A
2798907 Schneider Jul 1957 A
2903229 Landge Sep 1959 A
2945111 McCormick Jul 1960 A
3005055 Mattke Oct 1961 A
3965399 Walker et al. Jun 1976 A
4103252 Bobick Jul 1978 A
4110749 Janko et al. Aug 1978 A
4115670 Chandler Sep 1978 A
4121204 Welch et al. Oct 1978 A
4129747 Pepper Dec 1978 A
4158216 Bigelow Jun 1979 A
4242676 Piguet et al. Dec 1980 A
4246452 Chandler Jan 1981 A
4264903 Bigelow Apr 1981 A
4293734 Pepper, Jr. Oct 1981 A
D264969 McGoutry Jun 1982 S
4380007 Steinegger Apr 1983 A
4380040 Posset Apr 1983 A
4475008 Doi et al. Oct 1984 A
4570149 Thornburg et al. Feb 1986 A
4587378 Moore May 1986 A
4644100 Brenner et al. Feb 1987 A
4719524 Morishima et al. Jan 1988 A
4734034 Maness et al. Mar 1988 A
4736191 Matzke et al. Apr 1988 A
4739191 Puar Apr 1988 A
4739299 Eventoff et al. Apr 1988 A
4752655 Tajiri et al. Jun 1988 A
4755765 Ferland Jul 1988 A
4764717 Tucker et al. Aug 1988 A
4798919 Miessler et al. Jan 1989 A
4810992 Eventoff Mar 1989 A
4831359 Newell May 1989 A
4849852 Mullins Jul 1989 A
4856993 Maness et al. Aug 1989 A
4866602 Hall Sep 1989 A
4876524 Jenkins Oct 1989 A
4897511 Itaya et al. Jan 1990 A
4914624 Dunthorn Apr 1990 A
4917516 Retter Apr 1990 A
4951036 Grueter et al. Aug 1990 A
4976435 Shatford et al. Dec 1990 A
4990900 Kikuchi Feb 1991 A
5008497 Asher Apr 1991 A
5036321 Leach et al. Jul 1991 A
5053757 Meadows Oct 1991 A
5125077 Hall Jun 1992 A
5159159 Asher Oct 1992 A
5179648 Hauck Jan 1993 A
5186646 Pederson Feb 1993 A
5192082 Inoue et al. Mar 1993 A
5231326 Echols Jul 1993 A
5237311 Mailey et al. Aug 1993 A
5270710 Gaultier et al. Dec 1993 A
5278362 Ohashi Jan 1994 A
5305017 Gerpheide Apr 1994 A
5313027 Inoue et al. May 1994 A
D349280 Kaneko Aug 1994 S
5339213 O'Callaghan Aug 1994 A
5367199 Lefkowitz et al. Nov 1994 A
5374787 Miller et al. Dec 1994 A
5404152 Nagai Apr 1995 A
5408621 Ben-Arie Apr 1995 A
5414445 Kaneko et al. May 1995 A
5416498 Grant May 1995 A
5424756 Ho et al. Jun 1995 A
5432531 Calder et al. Jul 1995 A
5438331 Gilligan et al. Aug 1995 A
D362431 Kaneko et al. Sep 1995 S
5450075 Waddington Sep 1995 A
5453761 Tanaka Sep 1995 A
5473343 Kimmich et al. Dec 1995 A
5473344 Bacon et al. Dec 1995 A
5479192 Carroll, Jr. et al. Dec 1995 A
5483261 Yasutake Jan 1996 A
5488204 Mead et al. Jan 1996 A
5495566 Kwatinetz Feb 1996 A
5508703 Okamura et al. Apr 1996 A
5510813 Makinwa et al. Apr 1996 A
5543588 Bisset et al. Aug 1996 A
5555004 Ono et al. Sep 1996 A
5559301 Bryan, Jr. et al. Sep 1996 A
5559943 Cyr et al. Sep 1996 A
5561445 Miwa et al. Oct 1996 A
5564112 Hayes et al. Oct 1996 A
5565887 McCambridge et al. Oct 1996 A
5578817 Bidiville et al. Nov 1996 A
5581670 Bier et al. Dec 1996 A
5585823 Duchon et al. Dec 1996 A
5589893 Gaughan et al. Dec 1996 A
5596347 Robertson et al. Jan 1997 A
5598183 Robertson et al. Jan 1997 A
5611040 Brewer et al. Mar 1997 A
5611060 Belfiore et al. Mar 1997 A
5613137 Bertram et al. Mar 1997 A
5613600 Yokoji et al. Mar 1997 A
5617114 Bier et al. Apr 1997 A
5627531 Posso et al. May 1997 A
5632679 Tremmel May 1997 A
5640258 Kurashima et al. Jun 1997 A
D382550 Kaneko et al. Aug 1997 S
5657012 Tait Aug 1997 A
5661632 Register Aug 1997 A
D385542 Kaneko et al. Oct 1997 S
5689285 Asher Nov 1997 A
5726687 Belfiore et al. Mar 1998 A
5729219 Armstrong et al. Mar 1998 A
5730165 Philipp Mar 1998 A
5748185 Stephan et al. May 1998 A
5751274 Davis May 1998 A
5754890 Holmdahl et al. May 1998 A
5777605 Yoshinobu et al. Jul 1998 A
5786818 Brewer et al. Jul 1998 A
5790769 Buxton et al. Aug 1998 A
5805144 Scholder et al. Sep 1998 A
5808602 Sellers Sep 1998 A
5812498 Teres et al. Sep 1998 A
5825352 Bisset et al. Oct 1998 A
5825353 Will Oct 1998 A
5828364 Siddiqui Oct 1998 A
5835079 Shieh Nov 1998 A
5838304 Hall Nov 1998 A
5841423 Carroll, Jr. et al. Nov 1998 A
D402281 Ledbetter et al. Dec 1998 S
5850213 Imai et al. Dec 1998 A
5856822 Du et al. Jan 1999 A
5856827 Sudo Jan 1999 A
5859629 Tognazzini Jan 1999 A
5875311 Bertram et al. Feb 1999 A
5880411 Gillespie et al. Mar 1999 A
5883619 Ho et al. Mar 1999 A
5889236 Gillespie et al. Mar 1999 A
5889511 Ong et al. Mar 1999 A
5894117 Kamishima Apr 1999 A
5903229 Kishi May 1999 A
5907152 Dandiliker et al. May 1999 A
5907318 Medina May 1999 A
5909211 Combs et al. Jun 1999 A
5914706 Kono Jun 1999 A
5923388 Kurashima et al. Jul 1999 A
D412940 Kato et al. Aug 1999 S
5943044 Martinelli et al. Aug 1999 A
5956019 Bang et al. Sep 1999 A
5959611 Smailagic et al. Sep 1999 A
5964661 Dodge Oct 1999 A
5973668 Watanabe Oct 1999 A
6000000 Hawkins et al. Dec 1999 A
6002389 Kasser Dec 1999 A
6005299 Hengst Dec 1999 A
6025832 Sudo et al. Feb 2000 A
6031518 Adams et al. Feb 2000 A
6034672 Gaultiet et al. Mar 2000 A
6057829 Silfvast May 2000 A
6075533 Chang Jun 2000 A
6084574 Bidiville Jul 2000 A
D430169 Scibora Aug 2000 S
6097372 Suzuki Aug 2000 A
6122526 Parulski et al. Sep 2000 A
6124587 Bidiville et al. Sep 2000 A
6128006 Rosenberg et al. Oct 2000 A
6163312 Furuya Dec 2000 A
6166721 Kuroiwa et al. Dec 2000 A
6179496 Chou Jan 2001 B1
6181322 Nanavati Jan 2001 B1
D437860 Suzuki et al. Feb 2001 S
6188391 Seely et al. Feb 2001 B1
6188393 Shu Feb 2001 B1
6191774 Schena et al. Feb 2001 B1
6198054 Janniere Mar 2001 B1
6198473 Armstrong Mar 2001 B1
6211861 Rosenberg et al. Apr 2001 B1
6219038 Cho Apr 2001 B1
D442592 Ledbetter et al. May 2001 S
6225976 Yates et al. May 2001 B1
6225980 Weiss et al. May 2001 B1
6226534 Aizawa May 2001 B1
6227966 Yokoi May 2001 B1
D443616 Fisher et al. Jun 2001 S
6243078 Rosenberg Jun 2001 B1
6243080 Molne Jun 2001 B1
6248017 Roach Jun 2001 B1
6254477 Sasaki et al. Jul 2001 B1
6256011 Culver Jul 2001 B1
6262717 Donohue et al. Jul 2001 B1
6262785 Kim Jul 2001 B1
6266050 Oh et al. Jul 2001 B1
D448810 Goto Oct 2001 S
6297795 Kato et al. Oct 2001 B1
6310610 Beaton et al. Oct 2001 B1
D450713 Masamitsu et al. Nov 2001 S
6314483 Goto et al. Nov 2001 B1
6323845 Robbins Nov 2001 B1
6323846 Westerman et al. Nov 2001 B1
D452250 Chan Dec 2001 S
6340800 Zhai et al. Jan 2002 B1
D454568 Andre et al. Mar 2002 S
6357887 Novak Mar 2002 B1
D455793 Lin Apr 2002 S
6373470 Andre et al. Apr 2002 B1
6377530 Burrows Apr 2002 B1
6396523 Segal et al. May 2002 B1
6424338 Anderson Jul 2002 B1
6429846 Rosenberg et al. Aug 2002 B2
6429852 Adams et al. Aug 2002 B1
6468630 Mishima et al. Oct 2002 B1
6473069 Gerphelde Oct 2002 B1
6492979 Kent et al. Dec 2002 B1
6496181 Bomer et al. Dec 2002 B1
6497412 Bramm Dec 2002 B1
D468365 Bransky et al. Jan 2003 S
D469109 Andre et al. Jan 2003 S
6504530 Wilson et al. Jan 2003 B1
6525713 Soeta et al. Feb 2003 B1
D472245 Andre et al. Mar 2003 S
6587091 Serpa Jul 2003 B2
6606244 Liu et al. Aug 2003 B1
6636197 Goldenberg et al. Oct 2003 B1
6639584 Li Oct 2003 B1
6640250 Chang et al. Oct 2003 B1
6650975 Ruffner Nov 2003 B2
D483809 Lim Dec 2003 S
6664951 Fujii et al. Dec 2003 B1
6677927 Bruck et al. Jan 2004 B1
6686904 Sherman et al. Feb 2004 B1
6690387 Zimmerman et al. Feb 2004 B2
6703550 Chu Mar 2004 B2
6724817 Simpson et al. Apr 2004 B1
6727889 Shaw Apr 2004 B2
D489731 Huang May 2004 S
6738045 Hinckley et al. May 2004 B2
6750803 Yates et al. Jun 2004 B2
6781576 Tamura Aug 2004 B2
6788288 Ano Sep 2004 B2
6791533 Su Sep 2004 B2
6795057 Gordon Sep 2004 B2
D497618 Andre et al. Oct 2004 S
6844872 Farag et al. Jan 2005 B1
6886842 Vey et al. May 2005 B2
6894916 Reohr et al. May 2005 B2
D506476 Andre et al. Jun 2005 S
6922189 Fujiyoshi Jul 2005 B2
6930494 Tesdahl et al. Aug 2005 B2
6977808 Lam et al. Dec 2005 B2
6978127 Bulthuis et al. Dec 2005 B1
7006077 Uusimäki Feb 2006 B1
7015894 Morohoshi Mar 2006 B2
7046230 Zadesky et al. May 2006 B2
7069044 Okada et al. Jun 2006 B2
7084856 Huppi Aug 2006 B2
7091886 Depue et al. Aug 2006 B2
7113196 Kerr Sep 2006 B2
7119792 Andre et al. Oct 2006 B1
7184064 Zimmerman et al. Feb 2007 B2
7206038 Choi et al. Apr 2007 B2
7215319 Kamijo et al. May 2007 B2
7233318 Farag et al. Jun 2007 B1
7236154 Kerr et al. Jun 2007 B1
7511702 Hotelling Mar 2009 B2
7538760 Hotelling et al. May 2009 B2
7652230 Baier Jan 2010 B2
7663607 Hotelling et al. Feb 2010 B2
8040142 Bokma et al. Oct 2011 B1
8479122 Hotelling et al. Jul 2013 B2
20010011991 Wang et al. Aug 2001 A1
20010043545 Aratani Nov 2001 A1
20010050673 Davenport Dec 2001 A1
20010051046 Watanabe et al. Dec 2001 A1
20020027547 Kamijo Mar 2002 A1
20020030665 Ano Mar 2002 A1
20020033848 Sciammarella et al. Mar 2002 A1
20020045960 Phillips et al. Apr 2002 A1
20020071550 Pletikosa Jun 2002 A1
20020089545 Levi Montalcini Jul 2002 A1
20020118131 Yates et al. Aug 2002 A1
20020118169 Hinckley et al. Aug 2002 A1
20020154090 Lin Oct 2002 A1
20020158844 McLoone et al. Oct 2002 A1
20020164156 Bilbrey Nov 2002 A1
20020180701 Hayama et al. Dec 2002 A1
20030002246 Kerr Jan 2003 A1
20030025679 Taylor et al. Feb 2003 A1
20030043121 Chen Mar 2003 A1
20030043174 Hinckley et al. Mar 2003 A1
20030050092 Yun Mar 2003 A1
20030076301 Tsuk et al. Apr 2003 A1
20030076303 Huppi Apr 2003 A1
20030091377 Hsu et al. May 2003 A1
20030095095 Pihlaja May 2003 A1
20030095096 Robbin et al. May 2003 A1
20030098851 Brink May 2003 A1
20030184517 Senzui et al. Oct 2003 A1
20030206202 Moriya Nov 2003 A1
20040056845 Harkcom et al. Mar 2004 A1
20040156192 Kerr et al. Aug 2004 A1
20040215986 Shakkarwar Oct 2004 A1
20040224638 Fadell et al. Nov 2004 A1
20040227736 Kamrath et al. Nov 2004 A1
20040239622 Proctor et al. Dec 2004 A1
20040252109 Trent, Jr. et al. Dec 2004 A1
20040253989 Tupler et al. Dec 2004 A1
20040263388 Krumm et al. Dec 2004 A1
20040267874 Westberg et al. Dec 2004 A1
20050030048 Bolender Feb 2005 A1
20050041018 Philipp Feb 2005 A1
20050052425 Zadesky et al. Mar 2005 A1
20050110768 Marriott et al. May 2005 A1
20050204309 Szeto Sep 2005 A1
20050275567 Depue Dec 2005 A1
20060026521 Hotelling et al. Feb 2006 A1
20060097991 Hotelling et al. May 2006 A1
20060109252 Kolmykov-Zotov et al. May 2006 A1
20060181517 Zadesky et al. Aug 2006 A1
20060197750 Kerr et al. Sep 2006 A1
20060197753 Hotelling Sep 2006 A1
20060244733 Geaghan Nov 2006 A1
20060250377 Zadesky et al. Nov 2006 A1
20060274905 Lindahl et al. Dec 2006 A1
20060279548 Geaghan Dec 2006 A1
20070013671 Zadesky et al. Jan 2007 A1
20070052044 Forsblad et al. Mar 2007 A1
20070052691 Zadesky et al. Mar 2007 A1
20070080936 Tsuk et al. Apr 2007 A1
20070080938 Robbin et al. Apr 2007 A1
20070083822 Robbin et al. Apr 2007 A1
20070085841 Tsuk et al. Apr 2007 A1
20070097547 Yazawa et al. May 2007 A1
20070119698 Day May 2007 A1
20070152975 Ogihara Jul 2007 A1
20070152977 Ng et al. Jul 2007 A1
20070152983 McKillop et al. Jul 2007 A1
20070242057 Zadesky et al. Oct 2007 A1
20070268274 Westerman et al. Nov 2007 A1
20070273671 Zadesky et al. Nov 2007 A1
20070276525 Zadesky et al. Nov 2007 A1
20070279394 Lampell Dec 2007 A1
20080006453 Hotelling et al. Jan 2008 A1
20080006454 Hotelling Jan 2008 A1
20080007533 Hotelling et al. Jan 2008 A1
20080007539 Hotelling et al. Jan 2008 A1
20080012837 Marriott et al. Jan 2008 A1
20080018611 Serban et al. Jan 2008 A1
20080018615 Zadesky et al. Jan 2008 A1
20080018616 Lampell et al. Jan 2008 A1
20080018617 Ng et al. Jan 2008 A1
20080036734 Forsblad et al. Feb 2008 A1
20080055259 Plocher Mar 2008 A1
20080087476 Prest et al. Apr 2008 A1
20080088582 Prest et al. Apr 2008 A1
20080088596 Prest et al. Apr 2008 A1
20080088597 Prest et al. Apr 2008 A1
20080088600 Prest et al. Apr 2008 A1
20080111795 Bollinger May 2008 A1
20090020343 Rothkopf et al. Jan 2009 A1
20090058819 Gioscia et al. Mar 2009 A1
20100079402 Grunthaner Apr 2010 A1
20100253645 Bolender Oct 2010 A1
20110115738 Suzuki et al. May 2011 A1
20130018489 Grunthaner Jan 2013 A1
Foreign Referenced Citations (161)
Number Date Country
1139235 Jan 1997 CN
1455615 Nov 2003 CN
1499356 May 2004 CN
3615742 Nov 1987 DE
19722636 Dec 1998 DE
10022537 Nov 2000 DE
20019074 Feb 2001 DE
10 2006 000637 Jul 2007 DE
0178157 Apr 1986 EP
0419145 Mar 1991 EP
0419145 Mar 1991 EP
0498540 Aug 1992 EP
0521683 Jan 1993 EP
0 672 981 Sep 1995 EP
0674288 Sep 1995 EP
0 707 280 Apr 1996 EP
0 707 280 Apr 1996 EP
0 731 407 Sep 1996 EP
0 744 886 Nov 1996 EP
0551778 Jan 1997 EP
0880091 Nov 1998 EP
1 026 713 Aug 2000 EP
1081922 Mar 2001 EP
1098241 May 2001 EP
1 133 057 Sep 2001 EP
1162826 Dec 2001 EP
1205836 May 2002 EP
1251455 Oct 2002 EP
1 467 392 Oct 2004 EP
1482401 Dec 2004 EP
1 496 467 Jan 2005 EP
1542437 Jun 2005 EP
1 589 407 Oct 2005 EP
1 659 481 May 2006 EP
2 686 440 Jul 1993 FR
2072389 Sep 1981 GB
2315186 Jan 1998 GB
2391060 Jan 2004 GB
2402105 Dec 2004 GB
57-95722 Jun 1982 JP
57-97626 Jun 1982 JP
61-117619 Jun 1986 JP
61-124009 Jun 1986 JP
63-20411 Jan 1988 JP
63-106826 May 1988 JP
63-181022 Jul 1988 JP
63-298518 Dec 1988 JP
03-57617 Jun 1991 JP
3-192418 Aug 1991 JP
3192418 Aug 1991 JP
04-32920 Feb 1992 JP
5-041135 Feb 1993 JP
5-080938 Apr 1993 JP
5-101741 Apr 1993 JP
05-36623 May 1993 JP
5-189110 Jul 1993 JP
5-205565 Aug 1993 JP
5-211021 Aug 1993 JP
5-217464 Aug 1993 JP
05-233141 Sep 1993 JP
05-262276 Oct 1993 JP
5-265656 Oct 1993 JP
5-274956 Oct 1993 JP
05-289811 Nov 1993 JP
5-298955 Nov 1993 JP
5-325723 Dec 1993 JP
06-20570 Jan 1994 JP
6-084428 Mar 1994 JP
6-089636 Mar 1994 JP
6-96639 Apr 1994 JP
6-111695 Apr 1994 JP
6-139879 May 1994 JP
06-187078 Jul 1994 JP
06-208433 Jul 1994 JP
6-267382 Sep 1994 JP
06-283993 Oct 1994 JP
6-333459 Dec 1994 JP
7-107574 Apr 1995 JP
7-41882 Jul 1995 JP
7-201249 Aug 1995 JP
07-201256 Aug 1995 JP
07-253838 Oct 1995 JP
7-261899 Oct 1995 JP
7-261922 Oct 1995 JP
07-296670 Nov 1995 JP
7-319001 Dec 1995 JP
08-016292 Jan 1996 JP
8-115158 May 1996 JP
8-203387 Aug 1996 JP
8-293226 Nov 1996 JP
8-298045 Nov 1996 JP
08-299541 Nov 1996 JP
8-316664 Nov 1996 JP
09-044289 Feb 1997 JP
09-069023 Mar 1997 JP
09-128148 May 1997 JP
9-134248 May 1997 JP
9134248 May 1997 JP
9-218747 Aug 1997 JP
9-230993 Sep 1997 JP
9-231858 Sep 1997 JP
09-233161 Sep 1997 JP
9-251347 Sep 1997 JP
9-258895 Oct 1997 JP
9-288926 Nov 1997 JP
10-74127 Mar 1998 JP
10-074429 Mar 1998 JP
1074127 Mar 1998 JP
10-198507 Jul 1998 JP
10-227878 Aug 1998 JP
10-326149 Dec 1998 JP
11-184607 Jul 1999 JP
11-194863 Jul 1999 JP
11-194872 Jul 1999 JP
11-194882 Jul 1999 JP
11-194883 Jul 1999 JP
11-194891 Jul 1999 JP
11-195353 Jul 1999 JP
11-203045 Jul 1999 JP
1999-272378 Oct 1999 JP
2000-163031 Jun 2000 JP
2000-215549 Aug 2000 JP
2000-267786 Sep 2000 JP
2000-353045 Dec 2000 JP
2001-11769 Jan 2001 JP
2001-22508 Jan 2001 JP
2002-215311 Aug 2002 JP
2002-342033 Nov 2002 JP
2003-280807 Oct 2003 JP
2005-251218 Sep 2005 JP
2005-285140 Oct 2005 JP
2005-293606 Oct 2005 JP
2006-4453 Jan 2006 JP
2006-178962 Jul 2006 JP
3852854 Dec 2006 JP
2007-123473 May 2007 JP
1998-71394 Oct 1998 KR
1999-50198 Jul 1999 KR
2000-08579 Feb 2000 KR
2001-0052016 Jun 2001 KR
431607 Apr 2001 TW
00470193 Dec 2001 TW
547716 Aug 2003 TW
I220491 Aug 2004 TW
WO-9417494 Aug 1994 WO
WO 9500897 Jan 1995 WO
WO-9814863 Apr 1998 WO
WO-9949443 Sep 1999 WO
WO-0235460 May 2002 WO
WO-03044645 May 2003 WO
WO 03044956 May 2003 WO
WO 03090008 Oct 2003 WO
WO 2004040606 May 2004 WO
WO-2005055620 Jun 2005 WO
WO-2005073634 Aug 2005 WO
WO 2005076117 Aug 2005 WO
WO-2005124526 Dec 2005 WO
WO 2006037545 Apr 2006 WO
WO 2006104745 Oct 2006 WO
WO-2009012183 Jan 2009 WO
WO-2009012183 Jan 2009 WO
Non-Patent Literature Citations (81)
Entry
SanDisk Sansa Connect User Guide; 29 pages.
PCT Search Report and Written Opinion for PCT/US2008/069890, mailed Jan. 29, 2009 (18 pages).
“Touchpad,” Notebook PC Manual, ACER Information Co. Ltd., Feb. 16, 2005, pp. 11-12.
Bang & Olufsen Telecom a/s. (2000). “BeoCom 6000 User Guide,” 53 pages.
Lee, S.K. et al. (Apr. 1985). “A Multi-Touch Three Dimensional Touch-Sensitive Tablet,” Proceedings of CHI: ACM Conference on Human Factors in Computing Systems, pp. 21-25.
Letter re: Bang & Olufsen a/s by David Safran, Nixon Peabody, LLP, May 21, 2004, with BeoCom 6000 Sales Training Brochure, seven pages.
Rubine, D.H. (Dec. 1991). “The Automatic Recognition of Gestures,” CMU-CS-91-202, Submitted in Partial Fulfillment of the Requirements of the Degree of Doctor of Philosophy in Computer Science at Carnegie Mellon University, 285 pages.
Rubine, D.H. (May 1992). “Combining Gestures and Direct Manipulation,” CHI '92, pp. 659-660.
Westerman, W. (Spring 1999). “Hand Tracking, Finger Identification, and Chordic Manipulation on a Multi-Touch Surface,” A Dissertation Submitted to the Faculty of the University of Delaware in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy in Electrical Engineering, 364 pages.
Translation of Trekstor's Defense Statement to the District Court Mannheim of May 23, 2008; 37 pages.
“Diamond Multimedia Announces Rio PMP300 Portable MP3 Player,” located at http://news.harmony-central.com/Newp/1998/Rio-PMP300.html visited on May 5, 2008, 4 pages.
“Diamond Multimedia Announces Rio PMP300 Portable MP3 Music Player,” located at http://news.harmony-central.com/Newp/1998/Rio-PMP300.html visited on May 5, 2008. (4 pages).
“About Quicktip®” www.logicad3d.com/docs/qt.html, downloaded Apr. 8, 2002.
“Apple Presents iPod: Ultra-Portable MP3 Music Player Puts 1,000 Songs in Your Pocket,” retreived from http://www.apple.com/pr/library/2001/oct/23ipod.html on Oct. 23, 2001.
“Apple Unveils Optical Mouse and New Pro Keyboard,” Press Release, Jul. 19, 2000.
“Der Klangmeister,” Connect Magazine, Aug. 1998.
“Neuros MP3 Digital Audio Computer,” www.neurosaudio.com, downloaded Apr. 9, 2003.
“OEM Touchpad Modules” website www.glidepoint.com/sales/modules.index.shtml, downloaded Feb. 13, 2002.
“Product Overview—ErgoCommander®,” www.logicad3d.com/products/ErgoCommander.htm, downloaded Apr. 8, 2002.
“Product Overview—SpaceMouse® Classic ” www.logicad3d.com/products/Classic.htm, downloaded Apr. 8, 2002.
“Synaptics Tough Pad Interfacing Guide,” Second Edition, Mar. 25, 1998, Synaptics, Inc., San Jose, CA, pp. 1-90.
“System Service and Troubleshooting Manual,” www.dsplib.com/intv/Master, downloaded Dec. 11, 2002.
“Alps Electric introduces the GlidePoint Wave Keyboard; combines a gentily curved design with Alps' advanced GlidePoint Technology”, Business Wire, (Oct. 21, 1996).
Alps Electric Ships GlidePoint Keyboard for the Macintosh; Includes a GlidePoint Touchpad, Erase-Eaze Backspace Key and Contoured Wrist Rest, Business Wire, (Jul. 1, 1996).
“APS show guide to exhibitors”, Physics Today, 49(3) (Mar. 1996).
“Design News literature plus”, Design News, 51(24) (Dec. 18, 1995).
“Manufactures”, Laser Focus World, Buyers Guide '96, 31(12) (Dec. 1995).
“National Design Engineering Show”, Design News, 52(5) (Mar. 4, 1996).
“Preview of exhibitor booths at the Philadelphia show”, Air Conditioning Heating & News, 200(2) (Jan. 13, 1997).
“Product news”, Design News, 53(11) (Jun. 9, 1997).
“Product news”, Design News, 53(9) (May 5, 1997).
Ahl, “Controller Update”, Creative Computing vol. 9, No. 12, Dec. 1983, pp. 142-154.
Ahmad, “A Usable Real-Time 3D Hand Tracker,” Proceedings of the 28th Asilomar Conference on Signals, Systems and Computers—Part 2 (of 2) vol. 2 (Oct 1994), 5 pages.
Atari VCS/2600 Peripherals, www.classicgaming.com downloaded Feb. 28, 2007, pp. 1-15.
Baig, E.C., “Your PC Just Might Need a Mouse,” U.S. News & World Report 108(22) (Jun. 4, 1990).
Bang & Olufsen Telecom a/s, “BeoCom 6000 User Guide 2000.” BeoCom 6000, Sales Training Brochure, date unknown.
Bartimo, Jim, “The Portables: Traveling Quickly”, Computerworld (Nov. 14, 1983).
BeoCom 6000, Sales Training Brochure, date unknown.
Bray, “Phosphors help switch on xenon,” Physics in Action, pp. 1-3, Apr. 1999.
Brink et al., “Pumped-up portables”, U.S. News & World Report, 116(21) (May 30, 1994).
Brown et al., “Windows on Tablets as a Means of Achieving Virtual Input Devices”, Human-Computer Interaction—Interact '90 (1990).
Buxton et al., “Issues and Techniques in Touch-Sensitive Tablet Input”, Computer Graphics, 19(3), Proceedings of SIGGRAPH '85 (1985).
Chapweske, Adam “PS/2 Mouse/Keyboard Protocol,” 1999, http://panda.cs.ndsu.nodak.edu/˜achapwes/PICmicro/PS2/ps2.htm.
Chen et al., “A Study in Interactive 3-D Rotation Using 2-D Control Devices”, Computer Graphics 22(4) (Aug. 1988).
Chinese Office Action issue Dec. 29, 2006, directed to CN Application No. 200510103886.3, 25 pages.
De Meyer, Kevin, “Crystal Optical Mouse,” Feb. 14, 2002, Heatseekerz, Web Article 19.
Evans et al., “Tablet-based Valuators that Provide One, Two, or Three Degrees of Freedom”, Computer Graphics 15(3) (Aug. 1981).
Evb Elektronik “TSOP6238 IR Receiver Modules for Infrared Remote Control Systems” dated Jan. 2004 1 page.
Fiore, “Zen Touchpad,” Cornell University, May 2000, 6 pages.
Gadgetboy, “Point and click with the latest mice,” CNET Asia Product Review, www.asia.cnet.com/reviews . . . are/qadgetboy/0,39001770,380235900,00.htm downloaded Dec. 5, 2001.
Gfroerer, “Photoluminescence in Analysis of Surfaces and Interfaces,” Encyclopedia of Analytical Chemistry, pp. 1-23, Copyright John Wiley & Sons Ltd, Chichester, 2000.
Jesitus, John , “Broken promises?”, Industry Week/IW, 246(20) (Nov. 3, 1997).
Kobayashi et al. (1997) “Dynamic Soundscape: Mapping Time to Space for Audio Browsing,” Computer Human Interaction: 16 pages.
Kobayashi et al. “Development of the Touch Switches with the Click Response,” Koukuu Denshi Gihou No. 17: pp. 44-48 (Mar. 1994) (published by the Japan Aviation Electronics Industry, Ltd.); Translation of Summary.
Kobayashi (1996) “Design of Dynamic Soundscape: Mapping Time to Space for Audio Browsing with Simultaneous Listening,” Thesis submitted to Program in Media Arts and Sciences at the Massachusetts Institute of Technology, (58 pages).
Letter re: Bang & Olufsen a/s by David Safran, Nixon Peabody, LLP May 21, 2004.
Luna Technologies International, Inc., LUNA Photoluminescent Safety Products, “Photoluminescence—What is Photoluminescence?” from website at http://www.lunaplast.com/photoluminescence.com on Dec. 27, 2005.
Mims, Forrest M., III, “A Few Quick Pointers; Mouses, Touch Screens, Touch Pads, Light Pads, and the Like Can Make System Easier to Use,” Computers & Electronics (22) (May 1984).
Nass, Richard, “Touchpad input device goes digital to give portable systems a desktop “mouse-like” feel”, Electronic Design, 44(18) (Sep. 3, 1996).
Perenson, Melissa, “New & Improved: Touchpad Redux”, PC Magazine (Sep. 10, 1996).
Petersen, Marty, “Koala Pad Touch Tablet & Micro Illustrator Software,” InfoWorld (Oct. 10, 1983).
Petruzzellis, “Force-Sensing Resistors” Electronics Now, 64(3), (Mar. 1993).
Photographs of Innovation 2000 Best of Show Award Presented at the 2000 Int'l CES Innovations 2000 Design & Engineering Showcase, 1 page.
Soderholm, Lars G., “Sensing Systems for ‘Touch and Feel,’” Design News (May 8, 1989): pp. 72-76.
Sony presents “Choice Without Compromise” at IBC '97 M2 Presswire (Jul. 24, 1997.
Spiwak, Marc, “A Great New Wireless Keyboard”, Popular Electronics, 14(12) (Dec. 1997).
Spiwak, Marc, “A Pair of Unusual Controllers”, Popular Electronics 14(4) (Apr. 1997).
Sylvania, “Intellvision™ Intelligent Television Master Component Service Manual,” pp. 1, 2 and 8, 1979.
Tessler, Franklin, “Point Pad”, Macworld 12(10) (Oct. 1995).
Tessler, Franklin, Smart Input: How to Chose from the New Generation of Innovative Input Devices, Macworld 13(5) (May 1996).
Tessler, Franklin, “Touchpads”, Macworld 13(2) (Feb. 1996).
“Triax Custom Controllers due; Video Game Controllers,” HFD—The Weekly Home Furnishing Newspaper 67(1) (Jan. 4, 1993).
International Search Report and Written Opinion, dated Dec. 6, 2007, directed to related International Application No. PCT/US2007/015501.
Non-Final Office Action mailed Sep. 16, 2010, for U.S. Appl. No. 11/882,881, filed Aug. 6, 2007, nine pages.
Final Office Action mailed Feb. 4, 2011, for U.S. Appl. No. 11/882,881, filed Aug. 6, 2007, 14 pages.
Non-Final Office Action mailed Sep. 1, 2011, for U.S. Appl. No. 11/882,881, filed Aug. 6, 2007, nine pages.
Final Office Action mailed Jan. 11, 2012, for U.S. Appl. No. 11/882,881, filed Aug. 6, 2007, 15 pages.
International Search Report mailed Sep. 21, 2012, for PCT Application No. PCT/US2012/046114, filed Jul. 10, 2012, three pages.
Non-Final Office Action mailed Jul. 12, 2013, for U.S. Appl. No. 11/882,881, filed Aug. 6, 2007, 16 pages.
Final Office Action mailed Nov. 25, 2013, for U.S. Appl. No. 11/882,881, filed Aug. 6, 2007, 16 pages.
Non-Final Office Action mailed Oct. 25, 2016, for U.S. Appl. No. 11/882,881, filed Aug. 6, 2007, 37 pages.
Related Publications (1)
Number Date Country
20090019949 A1 Jan 2009 US