Multi-dimensional scroll wheel

Information

  • Patent Grant
  • 8866780
  • Patent Number
    8,866,780
  • Date Filed
    Monday, April 8, 2013
    11 years ago
  • Date Issued
    Tuesday, October 21, 2014
    9 years ago
Abstract
A multi-dimensional scroll wheel is disclosed. Scroll wheel circuitry is provided to detect input gestures that traverse the center of the scroll wheel and to detect multi-touch input. The scroll wheel can include a first plurality of sensor elements arranged in a first closed loop and a second plurality of sensor elements arranged in a second closed loop, the first and second closed loops being concentrically arranged about the center of the scroll wheel.
Description
FIELD OF THE DISCLOSURE

This relates generally to touch sensitive input devices, and more particularly, to enhanced functionality scroll wheels.


BACKGROUND

Many types of input devices exist for performing operations in consumer electronic devices. One type of input device that has enjoyed popularity in devices such as media players is the touch-based scroll wheel. Touch-based scroll wheels generally use capacitive sensor elements to detect the rotational motion of a user's finger and translate that motion into linear (e.g., horizontal or vertical) movement of a cursor or content on a display screen.


For example, if a user wishes to scroll down through a list of songs displayed on a media player, the user can touch the scroll wheel in a clockwise direction to see a cursor move from song to song down the list. Conversely, if the user wishes to scroll up through the list, the user can touch the scroll wheel in a counter-clockwise direction to see the cursor move from song to song up the list.


Accordingly, scroll wheels have proven useful and convenient for certain applications, such as navigation control using a single point of contact (“single touch” input). However, as consumer electronic devices evolve and provide more applications, it can become challenging to design such applications to operate based on the single touch rotational input detected by scroll wheels as described above.


SUMMARY

To improve the effectiveness of a touch-based scroll wheel, the present disclosure provides multi-touch scroll wheel circuitry capable of sensing input in multiple ways. For example, scroll wheel circuitry can sense a moving object, such as a finger, as it is moved not only in a rotational manner but also in a linear manner across the center of the scroll wheel. The scroll wheel circuitry can also sense more than one object at a time, such as multi-finger touch or motion. In this manner, the scroll wheel circuitry can enable a device to recognize a wider range of input. To be clear, a wheel can be circular, but can also have any shape that enables a closed loop type input.


Rather than having a single region of sensor elements for sensing single touch rotational input, the scroll wheel of the present disclosure can include multiple regions of sensor elements capable of independently or cooperatively, sequentially or simultaneously sensing the position of an object. The sensor elements can be arranged in any manner suitable for sensing varied input events.


For example, the scroll wheel can include an inner region and an outer region of capacitive sensor elements. The inner region can include one or more elements located at or near the center of the scroll wheel, for example, and the outer region can include a plurality of elements arranged around the inner region. The different regions of sensor elements can aid in sensing gestures that traverse the center of the scroll wheel. For example, the different regions of sensor elements can provide several data points for sensing linear motion as it traverses the sensor elements of each region including the center of the scroll wheel.


Applications can be enhanced by the improved range of input enabled by the scroll wheel circuitry. For example, linear motion, such as a swipe across the scroll wheel, can enable an image browsing application to cause images, such as album cover pictures for example, to be transitioned across a screen. Multi-touch input, such as one finger touching an inner region of the scroll and another finger rotating in the outer region, can enable a zooming application to cause a displayed image to be zoomed-in or out, depending on the direction of the rotation for example. A pinching or expanding of a user's fingers can also enable the zooming application to cause a zooming action.


The scroll wheel circuitry can also bias the sensor element configuration according to the type of input event expected. For example, if a particular application permits only linear motion input along a particular axis (e.g., a horizontal or vertical swipe), the scroll wheel circuitry can utilize only the sensor elements arranged along that path to sense for an input event. By using less than all available sensor elements in this manner, the scroll wheel circuitry can achieve power savings.


By sensing input from at least some of the inner region sensor elements and outer region sensor elements, an angular and/or radial position of one or more of a user's fingers can be determined with relatively high accuracy.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 illustrates an example of an electronic device.



FIG. 2 illustrates an example of a capacitive sensor element arrangement.



FIG. 3 illustrates an example of a capacitive sensor element arrangement.



FIG. 4 illustrates an example of a capacitive sensor element arrangement.



FIG. 5 illustrates an example of a capacitive sensor element arrangement.



FIG. 6 illustrates an example of a capacitive sensor element arrangement.



FIG. 7 illustrates an example of a capacitive sensor element arrangement.



FIG. 8 illustrates an example of a histogram indicating one finger being applied to a touch sensitive surface.



FIG. 9 illustrates an example of a histogram indicating one finger being applied to a touch sensitive surface.



FIG. 10 illustrates an example of a histogram indicating two fingers being applied to a touch sensitive surface.



FIG. 11 illustrates an example of a histogram indicating two fingers being applied to a touch sensitive surface.



FIG. 12 illustrates an example of a histogram indicating two fingers being applied to a touch sensitive surface.



FIG. 13 illustrates an example of a histogram indicating two fingers being applied to a touch sensitive surface.



FIG. 14 illustrates an example of a user application that may be performed on an electronic device in response to a linear input.



FIG. 15 illustrates an example of a user application that may be performed on an electronic device in response to a linear input.



FIG. 16 illustrates an example of a user application that may be performed on an electronic device in response to a multi-touch input.



FIG. 17 illustrates an example of a user application that may be performed on an electronic device in response to a multi-touch input.



FIGS. 18A-18C illustrate an example of operations of an input device.



FIG. 19 illustrates an example of an input device.



FIG. 20 illustrates an example of a computing system.



FIGS. 21A-21D illustrate examples of applications of input devices.



FIGS. 22A-22B illustrate an example of an installation of an input device into a media player.



FIG. 23 illustrates an example of a remote control incorporating an input device.





DETAILED DESCRIPTION

The present disclosure describes scroll wheels that can sense and resolve positions of one or more objects (e.g., fingers) as they touch the scroll wheel in a rotational, linear or other manner. According to one aspect of the disclosure, a scroll wheel can be provided on an electronic device to facilitate user interaction therewith.


The present disclosure will now be described in detail with reference to a few embodiments as illustrated in the accompanying drawings. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present disclosure. It will be apparent, however, to one skilled in the art, that the present disclosure may be practiced without some or all of these specific details. In other instances, well known process steps have not been described in detail in order not to unnecessarily obscure the present disclosure.



FIG. 1 illustrates an example of an electronic device. The electronic device may be any consumer electronic product. The electronic device may be a computing device and more particularly it may be a media player, PDA, phone, remote control, camera and the like. In the embodiment illustrated in FIG. 1, the electronic device 100 may correspond to a media player. The term “media player” generally refers to computing devices dedicated to processing media such as audio, video or other images, including, for example, music players, game players, video players, video recorders and the like. These devices can be portable to allow a user to, for example, listen to music, play games or video, record video or take pictures wherever the user travels. In one embodiment, the electronic device can be a handheld device that is sized for placement into a pocket of the user. By being pocket sized, the device may be taken almost anywhere the user travels (e.g., the user is not limited by carrying a large, bulky and often heavy device, as in a portable computer). Furthermore, the device can be operated in the user's hands, thus no reference surface such as a desktop is required.


Electronic devices (e.g., media players) generally have connection capabilities that allow a user to upload and download data to and from a host device, such as a general purpose computer (e.g., desktop computer, portable computer, etc.). For example, in the case of a camera, photo images can be downloaded to the general purpose computer for further processing (e.g., printing). With regard to music players, for example, songs and play lists stored on the general purpose computer can be downloaded into the music player. In the embodiment illustrated in FIG. 1, electronic device 100 can be a pocket-sized hand-held media player (e.g., MP3 player) that allows a user to store a collection of music, photos, album art, contacts, calendar entries, and other desirable media assets. It should be appreciated however, that media players are not a limitation as the electronic device may be embodied in other forms as mentioned above.


As shown in FIG. 1, electronic device 100 may include housing 102 that can enclose various electrical components, such as integrated circuit chips and other circuitry, for example. The integrated circuit chips and other circuitry may include, for example, a microprocessor, memory (e.g., ROM, RAM), a power supply (e.g., battery), a circuit board, a hard drive or Flash (e.g., Nand flash) for storing media for example, one or more orientation detection elements (e.g., accelerometer) and various input/output (I/O) support circuitry. In the case of music players, the electrical components can include components for outputting music such as an amplifier and a digital signal processor (DSP) for example. In the case of video recorders or cameras the electrical components can include components for capturing images such as image sensors (e.g., charge coupled device (CCD) or complimentary oxide semiconductor (CMOS)) or optics (e.g., lenses, splitters, filters) for example. In addition to the above, the housing can also define the shape or form of the electronic device. That is, the contour of housing 102 may embody the outward physical appearance of electronic device 100 in one embodiment.


Electronic device 100 may also include display screen 104. Display screen 104 can be used to display a graphical user interface as well as other information to the user (e.g., text, objects, graphics). By way of example, display screen 104 may be a liquid crystal display (LCD). In one embodiment, the display screen can correspond to a X-by-Y pixel high-resolution display, with a white LED backlight to give clear visibility in daylight as well as low-light conditions. Display screen 104 can also exhibit a “wide screen” aspect ratio (e.g., similar to a 16:9 aspect ratio) such that it may be relatively easy to perceive portrait and landscape orientations. In other embodiments, electronic device 100 does not include display screen 104. The lack of display screen 104 can allow electronic device 100 to be configured with smaller dimensions than it would otherwise have with display screen 104. For example, in one embodiment, electronic device 100 without display screen 104 may be less than two inches wide and less than two inches tall.


Electronic device 100 may also include input device 110. Input device 110 can be configured to provide one or more control functions for controlling various applications associated with electronic device 100. For example, a control function can be used to move an object or perform an action on display screen 104 or to make selections or issue commands associated with operating electronic device 100. Input device 110 may be widely varied. In one embodiment, input device 110 can include a rigid sensor mechanism for detecting input. The rigid sensor mechanism can include, for example, a touch sensitive surface that provides location information for an object, such as a finger for example, in contact with or in proximity to the touch sensitive surface. In another embodiment, input device 110 can include one or more movable sensor mechanisms for detecting input. The movable sensor mechanism can include, for example, one or more moving members that actuate a switch when a particular area of input device 110 is pressed. The movable sensor mechanism may operate as a mechanical push button and perform a clicking action when actuated. In a further embodiment, input device 110 may include a combination of a rigid sensor mechanism and one or more movable sensor mechanisms.


An example of an input device comprising a rigid sensor mechanism may be found in U.S. Pat. No. 7,046,230 entitled “Touch Pad Handheld Device,” which is incorporated herein by reference in its entirety. An example of an input device comprising a combination of a rigid sensor mechanism and a movable sensor mechanism may be found in U.S. patent application Ser. No. 11/812,383 entitled “Gimballed Scroll Wheel,” filed Jun. 18, 2007, which is incorporated herein by reference in its entirety.



FIGS. 2-7 illustrate examples of some arrangements of capacitive sensor elements that can be configured to sense touch events caused by an object, such as a finger, in contact with or in proximity to a touch sensitive surface of input device 110. Touch events detectable by the capacitive sensor elements of input device 110 may be widely varied, and may include, for example, rotational motion, linear motion, taps, holds, and other gestures and any combinations thereof provided by one (single touch input) or more than one (multi-touch input) of a user's fingers across the touch sensitive surface. The capacitive sensor elements can be configured to detect input based on self capacitance (as illustrated in FIGS. 2-7) or mutual capacitance. In self capacitance, the “self” capacitance of a single electrode is measured as for example relative to ground. In mutual capacitance, the mutual capacitance between at least first and second electrodes is measured. In either case, each of the sensor elements can work independent of the other sensor elements to produce simultaneously occurring signals representative of different points of input on the touch sensitive surface at a particular time. Input device 110 can include a controller configured to detect input sensed by the sensor elements by measuring a change in capacitance of the sensor elements.


An example of an input device configured to detect multiple simultaneous touches or near touches may be found in U.S. patent application Ser. No. 10/840,862 entitled “Multipoint Touchscreen,” filed May 6, 2004, which is incorporated herein by reference in its entirety. An example of a touch event model that can be associated with such an input device may be found in U.S. patent application Ser. No. 12/042,318 entitled “Touch Event Model,” filed Mar. 4, 2008, which is incorporated herein by reference in its entirety. An example of gestures that may be implemented on such an input device may be found in U.S. patent application Ser. No. 11/818,342 entitled “Gestures for Controlling, Manipulating, and Editing of Media Files Using Touch Sensitive Devices,” filed Jun. 13, 2007, which is incorporated herein by reference in its entirety.


The present disclosure is not limited to the input device configurations illustrated herein. Rather, an input device of any suitable technology or configuration for enabling detection of input in accordance with the teachings of the present disclosure can be utilized.


Rather than having a single region of sensor elements for sensing single touch rotational input, input device 110 can include multiple regions of sensor elements capable of independently or cooperatively, sequentially or simultaneously sensing an object.


The sensor elements of input device 110 can be arranged in any manner suitable for sensing varied types of input. For example, input device 110 can include an inner region and an outer region of capacitive sensor elements. The inner region can include one or more elements located at or near the center of the input device, for example, and the outer region can include a plurality of elements arranged around the inner region. The different regions of sensor elements can aid in sensing gestures that traverse the center of the input device. For example, the different regions of sensor elements can provide several data points for sensing linear motion as it traverses the sensor elements of each region including the center of the input device.


Input device 110 can enhance the detection of input that can be characterized as linear or non-linear. For example, a linear input can involve a substantially straight-line application of an object (e.g., finger) across the input device. Input device 110 can also be configured to permit detection of multi-touch events—the detection of two or more objects (e.g., fingers) on the input device simultaneously, or nearly simultaneously. It is understood that rotational touch events and touch events comprising any gestural pattern can also be detected with input device 110.



FIG. 2 illustrates capacitive sensor element arrangement 200 in accordance with one embodiment. In the embodiment illustrated in FIG. 2, input device 110 can include inner region 210, shown as a circle, and outer region 220 which makes up the periphery of the input device. Inner region 210 includes capacitive sensor element 212. Outer region 220 can include several capacitive sensor elements 222. Including sensor element 212 in the center can provide an extra data point to assist in the detection of gestures that traverse the center of the input device, such as linear input. For example, if a user swipes left-to-right across arrangement 200, each of elements 222A, 212, and 222B can be utilized to detect the gesture. Processing circuitry (not shown) can process this data and determine that the user swiped from left-to-right.



FIG. 3 illustrates capacitive sensor element arrangement 300 in accordance with one embodiment. In the embodiment illustrated in FIG. 3, input device 110 can include inner region 310 and outer region 320. Inner region 310 can be devoid of or optionally include a capacitive sensor element. Outer region 320 can include several multi-element capacitive sensing areas 322 that include more than one capacitive sensor element capable of independently or cooperatively sensing an object. Sensing areas 322 can provide additional data points to assist the processing circuitry in detecting touch events. For example, in a left-to-right linear input gesture, sensor elements 322A, 322B, 322C, and 322D can be utilized to detect the gesture. Although FIG. 3 shows sensing areas 322 as having two sensor elements, it is understood that sensing areas 322 can have any number of sensor elements (e.g., 3, 4, 5, etc. elements).



FIGS. 4 and 5 illustrate capacitive sensor element arrangements that have multi-element capacitive sensing areas in their inner regions in accordance with some embodiments. In the embodiment illustrated in FIG. 4, inner region 410 of input device 110 can include multi-element capacitive sensing area 412. Although sensing area 412 is shown to include 4 sensor elements each capable of sensing an object, any number of elements may be used. Similarly, outer region 420 can include any number of capacitive sensor elements 422. In the embodiment illustrated in FIG. 5, inner region 510 of input device 110 can include multi-element capacitive sensing area 512, and outer region 520 can include a number of multi-element capacitive sensing areas 524.



FIG. 6 illustrates capacitive sensor element arrangement 600 having capacitive sensor elements and a trackpad in accordance with one embodiment. In the embodiment illustrated in FIG. 6, inner region 610 can include trackpad 612. Trackpad 612 can be a resistance sense track pad or a capacitance sensing trackpad, for example, formed of rows and columns of sensor elements. Outer region 620 can include a combination of capacitive sensor elements 622 and multi-element capacitive sensing areas 624 arranged in a closed loop path, or only single sensor elements or only multi-element sensing areas may be used. Combining a trackpad in an input device with discrete capacitive sensor elements can increase cost and power consumption compared to an input device having only discrete capacitive sensor elements, since different or more complex processing circuitry may be required to compensate for the coordinate-based sensing of the trackpad element. In other embodiments, inner region 612 can include any other type of sensing device, such as a fingerprint sensor or light-based (e.g., laser or LED) sensing device for example.



FIG. 7 illustrates capacitive sensor element arrangement 700 in accordance with one embodiment. In the embodiment illustrated in FIG. 7, input device 110 can include 17 capacitive sensor elements arranged in an inner region near the center of input device 110 (represented by sensor element 1) and an outer region making up the periphery of input device 110 (represented by sensor elements 2-17). Since each sensor element is capable of independently or cooperatively sensing an object, the sensor elements can be configured to sense either single touch or multi-touch input events.


In the embodiment illustrated in FIG. 7, sensor element 1 can occupy the center of the substantially circular sensing area. A first or inner concentric ring of four sensor elements 2-5 can be positioned about central sensor element 1. Each of the sensor elements in the first concentric ring can define a shape of substantially equal area. A second or outer concentric ring including twelve sensor elements 6-17 can be positioned about the first concentric ring. Each of the sensor elements in the second concentric ring can similarly define a shape of substantially equal area. By designing sensor elements, such as sensor elements 2-17 for example, to define approximately the same surface area, the corresponding capacitive detection circuitry can be simplified since sensor elements with similar areas can produce similar capacitances in response to a touch of a user's finger.


Although the illustrated sensor element arrangements have a particular number of sensor elements arranged in a particular way, any suitable number and arrangement of sensor elements can be used. A suitable number and arrangement can be determined by weighing a combination of factors, such as sensor resolution, cost and power, for example. Higher sensor resolution can be achieved with a greater number of smaller sensor elements arranged closely together in a tight pitch. However, a greater number of sensor elements can increase cost, and require additional power to compensate for the time it takes to scan the additional elements to detect an input.


Higher sensor resolution can be advantageous in a variety of situations. For example, higher sensor resolution can enable better detection of motion and multi-touch input gestures because more sensor elements are likely to be touched during such gestures. This can be advantageous for applications in which user interface functionality relies heavily on the rate at which motion input is sensed, such as navigation control applications for example. This can also improve multi-touch input detection because more sensor elements can better capture the difference between a touch by one large finger in contrast to two small fingers positioned closely together, for example.


Device size, input device size and packaging can also influence the determination of a suitable number of elements to use. Packaging issues, for example, can include how certain materials may influence capacitive sensor detection, such as a glass cover placed over the sensor element arrangement for example. A thicker glass may require larger sensor elements to be utilized to ensure adequate capacitance detection through the thick glass.


As shown in the illustrated embodiments, the sensor elements can be circumferentially arranged, such as in a substantially circular closed loop pattern for example. In alternative embodiments the sensor elements can be arranged in a variety of other geometric patterns including, without limitation, ovals, ellipsoids, oblongs, etc. As shown in the illustrated embodiments, the sensor elements can be arranged on a substantially flat surface. However, alternative embodiments can include sensor elements arranged on non-flat surfaces, including without limitation, convex and concave surfaces.


The annular arrangement of the sensor elements can enable an angular position of an input, provided by a user's finger for example, to be determined relative to the input device. For example, utilizing the sensor element arrangement illustrated in FIG. 7, an angular position of a touch by a user's finger relative to input device 110 can be determined by first scanning sensor elements 2-17 in the following combinations to capture the following twelve readings (o)-(xi):


(o) scan sensor elements 2 and 6;


(i) scan sensor elements 2 and 7;


(ii) scan sensor elements 2 and 8;


(iii) scan sensor elements 3 and 9;


(iv) scan sensor elements 3 and 10;


(v) scan sensor elements 3 and 11;


(vi) scan sensor elements 4 and 12;


(vii) scan sensor elements 4 and 13;


(viii) scan sensor elements 4 and 14;


(ix) scan sensor elements 5 and 15;


(x) scan sensor elements 5 and 16;


(xi) scan sensor elements 5 and 17.


In this example, each of the twelve readings (o)-(xi) are associated with an angular position of a sensor element in the outer ring, which comprises sensor elements 6-17. The angular position determination can be made by scanning only the twelve outer ring elements, but including sensor elements 2-5 in the scanning process can increase the accuracy and resolution of the angular position determination.


Additionally, the sensor elements can be scanned at a speed substantially faster than that at which the user's finger moves relative to the sensor elements. For example, in this example all of the sensor element combinations comprising readings (o)-(xi) can be scanned within a few milliseconds. In effect, the relatively fast scanning speed provides a snapshot of the angular position of the user's finger relative to the sensing area. Since higher scanning speeds consume more power, it can be advantageous to scan only as fast and as often as necessary.


Using the twelve readings (o)-(xi) described above, the angular position of the user's finger can be calculated in a number of ways. In one embodiment, a centroid detection algorithm can be utilized according to the following formula:






C
=





i
=
0

11







i






R
i







i
=
0

11







R
i







where


i represents an index number uniquely associated with the angular position of each one of the sensor elements in the outer ring (i.e., the angular position of each of sensor elements 6-17 can be uniquely associated with a corresponding index number 0-11);


Ri represents the amplitude of the signal measured in the reading (o)-(xi) associated with each of sensor elements 6-17 in the outer ring; and


C represents a numerical value which varies as a function of detected signal amplitude and which can be used to identify the angular position of a user's finger relative to the sensor elements.


As shown in the histogram of measured reading values of FIG. 8, the result of the calculation, C, is the centroid, or average, of the angular positions P weighted by the captured readings through which the plotted line is interpolated. Each position P relates to an angular position of one of sensor elements 6-17 in the outer ring. A peak in this histogram corresponds to sensor element signals of relatively greater amplitude, indicating the angular position of the user's finger. The result of the centroid calculation corresponds to this peak.


For example, if the calculated centroid C indicates a histogram peak near the (o) reading (in which sensor elements 2 and 6 are scanned) then this can indicate that the user's finger is located near the 11 o'clock position (associated with sensor element 6) shown in FIG. 7. Applying this example in the context of FIG. 8, Pj can represent the 9 o'clock position (associated with sensor element 16) and Pj+k can represent the 1 o'clock position (associated with sensor element 8). These positions show low readings, as evidence by the lowness of the graphed line, because sensor elements 16 and 8 are two sensor elements removed from the touched sensor element (sensor element 6) and therefore do not sense much capacitance. C can represent the 11 o'clock position (associated with sensor element 6).


The processing circuitry can apply the centroid calculation over all angular positions or over only those in which the captured readings exceeds a certain threshold amount. For example, in the above example, the centroid calculation can be applied over only those angular positions associated with sensor elements 16-17-6-7-8. The readings from the remaining sensor element positions can be presumed to be below a baseline that indicates no finger touch.


The processing circuitry can also define the range for each centroid calculation to be between angular positions located at or below the baseline. If the range is not defined in this manner, the circuitry can incorrectly conclude the presence of two fingers rather than one. For example, the histogram of FIG. 9 shows a peak reading at sensor element angular position of Pj (e.g., corresponding to the i=0 position in the current example) and at sensor element angular position Pj+n (e.g., corresponding to the i=11 position in the current example). Although the sensor elements associated with positions Pj and Pj+n can be angularly adjacent to one another on an input device, the histogram shows two peaks rather than one. If the centroid calculation defines the range for the centroid calculation from i=Pj to Pj+n, it can identify two centroids—one between Pj and Pj+k, and another between Pj+n−k and Pj+n—incorrectly indicating a finger at two locations on the input device. In contrast, by defining the calculation range from one low point, such as Pj+n−k, to the next low point, such as Pj+k, the circuitry can correctly identify only one centroid indicating the presence of one finger on the input device.


In another embodiment, the angular position of the user's finger can be calculated using a line crossing method based on the twelve readings (o)-(xi) described above. In the line crossing method, the slope of the histogram of measured reading values can be used to determine the location of the histogram peak. For example, readings that increase from one detected angular position to the next provide a positively-sloped histogram. Conversely, readings that decrease from one detected angular position to the next provide a negatively-sloped histogram. The histogram peak, indicating the angular position of the user's finger, can be identified as the mid-point between the positively-sloped histogram values and the negatively-sloped histogram values.


The methods described above can also enable detection of more than one finger applied to the sensing surface. For example, when two fingers are placed on the sensing surface, a histogram of the measured sensor element readings can have two peaks, with each peak being associated with the respective angular location of an applied finger. FIG. 10 illustrates an example of a histogram indicating two fingers being applied to the sensing surface. In the case of two fingers, the detected amplitude (also occasionally referred to as the detected “mass”) can be about twice the detected amplitude for one finger. When two fingers are spaced relatively far apart, the resulting histogram can display two relatively separate and distinct bell-shaped curves (as shown, for example, in FIG. 10). When two fingers are spaced relatively close together, the separation between the bell-shaped curves can be less distinct (as shown, for example, in FIG. 11).


Accordingly, to enable multi-touch detection in one embodiment the processing circuitry can first determine whether the aggregate of the sensor element readings exceed a certain threshold. The thresholds can be set at levels indicating whether one, two, three or more fingers are deemed to be touching the touch sensitive surface of the input device. If the aggregate readings indicate one finger on the touch sensitive surface, the angular position of the touch can be determined by the centroid calculation or the line crossing method described above for example. If the aggregate readings indicate more than one finger on the touch sensitive surface, the angular position of the touch can be determined in a number of ways.


For example, if the fingers are spaced relatively far apart, their angular position can be determined using the line crossing method described above since the peaks of a resulting histogram are likely to be well-defined. However, in a situation in which two fingers are spaced relatively close together, the line crossing method may not be effective due to the lack of separation between peaks as shown, for example, in FIG. 11. In this situation, the angular position of each finger can be determined to be an offset, by a standard deviation factor, from the midpoint position of the readings determined by the centroid calculation described above. For instance, the centroids of each of two fingers (represented as C0 and C1) can be calculated as shown in FIGS. 12 and 13 and as follows:






σ
=



1
N






i
=
1

N









(


x
i

-

x
_


)

2

.











C
0

=


C
2

-

K





σ









C
1

=


C
2

+

K





σ






Where


σ (sigma) represents the standard deviation of the entire “two finger” histogram;


N represents the number of angular positions


x represents the amplitude of the signal measured (same as R above)



x represents the mean average of R


K represents a constant between 0.5 and 3.0; and


C2 represents the centroid of the entire histogram.


The ability to detect the angular position of more than one finger applied to the sensing surface can provide several advantages. For example, if the sensing area is used in combination with a display, the ability to detect the location of two different fingers can enable the user to use at least two pieces of information to manipulate displayed data. This feature can enable detection, for example, of an increase in relative distance between two fingers on the sensing surface (a gesture sometimes referred to as “zooming”). Similarly, this feature can enable detection of a decrease in relative distance between two fingers on the sensing surface (a gesture sometimes referred to as “pinching”). In this manner, two pieces of information can be used to manipulate data on the user interface. The need for additional buttons or other input devices to manipulate data can thereby be reduced or eliminated.


In a somewhat similar manner, a radial position of the user's finger can also be determined. Again referring to input device 110 configured with the sensor element arrangement illustrated in FIG. 7, the relative radial position of a user's finger can be detected by scanning sensor elements 1-17 in the following combinations to capture the following three readings (xii)-(xiv):


(xii) scan sensor element 1;


(xiii) scan sensor elements 2-5;


(xiv) scan sensor elements 6-17.


In this example the first reading (xii) is associated with central sensor element 1, the second reading (xiii) is associated with inner ring of sensor elements 2-5 and the third reading (xiv) is associated with outer ring of sensor elements 6-17. The scanning speed can also be preferably substantially faster than the speed at which a user's finger moves relative to the sensor elements. For example, in this embodiment all of the sensor element combinations comprising three readings (xii)-(xiv) may be scanned within a few milliseconds. In effect, the relatively fast scanning speed can provide a snapshot of the radial position of the user's finger relative to the sensing area.


Using the three readings (xii)-(xiv), the radial position of the user's finger can be calculated using a centroid detection algorithm according to the following formula:






C
=




i










i






R
i






i










R
i







Where


i represents an index number uniquely associated with a radial position of the center sensor element, the inner ring or the outer ring;


Ri represents the amplitude of the signal measured in the reading (xii)-(xiv) associated with the center sensor element, the inner ring or the outer ring;


C represents a numerical value which varies as a function of detected signal amplitude and which may be used to identify the radial position of a user's finger relative to the sensor elements.


As before, the results of the radial calculation can be represented in the form of a histogram. A peak in this histogram corresponds to the detection of sensor element signals of relatively greater amplitude, indicating the radial position of the user's finger. The result of the centroid calculation corresponds to this peak. For example, if the calculation indicates a histogram peak near the (xiv) reading (in which the outer ring sensor elements are collectively scanned) then this can indicate that the user's finger is located near the outer ring.


In some situations, the processing circuitry may only need to determine whether more than one finger is applied to the sensing area, rather than determine a specific location of the fingers. In this situation, if the magnitude of the measured signals is above a certain threshold amount that indicates the presence of more than one finger, the input device circuitry need not perform the above positional calculations. In other situations, the input device circuitry can adjust its sensing sensitivity in order to compensate for fingers of different sizes, such as those of adults and children.



FIGS. 14-17 illustrate examples of user applications that can be performed on a portable electronic device using an input device configured in accordance with the various embodiments described herein.


In the embodiment illustrated in FIG. 14, input device 1410 can detect a linear input gesture to enable portable media device 1400 to execute an image browsing application. An example of an image browsing application that causes images, such as album cover pictures (in a “coverflow” application) for example, to be transitioned across a screen may be found in U.S. patent application Ser. No. 11/767,409 entitled “Media Player with Imaged Based Browsing,” filed Jun. 22, 2007, which is incorporated herein by reference in its entirety. In the embodiment illustrated in FIG. 14, media device 1400 can display picture #1 in a landscape orientation. Device 1400 can determine its orientation based on one or more orientation detection elements, such as an accelerometer for example. When input device 1410 detects a swipe in a left-to-right direction, media device 1400 can transition the display of picture #1 to picture #2 by sliding picture #2 in from the left of the display screen as illustrated.


In the embodiment illustrated in FIG. 15, input device 1510 can detect a linear input gesture to enable portable media device 1500 to provide list navigation. In a portrait orientation, media device 1500 can display a list of items on its display screen. When input device 1510 detects a vertical swipe in an up-to-down direction, media device 1500 can scroll the list in a downward direction in response to the applied input.


In the embodiment illustrated in FIG. 16, input device 1610 can detect a multi-touch input gesture to enable portable media device 1600 to execute a zoom-in feature in connection with a displayed image. In the illustrated embodiment, media device 1500 can display an image on its display screen. When input device 1610 detects one finger in inner region 1630 and another finger rotating in the clockwise direction in outer region 1620, media device 1600 can zoom in on the displayed image in response to the applied input. Similarly, in the embodiment illustrated in FIG. 17, when input device 1610 detects one finger in inner region 1630 and another finger rotating in the counterclockwise direction in outer region 1620, media device 1600 can zoom out from the displayed image in response to the applied input.


An input device can also enable zooming by detecting a pinch gesture. In one embodiment, an input device can enable a distance between at least two fingers applied to its touch sensitive surface to be determined. In another embodiment, an input device can enable a distance between one or more fingers applied to its touch sensitive surface and some other reference point, such as the center of the input device for example, to be determined. If the determined distance increases during the input event, indicating a spreading apart motion, a zoom-in signal can be generated. If the compared distance decreases during the input event, indicating a closing together motion, a zoom-out signal can be generated.


The amount of zooming can vary according to the determined distance. Furthermore, the zooming can occur in relation to the motion of the fingers. For instance, as the fingers spread apart or close together, the object can be zoomed in or zoomed out at the same time. Although this methodology can be directed at zooming, it can also be used for other applications, such as enlarging or reducing for example. Zooming can be particularly useful in graphical programs such as publishing, photo, and drawing programs for example.


In another embodiment, input device processing circuitry can be configured to recognize principal directions of applied gestures and modify the scanning pattern accordingly. For example, the circuitry can optimize the scanning pattern so that input events applied at positions of 0, 90, 180, 270 degrees, for example, obtain better signal to noise ratios than input events applied at other positions.


In another embodiment, input device processing circuitry can switch between different sensing configurations in order to achieve power savings. In a particular sensing configuration, the circuitry can enable only sensor elements arranged according to a predefined input pattern to sense input. For example, in a “swipe mode”, only linearly arranged sensor elements can be enabled to sense input. Similarly, in a “legacy wheel mode”, only radially arranged sensor elements can be enabled to sense input. In a “gesture mode”, all sensor elements can be enabled so that any pattern of gestural input can be sensed by the input device. The particular sensing configuration to be utilized by the input device can be based on a context of an application running on the portable media device, for example.



FIGS. 18A-18C illustrate operations of an input device according to some embodiments of the present disclosure. By way of example, the input device may generally correspond to input device 110. In the example shown in FIG. 18A, input device 1830 can be configured to send information or data to an electronic device in order to perform an action on a display screen (e.g., via a graphical user interface). Examples of actions that may be performed include, moving an input pointer, making a selection, providing instructions, etc. The input device can interact with the electronic device through a wired connection (e.g., cable/connector) or a wireless connection (e.g., IR, Bluetooth, etc.). Input device 1830 may be a stand alone unit or it may be integrated into the electronic device. As a stand alone unit, the input device can have its own enclosure. When integrated into an electronic device, the input device can typically use the enclosure of the electronic device. In either case, the input device can be structurally coupled to the enclosure, as for example, through screws, snaps, retainers, adhesives and the like. In some cases, the input device may be removably coupled to the electronic device, as for example, through a docking station. The electronic device to which the input device may be coupled can correspond to any consumer related electronic product. By way of example, the electronic device can correspond to a computer such as a desktop computer, laptop computer or PDA, a media player such as a music player, a communication device such as a cellular phone, another input device such as a keyboard, and the like.


As shown in FIG. 18A, in this embodiment input device 1830 may include frame 1832 (or support structure) and touch pad 1834. Frame 1832 can provide a structure for supporting the components of the input device. Frame 1832 in the form of a housing can also enclose or contain the components of the input device. The components, which may include touch pad 1834, can correspond to electrical, optical and/or mechanical components for operating input device 1830. Frame 1832 may be a separate component or it may be an integral component of the housing of the electronic device.


Touch pad 1834 can provide location information for an object, such as a finger for example, in contact with or in proximity to the touch pad. This information can be used in combination with information provided by a movement indicator to generate a single command associated with the movement of the touch pad. The touch pad may be used as an input device by itself; for example, the touch pad may be used to scroll through a list of items on the device.


The shape, size and configuration of touch pad 1834 may be widely varied. In addition to the touchpad configurations disclosed above, a conventional touch pad based on the Cartesian coordinate system, or based on a Polar coordinate system can be configured to provide scrolling using rotational movements and can be configured to accept the multi-touch and gestures, for example those described herein. An example of a touch pad based on polar coordinates may be found in U.S. Pat. No. 7,046,230 which is incorporated by reference above. Furthermore, touch pad 1834 can be used in at least two different modes, which may be referred to as a relative mode and an absolute mode. In absolute mode, touch pad 1834 can, for example, report the absolute coordinates of the location at which it may be touched. For example, these would be “x” and “y” coordinates in the case of a standard Cartesian coordinate system or (r,θ) in the case of a Polar coordinate system. In relative mode, touch pad 1834 can report the direction and/or distance of change, for example, left/right, up/down, and the like. In most cases, the signals produced by touch pad 1834 can direct movement on the display screen in a direction similar to the direction of the finger as it may be moved across the surface of touch pad 1834.


Further examples of touch pad configurations may be found in U.S. patent application Ser. No. 10/949,060 entitled “Raw Data Track Pad Device and System,” filed Sep. 24, 2004, U.S. patent application Ser. No. 11/203,692 entitled “Method of Increasing the Spatial Resolution of Touch Sensitive Devices,” filed Aug. 15, 2005, and U.S. patent application Ser. No. 11/818,395 entitled “Touch Screen Stack-Ups,” filed Jun. 13, 2007, all of which are incorporated herein by reference in their entireties.


Further examples of touch pad sensing may be found in U.S. patent application Ser. No. 10/903,964 entitled “Gestures for Touch Sensitive Input Devices,” filed Jul. 30, 2004, U.S. patent application Ser. No. 11/038,590 entitled “Mode-Based Graphical User Interfaces for Touch Sensitive Input Devices,” filed Jan. 18, 2005, U.S. patent application Ser. No. 11/048,264 entitled “Gestures for Touch Sensitive Input Devices,” filed Jan. 31, 2005, U.S. patent application Ser. No. 11/232,299 entitled “System and Method for Processing Raw Data of Track Pad Device,” filed Sep. 21, 2005, and U.S. patent application Ser. No. 11/619,464 entitled “Multi-Touch Input Discrimination,” filed Jan. 3, 2007, all of which are incorporated herein by reference in their entireties.


The shape of touch pad 1834 may be widely varied. For example, it may be circular, oval, square, rectangular, triangular, and the like. In general, the outer perimeter can define the working boundary of touch pad 1834. In the embodiment illustrated in FIG. 18, the touch pad may be circular. Circular touch pads can allow a user to continuously swirl a finger in a free manner, i.e., the finger may be rotated through 360 degrees of rotation without stopping. This form of motion can produce incremental or accelerated scrolling through a list of songs being displayed on a display screen, for example. Furthermore, the user may rotate his or her finger tangentially from all sides, thus providing more finger position range. Both of these features may help when performing a scrolling function. Furthermore, the size of touch pad 1834 can accommodate manipulation by a user (e.g., the size of a finger tip or larger).


Touch pad 1834, which can generally take the form of a rigid platform. The rigid platform may be planar, convex or concave, and may include touchable outer surface 1836, which may be textured, for receiving a finger or other object for manipulation of the touch pad. Although not shown in FIG. 18A, beneath touchable outer surface 1836 can be a sensor arrangement that may be sensitive to such things as the pressure and movement of a finger thereon. The sensor arrangement may typically include a plurality of sensors that can be configured to activate as the finger sits on, taps on or passes over them. In the simplest case, an electrical signal can be produced each time the finger is positioned over a sensor. The number of signals in a given time frame may indicate location, direction, speed and acceleration of the finger on touch pad 1834, i.e., the more signals, the more the user moved his or her finger. In most cases, the signals can be monitored by an electronic interface that converts the number, combination and frequency of the signals into location, direction, speed and acceleration information. This information can then be used by the electronic device to perform the desired control function on the display screen. The sensor arrangement may be widely varied. By way of example, the sensors can be based on resistive sensing, surface acoustic wave sensing, pressure sensing (e.g., strain gauge), optical sensing, capacitive sensing and the like.


In the embodiment illustrated in FIG. 18, touch pad 1834 may be based on capacitive sensing. In most cases, the capacitive touch pad may include a protective shield, one or more electrode layers, a circuit board and associated electronics including an application specific integrated circuit (ASIC). The protective shield can be placed over the electrodes, the electrodes can be mounted on the top surface of the circuit board, and the ASIC can be mounted on the bottom surface of the circuit board. The protective shield may serve to protect the underlayers and to provide a surface for allowing a finger to slide thereon. The surface may generally be smooth so that the finger does not stick to it when moved. The protective shield also may provide an insulating layer between the finger and the electrode layers. The electrode layer may include a plurality of spatially distinct electrodes. Any suitable number of electrodes can be used. As the number of electrodes increases, the resolution of the touch pad also increases.


In accordance with one embodiment, touch pad 1834 can be movable relative to the frame 1832. This movement can be detected by a movement detector that generates another control signal. By way of example, touch pad 1834 in the form of the rigid planar platform can rotate, pivot, slide, translate, flex and/or the like relative to frame 1832. Touch pad 1834 can be coupled to frame 1832 and/or it can be movably restrained by frame 1832. By way of example, touch pad 1834 can be coupled to frame 1832 through axels, pin joints, slider joints, ball and socket joints, flexure joints, magnets, cushions and/or the like. Touch pad 1834 can also float within a space of the frame (e.g., gimbal). It should be noted that input device 1830 may additionally include a combination of joints such as a pivot/translating joint, pivot/flexure joint, pivot/ball and socket joint, translating/flexure joint, and the like to increase the range of movement (e.g., increase the degree of freedom).


When moved, touch pad 1834 can be configured to actuate a movement detector circuit that generates one or more signals. The circuit may generally include one or more movement detectors such as switches, sensors, encoders, and the like.


In the embodiment illustrated in FIG. 18, touch pad 1834 can be part of a depressible platform. The touch pad can operate as a button and perform one or more mechanical clicking actions. Multiple functions or the same function of the device may be accessed by depressing the touch pad 1834 in different locations. A movement detector signals that touch pad 1834 has been depressed, and touch pad 1834 signals a location on the platform that has been touched. By combining both the movement detector signals and touch pad signals, touch pad 1834 acts like multiple buttons such that depressing the touch pad at different locations corresponds to different buttons. As shown in FIGS. 18B and 18C, according to one embodiment touch pad 1834 can be capable of moving between an upright position (FIG. 18B) and a depressed position (FIG. 18C) when a requisite amount of force from finger 1838, palm, hand or other object is applied to touch pad 1834. Touch pad 1834 can be spring biased in the upright position, as for example through a spring member. Touch pad 1834 moves to the depressed position when the spring bias is overcome by an object pressing on touch pad 1834.


As shown in FIG. 18B, touch pad 1834 generates tracking signals when an object such as a user's finger is moved over the top surface of the touch pad in the x, y plane. As shown in FIG. 18C, in the depressed position (z direction), touch pad 1834 generates positional information and a movement indicator generates a signal indicating that touch pad 1834 has moved. The positional information and the movement indication can be combined to form a button command. Different button commands or the same button command can correspond to depressing touch pad 1834 in different locations. The different commands may be used for various functionalities including, but not limited to, making selections or issuing commands associated with operating an electronic device. By way of example, in the case of a music player, the button commands may be associated with opening a menu, playing a song, fast forwarding a song, seeking through a menu and the like.


To elaborate, touch pad 1834 can be configured to actuate a movement detector, which together with the touch pad positional information, can form a button command when touch pad 1834 is moved to the depressed position. The movement detector can be located within frame 1832 and coupled to touch pad 1834 and/or frame 1832. The movement detector may be any combination of switches and sensors. Switches can be generally configured to provide pulsed or binary data such as activate (on) or deactivate (off). By way of example, an underside portion of touch pad 1834 can be configured to contact or engage (and thus activate) a switch when the user presses on touch pad 1834. The sensors, on the other hand, can be generally configured to provide continuous or analog data. By way of example, the sensor can be configured to measure the position or the amount of tilt of touch pad 1834 relative to the frame when a user presses on the touch pad 1834. Any suitable mechanical, electrical and/or optical switch or sensor may be used. For example, tact switches, force sensitive resistors, pressure sensors, proximity sensors, and the like may be used. In some case, the spring bias for placing touch pad 1834 in the upright position may be provided by a movement detector that includes a spring action. In other embodiments, input device 1830 can include one or more movement detectors in various locations positioned under and/or above touch pad 1834 to form button commands associated with the particular locations in which the movement detector is actuated.


Touch pad 1834 may can also be configured to provide a force feedback response. An example of touch pad configuration providing a haptic feedback response may be found in U.S. Pat. No. 6,337,678 entitled “Force Feedback Computer Input and Output Device with Coordinated Haptic Elements,” which is incorporated herein by reference in its entirety.



FIG. 19 illustrates a simplified perspective diagram of input device 1870. Like the input device shown in the embodiment of FIGS. 18A-18C, this input device 1870 incorporates the functionality of one or more buttons directly into touch pad 1872, i.e., the touch pad acts like a button. In this embodiment, however, touch pad 1872 can be divided into a plurality of independent and spatially distinct button zones 1874. Button zones 1874 may represent regions of the touch pad 1872 that can be moved by a user to implement distinct button functions or the same button function. The dotted lines may represent areas of touch pad 1872 that make up an individual button zone. Any number of button zones may be used, for example, two or more, four, eight, etc. In the embodiment illustrated in FIG. 19, touch pad 1872 may include four button zones 1874 (i.e., zones A-D).


As should be appreciated, the button functions generated by pressing on each button zone may include selecting an item on the screen, opening a file or document, executing instructions, starting a program, viewing a menu, and/or the like. The button functions may also include functions that make it easier to navigate through the electronic system, as for example, zoom, scroll, open different menus, home the input pointer, perform keyboard related actions such as enter, delete, insert, page up/down, and the like. In the case of a music player, one of the button zones may be used to access a menu on the display screen, a second button zone may be used to seek forward through a list of songs or fast forward through a currently playing song, a third button zone may be used to seek backwards through a list of songs or fast rearward through a currently playing song, and a fourth button zone may be used to pause or stop a song that may be in the process of being played.


To elaborate, touch pad 1872 can be capable of moving relative to frame 1876 so as to create a clicking action. Frame 1876 can be formed from a single component or a combination of assembled components. The clicking action can actuate a movement detector contained inside frame 1876. The movement detector can be configured to sense movements of the button zones during the clicking action and to send a signal corresponding to the movement to the electronic device. By way of example, the movement detectors may be switches, sensors and/or the like.


In addition, touch pad 1872 can be configured to send positional information on what button zone may be acted on when the clicking action occurs. The positional information can allow the device to determine which button zone to activate when the touch pad is moved relative to the frame.


The movements of each of button zones 1874 may be provided by various rotations, pivots, translations, flexes and the like. In one embodiment, touch pad 1872 can be configured to gimbal relative to frame 1876. By gimbal, it is generally meant that the touch pad 1872 can float in space relative to frame 1876 while still being constrained thereto. The gimbal can allow the touch pad 1872 to move in single or multiple degrees of freedom (DOF) relative to the housing, for example, movements in the x, y and/or z directions and/or rotations about the x, y, and/or z axes (Oxeyez).



FIG. 20 illustrates an example of a simplified block diagram of a computing system 1839. The computing system may generally include input device 1840 operatively connected to computing device 1842. By way of example, input device 1840 can generally correspond to input device 1830 shown in FIGS. 18A-18C, and the computing device 1842 can correspond to a computer, PDA, media player or the like. As shown, input device 1840 may include depressible touch pad 1844 and one or more movement detectors 1846. Touch pad 1844 can be configured to generate tracking signals and movement detector 1846 can be configured to generate a movement signal when the touch pad is depressed. Although touch pad 1844 may be widely varied, in this embodiment, touch pad 1844 can include capacitance sensors 1848 and control system 1850 (which can generally correspond to the sensor controller described above) for acquiring position signals from sensors 1848 and supplying the signals to computing device 1842. Control system 1850 can include an application specific integrated circuit (ASIC) that can be configured to monitor the signals from sensors 1848, to compute the absolute location, angular location, direction, speed and/or acceleration of the monitored signals and to report this information to a processor of computing device 1842. Movement detector 1846 may also be widely varied. In this embodiment, however, movement detector 1846 can take the form of a switch that generates a movement signal when touch pad 1844 is depressed. Movement detector 1846 can correspond to a mechanical, electrical or optical style switch. In one particular implementation, movement detector 1846 can be a mechanical style switch that includes protruding actuator 1852 that may be pushed by touch pad 1844 to generate the movement signal. By way of example, the switch may be a tact or dome switch.


Both touch pad 1844 and movement detector 1846 can be operatively coupled to computing device 1842 through communication interface 1854. The communication interface provides a connection point for direct or indirect connection between the input device and the electronic device. Communication interface 1854 may be wired (wires, cables, connectors) or wireless (e.g., transmitter/receiver).


Referring to computing device 1842, it may include processor 1857 (e.g., CPU or microprocessor) configured to execute instructions and to carry out operations associated with computing device 1842. For example, using instructions retrieved from memory, the processor can control the reception and manipulation of input and output data between components of computing device 1842. Processor 1857 can be configured to receive input from both movement detector 1846 and touch pad 1844 and can form a signal/command that may be dependent upon both of these inputs. In most cases, processor 1857 can execute instruction under the control of an operating system or other software. Processor 1857 may be a single-chip processor or may be implemented with multiple components.


Computing device 1842 may also include input/output (I/O) controller 1856 that can be operatively coupled to processor 1857. (I/O) controller 1856 can be integrated with processor 1857 or it may be a separate component as shown. I/O controller 1856 can generally be configured to control interactions with one or more I/O devices that may be coupled to the computing device 1842, as for example input device 1840 and orientation detector 1855, such as an acclerometer. I/O controller 1856 can generally operate by exchanging data between computing device 1842 and I/O devices that desire to communicate with computing device 1842.


Computing device 1842 may also include display controller 1858 that can be operatively coupled to processor 1857. Display controller 1858 can be integrated with processor 1857 or it may be a separate component as shown. Display controller 1858 can be configured to process display commands to produce text and graphics on display screen 1860. By way of example, display screen 1860 may be a monochrome display, color graphics adapter (CGA) display, enhanced graphics adapter (EGA) display, variable-graphics-array (VGA) display, super VGA display, liquid crystal display (e.g., active matrix, passive matrix and the like), cathode ray tube (CRT), plasma displays and the like. In the embodiment illustrated in FIG. 20, the display device corresponds to a liquid crystal display (LCD).


In some cases, processor 1857 together with an operating system operates to execute computer code and produce and use data. The computer code and data can reside within program storage area 1862 that may be operatively coupled to processor 1857. Program storage area 1862 can generally provide a place to hold data that may be used by computing device 1842. By way of example, the program storage area may include Read-Only Memory (ROM), Random-Access Memory (RAM), hard disk drive and/or the like. The computer code and data could also reside on a removable program medium and loaded or installed onto the computing device when needed. In one embodiment, program storage area 1862 can be configured to store information for controlling how the tracking and movement signals generated by the input device may be used, either alone or in combination for example, by computing device 1842 to generate an input event command, such as a single button press for example.



FIGS. 21A-21D illustrate applications of an input device according to some embodiments of the present disclosure. As previously mentioned, the input devices described herein can be integrated into an electronic device or they can be separate stand alone devices. FIGS. 21A-21D show some implementations of input device 1820 integrated into an electronic device. FIG. 21A shows input device 1820 incorporated into media player 1812. FIG. 21B shows input device 1820 incorporated into laptop computer 1814. FIGS. 21C and 21D, on the other hand, show some implementations of input device 1820 as a stand alone unit. FIG. 21C shows input device 1820 as a peripheral device that can be connected to desktop computer 1816. FIG. 21D shows input device 1820 as a remote control that wirelessly connects to docking station 1818 with media player 1822 docked therein. It should be noted, however, that in some embodiments the remote control can also be configured to interact with the media player (or other electronic device) directly, thereby eliminating the need for a docking station. An example of a docking station for a media player may be found in U.S. patent application Ser. No. 10/423,490, entitled “Media Player System,” filed Apr. 25, 2003, which is incorporated herein by reference in its entirety. It should be noted that these particular embodiments do not limit the present disclosure and that many other devices and configurations may be used.


Referring back to FIG. 21A, media player 1812, housing 1822 and display screen 1824 may generally correspond to those described above. As illustrated in the embodiment of FIG. 21A, display screen 1824 can be visible to a user of media player 1812 through opening 1825 in housing 1822 and through transparent wall 1826 disposed in front of opening 1825. Although transparent, transparent wall 1826 can be considered part of housing 1822 since it helps to define the shape or form of media player 1812.


Media player 1812 may also include touch pad 1820 such as any of those previously described. Touch pad 1820 can generally consist of touchable outer surface 1831 for receiving a finger for manipulation on touch pad 1820. Although not illustrated in the embodiment of FIG. 21A, beneath touchable outer surface 1831 a sensor arrangement can be configured in a manner as previously described. Information provided by the sensor arrangement can be used by media player 1812 to perform the desired control function on display screen 1824. For example, a user may easily scroll through a list of songs by swirling the finger around touch pad 1820.


In addition to above, the touch pad may also include one or more movable buttons zones A-D as well as a center button E for example. The button zones can be configured to provide one or more dedicated control functions for making selections or issuing commands associated with operating media player 1812. By way of example, in the case of an MP3 music player, the button functions can be associated with opening a menu, playing a song, fast forwarding a song, seeking through a menu, making selections and the like. In some embodiments, the button functions can be implemented via a mechanical clicking action.


The position of touch pad 1820 relative to housing 1822 may be widely varied. For example, touch pad 1820 can be placed at any external surface (e.g., top, side, front, or back) of housing 1822 accessible to a user during manipulation of media player 1812. In some embodiments, touch sensitive surface 1831 of touch pad 1820 can be completely exposed to the user. In the embodiment illustrated in FIG. 21A, touch pad 1820 can be located in a lower front area of housing 1822. Furthermore, touch pad 1820 can be recessed below, level with, or extend above the surface of housing 1822. In the embodiment illustrated in FIG. 21A, touch sensitive surface 1831 of touch pad 1820 can be substantially flush with the external surface of housing 1822.


The shape of touch pad 1820 may also be widely varied. Although illustrated as circular in the embodiment of FIG. 21A, the touch pad can also be square, rectangular, triangular, and the like for example. More particularly, the touch pad can be annular, i.e., shaped like or forming a ring. As such, the inner and outer perimeter of the touch pad can define the working boundary of the touch pad.


Media player 1812 may also include hold switch 1834. Hold switch 1834 can be configured to activate or deactivate the touch pad and/or buttons associated therewith for example. This can be generally done to prevent unwanted commands by the touch pad and/or buttons, as for example, when the media player is stored inside a user's pocket. When deactivated, signals from the buttons and/or touch pad cannot be sent or can be disregarded by the media player. When activated, signals from the buttons and/or touch pad can be sent and therefore received and processed by the media player.


Moreover, media player 1812 may also include one or more headphone jacks 1836 and one or more data ports 1838. Headphone jack 1836 can be capable of receiving a headphone connector associated with headphones configured for listening to sound being outputted by media player 1812. Data port 1838, on the other hand, can be capable of receiving a data connector/cable assembly configured for transmitting and receiving data to and from a host device such as a general purpose computer (e.g., desktop computer, portable computer). By way of example, data port 1838 can be used to upload or download audio, video and other images to and from media player 1812. For example, the data port can be used to download songs and play lists, audio books, ebooks, photos, and the like into the storage mechanism of the media player.


Data port 1838 may be widely varied. For example, the data port can be a PS/2 port, a serial port, a parallel port, a USB port, a Firewire port and/or the like. In some embodiments, data port 1838 can be a radio frequency (RF) link or optical infrared (IR) link to eliminate the need for a cable. Although not illustrated in the embodiment of FIG. 21A, media player 1812 can also include a power port that receives a power connector/cable assembly configured for delivering power to media player 1812. In some cases, data port 1838 can serve as both a data and power port. In the embodiment illustrated in FIG. 21A, data port 1838 can be a USB port having both data and power capabilities.


Although only one data port may be shown, it should be noted that this does not limit the present disclosure and that multiple data ports may be incorporated into the media player. In a similar vein, the data port can include multiple data functionality, i.e., integrating the functionality of multiple data ports into a single data port. Furthermore, it should be noted that the position of the hold switch, headphone jack and data port on the housing may be widely varied, in that they are not limited to the positions shown in FIG. 21A. They can be positioned almost anywhere on the housing (e.g., front, back, sides, top, bottom). For example, the data port can be positioned on the top surface of the housing rather than the bottom surface as shown.



FIGS. 22A and 22B illustrate installation of an input device into a media player according to some embodiments of the present disclosure. By way of example, input device 1850 may correspond to any of those previously described and media player 1852 may correspond to the one shown in FIG. 21A. As shown, input device 1850 may include housing 1854 and touch pad assembly 1856. Media player 1852 may include shell or enclosure 1858. Front wall 1860 of shell 1858 may include opening 1862 for allowing access to touch pad assembly 1856 when input device 1850 is introduced into media player 1852. The inner side of front wall 1860 may include channel or track 1864 for receiving input device 1850 inside shell 1858 of media player 1852. Channel 1864 can be configured to receive the edges of housing 1854 of input device 1850 so that input device 1850 can be slid into its desired place within shell 1858. The shape of the channel can have a shape that generally coincides with the shape of housing 1854. During assembly, circuit board 1866 of touch pad assembly 1856 can be aligned with opening 1862 and cosmetic disc 1868 and button cap 1870 can be mounted onto the top side of circuit board 1866 for example. As shown in the embodiment illustrated in FIG. 22B, cosmetic disc 1868 can have a shape that may generally coincide with opening 1862. The input device can be held within the channel via a retaining mechanism such as screws, snaps, adhesives, press fit mechanisms, crush ribs and the like for example.



FIG. 23 illustrates a simplified block diagram of a remote control incorporating an input device according to some embodiments of the present disclosure. By way of example, input device 1882 may generally correspond to any of the previously described input devices. In this particular embodiment, input device 1882 may correspond to the input device shown in FIGS. 18A-18C, thus the input device may include touch pad 1884 and plurality of switches 1886. Touch pad 1884 and switches 1886 can be operatively coupled to wireless transmitter 1888. Wireless transmitter 1888 can be configured to transmit information over a wireless communication link so that an electronic device that has receiving capabilities can receive the information over the wireless communication link. Wireless transmitter 1888 may be widely varied. For example, it can be based on wireless technologies such as FM, RF, Bluetooth, 802.11 UWB (ultra wide band), IR, magnetic link (induction) and the like for example. In the embodiment illustrated in FIG. 23, wireless transmitter 1888 can be based on IR. IR generally refers to wireless technologies that convey data through infrared radiation. As such, wireless transmitter 1888 may generally include IR controller 1890. IR controller 1890 can take the information reported from touch pad 1884 and switches 1886 and convert this information into infrared radiation, as for example using light emitting diode 1892.


It will be appreciated that the above description for clarity has described embodiments of the disclosure with reference to different functional units and processors. However, it will be apparent that any suitable distribution of functionality between different functional units or processors may be used without detracting from the disclosure. For example, functionality illustrated to be performed by separate processors or controllers may be performed by the same processors or controllers. Hence, references to specific functional units may be seen as references to suitable means for providing the described functionality rather than indicative of a strict logical or physical structure or organization.


The disclosure may be implemented in any suitable form, including hardware, software, firmware, or any combination of these. The disclosure may optionally be implemented partly as computer software running on one or more data processors and/or digital signal processors. The elements and components of an embodiment of the disclosure may be physically, functionally, and logically implemented in any suitable way. Indeed, the functionality may be implemented in a single unit, in a plurality of units, or as part of other functional units. As such, the disclosure may be implemented in a single unit or may be physically and functionally distributed between different units and processors.


One skilled in the relevant art will recognize that many possible modifications and combinations of the disclosed embodiments can be used, while still employing the same basic underlying mechanisms and methodologies. The foregoing description, for purposes of explanation, has been written with references to specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the disclosure to the precise forms disclosed. Many modifications and variations can be possible in view of the above teachings. The embodiments were chosen and described to explain the principles of the disclosure and their practical applications, and to enable others skilled in the art to best utilize the disclosure and various embodiments with various modifications as suited to the particular use contemplated.

Claims
  • 1. An input device comprising: a touch sensitive input element,a touch sensitive surface arranged around the touch sensitive input element,the touch sensitive input element and touch sensitive surface being configured to enable relative displacement between the touch sensitive input element and the touch sensitive surface surrounding the touch sensitive input element when the touch sensitive input element is depressed relative to the touch sensitive surface, anda controller that detects input sensed by at least one of the touch sensitive input element and the touch sensitive surface.
  • 2. The input device of claim 1 wherein the controller detects a gesture traversing the touch sensitive surface and the touch sensitive input element.
  • 3. The input device of claim 1, wherein the touch sensitive surface is enabled by a first plurality of circumferentially arranged sensor elements, the first plurality of sensor elements being concentrically arranged relative to the touch sensitive input element, and wherein the controller is configured to detect a gesture traversing the first plurality of sensor elements and the touch sensitive input element.
  • 4. The input device of claim 3, wherein the touch sensitive surface is enabled by a second plurality of circumferentially arranged sensor elements, the second plurality of sensor elements being concentrically arranged relative to the touch sensitive input element, and wherein the controller is configured to detect a gesture traversing the first plurality of sensor elements, the second plurality of sensor elements and the touch sensitive input element.
  • 5. The input device of claim 1, wherein a user interface application is configured to be executed in response to the detection of the gesture.
  • 6. The input device of claim 1, wherein the gesture comprises a linear motion.
  • 7. The input device of claim 1, wherein the gesture comprises a linear motion and a user interface application is configured to change a display of content in response to the detection of the gesture.
  • 8. The input device of claim 3, wherein the first plurality of sensor elements comprise capacitive sensor elements.
  • 9. The input device of claim 3, wherein the first plurality of sensor elements are concentrically arranged relative to one or more capacitive sensor elements.
  • 10. The input device of claim 3, wherein the first plurality of sensor elements are concentrically arranged relative to a trackpad.
  • 11. The input device of claim 1, wherein the controller is configured to detect a simultaneous presence of two or more objects on the touch sensitive surface and the touch sensitive input element.
  • 12. The input device of claim 11, wherein the controller is configured to execute a user interface application in response to the simultaneous detection of the objects.
  • 13. The input device of claim 12, wherein the simultaneous detection comprises the controller sensing a first object on the touch sensitive surface, and the controller sensing a second object the touch sensitive surface.
  • 14. The input device of claim 12, wherein the simultaneous detection comprises the controller sensing a first object on the touch sensitive input element, and the controller sensing a second object on the touch sensitive surface.
  • 15. The input device of claim 12, wherein the simultaneous detection comprises the controller detecting a pinching gesture by two objects.
  • 16. The input device of claim 12, wherein the simultaneous detection comprises the controller detecting a presence of a first of two objects on the touch sensitive input element, and the controller detecting a rotation of a second of the two objects on the touch sensitive surface.
  • 17. The input device of claim 5, wherein the user interface application comprises a zoom-in or a zoom-out operation executed in response to the rotation of the second object on the touch sensitive surface while the first object touches the touch sensitive input element.
  • 18. The input device of claim 1, wherein the touch sensitive input element comprises a fingerprint sensor.
CROSS REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 12/205,757, filed Sep. 5, 2008, which claims priority under 35 USC 119(e) to U.S. Provisional Application No. 60/992,056, filed Dec. 3, 2007, and U.S. Provisional Application No. 61/017,436, filed Dec. 28, 2007, the entireties of which are incorporated herein by reference.

US Referenced Citations (535)
Number Name Date Kind
1061578 Wischhusen et al. May 1913 A
2063276 Thomas Dec 1936 A
2798907 Schneider Jul 1957 A
2903229 Landge Sep 1959 A
2945111 McCormick Jul 1960 A
3005055 Mattke Oct 1961 A
3965399 Walker et al. Jun 1976 A
3996441 Ohashi Dec 1976 A
4029915 Ojima Jun 1977 A
4103252 Bobick Jul 1978 A
4110749 Janko et al. Aug 1978 A
4115670 Chandler Sep 1978 A
4121204 Welch et al. Oct 1978 A
4129747 Pepper Dec 1978 A
4158216 Bigelow Jun 1979 A
4242676 Piguet et al. Dec 1980 A
4246452 Chandler Jan 1981 A
4264903 Bigelow Apr 1981 A
4266144 Bristol May 1981 A
4293734 Pepper, Jr. Oct 1981 A
D264969 McGourty Jun 1982 S
4338502 Hashimoto et al. Jul 1982 A
4380007 Steinegger Apr 1983 A
4380040 Posset Apr 1983 A
4394649 Suchoff et al. Jul 1983 A
4475008 Doi et al. Oct 1984 A
4570149 Thornburg et al. Feb 1986 A
4583161 Gunderson et al. Apr 1986 A
4587378 Moore May 1986 A
4604786 Howie, Jr. Aug 1986 A
4613736 Shichijo et al. Sep 1986 A
4644100 Brenner et al. Feb 1987 A
4719524 Morishima et al. Jan 1988 A
4734034 Maness et al. Mar 1988 A
4736191 Matzke et al. Apr 1988 A
4739191 Puar Apr 1988 A
4739299 Eventoff et al. Apr 1988 A
4752655 Tajiri et al. Jun 1988 A
4755765 Ferland Jul 1988 A
4764717 Tucker et al. Aug 1988 A
4771139 DeSmet Sep 1988 A
4798919 Miessler et al. Jan 1989 A
4810992 Eventoff Mar 1989 A
4822957 Talmage, Jr. et al. Apr 1989 A
4831359 Newell May 1989 A
4849852 Mullins Jul 1989 A
4856993 Maness et al. Aug 1989 A
4860768 Hon et al. Aug 1989 A
4866602 Hall Sep 1989 A
4876524 Jenkins Oct 1989 A
4897511 Itaya et al. Jan 1990 A
4914624 Dunthorn Apr 1990 A
4917516 Retter Apr 1990 A
4943889 Ohmatoi Jul 1990 A
4951036 Grueter et al. Aug 1990 A
4954823 Binstead Sep 1990 A
4976435 Shatford et al. Dec 1990 A
4990900 Kikuchi Feb 1991 A
5008497 Asher Apr 1991 A
5036321 Leach et al. Jul 1991 A
5053757 Meadows Oct 1991 A
5086870 Bolduc Feb 1992 A
5125077 Hall Jun 1992 A
5159159 Asher Oct 1992 A
5179648 Hauck Jan 1993 A
5186646 Pederson Feb 1993 A
5192082 Inoue et al. Mar 1993 A
5193669 Demeo et al. Mar 1993 A
5231326 Echols Jul 1993 A
5237311 Mailey et al. Aug 1993 A
5278362 Ohashi Jan 1994 A
5305017 Gerpheide Apr 1994 A
5313027 Inoue et al. May 1994 A
D349280 Kaneko Aug 1994 S
5339213 O'Callaghan Aug 1994 A
5367199 Lefkowitz et al. Nov 1994 A
5374787 Miller et al. Dec 1994 A
5379057 Clough et al. Jan 1995 A
5404152 Nagai Apr 1995 A
5408621 Ben-Arie Apr 1995 A
5414445 Kaneko et al. May 1995 A
5416498 Grant May 1995 A
5424756 Ho et al. Jun 1995 A
5432531 Calder et al. Jul 1995 A
5438331 Gilligan et al. Aug 1995 A
D362431 Kaneko et al. Sep 1995 S
5450075 Waddington Sep 1995 A
5453761 Tanaka Sep 1995 A
5473343 Kimmich et al. Dec 1995 A
5473344 Bacon et al. Dec 1995 A
5479192 Carroll, Jr. et al. Dec 1995 A
5494157 Golenz et al. Feb 1996 A
5495566 Kwatinetz Feb 1996 A
5508703 Okamura et al. Apr 1996 A
5508717 Miller Apr 1996 A
5543588 Bisset et al. Aug 1996 A
5543591 Gillespie et al. Aug 1996 A
5555004 Ono et al. Sep 1996 A
5559301 Bryan, Jr. et al. Sep 1996 A
5559943 Cyr et al. Sep 1996 A
5561445 Miwa et al. Oct 1996 A
5564112 Hayes et al. Oct 1996 A
5565887 McCambridge et al. Oct 1996 A
5578817 Bidiville et al. Nov 1996 A
5581670 Bier et al. Dec 1996 A
5585823 Duchon et al. Dec 1996 A
5589856 Stein et al. Dec 1996 A
5589893 Gaughan et al. Dec 1996 A
5596347 Robertson et al. Jan 1997 A
5596697 Foster et al. Jan 1997 A
5598183 Robertson et al. Jan 1997 A
5611040 Brewer et al. Mar 1997 A
5611060 Belfiore et al. Mar 1997 A
5613137 Bertram et al. Mar 1997 A
5617114 Bier et al. Apr 1997 A
5627531 Posso et al. May 1997 A
5632679 Tremmel May 1997 A
5640258 Kurashima et al. Jun 1997 A
5648642 Miller et al. Jul 1997 A
D382550 Kaneko et al. Aug 1997 S
5657012 Tait Aug 1997 A
5661632 Register Aug 1997 A
D385542 Kaneko et al. Oct 1997 S
5675362 Clough et al. Oct 1997 A
5689285 Asher Nov 1997 A
5721849 Amro Feb 1998 A
5726687 Belfiore et al. Mar 1998 A
5729219 Armstrong et al. Mar 1998 A
5730165 Philipp Mar 1998 A
5748185 Stephan et al. May 1998 A
5751274 Davis May 1998 A
5754890 Holmdahl et al. May 1998 A
5764066 Novak et al. Jun 1998 A
5777605 Yoshinobu et al. Jul 1998 A
5786818 Brewer et al. Jul 1998 A
5790769 Buxton et al. Aug 1998 A
5798752 Buxton et al. Aug 1998 A
5805144 Scholder et al. Sep 1998 A
5808602 Sellers Sep 1998 A
5812239 Eger Sep 1998 A
5812498 Teres Sep 1998 A
5815141 Phares Sep 1998 A
5825351 Tam Oct 1998 A
5825352 Bisset et al. Oct 1998 A
5825353 Will Oct 1998 A
5828364 Siddiqui Oct 1998 A
5838304 Hall Nov 1998 A
5841078 Miller et al. Nov 1998 A
5841423 Carroll, Jr. et al. Nov 1998 A
D402281 Ledbetter et al. Dec 1998 S
5850213 Imai et al. Dec 1998 A
5856645 Norton Jan 1999 A
5856822 Du et al. Jan 1999 A
5859629 Tognazzini Jan 1999 A
5861875 Gerpheide Jan 1999 A
5869791 Young Feb 1999 A
5875311 Bertram et al. Feb 1999 A
5883619 Ho et al. Mar 1999 A
5889236 Gillespie et al. Mar 1999 A
5889511 Ong et al. Mar 1999 A
5894117 Kamishima Apr 1999 A
5903229 Kishi May 1999 A
5907152 Dandiliker et al. May 1999 A
5907318 Medina May 1999 A
5909211 Combs et al. Jun 1999 A
5910802 Shields et al. Jun 1999 A
5914706 Kono Jun 1999 A
5923388 Kurashima et al. Jul 1999 A
D412940 Kato et al. Aug 1999 S
5933102 Miller et al. Aug 1999 A
5933141 Smith Aug 1999 A
5936619 Nagasaki et al. Aug 1999 A
5943044 Martinelli et al. Aug 1999 A
5953000 Weirich Sep 1999 A
5956019 Bang et al. Sep 1999 A
5959610 Silfvast Sep 1999 A
5959611 Smailagic et al. Sep 1999 A
5964661 Dodge Oct 1999 A
5973668 Watanabe Oct 1999 A
6000000 Hawkins et al. Dec 1999 A
6002093 Hrehor et al. Dec 1999 A
6002389 Kasser Dec 1999 A
6005299 Hengst Dec 1999 A
6025832 Sudo et al. Feb 2000 A
6031518 Adams et al. Feb 2000 A
6034672 Gaultier et al. Mar 2000 A
6057829 Silfvast May 2000 A
6075533 Chang Jun 2000 A
6084574 Bidiville Jul 2000 A
D430169 Scibora Aug 2000 S
6097372 Suzuki Aug 2000 A
6104790 Narayanaswami Aug 2000 A
6122526 Parulski et al. Sep 2000 A
6124587 Bidiville et al. Sep 2000 A
6128006 Rosenberg et al. Oct 2000 A
6131048 Sudo et al. Oct 2000 A
6141068 Iijima Oct 2000 A
6147856 Karidis Nov 2000 A
6163312 Furuya Dec 2000 A
6166721 Kuroiwa et al. Dec 2000 A
6179496 Chou Jan 2001 B1
6181322 Nanavati Jan 2001 B1
D437860 Suzuki et al. Feb 2001 S
6188391 Seely et al. Feb 2001 B1
6188393 Shu Feb 2001 B1
6191774 Schena et al. Feb 2001 B1
6198054 Janniere Mar 2001 B1
6198473 Armstrong Mar 2001 B1
6211861 Rosenberg et al. Apr 2001 B1
6219038 Cho Apr 2001 B1
6222528 Gerpheide et al. Apr 2001 B1
D442592 Ledbetter et al. May 2001 S
6225976 Yates et al. May 2001 B1
6225980 Weiss et al. May 2001 B1
6226534 Aizawa May 2001 B1
6227966 Yokoi May 2001 B1
D443616 Fisher et al. Jun 2001 S
6243078 Rosenberg Jun 2001 B1
6243080 Molne Jun 2001 B1
6243646 Ozaki et al. Jun 2001 B1
6248017 Roach Jun 2001 B1
6254477 Sasaki et al. Jul 2001 B1
6256011 Culver Jul 2001 B1
6259491 Ekedahl et al. Jul 2001 B1
6262717 Donohue et al. Jul 2001 B1
6262785 Kim Jul 2001 B1
6266050 Oh et al. Jul 2001 B1
6285211 Sample et al. Sep 2001 B1
D448810 Goto Oct 2001 S
6297795 Kato et al. Oct 2001 B1
6297811 Kent et al. Oct 2001 B1
6300946 Lincke et al. Oct 2001 B1
6307539 Suzuki Oct 2001 B2
D450713 Masamitsu et al. Nov 2001 S
6314483 Goto et al. Nov 2001 B1
6321441 Davidson et al. Nov 2001 B1
6323845 Robbins Nov 2001 B1
D452250 Chan Dec 2001 S
6337678 Fish Jan 2002 B1
6340800 Zhai et al. Jan 2002 B1
D454568 Andre et al. Mar 2002 S
6357887 Novak Mar 2002 B1
D455793 Lin Apr 2002 S
6373265 Morimoto et al. Apr 2002 B1
6373470 Andre et al. Apr 2002 B1
6377530 Burrows Apr 2002 B1
6396523 Segal et al. May 2002 B1
6424338 Anderson Jul 2002 B1
6429846 Rosenberg et al. Aug 2002 B2
6429852 Adams et al. Aug 2002 B1
6452514 Philipp Sep 2002 B1
6465271 Ko et al. Oct 2002 B1
6473069 Gerphelde Oct 2002 B1
6492602 Asai et al. Dec 2002 B2
6492979 Kent et al. Dec 2002 B1
6496181 Bomer et al. Dec 2002 B1
6497412 Bramm Dec 2002 B1
D468365 Bransky et al. Jan 2003 S
D469109 Andre et al. Jan 2003 S
D472245 Andre et al. Mar 2003 S
6546231 Someya et al. Apr 2003 B1
6563487 Martin et al. May 2003 B2
6587091 Serpa Jul 2003 B2
6606244 Liu et al. Aug 2003 B1
6618909 Yang Sep 2003 B1
6636197 Goldenberg et al. Oct 2003 B1
6639584 Li Oct 2003 B1
6640250 Chang et al. Oct 2003 B1
6650975 Ruffner Nov 2003 B2
D483809 Lim Dec 2003 S
6658773 Rohne et al. Dec 2003 B2
6664951 Fujii et al. Dec 2003 B1
6677927 Bruck et al. Jan 2004 B1
6678891 Wilcox et al. Jan 2004 B1
6686904 Sherman et al. Feb 2004 B1
6686906 Salminen et al. Feb 2004 B2
6703550 Chu Mar 2004 B2
6724817 Simpson et al. Apr 2004 B1
6727889 Shaw Apr 2004 B2
D489731 Huang May 2004 S
6738045 Hinckley et al. May 2004 B2
6750803 Yates et al. Jun 2004 B2
6781576 Tamura Aug 2004 B2
6784384 Park et al. Aug 2004 B2
6788288 Ano Sep 2004 B2
6791533 Su Sep 2004 B2
6795057 Gordon Sep 2004 B2
D497618 Andre et al. Oct 2004 S
6810271 Wood et al. Oct 2004 B1
6822640 Derocher Nov 2004 B2
6834975 Chu-Chia et al. Dec 2004 B2
6844872 Farag et al. Jan 2005 B1
6855899 Sotome Feb 2005 B2
6865718 Levi Montalcini Mar 2005 B2
6886842 Vey et al. May 2005 B2
6894916 Reohr et al. May 2005 B2
D506476 Andre et al. Jun 2005 S
6922189 Fujiyoshi Jul 2005 B2
6930494 Tesdahl et al. Aug 2005 B2
6958614 Morimoto Oct 2005 B2
6977808 Lam et al. Dec 2005 B2
6978127 Bulthuis et al. Dec 2005 B1
6985137 Kaikuranta Jan 2006 B2
7006077 Uusimaki Feb 2006 B1
7019225 Matsumoto et al. Mar 2006 B2
7046230 Zadesky et al. May 2006 B2
7050292 Shimura et al. May 2006 B2
7069044 Okada et al. Jun 2006 B2
7078633 Ihalainen Jul 2006 B2
7084856 Huppi Aug 2006 B2
7113196 Kerr Sep 2006 B2
7117136 Rosedale Oct 2006 B1
7119792 Andre et al. Oct 2006 B1
7215319 Kamijo et al. May 2007 B2
7233318 Farag et al. Jun 2007 B1
7236154 Kerr et al. Jun 2007 B1
7236159 Siversson Jun 2007 B1
7253643 Seguine Aug 2007 B1
7279647 Philipp Oct 2007 B2
7288732 Hashida Oct 2007 B2
7297883 Rochon et al. Nov 2007 B2
7310089 Baker et al. Dec 2007 B2
7312785 Tsuk et al. Dec 2007 B2
7321103 Nakanishi et al. Jan 2008 B2
7333092 Zadesky et al. Feb 2008 B2
7348898 Ono Mar 2008 B2
7382139 Mackey Jun 2008 B2
7394038 Chang Jul 2008 B2
7395081 Bonnelykke Kristensen Jul 2008 B2
7397467 Park et al. Jul 2008 B2
7439963 Geaghan et al. Oct 2008 B2
7466307 Trent et al. Dec 2008 B2
7479949 Jobs et al. Jan 2009 B2
7486323 Lee et al. Feb 2009 B2
7502016 Trent, Jr. et al. Mar 2009 B2
7503193 Schoene et al. Mar 2009 B2
7593782 Jobs et al. Sep 2009 B2
7645955 Huang et al. Jan 2010 B2
7671837 Forsblad et al. Mar 2010 B2
7708051 Katsumi et al. May 2010 B2
7772507 Orr et al. Aug 2010 B2
8581879 Vieta Nov 2013 B2
20010011991 Wang et al. Aug 2001 A1
20010011993 Saarinen Aug 2001 A1
20010033270 Osawa et al. Oct 2001 A1
20010043545 Aratani Nov 2001 A1
20010050673 Davenport Dec 2001 A1
20010051046 Watanabe et al. Dec 2001 A1
20020000978 Gerpheide Jan 2002 A1
20020011993 Lui et al. Jan 2002 A1
20020027547 Kamijo Mar 2002 A1
20020030665 Ano Mar 2002 A1
20020033848 Sciammarella et al. Mar 2002 A1
20020039493 Tanaka Apr 2002 A1
20020045960 Phillips et al. Apr 2002 A1
20020071550 Pletikosa Jun 2002 A1
20020089545 Levi Montalcini Jul 2002 A1
20020103796 Hartley Aug 2002 A1
20020118131 Yates et al. Aug 2002 A1
20020118169 Hinckley et al. Aug 2002 A1
20020145594 Derocher Oct 2002 A1
20020154090 Lin Oct 2002 A1
20020158844 McLoone et al. Oct 2002 A1
20020164156 Bilbrey Nov 2002 A1
20020168947 Lemley Nov 2002 A1
20020180701 Hayama et al. Dec 2002 A1
20020196239 Lee Dec 2002 A1
20030002246 Kerr Jan 2003 A1
20030025679 Taylor et al. Feb 2003 A1
20030028346 Sinclair et al. Feb 2003 A1
20030043121 Chen Mar 2003 A1
20030043174 Hinckley et al. Mar 2003 A1
20030050092 Yun Mar 2003 A1
20030076301 Tsuk et al. Apr 2003 A1
20030076303 Huppi Apr 2003 A1
20030076306 Zadesky et al. Apr 2003 A1
20030091377 Hsu et al. May 2003 A1
20030095095 Pihlaja May 2003 A1
20030095096 Robbin et al. May 2003 A1
20030098851 Brink May 2003 A1
20030103043 Mulligan et al. Jun 2003 A1
20030122792 Yamamoto et al. Jul 2003 A1
20030135292 Husgafvel et al. Jul 2003 A1
20030142081 Iizuka et al. Jul 2003 A1
20030184517 Senzui et al. Oct 2003 A1
20030197740 Reponen Oct 2003 A1
20030206202 Moriya Nov 2003 A1
20030210537 Engelmann Nov 2003 A1
20030224831 Engstrom et al. Dec 2003 A1
20040027341 Derocher Feb 2004 A1
20040074756 Kawakami et al. Apr 2004 A1
20040080682 Dalton Apr 2004 A1
20040109357 Cernea et al. Jun 2004 A1
20040150619 Baudisch et al. Aug 2004 A1
20040156192 Kerr et al. Aug 2004 A1
20040178997 Gillespie et al. Sep 2004 A1
20040200699 Matsumoto et al. Oct 2004 A1
20040215986 Shakkarwar Oct 2004 A1
20040224638 Fadell et al. Nov 2004 A1
20040239622 Proctor et al. Dec 2004 A1
20040252109 Trent, Jr. et al. Dec 2004 A1
20040252867 Lan et al. Dec 2004 A1
20040253989 Tupler et al. Dec 2004 A1
20040263388 Krumm et al. Dec 2004 A1
20040267874 Westberg et al. Dec 2004 A1
20050012644 Hurst et al. Jan 2005 A1
20050017957 Yi Jan 2005 A1
20050024341 Gillespie et al. Feb 2005 A1
20050030048 Bolender Feb 2005 A1
20050052425 Zadesky et al. Mar 2005 A1
20050052426 Hagermoser et al. Mar 2005 A1
20050052429 Philipp Mar 2005 A1
20050068304 Lewis et al. Mar 2005 A1
20050083299 Nagasaka Apr 2005 A1
20050083307 Aufderheide Apr 2005 A1
20050090288 Stohr et al. Apr 2005 A1
20050104867 Westerman et al. May 2005 A1
20050110768 Marriott et al. May 2005 A1
20050129199 Abe Jun 2005 A1
20050139460 Hosaka Jun 2005 A1
20050140657 Park et al. Jun 2005 A1
20050143124 Kennedy et al. Jun 2005 A1
20050156881 Trent et al. Jul 2005 A1
20050162402 Watanachote Jul 2005 A1
20050204309 Szeto Sep 2005 A1
20050237308 Autio et al. Oct 2005 A1
20060026521 Hotelling et al. Feb 2006 A1
20060026535 Hotelling et al. Feb 2006 A1
20060026536 Hotelling et al. Feb 2006 A1
20060032680 Elias et al. Feb 2006 A1
20060038791 Mackey Feb 2006 A1
20060066582 Lyon et al. Mar 2006 A1
20060066588 Lyon et al. Mar 2006 A1
20060095848 Naik May 2006 A1
20060097991 Hotelling et al. May 2006 A1
20060131156 Voelckers Jun 2006 A1
20060143574 Ito et al. Jun 2006 A1
20060174568 Kinoshita et al. Aug 2006 A1
20060181517 Zadesky et al. Aug 2006 A1
20060197750 Kerr et al. Sep 2006 A1
20060232557 Fallot-Burghardt Oct 2006 A1
20060236262 Bathiche et al. Oct 2006 A1
20060250377 Zadesky et al. Nov 2006 A1
20060274042 Krah et al. Dec 2006 A1
20060274905 Lindahl et al. Dec 2006 A1
20060279896 Bruwer Dec 2006 A1
20060284836 Philipp Dec 2006 A1
20070013671 Zadesky et al. Jan 2007 A1
20070018970 Tabasso et al. Jan 2007 A1
20070052044 Forsblad et al. Mar 2007 A1
20070052691 Zadesky et al. Mar 2007 A1
20070080936 Tsuk et al. Apr 2007 A1
20070080938 Robbin et al. Apr 2007 A1
20070080952 Lynch et al. Apr 2007 A1
20070083822 Robbin et al. Apr 2007 A1
20070085841 Tsuk et al. Apr 2007 A1
20070097086 Battles et al. May 2007 A1
20070120834 Boillot May 2007 A1
20070126696 Boillot Jun 2007 A1
20070152975 Ogihara Jul 2007 A1
20070152977 Ng et al. Jul 2007 A1
20070152983 McKillop et al. Jul 2007 A1
20070155434 Jobs et al. Jul 2007 A1
20070157089 Van Os et al. Jul 2007 A1
20070242057 Zadesky et al. Oct 2007 A1
20070247421 Orsley et al. Oct 2007 A1
20070247443 Philipp Oct 2007 A1
20070271516 Carmichael Nov 2007 A1
20070273671 Zadesky et al. Nov 2007 A1
20070276525 Zadesky et al. Nov 2007 A1
20070279394 Lampell et al. Dec 2007 A1
20070285404 Rimon et al. Dec 2007 A1
20070290990 Robbin et al. Dec 2007 A1
20070291016 Philipp Dec 2007 A1
20070296709 GuangHai Dec 2007 A1
20080006453 Hotelling et al. Jan 2008 A1
20080006454 Hotelling Jan 2008 A1
20080007533 Hotelling et al. Jan 2008 A1
20080007539 Hotelling et al. Jan 2008 A1
20080012837 Marriott et al. Jan 2008 A1
20080018615 Zadesky et al. Jan 2008 A1
20080018616 Lampell et al. Jan 2008 A1
20080018617 Ng et al. Jan 2008 A1
20080036473 Jansson Feb 2008 A1
20080036734 Forsblad et al. Feb 2008 A1
20080060925 Weber et al. Mar 2008 A1
20080062141 Chandhri Mar 2008 A1
20080069412 Champagne et al. Mar 2008 A1
20080079699 Mackey Apr 2008 A1
20080087476 Prest Apr 2008 A1
20080088582 Prest Apr 2008 A1
20080088596 Prest Apr 2008 A1
20080088597 Prest Apr 2008 A1
20080088600 Prest Apr 2008 A1
20080094352 Tsuk et al. Apr 2008 A1
20080098330 Tsuk et al. Apr 2008 A1
20080110739 Peng et al. May 2008 A1
20080111795 Bollinger May 2008 A1
20080143681 XiaoPing Jun 2008 A1
20080158145 Westerman Jul 2008 A1
20080165141 Christie Jul 2008 A1
20080165158 Hotelling et al. Jul 2008 A1
20080196945 Konstas Aug 2008 A1
20080202824 Philipp et al. Aug 2008 A1
20080209442 Setlur et al. Aug 2008 A1
20080264767 Chen et al. Oct 2008 A1
20080280651 Duarte Nov 2008 A1
20080284742 Prest Nov 2008 A1
20080293274 Milan Nov 2008 A1
20090021267 Golovchenko et al. Jan 2009 A1
20090026558 Bauer et al. Jan 2009 A1
20090033635 Wai Feb 2009 A1
20090036176 Ure Feb 2009 A1
20090058687 Rothkopf et al. Mar 2009 A1
20090058801 Bull Mar 2009 A1
20090058802 Orsley et al. Mar 2009 A1
20090073130 Weber et al. Mar 2009 A1
20090078551 Kang Mar 2009 A1
20090109181 Hui et al. Apr 2009 A1
20090141046 Rathnam et al. Jun 2009 A1
20090160771 Hinckley et al. Jun 2009 A1
20090179854 Weber et al. Jul 2009 A1
20090197059 Weber et al. Aug 2009 A1
20090229892 Fisher et al. Sep 2009 A1
20090273573 Hotelling Nov 2009 A1
20100058251 Rottler et al. Mar 2010 A1
20100060568 Fisher et al. Mar 2010 A1
20100073319 Lyon et al. Mar 2010 A1
20100149127 Fisher et al. Jun 2010 A1
20100289759 Fisher et al. Nov 2010 A1
20100313409 Weber et al. Dec 2010 A1
20110005845 Hotelling et al. Jan 2011 A1
20130135195 Josephson et al. May 2013 A1
20130271422 Hotelling et al. Oct 2013 A1
20130307818 Pope et al. Nov 2013 A1
Foreign Referenced Citations (202)
Number Date Country
1139235 Jan 1997 CN
1455615 Nov 2003 CN
1499356 May 2004 CN
1659506 Aug 2005 CN
3615742 Nov 1987 DE
19722636 Dec 1998 DE
10022537 Nov 2000 DE
20019074 Feb 2001 DE
10 2004 043 663 Apr 2006 DE
0178157 Apr 1986 EP
0419145 Mar 1991 EP
0 498 540 Aug 1992 EP
0 521 683 Jan 1993 EP
0 674 288 Sep 1995 EP
0 731 407 Sep 1996 EP
0 551 778 Jan 1997 EP
0 880 091 Nov 1998 EP
1 026 713 Aug 2000 EP
1 081 922 Mar 2001 EP
1 098 241 May 2001 EP
1 133 057 Sep 2001 EP
1 162 826 Dec 2001 EP
1 168 396 Jan 2002 EP
1 205 836 May 2002 EP
1 244 053 Sep 2002 EP
1 251 455 Oct 2002 EP
1 263 193 Dec 2002 EP
1 347 481 Sep 2003 EP
1 376 326 Jan 2004 EP
1 467 392 Oct 2004 EP
1 482 401 Dec 2004 EP
1 496 467 Jan 2005 EP
1 517 228 Mar 2005 EP
1 542 437 Jun 2005 EP
1 589 407 Oct 2005 EP
1 784 058 May 2007 EP
1 841 188 Oct 2007 EP
1 850 218 Oct 2007 EP
1 876 711 Jan 2008 EP
2 686 440 Jul 1993 FR
2015167 Sep 1979 GB
2072389 Sep 1981 GB
2315186 Jan 1998 GB
2333215 Jul 1999 GB
2391060 Jan 2004 GB
2402105 Dec 2004 GB
57-95722 Jun 1982 JP
57-97626 Jun 1982 JP
61-117619 Jun 1986 JP
61-124009 Jun 1986 JP
63-20411 Jan 1988 JP
63-106826 May 1988 JP
63-181022 Jul 1988 JP
63-298518 Dec 1988 JP
03-57617 Jun 1991 JP
3-192418 Aug 1991 JP
4-32920 Feb 1992 JP
4-205408 Jul 1992 JP
5-041135 Feb 1993 JP
5-080938 Apr 1993 JP
5-101741 Apr 1993 JP
5-36623 May 1993 JP
5-189110 Jul 1993 JP
5-205565 Aug 1993 JP
5-211021 Aug 1993 JP
5-217464 Aug 1993 JP
5-233141 Sep 1993 JP
5-262276 Oct 1993 JP
5-265656 Oct 1993 JP
5-274956 Oct 1993 JP
5-289811 Nov 1993 JP
5-298955 Nov 1993 JP
5-325723 Dec 1993 JP
6-20570 Jan 1994 JP
6-084428 Mar 1994 JP
6-089636 Mar 1994 JP
6-96639 Apr 1994 JP
6-111695 Apr 1994 JP
6-139879 May 1994 JP
6-187078 Jul 1994 JP
6-208433 Jul 1994 JP
6-267382 Sep 1994 JP
6-283993 Oct 1994 JP
6-333459 Dec 1994 JP
7-107574 Apr 1995 JP
7-41882 Jul 1995 JP
7-201249 Aug 1995 JP
7-201256 Aug 1995 JP
7-253838 Oct 1995 JP
7-261899 Oct 1995 JP
7-261922 Oct 1995 JP
7-296670 Nov 1995 JP
7-319001 Dec 1995 JP
8-016292 Jan 1996 JP
8-115158 May 1996 JP
8-203387 Aug 1996 JP
8-293226 Nov 1996 JP
8-298045 Nov 1996 JP
8-299541 Nov 1996 JP
8-316664 Nov 1996 JP
9-044289 Feb 1997 JP
9-069023 Mar 1997 JP
9-128148 May 1997 JP
9-134248 May 1997 JP
9-218747 Aug 1997 JP
9-230993 Sep 1997 JP
9-231858 Sep 1997 JP
9-233161 Sep 1997 JP
9-251347 Sep 1997 JP
9-258895 Oct 1997 JP
9-288926 Nov 1997 JP
9-512979 Dec 1997 JP
10-63467 Mar 1998 JP
10-74127 Mar 1998 JP
10-074429 Mar 1998 JP
10-198507 Jul 1998 JP
10-227878 Aug 1998 JP
10-240693 Sep 1998 JP
10-320322 Dec 1998 JP
10-326149 Dec 1998 JP
11-24834 Jan 1999 JP
11-184607 Jul 1999 JP
11-194863 Jul 1999 JP
11-194872 Jul 1999 JP
11-194882 Jul 1999 JP
11-194883 Jul 1999 JP
11-194891 Jul 1999 JP
11-195353 Jul 1999 JP
11-203045 Jul 1999 JP
11-212725 Aug 1999 JP
11-272378 Oct 1999 JP
11-338628 Dec 1999 JP
2000-200147 Jul 2000 JP
2000-215549 Aug 2000 JP
2000-267777 Sep 2000 JP
2000-267786 Sep 2000 JP
2000-267797 Sep 2000 JP
2000-353045 Dec 2000 JP
2001-11769 Jan 2001 JP
2001-22508 Jan 2001 JP
2001-184158 Jul 2001 JP
3085481 Feb 2002 JP
2002-215311 Aug 2002 JP
2003-015796 Jan 2003 JP
2003-060754 Feb 2003 JP
2003-099198 Apr 2003 JP
2003-150303 May 2003 JP
2003-517674 May 2003 JP
2003-280799 Oct 2003 JP
2003-280807 Oct 2003 JP
2004-362097 Dec 2004 JP
2005-251218 Sep 2005 JP
2005-285140 Oct 2005 JP
2005-293606 Oct 2005 JP
2006-004453 Jan 2006 JP
2006-178962 Jul 2006 JP
3852854 Dec 2006 JP
2007-123473 May 2007 JP
1998-71394 Oct 1998 KR
1999-50198 Jul 1999 KR
2000-0008579 Feb 2000 KR
2001-0052016 Jun 2001 KR
2001-108361 Dec 2001 KR
2002-65059 Aug 2002 KR
10-2006-0021678 Mar 2006 KR
431607 Apr 2001 TW
00470193 Dec 2001 TW
547716 Aug 2003 TW
I220491 Aug 2004 TW
WO-9417494 Aug 1994 WO
WO-9500897 Jan 1995 WO
WO-9627968 Sep 1996 WO
WO-9814863 Apr 1998 WO
WO-9949443 Sep 1999 WO
WO-0079772 Dec 2000 WO
WO-0102949 Jan 2001 WO
WO-0144912 Jun 2001 WO
WO-0208881 Jan 2002 WO
WO-03025960 Mar 2003 WO
WO-03044645 May 2003 WO
WO-03044956 May 2003 WO
WO-03088176 Oct 2003 WO
WO-03090008 Oct 2003 WO
WO-2004001573 Dec 2003 WO
WO-2004040606 May 2004 WO
WO-2004091956 Oct 2004 WO
WO-2005055620 Jun 2005 WO
WO-2005076117 Aug 2005 WO
WO-2005114369 Dec 2005 WO
WO-2005124526 Dec 2005 WO
WO-2006020305 Feb 2006 WO
WO-2006021211 Mar 2006 WO
WO-2006037545 Apr 2006 WO
WO 2006104745 Oct 2006 WO
WO-2006135127 Dec 2006 WO
WO-2007025858 Mar 2007 WO
WO-2007078477 Jul 2007 WO
WO-2007084467 Jul 2007 WO
WO-2007089766 Aug 2007 WO
WO-2008007372 Jan 2008 WO
WO-2008045414 Apr 2008 WO
WO-2008045833 Apr 2008 WO
Non-Patent Literature Citations (190)
Entry
“About Quicktip®” www.logicad3d.com/docs/qt.html, downloaded Apr. 8, 2002.
Ahl, David, “Controller Update,” Creative Computing Vo. 9, No. 12, Dec. 1983.
Ahmad, “A Usable Real-Time 3D Hand Tracker,” Proceedings of the 28th Asilomar Conference on Signals, Systems and Computers—Part 2 (of 2) vol. 2 (Oct. 1994).
“Alps Electric Ships GlidePoint Keyboard for the Macintosh; Includes a GlidePoint Touchpad, Erase-Eaze Backspace Key and Contoured Wrist Rest,” Business Wire (Jul. 1, 1996).
“Alps Electric introduces the GlidePoint Wave Keyboard; combines a gentily curved design with Alps' advanced GlidePoint Technology,” Business Wire (Oct. 21, 1996).
“Apple Presents iPod: Ultra-Portable MP3 Music Player Puts 1,000 Songs in Your Pocket,” retreived from http://www.apple.com/pr/library/2001/oct/23ipod.html on Oct. 23, 2001.
“Apple Unveils Optical Mouse and New Pro Keyboard,” Press Release, Jul. 19, 2000.
“APS show guide to exhibitors,” Physics Today 49(3) (Mar. 1996).
“Atari VCS/2600 Peripherals,” www.classicgaming.com/gamingmuseum/2006p.html, downloaded Feb. 28, 2007, pp. 1-15.
Baig, E.C., “Your PC Just Might Need a Mouse,” U.S. News & World Report 108(22) (Jun. 4, 1990).
Bang & Olufsen Telecom a/s. (2000). BeoCom 6000 User Guide; 53 pages.
Bartimo, Jim, “The Portables: Travelling Quickly,” Computerworld (Nov. 14, 1983).
Boling, Douglas (1993) “Programming Microsoft Windows CE.NET,” p. 109.
Bray, “Phosphors help switch on xenon,” Physics in Action, pp. 1-3, Apr. 1999.
Brink et al., “Pumped-up portables,” U.S. News & World Report 116(21) (May 30, 1994).
Brown et al., “Windows on Tablets as a Means of Achieving Virtual Input Devices,” Human-Computer Interaction—INTERACT '90 (1990).
Buxton et al., “Issues and Techniques in Touch-Sensitive Tablet Input,” Computer Graphics 19(3), Proceedings of SIGGRAPH '85 (1985).
Chapweske, Adam “PS/2 Mouse/Keyboard Protocol,” 1999, http://panda.cs.ndsu.nodak.edu/˜achapwes/PICmicro/PS2/ps2.htm.
Chen et al., “A Study in Interactive 3-D Rotation Using 2-D Control Devices,” Computer Graphics 22(4) (Aug. 1988).
De Meyer, Kevin, “Crystal Optical Mouse,” Feb. 14, 2002, Heatseekerz, Web Article 19.
“Der Klangmeister,” Connect Magazine, Aug. 1998.
“Design News literature plus,” Design News 51(24) (Dec. 18, 1995).
“Diamond Multimedia Announces Rio PMP300 Portable MP3 Music Player,” located at http://news.harmony-central.com/Newp/1998/Rio-PMP300.html visited on May 5, 2008. (4 pages).
Evans et al., “Tablet-based Valuators that Provide One, Two, or Three Degrees of Freedom,” Computer Grahics 15(3) (Aug. 1981).
EVB Elektronik “TSOP6238 IR Receiver Modules for Infrared Remote Control Systems” dated Jan. 2004 1 page.
Fiore, “Zen Touchpad,” Cornell University, May 2000.
Gadgetboy, “Point and click with the latest mice,” CNET Asia Product Review, www.asia.cnet.com/reviews...are/gadgetboy/0,39001770,380235900,00.htm, downloaded Dec. 5, 2001.
Gfroerer, “Photoluminescence in Analysis of Surfaces and Interfaces,” Encyclopedia of Analytical Chemistry, pp. 1-23, Copyright John Wiley & Sons Ltd, Chichester, 2000.
Interlink Electronics, VersaPad: Integration Guide, © 1998 (VersaPad), pp. 1-35.
Jesitus, John, “Broken promies?”, Industry Week/IW 246(20) (Nov. 3, 1997).
Kobayashi (1996) “Design of Dynamic Soundscape: Mapping Time to Space for Audio Browsing with Simultaneous Listening,” Thesis submitted to Program in Media Arts and Sciences at the Massachusetts Institute of Technology, (58 pages).
Kobayashi et al. (1994) “Development of the Touch Switches with the Click Response,” Koukuu Denshi Gihou No. 17, pp. 44-48 (published by the Japan Aviation Electronics Industry, Ltd.).
Kobayashi et al. (1997) “Dynamic Soundscape: Mapping Time to Space for Audio Browsing,” Computer Human Interaction: 16 pages.
Letter re: Bang & Olufsen a/s by David Safran, Nixon Peabody, LLP, May 21, 2004, with BeoCom 6000 Sales Training Brochure, 7 pages.
Luna Technologies International, Inc., LUNA Photoluminescent Safety Products, “Photoluminescence—What is Photoluminescence?” from website at http://www.lunaplast.com/photoluminescence.com on Dec. 27, 2005.
“Manufactures,” Laser Focus World, Buyers Guide '96, 31(12) (Dec. 1995).
Mims, Forrest M., III, “A Few Quick Pointers; Mouses, Touch Screens, Touch Pads, Light Pads, and the Like Can Make System Easier to Use,” Computers & Electronics (22) (May 1984).
Nass, Richard, “Touchpad input device goes digital to give portable systems a desktop ‘mouse-like’ feel,” Electronic Design 44(18) (Sep. 3, 1996).
“National Design Engineering Show,” Design News 52(5) (Mar. 4, 1996).
“Neuros MP3 Digital Audio Computer,” www.neurosaudio.com, downloaded Apr. 9, 2003.
“OEM Touchpad Modules” website www.glidepoint.com/sales/modules.index.shtml, downloaded Feb. 13, 2002.
Perenson, Melissa, “New & Improved: Touchpad Redux,” PC Magazine (Sep. 10, 1996).
Petersen, Marty, “Koala Pad Touch Tablet & Micro Illustrator Software,” InfoWorld (Oct. 10, 1983).
Petruzzellis, “Force-Sensing Resistors,” Electronics Now 64(3) (Mar. 1993).
Photographs of Innovation 2000 Best of Show Award Presented at the 2000 Int'l CES Innovations Design & Engineering Showcase, Jan. 6, 2000, 1 page.
“Preview of exhibitor booths at the Philadelphia show,” Air Conditioning, Heating & News 200(2) (Jan. 13, 1997).
“Product news,” Design News 53(11) (Jun. 9, 1997).
“Product news,” Design News 53(9) (May 5, 1997).
“Product Overview—ErgoCommander®,” www.logicad3d.com/products/ErgoCommander.htm, downloaded Apr. 8, 2002.
“Product Overview—SpaceMouse® Classic,” www.logicad3d.com/products/Classic.htm, downloaded Apr. 8, 2002.
SanDisk Sansa Connect User Guide, 2007; 29 pages.
Soderholm, Lars G., “Sensing Systems for ‘Touch and Feel,’” Design News (May 8, 1989): pp. 72-76.
Sony presents “Choice Without Compromise” at IBC '97 M2 Presswire (Jul. 24, 1997).
Spiwak, Marc, “A Great New Wireless Keyboard,” Popular Electronics 14(12) (Dec. 1997).
Spiwak, Marc, “A Pair of Unusual Controllers,” Popular Electronics 14(4) (Apr. 1997).
Sylvania, “Intellvision™ Intelligent Television Master Component Service Manual,” pp. 1, 2 and 8, 1979.
“Synaptics Tough Pad Interfacing Guide,” Second Edition, Mar. 25, 1998, Synaptics, Inc., San Jose, CA, pp. 1-90.
“System Service and Troubleshooting Manual,” www.dsplib.com/intv/Master, downloaded Dec. 11, 2002.
Tessler, Franklin, “Point Pad,” Macworld 12(10) (Oct. 1995).
Tessler, Franklin, “Smart Input: How to Chose from the New Generation of Innovative Input Devices,” Macworld 13(5) (May 1996).
Tessler, Franklin, “Touchpads,” Macworld 13(2) (Feb. 1996).
Translation of Trekstor's Defense Statement to the District Court Mannheim of May 23, 2008; 37 pages.
“Triax Custom Controllers due; Video Game Controllers,” HFD—The Weekly Home Furnishing Newspaper 67(1) (Jan. 4, 1993).
“Touchpad,” Notebook PC Manual, Acer Information Co. Ltd., Feb. 16, 2005, pp. 11-12.
Beaver et al., U.S. Appl. No. 12/042,318, filed Mar. 4, 2008 entitled, “Touch Event Model”; 36 pages.
Robbin, U.S. Appl. No. 60/346,237 entitled, “Method and System for List Scrolling,” filed Oct. 22, 2001.
Zadesky et al., U.S. Appl. No. 60/359,551 entitled “Touchpad for Handheld Device,” filed Feb. 25, 2002.
Robbin et al., U.S. Appl. No. 60/387,692 entitled “Method and Apparatus for Use of Rotational User Inputs,” filed Jun. 10, 2002.
Robbin et al., U.S. Appl. No. 60/399,806 entitled “Graphical User Interface and Methods of Use Thereof in a Multimedia Player,” filed Jul. 30, 2002.
Grignon et al., U.S. Appl. No. 60/755,656, filed Dec. 30, 2005, entitled “Touch Pad with Feedback”.
Elias et al., U.S. Appl. No. 60/522,107, filed Aug. 16, 2004, entitled, “A Method for Increasing the Spatial Resolution of Touch Sensitive Devices”.
Hotelling, U.S. Appl. No. 60/658,777 titled “Multi-Functional Hand-held Device,” filed Mar. 4, 2005.
Zadesky et al., U.S. Appl. No. 60/714,609 entitled “Scrolling Input Arrangements Using Capacitive Sensors on a Flexible Membrane,” filed Sep. 6, 2005.
Lampell et al., U.S. Appl. No. 60/810,423, filed Jun. 2, 2006, entitled “Techniques for Interactive Input to Portable Electronic Devices”.
Prest et al., U.S. Appl. No. 60/850,662, filed Oct. 11, 2006, entitled, “Capacitive Scroll Wheel”.
Bollinger, U.S. Appl. No. 60/858,404, filed Nov. 13, 2006, entitled “Method of Capacitively Sensing Finger Position”.
Rothkopf, U.S. Appl. No. 60/935,854 titled “Compact Media Players,” filed Sep. 4, 2007.
Rathnam et al., U.S. Appl. No. 60/992,056, filed Dec. 3, 2007, entitled, “Scroll Wheel Circuit Arrangements and Methods of Use Thereof”.
Rathnam et al., U.S. Appl. No. 61/017,436, filed Dec. 28, 2007, entitled, “Multi-Touch Scroll Wheel Circuit Arrangements and Processing Methods”.
Weber et al., U.S. Appl. No. 61/020,531, filed Jan. 11, 2008 entitled “Modifiable Clickwheel Text”; 11 pages.
Weber et al., U.S. Appl. No. 61/025,531, filed Feb. 1, 2008 entitled “Co-Extruded Materials and Methods”; 11 pages.
Fisher et al., U.S. Appl. No. 61/036,804, filed Mar. 14, 2008 entitled “Switchable Sensor Configurations”; 46 pages.
International Search Report and Written Opinion mailed Oct. 19, 2009, directed to PCT/US2008/085137; 15 pages.
IPRP mailed Jun. 8, 2010 directed to counterpart PCT/US2008/085137; 10 pages.
CN Office Action issued Dec. 29, 2006, directed to CN Application No. 200510103886.3, 25 pages.
Tsuk et al., U.S. Office Action mailed Oct. 13, 2006, directed to U.S. Appl. No. 10/256,716; 16 pages.
Tsuk et al., U.S. Office Action mailed Aug. 3, 2006, directed to U.S. Appl. No. 10/256,716; 15 pages.
Tsuk et al., U.S. Office Action mailed Jan. 10, 2006, directed to U.S. Appl. No. 10/256,716; 12 pages.
Tsuk et al., U.S. Office Action mailed Jun. 24, 2005, directed to U.S. Appl. No. 10/256,716; 12 pages.
Tsuk et al., U.S. Office Action mailed Sep. 30, 2004, directed to U.S. Appl. No. 10/256,716; 11 pages.
Tsuk et al., U.S. Office Action mailed Aug. 7, 2009, directed to U.S. Appl. No. 11/610,181; 20 pages.
Tsuk et al., U.S. Office Action mailed Jul. 7, 2009, directed to U.S. Appl. No. 11/610,190; 24 pages.
Tsuk et al., U.S. Office Action mailed Dec. 31, 2009, directed to U.S. Appl.No. 11/610,190; 25 pages.
Tsuk et al., U.S. Office Action mailed Apr. 28, 2010, directed to U.S. Appl. No. 11/610,190; 29 pages.
Tsuk et al., U.S. Office Action mailed Aug. 6, 2010, directed to U.S. Appl. No. 11/610,190; 30 pages.
Tsuk et al., U.S. Office Action mailed Apr. 19, 2011, directed to U.S. Appl. No. 11/610,190; 25 pages.
Tsuk et al., U.S. Office Action mailed Nov. 1, 2010, directed to U.S. Appl. No. 11/959,918; 8 pages.
Tsuk et al., U.S. Office Action mailed Mar. 31, 2011, directed to U.S. Appl. No. 11/959,918; 9 pages.
Tsuk et al., U.S. Office Action mailed Oct. 26, 2010, directed to U.S. Appl. No. 11/959,942; 27 pages.
Robbin et al., U.S. Office Action mailed Jan. 18, 2007, directed to U.S. Appl. No. 10/259,159; 18 pages.
Robbin et al., U.S. Office Action mailed Oct. 13, 2006, directed to U.S. Appl. No. 10/259,159; 18 pages.
Robbin et al., U.S. Office Action mailed Aug. 3, 2006, directed to U.S. Appl. No. 10/259,159; 15 pages.
Robbin et al., U.S. Office Action mailed Jan. 11, 2006, directed to U.S. Appl. No. 10/259,159; 15 pages.
Robbin et al., U.S. Office Action mailed Jun. 16, 2005, directed to U.S. Appl. No. 10/259,159; 16 pages.
Robbin et al., U.S. Office Action mailed Sep. 30, 2004, directed to U.S. Appl. No. 10/259,159; 14 pages.
Robbin et al., U.S. Office Action mailed Aug. 10, 2009, directed to U.S. Appl. No. 11/610,376; 11 pages.
Robbin et al., U.S. Office Action mailed Aug. 12, 2009, directed to U.S. Appl. No. 11/610,384; 20 pages.
Robbin et al., U.S. Office Action mailed Oct. 29, 2010, directed to U.S. Appl. No. 11/838,845; 8 pages.
Robbin et al., U.S. Office Action mailed Apr. 26, 2011, directed to U.S. Appl. No. 11/838,845; 9 pages.
Zadesky et al., U.S. Office Action mailed Jul. 9, 2008, directed to U.S. Appl. No. 10/643,256; 12 pages.
Zadesky et al., U.S. Office Action mailed Dec. 12, 2007, directed to U.S. Appl. No. 10/643,256; 12 pages.
Zadesky et al., U.S. Office Action mailed Jul. 13, 2007, directed to U.S. Appl. No. 10/643,256; 13 pages.
Zadesky et al., U.S. Office Action mailed Mar. 23, 2007, directed to U.S. Appl. No. 10/643,256; 11 pages.
Zadesky et al., U.S. Office Action mailed Oct. 27, 2006, directed to U.S. Appl. No. 10/643,256; 14 pages.
Zadesky et al., U.S. Office Action mailed Mar. 30, 2010, directed to U.S. Appl. No. 11/592,679; 13 pages.
Zadesky et al., U.S. Office Action mailed Mar. 31, 2011, directed to U.S. Appl. No. 11/882,005; 7 pages.
Ng et al., U.S. Action mailed Jan. 14, 2010, directed to U.S. Appl. No. 11/394,493; 20 pages.
Ng et al., U.S. Office Action mailed Jun. 22, 2010, directed to U.S. Appl. No. 11/394,493; 14 pages.
Ng et al., U.S. Office Action mailed Dec. 9, 2010, directed to U.S. Appl. No. 11/394,493; 13 pages.
Ng et al., U.S. Office Action mailed Jan. 15, 2010, directed to U.S. Appl. No. 11/882,423; 22 pages.
Ng et al., U.S. Office Action mailed Jul. 8, 2010, directed to U.S. Appl. No. 11/882,423; 19 pages.
Ng et al., U.S. Office Action mailed Oct. 26, 2010, directed to U.S. Appl. No. 11/882,423; 18 pages.
Forsblad et al., U.S. Office Action mailed Jun. 25, 2009, directed to U.S. Appl. No. 11/355,022; 18 pages.
Forsblad et al., U.S. Office Action mailed Jan. 26, 2009, directed to U.S. Appl. No. 11/355,022; 15 pages.
Forsblad et al., U.S. Office Action mailed Jan. 27, 2009, directed to U.S. Appl. No. 11/882,421; 15 pages.
Lampell, U.S. Office Action mailed Sep. 15, 2009, directed to U.S. Appl. No. 11/530,807; 15 pages.
Lampell, U.S. Office Action mailed Jun. 4, 2010, directed to U.S. Appl. No. 11/530,807; 15 pages.
Lampell, U.S. Office Action mailed Dec. 3, 2010, directed to U.S. Appl. No. 11/530,807; 17 pages.
Lampell et al., U.S Office Action mailed Dec. 22, 2010, directed to U.S. Appl. No. 11/882,427; 16 pages.
Zadesky et al., U.S. Office Action mailed Aug. 6, 2009, directed to U.S. Appl. No. 11/057,050; 30 pages.
Zadesky et al., U.S. Office Action mailed Feb. 20, 2009, directed to U.S. Appl. No. 11/057,050; 25 pages.
Zadesky et al., U.S. Office Action mailed Dec. 24, 2008, directed to U.S. Appl. No. 11/057,050; 25 pages.
Zadesky et al., U.S. Office Action mailed Nov. 26, 2008, directed to U.S. Appl. No. 11/057,050; 25 pages.
Zadesky et al, U.S. Office Action mailed Aug. 19, 2008, directed to U.S. Appl. No. 11/057,050; 23 pages.
Zadesky et al., U.S. Office Action mailed Nov. 20, 2007, directed to U.S. Appl. No. 11/057,050; 33 pages.
Zadesky et al., U.S. Office Action mailed Oct. 4, 2010, directed to U.S. Appl. No. 11/057,050; 31 pages.
Zadesky et al., U.S. Office Action mailed Mar. 5, 2009, directed to U.S. Appl. No. 11/477,469; 12 pages.
Zadesky et al., U.S. Office Action mailed Feb. 4, 2010, directed to U.S. Appl. No. 11/477,469; 14 pages.
Zadesky et al., U.S. Office Action mailed Nov. 16, 2010, directed to U.S. Appl. No. 11/477,469; 13 pages.
Prest et al., U.S. Office Action mailed Jun. 22, 2010, directed to U.S. Appl. No. 11/878,132; 32 pages.
Prest et al., U.S. Office Action mailed Jun. 22, 2010, directed to U.S. Appl. No. 11/882,882; 32 pages.
Prest et al., U.S. Office Action mailed Jun. 23, 2010, directed to U.S. Appl. No. 11/812,384; 29 pages.
Prest et al., U.S. Office Action mailed Jun. 22, 2010, directed to U.S. Appl. No. 11/882,890; 15 pages.
Prest et al., U.S. Office Action mailed Jun. 22, 2010, directed to U.S. Appl. No. 11/812,383; 21 pages.
Prest et al., U.S. Office Action mailed Jun. 23, 2010, directed to U.S. Appl. No. 11/882,889; 13 pages.
Bollinger et al., U.S. Office Action mailed Jun. 25, 2010, directed to U.S. Appl. No. 11/842,724; 22 pages.
Bollinger et al., U.S. Office Action mailed Mar. 21, 2011, directed to U.S. Appl. No. 11/842,724; 22 pages.
McKillop et al., U.S. Office Action mailed Sep. 16, 2010, directed to U.S. Appl. No. 11/591,752; 14 pages.
McKillop et al., U.S. Office Action mailed Mar. 24, 2011, directed to U.S. Appl. No. 11/591,752; 11 pages.
Zadesky et al., U.S. Office Action mailed Jul. 30, 2004, directed to U.S. Appl. No. 10/188,182; 7 pages.
Zadesky et al., U.S. Office Action mailed Sep. 21, 2005, directed to U.S. Appl. No. 10/188,182; 10 pages.
Zadesky et al., U.S. Office Action mailed Mar. 4, 2004, directed to U.S. Appl. No. 10/188,182; 8 pages.
Zadesky et al., U.S. Office Action mailed Oct. 4, 2007, directed to U.S. Appl. No. 11/386,238; 12 pages.
Zadesky et al.., U.S. Office Action mailed Oct. 4, 2007, directed to U.S. Appl. No. 11/806,957; 14 pages.
Zadesky et al., U.S. Office Action mailed Aug. 2, 2010, directed to U.S. Appl. No. 11/882,004; 9 pages.
Zadesky et al, U.S. Office Action mailed Feb. 1, 2011, directed to U.S. Appl. No. 11/882,004; 16 pages.
Zadesky et al., U.S. Office Action mailed Sep. 29, 2010, directed to U.S. Appl. No. 11/882,003; 13 pages.
Zadesky et al., U.S. Office Action mailed Mar. 16, 2011, directed to U.S. Appl. No. 11/882,003; 12 pages.
Marriott et al., U.S. Office Action mailed Jan. 30, 2008, directed to U.S. Appl. No. 10/722,948; 17 pages.
Marriott et al., U.S. Office Action mailed Jul. 13, 2007, directed to U.S. Appl. No. 10/722,948; 15 pages.
Marriott et al., U.S. Office Action mailed Dec. 12, 2006, directed to U.S. Appl. No. 10/722,948; 14 pages.
Marriott et al., U.S. Office Action mailed Jun. 2, 2006, directed to U.S. Appl. No. 10/722,948; 12 pages.
Marriott et al., U.S. Office Action mailed Aug. 19, 2010, directed to U.S. Appl. No. 11/882,422; 13 pages.
Hotelling, U.S. Office Action mailed Sep. 1, 2009, directed to U.S. Appl. No. 11/482,286; 14 pages.
Hotelling, U.S. Office Action mailed Jan. 25, 2010, directed to U.S. Appl. No. 11/482,286; 17 pages.
Hotelling, U.S. Office mailed Jun. 9, 2010, directed to U.S. Appl. No. 11/482,286; 21 pages.
Hotelling, U.S. Office Action mailed Oct. 1, 2010, directed to U.S. Appl. No. 11/482,286; 28 pages.
Hotelling, U.S. Office Action mailed Dec. 8, 2010, directed to U.S. Appl. No. 11/482,286; 33 pages.
Hotelling, U.S. Office Action mailed Aug. 18, 2010, directed to U.S. Appl. No. 11/882,424; 16 pages.
Hotelling et al., U.S. Office Action mailed Jul. 24, 2009, directed to U.S. Appl. No. 11/483,008; 17 pages.
Hotelling et al., U.S. Office Action mailed Mar. 30, 2010, directed to U.S. Appl. No. 11/483,008; 20 pages.
Hotelling et al., U.S. Office Action mailed Oct. 27, 2010, directed to U.S. Appl. No. 11/483,008; 23 pages.
Hotelling et al., U.S. Office Action mailed Jul. 27, 2009, directed to U.S. Appl. No. 11/882,420; 17 pages.
Elias et al., U.S. Office Action mailed Aug. 4, 2009, directed to U.S. Appl. No. 11/203,692; 12 pages.
Elias et al., U.S. Office Action mailed Feb. 23, 2009, directed to U.S. Appl. No. 11/203,692; 13 pages.
Elias et al., U.S. Office Action mailed Sep. 17, 2008, directed to U.S. Appl. No. 11/203,692; 8 pages.
Elias et al., U.S. Office Action mailed Mar. 30, 2010, directed to U.S. Appl. No. 11/203,692; 15 pages.
Elias et al., U.S. Office Action mailed Jun. 11, 2010, directed to U.S. Appl. No. 11/203,692; 17 pages.
Elias et al., U.S. Office Action mailed Nov. 22, 2010, directed to U.S. Appl. No. 11/203,692; 6 pages.
Bull, U.S. Office Action mailed Jul. 9, 2010, directed to U.S. Appl. No. 11/849,801; 13 pages.
Bull, U.S. Office Action mailed Feb. 4, 2011, directed to U.S. Appl. No. 11/849,801; 22 pages.
Weber et al., U.S. Office Action mailed Jun. 7, 2010, directed to U.S. Appl. No. 11/856,530; 15 pages.
Weber et al, U.S. Office Action mailed Jan. 7, 2011, directed to U.S. Appl. No. 11/856,530; 13 pages.
Rathnam et al., U.S. Office Action mailed Mar. 24, 2011, directed to U.S. Appl. No. 12/205,757; 14 pages.
Rathnam et al., U.S. Office Action mailed Dec. 7, 2011, directed to U.S. Appl. No. 12/205,757; 18 pages.
Weber et al., U.S. Office Action mailed Oct. 13, 2010, directed to U.S. Appl. No. 12/205,795; 15 pages.
Weber et al., U.S. Office Action mailed Jan. 7, 2011, directed to U.S. Appl. No. 12/205,795; 21 pages.
Weber et al., U.S. Office Action mailed Feb. 17, 2011, directed to U.S. Appl. No. 12/844,502; 11 pages.
Lynch et al., U.S. Office Action mailed Oct. 5, 2009, directed to U.S. Appl. No. 11/499,360; 7 pages.
Lynch et al., U.S. Office Action mailed Jan. 27, 2010, directed to U.S. Appl. No. 11/499,360; 8 pages.
Related Publications (1)
Number Date Country
20140092057 A1 Apr 2014 US
Provisional Applications (2)
Number Date Country
60992056 Dec 2007 US
61017436 Dec 2007 US
Continuations (1)
Number Date Country
Parent 12205757 Sep 2008 US
Child 13858603 US