The invention relates generally to data input methods and devices for electronic equipment and, more particularly, to methods and devices for discriminating between various inputs to a multi-touch touch-surface input device.
There currently exist many types of input devices for performing operations with an electronic system. These operations often correspond to moving a cursor and/or making selections on a display screen. Illustrative electronic systems include tablet, notebook, desktop and server computer systems, personal digital assistants, audio and video control systems, portable music and video players and mobile and satellite telephones. The use of touch pad and touch screen systems (collectively “touch-surfaces”) has become increasingly popular in these types of electronic systems because of their ease of use and versatility of operation.
One particular type of touch-surface is the touch screen. Touch screens typically include a touch panel, a controller and a software driver. The touch panel is characteristically an optically clear panel with a touch sensitive surface that is positioned in front of a display screen so that the touch sensitive surface is coextensive with a specified portion of the display screen's viewable area (most often, the entire display area). The touch panel registers touch events and sends signals indicative of these events to the controller. The controller processes these signals and sends the resulting data to the software driver. The software driver, in turn, translates the resulting data into events recognizable by the electronic system (e.g., finger movements and selections).
Unlike earlier input devices, touch-surfaces now becoming available are capable of simultaneously detecting multiple objects as they approach and/or contact the touch-surface, and detecting object shapes in much more detail. To take advantage of this capability, it is necessary to measure, identify and distinguish between the many kinds of objects that may approach or contact such touch-surfaces simultaneously. Prior art touch-surface systems (including their supporting software and/or circuitry) do not provide a robust ability to do this. Thus, it would be beneficial to provide methods and devices that identify and discriminate multiple simultaneous hover or touch events such as, for example, two or more closely grouped fingers, palm heels from one or more fingers, fingers from thumbs, and fingers from ears and cheeks.
In one embodiment the invention provides a method to process a proximity image from a multi-touch touch surface. The method includes obtaining a dispersion image having a plurality of pixels, determining an irregularity measure value for the dispersion image and controlling the operation of a multi-touch touch-surface device if the irregularity measure value is above a specified threshold. In one embodiment, the irregularity measure is determined for the entire proximity image. In another embodiment, an irregularity measure is determined for one or more regions within the image (e.g., patches). Device control may be manifested through, for example, changing the device's operating mode (e.g., off to on) or ignoring the identified irregular object so that it does not cause a change in the device's operating state.
Illustrative multi-touch touch-surface devices include, but are not limited to, tablet computer systems, notebook computer systems, portable music and video systems and mobile telephones. Methods in accordance with any of the described methodologies may be stored in any media that is readable and executable by a programmable control device such as, for example, by a general purpose computer processor.
Methods and devices to detect and discriminate between multiple simultaneous close approaches or touches to a touch-surface are described. The following embodiments are presented to enable any person skilled in the art to make and use the invention as claimed and are provided in the context of mutual capacitance touch-surface devices. Variations using other types of touch-surfaces such as force or optical sensing touch-surfaces will be readily apparent to those skilled in the art. Accordingly, the claims appended hereto are not intended to be limited by the disclosed embodiments, but are to be accorded their widest scope consistent with the principles and features disclosed herein.
As previously noted, recent touch-surface input devices are capable of simultaneously detecting multiple objects as they approach and/or contact the touch-surface. For a hand-held multi-touch touch-surface device that may be put into a pocket, purse, or held against the head (e.g., portable music player, portable video player, personal digital assistant or mobile phone), detecting when the device is being clasped on the way into or out of the pocket, against the body, or against the head is very useful for: input rejection (ensuring that touch-surface input signals generated as a result of these actions are not mistaken for normal finger/stylus touches); operational mode transitions (e.g., dimming the device's backlight, putting the device to sleep and waking the device from a low-power state); and, for mobile telephones, answering calls (e.g., when the device is brought near, but not necessarily touching the head) and/or terminating calls (e.g., when the unit is placed into a pocket or purse).
Each sensing element (aka “pixel”) in a two dimensional array of sensing elements (i.e., a touch-surface) generates an output signal indicative of the electric field disturbance (for capacitance sensors), force (for pressure sensors) or optical coupling (for optical sensors) at the sensor element. The ensemble of pixel values represents a“proximity image.” As described herein, various embodiments of the invention address the ability to detect and discriminate between touch-surface signals (represented as a proximity image) resulting from, for example, the types of actions identified in the preceding paragraph.
Referring to
Next, [PROX] image data feeds other processing blocks that may operate sequentially or in parallel with one another (blocks 110, 115 and 120). It has been found that filtering or smoothing a proximity image (block 115) prior to segmentation (block 125) reduces the number of spurious peaks and thus helps reduce over segmentation. In one embodiment of block 115, each pixel value may be averaged with its nearest neighbor pixels in accordance with a discrete diffusion operation. If this approach is employed, it has been found beneficial to insert a “border” around the captured image so that there is a value with which to average the pixels at the edge of the captured image. For example, a one (1) pixel border may be added to the [PROX] image—where each “border” pixel is assigned a value corresponding to the image's “background” (e.g., zero). In another embodiment, both temporal (e.g., obtaining multiple images over a period of time) and spatial (e.g., averaging neighbor pixels) smoothing operations may be used. Multiple smoothing operations may be beneficial if the captured pixel data is particularly noisy. As noted in
While [PROX] image pixel values are typically zero or positive in response to an object contacting the touch-surface (aka, a “grounded” object), background noise or objects close to but not touching the touch-surface (aka “ungrounded” objects) may produce an image some of whose pixel values are negative. Background noise may be static or vary with circuit temperature, touch-surface moisture, or other factors. Noisy, negative pixels can cause excessive jitter in centroid and other patch measurements (see discussion below regarding block [135]). To compensate for this, [PROX] image pixel values may be confined to a desired, typically positive, range (block 110). Subtracting the noise threshold helps reduce centroid jitter induced from pixels that wander around (above and below) the noise threshold in successive image frames. As noted in
In one embodiment, the noise-threshold value is set to between 1 and 3 standard deviations of the noise measured at each pixel and the background-value is set to zero. One skilled in the art will recognize that other values are possible and that the precise choice of values depends, inter alia, on the type of sensor element used, the actual or expected level of pixel noise and the multi-touch device's operational environment. For example, the noise threshold may be set to a specified expected value on a per-pixel basis or a single value may be used for all pixels in an image. In addition, pixel noise values may be allowed to vary over time such that thermal and environmental effects on sensor element noise may be compensated for.
Touch-surface contacts typically show up as grouped collections of “active” pixel values, where each region of fleshy contact (e.g. finger, palm, cheek, ear or thigh) is represented by a roughly elliptical patch of pixels.
By analyzing an image's topography, image segmentation operations can identify distinct pixel patches that correspond to touch-surface contacts (block 125). In one embodiment, bottom-up, ridge-hiking algorithms may be used to group pixels that are part of the same watershed around each peak pixel—each watershed group or pixel patch corresponds to a touch-surface contact. In another embodiment, top-down search algorithms may be used to identify pixel patches surrounding each peak pixel, starting from the peak, searching outward and stopping at valleys. As part of the image segmentation process, one-dimensional patches may be culled from the identified patches in that they generally result from isolated noise spikes or failure of an entire row or column of sensor elements and/or associated circuitry. In addition, because large contacts such as palms and elongated thumbs may produce multiple peaks in a proximity image (due to noise or non-uniform signal saturation, for example), multiple peaks in the image can grow into multiple, split patches. To account for this phenomenon, multiple detected patches may be merged to produce a reduced number of patches for further processing. Heuristic or empirically determined rules may, for example, be applied to accomplish this. For example, two separately identified patches may be merged when the saddle point along their shared border is not “very deep”—e.g., when the saddle magnitude is more than 60% to 80% of the two patches' peak pixel values. As noted in
Analysis shows that noise from pixels on the periphery of a patch, far from the center or peak pixel, can cause more jitter in calculated centroid (center-of-‘mass’) measurements than the same amount of noise from central pixels. This phenomenon applies to other statistically-fitted patch parameters such as major/minor radii and orientation as well. This jitter can be a particularly serious problem for the smooth tracking of hovering objects because hovering objects do not generally induce strong central pixels, leaving the peripheral pixels with even greater influence on the centroid measurement. However, completely leaving these peripheral pixels out of a patches' centroid calculations would discard potentially useful information about the position, size, and shape of the patch. It is further noted that performing patch parameterization on diffused images may reduce noise from peripheral pixels, but standard spatial filtering processes also cause swelling and distortion of patch shape, cause adjacent patches to spread into one another and other effects that bias centroid and ellipse radii measurements in particular. Thus, a technique is needed that minimizes the amount of noise from patch periphery pixels without strongly distorting patch shape and ensuing measurements.
In accordance with one embodiment of the invention, therefore, patch peripheral pixel values may be selectively reduced, down-scaled or dampened (block 130). Generally, patch centroid determination may be improved by selectively down-scaling patch peripheral pixels that are fairly weak and whose neighbors are very weak. More specifically, in one embodiment calibrated image pixel values (e.g., in [CNST]) whose corresponding smoothed value (e.g., in [SMTH]) falls within a specified range defined by a lower-limit and an upper-limit are reduced in proportion to where the smoothed value falls within that range. Lower and upper limits are chosen empirically so that only those pixel values that are relatively weak (compared to patch peak values and background noise) are manipulated. It has been found that: if the lower-limit is set too low, the patch will “bloom” from background pixels that happen to have positive noise; if the lower-limit is set too high, the patches' centroid position will have a spatially periodic bias toward sensor element centers (e.g., capacitive electrode plate centers); if the upper-limit is not sufficiently higher than the lower-limit, periphery dampening will not provide any significant centroid jitter reduction benefits; and if the upper-limit is too high, all patch pixels besides the patches' peak pixel will be affected, again biasing determination of the patches' centroid toward sensor element centers. In accordance with one embodiment of the invention, the lower-limit is set, on a pixel-by-pixel basis, to approximately twice the background noise standard deviation and the upper-limit is set to approximately four times the background noise standard deviation (with the background value typically being zero). In another embodiment, the lower-limit is set to a value indicative of the “average” or “expected” noise across all pixels in the proximity image. In some embodiments, the noise value may change dynamically to reflect changing operational conditions (see comments above). As noted in
Patch peripheral pixel dampening such as described above is equally applicable to touch-surfaces that provide one-dimensional proximity images. For example, projection scan touch-surfaces provide an output value (or signal) for each row and column of sensor elements in a touch-surface. In these types of touch-surfaces, a “patch” comprises a plurality of values, where each value represents a row or column measurement. The values at the ends of these patches (i.e., the peripheral values) may benefit from noise dampening as described here.
For certain touch-surface input devices such as a telephone, the ear and earlobe may contact the touch-surface sooner or more often than the cheek during calls. Unfortunately, earlobe patches can be very close in size to finger and thumb patches—but should, nevertheless, not cause spurious finger-button activations during a call. In accordance with one embodiment of the invention, a measurement of patch irregularity is defined that does not look for any specific ear (patch) shape, but rather indicates a general roughness, non-roundness or folds in the pixel patch (block 120). That is, if a patches' irregularity measure is above a specified threshold, the contact is identified as an irregular object (e.g., not a cheek, finger or palm), otherwise the patch is identified as not an irregular object (e.g., a cheek, finger or palm).
Referring to
[DISP]=[PROX]−[SMTH] EQ. 1
Next, the total energy for each patch [P1, P2, . . . Pn] is computed (block 205). In one embodiment, for example, a patches' total energy may be calculated by summing the square of each pixel value in the patch. This may be expressed mathematically as follows:
As noted in
The total energy between adjacent pixels in a patch is then determined (block 210). To reduce the effect of energy spikes for pixel patches straddling an edge, the summations below should neglect (i.e., assume a value of zero) contributions from pixels whose neighboring pixels are at the image's border, see EQ. 3. For the same reason, the summations below should ignore contributions from pixels whose neighboring pixels are from a different patch.
The sum is divided by 4 because each pixel gets counted once for each direction in the proximity image (left, right, up and down). As noted in
As noted in
Finally, an irregularity measure for each patch is calculated (block 220). In one embodiment, the irregularity measure is defined as the ratio of a patches' spatial energy minus its peak energy to the patches' total energy:
As noted in
In another embodiment, the irregularity measure may be based on the proximity image as a whole. That is, the entirety of the dispersion image (i.e., all pixels) may be treated as a single “patch” for purposes of generating an irregularity measure value. One benefit to this approach is that abnormal touch-surface surface conditions may be detected, and responded to, prior to segmentation operations in accordance with block 125 (see
In general, an oddly shaped collection of pixels (i.e., a patch) can require a relatively large set of numbers to define its boundary and signal value at each pixel within the patch. To reduce the computational complexity of identifying, distinguishing and tracking touch events, however, it is advantageous to characterize patches identified in accordance with block 125 with as few numbers as practical. Because most patches from flesh contact tend to have an elliptical shape, one approach to patch parameterization is to fit an ellipse to each patch. One benefit of this approach is that an ellipse is completely described by a relatively small collection of numbers—its center coordinates, major and minor axis lengths, and major axis orientation.
Referring again to
In another embodiment, patch signal density may be approximated by:
Prior art techniques to discriminate between objects that actually contact a touch-surface from those that are merely hovering above it have relied upon a patches' total signal parameter (see, for example, EQ. 6). This approach, however, is very dependent upon the size of the object being identified. That is, prior art techniques that threshold on a patches' total signal value generally only work well for objects of a single size. For example, a total patch signal threshold selected to identify a fingertip contact would trigger detection of a thumb or palm when those objects are far above the touch-surface. Such a situation leads to the misidentification of patches (e.g., identifying a patch actually caused by a palm as a thumb).
In contrast, a discrimination technique in accordance with one embodiment of the invention uses a patches' signal density parameter (see, for example, EQS. 7 and 8). It has been found that this approach provides a robust means to distinguish objects that contact the touch-surface from those that are held or hovering above the surface—regardless of the object's size.
If the patch signal density parameter is normalized such that a firm fingertip contacting the touch-surface produces a peak value of 1, then a lightly brushing contact is indicated by a patch density value of slightly greater than 0.5 (e.g., half the normalized value) while a hovering object would be indicated by a patch density value equal to or slightly less than 0.5. It will be recognized that what constitutes “slightly greater” or “slightly less” is dependent upon factors such as the type of sensor elements used in the touch-surface and the associated sensor element drive and measure circuitry. Accordingly, while the precise determination of a threshold value based on patch signal density will require some experimentation, it would be will within the purview of an artisan of ordinary skill with benefit of this disclosure.
It has also been determined that fingernail touches also produce patch signal density values generally greater than approximately 0.5. This is because the non-conductive fingernail holds the conductive finger flesh more than approximately 1 millimeter above the touch-surface. Accordingly, a threshold operation based on patch signal density is also a reliable means for discriminating between fleshy fingertip touches and back-of-fingernail touches. This same technique has also been found to reliably determine whether large objects (e.g., cheeks and palms) are hovering or actually in contact with the touch-surface.
With patch parameterization complete, the various types of touch-surface contacts may be discriminated (block 140). Using the parameters identified above, it is possible to robustly and reliably distinguish large objects (e.g., cheeks and palms) form other objects (e.g., fingers and thumbs), irregular objects (e.g., ears) from regular objects (e.g., fingers, thumbs, cheeks and palms) and finger-clasp actions (e.g., when a user claps a multi-touch touch-surface device to put it into or withdraw it from a pocket). Identification of and discrimination between these types of touch-surface inputs permits an associated device to be controlled in a more robust manner. For example, in one embodiment detection of a large object may be used to transition the device from one operational state (e.g., off) to another (e.g., on). In another embodiment, input identified as the result of a large or irregular object, which might normally cause a state transition, may be safely ignored if in one or more specified states. For example, if a touch-surface telephone is already in an “on” or “active” state, identification of a large or irregular object may be ignored.
As previously noted, it can be advantageous to distinguish large objects (e.g., cheeks and palms) from small objects (e.g., fingertips), regardless of whether the objects are hovering a few millimeters above the touch-surface or are pressed firmly against the surface. It has been found that a contact's minor radius measure provides a robust discriminative measure to accomplish this. If a patches' minor radius exceeds a specified threshold, the contact can reliably be classified as a cheek—as opposed to a finger or thumb, for example. This same measurement can also detect a nearby leg (e.g., thigh) through a few millimeters of fabric (e.g. when a device is inserted in the pocket with its touch-surface facing the body). This measurement has been found to be so robust that if other patches appear on the surface with smaller minor radii (e.g. from an earlobe), they may be safely ignored. Referring to
A similar size testing may be performed using a patches' major or geometric mean radius (i.e.,
the minor-radius discrimination described here has been found to be superior because it is better able to discriminate between thumbs or flattened fingers. (Flattened fingers may produce major radii as large as a cheek major radius, but their minor radii are typically no larger than a normal fingertip touch.)
It will be recognized that distinguishing a palm contact from fingertip or thumb contacts can be especially difficult because the patch radii resulting from a palm contact for people with small hands may approach the patch radii caused by thumb or fingertip contacts for people with large hands. These types of contacts may also be distinguished in accordance with the invention using the patch minor radius parameter. Referring to
Ear and earlobe contacts can generate patches that are roughly the same size as those generated by fingers and thumbs. It has been found, however, that the creases, ridges, and generally rough topography of the ear do produce proximity images unique from fingers and thumbs, at least if the imaging sensor (i.e., touch-surface) covers a significant portion of the ear (i.e. not just the fleshy lobule). The irregularity measure described above is one way to characterize contact roughness (see EQ. 5). This permits a robust means to discriminate between contacts due to ears and earlobes from contacts due to fingers, thumbs, cheeks, thighs and palms. It has been found that the defined irregularity measure tends to give values between 1.0 to 2.0 for ear and earlobe contacts while regular (e.g., smooth) contacts attributable to fingers, thumbs, palms and cheeks give values less than about 1.0. Referring to
In one embodiment, successive proximity images (aka “frames”) are used to track objects as they move across a touch-surface. For example, as an object is moved across a touch-surface, its associated patch(es) may be correlated through overlap calculations. That is, patches identified in successive images that overlap in a specified number of pixels (or fraction of patch pixels) may be interpreted to be caused by the same object. In such an embodiments, the maximum patch minor radius over the life of the tracked contact may be compared to the thresholds discussed above (e.g., thresholds 310 in
When taking a multi-touch device in and out of a pocket, or otherwise generally handling it, users should have the freedom to clasp their hand around it without producing spurious input. Such finger-clasps can be detected via any one of the following criteria:
In another embodiment of the invention, multi-touch processing methodology may include far-field processing. As used herein, far-field processing refers to the detection and processing associated with bodies (e.g., fingers, palms, cheeks, ears, thighs, . . . ) that are close to (e.g., less than one millimeter to more than a centimeter) but not in contact with the touch-surface. The ability to detect far-field objects may be beneficial in touch-surface devices that, during normal use, are brought into close proximity to a user. One example of such a device is a telephone that includes touch-surface for user input (e.g., dialing).
Referring to
Negative Far-Field Image=[PROX]−(Noise Factor) EQ. 9
In one embodiment, the noise factor may be set to between approximately 1 and 2 standard deviations of the average noise measured or expected over the entire image. This will cause most pixels in the resulting negative far-field image to be slightly negative rather than neutral in the absence of any touch-surface contact. As noted in
Next, each pixel in the [NFAR] image is saturated to the highest level expected from an object hovering a few millimeters from the touch-surface (block 605). In one embodiment, the resulting far-field saturation image (denoted [SFAR] in
Since a goal of far-field operations is to be sensitive to large numbers of pixels only slightly activated (e.g., having small positive values), without being overwhelmed by a few strongly active pixels (e.g., having large positive values), the saturation limit value should be less than the peak pixel value from fingers or thumbs hovering within approximately 1 to 2 millimeters of the touch-surface, but not so low as to cause the resulting [SFAR] image to lose to much information content. While the precise far-field saturation limit value will vary from implementation to implementation (due to differences in sensor element technology and associated circuitry), it has been determined empirically that a suitable value will generally lie between +3 standard deviations and +6 standard deviations of noise associated with the initial far-field image. (Again, this noise may be on a per-pixel, or whole image basis.)
If the initial proximity image [PROX] contains a significant amount of noise, it may be beneficial to filter the [SFAR] image (block 610). In one embodiment, a finite impulse response filter technique may be used wherein two or more consecutive [SFAR] images are averaged together. In another embodiment, an infinite impulse response filter technique may be used to generate a smoothed image. It will be recognized that an infinite impulse response filter generates a weighted running average (or auto-regressive) image. In one embodiment, for example, an infinite impulse response filter combines the current far-field saturated image (e.g., [SFAR]new) with the immediately prior far-field saturated image (e.g., [SFAR]prior) in a one-third to two-thirds ratio. As noted in
Following image segmentation operations in accordance with block 125 (see
where the ENeg( ) function non-linearly amplifies pixel values below a threshold (e.g., zero) and [LOC] represents a pixel weighting mechanism. As indicated in EQ. 10, only proximity image background pixels contribute to the computed FAR-FIELD value. That is, pixels identified as belonging to a patch during image segmentation operations are excluded during far-field measurement operations.
In one embodiment, the ENeg( ) function disproportionately emphasizes the contributions from background pixels as follows:
where B represents a far-field saturation limit value. Empirically determined, B is chosen to permit a small number of negative pixels to cancel out a finger or thumb-sized patch of positive pixels. In this way, only a nearly full coverage cheek-sized patch of positive pixels, plus a small remainder of neutral/background pixels, can produce a strongly positive far-field measurement.
While not necessary, disproportionately emphasizing the contributions from background pixels in accordance with EQs. 10 and 11 permits the FAR-FIELD measurement to be more selective for bodies large enough to positively affect most of a touch-surface's pixel (e.g., cheeks and legs), while not being overly influenced by medium-sized objects (e.g., hovering thumbs). For example, if a hovering thumb causes half of a touch-surface's sensor elements to have a slightly above-background pixel value, disproportionately emphasizing the half that remain below background will keep the measured FAR-FIELD value below zero—indicating no large object is “near” the touch-surface (e.g., within 1 to 3 centimeters). In another embodiment, background pixels may be linearly combined (e.g., summed).
As noted above, [LOC] represents a pixel weighting mechanism. In general, there is one value in [LOC] for each pixel present in the touch-surface. If it is desired to consider all touch-surface pixels equally, each value in the [LOC] image may be set to 1.0 (or some similar constant value). For hand-held form-factors selectivity for large bodies may be improved, however, by lowering the weights near the bottom and side edges (for example, to values between 0.1 and 0.9). Doing this can lessen false-positive contributions from a hand whose fingers wrap around the device during (clasping) operations. In mobile phone form-factors, to retain sensitivity to ear and cheek far-fields, the weights along the top edge (where thumbs and fingers are less likely to hover or wrap) may be kept at full strength.
Returning now to
In addition, one or more proximity sensors may be positioned above the touch-surface's top edge or around, for example, a telephone's receiver opening. Illustrative proximity sensors of this type include, but are not limited to, active infrared-reflectance sensors and capacitance-sensitive electrode strips. In a mobile telephone form-factor, when the device is held such that the receiver is centered on the ear canal, ear ridges may trigger the proximity sensor. Meanwhile the earlobe may cause a small pixel patch in the top portion of the touch-surface. Discrimination operations in accordance with block 140 could decide that when a pixel patch at the top of the touch-surface is accompanied by any significant receiver proximity trigger, the pixel patch must be an ear, not a finger. In another embodiment, the same conditions but with a significant FAR-FIELD value for the lower portion of the touch-surface (indicating a hovering cheek) may be used to trigger detection of an ear at the top of the touch-surface. Generally speaking, one or more of signal density (see EQs. 7 and 8), patch irregularity (see EQ. 5), FAR-FIELD measurement (see EQ. 10) and proximity sensor input may be combined (e.g., a weighted average) so that ear detection can trigger when multiple indicators are weakly active, or just one indicator is strongly active. Finally, it is noted that contact discrimination parameters such as a patches' centroid, minor axis radius, patch irregularity (EQ. 5), patch signal density (EQs. 7 and 8), far-field (EQ. 10) and proximity sensor input (if available) may be (low-pass) filtered to help counteract their often sporadic nature. This may be particularly beneficial if the filters employ adaptive time constants that rise quickly in response to rising input values, but decay more slowing when input values drop and/or are missing.
Referring to
As illustrated, touch-surface element 705 includes sensor elements and necessary drive and signal acquisition and detection circuitry. Memory 710 may be used to retain acquired proximity image information (e.g., [PROX] image data) and by processor 715 for computed image information (e.g., patch characterization parameters). Processor 715 represents a computational unit or programmable control device that is capable of using the information generated by touch-surface element 705 to determine various metrics in accordance with
Various changes in the materials, components, circuit elements, as well as in the details of the illustrated operational methods are possible without departing from the scope of the following claims. It will be recognized, for example, that not all steps identified in
With respect to illustrative touch-surface device 700, touch-surface element 705 may incorporate memory (e.g., 710) and/or processor (e.g., 715) functions. In addition, external component 720 may represent a hardware element (e.g., a host processor) or a software element (e.g., a driver utility).
Finally, acts in accordance with
This application is a continuation of U.S. patent application Ser. No. 11/619,490, filed Jan. 3, 2007, the entire disclosure of which is incorporated herein by reference in its entirety for all purposes.
Number | Name | Date | Kind |
---|---|---|---|
4056754 | Bailey et al. | Nov 1977 | A |
4219847 | Pinkney et al. | Aug 1980 | A |
4290052 | Eichelberger et al. | Sep 1981 | A |
4293734 | Pepper, Jr. | Oct 1981 | A |
4340911 | Kato et al. | Jul 1982 | A |
4444998 | House | Apr 1984 | A |
4526043 | Boie et al. | Jul 1985 | A |
4550221 | Mabusth | Oct 1985 | A |
4570193 | Yamashita et al. | Feb 1986 | A |
4598420 | Harvey | Jul 1986 | A |
4617563 | Fujiwara et al. | Oct 1986 | A |
4618989 | Tsukune et al. | Oct 1986 | A |
4678869 | Kable | Jul 1987 | A |
4686332 | Greanias et al. | Aug 1987 | A |
4736191 | Matzke et al. | Apr 1988 | A |
4746770 | McAvinney | May 1988 | A |
4763356 | Day, Jr. et al. | Aug 1988 | A |
4806709 | Evans | Feb 1989 | A |
4850677 | Okumura | Jul 1989 | A |
4885639 | Nakata et al. | Dec 1989 | A |
4896363 | Taylor et al. | Jan 1990 | A |
4899138 | Araki et al. | Feb 1990 | A |
4906983 | Parker | Mar 1990 | A |
4914624 | Dunthorn | Apr 1990 | A |
4952932 | Sugino et al. | Aug 1990 | A |
4954967 | Takahashi | Sep 1990 | A |
4968877 | McAvinney et al. | Nov 1990 | A |
5012524 | LeBeau | Apr 1991 | A |
5073950 | Colbert et al. | Dec 1991 | A |
5113041 | Blonder et al. | May 1992 | A |
5150105 | Perbet et al. | Sep 1992 | A |
5168531 | Sigel | Dec 1992 | A |
5189732 | Kondo | Feb 1993 | A |
5252951 | Tannenbaum et al. | Oct 1993 | A |
5267327 | Hirayama | Nov 1993 | A |
RE34476 | Norwood | Dec 1993 | E |
5276794 | Lamb, Jr. | Jan 1994 | A |
5293430 | Shiau et al. | Mar 1994 | A |
5305017 | Gerpheide | Apr 1994 | A |
5309374 | Misra et al. | May 1994 | A |
5327161 | Logan et al. | Jul 1994 | A |
5347295 | Agulnick et al. | Sep 1994 | A |
5367453 | Capps et al. | Nov 1994 | A |
5376946 | Mikan | Dec 1994 | A |
5386219 | Greanias et al. | Jan 1995 | A |
5414227 | Schubert et al. | May 1995 | A |
5420936 | Fitzpatrick et al. | May 1995 | A |
5422831 | Misra et al. | Jun 1995 | A |
5432531 | Calder et al. | Jul 1995 | A |
5446480 | Yoshida | Aug 1995 | A |
5457476 | Jenson | Oct 1995 | A |
5459463 | Gruaz et al. | Oct 1995 | A |
5461655 | Vuylsteke et al. | Oct 1995 | A |
5463388 | Boie et al. | Oct 1995 | A |
5469194 | Clark et al. | Nov 1995 | A |
5479528 | Speeter | Dec 1995 | A |
5483261 | Yasutake | Jan 1996 | A |
5488204 | Mead et al. | Jan 1996 | A |
5491495 | Ward et al. | Feb 1996 | A |
5505072 | Oreper | Apr 1996 | A |
5513309 | Meier et al. | Apr 1996 | A |
5541372 | Baller et al. | Jul 1996 | A |
5543588 | Bisset et al. | Aug 1996 | A |
5543590 | Gillespie et al. | Aug 1996 | A |
5543591 | Gillespie et al. | Aug 1996 | A |
5546334 | Hsieh et al. | Aug 1996 | A |
5546475 | Bolle et al. | Aug 1996 | A |
5548667 | Tu | Aug 1996 | A |
5552787 | Schuler et al. | Sep 1996 | A |
5570113 | Zetts | Oct 1996 | A |
5572717 | Pedersen | Nov 1996 | A |
5583543 | Takahashi et al. | Dec 1996 | A |
5583946 | Gourdol | Dec 1996 | A |
5585823 | Duchon et al. | Dec 1996 | A |
5588098 | Chen et al. | Dec 1996 | A |
5594806 | Colbert | Jan 1997 | A |
5600735 | Seybold | Feb 1997 | A |
5603053 | Gough et al. | Feb 1997 | A |
5612719 | Beernink et al. | Mar 1997 | A |
5627567 | Davidson | May 1997 | A |
5631976 | Bolle et al. | May 1997 | A |
5638093 | Takahashi et al. | Jun 1997 | A |
5648642 | Miller et al. | Jul 1997 | A |
5649070 | Connell et al. | Jul 1997 | A |
5666438 | Beernink et al. | Sep 1997 | A |
5673066 | Toda et al. | Sep 1997 | A |
5717777 | Wong et al. | Feb 1998 | A |
5729604 | Van Schyndel et al. | Mar 1998 | A |
5732230 | Cullen et al. | Mar 1998 | A |
5734742 | Asaeda et al. | Mar 1998 | A |
5734751 | Saito | Mar 1998 | A |
5748184 | Shieh | May 1998 | A |
5751851 | Guzik et al. | May 1998 | A |
5757478 | Ma | May 1998 | A |
5764218 | Bona et al. | Jun 1998 | A |
5764222 | Shieh | Jun 1998 | A |
5768492 | Schumer | Jun 1998 | A |
5771342 | Todd | Jun 1998 | A |
5790104 | Shieh | Aug 1998 | A |
5793355 | Youens | Aug 1998 | A |
5796389 | Bertram et al. | Aug 1998 | A |
5797396 | Geiser et al. | Aug 1998 | A |
5798756 | Yoshida et al. | Aug 1998 | A |
5805144 | Scholder et al. | Sep 1998 | A |
5808605 | Shieh | Sep 1998 | A |
5809166 | Huang et al. | Sep 1998 | A |
5812118 | Shieh | Sep 1998 | A |
5812698 | Platt et al. | Sep 1998 | A |
5825352 | Bisset et al. | Oct 1998 | A |
5825675 | Want et al. | Oct 1998 | A |
5831602 | Sato et al. | Nov 1998 | A |
5835079 | Shieh | Nov 1998 | A |
5838302 | Kuriyama et al. | Nov 1998 | A |
5852434 | Sekendur | Dec 1998 | A |
5852669 | Eleftheriadis et al. | Dec 1998 | A |
5856822 | Du et al. | Jan 1999 | A |
5856824 | Shieh | Jan 1999 | A |
5862256 | Zetts et al. | Jan 1999 | A |
5870083 | Shieh | Feb 1999 | A |
5870771 | Oberg | Feb 1999 | A |
5872559 | Shieh | Feb 1999 | A |
5874948 | Shieh | Feb 1999 | A |
5875311 | Bertram et al. | Feb 1999 | A |
5880411 | Gillespie et al. | Mar 1999 | A |
5896126 | Shieh | Apr 1999 | A |
5898422 | Zetts | Apr 1999 | A |
5898434 | Small et al. | Apr 1999 | A |
5933134 | Shieh | Aug 1999 | A |
5936615 | Waters | Aug 1999 | A |
5940065 | Babb et al. | Aug 1999 | A |
5943043 | Furuhata et al. | Aug 1999 | A |
5956682 | Loudermilk et al. | Sep 1999 | A |
5963190 | Tsuboyama et al. | Oct 1999 | A |
5974558 | Cortopassi et al. | Oct 1999 | A |
5977950 | Rhyne | Nov 1999 | A |
5982302 | Ure | Nov 1999 | A |
5987162 | Nakata | Nov 1999 | A |
6005959 | Mohan et al. | Dec 1999 | A |
6005963 | Bolle et al. | Dec 1999 | A |
6008800 | Pryor | Dec 1999 | A |
6016355 | Dickinson et al. | Jan 2000 | A |
6023313 | Hazama | Feb 2000 | A |
6029214 | Dorfman et al. | Feb 2000 | A |
6037882 | Levy | Mar 2000 | A |
6049329 | Zetts et al. | Apr 2000 | A |
6067079 | Shieh | May 2000 | A |
6075520 | Inoue et al. | Jun 2000 | A |
6088025 | Akamine et al. | Jul 2000 | A |
6118435 | Fujita et al. | Sep 2000 | A |
6121960 | Carroll et al. | Sep 2000 | A |
6128014 | Nakagawa et al. | Oct 2000 | A |
6147680 | Tareev | Nov 2000 | A |
6169538 | Nowlan et al. | Jan 2001 | B1 |
6175644 | Scola | Jan 2001 | B1 |
6184926 | Khosravi et al. | Feb 2001 | B1 |
6188391 | Seely et al. | Feb 2001 | B1 |
6222508 | Alvelda et al. | Apr 2001 | B1 |
6262717 | Donohue et al. | Jul 2001 | B1 |
6271860 | Gross | Aug 2001 | B1 |
6279170 | Chu | Aug 2001 | B1 |
6281887 | Wang | Aug 2001 | B1 |
6283858 | Hayes et al. | Sep 2001 | B1 |
6310610 | Beaton et al. | Oct 2001 | B1 |
6310964 | Mohan et al. | Oct 2001 | B1 |
6311162 | Reichwein et al. | Oct 2001 | B1 |
6323846 | Westerman et al. | Nov 2001 | B1 |
6353433 | Schumer | Mar 2002 | B1 |
6380929 | Platt | Apr 2002 | B1 |
6392636 | Ferrari et al. | May 2002 | B1 |
6492979 | Kent et al. | Dec 2002 | B1 |
6504530 | Wilson et al. | Jan 2003 | B1 |
6518957 | Lehtinen et al. | Feb 2003 | B1 |
6552719 | Lui et al. | Apr 2003 | B2 |
6567549 | Marianetti et al. | May 2003 | B1 |
6610936 | Gillespie et al. | Aug 2003 | B2 |
6657614 | Ito et al. | Dec 2003 | B1 |
6664982 | Bi | Dec 2003 | B1 |
6690387 | Zimmerman et al. | Feb 2004 | B2 |
6727891 | Moriya et al. | Apr 2004 | B2 |
6738154 | Venable | May 2004 | B1 |
6788815 | Lui et al. | Sep 2004 | B2 |
6924790 | Bi | Aug 2005 | B1 |
7015894 | Morohoshi | Mar 2006 | B2 |
7030860 | Hsu et al. | Apr 2006 | B1 |
7030862 | Nozaki | Apr 2006 | B2 |
7038659 | Rajkowski | May 2006 | B2 |
7053886 | Shin | May 2006 | B2 |
7184064 | Zimmerman et al. | Feb 2007 | B2 |
7274353 | Chiu et al. | Sep 2007 | B2 |
7283126 | Leung | Oct 2007 | B2 |
7292229 | Morag et al. | Nov 2007 | B2 |
7339580 | Westerman et al. | Mar 2008 | B2 |
7428142 | Ligtenberg et al. | Sep 2008 | B1 |
7444163 | Ban et al. | Oct 2008 | B2 |
7499035 | Kolmykov-Zotov et al. | Mar 2009 | B2 |
7633076 | Huppi et al. | Dec 2009 | B2 |
7714846 | Gray | May 2010 | B1 |
8269727 | Westerman | Sep 2012 | B2 |
20010030637 | Geisow et al. | Oct 2001 | A1 |
20020008711 | Karjalainen | Jan 2002 | A1 |
20020015024 | Westerman et al. | Feb 2002 | A1 |
20020080123 | Kennedy et al. | Jun 2002 | A1 |
20020191029 | Gillespie et al. | Dec 2002 | A1 |
20030117380 | Kanzaki | Jun 2003 | A1 |
20030161517 | Vuylsteke | Aug 2003 | A1 |
20040001048 | Kraus et al. | Jan 2004 | A1 |
20040119700 | Ichikawa | Jun 2004 | A1 |
20040178998 | Sharp et al. | Sep 2004 | A1 |
20050104867 | Westerman et al. | May 2005 | A1 |
20050226505 | Wilson | Oct 2005 | A1 |
20050227217 | Wilson | Oct 2005 | A1 |
20060026521 | Hotelling et al. | Feb 2006 | A1 |
20060097991 | Hotelling et al. | May 2006 | A1 |
20060197753 | Hotelling | Sep 2006 | A1 |
20060244735 | Wilson | Nov 2006 | A1 |
20060285743 | Oh et al. | Dec 2006 | A1 |
20070083372 | Cho et al. | Apr 2007 | A1 |
20080012835 | Rimon et al. | Jan 2008 | A1 |
20080158146 | Westerman | Jul 2008 | A1 |
Number | Date | Country |
---|---|---|
0 397 428 | Nov 1990 | EP |
0 637 795-A2 | Feb 1995 | EP |
2 139 762 | Nov 1984 | GB |
2 344 894 | Jun 2000 | GB |
62-030888 | Feb 1987 | JP |
06-161661 | Jun 1994 | JP |
07-110741 | Apr 1995 | JP |
2000-163031 | Jun 2000 | JP |
2000-300543 | Oct 2000 | JP |
2002-342033 | Nov 2002 | JP |
2004-178547 | Jun 2004 | JP |
10-0664 964 | Dec 2006 | KR |
WO-9615464 | May 1996 | WO |
WO-9938149 | Jul 1999 | WO |
WO-2004021261 | Mar 2004 | WO |
WO-2006074289 | Jul 2006 | WO |
WO-2008085458-A2 | Sep 2008 | WO |
Entry |
---|
Adams, H.G. et al. (Jul. 1985). “Automated Radiographic Report Generation Using Barcode Technology,” American Journal of Roentgenology 145(1):177-180. |
Agrawal, R. et al. (Jul. 1986). “An Overview of Tactile Sensing,” Center for Research on Integrated Manufacturing: Robot Systems Division, The University of Michigan, 47 pages. |
Anonymous. (1989). “Concorde Version 3.0: Turns a PC into a Graphics Workstation,” News and Views: Tektronix, Inc. Bioinformatics, 5(1):63-75. |
Anonymous. (May 8, 1992). “The Sensor Frame Graphic Manipulator,” NASA Phase II Final Report, 28 pages. |
Anonymous. (Apr. 13, 2004). Proximity Sensor Demo Kit, User Guide, Version 0.62—Preliminary, Integration Associates, Inc., 14 pages. |
Apte, A. et al. (Nov. 1993). “Recognizing Multistroke Geometric Shapes: An Experimental Evaluation,” Association for Computing Machinery UIST '93, Atlanta, GA, Nov. 3-5, 1993, 121-128. |
Bolle, R.M. et al. (1996) “Veggie Vision: A Producce Recognition System,” IEEE pp. 244-251. |
Brown, E. et al. (1990). “Windows on Tablets As A Means of Achieving Vitrual Input Devices,” Interact '90, pp. 675-681. |
Buxton, W. (1986). “There's More to Interaction Than Meets the Eye: Some Issues in Manual Input,” User Centered System Design, New Perspectives on Human-Computer Interaction, pp. 319-337. |
Buxton, W. et al. (Jul. 1985). “Issues and Techniques in Touch-Sensitive Tablet Input,” Association for Computing Machinery SIGGRAPH '85, San Francisco, CA, Jul. 22-26, 1985, 19(3):215-224. |
Buxton, W. et al. (1986). “A Study in Two-Handed Input,” Proceedings of CHI '86, pp. 321-326. |
Buxton, W.A.S. (Mar./Apr. 1994). “Two-Handed Document Navigation,” Xerox Disclosure Journal 19(2):103-108. |
Charlebois, M. et al. (1996). “Curvature Based Shape Estimation Using Tactile Sensing,” Proceedings 1996 IEEE International Conference on Robotics and Automation, six pages. |
Charlebois, M.A. (Dec. 1996). “Exploring the Shape of Objects with Curved Surfaces Using Tactile Sensing,” A Thesis Submitted in Partial Fulfillment of the Requirements for the Degree of Master of Applied Science, Simon Fraser University, Burnaby, British Columbia, Canada, 108 pages. |
Cirque Corporation. (1994). GlipePoint® Portable Touch Surface Round Mouse Post PS/2 Version User's Guide, Guide Version 1.0, 16 pages. |
Colosi, J. ed. (Dec. 15, 1984). Computer News, Hewlett-Packard 10(4):1-27. |
Dario, P. et al. (1986). “A Sensorised Scenario for Basic Investigation on Active Touch,” in Robot Sensors vol. 2- Tactile and Non-Vision, Pugh, A. ed., IFS Publications: UK, pp. 237-245. |
Davies, E.R. (Aug. 1987). “Lateral Histograms for Efficient Object Location: Speed Versus Ambiguity,” Pattern Recognition Letters 6(3):189-198. |
Davies, E.R. (1990). Machine Vision: Theory, Algorithms, Practicalities, Academic Press, Inc.: San Diego, CA, 575 pages. |
Davies, E.R. (1990). Machine Vision: Theory, Algorithms, Practicalities, Academic Press, Inc.: San Diego, CA, pp. xi-xxi (Table of Contents Only.). |
Davies, E.R. (1997). “Boundary Pattern Analysis,” Chapter 7 in Machine Vision: Theory Algorithms Practicalities, 2nd Edition, Academic Press, Inc.: San Diego, CA, pp. 171-191. |
Davies, E.R. (1997). “Ellipse Detection,” Chapter 11 in Machine Vision: Theory Algorithms Practicalities, 2nd Edition, Academic Press, Inc.: San Diego, CA, pp. 271-290. |
Davies, E.R. (1997). “Image Acquisition,” Chapter 23 in Machine Vision: Theory Algorithms Practicalities, 2nd Edition, Academic Press, Inc.: San Diego, CA, pp. 583-601. |
Desai, A. et al. (1994). “On the Localization of Objects Using an FSR Pressure Pad Transducer,” IEEE pp. 953-957. |
Desai, A.M. (Nov. 1994). “Interpretation of Tactile Data From an FSR Pressure Pad Transducer Using Image Processing Techniques,” A Thesis Submitted in Partial Fulfillment of the Requirements for the Degree of Master of Applied Science, Simon Fraser University, Burnaby, British Columbia, Canada, 157 pages. |
Dvorak, J. (Apr. 1992). “Whatever Happened to the Gavilan Mobile Computer?,” Computer Shopper 12(4):668, Dialog® search result, nine pages. |
Elo Touchsystems, Inc. (1993). SmartSet™ Touchscreen Controller Family Technical Reference Manual, Manual Version 1.0, 139 pages. |
Elographics, Inc. (1988). E271-140 AccuTouch® Serial Touchscreen Controller, User's Manual Version 2.0, 24 pages. |
Elographics, Inc. (1989). “All the Building Blocks You Need to Quickly Develop Touchscreen Applications,” eight pages. |
Elographics, Inc. (1989). “Application Development Tools,” four pages. |
Elographics, Inc. (1990). ELODEV™ Touchscreen Driver Program Version 1.4, Installation Guide and Programmer's Reference Manual, Manual Version 2.0c, 119 pages. |
Elographics, Inc. (1990). MonitorMouse™ and MonitorMouse for Windows™ 3.0, Version 1.2, User's Guide, Manual Version 1.2, 33 pages. |
Elographics, Inc. (1993). “In Touch with Excellence,” seven pages. |
Elographics, Inc. (1993). TouchBack™ Version 1.2 Programmer's Reference Manual, Manual Version 2.0a, 70 pages. |
Everett, H.R. (1995). Sensors for Mobile Robots: Theory and Application, AK Peters, Ltd.: Wellesley, MA, pp. v-x (Table of Contents Only.). |
Fearing, R.S. (1988). “Tactile Sensing, Perception and Shape Interpretation,” A Dissertation Submitted to the Department of Electrical Engineering and the Committee on Graduate Studies of Stanford University in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy, presented Dec. 1987, 160 pages. |
Fearing, R.S. (Jun. 1990). “Tactile Sensing Mechanisms,” Robotics Research 9(3):3-23. |
Final Office Action mailed Mar. 24, 2011, for U.S. Appl. No. 11/619,490, filed Jan. 3, 2007, 17 pages. |
Final Office Action mailed Jan. 31, 2012, for U.S. Appl, No. 11/619,490, filed Jan. 3, 2007, 23 pages. |
Gavilan Computer Corporation. (Jan. 23, 1984). Gavilan™ Mobile Computer: MS-DOS User's Guide, Part No: 620-0010-02, 276 pages. |
Geman, S. et al. (Nov. 1984). “Stochastic Relaxation, Gibbs Distributions, and The Bayesian Restoration of Images,” IEEE Transactions on Pattern Analysis and Machine Intelligence 6(6):721-741. |
Ghani, N. (Oct. 1988). “Visual and Tactile Senses in Collaboration,” The International Journal of Sensing for Industry pp. 210-215. |
Golub, G. et al. (1983). “Measuring Vectors, Matrices, Subspaces, and Linear System Sensitivty,” Chapter 2 in Matrix Computations, Johns Hopkins University Press: Baltimore, MD, pp. 10-30. |
Goodisman, A. (May 10, 1991). “A Stylus-Based User Interface for Text: Entry and Editing,” Thesis (B.S./M.S.) MIT Department of Electrical Engineering and Computer Science, pp. 1-100. |
Hackwood, S. et al. (1986). “Torque-Sensitive Tactile Array for Robotics,” in Robot Sensors vol. 2—Tactile and Non-Vision, Pugh, A. ed., IFS Publications: UK, pp. 123-131. |
Harmon, L.D. (Summer 1982). “Automated Tactile Sensing,” The International Journal of Robotics Research 1(2):3-32. |
Harrison, B.L. et al. (Apr. 18, 1998). “Squeeze Me, Hold Me, Tilt Me! An Exploration of Manipulative User Interfaces,” CHI 98, Conference Proceedings, Los Angeles, CA, Apr. 18 -23, 1998, pp. 17-24. |
Holland, S. (Jul. 1989). “Artificial Intelligence, Education and Music: The Use of Artificial Intelligence to Encourage and Facilitate Music Composition by Novices,” CITE Report No. 88, Centre for Information Technology in Education, 309 pages. |
Hurst, M.P. (Jul. 1995). “Configurator: A Configuration Program for Satellite Dish Hardware,” Thesis (M.S.) MIT Department of Electrical Engineering and Computer Science, pp. 1-74. |
International Search Report mailed Jul. 10, 2008, for PCT Application No. PCT/US2007/026299, filed Dec. 21, 2007, four pages. |
Jayawant, B.V. et al. (1986). “Robot Tactile Sensing: a New Array Sensor,” in Robot Sensors vol. 2—Tactile and Non-Vision, Pugh, A. ed., IFS Publications: UK, pp. 199-205. |
Kahn, R.E. et al. (Mar. 1988). “An Open Architecture for a Digital Library System and a Plan for Its Development,” in The Digital Library Project: The World of Knowbots, Corporation for National Research Initiatives, 1:1-48. |
Kirk, D.E. (1970). “The Method of Steepest Descent,” Chapter 6.2 in Optimal Control Theory: An Introduction, Prentice-Hall, Inc.: Englewood Cliffs, NJ, pp. 331-343. |
Kjeldsen, F.C.M. (1997). “Visual Interpretation of Hand Gestures as a Practical Interface Modality,” Thesis (Ph.D.) Columbia University, Graduate School of Arts, pp. 1-198. |
Konrad, J. (Jun. 1989). “Bayesian Estimation of Motion Fields from Image Sequences,” Thesis, McGill University, Department of Electrical Engineering, pp. 1-237. |
Krueger, M. et al. (Jun. 10, 1988). “Videoplace, Responsive Environment, 1972-1990,” located at http://www.youtube.com/watch?v=dmmxVA5xhuo, last visited Aug. 5, 2011, two pages. |
Kurtenbach, G.P. (1983). “The Design and Evaluation of Marking Menus,” A Thesis Submitted in Conformity with the Requirements of the Degree of Doctor of Philosophy, University of Toronto, Toronto, Canada, 201 pages. |
Lee, S. (Oct. 1984). “A Fast Multiple-Touch-Sensitive Input Device,” a Thesis Submitted in Conformity with the Requirements for the Degree of Master of Applied Science in the Department of Electrical Engineering, University of Toronto, 115 pages. |
Lee, S.K. et al. (Apr. 1985). “A Multi-Touch Three Dimensional Touch-Sensitive Tablet,” Proceedings of CHI: ACM Conference on Human Factors in Computing Systems, pp. 21-25. |
Leonard, J.J. et al. (Jun. 1991). “Mobile Robot Localization by Tracking Geometric Beacons,” IEEE Transaction on Robotics and Automation 7(3):376-382. |
Leung, A. et al. (1994). “Application of Adaptive Neural Network to Localization of Objects Using Pressure Array Transducer,” IEEE pp. 2114-2119. |
Liu, H. et al. (Jun. 1993). “Development of a New Kind of Robot Slipping Sensor,” Proceedings of the 1993 American Control Conference, Jun. 2-4, 1993, San Francisco, CA, pp. 756-757. |
Logitech, Fremont, Inc. (Aug. 2, 1998). P2.1 Touchpad, Firmware Engineering Specification FWS-761588-0000, Revision A, 25 pages. |
Logitech, Inc. (Jun. 24, 1997). TP3™—Standard Touch Pad Product Specification, Document ID: PS-761311-0000-USA, 27 pages. |
Mehta, N. (Oct. 1982). “A Flexible Human Macine Interface,” A Thesis Submitted in Conformity with the Requirements for the Degree of Master of Applied Science in the University of Toronto, 87 pages. |
Microtouch Sysyems, Inc. (1993). TouchWare for DOS, Windows and NT, User's Guide, Revision 3.0 #19-207, 33 pages. |
Microtouch Sysyems, Inc. (1993). Windows Touch Driver, User's Guide, Revision 2.0 #9006601, 26 pages. |
Miller, D. et al. (Feb./Mar. 1994). “Combined Source-Channel Vector Quantization Using Deterministic Annealing,”IEEE Transactions on Communications 42 (2/3/4):347-356. |
Moroney, D.T. (Mar. 1992). “Use of an Optical Multichannel Analyzer for Reflectivity Measurements,” Submitted in Partial Fulfillment of the Requirements for the Degree of Master of Science in Astronautical Engineering, Naval Postgraduate School, Monterrey, CA, 187 pages. |
Mui, L. (Aug. 22, 1995). “A Statistical Multi-Experts Approach to Image Classification and Segmentation,” Thesis (B.S./M.Eng.) MIT Department of Electrical Engineering and Computer Science, pp. 1-177 (submitted in two parts). |
Nicholls, H.R. et al. (Jun. 1989). “A Survey of Robot Tactile Sensing Technology,” The International Journal of Robotics Research 8(3):3-30. |
Non-Final Office Action mailed Oct. 1, 2009, for U.S. Appl. No. 11/619,490, filed Jan. 3, 2007, 35 pages. |
Non-Final Office Action mailed Nov. 22, 2010, for U.S. Appl. No. 11/619,490, filed Jan. 3, 2007, 17 pages. |
Non-Final Office Action mailed Sep. 29, 2011, for U.S. Appl. No. 11/619,490, filed Jan. 3, 2007, 22 pages. |
Notice of Allowance mailed May 15, 2012, for U.S. Appl. No. 11/619,490, filed Jan. 3, 2007, 12 pages. |
Nowlin, W.C. (Apr. 1991). “Experimental Results on Bayesian Algorithms for Interpreting Compliant Tactile Sensing Data,” Proceedings of the 1991 IEEE International Conference on Robotics and Automation, Sacramento, CA, pp. 378-383. |
Ogawa, H. (1979). “Preprocessing for Chinese Character Recognition and Global Classification of Handwritten Chinese Characters,” Pattern Recognition, 11(1):1-7. |
Poggio, T. et al. (1990). “The MIT Vision Machine,” Chapter 42 in Artificial Intelligence, MIT Press: Cambridge, MA, pp. 493-529. |
Pugh, A. ed. (1986). International Trends in Manufacturing Technology: Robot Sensors vol. 2—Tactile and Non-Vision, 3 pages (Table of Contents Only). |
Quantum Research Group Ltd. (1997). QT9701B2 Datasheet, 30 pages. |
Rebman, J. et al. (1986). “A Tactile Sensor with Electrooptical Transduction,” in Robot Sensors vol. 2—Tactile and Non-Vision, Pugh, A. ed., IFS Publications: UK, pp. 145-155. |
Robertson, B.E. et al. (1986). “Tactile Sensor System for Robotics,” in International Trends in Manufacturing Technology: Robot Sensors vol. 2—Tactile and Non-Vision, Pugh, A. ed., IFS Publications: UK, pp. 89-97. |
Rubine, D. (Sep. 1988). “The VideoHarp,” Proceedings of the 14th International Computer Music Conference, Cologne, FR, Sep. 20-25, 1988, 11 pages. |
Rubine, D. (Jul. 1991). “Specifying Gestures by Example,” Computer Graphics 25(4):329-337. |
Rubine, D. et al. (1990) “Programmable Finger-Tracking Instrument Controllers,” Computer Music Journal 14(1):26-41. |
Rubine, D.H. (Dec. 1991). “The Automatic Recognition of Gestures,” CMU-CS-91-202, Submitted in Partial Fulfillment of the Requirements of the Degree of Doctor of Philosophy in Computer Science at Carnegie Mellon University, 285 pages. |
Rubine, D.H. (May 1992). “Combining Gestures and Direct Manipulation,” CHI '92, pp. 659-660. |
Rusaw, S. et al. (May 1998). “Determining Polygon Orientation Using Model Based Force Interpretation,” Proceedings of the 1998 IEEE International Conference on Robotics and Automation, Leuven, Belgium, pp. 544-549. |
Sato, N. et al. (1986). “A Method for Three-Dimensional Part Identification by Tactile Transducer,” in Robot Sensors vol. 2—Tactile and Non-Vision, Pugh, A. ed., IFS Publications: UK, pp. 133-143. |
Sears, A. et al. (Jan. 23, 1989). “High Precision Touchscreens: Design Strategies and Comparisons With a Mouse,” Human-Computer Interaction Laboratory, University of MD, pp. 1-23. |
Sensor Frame Corporation. (May 8, 1990). “The Sensor Frame Graphic Manipulator Final Report,” Sensor Frame Corporation: NASA-CR-194243, 28 pages. |
Sethi, M. (1997). “Generating Clouds in Computer Graphics,” A Thesis Submitted in Partial Fulfillment of the Requirements for the Degree of M.Sc. in Computer Science, 154 pgs. |
Siegel, D.M. et al. (1987). “Performance Analysis of a Tactile Sensor,” IEEE, pp. 1493-1499. |
Son, J.S. et al. (1996). “Comparison of Contact Sensor Localization Abilities During Manipulation,” Robotics and Autonomous System 17 pp. 217-233. |
Speeter, T.H. (Nov. 1988). “Flexible, Piezoresitive Touch Sensing Array,” Optics Illumination and Imaging Sensing for Machine Vision III, SPIE 1005:31-43. |
Stansfield, S.A. (Mar. 1990). “Haptic Perception With an Articulated, Sensate Robot Hand,” SANDIA Report: SAND90—0085—UC-406, 37 pages. |
Stap, J.W. (Jun. 29, 1992). “Automatic 3-D Measurement of Workpieces on a Ship Repair Yard,” M.Sc. Thesis, 132 pages. |
Stauffer, R.N., ed. (Jun. 1983). “Progress in Tactile Sensor Development,” Robotics Today pp. 43-49. |
Sugiyama, S. et al. (Mar. 1990). “Tactile Image Detection Using a 1K-Element Silicon Pressure Sensor Array,” Sensors and Actuators, A22(Nos. 1-2):397-400. |
Summagraphics Corporation. (1987). Four-Button Cursor for the Summagraphics® Bit Pad® Plus, one page. |
Summagraphics Corporation. (Sep. 1989). CR™ 1212 Graphics Tablet Technical Reference, Publication #84-2001-002, 53 pages. |
Synaptics + Pilotfish (Nov. 25, 2006). “Onyx,” Internet Citation, three pages. |
Synaptics, Inc. (1999). The TouchPad: A Revolutionary Human Interface Device, four pages. |
Van Brussel, H. et al. (Jun. 1986). “A High Resolution Tactile Sensor for Part Recognition,” Proceedings of the 6th International Conference on Robot Vision and Sensory Controls, Paris, France, Jun. 3-5, 1986, pp. 49-59. |
Velasco, V.B. Jr. (Aug. 1997). “A Methodology for Computer-Assisted Gripper Customization Using Rapid Prototyping Technology,” Thesis Submitted in Partial Fulfillment of the Requirements for the Degree of Ph.D., 199 pages. |
Want, R. et al. (Mar. 1995). “The ParcTab Ubiquitous Computing Experiment,” Technical Report CSL-95-1, Xerox, 44 pages. |
Westerman, W. (Spring 1999). “Hand Tracking, Finger Identification, and Chordic Manipulation on a Multi-Touch Surface,” A Dissertation Submitted to the Faculty of the University of Delaware in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy in Electrical Engineering, 364 pages. |
Wolfeld, J.A. (Aug. 1981). “Real Time Control of a Robot Tactile Sensor,” Thesis Presented in Partial Fulfillment of the Requirements for the Degree of Master of Science in Engineering for Graduate Work in Computer and Information Science, 68 pages. |
Yaniger, S.L. (Apr. 1991). “Force Sensing Resistors: A Review of the Technology,” Elector International Conference Record, Apr. 16-18, 1991, New York, NY pp. 666-668. |
Yao, M. Y-S et al. (Mar. 1, 2002). “Quantifying Quality: A Case Study in Fingerprints,” Proceedings IEEE Conference on AudioID, IEEE, New York, NY, six pages. |
Number | Date | Country | |
---|---|---|---|
20120306800 A1 | Dec 2012 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 11619490 | Jan 2007 | US |
Child | 13584725 | US |