1. Field of the Invention
The present invention relates generally to gesturing associated with touch sensitive devices.
2. Description of the Related Art
There exist today many styles of input devices for performing operations in a computer system. The operations generally correspond to moving a cursor and making selections on a display screen. The operations may also include paging, scrolling, panning, zooming, etc. By way of example, the input devices may include buttons, switches, keyboards, mice, trackballs, touch pads, joy sticks, touch screens and the like. Each of these devices has advantages and disadvantages that are taken into account when designing the computer system.
Buttons and switches are generally mechanical in nature and provide limited control with regards to the movement of the cursor and making selections. For example, they are generally dedicated to moving the cursor in a specific direction (e.g., arrow keys) or to making specific selections (e.g., enter, delete, number, etc.).
In mice, the movement of the input pointer corresponds to the relative movements of the mouse as the user moves the mouse along a surface. In trackballs, the movement of the input pointer corresponds to the relative movements of a ball as the user moves the ball within a housing. Mice and trackballs also include one or more buttons for making selections. Mice may also include scroll wheels that allow a user to move through the GUI by simply rolling the wheel forward or backward.
With touch pads, the movement of the input pointer corresponds to the relative movements of the user's finger (or stylus) as the finger is moved along a surface of the touch pad. Touch screens, on the other hand, are a type of display screen that has a touch-sensitive transparent panel covering the screen. When using a touch screen, a user makes a selection on the display screen by pointing directly to GUI objects on the screen (usually with a stylus or finger).
In order to provide additionally functionality, gestures have been implemented with some of these input devices. By way of example, in touch pads, selections may be made when one or more taps are detected on the surface of the touch pad. In some cases, any portion of the touch pad may be tapped, and in other cases a dedicated portion of the touch pad may be tapped. In addition to selections, scrolling may be initiated by using finger motion at the edge of the touch pad.
U.S. Pat. Nos. 5,612,719 and 5,590,219, assigned to Apple Computer, Inc. describe some other uses of gesturing. U.S. Pat. No. 5,612,719 discloses an onscreen button that is responsive to at least two different button gestures made on the screen on or near the button. U.S. Pat. No. 5,590,219 discloses a method for recognizing an ellipse-type gesture input on a display screen of a computer system.
In recent times, more advanced gestures have been implemented. For example, scrolling may be initiated by placing four fingers on the touch pad so that the scrolling gesture is recognized and thereafter moving these fingers on the touch pad to perform scrolling events. The methods for implementing these advanced gestures, however, has several drawbacks. By way of example, once the gesture is set, it cannot be changed until the user resets the gesture state. In touch pads, for example, if four fingers equals scrolling, and the user puts a thumb down after the four fingers are recognized, any action associated with the new gesture including four fingers and the thumb will not be performed until the entire hand is lifted off the touch pad and put back down again (e.g., reset). Simply put, the user cannot change gesture states midstream. Along a similar vein, only one gesture may be performed at any given time. That is, multiple gestures cannot be performed simultaneously.
Based on the above, there is a need for improvements in the way gestures are performed on touch sensitive devices.
The invention relates, in one embodiment, to a computer implemented method for processing touch inputs. The method includes reading data from a multipoint touch screen. The data pertains to touch input with respect to the touch screen. The method also includes identifying at least one multipoint gesture based on the data from the multipoint touch screen.
The invention relates, in another embodiment to a gestural method. The method includes detecting multiple touches at different points on a touch sensitive surface at the same time. The method also includes segregating the multiple touches into at least two separate gestural inputs occurring simultaneously. Each gestural input has a different function such as zooming, panning, rotating and the like.
The invention relates, in another embodiment to a gestural method. The method includes concurrently detecting a plurality of gestures that are concurrently performed with reference to a touch sensing device. The method also includes producing different commands for each of the gestures that have been detected.
The invention relates, in another embodiment to a gestural method. The method includes displaying a graphical image on a display screen. The method also includes detecting a plurality of touches at the same time on a touch sensitive device. The method further includes linking the detected multiple touches to the graphical image presented on the display screen.
The invention relates, in another embodiment to a method of invoking a user interface element on a display via a multipoint touch screen of a computing system. The method includes detecting and analyzing the simultaneous presence of two or more objects in contact with the multipoint touch screen. The method also includes selecting a user interface tool, from a plurality of available tools, to display on a display for interaction by a user of the computing system based at least in part the analyzing. The method further includes controlling the interface tool based at least in part on the further movement of the objects in relation to the multipoint touch screen.
The invention relates, in another embodiment, to a touch-based method. The method includes detecting a user input that occurs over a multipoint sensing device. The user input includes one or more inputs. Each input has a unique identifier. The method also includes, during the user input, classifying the user input as a tracking or selecting input when the user input includes one unique identifier or a gesture input when the user input includes at least two unique identifiers. The method further includes performing tracking or selecting during the user input when the user input is classified as a tracking or selecting input. The method additionally includes performing one or more control actions during the user input when the user input is classified as a gesturing input. The control actions being based at least in part on changes that occur between the at least two unique identifiers.
The invention relates, in another embodiment, to a touch-based method. The method includes outputting a GUI on a display. The method also includes detecting a user input on a touch sensitive device. The method further includes analyzing the user input for characteristics indicative of tracking, selecting or a gesturing. The method additionally includes categorizing the user input as a tracking, selecting or gesturing input. The method further includes performing tracking or selecting in the GUI when the user input is categorized as a tracking or selecting input. Moreover, the method includes performing control actions in the GUI when the user input is categorized as a gesturing input, the actions being based on the particular gesturing input.
The invention relates, in another embodiment, to a touch-based method. The method includes capturing an initial touch image. The method also includes determining the touch mode based on the touch image. The method further includes capturing the next touch image. The method further includes determining if the touch mode changed between the initial and next touch images. The method additionally includes, if the touch mode changed, setting the next touch image as the initial touch image and determining the touch mode based on the new initial touch image. Moreover, the method includes, if the touch mode stayed the same, comparing the touch images and performing a control function based on the comparison.
The invention relates, in another embodiment, to a computer implemented method for processing touch inputs. The method includes reading data from a touch screen. The data pertaining to touch input with respect to the touch screen, and the touch screen having a multipoint capability. The method also includes converting the data to a collection of features. The method further includes classifying the features and grouping the features into one or more feature groups. The method additionally includes calculating key parameters of the feature groups and associating the feature groups to user interface elements on a display.
The invention relates, in another embodiment, to a computer implemented method. The method includes outputting a graphical image. The method also includes receiving a multitouch gesture input over the graphical image. The method further includes changing the graphical image based on and in unison with multitouch gesture input.
The invention relates, in another embodiment, to a touch based method. The method includes receiving a gestural input over a first region. The method also includes generating a first command when the gestural input is received over the first region. The method further includes receiving the same gestural input over a second region. The method additionally includes generating a second command when the same gestural input is received over the second region. The second command being different than the first command.
The invention relates, in another embodiment, to a method for recognizing multiple gesture inputs. The method includes receiving a multitouch gestural stroke on a touch sensitive surface. The multitouch gestural stroke maintaining continuous contact on the touch sensitive surface. The method also includes recognizing a first gesture input during the multitouch gestural stroke. The method further includes recognizing a second gesture input during the multitouch gestural stroke.
The invention relates, in another embodiment, to a computer implemented method. The method includes detecting a plurality of touches on a touch sensing device. The method also includes forming one or more touch groups with the plurality of touches. The method further includes monitoring the movement of and within each of the touch groups. The method additionally includes generating control signals when the touches within the touch groups are moved or when the touch groups are moved in their entirety.
It should be noted that in each of the embodiments described above, the methods may be implemented using a touch based input device such as a touch screen or touch pad, more particularly a multipoint touch based input device, and even more particularly a multipoint touch screen. It should also be noted that the gestures, gesture modes, gestural inputs, etc. may correspond to any of those described below in the detailed description. For example, the gestures may be associated with zooming, panning, scrolling, rotating, enlarging, floating controls, zooming targets, paging, inertia, keyboarding, wheeling, and/or the like.
The invention will be readily understood by the following detailed description in conjunction with the accompanying drawings, wherein like reference numerals designate like structural elements, and in which:
The invention generally pertains to gestures and methods of implementing gestures with touch sensitive devices. Examples of touch sensitive devices include touch screens and touch pads. One aspect of the invention relates to recognizing at least two simultaneously occurring gestures. Another aspect of the invention relates to displaying a graphical image and linking different touches that occur to the graphical image. Another aspect of the invention relates to immediately recognizing gestures so that actions associated with the gestures can be implemented at the same time. Another aspect of the invention relates to changing a displayed image based on and in unison with a gestural input, i.e., the displayed image continuously changes with changes in the gestural input such that the displayed image continuously follows the gestural input. Another aspect of the invention relates to implementing an input mode based on the number of fingers (or other object) in contact with the input device. Another aspect of the invention relates to providing region sensitivity where gestures mean different things when implemented over different areas of the input device. Another aspect of the invention relates to changing an input while making continuous contact with the touch sensitive surface of the touch sensitive device.
These and other aspects of the invention are discussed below with reference to
The exemplary computer system 50 shown in
In most cases, the processor 56 together with an operating system operates to execute computer code and produce and use data. Operating systems are generally well known and will not be described in greater detail. By way of example, the operating system may correspond to OS/2, DOS, Unix, Linux, Palm OS, and the like. The operating system can also be a special purpose operating system, such as may be used for limited purpose appliance-type computing devices. The operating system, other computer code and data may reside within a memory block 58 that is operatively coupled to the processor 56. Memory block 58 generally provides a place to store computer code and data that are used by the computer system 50. By way of example, the memory block 58 may include Read-Only Memory (ROM), Random-Access Memory (RAM), hard disk drive and/or the like. The information could also reside on a removable storage medium and loaded or installed onto the computer system 50 when needed. Removable storage mediums include, for example, CD-ROM, PC-CARD, memory card, floppy disk, magnetic tape, and a network component.
The computer system 50 also includes a display device 68 that is operatively coupled to the processor 56. The display device 68 may be a liquid crystal display (LCD) (e.g., active matrix, passive matrix and the like). Alternatively, the display device 68 may be a monitor such as a monochrome display, color graphics adapter (CGA) display, enhanced graphics adapter (EGA) display, variable-graphics-array (VGA) display, super VGA display, cathode ray tube (CRT), and the like. The display device may also correspond to a plasma display or a display implemented with electronic inks.
The display device 68 is generally configured to display a graphical user interface (GUI) 69 that provides an easy to use interface between a user of the computer system and the operating system or application running thereon. Generally speaking, the GUI 69 represents, programs, files and operational options with graphical images. The graphical images may include windows, fields, dialog boxes, menus, icons, buttons, cursors, scroll bars, etc. Such images may be arranged in predefined layouts, or may be created dynamically to serve the specific actions being taken by a user. During operation, the user can select and activate various graphical images in order to initiate functions and tasks associated therewith. By way of example, a user may select a button that opens, closes, minimizes, or maximizes a window, or an icon that launches a particular program. The GUI 69 can additionally or alternatively display information, such as non interactive text and graphics, for the user on the display device 68.
The computer system 50 also includes an input device 70 that is operatively coupled to the processor 56. The input device 70 is configured to transfer data from the outside world into the computer system 50. The input device 70 may for example be used to perform tracking and to make selections with respect to the GUI 69 on the display 68. The input device 70 may also be used to issue commands in the computer system 50. The input device 70 may include a touch sensing device configured to receive input from a user's touch and to send this information to the processor 56. By way of example, the touch-sensing device may correspond to a touchpad or a touch screen. In many cases, the touch-sensing device recognizes touches, as well as the position and magnitude of touches on a touch sensitive surface. The touch sensing means reports the touches to the processor 56 and the processor 56 interprets the touches in accordance with its programming. For example, the processor 56 may initiate a task in accordance with a particular touch. A dedicated processor can be used to process touches locally and reduce demand for the main processor of the computer system. The touch sensing device may be based on sensing technologies including but not limited to capacitive sensing, resistive sensing, surface acoustic wave sensing, pressure sensing, optical sensing, and/or the like. Furthermore, the touch sensing means may be based on single point sensing or multipoint sensing. Single point sensing is capable of only distinguishing a single touch, while multipoint sensing is capable of distinguishing multiple touches that occur at the same time.
The input device 70 may be a touch screen that is positioned over or in front of the display 68. The touch screen 70 may be integrated with the display device 68 or it may be a separate component. The touch screen 70 has several advantages over other input technologies such as touchpads, mice, etc. For one, the touch screen 70 is positioned in front of the display 68 and therefore the user can manipulate the GUI 69 directly. For example, the user can simply place their finger over an object to be controlled. In touch pads, there is no one-to-one relationship such as this. With touchpads, the touchpad is placed away from the display typically in a different plane. For example, the display is typically located in a vertical plane and the touchpad is typically located in a horizontal plane. This makes its use less intuitive, and therefore more difficult when compared to touch screens. In addition to being a touch screen, the input device 70 can be a multipoint input device. Multipoint input devices have advantages over conventional singlepoint devices in that they can distinguish more than one object (finger). Singlepoint devices are simply incapable of distinguishing multiple objects. By way of example, a multipoint touch screen, which can be used herein, is shown and described in greater detail in copending and commonly assigned U.S. patent application Ser. No. 10/840,862, which is hereby incorporated herein by reference.
The computer system 50 also includes capabilities for coupling to one or more I/O devices 80. By way of example, the I/O devices 80 may correspond to keyboards, printers, scanners, cameras, speakers, and/or the like. The I/O devices 80 may be integrated with the computer system 50 or they may be separate components (e.g., peripheral devices). In some cases, the I/O devices 80 may be connected to the computer system 50 through wired connections (e.g., cables/ports). In other cases, the I/O devices 80 may be connected to the computer system 80 through wireless connections. By way of example, the data link may correspond to PS/2, USB, IR, RF, Bluetooth or the like.
In accordance with one embodiment of the present invention, the computer system 50 is designed to recognize gestures 85 applied to the input device 70 and to control aspects of the computer system 50 based on the gestures 85. In some cases, a gesture is defined as a stylized interaction with an input device that is mapped to one or more specific computing operations. The gestures 85 may be made through various hand, and more particularly finger motions. Alternatively or additionally, the gestures may be made with a stylus. In all of these cases, the input device 70 receives the gestures 85 and the processor 56 executes instructions to carry out operations associated with the gestures 85. In addition, the memory block 58 may include a gesture operational program 88, which may be part of the operating system or a separate application. The gestural operation program 88 generally includes a set of instructions that recognizes the occurrence of gestures 85 and informs one or more software agents of the gestures 85 and/or what action(s) to take in response to the gestures 85.
When a user performs one or more gestures, the input device 70 relays gesture information to the processor 56. Using instructions from memory 58, and more particularly, the gestural operational program 88, the processor 56 interprets the gestures 85 and controls different components of the computer system 50, such as memory 58, a display 68 and I/O devices 80, based on the gestures 85. The gestures 85 may be identified as commands for performing actions in applications stored in the memory 58, modifying GUI objects shown on the display 68, modifying data stored in memory 58, and/or for performing actions in I/O devices 80. By way of example, the commands may be associated with zooming, panning, scrolling, paging, rotating, sizing, and the like. As further examples, the commands may also be associated with launching a particular program, opening a file or document, viewing a menu, making a selection, executing instructions, logging onto the computer system, permitting authorized individuals access to restricted areas of the computer system, loading a user profile associated with a user's preferred arrangement of the computer desktop, and/or the like.
A wide range of different gestures can be utilized. By way of example, the gestures may be single point or multipoint gestures; static or dynamic gestures; continuous or segmented gestures; and/or the like. Single point gestures are those gestures that are performed with a single contact point, e.g., the gesture is performed with a single touch as for example from a single finger, a palm or a stylus. Multipoint gestures are those gestures that can be performed with multiple points, e.g., the gesture is performed with multiple touches as for example from multiple fingers, fingers and palms, a finger and a stylus, multiple styli and/or any combination thereof. Static gestures are those gestures that do not include motion, and dynamic gestures are those gestures that do include motion. Continuous gestures are those gestures that are performed in a single stroke, and segmented gestures are those gestures that are performed in a sequence of distinct steps or strokes.
In one embodiment, the computer system 50 is configured to register multiple gestures at the same time, i.e., multiple gestures can be performed simultaneously. By way of example, a zoom gesture may be performed at the same time as a rotate gesture, or a rotate gesture may be performed at the same time as a pan gesture. In one particular implementation, zoom, rotate and pan gestures can all occur simultaneously in order to perform zooming, rotating and panning at the same time.
In another embodiment, the system is configured to immediately recognize the gestures so that actions associated with the gestures can be implemented at the same time as the gesture, i.e., the gesture and action simultaneously occur side by side rather than being a two-step process. By way of example, during a scrolling gesture, the screen moves with the finger motion.
In another embodiment, an object presented on a display 68 continuously follows the gesture occurring on a touch screen. There is a one to one relationship between the gesture being performed and the objects shown on the display 68. For example, as the gesture is performed, modifications simultaneously occur to the objects located underneath the gesture. For example, during a zooming gesture, the fingers may spread apart or close together in order to cause the object shown on the display 68 to zoom in during the spreading and zoom out during the closing. During this operation, the computer system 50 recognizes the user input as a zoom gesture, determines what action should be taken, and outputs control data to the appropriate device, in this case the display 68.
In another embodiment, the computer system 50 provides region sensitivity where gestures mean different things when implemented over different areas of the input device 68. For example, a rotation gesture over a volume knob causes volume increase/decrease, whereas a rotation gesture over a photo causes rotation of the photo.
In another embodiment, the number of fingers in contact with the touch screen may indicate an input mode. For example, a single touch as for example by a single finger may indicate the desire to perform tracking, i.e., pointer or cursor movements, or selections, whereas multiple touches as for example by a group of fingers may indicate the desire to perform gesturing. The number of fingers for implementing gesturing may be widely varied. By way of example, two fingers may indicate a first gesture mode, three fingers may indicate a third gesture mode, etc. Alternatively, any number of fingers, i.e., more than one, may be used for the same gesture mode, which can include one ore more gesture controls. The orientation of the fingers may similarly be used to denote the desired mode. The profile of the finger may be detected to permit different modal operations based on whether the user has used his thumb or index finger, for example.
In another embodiment, an input can be changed while making a continuous stroke on the input device without stopping the stroke (e.g., lifting off the touch sensitive surface). In one implementation, the user can switch from a tracking (or selection) mode to gesturing mode while a stroke is being made. For example, tracking or selections may be associated with a single finger and gesturing may be associated with multiple fingers; therefore, the user can toggle between tracking/selection and gesturing by picking up and placing down a second finger on the touch screen. In another implementation, the user can switch from one gesture mode to another gesture mode while a stroke is being made. For example, zooming may be associated with spreading a pair of fingers and rotating may be associated with rotating the pair of fingers; therefore, the user can toggle between zooming and rotating by alternating the movement of their fingers between spreading and rotating. In yet another implementation, the number of gesture inputs can be changed while a stroke is being made (e.g., added or subtracted). For example, during zooming where the fingers are spread apart, the user may further rotate their fingers to initiate both zooming and rotation. Furthermore during zooming and rotation, the user can stop spreading their fingers so that only rotation occurs. In other words, the gesture inputs can be continuously input, either simultaneously or consecutively.
In one particular embodiment, a single finger initiates tracking (or selection) and two or more fingers in close proximity to one another initiates scrolling or panning. Two fingers is generally preferred so as to provide easy toggling between one and two fingers, i.e., the user can switch between modes very easily by simply picking or placing an additional finger. This has the advantage of being more intuitive than other forms of mode toggling. During tracking, cursor movement is controlled by the user moving a single finger on the touch sensitive surface of a touch sensing device. The sensor arrangement of the touch sensing device interprets the finger motion and generates signals for producing corresponding movement of the cursor on the display. During scrolling, screen movement is controlled by the user moving dual fingers on the touch sensitive surface of the touch sensing device. When the combined fingers are moved in the vertical direction, the motion is interpreted as a vertical scroll event, and when the combined fingers are moved in the horizontal direction, the motion is interpreted as a horizontal scroll event. The same can be said for panning although panning can occur in all directions rather than just the horizontal and vertical directions.
The term “scrolling” as used herein generally pertains to moving displayed data or images (e.g., text or graphics) across a viewing area on a display screen so that a new set of data (e.g., line of text or graphics) is brought into view in the viewing area. In most cases, once the viewing area is full, each new set of data appears at the edge of the viewing area and all other sets of data move over one position. That is, the new set of data appears for each set of data that moves out of the viewing area. In essence, the scrolling function allows a user to view consecutive sets of data currently outside of the viewing area. The viewing area may be the entire viewing area of the display screen or it may only be a portion of the display screen (e.g., a window frame).
As mentioned above, scrolling may be implemented vertically (up or down) or horizontally (left or right). In the case of vertical scrolling, when a user scrolls down, each new set of data appears at the bottom of the viewing area and all other sets of data move up one position. If the viewing area is full, the top set of data moves out of the viewing area. Similarly, when a user scrolls up, each new set of data appears at the top of the viewing area and all other sets of data move down one position. If the viewing area is full, the bottom set of data moves out of the viewing area.
By way of example, the display screen, during operation, may display a list of media items (e.g., songs). A user is able to linearly scroll through the list of media items by moving his or her finger across a touch screen. As the finger moves across the touch screen, the displayed items from the list of media items are varied such that the user is able to effectively scroll through the list of media items. In most cases, the user is able to accelerate their traversal of the list of media items by moving his or her finger at greater speeds. Some embodiments, which may be related to the above example, are described in greater detail below. See for example
Following block 102, multipoint processing method 100 proceeds to block 104 where the image is converted into a collection or list of features. Each feature represents a distinct input such as a touch. In most cases, each feature includes its own unique identifier (ID), x coordinate, y coordinate, Z magnitude, angle θ, area A, and the like. By way of example,
The conversion from data or images to features may be accomplished using methods described in copending U.S. patent application Ser. No. 10/840,862 which is hereby incorporated herein by reference. As disclosed therein, the raw data is received. The raw data is typically in a digitized form, and includes values for each node of the touch screen. The values may be between 0 and 256 where 0 equates to no touch pressure and 256 equates to full touch pressure. Thereafter, the raw data is filtered to reduce noise. Once filtered, gradient data, which indicates the topology of each group of connected points, is generated. Thereafter, the boundaries for touch regions are calculated based on the gradient data, i.e., a determination is made as to which points are grouped together to form each touch region. By way of example, a watershed algorithm may be used. Once the boundaries are determined, the data for each of the touch regions are calculated (e.g., x, y, Z, θ, A).
Following block 104, multipoint processing method 100 proceeds to block 106 where feature classification and groupings are performed. During classification, the identity of each of the features is determined. For example, the features may be classified as a particular finger, thumb, palm or other object. Once classified, the features may be grouped. The manner in which the groups are formed can widely varied. In most cases, the features are grouped based on some criteria (e.g., they carry a similar attribute). For example, the two features shown in
Following block 106, the multipoint processing method 100 proceeds to block 108 where key parameters for the feature groups are calculated. The key parameters may include distance between features, x/y centroid of all features, feature rotation, total pressure of the group (e.g., pressure at centroid), and the like. As shown in
Following block 108, the process flow proceeds to block 110 where the group is or associated to a user interface (UI) element. UI elements are buttons boxes, lists, sliders, wheels, knobs, etc. Each UI element represents a component or control of the user interface. The application behind the UI element(s) has access to the parameter data calculated in block 108. In one implementation, the application ranks the relevance of the touch data to the UI element corresponding there to. The ranking may be based on some predetermine criteria. The ranking may include producing a figure of merit, and whichever UI element has the highest figure of merit, giving it sole access to the group. There may even be some degree of historesis as well (once one of the UI elements claims control of that group, the group sticks with the UI element until another UI element has a much higher ranking). By way of example, the ranking may include determining proximity of the centroid (or features) to the GUI object associated with the UI element.
Following block 110, the multipoint processing method 100 proceeds to blocks 112 and 114. The blocks 112 and 114 can be performed approximately at the same time. From the user perspective, in one embodiment, the blocks 112 and 114 appear to be performed concurrently. In block 112, one or more actions are performed based on differences between initial and current parameter values as well as the UI element to which they are associated. In block 114, user feedback pertaining to the one ore more action being performed is provided. By way of example, user feedback may include display, audio, tactile feedback and/or the like.
The above methods and techniques can be used to implement any number of GUI interface objects and actions. For example, gestures can be created to detect and effect a user command to resize a window, scroll a display, rotate an object, zoom in or out of a displayed view, delete or insert text or other objects, etc. Gestures can also be used to invoke and manipulate virtual control interfaces, such as volume knobs, switches, sliders, handles, knobs, doors, and other widgets that may be created to facilitate human interaction with the computing system.
To cite an example using the above methodologies, and referring to
Once knob 170 is displayed as shown in
As shown in
As shown in
As shown in
As shown in
Also as shown in
As shown in
It should be noted that additional gestures can be performed simultaneously with the virtual control knob gesture. For example, more than one virtual control knob can be controlled at the same time using both hands, i.e., one hand for each virtual control knob. Alternatively or additionally, one or more slider bars can be controlled at the same time as the virtual control knob, i.e., one hand operates the virtual control knob, while at least one finger and maybe more than one finger of the opposite hand operates at least one slider and maybe more than one slider bar, e.g., slider bar for each finger.
It should also be noted that although the embodiment is described using a virtual control knob, in another embodiment, the UI element can be a virtual scroll wheel. As an example, the virtual scroll wheel can mimic an actual scroll wheel such as those described in U.S. Patent Publication Nos: 2003/0076303A1, 2003/0076301A1, 2003/0095096A1, which are all herein incorporated by reference. For example, when the user places their finger on the surface of the virtual scroll wheel and makes a swirling, rotational or tangential gesture motion, a scrolling action can be performed with respect to a list of items displayed in a window.
If the user input is classified as a gesture input, the touch-based method 200 proceeds to block 208 where one or more gesture control actions are performed corresponding the user input. The gesture control actions are based at least in part on changes that occur with or between the at least two unique identifiers.
Following block 304 the touch-based method 300 proceeds to block 306 where the GUI object is modified based on and in unison with the gesture input. By modified, it is meant that the GUI object changes according to the particular gesture or gestures being performed. By in unison, it is meant that the changes occur approximately while the gesture or gestures are being performed. In most cases, there is a one to one relationship between the gesture(s) and the changes occurring at the GUI object and they occur substantially simultaneously. In essence, the GUI object follows the motion of the fingers. For example, spreading of the fingers may simultaneously enlarge the object, closing of the fingers may simultaneously reduce the GUI object, rotating the fingers may simultaneously rotate the object, translating the fingers may allow simultaneous panning or scrolling of the GUI object.
In one embodiment, block 306 may include determining which GUI object is associated with the gesture being performed, and thereafter locking the displayed object to the fingers disposed over it such that the GUI object changes in accordance with the gestural input. By locking or associating the fingers to the GUI object, the GUI object can continuously adjust itself in accordance to what the fingers are doing on the touch screen. Often the determination and locking occurs at set down, i.e., when the finger is positioned on the touch screen.
Following block 352, the zoom gesture method 350 proceeds to block 354 where the distance between at least the two fingers is compared. The distance may be from finger to finger or from each finger to some other reference point as for example the centroid. If the distance between the two fingers increases (spread apart), a zoom-in signal is generated as shown in block 356. If the distance between two fingers decreases (close together), a zoom-out signal is generated as shown in block 358. In most cases, the set down of the fingers will associate or lock the fingers to a particular GUI object being displayed. For example, the touch sensitive surface can be a touch screen, and the GUI object can be displayed on the touch screen. This typically occurs when at least one of the fingers is positioned over the GUI object. As a result, when the fingers are moved apart, the zoom-in signal can be used to increase the size of the embedded features in the GUI object and when the fingers are pinched together, the zoom-out signal can be used to decrease the size of embedded features in the object. The zooming typically occurs within a predefined boundary such as the periphery of the display, the periphery of a window, the edge of the GUI object, and/or the like. The embedded features may be formed on a plurality of layers, each of which represents a different level of zoom. In most cases, the amount of zooming varies according to the distance between the two objects. Furthermore, the zooming typically can occur substantially simultaneously with the motion of the objects. For instance, as the fingers spread apart or closes together, the object zooms in or zooms out at the same time. Although this methodology is directed at zooming, it should be noted that it may also be used for enlarging or reducing. The zoom gesture method 350 may be particularly useful in graphical programs such as publishing, photo, and drawing programs. Moreover, zooming may be used to control a peripheral device such as a camera, i.e., when the finger is spread apart, the camera zooms out and when the fingers are closed the camera zooms in.
Following block 452, the rotate method 450 proceeds to block 454 where the angle of each of the finger is set. The angles are typically determined relative to a reference point. Following block 454, rotate method 450 proceeds to block 456 where a rotate signal is generated when the angle of at least one of the objects changes relative to the reference point. In most cases, the set down of the fingers will associate or lock the fingers to a particular GUI object displayed on the touch screen. Typically, when at least one of the fingers is positioned over the image on the GUI object, the GUI object will be associated with or locked to the fingers. As a result, when the fingers are rotated, the rotate signal can be used to rotate the object in the direction of finger rotation (e.g., clockwise, counterclockwise). In most cases, the amount of object rotation varies according to the amount of finger rotation, i.e., if the fingers move 5 degrees then so will the object. Furthermore, the rotation typically can occur substantially simultaneously with the motion of the fingers. For instance, as the fingers rotate, the object rotates with the fingers at the same time.
It should be noted that the methods described in
Following block 504, the GUI operational method 500 proceeds to block 506 where an image in the vicinity of the object is generated. The image is typically based on the recognized object. The image may include windows, fields, dialog boxes, menus, icons, buttons, cursors, scroll bars, etc. In some cases, the user can select and activate the image (or features embedded therein) in order to initiate functions and tasks. By way of example, the image may be a user interface element or a group of user interface elements (e.g., one or more buttons that open, close, minimize, or maximize a window). The image may also be one or more icons that launch a particular program or files that open when selected. The image may additionally correspond to non interactive text and graphics. In most cases, the image is displayed as long as the object is detected or it may be displayed for some preset amount of time, i.e., after a period of time it times out and is removed.
In one particular embodiment, the image includes one or more control options that can be selected by the user. The control options may include one or more control buttons for implementing various tasks. For example, the control option box may include music listening control buttons as for example, play, pause, seek and menu.
The GUI operational method 650 generally begins at block 652 where a graphical image is displayed on a GUI. Following block 652, the GUI operational method 650 proceeds to block 654 where a scrolling or panning stroke on a touch sensitive surface is detected. By way of example, the stroke may be a linear or rotational stroke. During a linear stroke, the direction of scrolling or panning typically follows the direction of the stroke. During a rotational stroke (see
The GUI operational method 650 may additionally include blocks A and B. In block A, an object such as a finger is detected on the touch sensitive surface when the image is moving without the assistance of the object (block 660). In block B, the motion of the image is stopped when the object is detected, i.e., the new touch serves as a braking means. Using the metaphor above, while the piece of paper is sliding across the desktop, the user presses their finger on the paper thereby stopping its motion.
Scrolling generally pertains to moving displayed data or images (e.g., media items 681) across a viewing area on a display screen so that a new set of data (e.g., media items 681) is brought into view in the viewing area. In most cases, once the viewing area is full, each new set of data appears at the edge of the viewing area and all other sets of data move over one position. That is, the new set of data appears for each set of data that moves out of the viewing area. In essence, these functions allow a user to view consecutive sets of data currently outside of the viewing area. In most cases, the user is able to accelerate their traversal through the data sets by moving his or her finger at greater speeds. Examples of scrolling through lists can be found in U.S. Patent Publication Nos.: 2003/0076303A1, 2003/0076301A1, 2003/0095096A1, which are herein incorporated by reference.
As shown in
In one embodiment, only a single control signal is generated when the first object is detected over the first key and when the second object is detected over the second key at the same time. By way of example, the first key may be a shift key and the second key may be a symbol key (e.g., letters, numbers). In this manner, the keyboard acts like a traditional keyboard, i.e., the user is allowed to select multiple keys at the same time in order to change the symbol, i.e., lower/upper case. The keys may also correspond to the control key, alt key, escape key, function key, and the like.
In another embodiment, a control signal is generated for each actuated key (key touch) that occurs at the same time. For example, groups of characters can be typed at the same time. In some cases, the application running behind the keyboard may be configured to determine the order of the characters based on some predetermined criteria. For example, although the characters may be jumbled, the application can determine that the correct order of characters based on spelling, usage, context, and the like.
Although only two keys are described, it should be noted that two keys is not a limitation and that more than two keys may be actuated simultaneously to produce one or more control signals. For example, control-alt-delete functionality may be implemented or larger groups of characters can be typed at the same time.
In some cases, the principals of inertia as described above can be applied to the virtual scroll wheel. In cases such as these, the virtual scroll wheel continues to rotate when the fingers (or one of the fingers) are lifted off of the virtual scroll wheel and slowly comes to a stop via virtual friction. Alternatively or additionally, the continuous rotation can be stopped by placing the fingers (or the removed finger) back on the scroll wheel thereby braking the rotation of the virtual scroll wheel.
It should be noted that although a surface scroll wheel is shown, the principals thereof can be applied to more conventional scroll wheels which are virtually based. For example, scroll wheels, whose axis is parallel to the display screen and which appear to protrude through the display screen as shown in
The various aspects, embodiments, implementations or features of the invention can be used separately or in any combination.
The invention is preferably implemented by hardware, software or a combination of hardware and software. The software can also be embodied as computer readable code on a computer readable medium. The computer readable medium is any data storage device that can store data which can thereafter be read by a computer system. Examples of the computer readable medium include read-only memory, random-access memory, CD-ROMs, DVDs, magnetic tape, optical data storage devices, and carrier waves. The computer readable medium can also be distributed over network-coupled computer systems so that the computer readable code is stored and executed in a distributed fashion.
While this invention has been described in terms of several preferred embodiments, there are alterations, permutations, and equivalents, which fall within the scope of this invention. For example, although the invention has been primarily directed at touchscreens, it should be noted that in some cases touch pads may also be used in place of touchscreens. Other types of touch sensing devices may also be utilized. It should also be noted that there are many alternative ways of implementing the methods and apparatuses of the present invention. It is therefore intended that the following appended claims be interpreted as including all such alterations, permutations, and equivalents as fall within the true spirit and scope of the present invention.
Number | Name | Date | Kind |
---|---|---|---|
3333160 | Gorski | Jul 1967 | A |
3541541 | Englebart | Nov 1970 | A |
3609695 | Pirkle | Sep 1971 | A |
3662105 | Hurst et al. | May 1972 | A |
3748751 | Breglia et al. | Jul 1973 | A |
3757322 | Barkan et al. | Sep 1973 | A |
3798370 | Hurst | Mar 1974 | A |
3825730 | Worthington, Jr. et al. | Jul 1974 | A |
3846826 | Mueller | Nov 1974 | A |
4014000 | Uno et al. | Mar 1977 | A |
4017848 | Tannas, Jr. | Apr 1977 | A |
4146924 | Birk et al. | Mar 1979 | A |
4202041 | Kaplow et al. | May 1980 | A |
4219847 | Pinkney et al. | Aug 1980 | A |
4246452 | Chandler | Jan 1981 | A |
4303856 | Serras-Paulet | Dec 1981 | A |
4305071 | Bell et al. | Dec 1981 | A |
4305131 | Best | Dec 1981 | A |
4346376 | Mallos | Aug 1982 | A |
4375674 | Thornton | Mar 1983 | A |
4396945 | DiMatteo et al. | Aug 1983 | A |
4435835 | Sakow et al. | Mar 1984 | A |
4475122 | Green | Oct 1984 | A |
4484179 | Kasday | Nov 1984 | A |
4542375 | Alles et al. | Sep 1985 | A |
4550221 | Mabusth | Oct 1985 | A |
4561017 | Greene | Dec 1985 | A |
4613942 | Chen | Sep 1986 | A |
4629319 | Clarke et al. | Dec 1986 | A |
4631525 | Serravalle, Jr. | Dec 1986 | A |
4631676 | Pugh | Dec 1986 | A |
4644100 | Brenner et al. | Feb 1987 | A |
4644326 | Villalobos et al. | Feb 1987 | A |
4654872 | Hisano et al. | Mar 1987 | A |
4672364 | Lucas | Jun 1987 | A |
4672558 | Beckes et al. | Jun 1987 | A |
4686374 | Liptay-Wagner et al. | Aug 1987 | A |
4692809 | Beining et al. | Sep 1987 | A |
4695827 | Beining et al. | Sep 1987 | A |
4703306 | Barritt | Oct 1987 | A |
4710760 | Kasday | Dec 1987 | A |
4733222 | Evans | Mar 1988 | A |
4734685 | Watanabe | Mar 1988 | A |
4746770 | McAvinney | May 1988 | A |
4763356 | Day, Jr. et al. | Aug 1988 | A |
4771276 | Parks | Sep 1988 | A |
4772028 | Rockhold et al. | Sep 1988 | A |
4787040 | Ames et al. | Nov 1988 | A |
4788384 | Bruere-Dawson et al. | Nov 1988 | A |
4806846 | Kerber | Feb 1989 | A |
4814759 | Gombrich et al. | Mar 1989 | A |
4853888 | Lata et al. | Aug 1989 | A |
4898555 | Sampson | Feb 1990 | A |
4948371 | Hall | Aug 1990 | A |
4968877 | McAvinney et al. | Nov 1990 | A |
4988981 | Zimmerman et al. | Jan 1991 | A |
4993806 | Clausen et al. | Feb 1991 | A |
5003519 | Noirjean | Mar 1991 | A |
5017030 | Crews | May 1991 | A |
5038401 | Inotsume | Aug 1991 | A |
5045843 | Hansen | Sep 1991 | A |
5045846 | Gay et al. | Sep 1991 | A |
5072294 | Engle | Dec 1991 | A |
5119079 | Hube et al. | Jun 1992 | A |
5153829 | Furuya et al. | Oct 1992 | A |
5168531 | Sigel | Dec 1992 | A |
5178477 | Gambaro | Jan 1993 | A |
5189403 | Franz et al. | Feb 1993 | A |
5194862 | Edwards | Mar 1993 | A |
5212555 | Stoltz | May 1993 | A |
5224861 | Glass et al. | Jul 1993 | A |
5227985 | DeMenthon | Jul 1993 | A |
5241308 | Young | Aug 1993 | A |
5252951 | Tannenbaum et al. | Oct 1993 | A |
5281966 | Walsh | Jan 1994 | A |
5297041 | Kushler et al. | Mar 1994 | A |
5305017 | Gerpheide | Apr 1994 | A |
5319386 | Gunn et al. | Jun 1994 | A |
5328190 | Dart et al. | Jul 1994 | A |
5345543 | Capps et al. | Sep 1994 | A |
5347295 | Agulnick et al. | Sep 1994 | A |
5347629 | Barrett et al. | Sep 1994 | A |
5367453 | Capps et al. | Nov 1994 | A |
5376948 | Roberts | Dec 1994 | A |
5379057 | Clough et al. | Jan 1995 | A |
5398310 | Tchao et al. | Mar 1995 | A |
5412189 | Cragun | May 1995 | A |
5418760 | Kawashima et al. | May 1995 | A |
5422656 | Allard et al. | Jun 1995 | A |
5442742 | Greyson et al. | Aug 1995 | A |
5459793 | Naoi et al. | Oct 1995 | A |
5463388 | Boie et al. | Oct 1995 | A |
5463696 | Beernink et al. | Oct 1995 | A |
5463725 | Henckel et al. | Oct 1995 | A |
5471578 | Moran et al. | Nov 1995 | A |
5479528 | Speeter | Dec 1995 | A |
5483261 | Yasutake | Jan 1996 | A |
5488204 | Mead et al. | Jan 1996 | A |
5495077 | Miller et al. | Feb 1996 | A |
5495269 | Elrod et al. | Feb 1996 | A |
5495576 | Ritchey | Feb 1996 | A |
5502514 | Vogeley et al. | Mar 1996 | A |
5510806 | Busch | Apr 1996 | A |
5511148 | Wellner | Apr 1996 | A |
5513309 | Meier et al. | Apr 1996 | A |
5515079 | Hauck | May 1996 | A |
5523775 | Capps | Jun 1996 | A |
5530455 | Gillick et al. | Jun 1996 | A |
5530456 | Kokubo | Jun 1996 | A |
5543590 | Gillespie et al. | Aug 1996 | A |
5543591 | Gillespie et al. | Aug 1996 | A |
5548667 | Tu | Aug 1996 | A |
5559301 | Bryan, Jr. et al. | Sep 1996 | A |
5563632 | Roberts | Oct 1996 | A |
5563996 | Tchao | Oct 1996 | A |
5565658 | Gerpheide et al. | Oct 1996 | A |
5568604 | Hansen | Oct 1996 | A |
5572239 | Jaeger | Nov 1996 | A |
5572647 | Blades | Nov 1996 | A |
5579036 | Yates, IV | Nov 1996 | A |
5581243 | Ouellette et al. | Dec 1996 | A |
5581681 | Tchao et al. | Dec 1996 | A |
5583946 | Gourdol | Dec 1996 | A |
5589856 | Stein et al. | Dec 1996 | A |
5590219 | Gourdol | Dec 1996 | A |
5592566 | Pagallo et al. | Jan 1997 | A |
5594469 | Freeman et al. | Jan 1997 | A |
5594810 | Gourdol | Jan 1997 | A |
5596694 | Capps | Jan 1997 | A |
5612719 | Beernink et al. | Mar 1997 | A |
5613913 | Ikematsu et al. | Mar 1997 | A |
5625715 | Trew et al. | Apr 1997 | A |
5631805 | Bonsall | May 1997 | A |
5633955 | Bozinovic et al. | May 1997 | A |
5634102 | Capps | May 1997 | A |
5636101 | Bonsall et al. | Jun 1997 | A |
5642108 | Gopher et al. | Jun 1997 | A |
5644657 | Capps et al. | Jul 1997 | A |
5649706 | Treat, Jr. et al. | Jul 1997 | A |
5666113 | Logan | Sep 1997 | A |
5666502 | Capps | Sep 1997 | A |
5666552 | Greyson et al. | Sep 1997 | A |
5675361 | Santilli | Oct 1997 | A |
5675362 | Clough et al. | Oct 1997 | A |
5677710 | Thompson-Rohrlich | Oct 1997 | A |
5689253 | Hargreaves et al. | Nov 1997 | A |
5689667 | Kurtenbach | Nov 1997 | A |
5709219 | Chen et al. | Jan 1998 | A |
5710844 | Capps et al. | Jan 1998 | A |
5711624 | Klauber | Jan 1998 | A |
5712661 | Jaeger | Jan 1998 | A |
5726685 | Kuth et al. | Mar 1998 | A |
5729249 | Yasutake | Mar 1998 | A |
5729250 | Bishop et al. | Mar 1998 | A |
5730165 | Philipp | Mar 1998 | A |
5736974 | Selker | Apr 1998 | A |
5736975 | Lunetta | Apr 1998 | A |
5736976 | Cheung | Apr 1998 | A |
5741990 | Davies | Apr 1998 | A |
5745116 | Pisutha-Arnond | Apr 1998 | A |
5745716 | Tchao et al. | Apr 1998 | A |
5745719 | Falcon | Apr 1998 | A |
5746818 | Yatake | May 1998 | A |
5748184 | Shieh | May 1998 | A |
5748269 | Harris et al. | May 1998 | A |
5748512 | Vargas | May 1998 | A |
5764222 | Shieh | Jun 1998 | A |
5764818 | Capps et al. | Jun 1998 | A |
5767457 | Gerpheide et al. | Jun 1998 | A |
5767842 | Korth | Jun 1998 | A |
5777603 | Jaeger | Jul 1998 | A |
5790104 | Shieh | Aug 1998 | A |
5790107 | Kasser et al. | Aug 1998 | A |
5798760 | Vayda et al. | Aug 1998 | A |
5801941 | Bertram | Sep 1998 | A |
5802516 | Shwarts et al. | Sep 1998 | A |
5805145 | Jaeger | Sep 1998 | A |
5805146 | Jaeger et al. | Sep 1998 | A |
5805167 | Van Cruyningen | Sep 1998 | A |
5808567 | McCloud | Sep 1998 | A |
5808605 | Shieh | Sep 1998 | A |
5809267 | Moran et al. | Sep 1998 | A |
5818451 | Bertram et al. | Oct 1998 | A |
5821690 | Martens et al. | Oct 1998 | A |
5821930 | Hansen | Oct 1998 | A |
5823782 | Marcus et al. | Oct 1998 | A |
5825232 | Kimura | Oct 1998 | A |
5825308 | Rosenberg | Oct 1998 | A |
5825351 | Tam | Oct 1998 | A |
5825352 | Bisset et al. | Oct 1998 | A |
5831601 | Vogeley et al. | Nov 1998 | A |
5841428 | Jaeger et al. | Nov 1998 | A |
5844506 | Binstead | Dec 1998 | A |
5844547 | Minakuchi et al. | Dec 1998 | A |
5850218 | LaJoie et al. | Dec 1998 | A |
5854625 | Frisch et al. | Dec 1998 | A |
5859631 | Bergman et al. | Jan 1999 | A |
5867149 | Jaeger | Feb 1999 | A |
5870091 | Lazarony et al. | Feb 1999 | A |
5871251 | Welling et al. | Feb 1999 | A |
5874948 | Shieh | Feb 1999 | A |
5877751 | Kanemitsu et al. | Mar 1999 | A |
5880411 | Gillespie et al. | Mar 1999 | A |
5883619 | Ho et al. | Mar 1999 | A |
5886697 | Naughton et al. | Mar 1999 | A |
5898434 | Small et al. | Apr 1999 | A |
5900876 | Yagita et al. | May 1999 | A |
5910800 | Shields et al. | Jun 1999 | A |
5920309 | Bisset et al. | Jul 1999 | A |
5923319 | Bishop et al. | Jul 1999 | A |
5933134 | Shieh | Aug 1999 | A |
5933141 | Smith | Aug 1999 | A |
5936613 | Jaeger et al. | Aug 1999 | A |
5943043 | Furuhata et al. | Aug 1999 | A |
5943044 | Martinelli et al. | Aug 1999 | A |
5943052 | Allen et al. | Aug 1999 | A |
5943053 | Ludolph et al. | Aug 1999 | A |
5949345 | Beckert et al. | Sep 1999 | A |
5956291 | Nehemiah et al. | Sep 1999 | A |
5956822 | Brieden et al. | Sep 1999 | A |
5963671 | Comerford et al. | Oct 1999 | A |
5974541 | Hall et al. | Oct 1999 | A |
5977867 | Blouin | Nov 1999 | A |
5982302 | Ure | Nov 1999 | A |
5982352 | Pryor | Nov 1999 | A |
5982353 | Gallery et al. | Nov 1999 | A |
5982355 | Jaeger et al. | Nov 1999 | A |
5995101 | Clark et al. | Nov 1999 | A |
5995104 | Kataoka et al. | Nov 1999 | A |
5995106 | Naughton et al. | Nov 1999 | A |
5999895 | Forest | Dec 1999 | A |
6002389 | Kasser | Dec 1999 | A |
6002808 | Freeman | Dec 1999 | A |
6005549 | Forest | Dec 1999 | A |
6005555 | Katsurahira et al. | Dec 1999 | A |
6008800 | Pryor | Dec 1999 | A |
6013956 | Anderson, Jr. | Jan 2000 | A |
6020881 | Naughton et al. | Feb 2000 | A |
6028271 | Gillespie et al. | Feb 2000 | A |
6031524 | Kunert | Feb 2000 | A |
6034685 | Kuriyama et al. | Mar 2000 | A |
6037882 | Levy | Mar 2000 | A |
6040824 | Maekawa et al. | Mar 2000 | A |
6049326 | Beyda et al. | Apr 2000 | A |
6049328 | Vanderheiden | Apr 2000 | A |
6050825 | Nichol et al. | Apr 2000 | A |
6052339 | Frenkel et al. | Apr 2000 | A |
6054984 | Alexander | Apr 2000 | A |
6054990 | Tran | Apr 2000 | A |
6057540 | Gordon et al. | May 2000 | A |
6057845 | Dupouy | May 2000 | A |
6061177 | Fujimoto | May 2000 | A |
6066075 | Poulton | May 2000 | A |
6072494 | Nguyen | Jun 2000 | A |
6073036 | Heikkinen et al. | Jun 2000 | A |
6075531 | DeStefano | Jun 2000 | A |
6084576 | Leu et al. | Jul 2000 | A |
6094197 | Buxton et al. | Jul 2000 | A |
6104384 | Moon et al. | Aug 2000 | A |
6105419 | Michels et al. | Aug 2000 | A |
6107997 | Ure | Aug 2000 | A |
6128003 | Smith et al. | Oct 2000 | A |
6130665 | Ericsson | Oct 2000 | A |
6131299 | Raab et al. | Oct 2000 | A |
6135958 | Mikula-Curtis et al. | Oct 2000 | A |
6144380 | Shwarts et al. | Nov 2000 | A |
6151596 | Hosomi | Nov 2000 | A |
6154194 | Singh | Nov 2000 | A |
6154201 | Levin et al. | Nov 2000 | A |
6154209 | Naughton et al. | Nov 2000 | A |
6160551 | Naughton et al. | Dec 2000 | A |
6169538 | Nowlan et al. | Jan 2001 | B1 |
6175610 | Peter | Jan 2001 | B1 |
6188391 | Seely et al. | Feb 2001 | B1 |
6198515 | Cole | Mar 2001 | B1 |
6208329 | Ballare | Mar 2001 | B1 |
6219035 | Skog et al. | Apr 2001 | B1 |
6222465 | Kumar et al. | Apr 2001 | B1 |
6222531 | Smith | Apr 2001 | B1 |
6239790 | Martinelli et al. | May 2001 | B1 |
6243071 | Shwarts et al. | Jun 2001 | B1 |
6246862 | Grivas et al. | Jun 2001 | B1 |
6249606 | Kiraly et al. | Jun 2001 | B1 |
6255604 | Tokioka et al. | Jul 2001 | B1 |
6256020 | Pabon et al. | Jul 2001 | B1 |
6259436 | Moon et al. | Jul 2001 | B1 |
6271835 | Hoeksma | Aug 2001 | B1 |
6278441 | Gouzman et al. | Aug 2001 | B1 |
6278443 | Amro et al. | Aug 2001 | B1 |
6288707 | Philipp | Sep 2001 | B1 |
6289326 | LaFleur | Sep 2001 | B1 |
6292178 | Bernstein et al. | Sep 2001 | B1 |
6292179 | Lee | Sep 2001 | B1 |
6295049 | Minner | Sep 2001 | B1 |
6295052 | Kato et al. | Sep 2001 | B1 |
6308144 | Bronfeld et al. | Oct 2001 | B1 |
6310610 | Beaton et al. | Oct 2001 | B1 |
6313853 | Lamontagne et al. | Nov 2001 | B1 |
6323846 | Westerman et al. | Nov 2001 | B1 |
6326956 | Jaeger et al. | Dec 2001 | B1 |
6333753 | Hinckley | Dec 2001 | B1 |
6337678 | Fish | Jan 2002 | B1 |
6339748 | Hiramatsu | Jan 2002 | B1 |
6344861 | Naughton et al. | Feb 2002 | B1 |
6347290 | Bartlett | Feb 2002 | B1 |
6359572 | Vale | Mar 2002 | B1 |
6359632 | Eastty et al. | Mar 2002 | B1 |
6377009 | Philipp | Apr 2002 | B1 |
6380931 | Gillespie et al. | Apr 2002 | B1 |
6400379 | Johnson et al. | Jun 2002 | B1 |
6411287 | Scharff et al. | Jun 2002 | B1 |
6414671 | Gillespie et al. | Jul 2002 | B1 |
6414672 | Rekimoto et al. | Jul 2002 | B2 |
6414674 | Kamper et al. | Jul 2002 | B1 |
6421042 | Omura et al. | Jul 2002 | B1 |
6421046 | Edgren | Jul 2002 | B1 |
6421234 | Ricks et al. | Jul 2002 | B1 |
6424338 | Anderson | Jul 2002 | B1 |
6429846 | Rosenberg et al. | Aug 2002 | B2 |
6433801 | Moon et al. | Aug 2002 | B1 |
6441806 | Jaeger | Aug 2002 | B1 |
6441807 | Yamaguchi | Aug 2002 | B1 |
6442440 | Miller | Aug 2002 | B1 |
6452514 | Philipp | Sep 2002 | B1 |
6456952 | Nathan | Sep 2002 | B1 |
6457355 | Philipp | Oct 2002 | B1 |
6457834 | Cotton et al. | Oct 2002 | B1 |
6466036 | Philipp | Oct 2002 | B1 |
6469722 | Kinoe et al. | Oct 2002 | B1 |
6473069 | Gerpheide | Oct 2002 | B1 |
6473102 | Rodden et al. | Oct 2002 | B1 |
6478432 | Dyner | Nov 2002 | B1 |
6480188 | Horsley | Nov 2002 | B1 |
6481851 | McNelley et al. | Nov 2002 | B1 |
6489978 | Gong et al. | Dec 2002 | B1 |
6501464 | Cobbley et al. | Dec 2002 | B1 |
6501515 | Iwamura | Dec 2002 | B1 |
6515669 | Mohri | Feb 2003 | B1 |
6525711 | Shaw et al. | Feb 2003 | B1 |
6525749 | Moran et al. | Feb 2003 | B1 |
6535200 | Philipp | Mar 2003 | B2 |
6543684 | White et al. | Apr 2003 | B1 |
6543947 | Lee | Apr 2003 | B2 |
6545670 | Pryor | Apr 2003 | B1 |
6563492 | Furuya et al. | May 2003 | B1 |
6570557 | Westerman et al. | May 2003 | B1 |
6570584 | Cok et al. | May 2003 | B1 |
6573844 | Venolia et al. | Jun 2003 | B1 |
6583676 | Krah et al. | Jun 2003 | B2 |
6593916 | Aroyan | Jul 2003 | B1 |
6597345 | Hirshberg | Jul 2003 | B2 |
6610936 | Gillespie et al. | Aug 2003 | B2 |
6611252 | DuFaux | Aug 2003 | B1 |
6624833 | Kumar et al. | Sep 2003 | B1 |
6639577 | Eberhard | Oct 2003 | B2 |
6639584 | Li | Oct 2003 | B1 |
6650319 | Hurst et al. | Nov 2003 | B1 |
6654733 | Goodman et al. | Nov 2003 | B1 |
6658994 | McMillan | Dec 2003 | B1 |
6661920 | Skinner | Dec 2003 | B1 |
6664989 | Snyder et al. | Dec 2003 | B1 |
6670894 | Mehring | Dec 2003 | B2 |
6677932 | Westerman | Jan 2004 | B1 |
6677933 | Yogaratnam | Jan 2004 | B1 |
6677934 | Blanchard | Jan 2004 | B1 |
6680677 | Tiphane | Jan 2004 | B1 |
6690387 | Zimmerman et al. | Feb 2004 | B2 |
6697721 | Arlinsky | Feb 2004 | B2 |
6703999 | Iwanami et al. | Mar 2004 | B1 |
6710771 | Yamaguchi et al. | Mar 2004 | B1 |
6721784 | Leonard et al. | Apr 2004 | B1 |
6724366 | Crawford | Apr 2004 | B2 |
6757002 | Oross et al. | Jun 2004 | B1 |
6760020 | Uchiyama et al. | Jul 2004 | B1 |
6791467 | Ben-Ze'ev | Sep 2004 | B1 |
6795059 | Endo | Sep 2004 | B2 |
6798768 | Gallick et al. | Sep 2004 | B1 |
6803905 | Capps et al. | Oct 2004 | B1 |
6803906 | Morrison et al. | Oct 2004 | B1 |
6806869 | Yamakado | Oct 2004 | B2 |
6819312 | Fish | Nov 2004 | B2 |
6842672 | Straub et al. | Jan 2005 | B1 |
6856259 | Sharp | Feb 2005 | B1 |
6857800 | Zhang et al. | Feb 2005 | B2 |
6874129 | Smith | Mar 2005 | B2 |
6882641 | Gallick et al. | Apr 2005 | B1 |
6888536 | Westerman et al. | May 2005 | B2 |
6896375 | Peterson et al. | May 2005 | B2 |
6900795 | Knight, III et al. | May 2005 | B1 |
6903730 | Mathews et al. | Jun 2005 | B2 |
6920619 | Milekic | Jul 2005 | B1 |
6926609 | Martin | Aug 2005 | B2 |
6927761 | Badaye et al. | Aug 2005 | B2 |
6930661 | Uchida et al. | Aug 2005 | B2 |
6942571 | McAllister et al. | Sep 2005 | B1 |
6944591 | Raghunandan | Sep 2005 | B1 |
6952203 | Banerjee et al. | Oct 2005 | B2 |
6954583 | Nagasaka et al. | Oct 2005 | B2 |
6958749 | Matsushita et al. | Oct 2005 | B1 |
6961912 | Aoki et al. | Nov 2005 | B2 |
6965375 | Gettemy et al. | Nov 2005 | B1 |
6970749 | Chinn et al. | Nov 2005 | B1 |
6972401 | Akitt et al. | Dec 2005 | B2 |
6972749 | Hinckley et al. | Dec 2005 | B2 |
6975304 | Hawkins et al. | Dec 2005 | B1 |
6977666 | Hedrick | Dec 2005 | B1 |
6985801 | Straub et al. | Jan 2006 | B1 |
6992659 | Gettemy | Jan 2006 | B2 |
7002749 | Kremen | Feb 2006 | B2 |
7015894 | Morohoshi et al. | Mar 2006 | B2 |
7022075 | Grunwald et al. | Apr 2006 | B2 |
7030863 | Longe et al. | Apr 2006 | B2 |
7031228 | Born et al. | Apr 2006 | B2 |
7038659 | Rajkowski | May 2006 | B2 |
7057607 | Mayoraz et al. | Jun 2006 | B2 |
7058902 | Iwema et al. | Jun 2006 | B2 |
7075512 | Fabre et al. | Jul 2006 | B1 |
7079114 | Smith et al. | Jul 2006 | B1 |
7084859 | Pryor | Aug 2006 | B1 |
7091410 | Ito et al. | Aug 2006 | B2 |
7098891 | Pryor | Aug 2006 | B1 |
7098896 | Kushler et al. | Aug 2006 | B2 |
7100105 | Nishimura et al. | Aug 2006 | B1 |
7143355 | Yamaguchi et al. | Nov 2006 | B2 |
7149981 | Lundy et al. | Dec 2006 | B1 |
7158123 | Myers et al. | Jan 2007 | B2 |
7180502 | Marvit et al. | Feb 2007 | B2 |
7184064 | Zimmerman et al. | Feb 2007 | B2 |
7194699 | Thomson et al. | Mar 2007 | B2 |
7233316 | Smith et al. | Jun 2007 | B2 |
7233319 | Johnson et al. | Jun 2007 | B2 |
7240289 | Naughton et al. | Jul 2007 | B2 |
7242311 | Hoff et al. | Jul 2007 | B2 |
7260422 | Knoedgen | Aug 2007 | B2 |
7310781 | Chen et al. | Dec 2007 | B2 |
7319454 | Thacker et al. | Jan 2008 | B2 |
7320112 | Yamaguchi et al. | Jan 2008 | B2 |
7339580 | Westerman et al. | Mar 2008 | B2 |
7340077 | Gokturk et al. | Mar 2008 | B2 |
7340685 | Chen et al. | Mar 2008 | B2 |
7346853 | Chen et al. | Mar 2008 | B2 |
7417681 | Lieberman et al. | Aug 2008 | B2 |
7443316 | Lim | Oct 2008 | B2 |
7466843 | Pryor | Dec 2008 | B2 |
7474772 | Russo et al. | Jan 2009 | B2 |
7475390 | Berstis et al. | Jan 2009 | B2 |
7477240 | Yanagisawa | Jan 2009 | B2 |
7478336 | Chen et al. | Jan 2009 | B2 |
7489303 | Pryor | Feb 2009 | B1 |
7509113 | Knoedgen | Mar 2009 | B2 |
7515810 | Nagasaka et al. | Apr 2009 | B2 |
7519223 | Dehlin et al. | Apr 2009 | B2 |
7526738 | Ording et al. | Apr 2009 | B2 |
7614008 | Ording | Nov 2009 | B2 |
7663607 | Hotelling et al. | Feb 2010 | B2 |
7664748 | Harrity | Feb 2010 | B2 |
7694231 | Kocienda et al. | Apr 2010 | B2 |
7706616 | Kristensson et al. | Apr 2010 | B2 |
7714849 | Pryor | May 2010 | B2 |
7746325 | Roberts | Jun 2010 | B2 |
7831932 | Josephsoon et al. | Nov 2010 | B2 |
7856472 | Arav | Dec 2010 | B2 |
7861188 | Josephsoon et al. | Dec 2010 | B2 |
7877705 | Chambers et al. | Jan 2011 | B2 |
7898529 | Fitzmaurice et al. | Mar 2011 | B2 |
8239784 | Hotelling et al. | Aug 2012 | B2 |
8314775 | Westerman et al. | Nov 2012 | B2 |
8330727 | Westerman et al. | Dec 2012 | B2 |
8334846 | Westerman et al. | Dec 2012 | B2 |
20010012001 | Rekimoto et al. | Aug 2001 | A1 |
20010012022 | Smith | Aug 2001 | A1 |
20010012769 | Sirola et al. | Aug 2001 | A1 |
20010015718 | Hinckley et al. | Aug 2001 | A1 |
20010026678 | Nagasaka et al. | Oct 2001 | A1 |
20010040554 | Nakagawa | Nov 2001 | A1 |
20010055038 | Kim | Dec 2001 | A1 |
20020015024 | Westerman et al. | Feb 2002 | A1 |
20020044132 | Fish | Apr 2002 | A1 |
20020044161 | Sugai | Apr 2002 | A1 |
20020051018 | Yeh | May 2002 | A1 |
20020054175 | Miettinen et al. | May 2002 | A1 |
20020075317 | Dardick | Jun 2002 | A1 |
20020080123 | Kennedy et al. | Jun 2002 | A1 |
20020085037 | Leavitt et al. | Jul 2002 | A1 |
20020113778 | Rekimoto et al. | Aug 2002 | A1 |
20020118848 | Karpenstein | Aug 2002 | A1 |
20020120543 | Brittingham et al. | Aug 2002 | A1 |
20020130839 | Wallace et al. | Sep 2002 | A1 |
20020133522 | Greetham et al. | Sep 2002 | A1 |
20020135615 | Lang | Sep 2002 | A1 |
20020140633 | Rafii et al. | Oct 2002 | A1 |
20020140668 | Crawford | Oct 2002 | A1 |
20020140679 | Wen | Oct 2002 | A1 |
20020140680 | Lu | Oct 2002 | A1 |
20020167545 | Kang et al. | Nov 2002 | A1 |
20020180763 | Kung et al. | Dec 2002 | A1 |
20020191029 | Gillespie et al. | Dec 2002 | A1 |
20020196227 | Surloff et al. | Dec 2002 | A1 |
20020196238 | Tsukada et al. | Dec 2002 | A1 |
20020196274 | Comfort et al. | Dec 2002 | A1 |
20030001899 | Partanen et al. | Jan 2003 | A1 |
20030005454 | Rodriguez et al. | Jan 2003 | A1 |
20030006974 | Clough et al. | Jan 2003 | A1 |
20030016253 | Aoki et al. | Jan 2003 | A1 |
20030030664 | Parry | Feb 2003 | A1 |
20030063073 | Geaghan et al. | Apr 2003 | A1 |
20030071850 | Geidl | Apr 2003 | A1 |
20030071858 | Morohoshi | Apr 2003 | A1 |
20030072077 | Peterson et al. | Apr 2003 | A1 |
20030073461 | Sinclair | Apr 2003 | A1 |
20030076301 | Tsuk et al. | Apr 2003 | A1 |
20030076303 | Huppi | Apr 2003 | A1 |
20030076306 | Zadesky et al. | Apr 2003 | A1 |
20030076363 | Murphy | Apr 2003 | A1 |
20030095095 | Pihlaja | May 2003 | A1 |
20030095096 | Robbin et al. | May 2003 | A1 |
20030095135 | Kaasila et al. | May 2003 | A1 |
20030098858 | Perski et al. | May 2003 | A1 |
20030128188 | Wilbrink et al. | Jul 2003 | A1 |
20030128195 | Banerjee et al. | Jul 2003 | A1 |
20030132950 | Surucu et al. | Jul 2003 | A1 |
20030152241 | Eastty et al. | Aug 2003 | A1 |
20030160808 | Foote et al. | Aug 2003 | A1 |
20030164820 | Kent | Sep 2003 | A1 |
20030169303 | Islam et al. | Sep 2003 | A1 |
20030179201 | Thacker | Sep 2003 | A1 |
20030193481 | Sokolsky | Oct 2003 | A1 |
20030197736 | Murphy | Oct 2003 | A1 |
20030201972 | Usuda | Oct 2003 | A1 |
20030206202 | Moriya | Nov 2003 | A1 |
20030210260 | Palmer et al. | Nov 2003 | A1 |
20030210286 | Gerpheide et al. | Nov 2003 | A1 |
20030214534 | Uemura et al. | Nov 2003 | A1 |
20030222977 | Yoshino | Dec 2003 | A1 |
20030234768 | Rekimoto et al. | Dec 2003 | A1 |
20030237043 | Novak et al. | Dec 2003 | A1 |
20040001048 | Kraus et al. | Jan 2004 | A1 |
20040009788 | Mantyjarvi et al. | Jan 2004 | A1 |
20040017499 | Ambiru | Jan 2004 | A1 |
20040019505 | Bowman et al. | Jan 2004 | A1 |
20040021643 | Hoshino et al. | Feb 2004 | A1 |
20040021644 | Enomoto | Feb 2004 | A1 |
20040021696 | Molgaard | Feb 2004 | A1 |
20040036622 | Dukach et al. | Feb 2004 | A1 |
20040046887 | Ikehata et al. | Mar 2004 | A1 |
20040053661 | Jones et al. | Mar 2004 | A1 |
20040056837 | Koga et al. | Mar 2004 | A1 |
20040056839 | Yoshihara | Mar 2004 | A1 |
20040056849 | Lohbihler et al. | Mar 2004 | A1 |
20040064473 | Thomas et al. | Apr 2004 | A1 |
20040080529 | Wojcik | Apr 2004 | A1 |
20040100479 | Nakano et al. | May 2004 | A1 |
20040119750 | Harrison | Jun 2004 | A1 |
20040125081 | Hayakawa | Jul 2004 | A1 |
20040134238 | Buckroyd et al. | Jul 2004 | A1 |
20040135774 | La Monica | Jul 2004 | A1 |
20040135818 | Thomson et al. | Jul 2004 | A1 |
20040136564 | Roeber et al. | Jul 2004 | A1 |
20040140956 | Kushler et al. | Jul 2004 | A1 |
20040141157 | Ramachandran et al. | Jul 2004 | A1 |
20040145601 | Brielmann et al. | Jul 2004 | A1 |
20040146688 | Treat | Jul 2004 | A1 |
20040150668 | Myers et al. | Aug 2004 | A1 |
20040150669 | Sabiers et al. | Aug 2004 | A1 |
20040155888 | Padgitt et al. | Aug 2004 | A1 |
20040160419 | Padgitt | Aug 2004 | A1 |
20040165924 | Griffin | Aug 2004 | A1 |
20040178994 | Kairls, Jr. | Sep 2004 | A1 |
20040181749 | Chellapilla et al. | Sep 2004 | A1 |
20040183833 | Chua | Sep 2004 | A1 |
20040189720 | Wilson et al. | Sep 2004 | A1 |
20040198463 | Knoedgen | Oct 2004 | A1 |
20040218963 | Van Diepen et al. | Nov 2004 | A1 |
20040227739 | Tani et al. | Nov 2004 | A1 |
20040227830 | Kobayashi et al. | Nov 2004 | A1 |
20040230912 | Clow et al. | Nov 2004 | A1 |
20040237052 | Allport | Nov 2004 | A1 |
20040245352 | Smith | Dec 2004 | A1 |
20040262387 | Hart | Dec 2004 | A1 |
20040263484 | Mantysalo et al. | Dec 2004 | A1 |
20050005241 | Hunleth et al. | Jan 2005 | A1 |
20050012723 | Pallakoff | Jan 2005 | A1 |
20050015731 | Mak et al. | Jan 2005 | A1 |
20050016366 | Ito et al. | Jan 2005 | A1 |
20050024341 | Gillespie et al. | Feb 2005 | A1 |
20050052425 | Zadesky et al. | Mar 2005 | A1 |
20050052427 | Wu et al. | Mar 2005 | A1 |
20050057524 | Hill et al. | Mar 2005 | A1 |
20050064936 | Pryor | Mar 2005 | A1 |
20050066270 | Ali et al. | Mar 2005 | A1 |
20050071771 | Nagasawa et al. | Mar 2005 | A1 |
20050078087 | Gates et al. | Apr 2005 | A1 |
20050091577 | Torres et al. | Apr 2005 | A1 |
20050104867 | Westerman et al. | May 2005 | A1 |
20050110768 | Marriott et al. | May 2005 | A1 |
20050116941 | Wallington | Jun 2005 | A1 |
20050120312 | Nguyen | Jun 2005 | A1 |
20050132072 | Pennell et al. | Jun 2005 | A1 |
20050134578 | Chambers et al. | Jun 2005 | A1 |
20050162402 | Watanachote | Jul 2005 | A1 |
20050169527 | Longe et al. | Aug 2005 | A1 |
20050171783 | Suominen | Aug 2005 | A1 |
20050174975 | Mgrdechian et al. | Aug 2005 | A1 |
20050190970 | Griffin | Sep 2005 | A1 |
20050204008 | Shinbrood | Sep 2005 | A1 |
20050204889 | Swingle et al. | Sep 2005 | A1 |
20050211766 | Robertson et al. | Sep 2005 | A1 |
20050223308 | Gunn et al. | Oct 2005 | A1 |
20050253816 | Himberg et al. | Nov 2005 | A1 |
20050253818 | Nettamo | Nov 2005 | A1 |
20050259087 | Hoshino et al. | Nov 2005 | A1 |
20060001650 | Robbins et al. | Jan 2006 | A1 |
20060007174 | Shen | Jan 2006 | A1 |
20060010374 | Corrington et al. | Jan 2006 | A1 |
20060010400 | Dehlin et al. | Jan 2006 | A1 |
20060022955 | Kennedy | Feb 2006 | A1 |
20060022956 | Lengeling et al. | Feb 2006 | A1 |
20060026335 | Hodgson et al. | Feb 2006 | A1 |
20060026521 | Hotelling et al. | Feb 2006 | A1 |
20060026535 | Hotelling et al. | Feb 2006 | A1 |
20060026536 | Hotelling et al. | Feb 2006 | A1 |
20060031752 | Surloff et al. | Feb 2006 | A1 |
20060032680 | Elias et al. | Feb 2006 | A1 |
20060033724 | Chaudhri et al. | Feb 2006 | A1 |
20060035681 | Oh | Feb 2006 | A1 |
20060044259 | Hotelling et al. | Mar 2006 | A1 |
20060052885 | Kong | Mar 2006 | A1 |
20060053387 | Ording | Mar 2006 | A1 |
20060066582 | Lyon et al. | Mar 2006 | A1 |
20060066590 | Ozawa et al. | Mar 2006 | A1 |
20060071915 | Rehm | Apr 2006 | A1 |
20060085757 | Andre et al. | Apr 2006 | A1 |
20060085767 | Hinckley et al. | Apr 2006 | A1 |
20060097991 | Hotelling et al. | May 2006 | A1 |
20060150120 | Dresti et al. | Jul 2006 | A1 |
20060161846 | Van Leeuwen | Jul 2006 | A1 |
20060161870 | Hotelling et al. | Jul 2006 | A1 |
20060161871 | Hotelling et al. | Jul 2006 | A1 |
20060167993 | Aaron et al. | Jul 2006 | A1 |
20060181517 | Zadesky et al. | Aug 2006 | A1 |
20060181519 | Vernier et al. | Aug 2006 | A1 |
20060197750 | Kerr et al. | Sep 2006 | A1 |
20060197753 | Hotelling | Sep 2006 | A1 |
20060242587 | Eagle et al. | Oct 2006 | A1 |
20060242607 | Hudson | Oct 2006 | A1 |
20060253793 | Zhai et al. | Nov 2006 | A1 |
20060265668 | Rainisto | Nov 2006 | A1 |
20060274051 | Longe et al. | Dec 2006 | A1 |
20060290921 | Hotelling et al. | Dec 2006 | A1 |
20070011603 | Makela | Jan 2007 | A1 |
20070033269 | Atkinson et al. | Feb 2007 | A1 |
20070061754 | Ardhanari et al. | Mar 2007 | A1 |
20070070050 | Westerman et al. | Mar 2007 | A1 |
20070070051 | Westerman et al. | Mar 2007 | A1 |
20070070052 | Westerman et al. | Mar 2007 | A1 |
20070078919 | Westerman et al. | Apr 2007 | A1 |
20070081726 | Westerman et al. | Apr 2007 | A1 |
20070083823 | Jaeger | Apr 2007 | A1 |
20070087766 | Hardy et al. | Apr 2007 | A1 |
20070088787 | Hardy et al. | Apr 2007 | A1 |
20070139395 | Westerman et al. | Jun 2007 | A1 |
20070159453 | Inoue | Jul 2007 | A1 |
20070171210 | Chaudhri et al. | Jul 2007 | A1 |
20070177803 | Elias et al. | Aug 2007 | A1 |
20070177804 | Elias et al. | Aug 2007 | A1 |
20070180360 | Neil | Aug 2007 | A1 |
20070222768 | Geurts et al. | Sep 2007 | A1 |
20070268273 | Westerman et al. | Nov 2007 | A1 |
20070268274 | Westerman et al. | Nov 2007 | A1 |
20070268275 | Westerman et al. | Nov 2007 | A1 |
20070276875 | Brunswig et al. | Nov 2007 | A1 |
20080016468 | Chambers et al. | Jan 2008 | A1 |
20080024463 | Pryor | Jan 2008 | A1 |
20080036743 | Westerman et al. | Feb 2008 | A1 |
20080041639 | Westerman et al. | Feb 2008 | A1 |
20080042986 | Westerman et al. | Feb 2008 | A1 |
20080042987 | Westerman et al. | Feb 2008 | A1 |
20080042988 | Westerman et al. | Feb 2008 | A1 |
20080042989 | Westerman et al. | Feb 2008 | A1 |
20080088587 | Pryor | Apr 2008 | A1 |
20080128182 | Westerman et al. | Jun 2008 | A1 |
20080129707 | Pryor | Jun 2008 | A1 |
20080139297 | Beaulieu et al. | Jun 2008 | A1 |
20080174553 | Trust | Jul 2008 | A1 |
20080189622 | Sanchez et al. | Aug 2008 | A1 |
20080211775 | Hotelling et al. | Sep 2008 | A1 |
20080211779 | Pryor | Sep 2008 | A1 |
20080211783 | Hotelling et al. | Sep 2008 | A1 |
20080211784 | Hotelling et al. | Sep 2008 | A1 |
20080211785 | Hotelling et al. | Sep 2008 | A1 |
20080229236 | Carrer et al. | Sep 2008 | A1 |
20080231610 | Hotelling et al. | Sep 2008 | A1 |
20080316183 | Westerman et al. | Dec 2008 | A1 |
20090021489 | Westerman et al. | Jan 2009 | A1 |
20090064006 | Naick et al. | Mar 2009 | A1 |
20090160816 | Westerman et al. | Jun 2009 | A1 |
20090244031 | Westerman et al. | Oct 2009 | A1 |
20090244032 | Westerman et al. | Oct 2009 | A1 |
20090244033 | Westerman et al. | Oct 2009 | A1 |
20090249236 | Westerman et al. | Oct 2009 | A1 |
20090251435 | Westerman et al. | Oct 2009 | A1 |
20090251438 | Westerman et al. | Oct 2009 | A1 |
20090251439 | Westerman et al. | Oct 2009 | A1 |
20090267921 | Pryor | Oct 2009 | A1 |
20090273563 | Pryor | Nov 2009 | A1 |
20090273574 | Pryor | Nov 2009 | A1 |
20090273575 | Pryor | Nov 2009 | A1 |
20090300531 | Pryor | Dec 2009 | A1 |
20090322499 | Pryor | Dec 2009 | A1 |
20100149092 | Westerman et al. | Jun 2010 | A1 |
20100149134 | Westerman et al. | Jun 2010 | A1 |
20100231506 | Pryor | Sep 2010 | A1 |
20110037725 | Pryor | Feb 2011 | A1 |
20120293440 | Hotelling et al. | Nov 2012 | A1 |
20120293442 | Westerman et al. | Nov 2012 | A1 |
Number | Date | Country |
---|---|---|
1243096 | Oct 1988 | CA |
100 42 300 | Mar 2002 | DE |
100 59 906 | Jun 2002 | DE |
101 40 874 | Mar 2003 | DE |
102 51 296 | May 2004 | DE |
0394614 | Oct 1990 | EP |
0 422 577 | Apr 1991 | EP |
0 422 577 | Apr 1991 | EP |
0 462 759 | Dec 1991 | EP |
0 462 759 | Dec 1991 | EP |
0 462 759 | Dec 1991 | EP |
0 288 692 | Jul 1993 | EP |
0 588 210 | Mar 1994 | EP |
0 588 210 | Mar 1994 | EP |
0 622 722 | Nov 1994 | EP |
0 622 722 | Nov 1994 | EP |
0 622 722 | Nov 1994 | EP |
0 664 504 | Jul 1995 | EP |
0 464 908 | Sep 1996 | EP |
0 880 090 | Nov 1996 | EP |
0 817 000 | Jan 1998 | EP |
0 817 000 | Jan 1998 | EP |
0 880 090 | Nov 1998 | EP |
0926588 | Jun 1999 | EP |
1 014 295 | Jan 2002 | EP |
1233330 | Aug 2002 | EP |
1 271 295 | Jan 2003 | EP |
1 271 295 | Jan 2003 | EP |
1517228 | Mar 2005 | EP |
1 569 079 | Aug 2005 | EP |
1505484 | Sep 2005 | EP |
1 674 976 | Jun 2006 | EP |
1 674 976 | Jun 2006 | EP |
2330670 | Apr 1999 | GB |
2 332 293 | Jun 1999 | GB |
2 337 349 | Nov 1999 | GB |
2344894 | Jun 2000 | GB |
2 351 639 | Jan 2001 | GB |
2 380 583 | Apr 2003 | GB |
60-198586 | Oct 1985 | JP |
63-167923 | Jul 1988 | JP |
04-048318 | Feb 1992 | JP |
04-054523 | Feb 1992 | JP |
04-198795 | Jul 1992 | JP |
05-297979 | Nov 1993 | JP |
06-161661 | Jun 1994 | JP |
7-129312 | May 1995 | JP |
7-230352 | Aug 1995 | JP |
09-033278 | Feb 1997 | JP |
9-330175 | Dec 1997 | JP |
10039748 | Feb 1998 | JP |
10-171600 | Jun 1998 | JP |
11-053093 | Feb 1999 | JP |
11-073271 | Mar 1999 | JP |
11-085380 | Mar 1999 | JP |
11-119911 | Apr 1999 | JP |
11-133816 | May 1999 | JP |
11-175258 | Jul 1999 | JP |
11-194863 | Jul 1999 | JP |
2000-010705 | Jan 2000 | JP |
2000-163031 | Jun 2000 | JP |
2000-163444 | Jun 2000 | JP |
2000-231670 | Aug 2000 | JP |
2001-134382 | May 2001 | JP |
2001-147918 | May 2001 | JP |
2001-230992 | Aug 2001 | JP |
2001-356870 | Dec 2001 | JP |
2002-034023 | Jan 2002 | JP |
2002-501271 | Jan 2002 | JP |
2002-342033 | Nov 2002 | JP |
2003-173237 | Jun 2003 | JP |
2004-110388 | Apr 2004 | JP |
2000163193 | Jun 2006 | JP |
10-2001-0040410 | May 2001 | KR |
4057131 | Jul 2004 | KR |
WO-922000 | Feb 1992 | WO |
WO-9429788 | Dec 1994 | WO |
9718547 | May 1997 | WO |
9723738 | Jul 1997 | WO |
WO-9736225 | Oct 1997 | WO |
WO-9740744 | Nov 1997 | WO |
9814863 | Apr 1998 | WO |
WO 9814863 | Apr 1998 | WO |
WO-9833111 | Jul 1998 | WO |
9928813 | Jun 1999 | WO |
9938149 | Jul 1999 | WO |
9954807 | Oct 1999 | WO |
0038042 | Jun 2000 | WO |
0102949 | Jan 2001 | WO |
0201482 | Jan 2002 | WO |
WO-0239245 | May 2002 | WO |
WO-0239245 | May 2002 | WO |
WO-0239245 | May 2002 | WO |
WO-03027822 | Apr 2003 | WO |
WO-03027822 | Apr 2003 | WO |
WO-03036457 | May 2003 | WO |
WO-03036457 | May 2003 | WO |
WO-03062978 | Jul 2003 | WO |
03088176 | Oct 2003 | WO |
03098421 | Nov 2003 | WO |
WO-03098417 | Nov 2003 | WO |
WO-03098417 | Nov 2003 | WO |
WO-2004029789 | Apr 2004 | WO |
WO-2004029789 | Apr 2004 | WO |
WO-2004047069 | Jun 2004 | WO |
WO-2004051392 | Jun 2004 | WO |
WO-2004051392 | Jun 2004 | WO |
WO-2004091956 | Oct 2004 | WO |
WO-2004091956 | Oct 2004 | WO |
WO-2005064442 | Jul 2005 | WO |
2005114369 | Dec 2005 | WO |
WO-2006003590 | Jan 2006 | WO |
WO-2006003590 | Jan 2006 | WO |
WO-2006020304 | Feb 2006 | WO |
WO-2006020304 | Feb 2006 | WO |
WO-2006020305 | Feb 2006 | WO |
2006023569 | Mar 2006 | WO |
2006026012 | Mar 2006 | WO |
WO-2007037808 | Apr 2007 | WO |
WO-2007037809 | Apr 2007 | WO |
WO-2007089766 | Aug 2007 | WO |
WO-2007089766 | Aug 2007 | WO |
WO-2008094791 | Jul 2008 | WO |
WO-2008094791 | Jul 2008 | WO |
WO-2010135478 | Nov 2010 | WO |
Entry |
---|
“Touch Technologies: Touch is Everywhere.” (printed Aug. 30, 2005) http://www.3m.com/3MTouchSystems/downloads/PDFs/TouchTechOV.pdf. |
Annex to Form PCT/ISA/206 Communication Relating to the Results of the Partial International Search received in corresponding PCT Application No. PCT/US2005/025641 dated Feb. 19, 2007. |
PCT International Search Report received in corresponding PCT Application No. PCT/US2006/031523 dated Feb. 27, 2007. |
PCT International Search Report received in corresponding PCT Application No. PCT/US2005/025657 dated Feb. 26, 2007. |
PCT International Search Report received in corresponding PCT Application No. PCT/US2006/031527 dated Feb. 27, 2007. |
PCT International Search Report received in corresponding PCT Application No. PCT/US2006/031526 dated Feb. 14, 2007. |
Schiphorst, et al.; “Using a Gestural Interface Toolkit for Tactile Input to a Dynamic Virtual Space;” Conference on Human Factors in Computing Systems, Proceedings, Apr. 25, 2002, pp. 754-755. |
Chen, et al.; “Flowfield and Beyond: Applying Pressure-Sensitive Multi-Point Touchpad Interaction;” Multimedia and Expo, 2003, ICME '03, Proceedings, Jul. 9, 2003, pp. I-49-I52. |
Jones; “MTC Express Multi-touch Controller;” Computer Music Journal 25.1, 2001, pp. 97-99. |
“Touch Technologies Overview,” 2001, 3M Touch Systems, Massachusetts. |
Ian Hardy, “Fingerworks,” Mar. 7, 2002. |
“Symbol Commander,” http://www.sensiva.com/symbolcomander/,downloaded Aug. 30, 2005. |
“Mouse Gestures in Opera,” http://www.opera.corn/features/mouse/, downloaded Aug. 30, 2005. |
“A Brief Overview of Gesture Recognition,” http://www.dai.ed.ac.uk/Cvonline/LOCAL—COPIES/COHEN/gesture—overview.html, downloaded Apr. 20, 2004. |
Jun Rekimoto, “SmartSkin: An Infrastructure for Freehand Manipulation on Interactive Surfaces,” CHI 2002, Apr. 20-25, 2002. |
“Mouse Gestures,” Optim oz, May 21, 2004. |
“iGesture Products for Everyone (learn in minutes),” FingerWorks, Aug. 30, 2005. |
“MultiTouch Overview,” FingerWorks, http://www.fingerworks.com/multoverview.html downloaded Aug. 30, 2005. |
“Gesture Recognition,” http://www.fingerworks.com/gesture—recognition.html, downloaded Aug. 30, 2005. |
“Tips for Typing,” FingerWorks, http://www.fingerworks.com/mini—typing.html, downloaded Aug. 30, 2005. |
“Mouse Emulation,” FingerWorks, http://www.fingerworks.com/gesture—guide—mouse.html, downloaded Aug. 30, 2005. |
U.S. Appl. No. 10/840,862, filed May 6, 2004. |
U.S. Appl. No. 10/903,964, filed Jul. 30, 2004. |
U.S. Appl. No. 10/927,925, filed Aug. 26, 2004. |
U.S. Appl. No. 11/048,264, filed Jan. 31, 2005. |
U.S. Appl. No. 10/654,108, filed Sep. 2, 2003 entitled “Ambidextrous Mouse”. |
U.S. Appl. No. 10/789,676, filed Feb. 27, 2004 entitled “Shape Detecting Input Device”. |
“4-Wire Resistive Touchscreens” obtained from http://www.touchscreens.com/intro-touchtypes-4resistive.html generated Aug. 5, 2005. |
“5-Wire Resistive Touchscreens” obtained from http://www.touchscreens.com/intro-touchtypes-resistive.html generated Aug. 5, 2005. |
“A Brief Overview of Gesture Recognition” obtained from http://www.dai.ed.ac.uk/Cvonline/LOCA—COPIES/COHEN/gesture—overview. html, generated Apr. 20, 2004. |
“Capacitive Touchscreens” obtained from http://www.touchscreens.com/intro-touchtypes-capacitive.html generated Aug. 5, 2005. |
“Capacitive Position Sensing” obtained from http://www.synaptics.com/technology/cps.cfm generated Aug. 5, 2005. |
“Comparing Touch Technologies” obtained from http://www.touchscreens.com/intro-touchtypes.html generated Oct. 10, 2004. |
“Gesture Recognition” http://www.fingerworks.com/gesture—recognition.html, Jul. 25, 2006. |
“GlidePointe” obtained from http://www.cirque.com/technology/technology—gp.html generated Aug. 5, 2005. |
“How do touchscreen monitors know where you're touching?” obtained from http://www.electronics.howstuffworks.com/question716.html generated Aug. 5, 2005. |
“How does a touchscreen work?” obtained from http://www.touchscreens.com/intro-anatomy.html generated Aug. 5, 2005. |
“iGesture Products for Everyone (learn in minutes) Product Overview” FingerWorks.com, downloaded Aug. 30, 2005. |
“Infrared Touchscreens” obtained from http://www.touchscreens.com/intro-touchtypes-infrared.html generated Aug. 5, 2005. |
“Mouse Emulation” FingerWorks obtained from http://www.fingerworks.com/gesture—guide—mouse.html generated Aug. 30, 2005. |
“Mouse Gestures in Opera” obtained from http://www.opera.com/products/desktop/mouse/index.dml generated Aug. 30, 2005. |
“MultiTouch Overview” FingerWorks obtained from http://www.fingerworks.com/multoverview.html generated Aug. 30, 2005. |
“Near Field Imaging Touchscreens” obtained from http://www.touchscreens.com/intro-touchtypes-nfi.html generated Aug. 5, 2005. |
“PenTouch Capacitive Touchscreens” obtained from http://www.touchscreens.com/intro-touchtypes-pentouch.html generated Aug. 5, 2005. |
“Surface Acoustic Wave Touchscreens” obtained from http://www.touchscreens.com/intro-touchtypes-saw.html generated Aug. 5, 2005. |
“Symbol Commander” obtained from http://www.sensiva.com/symbolcomander/, generated Aug. 30, 2005. |
“Tips for Typing” FingerWorks http://www.fingerworks.com/mini—typing.html generated Aug. 30, 2005. |
“Touch Technologies Overview” 2001, 3M Touch Systems, Massachusetts. |
“Wacom Components—Technology” obtained from http://www.wacom-components.com/english/tech.asp generated on Oct. 10, 2004. |
“Watershed Algorithm” http://rsb.info.nih.gov/ij/plugins/watershed.html generated Aug. 5, 2005. |
“FingerWorks—Gesture Guide—Application Switching,” obtained from http://www.fingerworks.com/gesture—guide—apps.html, generated on Aug. 27, 2004, 1-pg. |
“FingerWorks—Gesture Guide—Editing,” obtained from http://www.fingerworks.com/gesure—guide—editing.html, generated on Aug. 27, 2004, 1-pg. |
“FingerWorks—Gesture Guide—File Operations,” obtained from http://www.fingerworks.com/gesture—guide—files.html, generated on Aug. 27, 2004, 1-pg. |
“FingerWorks—Gesture Guide—Text Manipulation,” obtained from http://www.fingerworks.com/gesture—guide—text—manip.html, generated on Aug. 27, 2004, 2-pg. |
“FingerWorks—Gesture Guide—Tips and Tricks,” obtained from http://www.fingerworks.com/gesture—guide—tips.html, generated Aug. 27, 2004, 2-pgs. |
“FingerWorks—Gesture Guide—Web,” obtained from http://www.fingerworks.com/gesture—guide—web.html, generated on Aug. 27, 2004, 1-pg. |
“FingerWorks—Guide to Hand Gestures for USB Touchpads,” obtained from http://www.fingerworks.com/igesture—userguide.html, generated Aug. 27, 2004, 1-pg. |
“FingerWorks—iGesture—Technical Details,” obtained from http://www.fingerworks.com/igesture—tech.html, generated Aug. 27, 2004, 1-pg. |
“FingerWorks—The Only Touchpads with Ergonomic Full-Hand Resting and Relaxation!” obtained from http://www.fingerworks.com/resting.html, Copyright 2001, 1-pg. |
“FingerWorks—Tips for Typing on the Mini,” obtained from http://www.fingerworks.com/mini—typing.html, generated on Aug. 27, 2004, 2-pgs. |
“iGesture Pad—the MultiFinger USB TouchPad with Whole-Hand Gestures,” obtained from http://www.fingerworks.com/igesture.html, generated Aug. 27, 2004, 2-pgs. |
Bier, et al., “Toolglass and Magic Lenses: The see-through interface” In James Kijiya, editor, Computer Graphics (SIGGRAPH '93 Proceedings), vol. 27, pp. 73-80, Aug. 1993. |
Douglas et al., The Ergonomics of Computer Pointing Devices (1997). |
European Search Report received in EP 1 621 989 (@ Beyer Weaver & Thomas, LLP) dated Mar. 27, 2006. |
Evb Elektronik “TSOP6238 IR Receiver Modules for Infrared Remote Control Systems” dated Jan. 2004 1-pg. |
Fisher et al., “Repetitive Motion Disorders: The Design of Optimal Rate-Rest Profiles,” Human Factors, 35(2):283-304 (Jun. 1993). |
Fukumoto, et al., “ActiveClick: Tactile Feedback for Touch Panels,” In CHI 2001 Summary, pp. 121-122, 2001. |
Fukumoto and Yoshinobu Tonomura, “Body Coupled Fingering: Wireless Wearable Keyboard,” CHI 97, pp. 147-154 (Mar. 1997). |
Hardy, “Fingerworks” Mar. 7, 2002; BBC World on Line. |
Hillier and Gerald J. Lieberman, Introduction to Operations Research (1986). |
International Search Report dated Mar. 3, 2006 (PCT/US 05/03325; 119-0052WO). |
Jacob et al., “Integrality and Separability of Input Devices,” ACM Transactions on Computer-Human Interaction, 1:3-26 (Mar. 1994). |
Kinkley et al., “Touch-Sensing Input Devices,” in CHI '99 Proceedings, pp. 223-230, 1999. |
Kionx “KXP84 Series Summary Data Sheet” copyright 2005,dated Oct. 21, 2005, 4-pgs. |
Lee et al., “A Multi-Touch Three Dimensional Touch-Sensitive Tablet,” in CHI '85 Proceedings, pp. 121-128, 2000. |
Lee, “A Fast Multiple-Touch-Sensitive Input Device,” Master's Thesis, University of Toronto (1984). |
Matsushita et al., “HoloWall: Designing a Finger, Hand, Body and Object Sensitive Wall,” In Proceedings of UIST '97, Oct. 1997. |
Quantum Research Group “QT510 / QWhee1™ Touch Slider IC” copyright 2004-2005, 14-pgs. |
Quek, “Unencumbered Gestural Interaction,” IEEE Multimedia, 3:36-47 (Winter 1996). |
Radwin, “Activation Force and Travel Effects on Overexertion in Repetitive Key Tapping,” Human Factors, 39(1):130-140 (Mar. 1997). |
Rekimoto “SmartSkin: An Infrastructure for Freehand Manipulation on Interactive Surfaces” CHI 2002, Apr. 20-25, 2002. |
Rekimoto et al., “ToolStone: Effective Use of the Physical Manipulation Vocabularies of Input Devices,” In Proc. of UIST 2000, 2000. |
Rubine et al., “Programmable Finger-Tracking Instrument Controllers,” Computer Music Journal, vol. 14, No. 1 (Spring 1990). |
Rutledge et al., “Force-To-Motion Functions for Pointing,” Human-Computer Interaction—INTERACT (1990). |
Subatai Ahmad, “A Usable Real-Time 3D Hand Tracker,” Proceedings of the 28th Asilomar Conference on Signals, Systems and Computers—Part 2 (of2), vol. 2 (Oct. 1994). |
Texas Instruments “TSC2003 / I2C Touch Screen Controller” Data Sheet SBAS 162, dated Oct. 2001, 20-pgs. |
Wellner, “The Digital Desk Calculators: Tangible Manipulation on a Desk Top Display” in ACM UIST '91 Proceedings, pp. 27-34, Nov. 1991. |
Williams, “Applications for a Switched-Capacitor Instrumentation Building Block” Linear Technology Application Note 3, Jul. 1985, pp. 1-16. |
Yamada et al., “A Switched-Capacitor Interface for Capacitive Pressure Sensors” IEEE Transactions on Instrumentation and Measurement, vol. 41, No. 1, Feb. 1992, pp. 81-86. |
Yeh et al., “Switched Capacitor Interface Circuit for Capacitive Transducers” 1985 IEEE. |
Zhai et al., “Dual Stream Input for Pointing and Scrolling,” Proceedings of CHI '97 Extended Abstracts (1997). |
Zimmerman et al., “Applying Electric Field Sensing to Human-Computer Interfaces,” in CHI '85 Proceedings, pp. 280-287, 1995. |
U.S. Appl. No. 10/774,053, filed Feb. 5, 2004. |
U.S. Appl. No. 11/140,529, filed May 27, 2005. |
U.S. Appl. No. 11/381,313, filed May 2, 2006 entitled “Multipoint Touch Surface Controller”. |
U.S. Appl. No. 11/332,861, filed Jan. 13, 2006 |
U.S. Appl. No. 11/380,109, filed Apr. 25, 2006 entitled “Keystroke Tactility Arrangement on Smooth Touch Surface.” |
U.S. Appl. No. 11/428,501, filed Jul. 3, 2006 entitled “Capacitive Sensing Arrangement”. |
U.S Appl. No. 11/428,503, filed Jul. 3, 2006 entitled “Touch Surface”. |
U.S Appl. No. 11/428,506, filed Jul. 3, 2006 entitled “User Interface Gestures”. |
U.S Appl. No. 11/428,515, filed Jul. 3, 2006 entitled “User Interface Gestures”. |
U.S Appl. No. 11/428,522, filed Jul. 3, 2006 entitled “Identifying Contacts on a Touch Surface”. |
U.S Appl. No. 11/428,521, filed Jul. 3, 2006 entitled “Identifying Contacts on a Touch Surface”. |
U.S Appl. No. 11/426,078, filed Jun. 23, 2006 entitled “Electronic Device Having Display and Surrounding Touch Sensitive Bezel for User Interface and Control”. |
U.S Appl. No. 11/278,080, filed Mar. 30, 2006 entitled “Force Imaging Input Device and System”. |
U.S Appl. No. 11/382,402, filed May 9, 2006 entitled “Force and Location Sensitive Display” which is a Continuation of 11/278,080 listed above (see C81). |
International Search Report received in corresponding PCT application No. PCT/US2006/008349 dated Oct. 6, 2006. |
Westerman, Wayne, et al., “Multi-Touch: A New Tactile 2-D Gesture Interface for Human-Computer Interaction,” Proceedings of the Human Factors and Ergonomics Society 45th Annual Meeting, 2001. |
Anonymous. “Ai Squared Products—XoomText Magnifier,” http://www/aisquared.com/Products/zoomtexturemag/index.cfm, downloaded Oct. 26, 2005. |
Anonymous. “Ai Squared Products,” http://www.aisquared.com/Products/index.cfm, downloaded Oct. 25, 2005. |
Anonymous. “Lunar Screen Magnifier and Lunar Plus Enhanced Screen Magnifier,” www.dolphincomputeraccess.com/products/lunar.htm, downloaded Oct. 25, 2005. |
Anonymous. “Touchscreen Technology Choices,” http://www.elotouch.com/products/detech2.asp, downloaded Aug. 5, 2005. |
Anonymous. “Visual Disabilities,” http://depts.stcc.edu/ods/ACCESS/bpvisual.htm, downloaded Oct. 25, 2005. |
Buxton, W. et al. (Jul. 22, 1985). “Issues and Techniques in Touch-Sensitive Tablet Input,” Proceedings ACM Siggraph, pp. 215-224. |
Chang, C-C. et al. (Aug. 1, 1993). “A Hashing-Oriented Nearest Neighbor Searching Scheme,” Pattern Recognition Letters, 14(8):625-630. |
Crowley, J.L. (Mar. 1, 1997). “Vision for Man-Machine Interaction,” Robotics and Autonomous Systems, 19(3-4):347-358. |
Davis, J. et al. (May 2, 1994). “Recognizing Hand Gestures,” European Conference on Computer Vision, Berlin, DE, 1:331-340. |
Davis, J. et al. (Oct. 31, 1994). “Determining 3-D Hand Motion,” Signals, Systems and Computers, 1994 Conference Record of the 28th Asilomar Conference on Pacific Grove, CA, Oct. 31-Nov. 2, 1994, Los Alamitos, CA, pp. 1262-1266. |
European Examination Report for European Patent Application No. 06016830.9, mailed Aug. 6, 2008. |
European Examination Report for European Patent Application No. 06016856.4 mailed Sep. 16, 2008. |
European Examination Report for European Patent Application No. 99904228.6, mailed Apr. 20, 2006. |
European Examination Report for European Patent Application No. 99904228.6, mailed Mar. 23, 2007. |
European Search Report for European Patent Application No. 06016830.9 mailed Dec. 3, 2007. |
European Search Report mailed Dec. 12, 2008, for EP Application No. 06016855.6 filed Jan. 25, 1999, six pages. |
European Search Report mailed Dec. 13, 2007, for EP Application No. 05772892.5, filed Jul. 19, 2005, three pages. |
European Search Report mailed Dec. 15, 2008, for EP Application No. 08016449.4, filed Jul. 19, 2005, six pages. |
European Search Report mailed Dec. 15, 2008, for EP Application No. 08016450.2, filed Jul. 19, 2005, six pages. |
European Search Report mailed Dec. 23, 2008, for EP Application No. 06016831.7 filed Jan. 25, 1999, seven pages. |
European Search Report mailed Jan. 9, 2009, for EP Application No. 06016832.5 filed Jan. 25, 1999, four pages. |
European Supplementary Search Report for European Patent Application No. 99904228.6, mailed Feb. 16, 2005. |
Extended European Search Report for European Patent Application No. 06016858.0, mailed Dec. 21, 2007. |
Extended European Search Report for European Patent Application No. 06016856.4, mailed Mar. 14, 2008. |
Final Office Action mailed Dec. 20, 2007, for U.S. Appl. No. 10/927,925, filed Aug. 26, 2004, 25 pages. |
Final Office Action mailed May 21, 2008, for U.S. Appl. No. 11/048,264, filed Jan. 31, 2005, 11 pages. |
Final Office Action mailed Oct. 16, 2008, for U.S. Appl. No. 11/038,590, filed Jan. 18, 2005, 35 pages. |
Final Office Action mailed Nov. 10, 2008, for U.S. Appl. No. 10/927,925, filed Aug. 26, 2004, 22 pages. |
Final Office Action mailed Dec. 24, 2008, for U.S. Patent Application No. 11/240,788,.filed Sep. 30, 2005, 12 pages. |
Final Office Action mailed Mar. 5, 2009, for U.S. Appl. No. 11/228,758, filed Sep. 16, 2005, 15 pages. |
Final Office Action mailed Mar. 17, 2009, for U.S. Appl. No. 11/241,839, filed Sep. 30, 2005, 16 pages. |
Heap, T. et al. (Oct. 14, 1996). “Towards 3D Hand Tracking Using a Deformable Model,” Proceedings of the 2nd International Conference, Killington, VT, USA, Oct. 14-16, 1996, Automatic Face and Gesture Recognition, IEEE Comput. Soc., pp. 140-145. |
International Search Report for PCT/US99/01454, mailed May 14, 1999. |
International Search Report mailed Apr. 24, 2007, for PCT Application No. PCT/US2005/025641 filed Jul. 19, 2005, five pages. |
International Search Report mailed Aug. 11, 2008, for PCT Application No. PCT/US2007/002512 filed Jan. 30, 2007, six pages. |
International Search Report mailed Oct. 8, 2008, for PCT Application No. PCT/US2008/051727, filed Jan. 22, 2008, six pages. |
Japanese Office Action mailed Oct. 27, 2008, for JP Patent Application No. 2007-523644, one page. |
Kahney, L. (Mar. 8, 2004). “Pocket PCs Masquerade as IPods,” available at: http://www.wired.com/gadgets/mac/news/2004/03/62543, last visited Apr. 28, 2008, two pages. |
Lee, S.K. et al. (Apr. 1985). “A Multi-Touch Three Dimensional Touch-Sensitive Tablet,” Proceedings of CHI: ACM Conference on Human Factors in Computing Systems, pp. 21-25. |
Mohri, K. (Nov. 25, 2000). “Wearable Human Interface Based on Hand/Finger Gesture Recognition,” Human Interface Association Magazine 2(4):9-18. (Abstract Only in English.). |
Nirei, K. et al. (Aug. 5, 1996). “Human Hand Tracking from Binocular Image Sequences,” Proceedings of the 1996 IEEE IECON 22nd International Conference, Taipei, Taiwan, Aug. 5-10, 1996, Industrial Electronics, Control, and Instrumentation 1(5):297-302. |
Non-Final Office Action mailed Jul. 24, 2007, for U.S. Appl. No. 10/927,925, filed on Aug. 26, 2004, 20 pages. |
Non-Final Office Action mailed Sep. 21, 2007, for U.S. Appl. No. 11/428,515, filed Jul. 3, 2006, seven pages. |
Non-Final Office Action mailed Sep. 24, 2007, for U.S. Appl. No. 11/428,506, filed Jul. 3, 2006, six pages. |
Non-Final Office Action mailed Sep. 24, 2007, for U.S. Appl. No. 11/428,521, filed Jul. 3, 2006, six pages. |
Non-Final Office Action mailed Sep. 28, 2007, for U.S. Appl. No. 11/048,264, filed Jan. 31, 2005, 11 pages. |
Non-Final Office Action mailed Nov. 1, 2007, for U.S. Appl. No. 11/240,788, filed Sep. 30, 2005, 20 pages. |
Non-Final Office Action mailed Dec. 31, 2007, for U.S. Appl. No. 11/038,590, filed Jan. 18, 2005, 32 pages. |
Non-Final Office Action mailed Jan. 28, 2008, for U.S. Appl. No. 11/428,522, filed Jul. 3, 2006, seven pages. |
Non-Final Office Action mailed Feb. 4, 2008, for U.S. Appl. No. 11/428,515, filed Jul. 3, 2006, six pages. |
Non-Final Office Action mailed Apr. 30, 2008, for U.S. Appl. No. 11/240,788, filed Sep. 30, 2005, 12 pages. |
Non-Final Office Action mailed May 5, 2008, for U.S. No. 10/927,925, filed Aug. 26, 2004, 22 pages. |
Non-Final Office Action mailed Jul. 9, 2008, for U.S. Appl. No. 11/428,521, filed Jul. 3, 2006, 11 pages. |
Non-Final Office Action mailed Sep. 2, 2008, for U.S. Appl. No. 11/428,522, filed Jul. 3, 2006, six pages. |
Non-Final Office Action mailed Sep. 15, 2008, for U.S. Appl. No. 11/428,506, filed Jul. 3, 2006, eight pages. |
Non-Final Office Action mailed Sep. 17, 2008, for U.S. Appl. No. 11/241,839, filed Sep. 30, 2005, 18 pages. |
Non-Final Office Action mailed Oct. 3, 2008, for U.S. Appl. No. 11/228,758, filed Sep. 16, 2005, 16 pages. |
Non-Final Office Action mailed Oct. 31, 2008, for U.S. Appl. No. 11/428,515, filed Jul. 3, 2006, seven pages. |
Non-Final Office Action mailed Dec. 11, 2008, for U.S. Appl. No. 11/048,264, filed Jan. 31, 2005, 11 pages. |
Non-Final Office Action mailed Feb. 17, 2009, for U.S. Appl. No. 11/428,522, filed Jul. 3, 2006, six pages. |
Non-Final Office Action mailed Apr. 2, 2009, for U.S. Appl. No. 11/038,590, filed Jan. 18, 2005, 41 pages. |
Notice of Allowability mailed Feb. 11, 2009, for U.S. Appl. No. 11/428,521, filed Jul. 3, 2006, five pages. |
Pavlovic, V.I. et al. (Jul. 1997). “Visual Interpretation of Hand Gestures for Human-Computer Interaction: A Review,” IEEE Transactions on Pattern Analysis and Machine Intelligence 19(7):677-695. |
Smith, R. et al. (1996). “Relating Distortion to Performance in Distortion-Oriented Displays,” Proceedings of the 6th Australian Conference on Computer-Human Interaction (OZCHI '96), pp. 6-11. |
The Gadgeteer. (Jun. 6, 2003). “Apple iPod (30GB),” available at http://the-gadgeteer.com/review/apple—ipod—30gb—review, last visited Apr. 28, 208, 19 pages. |
Westerman, W. (Jan. 1, 1999). “Hand Tracking, Finger Identification, and Chordic Manipulation on a Multi-Touch Surface,” Dissertation, University of Delaware, pp. 1-333. |
Non-Final Office Action mailed Mar. 2, 2010, for U.S. Appl. No. 11/240,788, filed Sep. 30, 2005, 13 pages. |
Malik, S. et al. (2004). “Visual Touchpad: A Two-Handed Gestural Input Device,” Proceedings of the 6th International Conference on Multimodal Interfaces, State College, PA, Oct. 13-15, 2004, ICMI '04, ACM pp. 289-296. |
Non-Final Office Action mailed Jan. 27, 2010, for U.S. Appl. No. 11/428,522, filed Jul. 3, 2006, five pages. |
Non-Final Office Action mailed Feb. 3, 2010, for U.S. Appl. No. 11/228,758, filed Sep. 16, 2005, 20 pages. |
Final Office Action mailed Dec. 31, 2009, for U.S. Appl. No. 11/038,590, filed Jan. 18, 2005, 36 pages. |
Non-Final Office Action mailed Dec. 18, 2009, for U.S. Appl. No. No. 11/048,264, filed Jan. 31, 2005, 11 pages. |
Non-Final Office Action mailed Dec. 22, 2009, for U.S. Appl. No. 11/559,833, filed Nov. 14, 2006, six pages. |
Non-Final Office Action mailed Dec. 24, 2009, for U.S. Appl. No. 11/677,958, filed. Feb. 22, 2007, six pages. |
Sun Microsystems. (1992). “The Star7 PDA Prototype,” located at <http://www.youtube.com/watch?v=Ahg8OBYixL0, last visited Jan. 15, 2010, seven pages. |
Non-Final Office Action mailed Oct. 5, 2009, for U.S. Appl. No. 11/559,799, filed Nov. 14, 2006, 14 pages. |
Non-Final Office Action mailed Oct. 6, 2009, for U.S. Appl. No. 11/559,763, filed Nov. 14, 2006, 24 pages. |
Non-Final Office Action mailed Oct. 14, 2009, for U.S. Appl. No. 11/428,501, filed Jul. 3, 2006, six pages. |
Final Office Action mailed Mar. 19, 2009, for U.S. Appl. No. 11/428,506, filed Jul. 3, 2006, seven pages. |
Non-Final Office Action mailed Mar. 18, 2009, for U.S. Appl. No. 11/428,515, filed Jul. 3, 2006, 12 pages. |
Non-Final Office Action mailed Apr. 2, 2009, for U.S. Appl. No. 11/428,501, filed Jul. 3, 2006, 11 pages. |
Non-Final Office Action mailed Apr. 2, 2009, for U.S. Appl. No. 11/428,503, filed Jul. 3, 2006, 12 pages. |
Non-Final Office Action mailed Aug. 25, 2009, for U.S. Appl. No. 11/428,522, filed Jul. 3, 2006, six pages. |
Non-Final Office Action mailed Sep. 2, 2009, for U.S. Appl. No. 11/559,736, filed Nov. 14, 2006, 12 pages. |
Notice of Allowability mailed Sep. 2, 2009, for U.S. Appl. No. 11/428,506, filed Jul. 3, 2006, five pages. |
Notice of Allowability mailed Sep. 3, 2009, for U.S. Appl. No. 11/241,839, filed Sep. 30, 2005, 10 pages. |
Non-Final Office Action mailed Aug. 18, 2009, for U.S. Appl. No. 11/228,758, filed Sep. 16, 2005, 15 pages. |
Final Office Action mailed Jul. 7, 2009, for U.S. Appl. No. 11/048,264, filed Jan. 31, 2005, 14 pages. |
Non-Final Office Action mailed Jun. 10, 2009, for U.S. Appl. No. 11/240,788, filed Sep. 30, 2005, 13 pages. |
Notice of Allowability mailed Jul. 10, 2009, for U.S. Appl. No. 11/428,521, filed Jul. 3, 2006, five pages. |
Final Office Action mailed Nov. 19, 2009, for U.S. Appl. No. 11/428,515, filed Jul. 3, 2006, 14 pages. |
Non-Final Office Action mailed Oct. 29, 2009, for U.S. Appl. No. 11/559,822, filed Nov. 14, 2006, 11 pages. |
Non-Final Office Action mailed Oct. 30, 2009, for U.S. Appl. No. 11/428,503, filed Jul. 3, 2006, nine pages. |
Final Office Action mailed Jan. 19, 2011, for U.S. Appl. No. 11/980,721, filed Oct. 31, 2007, nine pages. |
Final Office Action mailed Jan. 20, 2011, for U.S. Appl. No. 11/830,757, filed Jul. 30, 2007, 21 pages. |
Non-Final Office Action mailed Oct. 29, 2010, for U.S. Appl. No. 11/048,264, filed Jan. 31, 2005, 12 pages. |
Anonymous. (2011). “Jog Dial” located at http://www.ask.com/wiki/Jog—dial, last visited Feb. 27, 2011, two pages. |
Final Office Action mailed Apr. 21, 2011, for U.S. Appl. No. 11/830,808, filed Jul. 30, 2007, 10 pages. |
Final Office Action mailed May 11, 2011, for U.S. Appl. No. 11/048,264, filed Jan. 31, 2005, 12 pages. |
Non-Final Office Action mailed Apr. 21, 2011, for U.S. Appl. No. 11/559,763, filed Nov. 14, 2006, seven pages. |
Non-Final Office Action mailed Apr. 28, 2011, for U.S. Appl. No. 11/830,781, filed Jul. 30, 2007, 16 pages. |
Non-Final Office Action mailed Apr. 29, 2011, for U.S. Appl. No. 11/980,721, filed Oct. 31, 2007, 10 pages. |
Non-Final Office Action mailed May 4, 2011, for U.S. Appl. No. 11/038,590, filed Jan. 18, 2005, 40 pages. |
Non-Final Office Action mailed May 4, 2011, for U.S. Appl. No. 12/118,639, filed May 9, 2008, seven pages. |
Final Office Action mailed May 27, 2011, for U.S. Appl. No. 11/700,636, filed Jan. 30, 2007, nine pages. |
Non-Final Office Action mailed Jun. 7, 2011, for U.S. Appl. No. 11/878,024, Jul. 20, 2007, 10 pages. |
Notification of Reasons) for Refusal mailed Apr. 25, 2011, for JP Patent Application No. 2008-531106, with English Translation, five pages. |
Non-Final Office Action mailed Mar. 31, 2011, for U.S. Appl. No. 12/479,573, filed Jun. 5, 2009, 17 pages. |
Non-Final Office Action mailed Mar. 31, 2011, for U.S. Appl. No. 12/479,617, filed Jun. 5, 2009, 20 pages. |
Non-Final Office Action mailed Mar. 31, 2011, for U.S. Appl. No. 12/479,678, filed Jun. 5, 2009, 18 pages. |
Non-Final Office Action mailed Apr. 4, 2011, for U.S. Appl. No. 11/272,868, filed Nov. 15, 2005, nine pages. |
Final Office Action mailed Mar. 9, 2011, for U.S. Appl. No. 11/980,722, filed Oct. 31, 2007, 11 pages. |
Final Office Action mailed Mar. 21, 2011, for U.S. Appl. No. 11/832,134, filed Aug. 1, 2007, 33 pages. |
Non-Final Office Action mailed Mar. 18, 2011, for U.S. Appl. No. 11/830,774, filed Jul. 30, 2007, 18 pages. |
Non-Final Office Action mailed Mar. 21, 2011, for U.S. Appl. No. 11/830,801, filed Jul. 30, 2007, 11 pages. |
Non-Final Office Action mailed Jul. 19, 2011, for U.S. Appl. No. 11/980,722, filed Oct. 31, 2007, 12 pages. |
Non-Final Office Action mailed Jul. 20, 2011, for U.S. Appl. No. 11/830,788, filed Jul. 30, 2007, 12 pages. |
Non-Final Office Action mailed Jun. 28, 2011, for U.S. Appl. No. 12/422,197, filed Apr. 10, 2009, 18 pages. |
Non-Final Office Action mailed Jun. 28, 2011, for U.S. Appl. No. 12/422,205, filed Apr. 10, 2009, 13 pages. |
Non-Final Office Action mailed Jun. 28, 2011, for U.S. Appl. No. 12/422,212, filed Apr. 10, 2009, 16 pages. |
Non-Final Office Action mailed Jun. 29, 2011, for U.S. Appl. No. 12/342,027, filed Dec. 22, 2008, 32 pages. |
Final Office Action mailed Apr. 14, 2010, for U.S. Appl. No. 11/559,736, filed Nov. 14, 2006, 11 pages. |
Final Office Action mailed May 12, 2010, for U.S. Appl. No. 11/428,503, filed Jul. 3, 2006, 12 pages. |
Final Office Action mailed Jun. 11, 2010, for U.S. Appl. No. 11/559,833, filed Nov. 14, 2006, eight pages. |
Final Office Action mailed Jul. 6, 2010, for U.S. Appl. No. 11/048,264, filed Jan. 31, 2005, 12 pages. |
Final Office Action mailed Jul. 19, 2010, for U.S. Appl. No. 11/428,522, filed Jul. 3, 2006, six pages. |
Final Office Action mailed Aug. 17, 2010, for U.S. Appl. No. 11/240,788, filed Sep. 30, 2005, nine pages. |
Non-Final Office Action mailed May 11, 2010, for U.S. Appl. No. 11/830,788, filed Jul. 30, 2007, eight pages. |
Non-Final Office Action mailed Jun. 8, 2010, for U.S. Appl. No. 11/696,693, filed Apr. 4, 2007, 27 pages. |
Non-Final Office Action mailed Jun. 8, 2010, for U.S. Appl. No. 11/830,808, filed Jul. 30, 2007, 13 pages. |
Non-Final Office Action mailed Jun. 9, 2010, for U.S. Appl. No. 11/830,793, filed Jul. 30, 2007, eight pages. |
Non-Final Office Action mailed Jun. 10, 2010, for U.S. Appl. No. 11/830,801, filed Jul. 30, 2007, 10 pages. |
Non-Final Office Action mailed Jun. 22, 2010, for U.S. Appl. No. 11/830,815, filed Jul. 30, 2007, 12 pages. |
Notice of Allowance mailed Mar. 23, 2010, for U.S. Appl. No. 11/428,501, filed Jul. 3, 2006, eight pages. |
Notice of Allowance mailed Apr. 26, 2010, for U.S. Appl. No. 11/559,822, filed Nov. 14, 2006, nine pages. |
Notice of Allowance mailed Jun. 21, 2010, for U.S. Appl. No. 11/677,958, filed Feb. 22, 2007, eight pages. |
Notice of Allowability (Supplemental) mailed May 12, 2010, for U.S. Appl. No. 11/559,822, filed Nov. 14, 2006, two pages. |
Rubine, D.H. (Dec. 1991). “The Automatic Recognition of Gestures,” CMU-CS-91-202, Submitted in Partial Fulfillment of the Requirements of the Degree of Doctor of Philosophy in Computer Science at Carnegie Mellon University, 285 pages. |
Rubine, D. (May 1992). “Combining Gestures and Direct Manipulation,” CHI '92, pp. 659-660. |
Non-Final Office Action mailed Jan. 20, 2011, for U.S. Appl. No. 11/830,757, filed Jul. 30, 2007, 21 pages. |
Non-Final Office Action mailed Feb. 9, 2011, for U.S. Appl. No. 11/240,788, filed Sep. 30, 2005, nine pages. |
Non-Final Office Action mailed Feb. 17, 2011, for U.S. Appl. No. 11/852,690, filed Sep. 10, 2007, 10 pages. |
Non-Final Office Action mailed Feb. 17, 2011, for U.S. Appl. No. 11/830,766, filed Jul. 30, 2007, 20 pages. |
Bales, J. W. et al. (Apr. 1981). “Marking Parts to Aid Robot Vision,” NASA Technical Paper 1819, 37 pages. |
Final Office Action mailed Nov. 12, 2009, for U.S. Appl. No. 11/349,350, filed Feb. 8, 2006, nine pages. |
Final Office Action mailed Jul. 20, 2010, for U.S. Appl. No. 11/228,758, filed Sep. 16, 2005, 19 pages. |
Final Office Action mailed Sep. 2, 2010, for U.S. Appl. No. 11/272,868, filed Nov. 15, 2005, nine pages. |
Final Office Action mailed Oct. 19, 2010, for U.S. Appl. No. 11/559,799, filed Nov. 14, 2006, eight pages. |
Final Office Action mailed Oct. 29, 2010, for U.S. Appl. No. 11/559,763, filed Nov. 14, 2006, 15 pages. |
Final Office Action mailed Nov. 23, 2010, for U.S. Appl. No. 11/696,693, filed Apr. 4, 2007, 24 pages. |
Final Office Action mailed Nov. 23, 2010, for U.S. Appl. No. 11/830,801, filed Jul. 30, 2007, 13 pages. |
Final Office Action mailed Nov. 26, 2010, for U.S. Appl. No. 11/830,793, filed Jul. 30, 2007, nine pages. |
Final Office Action mailed Dec. 2, 2010, for U.S. Appl. No. 11/830,815, filed Jul. 30, 2007, nine pages. |
Final Office Action mailed Dec. 3, 2010, for U.S. Appl. No. 11/830,788, filed Jul. 30, 2007, 15 pages. |
International Search Report mailed Aug. 28, 2007, for PCT Application No. PCT/US2004/009701, filed Mar. 31, 2004, one page. |
Non-Final Office Action mailed Jan. 6, 2009, for U.S. Appl. No. 11/349,350, filed Feb. 8, 2006, 10 pages. |
Non-Final Office Action mailed Mar. 5, 2009, for U.S. Appl. No. 11/272,868, filed Nov. 15, 2005, 15 pages. |
Non-Final Office Action mailed Dec. 7, 2009, for U.S. Appl. No. 11/272,868, filed Nov. 15, 2005, seven pages. |
Non-Final Office Action mailed Jul. 29, 2010, for U.S. Appl. No. 11/980,721, filed Oct. 31, 2007, nine pages. |
Non-Final Office Action mailed Aug. 2, 2010, for U.S. Appl. No. 11/980,722, filed Oct. 31, 2007, five pages. |
Non-Final Office Action mailed Sep. 17, 2010, for U.S. Appl. No. 11/832,134, filed Aug. 1, 2007, 26 pages. |
Non-Final Office Action mailed Nov. 18, 2010, for U.S. Appl. No. 11/700,636, filed Jan. 30, 2007, eight pages. |
Non-Final Office Action mailed Nov. 23, 2010, for U.S. Appl. No. 11/830,808, filed Jul. 30, 2007, 11 pages. |
U.S. Appl. No. 11/272,868, filed Nov. 15, 2005, by Pryor. |
U.S. Appl. No. 11/349,350, filed Feb. 8, 2006, by Pryor. |
U.S. Appl. No. 11/878,024, filed Jul. 20, 2007, by Pryor. |
U.S. Appl. No. 90/010,571, filed Jun. 10, 2009, by Pryor. |
Final Office Action mailed Aug. 10, 2011, for U.S. Appl. No. 11/240,788, filed Sep. 30, 2005, seven pages. |
Final Office Action mailed Sep. 27, 2011, for U.S. Appl. No. 11/830,774, filed Jul. 30, 2007, 15 pages. |
Final Office Action mailed Sep. 28, 2011, for U.S. Appl. No. 12/479,678, filed Jun. 5, 2009, 13 pages. |
Final Office Action mailed Oct. 14, 2011, for U.S. Appl. No. 11/830,801, filed Jul. 30, 2007 , 16 pages. |
Final Office Action mailed Oct. 19, 2011, for U.S. Appl. No. 12/479,573, filed Jun. 5, 2009, 13 pages. |
Final Office Action mailed Nov. 10, 2011, for U.S. Appl. No. 11/272,868, filed Nov. 15, 2005, nine pages. |
Final Office Action mailed Nov. 17, 2011, for U.S. Appl. No. 11/830,781, filed Jul. 30, 2007, 16 pages. |
Final Office Action mailed Nov. 17, 2011, for U.S. Appl. No. 12/479,617, filed Jun. 5, 2009, 18 pages. |
Final Office Action mailed Nov. 18, 2011, for U.S. Appl. No. 11/878,024, filed Jul. 20, 2007, 18 pages. |
Final Office Action mailed Nov. 28, 2011, for U.S. Appl. No. 11/038,590, filed Jan. 18, 2005, 43 pages. |
Final Office Action mailed Dec. 16, 2011, 2011, for U.S. Appl. No. 12/422,212, filed Apr. 10, 2009, 20 pages. |
Non-Final Office Action mailed Aug. 4, 2011, for U.S. Appl. No. 11/830,793, filed Jul. 30, 2007, 13 pages. |
Non-Final Office Action mailed Aug. 5, 2011, for U.S. Appl. No. 12/422,222, filed Apr. 10, 2009, 15 pages. |
Non-Final Office Action mailed Aug. 5, 2011, for U.S. Appl. No. 12/422,225, filed Apr. 10, 2009, 17 pages. |
Non-Final Office Action mailed Sep. 1, 2011, for U.S. Appl. No. 11/830,766, filed Jul. 30, 2007, 29 pages. |
Non-Final Office Action mailed Sep. 16, 2011, for U.S. Appl. No. 11/830,757, filed Jul. 30, 2007, 26 pages. |
Non-Final Office Action mailed Sep. 23, 2011, for U.S. Appl. No. 12/500,973, filed Jul. 10, 2009, 5 pages. |
Non-Final Office Action mailed Sep. 27, 2011, for U.S. Appl. No. 11/700,636, filed Jan. 30, 2007, eight pages. |
Non-Final Office Action mailed Sep. 29, 2011, for U.S. Appl. No. 11/048,264, filed Jan. 31, 2005, 11 pages. |
Non-Final Office Action mailed Sep. 30, 2011, for U.S. Appl. No. 12/468,401, filed May 19, 2009, 19 pages. |
Non-Final Office Action mailed Oct. 14, 2011, for U.S. Appl. No. 12/434,439, filed May 1, 2009, nine pages. |
Non-Final Office Action mailed Oct. 27, 2011, for U.S. Appl. No. 12/139,411, filed Jun. 13, 2008, six pages. |
Non-Final Office Action mailed Nov. 8, 2011, for U.S. Appl. No. 12/118,639, filed May 9, 2008, five pages. |
Non-Final Office Action mailed Nov. 10, 2011, for U.S. Appl. No. 11/830,815, filed Jul. 30, 2007, 15 pages. |
Non-Final Office Action mailed Nov. 23, 2011, for U.S. Appl. No. 11/428,515, filed Jul. 3, 2006, eight pages. |
Non-Final Office Action mailed Dec. 8, 2011, for U.S. Appl. No. 11/228,758, filed Sep. 16, 2005, 16 pages. |
Non-Final Office Action mailed Dec. 8, 2011, for U.S. Appl. No. 12/500,925, filed Jul. 10, 2009, nine pages. |
Non-Final Office Action mailed Dec. 9, 2011, for U.S. Appl. No. 12/500,984, filed Jul. 10, 2009, nine pages. |
Non-Final Office Action mailed Dec. 12, 2011, for U.S. Appl. No. 12/500,951, filed Jul. 10, 2009, eight pages. |
Non-Final Office Action mailed Dec. 16, 2011, for U.S. Appl. No. 12/500,978, filed Jul. 10, 2009, nine pages. |
Notice of Allowance mailed Aug. 16, 2011, for U.S. Appl. No. 11/559,763, filed Nov. 14, 2006, nine pages. |
Notice of Allowance mailed Oct. 26, 2011, for U.S. Appl. No. 11/559,763, filed Nov. 14, 2006, nine pages. |
Non-Final Office Action mailed Dec. 22, 2011, for U.S. Appl. No. 11/696,693, filed Apr. 4, 2007, 29 pages. |
Non-Final Office Action mailed Jan. 19, 2012, for U.S. Appl. No. 11/830,808, filed Jul. 30, 2007, eight pages. |
Final Office Action mailed Feb. 3, 2012, for U.S. Appl. No. 12/422,205, filed Apr. 10, 2009, 16 pages. |
Non-Final Office Action mailed Jan. 30, 2012, for U.S. Appl. No. 11/428,503, filed Jul. 3, 2006, 15 pages. |
Final Office Action mailed Mar. 1, 2012, for U.S. Appl. No. 11/048,264, filed Jan. 31, 2005, 13 pages. |
Final Office Action mailed Mar. 9, 2012, for U.S. Appl. No. 11/700,636, filed Jan. 30, 2007, nine pages. |
Final Office Action mailed Mar. 26, 2012, for U.S. Appl. No. 12/118,639, filed May 9, 2008, seven pages. |
Notice of Allowance mailed Mar. 9, 2012, for U.S. Appl. No. 11/240,788, filed Sep. 30, 2005, five pages. |
Anonymous. (Apr. 1, 1994). “Jog Shuttle Graphical,” IBM Technical Disclosure Bulletin, 37(4A):47-48. |
Anonymous. (2011). “(graphical or virtual) (jog dial) or (jog wheel) or (scroll wheel),” Ask Search located at http://www.ask.com/web?q=%28graphical+or+virtual%29++%28jog+job+dial%29+or+%28jo . . . , last visited Feb. 27, 2011, two pages. |
Anonymous. (2011). “What Is a Jog Wheel?” Ask Search located at http://www.ask.com/web?q=what+is+a+jog+wheel&search=&qsrc=0&o=0&1=dirlast visited on Feb. 27, 2011, two pages. |
Anonymous. (2012). “Emulating Mouse Scroll Wheel?” Ask Search located at http://www.ask.com/web?q=emulate+scroll+wheel&qsrc=1&o=0&1=dir&qid=A23E49EA, last visited Mar. 23, 2012, one page. |
Anonymous. (2012). “Virtual Scroll Wheel,” Ask Search located at http://www.ask.com/web?q=virtual+scroll+wheel&qsrc=0&o=0&1=dir&oo=0, last visited Mar. 23, 2012, two pages. |
Notice of Allowance mailed Apr. 2, 2012, for U.S. Appl. No. 11/038,590, filed Jan. 18, 2005, 11 pages. |
Non-Final Office Action mailed May 7, 2012, for U.S. Appl. No. 12/118,645, filed May 9, 2008, five pages. |
Non-Final Office Action mailed May 9, 2012, for U.S. Appl. No. 12/118,641, filed May 9, 2008, four pages. |
Non-Final Office Action mailed May 17, 2012, for U.S. Appl. No. 12/118,648, filed May 9, 2008, four pages. |
Final Office Action mailed Jun. 7, 2012, for U.S. Appl. No. 11/228,758, filed Sep. 16, 2005, 16 pages. |
European Search Report mailed Jun. 8, 2012, for EP Application No. 12166818.0, filed Jan. 30, 2007, seven pages. |
European Search Report mailed Jun. 14, 2012, for EP Application No. 12166820.6, filed Jan. 30, 2007, six pages. |
Final Office Action mailed Jul. 6, 2012, for U.S. Appl. No. 11/696,693, filed Apr. 4, 2007, 24 pages. |
Non-Final Office Action mailed Jul. 27, 2012, for U.S. Appl. No. 12/118,659, filed May 9, 2008, five pages. |
Non-Final Office Action mailed Jul. 31, 2012, for U.S. Appl. No. 11/228,758, filed Sep. 16, 2005, 14 pages. |
Final Office Action mailed Jan. 4, 2013, for U.S. Appl. No. 12/118,659, filed May 9, 2008, six pages. |
Anonymous. (Sep. 2003). “P900 User Guide,” Second Edition, Sony Ericsson Mobile Communications AB: located at http://www.sonyericsson.com/downloads/P900—UG—R1b—EN.pdf, pp. 8, 16, 17, 20, 24-26, 42-45, 137 (98 pages total). |
Anonymous. (2004). “Devices,” Technology Loan Catalog, located at http://www.tsbvi.edu/outreach/techloan/catalog.html last visited Jun. 6, 2008, nine pages. |
Anonymous. (2004). “Fastap™ Keypads Redefine Mobile Phones,” DIGITWireless, located at http://www/digitwireless.com, last visited Nov. 18, 2005, nine pages. |
Anonymous. (Jan. 21, 2004). “Compare Keyboards with the Keyboard Compatibility Chart, Learn More About Alternative Keyboards,” Solutions for Humans, located at http://www.keyalt.com/kkeybrdp.html, last visited Dec. 8, 2005, five pages. |
Anonymous. (2005). “Fastap™,” DIGITWireless, located at http://www.digitwireless.com/about/faq.html, last visited Dec. 6, 2005, five pages. |
Anonymous. (2005). “Four-Button Keyboard,” WikiPodLinux located at http://ipodlinux.org/Four-Button—Keyboard, last visited Dec. 5, 2005, two pages. |
Anonymous. (2005). “Introducing the Ultimate Smartphone Keypad,” Delta II™ Keypads, Chicago Logic, Inc., located at http://www.chicagologic.com, last visited Nov. 18, 2005, two pages. |
Anonymous. (2005). “T9® Text Input for Keypad Devices,” Tegic Communications # located at http://www.tegic.com, last visited Nov. 18, 2005, one pages. |
Anonymous. (2005). “Text Input (Legacy),” WikiPodLinux located at http://ipodlinux.org/Text—Input—%281egacy%29, last visited Dec. 5, 2005, eight pages. |
Anonymous. (2005). “Text Input Concepts,” WikiPodLinux located at http://web.archive.org/web/20051211165254/http://ipodlinux.orgiText—Input—Concepts last visited Dec. 5, 2005, three pages. |
Anonymous. (2005). “Text Input Methods,” WikiPodLinux located at http://ipodlinux.org/Text—Input—Methods, last visited Dec. 5, 2005, five pages. |
Anonymous. (2005). “You Heard of Touch Screens Now Check Out Touch Keys,” Phoneyworld, located at http://www.phoneyworld.com/newspage.aspx?n=1413, last visited Nov. 18, 2005, two pages. |
Anonymous. (Apr. 6, 2005). “Microsoft's New-Smart Phone Interface: Your Thumb,” textually.org located at http://www.textually.org/textually/archives/2005/04/007819.html last visited Nov. 18, 2005, two pages. |
Anonymous. (Jun. 22, 2005). “LG Develops New Touch Pad Cell Phones,” textually.org located at http://textually.org/textually/archives/2005/06/009903.html, last visited Nov. 18, 2005, one page. |
Anonymous. (Nov. 3, 2005). “Samsung Releases Keyboard Phone in US,” textually.org located at http://www.textually.org/textually/archives/2005/11/010482.html last visited Nov. 18, 2005, one page. |
Anonymous. (2006). Centroid, located at http://faculty,evansville.edu/ck6/tcenter/class/centroid.html, last visited Apr. 28, 2006, 1 page. |
Anonymous. (2006). Centroid, located at http://www.pballew.net/centroid.html, last visited Apr. 28, 2006, three pages. |
Casario, M. (Oct. 5, 2005). “Hands on Macromedia World: Touch Screen Keypad for Mobile Phone by DoCoMo,” located at http://casario.blogs.com/mmwor1/2005/10/touch—screen—ke.html, last visited Nov. 18, 2005, 1 page. |
Day, B. (Jan. 6, 2004). “Will Cell Phones Render iPods Obsolete?” Java.net, located at http://weblogs.javanet/pub/wig/883, last visited Dec. 12, 2005, three pages. |
Final Office Action mailed May 12, 2009, for U.S. Appl. No. 11/228,700, filed Sep. 16, 2005, 13 pages. |
Final Office Action mailed Dec. 8, 2009, for U.S. Appl. No. 11/459,615, filed Jul. 24, 2006, 11 pages. |
Final Office Action mailed Nov. 18, 2010, for U.S. Appl. No. 11/961,663, filed Dec. 20, 2007, 14 pages. |
Final Office Action mailed Feb. 17, 2012, for U.S. Appl. No. 11/830,801, filed Jul. 30, 2007, 14 pages. |
Final Office Action mailed Mar. 1, 2012, for U.S. Appl. No. 11/830,793, filed Jul. 30, 2007, 14 pages. |
Final Office Action mailed Apr. 13, 2012, for U.S. Appl. No. 12/468,401, filed May 19, 2009, 21 pages. |
Final Office Action mailed Apr. 16, 2012, for U.S. Appl. No. 12/422,222, filed Apr. 10, 2009, 14 pages. |
Final Office Action mailed Apr. 25, 2012, for U.S. Appl. No. 12/422,197, filed Apr. 10, 2009, 12 pages. |
Final Office Action mailed Apr. 27, 2012, for U.S. Appl. No. 11/830,788, filed Jul. 30, 2007, 13 pages. |
Final Office Action mailed May 1, 2012, for U.S. Appl. No. 12/434,439, filed May 1, 2009, 14 pages. |
Final Office Action mailed May 8, 2012, for U.S. Appl. No. 12/139,411, filed Jun. 13, 2008, seven pages. |
Final Office Action mailed May 9, 2012, for U.S. Appl. No. 11/980,722, filed Oct. 31, 2007, 14 pages. |
Final Office Action mailed May 24, 2012, for U.S. Appl. No. 11/830,757, filed Jul. 30, 2007, 19 pages. |
Final Office Action mailed May 29, 2012, for U.S. Appl. No. 12/500,978, filed Jul. 10, 2009, 10 pages. |
Final Office Action mailed Jun. 8, 2012, for U.S. Appl. No. 12/422,225, filed Apr. 10, 2009, 12 pages. |
Final Office Action mailed Jul. 6, 2012, for U.S. Appl. No. 11/830,815, filed Jul. 30, 2007, 15 pages. |
Final Office Action mailed Jul. 27, 2012, for U.S. Appl. No. 11/696,701, filed Apr. 4, 2007, 13 pages. |
Final Office Action mailed Aug. 3, 2012, for U.S. Appl. No. 11/830,808, filed Jul. 30, 2007, seven pages. |
Final Office Action mailed Sep. 18, 2012, for U.S. Appl. No. 11/852,690, filed Sep. 10, 2007, 13 pages. |
Final Office Action mailed Oct. 12, 2012, for U.S. Appl. No. 12/118,648, filed May 9, 2008, eight pages. |
Final Office Action mailed Oct. 25, 2012, for U.S. Appl. No. 11/832,134, filed Aug. 1, 2007, 27 pages. |
Final Office Action mailed Dec. 12, 2012, for U.S. Appl. No. 12/500,984, filed Jul. 10, 2009, 10 pages. |
International Search Report mailed Apr. 11, 2008, for PCT Application No. PCT/US2007/060119, nine pages. |
International Search Report mailed Sep. 15, 2008, for PCT Application No. PCT/US2007/088904, nine pages. |
Non-Final Office Action mailed Sep. 17, 2008, for U.S. Appl. No. 11/228,700, filed Sep. 16, 2005, 18 pages. |
Non-Final Office Action mailed May 22, 2009, for U.S. Appl. No. 11/459,615, filed Jul. 24, 2006, nine pages. |
Non-Final Office Action mailed May 28, 2009, for U.S. Appl. No. 11/459,606, filed Jul. 24, 2006, 17 pages. |
Non-Final Office Action mailed Apr. 13, 2010, for U.S. Appl. No. 11/459,615, filed Jul. 24, 2006, nine pages. |
Non-Final Office Action mailed Mar. 2, 2012, for U.S. Appl. No. 11/852,690, filed Sep. 10, 2007, 12 pages. |
Non-Final Office Action mailed Apr. 5, 2012, for U.S. Appl. No. 12/479,678, filed Jun. 5, 2009, 13 pages. |
Non-Final Office Action mailed Apr. 16, 2012, 2011, for U.S. Appl. No. 12/422,212, filed Apr. 10, 2009, 20 pages. |
Non-Final Office Action mailed May 2, 2012, for U.S. Appl. No. 11/832,134, filed Oct. 31, 2007, 25 pages. |
Non-Final Office Action mailed May 23, 2012, for U.S. Appl. No. 11/428,515, filed Jul. 3, 2006, 15 pages. |
Non-Final Office Action mailed Jun. 7, 2012, for U.S. Appl. No. 11/830,766, filed Jul. 30, 2007, 14 pages. |
Non-Final Office Action mailed Jun. 13, 2012, for U.S. Appl. No. 12/500,984, filed Jul. 10, 2009, eight pages. |
Non-Final Office Action mailed Jun. 22, 2012, for U.S. Appl. No. 11/830,774, filed Jul. 30, 2007, 17 pages. |
Non-Final Office Action mailed Jun. 25, 2012, for U.S. Appl. No. 11/830,781, filed Jul. 30, 2007, 19 pages. |
Non-Final Office Action mailed Jun. 25, 2012, for U.S. Appl. No. 12/479,573, filed Jun. 5, 2009, 11 pages. |
Non-Final Office Action mailed Aug. 21, 2012, for U.S. Appl. No. 12/434,439, filed May 1, 2009, 12 pages. |
Non-Final Office Action mailed Aug. 30, 2012, for U.S. Appl. No. 11/830,801, filed Jul. 30, 2007, 15 pages. |
Non-Final Office Action mailed Sep. 4, 2012, for U.S. Appl. No. 11/830,788, filed Jul. 30, 2007, 13 pages. |
Non-Final Office Action mailed Sep. 6, 2012, for U.S. Appl. No. 11/830,793, filed Jul. 30, 2007, 14 pages. |
Non-Final Office Action mailed Sep. 13, 2012, for U.S. Appl. No. 12/422,205, filed Apr. 10, 2009, 16 pages. |
Non-Final Office Action mailed Sep. 25, 2012, for U.S. Appl. No. 11/428,515, filed Jul. 3, 2006, 15 pages. |
Non-Final Office Action mailed Oct. 12, 2012, for U.S. Appl. No. 12/422,212, filed Apr. 10, 2009, five pages. |
Non-Final Office Action mailed Oct. 26, 2012, for U.S. Appl. No. 12/468,401, filed May 19, 2009, 22 pages. |
Non-Final Office Action mailed Nov. 8, 2012, for U.S. Appl. No. 12/479,678, filed Jun. 5, 2009, five pages. |
Non-Final Office Action mailed Nov. 16, 2012, for U.S. Appl. No. 13/569,065, filed Aug. 7, 2012, eight pages. |
Non-Final Office Action mailed Nov. 29, 2012, for U.S. Appl. No. 11/559,833, filed Nov. 14, 2006, 10 pages. |
Non-Final Office Action mailed Dec. 18, 2012, for U.S. Appl. No. 11/878,024, filed Jul. 20, 2007, eight pages. |
Non-Final Office Action mailed Dec. 19, 2012, for U.S. Appl. No. 11/048,264, filed Jan. 31, 2005, nine pages. |
Non-Final Office Action mailed Jan. 4, 2013, for U.S. Appl. No. 12/139,411, filed Jun. 13, 2008, six pages. |
Notice of Allowance mailed Mar. 26, 2012, for U.S. Appl. No. 12/500,973, filed Jul. 10, 2009, 12 pages. |
Notice of Allowance mailed Apr. 13, 2012, for U.S. Appl. No. 12/342,027, filed Dec. 22, 2008, 10 pages. |
Notice of Allowance mailed Jun. 27, 2012, for U.S. Appl. No. 11/559,736, filed Nov. 14, 2006, 11 pages. |
Notice of Allowance mailed Jul. 26, 2012, for U.S. Appl. No. 11/428,503, filed Jul. 3, 2006, nine pages. |
Notice of Allowance mailed Aug. 22, 2012, for U.S. Appl. No. 11/559,763, filed Nov. 14, 2006, 10 pages. |
Notice of Allowance mailed Sep. 6, 2012, for U.S. Appl. No. 13/556,019, filed Jul. 23, 2012, seven pages. |
Notice of Allowance mailed Jan. 15, 2013, for U.S. Appl. No. 12/479,617, filed Jun. 5, 2009, 10 pages. |
O'Neal, W. (2005). “Smart Phones with Hidden Keyboards,” located at http://msn.com.com/4250-6452—16-6229969-1.html, last visited Nov. 18, 2005, three pages. |
Sears, A. et al. (2005). “Data Entry for Mobile Devices using Soft Keyboards: Understanding the Effects of Keyboard Size and User Tasks,” Abstract, Int'l Journal of Human-Computer Interaction, vol. 16, No. 2, one page. |
Non-Final Office Action mailed Feb. 14, 2013, for U.S. Appl. No. 11/228,758, filed Sep. 16, 2005, 14 pages. |
Number | Date | Country | |
---|---|---|---|
20060026521 A1 | Feb 2006 | US |