The disclosure relates to moving a graphical selector.
Computing devices often have a display device that displays a graphical selector, such as a cursor or a pointer controlled by a mouse or other input device, which may be used to interact with other displayed elements. A user may wish to move the graphical selector to a desired location on the display device. When the display device is a presence-sensitive device, such as a touch screen, the user may attempt to move the graphical selector by placing a finger over the desired location.
In one example, a method includes activating, by a computing device, a graphical key that is displayed with a presence-sensitive interface of the computing device. Upon activation of the graphical key, the method also includes receiving gesture input corresponding to a directional gesture using the presence-sensitive interface of the computing device and moving a graphical selector displayed with the presence-sensitive interface from a first graphical location to a second graphical location by at least one selected increment based on a property of the gesture input.
In another example, a tangible computer-readable medium comprising instructions for causing a programmable processor to perform operations including activating, by a computing device, a graphical key that is displayed with a presence-sensitive interface of the computing device. The instructions further include upon activation of the graphical key, receiving gesture input corresponding to a directional gesture using the presence-sensitive interface of the computing device. The instructions also include moving a graphical selector displayed by the computing device from a first graphical location to a second graphical location by at least one selected increment based on a property of the gesture input.
In yet another example, a computing device includes one or more processors. The computing device may also include an input device that receives gesture input corresponding to a directional gesture and an output device that displays a graphical selector. The computing device further includes means for moving the graphical selector from a first graphical location to a second graphical location by at least one selected increment based on a property of the gesture input.
The details of one or more aspects of the disclosure are set forth in the accompanying drawings and the description below. Other features, objects, and advantages of the disclosure will be apparent from the description and drawings, and from the claims.
In accordance with common practice, the various described features are not drawn to scale and are drawn to emphasize features relevant to the present disclosure. Like reference characters denote like elements throughout the figures and text.
Techniques of the present disclosure allow a computing device to implement a graphical key on a graphical user interface (GUI) of a computing device that moves a graphical selector based on detected user gestures. In response to a swipe or fling gesture originating from the graphical key, the graphical selector (e.g., a cursor or pointer) may be moved to a new location on a display based on properties of the user gesture. The movement of the graphical selector on the display may be defined in specific increments based on properties of the gesture, which may include, for example, a duration, type, displacement, or speed of the gesture. When the computing device has a relatively small screen (such as a mobile phone, tablet computer, or other mobile device), it may be difficult for a user to precisely pinpoint the location of the graphical selector. The present disclosure maximizes screen real estate (e.g., by reducing four or more directional arrow keys to a single graphical key) and provides more precise movement of a graphical selector on small and medium size presence-sensitive screens where user input is often provided using a finger.
For example, a user performs a swipe-left gesture, e.g., a swipe motion starting at the graphical key and moving to the left, to cause a cursor in a textual environment to move one character to the left. In another example, a fling-down gesture, e.g., a fling motion starting at the graphical key and moving downward, causes the cursor to move downward one line. In some examples, the movement of the cursor is based on or proportional to the distance traveled of the gesture, e.g., a relatively longer gesture (e.g., a length of the gesture may be equal to or more than a length threshold) moves the cursor over by an entire word while a relatively shorter gesture (e.g., a length of the gesture may be less than a length threshold) moves the cursor a single character. In other examples, the movement of the cursor is based on or proportional to the speed of the gesture, e.g., a fast upward gesture moves the cursor up to the top of a current paragraph, up to the top of a preceding paragraph, or up to the top of the textual environment. In still other examples, the magnitude of the cursor movement may be based on the duration that the user presses the graphical key before release.
This functionality may be provided via adding a graphical key to a graphical keyboard. Other examples overlay the functionality onto a pre-existing key of a graphical keyboard, for example, onto a space bar. In such examples, a tap made at the space bar adds a space to the text, while a swipe gesture performed originating at the space bar moves the cursor based on a vector (e.g., magnitude and direction) of the gesture. The present disclosure can also apply outside of the textual environment, e.g., in a pictorial context.
Computing device 2 may include an input/output (I/O) device 12 (e.g., a presence- or touch-sensitive device). In some examples, I/O device 12 may be a presence-sensitive screen capable of detecting gestures made proximate to the presence-sensitive screen. In other examples, I/O device 12 may be a touch-based interface capable of receiving touch input from a user 14 (e.g., touch screen, track pad, track point, or the like). In some examples, I/O device 12 may comprise a display device 20. User 14 may interact with display device 20, for example, by performing touch input on I/O device 12. For purposes of illustration only, in this disclosure, I/O device 12 is described as a touch-sensitive device 12, but aspects of this disclosure should not be considered limited to such devices. In other examples, techniques disclosed herein are applied to a presence-sensitive device.
Computing device 2 includes a user input module 6 that, when executed, may provide functionality to computing device 2 to receive and interpret user input. Computing device 2 further includes keyboard application 8 that, when executed, may provide functionality to computing device 2 to provide graphical key 18. Keyboard application 8 may also provide signals to display device 20 to display information related to gesture input. User input module 6 may also provide signals related to user input to keyboard application 8.
Keyboard application 8 may include a graphical key module 10. Graphical key module 10, in various instances, provides computing device 2 with capabilities to display graphical key 18 that may be used to move graphical selector 24. For example, graphical key module 10 provides capabilities to move graphical selector 24 by a selected increment from a first graphical location 26 to a second graphical location 28 (as shown in
Display device 12 of first computing device 2 may display text 22, graphical selector 24, and graphical key 18. Graphical key 18 may be a graphical representation of a virtual button or icon, for example, a touch target. Graphical key 18 may be an image having any shape, size, coloration, or style that is displayed on display device 20. In one example, display device 20 may also display a graphical keyboard 4. In some examples, graphical key 18 is part of graphical keyboard 4. In other examples, graphical key 18 is a functionality overlaid on a pre-existing key in graphical keyboard 4. In additional examples, graphical key 18 is not part of graphical keyboard 4. In alternative examples, graphical key 18 is located at any location of display device 20. In another example, graphical key 18 is overlaid on top of graphical keyboard 4.
At least part of computing device 2 may be operating in a textual environment, for example, where display device 20 may display text 22. In such an example, graphical selector 24 may be a text cursor. As shown in
In order to place graphical selector 24 in the desired location (in this example, on either side of the “w”), user 14 may activate graphical key 18 in order to precisely move graphical selector 24. User 14 may tap graphical key 18 to activate it, for example. Alternatively, graphical key 18 may be active at all times when it is displayed. User 14 may perform a swipe gesture originating from graphical key 18 in order to reposition graphical selector 24. As shown in
In some examples, graphical key 18 may be displayed sometimes or at all times when operating computing device 2. In one example, graphical key 18 may only be displayed when computing device 2 is operating in an environment pertaining to the graphical key 18. For example, graphical key 18 may be displayed to move a cursor when user 14 is operating in a text-based environment. As another example, graphical key 18 may be displayed whenever graphical selector 24 is also displayed. In another example, graphical key 18 may be displayed when graphical keyboard 4 is also displayed. In yet another example, graphical key 18 may be displayed upon activation of a mode (e.g., activating a mode button) to initiate movement of graphical selector 24.
Graphical key module 10 may interpret gesture input 16 as a command to move graphical selector 24 by an increment 29 to a second graphical location 28. In some examples, graphical key module 10 of keyboard application 8 may detect at least one property of gesture input 16. In another example, user input module 6 may detect at least one property of gesture input 16 and provide a signal related to the at least one property to keyboard application 8. Based on the at least one property of gesture input 16, keyboard application 8 interprets gesture input 16 as corresponding to one or more increments 29 in one or more directions. In response, keyboard application 8 may move graphical selector 24 based on the at least one property of gesture input 16.
Properties of the gesture input 16 may, for example, include a direction of the gesture input 16, a displacement of the gesture input 16, a speed of the gesture input, a width of the gesture input 16, duration graphical key 18 is pressed, or any other property. In one example where the gesture input 16 is a touch-based input, a width of the gesture input 16 may related to a length of the gesture input 16 in a direction orthogonal to a direction of a larger movement. For example, a width of a touch-based input may correspond to the width of a finger of user 14 that is making gesture input 16. For example, a width of a touch-based input from a finger may be larger than a width of a touch-based input from a stylus. In another example, the time that graphical key 18 is pressed before a fling or swipe gesture is made is used to determine how much graphical selector 24 is moved. For example, a shorter touch and fling may move graphical selector 24 a first distance, whereas a longer touch before a fling may move graphical selector 24 a second distance larger than the first distance.
In the text-based example of
As shown in the specific example of
Processors 30 may be configured to implement functionality and/or process instructions for execution in computing device 2. Processors 30 may be capable of processing instructions stored in memory 32 or instructions stored on storage devices 36.
Memory 32 may be configured to store information within computing device 2 during operation. Memory 32 may, in some examples, be described as a non-transitory or tangible computer-readable storage medium. In some examples, memory 32 is a temporary memory, meaning that a primary purpose of memory 32 is not long-term storage. Memory 32 may also, in some examples, be described as a volatile memory, meaning that memory 32 does not maintain stored contents when the computer is turned off. Examples of volatile memories include random access memories (RAM), dynamic random access memories (DRAM), static random access memories (SRAM), and other forms of volatile memories known in the art. In some examples, memory 32 may be used to store program instructions for execution by processors 30. Memory 32 may be used by software or applications running on computing device 2 (e.g., one or more of applications 46) to temporarily store information during program execution.
Storage devices 36 may also include one or more non-transitory or tangible computer-readable storage media. Storage devices 36 may be configured to store larger amounts of information than memory 32. Storage devices 36 may further be configured for long-term storage of information. In some examples, storage devices 36 may include non-volatile storage elements. Examples of such non-volatile storage elements may include magnetic hard discs, optical discs, floppy discs, flash memories, or forms of electrically programmable memories (EPROM) or electrically erasable and programmable (EEPROM) memories. In one example, storage device 36 contains a database that includes a mapping of properties of gesture inputs (such as gesture input 16) to increments (such as increment 29). In such an example, graphical key module 10 may access the database stored in storage device 36 when interpreting gesture input 16.
Computing device 2 also includes a network interface 34. Computing device 2 may utilize network interface 34 to communicate with external devices via one or more networks. In one example, network interface 34 may correspond to an interface for receiving data from computing devices (e.g., computing devices 22 and 24 of
Examples of such network interfaces 34 may include Bluetooth®, 3G and WiFi® radios in mobile computing devices as well as USB. Network interface 34 may be configured to connect to a wide-area network such as the Internet, a local-area network (LAN), an enterprise network, a wireless network, a cellular network, a telephony network, a Metropolitan area network (e.g., Wi-Fi, WAN, or WiMAX), one or more other types of networks, or a combination of two or more different types of networks (e.g., a combination of a cellular network and the Internet). In some examples, computing device 2 may utilize network interface 34 to wirelessly communicate with an external device or other networked computing device.
Computing device 2 may also include one or more input devices 38. Input device 38 may be configured to receive input from user 14 through tactile, audio, or video input. Examples of input device 38 may include a touch-sensitive display, mouse, a keyboard, a voice responsive system, a microphone, a video camera, or any other type of device for detecting a command from user 14.
One or more output devices 40 may also be included in computing device 2, e.g., display device 12. Output device 40 may be configured to provide output to user 14 using tactile, audio, or video stimuli. Output device 40 may include a touch-sensitive display, a sound card, a video graphics adapter card, or any other type of device for converting a signal into an appropriate form understandable to humans or machines. Additional examples of output device 40 may include a speaker, a cathode ray tube (CRT) monitor, a liquid crystal display (LCD), or any other type of device that can provide output to user 14.
Computing device 2 may include one or more batteries or other power sources 42, which may be rechargeable and provide power to computing device 2. The one or more batteries 42 may be made from nickel-cadmium, lithium-ion, or any other suitable material. In other examples, the one or more power sources 42 are located external to computing device 2. The one or more batteries 42 may be rechargeable and/or computing device 2 may be powered via a power connection.
Computing device 2 may include operating system 44. Operating system 44 may control the operation of components of computing device 2. For example, operating system 44 may facilitate the interaction of application 46 or keyboard application 8 with processors 30, memory 32, network interface 34, storage device 36, input device 38, output device 40, and battery 42. Examples of operating system 44 may include Android®, Apple iOS®, Blackberry® OS, Symbian OS®, Linux®, Microsoft Windows Phone 7®, or the like.
Keyboard application 8 may additionally include graphical key module 10, which may be executed as part of operating system 44. In other cases, graphical key module 10 may be implemented or executed by other components of computing device 2. Graphical key module 10 may process gesture input, e.g., gesture input 16, and may provide a graphical key 18 for display device 20 to display. Additionally, graphical key module 10 may receive input from a component such as processors 30, memory 32, network interface 34, storage devices 36, one or more output devices 40, battery 42, or operating system 44. In some cases, graphical key module 10 may perform additional processing on gesture input 16 or graphical selector 24. In other cases, graphical key module 10 may transmit input to an application, e.g., application 46, or other component in computing device 2.
Any applications, e.g., application 46 or keyboard application 8, implemented within or executed by computing device 2 may be implemented or contained within, operable by, executed by, and/or be operatively/communicatively coupled to components of computing device 2, e.g., processors 30, memory 32, network interface 34, and/or storage devices 36.
Method 50 includes activating a graphical key that is displayed with a touch-based interface of a computing device (52). For example, user 14 activates graphical key 18 by touching the graphical key with a finger or other device. In one example, graphical key 18 may be deactivated when a touch that activated direction key 18 is released. In other examples, tapping graphical key 18 may activate and deactivate graphical key 18. In another example, a graphical key mode may be toggled by a swipe gesture made anywhere on display device 20 (that is, not only those originating at graphical key 18). In such an example, a swipe gesture performed anywhere on touch-based device 12 may be used to move graphical selector 24 while computing device 2 is operating in the graphical key mode. In one example, the graphical key mode may be toggled by tapping graphical key 18.
Upon activation of the graphical key, method 50 may further include receiving gesture input corresponding to a directional gesture using the touch-based interface of the computing device (54). For example, the gesture input corresponding to a directional gesture may be a swipe motion originating at the graphical key, for example, gesture input 16 originating at graphical key 18. In other examples, gesture input 16 may be a swipe motion that does not originate at graphical key 18.
Method 50 may further include moving a graphical selector displayed by the computing device from a first graphical location to a second graphical location by at least one selected increment based on a property of the gesture input (56). For example, graphical selector 24 may be moved from first graphical location 26 to second graphical location 28 based on a direction of gesture input 16. That is, the distance graphical selector 24 is moved, and in what direction graphical selector 24 is moved, may be based on one or more properties of gesture input 16. For example, as shown in
For example, the property of gesture input 16 may be a direction of gesture input 16. In such an example, graphical selector 24 may be moved in a direction related to the direction of gesture input 16 (left, as shown in
In another example, a property of the gesture input 16 may be a displacement of gesture input 16. For example, the distance from a location where gesture input 16 is initiated (for example, where a touch begins) to a location where gesture input 16 is completed (for example, where the touch is released) is used to determine how far to move graphical selector 24. That is, graphical key module 10 may select which increment 29 or how many increments 29 to move graphical selector 24 by based on the displacement of gesture input 16.
For example, when the displacement of gesture input 16 is below a displacement threshold, graphical key module 10 selects an increment of a first amount. Similarly, when the displacement of gesture input 16 is equal to or above the displacement threshold, graphical key module 10 selects an increment of a second amount, wherein the second amount is larger than the first amount. For example, the displacement threshold may be set to be approximately 1 centimeter (cm). Whenever a directional gesture is less than approximately 1 cm, graphical selector 24 may be moved by one letter, space, or character in a direction corresponding to the directional gesture. In contrast, whenever a directional gesture is equal to or greater than approximately 1 cm, graphical selector 24 may be moved by one word in a direction corresponding to the directional gesture. In other examples, other increments and thresholds may be implemented.
In another example, a property of gesture input 16 may be a speed of the gesture input 16. In such an example, graphical selector 24 may be moved by a first increment when the speed of gesture input 16 is below a speed threshold and may be moved by a second increment when the speed of the gesture input 16 is above or equal to the speed threshold. In such an example, the second increment may be larger than the first increment. For example, when the speed of gesture input 16 is relatively slow, graphical selector 24 may be moved by a letter, space, character, or line in a direction corresponding to the direction of gesture input 16. Likewise, when the speed of gesture input 16 is relatively quick, graphical selector 24 may be moved by a word or paragraph in a direction corresponding to the direction of gesture input 16. In other examples, other increments and thresholds may be implemented.
In other examples, other properties of gesture input 16 may be used in determining how to move graphical selector 24. For example, which increment 29, and how many increments 29, graphical selector 24 moves may be based on the duration that user 14 presses graphical key 16 before releasing it.
In further examples, graphical selector 24 may be moved by one or more selected increments. In such an example, the number of selected increments may be proportional to at least one property of the gesture input. That is, the number of operations performed by graphical key module 10 in response to gesture input 16 may increase with the displacement of gesture input 16. For example, graphical key module 10 may move graphical selector 24 by one increment for every approximate 1 cm of displacement of gesture input 16. In such an example, when gesture input 16 corresponds to a directional gesture of approximately 4.3 cm, graphical key module 10 may move graphical selector 24 by four increments in the direction of gesture input 16. In some examples, a location of graphical selector 24 is updated real-time during performance of the directional gesture. In other examples, graphical selector 24 is not moved until completion of the directional gesture.
In some examples, a directional gesture may be fragmented into a plurality of sub-gestures. Moving the graphical selector 24 may include moving the graphical selector according to one or more properties of each sub-gesture. For example, when the directional gesture is not an approximately linear gesture input, but rather a curved input, the curve may be fragmented into approximately linear sub-gestures. Each of these sub-gestures may be treated as a separate gesture input 16. In one example, the plurality of sub-gestures is placed into a chronological order of inputs, wherein graphical selector 24 is moved sequentially in the chronological order of inputs. In another example, non-linear directional gestures may result in an approximately curving motion of graphical selector 24.
In another example, graphical key 18 may be a pre-existing key of graphical keyboard 4, wherein directional functionality is overlaid on the pre-existing key. In one example, the pre-existing key of graphical keyboard 4 may be a space bar. Activating graphical key 18 may include performing a swipe gesture originating at the space bar.
In one example, the response of graphical key 18 may be user configurable. That is, user 14 may select what properties of gesture input 16 are recognized and how those properties are used to move graphical selector 24. This user configurable information may be stored in a mapping database in storage device 36.
In some examples, computing device 2 may operate at least partially in a textual environment. In such a textual environment, graphical selector 24 may be a text cursor. Similarly, an increment may be one of a space, a letter, a word, a line, a sentence, a paragraph, and combinations thereof. In other examples, computing device 2 may operate at least partially in a pictorial or image environment. In such an image environment, graphical selector 24 may be a pointer. Likewise, an increment may be one of a single pixel, a selected number of pixels, a block, an image segment, and combinations thereof.
In another example, a short or slow gesture may move pointer 74 one pixel, row, or column in a direction corresponding to the gesture, while a long or fast gesture may move pointer 74 by two or more pixels, rows, or columns in a direction corresponding to the gesture.
Techniques of the present disclosure may provide a graphical key that may be used to precisely locate a graphical selector displayed on a computing device. The graphical selector may be moved from a first graphical location to a second graphical location by at least one selected increment based on a property of a gesture input. For example, a short or slow gesture may move the graphical selector one character or line in a direction corresponding to the gesture, while a long or fast gesture may move the graphical selector one word or paragraph in a direction corresponding to the gesture. This allows a user to finely move a graphical selector, even when a touch-sensitive screen of the computing device is relatively small compared with the implement used to touch the touch-sensitive screen.
Techniques described herein may be implemented, at least in part, in hardware, software, firmware, or any combination thereof. For example, various aspects of the described embodiments may be implemented within one or more processors, including one or more microprocessors, digital signal processors (DSPs), application specific integrated circuits (ASICs), field programmable gate arrays (FPGAs), or any other equivalent integrated or discrete logic circuitry, as well as any combinations of such components. The term “processor” or “processing circuitry” may generally refer to any of the foregoing logic circuitry, alone or in combination with other logic circuitry, or any other equivalent circuitry. A control unit including hardware may also perform one or more of the techniques of this disclosure.
Such hardware, software, and firmware may be implemented within the same device or within separate devices to support the various techniques described herein. In addition, any of the described units, modules or components may be implemented together or separately as discrete but interoperable logic devices. Depiction of different features as modules or units is intended to highlight different functional aspects and does not necessarily imply that such modules or units are realized by separate hardware, firmware, or software components. Rather, functionality associated with one or more modules or units may be performed by separate hardware, firmware, or software components, or integrated within common or separate hardware, firmware, or software components.
Techniques described herein may also be embodied or encoded in an article of manufacture including a computer-readable storage medium encoded with instructions. Instructions embedded or encoded in an article of manufacture including an encoded computer-readable storage medium, may cause one or more programmable processors, or other processors, to implement one or more of the techniques described herein, such as when instructions included or encoded in the computer-readable storage medium are executed by the one or more processors. Computer readable storage media may include random access memory (RAM), read only memory (ROM), programmable read only memory (PROM), erasable programmable read only memory (EPROM), electronically erasable programmable read only memory (EEPROM), flash memory, a hard disk, a compact disc ROM (CD-ROM), a floppy disk, a cassette, magnetic media, optical media, or other computer readable media. In some examples, an article of manufacture may comprise one or more computer-readable storage media.
In some examples, computer-readable storage media may comprise non-transitory media. The term “non-transitory” may indicate that the storage medium is not embodied in a carrier wave or a propagated signal. In certain examples, a non-transitory storage medium may store data that can, over time, change (e.g., in RAM or cache).
Various aspects of the disclosure have been described. Aspects or features of examples described herein may be combined with any other aspect or feature described in another example. These and other embodiments are within the scope of the following claims.
Number | Name | Date | Kind |
---|---|---|---|
4566000 | Goldman et al. | Jan 1986 | A |
5327161 | Logan et al. | Jul 1994 | A |
5523775 | Capps | Jun 1996 | A |
5666113 | Logan | Sep 1997 | A |
5880411 | Gillespie et al. | Mar 1999 | A |
5903229 | Kishi | May 1999 | A |
6286064 | King et al. | Sep 2001 | B1 |
6292179 | Lee | Sep 2001 | B1 |
6507678 | Yahagi | Jan 2003 | B2 |
6704034 | Rodriguez et al. | Mar 2004 | B1 |
6801190 | Robinson et al. | Oct 2004 | B1 |
6891551 | Keely et al. | May 2005 | B2 |
7030863 | Longe et al. | Apr 2006 | B2 |
7032171 | Carroll | Apr 2006 | B1 |
7042443 | Woodard et al. | May 2006 | B2 |
7075520 | Williams | Jul 2006 | B2 |
7088345 | Robinson et al. | Aug 2006 | B2 |
7098896 | Kushler et al. | Aug 2006 | B2 |
7145554 | Bachmann | Dec 2006 | B2 |
7151530 | Roeber et al. | Dec 2006 | B2 |
7199786 | Suraqui | Apr 2007 | B2 |
7250938 | Kirkland et al. | Jul 2007 | B2 |
7251367 | Zhai | Jul 2007 | B2 |
7277088 | Robinson et al. | Oct 2007 | B2 |
7321361 | Sato et al. | Jan 2008 | B2 |
7453439 | Kushler et al. | Nov 2008 | B1 |
7508324 | Suraqui | Mar 2009 | B2 |
7571393 | Premchandran et al. | Aug 2009 | B2 |
7659887 | Larsen et al. | Feb 2010 | B2 |
7683889 | Rimas Ribikauskas et al. | Mar 2010 | B2 |
7706616 | Kristensson et al. | Apr 2010 | B2 |
7716579 | Gunn et al. | May 2010 | B2 |
7737956 | Hsieh et al. | Jun 2010 | B2 |
7750891 | Stephanick et al. | Jul 2010 | B2 |
7877685 | Peters | Jan 2011 | B2 |
7921361 | Gunn et al. | Apr 2011 | B2 |
8036878 | Assadollahi | Oct 2011 | B2 |
8042044 | Van Leeuwen | Oct 2011 | B2 |
8135582 | Suraqui | Mar 2012 | B2 |
8276099 | Yost | Sep 2012 | B2 |
8356059 | Wiljanen et al. | Jan 2013 | B2 |
8365059 | Walsh et al. | Jan 2013 | B2 |
8482521 | Abe et al. | Jul 2013 | B2 |
20030068088 | Janakiraman et al. | Apr 2003 | A1 |
20030234766 | Hildebrand | Dec 2003 | A1 |
20050076300 | Martinez | Apr 2005 | A1 |
20050134578 | Chambers et al. | Jun 2005 | A1 |
20060005151 | Altman | Jan 2006 | A1 |
20060026535 | Hotelling et al. | Feb 2006 | A1 |
20060048071 | Jarrett et al. | Mar 2006 | A1 |
20060061557 | Kyrola | Mar 2006 | A1 |
20060119582 | Ng et al. | Jun 2006 | A1 |
20060176283 | Suraqui | Aug 2006 | A1 |
20060187216 | Trent et al. | Aug 2006 | A1 |
20070091070 | Larsen et al. | Apr 2007 | A1 |
20070157085 | Peters | Jul 2007 | A1 |
20070236475 | Wherry | Oct 2007 | A1 |
20070260981 | Kim et al. | Nov 2007 | A1 |
20070273664 | Kim et al. | Nov 2007 | A1 |
20080016467 | Chambers et al. | Jan 2008 | A1 |
20080062136 | Nakamura et al. | Mar 2008 | A1 |
20080079604 | Madonna et al. | Apr 2008 | A1 |
20080165142 | Kocienda et al. | Jul 2008 | A1 |
20080307350 | Sabatelli et al. | Dec 2008 | A1 |
20080316183 | Westerman et al. | Dec 2008 | A1 |
20090058823 | Kocienda | Mar 2009 | A1 |
20090064045 | Tremblay | Mar 2009 | A1 |
20090109182 | Fyke et al. | Apr 2009 | A1 |
20090178008 | Herz et al. | Jul 2009 | A1 |
20090189862 | Viberg | Jul 2009 | A1 |
20090213134 | Stephanick et al. | Aug 2009 | A1 |
20090228842 | Westerman et al. | Sep 2009 | A1 |
20090268018 | Kasai | Oct 2009 | A1 |
20100013852 | Liu | Jan 2010 | A1 |
20100123724 | Moore et al. | May 2010 | A1 |
20100134425 | Storrusten | Jun 2010 | A1 |
20100153879 | Rimas-Ribikauskas et al. | Jun 2010 | A1 |
20100214237 | Echeverri et al. | Aug 2010 | A1 |
20100235778 | Kocienda et al. | Sep 2010 | A1 |
20100235783 | Ording et al. | Sep 2010 | A1 |
20100238138 | Goertz et al. | Sep 2010 | A1 |
20100287486 | Coddington | Nov 2010 | A1 |
20100293460 | Budelli | Nov 2010 | A1 |
20110083105 | Shin et al. | Apr 2011 | A1 |
20110090151 | Huang et al. | Apr 2011 | A1 |
20110134068 | Shimoni | Jun 2011 | A1 |
20110193788 | King et al. | Aug 2011 | A1 |
20110231789 | Bukurak et al. | Sep 2011 | A1 |
20110239110 | Garrett et al. | Sep 2011 | A1 |
20110239153 | Carter et al. | Sep 2011 | A1 |
20110273379 | Chen et al. | Nov 2011 | A1 |
20110320978 | Herodezky et al. | Dec 2011 | A1 |
20120013541 | Boka et al. | Jan 2012 | A1 |
20120036469 | Suraqui | Feb 2012 | A1 |
20120139844 | Ramstein et al. | Jun 2012 | A1 |
20120185787 | Lisse et al. | Jul 2012 | A1 |
20120192117 | Migos et al. | Jul 2012 | A1 |
20120268387 | Kuo et al. | Oct 2012 | A1 |
20120293427 | Mukai et al. | Nov 2012 | A1 |
20120306772 | Tan et al. | Dec 2012 | A1 |
20120311437 | Weeldreyer et al. | Dec 2012 | A1 |
20130002719 | Ide | Jan 2013 | A1 |
20130024820 | Kirkpatrick | Jan 2013 | A1 |
20130036388 | Kirkpatrick | Feb 2013 | A1 |
20130042199 | Fong et al. | Feb 2013 | A1 |
20130283208 | Bychkov et al. | Oct 2013 | A1 |
Entry |
---|
Shen et al., Toward Gesture-Based behavior Authoring; © 2005; IEEE; 7 pages. |
Milota et al., Multimodal Interface with Voice and Gesture Input; © 1995; IEEE; 6 pages. |
“BlackBerry Torch 9800 Smartphone Version 6.0 User Guide” [online]. Research in Motion Limited, Waterloo, Canada. 2010. Retrieved from the Internet: <URL:http://docs.blackberry.com/en/smartphone—users/deliverables/18577/index.jsp?name=User+Guide+-+BlackBerry+Torch+9800+Smartphone98006.0&language=English&userType=1&category=BlackBerry+Smartphones&subCategory=BlackBerry+Torch> Title page, table of Contents, p. 32, and Legal Notice (15 pgs.). |
Final Office Action from U.S. Appl. No. 13/250,675, dated Sep. 27, 2012, 14 pp. |
Response to Final Office Action dated Sep. 27, 2012, from U.S. Appl. No. 13/250,675, filed Feb. 14, 2013, 10 pp. |
U.S. Appl. No. 13/250,675, by Ficus Kirkpatrick, filed Sep. 30, 2011. |
Final Rejection from U.S. Appl. No. 13/943,745, dated Mar. 20, 2014, 13 pp. |
De Silva et al., “Human Factors Evaluation of a Vision-Based Facial Gesture Interace,” 2003, IEEE, 8 pp. |
Ahmad et al., “A Keystroke and Pointer Control Input Interface for Wearable Computers” 2006, IEEE, 10 pp. |
An Introduction to Writing Systems & Unicode, retrieved from http://rishida.net/docs/unicode-tutorial/toc, accessed on May 15, 2012, 20 pp. |
Accesibility Solutions for iPhone, retrieved from http://www.apple.com/accessibility/iphone/hearing.html, accessed on Aug. 16, 2012, 4 pp. |
Williams, “How to highlight, cut, copy and past using a BlackBerry Z10 smartphone,” retrieved from http://helpblog.blackberry.com/2013/04/blackberry-z10-text-edit/, Apr. 10, 2013, 3 pp. |
Neo, “Select, copy and past text on Android the easy way,” retrieved from http://74.55.73.196/˜shanzai/index.php/guides.html?start=225, Oct. 14, 2010, 3 pp. |
BlackBerry Bold 9700 Smartphone 5.0 User Guide, Typing Shortcuts, retrieved from www.blackberry.com/docs/smartphones, accessed on May 16, 2012, 1 p. |
BlackBerry Bold 9700 Smartphone 5.0 User Guide, Typing Shortcuts, retrieved from www.blackberry.com/docs/smartphones, accessed on May 17, 2011, 327 pp. |
BlackBerry Torch 9800 Smartphone Version 6.0, User Guide, retrieved from www.blackberry.com/docs/smartphones, accessed on Jan. 19, 2011, 302 pp. |
Natasha Lomas, “Hey Apple, What The Next iPhone Really, Really Needs Is a Much Better Keyboard,” http://techcrunch.com/2013/04/21/the-iphone-keyboard-stinks/?, Apr. 21, 2013, 6 pp. |
U.S. Appl. No. 13/943,745, by Yu Ouyang, filed Jul. 16, 2013. |
U.S. Appl. No. 13/747,214, by Yu Ouyang filed Jan. 22, 2013. |
U.S. Appl. No. 13/836,242, by Yu Ouyang filed Mar. 15, 2103. |
Notice of Allowance from U.S. Appl. No. 13/250,675 dated Aug. 7, 2013, 19 pp. |
“iPhone text entry,” accessed on Aug. 2, 2012, 4 pp. |
Office Action from U.S. Appl. No. 13/943,745, dated Sep. 20, 2013, 14 pp. |
Response to Office Action from U.S. Appl. No. 13/943,745, dated Sep. 20, 2013, filed Dec. 20, 2013, 15 pp. |
Office Action from U.S. Appl. No. 13/250,675, dated Nov. 10, 2011, 20 pp. |
Response to Office Action dated Nov. 10, 2011, from U.S. Appl. No. 13/250,675, filed Feb. 10, 2012, 17 pp. |
Notice of Appeal from U.S. Appl. No. 13/943,745, filed Jul. 18, 2014, 1 page. |
Pre-Appeal Brief Request for Review from U.S. Appl. No. 13/943,745, filed Jul. 18, 2014, 5 pages. |
Number | Date | Country | |
---|---|---|---|
20130024820 A1 | Jan 2013 | US |