Method, device, and graphical user interface providing word recommendations for text input

Information

  • Patent Grant
  • 11474695
  • Patent Number
    11,474,695
  • Date Filed
    Monday, July 26, 2021
    3 years ago
  • Date Issued
    Tuesday, October 18, 2022
    2 years ago
Abstract
A portable electronic device having a touch screen display performs a set of operations, including displaying a plurality of key icons, each having an adjustable size hit region, and receiving a sequence of individual touch points input by a user on the touch screen display. The operations performed by the device further include processing the received individual touch points by: forming a user-input directed graph for the sequence of individual touch points received so far, determining a character corresponding to a last received individual touch point in accordance with the adjustable hit regions of the displayed key icons, displaying a sequence of characters corresponding to the sequence of individual touch points, and updating sizes of the adjustable hit regions for a plurality of the key icons in accordance with the sequence of individual touch points input by the user.
Description
TECHNICAL FIELD

The disclosed embodiments relate generally to text input on portable communication devices, and more particularly, to methods and systems for providing word recommendations in response to text input.


BACKGROUND

In recent years, the functional capabilities of portable communications devices have increased dramatically. Current devices enable communication by voice, text, and still or moving images. Communication by text, such as by email, instant message (IM) or short messaging service (SMS), has proven to be quite popular.


However, the size of these portable communication devices also restricts the size of the text input device, such as a physical or virtual keyboard, in the portable device. With a size-restricted keyboard, designers are often forced to make the keys smaller or overload the keys. Both may lead to typing mistakes and thus more backtracking to correct the mistakes. This makes the process of communication by text on the devices inefficient and reduces user satisfaction with such portable communication devices.


Accordingly, there is a need for more efficient ways of entering text into portable devices.


SUMMARY

In accordance with some embodiments, a computer-implemented method, performed at a portable electronic device having a touch screen display, includes displaying a plurality of key icons, each key icon having an adjustable hit region of dynamically adjustable size, and receiving a sequence of individual touch points input by a user on the touch screen display. Each touch point is determined at lift off of a contact from the touch screen display. An image with an enlarged version of a character that will be selected as the character corresponding to an individual touch point is displayed prior to lift off of a respective contact, wherein the character image that is displayed prior to lift off is selected in accordance with the adjustable hit regions of the displayed key icons. After receiving each of the individual touch points, the method performs a set of operations, including: forming a user-input directed graph for the sequence of individual touch points received so far; determining a character corresponding to a last received individual touch point in accordance with the adjustable hit regions of the displayed key icons; displaying a sequence of characters corresponding to the sequence of individual touch points, including the determined character; and updating sizes of the adjustable hit regions for a plurality of the key.


In accordance with some embodiments, a computer readable storage medium, for use in conjunction with a portable electronic device having a touch screen display, stores one or more programs for execution by one or more processors of the portable electronic device. The one or more programs include instructions for displaying on the touch screen display a plurality of key icons, each key icon having an adjustable hit region of dynamically adjustable size. The one or more programs further include instructions for receiving a sequence of individual touch points input by a user on the touch screen display. Each touch point is determined at lift off of a contact from the touch screen display. An image with an enlarged version of a character that will be selected as the character corresponding to an individual touch point is displayed prior to lift off of a respective contact, wherein the character image that is displayed prior to lift off is selected in accordance with the adjustable hit regions of the displayed key icons. The one or more programs further include instructions for processing the received individual touch points by performing operations after receiving each of the individual touch points, the operations including: forming a user-input directed graph for the sequence of individual touch points received so far; determining a character corresponding to a last received individual touch point in accordance with the adjustable hit regions of the displayed key icons; displaying on the touch screen display a sequence of characters corresponding to the sequence of individual touch points, including the determined character; and updating sizes of the adjustable hit regions for a plurality of the key icons.


In accordance with some embodiments, a portable electronic device having a touch screen display includes one or more processors, memory, and one or more programs stored in the memory, the one or more programs configured to be executed by the one or more processors. The one or more programs include instructions for displaying on the touch screen display a plurality of key icons, each key icon having an adjustable hit region of dynamically adjustable size. The one or more programs further include instructions for receiving a sequence of individual touch points input by a user on the touch screen display. Each touch point is determined at lift off of a contact from the touch screen display. An image with an enlarged version of a character that will be selected as the character corresponding to an individual touch point is displayed prior to lift off of a respective contact, wherein the character image that is displayed prior to lift off is selected in accordance with the adjustable hit regions of the displayed key icons. The one or more programs further include instructions for processing the received individual touch points by performing operations after receiving each of the individual touch points, the operations including: forming a user-input directed graph for the sequence of individual touch points received so far; determining a character corresponding to a last received individual touch point in accordance with the adjustable hit regions of the displayed key icons; displaying on the touch screen display a sequence of characters corresponding to the sequence of individual touch points, including the determined character; and updating sizes of the adjustable hit regions for a plurality of the key icons.


In accordance with some embodiments, a portable electronic device having a touch screen display includes one or more processors and memory. The portable electronic device further includes means for displaying on the touch screen display a plurality of key icons, each key icon having an adjustable hit region of dynamically adjustable size, and means for receiving a sequence of individual touch points input by a user on the touch screen display. Each touch point is determined at lift off of a contact from the touch screen display. An image with an enlarged version of a character that will be selected as the character corresponding to an individual touch point is displayed prior to lift off of a respective contact, wherein the character image that is displayed prior to lift off is selected in accordance with the adjustable hit regions of the displayed key icons. The portable electronic device further includes means for processing the received individual touch points by performing operations after receiving each of the individual touch points, the operations including: forming a user-input directed graph for the sequence of individual touch points received so far; determining a character corresponding to a last received individual touch point in accordance with the adjustable hit regions of the displayed key icons; displaying on the touch screen display a sequence of characters corresponding to the sequence of individual touch points, including the determined character; and updating sizes of the adjustable hit regions for a plurality of the key icons.


In accordance with some embodiments, a computer-implemented method, performed at a portable electronic device having a touch screen display, includes displaying a plurality of key icons, receiving a sequence of individual touch points input by a user on the touch screen display, and displaying a sequence of characters corresponding to the sequence of individual touch points. The method also includes receiving a touch point corresponding to a deletion key icon, and deleting one or more of the displayed characters to produce a shortened sequence of characters. Then the method includes receiving additional individual touch points. After receiving each of the additional individual touch points, the method performs a set of operations, including displaying a current sequence of characters including characters associated with the additional individual touch points, and determining and displaying a suggested character string only when the suggested character string starts with the shortened sequence of characters and the suggested character string meets predefined character string suggestion criteria.


In accordance with some embodiments, a computer readable storage medium, for use in conjunction with a portable electronic device having a touch screen display, stores one or more programs for execution by one or more processors of the portable electronic device. The one or more programs include instructions for displaying on the touch screen display a plurality of key icons, for receiving a sequence of individual touch points input by a user on the touch screen display, and for displaying on the touch screen display a sequence of characters corresponding to the sequence of individual touch points. The one or more programs also include instructions for receiving a touch point corresponding to a deletion key icon, and for deleting one or more of the displayed characters to produce a shortened sequence of characters. The one or more programs further include instructions for receiving additional individual touch points, and instructions for processing the received individual touch points by performing operations after receiving each of the additional individual touch points, including displaying a current sequence of characters including characters associated with the additional individual touch points, and determining and displaying a suggested character string only when the suggested character string starts with the shortened sequence of characters and the suggested character string meets predefined character string suggestion criteria.


In accordance with some embodiments, a portable electronic device having a touch screen display includes one or more processors, memory, and one or more programs stored in the memory, the one or more programs configured to be executed by the one or more processors. The one or more programs include instructions for displaying on the touch screen display a plurality of key icons, for receiving a sequence of individual touch points input by a user on the touch screen display, and for displaying on the touch screen display a sequence of characters corresponding to the sequence of individual touch points. The one or more programs also include instructions for receiving a touch point corresponding to a deletion key icon, and for deleting one or more of the displayed characters to produce a shortened sequence of characters. The one or more programs further include instructions for receiving additional individual touch points, and instructions for processing the received individual touch points by performing operations after receiving each of the additional individual touch points, including displaying a current sequence of characters including characters associated with the additional individual touch points, and determining and displaying a suggested character string only when the suggested character string starts with the shortened sequence of characters and the suggested character string meets predefined character string suggestion criteria.


In accordance with some embodiments, a portable electronic device having a touch screen display includes one or more processors and memory. The portable electronic device further includes means for displaying on the touch screen display a plurality of key icons, and means for receiving a sequence of individual touch points input by a user on the touch screen display, and means for displaying on the touch screen display a sequence of characters corresponding to the sequence of individual touch points. The portable electronic device also includes means for receiving a touch point corresponding to a deletion key icon, and means for deleting one or more of the displayed characters to produce a shortened sequence of characters. The portable electronic device further includes means for receiving additional individual touch points, and means for processing the received individual touch points by performing operations after receiving each of the additional individual touch points, including displaying a current sequence of characters including characters associated with the additional individual touch points, and determining and displaying a suggested character string only when the suggested character string starts with the shortened sequence of characters and the suggested character string meets predefined character string suggestion criteria.


In accordance with some embodiments, a computer-implemented method, performed at a portable electronic device having a touch screen display, includes: displaying a current character string being input by a user with a soft keyboard in a first area of the touch screen display; displaying a suggested replacement character string for the current character string in a second area of the touch screen display, wherein the second area includes a suggestion rejection icon adjacent to the suggested replacement character string; replacing the current character string in the first area with the suggested replacement character string in response to detecting user activation of a key on the soft keyboard associated with a delimiter; and keeping the current character string in the first area and ceasing to display the suggested replacement character string and the suggestion rejection icon in response to detecting a finger gesture on the suggested replacement character string displayed in the second area.


In accordance with some embodiments, a computer readable storage medium has stored therein instructions, which when executed by a portable electronic device with a touch screen display, cause the portable electronic device to: display a current character string being input by a user with a soft keyboard in a first area of the touch screen display; display a suggested replacement character string for the current character string in a second area of the touch screen display, wherein the second area includes a suggestion rejection icon adjacent to the suggested replacement character string; replace the current character string in the first area with the suggested replacement character string in response to detecting user activation of a key on the soft keyboard associated with a delimiter; and keep the current character string in the first area and cease to display the suggested replacement character string and the suggestion rejection icon in response to detecting a finger gesture on the suggested replacement character string displayed in the second area.


In accordance with some embodiments, a portable electronic device includes: a touch screen display; one or more processors; memory; and one or more programs. The one or more programs are stored in the memory and configured to be executed by the one or more processors. The one or more programs include instructions for: displaying a current character string being input by a user with a soft keyboard in a first area of the touch screen display; displaying a suggested replacement character string for the current character string in a second area of the touch screen display, wherein the second area includes a suggestion rejection icon adjacent to the suggested replacement character string; replacing the current character string in the first area with the suggested replacement character string in response to detecting user activation of a key on the soft keyboard associated with a delimiter; and keeping the current character string in the first area and ceasing to display the suggested replacement character string and the suggestion rejection icon in response to detecting a finger gesture on the suggested replacement character string displayed in the second area.


In accordance with some embodiments, a graphical user interface on a portable electronic device with a touch screen display includes a first area of the touch screen display; a current character string being input by a user with a soft keyboard in the first area of the touch screen display; and a second area of the touch screen display that includes a suggested replacement character string and a suggestion rejection icon adjacent to the suggested replacement character string. In response to detecting user activation of a key on the soft keyboard associated with a delimiter, the current character string in the first area is replaced with the suggested replacement character string. In response to detecting a finger gesture on the suggested replacement character string displayed in the second area, the current character string is kept in the first area and display of the suggested replacement character string and the suggestion rejection icon are ceased.


Thus, the embodiments provide more efficient ways to enter text in a portable device.





BRIEF DESCRIPTION OF THE DRAWINGS

For a better understanding of the aforementioned embodiments of the invention as well as additional embodiments thereof, reference should be made to the Description of Embodiments below, in conjunction with the following drawings in which like reference numerals refer to corresponding parts throughout the figures.



FIG. 1 is a block diagram illustrating a portable communications device in accordance with some embodiments.



FIG. 2 is a flow diagram illustrating a process of providing word recommendations in accordance with some embodiments.



FIG. 3 is a flow diagram illustrating a process of scoring candidate words in accordance with some embodiments.



FIG. 4 is a flow diagram illustrating a process of selecting and presenting candidate words in accordance with some embodiments.



FIGS. 5A and 5B illustrate exemplary layouts of letter keys on a keyboard in accordance with some embodiments.



FIG. 6 illustrates an exemplary derivation of candidate words based on text input in accordance with some embodiments.



FIGS. 7A-7C illustrate examples of scoring of candidate words in accordance with some embodiments.



FIGS. 8A-8C illustrate an exemplary method for dynamically adjusting hidden hit regions associated with soft keyboard keys as a word is typed with the soft keyboard keys in accordance with some embodiments.



FIG. 9 illustrates an exemplary derivation of candidate words based on text input in accordance with some embodiments.



FIGS. 10A and 10B are flow diagrams illustrating text input processes in accordance with some embodiments.



FIGS. 11A and 11B illustrate an exemplary user interface for inputting text in accordance with some embodiments.



FIG. 12 is a flow diagram illustrating a process for inputting text on a portable electronic device with a soft keyboard and a touch screen display in accordance with some embodiments.





DESCRIPTION OF EMBODIMENTS

Reference will now be made in detail to embodiments, examples of which are illustrated in the accompanying drawings. In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the present invention. However, it will be apparent to one of ordinary skill in the art that the present invention may be practiced without these specific details. In other instances, well-known methods, procedures, components, circuits, and networks have not been described in detail so as not to unnecessarily obscure aspects of the embodiments.


A portable communication device includes a user interface and a text input device. Via the interface and the text input device, a user may enter text into the device. The text includes words, which are sequences of characters separated by whitespaces or particular punctuation. For a word as it is being entered or an entered word, the device identifies and offers word recommendations that may be selected by the user to replace the word as inputted by the user.


Attention is now directed to an embodiment of a portable communications device.



FIG. 1 is a block diagram illustrating an embodiment of a device 100, such as a portable electronic device having a touch-sensitive display 112. The device 100 may include a memory controller 120, one or more data processors, image processors and/or central processing units 118 and a peripherals interface 116. The memory controller 120, the one or more processors 118 and/or the peripherals interface 116 may be separate components or maybe integrated, such as in one or more integrated circuits 104. The various components in the device 100 may be coupled by one or more communication buses or signal lines 103.


The peripherals interface 116 may be coupled to an optical sensor (not shown), such as a CMOS or CCD image sensor; RF circuitry 108; audio circuitry 110; and/or an input/output (I/O) subsystem 106. The audio circuitry 110 may be coupled to a speaker 142 and a micro-phone 144. The device 100 may support voice recognition and/or voice replication. The RF circuitry 108 may be coupled to one or more antennas 146 and may allow communication with one or more additional devices, computers and/or servers using a wireless network. The device 100 may support a variety of communications protocols, including code division multiple access (CDMA), Global System for Mobile Communications (GSM), Enhanced Data GSM Environment (EDGE), Wi-Fi (such as IEEE 802.11a, IEEE 802.11b, IEEE 802.11g and/or IEEE 802.11n), Bluetooth, Wi-MAX, a protocol for email, instant messaging, and/or a short message service (SMS), or any other suitable communication protocol, including communication protocols not yet developed as of the filing date of this document. In an exemplary embodiment, the device 100 may be, at least in part, a cellular telephone.


The I/O subsystem 106 may include a touch screen controller 152 and/or other input controller(s) 154. The touch-screen controller 152 may be coupled to a touch-sensitive screen or touch sensitive display system 112.


The touch-sensitive display system 112 provides an input interface and an output interface between the device and a user. The display controller 152 receives and/or sends electrical signals from/to the display system 112. The display system 112 displays visual output to the user. The visual output may include graphics, text, icons, video, and any combination thereof (collectively termed “graphics”). In some embodiments, some or all of the visual output may correspond to user-interface objects, further details of which are described below.


A touch screen in display system 112 is a touch-sensitive surface that accepts input from the user based on haptic and/or tactile contact. The display system 112 and the display controller 152 (along with any associated modules and/or sets of instructions in memory 102) detect contact (and any movement or breaking of the contact) on the display system 112 and converts the detected contact into interaction with user-interface objects (e.g., one or more soft keys, icons, web pages or images) that are displayed on the touch screen. The touch screen 112 may be used to implement virtual or soft buttons and/or a keyboard. In an exemplary embodiment, a point of contact between a touch screen in the display system 112 and the user corresponds to a finger of the user.


The touch screen in the display system 112 may use LCD (liquid crystal display) technology, or LPD (light emitting polymer display) technology, although other display technologies may be used in other embodiments. The touch screen in the display system 112 and the display controller 152 may detect contact and any movement or breaking (lift off) thereof using any of a plurality of touch sensing technologies now known or later developed, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with a touch screen in the display system 112. A touch-sensitive display in some embodiments of the display system 112 may be analogous to the multi-touch sensitive tablets described in the following U.S. Pat. No. 6,323,846 (Westerman et al.), U.S. Pat. No. 6,570,557 (Westerman et al.), and/or U.S. Pat. No. 6,677,932 (Westerman), and/or U.S. Patent Publication 2002/0015024A1, each of which is hereby incorporated by reference in its entirety. However, a touch screen in the display system 112 displays visual output from the portable device 100, whereas touch sensitive tablets do not provide visual output. The touch screen in the display system 112 may have a resolution in excess of 100 dpi. In an exemplary embodiment, the touch screen in the display system has a resolution of approximately 168 dpi. The user may make contact with the touch screen in the display system 112 using any suitable object or appendage, such as a stylus, a finger, and so forth. In some embodiments, the user interface is designed to work primarily with finger-based contacts and gestures, which are much less precise than stylus-based input due to the larger area of contact of a finger on the touch screen. In some embodiments, the device translates the rough finger-based input into a precise pointer/cursor position (e.g., a touch point position) or command for performing the actions desired by the user.


A touch-sensitive display in some embodiments of the display system 112 may be as described in the following applications: (1) U.S. patent application Ser. No. 11/381,313, “Multipoint Touch Surface Controller,” filed on May 2, 2006; (2) U.S. patent application Ser. No. 10/840,862, “Multipoint Touchscreen,” filed on May 6, 2004; (3) U.S. patent application Ser. No. 10/903,964, “Gestures For Touch Sensitive Input Devices,” filed on Jul. 30, 2004; (4) U.S. patent application Ser. No. 11/048,264, “Gestures For Touch Sensitive Input Devices,” filed on Jan. 31, 2005; (5) U.S. patent application Ser. No. 11/038,590, “Mode-Based Graphical User Interfaces For Touch Sensitive Input Devices,” filed on Jan. 18, 2005; (6) U.S. patent application Ser. No. 11/228,758, “Virtual Input Device Placement On A Touch Screen User Interface,” filed on Sep. 16, 2005; (7) U.S. patent application Ser. No. 11/228,700, “Operation Of A Computer With A Touch Screen Interface,” filed on Sep. 16, 2005; (8) U.S. patent application Ser. No. 11/228,737, “Activating Virtual Keys Of A Touch-Screen Virtual Keyboard,” filed on Sep. 16, 2005; and (9) U.S. patent application Ser. No. 11/367,749, “Multi-Functional Hand-Held Device,” filed on Mar. 3, 2006. All of these applications are incorporated by reference herein in their entirety.


The other input controller(s) 154 may be coupled to other input/control devices 114, such as one or more buttons, a keyboard, infrared port, USB port, and/or a pointer device such as a mouse. The one or more buttons (not shown) may include an up/down button for volume control of the speaker 142 and/or the micro-phone 144. The one or more buttons (not shown) may include a push button. A quick press of the push button (not shown) may engage or disengage a lock of the touch screen 112. A longer press of the push button (not shown) may turn power to the device 100 on or off. The user may be able to customize a functionality of one or more of the buttons.


In some embodiments, the device 100 may include circuitry for supporting a location determining capability, such as that provided by the global positioning system (GPS). In some embodiments, the device 100 may be used to play back recorded music stored in one or more files, such as MP3 files or AAC files. In some embodiments, the device 100 may include the functionality of an MP3 player, such as an iPod (trademark of Apple Computer, Inc.). In some embodiments, the external port is a multi-pin (e.g., 30-pin) connector that is the same as, or similar to and/or compatible with the 30-pin connector used on iPod devices.


The device 100 also includes a power system 137 for powering the various components. The power system 137 may include a power management system, one or more power sources (e.g., battery, alternating current (AC)), a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator (e.g., a light-emitting diode (LED)) and any other components associated with the generation, management and distribution of power in portable devices. The device 100 may also include one or more external ports 135 for connecting the device 100 to other devices.


The memory controller 120 may be coupled to memory 102 with one or more types of computer readable storage media. Memory 102 may include high-speed random access memory and/or non-volatile memory, such as one or more magnetic disk storage devices, one or more optical storage devices, and/or flash memory. Memory 102 may store an operating system 122, such as Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an embedded operating system such as VxWorks. The operating system 122 may include procedures (or sets of instructions) for handling basic system services and for performing hardware dependent tasks. Memory 102 may also store communication procedures (or sets of instructions) in a communication module 124. The communication procedures may be used for communicating with one or more additional devices, one or more computers and/or one or more servers. Memory 102 may include a display module (or a set of instructions) 125, a contact/motion module (or a set of instructions) 126 to determine one or more points of contact and/or their movement, and a graphics module (or a set of instructions) 128. The graphics module 128 may support widgets, that is, modules or applications with embedded graphics. The widgets may be implemented using JavaScript, HTML, or other suitable languages.


Memory 102 may also include one or more applications 130. Examples of applications include email applications, text messaging or instant messaging applications, web browsers, memo pad applications, address books or contact lists, and calendars.


Also in memory 102 are one or more dictionaries 132 and a word recommendation module (or set of instructions) 134. In some embodiments, a dictionary contains a list of words and corresponding usage frequency rankings. The usage frequency ranking of a word is the statistical usage frequency for that word in a language, or by a predefined group or people, or by the user of the device 100, or a combination thereof. As described below, a dictionary may include multiple usage frequency rankings for regional variations of the same language and/or be tailored to a user's own usage frequency, e.g., derived from the user's prior emails, text messages, address book, and other previous input from the user. The word recommendation module identifies word recommendations for presentation to the user in response to text input by the user.


Each of the above identified modules and applications corresponds to a set of instructions for performing one or more functions described above. These modules (i.e., sets of instructions) need not be implemented as separate software programs, procedures or modules. The various modules and sub-modules may be rearranged and/or combined. Memory 102 may include additional modules and/or sub-modules, or fewer modules and/or sub-modules. Memory 102, therefore, may include a subset or a superset of the above identified modules and/or sub-modules. Various functions of the device 100 may be implemented in hardware and/or in software, including in one or more signal processing and/or application specific integrated circuits.


Attention is now directed to FIG. 2, a flow diagram illustrating a process of providing word recommendations in accordance with some embodiments. Process flow 200 describes a process of providing word recommendations in response to input of a character string by a user.


A sequence of input characters is received from an input device (202). A user inputs a sequence of characters into the portable communications device via an input device, such as a keyboard, and the device receives the input. As used herein, the input character sequence is a sequence of non-whitespace characters, delimited by whitespaces or punctuation, input by the user via the input device. The sequence of characters may constitute a word.


In some embodiments, the input device is a virtual keyboard (also called a soft keyboard) displayed on a touch-sensitive display of the portable device, where the user hits the keys of the keyboard (“types on the keyboard”) by touching the touch-sensitive display on locations corresponding to keys of the virtual keyboard. In some other embodiments, the input device is a physical keyboard on the device (also called a hard keyboard).


The keyboard, whether virtual or physical, has a plurality of keys, each key corresponding to one or more characters, such as letters, numbers, punctuation, or symbols. The keys are arranged in accordance with a predefined layout that defines the positions of the keys on the keyboard. On the layout, each key has at least one neighbor key. In some embodiments, the keyboard layout follows the well-known QWERTY layout or a variant thereof. In some other embodiments, the keyboard layout may follow other layouts. Furthermore, in some embodiments, the layout may change depending on the language used on the device. For example, if English is selected as the user interface language, then the active keyboard layout may be the QWERTY layout, and other layouts may be active when another language, such as Swedish or French, is selected as the user interface language. Further details regarding keyboard layouts are described below in relation to FIG. 5.


Permutations of input characters and neighbor characters are determined and a set of strings are generated from the permutations (204). As used herein, a “permutation” is a sequence of characters, wherein each character in the sequence is either the input character in the corresponding position in the input character sequence or a neighbor character of that input character on the keyboard layout. The first character in the permutation is the first character of the input character sequence or a neighbor of that first character on the keyboard layout, the second character in the permutation is the second character of the input character sequence or a neighbor of that second character on the keyboard layout, and so forth, up to and perhaps including the last character in the input character sequence. Thus, the length of a permutation and of a generated string is at most the length of the input character sequence.


For example, if the input sequence is “rheater,” then the first character in any of the permutations generated for this input sequence is “r” (the first character in the input sequence) or any characters that are neighbors to “r” on the keyboard layout. The second character in a permutation is “h” or any neighbor thereof. The third character in a permutation is “e” (the third character in the input sequence) or neighbors thereof, and so forth.


In some embodiments, permutations may be determined for a predefined-length subset of the input sequence and strings of the same predefined length may be generated from the permutations. In some embodiments, the predefined length is 3 characters. That is, the permutations are determined and prefix strings are generated from the first three characters in the input sequence and neighbors thereof. If the length of the input sequence is less than the predefined length, a process other than process flow 200 may be used to provide word recommendations. For example, if the input sequence is one or two characters long, the input sequence in its entirety may be compared against words in a dictionary and best matches are identified.


The set of strings are compared against a dictionary. Words in the dictionary that have any of the set of strings as a prefix are identified (206). As used herein, “prefix” means that the string is a prefix of a word in the dictionary or is itself a word in the dictionary. A dictionary, as used herein, refers to a list of words. The dictionary may be pre-made and stored in the memory. The dictionary may also include usage frequency rankings for each word in the dictionary. A usage frequency ranking for a word indicates (or more generally, corresponds to) the statistical usage frequency for that word in a language. In some embodiments, the dictionary may include different usage frequency rankings for different variants of a language. For example, a dictionary of words in the English language may have different usage frequency rankings with respect to American English and British English.


In some embodiments, the dictionary may be customizable. That is, additional words may be added to the dictionary by the user. Furthermore, in some embodiments, different applications may have different dictionaries with different words and usage frequency rankings. For example, an email application and an SMS application may have different dictionaries, with different words and perhaps different usage frequency rankings within the same language.


The identified words are the candidate words that may be presented to the user as recommended replacements for the input sequence. The candidate words are scored (208). Each candidate word is scored based on a character-to-character comparison with the input sequence and optionally other factors. Further details regarding the scoring of candidate words are described below, in relation to FIGS. 3 and 7A7C. A subset of the candidate words are selected based on predefined criteria (210) and the selected subset is presented to the user (212). In some embodiments, the selected candidate words are presented to the user as a horizontal listing of words.


Attention is now directed to FIG. 3, a flow diagram illustrating a process of scoring candidate words in accordance with some embodiments. Process flow 300 describes a process of scoring a candidate word. The scoring helps determine which word(s) in the dictionary is/are the best potential replacement(s) for the input sequence of characters.


Each character in a candidate word is compared to the character in the corresponding position in the input sequence (302). Thus, the first character in the candidate word is compared to the first character in the input sequence, the second character in the candidate word is compared to the second character in the input sequence, and so forth. If either the candidate word or the input sequence is longer than the other, then the additional characters beyond the shorter length of the two are ignored in the comparison. In some embodiments, further comparison of the candidate word with the input sequence may be made. For example, the further comparison may include determining the number of character differences between the candidate words and the input sequence, and determining if any character differences are a result of transposed characters. A score is calculated for the candidate word based on the comparison described above (304). Each character comparison yields a value, and the values are added to yield the score for the candidate word.


In some embodiments, the score value given for a character comparison is based on the actual characters as opposed to merely whether the characters match. More particularly, the value may be based on whether the character in the candidate word matches the corresponding character in the input sequence exactly and/or whether the character in the candidate word is a keyboard layout neighbor of the corresponding character in the input sequence.


Optionally, a first “bonus” may be added to the score of the candidate word if the candidate word and the input sequence are different in only one character (306). Similarly, an optional second “bonus” may be added to the score of the candidate word if the candidate word and the input sequence are different in only a pair of transposed adjacent characters (308). Further details regarding candidate word scoring is described below, in relation to FIGS. 7A-7C.


Attention is now directed to FIG. 4, a flow diagram illustrating a process of selecting and presenting candidate words in accordance with some embodiments. Process flow 400 describes in further details blocks 210 and 212 (FIG. 2), which involves selection and presentation of candidate words.


In some embodiments, the candidate words are split into two groups based on their usage frequency rankings within the dictionary (402). A first group includes the candidate words whose usage frequency rankings exceed a predefined threshold. The second group includes the candidate words whose usage frequency rankings does not exceed the threshold. With each of the two groups, the candidate words are sorted by their candidate word scores.


There may be candidate words in the second group whose scores are very high because, for example, they match the input sequence exactly or almost exactly. In some embodiments, these high-scoring words may be removed from the second group and added to the first group if their scores exceed the score of the highest scoring candidate word in the first group by a predefined margin (404). In some embodiments, the predefined margin is that the score of the candidate word in the second group must be at least two times the highest candidate word score in the first group.


One or more of the highest scoring candidate words in the first group are presented to the user (406). It should be appreciated that if candidate words from the second group were moved to the first group as described above, then the candidate words that are presented will include at least one candidate word that was originally in the second group since that candidate word has a higher score than any of the original candidate words in the first group.


In some embodiments, if block 404 is not performed, either because no candidate word in the second group satisfies the score margin threshold or because the moving of candidate words is not performed at all, the highest scoring candidate word in the second group may nevertheless be presented along with the candidate words from the first group (408). Furthermore, in some embodiments, the input sequence as entered by the user may be presented as a matter of course (410). The user may choose any one of the presented candidate words to replace the input sequence, including choosing the input sequence as entered if the user is satisfied with it.


Attention is now directed to FIGS. 5A and 5B, which are exemplary layouts of letter keys on a keyboard in accordance with some embodiments. As described above, the prefix strings, based on which candidate words are identified, are generated based on characters in the input sequence and their corresponding neighbor characters on a keyboard layout. Keyboard layouts 502 and 504 are exemplary keyboard layouts. A keyboard layout defines the positions of each key on the keyboard and the alignment of the keys relative to each other. For ease of description, only the letter keys of the layouts 502 and 504 are shown. It should be appreciated, however, that a keyboard layout may also include keys for numbers, punctuation, symbols, and functional keys. In some embodiments, some keys may be overloaded, that is, a key may correspond to multiple characters and/or functions.


Layouts 502 and 504 are layouts that follow the well-known QWERTY layout. However, the key alignment in layout 502 is different from the key alignment in layout 504. In layout 502, the keys are aligned in rows but not in columns; a key in one row may straddle two keys in an adjacent row. For example, key “T” straddles keys “F” and “G” in layout 502. In layout 504, the keys are aligned in columns as well as in rows. The definition of which keys are the neighbors of a key may be different depending on how the keys are aligned. In layout 502, the neighbors of a particular key may be defined as the keys that are directly adjacent to the particular key or whose peripheries “touch” a periphery of the particular key. For example, the neighbors of key “G” in layout 502 are keys “T,” “Y,” “F,” “H,” “V,” and “B;” and the neighbors of key “W” are keys “Q,” “E,” “A,” and “S.” In layout 504, the neighbors of a particular key may be defined as the keys that are immediately above, below, to the side of, and diagonal of the particular key. For example, the neighbors of key “G” in layout 504 are keys “R,” “T,” “Y,” “F,” “H,” “C,” “V,” and “B;” and the neighbors of key “W” are keys “Q,” “E,” “A,” “S,” and “D.”


It should be appreciated, however, that layouts 502 and 504 are merely exemplary, and that other layouts and key alignments are possible and the same key may have different neighbors in different layouts.


Attention is now directed to FIG. 6, an exemplary derivation of candidate words based on a text input in accordance with some embodiments. FIG. 6 illustrates an example of the identification of candidate words from an input sequence.


In FIG. 6, the input sequence 602 is “rheatre.” For prefix strings of three characters in length, the first three characters and their corresponding neighbors 604 are identified. Here, the first character is “r” and its neighbors, in accordance with the layout 502, are “e,” “d,” “f,” and “t.” The second character is “h,” and its neighbors are “y,” “u,” “g,” “j,” “b,” and “n.” The third character is “e,” and its neighbors are “w,” “s,” “d,” and “r.”


From the input characters and corresponding neighbors, the character permutations 606 are determined. Each permutation is a character combination where the first character is the first input character or a neighbor thereof, the second character is the second input character or a neighbor thereof, and the third character is the third input character or a neighbor thereof. From these permutations, prefix strings are generated and compared to words in the dictionary. Examples of three-character permutations based on the input sequence 602 include “the,” “rus,” “rye,” and “due.” Words in the dictionary that have one of these strings as a prefix are identified as candidate words 608. Examples of candidate words include “theater,” “rye,” “rusty,” “due,” “the,” and “there.” In other embodiments, the character permutations may include four, five, or more characters, rather than three characters.


Attention is now directed to FIGS. 7A-7C, which are examples of scoring of candidate words in accordance with some embodiments. FIG. 7A shows an input sequence and three possible candidate words that may be identified from permutations of the first three characters of the input sequence. The candidate words are compared to the input sequence character-by-character and scores for the candidate words are tallied.


In some embodiments, a score tally of a candidate word involves assigning a value for each character comparison and adding the values together. The value that is assigned for a character comparison is based on the result of the comparison. Particularly, the value is based on whether the character in the candidate word, compared to the character in the corresponding position in the input sequence, is an exact match, a neighbor on the keyboard layout, or neither. In some embodiments, the value assigned for an exact match is a predefined value N. If the characters are not an exact match but are neighbors, then the value assigned is a value αN, where a is a constant and α<1. In some embodiments, a is 0.5. In other words, the value assigned for a neighbor match is a reduction of the value for an exact match.


In some embodiments, if the character in the candidate word is neither an exact match or a neighbor of the corresponding character in the input sequence, then the assigned value is βN, where β is a constant and β<α<1. For example, β may be 0.25. In some other embodiments, β may be a function of the “distance” between the characters on the keyboard layout. That is, 13 may be a smaller number if the candidate word character is farther away on the keyboard layout from the input sequence character than if the candidate word character is closer on the keyboard layout from the input sequence character without being a neighbor.


More generally, the value assigned for a character comparison is γN, where N is a predefined value, γ=1 for an exact match, and γ may vary based on some function of the “distance” on the layout between the character in the candidate word and the corresponding character in the input sequence. For example, γ may be 1 for an exact match, 0.5 for a neighbor, and 0 otherwise. As another example, γ may be 0.5 for a neighbor (a 1-key radius), 0.25 for keys that are two keys away (a 2-key radius), and 0 for keys that are three or more keys away. In some embodiments, N is equal to 1.


If the candidate word has a length that is longer than the input sequence, or vice versa, then the character positions that are beyond the lesser of the two lengths are ignored or assigned a value of 0.


The first candidate word shown in FIG. 7A is “theater.” Compared to the input sequence of “rheatre,” there are exact matches in the second thru fifth positions. The characters in the first, sixth, and seventh positions of the candidate word are keyboard layout neighbors of input sequence characters in the corresponding positions. Thus, the score for “theater” in this case is 0.5N+N+N+N+N+0.5N+0.5N=5.5N.


The second candidate word is “threats.” Compared to the input sequence of “rheatre,” there is an exact match in the second position. The characters in the first, third, sixth, and seventh positions of the candidate word are keyboard layout neighbors of the input sequence characters in the corresponding positions, and the characters in the fourth and fifth positions of the candidate word are neither exact matches nor neighbors of the input sequence characters in the corresponding positions. Thus, the score for “threats” in this case is 0.5N+N+0.5N+0.25N+0.25N+0.5N+0.5N=3.5N.


The third candidate word is “there.” Compared to the input sequence of “rheatre,” there is an exact match in the second and third positions. The character in the first position of the candidate word is a keyboard layout neighbor of the input sequence character in the corresponding position, and the characters in the fourth and fifth positions of the candidate word are neither exact matches nor neighbors of the input sequence characters in the corresponding positions. Furthermore, because the input sequence is two characters longer than the candidate word, the last two characters in the input sequence are ignored in the comparison and are assigned score values of 0. Thus, the score for “there” in this case is 0.5N+N+N+0.25N+0.25N=3N.


Some candidate words, when compared to the input sequence, may merit a score bonus, examples of which are shown in FIGS. 7B and 7C. In FIG. 7B, the input sequence is “thaeter” and the candidate word is “theater.” The score based on the character comparisons alone is 5.5N. However, the only difference between “thaeter” and “theater” is a pair of transposed or swapped characters, namely “ae” in “thaeter” vs. “ea” in “theater.” In some embodiments, a first bonus P is added to the score for this fact. In FIG. 7C, the input sequence is “thester” and the candidate word is “theater.” The score based on the character comparisons alone is 6.5N. However, the only difference between “thester” and “theater” is a single character, namely “s” in “thester” vs. “a” in “theater.” In some embodiments, a second bonus Q is added to the score for this fact. In some embodiments, both P and Q are equal to 0.75.


It should be appreciated that, in some other embodiments, alternative candidate word scoring and selection schemes other than the ones described above may be used.


For example, one alternative scheme may include, instead of dividing the candidate words into the first and second groups based on usage frequency rankings, the usage frequency rankings may instead be used as a weighting to be applied to candidate word scores. That is, the score of a candidate word is multiplied by the usage frequency ranking of the candidate word, and candidate words for presentation are selected based on their weighted scores.


As another example, another scheme replaces candidate word scoring based on character-by-character comparisons, as described above, with scoring based on the edit distance (also known as the Levenshtein distance) between the input sequence and the candidate word. That is, the score of a candidate word is the edit distance between the candidate word and the input sequence, or a function thereof, and candidate words are selected for presentation based on the edit distance scores. Alternately, the score for each candidate is based on the edit distance multiplied by (or otherwise combined with) the usage frequency ranking of the candidate, and candidate words are selected for presentation based on these scores.


As another example, another scheme uses a graph-matching technique. In this technique, the sequence of individual touch points that a user inputs into the device for a word (e.g., by contacts with a virtual keyboard on the touch screen) form a directed graph. This user-input directed graph is compared against a collection of directed graphs for respective words in a dictionary to generate a list of dictionary words that most closely match the user typing. In some embodiments, the probability that a user-input directed graph matches the directed graph for a dictionary word is calculated as follows:


Let U1 . . . n be each point in the user-input directed graph.


Let D1 . . . n be each point in the directed graph of a dictionary word. Points in this directed graph are assigned based on the centroid of the key that inputs the corresponding letter, as represented in the keyboard user interface.


Let P1 . . . n be, for each point in the user-input directed graph, the probability that the letter corresponding to Ux equals the letter corresponding to Dx. In some embodiments, a respective Px is computed by calculating the Euclidean distance between the points Ux and Dx, and applying a factor based on the size of the user interface elements that indicate the keys on the keyboard. A minimum probability may be entered for Px if the graphs for the user word and the dictionary word are different lengths. In one embodiment, the factor (based on the size of the user interface elements that indicate the keys on the keyboard) is a divisor that is equal to, or proportional to, the distance between center points of two horizontally adjacent keys on the keyboard.


Multiplying the probabilities in P1 . . . n together yields G, the probability that a graph for a dictionary word matches the user-input graph. In some embodiments, G is multiplied by F, the frequency that the word occurs in the source language/domain. Furthermore, in some embodiments G is also multiplied by N, a factor calculated by considering one or more words previously typed by the user. For example, in a sentence/passage being typed by a user, “to” is more likely to follow “going,” but “ti” is more likely to follow “do re mi fa so la.” In some embodiments, G is multiplied by both F and N to yield Ω, the probability that a user-input directed graph matches a dictionary word.


The collection of dictionary words with the highest probabilities may be presented in a display for user consideration, for example as described in “Method, System, and Graphical User Interface for Providing Word Recommendations” (U.S. patent application Ser. No. 11/620,642, filed Jan. 5, 2007), the content of which is hereby incorporated by reference in its entirety. In some embodiments, the top-ranked word is presented for selection by the user as described below with respect to FIGS. 11A, 11B, and 12. In some embodiments, the top-ranked word is selected for the user by the device without user intervention.


In some embodiments, as word recommendations are offered by the portable device and selected by the user, statistics regarding the corrections made are collected. For example, the characters in an input sequence that was replaced by a candidate word selected by the user and the corresponding characters may be logged. Over time, the corrections log may be analyzed for patterns that may indicate a pattern of repeated typing errors by the user. If the keyboard is a virtual keyboard on a touch screen of the portable device, the portable device may automatically adjust or recalibrate the contact regions of the keys of the virtual keyboard to compensate for the user pattern of typing errors. As another example, for a given input sequence, the word selected by the user may be recommended first or given a higher score when the same input sequence is subsequently entered by the user.


In some embodiments, a user interface object is activated if the determined touch point position falls within a user interface object's “hit region” (or equivalently, “hit area”) and there are no overlapping objects with larger hit regions. The hit region of a user interface object may be the same size as, or larger, or smaller, than the visible size of the user interface object as it is displayed on the touch screen. As explained below, an object (e.g., a high-probability next character in a keyboard during text input) may have a hit region that is larger than the visible size of the user interface object as it is displayed on the touch screen (e.g., key icon “O” has an enlarged hit region denoted by the dotted line around the “O” key icon in FIG. 8B). For such objects, the portion of the hit region that is larger than the corresponding user interface object is called a hidden hit region.


For example, the total hit area associated with a key icon on a keyboard may be dynamically adjusted (e.g., updated after each character in a character string is entered) with the following formula:

ATotal(i)=AVisible(i)AHidden(i)=AVisible(i){1+[P(iK]}

where,

  • ATotal (i)=total hit area for the adjustable hit region for character i,
  • AVisible=visible key area on the touch screen for character i,
  • AHidden=hidden hit area for character i,
  • P(i)=probability that the next character entered will be i (where the probability is based on the previous characters entered in the character string and one or more types of word usage frequency), and
  • K=an empirically determined, positive constant that depends on the size of the keys on the touch screen display.


For this example, the total hit area, ATotal is never less than the visible key area, AVisible (i). The adjustable hit region for a key icon includes a visible key area displayed on the touch screen display and a hidden hit region not displayed on the touch screen display. In some embodiments, the visible key area AVisible (i) for a given key is constant, while the hidden hit area AHidden (i) is dynamic, thereby making the total hit area ATotal (i) dynamic as well. K can be determined by trial and error by observing users interacting with a particular keyboard. K approaches zero as the size of the visible keys increases on the touch screen. In other words, if the key icons become comparable to or larger than the finger contact areas on the touch screen, hidden hit regions are no longer needed to help identify the key icon that the user probably intended to hit.



FIGS. 8A-8C illustrate an exemplary method for dynamically adjusting hidden hit regions associated with soft keyboard keys as a word is typed with the soft keyboard keys in accordance with some embodiments. The user interface includes an input field 5620 and a soft keyboard 5640. A user selection of any key icon of the soft keyboard 5640 enters a corresponding user-selected character in the input field 5620. For illustrative purposes, as shown in FIG. 8A, all the key icons initially have the same size hit regions, which correspond to the visible boundaries of the keys.



FIG. 8B depicts changes in the sizes of hidden hit regions associated with different key icons after two characters “G-O” are entered into the input field 5620. The sizes of the hidden regions for the keys have been adjusted in accordance with the previously entered characters. For example, the size of the hidden hit region for the key icon “D” increases because “God” is a common English word. Thus, the key icon “D” may be activated even if the next finger contact (as illustrated by the “+” sign in FIG. 8B) is on the visible area of the key icon “F.” In some embodiments, if the hidden hit regions of two (or more) keys overlap with the finger contact (or with a touch point position derived from a finger contact), then the key with the largest hit region ATotal (i) (including its hidden hit region AHidden (i)) is selected. Similarly, the hidden hit regions associated with key icons “A” and “0” are also increased because each of the strings “Goa” and “Goo” leads to one or more legitimate English words such as “Goal”, “Good”, or “Goad.” In some embodiments, the hit regions of unlikely next characters (e.g., key icon “K” is unlikely because the string “Gok” is not found at the beginning of any common English words) shrink so that the hit area is less than the visible area of the key. In some embodiments, the hit regions of unlikely next characters do not shrink. Such keys will not be selected in the areas where they overlap with keys with enlarged hit regions.



FIG. 8C depicts the updated hidden hit regions associated with different key icons after another character “a” is entered into the input field 5620. Given the string “Goa” that has been entered, the user may be typing the word “Goal.” Accordingly, the size of the hidden hit region associated with the key icon “L” increases, whereas the hidden hit region associated with the key icon “O” drops (e.g., to it default initial value) because the string “Goao” is not found at the beginning of any common English words.



FIG. 9 illustrates an exemplary derivation of candidate words based on text input in accordance with some embodiments. A set of key icons (e.g., soft keyboard 5640, FIG. 8A) are displayed on a touch screen display, as discussed above. A first touch point input is received from the user, and is construed by the portable electronic device as the character “R,” because the probability of the character “R” is determined to be greater than the probability of other characters, such as “T,” based on the location of the first touch point input. For example, the first touch point may be on a key icon for the letter “R.” Alternately, the first touch point may be at a location that is between the key icons for the letters “R” and “T,” but closer to the letter “R.” With only a single touch point input so far, the probabilities are determined based solely or primarily on touch probabilities.


Next, a second touch point input is received from the user, this time within the hit region for the key icon for the letter “h”. The device now evaluates both touch-based probabilities (Ptouch(r), Ptouch(h)), for candidate characters corresponding to the first and second touch point inputs and usage-based probabilities (Pusage(r), Pusage(h)) for the first and second touch point inputs. A combined probability for each of a plurality of candidate character sequences (e.g., Rh, Th) is determined. If there is a candidate character sequence having a probability that is greater than the probability for the displayed character sequence, and that meets any other applicable selection criteria, then a suggested character string corresponding to the candidate character sequence having the highest probability is displayed. If the displayed suggested character string (e.g., a word) is selected by the user, for example by touching the space bar or other delimiter icon (e.g., as shown in FIGS. 11A, 11 B, and 12), the displayed suggested character string replaces the previously displayed sequence of characters. If the user does not select (e.g., does not touch an icon for a delimiter) the displayed suggested character string, the user can instead continue to input additional touch points. In the example shown in FIG. 9, the third touch point input is on or near the key icon for the letter “e.” At this point, the device may determine that the probability for the candidate character sequence “T-h-e” [e.g., Ptouch(t)Pusage(t)Ptouch(h)Pusage(h) Ptouch(e)Pusage(e), FIG. 9] is greater than the probability for the currently displayed character sequence, “R-h-e” [e.g., Ptouch(r)Pusage(r) Ptouch(h)Pusage(h) Ptouch(e)Pusage(e), FIG. 9], and is also greater than the probability for any other candidate character sequence that is compatible with the sequence of touch point inputs received so far. In that case, the suggested character string “The” is displayed and made available for selection by the user. If the user is actually trying to type the word “rhesus,” then in response to detecting a user touch point on or near the “s” icon key, the device will cease to display the suggested word “The.” On the other hand, in response to detecting a user touch point on or near the space bar or other delimiter icon key, the device will replace “Rhe” with the suggested word “The.”



FIGS. 10A and 10B are flow diagrams illustrating text input processes 1000 and 1050 in accordance with some embodiments. The processes increase the accuracy of text entered on touch screen keyboards by touch input and the accuracy of suggested words, respectively. Process 1000 is performed at a portable electronic device having a touch screen display (e.g., device 100).


The device displays (1002) a plurality of key icons (e.g., in keyboard 5640, FIG. 8A), each key icon having an adjustable hit region of dynamically adjustable size (e.g., FIGS. 8B-8C). In some embodiments, the hit region of each key icon has a default size equal to a visible display size of the key icon (e.g., FIG. 8A).


The devices receives (1004) a sequence of individual touch points input by a user on the touch screen display. Each touch point is determined at lift off of a contact (e.g., a finger contact or a stylus contact) from the touch screen display. An image with an enlarged version of a character that will be selected as the character corresponding to an individual touch point is displayed prior to lift off of a respective contact (e.g., the letter “N” in FIG. 11A). The character image that is displayed prior to lift off is selected in accordance with the adjustable hit regions of the displayed key icons. For example, if the hidden hit regions of two (or more) keys overlap with the finger contact (or with a touch point position derived from a finger contact), then a character image that corresponds to the key with the largest hit region ATotal (i) (including its hidden hit region AHidden (i)) is selected for display.


After receiving each of the individual touch points, the device:

    • forms a user-input directed graph for the sequence of individual touch points received so far;
    • determines a character corresponding to a last received individual touch point in accordance with the adjustable hit regions of the displayed key icons;
    • displays a sequence of characters corresponding to the sequence of individual touch points, including the determined character; and
    • updates sizes of the adjustable hit regions for a plurality of the key icons (1006).


As noted above, in some embodiments, if the hidden hit regions of two (or more) keys overlap with a finger contact (or with a touch point position derived from the finger contact), then the character corresponding to the key with the largest hit region (including its hidden hit region) is selected as the determined character.


In some embodiments, the device determines (1008) one of more alternate sequences of characters corresponding to the sequence of individual touch points, and determines a respective probability for each of the alternate sequences of characters and for the displayed sequence of characters. In some embodiments, the device displays a suggested replacement character string comprising a selected one of the alternate sequence of characters when the probability of the selected alternate sequence meets one or more predefined criteria with respect to the probability of the displayed sequence of characters [e.g., in FIG. 9, Ptouch(t)Pusage(t) Ptouch(h)Pusage(h) Ptouch(e)Pusage(e)>Ptouch(r)Pusage(r) Ptouch(h)Pusage(h) Ptouch(e)Pusage(e)].


In some embodiments, the device receives (1010) a touch point corresponding to a deletion key icon; deletes one or more of the displayed characters to produce a shortened sequence of characters; receives additional individual touch points; and after receiving each of the additional individual touch points: determines and displays a suggested character string only when the suggested character string starts with the shortened sequence of characters and the suggested character string meets predefined character string suggestion criteria (1010).


In some embodiments, the size of the adjustable hit region for a respective key icon is updated in accordance with the sequence of individual touch points input by the user. In some embodiments, updating the size of the adjustable hit region for a respective key icon includes determining a probability associated with the respective key icon and determining a size of the adjustable hit region in accordance with the determined probability. In some embodiments, the probability associated with the respective key icon is determined in accordance with the displayed sequence of characters (e.g., “Go” in FIG. 8B). In some embodiments, the probability associated with the respective key icon is determined in accordance with a plurality of character sequences including the displayed sequence of characters and at least one other sequence of characters consistent with the sequence of individual touch points input by the user. For example, the probability associated with the respective key icon may be based on the displayed sequence of characters (e.g., “Go” in FIG. 8B) and the top N (where N=1, 2, 5, etc.) candidate words (e.g., God, Goal, and Good for N=3) consistent with the sequence of individual touch points input by the user.


In some embodiments, the device determines a respective probability for each of a plurality of character sequences consistent with the sequence of individual touch points input by the user. The probability associated with the respective key icon is determined in accordance with determined probabilities of the plurality of character sequences, each of which comprises a potential prefix for a next character corresponding to a next touch point input by the user.


Process 1050 is performed at a portable electronic device having a touch screen display (e.g., device 100). The process increases the accuracy of suggested words by using information derived from character deletion by a user on the character string currently being entered. In the process, in addition to meeting other predefined word suggestion criteria, a word is not suggested unless the word starts with the shortened sequence of characters that remain after a user has deleted characters from the current character string being input by the user.


The device displays (1030) a plurality of key icons (e.g., soft keyboard 5640, FIG. 8A).


The device receives (1032) a sequence of individual touch points input by a user on the touch screen display.


The device displays (1034) a sequence of characters corresponding to the sequence of individual touch points.


The device receives (1036) a touch point corresponding to a deletion key icon.


The device deletes (1038) one or more of the displayed characters to produce a shortened sequence of characters.


The device receives (1040) additional individual touch points.


After receiving each of the additional individual touch points, the device:

    • displays a current sequence of characters including characters associated with the additional individual touch points; and
    • determines and displays a suggested character string only when the suggested character string starts with the shortened sequence of characters and the suggested character string meets predefined character string suggestion criteria (1042).


In some embodiments, the device determines a respective probability for the suggested character string and for the current sequence of characters. The predefined character string suggestion criteria include a requirement that the determined probability for the suggested character string be greater than the determined probability for the current sequence of characters.


In some embodiments, the predefined character string suggestion criteria include a requirement that the determined probability for the suggested character string be greater than the determined probability for the current sequence of characters by at least a predefined margin. For example, the margin may be a 10% margin, requiring that the probability for the suggested character string is at least 10% greater than the probability for the current sequence of characters.


In some embodiments, the determined probabilities are determined in accordance with the additional individual touch points input by the user.


In some embodiments, the determined probabilities are determined in accordance with the shortened sequence of characters and the additional individual touch points input by the user.


In some embodiments, the suggested character string comprises a complete word.


In some embodiments, the suggested character string comprises a complete word that includes at least one character not currently included in the plurality of displayed key icons. For example, if the user enters the letters “bete,” the suggested word may be “bête.” The suggested word includes “ê”, which may not be displayed on a key icon on the keyboard.



FIGS. 11A and 11B illustrate an exemplary user interface for inputting text in accordance with some embodiments.


In some embodiments, user interfaces 1100A and 1100B include the following elements, or a subset or superset thereof:

    • Signal strength indicator(s) 650 for wireless communication(s), such as cellular and Wi-Fi signals;
    • Time 652;
    • Battery status indicator 654;
    • Text entry area 612;
    • Send icon 614 that when activated (e.g., by a finger tap on the icon) initiates sending of the message in text box 612 to another party (e.g., Mike Van Os);
    • Soft keyboard 616 for entering text in area 612;
    • Alternate keyboard selector icon 618 that when activated (e.g., by a finger tap on the icon) initiates the display of a different keyboard (e.g., a soft keyboard with numbers);
    • Return icon 620 that when activated (e.g., by a finger tap on the icon) initiates a new line in the message in text box 612;
    • Shift key 628 that when activated (e.g., by a finger tap on the icon) capitalizes the next letter chosen on soft keyboard 616
    • Recipient input field 632 that when activated (e.g., by a finger tap on the field) receives and displays the phone number of the recipient of the instant message (or the recipient's name if the recipient is already in the user's contact list);
    • Add recipient icon 634 that when activated (e.g., by a finger tap on the icon) initiates the display of a scrollable list of contacts;
    • Cancel icon 636 that when activated (e.g., by a finger tap on the icon) cancels the new instant message;
    • Second area with suggested word 644 (e.g., adjacent to the word being input in text entry area 612);
    • Rejection icon 645;
    • Space bar 646; and/or
    • Insertion marker 656 (e.g., a cursor, insertion bar, insertion point, or pointer).


In some embodiments, a user can set whether the second area 644 with a suggested word is shown (e.g., by setting a user preference). In some embodiments, a letter is enlarged briefly before or after it is selected (e.g., an enlarged “N” 660 is displayed briefly while typing the “n” in “din” in FIG. 11A) to provide feedback to the user.



FIG. 12 is a flow diagram illustrating a process 1200 for inputting text on a portable electronic device with a soft keyboard and a touch screen display (e.g., device 100) in accordance with some embodiments. The process makes it very simple and intuitive for a user to accept or reject suggested words.


In a first area (e.g., text entry area 612) of the touch screen display, the device displays (1202) a current character string being input by a user with the soft keyboard (e.g., “din”, FIG. 11A).


In a second area 644 of the touch screen display, the device displays (1204) a suggested replacement character string for the current character string (e.g., “dinner” in area 644, FIG. 11A). In some embodiments, the second area 644 includes a suggestion rejection icon 645 adjacent to the suggested replacement character string (e.g., the circled “X” in area 644 adjacent to “dinner”).


The device replaces (1206) the current character string in the first area with the suggested replacement character string in response to detecting user activation of a key on the soft keyboard associated with a delimiter. For example, if the user activates the space bar key 646 on keyboard 615, the character string “din” in area 612 is replaced with the suggested replacement character string “dinner,” as shown in FIG. 11B.


The device keeps (1208) the current character string in the first area in response to detecting a finger gesture on the suggested replacement character string displayed in the second area (e.g., a tap gesture on the suggested replacement character string “dinner” ends display of the suggested replacement character string “dinner” and the suggestion rejection icon 645, while the current character string “din” is kept in area 612, not shown).


The foregoing description, for purpose of explanation, has been described with reference to specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the invention to the precise forms disclosed. Many modifications and variations are possible in view of the above teachings. The embodiments were chosen and described in order to best explain the principles of the invention and its practical applications, to thereby enable others skilled in the art to best utilize the invention and various embodiments with various modifications as are suited to the particular use contemplated.

Claims
  • 1. An electronic device, comprising: a touch-sensitive surface;a display;one or more processors; andmemory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for: receiving, via the touch-sensitive surface, a set of one or more inputs;in response to receiving the set of one or more inputs, displaying, in a first area of the display, a sequence of characters corresponding to the set of one or more inputs;in accordance with a determination that a display suggested word setting is enabled: displaying, in a second area of the display, one or more suggested character strings based on the sequence of characters;receiving, via the touch-sensitive surface, an input;in response to receiving the input: in accordance with a determination that the input corresponds to selection of the one or more suggested character strings: maintaining display of the sequence of characters corresponding to the set of one or more inputs; and ceasing to display the one or more suggested character strings in the second area on the display;in accordance with a determination that the input corresponds to a key associated with a non-delimiter: concatenating display of the sequence of characters in the first area with the non-delimited character; in accordance with a determination that respective probabilities of the one or more suggested character strings exceeds respective probabilities associated with the concatenated sequence of characters, maintaining display of the one or more suggested character strings in the second area on the display; and in accordance with a determination that the respective probabilities associated with the concatenated sequence of characters exceeds the respective probabilities of the one or more suggested character strings, ceasing to display the one or more suggested character strings in the second area on the display; andin accordance with a determination that the display suggested word setting is disabled, forgoing to display one or more suggested character strings in the second area on the display.
  • 2. The electronic device of claim 1, wherein the second area is adjacent to the sequence of characters in the first area.
  • 3. The electronic device of claim 1, the one or more programs further including instructions for: in response to receiving the input and in accordance with a determination that the input corresponds to a key associated with a delimiter: maintaining display of the sequence of characters corresponding to the set of one or more inputs; andceasing to display the one or more suggested character strings in the second area on the display.
  • 4. The electronic device of claim 3, wherein the key associated with the delimiter is a space bar.
  • 5. A non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of an electronic device with a display and a touch-sensitive surface, the one or more programs including instructions for: receiving, via the touch-sensitive surface, a set of one or more inputs;in response to receiving the set of one or more inputs, displaying, in a first area of the display, a sequence of characters corresponding to the set of one or more inputs;in accordance with a determination that a display suggested word setting is enabled: displaying, in a second area of the display, one or more suggested character strings based on the sequence of characters;receiving, via the touch-sensitive surface, an input;in response to receiving the input: in accordance with a determination that the input corresponds to selection of the one or more suggested character strings: maintaining display of the sequence of characters corresponding to the set of one or more inputs; andceasing to display the one or more suggested character strings in the second area on the display;in accordance with a determination that the input corresponds to a key associated with a non-delimiter: concatenating display of the sequence of characters in the first area with the non-delimited character;in accordance with a determination that respective probabilities of the one or more suggested character strings exceeds respective probabilities associated with the concatenated sequence of characters, maintaining display of the one or more suggested character strings in the second area on the display; andin accordance with a determination that the respective probabilities associated with the concatenated sequence of characters exceeds the respective probabilities of the one or more suggested character strings, ceasing to display the one or more suggested character strings in the second area on the display; andin accordance with a determination that the display suggested word setting is disabled, forgoing to display one or more suggested character strings in the second area on the display.
  • 6. The non-transitory computer-readable storage medium of claim 5, wherein the second area is adjacent to the sequence of characters in the first area.
  • 7. The non-transitory computer-readable storage medium of claim 5, the one or more programs further including instructions for: in response to receiving the input and in accordance with a determination that the input corresponds to a key associated with a delimiter: maintaining display of the sequence of characters corresponding to the set of one or more inputs; andceasing to display the one or more suggested character strings in the second area on the display.
  • 8. The non-transitory computer-readable storage medium of claim 7, wherein the key associated with the delimiter is a space bar.
  • 9. A method, comprising: at an electronic device with a display and a touch-sensitive surface: receiving, via the touch-sensitive surface, a set of one or more inputs;in response to receiving the set of one or more inputs, displaying, in a first area of the display, a sequence of characters corresponding to the set of one or more inputs;in accordance with a determination that a display suggested word setting is enabled: displaying, in a second area of the display, one or more suggested character strings based on the sequence of characters;receiving, via the touch-sensitive surface, an input;in response to receiving the input: in accordance with a determination that the input corresponds to selection of the one or more suggested character strings: maintaining display of the sequence of characters corresponding to the set of one or more inputs; and ceasing to display the one or more suggested character strings in the second area on the display;in accordance with a determination that the input corresponds to a key associated with a non-delimiter: concatenating display of the sequence of characters in the first area with the non-delimited character; in accordance with a determination that respective probabilities of the one or more suggested character strings exceeds respective probabilities associated with the concatenated sequence of characters, maintaining display of the one or more suggested character strings in the second area on the display; and in accordance with a determination that the respective probabilities associated with the concatenated sequence of characters exceeds the respective probabilities of the one or more suggested character strings, ceasing to display the one or more suggested character strings in the second area on the display; andin accordance with a determination that the display suggested word setting is disabled forgoing to display one or more suggested character strings in the second area on the display.
  • 10. The method of claim 9, wherein the second area is adjacent to the sequence of characters in the first area.
  • 11. The method of claim 9, further comprising: in response to receiving the input and in accordance with a determination that the second input corresponds to a key associated with a delimiter: maintaining display of the sequence of characters corresponding to the set of one or more inputs; andceasing to display the one or more suggested character strings in the second area on the display.
  • 12. The method of claim 11, wherein the key associated with the delimiter is a space bar.
RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 16/270,396, filed Feb. 7, 2019, which is a continuation of U.S. patent application Ser. No. 14/800,378, filed Jul. 15, 2015, which is a continuation of U.S. patent application Ser. No. 13/559,495, filed Jul. 26, 2012, now U.S. Pat. No. 9,086,802, which is a divisional application of U.S. patent application Ser. No. 12/165,554, filed Jun. 30, 2008, now U.S. Pat. No. 8,232,973, which claims priority to U.S. Provisional Patent Application No. 61/010,619, “Method, Device, and Graphical User Interface Providing Word Recommendations for Text Input,” filed Jan. 9, 2008, the entire contents of which are incorporated by reference herein in their entirety. This application is related to: U.S. patent application Ser. No. 11/620,641, “Method and System for Providing Word Recommendations for Text Input,” filed Jan. 5, 2007; U.S. patent application Ser. No. 11/620,642, “Method, System, and Graphical User Interface for Providing Word Recommendations,” filed Jan. 5, 2007; U.S. patent application Ser. No. 11/850,015, “Methods for Determining a Cursor Position from a Finger Contact with a Touch Screen Display,” filed Sep. 4, 2007; and U.S. patent application Ser. No. 12/101,832, “Touch Screen Device, Method, and Graphical User Interface for Determining Commands by Applying Heuristics,” filed Apr. 11, 2008. All of these applications are incorporated by reference herein in their entirety

US Referenced Citations (341)
Number Name Date Kind
5038401 Inotsume Aug 1991 A
5053758 Cornett et al. Oct 1991 A
5128672 Kaehler Jul 1992 A
5253325 Clark Oct 1993 A
5297041 Kushler et al. Mar 1994 A
5305205 Weber et al. Apr 1994 A
5565894 Bates et al. Oct 1996 A
5581484 Prince Dec 1996 A
5615378 Nishino et al. Mar 1997 A
5736974 Selker Apr 1998 A
5748512 Vargas May 1998 A
5748927 Stein et al. May 1998 A
5758314 Mckenna May 1998 A
5761689 Rayson et al. Jun 1998 A
5765168 Burrows Jun 1998 A
5774834 Visser Jun 1998 A
5778405 Ogawa Jul 1998 A
5797008 Burrows Aug 1998 A
5801941 Bertram Sep 1998 A
5805165 Thorne et al. Sep 1998 A
5805911 Miller Sep 1998 A
5818437 Grover et al. Oct 1998 A
5818451 Bertram et al. Oct 1998 A
5896321 Miller et al. Apr 1999 A
5943443 Itonori et al. Aug 1999 A
5953541 King et al. Sep 1999 A
5956021 Kubota et al. Sep 1999 A
5963671 Comerford et al. Oct 1999 A
5999895 Forest Dec 1999 A
6023536 Visser Feb 2000 A
6040824 Maekawa et al. Mar 2000 A
6049326 Beyda et al. Apr 2000 A
6073036 Heikkinen et al. Jun 2000 A
6094197 Buxton et al. Jul 2000 A
6169538 Nowlan et al. Jan 2001 B1
6212412 Rogers et al. Apr 2001 B1
6259436 Moon et al. Jul 2001 B1
6271835 Hoeksma Aug 2001 B1
6292179 Lee Sep 2001 B1
6295052 Kato et al. Sep 2001 B1
6298321 Karlov et al. Oct 2001 B1
6307548 Flinchem et al. Oct 2001 B1
6323846 Westerman et al. Nov 2001 B1
6359572 Vale Mar 2002 B1
6377965 Hachamovitch et al. Apr 2002 B1
6401060 Critchlow et al. Jun 2002 B1
6424338 Anderson Jul 2002 B1
6426761 Kanevsky et al. Jul 2002 B1
6434581 Forcier et al. Aug 2002 B1
6456952 Nathan Sep 2002 B1
6469722 Kinoe et al. Oct 2002 B1
6470347 Gillam Oct 2002 B1
6570557 Westerman et al. May 2003 B1
6573844 Venolia et al. Jun 2003 B1
6597345 Hirshberg Jul 2003 B2
6654733 Goodman et al. Nov 2003 B1
6671856 Gillam Dec 2003 B1
6675169 Bennett et al. Jan 2004 B1
6677932 Westerman Jan 2004 B1
6714221 Christie et al. Mar 2004 B1
6760580 Robinson et al. Jul 2004 B2
6795059 Endo Sep 2004 B2
6801190 Longe et al. Oct 2004 B1
6801659 O'dell Oct 2004 B1
6803905 Capps et al. Oct 2004 B1
6804677 Shadmon et al. Oct 2004 B2
6856318 Lewak Feb 2005 B1
6857800 Zhang et al. Feb 2005 B2
6926609 Martin Aug 2005 B2
6938220 Shigematsu et al. Aug 2005 B1
7030863 Longe et al. Apr 2006 B2
7038659 Rajkowski May 2006 B2
7057607 Mayoraz et al. Jun 2006 B2
7075512 Fabre et al. Jul 2006 B1
7096432 Huapaya et al. Aug 2006 B2
7098896 Kushler et al. Aug 2006 B2
7194699 Thomson et al. Mar 2007 B2
7277088 Robinson et al. Oct 2007 B2
7283072 Plachta et al. Oct 2007 B1
7283126 Leung Oct 2007 B2
7319957 Robinson et al. Jan 2008 B2
7382358 Kushler et al. Jun 2008 B2
7443316 Lim Oct 2008 B2
7475063 Datta et al. Jan 2009 B2
7477240 Yanagisawa Jan 2009 B2
7487147 Bates et al. Feb 2009 B2
7490034 Finnigan et al. Feb 2009 B2
7502017 Ratzlaff et al. Mar 2009 B1
7508324 Suraqui Mar 2009 B2
7526738 Ording et al. Apr 2009 B2
7565380 Venkatachary Jul 2009 B1
7584093 Potter et al. Sep 2009 B2
7609179 Diaz-gutierrez et al. Oct 2009 B2
7614008 Ording Nov 2009 B2
7619677 Matsuda et al. Nov 2009 B2
7650562 Bederson et al. Jan 2010 B2
7676763 Rummel et al. Mar 2010 B2
7679534 Kay et al. Mar 2010 B2
7683886 Willey Mar 2010 B2
7694231 Kocienda et al. Apr 2010 B2
7707026 Liu Apr 2010 B2
7712053 Bradford et al. May 2010 B2
7725838 Williams May 2010 B2
7778818 Longe et al. Aug 2010 B2
7793228 Mansfield et al. Sep 2010 B2
7797269 Rieman et al. Sep 2010 B2
7809719 Furuuchi et al. Oct 2010 B2
7809744 Nevidomski et al. Oct 2010 B2
7880730 Robinson et al. Feb 2011 B2
7941762 Tovino et al. May 2011 B1
7957955 Christie et al. Jun 2011 B2
7969421 Huh Jun 2011 B2
8037034 Plachta et al. Oct 2011 B2
8041557 Liu Oct 2011 B2
8059101 Westerman et al. Nov 2011 B2
8074172 Kocienda et al. Dec 2011 B2
8090571 Elshishiny et al. Jan 2012 B2
8112529 Van et al. Feb 2012 B2
8136052 Shin et al. Mar 2012 B2
8179370 Yamasani et al. May 2012 B1
8232973 Kocienda et al. Jul 2012 B2
8245156 Mouilleseaux et al. Aug 2012 B2
8286085 Denise Oct 2012 B1
8299943 Longe et al. Oct 2012 B2
8370737 Zahavi et al. Feb 2013 B2
8423916 Chihara et al. Apr 2013 B2
8542206 Westerman et al. Sep 2013 B2
8601389 Schulz et al. Dec 2013 B2
8645825 Cornea et al. Feb 2014 B1
8661340 Goldsmith et al. Feb 2014 B2
8671343 Oberstein Mar 2014 B2
8706750 Hansson et al. Apr 2014 B2
8825484 Yamada et al. Sep 2014 B2
8843845 Bi et al. Sep 2014 B2
8896556 Frazier et al. Nov 2014 B2
8938688 Bradford et al. Jan 2015 B2
8994660 Neels et al. Mar 2015 B2
9007311 Kwak et al. Apr 2015 B2
9021380 Ouyang et al. Apr 2015 B2
9046928 Kumhyr Jun 2015 B2
9058092 Rogers Jun 2015 B2
9086802 Kocienda et al. Jul 2015 B2
9116551 Huang et al. Aug 2015 B2
9250797 Roberts et al. Feb 2016 B2
9310889 Griffin et al. Apr 2016 B2
9405740 King et al. Aug 2016 B2
9436380 Chmielewski et al. Sep 2016 B2
9465536 Goldsmith et al. Oct 2016 B2
9535597 Wong et al. Jan 2017 B2
9557913 Griffin et al. Jan 2017 B2
9557916 Robinson et al. Jan 2017 B2
9740399 Paek et al. Aug 2017 B2
10037139 Pasquero et al. Jul 2018 B2
20010015718 Hinckley et al. Aug 2001 A1
20020010726 Rogson Jan 2002 A1
20020015024 Westerman et al. Feb 2002 A1
20020015064 Robotham et al. Feb 2002 A1
20020051018 Yeh May 2002 A1
20020085037 Leavitt et al. Jul 2002 A1
20020126097 Savolainen Sep 2002 A1
20020135615 Lang Sep 2002 A1
20020140679 Wen Oct 2002 A1
20020140680 Lu Oct 2002 A1
20020156615 Takatsuka et al. Oct 2002 A1
20020167545 Kang et al. Nov 2002 A1
20020191029 Gillespie et al. Dec 2002 A1
20030024375 Sitrick Feb 2003 A1
20030041147 Van et al. Feb 2003 A1
20030063073 Geaghan et al. Apr 2003 A1
20030090467 Hohl et al. May 2003 A1
20030100965 Sitrick et al. May 2003 A1
20030149978 Plotnick Aug 2003 A1
20030159113 Bederson et al. Aug 2003 A1
20030189553 Goren Oct 2003 A1
20030193481 Sokolsky Oct 2003 A1
20030197736 Murphy Oct 2003 A1
20030204392 Finnigan et al. Oct 2003 A1
20030216913 Keely et al. Nov 2003 A1
20030229607 Zellweger et al. Dec 2003 A1
20040009788 Mantyjarvi et al. Jan 2004 A1
20040070567 Longe et al. Apr 2004 A1
20040095395 Kurtenbach May 2004 A1
20040135774 La Monica Jul 2004 A1
20040140956 Kushler et al. Jul 2004 A1
20040155869 Robinson et al. Aug 2004 A1
20040157586 Robinson et al. Aug 2004 A1
20040160419 Padgitt Aug 2004 A1
20040165924 Griffin Aug 2004 A1
20040178994 Kairls Sep 2004 A1
20040183833 Chua Sep 2004 A1
20040196256 Wobbrock et al. Oct 2004 A1
20040218963 Van Diepen et al. Nov 2004 A1
20040243389 Thomas et al. Dec 2004 A1
20040261021 Mittal et al. Dec 2004 A1
20050024341 Gillespie et al. Feb 2005 A1
20050027622 Walker et al. Feb 2005 A1
20050057498 Gentle Mar 2005 A1
20050093826 Huh May 2005 A1
20050099398 Garside et al. May 2005 A1
20050114324 Mayer May 2005 A1
20050116927 Voelckers Jun 2005 A1
20050131687 Sorrentino Jun 2005 A1
20050162395 Unruh Jul 2005 A1
20050169527 Longe et al. Aug 2005 A1
20050190970 Griffin Sep 2005 A1
20050193351 Huoviala Sep 2005 A1
20050216331 Ahrens et al. Sep 2005 A1
20050246365 Lowles Nov 2005 A1
20050253816 Himberg et al. Nov 2005 A1
20050253818 Nettamo Nov 2005 A1
20050278647 Leavitt et al. Dec 2005 A1
20050283364 Longe et al. Dec 2005 A1
20050283726 Lunati Dec 2005 A1
20060004744 Nevidomski et al. Jan 2006 A1
20060007174 Shen Jan 2006 A1
20060044278 Fux et al. Mar 2006 A1
20060052885 Kong Mar 2006 A1
20060053387 Ording Mar 2006 A1
20060062461 Longe et al. Mar 2006 A1
20060066590 Ozawa et al. Mar 2006 A1
20060085757 Andre et al. Apr 2006 A1
20060117067 Wright et al. Jun 2006 A1
20060152496 Knaven Jul 2006 A1
20060161846 Van Leeuwen Jul 2006 A1
20060181519 Vernier et al. Aug 2006 A1
20060190256 Stephanick et al. Aug 2006 A1
20060206454 Forstall et al. Sep 2006 A1
20060241944 Potter et al. Oct 2006 A1
20060246955 Nirhamo et al. Nov 2006 A1
20060247915 Bradford et al. Nov 2006 A1
20060265208 Assadollahi Nov 2006 A1
20060265648 Rainisto Nov 2006 A1
20060274051 Longe et al. Dec 2006 A1
20060288024 Braica Dec 2006 A1
20060293880 Elshishiny et al. Dec 2006 A1
20070024736 Matsuda et al. Feb 2007 A1
20070040813 Kushler et al. Feb 2007 A1
20070046641 Lim Mar 2007 A1
20070061753 Ng et al. Mar 2007 A1
20070061754 Ardhanari et al. Mar 2007 A1
20070067272 Flynt et al. Mar 2007 A1
20070130128 Garg et al. Jun 2007 A1
20070143262 Kasperski Jun 2007 A1
20070152978 Kocienda et al. Jul 2007 A1
20070156747 Samuelson et al. Jul 2007 A1
20070174387 Jania et al. Jul 2007 A1
20070180392 Russo Aug 2007 A1
20070198566 Sustik Aug 2007 A1
20070229323 Plachta et al. Oct 2007 A1
20070229476 Huh Oct 2007 A1
20070257896 Huh Nov 2007 A1
20070260595 Beatty et al. Nov 2007 A1
20070279711 King et al. Dec 2007 A1
20070285958 Platchta et al. Dec 2007 A1
20070288449 Datta et al. Dec 2007 A1
20080036743 Westerman et al. Feb 2008 A1
20080059876 Hantler et al. Mar 2008 A1
20080072156 Sitrick Mar 2008 A1
20080109401 Sareen et al. May 2008 A1
20080114591 Williamson May 2008 A1
20080167858 Christie et al. Jul 2008 A1
20080168366 Kocienda et al. Jul 2008 A1
20080177717 Kumar et al. Jul 2008 A1
20080195388 Bower et al. Aug 2008 A1
20080209358 Yamashita Aug 2008 A1
20080259022 Mansfield et al. Oct 2008 A1
20080266261 Idzik et al. Oct 2008 A1
20080310723 Manu et al. Dec 2008 A1
20080316183 Westerman et al. Dec 2008 A1
20090051661 Kraft et al. Feb 2009 A1
20090058823 Kocienda Mar 2009 A1
20090119289 Gibbs et al. May 2009 A1
20090174667 Kocienda et al. Jul 2009 A1
20090193332 Lee Jul 2009 A1
20090193361 Lee et al. Jul 2009 A1
20090249198 Davis et al. Oct 2009 A1
20090284471 Longe et al. Nov 2009 A1
20090319172 Almeida et al. Dec 2009 A1
20090327977 Bachfischer et al. Dec 2009 A1
20100023318 Lemoine Jan 2010 A1
20100036655 Cecil et al. Feb 2010 A1
20100177056 Kocienda et al. Jul 2010 A1
20100188357 Kocienda et al. Jul 2010 A1
20100188358 Kocienda et al. Jul 2010 A1
20100192086 Kocienda et al. Jul 2010 A1
20100235780 Westerman et al. Sep 2010 A1
20100287486 Coddington et al. Nov 2010 A1
20100325588 Reddy et al. Dec 2010 A1
20100333030 Johns Dec 2010 A1
20110004849 Oh Jan 2011 A1
20110183720 Dinn Jul 2011 A1
20110201387 Paek et al. Aug 2011 A1
20110202876 Badger et al. Aug 2011 A1
20110209098 Hinckley et al. Aug 2011 A1
20120036469 Suraqui Feb 2012 A1
20120047135 Hansson et al. Feb 2012 A1
20120079373 Kocienda et al. Mar 2012 A1
20120079412 Kocienda et al. Mar 2012 A1
20120119997 Gutowitz et al. May 2012 A1
20120136897 Kawauchi May 2012 A1
20120167009 Davidson et al. Jun 2012 A1
20120239395 Foo et al. Sep 2012 A1
20120240036 Howard et al. Sep 2012 A1
20130002553 Colley et al. Jan 2013 A1
20130036387 Murata Feb 2013 A1
20130104068 Murphy et al. Apr 2013 A1
20130125037 Pasquero et al. May 2013 A1
20130187858 Griffin et al. Jul 2013 A1
20130285927 Pasquero et al. Oct 2013 A1
20130339283 Grieves et al. Dec 2013 A1
20140002363 Griffin et al. Jan 2014 A1
20140028571 St. et al. Jan 2014 A1
20140063067 Compton et al. Mar 2014 A1
20140085311 Gay et al. Mar 2014 A1
20140176776 Morita Jun 2014 A1
20140195979 Branton et al. Jul 2014 A1
20140310639 Zhai et al. Oct 2014 A1
20140317547 Bi et al. Oct 2014 A1
20150121285 Eleftheriou et al. Apr 2015 A1
20150142602 Williams et al. May 2015 A1
20150242114 Hirabayashi et al. Aug 2015 A1
20150269432 Motoi Sep 2015 A1
20150281788 Noguerol et al. Oct 2015 A1
20150317078 Kocienda et al. Nov 2015 A1
20150331605 Park et al. Nov 2015 A1
20150347007 Jong et al. Dec 2015 A1
20150347379 Jong et al. Dec 2015 A1
20150378982 Mckenzie et al. Dec 2015 A1
20160026730 Hasan Jan 2016 A1
20160070441 Paek et al. Mar 2016 A1
20160092431 Motoi Mar 2016 A1
20160098186 Sugiura Apr 2016 A1
20160132232 Baba et al. May 2016 A1
20160139805 Kocienda et al. May 2016 A1
20170024126 Kocienda et al. Jan 2017 A1
20180047189 Diverdi et al. Feb 2018 A1
20190147035 Chaudhri et al. May 2019 A1
20190187892 Kocienda et al. Jun 2019 A1
20190303423 Thimbleby Oct 2019 A1
20200379638 Zhu et al. Dec 2020 A1
20210150121 Thimbleby May 2021 A1
Foreign Referenced Citations (28)
Number Date Country
1834872 Sep 2006 CN
0880090 Nov 1998 EP
1271295 Jan 2003 EP
1674976 Jun 2006 EP
2332293 Jun 1999 GB
2337349 Nov 1999 GB
2351639 Jan 2001 GB
2380583 Apr 2003 GB
8-249122 Sep 1996 JP
9-81320 Mar 1997 JP
11-53093 Feb 1999 JP
11-305933 Nov 1999 JP
2000-29630 Jan 2000 JP
2001-521793 Nov 2001 JP
2002-518721 Jun 2002 JP
2002-222039 Aug 2002 JP
2003-216312 Jul 2003 JP
2005-92256 Apr 2005 JP
199833111 Jul 1998 WO
200038041 Jun 2000 WO
200038042 Jun 2000 WO
2003098417 Nov 2003 WO
2004051392 Jun 2004 WO
2005006442 Jan 2005 WO
2005008899 Jan 2005 WO
2006003590 Jan 2006 WO
2006115946 Nov 2006 WO
2007068505 Jun 2007 WO
Non-Patent Literature Citations (156)
Entry
Applicant-Initiated Interview Summary received for U.S. Appl. No. 16/245,140, dated Apr. 2, 2021, 4 pages.
Notice of Allowance received for U.S. Appl. No. 16/245,140, dated Jul. 30, 2021, 8 pages.
Notice of Allowance received for U.S. Appl. No. 16/245,140, dated May 18, 2021, 14 pages.
Notice of Allowance received for U.S. Appl. No. 16/814,770, dated Jun. 1, 2021, 9 pages.
Advisory Action received for U.S. Appl. No. 15/288,579, dated May 2, 2019, 5 pages.
Advisory Action received for U.S. Appl. No. 16/270,396, dated Apr. 9, 2020, 5 pages.
Anonymous, “Swipe to Edit Using Better Touch Tool: Mac Automation Tips”, Mac Automation Tips, XP55217837, Available Online at: https:jjmacautomationtips.wordpress.com/2011/03/11/swipe-to-edit-using-bettertouchtool/, Mar. 11, 2011, 1 page.
Applicant-Initiated Interview Summary received for U.S. Appl. No. 16/245,140, dated Feb. 1, 2021, 4 pages.
Applicant-Initiated Interview Summary received for U.S. Appl. No. 16/245,140, dated Sep. 16, 2020, 6 pages.
Applicant-Initiated Interview Summary received for U.S. Appl. No. 16/270,396, dated Apr. 6, 2020, 5 pages.
Applicant-Initiated Interview Summary received for U.S. Appl. No. 16/270,396, dated Feb. 22, 2021, 4 pages.
Applicant-Initiated Interview Summary received for U.S. Appl. No. 16/270,396, dated Sep. 14, 2020, 5 pages.
CALL Centre, “Word Prediction”, The CALL Centre & Scottish Executive Education Dept., 1999, pp. 63-73.
Casario M., “Hands on Macromedia World: Touch Screen Keypad for Mobile Phone by DoCoMo”, Available Online at: http://casario.bloqs.com/mmworld/2005/10/touch_screen_ke. html, retrieved on Nov. 18, 2005, 1 page.
Centroid, Available Online at: http://faculty.evansville.edu/ck6/tcenters/class/centroid.html, Apr. 28, 2006, 1 page.
Centroid, Available Online at : http://www.pballew.net/centroid.html, Apr. 28, 2006, 3 pages.
Chavda Prekesh, “Swipe to Edit”, Dribbble. , XP55217832 , Available Online at: https://dribbble.comjshots/1320750-Swipe-to-Edit-animation, Nov. 21, 2013, 7 pages.
Compare Keyboards with the Keyboard Compatibility Chart, Learn more About Alternative Keyboards, Solutions for Humans, Available Online at: http://www.keyalt.com/kkeybrdp.htm, Dec. 8, 2005, 5 pages.
Day B., “Will Cell Phones Render iPods Obsolete?”, Available Online at: http://weblogs.iavanet/pub/wig/883, Dec. 12, 2005, 3 pages.
Decision to Grant received for Chinese Patent Application No. 201210377992.0, dated May 6, 2015, 4 pages.
Decision to Grant received for European Patent Application No. 15716372.6, dated Aug. 16, 2019, 2 pages.
Devices, Technology Loan Catalog, Available Online at: http://www.tsbvi.edu/outreach/techioan/catalog.html, retrieved on Dec. 8, 2005, 10 pages.
dyslexic.com, “AlphaSmart 3000 with CoWriter SmartApplet: Don Johnston Special Needs”, Available Online at: http://www.dyslexic.com/procuts.php?catid-2&pid=465&PHPSESSID=2511b800000f7da, retrieved on Dec. 6, 2005, pp. 1-13.
Fastap Keypads Redefine Mobile Phones, DigitWireless, Available Online at http://www.digitwireless.com, retrieved on Nov. 18, 2005, 10 pages.
Fastap, DigitWireless, Available Online at: http://www.digitwireless.com/about/faq.html, Dec. 6, 2005, 5 pages.
Final Office Action Received for U.S. Appl. No. 14/502,711 , dated Sep. 22, 2017, 27 pages.
Final Office Action received for U.S. Appl. No. 11/459,615, dated Dec. 8, 2009, 12 pages.
Final Office Action received for U.S. Appl. No. 11/549,624, dated Apr. 10, 2009, 9 pages.
Final Office Action received for U.S. Appl. No. 11/549,624, dated Feb. 1, 2010, 9 pages.
Final Office Action received for U.S. Appl. No. 11/620,641, dated Jun. 25, 2010, 31 pages.
Final Office Action received for U.S. Appl. No. 11/620,642, dated Nov. 29, 2010, 14 pages.
Final Office Action received for U.S. Appl. No. 11/961,663 dated Mar. 17, 2011, 24 pages.
Final Office Action received for U.S. Appl. No. 12/207,429, dated Nov. 28, 2011, 11 pages.
Final Office Action received for U.S. Appl. No. 12/207,429, dated Oct. 25, 2012, 8 pages.
Final Office Action received for U.S. Appl. No. 12/505,382, dated Jul. 9, 2012, 35 pages.
Final Office Action received for U.S. Appl. No. 12/505,382, dated May 3, 2012, 27 pages.
Final Office Action received for U.S. Appl. No. 13/559,495, dated Sep. 8, 2014, 7 pages.
Final Office Action received for U.S. Appl. No. 14/502,711, dated Apr. 12, 2017, 29 pages.
Final Office Action Received for U.S. Appl. No. 14/503,147, dated Jun. 15, 2017, 18 pages.
Final Office Action received for U.S. Appl. No. 14/800,378, dated Sep. 7, 2018, 18 pages.
Final Office Action received for U.S. Appl. No. 15/003,773, dated May 10, 2018, 12 pages.
Final Office Action received for U.S. Appl. No. 15/288,579, dated Jan. 7, 2019, 12 pages.
Final Office Action received for U.S. Appl. No. 16/245,140, dated Feb. 11, 2021, 23 pages.
Final Office Action received for U.S. Appl. No. 16/270,396, dated Mar. 6, 2020, 11 pages.
Final Office Action received for U.S. Appl. No. 16/270,396, dated Oct. 19, 2020, 10 pages.
Four-Button Keyboard, WikiPodlinux, Available Online at: http://ipodlinux.org/Four_Button_Keyboard, retrieved on Dec. 5, 2005, 2 pages.
Glossary of Adaptive Technologies: Word Prediction, Available Online at: http://www.utoronto.ca/atrc/reference/techwordpred.html, retrieved on Dec. 6, 2005, pp. 1-5.
Hardy Ed, “Apple Adds iTunes Wi-Fi Music Store to iPhone”, Brighthand, Available Online at: http://www.brighthand.com/printArticle.asp?newsID=13379, Sep. 28, 2007, 1 page.
Intention to Grant received for European Patent Application No. 15716372.6, dated Apr. 3, 2019, 7 pages.
International Preliminary Report on Patentability Received for PCT Application No. PCT/US2015/023946 , dated Dec. 15, 2016, 17 pages.
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/US2007/060119, dated Jul. 8, 2008, 13 pages.
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/US2007/088872, dated Jul. 7, 2009, 7 pages.
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/US2007/088873, dated Jul. 7, 2009, 6 pages.
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2015/023946 , dated Oct. 12, 2015, 22 pages.
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2007/060119, dated Apr. 11, 2008, 18 pages.
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2007/088872, dated May 8, 2008, 8 pages.
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2007/088873, dated May 8, 2008, 7 pages.
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2007/088904, dated Sep. 15, 2008, 17 pages.
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2008/050426, dated Jun. 13, 2008, 10 pages.
Introducing the Ultimate Smartphone Keypad, Delta II™ Keypads, Available Online at: http://www.chicagologic.com, retrieved on Nov. 18, 2005, 2 pages.
Invitation to Pay Additional Fees received for PCT Patent Application No. PCT/US2007/060119, dated Jan. 2, 2008, 9 pages.
Invitation to Pay Additional Fees received for PCT Patent Application No. PCT/US2007/088904, dated Jun. 23, 2008, 8 pages.
LG Develops New Touch Pad Cell Phones, Textually, Available Online at: http://textually.ora/textually/archives/2005/06/009903.html, retrieved on Nov. 18, 2005, 1 page.
Mactech, “Keystrokes 3.5 for Mac OS X Boosts Word Prediction”, Available Online at: http://www.mactech.com/news/?p=1007129, retrieved on Jan. 7, 2008, pp. 1-3.
Masui Toshiyuki, “POBox: An Efficient Text Input Method for Handheld and Ubiquitous Computers”, Proceedings of the 1st International Symposium on Handheld and Ubiquitous Computing, 1999, 12 pages.
Microsoft New-Smart Phone Interface: Your Thumb, Textually, Available Online at: http://www.textuallv.org, retrieved on Nov. 18, 2005, 2 pages.
Minutes of the Oral proceedings received for European Patent Application No. 15716372.6, mailed on Mar. 29, 2019, 9 pages.
Minutes of the Oral Proceedings received for European Patent Application No. 15716372.6, mailed on Feb. 21, 2019, 6 pages.
Mobile Tech News, “T9 Text Input Software Updated”, Available Online at: http://www.mobiletechnews.com/info/2004/11/23/122155.html, Nov. 23, 2004, 4 pages.
NCIP, “NCIP Library: Word Prediction Collection”, Available Online at: http://www2.edc.org/ncip/library/wp/toc.htm, 1998, 4 pages.
NCIP, “What is Word Prediction?”, Available Online at: http://www2.edc.org/NCIP/library/wp/what_is.htm, 1998, 2 pages.
Non-Final Office Action received for U.S. Appl. No. 11/620,641, dated Nov. 20, 2009, 20 pages.
Non-Final Office Action received for U.S. Appl. No. 12/727,217, dated May 11, 2012, 22 pages.
Non-Final Office Action received for U.S. Appl. No. 11/228,737, dated Mar. 19, 2009, 17 pages.
Non-Final Office Action received for U.S. Appl. No. 11/459,606, dated May 28, 2009, 19 pages.
Non-Final Office Action received for U.S. Appl. No. 11/459,615, dated Apr. 13, 2010, 10 pages.
Non-Final Office Action received for U.S. Appl. No. 11/459,615, dated May 22, 2009, 10 pages.
Non-Final Office Action received for U.S. Appl. No. 11/549,624, dated Jul. 22, 2009, 9 pages.
Non-Final Office Action received for U.S. Appl. No. 11/549,624, dated Sep. 30, 2008, 8 pages.
Non-Final Office Action received for U.S. Appl. No. 11/620,642, dated Feb. 18, 2011, 16 pages.
Non-Final Office Action received for U.S. Appl. No. 11/620,642, dated Mar. 30, 2010, 10 pages.
Non-Final Office Action received for U.S. Appl. No. 11/961,663, dated Nov. 18, 2010, 22 pages.
Non-Final Office Action received for U.S. Appl. No. 12/165,554, dated Nov. 21, 2011, 12 pages.
Non-Final Office Action received for U.S. Appl. No. 12/207,429, dated Jun. 21, 2013, 8 pages.
Non-Final Office Action received for U.S. Appl. No. 12/207,429, dated Jun. 9, 2011, 10 pages.
Non-Final Office Action received for U.S. Appl. No. 12/207,429, dated Mar. 30, 2012, 9 pages.
Non-Final Office Action received for U.S. Appl. No. 12/505,382, dated Jan. 5, 2012, 32 pages.
Non-Final Office Action received for U.S. Appl. No. 12/727,219, dated Feb. 17, 2012, 12 pages.
Non-Final Office Action received for U.S. Appl. No. 12/727,220, dated Feb. 16, 2012, 15 pages.
Non-Final Office Action received for U.S. Appl. No. 12/727,221, dated Feb. 16, 2012, 18 pages.
Non-Final Office Action received for U.S. Appl. No. 12/976,834, dated Mar. 12, 2013, 17 pages.
Non-Final Office Action received for U.S. Appl. No. 13/220,202, dated Jun. 12, 2014, 13 pages.
Non-Final Office Action received for U.S. Appl. No. 13/310,586, dated Jul. 9, 2015, 10 pages.
Non-Final Office Action received for U.S. Appl. No. 13/310,592, dated Jun. 22, 2015, 9 pages.
Non-Final Office Action received for U.S. Appl. No. 13/559,495, dated Dec. 16, 2013, 6 pages.
Non-Final Office Action received for U.S. Appl. No. 13/559,495, dated Dec. 7, 2012, 10 pages.
Non-Final Office Action received for U.S. Appl. No. 14/502,711, dated Apr. 26, 2018, 21 pages.
Non-Final Office Action received for U.S. Appl. No. 14/503,147 , dated Nov. 2, 2016, 21 pages.
Non-Final Office Action received for U.S. Appl. No. 14/800,378, dated Feb. 23, 2018, 10 pages.
Non-Final Office Action received for U.S. Appl. No. 15/003,773, dated Oct. 5, 2017, 10 pages.
Non-Final Office Action received for U.S. Appl. No. 15/288,579, dated Apr. 4, 2018, 6 pages.
Non-Final Office Action received for U.S. Appl. No. 16/245,140, dated Jun. 2, 2020, 23 pages.
Non-Final Office Action received for U.S. Appl. No. 16/245,140, dated Oct. 30, 2020, 20 pages.
Non-Final Office Action received for U.S. Appl. No. 16/270,396, dated Aug. 22, 2019, 9 pages.
Non-Final Office Action received for U.S. Appl. No. 16/270,396, dated Jun. 12, 2020, 9 pages.
Non-Final Office Action received for U.S. Appl. No. 90/012,892, dated Aug. 5, 2014, 54 pages.
Non-Final Office Action received for U.S. Appl. No. 90/012,892, dated Mar. 27, 2015, 102 pages.
Non-Final Office Action Received for U.S. Appl. No. 14/502,711, dated Nov. 21, 2016, 23 pages.
Notice of Allowance received for U.S. Appl. No. 13/310,586, dated Sep. 14, 2015, 9 pages.
Notice of Allowance received for U.S. Appl. No. 11/459,606, dated Dec. 18, 2009, 7 pages.
Notice of Allowance received for U.S. Appl. No. 11/549,624, dated Jun. 3, 2010, 6 pages.
Notice of Allowance received for U.S. Appl. No. 11/620,641, dated Apr. 13, 2011, 6 pages.
Notice of Allowance received for U.S. Appl. No. 11/620,641, dated Mar. 18, 2011, 12 pages.
Notice of Allowance received for U.S. Appl. No. 11/620,642, dated Oct. 24, 2011, 8 pages.
Notice of Allowance received for U.S. Appl. No. 12/165,554, dated Apr. 2, 2012, 9 pages.
Notice of Allowance received for U.S. Appl. No. 12/207,429, dated Oct. 8, 2013, 9 pages.
Notice of Allowance received for U.S. Appl. No. 13/220,202, dated Nov. 25, 2014, 8 pages.
Notice of Allowance received for U.S. Appl. No. 13/310,592, dated Jul. 15, 2015, 8 pages.
Notice of Allowance received for U.S. Appl. No. 13/559,495, dated Aug. 15, 2013, 6 pages.
Notice of Allowance received for U.S. Appl. No. 13/559,495, dated Dec. 12, 2014, 5 pages.
Notice of Allowance received for U.S. Appl. No. 13/559,495, dated Jun. 25, 2013, 6 pages.
Notice of Allowance received for U.S. Appl. No. 13/559,495, dated Mar. 13, 2015, 5 pages.
Notice of Allowance received for U.S. Appl. No. 14/187,176, dated Jul. 29, 2016, 7 pages.
Notice of Allowance received for U.S. Appl. No. 14/502,711, dated Sep. 28, 2018, 9 pages.
Notice of Allowance received for U.S. Appl. No. 14/503,147, dated Jan. 28, 2019, 7 pages.
Notice of Allowance received for U.S. Appl. No. 14/503,147, dated Sep. 12, 2018, 7 pages.
Notice of Allowance received for U.S. Appl. No. 16/270,396, dated Apr. 1, 2021, 10 pages.
Notice of Intent received for U.S. Appl. No. 90/012,892, dated Sep. 17, 2015, 10 pages.
Office Action received for Australian Patent Application No. 2007342164, dated Apr. 1, 2010, 2 pages.
Office Action received for Chinese Patent Application No. 200780006621.9, dated Aug. 16, 2010, 4 pages.
Office Action received for Chinese Patent Application No. 200780052020.1, dated Nov. 25, 2010, 14 pages.
Office Action received for Chinese Patent Application No. 201210377992.0, dated Sep. 3, 2014, 9 pages.
Office Action received for European Patent Application No. 07709955.4, dated Jul. 31, 2009, 6 pages.
Office Action received for European Patent Application No. 07709955.4, dated Oct. 10, 2008, 5 pages.
Office Action received for European Patent Application No. 07869922.0, dated Dec. 7, 2010, 5 pages.
Office Action received for European Patent Application No. 07869922.0, dated May 26, 2010, 5 pages.
Office Action received for European Patent Application No. 07869923.8, dated May 26, 2010, 4 pages.
Office Action received for European Patent Application No. 15716372.6, dated Nov. 15, 2017, 8 pages.
Office Action received for Korean Patent Application No. 10-2008-7019114, dated Aug. 31, 2010, 4 pages.
Office Action received for Taiwan Patent Application No. 097100079, dated Apr. 17, 2012, 34 pages.
Office Action received in Japanese Patent Application No. 2008-549646, dated Apr. 27, 2011, 4 pages.
O'Neal, “Smart Phones with Hidden Keyboards”, Available Online at: http://msc.com/4250-6452_16-6229969-1.html, Nov. 18, 2005, 3 pages.
P900 User Guide, Sony Ericsson Mobile Communications AB, XP002479719, Available Online at : http://www.sonyericcson.com/downloads/P900_UG_R1b_EN.pdf, pp. 8, 16,17,20, 24-26,42-45,137, 98 pages.
Plaisant C., “Touchscreen Toggle Design”, Available Online at: http://www.youtube.com/watch?v=wFWbdxicvK0, retrieved on Nov. 15, 2013, 2 pages.
Pogue D., “iPhone:The Missing Manual”, Second Edition, O'Reilly Media, Aug. 2008, pp. 19-26.
Pogue David, “iPhone: The Missing Manual”, Aug. 2007, 306 pages.
Samsung Releases Keyboard Phone in US, Textually, Available Online at: http://www.textually.ora/textually/archives/2005/11 /01 0482. htm, retrieved on Nov. 18, 2005, 1 page.
Sears et al., “Data Entry for Mobile Devices Using Soft Keyboards: Understanding the Effects of Keyboard Size and User Tasks”, Abstract, Int'l Journal of Human-Computer Interaction, vol. 16, No. 2, 1 page.
Summons to Attend Oral Proceedings received for European Patent Application No. 07709955.4, mailed on May 11, 2010, 8 pages.
Summons to Attend Oral Proceedings received for European Patent Application No. 15716372.6, mailed on Jul. 13, 2018, 9 pages.
T9® Text Input for Keypad Devices, Available Online at: http://tegic.com, 1 page.
Text Input (legacy), WikiPodlinux, Available Online at: http://ipodlinux.org/TextInput_%28legacy%29, retrieved on Dec. 5, 2005, 8 pages.
Text Input Concepts, WikiPodlinux, Available Online at: http:l/web.archive.ora/web/20051211165254/http:l/ipodlinux.ora/Text_Input_Concepts, 3 pages.
Text Input Methods, WikiPodlinux, Available Online at: http://ipodlinux.org/Text_Input_Methods, retrieved on Dec. 5, 2005, 5 pages.
Third Party Rejection received for U.S. Appl. No. 90/012,892, dated Jun. 14, 2013, 681 pages.
You Heard of Touch Screens Now Check Out Touch Keys, Phoneyworld, Available Online at: http://www.phonevworld.com, retrieved on Nov. 18, 2005, 2 pages.
Related Publications (1)
Number Date Country
20210349631 A1 Nov 2021 US
Provisional Applications (1)
Number Date Country
61010619 Jan 2008 US
Divisions (1)
Number Date Country
Parent 12165554 Jun 2008 US
Child 13559495 US
Continuations (3)
Number Date Country
Parent 16270396 Feb 2019 US
Child 17385547 US
Parent 14800378 Jul 2015 US
Child 16270396 US
Parent 13559495 Jul 2012 US
Child 14800378 US