Method, system, and graphical user interface for providing word recommendations

Information

  • Patent Grant
  • 11416141
  • Patent Number
    11,416,141
  • Date Filed
    Thursday, July 15, 2021
    3 years ago
  • Date Issued
    Tuesday, August 16, 2022
    2 years ago
Abstract
One aspect of the invention involves a method that includes: in a first area of the touch screen, displaying a current character string being input by a user with the keyboard; in a second area of the touch screen, displaying the current character string or a portion thereof and a suggested replacement for the current character string; replacing the current character string in the first area with the suggested replacement if the user activates a delimiter key on the keyboard; replacing the current character string in the first area with the suggested replacement if the user performs a first gesture on the suggested replacement displayed in the second area; and keeping the current character string in the first area if the user performs a second gesture on the current character string or the portion thereof displayed in the second area.
Description
TECHNICAL FIELD

The disclosed embodiments relate generally to text input on portable electronic devices, and more particularly, to a method, system, and graphical user interface for providing word recommendations on a portable electronic device.


BACKGROUND

In recent years, the functional capabilities of portable electronic devices have increased dramatically. Current devices enable communication by voice, text, and still or moving images. Communication by text, such as by email or short message service (SMS), has proven to be quite popular.


However, the size of these portable communication devices also restricts the size of the text input device, such as a physical or virtual keyboard, in the portable device. With a size-restricted keyboard, designers are often forced to make the keys smaller or overload the keys. Both may lead to typing mistakes and thus more backtracking to correct the mistakes. This makes the process of inputting text on the devices inefficient and reduces user satisfaction with such portable devices.


Accordingly, there is a need for more efficient ways of entering text into portable devices.


SUMMARY

The above deficiencies and other problems associated with user interfaces for portable devices are reduced or eliminated by the disclosed device that includes a text input interface that provides word recommendations.


According to some embodiments, a computer-implemented method may be performed at a portable electronic device with a keyboard and a touch screen display. The method includes: in a first area of the touch screen display, displaying a current character string being input by a user with the keyboard; in a second area of the touch screen display, displaying the current character string or a portion thereof and a suggested replacement character string for the current character string; replacing the current character string in the first area with the suggested replacement character string if the user activates a key on the keyboard associated with a delimiter; replacing the current character string in the first area with the suggested replacement character string if the user performs a first gesture on the suggested replacement character string displayed in the second area; and keeping the current character string in the first area if the user performs a second gesture on the current character string or the portion thereof displayed in the second area.


According to some embodiments, a graphical user interface on a portable electronic device with a keyboard and a touch screen display includes a first area of the touch screen display that displays a current character string being input by a user with the keyboard, and a second area of the touch screen display that displays the current character string or a portion thereof and a suggested replacement character string for the current character string. The current character string in the first area is replaced with the suggested replacement character string if the user activates a key on the keyboard associated with a delimiter. The current character string in the first area is replaced with the suggested replacement character string if the user performs a gesture on the suggested replacement character string in the second area. The current character string in the first area is kept if the user performs a gesture on the current character string or the portion thereof displayed in the second area.


According to some embodiments, a portable electronic device includes a touch screen display, one or more processors, memory, and a program. The program is stored in the memory and configured to be executed by the one or more processors. The program includes: instructions for displaying, in a first area of the touch screen display, a current character string being input by a user with the keyboard; instructions for displaying, in a second area of the touch screen display, the current character string and a suggested replacement character string for the current character string; instructions for replacing the current character string in the first area with the suggested replacement character string if the user activates a key on the keyboard associated with a delimiter; instructions for replacing the current character string in the first area with the suggested replacement character string if the user performs a first gesture on the suggested replacement character string displayed in the second area; and instructions for keeping the current character string in the first area if the user performs a second gesture on the current character string or the portion thereof displayed in the second area.


According to some embodiments, a computer-program product includes a computer readable storage medium and a computer program mechanism embedded therein. The computer program mechanism includes instructions, which when executed by a portable electronic device with a touch screen display, cause the device to: in a first area of the touch screen display, display a current character string being input by a user with the keyboard; in a second area of the touch screen display, display the current character string or a portion thereof and a suggested replacement character string for the current character string; replace the current character string in the first area with the suggested replacement character string if the user activates a key on the keyboard associated with a delimiter; replace the current character string in the first area with the suggested replacement character string if the user performs a first gesture on the suggested replacement character string displayed in the second area; and keep the current character string in the first area if the user performs a second gesture on the current character string or the portion thereof displayed in the second area.


According to some embodiments, a portable electronic device with a touch screen display includes: means for displaying a current character string being input by a user with the keyboard in a first area of the touch screen display; means for displaying the current character string or a portion thereof and a suggested replacement character string for the current character string in a second area of the touch screen display; means for replacing the current character string in the first area with the suggested replacement character string if the user activates a key on the keyboard associated with a delimiter; means for replacing the current character string in the first area with the suggested replacement character string if the user performs a first gesture on the suggested replacement character string displayed in the second area; and means for keeping the current character string in the first area if the user performs a second gesture on the current character string or the portion thereof displayed in the second area.





BRIEF DESCRIPTION OF THE DRAWINGS

For a better understanding of the aforementioned embodiments of the invention as well as additional embodiments thereof, reference should be made to the Description of Embodiments below, in conjunction with the following drawings in which like reference numerals refer to corresponding parts throughout the figures.



FIG. 1 is a block diagram illustrating a portable electronic device in accordance with some embodiments.



FIG. 2 illustrates a portable electronic device having a touch screen and a soft keyboard in accordance with some embodiments.



FIG. 3 is a flow diagram illustrating a process for providing word recommendations in accordance with some embodiments.



FIGS. 4A-4I illustrate a user interface for providing word recommendations in accordance with some embodiments.



FIGS. 5A-5B illustrate a user interface for showing originally entered text in accordance with some embodiments.





DESCRIPTION OF EMBODIMENTS

Reference will now be made in detail to embodiments, examples of which are illustrated in the accompanying drawings. In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the present invention. However, it will be apparent to one of ordinary skill in the art that the present invention may be practiced without these specific details. In other instances, well-known methods, procedures, components, circuits, and networks have not been described in detail so as not to unnecessarily obscure aspects of the embodiments.


Embodiments of user interfaces and associated processes for using a portable electronic device are described. In some embodiments, the device is a portable communications device such as a mobile telephone. The user interface may include a click wheel in addition to a touch screen. A click wheel is a physical user-interface device that may provide navigation commands based on an angular displacement of the wheel or a point of contact with the wheel by a user of the device. A click wheel may also be used to provide a user command corresponding to selection of one or more items, for example, when the user of the device presses down on at least a portion of the wheel or the center of the wheel. For simplicity, in the discussion that follows, a portable electronic device (e.g., a cellular telephone that may also contain other functions, such as text messaging, PDA and/or music player functions) that includes a touch screen is used as an exemplary embodiment. It should be understood, however, that the user interfaces and associated processes may be applied to other devices, such as personal digital assistants (PDA's), personal computers and laptops, which may include one or more other physical user-interface devices, such as a click wheel, a keyboard, a mouse and/or a joystick.


The device may support a variety of applications, such as one or more telephone applications, a text messaging application, a word processing application, an email application, a web browsing application, and a music player. The music player may be compatible with one or more file formats, such as MP3 and/or AAC. In an exemplary embodiment, the device includes an iPod music player (iPod trademark of Apple Computer, Inc.).


The various applications that may be executed on the device may use at least one common physical user-interface device, such as the touch screen. In embodiments that include a touch screen, one or more functions of the touch screen as well as corresponding information displayed on the device may be adjusted and/or varied from one application to the next and/or within a respective application. In this way, a common physical architecture (such as the touch screen) of the device may support the variety of applications with user interfaces that are intuitive and transparent to a user.


The user interfaces may include one or more keyboard embodiments displayed on a touch screen. The keyboard embodiments may include standard (QWERTY) and/or non-standard configurations of symbols on the displayed icons of the keyboard. The keyboard embodiments may include a reduced number of icons (or soft keys) relative to the number of keys in existing physical keyboards, such as that for a typewriter. This may make it easier for users to select one or more icons in the keyboard, and thus, one or more corresponding symbols. The keyboard embodiments may be adaptive. For example, displayed icons may be modified in accordance with user actions, such as selecting one or more icons and/or one or more corresponding symbols. One or more applications on the portable device may utilize common and/or different keyboard embodiments. Thus, the keyboard embodiment used may be tailored to at least some of the applications. In some embodiments, one or more keyboard embodiments may be tailored to a respective user. For example, based on a word usage history (lexicography, slang, individual usage) of the respective user. Some of the keyboard embodiments may be adjusted to reduce a probability of a user error when selecting one or more icons, and thus one or more symbols, when using the keyboard embodiments.


Attention is now directed to an embodiment of a portable communications device. FIG. 1 is a block diagram illustrating an embodiment of a device 100, such as a portable electronic device having a touch-sensitive display 112. The touch-sensitive display 112 is sometimes called a “touch screen” for convenience. The device 100 may include a memory controller 120, one or more data processors, image processors and/or central processing units 118 and a peripherals interface 116. The memory controller 120, the one or more processors 118 and/or the peripherals interface 116 may be separate components or may be integrated, such as in one or more integrated circuits 104. The various components in the device 100 may be coupled by one or more communication buses or signal lines 103.


If the device 110 includes picture taking capabilities, the peripherals interface 116 may be coupled to an optical sensor 148, such as a CMOS or CCD image sensor. The peripherals interface 116 is also coupled to RF circuitry 108; audio circuitry 110; and/or an input/output (I/O) subsystem 106. The audio circuitry 110 may be coupled to a speaker 142 and a micro-phone 144. The device 100 may support voice recognition and/or voice replication. The RF circuitry 108 may be coupled to one or more antennas 146 and may allow communication with one or more additional devices, computers and/or servers using a wireless network. The device 100 may support a variety of communications protocols, including code division multiple access (CDMA), Global System for Mobile Communications (GSM), Enhanced Data GSM Environment (EDGE), Wi-Fi (such as IEEE 802.11a, IEEE 802.11b, IEEE 802.11g and/or IEEE 802.11n), Bluetooth, Wi-MAX, a protocol for email, instant messaging, and/or a short message service (SMS), or any other suitable communication protocol, including communication protocols not yet developed as of the filing date of this document. In an exemplary embodiment, the device 100 may be, at least in part, a mobile phone (e.g., a cellular telephone).


The I/O subsystem 106 may include a touch screen controller 132 and/or other input controller(s) 134. The touch-screen controller 132 is coupled to a touch-sensitive screen or touch sensitive display system 112. The touch screen 112 and touch screen controller 132 may detect contact and any movement or break thereof using any of a plurality of touch sensitivity technologies now known or later developed, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with the touch-sensitive screen 112. A touch-sensitive display in some embodiments of the display system 112 may be analogous to the multi-touch sensitive tablets described in the following U.S. Pat. No. 6,323,846 (Westerman et al.), U.S. Pat. No. 6,570,557 (Westerman et al.), and/or U.S. Pat. No. 6,677,932 (Westerman), and/or U.S. Patent Publication 2002/0015024A1, each of which is hereby incorporated by reference. However, a touch screen in the display system 112 displays visual output from the portable device 100, whereas touch sensitive tablets do not provide visual output. The touch-sensitive screen 112 may have a resolution in excess of 100 dpi. In an exemplary embodiment, the touch-sensitive screen 112 has a resolution of approximately 168 dpi. The other input controller(s) 134 may be coupled to other input/control devices 114, such as one or more buttons. In some alternate embodiments, input controller(s) 134 may be coupled to any (or none) of the following: a keyboard, infrared port, USB port, and/or a pointer device such as a mouse. The one or more buttons (not shown) may include an up/down button for volume control of the speaker 142 and/or the microphone 144. The one or more buttons (not shown) may include a push button. A quick press of the push button (not shown) may disengage a lock of the touch screen 112. A longer press of the push button (not shown) may turn power to the device 100 on or off. The user may be able to customize a functionality of one or more of the buttons. The touch screen 112 may be used to implement virtual or soft buttons and/or one or more keyboards.


A touch-sensitive display in some embodiments of the display system 112 may be as described in the following applications: (1) U.S. patent application Ser. No. 11/381,313, “Multipoint Touch Surface Controller,” filed on May 2, 2006; (2) U.S. patent application Ser. No. 10/840,862, “Multipoint Touchscreen,” filed on May 6, 2004; (3) U.S. patent application Ser. No. 10/903,964, “Gestures For Touch Sensitive Input Devices,” filed on Jul. 30, 2004; (4) U.S. patent application Ser. No. 11/048,264, “Gestures For Touch Sensitive Input Devices,” filed on Jan. 31, 2005; (5) U.S. patent application Ser. No. 11/038,590, “Mode-Based Graphical User Interfaces For Touch Sensitive Input Devices,” filed on Jan. 18, 2005; (6) U.S. patent application Ser. No. 11/228,758, “Virtual Input Device Placement On A Touch Screen User Interface,” filed on Sep. 16, 2005; (7) U.S. patent application Ser. No. 11/228,700, “Operation Of A Computer With A Touch Screen Interface,” filed on Sep. 16, 2005; (8) U.S. patent application Ser. No. 11/228,737, “Activating Virtual Keys Of A Touch-Screen Virtual Keyboard,” filed on Sep. 16, 2005; and (9) U.S. patent application Ser. No. 11/367,749, “Multi-Functional Hand-Held Device,” filed on Mar. 3, 2006. All of these applications are incorporated by reference herein.


In some embodiments, the device 100 may include circuitry for supporting a location determining capability, such as that provided by the Global Positioning System (GPS). In some embodiments, the device 100 may be used to play back recorded music, such as one or more files, such as MP3 files or AAC files. In some embodiments, the device 100 may include the functionality of an MP3 player, such as an iPod (trademark of Apple Computer, Inc.). In some embodiments, the device 100 may include a multi-pin (e.g., 30-pin) connector that is compatible with the iPod.


The device 100 also includes a power system 137 for powering the various components. The power system 137 may include a power management system, one or more power sources (e.g., battery, alternating current (AC)), a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator (e.g., a light-emitting diode (LED)) and any other components associated with the generation, management and distribution of power in portable devices. The device 100 may also include one or more external ports 135 for connecting the device 100 to other devices.


The memory controller 120 may be coupled to memory 102, which may include one or more types of computer readable medium. Memory 102 may include high-speed random access memory and/or non-volatile memory, such as one or more magnetic disk storage devices, one or more optical storage devices, and/or flash memory. Memory 102 may store an operating system 122, such as Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an embedded operating system such as VxWorks. The operating system 122 may include procedures (or sets of instructions) for handling basic system services and for performing hardware dependent tasks. Memory 102 may also store communication procedures (or sets of instructions) in a communication module 124. The communication procedures may be used for communicating with one or more additional devices, one or more computers and/or one or more servers. The memory 102 may include a display module (or a set of instructions) 125, a contact/motion module (or a set of instructions) 126 to determine one or more points of contact and/or their movement, and a graphics module (or a set of instructions) 128. The graphics module 128 may support widgets, that is, modules or applications with embedded graphics. The widgets may be implemented using JavaScript, HTML, Adobe Flash, or other suitable computer program languages and technologies.


The memory 102 may also include one or more applications 130. Examples of applications that may be stored in memory 102 include telephone applications, email applications, text messaging or instant messaging applications, memo pad applications, address books or contact lists, calendars, picture taking and management applications, and music playing and management applications. The applications 130 may include a web browser (not shown) for rendering pages written in the Hypertext Markup Language (HTML), Wireless Markup Language (WML), or other languages suitable for composing web pages or other online content.


Also included in the memory 102 are a keyboard module (or a set of instructions) 131, a word recommendations module (or a set of instructions) 133, and a dictionary 136. The keyboard module 131 operates one or more soft keyboards. The word recommendations module 133 determines word completion or replacement recommendations for text entered by the user. The dictionary 136 includes a list of words in a language, from which word recommendations are drawn. In some embodiments, the dictionary also includes usage frequency rankings associated with the words in the dictionary.


Each of the above identified modules and applications correspond to a set of instructions for performing one or more functions described above. These modules (i.e., sets of instructions) need not be implemented as separate software programs, procedures or modules. The various modules and sub-modules may be rearranged and/or combined. Memory 102 may include additional modules and/or sub-modules, or fewer modules and/or sub-modules. Memory 102, therefore, may include a subset or a superset of the above identified modules and/or sub-modules. Various functions of the device 100 may be implemented in hardware and/or in software, including in one or more signal processing and/or application specific integrated circuits.


Attention is now directed towards embodiments of user interfaces and associated processes that may be implemented on the device 100. FIG. 2 is a schematic diagram illustrating an embodiment of a user interface for a portable electronic device 200. The device 200 includes a touch screen 208. In some embodiments, the touch screen may display one or more trays. A tray is a defined region or area within a graphical user interface. One tray may include a user entry interface, such as a virtual or soft keyboard 210 that includes a plurality of icons. The icons may include one or more symbols. In this embodiment, as well as others described below, a user may select one or more of the icons, and thus, one or more of the corresponding symbols, by making contact or touching the keyboard 210, for example, with one or more fingers 212 (not drawn to scale in the figure). The contact may correspond to the one or more icons. In some embodiments, selection of one or more icons occurs when the user breaks contact with the one or more icons. In some embodiments, the contact may include a gesture, such as one or more taps, one or more swipes (e.g., from left to right, right to left, upward and/or downward) and/or a rolling of a finger (e.g., from right to left, left to right, upward and/or downward) that has made contact with the device 200. In some embodiments, inadvertent contact with an icon may not select a corresponding symbol. For example, a swipe gesture that sweeps over an icon may not select a corresponding symbol if the gesture corresponding to selection is a tap gesture.


Alternatively, in some other embodiments, the keyboard may be a physical keyboard that includes a set of push buttons, a keypad, or the like. The physical keyboard is not a part of the touch screen display. The physical keyboard includes keys that correspond to the plurality of icons described above. A user may select one or more of the icons by pushing the corresponding keys on the physical keyboard.


The device 200 may include a display tray 214, which is displayed on the touch screen 208. The display tray 214 may display one or more of the characters and/or symbols that are selected by the user. The device 200 may also include one or more physical buttons, such as the clear, hold and menu buttons shown in FIG. 2. The menu button may be used to navigate to any application in a set of applications that may be executed on the device 200. Alternatively, in some embodiments, the clear, hold, and/or menu buttons are implemented as soft keys in a GUI in touch screen 208.


Attention is now directed to FIG. 3, which illustrates a flow diagram of a process flow 300 for providing word recommendations in accordance with some embodiments. As text is entered by a user on a device, one or more candidate character sequences (suggested replacements) may be provided in response to the entered text. The user may select a candidate character sequence to further extend or to complete the entered text.


A current character string is displayed in a first area of a touch screen of a portable device (302). In some embodiments, the current character string (which is a word, number, symbol, or a combination thereof) is at least a portion of a sequence of characters entered into the device by a user. The user inputs a sequence of characters into the portable device via an input device, such as a keyboard 210, and the device receives and displays the input on the touch screen. In some embodiments, the current character string is the endmost sequence of non-whitespace characters input by the user via the input device and delimited from the rest of the sequence of characters entered by the user by delimiters, such as whitespaces, line breaks, and punctuation.


The current character string (or a portion thereof) and one or more suggested replacements for the current character string is displayed in a second area (for example, a word selection area 216) of the touch screen (304). The second area may be located between the first area and the keyboard. The one or more suggested replacements, which may be words, numbers, or combinations thereof, are selected from a dictionary 136 for display by the device in accordance with predefined procedures. An example of a procedure for selecting suggested replacements for display is described in U.S. patent application Ser. No. 11/620,641, which is hereby incorporated by reference as background information. The user may take one of a plurality of actions with respect to the current character string and the suggested replacement displayed in the second area. If the user action is activation of a key on the keyboard associated with a delimiter







(


3

0

6


1
N







Perform





gesture





on





the





current





character





string







)

,





the current character string in the first area of the touch screen is replaced with the suggested replacement (308). The delimiter associated with the activated key may be appended to the end of the suggested replacement in the first area. For example, if the activated key is associated with a comma, a comma is appended to the suggested replacement (which replaces the current character string) in the first area. In some embodiments, delimiters include spaces, line breaks (sometimes called line returns), and terminal punctuation (for example, commas, periods, exclamation points, question marks, and semicolons). In other embodiment, delimiters may include a subset of the delimiters listed above, and may optionally include additional delimiters as well.


If the user action is performance of a first gesture on the suggested replacement in the second area of the touch screen







(


3

0

6


1
N







Activate





a





key







)

,





current character string in the first area of the touch screen is replaced with the suggested replacement (308). In some embodiments, a whitespace is appended to the end of the suggested replacement in the first area. In some embodiments, the first gesture includes one or more taps on the suggested replacement in the second area.


If the user action is performance of a second gesture on the current character string in the second area







(

306


1
N






Perform





gesture





on





the





suggested





replacement







)

,





the current character string is maintained in the first area (310). In some embodiments, a whitespace is appended to the end of the current character string in the first area. In some embodiments, the second gesture includes one or more taps on the current character string in the second area.


In some embodiments, the device displays a plurality of suggested replacements in the word selection area. In these embodiments, the user may select the desired replacement by performing a gesture on the desired replacement. However, if the user activates a key associated with the delimiter, a replacement is selected from amongst the plurality in accordance with one or more default rules. For example, a default rule may be that the highest ranked suggested replacement is selected.


In some embodiments, if the current character string in the first area was replaced with the suggested replacement, the user may review the current character string that was replaced. The user may perform a third gesture on the suggested replacement in the first area. After the third gesture is performed, the (original) current character string is displayed in the first area for a predetermined amount of time. In some embodiments, the third gesture includes one or more taps on the suggested replacement in the first area. Further details regarding reviewing the replaced current character string is described below in relation to FIGS. 5A-5B.


Attention is now directed to FIGS. 4A-4I, which illustrate a user interface for providing word recommendations in accordance with some embodiments. In a portable electronic device 200, text 218 entered by the user via a keyboard 210 or other input may be displayed in a first area, e.g. display tray 214. A cursor or insertion marker 220 may be displayed in the display tray 214 to indicate the insertion position of the next entered character.


The text 218 may include one or more strings separated by one or more delimiters, such as spaces and punctuation. The end-most string in the text 218 may be highlighted as the current character string 222 (FIG. 4B). The current character string 222 may be a complete or incomplete word. The device 200 may display one or more suggested replacements 224 (for example, “car” in FIG. 4D; “car,” “cat,” “cabinet,” and “candle” in FIG. 4F) in a second area, e.g. word selection area 216. A duplicate 226 of the current character string 222 may also be displayed in the word selection area 216. In some embodiments, the suggested replacement(s) and the current character string duplicate 226 are displayed on opposite sides of the word selection area 216. For example, the suggested replacement(s) may be displayed in the left side of the word selection area 216 and the current character string duplicate 226 may be displayed in the right side of the word selection area 216.


The user may perform a gesture (such as a tap on the touch screen) on either the duplicate 226 of the current character string 222 or the suggested replacement 224. If the user taps on the duplicate 226 of the current character string 222 in the word selection area 216 with a finger 212, as indicated by the finger contact area 228 in FIG. 4B, the current character string 222 is left as is in the display tray 214. If the user taps on the suggested replacement 224 in the word selection area 216 with a finger 212, as indicated by the finger contact area 228 in FIG. 4D, the current character string 222 is replaced in the display tray 214 by the suggested replacement 224 (FIG. 4E).


As an example, the current character string 222 “cae” is highlighted, as shown in FIG. 4B. If the user taps the duplicate 226 of the current character string 222 in the word selection area 216, the current character string “cae” is completed and becomes part of the text 218 for which the device 200 is not providing suggested replacements, as shown in FIG. 4C. In some embodiments, a space is added to the end of the completed current character string, as shown in FIG. 4C. In some embodiments, the completed current character string (“cae” in FIG. 4C) is added to the dictionary 136. If the user taps instead the suggested replacement 224 “car” in the word selection area 216 (FIG. 4D), the current character string “cae” is replaced in the display tray 214 with the suggested replacement “car,” as shown in FIG. 4E. In some embodiments, a space is added to the end of the replaced current character string in the display tray 214, as shown in FIG. 4E.


Returning to FIG. 4D, if the user hits (as indicated by the finger contact area 228 on the space bar 227) a key on the keyboard 210 that is associated with a delimiter, such as a space bar 227, the current character string 222 in the display tray 214 is replaced by the suggested replacement 224, and the delimiter associated with the key that was hit by the user is appended to the end of the suggested replacement in the display tray 214.


In some embodiments, the device 200 may display a plurality of suggested replacements 224 for a current character sequence 222 in the word selection area 216, as shown in FIG. 4F. A user may perform a gesture (e.g., a tap) on one of the plurality of suggested replacements to select that suggested replacement. The current character sequence 222 is replaced with the selected suggested replacement. As an example, in FIG. 4F, suggested replacements for the current character string “cae” include “car,” “cat,” “cabinet,” and “candle.” If the user taps on the suggested replacement “cabinet,” as indicated by the contact area 228 in the word selection area 216, the current character string “cae” is replaced in the display tray 214 with the selected replacement “cabinet,” as shown in FIG. 4G. If the user hits a key on the keyboard 210 that is associated with a delimiter, the current character string 222 in the display tray 214 may be replaced by the suggested replacement 224 in the word selection area 216 that is highest ranked (e.g., “car” in FIG. 4F). In some embodiments, the suggested replacements 224 are displayed in ranking order (ascending or descending, depending on the particular embodiment and/or user preferences) in the word selection area 216, so that the user may identify which suggested replacement is the highest ranked.


In some embodiments, if the current character string 222 is longer than a predefined length (based on the number of characters), the duplicate 226 of the current character string 222 in the word selection area 216 may show a subset of the characters in the current character string 222. For example, the duplicate 226 may show the first six characters of the current character string 222, as shown in FIG. 4H. As another example, the duplicate 226 may show the first three and the last three characters of the current character string 222.


As shown in FIG. 4I, in some embodiments, the highest ranked suggested replacement 240 is displayed within the space bar 227. If the user performs a predefined gesture on or near the touch screen display (e.g., taps or touches the space bar 227), the current character string 222 is replaced by the replacement string 240 shown in the space bar 227, and the display of the space bar 227 is then returned to its normal or default status (e.g., blank, or with the word “space” displaced in the space bar (see FIG. 4H)). It is noted that the space bar 227 corresponds to a delimiter (i.e., a space). In some of these embodiments, only the highest ranked suggested replacement is presented to the user, and thus any other corrections must be made manually by the user. If the user performs a second gesture with respect to the touch screen display, such as tapping any key of the keyboard other than the space bar 227, the current character string 222 in retained.


The embodiments of the invention, as described above, provides an intuitive way to integrate explicit word selection (via suggested word replacements in the second area), implicit word selection (e.g., via the space bar or other delimiter keys), and explicit non-selection of suggested word replacements (via keeping the current word, e.g., for words with unusual spellings).


In some embodiments, the device 200 may allow the user to review strings replaced by user-selected suggested replacements. Attention is now directed to FIGS. 5A1/N 5B, which illustrate a user interface for reviewing the originally entered strings that were replaced by suggested replacements. A user may perform a gesture over a word 229 in the entered text 218. For example, the user may tap the word 229 on the touch screen with a finger 212, as indicated by the contact area 228 in the display tray 214. If the word 229 (FIG. 5A) was a replacement for some originally entered text, the originally entered text 230 may be displayed (FIG. 5B). Alternately, the originally entered text may be displayed if the user's finger hovers over the word 229 for at least a threshold period of time (e.g., 0.5 seconds, 1.0 second, or a value between 0.35 and 1.25 seconds). In some embodiments, the originally entered text 230 is displayed in place of the word 229 for a predetermined amount of time, such as 2 seconds. After the time has elapsed, the word 229 is displayed back in its place unless an undo gesture (e.g., a tap on the original text) is performed, in which case the originally entered text 230 is durably restored. In some other embodiments, the originally entered text 230 is displayed in a balloon graphic or the like extending from the word 229.


The foregoing description, for purpose of explanation, has been described with reference to specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the invention to the precise forms disclosed. Many modifications and variations are possible in view of the above teachings. The embodiments were chosen and described in order to best explain the principles of the invention and its practical applications, to thereby enable others skilled in the art to best utilize the invention and various embodiments with various modifications as are suited to the particular use contemplated.

Claims
  • 1. An electronic device, comprising: a display;one or more processors;memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for: concurrently displaying, on the display; a current character string in a first area; anda plurality of suggested replacement character strings in a second area distinct from the first area;detecting a first input;in accordance with a determination that the first input corresponds to one of the plurality of suggested replacement character strings: replacing display of the current character string in the first area of the display with the one of the plurality of suggested replacement character strings; andceasing to display the plurality of suggested replacement character strings in the second area of the display; andsubsequent to replacing display of the current character string in the first area of the display with the one of the plurality of suggested replacement character strings and in response to detecting a second input, displaying, on the display, one or more suggested replacement character strings in a third area of the display different from the first area of the display and the second area of the display, wherein one of the one or more suggested replacement character strings matches the current character string.
  • 2. The electronic device of claim 1, wherein the third area of the display is a balloon graphic extending from the one of the plurality of suggested replacement character strings.
  • 3. The electronic device of claim 1, wherein the one or more suggested replacement character strings is different from the plurality of suggested replacement character strings.
  • 4. The electronic device of claim 1, wherein the second input corresponds to an undo input.
  • 5. The electronic device of claim 1, wherein the one or more programs further include instructions for: in accordance with the determination that the first input corresponds to one of the plurality of suggested replacement character strings, adding a space after the one of the plurality of suggested replacement character strings in the first area on the display.
  • 6. The electronic device of claim 1, wherein the one or more programs further include instructions for: in accordance with the determination that a third input corresponds to one of the one or more suggested replacement character strings: replacing display of the one of the plurality of suggested replacement character strings in the first area of the display with the one or more suggested replacement character strings; andceasing to display the one or more suggested replacement character strings in the third area of the display and the one or more suggested replacement character strings in the second in the second area on the display.
  • 7. The electronic device of claim 1, wherein the one or more programs further include instructions for: in accordance with the determination that a third input corresponds to one of the plurality of suggested replacement character strings: replacing display of the one of the plurality of suggested replacement character strings in the first area of the display with the one or more suggested replacement character strings, andforegoing adding a space after the one or more suggested replacement character strings in the second area on the display.
  • 8. The electronic device of claim 1, wherein the first input comprises a finger tap on the second area of the display.
  • 9. The electronic device of claim 1, wherein the first input comprises activation of a key on a keyboard.
  • 10. The electronic device of claim 1, wherein the one or more programs further include instructions for: highlighting the one of the plurality of suggested replacement character strings among the plurality of suggested replacement character strings.
  • 11. The electronic device of claim 1, wherein the one or more programs further include instructions for: detecting an activation of a key on a keyboard that is associated with a delimiter; andin response to detecting the activation of the key on the keyboard that is with the delimiter, replacing the current character string in the first area of the display with a suggested replacement character string that is highest ranked among the plurality of suggested replacement character strings displayed in the second area of the display.
  • 12. A non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of an electronic device with a display and a touch-sensitive surface, the one or more programs including instructions for: concurrently displaying, on the display: a current character string in a first area; anda plurality of suggested replacement character strings in a second area distinct from the first area;detecting, on the display, a first input;in accordance with a determination that the first input corresponds to one of the plurality of suggested replacement character strings: replacing display of the current character string in the first area of the display with the one of the plurality of suggested replacement character strings; andceasing to display the plurality of suggested replacement character strings in the second area of the display; andsubsequent to replacing display of the current character string in the first area of the display with the one of the plurality of suggested replacement character strings and in response to detecting a second input, displaying, on the display, one or more suggested replacement character strings in a third area of the display different from the first area of the display and the second area of the display, wherein one of the one or more suggested replacement character strings matches the current character string.
  • 13. The non-transitory computer-readable storage medium of claim 12, wherein the third area of the display is a balloon graphic extending from the one of the plurality of suggested replacement character strings.
  • 14. The non-transitory computer-readable storage medium of claim 12, wherein the one or more suggested replacement character strings is different from the plurality of suggested replacement character strings.
  • 15. The non-transitory computer-readable storage medium of claim 12, wherein the second input corresponds to an undo input.
  • 16. The non-transitory computer-readable storage medium of claim 12, wherein the one or more programs further include instructions for: in accordance with the determination that the first input corresponds to one of the plurality of suggested replacement character strings, adding a space after the one of the plurality of suggested replacement character strings in the first area on the display.
  • 17. The non-transitory computer-readable storage medium of claim 12, wherein the one or more programs further include instructions for: in accordance with the determination that a third input corresponds to one of the one or more suggested replacement character strings: replacing display of the one of the plurality of suggested replacement character strings in the first area of the display with the one or more suggested replacement character strings; andceasing to display the one or more suggested replacement character strings in the third area of the display and the one or more suggested replacement character strings in the second in the second area on the display.
  • 18. The non-transitory computer-readable storage medium of claim 12, wherein the one or more programs further include instructions for: in accordance with the determination that a third input corresponds to one of the plurality of suggested replacement character strings: replacing display of the one of the plurality of suggested replacement character strings in the first area of the display with the one or more suggested replacement character strings, andforegoing adding a space after the one or more suggested replacement character strings in the second area on the display.
  • 19. The non-transitory computer-readable storage medium of claim 12, wherein the first input comprises a finger tap on the second area of the display.
  • 20. The non-transitory computer-readable storage medium of claim 12, wherein the first input comprises activation of a key on a keyboard.
  • 21. The non-transitory computer-readable storage medium of claim 12, wherein the one or more programs further include instructions for: highlighting the one of the plurality of suggested replacement character strings among the plurality of suggested replacement character strings.
  • 22. The non-transitory computer-readable storage medium of claim 12, wherein the one or more programs further include instructions for: detecting an activation of a key on a keyboard that is associated with a delimiter; andin response to detecting the activation of the key on the keyboard that is with the delimiter, replacing the current character string in the first area of the display with a suggested replacement character string that is highest ranked among the plurality of suggested replacement character strings displayed in the second area of the display.
  • 23. A method, comprising: at an electronic device with a touch-sensitive surface and display: concurrently displaying, on the display: a current character string in a first area; anda plurality of suggested replacement character strings in a second area distinct from the first area;detecting, on the display, a first input;in accordance with a determination that the first input corresponds to one of the plurality of suggested replacement character strings: replacing display of the current character string in the first area of the display with the one of the plurality of suggested replacement character strings; andceasing to display the plurality of suggested replacement character strings in the second area of the display; andsubsequent to replacing display of the current character string in the first area of the display with the one of the plurality of suggested replacement character strings and in response to detecting a second input, displaying, on the display, one or more suggested replacement character strings in a third area of the display different from the first area of the display and the second area of the display, wherein one of the one or more suggested replacement character strings matches the current character string.
  • 24. The method of claim 23, wherein the third area of the display is a balloon graphic extending from the one of the plurality of suggested replacement character strings.
  • 25. The method of claim 23, wherein the one or more suggested replacement character strings is different from the plurality of suggested replacement character strings.
  • 26. The method of claim 23, wherein the second input corresponds to an undo input.
  • 27. The method of claim 23, further comprising: in accordance with the determination that the first input corresponds to one of the plurality of suggested replacement character strings, adding a space after the one of the plurality of suggested replacement character strings in the first area on the display.
  • 28. The method of claim 23, further comprising: in accordance with the determination that a third input corresponds to one of the one or more suggested replacement character strings: replacing display of the one of the plurality of suggested replacement character strings in the first area of the display with the one or more suggested replacement character strings; andceasing to display the one or more suggested replacement character strings in the third area of the display and the one or more suggested replacement character strings in the second in the second area on the display.
  • 29. The method of claim 23, further comprising: in accordance with the determination that a third input corresponds to one of the plurality of suggested replacement character strings: replacing display of the one of the plurality of suggested replacement character strings in the first area of the display with the one or more suggested replacement character strings, andforegoing adding a space after the one or more suggested replacement character strings in the second area on the display.
  • 30. The method of claim 23, wherein the first input comprises a finger tap on the second area of the display.
  • 31. The method of claim 23, wherein the first input comprises activation of a key on a keyboard.
  • 32. The method of claim 23, further comprising: highlighting the one of the plurality of suggested replacement character strings among the plurality of suggested replacement character strings.
  • 33. The method of claim 23, further comprising: detecting an activation of a key on a keyboard that is associated with a delimiter; andin response to detecting the activation of the key on the keyboard that is with the delimiter, replacing the current character string in the first area of the display with a suggested replacement character string that is highest ranked among the plurality of suggested replacement character strings displayed in the second area of the display.
RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 16/781,574, filed Feb. 4, 2020, which is a continuation of U.S. patent application Ser. No. 15/003,773, filed Jan. 21, 2016, which is a continuation of U.S. patent application Ser. No. 13/310,586, filed Dec. 2, 2011, now U.S. Pat. No. 9,244,536, which is a continuation of U.S. patent application Ser. No. 11/620,642, filed Jan. 5, 2007, now U.S. Pat. No. 8,074,172, the entire contents of which are incorporated herein by reference. This application is related to U.S. patent application Ser. No. 11/620,641, filed Jan. 5, 2007, now U.S. Pat. No. 7,957,955, entitled “Method and System for Providing Word Recommendations for Text Input,” and U.S. patent application Ser. No. 13/310,592, filed Dec. 2, 2011, now U.S. Pat. No. 9,189,079, entitled “Method, System, and Graphical User Interface for Providing Word Recommendations,” the entire contents of which are incorporated herein by reference.

US Referenced Citations (292)
Number Name Date Kind
5038401 Inotsume Aug 1991 A
5053758 Cornett et al. Oct 1991 A
5128672 Kaehler Jul 1992 A
5253325 Clark Oct 1993 A
5297041 Kushler et al. Mar 1994 A
5305205 Weber et al. Apr 1994 A
5539839 Bellegarda et al. Jul 1996 A
5565894 Bates et al. Oct 1996 A
5581484 Prince Dec 1996 A
5615378 Nishino et al. Mar 1997 A
5710831 Beernink et al. Jan 1998 A
5736974 Selker Apr 1998 A
5748512 Vargas May 1998 A
5748927 Stein et al. May 1998 A
5758314 Mckenna May 1998 A
5765168 Burrows Jun 1998 A
5774834 Visser Jun 1998 A
5778405 Ogawa Jul 1998 A
5797008 Burrows Aug 1998 A
5801941 Bertram Sep 1998 A
5805165 Thorne et al. Sep 1998 A
5818437 Grover et al. Oct 1998 A
5818451 Bertram et al. Oct 1998 A
5896321 Miller et al. Apr 1999 A
5943443 Itonori et al. Aug 1999 A
5953541 King et al. Sep 1999 A
5956021 Kubota et al. Sep 1999 A
5963671 Comerford et al. Oct 1999 A
5999895 Forest Dec 1999 A
6023536 Visser Feb 2000 A
6040824 Maekawa et al. Mar 2000 A
6049326 Beyda et al. Apr 2000 A
6073036 Heikkinen et al. Jun 2000 A
6094197 Buxton et al. Jul 2000 A
6169538 Nowlan et al. Jan 2001 B1
6212412 Rogers et al. Apr 2001 B1
6259436 Moon et al. Jul 2001 B1
6271835 Hoeksma Aug 2001 B1
6292179 Lee Sep 2001 B1
6295052 Kato et al. Sep 2001 B1
6298321 Karlov et al. Oct 2001 B1
6307548 Flinchem et al. Oct 2001 B1
6323846 Westerman et al. Nov 2001 B1
6340967 Maxted Jan 2002 B1
6359572 Vale Mar 2002 B1
6424338 Anderson Jul 2002 B1
6456952 Nathan Sep 2002 B1
6469722 Kinoe et al. Oct 2002 B1
6470347 Gillam Oct 2002 B1
6570557 Westerman et al. May 2003 B1
6573844 Venolia et al. Jun 2003 B1
6597345 Hirshberg Jul 2003 B2
6636163 Hsieh Oct 2003 B1
6654733 Goodman et al. Nov 2003 B1
6671856 Gillam Dec 2003 B1
6675169 Bennett et al. Jan 2004 B1
6677932 Westerman Jan 2004 B1
6707942 Cortopassi et al. Mar 2004 B1
6714221 Christie et al. Mar 2004 B1
6760580 Robinson et al. Jul 2004 B2
6795059 Endo Sep 2004 B2
6801190 Longe et al. Oct 2004 B1
6803905 Capps et al. Oct 2004 B1
6804677 Shadmon et al. Oct 2004 B2
6857800 Zhang et al. Feb 2005 B2
6926609 Martin Aug 2005 B2
6938220 Shigematsu et al. Aug 2005 B1
7038659 Rajkowski May 2006 B2
7057607 Mayoraz et al. Jun 2006 B2
7075512 Fabre et al. Jul 2006 B1
7098896 Kushler et al. Aug 2006 B2
7177797 Micher et al. Feb 2007 B1
7194699 Thomson et al. Mar 2007 B2
7277088 Robinson et al. Oct 2007 B2
7283072 Plachta et al. Oct 2007 B1
7319957 Robinson et al. Jan 2008 B2
7382358 Kushler et al. Jun 2008 B2
7443316 Lim Oct 2008 B2
7475063 Datta et al. Jan 2009 B2
7477240 Yanagisawa Jan 2009 B2
7490034 Finnigan et al. Feb 2009 B2
7508324 Suraqui Mar 2009 B2
7526738 Ording et al. Apr 2009 B2
7565380 Venkatachary Jul 2009 B1
7584093 Potter et al. Sep 2009 B2
7609179 Diaz-Gutierrez et al. Oct 2009 B2
7614008 Ording Nov 2009 B2
7669149 Dietl et al. Feb 2010 B2
7679534 Kay et al. Mar 2010 B2
7683886 Willey Mar 2010 B2
7694231 Kocienda et al. Apr 2010 B2
7707026 Liu Apr 2010 B2
7712053 Bradford et al. May 2010 B2
7725838 Williams May 2010 B2
7793228 Mansfield et al. Sep 2010 B2
7797269 Rieman et al. Sep 2010 B2
7809744 Nevidomski et al. Oct 2010 B2
7880730 Robinson et al. Feb 2011 B2
7957955 Christie et al. Jun 2011 B2
8037034 Plachta et al. Oct 2011 B2
8041557 Liu Oct 2011 B2
8074172 Kocienda et al. Dec 2011 B2
8090571 Elshishiny et al. Jan 2012 B2
8094941 Rowley et al. Jan 2012 B1
8175389 Matic et al. May 2012 B2
8232973 Kocienda et al. Jul 2012 B2
8310461 Morwing et al. Nov 2012 B2
8370737 Zahavi et al. Feb 2013 B2
8884905 Morwig et al. Nov 2014 B2
8938688 Bradford et al. Jan 2015 B2
8994660 Neels et al. Mar 2015 B2
9086802 Kocienda et al. Jul 2015 B2
9111139 Morwing et al. Aug 2015 B2
9633191 Fleizach et al. Apr 2017 B2
9921744 Ha et al. Mar 2018 B2
9928651 Mariappan Mar 2018 B2
9998888 Chang et al. Jun 2018 B1
10013162 Fleizach et al. Jul 2018 B2
10133397 Smith Nov 2018 B1
20020010726 Rogson Jan 2002 A1
20020015024 Westerman et al. Feb 2002 A1
20020015064 Robotham et al. Feb 2002 A1
20020019731 Masui et al. Feb 2002 A1
20020051018 Yeh May 2002 A1
20020067854 Reintjes et al. Jun 2002 A1
20020085037 Leavitt et al. Jul 2002 A1
20020107896 Ronai Aug 2002 A1
20020126097 Savolainen Sep 2002 A1
20020135615 Lang Sep 2002 A1
20020140679 Wen Oct 2002 A1
20020140680 Lu Oct 2002 A1
20020156615 Takatsuka et al. Oct 2002 A1
20020167545 Kang et al. Nov 2002 A1
20020191029 Gillespie et al. Dec 2002 A1
20030038788 Demartines et al. Feb 2003 A1
20030063073 Geaghan et al. Apr 2003 A1
20030090467 Hohl et al. May 2003 A1
20030149978 Plotnick Aug 2003 A1
20030189553 Goren Oct 2003 A1
20030193481 Sokolsky Oct 2003 A1
20030197736 Murphy Oct 2003 A1
20030204392 Finnigan et al. Oct 2003 A1
20040009788 Mantyjarvi et al. Jan 2004 A1
20040021691 Dostie Feb 2004 A1
20040070567 Longe et al. Apr 2004 A1
20040135774 La Monica Jul 2004 A1
20040140956 Kushler et al. Jul 2004 A1
20040157586 Robinson et al. Aug 2004 A1
20040160419 Padgitt Aug 2004 A1
20040165924 Griffin Aug 2004 A1
20040178994 Kairls Sep 2004 A1
20040183833 Chua Sep 2004 A1
20040196256 Wobbrock et al. Oct 2004 A1
20040218963 Van Diepen et al. Nov 2004 A1
20040243389 Thomas et al. Dec 2004 A1
20050024341 Gillespie et al. Feb 2005 A1
20050027622 Walker et al. Feb 2005 A1
20050057498 Gentle Mar 2005 A1
20050093826 Huh May 2005 A1
20050099398 Garside et al. May 2005 A1
20050116927 Voelckers Jun 2005 A1
20050131687 Sorrentino Jun 2005 A1
20050162395 Unruh Jul 2005 A1
20050169527 Longe et al. Aug 2005 A1
20050190970 Griffin Sep 2005 A1
20050193351 Huoviala Sep 2005 A1
20050216331 Ahrens et al. Sep 2005 A1
20050237311 Nakajima Oct 2005 A1
20050246365 Lowles et al. Nov 2005 A1
20050253816 Himberg et al. Nov 2005 A1
20050253818 Nettamo Nov 2005 A1
20050278647 Leavitt et al. Dec 2005 A1
20050283364 Longe et al. Dec 2005 A1
20050283726 Lunati Dec 2005 A1
20060004744 Nevidomski et al. Jan 2006 A1
20060007174 Shen Jan 2006 A1
20060044278 Fux et al. Mar 2006 A1
20060052885 Kong Mar 2006 A1
20060053387 Ording Mar 2006 A1
20060062461 Longe et al. Mar 2006 A1
20060066590 Ozawa et al. Mar 2006 A1
20060085757 Andre et al. Apr 2006 A1
20060152496 Knaven Jul 2006 A1
20060161846 Van Leeuwen Jul 2006 A1
20060181519 Vernier et al. Aug 2006 A1
20060190256 Stephanick et al. Aug 2006 A1
20060206454 Forstall et al. Sep 2006 A1
20060246955 Nirhamo et al. Nov 2006 A1
20060247915 Bradford et al. Nov 2006 A1
20060265208 Assadollahi Nov 2006 A1
20060265648 Rainisto et al. Nov 2006 A1
20060274051 Longe Dec 2006 A1
20060288024 Braica Dec 2006 A1
20060293880 Elshishiny et al. Dec 2006 A1
20070040813 Kushler et al. Feb 2007 A1
20070046641 Lim Mar 2007 A1
20070061754 Ardhanari et al. Mar 2007 A1
20070067272 Flynt et al. Mar 2007 A1
20070130128 Garg et al. Jun 2007 A1
20070146340 Webb et al. Jun 2007 A1
20070152978 Kocienda et al. Jul 2007 A1
20070198566 Sustik Aug 2007 A1
20070229323 Plachta et al. Oct 2007 A1
20070229476 Huh Oct 2007 A1
20070257896 Huh Nov 2007 A1
20070260595 Beatty et al. Nov 2007 A1
20070285958 Platchta et al. Dec 2007 A1
20070288449 Datta et al. Dec 2007 A1
20080059876 Hantler et al. Mar 2008 A1
20080098456 Alward et al. Apr 2008 A1
20080167858 Christie et al. Jul 2008 A1
20080168366 Kocienda et al. Jul 2008 A1
20080259022 Mansfield et al. Oct 2008 A1
20080270118 Kuo et al. Oct 2008 A1
20080304890 Shin et al. Dec 2008 A1
20080316183 Westerman et al. Dec 2008 A1
20090174667 Kocienda et al. Jul 2009 A1
20090228842 Westerman et al. Sep 2009 A1
20090249198 Davis et al. Oct 2009 A1
20090327977 Bachfischer et al. Dec 2009 A1
20100023318 Lemoine Jan 2010 A1
20100036655 Cecil et al. Feb 2010 A1
20100177056 Kocienda et al. Jul 2010 A1
20100188357 Kocienda et al. Jul 2010 A1
20100188358 Kocienda et al. Jul 2010 A1
20100192086 Kocienda et al. Jul 2010 A1
20100235780 Westerman et al. Sep 2010 A1
20100246964 Matic et al. Sep 2010 A1
20100309147 Fleizach et al. Dec 2010 A1
20100325588 Reddy et al. Dec 2010 A1
20110279379 Morwing et al. Nov 2011 A1
20120079373 Kocienda et al. Mar 2012 A1
20120079412 Kocienda et al. Mar 2012 A1
20120089632 Zhou et al. Apr 2012 A1
20120216113 Li Aug 2012 A1
20120216141 Li et al. Aug 2012 A1
20130034303 Morwing et al. Feb 2013 A1
20130120274 Ha et al. May 2013 A1
20130251249 Huo et al. Sep 2013 A1
20140035823 Khoe et al. Feb 2014 A1
20140035851 Kim et al. Feb 2014 A1
20140044357 Moorthy et al. Feb 2014 A1
20140129931 Hashiba May 2014 A1
20140244234 Huang et al. Aug 2014 A1
20140267072 Andersson et al. Sep 2014 A1
20140270529 Sugiura et al. Sep 2014 A1
20140285460 Morwing et al. Sep 2014 A1
20140361983 Dolfing et al. Dec 2014 A1
20140363074 Dolfing et al. Dec 2014 A1
20140363082 Dixon et al. Dec 2014 A1
20140363083 Xia et al. Dec 2014 A1
20140365949 Xia et al. Dec 2014 A1
20150040213 Fleizach et al. Feb 2015 A1
20150067488 Liu Mar 2015 A1
20150100537 Grieves et al. Apr 2015 A1
20150161463 Morwing et al. Jun 2015 A1
20150169948 Motoi Jun 2015 A1
20150169975 Kienzle et al. Jun 2015 A1
20150193141 Goldsmith et al. Jul 2015 A1
20150234588 Andersson et al. Aug 2015 A1
20150235097 Wang et al. Aug 2015 A1
20150248235 Offenberg et al. Sep 2015 A1
20150268768 Woodhull et al. Sep 2015 A1
20150294145 Bouaziz et al. Oct 2015 A1
20150310267 Nicholson et al. Oct 2015 A1
20150317069 Clements et al. Nov 2015 A1
20150317078 Kocienda et al. Nov 2015 A1
20150324011 Czelnik et al. Nov 2015 A1
20150370529 Zambetti et al. Dec 2015 A1
20150370779 Dixon et al. Dec 2015 A1
20160019201 Qian et al. Jan 2016 A1
20160139805 Kocienda et al. May 2016 A1
20160179225 Black et al. Jun 2016 A1
20160259548 Ma Sep 2016 A1
20160274686 Alonso Ruiz et al. Sep 2016 A1
20160357752 Yang et al. Dec 2016 A1
20170010802 Xia et al. Jan 2017 A1
20170017835 Dolfing et al. Jan 2017 A1
20170075878 Jon et al. Mar 2017 A1
20170115875 Ha et al. Apr 2017 A1
20170300559 Fallah Oct 2017 A1
20170351420 Rigouste Dec 2017 A1
20170357438 Dixon et al. Dec 2017 A1
20170359302 Van Os et al. Dec 2017 A1
20180091732 Wilson et al. Mar 2018 A1
20180173415 Xia et al. Jun 2018 A1
20190163359 Dixon et al. May 2019 A1
20190332259 Xia et al. Oct 2019 A1
20200057556 Dixon et al. Feb 2020 A1
20200174658 Xia et al. Jun 2020 A1
20200174663 Kocienda et al. Jun 2020 A1
20210124485 Dixon et al. Apr 2021 A1
Foreign Referenced Citations (61)
Number Date Country
101123044 Feb 2008 CN
102243570 Nov 2011 CN
102449640 May 2012 CN
102722240 Oct 2012 CN
103365446 Oct 2013 CN
104951175 Sep 2015 CN
105247540 Jan 2016 CN
880090 Nov 1998 EP
1271295 Jan 2003 EP
1674976 Jun 2006 EP
2031485 Mar 2009 EP
2336871 Jun 2011 EP
2367097 Sep 2011 EP
2386984 Nov 2011 EP
2386984 Jun 2013 EP
2650766 Oct 2013 EP
3065083 Sep 2016 EP
2332293 Jun 1999 GB
2337349 Nov 1999 GB
2351639 Jan 2001 GB
2380583 Apr 2003 GB
1220276 Apr 2017 HK
8-249122 Sep 1996 JP
9-81320 Mar 1997 JP
10-91346 Apr 1998 JP
11-53093 Feb 1999 JP
2000-29630 Jan 2000 JP
2001-521793 Nov 2001 JP
2002-518721 Jun 2002 JP
2002-222039 Aug 2002 JP
2003-216312 Jul 2003 JP
2005-341411 Dec 2005 JP
2009-289188 Dec 2009 JP
2013-206141 Oct 2013 JP
2014-56389 Mar 2014 JP
2015-501022 Jan 2015 JP
2015-97103 May 2015 JP
2015-148946 Aug 2015 JP
2016-24684 Feb 2016 JP
10-1417286 Jul 2014 KR
10-2016-0003112 Jan 2016 KR
10-2016-0065174 Jun 2016 KR
199833111 Jul 1998 WO
200038041 Jun 2000 WO
200038042 Jun 2000 WO
2003098417 Nov 2003 WO
2004051392 Jun 2004 WO
2005006442 Jan 2005 WO
2005008899 Jan 2005 WO
2006003590 Jan 2006 WO
2006115825 Nov 2006 WO
2006115946 Nov 2006 WO
2008005304 Jan 2008 WO
2010117505 Oct 2010 WO
2010117505 Jan 2011 WO
2013048880 Apr 2013 WO
2014166114 Oct 2014 WO
2014200736 Dec 2014 WO
2014205648 Dec 2014 WO
2015094587 Jun 2015 WO
2015122885 Aug 2015 WO
Non-Patent Literature Citations (177)
Entry
Advisory Action received for U.S. Appl. No. 15/003,773, dated Sep. 4, 2018, 6 pages.
Advisory Action received for U.S. Appl. No. 15/003,773, dated Sep. 12, 2019, 6 pages.
Applicant-Initiated Interview Summary received for U.S. Appl. No. 16/781,574, dated Mar. 17, 2021, 2 pages.
Applicant-Initiated Interview Summary received for U.S. Appl. No. 16/781,574, dated Nov. 18, 2020, 5 pages.
Call Centre, “Word Prediction”, The CALL Centre & Scottish Executive Education Dept., 1999, pp. 63-73.
Casario M., “Hands on Macromedia World: Touch Screen Keypad for Mobile Phone by DoCoMo”, available at: http://casario.bloqs.com/mmworld/2005/10/touch_screen_ke.html, retrieved on Nov. 18, 2005, 1 page.
Centroid, Online available at: http://faculty.evansville.edu/ck6/tcenters/class/centroid.html, Apr. 28, 2006, 1 page.
Centroid, Online available at: http://www.pballew.net/centroid.html, Apr. 28, 2006, 3 pages.
Compare Keyboards with the Keyboard Compatibility Chart, Learn more About Alternative Keyboards, Solutions for Humans, available at: http://www.keyalt.com/kkeybrdp.htm, Dec. 8, 2005, 5 pages.
Corrected Notice of Allowability received for U.S. Appl. No. 16/265,676, dated Aug. 26, 2019, 4 pages.
Corrected Notice of Allowance received for U.S. Appl. No. 15/614,276, dated Dec. 10, 2018, 2 pages.
Corrected Notice of Allowance received for U.S. Appl. No. 16/663,070, dated Nov. 5, 2020, 2 pages.
Corrected Notice of Allowance received for U.S. Appl. No. 16/663,070, dated Nov. 18, 2020, 2 pages.
Corrected Notice of Allowance received for U.S. Appl. No. 16/663,070, dated Oct. 20, 2020, 2 pages.
Day B., “Will Cell Phones Render iPods Obsolete?”, available at: http://weblogs.iavanet/pub/wig/883, Dec. 12, 2005, 3 pages.
Decision to Grant received for Danish Patent Application No. PA201670624, dated Feb. 5, 2018, 3 pages.
Decision to Grant received for Danish Patent Application No. PA201670626, dated Mar. 21, 2018, 2 pages.
Decision to Grant received for European Patent Application No. 17173810.7, dated Apr. 4, 2019, 2 pages.
Decision to Grant received for European Patent Application No. 17211174.2, dated Aug. 29, 2019, 2 pages.
Devices, Technology Loan Catalog, available at: http://www.tsbvi.edu/outreach/techioan/catalog.html, retrieved on Dec. 8, 2005, 10 pages.
dyslexic.com, “AlphaSmart 3000 with CoWriter SmartApplet: Don Johnston Special Needs”, Available at: http://www.dyslexic.com/procuts.php?catid-2&pid=465&PHPSESSID=2511b800000f7da, retrieved on Dec. 6, 2005, pp. 1-13.
Extended European Search Report received for European Patent Application No. 17173810.7, dated Oct. 17, 2017, 24 pages.
Extended European Search Report received for European Patent Application No. 17211174.2, dated Mar. 27, 2018, 13 pages.
Extended European Search Report received for European Patent Application No. 19171354.4, dated Sep. 23, 2019, 11 pages.
Fastap, Digit wireless, available at: http://www.digitwireless.com/about/faq.html, Dec. 6, 2005, 5 pages.
Fastap, Keypads Redefine Mobile Phones, Digit Wireless, available at: http://www.digitwireless.com, retrieved on Nov. 18, 2005, 10 pages.
Final Office Action received for U.S. Appl. No. 11/459,615, dated Dec. 8, 2009, 12 pages.
Final Office Action received for U.S. Appl. No. 11/549,624, dated Apr. 10, 2009, 9 pages.
Final Office Action received for U.S. Appl. No. 11/549,624, dated Feb. 1, 2010, 9 pages.
Final Office Action received for U.S. Appl. No. 11/620,641, dated Jun. 25, 2010, 31 pages.
Final Office Action received for U.S. Appl. No. 11/620,642, dated Nov. 29, 2010, 14 pages.
Final Office Action received for U.S. Appl. No. 11/961,663 dated Mar. 17, 2011, 24 pages.
Final Office Action received for U.S. Appl. No. 12/505,382, dated Jul. 9, 2012, 35 pages.
Final Office Action received for U.S. Appl. No. 12/505,382, dated May 3, 2012, 27 pages.
Final Office Action received for U.S. Appl. No. 13/559,495, dated Sep. 8, 2014, 7 pages.
Final Office Action received for U.S. Appl. No. 14/800,378, dated Sep. 7, 2018, 18 pages.
Final Office Action received for U.S. Appl. No. 15/003,773, dated May 10, 2018, 12 pages.
Final Office Action received for U.S. Appl. No. 15/003,773, dated May 30, 2019, 11 pages.
Final Office Action received for U.S. Appl. No. 16/781,574, dated Jan. 14, 2021, 13 pages.
Four-Button Keyboard, WikiPodlinux, available at: http://ipodlinux.org/Four_Button_Keyboard, retrieved on Dec. 5, 2005, 2 pages.
Glossary of Adaptive Technologies: Word Prediction, available at: http://www.utoronto.ca/atrc/reference/techwordpred.html, retrieved on Dec. 6, 2005, pp. 1-5.
Hardy Ed, “Apple Adds iTunes Wi-Fi Music Store to iPhone”, Brighthand, available online at: http://www.brighthand.com/printArticle.asp?newsID=13379, Sep. 28, 2007, 1 page.
Intention to Grant received for Danish Patent Application No. PA201670624, dated Oct. 17, 2017, 2 pages.
Intention to grant received for Danish Patent Application No. PA201670626, dated Jan. 26, 2018, 2 pages.
Intention to Grant received for European Patent Application No. 17173810.7, dated Nov. 21, 2018, 8 pages.
Intention to Grant received for European Patent Application No. 17211174.2, dated Apr. 9, 2019, 7 pages.
Intention to Grant received for European Patent Application No. 17211174.2, dated Aug. 20, 2019, 6 pages.
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/US2007/060119, dated Jul. 8, 2008, 13 pages.
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/US2007/088872, dated Jul. 7, 2009, 7 pages.
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/US2007/088873, dated Jul. 7, 2009, 6 pages.
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2007/060119, dated Apr. 11, 2008, 18 pages.
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2007/088872, dated May 8, 2008, 8 pages.
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2007/088873, dated May 8, 2008, 7 pages.
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2007/088904, dated Sep. 15, 2008, 17 pages.
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2008/050426, dated Jun. 13, 2008, 10 pages.
Introducing the Ultimate Smartphone Keypad, Delta II™ Keypads, available at: http://www.chicagologic.com, retrieved on Nov. 18, 2005, 2 pages.
Invitation to Pay Additional Fees received for PCT Patent Application No. PCT/US2007/060119, dated Jan. 2, 2008, 9 pages.
Invitation to Pay Additional Fees received for PCT Patent Application No. PCT/US2007/088904, dated Jun. 23, 2008, 8 pages.
LG Develops New Touch Pad Cell Phones, Textually, available at: http://textually.ora/textually/archives/2005/06/009903.html, retrieved on Nov. 18, 2005, 1 page.
Mactech, “Keystrokes 3.5 for Mac OS X Boosts Word Prediction”, available at: http://www.mactech.com/news/?p=1007129, retrieved on Jan. 7, 2008, pp. 1-3.
Masui Toshiyuki, “POBox: An Efficient Text Input Method for Handheld and Ubiquitous Computers”, Proceedings of the 1st International Symposium on Handheld and Ubiquitous Computing, 1999, 12 pages.
Microsoft New-Smart Phone Interface: Your Thumb, Textually, available at: http://www.textuallv.org, retrieved on Nov. 18, 2005, 2 pages.
Mobile Tech News,“T9 Text Input Software Updated”, available at: http://www.mobiletechnews.com/info/2004/11/23/122155.html, Nov. 23, 2004, 4 pages.
NCIP, “NCIP Library: Word Prediction Collection”, Available at: http://www2.edc.org/ncip/library/wp/toc.htm, 1998, 4 pages.
NCIP, “What is Word Prediction?”, available at: http://www2.edc.org/NCIP/library/wp/what_is.htm, 1998, 2 pages.
Non-Final Office Action received for U.S. Appl. No. 12/727,217, dated May 11, 2012, 22 pages.
Non-Final Office Action received for U.S. Appl. No. 11/228,737, dated Mar. 19, 2009, 17 pages.
Non-Final Office Action received for U.S. Appl. No. 11/459,606, dated May 28, 2009, 19 pages.
Non-Final Office Action received for U.S. Appl. No. 11/459,615, dated Apr. 13, 2010, 10 pages.
Non-Final Office Action received for U.S. Appl. No. 11/459,615, dated May 22, 2009, 10 pages.
Non-Final Office Action received for U.S. Appl. No. 11/549,624, dated Jul. 22, 2009, 9 pages.
Non-Final Office Action received for U.S. Appl. No. 11/549,624, dated Sep. 30, 2008, 8 pages.
Non-Final Office Action received for U.S. Appl. No. 11/620,641, dated Nov. 20, 2009, 20 pages.
Non-Final Office Action received for U.S. Appl. No. 11/620,642, dated Feb. 18, 2011, 16 pages.
Non-Final Office Action received for U.S. Appl. No. 11/620,642, dated Mar. 30, 2010, 10 pages.
Non-Final Office Action received for U.S. Appl. No. 11/961,663, dated Nov. 18, 2010, 22 pages.
Non-Final Office Action received for U.S. Appl. No. 12/165,554, dated Nov. 21, 2011, 12 pages.
Non-Final Office Action received for U.S. Appl. No. 12/505,382, dated Jan. 5, 2012, 32 pages.
Non-Final Office Action received for U.S. Appl. No. 12/727,219, dated Feb. 17, 2012, 12 pages.
Non-Final Office Action received for U.S. Appl. No. 12/727,220, dated Feb. 16, 2012, 15 pages.
Non-Final Office Action received for U.S. Appl. No. 12/727,221, dated Feb. 16, 2012, 18 pages.
Non-Final Office Action received for U.S. Appl. No. 13/220,202, dated Jun. 12, 2014, 13 pages.
Non-Final Office Action received for U.S. Appl. No. 13/310,586, dated Jul. 9, 2015, 10 pages.
Non-Final Office Action received for U.S. Appl. No. 13/310,592, dated Jun. 22, 2015, 9 pages.
Non-Final Office Action received for U.S. Appl. No. 13/559,495, dated Dec. 16, 2013, 6 pages.
Non-Final Office Action received for U.S. Appl. No. 13/559,495, dated Dec. 7, 2012, 10 pages.
Non-Final Office Action received for U.S. Appl. No. 14/800,378, dated Feb. 23, 2018, 10 pages.
Non-Final Office Action received for U.S. Appl. No. 15/003,773, dated Dec. 11, 2018, 11 pages.
Non-Final Office Action received for U.S. Appl. No. 15/003,773, dated Oct. 5, 2017, 10 pages.
Non-Final Office Action received for U.S. Appl. No. 15/614,276, dated Jul. 12, 2018, 13 pages.
Non-Final Office Action received for U.S. Appl. No. 16/781,574, dated Sep. 15, 2020, 11 pages.
Non-Final Office Action received for U.S. Appl. No. 90/012,892, dated Aug. 5, 2014, 54 pages.
Non-Final Office Action received for U.S. Appl. No. 90/012,892, dated Mar. 27, 2015, 102 pages.
Notice of Acceptance received for Australian Patent Application No. 2017203816, dated Jul. 30, 2018, 3 pages.
Notice of Acceptance received for Australian Patent Application No. 2018260930, dated Nov. 11, 2019, 3 pages.
Notice of Allowance received for Japanese Patent Application No. 2017-108227, dated Feb. 4, 2019, 4 pages.
Notice of Allowance received for Korean Patent Application No. 10-2017-0068927, dated Feb. 25, 2019, 3 pages.
Notice of Allowance received for Korean Patent Application No. 10-2019-0054454, dated Oct. 24, 2019, 3 pages.
Notice of Allowance received for U.S. Appl. No. 11/459,606, dated Dec. 18, 2009, 7 pages.
Notice of Allowance received for U.S. Appl. No. 11/549,624, dated Jun. 3, 2010, 6 pages.
Notice of Allowance received for U.S. Appl. No. 11/620,641, dated Apr. 13, 2011, 6 pages.
Notice of Allowance received for U.S. Appl. No. 11/620,641, dated Mar. 18, 2011, 12 pages.
Notice of Allowance received for U.S. Appl. No. 11/620,642, dated Oct. 24, 2011, 8 pages.
Notice of Allowance received for U.S. Appl. No. 12/165,554, dated Apr. 2, 2012, 9 pages.
Notice of Allowance received for U.S. Appl. No. 13/220,202, dated Nov. 25, 2014, 8 pages.
Notice of Allowance received for U.S. Appl. No. 13/310,586, dated Sep. 14, 2015, 9 pages.
Notice of Allowance received for U.S. Appl. No. 13/310,592, dated Jul. 15, 2015, 8 pages.
Notice of Allowance received for U.S. Appl. No. 13/559,495, dated Aug. 15, 2013, 6 pages.
Notice of Allowance received for U.S. Appl. No. 13/559,495, dated Dec. 12, 2014, 5 pages.
Notice of Allowance received for U.S. Appl. No. 13/559,495, dated Jun. 25, 2013, 6 pages.
Notice of Allowance received for U.S. Appl. No. 13/559,495, dated Mar. 13, 2015, 5 pages.
Notice of Allowance received for U.S. Appl. No. 15/003,773, dated Nov. 15, 2019, 5 pages.
Notice of Allowance received for U.S. Appl. No. 15/614,276, dated Jan. 17, 2019, 4 pages.
Notice of Allowance received for U.S. Appl. No. 15/614,276, dated Oct. 31, 2018, 11 pages.
Notice of Allowance received for U.S. Appl. No. 16/265,676, dated Jul. 3, 2019, 11 pages.
Notice of Allowance received for U.S. Appl. No. 16/663,070, dated Sep. 3, 2020, 10 pages.
Notice of Allowance received for U.S. Appl. No. 16/781,574, dated Apr. 29, 2021, 6 pages.
Notice of Intent received for U.S. Appl. No. 90/012,892, dated Sep. 17, 2015, 10 pages.
Office Action received for Australian Patent Application No. 2007342164, dated Apr. 1, 2010, 2 pages.
Office Action received for Australian Patent Application No. 2017203816, dated Feb. 12, 2018, 3 pages.
Office Action received for Australian Patent Application No. 2018260930, dated Jun. 26, 2019, 3 pages.
Office Action received for Australian Patent Application No. 2020200191, dated Sep. 25, 2020, 3 pages.
Office Action received for Chinese Patent Application No. 200780006621.9, dated Aug. 16, 2010, 4 pages.
Office Action received for Chinese Patent Application No. 200780052020.1, dated Nov. 25, 2010, 14 pages.
Office Action received for Chinese Patent Application No. 201710424212.6, dated Oct. 28, 2019, 20 pages.
Office Action received for Chinese Patent Application No. 201710424212.6, dated Sep. 9, 2020, 6 pages.
Office Action received for Chinese Patent Application No. 201711258408.9, dated Jun. 23, 2020, 14 pages.
Office Action received for Danish Patent Application No. PA201670624, dated Jun. 28, 2017, 3 pages.
Office Action received for Danish Patent Application No. PA201670624, dated Oct. 20, 2016, 8 pages.
Office Action received for Danish Patent Application No. PA201670626, dated Jun. 30, 2017, 3 pages.
Office Action received for Danish Patent Application No. PA201670626, dated Oct. 24, 2016, 8 pages.
Office Action received for Danish Patent Application No. PA201770921, dated Apr. 26, 2019, 6 pages.
Office Action received for Danish Patent Application No. PA201770921, dated Dec. 6, 2018, 6 pages.
Office Action received for Danish Patent Application No. PA201770921, dated May 3, 2018, 3 pages.
Office Action received for European Patent Application No. 07709955.4, dated Jul. 31, 2009, 6 pages.
Office Action Received for European Patent Application No. 07709955.4, dated Oct. 10, 2008, 5 pages.
Office Action received for European Patent Application No. 07869922.0, dated Dec. 7, 2010, 5 pages.
Office Action received for European Patent Application No. 07869922.0, dated May 26, 2010, 5 pages.
Office Action received for European Patent Application No. 07869923.8, dated May 26, 2010, 4 pages.
Office Action received for Japanese Patent Application No. 2017-108227, dated Aug. 6, 2018, 8 pages.
Office Action received for Japanese Patent Application No. 2019-040836, dated Aug. 14, 2020, 4 pages.
Office Action received for Japanese Patent Application No. 2019-040836, dated May 15, 2020, 5 pages.
Office Action received for Korean Patent Application No. 10-2008-7019114, dated Aug. 31, 2010, 4 pages.
Office Action received for Korean Patent Application No. 10-2017-0068927, dated Jun. 11, 2018, 14 pages.
Office Action received for Korean Patent Application No. 10-2019-0054454, dated May 20, 2019, 7 pages.
Office Action received for Korean Patent Application No. 10-2020-0010129, dated Jul. 27, 2020, 11 pages.
Office Action received for Taiwanese Patent Application No. 097100079, dated Apr. 17, 2012, 34 pages.
Office Action received in Japanese Patent Application No. 2008-549646, dated Apr. 27, 2011, 4 pages.
O'Neal, “Smart Phones with Hidden Keyboards”, Available at: http://msc.com/4250-6452_16-6229969-1.html, Nov. 18, 2005, 3 pages.
P900 User Guide, Sony Ericsson Mobile Communications AB, XP002479719, Available at: http://www.sonyericcson.com/downloads/P900_UG_R1b_EN.pdf, 2003, pp. 8, 16,17,20, 24-26,42-45, 137, 98 pages.
Plaisant C., “Touchscreen Toggle Design”, Available at: http://www.youtube.com/watch?v=wFWbdxicvK0, retrieved on Nov. 15, 2013, 2 pages.
Pogue David, “iPhone: The Missing Manual”, Aug. 2007, 306 pages.
Samsung Releases Keyboard Phone in US, Textually, available at: http://www.textually.ora/textually/archives/2005/11 /01 0482. htm, retrieved on Nov. 18, 2005, 1 page.
Search Report received for Danish Patent Application No. PA201770921, dated Jan. 23, 2018, 7 pages.
Sears et al., “Data Entry for Mobile Devices Using Soft Keyboards: Understanding the Effects of Keyboard Size and User Tasks”, Abstract, Int'l Journal of Human-Computer Interaction, vol. 16, No. 2, 1 page.
Summons to Attend Oral Proceedings received for European Patent Application No. 07709955.4, dated May 11, 2010, 8 pages.
T9® Text Input for Keypad Devices, Available at: http://tegic.com, Nov. 18, 2005, 1 page.
Text Input (legacy), WikiPodlinux, Available at: http://ipodlinux.org/TextInput_%28legacy%29, retrieved on Dec. 5, 2005, 8 pages.
Text Input Concepts, WikiPodlinux, Available at: http:l/web.archive.ora/web/20051211165254/http:l/ipodlinux.ora/Text_Input_Concepts, retrieved on Dec. 5, 2005, 3 pages.
Text Input Methods, WikiPodlinux, Available at: http://ipodlinux.org/Text_Input_Methods, retrieved on Dec. 5, 2005, 5 pages.
Third Party Rejection received for U.S. Appl. No. 90/012,892, dated Jun. 14, 2013, 681 pages.
Warren Tom, “Microsoft Android Wear keyboard”, Online Available at: https://www.youtube.com/watch?v=_lu7bUKKrJE, Oct. 11, 2014, 4 pages.
You Heard of Touch Screens Now Check Out Touch Keys, Phone world, Available at: http://www.phonevworld.com, retrieved on Nov. 18, 2005, 2 pages.
Office Action received for Australian Patent Application No. 2020273352, dated Nov. 15, 2021, 5 pages.
Office Action received for Japanese Patent Application No. 2020-205139, dated Nov. 12, 2021, 6 pages (3 pages of English Translation and 3 pages of Official Copy).
Notice of Acceptance received for Australian Patent Application No. 2020200191, dated Nov. 30, 2020, 3 pages.
Notice of Allowance received for Chinese Patent Application No. 201710424212.6, dated Mar. 4, 2021, 4 pages (1 page of English Translation and 3 pages of Official Copy).
Notice of Allowance received for Chinese Patent Application No. 201711258408.9, dated Mar. 12, 2021, 2 pages (1 page of English Translation and 1 page of Official Copy).
Notice of Allowance received for Korean Patent Application No. 10-2020-0010129, dated Dec. 1, 2020, 5 pages (2 pages of English Translation and 3 pages of Official Copy).
Office Action received for Chinese Patent Application No. 201711258408.9, dated Jan. 4, 2021, 6 pages (3 pages of English Translation and 3 pages of Official Copy).
Office Action received for European Patent Application No. 19171354.4, dated Apr. 14, 2021, 8 pages.
Office Action received for Korean Patent Application No. 10-2021-0024638, dated May 6, 2021, 9 pages (4 pages of English Translation and 5 pages of Official Copy).
Notice of Allowance received for Japanese Patent Application No. 2019-040836, dated Nov. 26, 2021, 16 pages (1 page of English Translation and 15 pages of Official Copy).
Notice of Allowance received for Korean Patent Application No. 10-2021-0024638, dated Nov. 22, 2021, 3 pages (1 page of English Translation and 2 pages of Official Copy).
Intention to Grant received for European Patent Application No. 19171354.4, dated Mar. 24, 2022, 8 pages.
Office Action received for Australian Patent Application No. 2020273352, dated Jan. 21, 2022, 3 pages.
Office Action received for Chinese Patent Application No. 202110446637.3, dated Dec. 20, 2021, 19 pages (9 pages of English Translation and 10 pages of Official Copy).
Related Publications (1)
Number Date Country
20210342064 A1 Nov 2021 US
Continuations (4)
Number Date Country
Parent 16781574 Feb 2020 US
Child 17376774 US
Parent 15003773 Jan 2016 US
Child 16781574 US
Parent 13310586 Dec 2011 US
Child 15003773 US
Parent 11620642 Jan 2007 US
Child 13310586 US