Portable electronic device including touch-sensitive display and method of controlling same

Information

  • Patent Grant
  • 8659569
  • Patent Number
    8,659,569
  • Date Filed
    Wednesday, August 1, 2012
    11 years ago
  • Date Issued
    Tuesday, February 25, 2014
    10 years ago
Abstract
A method includes displaying a first keyboard on a touch-sensitive display of an electronic device, detecting a moving touch on the first keyboard, and, as the touch moves, changing the first keyboard into a second keyboard by moving keys of the first keyboard relative to other keys of the first keyboard, from first locations, along respective key paths, to second locations on the touch-sensitive display.
Description
FIELD OF TECHNOLOGY

The present disclosure relates to electronic devices including but not limited to portable electronic devices having touch-sensitive displays and their control.


BACKGROUND

Electronic devices, including portable electronic devices, have gained widespread use and may provide a variety of functions including, for example, telephonic, electronic messaging and other personal information manager (PIM) application functions. Portable electronic devices include several types of devices including mobile stations such as simple cellular telephones, smart telephones (smart phones), Personal Digital Assistants (PDAs), tablet computers, and laptop computers, with wireless network communications or near-field communications connectivity such as Bluetooth® capabilities.


Portable electronic devices such as PDAs, or tablet computers are generally intended for handheld use and ease of portability. Smaller devices are generally desirable for portability. A touch-sensitive display, also known as a touchscreen display, is particularly useful on handheld devices, which are small and may have limited space for user input and output. The information displayed on the display may be modified depending on the functions and operations being performed.


Improvements in electronic devices with touch-sensitive displays are desirable.





BRIEF DESCRIPTION OF THE DRAWINGS

Embodiments of the present disclosure will now be described, by way of example only, with reference to the attached Figures, wherein:



FIG. 1 is a block diagram of a portable electronic device in accordance with an example;



FIG. 2 is a flowchart illustrating an example of a method of changing a keyboard displayed on an electronic device; and



FIG. 3 through FIG. 9 are front views illustrating one example of changing a keyboard displayed on an electronic device in accordance with the method of FIG. 2.





DETAILED DESCRIPTION

The following describes an electronic device and method including displaying a first keyboard on a touch-sensitive display of an electronic device, detecting a touch on the first keyboard, and when the touch is associated with a keyboard transformation function, changing the first keyboard into a second keyboard by moving keys of the first keyboard relative to other keys of the first keyboard, from first locations, along respective key paths, to second locations on the touch-sensitive display.


For simplicity and clarity of illustration, reference numerals may be repeated among the figures to indicate corresponding or analogous elements. Numerous details are set forth to provide an understanding of the examples described herein. The examples may be practiced without these details. In other instances, well-known methods, procedures, and components are not described in detail to avoid obscuring the examples described. The description is not to be considered as limited to the scope of the examples described herein.


The disclosure generally relates to an electronic device, such as a portable electronic device as described herein. Examples of electronic devices include mobile, or handheld, wireless communication devices such as pagers, cellular phones, cellular smart-phones, wireless organizers, personal digital assistants, wirelessly enabled notebook computers, tablet computers, mobile internet devices, electronic navigation devices, and so forth. The portable electronic device may also be a portable electronic device without wireless communication capabilities, such as a handheld electronic game device, digital photograph album, digital camera, media player, e-book reader, and so forth.


A block diagram of an example of a portable electronic device 100 is shown in FIG. 1. The electronic device 100 includes multiple components, such as a processor 102 that controls the overall operation of the portable electronic device 100. Communication functions, including data and voice communications, are performed through a communication subsystem 104. Data received by the portable electronic device 100 is decompressed and decrypted by a decoder 106. The communication subsystem 104 receives messages from and sends messages to a wireless network 150. The wireless network 150 may be any type of wireless network, including, but not limited to, data wireless networks, voice wireless networks, and networks that support both voice and data communications. A power source 142, such as one or more rechargeable batteries or a port to an external power supply, powers the portable electronic device 100.


The processor 102 interacts with other components, such as a Random Access Memory (RAM) 108, memory 110, a touch-sensitive display 118, an auxiliary input/output (I/O) subsystem 124, a data port 126, a speaker 128, a microphone 130, short-range communications 132 and other device subsystems 134. The touch-sensitive display 118 includes a display 112 and touch sensors 114 that are coupled to at least one controller 116 that is utilized to interact with the processor 102. Input via a graphical user interface is provided via the touch-sensitive display 118. Information, such as text, characters, symbols, images, icons, and other items that may be displayed or rendered on a portable electronic device, is displayed on the touch-sensitive display 118 via the processor 102. Optionally, the processor may interact with one or more force sensors 122. The processor 102 may also interact with an accelerometer 136 that may be utilized to detect direction of gravitational forces or gravity-induced reaction forces.


To identify a subscriber for network access, the portable electronic device 100 may utilize a Subscriber Identity Module or a Removable User Identity Module (SIM/RUIM) card 138 for communication with a network, such as the wireless network 150. Alternatively, user identification information may be programmed into memory 110.


The portable electronic device 100 includes an operating system 146 and software programs, applications, or components 148 that are executed by the processor 102 and are typically stored in a persistent, updatable store such as the memory 110. Additional applications or programs may be loaded onto the portable electronic device 100 through the wireless network 150, the auxiliary I/O subsystem 124, the data port 126, the short-range communications subsystem 132, or any other suitable subsystem 134.


A received signal such as a text message, an e-mail message, or web page download is processed by the communication subsystem 104 and input to the processor 102. The processor 102 processes the received signal for output to the display 112 and/or to the auxiliary I/O subsystem 124. A subscriber may generate data items, for example e-mail messages, which may be transmitted over the wireless network 150 through the communication subsystem 104. For voice communications, the overall operation of the portable electronic device 100 is similar. The speaker 128 outputs audible information converted from electrical signals, and the microphone 130 converts audible information into electrical signals for processing.


The touch-sensitive display 118 may be any suitable touch-sensitive display, such as a capacitive, resistive, infrared, surface acoustic wave (SAW) touch-sensitive display, strain gauge, optical imaging, dispersive signal technology, acoustic pulse recognition, and so forth. A capacitive touch-sensitive display includes one or more capacitive touch sensors 114. The capacitive touch sensors may comprise any suitable material, such as indium tin oxide (ITO).


One or more touches, also known as touch contacts or touch events, may be detected by the touch-sensitive display 118. The processor 102 may determine attributes of the touch, including a location of the touch. Touch location data may include data for an area of contact or data for a single point of contact, such as a point at or near a center of the area of contact. The location of a detected touch may include x and y components, e.g., horizontal and vertical components, respectively, with respect to one's view of the touch-sensitive display 118. A touch may be detected from any suitable input member, such as a finger, thumb, appendage, or other objects, for example, a stylus, pen, or other pointer, depending on the nature of the touch-sensitive display 118. Multiple simultaneous touches may be detected.


One or more gestures may also be detected by the touch-sensitive display 118. A gesture, such as a swipe, also known as a flick, is a particular type of touch on a touch-sensitive display 118 and may begin at an origin point and continue to an end point, for example, a concluding end of the gesture. A gesture may be identified by attributes of the gesture, including the origin point, the end point, the distance travelled, the duration, the velocity, and the direction, for example. A gesture may be long or short in distance and/or duration. Two points of the gesture may be utilized to determine a direction of the gesture. A gesture may also include a hover. A hover may be a touch at a location that is generally unchanged over a period of time or is associated with the same selection item for a period of time.


Optional force sensors 122 may be disposed in conjunction with the touch-sensitive display 118 to determine or react to forces applied to the touch-sensitive display 118. The force sensors 122 may be force-sensitive resistors, strain gauges, piezoelectric or piezoresistive devices, pressure sensors, quantum tunneling composites, force-sensitive switches, or other suitable devices. Force as utilized throughout the specification, including the claims, refers to force measurements, estimates, and/or calculations, such as pressure, deformation, stress, strain, force density, force-area relationships, thrust, torque, and other effects that include force or related quantities. Optionally, force information associated with a detected touch may be utilized to select information, such as information associated with a location of a touch. For example, a touch that does not meet a force threshold may highlight a selection option, whereas a touch that meets a force threshold may select or input that selection option. Selection options include, for example, displayed or virtual keys of a keyboard; selection boxes or windows, e.g., “cancel,” “delete,” or “unlock”; function buttons, such as play or stop on a music player; and so forth. Different magnitudes of force may be associated with different functions or input. For example, a lesser force may result in panning, and a higher force may result in zooming.


A flowchart illustrating an example of a method of changing a keyboard displayed on an electronic device, such as the electronic device 100, is shown in FIG. 2. The method may be carried out by software executed, for example, processor 102 and/or the controller 116. Coding of software for carrying out such a method is within the scope of a person of ordinary skill in the art given the present description. The method may contain additional or fewer processes than shown and/or described, and may be performed in a different order. Computer-readable code executable by at least one controller or processor of the portable electronic device to perform the method may be stored in a computer-readable medium, such as a non-transitory computer-readable medium.


A keyboard is displayed on the touch-sensitive display 118 at 202. The keyboard may be any suitable keyboard such as a QWERTY keyboard, QWERTZ keyboard, AZERTY keyboard, and so forth. The keyboard includes a plurality of keys that are associated with characters that may be entered utilizing the keyboard. The keyboard may be displayed in any suitable application. For example, the keyboard may be displayed for composition of a message in a messaging application. The keyboard may be displayed for entry of information in a data entry field in a Web browser application. The keyboard may be displayed for entry of information in other applications, such as a calendar application, a contacts or address book application, a word processing application, or any other suitable application.


When a touch is detected on the keyboard at 204, the attributes of touch on the touch-sensitive display 118 are determined. The touch may be a gesture, a multi-touch gesture, a tap, a multi-touch tap, or any other suitable touch. The attributes include, for example, duration of a touch, number of touch contacts, direction of the touch when the touch is a gesture, and so forth.


The touch may be associated with a function and the function is identified at 206. The function that is associated with the touch is dependent on the attributes of the touch. For example a gesture on the keyboard may be associated with a keyboard transformation function to change the layout of the keys of the keyboard, for example. A tap on a location associated with one of the keys of the keyboard may be associated with entry of the character associated with the one of the keys. Another gesture, such as a swipe from a location on the keyboard, in the downward direction, may be associated with a function to hide the keyboard.


When the touch is associated with a keyboard transformation function at 208, the process continues at 210. The keyboard transformation function is a function to change the keyboard layout by changing the locations of the keys, for example, to increase the number of rows of the keyboard, to decrease the number of rows of the keyboard, to increase the number the columns, or to decrease the number of columns. Different keyboard layouts may also include greater or fewer numbers of keys. The locations of the keys are changed by moving keys of the keyboard relative to other keys of the keyboard. The keys that are moved, move along their respective key paths. For example, when increasing the number of rows of the keyboard, keys may move along a path from one row to the new row. Other keys may also move along a path from one row to another row. Still other keys may move along a path within the row. The keys of the keyboard may also be resized based on the available display width and based on the number of keys of the keyboard. Additional keys may be added when the number of rows is increased. Alternatively, keys may be removed when the number of rows is decreased.


At 210, the keys move at a rate or speed that is dependent on the speed of the gesture detected. Thus, a slow gesture may be utilized to move the keys slowly to the new locations. Alternatively, a fast gesture may be utilized to move the keys quickly. The keys move with movement of the touch such that the keys move a distance along their respective key paths based on a location of the touch. When the touch moves farther from an origin or origins of the touch, the keys move farther along their respective key paths. If the direction of the gesture is reversed such that the touch moves in the reverse direction, toward the origin point(s) of the touch, the movement of the keys may be reversed. The distance of movement of the keys may be dependent on the distance of movement of the touch until, for example, the keys are located at their respective end locations along their respective key paths. Thus, further movement of the touch may not result in further movement of the keys.


When the touch ends at 212, the keyboard associated with the last detected location of the touch is displayed 214. For example, when the touch moves a distance that does not meet a threshold, the keys may return, along their respective key paths, to their starting locations, or locations prior to the touch. When the touch moves a distance that meets or exceeds the threshold, the keys may move to end locations along their respective key paths.


The movement of the keys along their respective key paths is displayed on the touch-sensitive display 118 when the keyboard is changed. Ready identification of the new locations of keys is facilitated by displaying the movement of the keys during changing of the keyboard.


When the touch is not associated with a keyboard transformation function at 208, the process continues at 216 where a function associated with the touch is performed.


One example of changing a keyboard displayed on an electronic device 100 is illustrated in FIG. 3 through FIG. 9 and described with continued reference to FIG. 2. In the front view of FIG. 3, a first keyboard 302 is displayed on the touch-sensitive display 118 at 202. In the example of FIG. 3, the first keyboard 302 is a QWERTY keyboard and includes three rows 304, 306, 308 of keys 310. The keys 310 of the keyboard are sized such that the rows 304, 306, 308 fit the width of the touch-sensitive display 118 when the touch-sensitive display 118 is in the landscape orientation.


A touch is detected on the keyboard at 204 and the attributes of the touch, including touch contact locations and the directions of movement on the touch-sensitive display 118 are determined. In the example illustrated in FIG. 4, the touch is a multi-touch gesture, including one touch contact beginning at the location illustrated by the circle 402 and moving upwardly in the direction illustrated by the arrow 404 and another touch contact beginning at the location illustrated by the circle 406 and moving upwardly in the direction illustrated by the arrow 408.


The touch contacts are illustrated by the circles 402, 406 in FIG. 4 through FIG. 9. The touch contacts begin at locations illustrated in FIG. 4, that are associated with the “S” and “L” keys of the keyboard 302. For the purpose of the example of FIG. 3 through FIG. 9, the “S” and “L” keys are predetermined locations on the keyboard 302, from which an upward gesture is associated with the function to change the layout of the keyboard, referred to as the keyboard transformation function, and the function is identified at 206.


In the example illustrated in FIG. 3 through FIG. 9, the keyboard transformation function is a function to change the keyboard layout by changing the locations of the keys 310, for example, to increase the number of rows of the keyboard to four rows. The keyboard is changed at 210. The change is illustrated in FIG. 5 through FIG. 9.


As illustrated in FIG. 5, each of the rows 304, 306, 308 of keys 310 of the keyboard 302 are moved upwardly, away from the bottom edge 502 of the display area 504 of the touch-sensitive display 118 as the touches, or locations of touch contact, move. The terms up or upwardly and down or downwardly are utilized herein to refer to directions relative to the orientation of the displayed keyboard illustrated in the figures. The rows of keys 310 are moved upwardly with the gesture such that the contact locations, illustrated by the circles 402, 406 in FIG. 4, are locations at which the keyboard 302 is grabbed to move the keys 310. The keys 310 are moved a sufficient distance from the bottom edge 502 to provide space for an additional row of keys 310.


After the keys are moved away from the edge 502, keys drop back down toward the edge 502 as the locations of touch contact move. Not all the keys drop back down toward the edge as the keys move relative to each other, i.e., ones of the keys move relative to other ones of the keys. In this example, the space key 506 drops down and the backspace key 508 and return key 510 drop down such that the space key 506 and the return key 510 move along their respective key paths, to a new or fourth row, as illustrated in FIG. 5 and FIG. 6. The backspace key 508 moves from the second row 306 to the third row 308. Two new keys, including the period, or “.” key 512 and the comma, or “,” key 514 are added. The two new keys are added to the new, or fourth row 516 of keys. The new keys are displayed as entering the display area of the touch-sensitive display 118, from the bottom edge 502.


Movement of the space bar to the fourth row 516 provides additional space in the third row 308. As the locations of touch contact continue to move, the keys in the third row 308 are moved along their respective key paths to new locations in the third row, to utilize the space, as illustrated in FIG. 7. The “Z” key 702, the “X” key 704, the “C” key 706, and the V″ key 708 move to the right such that the “Z” key 702 moves away from the left edge 710 of the display area 504 to leave a space between the left edge 710 and the “Z” key 702. The “B” key 712, the “N” key 714, the “M” key 716, and the backspace key 508 move along respective key paths to new location in the third row 308 to utilize the space left after movement of the space key 506. Keys in the second row 306 may also be moved along their respective key paths to new locations in the second row to utilize the space left after movement of the backspace key 508.


Additionally, the shift key 718 and a key 720 that is associated with a numeric/symbolic keyboard are moved down along their respective key paths such that the key 718 associated with the numeric/symbolic keyboard is moved to the fourth row 516 and the shift key 720 is moved from the second row 306 to the third row 308.


As the locations of touch contact move further, the keys in the first row 304 are moved along their respective key paths to space the keys along the first row 304, as illustrated in FIG. 8. Each of the keys is resized by changing the width of the keys such that the keys are spaced along and generally fill the width of the display area, with small spaces between the keys, as illustrated in FIG. 9, which shows the second keyboard. As illustrated, the key widths in the second keyboard are not all equal. The keys in the first row 304, for example, are not as wide as the keys in the second row 306 and the third row 308. The fourth row 516 includes keys of greater width than the remaining keys, including the space key 506, the return key 510, and the key 720 associated with the numeric/symbolic keyboard.


When the touch ends, the last detected locations of touch contact are beyond a threshold distance, as illustrated by the dashed line 902. Thus, the second keyboard is maintained on the touch-sensitive display 118 when the touch ends.


When a touch ends at locations that are a distance that does not meet the threshold, e.g., are located below the line 902 illustrated in FIG. 9, the keys return, along their respective key paths, to the three row keyboard illustrated in FIG. 3. A multi-touch gesture, such as the gesture illustrated by the circles 402, 406 in FIG. 4 through FIG. 9, may meet the threshold when one or when both touches meet or extend beyond or above the line 902. Alternatively, a multi-touch gesture may be determined to meet the threshold when both touches meet or extend above the line 902.


In the example described above with reference to FIG. 3 through FIG. 9, a first keyboard that includes three rows of keys is changed to a second keyboard that includes four rows of keys. The keyboard may include other rows of keys and more rows of keys may be added. The number of rows may also be reduced, for example, from four to three rows.


The method is not limited to the portable electronic device illustrated in the examples. The method may be applied utilizing other electronic devices. The method may also be applied to a keyboard displayed in a portrait orientation.


A method includes displaying a first keyboard on a touch-sensitive display of an electronic device, detecting a touch on the first keyboard, and when the touch is associated with a keyboard transformation function, changing the first keyboard into a second keyboard by moving keys of the first keyboard relative to other keys of the first keyboard, from first locations, along respective key paths, to second locations on the touch-sensitive display.


An electronic device includes a touch-sensitive display and at least one processor coupled to the touch-sensitive display and configured to display a first keyboard on the touch-sensitive display, detect a touch on the first keyboard, and when the touch is associated with a keyboard transformation function, change the first keyboard into a second keyboard by moving keys of the first keyboard relative to other keys of the first keyboard, from first locations, along respective key paths, to second locations on the touch-sensitive display.


More rows may be added to a keyboard such that additional keys may be added to increase the number of characters that may be entered utilizing the keyboard, and/or to increase the size of keys of the keyboard to facilitate selection of the keys. Alternatively, keys may be removed when the number of rows is decreased or the size of keys may be decreased. The movement of the keys along their respective key paths is displayed on the touch-sensitive display when the keyboard is changed. Ready identification of the new locations of keys is facilitated by displaying the movement of the keys during changing of the keyboard. The user may control the movement of the keys, for example, by controlling the speed. A user may also reverse the movement of the keys by reversing the direction of movement of the touch. Thus, the user may follow the movement of the keys to their new locations.


The present disclosure may be embodied in other specific forms without departing from its spirit or essential characteristics. The described embodiments are to be considered in all respects only as illustrative and not restrictive. The scope of the present disclosure is, therefore, indicated by the appended claims rather than by the foregoing description. All changes that come within the meaning and range of equivalency of the claims are to be embraced within their scope.

Claims
  • 1. A method comprising: displaying a first keyboard on a touch-sensitive display of an electronic device;detecting a moving touch associated with the first keyboard; andas the touch moves, changing the first keyboard into a second keyboard by moving keys of the first keyboard from first locations, along a plurality of directions based on respective key paths, to second locations on the touch-sensitive display, wherein: the keys of the first keyboard move at a rate based on the speed of the moving touch,changing the first keyboard into the second keyboard changes spacing between the keys of the first keyboard as compared with the second keyboard, anda space is provided to display an additional key row while displaying all keys of the first keyboard.
  • 2. The method according to claim 1, wherein the moving touch comprises a moving multi-touch.
  • 3. The method according to claim 1, wherein the moving touch comprises a moving touch beginning at a predetermined location for changing the first keyboard.
  • 4. The method according to claim 1, wherein changing comprises resizing the keys.
  • 5. The method according to claim 4, wherein resizing the keys comprises changing a width of the keys.
  • 6. The method according to claim 5, wherein the width of the keys is changed based on the available display width.
  • 7. The method according to claim 1, wherein the keys are arranged in rows and changing the keyboard comprises: moving at least one of the keys from one of the rows to another of the rows; andchanging the number of keys included in at least one or more of the rows.
  • 8. The method according to claim 1, wherein the moving touch comprises a gesture in a first direction, beginning on the first keyboard and wherein the keys are moved in the first direction to provide a space for the additional key row.
  • 9. The method according to claim 8, wherein ones of the keys move back in a second direction opposite the first direction, into the additional key row, after being moved in the first direction.
  • 10. The method according to claim 9, wherein changing comprises resizing the keys after the ones of the keys are moved into the additional key row.
  • 11. The method according to claim 9, wherein others of the keys are moved in a third direction different from both the first direction and the second direction after the ones of the keys are moved.
  • 12. The method according to claim 8, wherein further keys that were not included in the first keyboard are added to the additional key row.
  • 13. The method according to claim 1, wherein the moving touch comprises a gesture in a direction beginning on the first keyboard to remove a key row.
  • 14. The method according to claim 1, wherein the moving touch comprises a gesture beginning on the first keyboard to add or remove a key row.
  • 15. The method according to claim 1, wherein a distance of movement of the keys along respective key paths is based on a distance of movement of the touch.
  • 16. The method according to claim 15, further comprising: while detecting a moving touch associated with the first keyboard, reversing movement of the keys along their respective key paths in response to a change in direction of movement of the touch.
  • 17. The method according to claim 1, comprising detecting an end of the touch and displaying one of the first keyboard and the second keyboard associated with a last-detected location of the touch.
  • 18. The method according to claim 1, comprising detecting an end of the touch and displaying the first keyboard again when the distance moved by the moving touch does not meet a threshold.
  • 19. The method according to claim 18, wherein displaying the first keyboard again comprises moving the keys back along their respective key paths to their starting locations.
  • 20. The method according to claim 18, comprising displaying the second keyboard when the distance moved by the moving touch meets the threshold.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application claims priority as a continuation of U.S. patent application Ser. No. 13/485,723, filed May 31, 2012, which claims the benefit of U.S. Provisional Patent Application 61/603,094, filed Feb. 24, 2012, which are hereby incorporated by reference in their entirety.

US Referenced Citations (188)
Number Name Date Kind
3872433 Holmes et al. Mar 1975 A
4408302 Fessel et al. Oct 1983 A
5261009 Bokser Nov 1993 A
5664127 Anderson et al. Sep 1997 A
5832528 Kwatinetz et al. Nov 1998 A
5963671 Comerford et al. Oct 1999 A
6002390 Masui Dec 1999 A
6064340 Croft et al. May 2000 A
6094197 Buxton et al. Jul 2000 A
6223059 Haestrup Apr 2001 B1
6226299 Henson May 2001 B1
6351634 Shin Feb 2002 B1
6646572 Brand Nov 2003 B1
7098896 Kushler et al. Aug 2006 B2
7107204 Liu et al. Sep 2006 B1
7216588 Suess May 2007 B2
7277088 Robinson et al. Oct 2007 B2
7292226 Matsuura et al. Nov 2007 B2
7382358 Kushler et al. Jun 2008 B2
7394346 Bodin Jul 2008 B2
7443316 Lim Oct 2008 B2
7479949 Jobs et al. Jan 2009 B2
7487461 Zhai et al. Feb 2009 B2
7530031 Iwamura et al. May 2009 B2
7661068 Lund Feb 2010 B2
7698127 Trower, II et al. Apr 2010 B2
7886233 Rainisto et al. Feb 2011 B2
8023930 Won Sep 2011 B2
8065624 Morin et al. Nov 2011 B2
8201087 Kay et al. Jun 2012 B2
20020097270 Keely et al. Jul 2002 A1
20020154037 Houston Oct 2002 A1
20020180797 Bachmann Dec 2002 A1
20040111475 Schultz Jun 2004 A1
20040135818 Thomson et al. Jul 2004 A1
20040140956 Kushler et al. Jul 2004 A1
20040153963 Simpson et al. Aug 2004 A1
20050017954 Kay et al. Jan 2005 A1
20050024341 Gillespie et al. Feb 2005 A1
20050039137 Bellwood et al. Feb 2005 A1
20050052425 Zadesky et al. Mar 2005 A1
20050093826 Huh May 2005 A1
20050195173 McKay Sep 2005 A1
20060022947 Griffin et al. Feb 2006 A1
20060033724 Chaudhri et al. Feb 2006 A1
20060053387 Ording Mar 2006 A1
20060176283 Suraqui Aug 2006 A1
20060209040 Garside et al. Sep 2006 A1
20060239562 Bhattacharyay et al. Oct 2006 A1
20060253793 Zhai et al. Nov 2006 A1
20060265648 Rainisto et al. Nov 2006 A1
20060265668 Rainisto Nov 2006 A1
20060279548 Geaghan Dec 2006 A1
20070046641 Lim Mar 2007 A1
20070061753 Ng et al. Mar 2007 A1
20070150842 Chaudhri et al. Jun 2007 A1
20070156394 Banerjee et al. Jul 2007 A1
20070157085 Peters Jul 2007 A1
20070256029 Maxwell Nov 2007 A1
20070263932 Bernardin et al. Nov 2007 A1
20080100581 Fux May 2008 A1
20080122796 Jobs et al. May 2008 A1
20080126387 Blinnikka May 2008 A1
20080136587 Orr Jun 2008 A1
20080141125 Ghassabian Jun 2008 A1
20080158020 Griffin Jul 2008 A1
20080184360 Kornilovsky et al. Jul 2008 A1
20080189605 Kay et al. Aug 2008 A1
20080231610 Hotelling et al. Sep 2008 A1
20080259040 Ording et al. Oct 2008 A1
20080273013 Levine et al. Nov 2008 A1
20080281583 Slothouber et al. Nov 2008 A1
20080304890 Shin et al. Dec 2008 A1
20080309644 Arimoto Dec 2008 A1
20080316183 Westerman et al. Dec 2008 A1
20080318635 Yoon et al. Dec 2008 A1
20090002326 Pihlaja Jan 2009 A1
20090025089 Martin et al. Jan 2009 A1
20090058823 Kocienda Mar 2009 A1
20090058830 Herz et al. Mar 2009 A1
20090066668 Kim et al. Mar 2009 A1
20090077464 Goldsmith et al. Mar 2009 A1
20090085881 Keam Apr 2009 A1
20090094562 Jeong et al. Apr 2009 A1
20090125848 Keohane et al. May 2009 A1
20090132576 Miller et al. May 2009 A1
20090144667 Christoffersson et al. Jun 2009 A1
20090160800 Liu et al. Jun 2009 A1
20090167700 Westerman et al. Jul 2009 A1
20090174667 Kocienda et al. Jul 2009 A1
20090193334 Assadollahi Jul 2009 A1
20090213081 Case, Jr. Aug 2009 A1
20090228792 Van Os et al. Sep 2009 A1
20090228842 Westerman et al. Sep 2009 A1
20090247112 Lundy et al. Oct 2009 A1
20090251410 Mori et al. Oct 2009 A1
20090254818 Jania et al. Oct 2009 A1
20090259962 Beale Oct 2009 A1
20090265669 Kida et al. Oct 2009 A1
20090284471 Longe et al. Nov 2009 A1
20090295737 Goldsmith et al. Dec 2009 A1
20090307768 Zhang et al. Dec 2009 A1
20090313693 Rogers Dec 2009 A1
20100020033 Nwosu Jan 2010 A1
20100020036 Hui et al. Jan 2010 A1
20100045705 Vertegaal et al. Feb 2010 A1
20100052880 Laitinen et al. Mar 2010 A1
20100070908 Mori et al. Mar 2010 A1
20100079413 Kawashima et al. Apr 2010 A1
20100095238 Baudet Apr 2010 A1
20100115402 Knaven et al. May 2010 A1
20100127991 Yee May 2010 A1
20100131900 Spetalnick May 2010 A1
20100141590 Markiewicz et al. Jun 2010 A1
20100156813 Duarte et al. Jun 2010 A1
20100156818 Burrough et al. Jun 2010 A1
20100161538 Kennedy, Jr. et al. Jun 2010 A1
20100197352 Runstedler et al. Aug 2010 A1
20100199176 Chronqvist Aug 2010 A1
20100225599 Danielsson et al. Sep 2010 A1
20100235726 Ording et al. Sep 2010 A1
20100253620 Singhal Oct 2010 A1
20100257478 Longe et al. Oct 2010 A1
20100259482 Ball Oct 2010 A1
20100259561 Forutanpour et al. Oct 2010 A1
20100277424 Chang et al. Nov 2010 A1
20100287486 Coddington Nov 2010 A1
20100292984 Huang et al. Nov 2010 A1
20100295801 Bestle et al. Nov 2010 A1
20100313127 Gosper et al. Dec 2010 A1
20100313158 Lee et al. Dec 2010 A1
20100315266 Gunawardana et al. Dec 2010 A1
20100325721 Bandyopadhyay et al. Dec 2010 A1
20100333027 Martensson et al. Dec 2010 A1
20110010655 Dostie et al. Jan 2011 A1
20110018812 Baird Jan 2011 A1
20110029862 Scott et al. Feb 2011 A1
20110035696 Elazari et al. Feb 2011 A1
20110041056 Griffin et al. Feb 2011 A1
20110043455 Roth et al. Feb 2011 A1
20110060984 Lee Mar 2011 A1
20110061029 Yeh et al. Mar 2011 A1
20110063231 Jakobs et al. Mar 2011 A1
20110078613 Bangalore Mar 2011 A1
20110086674 Rider et al. Apr 2011 A1
20110090151 Huang et al. Apr 2011 A1
20110099505 Dahl Apr 2011 A1
20110099506 Gargi et al. Apr 2011 A1
20110119623 Kim May 2011 A1
20110148572 Ku Jun 2011 A1
20110171617 Yeh et al. Jul 2011 A1
20110179355 Karlsson Jul 2011 A1
20110193797 Unruh Aug 2011 A1
20110202835 Jakobsson et al. Aug 2011 A1
20110202876 Badger et al. Aug 2011 A1
20110209087 Guyot-Sionnest Aug 2011 A1
20110233407 Wu et al. Sep 2011 A1
20110239153 Carter et al. Sep 2011 A1
20110242138 Tribble Oct 2011 A1
20110248945 Higashitani Oct 2011 A1
20110249076 Zhou et al. Oct 2011 A1
20110256848 Bok et al. Oct 2011 A1
20110285656 Yaksick et al. Nov 2011 A1
20110302518 Zhang Dec 2011 A1
20110305494 Kang Dec 2011 A1
20120005576 Assadollahi Jan 2012 A1
20120023447 Hoshino et al. Jan 2012 A1
20120029910 Medlock et al. Feb 2012 A1
20120030566 Victor Feb 2012 A1
20120030623 Hoellwarth Feb 2012 A1
20120036469 Suraqui Feb 2012 A1
20120053887 Nurmi Mar 2012 A1
20120062465 Spetalnick Mar 2012 A1
20120062494 Hsieh et al. Mar 2012 A1
20120068937 Backlund et al. Mar 2012 A1
20120079373 Kocienda et al. Mar 2012 A1
20120092278 Yamano Apr 2012 A1
20120110518 Chan et al. May 2012 A1
20120117506 Koch et al. May 2012 A1
20120119997 Gutowitz May 2012 A1
20120149477 Park et al. Jun 2012 A1
20120159317 Di Cocco et al. Jun 2012 A1
20120166696 Kallio et al. Jun 2012 A1
20120167009 Davidson et al. Jun 2012 A1
20120223959 Lengeling Sep 2012 A1
20120306772 Tan et al. Dec 2012 A1
20120311437 Weeldreyer et al. Dec 2012 A1
20130222255 Pasquero et al. Aug 2013 A1
Foreign Referenced Citations (40)
Number Date Country
101021762 Aug 2007 CN
0844571 May 1998 EP
0880090 Nov 1998 EP
1847917 Oct 2007 EP
1850217 Oct 2007 EP
1909161 Apr 2008 EP
2077491 Jul 2009 EP
2109046 Oct 2009 EP
2128750 Dec 2009 EP
2146271 Jan 2010 EP
2184686 May 2010 EP
2256614 Dec 2010 EP
2282252 Feb 2011 EP
2293168 Mar 2011 EP
2320312 May 2011 EP
2336851 Jun 2011 EP
2402846 Jan 2012 EP
2420925 Feb 2012 EP
2431842 Mar 2012 EP
2011-197782 Oct 2011 JP
2012-68963 Apr 2012 JP
20120030652 Mar 2012 KR
03029950 Apr 2003 WO
03054681 Jul 2003 WO
04001560 Dec 2003 WO
2006100509 Sep 2006 WO
2007068505 Jun 2007 WO
2007076210 Jul 2007 WO
2007134433 Nov 2007 WO
WO2008057785 May 2008 WO
2009019546 Feb 2009 WO
2010035574 Apr 2010 WO
WO2010035574 Apr 2010 WO
WO2010099835 Sep 2010 WO
WO2010112841 Oct 2010 WO
2011073992 Jun 2011 WO
WO2011073992 Jun 2011 WO
2011098925 Aug 2011 WO
WO2011113057 Sep 2011 WO
2012043932 Apr 2012 WO
Non-Patent Literature Citations (81)
Entry
“Features Included in the T-Mobile G1”, http://www.t-mobileg1.com/T-Mobile-G1-Features.pdf, 2009.
BlackBerry Seeker—Freeware—Pattern Lock v1.0.7, http://www.blackberryseeker.com/applications/preview/Pattern-Lock-v107.aspx, Jul. 28, 2009.
Chong et al., Exploring the Use of Discrete Gestures for Authentication, IFIP International Federation for Information Processing, 2009.
European Search Report dated Feb. 28, 2011, issued in European Patent Application No. 10160590.5.
GSMArena—Samsung announce s5600 & s5230 full touch midrange phones, http://www.gsmarena.com/samsung—announce—s5600—and—s5230—full—touch—midrange—phones-news-825.php, Mar. 10, 2009.
Hardware Sphere—Samsung s5600 & s5230 Touchscreen phones, http://hardwaresphere.com/2009/03/09/samsung-s5600-s5230-touchscreen-phones/, Mar. 9, 2009.
International Search Report and Written Opinion issued in International Application No. PCT/IB2011/003273, on Jun. 14, 2012, 8 pages.
iPhone User Guide—for iPhone OS 3.1 Software, 2009 (217 pages).
Madhvanath, Sriganesh, HP-Gesture based computing interfaces, Mar. 2008.
Manual del usuario Samsung Moment™ with Google™, dated May 20, 2012 (224 pages).
Mobile Tech News—Samsung launches new Gesture Lock touchscreen handsets, http://www.mobiletechnews.com/info/2009/03/11/124559.html, Mar. 11, 2009.
Partial European Search Report; Application No. 10160590.5; Sep. 16, 2010.
Sprint Support Tutorial Set the Screen Lock Pattern—Samsung Moment, http://supportsprint.com/support/tutorial/Set—the—Screen—Lock—Pattern—Samsung—Moment/10887-171, date of access: May 31, 2012 (9 pages).
Sprint Support Tutorial Unlock a Forgotten Lock Pattern—Samsung Moment, http://support.sprint.com/support/tutorial/Unlock—a—Forgotten—Lock—Pattern—Samsung—Moment/10887-339, date of access: May 31, 2012 (7 pages).
Support—Sprint Cell Phones SPH-M900—Samsung Cell Phones, http://www.samsung.com/us/support/owners/product/SPH-M900?tabContent-content2, date of access: May 31, 2012 (1 page).
T-Mobile Forum—Help & How to—Hidden Pattern, http://forums.t-mobile.com/tmbl/board/rnessage?board.id=Android3&message.id=3511&query.id=52231#M3511, Oct. 23, 2008.
T-Mobile Forum—Help & How to—Screen Unlock Pattern, http://forums.t-mobile.com/tmbl/board/message?board.id=Android3&message.id=6015&query.id=50827#M6015, Oct. 22, 2008.
T-Mobile launches the highly anticipated T-Mobile G1, Oct. 22, 2008.
U.S. Office Action for U.S. Appl. No. 12/764,298, dated Jul. 20, 2012, 38 pages.
U.S. Office Action for U.S. Appl. No. 13/482,705, dated Aug. 7, 2012, 10 pages.
User Guide Samsung Moment(TM) with Google(TM), dated Dec. 4, 2009 (122 pages).
User Guide Samsung Moment(TM) with Google(TM), dated Mar. 2, 2010 (218 pages).
Conveniently select text, images, annotations, etc. in a PDF or any other text format on a touch based mobile/tablet device, IP.com Journal, Mar. 1, 2011, XP013142665, (10 pages).
Droid X by Motorola © 2010 Screen shots.
Droid X by Motorola © 2010 User Manual (72 pages).
Extended European Search Report dated Aug. 24, 2012, issued in European Application No. 12166115.1 (5 pages).
Extended European Search Report dated Oct. 9, 2012, issued in European Application No. 12166244.9 (6 pages).
Extended European Search Report dated Sep. 10, 2012, issued in European Application No. 12166246.4 (6 pages).
Extended European Search Report dated Sep. 10, 2012, issued in European Application No. 12166247.2 (8 pages).
Extended European Search Report dated Sep. 21, 2012, issued in European Application No. 12164240.9 (6 pages).
Extended European Search Report dated Sep. 25, 2012, issued in European Application No. 11192713.3 (7 pages).
Extended European Search Report dated Sep. 3, 2012, issued in European Application No. 12164300.1 (7 pages).
Google Mobile Help—Editing text, http://support.google.com/mobile/bin/answer.py?hl=en&answer=168926, date of access: Jun. 6, 2012 (2 pages).
International Search Report and Written Opinion issued in International Application No. PCT/EP2012/057944, on Oct. 12, 2012, (10 pages).
International Search Report and Written Opinion mailed Sep. 10, 2012, issued for International Application No. PCT/EP2012/057945 (11 pages).
Merrett, Andy, “iPhone OS 3.0: How to cut, copy and paste text and images”, http://www.iphonic.tv/2009/06/iphone—os—30—how—to—cut—copy—a.html, Jun. 18, 2009, XP002684215, (8 pages).
U.S. Office Action dated Oct. 15, 2012, issued in U.S. Appl. No. 13/560,270 (15 pages).
U.S. Office Action dated Oct. 17, 2012, issued in U.S. Appl. No. 13/563,943 (17 pages).
U.S. Office Action dated Oct. 18, 2012, issued in U.S. Appl. No. 13/563,182 (12 pages).
U.S. Office Action dated Oct. 23, 2012, issued in U.S. Appl. No. 12/764,298 (41 pages).
U.S. Office Action dated Oct. 25, 2012, issued in U.S. Appl. No. 13/459,732 (15 pages).
U.S. Office Action dated Oct. 5, 2012, issued in U.S. Appl. No. 13/447,835 (20 pages).
U.S. Office Action dated Sep. 10, 2012, issued in U.S. Appl. No. 13/524,678 (12 pages).
U.S. Office Action dated Sep. 28, 2012, issued in U.S. Appl. No. 13/494,794 (14 pages).
“Windows Mobile Café—Software (Freeware): Touchpal, Let's Try Tabbing Up to 300 Chars/Min”, Nov. 4, 2007, retrieved from URL:http://windows-mobile-cafe.blogspot.nl/2007/11/software-freeware-touchpal-lets-try.html, accessed online Jan. 18, 2013 (2 pages).
European Examination Report dated Apr. 5, 2013, issued in European Application No. 12180190.6 (7 pages).
European Partial Search Report dated Jan. 16, 2013, issued in European Application No. 12182612.7 (5 pages).
European Partial Search Report dated Mar. 7, 2013, issued in European Application No. 12184574.7 (5 pages).
Extended European Search Report dated Aug. 24, 2012, issued in European Application No. 12172458.7 (6 pages).
Extended European Search Report dated Aug. 31, 2012, issued in European Application No. 12166170.6 (7 pages).
Extended European Search Report dated Feb. 28, 2013, issued in European Application No. 12182610.1 (7 pages).
Extended European Search Report dated Jan. 25, 2013, issued in European Application No. 12166520.2 (8 pages).
Extended European Search Report dated Mar. 8, 2013, issued in European Application No. 12182611.9 (8 pages).
Extended European Search Report dated Nov. 22, 2012, issued in European Application No. 12172892.7 (7 pages).
Extended European Search Report dated Sep. 25, 2012, issued in European Application No. 12176453.4 (7 pages).
Extended European Search Report dated Sep. 25, 2012, issued in European Application No. 12180190.6 (8 pages).
Final Office Action dated Apr. 4, 2013, issued in U.S. Appl. No. 13/447,835 (20 pages).
Final Office Action dated Feb. 1, 2013, issued in U.S. Appl. No. 13/563,943 (17 pages).
Final Office Action dated Feb. 28, 2013, issued in U.S. Appl. No. 13/524,678 (21 pages).
Final Office Action dated Jan. 18, 2013, issued in U.S. Appl. No. 13/482,705 (18 pages).
Final Office Action dated Mar. 15, 2013, issued in U.S. Appl. No. 13/572,232 (36 pages).
iPhone J.D. Typing Letters or Symbols That Are Not on the iPhone Keyboard dated Mar. 19, 2010, accessed “http://www.iphonejd.com/iphone—jd2010/03/typing-letters-or-symbols-that-are-not-on-the-iphone-keyboard.html” on Feb. 26, 2013 (3 pages).
Notice of Allowance dated Mar. 15, 2013, issued in U.S. Appl. No. 13/373,356 (25 pages).
Office Action dated Dec. 28, 2012, issued in U.S. Appl. No. 13/459,301 (22 pages).
Office Action dated Jan. 22, 2013, issued in U.S. Appl. No. 13/564,687 (19 pages).
Office Action dated Jan. 29, 2013, issued in U.S. Appl. No. 13/563,182 (19 pages).
Office Action dated Jan. 7, 2013, issued in U.S. Appl. No. 13/564,070 (21 pages).
Office Action dated Jan. 7, 2013, issued in U.S. Appl. No. 13/564,697 (19 pages).
Office Action dated Mar. 12, 2013, issued in U.S. Appl. No. 13/560,796 (22 pages).
Office Action dated Nov. 14, 2012, issued in U.S. Appl. No. 13/572,232 (24 pages).
Office Action dated Nov. 16, 2012, issued in U.S. Appl. No. 13/554,583 (21 pages).
Office Action dated Nov. 8, 2012, issued in U.S. Appl. No. 13/373,356 (18 pages).
Office Action dated Oct. 26, 2012, issued in U.S. Appl. No. 13/554,436 (22 pages).
PCT International Search Report and Written Opinion dated Jan. 24, 2013, issued in International Application No. PCT/CA2012/050274 (9 pages).
PCT International Search Report and Written Opinion dated Nov. 7, 2012, issued in International Application No. PCT/CA2012/050362 (9 pages).
PCT International Search Report and Written Opinion dated Nov. 8, 2012, issued in International Application No. PCT/CA2012/050405 (12 pages).
Swype Product Features, accessed online at http://www.swype.com/about/specifications/ on Feb. 25, 2013 (2 pages).
U.S. Appl. No. 13/601,736, filed Aug. 31, 2012 (44 pages).
U.S. Appl. No. 13/616,423, filed Sep. 14, 2012 (30 pages).
U.S. Appl. No. 13/773,812, filed Feb. 22, 2013 (94 pages).
Wang, Feng, et al., “Detecting and Leveraging Finger Orientation for Interaction with Direct-Touch Surfaces”, UIST '09, Oct. 4-7, 2009, Victoria, British Columbia, Canada (10 pages).
Related Publications (1)
Number Date Country
20130222256 A1 Aug 2013 US
Provisional Applications (1)
Number Date Country
61603094 Feb 2012 US
Continuations (1)
Number Date Country
Parent 13485723 May 2012 US
Child 13563943 US