The present invention relates generally to mobile computer systems, and more particularly to methods and systems for revealing functions assigned to particular keys on mobile devices such as cellular telephones.
The usage of mobile electronic devices (mobile devices), such as cellular telephones, is ever increasing due to their portability, connectivity and ever increasing computing power. As mobile devices grow in sophistication, the variety and sophistication of application software is increasing, turning mobile devices into multipurpose productivity tools. Yet, the usefulness of mobile devices and their applications are limited by the small area available for the user-interface. Traditional cellular telephones included a simple keypad of fixed configuration. To provide more functionality for mobile devices having fixed keypads, application software frequently assigns functions to the keys which differ from the label on the key (e.g., 1, 2, etc.). However, this solution may leave users unsure about the functionality assigned to each key.
Various embodiment systems and methods reveal a value or function assigned to a key of a computing device based on the position of the user's finger or a pointing device on or near the key. Application software running on the computing device determines the current meaning, or value or function assigned to the key. The meaning of the key is presented in a portion of the display area. The current meaning of the key may be managed by a keypad protocol operating as part of the system software. Applications control the description of the key function or value defining the current meaning of the key that is presented on the display in response to the key being touched or nearly touched.
The accompanying drawings, which are incorporated herein and constitute part of this specification, illustrate exemplary embodiments of the invention, and, together with the general description given above and the detailed description given below, serve to explain features of the invention.
The various embodiments will be described in detail with reference to the accompanying drawings. Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts. References made to particular examples and implementations are for illustrative purposes, and are not intended to limit the scope of the invention or the claims.
In this description, the term “exemplary” is used herein to mean “serving as an example, instance, or illustration.” Any implementation described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other implementations.
As used herein, the term “computing device” refers to any programmable computer device including a display and a keyboard or keypad. In description of the embodiments, reference is made to “mobile devices” which are but one type of computing device that implements the various embodiments. As used herein, the terms “mobile handsets” and “mobile devices” are used interchangeably and refer to any one of various cellular telephones, personal data assistants (PDA's), palm-top computers, laptop computers with wireless modems, wireless electronic mail receivers (e.g., the Blackberry® and Treo® devices), cellular telephones, and multimedia Internet enabled cellular telephones (e.g., the iPhone®), and similar computing devices. A mobile device may include a programmable processor and memory as described more fully below with reference to
As used herein, the term “keypad” refers to any of a variety of user interfaces in which a user presses a button or key in order to communicate to a mobile device that a function associated with the key should be implemented. Examples of keypads encompassed within the following description include the number keypads of conventional cellular telephones, miniature keyboards and is implemented on a variety of mobile devices, external keypads and keyboards which may be electronically coupled to a mobile device (e.g., via a wired or wireless data link), computer keyboards, and musical keyboards which may be coupled to a personal computer, mobile device or other computing device. For ease of description, the figures depict and the descriptions refer to the keypad of a typical cellular telephone. However, these descriptions and illustrations are for example only, and are not intended to limit the scope of the description or the claims to a particular keypad configuration.
As used herein, the term “touch” and “touch-sensitive” are intended to encompass close proximity as well as actual physical touching of a key. The “touch-sensitive” keypads described herein may also (or alternatively) be able to sense close proximity of a finger, stylus or other object. Thus, the use of “touch” and “touch-sensitive” in the following description should not be interpreted as being limited to requiring physical touching or as excluding close-proximity sensitive keypads. As used herein, the term “near touch” refers to a close proximity event, as when a user brings a finger into close proximity with a close-proximity sensitive key.
The various embodiments enable a mobile device to sense the close proximity or touch of a user's finger or stylus to a key and display for the user a description of the function assigned to a particular key by application. By displaying the function assigned to a key without requiring the user to press the key, mobile device applications can assign a variety of different functions to keys on a fixed keypad without requiring users to memorize the function assignments and without having to block the display with a menu of key-function assignments. The various embodiments may be useful in applications which use a fixed keypad to receive commands that are inconsistent with the value printed on the keys (e.g., “1,” “2,” “3”, etc.). Additionally, the embodiments enable mobile devices to implement alphabets and number formats different from those printed on the keys while providing users with a handy mechanism for locating desired keys in their native language. The embodiments may also be useful for mobile devices that include keypads which have application-assignable keys, such as the function keys on a conventional computer keyboard.
The embodiments described herein may be implemented on any of a variety of mobile devices. Typically, such mobile devices will have in common the components illustrated in
In an embodiment, a mobile device includes a keypad that is configured to sense the touch or close proximity of a finger, stylus or other pointing device. A variety of sensors can be used to sense the touch or close proximity of a finger, stylus or other pointing device to a key. Such sensors may include, for example, electrical property sensors (e.g., capacitance, inductance or voltage), thermal sensors (e.g., capable of detecting the temperature of a finger in close proximity to the key), light sensors (e.g., to detect a shadow cast by a finger or pointing device covering the key), and pressure sensors (e.g., to detect the light touch of a finger or pointing device). The touch-sensitive keypad is configured to provide a signal to the mobile device processor 11 that indicates when a particular key is touched that is different from the signal indicating that the key has been pressed. By sensing the touch or close proximity of a finger or stylus to a key, the mobile device can be configured with software to provide a display showing the function presently assigned to the particular key before the key is pressed. Such a display may be presented in a portion of the mobile device display 13 that does not block other information and graphics on the display.
A touch-sensitive keypad is a user interface which has the capability of sensing both the touch and the press of a key as different kinds of events and can signal the key touch and keep press events to a processor 11. An example embodiment of a touch-sensitive keypad is illustrated in
Referring to
Also associated with each key 31 is a touch or near-touch sensing circuit 34, such as a capacitor or a capacitance sensor. A capacitance sensor circuit 34 is a circuit which can detect a change in capacitance as may occur when a user touches or nearly touches a key 31, thereby adding their body to the electrostatic materials that comprises a capacitor assembly between the key 31 and a bottom support 35. In another embodiment, the touch sensing circuit 34 may be a low-voltage detection circuit which can sense the voltage passed to a key 31 from a user's body when a finger is brought into close proximity or touches the key 31. In another embodiment, the touch sensing circuit 34 may be a thermal or temperature sensing circuit that is sensitive enough to detect a change in temperature that occurs when a user's finger touches or comes in close proximity to the key 31. In further embodiment, the touch sensing circuit 34 may be a light sensing circuit that can detect a change in light through the key 31 that occurs when a user's finger shades the key as when it touches or comes in close proximity to the key 31. These alternative embodiments are not separately illustrated since when diagrammed as a circuit block a voltage, thermal, temperature or light sensing circuit would appear the same as the capacitance sensor 34 illustrated in
The touch-sensitive keypad 30 may also include a side support structures 33 (which may be made of an insulator material) and electrical insulator material 36 between keys so as to electrically isolate each key 31 and touch sensing circuit 34 from one another. As illustrated in
In another embodiment, illustrated in
As with any keypad, circuits will be included for routing signals received from the keypress sensor circuit 32 and from the touch sensing circuit 34, 38 to external circuits and ultimately to the processor of the mobile device. Any of the keypad circuitry known in the art may be implemented for this purpose, and so are not included in the figures.
In a preferred embodiment, the touch-sensitive keypad 30 is built into the mobile device 10 as its primary keypad (i.e., replacing the conventional keypad 20 illustrated in
Traditionally, keypads function by transforming the depression of a key 31 into an electrical signal that can be interpreted by the mobile device and its application software.
In an embodiment, a user touching or nearly touching a key without pressing the key is sensed by the touch-sensitive keypad 30 and converted into a key touch event message (shown as dash and dot arrows) that is sent to the hardware driver 4. Key touch event messages may be transmitted via a runtime environment 3 to an application 2. Upon receiving a key touch event message, the application 2 determines the value or function assigned to the associated key (i.e., the key that is being touched or nearly touched), and directs the user interface 1 to display the associated value or function within the mobile device display 13 as described below.
Information regarding a key touch event or a key press event may be communicated from the keypad 30 to the driver 4 and from the driver to be application 2 in a variety of data and signal structures as would be appreciated by one of skill in the art. An example of signals being passed among the various software layers is described below with reference to
As another example, the keypad hardware 30 or the keypad driver software 4 may signal a key touch event or a key press event by sending a software interrupt to the runtime environment layer 3 or the application 2. An example of the data structure of such an interrupt is described below with reference to
Example processing steps that may be performed upon a keypress or key touch event are illustrated in
When a user nearly touches or touches but does not press a key 31, the touch-sensitive keypad 30 senses when this touch event and sends a key touch event electrical signal to the keypad driver, step 71. The keypad driver receives the key touch event signal from the keypad, recognizes the key that has been pressed and sends an appropriate key touch notification to the operating system or runtime environment later, step 73. The runtime environment layer forwards the key touch notification to the application, step 75. Upon receiving the key touch event notification, the application determines the function or value assigned to the particular key, step 77. The application may also determine whether the event was a keypress or a key touch event, test 79. Being a key touch event, the application communicates to the display (or changes the image presented on the display to indicate) the value or function associated with the touched or nearly touched key, thereby informing the user of the value that will be entered or the function will be performed if the key is pressed.
The method steps illustrated in
When a key is pressed, the touch-sensitive hardware 30 senses this event and sends an electrical signal to the keypad driver layer 4 informing it of a touch event and the particular key that has been touched or nearly touched, message 71. The keypad driver 4 translates this event into a signal which can be understood by the runtime environment layer 3, message 73a. This message informs the runtime environment layer 3 of both the nature of the event (i.e., a keypress event) and the particular key involved, such as by providing the key ID of the pressed key. The runtime environment layer 3 then forwards the keypress event information to the application 2, message 75a. The application performs the processing of steps 77-81 to determine or function (or value) associated with the touched or nearly touched key and then perform that function (or enter the value). Once the function has been performed (or the value entered), the application sends a signal to the display 13 or reconfigures the display 13 to present the results of the performed (or display the entered value), message 83.
In the foregoing description referencing
In embodiments in which the key touch or keypress event is communicated by storing flags and a key ID in a register, the messages illustrated in
In order to implement these embodiments, the application software may be configured to recognize key touch events and interact with the mobile device display in order to reveal the value or function associated with the touched or nearly touched key. Such a configuration may be accomplished by adding additional processing steps that recognize a key touch event signal or registry value and present the assigned value or function to the display. Runtime environment layer software may also be adapted to recognize a key touch event and to appropriately notify applications of this event in a manner (e.g., a data format or flag values) different from that of a keypress event. Additionally, the hardware driver used with a touch-sensitive keypad will be configured to distinguish the two kinds of key events and to appropriately communicate key touch event and keypress event information to the runtime environment or the application.
The added complexity required of the application software to distinguish and act upon key touch events versus keypress events may be avoided by implementing the various embodiments in conjunction with a keypad protocol layer within the operating system of the mobile device. Such a keypad protocol is described in U.S. patent application Ser. No. ______ entitled “Standardized Method and Systems for Interfacing with Configurable Keypads”, which is filed concurrently herewith, the entire contents of which are hereby incorporated by reference. The keypad protocol layer serves as an interface between application software and keypad drivers that enables application software to define keypad functions to the operating system and receive key event notifications in standard formats. By doing so, the process of displaying and assign a value or function of a touched or nearly touched key can be performed by the keypad protocol, removing the need for this processing from the application software. If a mobile device is equipped with a touch-sensitive keypad then this will be known to the keypad protocol layer which can communicate with the mobile device display to present the associated value or function that has been assigned by the application. In this manner, a software application can be written for a variety of mobile devices without having to accommodate the touch-sensitive keypad functionality described herein. The following description with reference to
As illustrated in
The keypad protocol 100 receives the key press event signal from the driver layer 110 and sends a keypress event notification to an application 180 in a standardized format that application developers can anticipate and accommodate with standard software instructions. In doing so, the keypad protocol 100 configures a key press event message, such as a notification object, which can be interpreted by the application 180. This configured key press event message/notification object may be passed to an application 180 through the runtime environment software layer 170. Alternatively, the keypad protocol 100 may communicate the key press event message/notification object directly to the application 180. The application 180 may also communicate the key press event to a user-interface layer 190 providing the display function. Alternatively, the keypad protocol 100 may communicate the key value or function directly to the user-interface layer 190 for presentation on the display 13.
Of particular advantage to the various embodiments of the present invention, the keypad protocol 100 can receive key function assignments and configuration commands from applications 180 allowing it to determine the value or function assigned to a particular key at any given moment. Values and functions assigned to various keys are defined by the application running on the mobile device depending upon the functions of that software. In some instances, the value or function assigned to a particular key will depend upon whether other keys are pressed previously or simultaneously (e.g., such as following the press of a “shift” or “alt” key). In other instances, the function assigned to a particular key will depend upon the current operation being performed by the application. For example, in a media player application, the same key may be used to stop and start the media play, with the “stop” functionality assigned to the key whenever the media is playing, and the “start” functionality assigned to the key whenever a media file is selected but not yet playing. Thus, the value or functionality assigned to a particular key is context dependent and may change frequently during the operation of an application. By simplifying application development while enabling dynamic functionality key assignments, an application may configure the keypad protocol to report each keypress event using a command associated with the implicated functionality or value, leaving the processing of the particular keypress event and context to the keypad protocol 100. For example, a media player application may configure the keypad protocol 100 to report a keypress event as a “play” function or a “stop” function depending upon the context of the keypress event as determined by the keypad protocol. Thus, the keypad protocol 100 may communicate with the application using function definitions that are convenient for the application developer. In such an implementation, the application may be unable to determine the value or function assigned to a particular key at any given instant, leaving that processing to the keypad protocol 100. Since the keypad protocol 100 is informed of the function or value assigned to a particular key, the protocol can communicate this information to the display in response to a key touch event. By allocating to the keypad protocol 100 the processing of key touch events and revealing the assigned value or functionality on the display, application software can be easier to develop and need not be configured to interrupt other processing in order to reveal key assignments.
The keypad protocol 100 can also receive graphics from the application associated with the value or function assigned to a particular key. Such graphics may be used in the value/function reveal display generated by the keypad protocol. For example, if the application supports foreign language letters and numerals, the graphics for such graphics may be provided by the application to the keypad protocol 100 so that they may be used when revealing the assigned value in the display. Similarly, if the application assigns functions to keys that can be represented graphically, such as an arrow to indicate “play” and two vertical bars to indicate “stop,” such graphics can be provided to the keypad protocol 100 and used to reveal the assigned functionality in the display instead of describing the function in text form. In situations where the mobile device has or is connected to graphic user interfaces, the keypad protocol 100 can use such graphic files to the configure user interface displayed the graphic.
As described above, the keypad protocol 100 can receive key touch events from the hardware driver 110 and communicate with the display 190 to reveal the function associated with a touched or nearly touched key. Alternatively and in some implementations, the keypad protocol 100 can also communicate key touch events to the application 180, such as by way of the runtime environment layer 170, if the application is configured to process key touch events. For example, some applications may be written for mobile devices having touch-sensitive keypads, and thus be able to receive the key touch event notification and communicate the associated value or function to the user interface display 190 in a manner similar to that described above with reference to
The keypad protocol 100 may include a standard set of APIs that the application developer can utilize in developing applications software. Thus, the keypad protocol layer 100 can serve as a standard software interface for higher-level software. The keypad protocol 100 may also include software tailored to interface directly with keypad drivers 110 to enable it to identify the particular key that has been touched (or nearly touched) or pressed based on a key event signal received from the keypad driver 110. Since the nature of keypad functions and interface signals may vary dramatically among different types of keypads, the keypad controller layer 104 provides a software layer for accommodating such complexity and hiding the complexity from the application layer 180.
In order to inform the keypad protocol 100 of the function or value assigned to particular keys, the application 180 needs to be able to provide keypad definition commands and graphics. Such definition and graphic information can be provided by the application 180 to the keypad protocol 100 directly or by way of the runtime environment layer 170. Similarly, user-interface software 190 may provide keypad definition and graphic configuration information to the keypad protocol 100. The keypad protocol 100 then uses such definition and graphics information to determine the value or function assigned to each key in the keypad. The keypad protocol 100 may also provide keypad configuration commands to the keypad hardware driver 110.
When an application 180 is first started, it may interact with the keypad protocol 100 in order to configure the keypad for operation consistent with the application's functionality. Example steps for this process are illustrated in
When an application 180 is loaded or otherwise needs to determine the available keypad and its capabilities (e.g., whether it is a touch-sensitive keypad), the application may ask for this information from the keypad protocol 100, such as by issuing an API, step 210. Even in situations where the mobile device has only one keypad, the application 180 may need to request information regarding the capabilities of the keypad since applications are typically written to operate on a variety of different types of mobile devices. For illustrative purposes, an example API entitled “Query_Keypad” is illustrated in the figures for performing this function. This API may simply ask the keypad protocol 100 to inform the application 180 of the keypads that are available for use as well as their various capabilities (e.g., configurable keypad or touchscreen). Upon receiving such a Query_Keypad API, the keypad protocol 100 may inform the application of the available (i.e., activated and connected) keypads and their capabilities, step 212. The format for informing the application of the available keypad(s) may be standardized in order to provide a common interface for application developers. The format of the information may be any suitable data structure, such as the data structure described below with reference to
Upon receiving the keypad availability and configuration information, an application may provide configuration information to the keypad protocol, step 220. This configuration step may be in the form of an API to provide a common application interface for application developers. For illustrative purposes, example APIs entitled “Key_Config” and “Keypad_Config” are illustrated in the figures for performing this function. Such an API may specify the index number of the keypad and provide key configuration information on a key-by-key basis. Such configuration information may include the identifier that the application uses for a particular key event, a string describing the function or value assigned to the particular key or key event, and graphics information that can be used to display the key function in a graphical manner. An example format and content of such key-by-key configuration information is discussed below with reference to
The keypad protocol 100 receives the keypad configuration information from the application 180, step 222 and any graphics files or images associated with the selected keypad, step 224. The keypad protocol 100 may configure a translation table associated with the keypad, step 226. Such a translation table can be used by the keypad protocol 100 to determine the appropriate command string or application key identifier to provide to an application 180 in response to each key press event. The keypad protocol 100 may also use the assigned value or function stored in the translation table to generate the display of the assigned value/function in response to a key touch event. Additionally, the keypad protocol 100 may further configure the keypad if required to match the functionality of the application, step 230. Upon completing the keypad configuration operations, the keypad protocol may inform the application 180 that the keypad is ready for operation, reply 232.
The process steps illustrated in
Using information received from the keypad protocol 100, the application 180 may send keypad configuration information and, optionally, graphics files to the keypad protocol 100, messages 220, 224. As with other messages, this information may be sent by way of the runtime environment layer 170 or directly to the keypad protocol 100 as illustrated. The application 180 may also provide graphics files to the display layer, message 234, to present a display consistent with the application and functions assigned to various keys.
Using the keypad configuration and graphics files provided by the application 180, the keypad protocol 100 may configure a key translation table, process 226, and configure the keypad, message 230. Additionally, the keypad protocol 100 may provide some keypad display files to the display, message 228. For example, if the keypad includes configurable keys (e.g., keys 22-24 illustrated in
The processing illustrated in
Applications may also interface with the keypad protocol 100 in order to obtain more information about particular keypads that may be useful in making a selection. In mobile devices that include two or more keypads or user interfaces, such as a telephone keypad used for telephone applications and a miniature keyboard used for text and e-mail applications, an application 180 may need to select one of those keypads for receiving user inputs based upon the application functionality. For example, an application involving significant text entry, such as messaging or e-mail application, may be best supported by a miniature keyboard if such a keypad is available and active on the mobile device, while an media player or game may be best supported by a telephone keypad (see
An example process by which the application 180 may obtain information regarding the capabilities of a particular keypad are illustrated in
Information regarding the available keypad capabilities may be provided to applications by the keypad protocol 100 in a standardized data format, such as illustrated in
The keypad information provided to the application 180 may be in the form of a standardized key set identifier and may use standardized keypad definitions to communicate the particular type of keypad and its capabilities. Alternatively, the keypad capabilities data table 300 may list individual keys that are available and their individual capabilities and configurations. The entries shown in the keypad capabilities table 300 are provided for illustrative purposes only and in a typical implementation are more likely to store data in the form of binary codes that can be recognized and understood by an application 180.
Applications 180 may provide a variety of data and configuration parameters to the keypad protocol 100 for use in interpreting key touch and keypress events and in translating those events into signals or data structures which the application 180 can process. An example of a data structure for storing such information for use by the keypad protocol 100 is illustrated in
The keypad translation data structure 320 may also include graphics (data field 332) associated with the function assigned to a key. The application 180 may provide graphic files to be displayed in response to a key touch event in order to graphically illustrated the key functionality assigned by the application. The graphics file 332 can be used by the keypad protocol 100 to generate a graphic key function reveal display in response to a key touch event. Rather than store the graphics within the keypad translation data structure 320, the data field may include a pointer (i.e., memory address) to the memory location storing the graphic file associated with the particular key. Such graphics may be in the form of simple symbols that communicate a particular key function, such as arrows (left, right, up, down or curved), circles, mathematical operation symbols, etc.
To configure keypads using the keypad protocol 100, an application 180 need only provide some of the information to be stored in the keypad translation data structure 320 in the form of a series of data records. Such data records may be linked to standard key identifiers that the keypad protocol can recognize. For example, if the keypad being configured is a standard 12 key numeric keypad, the application 180 may identify a key by its standard numeral value. Using that identifier, the application 180 can provide the application identifier key ID that the keypad protocol 100 can use to inform the application of a key press event, along with the function description string and/or function graphic or file pointer. The keypad protocol 100 can receive such data records and store them in a data table such as illustrated in
One of skill in the art will appreciate that keypad translation and configuration data may be stored in memory in a variety of different data structures. The data structure illustrated in
Processing flow of key touch and key press events are illustrated in
When a key is touched, nearly touched or pressed on the keypad 120, the key circuitry and its keypad driver 110 can inform the keypad protocol 100 of the event in a variety of ways, such as by providing an interrupt, or storing data in a particular register or portion of memory used for setting system flags. For example, as illustrated in
When the keypad protocol 100 is informed of a key touch event, it can translate the key touch event information into a functional description that can be presented in the key function reveal portion of the display. Similarly, when the keypad protocol 100 is informed of a keypress event, it can translate the key press event into information that an application can interpret. An example of method steps that may be implemented by the keypad protocol 100 in receiving a key touch event and a keypress event are illustrated in
If the event is determined to be a keypress event in test 245, the keypad protocol 100 can locate the corresponding data record within the key translation table 320 using the key ID and keypad ID, step 246. Using the data stored in the corresponding data record, the keypad protocol 100 can retrieve the application ID specified by the application 180 corresponding to the particular keypress event, step 248. Using that information, the keypad protocol can create a notification object for communication to the application 180, step 250. Finally, the keypad protocol sends the keypress notification object to the application 180, step 252. In sending the notification object, the keypad protocol 100 may send the object directly to the application 180 or by way of the operating system or runtime environment 170.
If the event is determined to be a key touch event in test 245, the keypad protocol 100 can locate the corresponding data record within the key translation table 320 using the key ID and keypad ID, step 272, and retrieve the function description text or graphic information associated with the key, step 274. Using that text or graphic information, the keypad protocol can format a display or generate a display object for presentation within the key function reveal window within the display, step 276. Finally, the keypad protocol 100 sends the key function reveal text/graphic or display object to the display, step 278. In sending the key function reveal text/graphic or display object to the display, the keypad protocol 100 may send the information directly to the display or may provide the information to a display layer which configures and manages the generation of images on the display.
The process of receiving and processing a key press event may be accomplished in a series of messages among the different hardware and software layers in the mobile device 10 as illustrated in
When a key is touched or nearly touched, the keypad will send a key touch event signal to the keypad driver, message 241. In turn the keypad driver sends a key touch event flag along with the keypad ID and key ID to the keypad protocol, message 242a. As discussed above, this message may be in the form of information that is saved to a memory location that the keypad protocol may periodically access or access upon detecting a set flag or upon receiving an interrupt. Using this information, the keypad protocol generates the key touch notification object, processing steps 244 and 272-276, and then transmits a key function reveal text or graphic to the display, message 276.
When a key is pressed, the keypad will send a key press event signal to the keypad driver, message 240a. In turn the keypad driver sends the keypad ID and key ID to the keypad protocol, message 242b. As discussed above, this message may be in the form of information that is saved to a memory location that the keypad protocol may periodically access or access upon detecting a set flag or upon receiving an interrupt. Using this information, the keypad protocol generates the key press notification object, processing steps 244 and 246-250, and then transmits the key value to the runtime environment, message 252a, for relay to the application 180 in message 253a. Alternatively, the keypad protocol may communicate the key value directly to the application 180. Additionally, the keypad protocol 100 may send a key value or graphic to the display, message 254a, so the display can reflect the key press event (e.g., presenting on the display the value of the key that was pressed).
A subsequent key press event will be handled in the same way, as illustrated in messages 240b through 254b in
In some situations, a key press event may prompt an application 180 to redefine key values or functions for subsequent key presses. For example, if the application 180 is a media player, such as an MP3 player, and a first key press event is interpreted by the application as initiating audio play (i.e., the first key press had a “play” function), the application may change the functionality of the same key so that a subsequent press will be interpreted as pausing or stopping the media play (i.e., the second key press will have a “stop” function).
The advantages of the various embodiments may be further explained by way of some examples which are illustrated in
In the foregoing embodiments, the key function reveal display may be maintained on the display so long as the key remains touched or nearly touched. Alternatively, the key function reveal display may be maintained for a preset duration following a key touch or near touch, even if the user stops touching the key before the preset duration or continues to touch or near touch the key beyond the preset duration. To accomplish this alternative implementation, a key touch event may also initiate a timer which determines how long the key function reveal display remains.
The flexibility and usefulness of the various embodiments are particularly evident when the mobile device is operating applications which can utilize a non-alphabetic user-interface in order to make the operation of the application more intuitive to a user. For example,
Using the various embodiments, users can be informed of the functions assigned to various keys without having to call up a menu that blocks the display 13, as illustrated in
Using the various embodiments, users can be informed of the game functions assigned to various keys without having to call up a menu that blocks the display 13, as illustrated in
The various embodiments may be implemented by the processor 11 executing software instructions configured to implement one or more of the described methods. Such software instructions may be stored in memory 12 as the device's operating system software, a series of APIs implemented by the operating system, or as compiled software implementing an embodiment method. Further, the software instructions may be stored on any form of tangible processor-readable memory, including: a random access memory 12, a memory module plugged into the mobile device 10, such as an SD memory chip, an external memory chip such as a USB-connectable external memory (e.g., a “flash drive”), read only memory (such as an EEPROM); hard disc memory, a floppy disc, and/or a compact disc.
Those of skill in the art would appreciate that the various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may reside in processor readable memory which may be any of RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art. An exemplary storage medium is coupled to a processor such that the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The processor and the storage medium may reside in an ASIC. The ASIC may reside in a user terminal or mobile device. In the alternative, the processor and the storage medium may reside as discrete components in a user terminal or mobile device. Additionally, in some aspects, the steps and/or actions of a method or algorithm may reside as one or any combination or set of codes and/or instructions on a machine readable medium and/or computer readable medium, which may be incorporated into a computer program product.
The foregoing description of the various embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein, and instead the claims should be accorded the widest scope consistent with the principles and novel features disclosed herein.
The present application claims the benefit of priority to U.S. Provisional Patent Application No. 60/950,112 filed Jul. 16, 2007 entitled “Dynamically Configurable Keypad,” the entire contents of which are hereby incorporated by reference.
Number | Date | Country | |
---|---|---|---|
60950112 | Jul 2007 | US |