This invention relates in general to user interfaces and more particularly to intuitive user interfaces using sensors to enable various user interface functions.
Currently, many hand-held electronic devices, such as mobile telephones, personal digital assistants (PDAs) and the like, include extensive and sophisticated user interface functionality. Furthermore many of these devices are physically small with limited areas for conventional user controls, such as keys or buttons and corresponding switches that may be activated by a user in order to exercise aspects of the user interface. Practitioners in these instances have typically resorted to a menu driven system to control the user interface functions. Unfortunately, as additional features and flexibility is incorporated into these electronic devices, the menu system can become relatively complex with many levels. The end result is the user of the device can be presented with a bewildering, confusing and time-consuming process for activating or adjusting features or functions.
The various aspects, features and advantages of the present invention will become more fully apparent to those having ordinary skill in the art upon careful consideration of the following Detailed Description of the Drawings with the accompamying drawings described below.
While the present invention is achievable by various forms of embodiment, there is shown in the drawings and described hereinafter present exemplary embodiments with the understanding that the present disclosure is to be considered an exemplification of the invention and is not intended to limit the invention to the specific embodiments contained herein. The invention is defined solely by the appended claims including any amendments made during the pendency of this application and all equivalents of those claims as issued.
As further discussed below various inventive concepts, principles or combinations thereof are advantageously employed to provide various methods and apparatus for creating an intuitive user interface for an electronic device, e.g. cellular phone or the like, or promoting intuitive control of an interface by a user. This is accomplished in various embodiments by providing a sensor that may be activated by a user, where the sensor is logically located relative to (or located so as to be logically related to) a user interface component feature, function, or functionality. For example, an operating mode or volume level of a speaker may be controlled or such control may be activated or initiated by user activation of a corresponding sensor(s) that is proximate to or co-located with the speaker or corresponding sound port(s). This intuitive control can be augmented by inputs, such as output signals from additional sensors that provide additional contextual input. For example, if the device is being held by the user rather than lying on another surface or the like as indicated by the inputs from additional sensors, the intuitive control of the volume level of the speaker can be further conditioned on these inputs. Sensors carried on the device, internally or externally, sense environmental or contextual characteristics of the device in relation to other objects or the user. The contextual characteristics may be static or dynamic.
It is further understood that the use of relational terms, if any, such as first and second, top and bottom, upper and lower and the like are used solely to distinguish one entity or action from another without necessarily requiring or implying any actual such relationship or order between such entities or actions. The terms “a” or “an” as used herein are defined as one or more than one. The term “plurality” as used herein is defined as two or more than two. The term “another” as used herein is defined as at least a second or more. The terms “including,” “having” and “has” as used herein are defined as comprising (i.e., open language). The term “coupled” as used herein is defined as connected, although not necessarily directly and not necessarily mechanically.
Some of the inventive functionality and inventive principles are best implemented or supported with or in software programs or instructions and integrated circuits (ICs) such as application specific ICs as well as physical structures. It is expected that one of ordinary skill, notwithstanding possibly significant effort and many design choices motivated by, for example, available time, current technology, and economic considerations, when guided by the concepts and principles disclosed herein will be readily capable of generating such software instructions, ICs, and physical structures with minimal experimentation. Therefore, in the interest of brevity and minimization of any risk of obscuring the principles and concepts according to the present invention, further discussion of such structures, software and ICs, if any, will be limited to the essentials with respect to the principles and concepts used by the exemplary embodiments
The user interface components, functions, or features 105 are normally coupled to interface circuitry 117 having one or more, respective interface circuits that are configured to process signals for or from (and couple signals to or from) the respective user interface component, function, or feature. For example the respective interface circuits include circuitry, such as an amplifier 119 for driving the speaker 107 or an amplifier 121 for amplifying signals, e.g. audio signals, from the microphone 109 where these amplifiers are further coupled from/to additional audio processing (vocoders, gates, etc.) as is known and as may vary depending on specifics of the electronic device 101. Other interface circuits include a display driver 123 for driving the display 111, a display backlighting driver 125 for driving and controlling levels, etc. for the display backlight 113, and other drivers 127 for supporting interfaces with other user interface components 115, such as a keyboard driver and decoder for the keyboard.
The intuitive user interface 103 further includes one or more sensors 129 that are located or physically placed in a position that is logically or intuitively associated with a respective user interface component, function, or feature. For example a logical or intuitive association can be formed when a sensor is proximate (physically near) to, co-located with, or located to correspond with functionality (e.g., sound ports or other visual indicia for a speaker or microphone) of the corresponding user interface component, function or feature. The sensors may be of varying known forms, such as pressure sensors, resistive sensors, or capacitive sensors with various inventive embodiments of the latter described below. The individual sensors form a sensor system that provides or facilitates an intuitive user interface.
In the exemplary embodiment shown in
Note that each of these sensors 131, 133, 135, 137 is configured to provide an output signal (including a change in an output signal) when the sensor is triggered or activated by proximity to a user (including objects, such as a desk or a stylus, etc. used by a user) and this output signal facilitates changing an operating mode of the user interface component, function, or feature, e.g. speaker level, microphone muting or sensitivity, backlighting level, display contrast, and so forth. Additionally shown are other sensors 139 that may be used to determine the context of the electronic device as will be discussed further below. Note that different embodiments of electronic devices may have all or fewer or more sensors (or different sets of the sensors) than shown in
As reflected in the interface circuitry 117 of
The interface circuitry 117 and respective circuits are coupled to a controller 143 that further includes a processor 145 inter-coupled to a memory 147 and possibly various other circuits and functions (not shown) that will be device specific. Note that all or part of the interface circuitry 117 may be included as part of (or considered as part of) the controller 143. The controller 143 can be further coupled to a network interface function 148, such as a transceiver for a wire line or wireless network. The processor 145 is generally known and can be microprocessor or digital signal processor based, using known microprocessors or the like. The memory 147 may be one or more generally known types of memory (RAM, ROM, etc.) and generally stores instructions and data that, when executed and utilized by the processor 145, support the functions of the controller 143.
A multiplicity of software routines, databases and the like will be stored in a typical memory for a controller 143. These include the operating system, variables, and data 149 that are high level software instructions that establish the overall operating procedures and processes for the controller 143, i.e. result in the electronic device 101 performing as expected. A table(s) of sensor characteristics 151 is shown that includes or stores expected parameters for sensor(s), such as a frequency range or time constant values and the like that can be used to determine whether a given sensor has been activated or triggered. Another software routine is a determining sensor activation routine 153 that facilitates comparing output signals from sensor(s) 131, 133, 135, 137, e.g. a frequency, to corresponding entries in the table 151. This routine also determines a duration of the sensor activation as well as parameters corresponding to repeated activations as needed. An operating mode control routine 155 provides for controlling operating modes of one or more of the user interface components, functions, or features, such as one or more of a speaker, microphone, display and display backlighting, keypad, and the like.
In operation, the device of
Execution of the operating mode control routine 155 can result in a change in an operating mode of the speaker, e.g. an output level of the amplifier 119 and thus speaker. This can be accomplished by changing an input level to the amplifier, e.g. level from other audio processing circuits, or changing the gain of the amplifier 119 under direction of the processor 145, e.g. increasing/decreasing the level by 20 dB, muting the speaker, a 3 dB change in volume level, or the like. For example, the operating mode of the speaker can be changed from a private or earpiece mode to a loud speaker mode when, for example, the electronic device is a cellular phone and being used in a speaker phone mode. Note that other techniques can be implemented using the intuitive control and a logically associated sensor. For example, by distinguishing (or detecting) taps or multiple taps (momentary activations) relative to longer activations, e.g. based on a duration of the output signal from a sensor, the processor 145 can initiate a volume setting operation, e.g. by gradually increasing or decreasing volume levels possibly with an accompanying test tone or message or displayed message or the like.
Alternatively the processor can provide a menu, using the menu generation routine 157 and driving the display accordingly via the display driver 123. A user can then select from various options on the menu, e.g. volume up or down, mute, speaker phone, private, etc. The particulars of any given embodiment of the operating mode control routine 155 and resultant actions vis-a-vis the speaker are left to the practitioner and will depend on an assessment of user ergonomics and various other practicalities, e.g. what is a single tap versus double taps versus tap-and-hold, etc.
Execution of the operating mode control routine 155 can analogously result in a change of the operating mode of the microphone 109, e.g., muting (disabling the amplifier 121 or otherwise blocking the output from the amplifier 121) the microphone, changing the sensitivity of the microphone, possibly varying other microphone characteristics as may be required for example in going from a private mode to a speakerphone operating mode or alternatively bringing up a microphone related menu with choices for control of the microphone and its functions and features. Similarly, a backlighting level can be adjusted responsive to an output from the display sensor 135, a display 111 may be adjusted in terms of contrast level or enabled or disabled responsive to the display sensor 135 output, a keypad or other user interface components 115 may be enabled or disabled in response to other user interface sensors 137, or the like. For example, when a user interface component is a display 111, the interface circuitry 117 generally includes a display driver 123 or display backlighting driver 125, and the sensor 135 is located proximate to or integrated with the display 111. The display driver 123 or the backlighting driver 125 can be configured to be responsive to the output signal provided when the sensor 135 is activated or triggered. It is further noted that the change in one or more of the operating modes noted above can be further conditioned on output signals from the other sensors 139 in the sensor system.
Turning to
A context sensor 224 is coupled to a processor or microprocessor 204. The context sensor 224 may be a single sensor or a plurality of sensors. In this exemplary embodiment, a touch sensor 211, accelerometer 213, infrared (IR) sensor 215, photo sensor 217, proximity sensor 219 make up together or in any combination the context sensor 224; all of which are all coupled to the microprocessor 204. Other context sensors, such as a camera 240, scanner 242, and microphone 220 and the like may be used as well, i.e. the above list is not an exhaustive but exemplary list. The device 200 may also have a vibrator 248 to provide haptic feedback to the user, or a heat generator (not shown), both of which are coupled to the microprocessor 204 directly or though an I/O driver (not shown).
The contextual or context sensor 224 is for sensing an environmental or contextual characteristic associated with the device 200 and sending the appropriate signals to the microprocessor 204. The microprocessor 204 takes all the input signals from each individual sensor and executes an algorithm which determines a device context depending on the combination of input signals and input signal levels. A context sensor module 244 may also perform the same function and may be coupled to the microprocessor 204 or embedded within the microprocessor 204. Optionally a proximity sensor 219 senses the proximity of a human body, e.g. hand, face, ear or the like and may condition intuitive user interface control on such proximity. The sensor may sense actual contact with another object or a second wireless communication device or at least close proximity therewith.
A housing (not depicted) holds the transceiver 227 made up of the receiver 228 and the transmitter circuitry 234, the microprocessor 204, the contextual sensor 244, and the memory 206.
Still further in
In yet another embodiment, the electronic device 200 may carry one or more sensors, such as a touch sensor (see
Referring to
As noted above, changing the operating mode 307 can take a multiplicity of forms, depending on the particular user interface component, function, or feature as well as a practitioner's preferences and possibly other sensor signals. Some examples are shown as alternatives in the more detailed diagrams making up 307. For example, where the user interface feature or component is a microphone and a sensor is located proximate to the microphone, the changing, responsive to the output signal, can further change an audio level 309 corresponding to the microphone, i.e. mute or unmute the microphone or otherwise vary the sensitivity thereof. Alternatively, where a speaker is provided together with a sensor located proximate to the speaker, the changing 307, responsive to the output signal, can include changing an output level 311 of the speaker, such as can be required when changing from a private, i.e. earpiece, mode to a loudspeaker, i.e. hands-free or speakerphone, mode of operation. The change from earpiece to hands-free mode can be an output level change to the same speaker or a switching of the signal from an earpiece speaker to a hands-free speaker. Similarly where the providing the user interface feature and the sensor includes providing a display and a sensor located proximate to or integrated with the display, the changing, responsive to the output signal, of the operating mode 307 may further enable the display, disable the display, change a display contrast level, or change (e.g., on/off or up/down) a backlighting level for the display (not depicted). For example, if the display is enabled, activation or triggering a proximate or integral sensor could disable or turn off the display.
One other example is shown that further exemplifies various alternative processes that may be performed as part of the processes at 307 or at 305. For example, the method 300 can determine whether the output signal duration or number of repetitions of the output signal satisfy some requirement 313 and if not, the method returns to 305. If so, a determination of whether other sensor signals are present 315, i.e. indicating the device is being held near a user's face or ear, can be performed where if the required other sensor signals are not present, the method returns to 305. If the conditions at 313 and 315 are satisfied, a menu can be displayed for the corresponding user interface feature 317 or a mode of operation can be changed 319 for an associated user interface feature, such as the speaker, microphone, display, display backlighting, keypad or the like. Note that none, either, or both 313 or 315 may be used as further conditional steps for any change in mode of operation. The method of
Each of the sensors is a thin flexible layered structure, e.g. flex circuit, that includes a bottom conductive layer, e.g. copper or other conductive material, that in certain embodiments is fabricated to include a ground plane and a shield plate that is a separate structure that is non overlapping with the ground plane and that is driven (see
The front sensor 419 includes a ground plane 425, shield plate 427, and sensor plate 429 arranged and configured substantially as shown with respective connector tabs. The shield plate 427 shields the sensor plate from the interfering hardware. Each of the three connector tabs are electrically coupled via for example, a connector or other electrical contact to the appropriate circuitry (not shown). Note that the sensor plate 429 substantially overlaps or overlays the shield plate 427 and both are electrically isolated from the ground plane 425. The back sensor 421 includes a ground plane 431 and a shield plate 433 formed on the lower conductive layer and isolated from each other as depicted. The top layer includes a sensor plate 435 that overlaps the lower shield plate 433 and each of these structures includes a connector tab. The side sensor 423 includes a ground plane 437, a first and second shield plate 439, 441, and a first and second sensor plate 443, 445, respectively overlaying the shield plates. The first sensor plate 443 operates as the side sensor and the second sensor plate 445 may be used, for example, as a volume control or the like.
In practice these sensors would be attached to or integrated with a housing for the device, such as proximate to an inner surface of the housing. A user of the device can switch operating modes of, for example, the speaker by touching the front sensor near the speaker. This change in operating modes can be further conditioned on whether the device is being held in a user hand, e.g. based on output signals from the back sensor 421 and the side sensor 423. This change in the speaker operating mode can be further conditioned on an output signal from the front sensor 419 that corresponds to the front of the device being near the user's head, e.g. an output signal corresponding to the sensor overlaying the thumbwheel 415.
The configuration or relative location of the eight touch sensors on the housing 500 a portion of which are included in the overall device context sensor allow the microprocessor 204 to determine for example how the housing 500 is held by the user or whether the housing 500 is placed on a surface in a particular manner. When the housing 500 is held by the user, a subset of touch sensors of the plurality of touch sensors are activated by contact with the users hand while the remainder are not. The particular subset of touch sensors that are activated correlate to the manner in which the user has gripped the housing 500. For example, if the user is gripping the device so as to make or initiate a telephone call, i.e. making contact with a subset of touch sensors) the first touch sensor 502 and the second touch sensor 506 will be activated in addition to the sixth touch sensor 522 on the back of the housing 500. The remaining touch sensors will typically not be active. Therefore, signals from three out-of-eight touch sensors are received, and in combination with each sensors known relative position, the software in the electronic device correlates the information to a predetermined grip. In particular, this touch sensor subset activation pattern can indicate that the user is holding the device in a phone mode with the display 516 facing the user.
In another exemplary embodiment, one touch sensor is electrically associated with a user interface component, function, or feature adjacent thereto. For example, the third touch sensor 510 which is adjacent to the speaker 512 is operative to control the speaker. Touching the area adjacent to the speaker may, for example, toggle the speaker on or off or cycle the speaker between two or more different operating modes. This provides intuitive interactive control and management of the electronic device operation.
The touch sensor in the exemplary embodiment is carried on the outside of the housing 500. A cross section illustrating the housing 500 and an exemplary touch sensor is shown in
Moving to
Turning back to
In another embodiment, the output from the IR sensor 528 and the output from the plurality of touch sensors are used to determine the contextual environment of the device 101, 200. For example, as discussed above, the volume may be controlled by the sensed proximity of objects and in particular the users face. To ensure that the desired operation is carried out at the appropriate time (i.e. reducing the volume of the speaker to a level appropriate for private mode) additional contextual information may be used. For example, using the touch sensors 502, 506, 510, 514, 518, 522, 524 and 526 which are carried on the housing 500, the device may determine when the housing is being gripped by the user in a manner that would coincide with holding the housing 500 adjacent to the user's face. Therefore a combination of input signals sent to the microprocessor 204; one, or one set, from the subset of touch sensors and a signal from the IR sensor 528 representing the close proximity of on object (i.e. the user's head), may be required to change the speaker volume. The result of sensing the close proximity of an object may also depend on the current mode of the device 101, 200. For example, if the device is a radiotelephone, but not in a call, the volume would not be changed as a result of the sensed contextual characteristic. Similar concepts and principles are applicable to adjusting microphone sensitivity or other user interface features and functions.
Similarly, a light sensor 802, as illustrated in
Similar to the example discussed above concerning context changes resulting in the change in speaker volume, when the light sensor 802 reads substantially zero, the device 801 is assumed to be placed on its back in one exemplary embodiment such as on a table for example. In this exemplary embodiment, the device 801 could automatically configure to speakerphone mode with the volume adjusted accordingly. Another contextual characteristic would result from the light sensor sensing substantially zero light and the IR sensor sensing the close proximity of an object. This may indicate that the device is covered on both the front and back such as in the user's shirt pocket. When this contextual characteristic is sensed the device can change to a vibrate mode to indicate incoming calls, for example.
Other contextual sensors may be a microphone, a global positioning system receiver, temperature sensors or the like. The microphone may sense ambient noise to determine the device's environment. The ambient noise in combination with any of the other contextual characteristic sensors may be used to determine the device's context. As GPS technology is reduced in size and cost, the technology is implemented into more and more electronic devices. Having GPS reception capability provides location and motion information as another contextual characteristic. The temperature of the device 101, 200 may also be considered as a contextual characteristic either alone or in combination with any of the other contextual sensors of the device.
Other contextual characteristics that may be sensed by any combination of contextual sensors including those listed above, include the manner in which the device 101, 200 is held, the relation of the device to other objects, the motion of the device (including velocity and/or acceleration), temperature, mode, ambient light, received signal strength, transmission power, battery charge level, the number of base stations in range of the device, the number of internet access points as well as any other context related characteristics related to the device.
When a logically associated sensor such as speaker sensor 131 shown in
When a logically associated sensor such as microphone sensor 133 shown in
It should also be noted that conductive portions of a typical display or display panel, such as a frame of such a panel (not specifically depicted) can be utilized as a portion of a sensor that is integral to the display in much the same manner as discussed above with reference to a speaker or microphone. A display including a touch sensor could use signals corresponding to the touch pad to enable or disable the display or to control backlighting levels between on and off or among a plurality of distinct levels.
Thus an electronic device, such as cellular phone or other communication device that includes a user interface that is arranged and constructed for intuitive control of interface functionality has been shown, described, and discussed. The user interface includes various embodiments having a plurality of user interface functions and features, such as one or more user audio/visual (AV) input/output (IO) features (one or more speakers, microphones, displays with display backlighting, keyboards, and the like). Further included are various and appropriate interface circuits coupled to the user AV I/O features and configured to couple signals to or from the user AV I/O features. To facilitate the intuitive control, a sensor, such as a capacitive sensor, is located in a position that is intuitively or logically associated with the user AV I/O feature or functionality thereof (proximate to, co-located with, or integral with) and configured to provide an output signal that changes when the sensor is triggered by proximity to a user or associated object. A processor, such as a microprocessor or dedicated integrated circuit or the like, is coupled to the output signal and configured to detect a change in the output signal and modify, responsive to the output signal or change thereto, an operating mode of the user AV I/O feature.
The electronic device including the intuitive user interface can be advantageously utilized to modify or change the operating mode of the user interface function, e.g. user AV I/O feature, in one or more intuitive manners, such as controlling or adjusting between different volume levels at a speaker, e.g. speaker phone level and private or earpiece level, muted and unmuted microphone modes, enabled and disabled display modes, various different backlighting levels for a display, or the like. For example, a user can merely touch the area of the speaker to switch between speaker phone and private modes, touch the microphone area to mute or unmute the microphone or adjust sensitivity, touch a particular portion of a keypad possibly in a particular way (for example two short taps and hold briefly) to enable the keypad rather than navigate a complex menu system or enter a lock code, or touch the display to adjust backlighting levels. The particular adjustments may be further conditioned on whether the user is holding the device, e.g. cellular phone, versus the device being disposed on another surface.
While the present inventions and what is considered presently to be the best modes thereof have been described in a manner that establishes possession thereof by the inventors and that enables those of ordinary skill in the art to make and use the inventions, it will be understood and appreciated that there are many equivalents to the exemplary embodiments disclosed herein and that myriad modifications and variations may be made thereto without departing from the scope and spirit of the inventions, which are to be limited not by the exemplary embodiments but by the appended claims.
This application is a continuation in part of and claims priority from U.S. patent application Ser. No. 10/814,370 titled METHOD AND APPARATUS FOR DETERMINING THE CONTEXT OF A DEVICE by Kotzin et al. filed on Mar. 31, 2004. The priority application is assigned to the same assignee as here and is hereby incorporated herein in its entirety.
Number | Date | Country | |
---|---|---|---|
Parent | 10814370 | Mar 2004 | US |
Child | 11015566 | Dec 2004 | US |