The present disclosure is related generally to user-interface techniques for computing devices and, more particularly, to a system and method for responding to a touch input on a user interface of a computing device.
As mobile devices have diminished in size, new methods of user input have developed. For example, while user input was initially received exclusively via hardware such as buttons and sliders, users are now able to interface with many mobile devices via touch-screen inputs. Despite the general effectiveness of such input methods, the methods often consume a great deal of power from an internal power source due to the requirement of an always-on processor. Enhanced input technology regarding processor power schemes could play a role in providing greater power saving capabilities.
The present disclosure is directed to a system that may provide enhanced power saving capabilities. However, it should be appreciated that any such benefits are not a limitation on the scope of the disclosed principles or of the attached claims, except to the extent expressly noted in the claims. Additionally, the discussion of technology in this Background section is merely reflective of inventor observations or considerations and is not an indication that the discussed technology represents actual prior art.
While the appended claims set forth the features of the present techniques with particularity, these techniques, together with their objects and advantages, may be best understood from the following detailed description taken in conjunction with the accompanying drawings of which:
In overview of the disclosed principles, an electronic device may include two processors, that is, a first processor and a second processor. The first processor is a general purpose (or “application”) processor. While broadly capable, this first processor tends to use a significant amount of power, which may present an energy-use challenge for small, battery-powered devices. To address the issue of excessive power consumption and for other reasons, the electronic device's second processor may use significantly less power than the first processor. In some embodiments this second, low power, processor may be or include a sensor hub.
In an example method for responding to a touch input, the first processor is placed in a very low power (or “sleep”) mode. While the first processor sleeps, the second processor monitors the environment of the device. Based on this monitoring, the second processor may decide that the device needs to perform some task beyond the capabilities of the second processor. For example, the second processor may detect a button press or a swipe gesture from a user that indicates that the user wishes to interact with the device. In this situation, the second processor wakes up the first processor. The first processor then performs whatever work is required of it.
Eventually, there may be no more work for the first processor to perform. For example, the user may eventually finish his interaction with the device and put the device in a pocket. At this point, the first processor goes to sleep in order to save power, while the second processor remains on, sensing the environment. In some embodiments, while the first processor is asleep, the second processor monitors a touch-input system for specific inputs. If an input is received that is one of a set of specific inputs, then the second processor wakes the first processor to respond to the input; otherwise, the input is ignored. In one example of a specific input, the second processor may ignore all inputs except a “wake up” touch gesture from the user. In some implementations, the touch-input system itself is intelligent enough to recognize gestures. In such examples, the touch-input system instructs the second processor as to what type of gesture has been received. In other implementations, the second processor interprets touch information itself to determine if a specific gesture has been performed.
In another example, the second processor may logically divide a screen of the touch-input system into “live” and “non-live” areas. For example, just before the first processor goes to sleep, it may display one or more selectable icons on the screen, or the first processor may tell the second processor to display these icons. Areas associated with these icons are considered to be “live,” while the remainder of the screen is considered to be non-live. If, while the first processor is asleep, a touch is received that corresponds to a location of one of these icons, then the second processor wakes the first processor. Touches received in non-live areas are ignored. Because the designation of areas of the screen as live or non-live ultimately depends upon the first processor, these areas may change.
There are multiple options for connecting the first and second processors. In one implementation, touch events are sent in parallel to both processors. When the second processor wakes the first processor in this embodiment, the first processor already has access to the relevant touch event. In another implementation, all touch events go only to the second processor. If the second processor decides to wake the first processor in this embodiment, then the second processor sends the relevant touch event to the first processor.
Turning to the drawings, wherein like reference numerals refer to like elements, techniques of the present disclosure are illustrated as being implemented in a suitable environment. The following description is based on example embodiments and should not be taken as limiting the claims with regard to alternative embodiments that are not explicitly described herein.
Referring now to
In an example embodiment, the electronic device 100 has a housing 101 comprising a front surface 103 which includes a visible display 105 and a user interface. For example, the user interface may be a touch screen including a touch-sensitive surface that overlays the display 105. In another embodiment, the user interface or touch screen of the electronic device 100 may include a touch-sensitive surface supported by the housing 101 that does not overlay any type of display. In yet another embodiment, the user interface of the electronic device 100 may include one or more input keys 107. Examples of the input keys 107 include, but are not limited to including, keys of an alphabetic or numeric keypad or keyboard, physical keys, touch-sensitive surfaces, mechanical surfaces, multipoint direction keys, or side buttons or side keys 107.
The electronic device 100 may also comprise apertures 109, 111 for audio output and input at the surface. It is to be understood that the electronic device 100 may include a variety of different combinations of displays and interfaces. The electronic device 100 may include one or more sensors 113 positioned at or within an exterior boundary of the housing 101. For example, as illustrated by
Turning now to
The internal components 200 may further include a device interface 215 to provide a direct connection to auxiliary components or accessories for additional or enhanced functionality. In addition, the internal components 200 preferably include a power source or supply 217, such as a portable battery, for providing power to the other internal components and to allow portability of the electronic device 100.
Further, the application processor 203 and the low power processor 204 may both generate commands based on information received from one or more input components 209. The processors 203, 204 may process the received information alone or in combination with other data, such as the information stored in the memory 205. Thus, the memory 205 of the internal components 200 may be used by the processors 203, 204 to store and retrieve data. Additionally, the components 200 may include any additional processors aside from the application processor 203 and the low power processor 204.
The data that may be stored by the memory 205 include, but are not limited to including, operating systems, applications, and data. Each operating system includes executable code that controls basic functions of the electronic device 200, such as interaction among the components of the internal components 200, communication with external devices via each transceiver 201 or the device interface 215, and storage and retrieval of applications and data to and from the memory 205. Each application may include executable code utilizing an operating system to provide more specific functionality for the electronic device 100. Data are non-executable code or information that may be referenced or manipulated by an operating system or application for performing functions of the electronic device 100.
The input components 209, such as a user interface, may produce an input signal in response to detecting a predetermined gesture at a touch input 219, which may be a gesture sensor. In the present example, the touch input 219 is an example touch-sensitive surface substantially parallel to the display. The touch input 219 may further include at least one capacitive touch sensor, a resistive touch sensor, an acoustic sensor, an ultrasonic sensor, a proximity sensor, or an optical sensor.
The input components 209 may also include other sensors, such as a visible light sensor, a motion sensor, and a proximity sensor. Likewise, the output components 207 of the internal components 200 may include one or more video, audio, or mechanical outputs. For example, the output components 207 may include a video-output component such as a cathode-ray tube, liquid-crystal display, plasma display, incandescent light, fluorescent light, front or rear projection display, or a light-emitting diode indicator. Other examples of output components 207 include an audio-output component such as a speaker, alarm, or buzzer, or a mechanical output component such as vibrating or motion-based mechanisms.
Although the input components 209 described above are intended to cover all types of input components included or utilized by the electronic device 100, the components 200 may include additional sensors 223 that may be included or utilized by the device 100. The various sensors 223 may include, but are not limited to, power sensors, temperature sensors, pressure sensors, moisture sensors, motion sensors, accelerometer or gyroscopic sensors, or other sensors, such as ambient-noise sensors, light sensors, motion sensors, proximity sensors, and the like.
It is to be understood that
Referring now to
In some embodiments, the application processor 203 may be in a very low power state (a “sleep mode”). While the application processor 203 is in a sleep mode, the low power processor 204 receives information associated with a touch from the touch-input system 219. The touch information may include a location of the touch on the touch-input screen 301 and may be a single-point touch, a multi-point touch, or any recognizable gesture. When the low power processor 204 receives the touch information, based on the location of the touch, the low power processor 204 will either ignore the touch or wake the application processor 203.
Waking the application processor 203 may be done by sending a handover signal from the low power processor 204 to the application processor 203. In some examples, the application processor 203 may receive information associated with the touch from the touch input 219 upon waking from the sleep mode. Further, the application processor 203 may transition from the sleep mode to a non-sleep mode upon waking
In some examples, the low power processor 204 is configured for displaying information via the touch screen 301 while the application processor 203 is in sleep mode. Additionally or alternatively, the application processor 203 may display information via the touch screen 301 while the application processor 203 is in the non-sleep mode.
Continuing, the flow chart 400 of
In an alternative embodiment shown in
If, while the application processor 203 is asleep, a touch is received that corresponds to the live areas 502, then the second processor 204 wakes the first processor 203. Touches received in non-live areas 504 are ignored. Because the designations of the areas of the screen as live or non-live ultimately depends upon the first processor 203, these areas 502, 504 may change over time.
The flow chart 600 of
In view of the many possible embodiments to which the principles of the present disclosure may be applied, it should be recognized that the embodiments described herein with respect to the drawing figures are meant to be illustrative only and should not be taken as limiting the scope of the claims. Therefore, the techniques as described herein contemplate all such embodiments as may come within the scope of the following claims and equivalents thereof.
The present application claims priority to U.S. Provisional Patent Application 61/748,794, filed on Jan. 4, 2013, which is incorporated herein by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
61748794 | Jan 2013 | US |