Embodiments of the invention relate generally to handheld electronic devices and more specifically to a touch interface for such a device that reconfigures in response to the way a user operates the device.
Portable handheld computing devices such as cell phones, personal digital assistants (PDAs), audio players, cameras, global positioning satellite (GPS) devices, games, etc., are more commonly using a touch interface. In a touch interface, the user is presented with images of controls such as buttons, icons, links, sliders, bars, boxes, etc. The user is able to manipulate the controls by touching the display screen on which the controls are displayed. Typically a user holds the device in one hand and uses a finger on the other hand to tap or slide a control on the screen. However, often a user may try to operate the device in a different manner, such as one-handed operation where the thumb of the holding hand is used to operate the controls. In such one-handed operation it may be difficult or impossible to operate the controls easily.
Embodiments of the invention provide a system for changing the location, number, type or other operation of touch controls. For example, a standard display of controls that works well in two-handed operation may not work well for one-handed operation where the user tries to operate the controls with a thumb of the holding hand. In such a case, a device can detect that the user is trying to operate the controls with a thumb of the holding hand and the controls are reconfigured to be more suitable for thumb operation. Reconfiguration can also occur in other modes of operation, such as when the user is using a stylus, multiple fingers of a second (non-holding hand), fingers on both hands, etc. In some embodiments, detection of the mode of operation can be automatic such as where a camera takes an image to determine the mode. Or the user can change modes manually with a control or by voice, touch, gesture, device movement or other commands.
In one embodiment a method for reconfiguring controls shown on a touch screen display is disclosed. The touch screen display is included in a device, wherein the reconfiguring allows one or more of the controls to be more accessible for operation of a thumb of a hand holding the device. The method includes displaying at least one control in a first position on the touch screen display; accepting a signal to indicate that reconfiguring for a thumb mode of operation is desired; and changing the display in response to the signal so that the at least one control is moved to a thumb access area, wherein the at least one control is more accessible for operation by the thumb than when the at least one control was in the first position.
Another embodiment provides a method for reconfiguring controls shown on a touch screen display, wherein the touch screen display is included in a device, wherein the reconfiguring allows one or more of the controls to be more adapted for use with a stylus, the method comprising: displaying at least one control in a first position on the touch screen display; accepting a signal to indicate that reconfiguring for a stylus mode of operation is desired; and changing the display in response to the signal so that the at least one control is adapted for the stylus mode.
Although specific controls are shown and discussed herein, it should be apparent that any number, type and arrangement of controls can be used in different situations. The specific details of the provided Figures are intended to provide examples to illustrate features of various embodiments of the invention. Features which may be used in any other suitable devices and interfaces.
It should be apparent that by using the index finger, or other finger, of the user's other hand—in this case the right hand (not shown)—it would be a simple matter to press, swipe or otherwise activate or manipulate any of the controls on the touch screen of
In
Finally,
In general, in a standard or two-handed mode of operation the entire touch screen area may be used for controls similar to what is shown in
The access area can be automatically defined by the device such as by having a default access area. The device can also request that the user trace an access area and use the resulting trace. The device can also use the front-facing camera 130 to image the movements of the thumb and generate the access area by observation. In one embodiment, the interface can be automatically switched from a standard mode (e.g., default or two-handed mode) of control arrangement to the one-handed mode (i.e., thumb mode) of control arrangement upon the device detecting one-handed operation by using the front-facing camera 130. For example, if a thumb is detected by the camera then the user interface is switched to thumb mode. If it is determined that an index finger is approaching the screen then the screen can be switched back to two-handed mode. Such switching may be delayed to take place after the first activation of a control so that the user does not try to press a control only to see the control move away to a different location on the screen.
A user can be allowed to change the shape of the access area. Different patterns for the access area can be provided and the user can select from among them. Updates to the access area shape can be by system or application updates performed by the manufacturer of the device, operating system or another entity. Access patterns can be transferred to or from other devices or users. Other ways to define, estimate or manipulate the access area are possible.
Similar to thumb pressing, thumb swipe sensing can be modified so that the thumb is not required to make movements that might be too awkward. For example, given the curved, arcuate nature of the retracted curve 206 and extended curve 208 (see
A similar approach can be taken to reconfigure one or more controls on the touch screen when it is determined that a user wants to use the device in a stylus mode. A stylus, such as a pen or other pen-like device, can allow a user to point to, touch, activate or otherwise manipulate smaller controls, such as smaller icons, than the user would be able to efficiently use with their fingertip, thumb or other digit of their hand. In a stylus mode, the icons can be made smaller and/or placed closer together. The stylus mode can be entered automatically such as by using the camera to capture and analyze an image to determine how the user is trying to use the device. Or the stylus mode can be entered automatically by accepting a signal from a user control such as a button or other on-screen control, or a physical button, slider, etc. on the device. Or by using voice command, gesture or other type of user input.
In
Physical controls 306 include any tactile dedicated buttons such as a volume rocker, power on/off switch, home button, etc., that may be provided on the device. Processor 308 represents any type of processor configuration including electrical, optical, quantum, chemical, biological, micro-electromechanical systems (MEMS), etc. One or more processors may be used to achieve the functionality describe herein. Although specific interconnections are shown in
Random Access Memory (RAM) 310 includes instructions that are executable by processor 308 to achieve the functions described herein. In general, RAM 310 can be any type of processor-readable storage device such as electronic, electromagnetic, optical, etc. Camera 312 includes optical, infrared; still or video, or other types of cameras that may be provided in a device. As discussed above, the inclusion of a camera can make it possible to automatically determine how the user is trying to use the device so that mode switching and control reconfiguration can be automatic. Image data captured by camera 312 is transferred to processor 308 for analysis. Processor 308 can determine, for example, whether a user is trying to operate on-screen controls with an index finger, thumb, stylus or other object.
Camera 312 can be used to detect gesture commands. Microphone 314 can be used to receive voice commands. Other components and/or subsystems shown as 316, 318, 320 and 322 can be used for other purposes such as to output audio, communicate with other devices, store information, etc.
Although embodiments of the invention have been described with respect to particular embodiments thereof, these particular embodiments are merely illustrative, and not restrictive. Details, including camera detection of user operations, can be found in documents incorporated by reference at the beginning of this specification.
Larger devices that may be adaptable for use with features described herein even though the devices may be considered too large for easy “handheld” or “portable” operation. For example, tablet or slate computers such as the iPad™ by Apple Computer, Inc. can be used even though these devices are significantly larger than cell phones.
Any suitable programming language can be used to implement the routines of particular embodiments including C, C++, Java, assembly language, etc. Different programming techniques can be employed such as procedural or object oriented, scripts, interpreted or compiled code, etc. The routines can execute on a single processing device or multiple processors. Although the steps, operations, or computations may be presented in a specific order, this order may be changed in different particular embodiments. In some particular embodiments, multiple steps shown as sequential in this specification can be performed at the same time.
Particular embodiments may be implemented in a computer-readable storage medium for use by or in connection with the instruction execution system, apparatus, system, or device. Particular embodiments can be implemented in the form of control logic in software or hardware or a combination of both. The control logic, when executed by one or more processors, may be operable to perform that which is described in particular embodiments.
Particular embodiments may be implemented by using a programmed general purpose digital computer, by using application specific integrated circuits, programmable logic devices, field programmable gate arrays, optical, chemical, biological, quantum or nano-engineered systems, components and mechanisms may be used. In general, the functions of particular embodiments can be achieved by any means as is known in the art. Distributed, networked systems, components, and/or circuits can be used. Communication, or transfer, of data may be wired, wireless, or by any other means.
It will also be appreciated that one or more of the elements depicted in the drawings/figures can also be implemented in a more separated or integrated manner, or even removed or rendered as inoperable in certain cases, as is useful in accordance with a particular application. It is also within the spirit and scope to implement a program or code that can be stored in a machine-readable medium to permit a computer to perform any of the methods described above.
As used in the description herein and throughout the claims that follow, “a”, “an”, and “the” includes plural references unless the context clearly dictates otherwise. Also, as used in the description herein and throughout the claims that follow, the meaning of “in” includes “in” and “on” unless the context clearly dictates otherwise.
Thus, while particular embodiments have been described herein, latitudes of modification, various changes, and substitutions are intended in the foregoing disclosures, and it will be appreciated that in some instances some features of particular embodiments will be employed without a corresponding use of other features without departing from the scope and spirit as set forth. Therefore, many modifications may be made to adapt a particular situation or material to the essential scope and spirit.
This application claims priority from U.S. Provisional Patent Application Ser. No. 61/590,284; entitled “USER INTERFACE USING DEVICE AWARENESS”, filed on Jan. 24, 2012, which is hereby incorporated by reference as if set forth in full in this document for all purposes.
Number | Name | Date | Kind |
---|---|---|---|
6519003 | Swayze | Feb 2003 | B1 |
7649522 | Chen et al. | Jan 2010 | B2 |
7932892 | Chen et al. | Apr 2011 | B2 |
8125440 | Guyot-Sionnest et al. | Feb 2012 | B2 |
8144122 | Chen et al. | Mar 2012 | B2 |
8375334 | Nakano et al. | Feb 2013 | B2 |
8395584 | Griffin | Mar 2013 | B2 |
20040080487 | Griffin et al. | Apr 2004 | A1 |
20040100479 | Nakano et al. | May 2004 | A1 |
20070080934 | Chen et al. | Apr 2007 | A1 |
20080015115 | Guyot-Sionnest et al. | Jan 2008 | A1 |
20090167696 | Griffin | Jul 2009 | A1 |
20090167727 | Liu et al. | Jul 2009 | A1 |
20100013780 | Ikeda et al. | Jan 2010 | A1 |
20100238111 | Chen et al. | Sep 2010 | A1 |
20100271312 | Alameh et al. | Oct 2010 | A1 |
20110169760 | Largillier | Jul 2011 | A1 |
20110199295 | Chen et al. | Aug 2011 | A1 |
20110304550 | Romera Jolliff et al. | Dec 2011 | A1 |
20120075229 | Summers | Mar 2012 | A1 |
20120162078 | Ferren et al. | Jun 2012 | A1 |
20120202515 | Hsu et al. | Aug 2012 | A1 |
20130135228 | Won et al. | May 2013 | A1 |
20130147701 | Cripps | Jun 2013 | A1 |
Entry |
---|
http://www.samsung.com/uk/consumer/mobile-devices/smartphones/android/GT-I9300MBDBTU-features. |
Number | Date | Country | |
---|---|---|---|
20130188081 A1 | Jul 2013 | US |
Number | Date | Country | |
---|---|---|---|
61590284 | Jan 2012 | US |