Handheld devices have become more and more prevalent, in forms such as cellular phones, wireless phones, smartphones, music players, video players, netbooks, laptop computers, e-reading devices, tablet computers, cameras, controllers, remote controls, analytic devices, sensors, and many other types of devices.
User interfaces for handheld devices have become increasingly sophisticated, and many user interfaces now include color bitmap displays. Furthermore, many user interfaces utilize touch sensitive color displays that can detect touching by a finger or stylus. There are many varieties of touch sensitive displays, including those using capacitive sensors, resistive sensors, and active digitizers. Some displays are limited to detecting only single touches, while others are capable of sensing multiple simultaneous touches.
Touch sensitive displays are convenient in handheld devices because of the simplicity of their operation to the user. Menu items can be displayed and a user can interact directly with the menu items by touching or tapping them, without the need to position or manipulate an on-screen indicator such as a pointer, arrow, or cursor. Furthermore, the touch capabilities of the display reduce the need for additional hardware input devices such as buttons, knobs, switches, mice, pointing sticks, track pads, joysticks, and other types of input devices.
One disadvantage of touch sensitive user interfaces, however, is that a user's finger can often obstruct the user's view of the display, and repeated touching of the display can result in fingerprints and smudges that obscure the display. Furthermore, it may be awkward in some devices for a user to both hold the device and to provide accurate touch input via the display, especially with one hand. Because of this, many devices are more awkward in operation than would be desirable.
The detailed description is set forth with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The use of the same reference numbers in different figures indicates similar or identical items or features.
Handheld device 100 has a touch sensitive sensor 103, also referred to herein as a touch panel. Touch panel 103 is situated in the alternate surface, in this embodiment facing away from a user who is holding handheld device 100. In operation, a user's finger, such as the user's index finger, may be positioned over or on touch panel 103; touch panel 103 is positioned in such a way as to make this finger placement comfortable and convenient.
Touch panel 103 has multiple areas that are tactually delineated from each other so that a user can distinguish between the areas by touch. In the described embodiment, the areas comprise a plurality of successively nested or hierarchically arranged annular rings or bands 104. In the illustrated example, there are three such bands: an outer band 104(a), a middle band 104(b), and an inner band 104(c). Bands 104 may be concentric in some embodiments, and may surround a common central touch area 105. Individual bands 104 may be referred to as touch bands in the following discussion.
In the described embodiment, each of bands 104 has a different elevation or depth relative to alternate surface 102 of handheld device 100. There are steps or discontinuous edges between the different elevations that provide tactile differentiation between areas or bands 104, allowing a user to reliably locate a particular touch band, via tactile feedback with a finger, without visually looking at touch panel 103.
In this example, each successively inward band is stepped down in elevation from alternate surface 102 or from its outwardly neighboring band. In particular, outer band 104(a) is stepped down from alternate surface 102 and therefore is deeper or has a lower elevation than alternate surface 102. Middle band 104(b) is stepped down from its outwardly neighboring band 104(a) and is therefore deeper and has a lower elevation than outer band 104(a) Inner band 104(c) is stepped down from its outwardly neighboring band 104(b) and is therefore deeper and has a lower elevation than middle band 104(b). Similarly, central area 105 is stepped down from surrounding inner band 104(c) and is therefore deeper and has a lower elevation than inner band 104(c). Those of skill in the art will understand that touch bands 104 may each successively extend upward from the bordering larger band. Thus, outer band 104(a) may be lower than middle band 104(b), which in turn is lower than inner band 104(c), which is in turn lower than central area 105, thus forming a convex arrangement. In another embodiment, the respective bands may all share the same level, but may be tactually detectable by virtue of a raised border between them. For purposes of simplicity, however, the disclosed embodiment will address only a concave arrangement of touch pad 103.
The progressively and inwardly increasing depths of bands 104 and central area 105 relative to alternate surface 102 create a concavity or depression 106 relative to alternate surface 102. Position and dimensions of touch panel 103 can be chosen so that a user's index finger naturally locates and rests within concavity 106, such that it is comfortable to move the finger to different locations around touch panel 103.
Bands 104 can be irregularly shaped or can form a wide variety of shapes such as circles, ovals, rectangles, or squares. In the illustrated embodiment, bands 104 are irregularly shaped to allow easy finger positioning at desired locations. The irregular shape of bands 104 allows a user to learn the orientation of the bands and thus aids in non-visual interaction with touch panel 103.
Touch panel 103 is sensitive to touch, and can detect the particular location at which it is touched or pressed. Thus, it can detect which individual band 104 is touched, and the position or coordinates along the band of the touched location. A user can slide his or her finger radially between bands 104 or around a single band 104, and touch panel 103 can detect the movement and absolute placement of the finger as it moves along or over the bands. Central area 105 is also sensitive to touch in the same manner.
Touch panel 103 can be implemented using capacitive, resistive, or pressure sensing technology, or using other technologies that can detect a user's finger placement. Touch panel 103 may also integrate additional sensors, such as sensors that detect the pressing or depression of central area 105 or other areas of touch panel 103.
Different embodiment may utilize different numbers of bands, and a single band or two bands may be used in different embodiments. Furthermore, the bands may be shaped and positioned differently.
As an example of a different touch area configuration,
Tactile delineation between touch bands 401 and 402 can be provided by a ridge or valley between the bands. Alternatively, the bands can have different elevations relative to right side surface 403.
Display panel 501 can be used as part of a user interface to operate handheld device 100. It can also be used to display content, such as text, video, pictures, etc.
A graphical menu 502 can be displayed at times on front display 501. Menu 502 has a plurality of graphically- or visually-delineated menu areas or bands 504 corresponding respectively to the tactually-delineated touch sensitive areas 104 on alternate surface 102. In this example, menu areas 504 include an outer band 504(a), a middle band 504(b), and an inner band 504(c). In addition, menu 502 includes a center visual area 505.
Generally, graphical menu 502 faces the user, and touch panel 103 faces away from the user. However, display panel 501 and touch panel 103 may or may not be precisely parallel with each other. Although in particular embodiments it may be desirable to position graphical menu 502 so that is directly in front of and aligned with touch panel 103 as illustrated, other arrangements may work well in certain situations. In particular, in some embodiments there may be a lateral and/or angular offset between graphical menu 502 and touch panel 103, such that touch panel 103 is not directly behind menu 502 or is not parallel with the surface of display panel 501. Furthermore, the correspondence in size and shape between the menu bands and the touch bands may not be exact in all embodiments. Thus, the bands and center area of touch panel 103 and menu 502 may differ from one another, but will be similar enough that when a user interacts with touch panel 103, the user perceives it to have a one-to-one positional correspondence with the elements of menu 502.
In operation, as will be described in more detail below, menu items are displayed in menu bands 504. Each displayed menu item is located at a particular point on a menu band 504, and therefore corresponds to a similar point on corresponding touch band 104 of touch panel 103. A particular menu band 504 can be selected or activated by touching its corresponding touch band. A particular menu item can be selected or activated by touching the corresponding position or location on the corresponding touch band 104.
Generally, touching any particular location on touch pad 103 can be considered similar to touching or clicking on the corresponding location on graphical menu 502. If a user desires to select a menu item or some other graphical object positioned at a particular point on menu 502, for example, he or she presses the corresponding point or location on touch panel 103. The tactual delineations between bands of touch panel 103 help the user identify and move between graphical menu bands to locate particular menu item groups.
Generally, each of the menu bands 701 and 702 contains a group of related menu items. Each menu item may be represented by text or a graphical element, object, or icon. In this example, the items are represented by text. Inner menu band 702 contains menu items labeled “ITEM A1”, “ITEM A2”, “ITEM A3”, “ITEM A4”, “ITEM A5” and “ITEM A6”. Outer menu band 701 contains menu items labeled “ITEM B1”, “ITEM B2”, “ITEM B3”, “ITEM B4”, “ITEM B5”, “ITEM B6”, and “ITEM B7”.
Each menu band 701 and 702 may also have a band heading or title, indicating the category or type of menu items contained within the band. In this example, inner menu band 702 has a heading “GROUP A”, and outer menu band 701 has a heading “GROUP B”.
Generally, individual menu items correspond to actions, and selecting a menu item initiates the corresponding action. Thus, hand-held device 100 is configured to initiate actions associated respectively with the menu items in response to their selection.
In a configuration such as this, touch panel 103 may be symmetrical, with bands that are the same width on their left and right sides. Menu 502 might be non-symmetrical, similar to menu structure 700. The non-symmetry of menu 502 might allow menu items labels and icons to easily fit within its right-hand side. However, the slight differences between the shapes of the touch bands and the corresponding menu bands will likely be nearly imperceptible to a user, or at least easily ignored. This arrangement allows menu 502 to be displayed using either a right-hand or left-hand orientation, depending on preferences of a user, while using the same touch panel 103.
User interaction can be implemented in different ways. For purposes of discussion, interaction with touch panel 103 will be described with reference to bands and locations of menu structure 700. Thus, “touching” or “tapping” ITEM A1 is understood to mean that the user touches the corresponding location on touch panel 103.
Menu structure 700 can be sensitive to the context that is otherwise presented by handheld device 100. In other words, the particular menu items found on menu 700 may vary depending on the activity that is being performed on handheld device 100. Furthermore, different bands of menu 700 can have menu items that vary depending on a previous selection within a different band. Specific examples will be described below.
In certain embodiments, menu 700 may be activated or initiated by touching center touch area 105 of touch panel 103. In response, handheld device displays menu 700. Alternatively, menu 700 might be activated by touching any portion of touch panel 103, or by some other means such as by interaction with front-surface elements of handheld device 100.
Upon initially displaying menu structure 700, individual menu items may or may not be displayed. For example, upon initial display, each menu band may only indicate its group heading or title, and the individual menu items may be hidden.
After activating menu structure 700 by touching center area 703, the user may touch one of the touch bands to activate or reveal the menu items within that touch band. For example, the user may touch inner band 702, which causes device 100 to activate that band and to display or reveal its individual menu items. In addition, activating a particular band might result in that band being highlighted in some manner, such as by an animation, bold text, or distinguishing shades or colors. Activation or selection of a band might also be indicated by enlarging that band on displayed menu 700 in relation to other, non-activated bands.
Another band might be activated by touching it, or by selecting an item from a first band. For example, outer band 701 may contain items that depend on a previous selection made from the items of inner band 702. Thus, touching or selecting an item within inner band 702 may activate outer band 701, and outer band 701 might in this scenario contain items or commands related to the menu item selected from inner band 702.
Selection of a band or menu item may be made by touching and releasing the corresponding location on touch panel 103. Alternatively, selection may be made by touching at one location, sliding to another location, and releasing. For example, menu structure 700 may be implemented such that touching center area 703 opens menu structure 700, and sliding to inner band 702 allows the user to move to a menu item on inner band 702. Releasing when over a particular menu item might select or activate that menu item.
Selection within menu structure 700 or within a band of menu structure 700 may be accompanied by a highlight indicating the location of the user's finger at any time within the menu structure. For example, touching in a location on touch panel 103 in a location corresponding to ITEM A1 may cause ITEM A1 to become bold or otherwise highlighted. Furthermore, any area that is currently being touched can be made to glow on display panel 501, or some similar visual mechanism can be used to indicate finger placement and movement on menu structure 700. Thus, a user might touch a menu band, move his or her finger along the menu band until the desired menu item is highlighted, and then release his or her touch, thereby activating the menu item that was highlighted upon the touch release.
The user interface arrangement described above can be used in a variety of ways. The following examples assume the use of front-facing display panel 501 and rear-facing touch panel 103. For purposes of example and illustration, touch panel 103 will not be explicitly shown in the figures accompanying this discussion. It is assumed that in the examples described, touch panel 103 lies directly behind the illustrated graphical menus, and that the touch bands of the touch panel have shapes and sizes that correspond at least roughly with the menu bands of the displayed graphical menus. User interactions with the touch panel will be described with reference to corresponding points on the displayed graphical menus.
Suppose, for example, that the user wants to crop the displayed picture 801. The user first touches and releases center area 703 to activate menu 700. The user then touches inner band 702, which reveals menu items 901 relating to editing actions. The user moves his or her finger until touching the menu item “Crop”, and releases. This causes device 100 to display an on-screen tool for cropping picture 801. Although this tool is not illustrated, picture 801 may be again displayed in full size on front display panel 501, as in
In the example of
In this example, “Jim” has been previously selected from inner band 1203 and is displayed in center area 1204 as the object of any selected operations. The menu items and corresponding operations include “eMail”, “Text”, “Call”, “Chat”, and “Twitter”. The available menu items might vary depending on the information available for the selected contact. For example, some contacts might only include a telephone number, and communications options might therefore be limited to texting and calling. Other contacts might include other information such as Chat IDs, and a “Chat” activity might therefore be available for these contacts. Thus, the menu items available in this band are sensitive to the menu context selected in previous interactions with menu 1200.
The above usage scenarios are only examples, and the user described interaction techniques might be useful in many different situations. As another example, the described menu structure might be used as an application launcher, with different types of applications being organized within different menu bands. End-users may be given the ability to organize applications within menu bands in accordance with personal preferences.
The described menu structure might also be used as a general context menu, presenting operations such as copy, paste, delete, add bookmark, refresh, etc., depending on operations that might be appropriate at a particular time when the menu structure is opened. Again, different types of operations might be presented in different menu bands, such as “edit” operations in an inner band and “sharing” operations in an outward band.
Furthermore, support for the menu structure can be provided through an application programming interface (API) and corresponding software development kit (SDK) to allow the menu functionality to be used and customized by various application programs. In addition, the operating system of the handheld device can expose APIs allowing application programs to register certain activities and actions that might be performed with respect to certain types of objects, or in certain contexts. Registering in this manner would result in the indicated activities or actions being included in the contextual menus described above.
An action 1602 comprises displaying menu items in the menu bands. As already described, each menu item corresponds to a position on the rear-facing touch sensor of the handheld device.
An action 1603 comprises navigating among the menu bands and menu items in response to rear touch sensor input. Action 1604 comprises selecting a particular one of the menu items in response to the user touching its corresponding position on the rear-facing touch sensor.
Note that in the embodiments described above, having a front-facing touch-sensitive display, some of the user interactions might be performed by touching the display itself at the desired menu location, as an alternative to touching the corresponding location on the rear touch panel. Some embodiments may allow the user to touch either the front displayed menu or the corresponding rear touch panel, at the user's discretion.
The handheld device 100 of
In many cases, the programs and logic of memory 1702 will be organized as an operating system (OS) 1703 and applications 1704. OS 1703 contains logic for basic device operation, while applications 1704 work in conjunction with OS 1703 to implement additional, higher-level functionality. Applications 1704 may in many embodiments be installed by device manufacturers, resellers, retailers, or end-users. In other embodiments, the OS and applications may be built into the device at manufacture.
Note that memory 1702 may include internal device memory as well as other memory that may be removable or installable. Internal memory may include different types of machine-readable media, such as electronic memory, flash memory, and/or magnetic memory, and may include both volatile and non-volatile memory. External memory may similarly be of different machine-readable types, including rotatable magnetic media, flash storage media, so-called “memory sticks,” external hard drives, network-accessible storage, etc. Both applications and operating systems may be distributed on such external memory and installed from there. Applications and operating systems may also be installed and/or updated from remote sources that are accessed using wireless means, such as WiFi, cellular telecommunications technology, and so forth.
Handheld device 100 also has a front-facing display 501 and a rear-facing touch panel 103, the characteristics of which are described above. OS 1703 interacts with front display 501 and rear touch panel 103 to implement the user interface behaviors and techniques described above. In many embodiments, handheld device 100 might have an application programming interface (API) 1705 that exposes the functionality of front display 501 and rear touch panel 103 to applications through high-level function calls, allowing third-party application to utilize the described functionality without the need for interacting with device components at a low level. API 1705 may include function calls for performing the actions described with reference to
Similarly, API 1705 may allow application programs to register certain functions or actions, along with potential objects of those functions or actions, allowing the handheld device to include those functions and activities as menu items in appropriate contexts.
Note that various embodiments include programs, devices, and components that are configured or programmed to perform in accordance with the descriptions above, as well as computer-readable storage media containing programs or instructions for implementing the described functionality.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as exemplary forms of implementing the claims
Further, it should be noted that the system configurations illustrated above are purely exemplary of systems in which the implementations may be provided, and the implementations are not limited to the particular hardware configurations illustrated. In the description, numerous details are set forth for purposes of explanation in order to provide a thorough understanding of the disclosure. However, it will be apparent to one skilled in the art that not all of these specific details are required.