The present disclosure relates generally to a menu for a communication device. More specifically, the present disclosure relates to a graphical context short menu for a mobile communication device.
With the advent of more robust wireless communications systems, compatible mobile communication devices are becoming more prevalent, as well as advanced. Where in the past such mobile communication devices typically accommodated either voice transmission (cell phones) or text transmission (pagers and PDAs), today's consumer often demands a combination device capable of performing both types of transmissions, including even sending and receiving e-mail. Furthermore, these higher-performance devices can also be capable of sending and receiving other types of data including that which allows the viewing and use of Internet websites. These higher level functionalities necessarily require greater user interaction with the devices through included user interfaces (UIs) which may have originally been designed to accommodate making and receiving telephone calls and sending messages over a related Short Messaging Service (SMS). As might be expected, suppliers of such mobile communication devices and the related service providers are anxious to meet these customer requirements, but the demands of these more advanced functionalities have in many circumstances rendered the traditional user interfaces unsatisfactory, a situation that has caused designers to have to improve the UIs through which users input information and control these sophisticated operations.
Most application programs are menu-driven as opposed to being command-driven. Menu-driven applications provide a list of possible action commands or options from which a user may choose, while command-driven applications require users to enter explicit commands. Thus, menu-driven applications are generally easier for the average user to learn than are command-driven applications. Menus are typically implemented as a list of textual or graphical choices (i.e., menu items) from which a user can choose. Thus, menus allow a user to select a menu item, for example, by pointing to the item with a mouse and then clicking on the item. Examples of other methods of selecting menu items include highlighting an item and then hitting the “return” key or “enter” key, and pressing directly on a menu item through a touch-sensitive screen.
One particularly useful type of menu is a hierarchical menu. Hierarchical menus typically present a parent menu that has selectable menu items. The selection of each menu item normally causes another menu, or submenu, to be displayed next to the currently displayed menu. The submenu has additional menu choices that are related to the selected parent menu item. Also, the parent menu results in the display of the submenu. The depth of a hierarchical menu can extend in this manner to many levels of submenus.
The conventional hierarchical menus generally lay out from left to right across a display screen as menu choices are selected. This menu format provides various advantages such as retaining previous and current menus on the display screen at the same time. This provides a historical menu map as menu selections are made and their corresponding submenus are displayed across the screen. Users can therefore review previous menu selections that have been made while progressing to the most recently displayed menu—thus making it easier to move between different menu items and menu levels.
Although such hierarchical menus provide useful advantages, there are scenarios in which their use is impracticable. One such scenario is when hierarchical menus are used on devices having small display screens. The problems presented when attempting to implement conventional hierarchical menus on small-screen devices have generally discouraged the use of hierarchical menus with such devices.
Hierarchical menus generally lay out across the display screen from left to right. On small-screen devices where the room on the screen is not wide enough to accommodate all of the menus, the menus often lay out across the screen in both directions, from left to right and back again. In this scenario, the menus typically begin to overlap one another, creating various problems. Overlapping menus can be confusing to the user. Overlapping menus can make it difficult for a user to discern previous menu selections which can, in turn, make it difficult to determine how to return to previous menus to make different menu selections. Thus, one of the intended benefits of a hierarchical menu can be undermined when the hierarchical menu is implemented on a small-screen device.
Overlapping menus can also be difficult to work with on small-screen devices (as well as others) that employ pen-based or stylus-based touch-sensitive screens. With such devices, it is often difficult to maintain contact continuity between menus on the screen when the menus are overlapping. In other words, it is easy to move off of menus with small-screen, touch-based devices. If continuity is lost when moving from one menu to another, menus will often disappear from the screen, causing the user to have to go back and reactivate the menu from a prior menu. This problem becomes worse when using pen-based devices that “track”. In the present context, the terminology of “tracking” is used to indicate a situation in which a cursor on the screen follows (tracks) the movement of the pen as the pen moves over the screen even though the pen is not touching the screen. Tracking is lost if the pen is pulled too far away from the screen. Thus, pen-based devices that “track” tend to lose more menus when hierarchical menus are employed.
One method of addressing this issue involves displaying submenus in place of a parent menu, and vice versa, when the appropriate menu items are selected from within the parent menus and submenus. Like a typical hierarchical menu, the depth of a hierarchical in-place menu can extend in this manner to many levels of submenus such as second, third, fourth and fifth levels, with submenus being parent menus to other submenus. Parent menu items selected from within parent menus are displayed within submenus as links back to previous parent menus and are separated from that submenu's items by a divider. For example, parent menu item “Launch App” is from a parent menu and thus includes a forward pointer that indicates a submenu will replace the first parent menu upon selection of “Launch App”. In each of the submenus, “Launch App” has a backward pointing arrow that facilitates going back to a previous menu in the hierarchy. However, each of the menus provides the full complement of available menu items. This can be overwhelming for a novice user and irritating to an experienced user. This problem is exacerbated to an extent by the addition of a hierarchical history of parent menus added to the list.
Another approach is the use of short menus and full menus. A full or extended menu, lists all available menu items at that particular level and a short menu is a subset of the full menu. The short menu can be a dynamic menu in that a user selects menu items from the corresponding extended menu to be included in the short menu. However, navigating such menus can be difficult when using the navigation tools of a mobile communication device in that a user has to select or highlight the desired menu option when the menu options are in a vertical list.
Embodiments of the present application will now be described, by way of example only, with reference to the attached Figures, wherein:
It will be appreciated that for simplicity and clarity of illustration, where appropriate, reference numerals have been repeated among the different figures to indicate corresponding or analogous elements. In addition, numerous specific details are set forth in order to provide a thorough understanding of the embodiments described herein. However, it will be understood by those of ordinary skill in the art that the embodiments described herein can be practiced without these specific details. In other instances, methods, procedures and components have not been described in detail so as not to obscure the related relevant feature being described. Also, the description is not to be considered as limiting the scope of the embodiments described herein.
Referring to
As shown, the exemplary communication devices 100 are communicatively coupled to a wireless network 219 as exemplified in the block diagram of
Referring to
The auxiliary I/O subsystem 228 can take the form of a variety of different navigation tools 127 (multi-directional or single-directional) such as a trackpad navigation tool 221 as illustrated in the exemplary embodiment shown in
As may be appreciated from
As described above, the communication device 100 may include the auxiliary input 228 that acts as a cursor navigation tool and which can be also exteriorly located upon the front face 170 of the communication device 100. Its front face location allows the tool to be easily thumb-actuable like the keys of the keyboard 232. An embodiment provides the navigation tool 127 in the form of the trackpad 121, which can be utilized to instruct two-dimensional screen cursor movement in substantially any direction, as well as act as an actuator when the trackpad 121 is depressed like a button. The placement of the navigation tool 127 may be above the keyboard 232 and below the display screen 222; here, it can avoid interference during keyboarding and does not block the operator's view of the display screen 222 during use, e.g., as shown in
As illustrated in
Furthermore, the communication device 100 is equipped with components to enable operation of various programs, as shown in
When the communication device 100 is enabled for two-way communication within the wireless communication network 219, it can send and receive signals from a mobile communication service. Examples of communication systems enabled for two-way communication include, but are not limited to, the General Packet Radio Service (GPRS) network, the Universal Mobile Telecommunication Service (UMTS) network, the Enhanced Data for Global Evolution (EDGE) network, the Code Division Multiple Access (CDMA) network, High-Speed Packet Access (HSPA) networks, Universal Mobile Telecommunication Service Time Division Duplexing (UMTS-TDD), Ultra Mobile Broadband (UMB) networks, Worldwide Interoperability for Microwave Access (WiMAX), and other networks that can be used for data and voice, or just data or voice. For the systems listed above, the communication device 100 may require a unique identifier to enable the communication device 100 to transmit and receive signals from the communication network 219. Other systems may not require such identifying information. GPRS, UMTS, and EDGE use a smart card such as a Subscriber Identity Module (SIM) in order to allow communication with the communication network 219. Likewise, most CDMA systems use a Removable Identity Module (RUIM) in order to communicate with the CDMA network. A smart card can be used in multiple different communication devices 100. The communication device 100 may be able to operate some features without a smart card, but it will not be able to communicate with the network 219. A smart card interface 244 located within the communication device 100 allows for removal or insertion of a smart card (not shown). The smart card features memory and holds key configurations 251, and other information 253 such as identification and subscriber related information. With a properly enabled communication device 100, two-way communication between the communication device 100 and communication network 219 is possible.
If the communication device 100 is enabled as described above or the communication network 219 does not require such enablement, the two-way communication enabled communication device 100 is able to both transmit and receive information from the communication network 219. The transfer of communication can be from the communication device 100 or to the communication device 100. In order to communicate with the communication network 219, the communication device 100 in the presently described exemplary embodiment is equipped with an integral or internal antenna 218 for transmitting signals to the communication network 219. Likewise the communication device 100 in the presently described exemplary embodiment is equipped with another antenna 216 for receiving communication from the communication network 219. These antennae (216, 218) in another exemplary embodiment are combined into a single antenna (not shown). As one skilled in the art would appreciate, the antenna or antennae (216, 218) in another embodiment are externally mounted on the communication device 100.
When equipped for two-way communication, the communication device 100 features the communication subsystem 211. As is understood in the art, this communication subsystem 211 is modified so that it can support the operational needs of the communication device 100. The subsystem 211 includes a transmitter 214 and receiver 212 including the associated antenna or antennae (216, 218) as described above, local oscillators (LOs) 213, and a processing module 220 which in the presently described exemplary embodiment is a digital signal processor (DSP) 220.
It is contemplated that communication by the communication device 100 with the wireless network 219 can be any type of communication that both the wireless network 219 and communication device 100 are enabled to transmit, receive and process. In general, these can be classified as voice and data. Voice communication generally refers to communication in which signals for audible sounds are transmitted by the communication device 100 through the communication network 219. Data generally refers to all other types of communication that the communication device 100 is capable of performing within the constraints of the wireless network 219.
The keyboard 232 can include a plurality of keys that can be of a physical nature such as actuable buttons, or they can be of a software nature, typically constituted by virtual representations of physical keys on the display screen 222 (referred to herein as “virtual keys”). It is also contemplated that the user input can be provided as a combination of the two types of keys. Each key of the plurality of keys has at least one actuable action which can be the input of a character, a command or a function. In this context, “characters” are contemplated to exemplarily include alphabetic letters, language symbols, numbers, punctuation, insignias, icons, pictures, and even a blank space.
In the case of virtual keys, the indicia for the respective keys are shown on the display screen 222, which in one embodiment is enabled by touching the display screen 222, for example, with a stylus, finger, or other pointer, to generate the character or activate the indicated command or function. Some examples of display screens 222 capable of detecting a touch include resistive, capacitive, projected capacitive, infrared and surface acoustic wave (SAW) touchscreens.
Physical and virtual keys can be combined in many different ways as appreciated by those skilled in the art. In one embodiment, physical and virtual keys are combined such that the plurality of enabled keys for a particular program or feature of the communication device 100 is shown on the display screen 222 in the same configuration as the physical keys. Using this configuration, the operator can select the appropriate physical key corresponding to what is shown on the display screen 222. Thus, the desired character, command or function is obtained by depressing the physical key corresponding to the character, command or function displayed at a corresponding position on the display screen 222, rather than touching the display screen 222.
While the above description generally describes the systems and components associated with a mobile communication device, the communication device 100 could be another communication device such as a PDA, a laptop computer, desktop computer, a server, or other communication device. In those embodiments, different components of the above system might be omitted in order provide the desired communication device 100. Additionally, other components not described above may be required to allow the communication device 100 to function in a desired fashion. The above description provides only general components and additional components may be required to enable the system to function. These systems and components would be appreciated by those of ordinary skill in the art.
Referring to
Referring to
Referring to
The graphical context short menu 500 can be a popup grid menu. The graphical context short menu 500 can be a dynamic menu that includes menu items from a full or extended menu. In other words, the graphical context short menu 500 can be menu items that are a subset of a full or extended menu. A full or extended menu can list all available menu items at that particular level and can be accessed by selecting the more menu items 502. The full or extended menu can be graphical or non-graphical.
The menu items for the graphical short menu 500 can be designed in different ways. For example, each graphical context short menu 500 can include menu items that are predefined, programmer preferences, selected or built by the user, the most commonly used commands in the context, or the user's most frequently used commands in the context. Context can mean based on the application, function selected, or screen context. There are two types of context menus: disambiguation and contextual actions. A disambiguation menu is displayed to clarify what action should be taken when clicking on an item. For example, when a contact name is highlighted in an address book, the menu can clarify how the user would like to communicate with the contact, e.g., email, phone, or SMS, etc. A contextual actions menu provides more actions than the default action. For example, when a contact name is highlighted in an email message, the menu can default to the “reply” menu item but can also include other items such as phone or SSM.
The menu items can be positioned in the graphical context short menus 500 as consistently as possible to leverage muscle memory. For example, a default menu option 504 can be placed in the center of each menu 500 and the more menu item 502 can be placed in the bottom right of each menu 500. By including the more menu item 502 in each menu 500, there are no dead ends in the menus 500 because there is provided a means to access a full menu. The graphical context short menus 500 can provide available actions for on-screen items. By using the graphical context short menus 500, a user can use the navigational tool 127 to select a desired menu option. The grid format can be visually appealing and can allow for easier navigation since the selectable area for a menu option is larger compared to a traditional list menu comprising text only. The menu options can also be selectable using a double click action, e.g., clicking on a menu option once to highlight and again to select it. In one or more embodiments, the default menu option 504 can be highlighted when the graphical context short menu 500 is displayed. In such embodiments, the default menu option 504 can require only one click. As discussed below, the menu options can be selected using other selection means.
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
At block 1302, a page can be displayed. For example, the page can be displayed on the display or display screen 222 of the mobile communication device 100. The page can include information associated with a contact. After displaying the page, the method 1300 can proceed to block 1304.
At block 1304, a menu request can be generated. For example, a user can select or highlight an object (e.g., an application, a message, a header, a contact or text) using the navigational tool 127. The microprocessor 238 or menu program can generate the menu request. After the menu request is generated, the method 1300 can proceed to block 1306.
At block 1306, the menu request can be received. For example, the microprocessor 238 or menu program can receive the menu request. After receiving the menu request, the method 1300 can proceed to block 1308.
At block 1308, a determination can be made whether a contact is associated with the displayed information. For example, the microprocessor 238 or menu program can determine if a contact is associated with the displayed information. If a contact is associated with the displayed information the method 1300 can proceed to block 1310. If a contact is not associated with the displayed information the method 1300 can proceed to block 1312.
At block 1310, a graphical context short menu is displayed including a contact icon. For example, the microprocessor 238 or menu program can display a graphical context short menu having the contact icon in the center of the grid as shown in
At block 1312, a graphical context short menu is displayed with a default option selected or highlighted. For example, the microprocessor 238 or menu program can display a graphical context short menu having a default option selected or highlighted in the center of the grid as shown in
At block 1314, a menu option is selected. For example, the user can use the navigational tool 127 to select a menu option. The microprocessor 238 or menu program can receive the selected menu option. Depending on the selected menu option, the method can proceed to anther block accordance with the selected menu option. For example, the method can proceed to block 1316, 1318, 1320, or 1322.
At block 1316, in the event the selected option is an unambiguous selection, another menu can be displayed. The menu can be graphical (shown in
At block 1318, in the event the more menu items option is selected, a full menu can be displayed. For example, if the more menu items option 702 in
At block 1320, in the event a menu item is selected, the selected menu item can be acted on. For example, if the call option 704 of
At block 1322, in the event the exit button 1252 is selected, the menu, e.g., a graphical context short menu or a full menu, can disappear. For example, the microprocessor 238 of menu program can remove the displayed menu.
The technology can take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment containing both hardware and software elements. In one embodiment, the technology is implemented in software, which includes but is not limited to firmware, resident software, microcode, etc. Furthermore, the technology can take the form of a computer program product accessible from a computer-usable or computer-readable medium providing program code for use by or in connection with a computer or any instruction execution system. For example, method 1300 can be a computer program product or can be program code on a computer-readable medium. For the purposes of this description, a computer-usable or computer readable medium can be any apparatus that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. The medium can be an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system (or apparatus or device) or a propagation medium (though propagation mediums in and of themselves as signal carriers are not included in the definition of physical computer-readable medium). Examples of a physical computer-readable medium include a semiconductor or solid state memory, magnetic tape, a removable computer diskette, a random access memory (RAM), a read-only memory (ROM), a rigid magnetic disk and an optical disk. Current examples of optical disks include compact disk—read only memory (CD-ROM), compact disk—read/write (CD-R/W) and DVD. Both processors and program code for implementing each as aspect of the technology can be centralized or distributed (or a combination thereof) as known to those skilled in the art.
A data processing system suitable for storing program code and for executing program code will include at least one processor coupled directly or indirectly to memory elements through a system bus. The memory elements can include local memory employed during actual execution of the program code, bulk storage, and cache memories that provide temporary storage of at least some program code in order to reduce the number of times code must be retrieved from bulk storage during execution. Input/output or I/O devices (including but not limited to keyboards, displays, pointing devices, etc.) can be coupled to the system either directly or through intervening I/O controllers. Network adapters may also be coupled to the system to enable the data processing system to become coupled to other data processing systems or remote printers or storage devices through intervening private or public networks. Modems, cable modem and Ethernet cards are just a few of the currently available types of network adapters.
Exemplary embodiments have been described hereinabove regarding the implementation of a smart card receiving assembly for a mobile communication device. Various modifications to and departures from the disclosed embodiments will occur to those having skill in the art. The subject matter that is intended to be within the spirit of this disclosure is set forth in the following claims.
This application claims the benefit of U.S. Provisional Application No. 61/304,773 filed on Feb. 15, 2010, which is incorporated herein by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
61304773 | Feb 2010 | US |