Mobile electronic devices, such as personal desktop assistants, contemporary mobile telephones, hand-held computers, tablet personal computers, laptop personal computers, “smart” phones, and the like are becoming popular user tools. These electronic devices can run a general purpose operating system, such as MICROSOFT WINDOWS® Mobile, and can have a rich set of functionalities including e-mail access, Internet capabilities, document editing, calendar functions, music players, and even games. Such features and capabilities have increased both the utility and complexity of mobile devices.
Mobile electronic devices tend to be small, lightweight and easily portable. Consequently, these mobile devices typically have limited display space. Providing access to the volume and variety of available information and services, therefore, tends to clutter the user interface, thereby inhibiting users from accessing features or entering, retrieving, and/or viewing data. Users can become frustrated when they are unable to locate the desired information or services and may be unable to fully exploit the advantages of the mobile device.
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended as an aid in determining the scope of the claimed subject matter.
Navigating through content data includes receiving navigational input; cycling through categories to select a new category; and cycling through the pages associated with the selected category to select a new page. Category and pages can be cycled regardless of which page is the currently selected and displayed. According to aspects of the disclosure, the selected page fills a substantial portion of the display space.
Navigational instructions can be provided through touch-sensitive displays. Such displays can support two different communication styles: Tap and Gesture. Gesture-based navigation can be inverted to suit the style of a particular user.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
Embodiments of the present disclosure are directed to a navigational user interface for displaying content data and services stored, e.g., on a mobile electronic device. Other aspects relate to navigating through the navigational user interface on the mobile device to access and display the content data and services. The described navigational user interface is applicable to mobile devices (e.g., handheld computing devices), but may be applied to other computing devices as appropriate.
Content data on mobile devices can take many forms including, but not limited to, contact information, calendar items, email messages, voicemail recordings, music, photos, documents, and tasks or actions. Mobile device services can include telephone services, email services, text messaging services, music play, and the like. The user interface generally arranges the content data onto content pages, which are organized under content categories (e.g., messages, contacts, tools, and settings). In one embodiment, each content category is associated with a software application appropriate for processing the content data (e.g., plug-in type applications). Within each content category, a user can traverse through content pages, which include full screen displays within which action items and/or content data are arranged.
The term “action item” as used in this application encompasses any specific content on a content page with which a user may interact. Action items also may be referred to as selectable objects or elements. Upon navigation to an action item, a user may “select” that action item, thereby causing an action to occur. Examples of action items include, among others, hyperlinks, images, embedded content pages, and interface elements, such as, buttons, text boxes, drop-down menus, etc. Action items may lead to displays enabling a user to implement services, such as phone, email, or text messaging services.
A navigational user interface having features that are examples of inventive aspects in accordance with the principles of the present disclosure enables a user to navigate through content pages and content categories regardless of what content data is currently being displayed to the user. For example, the navigational user interface includes a category input and a page input. The category input cycles through the categories to enable a user to select a new category, thereby providing access to the pages organized under the newly selected category. The page input cycles through the pages organized under the selected category to sequentially display the pages. The category input and the page input each can be selected regardless of what page is currently displayed to the user.
In certain embodiments, the navigational user interface receives input from a screen display. In such cases, the navigational user interface can provide touch support implemented with a tap system and/or a gesture system as will be described herein. By implementing a combination of both tap- and gesture-based systems, the navigational user interface will improve user experience by enabling gesture-based communication, while still accommodating users who prefer tap-based communication.
In general, the content panel 120 extends over a substantial portion of the navigational user interface 100. For example, in different embodiments, the content panel 120 can extend over an area ranging from about 20% to about 100% of the navigational user interface 100. Preferably, the navigation bar 110 is arranged adjacent the content panel 120 over a substantially smaller portion of the navigational user interface 100 than the content panel 120. For example, the navigation bar 110 can extend over a side of the user interface 100 as shown in
Display elements 112 (e.g., icons) can be arranged within the navigation bar 110. Each display element 112 represents a category available to the user for selection. In some embodiments, display elements 112 of only a subset of the available categories are visible on the navigation bar 110 at any given time. As will be discussed in greater detail herein, the user can scroll (i.e., cycle) through the display elements 112 of the navigation bar 110 to view the remaining display elements 112. In some embodiments, the navigation bar 110 is differentiated from a background of the user interface 100. In other embodiments, however, only the display elements 112 arranged on the navigation bar 110 are distinguished from the background.
Example categories that can be represented on the navigation bar 110 include, but are not limited to, email, contacts, settings, and media. In some embodiments, one or more of the categories correspond with software applications (e.g., plug-in-type applications) providing access to and optionally enabling manipulation of specific types of content data. Examples of software applications include, inter alia, email programs, scheduling programs, PIM (personal information management) programs, word processing programs, spreadsheet programs, and Internet browser programs. For example, an email program can enable a user to create, send, and view email messages. A media editor may enable viewing, editing, and/or sorting of still images and/or videos.
Computing device 200 may typically include at least one processing unit 202 and system memory 204. Computing device 200 may also include a plurality of processing units that cooperate in executing programs. Depending on the exact configuration and type of computing device, the system memory 204 may be volatile (such as RAM), non-volatile (such as ROM, flash memory, etc.) or some combination of the two.
An operating system 205 suitable for controlling the operation of a networked personal computer, such as the WINDOWS® operating systems from MICROSOFT CORPORATION of Redmond, Wash., is typically stored in system memory 204. The system memory 204 also may include configuration settings 226 and one or more software applications, such as program modules 206 and navigation processing application 222. The navigation processing application 222 obtains user input (e.g., via input device 212), ascertains a navigational instruction from the user input, and executes the navigational instruction to change which content data or services are presented to the user.
Computing device 200 may have output device(s) 214, such as a display (e.g., a screen), speakers, external printer, etc. The computing device 200 also may have input device(s) 212, such as a D-pad, jog-wheel, hardware buttons, soft keys, keyboard, pen, voice input device, touch input screen, external mouse, etc. These devices are well known in the art and need not be discussed at length here. In some embodiments, navigational instructions can be hardwired into certain input devices 212. In other embodiments, navigational instructions can be associated with input devices 212 via software. For example, a user can view and select navigation indicia 126 presented on the display 214 of the computing device 200.
The computing device 200 also may have additional features or functionality. For example, the computing device 200 may also include additional data storage devices (removable and/or non-removable). Such additional storage is illustrated in
The computing device 200 also may contain communication connections 216 that allow the device to communicate with other computing devices 218, such as over a wireless network in a distributed computing environment, for example, an intranet or the Internet. Communication connection 216 is one example of communication media. Communication media may typically be embodied by computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and includes any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. The term computer readable media as used herein includes both storage media and communication media.
Generally, program modules include routines, programs, components, data structures, and other types of structures that perform particular tasks or implement particular abstract data types. Moreover, those skilled in the art will appreciate that embodiments may be practiced with other computer system configurations, including hand-held devices, multiprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, mainframe computers, and the like. Embodiments may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote memory storage devices.
Mobile device 300 is shown with many features. However, embodiments of the mobile device 300 may be implemented with fewer or additional components. The example mobile device 300 shown in
Mobile device 300 also can include navigational input features including a D-pad 335, a jog-wheel 343, a track ball 337, and an interactive (e.g., touch sensitive) display 342. The display 342 also may provide soft key options. Touch sensitive displays receive input through some form of tapping implement (e.g., stylus, finger, etc.) that is tapped and/or dragged across the display. Typically, the display 342 is a relatively small sized display screen. In addition, due to space and available power constraints, certain capabilities (resolution, etc.) of the display 342 may also be more limited than those of a traditional large display. A user navigation interface displaying the available options or content data within such a display may become cluttered. Accordingly, a user navigation interface, such as user navigation interface 100, facilitating access to options and content data is advantageous.
Referring to
For example, selection of one of the navigational directions in the category traversal set changes which category is currently accessed, regardless of what content data is currently being displayed. Selection of one of the navigational directions of the page traversal set sequentially traverses the content pages associated with the accessed category, regardless of what content data is currently displayed to the user. Typically, each traversal set includes two directions. In one embodiment, each set of navigational directions includes two opposing navigational directions (e.g., up and down).
As shown in
A traverse operation 706 navigates through the content data in accordance with the received traversal direction. For example, if the traversal direction indicates categories, should be incremented, then access operation 708 will access the next category in the sequence and will display the default page associated with that category. Alternatively, if the traversal direction indicates the navigational user interface should decrement a content page, then access operation 708 will access the content page prior in the sequence. A display operation 708 displays the content data to which the user navigated. The navigation process 700 completes and ends at a stop module 710.
The ascertain process 800 of
The traversal process 900 of
The traversal process 1000 of
For example, in
In certain embodiments, the categories 410 include software applications (e.g., plug-in type software applications) configured to process the content data organized under the respective categories. For example, the categories 410 may include an email program, a text messaging program, and a music player program. In such embodiments, choosing to navigate in the one of the category traversal directions sequentially closes, loads, and executes the software applications as well as the respective content data.
For example, to access email capabilities, a user navigates to and opens an email application. A default content page associated with the email application is displayed to the user in the content panel. The user then navigates through content pages to obtain access to action items (e.g., a link to an internal phonebook or a message inbox) and/or content data (e.g., phonebook listings, email messages, etc.).
In one embodiment, the user can navigate to a content page associated with an email editor and select an action item on the page. Selection of the action item opens an email template within the content panel. The user may enter information, such as a textual message, recipient contact information, a subject line, and/or message attachments, into the email template within the content panel.
In another embodiment, the user navigates to the content page associated with the user's inbox and selects an action item on the page. Selection of the action item opens the inbox, which can contain messages organized into a series of one or more content pages. The user views different messages by navigating through the series of content pages by sequentially displaying each content page within the content panel.
Referring to
In some embodiments, only a subset of the available categories is visible on the navigation bar 1210 at any one time. For example, display icons 1212′ are not visible on the navigation bar 1210 in
In certain embodiments, the user interface 1210 can have an acceleration mode. When configured in such a mode, the navigational user interface 1200 enables a user to cycle quickly through the available categories and/or pages without rendering and displaying the content data associated with each sequential page. The content data is only rendered and displayed after a category and/or page is chosen.
A cycle operation 1308 traverses through the categories in the sequence of categories until a stop instruction is received at an obtain operation 1310. In one embodiment, the obtain operation 1310 receives an affirmative instruction to stop cycling (e.g., tapping an input key). In another embodiment, however, the obtain operation 1310 determines when the navigational instruction received by the receive operation 1304 is no longer received (e.g., lifting up on an input key after depressing the input key for an extended period of time).
The cycle operation 1308 does not access each category through which it cycles. Rather, an access operation 1312 accesses only the category that is eventually selected when the stop instruction is received. Access operation 1312 determines a default page associated with the selected category and displays the default page. The cycling process 1300 completes and ends at a stop module 1314.
A cycle operation 1406 scrolls through display elements 1212 in the navigation bar 1210 that represent the available categories. As the user cycles through the categories, a differentiate operation 1408 indicates the current category available for selection. For example, the differentiate operation 1408 can enlarge, color, outline, or otherwise distinguish a display element 1212 representing the current category. In the example shown in
In another embodiment, however, the display elements 1212 remain in place with respect to the navigation bar 1210. Instead, the differentiate operation 1408 sequentially distinguishes the display elements 1212 while the categories are cycled. For example, in such an embodiment, decrementing categories on the interface 1500 shown in
Referring to
In tap-based communication, a user interacts with the navigational user interface by tapping on different sections of the touch screen. For example,
To support tapping on a display element 1612 of the navigation bar 1610, an area 1617 over each display element 1612 is defined as a tap area. In general, when the interface framework identifies a tap anywhere within the tap area 1617, the interface framework will navigate immediately to the category (e.g., plug-in application) associated with the tap area 1617. Tap areas 1617 follow their corresponding display elements 1612 when the display elements 1612 scroll along the display screen.
Typically, tap areas 1617 extend beyond the corresponding display elements 1612 without overlapping one another. This extra area facilitates tapping by reducing the need for accuracy. In certain embodiments, the tap area 1617 is arranged in a square pattern centered on the display element 1612. In one embodiment, the tap area 1617 is less than or equal to about fifty-five pixels by fifty-five pixels.
To further provide tap-based communication, navigational indicia 1626 enable a user to navigate through content pages via a tap action. In general, the navigational indicia 1626 function to increment and decrement the content page displayed to the user. An area 1627 over each navigation indicia 1626 is defined as a tap area. When the interface framework identifies a tap anywhere within the tap area 1627 of one of the navigation indicia 1626, the interface framework will navigate immediately to the next or previous content page as appropriate.
When the content pages are arranged along a looped flow path, the navigation indicia are always visible (as there is always a page to navigate to in both directions). In other embodiments, the content pages are arranged in linear arrays. In such embodiments, increment and decrement indicia 1626 are shown as appropriate. In certain embodiments, the navigational indicia 1626 also can inform users of new and/or special events (e.g., through a glowing action). In one embodiment, if content has been updated, these indicia 1626 can flash to indicate a direction in which the user should navigate to reach the new content. For example, increment indicia 1626 can flash when a new email message is received to inform the user that navigating in an incremental direction will display the new email message.
The navigational user interface 1600 also is configured to enable tapping on the content panel 1620 and/or the action items 1624 within the content panel 1620. In some embodiments, tapping on the display of the navigational user interface 1600 anywhere except for near the navigation bar 1610 or navigation icons 1626 selects the entire content panel 1620. In other embodiments, however, the user can separately select action items 1624 and/or content 1622 within the content panel 1620.
Referring to
In general, gesture-based communication includes tapping motions and dragging motions. The navigational user interface processes movement of a tapping implement (e.g., a finger, a stylus, a light pen, etc.) to determine whether a gesture was made and to ascertain the navigational instruction indicated by the gesture. The navigational instruction indicates a type of navigation (i.e., category or page) and a navigational direction. The direction of navigation is generally based on the direction of the dragging motion.
For example, a first direction of drag can be associated with incremental category navigation and a second direction of drag can be associated with decremental category navigation. Typically, the first direction extends opposite the second direction. A third direction of drag can be associated with incremental page navigation and a fourth direction of drag can be associated with decremental page navigation.
Gesture-based communication implemented in accordance with the principles of the present disclosure can include at least two different types of gestures: a) basic navigation gestures; and b) navigation bar gestures. The former facilitates navigation through categories and content pages. The latter facilitates accelerated navigation between categories. Typically, basic navigation gestures are initiated by tapping anywhere on the display, except the navigation bar. Navigation bar gestures are initiated by tapping on the navigation bar.
For example,
In the example shown in
In
Four of these zones 1750, 1760, 1770, 1780 are generally formed by splitting the display area of the navigational user interface 1700 into four areas centered on the original tap point 1730. In one embodiment, the zones 1750, 1760, 1770, 1780 are defined by four triangular areas of approximately equal area. In another embodiment, the area of each zone 1750, 1760, 1770, 1780 can differ from the other areas. A fifth zone 1740 includes an area overlaying and surrounding the tap location 1730. For example, the fifth zone 1740 can include a circular area extending outwardly from the tap location 1730. In one embodiment, the circular area of the fifth zone 1740 has about a twenty pixel radius.
In general, providing the “just tap” zone 1740 inhibits the accidental selection of a gesture-based navigational instruction by the user. The navigational user interface does not interpret movement of a tapping implement within the “just tap” zone 1740 as a “tap and drag” gesture. Advantageously, the “just tap” zone 1740, therefore, forgives (i.e., allows for) slight movement of the tapping implement during a tapping motion without misinterpreting the tap as a navigation gesture.
When the tapping implement moves along the display area from the tap location 1730 to a location outside of the “just tap” zone 1740, however, the navigational user interface interprets the movement as a “tap and drag” gesture and ascertains a navigational instruction from the gesture. Dragging the tapping implement outside the “just tap” zone 1740 in any direction, therefore, commits the user to the “tap and drag” gesture.
A drag module 1906 determines whether a drag motion is subsequently detected. The drag module 1906 defines a “just tap” zone, such as “just tap” zone 1740 of
Alternatively, if the drag module 1906 determines the tapping implement is dragged from the location of the tapping motion to an area outside the “just tap” zone, then an ascertain operation 1910 divides the display area of the user interface into navigation instruction zones, such as the zones 1750, 1760, 1770, 1780 of
In one embodiment, the ascertain operation 1910 determines the zone first entered by the tapping implement after leaving the “just tap” zone. In another embodiment, the ascertain operation 1910 determines the zone last entered by the tapping implement after leaving the “just tap” zone and before the gesture is finalized. For example, the ascertain operation 1910 can determine the zone last entered by the tapping implement before the tapping implement is lifted from the touch screen. In other embodiments, the ascertain operation 1910 can apply other logical rules to determine the intended zone.
Typically, the ascertain operation 1910 will not process further movement of the tapping implement after the drag portion of the gesture is finalized until a new tapping motion is detected. As noted above, finalizing can include lifting a tapping implement from the touch screen. In other embodiments, however, a gesture is considered to be finalized when the ascertain operation 1910 ascertains a navigational instruction.
A navigate operation 1912 changes the display of the navigational user interface based on the ascertained navigation instruction. For example, the navigate operation 1912 can display the next content page if the tapping implement is dragged from the tapping location to an increment page zone. The determine process 1900 completes and ends at a stop module 1914.
Referring to
In certain embodiments, the navigational user interface 2000 will interpret the motion of a tapping implement as a navigation bar gesture only if the tapping implement first taps the touch screen within the navigation bar 2010 and then drags along the touch screen. If the initial tap occurs outside the navigation bar 2010, then the navigational user interface 2000 interprets the movement as a basic navigation gesture disclosed above. In other embodiments, the navigation bar gesture (i.e., the tap and the drag movements) must be performed completely within the confines of the navigation bar 2010.
A drag module 2106 determines whether a drag motion is subsequently detected. The drag module 2106 defines a “stall” zone, such as “stall” zone 2092 of
The locate operation 2108 determines the direction in which the tapping implement is dragged. For example, as shown in
A scroll operation 2110 cycles through the display elements 2012 in a direction based on the scroll zone 2092, 2094, 2096 entered by the tapping implement. For example, if the tapping implement is dragged over the incremental scroll zone 2094, then the scroll operation 2110 cycles through the display elements 2012 in a first direction. If the tapping implement is dragged over the decremental scroll zone 2096, however, then the scroll operation 2110 cycles through the display elements 2012 in a second, opposite direction. Advantageously, the scroll operation 2110 facilitates quick access to display elements 2012 that are not visible on the navigation bar 2010, thereby facilitating access to the represented categories.
In certain embodiments, the navigational user interface 2000 shows an animation of the display elements 2012 scrolling in the appropriate direction during the scrolling operation 2110. Showing the animation enables a user to access display elements 1012 that initially are not visible in the navigation bar 2010. In other embodiments, however, the navigational user interface 2000 does not display an animation. Rather, the display elements 2012 remain in place and are sequentially modified (e.g., enlarged) to indicate when each is selected during the cycle. In one embodiment, no content data is displayed in the content panel 2020 while scrolling through the categories.
A finalized module 2112 determines whether the “tap and hold” gesture has been completed. In one embodiment, the gesture can be finalized by lifting the tapping implement off the touch screen. In another embodiment, dragging the tapping implement over the stall zone 2092 for a predetermined period of time indicates completion of the gesture. For example, in one embodiment dragging the tapping implement over the stall zone 2092 for 300 ms may indicate completion of the gesture. In other embodiments, however, dragging the tapping implement over the stall zone 2092 does not, by itself, indicate completion of the gesture.
If the gesture is not finalized, then the scrolling process 2100 returns to the drag module 2106. By looping back in this manner, the accelerated scrolling process 2100 enables the user to alternately scroll in opposite directions without initiating a new gesture. For example, if a user taps on the navigation bar 2010 portion of the touch screen with a tapping implement and drags the tapping implement into the incremental scroll zone 2094, then the display elements 2012 begin cycling continuously in a first direction. To pause the scrolling of the display elements 2012, the user can drag the tapping implement back to the stall zone 2092. To reverse the direction in which the display elements 2012 cycle, the user drags the tapping element into the decremental scroll zone 2096. The user can repeatedly change scrolling directions through dragging motions until finalizing the gesture.
When the gesture is finalized, the scrolling process 2100 proceeds to an ascertain operation 2114. The ascertain operation 2114 determines which category is represented by the selected display element 2012 when the scrolling process 2100 stops cycling through the categories. A navigate operation 2116 accesses the category represented by the selected display element 2012 and renders a default content page associated with the category. The scrolling process 2100 completes and ends at a stop module 2118.
Referring to
To accommodate both types of users, a navigational user interface may allow a user to select a preference of which directions (e.g., up, down, left, right) to associate with incremental navigation and which directions to associate with decremental navigation. The user's preference is typically stored in system memory with other types of configuration data (e.g., see
For example, with reference to
In
Embodiments of the disclosure described above may be implemented as a computer process (method), a computing system, or as an article of manufacture, such as a computer program product or computer readable media. The computer program product may be a computer storage media readable by a computer system and encoding a computer program of instructions for executing a computer process. The computer program product also may be a propagated signal on a carrier readable by a computing system and encoding a computer program of instructions for executing a computer process.
These computer processes can be implemented in any number of ways, including the structures described in this document. One such way is by machine operations, of devices of the type described in this document. Another optional way is for one or more of the individual operations of the methods to be performed in conjunction with one or more human operators performing some. These human operators need not be collocated with each other, but each can be only with a machine that performs a portion of the program.
While the embodiments have been described in the general context of program modules that execute in conjunction with an application program that runs on an operating system on a personal computer, those skilled in the art will recognize that aspects may also be implemented in combination with other program modules. Further, while specific file formats and software or hardware modules are described, a system according to embodiments of the present disclosure is not limited to the definitions and examples described above. Displaying and manipulating data may be performed using other file formats, modules, and techniques.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.