Modern productivity applications enable users to perform a large number of commands on documents. For example, a word processor application can enable a user to manipulate the appearance of text, insert tables, insert footnotes, create tables of content, add page numbers, review changes, and so on. In another example, a spreadsheet application can enable a user to select styles for cells, create and insert charts, set the layout for spreadsheet pages, and so on.
Traditionally, productivity applications have used menu systems to enable users to select and perform commands on documents. A menu system comprises a set of menus. Each of the menus contains one or more menu items. Selection of a menu item can cause a productivity application to perform a command on a document, open an interface that provides the user with more options, or perform some other action. Menu systems can be beneficial because menu systems frequently do not occupy large amounts of onscreen space. However, users can find it difficult to find valuable commands because the menu items associated with those commands can be located somewhere deep within one of the menus. Moreover, it can take several clicks for a user to select the desired menu item.
In addition to menu systems, some productivity applications provide toolbars. A toolbar comprises a fixed set of selectable icons associated with commands. The icons can graphically suggest the effect of the performing the commands associated with the icons. Selection of an icon can cause the productivity application to perform some command. Toolbars can be beneficial because the graphical icons can help users more easily understand the associated commands. Furthermore, toolbars can be beneficial because toolbars can remain onscreen and thus can be selected with a single click. However, it may be difficult for a user to determine which icons are associated with which commands. Labeling the icons with text can cause each icon to become so large that the toolbar occupies an unacceptable amount of onscreen space.
A ribbon-shaped user interface can include a set of toolbars placed on tabs in a tab bar. The tab bar can be rectangular in shape. Ribbon-shaped user interfaces can have the benefits of toolbars in that users can see and select graphical icons to perform commands. Furthermore, ribbon-shaped user interfaces can have some of the benefits of menu systems because not all of the icons are onscreen at once. As a result, a ribbon-shaped user interface can occupy less onscreen space than a toolbar containing the same number of icons.
However, it can still be challenging for users to learn how to use a ribbon-shaped user interface. This can especially be true for users accustomed to menu systems or menu systems accompanied by one or more static toolbars. Because users can find it difficult to learn how to use ribbon-shaped user interfaces, users may never learn how to use potentially valuable commands provided by productivity applications.
A computing device displays a user interface containing a ribbon-shaped user interface. The ribbon-shaped user interface contains multiple tabs. Each of the tabs contains multiple controls. Furthermore, the computing device displays a challenge and a tab visualization control in the user interface. The challenge instructs a user of the computing device to perform a task using the ribbon-shaped user interface. However, the challenge does not instruct the user how to perform the task. If the user does not know how to perform the task using the ribbon-shaped user interface, the user can select the tab visualization control. In response to receiving selection of the tab visualization control, the computing device displays a popup window in the user interface. The popup window initially contains an image of an initial portion of a given tab in the ribbon-shaped user interface. The image of the given tab is scrolled within the popup window such that a target control in the given tab is visible within the popup window. The user may need to use the target control to perform the task. Scrolling the image of the given tab within the popup window can help the user learn the location of the target control within the tab.
This summary is provided to introduce a selection of concepts. These concepts are further described below in the Detailed Description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is this summary intended as an aid in determining the scope of the claimed subject matter.
The computing device 102 comprises an input device 104 and a display device 106. The input device 104 enables the computing device 102 to receive input from the user 100. For example, the input device 104 can be a mouse, a keyboard, a microphone, a touch screen, a keypad, or another type of device that enables the computing device 102 to receive input from the user 100. The display device 106 is a device that is capable of displaying a user interface to the user 100. For example, the display device 106 can be a computer monitor, a touch screen, a television screen, a projector, or another type of device that is capable of displaying a user interface to the user 100.
In the example of
The productivity application 108 displays a user interface to the user 100 via the display device 106. The user interface comprises a ribbon-shaped user interface. The ribbon-shaped user interface has multiple tabs. Each of the tabs comprises multiple user-selectable controls. Selection of the controls causes the productivity application 108 to perform commands. For example, selection of a given control can cause the productivity application 108 to apply a selected font to a block of text in a document. The user 100 is trying to learn how to use the ribbon-shaped user interface effectively.
Furthermore, a learning tool 110 runs on the computing device 102. The learning tool 110 can implement a method for teaching the user 100 to use the ribbon-shaped user interface. For example, the learning tool 110 can implement the RIBBON HERO™ tool for teaching user to use ribbon-shaped user interfaces. For ease of explanation, this patent document can describe functionality associated with the learning tool 110 as being performed by the learning tool 110. However, the learning tool 110 can be related to the productivity application 108 in various ways. For example, the learning tool 110 can be part of the productivity application 108. In another example, the learning tool 110 can be a plug-in for the productivity application 108. In yet another example, the learning tool 110 can be an update pack for the productivity application 108.
The learning tool 110 displays a series of challenges to the user 100 in the user interface of the productivity application 108. The challenges instruct the user 100 to use the ribbon-shaped user interface to perform various tasks. Because the user 100 is merely learning how to use the ribbon-shaped user interface, the learning tool 110 provides a hint button in the user interface. When the user 100 selects the hint button, the learning tool 110 displays one or more hints to help the user 100 use the ribbon-shaped user interface to complete the tasks specified by the challenges.
To further help the user 100 use the ribbon-shaped user interface to complete a task specified by a challenge, the learning tool 110 can display a tab visualization control in the user interface. When the user 100 provides input to select the tab visualization control, the learning tool 110 displays a popup window in the user interface. At least a horizontal onscreen dimension of the popup window is smaller than the horizontal onscreen dimension of the ribbon-shaped user interface. The popup window contains an image of a portion of a given tab in the ribbon-shaped user interface. The portion of the given tab can be displayed in the popup window at approximately the same size as the given tab is displayed in the ribbon-shaped user interface. The popup window only contains a portion of the given tab because the horizontal onscreen dimension of the popup window is smaller than the horizontal onscreen dimension of the ribbon-shaped user interface. Initially, the popup window shows a leftmost portion of the given tab.
The given tab contains a target control. Performance of a task specified by one of the challenges can involve selection of the target control. Because the target control may not initially be in the portion of the given tab displayed in the popup window, the learning tool 110 scrolls the image of the given tab within the popup window until the target control is displayed. Scrolling the image of the given tab within the popup window in this manner can help the user 100 locate the target control within the ribbon-shaped user interface.
To further help the user 100 learn to select the target control when performing a task specified by a challenge, the learning tool 110 can draw attention to the target control within the popup window. In various embodiments, the learning tool 110 can draw attention to the target control in various ways. For example, the learning tool 110 can display a colored pulsing frame around the target control. In another example, the learning tool 110 can display a still frame around the target control. In yet another example, the learning tool 110 can magnify the target control within the popup window.
The computing device 102 and the server system 200 communicate via a network 202. The network 202 can be a variety of different types of network. For example, the network 202 can be the Internet. In another example, the network 202 can be a local area network, a virtual local area network, a virtual private network, or another type of network.
In the example of
Furthermore, the user interface 300 also comprises a document pane 308. The document pane 308 displays at least a portion of a document. For instance, the document pane 308 can display at least a portion of a word processor document, a slideshow, a spreadsheet, or another type of document.
The user interface 300 also comprises a learning tool pane 310. The learning tool 110 can use the learning tool pane 310 to display hints for completing tasks specified by challenges.
The user interface 300 also contains a popup window 312. The learning tool 110 displays the popup window 312 in the user interface 300 to further help the user 100 learn how to use the ribbon-shaped user interface 302 to complete a task specified by a challenge. The user interface 300 does not initially contain the popup window 312. Rather, the learning tool 110 can display the popup window 312 in the user interface 300 in response to input from the user 100 at some time after the other parts of the user interface 300 are displayed.
It should be appreciated that the document pane 308 can include challenge text instructing the user 100 to use the ribbon-shaped user interface 302 to perform other tasks and can include content other than the table 402. For instance, if the productivity application 108 is a spreadsheet application, the document pane 308 can include challenge text that instructs the user 100 to use the ribbon-shaped user interface 302 to sort rows in a spreadsheet based on the values in a given column.
Furthermore, when the user 100 selects the hint control 500, the learning tool 110 causes the learning tool pane 310 to contain a popup control 600. In the example of
Furthermore, in some embodiments, the learning tool 110 can display the popup window 312 in response to input from the user 100 that does not involve the popup control 600. For example, the learning tool 110 can display the popup window 312 in response to the user 100 providing one or more keystrokes on a keyword. In another example, the learning tool 110 can display the popup window 312 in response to a voice command or a gesture. In such embodiments, the user interface 300 may or may not contain the popup control 600.
The image of the tab 700 is not actually the tab in the ribbon-shaped user interface 302. For instance, the user 100 may not be able to select controls in the image of the tab 700. Rather, the learning tool 110 causes the popup window 312 to display the image of the tab 700 to guide the user 100 on how to find a control within an actual tab in the ribbon-shaped user interface 302.
Because the horizontal onscreen dimension of the popup window 312 is less than the horizontal onscreen dimension of the ribbon-shaped user interface 302, it may not be possible to display all of the controls in the tab 700 in the popup window 312 at one time. Rather, the popup window 312 is only able to display a portion of the controls in the tab 700. For example, the popup window 312 can initially display only the controls in a leftmost portion of the tab 700.
To show the target control 800 within the popup window 312, the learning tool 110 scrolls the image of the tab 700 within the popup window 312. For example, if the popup window 312 initially shows an image of a leftmost part of the tab 700, the learning tool 110 appears to move the controls in the tab 700 to the left over a period of time, thereby progressively exposing in the popup window 312 controls that are further right in the tab 700. The learning tool 110 stops scrolling the image of the tab 700 when the popup window 312 shows a control to which the learning tool 110 wants to attract the attention of the user 100 (e.g., the target control 800). Scrolling the image of the tab 700 in this way can help the user 100 learn where the target control 800 is located within the tab 700 by showing the user 100 where the target control 800 is located relative to the controls in the leftmost portion of the tab 700. In other words, by displaying the target control 800 at its location in the tab 700 relative to other controls in the tab 700, the user 100 may be better able to find the target control 800 when the user 100 next wants to perform the task specified by the challenge text 400.
As the learning tool 110 scrolls the image of the tab 700, the learning tool 110 does not move the label 702 for the tab 700 within the popup window 312. Rather, the label 702 remains at a fixed position within the popup window 312 while the image of the tab 700 scrolls within the popup window 312. Thus, the popup window 312 continues to display the label 702 even if the label for the tab 700 would not actually be displayed above the portion of the tab 700 shown in the popup window 312. By leaving the label 702 at the same position within the popup window 312, it may be easier for the user 100 to remember what tab is being displayed in the popup window 312.
In other embodiments, the learning tool 110 can initially display a rightmost portion of the image of the tab 700 and scroll the image of the tab 700 to display controls further left in the tab. Scrolling the image of the tab 700 to the right may have a more natural feel in some cultures.
Furthermore, in some embodiments, the learning tool 110 starts scrolling the image of the tab 700 within the popup window 312 automatically without receiving additional input from the user 100. In other embodiments, the learning tool 110 starts scrolling the image of the tab 700 within the popup window 312 in response to input from the user 100. Furthermore, in some embodiments, the learning tool 110 can scroll the image of the tab 700 back and forth within the popup window 312 in response to dragging input from the user 100.
To further draw the attention of the user 100 to the target control 800, the learning tool 110 displays a frame 802 around the target control 800. The frame 802 is a screen element designed to attract the attention of the user 100 to the target control 800. The frame 802 can have a high contrast color relative to the color of the image of the tab 700. Furthermore, in some embodiments, the frame 802 can flash or pulsate to draw the attention of the user 100 to the target control 800.
In other embodiments, the learning tool 110 can use visual elements other than a frame to draw attention to the target control 800. For example, the learning tool 110 can display one or more arrows pointing to the target control 800. In another example, the learning tool 110 can magnify the target control 800 relative to other controls in the image of the tab 700.
The user 100 may be unfamiliar with using the contextual menu. Consequently, the user 100 may not know how to find menu items in the contextual menu. Accordingly, the user 100 can select the hint control 500 in the learning tool pane 310. In this context, selecting the hint control 500 causes the step area 504 of the learning tool pane 310 to contain text describing steps for using the contextual menu to complete the task specified by the challenge text 400. Selecting the hint control 500 also causes the learning tool pane 310 to contain the popup control 600. Selection of the popup control 600 causes the learning tool 110 to display the popup window 312.
Because the user 100 needs to use the contextual menu to complete the task specified by the challenge text 400, the popup window 312 contains the contextual menu image 900, instead of the image of a tab in the ribbon-shaped user interface 302. The contextual menu image 900 is an image of the contextual menu, and not the contextual menu itself. The user 100 may not be able to select menu items in the contextual menu image 900.
The popup window 312 comprises a frame 902. The frame 902 is a screen element designed to attract the attention of the user 100 to a target menu item 904. The target menu item 904 is a control in the contextual menu that the user 100 can use in performing the task specified by the challenge text 400. By displaying the target menu item 904 at its location in the contextual menu relative to other menu items in the contextual menu, the user 100 may be better able to find the target menu item 904 when the user 100 next wants to perform the task.
Furthermore, in some embodiments, the contextual menu may be too long in the vertical dimension for all menu items in the contextual menu to be displayed concurrently within the popup window 312. In such embodiments, the learning tool 110 can initially display a top portion of the contextual menu image 900 in the popup window 312. The learning tool 110 can then scroll down the contextual menu image 900 until the target menu item 904 is displayed within the popup window 312. In this way, the popup window 312 displays the target menu item 904 at its location relative to other menu items in the contextual menu. As a result, the user 100 may later be better able to find the target menu item 904 in the contextual menu even if the target menu item 904 is low in the contextual menu.
It should be appreciated that
The term computer readable media as used herein may include computer storage media and communication media. As used in this document, a computer storage medium is a device or article of manufacture that stores data and/or computer-executable instructions. Computer storage media may include volatile and nonvolatile, removable and non-removable devices or articles of manufacture implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data. By way of example, and not limitation, computer storage media may include dynamic random access memory (DRAM), double data rate synchronous dynamic random access memory (DDR SDRAM), reduced latency DRAM, DDR2 SDRAM, DDR3 SDRAM, solid state memory, read-only memory (ROM), electrically-erasable programmable ROM, optical discs (e.g., CD-ROMs, DVDs, etc.), magnetic disks (e.g., hard disks, floppy disks, etc.), magnetic tapes, and other types of devices and/or articles of manufacture that store data. Communication media may be embodied by computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and includes any information delivery media. The term “modulated data signal” may describe a signal that has one or more characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media may include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, radio frequency (RF), infrared, and other wireless media.
In the example of
The processing system 1004 includes one or more processing units. A processing unit is a physical device or article of manufacture comprising one or more integrated circuits that selectively execute software instructions. In various embodiments, the processing system 1004 is implemented in various ways. For example, the processing system 1004 can be implemented as one or more processing cores. In another example, the processing system 1004 can comprise one or more separate microprocessors. In yet another example embodiment, the processing system 1004 can comprise an application-specific integrated circuit (ASIC) that provides specific functionality. In yet another example, the processing system 1004 provides specific functionality by using an ASIC and by executing computer-executable instructions.
The secondary storage device 1006 includes one or more computer storage media. The secondary storage device 1006 stores data and software instructions not directly accessible by the processing system 1004. In other words, the processing system 1004 performs an I/O operation to retrieve data and/or software instructions from the secondary storage device 1006. In various embodiments, the secondary storage device 1006 comprises various types of computer storage media. For example, the secondary storage device 1006 can comprise one or more magnetic disks, magnetic tape drives, optical discs, solid state memory devices, and/or other types of computer storage media.
The network interface card 1008 enables the computing device 1000 to send data to and receive data from a communication network. In different embodiments, the network interface card 1008 is implemented in different ways. For example, the network interface card 1008 can be implemented as an Ethernet interface, a token-ring network interface, a fiber optic network interface, a wireless network interface (e.g., WiFi, WiMax, etc.), or another type of network interface.
The video interface 1010 enables the computing device 1000 to output video information to the display unit 1012. The display unit 1012 can be various types of devices for displaying video information, such as a cathode-ray tube display, an LCD display panel, a plasma screen display panel, a touch-sensitive display panel, an LED screen, or a projector. The video interface 1010 can communicate with the display unit 1012 in various ways, such as via a Universal Serial Bus (USB) connector, a VGA connector, a digital visual interface (DVI) connector, an S-Video connector, a High-Definition Multimedia Interface (HDMI) interface, or a DisplayPort connector.
The external component interface 1014 enables the computing device 1000 to communicate with external devices. For example, the external component interface 1014 can be a USB interface, a FireWire interface, a serial port interface, a parallel port interface, a PS/2 interface, and/or another type of interface that enables the computing device 1000 to communicate with external devices. In various embodiments, the external component interface 1014 enables the computing device 1000 to communicate with various external components, such as external storage devices, input devices, speakers, modems, media player docks, other computing devices, scanners, digital cameras, and fingerprint readers.
The communications medium 1016 facilitates communication among the hardware components of the computing device 1000. In the example of
The memory 1002 stores various types of data and/or software instructions. For instance, in the example of
The various embodiments described above are provided by way of illustration only and should not be construed as limiting. Those skilled in the art will readily recognize various modifications and changes that may be made without following the example embodiments and applications illustrated and described herein. For example, the operations shown in the figures are merely examples. In various embodiments, similar operations can include more or fewer steps than those shown in the figures. Furthermore, in other embodiments, similar operations can include the steps of the operations shown in the figures in different orders.