In the field of computing, windowing environments have been used to provide applications with windows. Applications or programs executing on a computing device may have corresponding application windows through which a user interacts with the applications. In addition, it has been known how to concurrently display windows on multiple displays locally connected to one computing device. Typically, an operating system of a multi-display computing device handles details for managing multiple displays and may provide different display modes such as display mirroring or display concatenation.
Usually, windowing systems or environments include systemic user interface elements that a user can interact with to control and manage windows. For example, OS X™ has an “app launcher” tool, the Android™ operating system provides a default “Launcher” that is used to start applications and access system settings, and various versions of Microsoft Windows™ have provided a “Start” element, fast-switch lists, and other elements. In addition, there have been many third-party applications that have provided similar functionality.
To date, such user interface elements for controlling applications have been unable to work efficiently and intuitively in the presence of multiple displays connected to a same device. Add-on user interface programs for application management often lack logic, perhaps at the kernel level, that might be helpful for smooth and consistent use across multiple displays. System-provided application managers such as those mentioned above have not been designed for a multi-display user experience and therefore fall short of providing the behavior a user might expect when using multiple displays.
Techniques related to providing application management user interface elements for computing devices with multiple displays are discussed below.
The following summary is included only to introduce some concepts discussed in the Detailed Description below. This summary is not comprehensive and is not intended to delineate the scope of the claimed subject matter, which is set forth by the claims presented at the end.
Described herein are techniques for a computing device executing a windowing system that automatically maintains a tiled arrangement of application windows on a first display and on a second display. A user interface element has indicia of applications that can be used to open the applications. Responsive to a first user input the user interface element is displayed on the first display, and while the user interface element is displayed on the first display, the windowing system maintains two or more of the application windows in a tiled arrangement on the second display. The user interface element may be part of a system user interface and may be implemented in a variety of ways. For example, it may be a full-screen set of application representations, possibly user-selected, or a list of recently used applications, or a list of open applications, etc.
Many of the attendant features will be explained below with reference to the following detailed description considered in connection with the accompanying drawings.
The present description will be better understood from the following detailed description read in light of the accompanying drawings, wherein like reference numerals are used to designate like parts in the accompanying description.
Embodiments described below relate to providing application management user interface elements for computing devices with multiple displays. Discussion will begin with an overview of windowing systems for multiple displays. Tiled or non-occluding windowing systems will be discussed next. User interface elements for application window management and manipulation will be discussed next, followed by explanation of how such user interface elements can be integrated into a multi-display setting.
The computing device 100 may be any of a variety of types which are described later with reference to
A windowing system 110 may be partly integrated with or closely coupled with the operating system 102. For purposes herein, distinctions between these components are not significant; an operating system itself may be considered to be a windowing system. The windowing system 110 may have functionality that is known in the computing arts, such as handling of input events (e.g., touches/strokes, clicks, keypresses, mouse drags, etc.) inputted by a user with various types of input devices. The windowing system 110 also manages the application windows 106 by handling related events or user interactions such as closing windows, moving windows, resizing windows, directing input to windows, rendering multiple concurrent windows, and others. The windowing system 110 may also provide a background and system controls (user interface elements) not specific to an application, which will be addressed further below.
Some tiled windowing systems may omit from windows traditional window elements such as borders, title bars, menu bars, and other elements. That is to say, application windows may have immersive qualities, for instance they may have minimal or no window adornments and may have an appearance typically associated with the “full screen” mode of many software applications. Application windows may have such appearance even in cases where multiple application windows are displayed on a same display.
Returning to
The second example layout 126 reflects the addition of a second application window 128, whether automatically or interactively inserted; the tiled windowing system automatically manages the window layout to cause the currently displayed application windows 122, 128 to somewhat maximize occupancy of the display 108. The tiled windowing system (window manager) may insert between windows a divider 130 that can be moved by a user to resize the application windows adjacent to the divider 130 while maintaining a tiled arrangement.
The third example layout 132 shows a third application 134 having been inserted. With tiled layout management, a user may only needs to designate the third application window 134 to be inserted and/or possibly designate a slot or location for inserting the third application window 134; the tiled window manager may automatically resize the displayed application windows or take other measures to accommodate the new application window.
The fourth and fifth example layouts 136, 138 show other divisions of screen real estate that may be used. For ease of discussion, examples discussed below show tiling using only a single horizontal row of application windows, however, any arbitrary rectilinear arrangement may be used, possibly with asymmetries.
The tiled windowing system 140 may have various features or facilities that allow a user to manage applications on the computing device. Such features, which are sometimes referred to herein as “user interface elements”, or “system elements”, might include a recent applications module 182, an active applications module 184, and/or a favorites module 186.
These modules cooperate with the windowing system (or are a part thereof) to track the semantically relevant information. When applications are opened or used the windowing system might populate a recent-applications list 188 with indicia of recently used applications. The windowing system might similarly populate or provide an active-applications list 190, which might include applications currently displayed on any connected monitors and/or applications that are executing or suspended but are not currently displayed. Similarly, a user might maintain a favorite-applications list 192.
These lists are used by the graphical user interface 180 to display corresponding user interface elements 194, 196, 198 that can be invoked and used by the user to activate applications or application groups, as the case may be. In some embodiments, the user interface elements 194, 196, 198 may be persistently displayed, and in other embodiments they are displayed only when activated by a user input such as after pressing and releasing a hardware or software button, or while a hot key is depressed, or after inputting a touch gesture, etc. Some of the user interface elements 194, 196, 198 may be undisplayed when they are used to open an application, or when display-sustaining input ends, or when a user invokes a close command.
The system user interface elements 194, 196, 198 in
In one embodiment, an application representation 222 may display dynamically updated content received, for example, from other applications, system services, or from network-based resources. Such live updating may occur even when an application representation's application is not open. The app launcher may be implemented as a scrollable surface, and may also include dashboard-like features such as a clock, a logout mechanism, network status information, system or application notifications, and so forth. At times, as discussed below, the application launcher 220 is not displayed until requested by a user.
Application representations 222 may be interactively rearranged, removed, added, perhaps resized, configured with settings, updated by applications with dynamic content, etc. Application representations 222 may also be activated or selected by a user to open a corresponding application window. In some embodiments, the application launcher is undisplayed when a graphic application representation 222 is actuated to open an application window; the opened application window may supplant the application launcher on the display where the application launcher was used.
The user interface element 194 includes graphic application representations 222 of corresponding applications. An application representation may be displayed as a thumbnail image of the corresponding application (live or previously captured) or a graphic symbol representing the application. The application representation 222 may be interactively selected, for instance by a click or touch, or dragged from the user interface element 194 by the user. When the application representation 222 is activated or released from a drag the corresponding application window is opened. Various visual effects may be used. For instance, a rendering of the prior or emerging layout may be animated as enlarging to occupy the display before switching to live activation of the windows. In some embodiments, the application representation 222 may represent a group of applications and can be used to open those applications all at once. Note that a pointer 224 may or may not be displayed, and in this description the pointer 224 may also depict not a graphic pointer but rather an input point moved or inputted by a user.
Regarding the multi-display behavior of the user interface element 194, the user interface element may, in one embodiment, be opened by the user on either the first display 108A or on the second display 1088, and optionally may not be able to be displayed simultaneously on both displays, as shown in the four sequential stages of
Sequentially, the user interface element 194 is initially not displayed but is available to be activated by the user on either the first display 108A or the second display 1088. As shown in the first quarter of
When the user activates the user interface element 194 on the second display 1088, as shown in the second quarter of
As can be seen from the discussion above, the content of the recent-applications list 188 and the appearance of the user interface element 194 is consistent across multiple displays, regardless of which display it is displayed on or which display an application windows is opened to. In addition, activity limited to one display does not affect what is displayed any other display, although if dormant the user interface element 194 will reflect such activity if later displayed on the other display.
When a user input is received and directed to a particular target display, which at one time could be either display, the windowing system displays the application launcher 220 on that target display, as shown in the middle portion of
When the application launcher 220 is displayed on one arbitrary display, user input directed to the other display, for instance interacting with an application window, does not cause the application launcher 220 to be undisplayed; the application launcher 220 remains available to be used by the user, possibly displaying live information in dashboard fashion or providing other system functionality. If the application launcher 220 is displayed on a first display and launcher-invoking input is directed to a second display, then the application launcher 220 is removed from the first display and is displayed on the second display.
As shown in the middle portion of
At times one or more of multiple displays may be disconnected or become inoperable. An application capture feature may be implemented to respond to the loss of a display by capturing indicia of the application windows that were displayed on that display. If the display or displays is/are reconnected within a predetermined period of time (e.g., five minutes), then the application windows are automatically displayed on the display or displays to reproduce the appearance of the display or displays before it/they was/were disconnected. Note that if all displays are disconnected, this timer might not apply. That is to say, there may be scenarios where there are no available displays, such as when display drivers are updating, when connecting to a remote machine, or occurrence of some system failures. In such cases, all screens can be restored.
Further regarding how an operating system and/or a windowing system handle multiple monitors, the windowing system may also allow applications to be interactively moved across displays. For example, a user might be allowed to drag a window on a first display over to a second display (or, the user might input a “switch displays” command). That is to say, the windowing system is able to maintain a tiled arrangement and provide user interface elements seamlessly within a concatenated display model.
As can be seen from the embodiments described above, when a computing device has only one connected display, a number of system user interface elements may be available to open applications or perform other application management functions. When a second display is connected, those system user interface elements, even if dormant or not currently displayed, become equally available to be activated on both displays. In addition, their display or use on one display need not affect the contents of the other display. When the user interface elements are deactivated or undisplayed from one display, the contents (e.g., windows) of the other display may continue to be displayed thereon.
Embodiments and features discussed above can be realized in the form of information stored in volatile or non-volatile computer-readable or device-readable devices. This is deemed to include at least devices such as optical storage (e.g., compact-disk read-only memory (CD-ROM)), magnetic media, flash read-only memory (ROM), or any other devices for storing digital information in physical matter. The stored information can be in the form of machine executable instructions (e.g., compiled executable binary code), source code, bytecode, or any other information that can be used to enable or configure computing devices to perform the various embodiments discussed above. This is also deemed to include at least volatile memory such as random-access memory (RAM) and/or virtual memory storing information such as central processing unit (CPU) instructions during execution of a program carrying out an embodiment, as well as non-volatile media storing information that allows a program or executable to be loaded and executed. The embodiments and features can be performed on any type of computing device, including portable devices, workstations, servers, mobile wireless devices, and so on.