For some time, in the field of computing, windowing environments have been able to provide applications with multiple windows. Applications having a clear distinction between control and content playback have sometimes used multiple windows to display content in one window and to display controls in another window. For example, media players, slide presentation applications, video games, and other applications have separated, with multiple windows, the display of content (display or presentation graphics) from the display of interactive graphic controls used to control the content. Such applications will be referred to loosely as presentation applications. To date, presentation-type applications have been developed to handle control-presentation separation on an ad hoc basis. That is, the applications themselves have been developed with logic to separate presentation of content from the controls that control the content. Some such applications require complex scenario-handling logic that often fails.
In addition it is has been known how to concurrently used multiple displays with one computing device. Typically, an operating system of a multi-display device handles the hardware details of managing multiple displays and may provide different display modes such as display mirroring or display concatenation. Again, presentation-style applications, if they have used multiple displays, have done so at the application level. Each application may have its own logic for how to deal with multiple monitors. Again, such logic can be complex and may fail under certain use-cases and display events. In addition, such applications or systems may not allow interaction with content or even other applications on a projected or secondary display.
Techniques related to system-managed multi-display projection logic are discussed below.
The following summary is included only to introduce some concepts discussed in the Detailed Description below. This summary is not comprehensive and is not intended to delineate the scope of the claimed subject matter, which is set forth by the claims presented at the end.
Described herein are embodiments performed by a computing device capable of having a first display and a second display. The device may also execute a windowing system. Arbitrary applications execute on the computing device. Each such application has a corresponding application window managed by the windowing system. A start-projecting request may be received from an arbitrary one of the applications, and the windowing system may respond to the start request by generating, displaying, and managing a projection window. The application may generate and display content via the projection window. Responsive to the start-projecting request, display information about the first display and the second display may be used by the windowing system to display the projection window. A stop-projecting request from the application may cause the windowing system to terminate the projection window.
Many of the attendant features will be explained below with reference to the following detailed description considered in connection with the accompanying drawings.
The present description will be better understood from the following detailed description read in light of the accompanying drawings, wherein like reference numerals are used to designate like parts in the accompanying description.
Embodiments discussed below relate to managing a secondary or projection view or window for arbitrary applications with advantage taken of multiple displays and with multitasking enabled on the multiple displays. While such functionality may be helpful for presentation applications, a system-managed projection window may have any arbitrary use and may display any content as determined by its corresponding parent application.
The windowing system may be partly integrated with or closely coupled with the operating system 106. For purposes herein, distinctions between these components are not significant. The windowing system 108 may have some functionality that is known in the computing arts, such as handling of input events (e.g., touches/strokes, clicks, keypresses, mouse drags, etc.) inputted by a user with various types of input devices. The windowing system 108 also manages application windows 110, handling related events or user interactions such as closing windows, moving windows, resizing windows, rendering multiple concurrent windows, possibly providing a background and controls not specific to an application, etc. Additional details of a windowing system are described later with reference to
The windowing system 108 manages application windows 110. The application windows 110 correspond to respective processes executing on the computing device 100 and managed by the operating system 106. Note that each process need not have a window, and each window need not have its own unique process. Some processes may have multiple application windows 110. Again, windows and their basic features are well known in the art of computing and require no further explanation. In embodiments described herein, some applications (e.g., processes with application windows 110) may access services or functions of the windowing system 108 via one or more application programming interfaces (APIs) 112 or the like.
In extended mode the displays 102A, 102B are treated as one logic display surface extended across two displays. Various forms of extend mode may be implemented. In one form the displays 102A, 102B are abstracted to a virtual display 130 which is indistinguishable from a single display. All graphics is bridged across the displays and the individual displays may not have distinct visibility to applications. Another form of extend mode treats the displays partly as a single device and partly as multiple devices. For example, a user shell or graphical user interface (GUI) might have distinct environmental components such as taskbars or application icons duplicated on each display, but at the same time this form of extend mode might provide applications with the ability to individually address the displays 102A, 102B. For purposes herein, “extend mode” refers to nearly any system-provided multi-display mode that allows different application windows to be displayed on different respective displays.
An application may be a first order object having a primary window that can be interactively moved, resized, moved from one display to another, etc. In one embodiment, the application manager may have layout logic that manages the layout of application windows 110. For instance, the application manager (or a layout manager) may allow a user to interactively insert an application window 110, interactively replace one application window 110 with another, concurrently reallocate display space among application windows 110 (e.g., using moveable dividers), interactively mode a window, and so forth. In sum, the application manager 150 provides the user with multiple application windows for multi-tasking.
The monitor manager 152 may handle hardware details of having multiple displays 102. The monitor manager 152 may operate as a service providing abstract access to display functionality. For example, the monitor manager 152 may handle the addition and removal (e.g., turning on/off, connecting/disconnecting, enabling/disabling) of displays. The monitor manager 152 also provides the different display modes such as extend mode and duplicate mode. The monitor manager 152 may also provide to applications, the application manager 150, and the projection manager 154 information about the current display mode, information about which displays are available and their properties, and others. In addition, the monitor manager 152 may switch between display modes as instructed by the projection manager 154.
The projection manager 154 is a component that allows applications to use window projection functionality without concern for implementation details. The projection manager 154 may be accessible through an API that includes at least a start-projecting call and a stop-projecting call. When an application needs a projection window for presentation-style display of content, the application may use the projection manager to start projecting.
The projection manager 154 may go through a decision making process to determine on which of the two (or more) displays the projection window is to be displayed. When switching from a duplicate mode to the extend mode the main or primary window of the requesting (projecting) application may not be associated with either display. Consequently the projection manager 154 may decide for the user which display will display the primary application window and which display will display the projection window. For instance, the projection manager 154 may take into account the display types (e.g., projector), the size of the displays, whether a display has input associated with it (e.g., a touch-sensitive display), previous display uses, etc. As will be discussed later, the projection manager 154 will also generate and manage the projection window even when there is only one display available or when the display mode is locked into duplicate mode per a user preference.
After the projection window has been displayed, at step 178 the windowing system and its components begin managing the projection window much like any other application window, for example by responding to user input directed to or affecting the main application window, the projection window, or other application windows that might be displayed on one or more of the displays after projecting has begun. During this time, the content displayed in the projection window may be controlled by user interaction with the main window or user interface elements thereof such as “play”, “skip”, “rewind”, “stop”, or any other control not necessarily of the playback control variety (for example, game controls may be used to play a game with the projection window displaying game graphics).
At the same time, the main application window 190 and the presentation window 192 (a child window of the application) are objects subject to user manipulation and management by the application manager 150. In other words, at any time the user might change which displays are available, may add or remove windows to/from either display 102A, 102B, and so forth. In other words, both displays 102A, 102B may be thought of as providing a multi-tasking user interface in conjunction with a system-curated or managed presentation view. The lower parts of
While two-display embodiments have been described above, the same embodiments are readily extendible to any number of displays. In one embodiment the presentation activation request may be denied by the presentation manager if the requesting application is not the active application. It is also possible that the application may itself make sure it is active before issuing a projection call, or there may be a separate API. The window manager is also a point where limiting projection to an active application can be implemented if needed. Note that the active application is of interest at least in part because of the multitasking environment.
The presentation manager may have other calls related to presentation. For example, a call may be provided to allow an application to swap its presentation window and main window between displays. An application may also query about the presence of a second display or can receive a notification when a second display becomes available. It may be practical in some implementations to allow an application to have only one presentation window. In yet another embodiment closing the main application window also closes its presentation window (or all non-projection windows of the application, as the case may be).
In some implementations or cases a projected window may be able to accept input but does not afford input. That is, the windowing system might assume that there is no input on a remote/second screen (for example, a remote window goes away when the local window is closed). If a user does have input on the remote/second screen (for example, a touch panel), the user can touch it and interact with it just like any other screen.
Embodiments and features discussed above can be realized in the form of information stored in volatile or non-volatile computer or device readable devices. This is deemed to include at least devices such as optical storage (e.g., compact-disk read-only memory (CD-ROM)), magnetic media, flash read-only memory (ROM), or devices for storing digital information. The stored information can be in the form of machine executable instructions (e.g., compiled executable binary code), source code, bytecode, or any other information that can be used to enable or configure computing devices to perform the various embodiments discussed above. This is also deemed to include at least volatile memory such as random-access memory (RAM) and/or virtual memory storing information such as central processing unit (CPU) instructions during execution of a program carrying out an embodiment, as well as non-volatile devices storing information that allows a program or executable to be loaded and executed. The embodiments and features can be performed on any type of computing device, including portable devices, workstations, servers, mobile wireless devices, and so on.