The invention relates generally to computer operating systems. More specifically, the invention provides a method for transforming a work area (e.g., desktop) of a graphical operating system in a virtual three-dimensional space to view an information component in the revealed presentation area.
Computer operating systems have evolved significantly in recent years. Typically, these systems have a shell that provides a graphical user interface (GUI) to an end-user. The shell consists of one or a combination of software components that provide direct communication between the user and the operating system. Speed improvements in computer hardware, e.g., memory, hard drives, processors, graphics cards, system buses, and the like, have enabled richer GUIs that are drastically easier for users to comprehend. Accompanying hardware price reductions have made computer systems more affordable, enabling broad adoption of computers as productivity tools and multimedia systems. GUIs have allowed users who may have been unschooled or unfamiliar with computers to quickly and intuitively grasp the meaning of desktops, icons, windows, and applications, and how the user can interact with each.
The desktop illustrated in
To some extent, this two-dimensional shortcoming has been driven by the video hardware available in personal computers. In the past, advancements in mid- and lower-end computer video hardware have been driven in large part by the graphical services available in popular operating systems. However, the graphical services available in these systems have not significantly advanced for a variety of reasons, including the need to maintain compatibility with older application software and the limited capabilities of the affordable range of video hardware. More recently, however, real-time 3D computer games have overtaken operating systems as the primary market incentive for advancing retail video hardware, which has in a short time attained an exceptional level of sophistication. Real time, hardware-based 3D acceleration is now available to consumers at reasonable cost. Thus, graphics hardware features once considered highly advanced, such as accelerated texture and lighting algorithms as well as 3D transformations are readily available. At present, generally only game software and highly specialized graphics applications actively exploit such features.
An operating system, such as Microsoft Windows XP® brand or Windows 2000® brand operating systems, will typically comprise a graphical method for launching new software applications within its GUI.
When a user clicks on the Start button 205 in
Using a broader perspective, a program launching menu, like the Start Menu, occupying the same work area as the software applications inhibits a user's fundamental understanding of the operating system. Manipulating application windows and the content therein can be viewed as tasks within and under the auspices of the operating system. For these tasks (e.g. editing a document or clicking on a link in a web page) the operating system can be viewed as arbitrating communication between the user and the application, displaying application output for the user, and passing user input to the application. Using this same perspective, launching a new application can be viewed as a meta-task, or as making a direct request of the operating system which operates outside the normal user-input-application-output model. That being the case, a program launching menu which occupies an existing work area inhabited by other windows and icons has the potential to confuse an end user, both visually and conceptually.
Thus, it would be an advancement in the art to provide for viewing a program launching menu in a way which does not clutter a work area such as a desktop, and also conceptually decouples the operating system from the applications it hosts.
The following presents a simplified summary of the invention in order to provide a basic understanding of some aspects of the invention. The summary is not an extensive overview of the invention. It is not intended to identify key or critical elements of the invention or to delineate the scope of the invention. The following summary merely presents some concepts of the invention in a simplified form as a prelude to the more detailed description below.
A first embodiment of the invention provides a method for displaying content to a user through a three-dimensional graphical user interface on a computer. The method comprises transforming a presently displayed work area, which includes desktops, windows, and the like. The transformation can involve rotating the work area away from the user and revealing a portion of a presentation area situated behind the work area. Finally, an information component, such as a Start Menu, is displayed in the visible portion of the presentation area.
A second embodiment of the invention provides a computer system comprising a pointing device, a processor, a display, and a memory, the memory storing computer executable instructions. The computer executable instructions provide for a graphical user interface using three-dimensional graphics. In addition, the computer executable instructions provide for transforming a presently displayed work area, and displaying an information component in the portion of the presentation area revealed behind the work area.
A more complete understanding of the present invention and the advantages thereof may be acquired by referring to the following description in consideration of the accompanying drawings, in which like reference numbers indicate like features, and wherein:
In the following description of the various embodiments, reference is made to the accompanying drawings, which form a part hereof, and in which is shown by way of illustration various embodiments in which the invention may be practiced. It is to be understood that other embodiments may be utilized and structural and functional modifications may be made without departing from the scope and spirit of the present invention.
Illustrative Operating Environment
The invention is operational with numerous other general purpose or special purpose computing system environments or configurations. Examples of well known computing systems, environments, and/or configurations that may be suitable for use with the invention include, but are not limited to, personal computers; server computers; portable and hand-held devices such as personal digital assistants (PDAs), tablet PCs or laptop PCs; multiprocessor systems; microprocessor-based systems; set top boxes; programmable consumer electronics; network PCs; minicomputers; mainframe computers; distributed computing environments that include any of the above systems or devices; and the like.
The invention may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The invention may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
With reference to
Computer 110 typically includes a variety of computer readable media. Computer readable media can be any available media that can be accessed by computer 110 and includes both volatile and nonvolatile media, removable and non-removable media. By way of example, and not limitation, computer readable media may comprise computer storage media and communication media. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, DVD or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can accessed by computer 110. Communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. Combinations of the any of the above should also be included within the scope of computer readable media.
The system memory 130 includes computer storage media in the form of volatile and/or nonvolatile memory such as read only memory (ROM) 131 and random access memory (RAM) 132. A basic input/output system 133 (BIOS), containing the basic routines that help to transfer information between elements within computer 110, such as during start-up, is typically stored in ROM 131. RAM 132 typically contains data and/or program modules that are immediately accessible to and/or presently being operated on by processing unit 120. By way of example, and not limitation,
The computer 110 may also include other removable/non-removable, volatile/nonvolatile computer storage media. By way of example only,
The drives and their associated computer storage media discussed above and illustrated in
The computer 110 may operate in a networked environment using logical connections to one or more remote computers, such as a remote computer 180. The remote computer 180 may be a personal computer, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to the computer 110, although only a memory storage device 181 has been illustrated in
When used in a LAN networking environment, the computer 110 is connected to the LAN 171 through a network interface or adapter 170. When used in a WAN networking environment, the computer 110 may include a modem 172 or other means for establishing communications over the WAN 173, such as the Internet. The modem 172, which may be internal or external, may be connected to the system bus 121 via the user input interface 160, or other appropriate mechanism. In a networked environment, program modules depicted relative to the computer 110, or portions thereof, may be stored in the remote memory storage device. By way of example, and not limitation,
The invention may use a compositing desktop window manager (CDWM), further described below and in co-pending application Ser. No. 10/691,450, filed Oct. 23, 2003 and entitled “Compositing Desktop Window Manager.” The CDWM is used to draw and maintain the display using a composited desktop model, i.e., a bottom-to-top rendering methodology in a virtual three-dimensional graphical space, as opposed to simulated 3D in a two-dimensional graphical space. The CDWM may maintain content in a buffer memory area (for future reference). The CDWM composes the display by drawing from the bottom up, beginning with the presentation area background, then a desktop background and proceeding through overlapping windows in reverse Z order. While composing a desktop, the CDWM may draw each window based in part on the content in front of which the window is being drawn (e.g., transparency), and based in part on other environmental factors (e.g., light source, reflective properties, etc.). For example, the CDWM may use the alpha channel of an ARGB format texture to provide transparency to a window, and may selectively emphasize portions of window content (e.g., the frame) based on a virtual light source.
The CDWM may rely upon a lower level graphics compositing subsystem, referred to herein as a Unified Compositing Engine (UCE), further described below and in co-pending application serial number (attorney docket number 50037.201US01), filed Oct. 23, 2003, entitled “System and Method for a Unified Composition Engine in a Graphics Processing System”, herein incorporated by reference in its entirety for all purposes. In one illustrative embodiment the UCE is based on or uses Direct3D® and DirectX® technology by Microsoft Corporation of Redmond, Wash. In alternative embodiments other graphics compositing subsystems may be used, such as variations of the X Window platform based on the OpenGL® graphics engine by Silicon Graphics, Inc. of Mountain View, Calif., and the like. The UCE enables 3D graphics, animation, transparency, shadows, lighting effects, bump mapping, environment mapping, and other rich visual features on the desktop.
A Unified Compositing Engine (UCE) 194 may service rendering instructions and coalesce resources emitted from the CDWM via a Programming Interface 194a. The UCE Programming Interface 194a provides an abstract interface to a broad range of graphics services including resource management, encapsulation from multiple-display scenarios, and remote desktop support. Rendering desktops to multiple displays requires abstraction of the differences in refresh rate, pixel format support, and device coordinate mapping among heterogeneous display devices. The UCE may provide this abstraction.
Graphics resource contention between write operations and rendering operations may be arbitrated by an internal Resource Manager 194b. Requests for resource updates and rendering services are placed on the UCE's Request Queue 194c by the Programming Interface subcomponent 194a. These requests may be processed asynchronously by the Rendering Module 194d at intervals coinciding with the refresh rate of the display devices installed on the system. Thus, the Rendering Module 194d of the UCE 194 may access and manipulate resources stored in the Resource Manager 194b as necessary, and assemble and deliver display-specific rendering instructions to the 3D Graphics Interface 195.
The UCE may also be responsible for delivering graphics data over a network connection in remote display configurations. In order to efficiently remote the display of one particular system to another, resource contention should be avoided, performance optimizations should be enacted and security should be robust. These responsibilities may also rest with the UCE.
The 3D Graphics Interface 195 may include a low-level, immediate-mode (stateless) graphics service such as Direct3D®, OpenGL®, or the like. A purpose of the 3D Graphics Interface may be to provide an abstract interface over the features of the particular graphics hardware configuration. The 3D Graphics Interface may service a single display device; the UCE may parse and distribute rendering instructions among multiple graphics output devices 197 in a multiple-display system via multiple device drivers 196.
It should be noted that the component architecture depicted in
Although the illustrative embodiments of
Returning to
Transforming the presently displayed work area can be accomplished using a 3D graphics system present in the hardware and/or software (e.g., in the operating system) of the host computer, as described above. Optionally, ongoing visual activity within the transformed work area may continue while transformed, especially with the assistance of the 3D graphics system. For example, if a window within the work area is showing a video clip, the video may continue to play, although in a transformed state. When a user clicks on the Start button, the operating system uses a three dimensional transform to tilt the work area. The specifics of this transformation are provided in more detail below. Although a three dimensional rendering system is used here, the visual transformation can be simulated by conventional two-dimensional algorithms. The resulting display conceptually decouples the operating system from the applications it hosts and prevents visual clutter while taking full advantage of the graphics capabilities of the host computer.
Once the presently displayed work area is transformed and the information component displayed, the work area may retain some level of interactivity. At a minimum, if the user points the mouse in the work area and clicks, the work area can be returned to its initial un-transformed state, and the user may resume normal manipulation of the work area. Alternatively, the location of the user's click may be processed as a normal click upon the screen, triggering activity within the work area. For example, clicking on a window in the transformed work area might bring the work area back into the forefront, and additionally give the focus to the window clicked while the work area was transformed. Another possibility is that the user may click on a specific control or item within an application window in the transformed work area. The exact location of the click can be un-transformed into two-dimensional space and passed through to the application running within the work area. The work area can be returned to the forefront, and the application can process the click as it normally would.
Mapping elements of work area 411 into 3D space can be accomplished in any number of ways, for example, using the resources of the previously described compositing desktop window manager, low level graphics APIs, such as Direct3D® or OpenGL®, a high level graphics API, such as Java 3D™, or working directly with the specialized hardware of a 3D graphics video card. One possible method for creating the 3D scene presented in
The meshes required to produce the scene set in
Each of the meshes described above can be created in the host computer's 3D system using a 3D graphics API, such as Direct3D®, simply by specifying the X, Y, and Z coordinates of the vertices. Once described and placed, the surfaces of the meshes are defined. The desktop 404, for example may be a simple texture map of a photograph, or a single solid color with no transparency. The contents of open window 406 may be projected onto its respective mesh as a texture map, or each component of the open window can be drawn as its own mesh, each with its own attendant image and surface properties.
Once the surfaces are specified for each of the meshes, the scene is set in the memory of the computer. Next, the computer must render the 2D audience view (
Although the scene above is described as being set in 3D space, it is only one embodiment of the invention. The frontal view illustrated in
The transformation shown here is one of rotating the work area 411 away from the user around an invisible axis 510, the axis in this embodiment running parallel to the Y axis. Other axes of rotation are possible, including horizontal and diagonal axes and can be located in either in or outside of the presentation area. In addition, the particular transformation need not be a rotation; the work area may retreat from the screen and move to one side, for example. A 3D graphics API, such as Direct3D®, can accomplish this displacement of selected objects in the presentation area 401 with a transformation command, and the new scene or scenes can be rendered for presentation to the user. Optionally, the work area 411 is rotated away in one frame, without animation. However, in order to help the user mentally transition from the work area context to the operating system context, a smooth animation is preferred. The steps between FIGS. 4A/4B and FIGS. 5A/5B can be animated by rotating the work area 411 around axis 510 in small increments and rendering each of the frames in between.
Start button 405, as depicted here, in this embodiment does not move with the rotation. Rather, it retains its fixed location so that the user always sees it as a starting point and positional reference within the 3D presentation area 401. It should be noted that Start button 405 may be situated at any location within the display, and not just the lower left corner. For instance, the button 405 may be placed on the right side of the screen. In such a situation, the rotational transformation may optionally occur with work area 411 rotating away to the left rather than to the right.
As the work area 411 is rotated away in 3D, a portion of the presentation area 401 may be revealed. Any objects stored in this “backstage” area that were previously hidden by the work area 411 may now be exposed. The presentation area 401 may simply comprise a solid color background, unique from the colors of the work area 411. Alternatively, presentation area 401 may comprise a 3D table top (not shown), along which the work area may slide as it rotates away. In such a setting, the table top may comprise a flat mesh with reflective marble-like properties, and subsequently may create a mirrored reflection of the desktop.
The 3D engine optionally used to render each scene may take into account the visual perspective which occurs as objects move along the Z axis. Hence, objects that are closer to the user along the Z axis will appear larger to the viewer, and items further away along the Z axis will appear smaller. 3D perspective causes lines which are substantially parallel to appear to merge at some distant vanishing point on an invisible horizon. Thus, the portions of work area 411 which are further away will appear smaller in the frontal view of
Once the work area 411 is rotated away, the program launcher 512 can appear in the portion of the presentation area revealed.
Once program launcher 512 is revealed, the user may choose to launch one of the applications in the list of applications and submenus. A new application or new window for an existing application can be launched into work area 411 or into its own work area. Either way, program launcher 512 disappears, and work area 411 (or the new work area) returns to the forefront of the scene, preferably using a 3D animation. If the item selected from program launcher 512 is associated with the operating system, it may be launched as an information component in the same location as Start Menu 512. If the item selected from program launcher 512 requires the display of a submenu, then the program launcher remains, and work area 411 may be further transformed to make room for the submenu.
The submenu selection described above is depicted in
This 3D layering of windows in a work area is further depicted in
If a user were to click on a portion of work area 711 rather than on the menu and submenu 712, as described above, the click may result in one of several events. The click may simply cause work area 711 to be transformed back to the forefront. Or the click location may be passed through to the work area and used appropriately. For example, if the user clicks on window 706a, not only may work area 711 return to the forefront, but window 706a may move to the top of the stack of open windows. Alternatively, clicking only once on window 706a may result in that window moving to the front of the stack, in front of window 706b, but work area 711 may remain transformed. In this scenario, a double click may be used to return work area 711 to the forefront.
At this point, in step 1004, the information component required is displayed in the portion of the presentation area revealed. The user, in step 1005, controls the next course of action by directing input, such as a mouse click or keyboard stroke, to either the information component displayed or the recently transformed work area. Alternatively, the information component may be timed to retreat after a certain period of time. If the user's input requires a new work area in decision step 1006, then a new work area will be displayed in step 1007. Otherwise, if the user interacts directly with the transformed work area, or launches a new window in the transformed work area, then the information component may retreat, and the work area may return to the forefront.
Although the embodiments of the invention described herein make reference to their use in an operating system, this does not imply that additional embodiments cannot be used within an individual software application. Software programs such as word processors, games or database managers can benefit from displaying information components in this fashion.
While the invention has been described with respect to specific examples including presently preferred modes of carrying out the invention, those skilled in the art will appreciate that there are numerous variations and permutations of the above described devices and techniques that fall within the spirit and scope of the invention as set forth in the appended claims. For example, features described relating to the attachable templates and to determining locations of the inputs are applicable reciprocally between the template and the device.