Computing systems and networks have transformed the way we work, play, and communicate. Computing systems obtain there functionality by executing commands on computing resources accessible to the computing system. Commands might be, for instance, initiated by a user. In that case, the user interfaces with a visualization of the command, thereby causing corresponding operations on the computing asset. During various stages of the lifecycle of a command, the user may be presented with dialogs that ask for confirmation, inform of success or failure, or inform of progress of the command.
At least some embodiments described herein relate to computing systems in which multiple non-context-sensitive or core commands may be initiated from each of a number of different user interface contexts. There are also multiple context-sensitive mechanism for visualizing the commands depending on which of the multiple possible user interface contexts that the commands appear. At least some embodiments described herein also related to the presentation of dialogs at various stages of the command lifecycle without the system needing to know the underlying operations of the command, and allowing the developer to specify when dialogs are to appear in that lifecycle.
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
In order to describe the manner in which the above-recited and other advantages and features of the invention can be obtained, a more particular description of the invention briefly described above will be rendered by reference to specific embodiments thereof which are illustrated in the appended drawings. Understanding that these drawings depict only typical embodiments of the invention and are not therefore to be considered to be limiting of its scope, the invention will be described and explained with additional specificity and detail through the use of the accompanying drawings in which:
Even though the user experience for displaying the commands and the context where the command is displayed may be different, the actual command is the same as illustrated in
Commanding is a common way of describing behavior in a system, whether distributed or otherwise. Each command represents a unit of functionality that can be applicable to an asset within the system, to the system itself, or to any arbitrary artifact. Commands can be provided by the system (i.e., built-in commands) or by other parties (extrinsic commands).
In accordance with the principles described herein, commands are provided consistently in an entire system, even though the system itself may be operating a number of different applications composed by entirely different parties. Furthermore, the embodiments described herein help security by running commands in the right isolation mode, such that harmful (but not necessarily malicious) code does not compromise the system. Preferably, the command should not block the user interface so they should run asynchronously. As far as the user experience, the embodiments described herein allow commands to be surfaced following the same patterns (e.g. command bar; context menu; etc.) (i.e., also referred to herein as a context-sensitive mechanism for visualization) and provide interactivity options to the users (e.g. dialogs) so they can participate in the operation and also understand the operation's status and result.
The principles described herein may be implemented using a computing system. For instance, the users may be engaging with the system using a client computing system. The executable logic supporting the system and providing visualizations thereon may also be performed using a computing system. The computing system may even be distributed. Accordingly, a brief description of a computing system will now be provided.
Computing systems are now increasingly taking a wide variety of forms. Computing systems may, for example, be handheld devices, appliances, laptop computers, desktop computers, mainframes, distributed computing systems, or even devices that have not conventionally been considered a computing system. In this description and in the claims, the term “computing system” is defined broadly as including any device or system (or combination thereof) that includes at least one physical and tangible processor, and a physical and tangible memory capable of having thereon computer-executable instructions that may be executed by the processor. The memory may take any form and may depend on the nature and form of the computing system. A computing system may be distributed over a network environment and may include multiple constituent computing systems. An example computing system is illustrated in
As illustrated in
In the description that follows, embodiments are described with reference to acts that are performed by one or more computing systems. If such acts are implemented in software, one or more processors of the associated computing system that performs the act direct the operation of the computing system in response to having executed computer-executable instructions. For example, such computer-executable instructions may be embodied on one or more computer-readable media that form a computer program product. An example of such an operation involves the manipulation of data. The computer-executable instructions (and the manipulated data) may be stored in the memory 104 of the computing system 100. Computing system 100 may also contain communication channels 108 that allow the computing system 100 to communicate with other message processors over, for example, network 110.
The computing system 100 also includes a display 112 on which a user interface, such as the user interfaces described herein, may be rendered. Such user interfaces may be generated in computer hardware or other computer-represented form prior to rendering. The presentation and/or rendering of such user interfaces may be performed by the computing system 100 by having the processing unit(s) 102 execute one or more computer-executable instructions that are embodied on one or more computer-readable media. Such computer-readable media may form all or a part of a computer program product.
Embodiments described herein may comprise or utilize a special purpose or general-purpose computer including computer hardware, such as, for example, one or more processors and system memory, as discussed in greater detail below. Embodiments described herein also include physical and other computer-readable media for carrying or storing computer-executable instructions and/or data structures. Such computer-readable media can be any available media that can be accessed by a general purpose or special purpose computer system. Computer-readable media that store computer-executable instructions are physical storage media. Computer-readable media that carry computer-executable instructions are transmission media. Thus, by way of example, and not limitation, embodiments of the invention can comprise at least two distinctly different kinds of computer-readable media: computer storage media and transmission media.
Computer storage media includes RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other tangible medium which can be used to store desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer.
A “network” is defined as one or more data links that enable the transport of electronic data between computer systems and/or modules and/or other electronic devices. When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or a combination of hardwired or wireless) to a computer, the computer properly views the connection as a transmission medium. Transmissions media can include a network and/or data links which can be used to carry or desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer. Combinations of the above should also be included within the scope of computer-readable media.
Further, upon reaching various computer system components, program code means in the form of computer-executable instructions or data structures can be transferred automatically from transmission media to computer storage media (or vice versa). For example, computer-executable instructions or data structures received over a network or data link can be buffered in RAM within a network interface module (e.g., a “NIC”), and then eventually transferred to computer system RAM and/or to less volatile computer storage media at a computer system. Thus, it should be understood that computer storage media can be included in computer system components that also (or even primarily) utilize transmission media.
Computer-executable instructions comprise, for example, instructions and data which, when executed at a processor, cause a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions. The computer executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, or even source code. Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the described features or acts described above. Rather, the described features and acts are disclosed as example forms of implementing the claims.
Those skilled in the art will appreciate that the invention may be practiced in network computing environments with many types of computer system configurations, including, personal computers, desktop computers, laptop computers, message processors, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, mobile telephones, PDAs, pagers, routers, switches, and the like. The invention may also be practiced in distributed system environments where local and remote computer systems, which are linked (either by hardwired data links, wireless data links, or by a combination of hardwired and wireless data links) through a network, both perform tasks. In a distributed system environment, program modules may be located in both local and remote memory storage devices.
In accordance with principles described herein, a user interface element (often called herein a “part”) represents a basic unit of the user interface. Each of at least some of the parts are associated with corresponding controls that the user may interact with to thereby cause the system to execute respective commands. The execution of the command may, for instance, return data to project via the corresponding part. The parts may incorporate extrinsic commands that implement given contracts, and may reason about them.
In accordance with the principles described herein, commands (also called hereinafter “non-context-sensitive commands” or “core commands”) can be associated with resources in the system (such as a website, database, an arbitrary artifact, the system itself, or a piece of the user interface). This association may be persistent, such that when that resource is displayed in different user interface contexts, the non-context-sensitive commands associated with that resource are still available, but displayed using the right context-sensitive mechanism. For instance, the context-sensitive mechanism for visualizing these core commands may be a user experience form factor appropriate for the context in which the part is displayed. Consistency may also be further achieved by having the same context-visualization mechanism used to display the commands of any user interface element that is displayed in a particular user interface context. Commands are thus offered to the user via a well-defined experience that is consistent across the entire system.
The system provides a set of abstractions through a portal that enable application developers to create commands. A command encapsulates an action in the system. The composition tree describes structure and the commands describe behavior. Commands provide a well-defined surface that the system can reason about to support units of behavior. Commands can be system commands (built-in) and custom commands (provided by the application developer). Commands are offered to the user via a well-defined experience that is consistent across the entire system. This experience is built-in and cannot be redefined by application developers. Application developers can only contribute with new commands, but not with new ways of exposing those commands to the user. Thus, the manner of exposing commands (the command experience) is governed by the system.
Commands provide application developers and users a consistent model across applications (sometimes referred to as “extensions”) compatible with browser capabilities and scalable to all parties to describe behavior in the system. The command may have affinity with portal assets which can make them available everywhere that asset is presented if so desired
The user interface may be a rich in allowing different user interface elements to be presented in entirely different contexts. For instance, as will be described further below, the user interface might include different contexts such as a favorites area, a blade, and hubs. These represent different places where user interface elements (also called herein “parts”) can be displayed. The commands associated with a given asset can be available in all of these different contexts, enabling taking action on a resource in the way that is more convenient (and in addition new commands can be added to accommodate the specifics of each context in case is needed).
Commands can be associated with different resources in the system. For instance, the resource might be a portion of the user interface itself, such as a part. As another example of a portion of the user interface may be what will be referred to herein as a “blade”. A blade is a user interface element that may be placed on a canvas that extends in an extendible direction (such as horizontally). For substantially all of a particular range of the canvas in the dimension of the extendible dimension, the blade may occupy substantially all of the canvas in the dimension perpendicular to the extendible direction of the canvas. The resource might also be associated with an actual asset in the system, such as a website, database, virtual machine, and so forth. Blades or other parts can be also associated with assets creating a transitive relationship between the commands and its container if so desired.
Commands are visualized through different context-sensitive mechanisms, depending on the user interface context in which the associated part is displayed. Each context-sensitive mechanism supports a particular user experience. For instance,
In
The command bar 210 also includes an overflow control 221 (also called hereinafter a “command bar expansion control 221”) that is presented when there are more commands associated with the blade than the blade can display in the available space.
The selectable non-context-sensitive commands 212 through 214 may be presented regardless of where commands appears in the user interface context. For instance, when the user interface element is associated with a resource, the non-context sensitive commands may be basic commands associated with the resource that the user might like to initiate regardless of the user interface context in which the resource is presented. For instance, a user might like to start, stop, restart, or delete a web site from any one of a number of different user interface contexts.
Referring again to
In one embodiment, context-sensitive commands including one or more of 1) non-selectable non-context-sensitive commands (such as the start command 211), 2) commands that are associated with an underlying resource, but which are not to be performed in every user interface context (such as the browse command 222 and the reset publish profile command 223), and 3) and commands that are associated with the user interface element itself, but not the underlying resource (such as the overflow indicator 221 or command bar expansion command).
The context menu 400 again visualizes the active non-context-sensitive commands, including the stop command 412 (corresponds to stop command 212 of
In this example, the blade user interface element 200 is one example of a user interface context with the command bar 210 being an associated context-sensitive visualization for the commands. The context menu 400 is another example of the associated context-sensitive visualization for the commands, which is associated with other user interface contexts (such as smaller parts, favorites areas, grids, activity panes, and so forth).
The user interface element 500 again displays the non-context sensitive commands including the stop command 512 (corresponding to the stop command 212 and 412 in
The commands 511 through 514, 522 and 523 are application commands 510 (also referred to herein as “extrinsic commands”) being offered by application developers and not underlying system. The extended context menu 500 also includes system commands 530, such as an unpin command 531, and size selection commands 522 through 524. Such system commands 530 are offered by the system regardless of the underlying resource, so long as the commands were selected within the given user interface context that generated the extended context menu 500.
The built-in commands provide general infrastructure services (pin/unpin parts, resizing parts, restoring layout, and so forth) and are general in that they apply across all usage domains. Commands provided by application developers are domain specific. For instance, an example set of extrinsic commands for a web site application might include “start”, “stop”, “delete website” and so forth.
Commands are authored by application developers by leveraging a set of artifacts (interfaces and bases classes provided by the system) that expose the command contract to the application developers. This allows the application developer to provide the actual behavior of the command (what happens when the command is executed), provide dialogs (which are optional) that will display at different moments of the command's life cycle, and influence the command life-cycle.
The non-context-sensitive commands can follow a resource in multiple contexts. For example, commands associated with a website can be present in the website's blade (see
Before the command is initiated, there is no operation, which is the none state 801 in
The developer can specify whether or not constrained user interface elements (or dialogs) are to appear at each of the transition 811 through 814 for each command. Accordingly, when making a transition 811 through 814, the system can check to determine whether a dialog is to appear as part of the transition. For instance, such dialogs could ask users for confirmations, inform of progress, or inform of the result of an operation, all depending on which transition 811 through 814 is being made, and what the resource associated with the command is.
Immediately upon receiving the stop command, the stop command begins transitioning (as represented by transition 811) from the Non state 801 to the Pending state 802. However, as part of this transition, the author verifies that the browser developer has not indicated that a dialog is to appear at this point for this type of resource (e.g., website), and/or that the web site developer has not indicated that a dialog is to appear at this point for that particular resource (e.g., the “Wandering website”). The browser developer and/or the website author may also specify a dialog template in cases in which there are multiple templates that could be used for that transition and resource type.
Here, the system verifies that a dialog is to appear, and thus presents dialog 900. The dialog 900 may be generated knowing nothing more than which transition is involved (and potentially also a dialog template to use which may also be specified by the developer). The dialog 900 may then populate the dialog template using the name of the resource (e.g., “Wandering” website), and then present the dialog to the user. Thus, the presentation of dialogs may be consistent throughout the system regardless of the command being executed, or the resource being operated upon, even without the system knowing the specifics of the underlying operations that support the command.
These dialogs are data-driven and extremely constrained to provide a uniform user experience across applications. Dialogs include confirmation (with yes/no buttons), show progress (deterministic and non-deterministic), show success, and show failure (with a retry button). Application developers can configure the command to surface the dialogs at certain points in the lifecycle of the operation.
As illustrated in
In some cases, the portal can provide abstractions that application developers can use to create intrinsic commands that the system will recognize (at least to the point of being able to track the state machine of
Commands are executed asynchronously by the system. Commands provided by application authors are executed leveraging the system's isolation model to ensure that they do not compromise the overall portal (as the execution is isolated within the application that owns the command).
The application developer creates a small set of commands that are available in multiple contexts. All capabilities for his commands (execution logic, dialogs, and so forth) are preserved and the necessary user experience is adapted to the constraints of where the command is rendered. This makes possible that users can interact with a resource at any place in the user interface. There is no single location where “actions” can be executed but rather any place in the portal allows rich interactions with resources.
Accordingly, a system has been described that provides consistency in how commands are visualized, as well as how dialogs associated with the command lifecycle are visualized. This is true regardless of there being user interface elements of different applications within the system.
The present invention may be embodied in other specific forms without departing from its spirit or essential characteristics. The described embodiments are to be considered in all respects only as illustrative and not restrictive. The scope of the invention is, therefore, indicated by the appended claims rather than by the foregoing description. All changes which come within the meaning and range of equivalency of the claims are to be embraced within their scope.
This application claims the benefit of each of the following provisional patent applications, and each of the following provisional patent applications are incorporated herein by reference in their entirety: 1. U.S. Provisional Application Ser. No. 61/905,114, filed Nov. 15, 2013;2. U.S. Provisional Application Ser. No. 61/884,743, filed Sep. 30, 2013;3. U.S. Provisional Application Ser. No. 61/905,111, filed Nov. 15, 2013;4. U.S. Provisional Application Ser. No. 61/905,243, filed Nov. 17, 2013;5. U.S. Provisional Application Ser. No. 61/905,116, filed Nov. 15, 2013;6. U.S. Provisional Application Ser. No. 61/905,129, filed Nov. 15, 2013;7. U.S. Provisional Application Ser. No. 61/905,105, filed Nov. 15, 2013;8. U.S. Provisional Application Ser. No. 61/905,247, filed Nov. 17, 2013;9. U.S. Provisional Application Ser. No. 61/905,101, filed Nov. 15, 2013;10. U.S. Provisional Application Ser. No. 61/905,128, filed Nov. 15, 2013; and11. U.S. Provisional Application Ser. No. 61/905,119, filed Nov. 15, 2013.
Number | Date | Country | |
---|---|---|---|
61905128 | Nov 2013 | US | |
61884743 | Sep 2013 | US | |
61905111 | Nov 2013 | US | |
61905243 | Nov 2013 | US | |
61905114 | Nov 2013 | US | |
61905116 | Nov 2013 | US | |
61905129 | Nov 2013 | US | |
61905105 | Nov 2013 | US | |
61905247 | Nov 2013 | US | |
61905101 | Nov 2013 | US | |
61905119 | Nov 2013 | US |