Computer systems are currently in wide use. Development environments for developing computer systems are also in wide use.
It is not uncommon for developers to use development environments (such as interactive development environments, or IDEs) to develop computer systems. An IDE may have a plurality of different designer elements that can be used by a developer to perform the development tasks.
One example of a scenario where developers use an IDE to perform development, is in developing or modifying business systems. Business systems are often relatively large computer systems and can include, for instance, enterprise resource planning (ERP) systems, customer relations management (CRM) systems, line-of-business (LOB) systems, among a variety of others. Business systems are often developed, or manufactured, by a manufacturer who sells a base system which is often customized (and sometimes highly customized) to meet the needs of an individual organization, so that it can be deployed at that organization. Thus, developers may use an IDE to not only develop the base product, but also to perform development in customizing the base product to meet the needs of the end user organization. Such developments are sometimes performed by independent software vendors (ISVs), partners, developers or by other parties.
In performing development tasks, a developer may find that the particular set of interactive development tools provided by the IDE are insufficient, inefficient, or otherwise not adequate for the developer on a given project. For a developer to write his or her own interactive tools, the developer may spend a relatively large amount of time and other resources in generating code that may not necessarily be relevant to their development task.
The discussion above is merely provided for general background information and is not intended to be used as an aid in determining the scope of the claimed subject matter.
A design time extension framework provides a set of application programming interfaces that are used by a developer to create extensions to the development environment.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter. The claimed subject matter is not limited to implementations that solve any or all disadvantages noted in the background.
Developer 110 illustratively does this to develop or modify business system 104 into modified business system 112. Modified business system 112 is illustratively modified to meet the needs of a given organization that is deploying modified business system 112. Therefore, once modified, it generates user interface displays 114 with user input mechanisms 116 that are interacted with by user 118 in order to control and manipulate modified business system 112. User 118 does this in order to perform the business tasks of the organization deploying modified business system 112.
It will be noted that, while architecture 100 is shown with IDE 102 being used to modify business system 104 to generate modified business system 112, a business system is only one example of a scenario where a developer may use IDE 102. A wide variety of other scenarios can be used as well. The business system scenario is described for the sake of example only.
In the example shown in
In the embodiment shown in
Design time functionality 138 illustratively provides the design time functionality that developer 110 can use in order to modify business system 104. In the embodiment shown in
As discussed in the background section, it may be that developer 110 finds the designer elements 146, 148 and 150, that already exist in IDE 102, to be insufficient for the development task at hand, for a variety of different reasons. Therefore, in one embodiment, developer 110 illustratively uses design time extension framework 136 to generate his or her own add-in designer elements 152-154.
In the embodiment shown, framework 136 illustratively includes add-in templates 166, extension application programming interfaces (APIs) 168, which, themselves, include metadata API 170, automation API 172, and it can include other APIs 174. Framework 136 also illustratively includes other add-in generation functionality 176, automatic deployment component 178, add-in discovery component 180, and it can include other items 182 as well.
Developer 110 can illustratively author add-ins from scratch, or developer 110 can invoke add-in templates 166 which help facilitate the generation of add-ins. Metadata API 170 is illustratively a programming interface that enables creating and changing metadata elements in the file system of business system 104. Automation API 172 is illustratively a programming interface that enables creating and changing designer elements (such as creating add-in designer elements 152-154, or changing designer elements 146, 148 and 150) in IDE 102. Automatic deployment component 178 automatically deploys an add-in, once it has been created by developer 110, to design time functionality 138. Add-in discovery component 180 illustratively allows developer 110 to easily discover the various add-ins 152-154 that have been deployed to design time functionality 138.
Developer 110 then provides inputs to design time extension framework 136 indicating that the user wishes to access framework 136 to generate or modify an add-in. This is indicated by block 200 in
IDE 102 then generates a developer interface display that allows developer 110 to indicate whether developer 110 wishes to use an add-in template 166 or to simply use add-in generation functionality 176 to generate the add-in. If the developer 110 does decide to a use a template, then framework 136 displays the template for use by developer 110. This is indicated by blocks 208 and 210 in
Regardless of whether developer 110 uses an add-in template 166, developer 110 then provides inputs, such as through automation API 172, to create or modify an add-in designer element 152-154, or to modify another designer element. This is indicated by block 212 in
At some point, developer 110 will be finished designing the add-in. This is indicated by block 228. Developer 110 will then provide an input indicating this. Automatic deployment component 178 then automatically deploys the add-in to the design time functionality 138 of IDE 102. This is indicated by block 230 in
Automatic deployment component 178 illustratively makes the newly created add-in available for selection from context menus in design time functionality 138 of IDE 102. This is indicate by block 134. Automatic deployment component 178 can perform a variety of other tasks as well. This is indicated by block 236.
Once the add-in has been authored or generated by developer 110 and deployed in design time functionality 138, it can then be used by developer 110 (or a different developer) to perform development tasks.
IDE 102 then receives inputs from developer 110 in which developer 110 seeks to discover designer elements or add-in designer elements, given the developer's current development context. This is indicated by block 250 in
Design time functionality 138 then displays related designer elements and add-in designer elements that can be selected by developer 110, for use in performing his or her development tasks. This is indicated by block 252.
As shown in
Referring again to
When this happens, add-in factory 156 illustratively generates an instance of the selected add-in. This is indicated by block 276 in
The add-in instance is illustratively configured to use the metadata API 170 to modify metadata (e.g., the metadata that is being developed) according to the functionality designed into the add-in. Thus, when the user provides inputs manipulating the add-in, the add-in accesses the metadata API to modify the metadata, based on those inputs. This is indicated by block 280.
It will be understood that the class diagrams shown in
It can thus be seen that the present description provides a framework that allows a developer to advantageously generate new tools as add-ins to a development environment. It describes the use of add-in project templates, automatic deployment and discovery of add-ins, a metadata API and a designer automation API. These advantageously allow the designer to quickly develop and deploy add-ins to implement new design tools. They also allow the developer to perform attribute based discovery of the add-ins. Add-in action menus are automatically created on the fly, based on how relevant a given add-in is to the designer context. This enables the system to quickly surface relevant add-ins for use in the development environment. The templates enhance both the design experience and the performance of the development environment. Because the templates are surfaced, the processing and memory overhead used by the system can be reduced because the developer is not developing the add-ins from scratch. The framework is also independent of the particular development environment in which it is deployed. This enhances flexibility of the framework.
The present discussion has mentioned processors and servers. In one embodiment, the processors and servers include computer processors with associated memory and timing circuitry, not separately shown. They are functional parts of the systems or devices to which they belong and are activated by, and facilitate the functionality of the other components or items in those systems.
Also, a number of user interface displays have been discussed. They can take a wide variety of different forms and can have a wide variety of different user actuatable input mechanisms disposed thereon. For instance, the user actuatable input mechanisms can be text boxes, check boxes, icons, links, drop-down menus, search boxes, etc. They can also be actuated in a wide variety of different ways. For instance, they can be actuated using a point and click device (such as a track ball or mouse). They can be actuated using hardware buttons, switches, a joystick or keyboard, thumb switches or thumb pads, etc. They can also be actuated using a virtual keyboard or other virtual actuators. In addition, where the screen on which they are displayed is a touch sensitive screen, they can be actuated using touch gestures. Also, where the device that displays them has speech recognition components, they can be actuated using speech commands.
A number of data stores have also been discussed. It will be noted they can each be broken into multiple data stores. All can be local to the systems accessing them, all can be remote, or some can be local while others are remote. All of these configurations are contemplated herein.
Also, the figures show a number of blocks with functionality ascribed to each block. It will be noted that fewer blocks can be used so the functionality is performed by fewer components. Also, more blocks can be used with the functionality distributed among more components.
The description is intended to include both public cloud computing and private cloud computing. Cloud computing (both public and private) provides substantially seamless pooling of resources, as well as a reduced need to manage and configure underlying hardware infrastructure.
A public cloud is managed by a vendor and typically supports multiple consumers using the same infrastructure. Also, a public cloud, as opposed to a private cloud, can free up the end users from managing the hardware. A private cloud may be managed by the organization itself and the infrastructure is typically not shared with other organizations. The organization still maintains the hardware to some extent, such as installations and repairs, etc.
In the embodiment shown in
It will also be noted that architecture 100, or portions of it, can be disposed on a wide variety of different devices. Some of those devices include servers, desktop computers, laptop computers, tablet computers, or other mobile devices, such as palm top computers, cell phones, smart phones, multimedia players, personal digital assistants, etc.
Computer 810 typically includes a variety of computer readable media. Computer readable media can be any available media that can be accessed by computer 810 and includes both volatile and nonvolatile media, removable and non-removable media. By way of example, and not limitation, computer readable media may comprise computer storage media and communication media. Computer storage media is different from, and does not include, a modulated data signal or carrier wave. It includes hardware storage media including both volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by computer 810. Communication media typically embodies computer readable instructions, data structures, program modules or other data in a transport mechanism and includes any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. Combinations of any of the above should also be included within the scope of computer readable media.
The system memory 830 includes computer storage media in the form of volatile and/or nonvolatile memory such as read only memory (ROM) 831 and random access memory (RAM) 832. A basic input/output system 833 (BIOS), containing the basic routines that help to transfer information between elements within computer 810, such as during start-up, is typically stored in ROM 831. RAM 832 typically contains data and/or program modules that are immediately accessible to and/or presently being operated on by processing unit 820. By way of example, and not limitation,
The computer 810 may also include other removable/non-removable volatile/nonvolatile computer storage media. By way of example only,
Alternatively, or in addition, the functionality described herein can be performed, at least in part, by one or more hardware logic components. For example, and without limitation, illustrative types of hardware logic components that can be used include Field-programmable Gate Arrays (FPGAs), Program-specific Integrated Circuits (ASICs), Program-specific Standard Products (ASSPs), System-on-a-chip systems (SOCs), Complex Programmable Logic Devices (CPLDs), etc.
The drives and their associated computer storage media discussed above and illustrated in
A user may enter commands and information into the computer 810 through input devices such as a keyboard 862, a microphone 863, and a pointing device 861, such as a mouse, trackball or touch pad. Other input devices (not shown) may include a joystick, game pad, satellite dish, scanner, or the like. These and other input devices are often connected to the processing unit 820 through a user input interface 860 that is coupled to the system bus, but may be connected by other interface and bus structures, such as a parallel port, game port or a universal serial bus (USB). A visual display 891 or other type of display device is also connected to the system bus 821 via an interface, such as a video interface 890. In addition to the monitor, computers may also include other peripheral output devices such as speakers 897 and printer 896, which may be connected through an output peripheral interface 895.
The computer 810 is operated in a networked environment using logical connections to one or more remote computers, such as a remote computer 880. The remote computer 880 may be a personal computer, a hand-held device, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to the computer 810. The logical connections depicted in
When used in a LAN networking environment, the computer 810 is connected to the LAN 871 through a network interface or adapter 870. When used in a WAN networking environment, the computer 810 typically includes a modem 872 or other means for establishing communications over the WAN 873, such as the Internet. The modem 872, which may be internal or external, may be connected to the system bus 821 via the user input interface 860, or other appropriate mechanism. In a networked environment, program modules depicted relative to the computer 810, or portions thereof, may be stored in the remote memory storage device. By way of example, and not limitation,
It should also be noted that the different embodiments described herein can be combined in different ways. That is, parts of one or more embodiments can be combined with parts of one or more other embodiments. All of this is contemplated herein.
Example 1 is a computing system, comprising:
an add-in generation component, in a development system, configured to generate an add-in creation user interface with creation user input mechanisms that are actuated to generate add-in functionality for an add-in designer element; and
an add-in deployment component configured to deploy the add-in designer element in the development system.
Example 2 is the computing system of any or all previous examples wherein the add-in deployment component is configured to expose an automation application programming interface invoked by the add-in generation component to generate the add-in designer element in the development system.
Example 3 is the computing system of any or all previous examples wherein the automation application programming interface is configured to be invoked by the add-in generation component to modify an existing designer element in the development system to generate the add-in designer element.
Example 4 is the computing system of any or all previous examples wherein the add-in deployment component is configured to expose a metadata application programming interface that is invoked by the add-in designer element to modify metadata defining objects under development.
Example 5 is the computing system of any or all previous examples wherein the add-in generation component is configured to access an add-in template and to generate the add-in creation user interface based on the add-in template.
Example 6 is the computing system of any or all previous examples wherein the add-in generation component is configured to generate a template selection user input mechanism that is actuated to select the add-in template from a plurality of different add-in templates.
Example 7 is the computing system of any or all previous examples and further comprising:
an add-in discovery component configured to identify a context in the development system being accessed by a user and, in response, generate an add-in selection user interface display with an add-in selection user input mechanism that is actuated to select the add-in designer element.
Example 8 is the computing system of any or all previous examples and further comprising:
an add-in factory that generates an instance of the add-in designer element for user interaction in response to actuation of the add-in selection user input mechanism.
Example 9 is a computing system in a developer environment, comprising:
a metadata visualization component configured to generate a visualization of metadata for a system under development, based on a context of the developer system being accessed by a user;
an add-in discovery component configured to identify the context in the developer environment that is being accessed and to generate an add-in user interface display, based on the context, with an add-in selection user input mechanism that is actuated to select an add-in designer element in the developer environment; and
an add-in factory configured to generate an instance of the selected add-in designer element in the developer environment, the instance having properties that define functionality for modifying the metadata.
Example 10 is the computing system of any or all previous examples wherein the selected add-in designer element instance is configured to invoke a metadata application programming interface to modify the metadata based on the functionality defined in the selected add-in designer element instance.
Example 11 is the computing system of any or all previous examples wherein the metadata visualization component is configured to generate a visualization of metadata for a business system under development, the metadata defining objects in the business system.
Example 12 is the computing system of any or all previous examples wherein the add-in discovery component is configured to generate the add-in user interface display by identifying add-in designer elements relevant to the context and to generate context menus for the relevant add-in designer elements.
Example 13 is a method, comprising:
displaying an add-in creation user interface, in a development system, with creation user input mechanisms;
receiving actuation of a creation user input mechanism;
in response to the received actuation, generating add-in functionality for an add-in designer element; and
deploying the add-in designer element in the development system.
Example 14 is the method of any or all previous examples and further comprising:
exposing an automation application programming interface; and
invoking the automation application programming interface to generate the add-in functionality of the add-in designer element in the development system.
Example 15 is the method of any or all previous examples wherein generating add-in functionality comprises:
modifying an existing designer element in the development system to generate the add-in functionality to obtain the add-in designer element.
Example 16 is the method of any or all previous examples wherein deploying the add-in designer element comprises:
exposing a metadata application programming interface; and
invoking the metadata application programming interface to modify metadata defining objects under development, in the development system.
Example 17 is the method of any or all previous examples wherein displaying an add-in creation user interface display comprises:
accessing an add-in template; and
generating the add-in creation user interface based on the add-in template.
Example 18 is the method of any or all previous examples wherein accessing the add-in template comprises:
generating a template selection user input mechanism; and
receiving actuation of the template selection user input mechanism to select the add-in template from a plurality of different add-in templates.
Example 19 is the method of any or all previous examples and further comprising:
identifying a context in the development system being accessed by a user; and
generating an add-in selection user interface display with an add-in selection user input mechanism that is actuated to select the add-in designer element, based on the identified context.
Example 20 is the method of any or all previous examples further comprising:
receiving actuation of the add-in selection user input mechanism; and
instantiating an instance of the add-in designer element for user interaction in response to actuation of the add-in selection user input mechanism.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.
The present application is based on and claims the benefit of U.S. provisional patent application Ser. No. 62/004,450, filed May 29, 2014, the content of which is hereby incorporated by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
62004450 | May 2014 | US |