Computer systems are very common today. In fact, they are in use in many different types of environments.
Some computer systems include business computer systems, which are also in wide use. Business systems include customer relations management (CRM) systems, enterprise resource planning (ERP) systems, line-of-business (LOB) systems, etc. These types of systems often include business data that is stored as entities or other business data records. Such business data records (or entities) often include records that are used to describe various aspects of a business. For instance, they can include customer entities that describe and identify customers, vendor entities that describe and identify vendors, sales entities that describe particular sales, quote entities, order entities, inventory entities, etc. The business systems also commonly include process functionality that facilitates performing various business processes or tasks on the data. Users log into the business system in order to perform business tasks for conducting the business.
Such business systems also currently include roles. Users are assigned one or more roles based upon the types of tasks they are to perform for the business. The roles can include certain security permissions, and they can also provide access to different types of data records (or entities), based upon a given role.
Business systems can also be very large. They contain a great number of data records (or entities) that can be displayed or manipulated through the use of thousands of different forms. Therefore, visualizing the data in a meaningful way can be very difficult. This problem is exacerbated when a user has one or more roles, or when a user has a given role that is responsible for a wide variety of different types of business tasks. It can be very cumbersome and time consuming for a user to navigate through various portions of a business system in order to view data or other information that is useful to that particular user, in that particular role.
The discussion above is merely provided for general background information and is not intended to be used as an aid in determining the scope of the claimed subject matter.
A role-based dashboard display is generated, showing a plurality of different display sections that display information from a computer system. The display sections include a customer-branded section, a favorites section, a workspace display section and a live data feed section. The sections have display elements linked to underlying data.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter. The claimed subject matter is not limited to implementations that solve any or all disadvantages noted in the background.
Visualization component 114 illustratively generates various visualizations, or views, of the data and processes (or workflows) stored in business data store 108. Visualizations can include, for example, one or more dashboard displays 126, a plurality of different workspace displays 128, a plurality of different list page displays 129, a plurality of different entity hub displays 130, and other displays 132.
Dashboard display 126 is illustratively an overview of the various data and workflows in business system 100. It illustratively provides a plurality of different links to different places within the applications comprising business system 100. Dashboard display 126 illustratively includes a plurality of different display sections that each include a variety of different display elements. For instance, dashboard display 126 can include an end-customer-branded section that includes a customer logo, for instance, or other customer branding display elements. It can also include a workspace section that includes a combination of workspace display elements that can be manipulated by the user. Further, it can include a newsfeed and notification section that shows a running stream of information about work that the user has been assigned, or that the user wishes to be notified of, along with related company news (both internal and external) in a newsfeed. Dashboard display 126 can also present a highly personalized experience. Dashboard 126 is described in greater detail below with respect to
Workspace display 128 is illustratively a customizable, activity-oriented display that provides user 106 with visibility into the different work (tasks, activities, data, etc.) performed by user 106 in executing his or her job. The workspace display 128 illustratively consolidates information from several different areas in business system 110 (e.g., in one or more business applications that execute the functionality of business system 100) and presents it in an organized way for visualization by user 106.
List page display 129 is illustratively a page that breaks related items out into their individual rows. Other displays 126, 128 and 130 illustratively have user actuable links that can summarize related information, but can be actuated to navigate the user to a list page display 129 that has the related information broken out. For example, whereas a workspace display 128 may have multiple individual elements (such as tiles or lists or charts, etc.) that summarize the related information, the corresponding list page 129 will break summarized information into their individual rows. A workspace display 128 can also have multiple elements that each point to a different list page display 129.
Entity hub display 130 is illustratively a display that shows a great deal of information about a single data record (such as a single entity 118 or other data record 120, which may be a vendor record, a customer record, an employee record, etc.). The entity hub display 130 illustratively includes a plurality of different sections of information, with each section designed to present its information in a given way (such as a data field, a list, etc.) given the different types of information.
Business process component 110 illustratively accesses and facilitates the functionality of the various workflows 124 that are performed in business system 100. It can access the various data (such as entities 118 and business records 120) stored in data store 108 in facilitating this functionality as well.
Display customization component 116 illustratively allows user 106 to customize the displays that user 106 has access to in business system 100. For instance, display customization component 116 can provide functionality that allows user 106 to customize the dashboard display 126 or one or more of the workspace displays 128 that user 106 has access to in system 100.
Processor 112 is illustratively a computer processor with associated memory and timing circuitry (not separately shown). It is illustratively a functional part of business system 100 and is activated by, and facilitates the functionality of, other components or items in business system 100.
Data store 108 is shown as a single data store, and is local to system 100. It should be noted, however, that it can be multiple different data stores as well. Also, one or more data stores can be remote from system 100, or local to system 100, or some can be local while others are remote.
User input mechanisms 104 can take a wide variety of different forms. For instance, they can be text boxes, active tiles that dynamically display parts of the underlying information, check boxes, icons, links, drop down menus, or other input mechanisms. In addition, they can be actuated by user 106 in a variety of different ways as well. For instance, they can be actuated using a point and click device (such as a mouse or trackball), using a soft or hard keyboard, a thumb pad, a keypad, various buttons, a joystick, etc. In addition, where the device on which the user interface displays are displayed has a touch sensitive screen, they can be actuated using touch gestures (such as with the user's finger, a stylus, etc.). Further, where the device or system includes speech recognition components, they can be actuated using voice commands.
It will also be noted that multiple blocks are shown in
In one embodiment, each user 106 is assigned a role 122, based upon the types of activities or tasks that the given user 106 will perform in business system 100. Thus, in one embodiment, dashboard display 126 is generated to provide information related to the role of a given user 106. That is, user 106 is provided with different information on a corresponding dashboard display 126, based upon the particular role or roles that are assigned to user 106 in business system 100. In this way, user 106 is presented with a visualization of information that is highly relevant to the job being performed by user 106 in business system 100.
In addition, some types of roles 122 may have multiple corresponding workspace displays 128 generated for them. By way of example, assume that user 106 is assigned an administrator's role in business system 100. In that case, user 106 may be provided with access to multiple different workspace displays 128. A workspace display 128 may show information for a security workspace. The security workspace may include information related to security features of business system 100, such as access, permissions granted in system 100, security violations in system 100, authentication issues related to system 100, etc. User 106 (being in an administrative role) may also have access to a workspace display 128 corresponding to a workspace that includes information about the health of system 100. This workspace display 128 may include information related to the performance of system 100, the memory usage and speed of system 100, etc. Thus, a given user 106 that has only a single role 122 may have access to multiple different workspace displays 128.
Similarly, a given user 106 may have multiple different roles 122. By way of example, assume that a given user 106 is responsible for both the human resources tasks related to business system 100, and the payroll tasks. In that case, the given user 106 may have a human resources role 122 and a payroll role 122. Thus, user 106 may have access to one or more workspace displays 128 for each role 122 assigned to user 106 in business system 100. In this way, when user 106 is performing the human resources tasks, user 106 can access the human resources workspace display 128, through dashboard display 126, which will contain a set of information that user 106 believes is relevant to the human resources role and the human resources tasks. Then, when user 106 is performing the payroll tasks in system 100, user 106 can access one or more payroll workspace displays 128, through dashboard 126, which contain the information that user 106 believes is relevant to the payroll tasks and role. Is this way, the user need not have just a single display with all the information related to both the payroll tasks and the human resources tasks combined, which can be confusing and cumbersome to work with. Instead, the user 106 can illustratively have workspace display elements on the dashboard display 126, each workspace display elements corresponding to a different workspace display. When the user actuates one of the workspace display elements, the user can then be navigated to the corresponding workspace display 128.
This can include a wide variety of different things. For instance, user 106 can provide authentication information 152 (such as a username and password) or a role 154 (or the role can be automatically accessed within system 100 once the user provides authentication information 152). In addition, user 106 can provide other information 156 as well.
In response, visualization component 114 illustratively generates a dashboard display 126 that is specific to the given user 106, having the assigned role. Displaying the user's dashboard display 126 is indicated by block 158 in
Dashboard display 126 also illustratively includes a workspace display section 166 that includes a plurality of workspace display elements 168. Each workspace display element 168 illustratively represents a different workspace display that, itself, shows information for a workspace in the business system 100 that is relevant to user 106. As will be described below with respect to
Dashboard display 126 also illustratively includes a notifications section 170 and a newsfeed section 172. It will be noted that sections 170 and 172 can be either separate sections, or combined into a single section. In one embodiment, notifications section 170 illustratively includes a set of notification elements each corresponding to a notification that can be customized by user 106. Therefore, user 106 can add items that the user wishes to be notified of, into section 170. Newsfeed section 172 illustratively includes links to news from a plurality of different sources. The sources can be multiple internal sources, or external sources, or a combination of internal and external sources. For instance, the newsfeed section 172 can include links to news on a social network, on an internal company network, news identified from external news sites, etc. In one embodiment, when the user actuates one of the newsfeed items in section 172, the user is navigated to the underlying news story.
In one embodiment, the initial display of dashboard display 126 can be dynamic. For instance, when the user first requests access to dashboard display 126, visualization component 114 can begin by displaying section 160. Thus, the user can see a company logo display 176, one or more different images 178, or a variety of other end-customer branding information, or even personalized information, such as the user's name, the user's role or roles, along with the date, time, or other information. However, as visualization component 114 loads data into dashboard display 126 (after several seconds, for instance), visualization component 114 can illustratively change the display, such as by pushing sections 160 and 162 off the screen to the left, and stop on workplace display section 166. Thus, once visualization component 114 has loaded all of the data into dashboard display 126, the final landing page for user 106 may or may not be section 160. For instance, workspace display section 166 can be the first fully-viewable section that is presented to the user at rest-loading. In one embodiment, the user can adjust the final landing page so that the particular sections of dashboard display 126 that are shown on display screen 176, once dashboard display 126 is fully loaded, can be selected by the user. In another embodiment, the final landing page display is predetermined.
Referring again to the flow diagram of
Visualization component 114 then illustratively receives a user input indicating a user interaction with some portion of dashboard display 126. This is indicated by block 192 in the flow diagram of
Once the user has provided an input to interact with display 126, visualization component 114 illustratively performs an action based on the user input. This is indicated by block 204 in
It should also be noted that the particular items displayed in each of sections 160, 162, 166, 170 and 172 can be customized as well. For instance, in one embodiment, user 106 can navigate to a specific place in the application or applications which are run in business system 100 and “pin” or otherwise select items to be displayed as the user interface elements in each of the sections on dashboard 126. Modifying the particular elements displayed on each individual workspace display element 168 is described in more detail below with respect to
By way of example, assume that display screen 176 is a touch sensitive display screen. Then, if user 106 touches item 210, visualization component 114 toggles through the visual representations of workplace display cards 168 to a next visual representation.
In one embodiment, user 106 can again actuate item 210 to toggle to yet a different visual representation of workspace display cards 168. For instance, if user 106 toggles item 210 again, the user interface display elements corresponding to each of the workspaces can be displayed as list items within a list.
In the user interface display shown in
In another embodiment, user 106 can toggle item 210 to have visualization component 114 display the user interface display elements in section 166 in yet a different representation.
Also, while a number of visual representations have been discussed, others can be displayed as well. For instance, all workspace display cards 168 can be displayed in small representations or in other representations.
Each workspace display card 240-248 also includes an additional information section 302-310, respectively. The particular visual display elements displayed on additional information sections 302-310 can vary widely. They are also illustratively selectively placed there by user 106. By way of example, the display elements can include active or dynamic tiles, lists, activity feeds, charts, quick links, images, label/value pairs, calendars, maps, other cards, or other information. By way of example, additional information section 302 in card 240 illustratively includes three different tiles 312, two of which are sized in a relatively small size and one of which is relatively larger. Each tile 312 is illustratively a dynamic tile so that it displays information corresponding to underlying data or process. As the underlying data or process changes, the information on the dynamic tile 312 changes as well.
Additional information section 302 also illustratively includes a chart 314. Again, the chart is illustratively dynamic so as the underlying data which it represents changes, the display of chart 314 changes as well. In addition, each of the display elements 312-314 in section 302, can be user actuatable display elements. Therefore, when the user actuates one of those elements (such as by tapping it or clicking on it), visualization component 114 navigates the user to a more detailed display of the underlying information or process. In one example, the entire workspace display card is a user actuatable element as well. Therefore, if the user actuates it (such as by tapping it or by clicking on it) anywhere on the display card, the user is navigated to a more detailed display of the actual workspace that is represented by the corresponding workspace display card. This is described in greater detail below with respect to
In one embodiment, the particular information that shows up on the various visual representations of workspace display elements shown in section 166 on dashboard display 126 can be customized by user 106. That is, user 106 can select items that will be displayed on the various visual representations of the workspace display cards and list items discussed above.
It is first assumed that user 106 provides inputs to system 100 so that visualization component 114 generates a workspace display 128, for a given workspace. In one embodiment, the user can simply actuate one of the workspace display cards or list items on dashboard 126. This is indicated by block 350 shown in
Section 354 illustratively shows the information that is displayed on the corresponding display card 240 on the dashboard display 126. In the embodiment shown in
Once the workspace display 128 is displayed, the user can illustratively customize the information that appears on the corresponding display card 240 on dashboard display 126, by choosing the items that appear in section 354 on workspace display 128. In one example, the user can simply move items from sections 356, 358 and 360 into section 354, and position them within section 354 as desired. In response, customization component 116 customizes the corresponding workspace display card 240 so that it shows the information placed on section 354 by user 106.
By way of example, user 106 can illustratively select tile 370 (indicated by the dashed line around tile 370) and move it to a desired position in section 354, as indicated by arrow 372. This can be done using a drag and drop operation, or a wide variety of other user inputs as well. Once the user has done this, when the user returns to dashboard display 126, tile 370 will appear on the corresponding card 240, as shown in section 354.
The user can illustratively remove items from card 240 by again going to the workspace display 128 and removing those items from section 354, and placing them back in one of the other sections 356-360, or by simply deleting them, in which case they will no longer appear on workspace 128 or card 240. In addition, user 106 can place other items on the corresponding workspace display card 240 by moving them from the corresponding sections 356-360, into section 354. They will appear on card 240, where the user places them in section 354, when the user navigates back to dashboard display 126.
Returning again to the flow diagram of
The present discussion has mentioned processors and/or servers. In one embodiment, the processors and servers include computer processors with associated memory and timing circuitry, not separately shown. They are functional parts of the systems or devices to which they belong and are activated by, and facilitate the functionality of, the other components or items in those systems.
Also, a number of user interface displays have been discussed. They can take a wide variety of different forms and can have a wide variety of different user actuatable input mechanisms disposed thereon. For instance, the user actuatable input mechanisms can be text boxes, check boxes, icons, links, drop-down menus, search boxes, etc. They can also be actuated in a wide variety of different ways. For instance, they can be actuated using a point and click device (such as a track ball or mouse). They can be actuated using hardware buttons, switches, a joystick or keyboard, thumb switches or thumb pads, etc. They can also be actuated using a virtual keyboard or other virtual actuators. In addition, where the screen on which they are displayed is a touch sensitive screen, they can be actuated using touch gestures. Also, where the device that displays them has speech recognition components, they can be actuated using speech commands.
A number of data stores have also been discussed. It will be noted they can each be broken into multiple data stores. All can be local to the systems accessing them, all can be remote, or some can be local while others are remote. All of these configurations are contemplated herein.
Also, the figures show a number of blocks with functionality ascribed to each block. It will be noted that fewer blocks can be used so the functionality is performed by fewer components. Also, more blocks can be used with the functionality distributed among more components.
The description is intended to include both public cloud computing and private cloud computing. Cloud computing (both public and private) provides substantially seamless pooling of resources, as well as a reduced need to manage and configure underlying hardware infrastructure.
A public cloud is managed by a vendor and typically supports multiple consumers using the same infrastructure. Also, a public cloud, as opposed to a private cloud, can free up the end users from managing the hardware. A private cloud may be managed by the organization itself and the infrastructure is typically not shared with other organizations. The organization still maintains the hardware to some extent, such as installations and repairs, etc.
In the embodiment shown in
It will also be noted that system 100, or portions of it, can be disposed on a wide variety of different devices. Some of those devices include servers, desktop computers, laptop computers, tablet computers, or other mobile devices, such as palm top computers, cell phones, smart phones, multimedia players, personal digital assistants, etc.
Under other embodiments, applications or systems are received on a removable Secure Digital (SD) card that is connected to a SD card interface 15. SD card interface 15 and communication links 13 communicate with a processor 17 (which can also embody processor 112 from
I/O components 23, in one embodiment, are provided to facilitate input and output operations. I/O components 23 for various embodiments of the device 16 can include input components such as buttons, touch sensors, multi-touch sensors, optical or video sensors, voice sensors, touch screens, proximity sensors, microphones, tilt sensors, and gravity switches and output components such as a display device, a speaker, and or a printer port. Other I/O components 23 can be used as well.
Clock 25 illustratively comprises a real time clock component that outputs a time and date. It can also, illustratively, provide timing functions for processor 17.
Location system 27 illustratively includes a component that outputs a current geographical location of device 16. This can include, for instance, a global positioning system (GPS) receiver, a LORAN system, a dead reckoning system, a cellular triangulation system, or other positioning system. It can also include, for example, mapping software or navigation software that generates desired maps, navigation routes and other geographic functions.
Memory 21 stores operating system 29, network settings 31, applications 33, application configuration settings 35, data store 37, communication drivers 39, and communication configuration settings 41. Memory 21 can include all types of tangible volatile and non-volatile computer-readable memory devices. It can also include computer storage media (described below). Memory 21 stores computer readable instructions that, when executed by processor 17, cause the processor to perform computer-implemented steps or functions according to the instructions. Processor 17 can be activated by other components to facilitate their functionality as well.
Examples of the network settings 31 include things such as proxy information, Internet connection information, and mappings. Application configuration settings 35 include settings that tailor the application for a specific enterprise or user. Communication configuration settings 41 provide parameters for communicating with other computers and include items such as GPRS parameters, SMS parameters, connection user names and passwords.
Applications 33 can be applications that have previously been stored on the device 16 or applications that are installed during use, although these can be part of operating system 29, or hosted external to device 16, as well.
The mobile device of
Note that other forms of the devices 16 are possible.
Computer 810 typically includes a variety of computer readable media. Computer readable media can be any available media that can be accessed by computer 810 and includes both volatile and nonvolatile media, removable and non-removable media. By way of example, and not limitation, computer readable media may comprise computer storage media and communication media. Computer storage media is different from, and does not include, a modulated data signal or carrier wave. It includes hardware storage media including both volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by computer 810. Communication media typically embodies computer readable instructions, data structures, program modules or other data in a transport mechanism and includes any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. Combinations of any of the above should also be included within the scope of computer readable media.
The system memory 830 includes computer storage media in the form of volatile and/or nonvolatile memory such as read only memory (ROM) 831 and random access memory (RAM) 832. A basic input/output system 833 (BIOS), containing the basic routines that help to transfer information between elements within computer 810, such as during start-up, is typically stored in ROM 831. RAM 832 typically contains data and/or program modules that are immediately accessible to and/or presently being operated on by processing unit 820. By way of example, and not limitation,
The computer 810 may also include other removable/non-removable volatile/nonvolatile computer storage media. By way of example only,
Alternatively, or in addition, the functionality described herein can be performed, at least in part, by one or more hardware logic components. For example, and without limitation, illustrative types of hardware logic components that can be used include Field-programmable Gate Arrays (FPGAs), Program-specific Integrated Circuits (ASICs), Program-specific Standard Products (ASSPs), System-on-a-chip systems (SOCs), Complex Programmable Logic Devices (CPLDs), etc.
The drives and their associated computer storage media discussed above and illustrated in
A user may enter commands and information into the computer 810 through input devices such as a keyboard 862, a microphone 863, and a pointing device 861, such as a mouse, trackball or touch pad. Other input devices (not shown) may include a joystick, game pad, satellite dish, scanner, or the like. These and other input devices are often connected to the processing unit 820 through a user input interface 860 that is coupled to the system bus, but may be connected by other interface and bus structures, such as a parallel port, game port or a universal serial bus (USB). A visual display 891 or other type of display device is also connected to the system bus 821 via an interface, such as a video interface 890. In addition to the monitor, computers may also include other peripheral output devices such as speakers 897 and printer 896, which may be connected through an output peripheral interface 895.
The computer 810 is operated in a networked environment using logical connections to one or more remote computers, such as a remote computer 880. The remote computer 880 may be a personal computer, a hand-held device, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to the computer 810. The logical connections depicted in
When used in a LAN networking environment, the computer 810 is connected to the LAN 871 through a network interface or adapter 870. When used in a WAN networking environment, the computer 810 typically includes a modem 872 or other means for establishing communications over the WAN 873, such as the Internet. The modem 872, which may be internal or external, may be connected to the system bus 821 via the user input interface 860, or other appropriate mechanism. In a networked environment, program modules depicted relative to the computer 810, or portions thereof, may be stored in the remote memory storage device. By way of example, and not limitation,
It should also be noted that the different embodiments described herein can be combined in different ways. That is, parts of one or more embodiments can be combined with parts of one or more other embodiments. All of this is contemplated herein.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claim.