Prior software tools have attempted to provide users with the ability to plan and track progress of projects. For example, prior spreadsheet tools have allowed users to enter information about the progress of a project, such as by entering an expected completion date for a task, entering a list of workers that will be responsible for completing a task, or adding a notation when a particular task has been completed. Such tools can be shared by multiple users of computers on a network, with users being able to access and edit shared documents to reflect and track progress and plan future tasks.
Yet, prior software tools have also created technological problems of their own. For example, if 50 users involved in a project have access to a spreadsheet being used to track progress of the project, they may all wish at different times to update the spreadsheet to reflect their own progress or to access the spreadsheet to gain knowledge about the project from the updates of other users. This creates several technical problems, such as how to provide updates to users without locking the document to editing, and how to avoid consumption of valuable computing resources as users constantly access the spreadsheet to check for updates. It is also a technical and computational challenge to resolve updates of different users to avoid inconsistency and logical contradictions within the information that is presented. If inconsistencies and contradictions become apparent, or if a project simply does not seem to be progressing as expected, even more computing resources may be consumed as users continually check for progress or send communications to each other about a stalled project or a contradictory project plan. The problem is further limited to technical and computational solutions when the number of information items in the history, and ongoing and planned work queues, is in the thousands or millions, requiring significant computational and data resources to collect, interact with, and report on those items.
Many tools are applied for planning and tracking progress of projects and related information items. Human activity does not automatically make a record of itself, or of progress, other than in human memory and experience. For example, a new home contractor may know how far along construction is, and what delays have occurred or remain likely; the unschooled homeowner may not. For another person to discern progress, technical tools can be applied. Historically, these tools have proven very important to the technical organization of time and direction of the worker(s). Methods associated with the technical management of work and related items include time recorders/clocks; daily planners; calendars, planning meetings and agendas; to-do lists, email; and project planner software. These methods are typically deployed by supervisors to track and manage workers' activity. In recent technology, for example, in a Gantt chart, project tasks are typically shown with reference to a timeline. A task may be represented with a graphic, such as a bar, in the chart. The characteristics of the graphic and its position in the chart can provide information about the task. For example, the length of a bar that represents a particular task can provide an indication of how long the task is expected to take to complete. The position of the bar along the timeline can indicate when the task is scheduled to begin or to be completed. This type of display provides an overview, albeit static, of the state of a task or item.
Graphical tools can be used to take advantage of the information provided in Gantt charts. According to U.S. Pre-Grant Publication No. 2010/0217418, by Fontanot et al., “computer-based graphical tools based on the metaphor of the interactive Gantt-chart are used. . . . Usually, such tools give to the user the ability to move on a screen, by means of a pointing device (mouse, trackball or touch-screen), the graphical objects that represent the tasks that are part of the production schedule.” According to U.S. Pre-Grant Publication No. 2010/0017740, by Gonzalez Veron, et al., “project management application may display a project schedule timeline in conjunction with the Gantt chart or other schedule data. The timeline provides a summary of the project schedule by visually representing the schedule along a timescale from the start of the project until the finish. The timeline may also display crucial time information about the project including phases and milestones.”
According to U.S. Pre-Grant Publication No. 2014/033117, by Holler et al., “Progress and status reports may be displayed graphically. For example, a ‘dashboard’ user interface may display multiple graphical reports. Possible graphical reports include burn-down charts, velocity charts, burn-up charts, Gantt charts, parking lot reports, scope change, defect trending, test case status, and defect actuals. A burn-down chart illustrates remaining work vs. time.”
However, these solutions do not provide a technical solution to tracking items and interactive projects at scale, with all their dependencies and users. Despite providing features such as project schedule timelines, previous interactive Gantt charts do not offer the ability to easily determine the actual progress of tasks or the underlying interactions, dynamics between items, and measurable progress directly from the display, which is especially important for agile project management techniques. Therefore, users of such charts are required to personally investigate the progress of tasks in the timeline, often in side communications with personnel engaged in the task, which costs valuable time and resources and inherently limits the utility of such tools and limits the trust value of the information in the display. Although burn-down graphs, burn-up graphs, trend graphs, and the like are known to be used to track progress of things in limited ways, they have previously been viewed as separate tools that provide a different, and quite limited, view of a project.
While a visual display of progress is one of the benefits of prior project management techniques, they do not have the advantages in computational and dynamic interaction of the technical solutions described below.
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This summary is not intended to identify key features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
In one aspect, a computing device displays at least a portion of an interactive chart (e.g., a task row) presented in a first view. The computing device receives a first user request (e.g., by touch or gesture input received via a touchscreen of the computing device) for a change in the view, and updates the view responsive to the request. The updated view comprises a dynamic progress view. The dynamic progress view may include a burn-down graph. A task row may be height-adjusted to accommodate a dynamic progress view (e.g., by adjusting the height to display a burn-down graph in addition to other information in the task row).
The computing device may receive a second user request for a change in the view and further update the view responsive to the second user request. In such a scenario, the corresponding portion of the interactive chart may be an expandable task row, and the further updated view may include representations of a plurality of individual tasks associated with the expandable task row. The expandable task row may include an expand/collapse element, and the second user request may be associated with the expand/collapse element.
In another aspect, a computing device presents a dynamic progress view (which may include a burn-down graph) for at least a portion of an interactive chart, receives user input associated with the dynamic progress view, and automatically performs one or more of the following steps responsive to the received user input: (a) presenting additional information about a task or project associated with the interactive chart; (b) generating a notification associated with the interactive chart; (c) updating resource allocation for a task associated with the interactive chart; or (d) adjusting a projected completion time for a task associated with the interactive chart. For example, steps (b) and (d) may be performed, and the notification of step (b) may include information relating to the adjusted projected completion time of step (d). As another example, steps (b) and (c) may be performed, and the notification of step (b) may include information relating to the updated resource allocation of step (c).
In examples described herein, the computing device may be in communication with a remote computer system (e.g., one or more server computers), and the dynamic progress view may include project information (e.g., project schedule information, resource allocation information, etc.) received from the remote computer system. The dynamic progress view also may include project information received from a local project cache on the computing device. Changes to the project information (e.g., changes made by a user interacting with the computing device) may be synchronized back to the remote computer system.
The foregoing aspects and many of the attendant advantages of this invention will become more readily appreciated as the same become better understood by reference to the following detailed description, when taken in conjunction with the accompanying drawings, wherein:
The detailed description set forth below in connection with the appended drawings where like numerals reference like elements is intended as a description of various embodiments of the disclosed subject matter and is not intended to represent the only embodiments. Each embodiment described in this disclosure is provided merely as an example or illustration and should not be construed as preferred or advantageous over other embodiments. The illustrative examples provided herein are not intended to be exhaustive or to limit the claimed subject matter to the precise forms disclosed.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of illustrative embodiments of the present disclosure. It will be apparent to one skilled in the art, however, that many embodiments of the present disclosure may be practiced without some or all of the specific details. In some instances, well-known process steps have not been described in detail in order not to unnecessarily obscure various aspects of the present disclosure. Further, it will be appreciated that embodiments of the present disclosure may employ any combination of features described herein.
In the example shown in
The task bar 240 has an associated task bar label 242 that provides additional information about the task, such as the person or entity responsible for completing the task. The information to be displayed in the task bar label 242 may be selectable (e.g., automatically or by a user) or predefined. Alternatively, the task bar label 242 can be omitted or presented in some other way.
In the example shown in
Embodiments described herein provide a visual environment that implements technical solutions to permit rapid, contextual display of status information for one or more scheduled items. The visual environment includes interactive charts (e.g., interactive Gantt charts) with dynamic progress monitoring (e.g., in the form of dynamic progress views with specialized, multi-function burn-down graphs, trend lines, or the like). Automated notifications and modifications can be generated based on inputs to the visual environment (e.g., via interactive elements of a dynamic progress view) or outputs from the visual environment and/or a related back-end computer system. Contextual presentation of possible or actual changes to the scheduled items can be provided by the visual environment (e.g., in the form of updates to a dynamic progress view). The back-end computer system can perform automated processing to modify scheduled items based on, e.g., status information and underlying constraints. Real-time data inputs by users can be tracked by the back-end computer system, which can make updates to the visual environment, such as updating a trend line in a dynamic progress view. This type of dynamic computing system permits a level of interaction with ongoing work by a large group of users.
According to some embodiments of the present disclosure, solutions are provided for the management and presentation of information in interactive charts with selectable and customizable dynamic progress views, which allow the actual progress of a task or other scheduled item to be quickly and accurately understood. As used herein, a dynamic progress view depicts progress (e.g., an amount of work remaining) against time. The dynamic progress view includes a visual representation (e.g., in the form of a trend-line or spark-line) of a rate of progress. The dynamic progress view may depict a present rate, a historic rate, and/or a projected future rate of task completion. The dynamic progress view also may depict the present, historic, or projected future rate versus a planned rate of completion (e.g., with separate lines to represent each). Different time periods in a dynamic progress view may be represented distinctively (e.g., with different colors, shadings, etc.). For example, time periods may be represented in different colors depending on whether the respective time period is past or future time. A customized color scheme may be used to indicate different times or conditions, e.g., green for a time elapsed (past time) region, blue for an on-schedule trend line, yellow for a behind-schedule trend line, or red for stalled trend line. The presentation of the time allotted for completion can be made to contrast with the trend-line for projected completion, to enhance the visual effect of the dynamic progress view.
As mentioned above, task rows and the elements within them can be presented in different views. Views can change in response to, e.g., user requests. User requests may include requests to (1) refresh a current view; (2) select from among available views (including, e.g., a calendar view, a Gantt view, and a dynamic progress view), or switch between views; and (3) expand/collapse views (if applicable). In at least one embodiment, refreshing a view involves updating task data for a task row based on a selected cell or cells in an underlying spreadsheet. Selected cell data can be, for example, content in the cell, metadata about the cell, and data derived from or about the cell from the sheet itself.
The dynamic progress view 270D depicted in
The examples shown in
It will be understood with the flexibility that is offered by described dynamic progress views, other progress scenarios also can be depicted in such views.
The interactive elements 710 and 720 can be clicked, tapped, dragged, or otherwise activated to interact with the interactive chart via the dynamic progress view. By interacting with the interactive chart via the dynamic progress view (e.g., with the elements shown or in other ways), a user can obtain information about a task, cause a notification to be generated, cause updates to resource allocation for a task, adjust a projected completion time for a task, or invoke a combination of such functionality or other functionality. Examples of such interactions and related functionality are described in detail below.
In the examples shown in
In the examples shown in
The examples shown in
The server 920 can process input obtained from the client device 910 and/or other devices. For example, the server 920 can accept input indicating completion of a task, an update on the amount of work remaining for a task, or the like. The server 920 can store such data in a data store 928 and/or use such data for further processing. For example, the server 920 can use a change engine 924 to push corresponding changes to the interactive chart. The changes may be based on underlying calculations (e.g., updates to expected time remaining before a task is completed) performed by a computation engine 926. The server 920 may use a message engine 922 to generate messages relating to the interactive chart. For example, the server 920 may use the message engine 922 to automatically generate messages to be transmitted (e.g., to client device 910 via the network 930) to users that may be affected by an update to a task.
Messages may be automatically generated by the message engine 922 based on data that indicates current status, constraints, or the like. Similarly, automated changes, if permitted, may be made by the change engine 924 based on such data. The timing of automatically generated messages or changes also may be based on such data. For example, if a projected rate of completion indicates that a task is stalled or will be delayed, and the time remaining is less than a specified threshold, a warning message may be sent to a manager or other responsible entity. The specified threshold may depend on the condition of the task. For example, a stalled task may need to be addressed earlier than a delayed task, so the threshold time remaining for sending a warning message on a stalled task may be greater than for a delayed task.
Messages also may be sent to provide regular status reports at predefined times (e.g., when 25% or 50% of the allocated time for a task has elapsed), when particular conditions are met (e.g., when a task is 25% or 50% completed), or the like. For example, at 25% time remaining, a message engine may generate a message including a condition statement (for example, tasks completed and remaining, items yet to be received, reviewers assigned but not active) and, if appropriate, a dependency statement (for example, other items and persons relying upon completion).
The design of described embodiments allows tracking of user changes to task parameters (e.g., allocated time, or resources such as workers, equipment, etc., for a task); tracking of actual work (e.g., completed tasks or sub-tasks); tracking and adjustment of resource, schedule and user assignments; and implementation of management mandates relating to schedule, resource and user time allocations. The design of described embodiments also allows for partitioning of underlying metadata and generation of reports, which can be automatically updated and produced on request. Such reports may include reports specific to past time in a task or project, such as who worked on a task, tasks completed, changes in task assignments, etc., or future time in a task or project, such as who is assigned to a future task or subtask, tasks remaining, task assignments yet to be made, etc. In addition to separate reports, some information may be available by directly inspecting the dynamic progress view of the interactive chart. For example, a task bar or trend-line can be segmented in different colors or shading to indicate who was assigned to a task during a particular period of time, to indicate a future task that has not been assigned to a worker yet, etc.
The computing device may receive a second user request for a change in the view and further update the view responsive to the second user request. In such a scenario, the further updated view may include representations of a plurality of individual tasks associated with an expandable task row. The expandable task row may include an expand/collapse element, and the second user request may be associated with the expand/collapse element.
In the illustrative method 1100 shown in
In the above described methods or other techniques described herein, the computing device may be in communication with a remote computer system (e.g., one or more server computers), and the dynamic progress view may include project information (e.g., project schedule information, resource allocation information, etc.) received from the remote computer system. The dynamic progress view also may include project information received from a local project cache on the computing device. Changes to the project information (e.g., changes made by a user interacting with the computing device) may be synchronized back to the remote computer system.
In described embodiments, user interface operations (e.g., scrolling operations, zoom operations, panning operations, activation of user interface elements such as buttons or menu options, etc.) can be initiated and/or controlled in response to user input events (e.g., a touch of a button, a gesture or tap on a touchscreen, or the like). User interface operations permit user interaction with charts and views. The available user interface operations, as well as the particular input that is used to initiate or control the available user interface options, can vary depending on factors such as the capabilities of the device (e.g., whether the device has a touchscreen), the underlying operating system, user interface design, user preferences, or other factors.
Scrolling operations (e.g., horizontal scrolling or vertical scrolling) can be initiated and/or controlled by user input such as a button press or touch, a gesture on a touchscreen (e.g., a horizontal flick gesture), or the like. Zoom operations (e.g., zoom-in operations, zoom-out operations) can be initiated and/or controlled by user input such as a button press or touch, a gesture on a touchscreen (e.g., a tap or pinch gesture), or the like. Zoom operations can be useful for viewing or interacting with a chart at different levels of detail. The effect of a zoom operation may vary depending on the current zoom level, the direction and/or magnitude of the zoom, the nature of the user input, or other factors. Zoom operations can be initiated and/or controlled by gestures in a touch-enabled interface. For example, a zoom-in operation can be initiated by a pinch gesture, in which two points of contact (e.g., by a user's fingers) are brought closer together during the gesture, and a zoom-out operation can be initiated by an “un-pinch” gesture, in which two points of contact are moved further apart during the gesture. The effect of a zoom operation can depend on the magnitude of the gesture (e.g., the change in distance between the contact points), the direction of the gesture (e.g., whether the gesture is a pinch or un-pinch gesture), the angle of the gesture (e.g., the angle relative to a horizontal or vertical axis), the current zoom level, and/or other factors. A single gesture can lead to any of several possible zoom effects depending on the characteristics of the gesture, as described in further detail below.
In addition to scrolling operations and zoom operations, other user interface operations can permit other types of user interaction with charts and views. For example, a user can interact with user interface elements by performing a tap gesture or a press-and-hold gesture on a designated hit area in a row, if the hit area is within the display area. A tap or press-and-hold gesture can be recognized if the gesture occurs in a designated area of the display corresponding to a portion of the user interface. Such a designated area is referred to herein as a “hit area.”
In at least one embodiment, the following user interface operations can be initiated by a tap gesture on a designated hit area in a row, if the hit area is within the display area.
Tap expand/collapse element: If an expand/collapse element (e.g., a box) is present (e.g., if a row has children) tapping in an area on or near the element (e.g., a row-height square area) can toggle the expanded/collapsed state. In at least one embodiment, the hit or tap area extends beyond the visible element and potentially even covers a small part of other content outside the expand/collapse element (e.g., text to its right) to ensure that the hit area is big enough for easy interaction.
Tap bar/marker to open a row view: Tapping on the bar or marker (e.g., a task bar, a group bar, a milestone marker, etc.) in a row can open a row view. In at least one embodiment, the row to be viewed is opened in a “view row” form, and the user can move to the next or previous row or invoke an editing operation from that view. If the row is edited, the view can be refreshed to show any corresponding changes.
The above effects may be presented temporally, for a period of time after invocation by the user, or for so long as the user holds a given button or item on the page, for a “show me” effect on the data element selected. In this way, a temporary view of the information may be viewed by the user, without changing the view permanently.
Unless otherwise specified in the context of specific examples, described techniques and tools may be implemented by any suitable computing devices, including, but not limited to, laptop computers, desktop computers, smart phones, tablet computers, and/or the like.
Some of the functionality described herein may be implemented in the context of a client-server relationship. In this context, server devices may include suitable computing devices configured to provide information and/or services described herein. Server devices may include any suitable computing devices, such as dedicated server devices. Server functionality provided by server devices may, in some cases, be provided by software (e.g., virtualized computing instances or application objects) executing on a computing device that is not a dedicated server device. The term “client” can be used to refer to a computing device that obtains information and/or accesses services provided by a server over a communication link. However, the designation of a particular device as a client device does not necessarily require the presence of a server. At various times, a single device may act as a server, a client, or both a server and a client, depending on context and configuration. Actual physical locations of clients and servers are not necessarily important, but the locations can be described as “local” for a client and “remote” for a server to illustrate a common usage scenario in which a client is receiving information provided by a server at a remote location.
In its most basic configuration, the computing device 1200 includes at least one processor 1202 and a system memory 1204 connected by a communication bus 1206. Depending on the exact configuration and type of device, the system memory 1204 may be volatile or nonvolatile memory, such as read only memory (“ROM”), random access memory (“RAM”), EEPROM, flash memory, or other memory technology. Those of ordinary skill in the art and others will recognize that system memory 1204 typically stores data and/or program modules that are immediately accessible to and/or currently being operated on by the processor 1202. In this regard, the processor 1202 may serve as a computational center of the computing device 1200 by supporting the execution of instructions.
As further illustrated in
In the illustrative embodiment depicted in
As used herein, the term “computer-readable medium” includes volatile and nonvolatile and removable and non-removable media implemented in any method or technology capable of storing information, such as computer-readable instructions, data structures, program modules, in-memory databases, or other data. In this regard, the system memory 1204 and storage medium 1208 depicted in
For ease of illustration and because it is not important for an understanding of the claimed subject matter,
In any of the described examples, data can be captured by input devices and transmitted or stored for future processing. The processing may include encoding data streams, which can be subsequently decoded for presentation by output devices. Media data can be captured by multimedia input devices and stored by saving media data streams as files on a computer-readable storage medium (e.g., in memory or persistent storage on a client device, server, administrator device, or some other device). Input devices can be separate from and communicatively coupled to computing device 1200 (e.g., a client device), or can be integral components of the computing device 1200. In some embodiments, multiple input devices may be combined into a single, multifunction input device (e.g., a video camera with an integrated microphone). Any suitable input device either currently known or developed in the future may be used with systems described herein.
The computing device 1200 may also include output devices such as a display, speakers, printer, etc. The output devices may include video output devices such as a display or touchscreen. The output devices also may include audio output devices such as external speakers or earphones. The output devices can be separate from and communicatively coupled to the computing device 1200, or can be integral components of the computing device 1200. In some embodiments, multiple output devices may be combined into a single device (e.g., a display with built-in speakers). Further, some devices (e.g., touchscreens) may include both input and output functionality integrated into the same input/output device. Any suitable output device either currently known or developed in the future may be used with described systems.
In general, functionality of computing devices described herein may be implemented in computing logic embodied in hardware or software instructions, which can be written in a programming language, such as C, C++, COBOL, JAVA™, PHP, Perl, HTML, CSS, JavaScript, VBScript, ASPX, Microsoft .NET™ languages such as C#, and/or the like. Computing logic may be compiled into executable programs or written in interpreted programming languages. Generally, functionality described herein can be implemented as logic modules that can be duplicated to provide greater processing capability, merged with other modules, or divided into sub-modules. The computing logic can be stored in any type of computer-readable medium (e.g., a non-transitory medium such as a memory or storage medium) or computer storage device and be stored on and executed by one or more general-purpose or special-purpose processors, thus creating a special-purpose computing device configured to provide functionality described herein.
Although some embodiments that may be used for project scheduling and collaboration are described herein with reference to Gantt charts, it should be understood that such embodiments may also be applied to other types of project scheduling charts, or other charts that are not specifically limited to project scheduling, but have features in which an interactive chart of the progress of related tasks or other scheduled items can be presented to a user.
Although some embodiments are described herein with reference to burn-down graphs that track work remaining, similar principles could be used to track work completed (e.g., in a burn-up graph, a task list, a calendar view, an activity graph, etc.). Although some examples are described herein with regard to illustrative touch-enabled mobile computing devices and a corresponding application that can be executed on the illustrative devices, the principles described herein also can be applied to other computing devices, whether such computing devices employ touchscreen input or other input modes.
For touch-enabled devices, many different types of touch input can be used, and touch input can be interpreted in different ways. Inertia effects, friction effects, and the like can be used to provide a more realistic feel for touch input. For example, in a touch-enabled interface, a flick gesture can be used to initiate a scrolling motion at an initial velocity that gradually decreases (e.g., based on a friction coefficient) before coming to rest.
Visual display areas may represent defined data sets and metadata, and such information may be displayed, reports may be generated, and/or actions may be taken responsive to such information through any number of user interface devices (e.g., mouse, keyboard, keypad, touchpad, touch-enabled display, etc.) Multiple display devices may be used with the system.
Many alternatives to the systems and devices described herein are possible. For example, individual modules or subsystems can be separated into additional modules or subsystems or combined into fewer modules or subsystems. As another example, modules or subsystems can be omitted or supplemented with other modules or subsystems. As another example, functions that are indicated as being performed by a particular device, module, or subsystem may instead be performed by one or more other devices, modules, or subsystems. Although some examples in the present disclosure include descriptions of devices comprising specific hardware components in specific arrangements, techniques and tools described herein can be modified to accommodate different hardware components, combinations, or arrangements. Further, although some examples in the present disclosure include descriptions of specific usage scenarios, techniques and tools described herein can be modified to accommodate different usage scenarios. Functionality that is described as being implemented in software can instead be implemented in hardware, or vice versa.
Many alternatives to the techniques described herein are possible. For example, processing stages in the various techniques can be separated into additional stages or combined into fewer stages. As another example, processing stages in the various techniques can be omitted or supplemented with other techniques or processing stages. As another example, processing stages that are described as occurring in a particular order can instead occur in a different order. As another example, processing stages that are described as being performed in a series of steps may instead be handled in a parallel fashion, with multiple modules or software processes concurrently handling one or more of the illustrated processing stages. As another example, processing stages that are indicated as being performed by a particular device or module may instead be performed by one or more other devices or modules.
Many alternatives to the user interfaces described herein are possible. In practice, the user interfaces described herein may be implemented as separate user interfaces or as different states of the same user interface, and the different states can be presented in response to different events, e.g., user input events. The elements shown in the user interfaces can be modified, supplemented, or replaced with other elements in various possible implementations.
The principles, representative embodiments, and modes of operation of the present disclosure have been described in the foregoing description. However, aspects of the present disclosure which are intended to be protected are not to be construed as limited to the particular embodiments disclosed. Further, the embodiments described herein are to be regarded as illustrative rather than restrictive. It will be appreciated that variations and changes may be made by others, and equivalents employed, without departing from the spirit of the present disclosure. Accordingly, it is expressly intended that all such variations, changes, and equivalents fall within the spirit and scope of the claimed subject matter.