Enterprise software in a hosted web browser environment has typically been driven by efforts to replicate mainline applications on the desktop, with all of their complexity and user interface options. Rich client applications and broadband connections to data sources are presently available for tasks such as word processing, spreadsheet, presentation, image and video manipulation, project management, and other such productivity applications. Similarly, strategies for the synchronization of data and actions between the online application system and the end user client computing device, and the display of the user interface on the client, are well known.
However, the advent of mobile and tablet devices with touchscreen displays, and their popularity with the consumer and business user, mandates a different direction for the presentation of software products. It is particularly challenging for mobile applications, especially applications relating to complex subjects such as project management, to simplify the user experience for successful presentation and manipulation of information on a smaller screen and tune the application for successful deployment on mobile networks with lower network bandwidth. It is also desirable to empower users to customize the display of information dynamically on a mobile or touchscreen device.
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This summary is not intended to identify key features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
In one aspect, a computing device with a touchscreen (e.g., a tablet computer, smart phone, etc.) renders an initial interactive chart view (e.g., a Gantt chart view) comprising a time axis and a task axis and receives first user input (e.g., a pinch gesture) associated with a zoom operation via the touchscreen. Based at least in part on the first user input, the computing device determines whether the zoom operation is a one-dimensional zoom operation or a multi-dimensional zoom operation and renders a zoom-adjusted interactive chart view for display responsive to the first user input. In a time-axis zoom operation, the zoom-adjusted interactive chart view can be zoomed in or out on the time axis relative to the initial interactive chart view. In a task-axis zoom operation, the zoom-adjusted interactive chart view can be zoomed in or out on the task axis relative to the initial interactive chart view. In a multi-dimensional zoom operation, the zoom-adjusted interactive chart view can be zoomed in or out on the time axis and the task axis relative to the initial interactive chart view.
The initial interactive chart view may comprise an initial time scale and an initial number of task rows to be displayed. The zoom-adjusted interactive chart view may comprise an adjusted height or number of task rows to be displayed and/or an adjusted time scale. The initial interactive chart view may comprise a number of connection lines representing dependencies to be displayed. The zoom-adjusted interactive chart view may comprise an adjustment of the number, positioning, or orientation of the connection lines.
Additional user input can be used to perform other operations. For example, the computing device can receive second user input (e.g., a tap gesture or press-and-hold gesture) associated with an operation in a task row via the touchscreen and render a modified view of task information responsive to the second user input. Rendering the modified view of task information may include expanding or collapsing an element, repositioning (e.g., centering) a task label on the display, rendering a row view, or rendering a row menu. The row menu may include a dependencies button configured to, when activated, highlight dependences for a task.
In another aspect, a computing device renders a chart view (e.g., a Gantt chart view) comprising a time axis and a task row having a task name label with a directional indicator. The computing device renders the directional indicator such that it points in the direction of a task bar at a current position. The task bar is associated with the task row. The computing device receives user input associated with a change in the current position of the task bar and adjusts the directional indicator based at least in part on the change in the current position of the task bar. The directional indicator can be initially positioned at a first end of the task name label and the directional indicator can be positioned at a second end of the task name label. Adjusting the directional indicator can include adjusting the size or orientation of the directional indicator or adjusting the size of the directional indicator in proportion to a distance between the task name label and the task bar.
In another aspect, a computing device renders a chart view for a project comprising tasks and detects a pinch gesture via a touchscreen. The pinch gesture corresponds to a zoom operation. The computing device automatically adjusts a time scale of the time axis responsive to the pinch gesture subject to a time scale adjustment constraint that is based at least in part on the tasks. The time scale adjustment constraint may include a minimum number of tasks to be visible at the adjusted time scale. The time scale adjustment constraint may require an entire task to be visible at the adjusted time scale.
A computing device performing such methods may be in communication with a server computer, and chart views may be generated using project schedule information received from the server computer. Changes made to the project schedule information can be synchronized back to the server.
The foregoing aspects and many of the attendant advantages of this invention will become more readily appreciated as the same become better understood by reference to the following detailed description, when taken in conjunction with the accompanying drawings, wherein:
The detailed description set forth below in connection with the appended drawings where like numerals reference like elements is intended as a description of various embodiments of the disclosed subject matter and is not intended to represent the only embodiments. Each embodiment described in this disclosure is provided merely as an example or illustration and should not be construed as preferred or advantageous over other embodiments. The illustrative examples provided herein are not intended to be exhaustive or to limit the claimed subject matter to the precise forms disclosed.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of exemplary embodiments of the present disclosure. It will be apparent to one skilled in the art, however, that many embodiments of the present disclosure may be practiced without some or all of the specific details. In some instances, well-known process steps have not been described in detail in order not to unnecessarily obscure various aspects of the present disclosure. Further, it will be appreciated that embodiments of the present disclosure may employ any combination of features described herein.
According to some embodiments of the present disclosure, solutions are provided for the management and presentation of information (e.g., task, status, and calendar information) in interactive charts. Described embodiments may be implemented in the context of a hosted project management application presented on a mobile device, in which the mobile device presents a user interface that allows interaction with project information provided by a remote device, such as a server. Alternatively, according to principles described herein, management and presentation of such information (or other information) can be performed in the context of other applications and/or on other types of devices.
According to some embodiments of the present disclosure, an interactive chart view application is provided that allows interaction with charts having a time axis and a task axis (e.g., charts for project scheduling). A Gantt chart is an example of a chart that can be used for project scheduling. In a Gantt chart, projects are broken down into tasks. The tasks in a Gantt chart are typically shown with reference to a timeline. A task may be represented with a graphic, such as a bar or line, in the chart. The characteristics of the graphic and its position in the chart can provide information about the task. For example, the length of a bar or line that represents a particular task can provide an indication of how long the task is expected to take to complete. As another example, the position of the bar or line along the timeline can indicate when the task is scheduled to begin or to be completed. Tasks may be dependent on one other. For example, one task may need to be completed before another task can begin. Dependencies between tasks can be depicted (e.g., graphically) in the chart.
Although some embodiments that may be used for project scheduling and collaboration are described herein with reference to Gantt charts, it should be understood that such embodiments may also be applied to other types of project scheduling charts, or other charts that are not specifically limited to project scheduling.
According to some embodiments of the present disclosure, modifiable views of project information are provided. Such views can be modified according to display rules. Modifications of views can be initiated and controlled by user interface elements. For example, on a mobile device with a touchscreen interface, modifications of views can be initiated and controlled by gestures, button taps, or other user interactions. For example, a Gantt view provided for interaction with a Gantt chart can be dynamically modified according to a pre-defined set of display rules and related user interface elements.
In some embodiments, multiple views are provided. The respective views can be displayed individually, or multiple views (or portions of views) can be displayed at the same time, depending on factors such as the available screen area for displaying views. As used herein, a view refers to a depiction of a portion of a user interface that relates to a particular type of content. For example, a chart view depicts content relating to a chart (e.g., a Gantt chart). A chart view that depicts a Gantt chart can be referred to as a Gantt view. In one illustrative embodiment, three views are provided to facilitate user interaction with a Gantt chart: a Gantt view, a list view, and a grid view.
In described embodiments, an interactive chart view application provides views that can be rich in graphics and other information without providing unnecessary distractions. Such views allow the user's data to be easily viewable and accessible on a small display, such as a display on a mobile phone. The views are also flexible enough to take advantage of larger displays, such as tablet PC displays, allowing users to use different devices as their needs and resources change.
Treatment of views and view states can vary depending on application design, user preferences, or other factors. For example, an interactive chart view application may be configured to store a current view state when a user exits the application, and the stored view state can be automatically restored when the application is opened again. Alternatively or additionally, default views and/or view states can be used. For example, an interactive chart view application may be configured to display a default view each time the application is opened. As another example, a user can choose to display a default view or a previously stored view state when the application is opened. The default view may be customizable or pre-defined. For example, a user can select a Gantt view, list view, or grid view as a default view.
A user can interact with views (including Gantt views) in various ways. For example, views can be initiated, selected, modified, and/or manipulated on mobile devices using onscreen or physical device buttons, taps or gestures, movements of the device itself (e.g., shaking), voice commands, menu-selected user interface options, or any other suitable method. The invocation of a view (e.g., a Gantt view) may invoke a modal view (e.g., a view that is displayed until dismissed, modified, or a change is requested), or it may invoke a transparent or semi-transparent temporal view. For example, a Gantt view can overlay an overall project view or some other view for a period of time (e.g., a period of time after shaking the device) or pendency of an action (e.g., holding a touch on a button).
Illustrative Interactive Chart Views
Although the chart view 110 shown in
In the example shown in
In the example shown in
In the example shown in
The task bar 240 also has an associated task bar label 242. The task bar label 242 can provide additional information about the task, such as the person or entity responsible for completing the task. In the example shown in
Different portions of a chart view may be allocated respective percentages of the display area. For example, a portion of a Gantt view that includes task bars 240 and connection lines 250 may be allocated 40% of the width of the display area. As another example, the size of task name labels 230 can be limited to a percentage of the display area (e.g., 45% of the width of the display area).
Elements of a chart view (e.g., task name labels, task bars, etc.) may be presented as transparent, semi-transparent, or opaque, and the transparency of such elements can vary depending on the view state. For example, the task name labels 230 may be semi-transparent (allowing content behind them, such as task bars, to be partially visible) or opaque.
In some embodiments, an interactive chart view application can track dimensions of the screen allocated to display the application data and the interactions of the mobile device user. Some rules that may be used in such embodiments are described below with reference to corresponding figures.
The appearance of some elements can be altered in response to events. For example, in state 300B (
Some elements can be configured to float over the Gantt view. For example, in state 300B (
Changing to a landscape orientation (e.g., by rotating the display) can provide a greater screen width. To take advantage of changes in display orientation, at least some aspects of the Gantt view (e.g., the size of task name labels, the length the time axis, etc.) can be automatically adjusted in response to a change in display orientation (e.g., from portrait to landscape orientation, or vice versa).
Task name labels may include a directional indicator (e.g., an arrow or chevron pointer) to indicate where corresponding task bars are located relative to the task name labels. For example, in
Task name labels 230J and 230L are associated with task bars 240J and 240L, respectively. In at least one embodiment, the task bars 240J and 240L are rectangles with a height equal to the height of the text in the corresponding task name labels 230J and 230L, respectively, with a gray border and a colored fill that can be customized, e.g., to represent different types of tasks.
Task name labels 230K and 230M are associated with milestone markers 240K and 240M, respectively. Milestone markers can be considered a special type of task bar associated with a milestone. In at least one embodiment, if the user sets a task to zero time length, the task is represented as a milestone marker (e.g., a black diamond). The milestone marker can be vertically and horizontally centered within the box representing the corresponding time period, as shown in
As indicated above, in described embodiments a chart view comprises a time axis. The time axis may have an associated header with multiple levels of time markers (e.g., letters representing days of the week, names of months, etc.) and multiple available time scales and levels of zoom.
The time scales and the associated time markers can be updated automatically in response to zooming in or out on the time axis 204. Alternatively, the levels of time markers can be adjusted independently. As another alternative, one or more of the time scales may be fixed. Further details regarding zooming functionality and time scales are provided below.
Many different date formats may be used for time markers.
An interactive chart view application running on a local device can provide all of the time markers used for a chart. Alternatively, sufficient information to calculate time markers can be stored on the local device. If time markers are calculated locally, a remote service may still provide some date information directly (e.g., the month that begins a fiscal year, such as January, April, July, or October). Fiscal year information may be important for providing the right year for fiscal-year-formatted dates and for displaying the right index for quarter and week numbers if they are associated with the fiscal year.
In at least some embodiments, there are two types of column dividers—a secondary column divider (representing smaller time units in the time scale) and a primary column divider (representing larger time units in the time scale). In the example shown in
At the level of zoom shown in
In some cases, a primary column divider may not line up with a corresponding secondary column divider. For example, in a week/month time scale, the primary column divider may appear between two secondary column dividers if, as is often the case, the end of a month does not coincide with the end of a week. The primary column divider can be positioned behind the time markers in the header row, if necessary, in order to avoid obscuring the time markers.
Referring again to
Illustrative User Interface Operations for Interactive Chart Views
In described embodiments, user interface operations (e.g., scrolling operations, zoom operations, panning operations, activation of user interface elements such as buttons or menu options, etc.) can be initiated and/or controlled in response to user input events (e.g., a touch of a button, a gesture or tap on a touchscreen, or the like). User interface operations permit user interaction with charts and views. The available user interface operations, as well as the particular input that is used to initiate or control the available user interface options, can vary depending on factors such as the capabilities of the device (e.g., whether the device has a touchscreen), the underlying operating system, user interface design, user preferences, or other factors.
Scrolling operations (e.g., horizontal scrolling or vertical scrolling) can be initiated and/or controlled by user input such as a button press or touch, a gesture on a touchscreen (e.g., a horizontal flick gesture), or the like. Horizontal scrolling allows a user to view content that is outside the display area in the horizontal direction, and vertical scrolling allows a user to view content that is outside the display area in the vertical direction. Some elements may be altered (e.g., in terms of size, shape, or content) in response to scrolling. For example, the shape, size, or content of task name labels can be altered in response to horizontal scrolling in a Gantt view, as described in detail above. In at least one embodiment, vertical scrolling allows headers (such as time scale headers) to continue to be displayed as the user scrolls vertically (e.g., to view additional task rows in a Gantt view), thereby providing the user with consistent visual cues as to the information that is being displayed.
Zoom operations (e.g., zoom-in operations, zoom-out operations) can be initiated and/or controlled by user input such as a button press or touch, a gesture on a touchscreen (e.g., a tap or pinch gesture), or the like. Zoom operations in a chart view (e.g. a Gantt view) can be useful for viewing or interacting with a chart at different levels of detail. The effect of a zoom operation may vary depending on the current zoom level, the direction and/or magnitude of the zoom, the nature of the user input, or other factors.
In described embodiments, zoom operations can be initiated and/or controlled by gestures in a touch-enabled interface. For example, a zoom-in operation can be initiated by a pinch gesture, in which two points of contact (e.g., by a user's fingers) are brought closer together during the gesture, and a zoom-out operation can be initiated by an “un-pinch” gesture, in which two points of contact are moved further apart during the gesture. The effect of a zoom operation can depend on the magnitude of the gesture (e.g., the change in distance between the contact points), the direction of the gesture (e.g., whether the gesture is a pinch or un-pinch gesture), the angle of the gesture (e.g., the angle relative to a horizontal or vertical axis), the current zoom level, and/or other factors. A single gesture can lead to any of several possible zoom effects depending on the characteristics of the gesture, as described in further detail below.
In some embodiments, zoom operations can be used to adjust time scales in a chart view (e.g., a Gantt view). A time scale may have an upper zoom limit and a lower zoom limit. If a zoom operation extends beyond a limit of a current time scale and another time scale is available beyond the respective limit, the zoom operation can cause the view to transition from the current time scale to another time scale (e.g., from day/week to week/month, or vice versa).
Transitions between time scales can be implemented in many different ways. For example, to provide a smooth transition between time scales, a dissolve effect or other visual effect can be used to animate transitions between day/week, week/month, month/quarter, quarter/year, or other time scales. As another example, column lines can appear or disappear or change appearance in response to zoom operations.
Views also can be adjusted in response to changes in zoom level within the same time scale. For example, an intermediate level of zoom that does not exceed a limit of a current time scale may be accompanied by a change in font size in headers and/or labels, a change in column size and/or row size, or the like. As an example, a Gantt view can respond to zoom-out and zoom-in operations by moving column lines (e.g., day lines) closer and further apart, respectively, within the same time scale.
In at least one embodiment, a Gantt view supports zoom operations along the time axis and the task axis, which can be carried out either individually or together, responsive to pinch and un-pinch gestures. A zoom operation directed to only the time axis (e.g., a horizontal axis) can be used to grow or shrink the columns representing time units and/or change time scales, allowing the user to view the same number of tasks over a longer or shorter time period.
Time scale adjustments may be constrained by time scale adjustment constraints. Such constraints may be based on task visibility. For example, time scale adjustments may be constrained to keep all tasks visible (e.g., if a project includes a number of tasks below a threshold number). This may be useful, for example, where a project contains a small number of tasks (e.g., 3, 4, or 5 tasks). As another example, time scale adjustments may be constrained to keep a minimum number of tasks visible (e.g., 2 tasks). As another example, time scale adjustments may be constrained by requiring at least one entire task (including beginning and end points) to be visible at an adjusted time scale. This may be useful, for example, to avoid zooming in so far on the time axis that the beginning point and/or end point of a task (or set of tasks) is not visible. Other task visibility constraints are also possible. Time scale adjustment constraints also can be based on user preferences and/or other factors, including factors unrelated to task visibility.
Referring again to
In described embodiments, a zoom operation can be directed to the task axis (e.g., a vertical axis) to view more or fewer tasks over the same time period. The zoom level along the task axis may also affect legibility of the task labels. For example, font sizes and/or label sizes may be configured to grow (and perhaps become more legible) as fewer tasks are displayed, or shrink (and perhaps become less legible) as more tasks are displayed. In order to limit any adverse effects on legibility, a limit on the number of tasks to be viewed can be set, e.g., to prevent font size and/or label size from getting too small. Such a limit may vary depending on the size of the display, user preferences, or other factors.
A zoom operation directed to both the time axis and the task axis can be used to adjust both the time period to be viewed and the number of tasks to be viewed (e.g., in response to a single pinch or un-pinch gesture). In at least one embodiment, a diagonal pinch gesture is used to zoom on both the time axis and the task axis. The system can determine whether a pinch gesture is diagonal or not by comparing the angle of the pinch gesture with the horizontal or vertical axis.
In at least one embodiment, the system determines whether a pinch gesture is vertical, horizontal, or diagonal as follows: if the line between the two contact points of the pinch is within 15° of horizontal, the pinch gesture is horizontal; if the line between the two contact points of the pinch is within 15° of vertical, the pinch gesture is vertical; otherwise, the pinch gesture is diagonal. These ranges are represented in
In at least one embodiment, a diagonal pinch gesture in a chart view causes the same degree of zoom on both the time axis and the task axis, regardless of the specific angle. However, the application of the same degree of zoom along each axis may differ in the respective effects presented in the view, as described above. Alternatively, the degree of zoom caused by a diagonal pinch gesture can vary depending on the specific angle (e.g., with a greater degree of zoom on the time axis if the angle is closer to horizontal, or with a greater degree of zoom on the task axis if the angle is closer to vertical).
In addition to scrolling operations and zoom operations, other user interface operations can permit other types of user interaction with charts and views. For example, a user can interact with user interface elements by performing a tap gesture or a press-and-hold gesture on a designated hit area in a row, if the hit area is within the display area. A tap or press-and-hold gesture can be recognized if the gesture occurs in a designated area of the display corresponding to a portion of the user interface. Such a designated area is referred to herein as a “hit area.”
In at least one embodiment, the following user interface operations can be initiated by a tap gesture on a designated hit area in a row, if the hit area is within the display area.
Tap expand/collapse element: If an expand/collapse element (e.g., a box) is present (e.g., if a row has children) tapping in an area on or near the element (e.g., a row-height square area) can toggle the expanded/collapsed state. In at least one embodiment, the hit or tap area extends beyond the visible element and potentially even covers a small part of other content outside the expand/collapse element (e.g., text to its right) to ensure that the hit area is big enough for easy interaction.
Tap label to show and/or center task on display: If the user taps the label of a task (or a milestone or group) the view can horizontally scroll (e.g., so the left-hand side of the task bar is in the middle of the screen). This scrolling operation can be animated, but quick (e.g., 300 ms). This scrolling operation also can include shifting other content (e.g., a table or grid) to the side, if needed.
Tap bar/marker to open a row view: Tapping on the bar or marker (e.g., a task bar, a group bar, a milestone marker, etc.) in a row can open a row view. In at least one embodiment, the row to be viewed is opened in a “view row” form, and the user can move to the next or previous row or invoke an editing operation from that view. If the row is edited, the chart view can be refreshed to show any corresponding changes.
The above effects may be presented temporally, for a period of time after invocation by the user, or for so long as the user holds a given button or item on the page, for a “show me” effect on the data element selected. In this way, a temporary view of the information may be viewed by the user, without changing the display permanently.
In at least one embodiment, a row menu can be activated by a press-and-hold gesture on a designated hit area in a row, if the hit area is within the display area.
Dependencies can be highlighted in different ways. For example, using the illustrative row menu 270 shown in
The extent of dependency highlighting may be limited to one task before and/or after the selected task, or more dependencies may be highlighted. Related milestones may be highlighted or ignored for highlighting purposes.
Optionally, the user may invoke a scroll-to operation by tapping on an arrow associated with a dependency, such as an antecedent. This scroll-to operation may be, e.g., a temporary zoom-to or snap-back scroll in which, after showing the antecedent or precedent (e.g., by centering the view on the antecedent or precedent), viewing of the antecedent returns to the original view position, or a move-to scroll that repositions the view (e.g., by centering the view on the antecedent or precedent) and comes to rest at the antecedent or precedent's location on the chart.
Highlighting can be temporary, or it can be displayed until dismissed. Some gestures may cause highlighting to be dismissed, while other gestures may allow highlighting to continue to be displayed. In at least one embodiment, a tap gesture is effective to dismiss the highlighting while pinch and flick gestures do not dismiss the highlighting. As shown in
Highlighting also can be preserved in a persistent state. Persistent highlighting can be used for generating a tracing of dependencies between tasks, milestones, and the like. Such tracings can be preserved in a snapshot format (e.g., as an image or as a saved state of an interactive chart view) for later reference. In one illustrative scenario, tracings can be saved and shared with other users, e.g., by sending a tracing (e.g., in an image file) to other users or by allowing other users to access a saved state of an interactive chart view.
In the illustrative method 1300 shown in
Operating Environment
Unless otherwise specified in the context of specific examples, described techniques and tools may be implemented by any suitable computing devices, including, but not limited to, laptop computers, desktop computers, smart phones, tablet computers, and/or the like.
Some of the functionality described herein may be implemented in the context of a client-server relationship. In this context, server devices may include suitable computing devices configured to provide information and/or services described herein. Server devices may include any suitable computing devices, such as dedicated server devices. Server functionality provided by server devices may, in some cases, be provided by software (e.g., virtualized computing instances or application objects) executing on a computing device that is not a dedicated server device. The term “client” can be used to refer to a computing device that obtains information and/or accesses services provided by a server over a communication link. However, the designation of a particular device as a client device does not necessarily require the presence of a server. At various times, a single device may act as a server, a client, or both a server and a client, depending on context and configuration. Actual physical locations of clients and servers are not necessarily important, but the locations can be described as “local” for a client and “remote” for a server to illustrate a common usage scenario in which a client is receiving information provided by a server at a remote location.
In its most basic configuration, the computing device 1400 includes at least one processor 1402 and a system memory 1404 connected by a communication bus 1406. Depending on the exact configuration and type of device, the system memory 1404 may be volatile or nonvolatile memory, such as read only memory (“ROM”), random access memory (“RAM”), EEPROM, flash memory, or other memory technology. Those of ordinary skill in the art and others will recognize that system memory 1404 typically stores data and/or program modules that are immediately accessible to and/or currently being operated on by the processor 1402. In this regard, the processor 1402 may serve as a computational center of the computing device 1400 by supporting the execution of instructions.
As further illustrated in
In the exemplary embodiment depicted in
As used herein, the term “computer-readable medium” includes volatile and nonvolatile and removable and non-removable media implemented in any method or technology capable of storing information, such as computer-readable instructions, data structures, program modules, in-memory databases, or other data. In this regard, the system memory 1404 and storage medium 1408 depicted in
For ease of illustration and because it is not important for an understanding of the claimed subject matter,
In any of the described examples, data can be captured by input devices and transmitted or stored for future processing. The processing may include encoding data streams, which can be subsequently decoded for presentation by output devices. Media data can be captured by multimedia input devices and stored by saving media data streams as files on a computer-readable storage medium (e.g., in memory or persistent storage on a client device, server, administrator device, or some other device). Input devices can be separate from and communicatively coupled to computing device 1400 (e.g., a client device), or can be integral components of the computing device 1400. In some embodiments, multiple input devices may be combined into a single, multifunction input device (e.g., a video camera with an integrated microphone). Any suitable input device either currently known or developed in the future may be used with systems described herein.
The computing device 1400 may also include output devices such as a display, speakers, printer, etc. The output devices may include video output devices such as a display or touchscreen. The output devices also may include audio output devices such as external speakers or earphones. The output devices can be separate from and communicatively coupled to the computing device 1400, or can be integral components of the computing device 1400. In some embodiments, multiple output devices may be combined into a single device (e.g., a display with built-in speakers). Further, some devices (e.g., touchscreens) may include both input and output functionality integrated into the same input/output device. Any suitable output device either currently known or developed in the future may be used with described systems.
In general, functionality of computing devices described herein may be implemented in computing logic embodied in hardware or software instructions, which can be written in a programming language, such as C, C++, COBOL, JAVA™, PHP, Perl, HTML, CSS, JavaScript, VBScript, ASPX, Microsoft .NET™ languages such as C#, and/or the like. Computing logic may be compiled into executable programs or written in interpreted programming languages. Generally, functionality described herein can be implemented as logic modules that can be duplicated to provide greater processing capability, merged with other modules, or divided into sub-modules. The computing logic can be stored in any type of computer-readable medium (e.g., a non-transitory medium such as a memory or storage medium) or computer storage device and be stored on and executed by one or more general-purpose or special-purpose processors, thus creating a special-purpose computing device configured to provide functionality described herein.
Extensions and Alternatives
Although some examples are described herein with regard to illustrative touch-enabled mobile computing devices and a corresponding interactive chart view application that can be executed on the illustrative devices, the principles described herein also can be applied to any other computing devices having limited display areas, whether such computing devices employ touchscreen input or other input modes. Described embodiments can be applied to any size display to provide powerful and flexible capabilities, even though the need may be more acute for smaller displays.
For touch-enabled devices, many different types of touch input can be used, and touch input can be interpreted in different ways. Inertia effects, friction effects, and the like can be used to provide a more realistic feel for touch input. For example, in a touch-enabled interface, a flick gesture can be used to initiate a scrolling motion at an initial velocity that gradually decreases (e.g., based on a friction coefficient) before coming to rest.
A “tools” menu can be provided to enhance the functionality of an interactive chart view application. For example, functions such as share (e.g., to share a link to a chart), send (e.g., to send a copy of a chart or a related file), refresh (e.g., to update a chart view after editing), help (e.g., to access a web-based or application-based help library), and cancel (e.g., to discard edits), can be provided by a tools menu.
Many alternatives to the systems and devices described herein are possible. For example, individual modules or subsystems can be separated into additional modules or subsystems or combined into fewer modules or subsystems. As another example, modules or subsystems can be omitted or supplemented with other modules or subsystems. As another example, functions that are indicated as being performed by a particular device, module, or subsystem may instead be performed by one or more other devices, modules, or subsystems. Although some examples in the present disclosure include descriptions of devices comprising specific hardware components in specific arrangements, techniques and tools described herein can be modified to accommodate different hardware components, combinations, or arrangements. Further, although some examples in the present disclosure include descriptions of specific usage scenarios, techniques and tools described herein can be modified to accommodate different usage scenarios. Functionality that is described as being implemented in software can instead be implemented in hardware, or vice versa.
Many alternatives to the techniques described herein are possible. For example, processing stages in the various techniques can be separated into additional stages or combined into fewer stages. As another example, processing stages in the various techniques can be omitted or supplemented with other techniques or processing stages. As another example, processing stages that are described as occurring in a particular order can instead occur in a different order. As another example, processing stages that are described as being performed in a series of steps may instead be handled in a parallel fashion, with multiple modules or software processes concurrently handling one or more of the illustrated processing stages. As another example, processing stages that are indicated as being performed by a particular device or module may instead be performed by one or more other devices or modules.
Many alternatives to the user interfaces described herein are possible. In practice, the user interfaces described herein may be implemented as separate user interfaces or as different states of the same user interface, and the different states can be presented in response to different events, e.g., user input events. The elements shown in the user interfaces can be modified, supplemented, or replaced with other elements in various possible implementations.
The principles, representative embodiments, and modes of operation of the present disclosure have been described in the foregoing description. However, aspects of the present disclosure which are intended to be protected are not to be construed as limited to the particular embodiments disclosed. Further, the embodiments described herein are to be regarded as illustrative rather than restrictive. It will be appreciated that variations and changes may be made by others, and equivalents employed, without departing from the spirit of the present disclosure. Accordingly, it is expressly intended that all such variations, changes, and equivalents fall within the spirit and scope of the claimed subject matter.
This application is a continuation of U.S. patent application Ser. No. 14/205,277, filed Mar. 11, 2014, which claims the benefit of U.S. Provisional Application No. 61/862,919, filed Aug. 6, 2013, the disclosures of which are hereby incorporated by reference herein in their entirety.
Number | Date | Country | |
---|---|---|---|
61862919 | Aug 2013 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14205277 | Mar 2014 | US |
Child | 14268987 | US |