This invention relates generally to devices and methods for displaying graphical views of data. The invention relates specifically to devices and methods for manipulating user interfaces displaying graphical views of data.
Data sets with hundreds of variables or more arise today in many contexts, including, for example: gene expression data for uncovering the link between the genome and the various proteins for which it codes; demographic and consumer profiling data for capturing underlying sociological and economic trends; sales and marketing data for huge numbers of products in vast and ever-changing marketplaces; and environmental measurements for understanding phenomena such as pollution, meteorological changes, and resource impact issues.
Data visualization is a powerful tool for exploring large data sets, both by itself and coupled with data mining algorithms. Graphical views provide user-friendly ways to visualize and interpret data. However, the task of effectively visualizing large databases imposes significant demands on the human-computer interface to the visualization system.
In addition, as computing and networking speeds increase, data visualization that was traditionally performed on desktop computers can also be performed on portable electronic devices, such as smart phones, tablets, and laptop computers. These portable devices typically use touch-sensitive surfaces (e.g., touch screens and/or trackpads) as input devices. These portable devices typically have significantly smaller displays than desktop computers. Thus, additional challenges arise in using touch-sensitive surfaces to manipulate graphical views of data in a user-friendly manner on portable devices.
Consequently, there is a need for faster, more efficient methods and interfaces for manipulating graphical views of data. Such methods and interfaces may complement or replace conventional methods for visualizing data. Such methods and interfaces reduce the cognitive burden on a user and produce a more efficient human-machine interface. For battery-operated devices, such methods and interfaces conserve power and increase the time between battery charges.
The above deficiencies and other problems associated with visualizing data are reduced or eliminated by the disclosed methods, devices, and storage mediums. Various implementations of methods, devices, and storage mediums within the scope of the appended claims each have several aspects, no single one of which is solely responsible for the attributes described herein. Without limiting the scope of the appended claims, after considering this disclosure, one will understand how the aspects of various implementations are used to visualize data.
In one aspect, some embodiments include methods for visualizing data.
In some embodiments, a method is performed at an electronic device with a touch-sensitive surface and a display. The method includes displaying a first chart on the display. The first chart concurrently displays a first set of categories, and each respective category in the first set of categories has a corresponding visual mark displayed in the first chart. The method also includes detecting a first touch input at a location on the touch-sensitive surface that corresponds to a location on the display of a first visual mark for a first category in the first chart. The method further includes, in response to detecting the first touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the first visual mark for the first category in the first chart: removing the first category and the first visual mark from the first chart via an animated transition, where the first visual mark moves in concert with movement of a finger contact in the first touch input during at least a portion of the animated transition; and updating display of the first chart.
In some embodiments, the first touch input is a drag gesture or a swipe gesture that moves in a first predefined direction on the touch-sensitive surface.
In some embodiments, the method includes, in response to detecting the first touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the first visual mark for the first category in the first chart, ceasing to display the first visual mark.
In some embodiments, the method includes, in response to detecting the first touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the first visual mark for the first category in the first chart, displaying an indicium that the first category has been removed.
In some embodiments, the method includes, while displaying the indicium that the first category has been removed, changing from displaying the first chart with the first set of categories, other than the first category, to displaying a second chart. The second chart concurrently displays a second set of categories that are distinct from the first set of categories, and each respective category in the second set of categories has a corresponding visual mark displayed in the second chart. The method also includes, while displaying the second chart with the second set of categories, detecting a second touch input at a location on the touch-sensitive surface that corresponds to a location on the display of the indicium that the first category has been removed and, in response to detecting the second touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the indicium that the first category has been removed, updating display of the second chart to reflect inclusion of data that corresponds to the first category in the first chart.
In some embodiments, updating display of the second chart to reflect inclusion of data that corresponds to the first category in the first chart includes reordering display of the second set of categories in the second chart.
In some embodiments, the method includes, after updating display of the second chart to reflect inclusion of data that corresponds to the first category, detecting a third touch input, and, in response to detecting a third touch input, updating display of the second chart to reflect removal of data that corresponds to the first category in the first chart.
In some embodiments, the method includes, while displaying the first chart on the display, detecting a fourth touch input at a location on the touch-sensitive surface that corresponds to a location on the display of a second visual mark for a second category in the first chart. The method also includes, in response to detecting the fourth touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the second visual mark for the second category in the first chart: maintaining display of the second category and the second visual mark in the second chart; removing display of all categories, other than the second category, in the first set of categories; and removing display of all visual marks, other than the second visual mark, that correspond to categories in the first set of categories.
In some embodiments, the method includes, in response to detecting the fourth touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the second visual mark for the second category in the first chart, displaying an indicium that only the second category in the first set of categories remains displayed.
In some embodiments, the first touch input is a drag gesture or a swipe gesture that moves in a first predefined direction on the touch-sensitive surface and the fourth touch input is a drag gesture or a swipe gesture that moves in a second predefined direction on the touch-sensitive surface that is distinct from the first predefined direction.
In some embodiments, a method is performed at an electronic device with a touch-sensitive surface and a display. The method includes displaying a first chart on the display. The first chart is derived from a set of data. The first chart concurrently displays a first set of categories and a label for the first set of categories. Each respective category in the first set of categories has a corresponding visual mark displayed in the first chart, the corresponding visual mark representing an aggregate value of a first field in the set of data, aggregated according to the first set of categories. The method also includes detecting a first touch input at a location on the touch-sensitive surface that corresponds to a location on the display of the label for the first set of categories. The method further includes, in response to detecting the first touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the label for the first set of categories, replacing display of the first chart with a second chart via an animated transition, where the label for the first set of categories moves in concert with movement of a finger contact in the first touch input during at least a portion of the animated transition. The second chart is derived from the set of data. The second chart concurrently displays a second set of categories, which replaces display of the first set of categories, and a label for the second set of categories, which replaces display of the label for the first set of categories. Each respective category in the second set of categories has a corresponding visual mark displayed in the second chart, the corresponding visual mark representing an aggregate value of the first field in the set of data, aggregated according to the second set of categories.
In some embodiments, the first touch input is a drag gesture or a swipe gesture that moves in a first predefined direction on the touch-sensitive surface.
In some embodiments, a label for the first field and aggregation type is displayed with the first chart, and the label for the first field and aggregation type continues to be displayed with the second chart.
In some embodiments, a label for the first field and aggregation type is displayed with the first chart and the method includes, in response to detecting the first touch input: displaying an animation of the second set of categories replacing the first set of categories; displaying an animation of the label for the second set of categories replacing the label for the first set of categories; and maintaining display of the label for the first field and aggregation type.
In some embodiments, the method includes, while displaying the second chart with the second set of categories, detecting a second touch input at a location on the touch-sensitive surface that corresponds to a location on the display of an indicium that a predefined subset of data is not included in the aggregated values of the first field. The method also includes, in response to detecting the second touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the indicium that the predefined subset of data is not included in the aggregated values of the first field, updating display of the second chart to reflect inclusion of the predefined subset of data in the aggregated values.
In some embodiments, updating display of the second chart to reflect inclusion of the predefined subset of data includes reordering display of the second set of categories in the second chart.
In some embodiments, the method includes, after updating display of the second chart to reflect inclusion of the predefined subset of data, detecting a third touch input. The method also includes, in response to detecting a third touch input, updating display of the second chart to reflect removal of the predefined subset of data.
In some embodiments, replacing display of the first chart with the second chart via the animated transition in response to detecting the first touch input occurs without displaying a selection menu.
In some embodiments, the first touch input is a drag gesture or a swipe gesture that moves in a first predefined direction on the touch-sensitive surface and the method includes, while displaying the second chart, detecting a tap gesture at a location on the touch-sensitive surface that corresponds to a location on the display of a label for the second set of categories. The method also includes, in response to detecting the tap gesture at the location on the touch-sensitive surface that corresponds to the location on the display of the label for the second set of categories, displaying a selection menu with possible sets of categories to display in a third chart. The method further includes detecting selection of a respective set of categories in the selection menu; and, in response to detecting selection of the respective set of categories in the selection menu: replacing display of the second chart with a third chart that contains the selected respective set of categories; and ceasing to display the selection menu.
In some embodiments, the first touch input is a drag gesture or a swipe gesture that moves in a first predefined direction on the touch-sensitive surface; and the method includes, while displaying the second chart, detecting a tap gesture at a location on the touch-sensitive surface that corresponds to a location on the display of a label for the second set of categories. The method also includes, in response to detecting the tap gesture at the location on the touch-sensitive surface that corresponds to the location on the display of the label for the second set of categories, displaying a selection menu with possible sets of categories to display in a third chart. The method further includes detecting selection of a first set of categories in the selection menu and a second set of categories in the selection menu; and, in accordance with detecting selection of the first set of categories in the selection menu and the second set of categories in the selection menu: replacing display of the second chart with a third chart that contains the first set of categories and the second set of categories; and ceasing to display the selection menu.
In some embodiments, a method is performed at an electronic device with a touch-sensitive surface and a display. The method includes displaying a chart on the display. The chart has a horizontal axis and a vertical axis. The horizontal axis includes first horizontal scale markers. The vertical axis includes first vertical scale markers. The method also includes detecting a first touch input at a location on the touch-sensitive surface that corresponds to a location on the display of the chart. The method further includes, while detecting the first touch input: horizontally expanding a portion of the chart such that a distance between first horizontal scale markers increases; and maintaining a vertical scale of the chart such that a distance between first vertical scale markers remains the same.
In some embodiments, the first touch input is a de-pinch gesture.
In some embodiments, the method includes, after horizontally expanding the portion of the chart such that the distance between first horizontal scale markers increases and while continuing to detect the first touch input: continuing to horizontally expand a portion of the chart; displaying second horizontal scale markers, the second horizontal scale markers being at a finer scale than the first horizontal scale markers; and continuing to maintain the vertical scale of the chart.
In some embodiments, the method includes, including, after horizontally expanding the portion of the chart such that the distance between first horizontal scale markers increases and while continuing to detect the first touch input: continuing to horizontally expand a portion of the chart; replacing a first set of displayed data marks with a second set of displayed data marks, where for at least some of the data marks in the first set of data marks, an individual data mark in the first set of data marks corresponds to a plurality of data marks in the second set of data marks; and continuing to maintain the vertical scale of the chart.
In some embodiments, the method includes, after horizontally expanding the portion of the chart and maintaining the vertical scale of the chart while detecting the first touch input, ceasing to detect the first touch input. The method also includes, in response to ceasing to detect the first touch input, changing a vertical scale of the chart.
In some embodiments, a method is performed at an electronic device with a touch-sensitive surface and a display. The method includes displaying at least a first portion of a chart on the display at a first magnification, the first portion of the chart containing a plurality of data marks. The method also includes detecting a first touch input at a location on the touch-sensitive surface that corresponds to a location on the display of the first portion of the chart and, in response to detecting the first touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the first portion of the chart, zooming in to display a second portion of the chart at a second magnification, the second portion of the chart including a first data mark in the plurality of data marks. The method further includes, while displaying the second portion of the chart at the second magnification, detecting a second touch input at a location on the touch-sensitive surface that corresponds to a location on the display of the second portion of the chart. The method further includes, in response to detecting the second touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the second portion of the chart: in accordance with a determination that one or more predefined data-mark-information-display criteria are not met, zooming in to display a third portion of the chart at a third magnification, the third portion of the chart including the first data mark in the plurality of data marks; and, in accordance with a determination that the one or more predefined data-mark-information-display criteria are met, displaying information about the first data mark.
In some embodiments, the second touch input is a same type of touch input as the first touch input.
In some embodiments, the information about the first data mark comprises a data record that corresponds to the first data mark.
In some embodiments, the data-mark-information-display criteria include the second magnification being a predefined magnification.
In some embodiments, the data-mark-information-display criteria include the first data mark in the plurality of data marks being the only data mark displayed at the second magnification after the first touch input.
In some embodiments, the data-mark-information-display criteria include the first data mark reaching a predefined magnification during the second touch input.
In some embodiments, the data-mark-information-display criteria include the device zooming in to display only the first data mark in the plurality of data marks during the second touch input.
In some embodiments, the method includes, in accordance with the determination that one or more predefined data-mark-information-display criteria are met, ceasing to display the first data mark.
In some embodiments, a method is performed at an electronic device with a touch-sensitive surface and a display. The method includes displaying a chart on the display, the chart including a plurality of data marks and detecting a first touch input at a location on the touch-sensitive surface that corresponds to a location on the display of a first predefined area in the chart, the first predefined area having a corresponding first value. The method also includes, in response to detecting the first touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the first predefined area in the chart: selecting the first predefined area and visually distinguishing the first predefined area. The method further includes, while the first predefined area is selected, detecting a second touch input on the touch-sensitive surface and, in response to detecting the second touch input on the touch-sensitive surface: visually distinguishing a sequence of predefined areas in the chart, where the sequence of predefined areas is adjacent to the first predefined area; and displaying a change between the first value for the first predefined area and a value for a last predefined area in the sequence of predefined areas.
In some embodiments, the first touch input is a tap gesture.
In some embodiments, the first predefined area includes a column in the chart.
In some embodiments, the first predefined area includes a single data mark in the plurality of data marks.
In some embodiments, data marks in the plurality of data marks are displayed in corresponding columns in the chart, with a single data mark per column.
In some embodiments, data marks in the plurality of data marks are separated horizontally from one another.
In some embodiments, the second touch input is initially detected at a location on the touch-sensitive surface that corresponds to a location on the display of the first predefined area.
In some embodiments, the second touch input is initially detected at a location on the touch-sensitive surface that corresponds to a location on the display of an edge of the first predefined area.
In some embodiments, the second touch input is initially detected at a location on the touch-sensitive surface that corresponds to a location on the display of a selection handle in or next to the first predefined area.
In some embodiments, the second touch input is a drag gesture, and the method includes detecting movement of a finger contact in the drag gesture across locations on the touch-sensitive surface that correspond to locations on the display of the sequence of predefined areas in the chart that have corresponding values. The method also includes, in response to detecting movement of the finger contact in the drag gesture across locations on the touch-sensitive surface that correspond to locations on the display of the sequence of predefined areas in the chart that have corresponding values, displaying a series of changes between the first value in the first predefined area and the corresponding values of the sequence of predefined areas.
In some embodiments, after the second touch input, a selected area in the chart comprises the first predefined area and the sequence of predefined areas, and the method includes detecting a third touch input, the third touch input including initial contact of a finger at a location on the touch-sensitive surface that corresponds to a location on the display within the selected area in the chart, and movement of the finger across the touch-sensitive surface. The method also includes, in response to detecting the third touch input: moving the selected area across the chart, in accordance with the movement of the finger across the touch-sensitive surface, while maintaining a number of predefined areas in the moved selected area equal to the number of predefined areas in the sequence of predefined areas plus one; and displaying a change between a value corresponding to a leftmost predefined area in the moved selected area and a value corresponding to a rightmost predefined area in the moved selected area.
In some embodiments, after the second touch input, a selected area in the chart comprises the first predefined area and the sequence of predefined areas, and the method includes detecting a fourth touch input. The method also includes, in response to detecting the fourth touch input: zooming in on the selected area in the chart; in accordance with a determination that areas in the chart outside the selected area are still displayed on the display, maintaining selection of the selected area; and in accordance with a determination only areas in the chart in the selected area are displayed on the display, ceasing selection of the selected area.
In some embodiments, a method is performed at an electronic device with a touch-sensitive surface and a display. The method includes displaying a chart on the display. The chart has a horizontal axis with a first horizontal scale with first horizontal scale markers. The chart has a vertical axis with a first vertical scale with first vertical scale markers. The chart includes a first set of data marks. Each respective data mark in the first set of data marks has a respective abscissa and a respective ordinate. The chart includes a line that connects adjacent data marks in the first set of data marks. The method also includes detecting a first touch input at a location on the touch-sensitive surface that corresponds to a location on the display of the chart and, while detecting the first touch input: expanding at least a portion of the chart such that a distance between adjacent first horizontal scale markers increases in accordance with the first touch input; expanding at least a portion of the line that connects adjacent data marks in the first set of data marks in accordance with the first touch input; adding a second set of second data marks, distinct from the first set of data marks, on the line. Each respective data mark in the second set of data marks includes a respective abscissa and a respective ordinate. Each respective data mark in the second set of data marks is placed on the line based on the respective abscissa of the respective data mark, independent of the respective ordinate of the respective data mark. The method further includes, after adding the second set of data marks on the line: for each respective data mark in the second set of data marks placed on the line at a vertical position distinct from its respective ordinate, animatedly moving the respective data mark vertically in accordance with the respective ordinate for the respective data mark and a second vertical scale for the vertical axis; and animatedly adjusting the line so that the line connects the second set of data marks.
In some embodiments, adjacent data marks in the first set of first data marks are separated by a first horizontal distance.
In some embodiments, adjacent data marks in the second set of data marks are separated by a second horizontal distance that corresponds to a second horizontal scale that is finer than the first horizontal scale.
In some embodiments, each respective data mark in the second set of data marks is placed on the line based on the respective abscissa of the respective data mark and the ordinate of the line at the respective abscissa of the respective data mark.
In some embodiments, a shape of the line is maintained when the second set of data marks is added to the line.
In some embodiments, a single data mark in the first set of data marks corresponds to a plurality of data marks in the second set of data marks.
In some embodiments, animatedly moving each respective data mark vertically in accordance with the respective ordinate for the respective data mark and a second vertical scale for the vertical axis occurs while detecting the first input.
In some embodiments, animatedly moving each respective data mark vertically in accordance with the respective ordinate for the respective data mark and a second vertical scale for the vertical axis occurs after ceasing to detect the first input.
In some embodiments, the second vertical scale is the same as the first vertical scale.
In some embodiments, animatedly moving each respective data mark vertically and animatedly adjusting the line so that the line connects the set of second data marks occur concurrently.
In some embodiments, the method includes ceasing to display the set of first data marks when the second set of data marks is added.
In some embodiments, the method includes ceasing to display the set of first data marks after the second set of data marks is added.
In another aspect, some embodiments include electronic devices for visualizing data. In some embodiments, an electronic device for visualizing data includes a display, a touch-sensitive surface, one or more processors, memory, and one or more programs stored in the memory and configured to be executed by the one or more processors. The one or more programs include instructions for displaying a first chart on the display. The first chart concurrently displays a first set of categories, and each respective category in the first set of categories has a corresponding visual mark displayed in the first chart. The one or more programs also include instructions for detecting a first touch input at a location on the touch-sensitive surface that corresponds to a location on the display of a first visual mark for a first category in the first chart. The one or more programs further include instructions for, in response to detecting the first touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the first visual mark for the first category in the first chart: removing the first category and the first visual mark from the first chart via an animated transition, where the first visual mark moves in concert with movement of a finger contact in the first touch input during at least a portion of the animated transition; and updating display of the first chart.
In some embodiments, an electronic device for visualizing data includes a display, a touch-sensitive surface, one or more processors, memory, and one or more programs stored in the memory and configured to be executed by the one or more processors. The one or more programs include instructions for displaying a first chart on the display. The first chart is derived from a set of data. The first chart concurrently displays a first set of categories and a label for the first set of categories. Each respective category in the first set of categories has a corresponding visual mark displayed in the first chart, the corresponding visual mark representing an aggregate value of a first field in the set of data, aggregated according to the first set of categories. The one or more programs also include instructions for detecting a first touch input at a location on the touch-sensitive surface that corresponds to a location on the display of the label for the first set of categories. The one or more programs further include instructions for, in response to detecting the first touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the label for the first set of categories, replacing display of the first chart with a second chart via an animated transition, where the label for the first set of categories moves in concert with movement of a finger contact in the first touch input during at least a portion of the animated transition. The second chart is derived from the set of data. The second chart concurrently displays a second set of categories, which replaces display of the first set of categories, and a label for the second set of categories, which replaces display of the label for the first set of categories. Each respective category in the second set of categories has a corresponding visual mark displayed in the second chart, the corresponding visual mark representing an aggregate value of the first field in the set of data, aggregated according to the second set of categories.
In some embodiments, an electronic device for visualizing data includes a display, a touch-sensitive surface, one or more processors, memory, and one or more programs stored in the memory and configured to be executed by the one or more processors. The one or more programs include instructions for displaying a chart on the display. The chart has a horizontal axis and a vertical axis. The horizontal axis includes first horizontal scale markers. The vertical axis includes first vertical scale markers. The one or more programs also include instructions for detecting a first touch input at a location on the touch-sensitive surface that corresponds to a location on the display of the chart. The one or more programs further include instructions for, while detecting the first touch input: horizontally expanding a portion of the chart such that a distance between first horizontal scale markers increases; and maintaining a vertical scale of the chart such that a distance between first vertical scale markers remains the same.
In some embodiments, an electronic device for visualizing data includes a display, a touch-sensitive surface, one or more processors, memory, and one or more programs stored in the memory and configured to be executed by the one or more processors. The one or more programs include instructions for displaying at least a first portion of a chart on the display at a first magnification, the first portion of the chart containing a plurality of data marks. The one or more programs also include instructions for detecting a first touch input at a location on the touch-sensitive surface that corresponds to a location on the display of the first portion of the chart and, in response to detecting the first touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the first portion of the chart, zooming in to display a second portion of the chart at a second magnification, the second portion of the chart including a first data mark in the plurality of data marks. The one or more programs further include instructions for, while displaying the second portion of the chart at the second magnification, detecting a second touch input at a location on the touch-sensitive surface that corresponds to a location on the display of the second portion of the chart. The one or more programs further include instructions for, in response to detecting the second touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the second portion of the chart: in accordance with a determination that one or more predefined data-mark-information-display criteria are not met, zooming in to display a third portion of the chart at a third magnification, the third portion of the chart including the first data mark in the plurality of data marks; and, in accordance with a determination that the one or more predefined data-mark-information-display criteria are met, displaying information about the first data mark.
In some embodiments, an electronic device for visualizing data includes a display, a touch-sensitive surface, one or more processors, memory, and one or more programs stored in the memory and configured to be executed by the one or more processors. The one or more programs include instructions for displaying a chart on the display, the chart including a plurality of data marks. The one or more programs also include instructions for detecting a first touch input at a location on the touch-sensitive surface that corresponds to a location on the display of a first predefined area in the chart, the first predefined area having a corresponding first value. The one or more programs further include instructions for, in response to detecting the first touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the first predefined area in the chart: selecting the first predefined area; and visually distinguishing the first predefined area. The one or more programs further include instructions for, while the first predefined area is selected, detecting a second touch input on the touch-sensitive surface. The one or more programs further include instructions for, in response to detecting the second touch input on the touch-sensitive surface: visually distinguishing a sequence of predefined areas in the chart, where the sequence of predefined areas is adjacent to the first predefined area; and displaying a change between the first value for the first predefined area and a value for a last predefined area in the sequence of predefined areas.
In some embodiments, an electronic device for visualizing data includes a display, a touch-sensitive surface, one or more processors, memory, and one or more programs stored in the memory and configured to be executed by the one or more processors. The one or more programs include instructions for displaying a chart on the display. The chart has a horizontal axis with a first horizontal scale with first horizontal scale markers. The chart has a vertical axis with a first vertical scale with first vertical scale markers. The chart includes a first set of data marks. Each respective data mark in the first set of data marks has a respective abscissa and a respective ordinate. The chart includes a line that connects adjacent data marks in the first set of data marks. The one or more programs also include instructions for detecting a first touch input at a location on the touch-sensitive surface that corresponds to a location on the display of the chart and, while detecting the first touch input: expanding at least a portion of the chart such that a distance between adjacent first horizontal scale markers increases in accordance with the first touch input; expanding at least a portion of the line that connects adjacent data marks in the first set of data marks in accordance with the first touch input; adding a second set of second data marks, distinct from the first set of data marks, on the line. Each respective data mark in the second set of data marks includes a respective abscissa and a respective ordinate. Each respective data mark in the second set of data marks is placed on the line based on the respective abscissa of the respective data mark, independent of the respective ordinate of the respective data mark. The one or more programs further include instructions for, after adding the second set of data marks on the line: for each respective data mark in the second set of data marks placed on the line at a vertical position distinct from its respective ordinate, animatedly moving the respective data mark vertically in accordance with the respective ordinate for the respective data mark and a second vertical scale for the vertical axis; and animatedly adjusting the line so that the line connects the second set of data marks.
In some embodiments, an electronic device for visualizing data includes a display, a touch-sensitive surface, one or more processors, memory, and one or more programs stored in the memory and configured to be executed by the one or more processors. The one or more programs include instructions for performing any of the methods described herein.
In yet another aspect, some embodiments include a non-transitory computer readable storage medium, storing one or more programs for execution by one or more processors of an electronic device with a display and a touch-sensitive surface, the one or more programs including instructions for performing any of the methods described herein.
In yet another aspect, some embodiments include a graphical user interface on an electronic device with a display, a touch-sensitive surface, a memory, and one or more processors to execute one or more programs stored in the memory, the graphical user interface comprising user interfaces displayed in accordance with any of the methods described herein.
Thus, electronic devices with displays and touch-sensitive surfaces are provided with faster, more efficient methods and interfaces for data visualization, thereby increasing the effectiveness, efficiency, and user satisfaction with such devices. Such methods and interfaces may complement or replace conventional methods for data visualization.
So that the present disclosure can be understood in greater detail, a more particular description may be had by reference to the features of various implementations, some of which are illustrated in the appended drawings. The appended drawings, however, merely illustrate the more pertinent features of the present disclosure and are therefore not to be considered limiting, for the description may admit to other effective features.
In accordance with common practice, the various features illustrated in the drawings may not be drawn to scale. Accordingly, the dimensions of the various features may be arbitrarily expanded or reduced for clarity. In addition, some of the drawings may not depict all of the components of a given system or device. Finally, like reference numerals may be used to denote like features throughout the specification and figures.
As portable electronic devices become more compact, and the number of functions performed by applications on any given device increase, it has become a significant challenge to design user interfaces that allow users to interact with the applications easily. This challenge is particularly significant for portable devices with smaller screens and/or limited input devices. In addition, data visualization applications need to provide user-friendly ways to explore data in order to enable a user to extract significant meaning from a particular data set. Some application designers have resorted to using complex menu systems to enable a user to perform desired functions. These conventional user interfaces often result in complicated key sequences and/or menu hierarchies that must be memorized by the user and/or that are otherwise cumbersome and/or not intuitive to use.
The methods, devices, and GUIs described herein make manipulation of data sets and data visualizations more efficient and intuitive for a user. A number of different intuitive user interfaces for data visualizations are described below. For example, applying a filter to a data set can be accomplished by a simple touch input on a given portion of a displayed chart rather than via a nested menu system. Additionally, switching between chart categories can be accomplished by a simple touch input on a displayed chart label.
Reference will now be made in detail to embodiments, examples of which are illustrated in the accompanying drawings. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the various described embodiments. However, it will be apparent to one of ordinary skill in the art that the various described embodiments may be practiced without these specific details. In other instances, well-known methods, procedures, components, circuits, and networks have not been described in detail so as not to unnecessarily obscure aspects of the embodiments.
Attention is now directed toward embodiments of portable devices with touch-sensitive displays. Embodiments of electronic devices and user interfaces for such devices are described. In some embodiments, the device is a portable communications device, such as a mobile telephone, that also contains other functions, such as PDA and/or music player functions. Other portable electronic devices, such as laptops or tablet computers with touch-sensitive surfaces (e.g., touch screen displays and/or touch pads), are, optionally, used. It should also be understood that, in some embodiments, the device is not a portable communications device, but is a desktop computer with a touch-sensitive surface (e.g., a touch screen display and/or a touch pad). In the discussion that follows, an electronic device that includes a display and a touch-sensitive surface is described. It should be understood, however, that the electronic device optionally includes one or more other physical user-interface devices, such as a physical keyboard, a mouse, a microphone, and/or a joystick.
Device 100 includes one or more processing units (CPU's) 302, input/output (I/O) subsystem 306, memory 308 (which optionally includes one or more computer readable storage mediums), and network communications interface 310. These components optionally communicate over one or more communication buses or signal lines 304. Communication buses 304 optionally include circuitry (sometimes called a chipset) that interconnects and controls communications between system components.
Memory 308 includes high-speed random access memory, such as DRAM, SRAM, DDR RAM or other random access solid state memory devices, and may include non-volatile memory, such as one or more magnetic disk storage devices, optical disk storage devices, flash memory devices, or other non-volatile solid state storage devices. Memory 308 optionally includes one or more storage devices remotely located from processor(s) 302. Memory 308, or alternately the non-volatile memory device(s) within memory 308, comprises a non-transitory computer readable storage medium.
In some embodiments, the software components stored in memory 308 include operating system 318, communication module 320, input/output (I/O) module 322, and applications 328. In some embodiments, one or more of the various modules comprises a set of instructions in memory 308. In some embodiments, memory 308 stores one or more data sets in one or more database(s) 332.
Operating system 318 (e.g., Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an embedded operating system such as VxWorks) includes various software components and/or drivers for controlling and managing general system tasks (e.g., memory management, storage device control, power management, etc.) and facilitates communication between various hardware, software, and/or firmware components.
Communication module 320 facilitates communication with other devices over one or more external ports and also includes various software components for handling data received from other devices.
I/O module 322 includes touch input sub-module 324 and graphics sub-module 326. Touch input sub-module 324 optionally detects touch inputs with touch screen 102 and other touch sensitive devices (e.g., a touchpad or physical click wheel). Touch input sub-module 324 includes various software components for performing various operations related to detection of a touch input, such as determining if contact has occurred (e.g., detecting a finger-down event), determining if there is movement of the contact and tracking the movement across the touch-sensitive surface (e.g., detecting one or more finger-dragging events), and determining if the contact has ceased (e.g., detecting a finger-up event or a break in contact). Touch input sub-module 324 receives contact data from the touch-sensitive surface (e.g., touch screen 102). These operations are, optionally, applied to single touch inputs (e.g., one finger contacts) or to multiple simultaneous touch inputs (e.g., “multitouch”/multiple finger contacts). In some embodiments, touch input sub-module 324 detects contact on a touchpad.
Touch input sub-module 324 optionally detects a gesture input by a user. Different gestures on the touch-sensitive surface have different contact patterns. Thus, a gesture is, optionally, detected by detecting a particular contact pattern. For example, detecting a finger tap gesture includes detecting a finger-down event followed by detecting a finger-up (lift off) event at the same position (or substantially the same position) as the finger-down event (e.g., at the position of an data mark). As another example, detecting a finger swipe gesture on the touch-sensitive surface includes detecting a finger-down event followed by detecting one or more finger-dragging events, and subsequently followed by detecting a finger-up (lift off) event.
Graphics sub-module 326 includes various known software components for rendering and displaying graphics on touch screen 102 or other display, including components for changing the visual impact (e.g., brightness, transparency, saturation, contrast or other visual property) of graphics that are displayed. As used herein, the term “graphics” includes any object that can be displayed to a user, including without limitation data visualizations, icons (such as user-interface objects including soft keys), text, digital images, animations and the like. In some embodiments, graphics sub-module 326 stores data representing graphics to be used. Each graphic is, optionally, assigned a corresponding code. Graphics sub-module 326 receives, from applications etc., one or more codes specifying graphics to be displayed along with, if necessary, coordinate data and other graphic property data, and then generates screen image data to output to the display or touch screen.
Applications 328 optionally include data visualization module 330 for displaying graphical views of data and one or more other applications. Examples of other applications that are, optionally, stored in memory 308 include word processing applications, email applications, and presentation applications.
In conjunction with I/O interface 306, including touch screen 102, CPU(s) 302, and/or database(s) 332, data visualization module 330 includes executable instructions for displaying and manipulating various graphical views of data.
Each of the above identified modules and applications correspond to a set of executable instructions for performing one or more functions described above and the methods described in this application (e.g., the computer-implemented methods and other information processing methods described herein). These modules (i.e., sets of instructions) need not be implemented as separate software programs, procedures, or modules, and thus various subsets of these modules are, optionally, combined or otherwise re-arranged in various embodiments. In some embodiments, memory 308 optionally stores a subset of the modules and data structures identified above. Furthermore, memory 308 optionally stores additional modules and data structures not described above.
Device 200 typically includes one or more processing units/cores (CPUs) 352, one or more network or other communications interfaces 362, memory 350, I/O interface 356, and one or more communication buses 354 for interconnecting these components. Communication buses 354 optionally include circuitry (sometimes called a chipset) that interconnects and controls communications between system components.
I/O interface 306 comprises screen 202 (also sometimes called a display), touch-sensitive surface 204, and one or more sensor(s) 360 (e.g., optical, acceleration, proximity, and/or touch-sensitive sensors). I/O interface 356 optionally includes a keyboard and/or mouse (or other pointing device) 358. I/O interface 356 couples input/output peripherals on device 200, such as screen 202, touch-sensitive surface 204, other input devices 358, and one or more sensor(s) 360, to CPU(s) 352 and/or memory 350.
Screen 202 provides an output interface between the device and a user. Screen 202 displays visual output to the user. The visual output optionally includes graphics, text, icons, data marks, and any combination thereof (collectively termed “graphics”). In some embodiments, some or all of the visual output corresponds to user-interface objects. Screen 202 optionally uses LCD (liquid crystal display) technology, LPD (light emitting polymer display) technology, or LED (light emitting diode) technology, although other display technologies are used in other embodiments.
In addition to the touch screen, device 200 includes touch-sensitive surface 204 (e.g., a touchpad) for detecting touch inputs. Touch-sensitive surface 204 accepts input from the user via touch inputs. For example, touch input 210 in
Memory 350 includes high-speed random access memory, such as DRAM, SRAM, DDR RAM or other random access solid state memory devices; and optionally includes non-volatile memory, such as one or more magnetic disk storage devices, optical disk storage devices, flash memory devices, or other non-volatile solid state storage devices. Memory 350 optionally includes one or more storage devices remotely located from CPU(s) 352. In some embodiments, the software components stored in memory 350 include operating system 364, communication module 366, input/output (I/O) module 368, and applications 374. In some embodiments, one or more of the various modules comprises a set of instructions in memory 350. In some embodiments, memory 350 stores one or more data sets in one or more database(s) 378. In some embodiments, I/O module 368 includes touch input sub-module 370 and graphics sub-module 372. In some embodiments, applications 374 include data visualization module 376.
In some embodiments, memory 350 stores programs, modules, and data structures analogous to the programs, modules, and data structures stored in memory 308 of portable multifunction device 100 (
Device 200 also includes a power system for powering the various components. The power system optionally includes a power management system, one or more power sources (e.g., battery, alternating current (AC)), a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator (e.g., a light-emitting diode (LED)) and any other components associated with the generation, management, and distribution of power in portable devices.
Each of the above identified elements in
Attention is now directed towards embodiments of user interfaces (“UI”) that are, optionally, implemented on portable multifunction device 100 or device 200. The following examples are shown utilizing a touch screen (e.g., touch screen 102 in
Attention is now directed towards methods that are, optionally, implemented on portable multifunction device 100 or device 200.
As described below, method 2000 provides an intuitive way to change filtering. This method is particularly useful when the user is interacting with a portable device and/or a compact device with a smaller screen. The method reduces the cognitive burden on the user when applying and/or removing filters, thereby creating a more efficient human-machine interface. For battery-operated electronic devices, enabling a user to adjust filters faster and more efficiently conserves power and increase the time between battery charges.
The device displays (2002) a first chart on the display. For example,
The first chart concurrently displays (2004) a first set of categories. For example, the bar chart in
Each respective category in the first set of categories has (2006) a corresponding visual mark (e.g., a picture, drawing, or other graphic) displayed in the first chart. For example, a respective category in a bar chart has a corresponding bar that represents a value for that respective category, a respective category in a pie chart has a corresponding slice of the pie chart that represents a value for that respective category, etcetera. For example, the bar chart in
The device detects (2008) a first touch input (e.g., a swipe gesture or a drag gesture) at a location on the touch-sensitive surface that corresponds to a location on the display of a first visual mark for a first category in the first chart. For example,
In some embodiments, the first touch input is (2010) a drag gesture or a swipe gesture that moves in a first predefined direction on the touch-sensitive surface. For example, a leftward drag gesture. For example, the movement of contact 510 shown in
In response (2012) to detecting the first touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the first visual mark for the first category in the first chart, the device removes (2014) the first category and the first visual mark from the first chart via an animated transition, where the first visual mark moves in concert with movement of a finger contact in the first touch input during at least a portion of the animated transition. For example,
In response (2012) to detecting the first touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the first visual mark for the first category in the first chart, the device updates (2016) display of the first chart. For example, repositioning the remaining categories in the first set and their corresponding visual marks (e.g., graphics) in the first chart. Thus, data that corresponds to the first category is filtered out of the first chart. This process may be repeated to remove additional categories in the first set of categories from the first chart. In some embodiments, the contact is a stylus contact. For example,
In some embodiments, in response (2012) to detecting the first touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the first visual mark for the first category in the first chart, the device ceases (2018) to display the first visual mark. In some embodiments, the first visual mark remains displayed while the finger contact in the first touch input remains in continuous contact with the touch-sensitive surface, and the first visual mark ceases to be displayed (e.g., fades out) in response to detecting lift off of the finger contact in the first touch input from the touch-sensitive surface. For example,
In some embodiments, while displaying (2020) the first chart on the display, the device detects a fourth touch input (e.g., a tap gesture, a swipe gesture, or a drag gesture) at a location on the touch-sensitive surface that corresponds to a location on the display of a second visual mark for a second category in the first chart. For example,
In some embodiments, in response (2022) to detecting the fourth touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the second visual mark for the second category in the first chart, the device: maintains (2024) display of the second category and the second visual mark in the second chart; removes display of all categories, other than the second category, in the first set of categories; and removes display of all visual marks, other than the second visual mark, that correspond to categories in the first set of categories. In some embodiments, the device responds differently to different finger gestures made on the touch-sensitive surface at a location that corresponds to a respective graphic for a respective category in the chart. For example, in accordance with a determination that the gesture is a leftward swipe or drag gesture, the device removes the respective category from the chart. On the other hand, in accordance with a determination that the gesture is a rightward swipe or drag gesture, the device maintains the respective category, but removes all the other categories from the chart. For example,
In some embodiments, in response (2022) to detecting the fourth touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the second visual mark for the second category in the first chart, the device displays (2026) an indicium that only the second category in the first set of categories remains displayed. For example,
In some embodiments, the first touch input is (2028) a drag gesture or a swipe gesture that moves in a first predefined direction on the touch-sensitive surface (e.g., a leftward drag gesture) and the fourth touch input is a drag gesture or a swipe gesture that moves in a second predefined direction on the touch-sensitive surface that is distinct from the first predefined direction (e.g., a rightward drag gesture). In some embodiments, the second predefined direction is opposite the first predefined direction. In some embodiments, the second predefined direction is perpendicular to the first predefined direction. For example, the movement of contact 510 shown in
In some embodiments, in response (2012) to detecting the first touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the first visual mark for the first category in the first chart, the device displays (2030) an indicium that the first category has been removed. In some embodiments, an indicium is displayed that indicates that data corresponding to the first category has been filtered out of the data that is used to create various related charts, such as the first chart and the second chart. For example,
In some embodiments, while displaying the indicium that the first category has been removed, the device changes (2032) from displaying the first chart with the first set of categories, other than the first category, to displaying a second chart. For example,
In some embodiments, the second chart concurrently displays (2034) a second set of categories that are distinct from the first set of categories. Each respective category in the second set of categories has a corresponding visual mark displayed in the second chart. For example,
In some embodiments, while displaying the second chart with the second set of categories, the device detects (2036) a second touch input (e.g., a tap gesture, a swipe gesture, or a drag gesture) at a location on the touch-sensitive surface that corresponds to a location on the display of the indicium that the first category has been removed. For example,
In some embodiments, in response to detecting the second touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the indicium that the first category has been removed, the device updates (2038) display of the second chart to reflect inclusion of data that corresponds to the first category in the first chart. Thus, data that corresponds to the first category, which was filtered out of the first chart and remained filtered out when the second chart was initially displayed, is added to the second chart and the visual marks (e.g., graphics) that correspond to the second set of categories in the second chart are automatically updated accordingly to reflect the addition of the data that corresponds to the first category. For example,
In some embodiments, updating display of the second chart to reflect inclusion of data that corresponds to the first category in the first chart includes reordering (2040) display of the second set of categories in the second chart. For example, if the second set of categories in the second chart are ordered largest to smallest, and adding in the data that corresponds to the first category in the first chart changes the order of the second set of categories, then the display of the second chart is updated to reflect the changed order of the second set of categories. For example, via an animated rearrangement of the second set of categories as shown in
In some embodiments, after updating display of the second chart to reflect inclusion of data that corresponds to the first category, the device detects (2042) a third touch input. For example, a tap gesture, a swipe gesture, or a drag gesture at a location on the touch-sensitive surface that corresponds to a location on the display of a predefined area that displays one or more indicium of data filters, such as the area that displayed the indicium that the first category had been removed. For example,
In some embodiments, in response to detecting a third touch input, the device updates (2044) display of the second chart to reflect removal of data that corresponds to the first category in the first chart. Thus, data that corresponds to the first category, which was added to the second chart in response to the second touch input (e.g., a rightward swipe or drag gesture), is removed in response to the third touch input (e.g., a leftward swipe or drag gesture) and the visual marks (e.g., graphics) that correspond to the second set of categories in the second chart are automatically updated accordingly to reflect the removal of the data that corresponds to the first category. For example,
As described below, method 2100 provides an intuitive way to change chart categories. This method is particularly useful when the user is interacting with a portable device and/or a compact device with a smaller screen. The method reduces the cognitive burden on the user when changing chart categories, thereby creating a more efficient human-machine interface. For battery-operated electronic devices, enabling a user to switch categories faster and more efficiently conserves power and increase the time between battery charges.
The device displays (2102) a first chart on the display. For example,
The first chart is derived (2104) from a set of data. For example, the chart in
The first chart concurrently displays (2106) a first set of categories and a label for the first set of categories. For example,
Each respective category in the first set of categories has (2108) a corresponding visual mark displayed in the first chart, the corresponding visual mark representing an aggregate value of a first field in the set of data, aggregated according to the first set of categories. For example, in
The device detects (2110) a first touch input (e.g., a swipe gesture or a drag gesture) at a location on the touch-sensitive surface that corresponds to a location on the display of the label for the first set of categories. For example,
In some embodiments, the first touch input is (2112) a drag gesture or a swipe gesture that moves in a first predefined direction on the touch-sensitive surface (e.g., a leftward drag gesture). For example, the movement of contact 610 shown in
In response to detecting the first touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the label for the first set of categories, the device replaces (2114) display of the first chart with a second chart via an animated transition, where the label for the first set of categories moves in concert with movement of a finger contact in the first touch input during at least a portion of the animated transition. For example,
The second chart is derived (2116) from the set of data. For example, the chart in
The second chart concurrently displays (2118) a second set of categories, which replaces display of the first set of categories, and a label for the second set of categories, which replaces display of the label for the first set of categories. In some embodiments, the label for the first set of categories remains displayed while the finger contact in the first touch input remains in continuous contact with the touch-sensitive surface, and the label for the first set of categories ceases to be displayed (e.g., fades out) in response to detecting lift off of the finger contact in the first touch input from the touch-sensitive surface. For example,
Each respective category in the second set of categories has (2120) a corresponding visual mark displayed in the second chart, the corresponding visual mark representing an aggregate value of the first field in the set of data, aggregated according to the second set of categories. For example, in
In some embodiments, a label for the first field and aggregation type is displayed (2122) with the first chart, and the label for the first field and aggregation type continues to be displayed with the second chart. For example,
In some embodiments, a label for the first field and aggregation type (e.g., SUM, MAX, MIN, AVERAGE, COUNT) is displayed (2124) with the first chart. For example,
In some embodiments, in response to detecting the first touch input, the device: displays (2126) an animation of the second set of categories replacing the first set of categories; displays an animation of the label for the second set of categories replacing the label for the first set of categories; and maintains display of the label for the first field and aggregation type. For example,
In some embodiments, while displaying the second chart with the second set of categories, the device detects (2128) a second touch input (e.g., a tap gesture, a swipe gesture, or a drag gesture) at a location on the touch-sensitive surface that corresponds to a location on the display of an indicium that a predefined subset of data is not included in the aggregated values of the first field. For example,
In some embodiments, in response to detecting the second touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the indicium that the predefined subset of data is not included in the aggregated values of the first field, the device updates display (2130) of the second chart to reflect inclusion of the predefined subset of data in the aggregated values. In some embodiments, data that was filtered out of the second set of data is added to the second set of data and the visual marks (e.g., graphics) that correspond to the second set of categories in the second chart are automatically updated accordingly to reflect the addition of the data that was previously filtered out. For example,
In some embodiments, updating display of the second chart to reflect inclusion of the predefined subset of data includes reordering (2132) display of the second set of categories in the second chart. For example, if the second set of categories in the second chart are ordered largest to smallest, and adding in the predefined subset of data changes the order of the second set of categories, then the display of the second chart is updated to reflect the changed order of the second set of categories. For example, via an animated rearrangement of the second set of categories as shown in
In some embodiments, after updating display of the second chart to reflect inclusion of the predefined subset of data, the device detects (2134) a third touch input. For example, a tap gesture, a swipe gesture, or a drag gesture at a location on the touch-sensitive surface that corresponds to a location on the display of a predefined area that displays one or more indicium of data filters, such as the area that displayed the indicium that the predefined subset of data is not included in the second set of data. For example,
In some embodiments, in response to detecting a third touch input, the device updates (2136) display of the second chart to reflect removal of the predefined subset of data. Thus, the predefined subset of data, which was added to the second chart in response to the second touch input (e.g., a rightward swipe or drag gesture), is removed in response to the third touch input (e.g., a leftward swipe or drag gesture) and the visual marks (e.g., graphics) that correspond to the second set of categories in the second chart are automatically updated accordingly to reflect the removal of the predefined subset of data. For example,
In some embodiments, replacing display of the first chart with the second chart via the animated transition in response to detecting the first touch input occurs (2138) without displaying a selection menu. For example, between displaying the first chart and displaying the second chart, the device does not display a selection menu that contains possible sets of categories to display in the second chart. For example,
In some embodiments, the first touch input is (2140) a drag gesture or a swipe gesture that moves in a first predefined direction on the touch-sensitive surface. For example, the movement of contact 610 shown in
In some embodiments, while displaying the second chart, the device detects (2142) a tap gesture at a location on the touch-sensitive surface that corresponds to a location on the display of a label for the second set of categories. For example,
In some embodiments, in response to detecting the tap gesture at the location on the touch-sensitive surface that corresponds to the location on the display of the label for the second set of categories, the device displays (2144) a selection menu with possible sets of categories to display in a third chart. For example,
In some embodiments, the device detects (2146) selection of a respective set of categories in the selection menu. For example, detecting a tap gesture at a location on the touch-sensitive surface that corresponds to a location on the display of the respective set of categories in the selection menu. For example,
In some embodiments, in response to detecting selection of the respective set of categories in the selection menu, the device: replaces (2148) display of the second chart with a third chart that contains the selected respective set of categories; and ceasing to display the selection menu. Thus, in some embodiments, swipe or drag gestures on a chart label are used as a shortcut to quickly move between different chart types, whereas a tap gesture on the chart label is used to display a selection menu with available chart types and another tap gesture is used to select and display a particular chart type. For example,
In some embodiments, the first touch input is (2140) a drag gesture or a swipe gesture that moves in a first predefined direction on the touch-sensitive surface. For example, the movement of contact 620 shown in
In some embodiments, while displaying the second chart, the device detects (2150) a tap gesture at a location on the touch-sensitive surface that corresponds to a location on the display of a label for the second set of categories. For example,
In some embodiments, in response to detecting the tap gesture at the location on the touch-sensitive surface that corresponds to the location on the display of the label for the second set of categories, the device displays (2152) a selection menu with possible sets of categories to display in a third chart. For example,
In some embodiments, the device detects (2154) selection of a first set of categories in the selection menu and a second set of categories in the selection menu. For example, detecting a tap gesture at a location on the touch-sensitive surface that corresponds to a location on the display of the first set of categories in the selection menu and detecting a tap gesture at a location on the touch-sensitive surface that corresponds to a location on the display of the second set of categories in the selection menu. For example,
In some embodiments, in accordance with detecting selection of the first set of categories in the selection menu and the second set of categories in the selection menu, the device: replaces (2156) display of the second chart with a third chart that contains the first set of categories and the second set of categories; and ceases to display the selection menu. For example, in some embodiments, the third chart contains categories 612 as shown in
As described below, method 2200 provides an intuitive way to adjust chart magnification (e.g., zooming in and/or zooming out the chart view). This method is particularly useful when the user is interacting with a portable device and/or a compact device with a smaller screen. The method reduces the cognitive burden on the user when adjusting chart magnification, thereby creating a more efficient human-machine interface. For battery-operated electronic devices, enabling a user to adjust magnification faster and more efficiently conserves power and increase the time between battery charges.
The device displays (2202) a first chart on the display. For example,
The chart has (2204) a horizontal axis and a vertical axis. For example, the chart in
The horizontal axis includes (2206) first horizontal scale markers. For example, the chart in
The vertical axis includes (2208) first vertical scale markers. For example, the chart in
The device detects (2210) a first touch input at a location on the touch-sensitive surface that corresponds to a location on the display of the chart. For example,
In some embodiments, the first touch input is (2212) a de-pinch gesture. For example, the movement of contacts 1110 and 1120 shown in
While detecting the first touch input, the device: horizontally expands (2214) a portion of the chart such that a distance between first horizontal scale markers increases; and maintains a vertical scale of the chart such that a distance between first vertical scale markers remains the same. For example,
In some embodiments, after horizontally expanding the portion of the chart and maintaining the vertical scale of the chart while detecting the first touch input, the device ceases (2216) to detect the first touch input. For example,
In some embodiments, in response to ceasing to detect the first touch input (e.g., detecting lift off of the fingers in the first touch input), the device changes (2218) a vertical scale of the chart. In some embodiments, the vertical scale is adjusted so that all of the data marks are visible within a predefined margin. For example, in
In some embodiments, after horizontally expanding the portion of the chart such that the distance between first horizontal scale markers increases (2220), the device, while continuing to detect the first touch input: continues (2222) to horizontally expand a portion of the chart; displays second horizontal scale markers, the second horizontal scale markers being at a finer scale than the first horizontal scale markers; and continues to maintain the vertical scale of the chart. In some embodiments, the horizontal scale markers change from years to months, months to weeks, weeks to days, or days to hours, as shown in
In some embodiments, after horizontally expanding the portion of the chart such that the distance between first horizontal scale markers increases (2224), the device, while continuing to detect the first touch input: continues (2226) to horizontally expand a portion of the chart; replaces a first set of displayed data marks with a second set of displayed data marks, where for at least some of the data marks in the first set of data marks, an individual data mark in the first set of data marks corresponds to a plurality of data marks in the second set of data marks; and continues to maintain the vertical scale of the chart. For example,
As described below, method 2300 provides an intuitive way to display information about a data mark. This method is particularly useful when the user is interacting with a portable device and/or a compact device with a smaller screen. The method reduces the cognitive burden on the user when accessing information about a data mark, thereby creating a more efficient human-machine interface. For battery-operated electronic devices, enabling a user to access data mark information faster and more efficiently conserves power and increase the time between battery charges.
The device displays (2302) at least a first portion of a chart on the display at a first magnification, the first portion of the chart containing a plurality of data marks (e.g., circles, squares, triangles, bars, or other representations of data points). For example,
The device detects (2304) a first touch input (e.g., a de-pinch gesture) at a location on the touch-sensitive surface that corresponds to a location on the display of the first portion of the chart. For example,
In response to detecting the first touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the first portion of the chart, the device zooms (2306) in to display a second portion of the chart at a second magnification, the second portion of the chart including a first data mark in the plurality of data marks. For example,
While displaying the second portion of the chart at the second magnification, the device detects (2308) a second touch input at a location on the touch-sensitive surface that corresponds to a location on the display of the second portion of the chart. For example,
In some embodiments, the second touch input is (2310) a same type of touch input as the first touch input (e.g., both the first touch input and the second touch input are de-pinch gestures). For example, contacts 1210 and 1220 shown in
In response to detecting the second touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the second portion of the chart (2312), the device, in accordance with a determination that one or more predefined data-mark-information-display criteria are not met, zooms (2314) in to display a third portion of the chart at a third magnification, the third portion of the chart including the first data mark in the plurality of data marks.
In response to detecting the second touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the second portion of the chart (2312), the device, in accordance with a determination that the one or more predefined data-mark-information-display criteria are met, displays (2316) information about the first data mark. In some embodiments, while displaying information about the first data mark, the device detects a third touch input on the touch-sensitive surface; and in response to detecting the third touch input, ceases to display the information about the first data mark and display a fourth portion of the chart. In some embodiments, the fourth portion of the chart is the second portion of the chart. For example,
In some embodiments, the information about the first data mark comprises (2318) a data record that corresponds to the first data mark. For example,
In some embodiments, the data-mark-information-display criteria include (2320) the second magnification being a predefined magnification. For example, if the first touch input zooms in the chart to a predefined maximum magnification, then the second touch input causes display of the information about the first data mark, instead of (or in addition to) causing continued zooming in of the chart.
In some embodiments, the data-mark-information-display criteria include (2322) the first data mark in the plurality of data marks being the only data mark displayed at the second magnification after the first touch input. For example, if the first touch input zooms in the chart so that only the first data mark is displayed, then the second touch input causes display of the information about the first data mark, instead of (or in addition to) causing continued zooming in of the chart.
In some embodiments, the data-mark-information-display criteria include (2324) the first data mark reaching a predefined magnification during the second touch input. In some embodiments, if the first data mark reaches a predefined magnification during the second touch input (e.g., during a de-pinch gesture), then the device zooms in during the second touch input prior to reaching the predefined magnification, and the device displays the information about the first data mark after reaching the predefined magnification (with or without continuing to zoom in the chart during the remainder of the second touch input).
In some embodiments, the data-mark-information-display criteria include (2326) the device zooming in to display only the first data mark in the plurality of data marks during the second touch input. In some embodiments, if during the second touch input (e.g., a de-pinch gesture), the device zooms in such that the first data mark is the only data mark that is displayed, the device displays the information about the first data mark after the first data mark is the only data mark that is displayed (with or without continuing to zoom in the chart during the remainder of the second touch input).
In some embodiments, in accordance with the determination that one or more predefined data-mark-information-display criteria are met, the device ceases (2328) to display the first data mark. In some embodiments, display of the first data mark is replaced by display of a data record that corresponds to the first data mark when the one or more predefined data-mark-information-display criteria are met (e.g., via an animated transition). For example,
As described below, method 2400 provides an intuitive way to select portions of a chart and/or display information about the underlying data. This method is particularly useful when the user is interacting with a portable device and/or a compact device with a smaller screen. The method reduces the cognitive burden on the user when selecting chart areas, thereby creating a more efficient human-machine interface. For battery-operated electronic devices, enabling a user to select portions of a chart and view information about the underlying data faster and more efficiently conserves power and increase the time between battery charges.
The device displays (2402) a chart on the display, the chart including a plurality of data marks. For example,
In some embodiments, data marks in the plurality of data marks are displayed (2404) in corresponding columns in the chart, with a single data mark per column. For example, data marks 1312 in
In some embodiments, data marks in the plurality of data marks are separated (2406) horizontally from one another. For example, data marks 1312 in
The device detects (2408) a first touch input at a location on the touch-sensitive surface that corresponds to a location on the display of a first predefined area in the chart (e.g., a bar in a bar chart), the first predefined area having a corresponding first value. For example,
In some embodiments, the first touch input is (2410) a tap gesture. For example, in some embodiments, contact 1309 shown in
In some embodiments, the first predefined area includes (2412) a column in the chart. For example,
In some embodiments, the first predefined area includes (2414) a single data mark in the plurality of data marks. For example, selected portion 1302 shown in
In response to detecting the first touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the first predefined area in the chart (2416), the device: selects (2418) the first predefined area; and visually distinguishes the first predefined area. For example,
While the first predefined area is selected, the device detects (2420) a second touch input on the touch-sensitive surface. For example,
In some embodiments, the second touch input is initially detected (2422) at a location on the touch-sensitive surface that corresponds to a location on the display of the first predefined area. For example,
In some embodiments, the second touch input is initially detected (2424) at a location on the touch-sensitive surface that corresponds to a location on the display of an edge of the first predefined area. For example, contact 1310 in
In some embodiments, the second touch input is initially detected (2426) at a location on the touch-sensitive surface that corresponds to a location on the display of a selection handle in or next to the first predefined area. For example, contact 1310 in
In some embodiments, the second touch input is (2428) a drag gesture. For example, the movement of contact 1310 shown in
In some embodiments, the device detects (2430) movement of a finger contact in the drag gesture across locations on the touch-sensitive surface that correspond to locations on the display of the sequence of predefined areas in the chart that have corresponding values. For example,
In some embodiments, in response to detecting movement of the finger contact in the drag gesture across locations on the touch-sensitive surface that correspond to locations on the display of the sequence of predefined areas in the chart that have corresponding values, the device displays (2432) a series of changes between the first value in the first predefined area and the corresponding values of the sequence of predefined areas. For example,
In response to detecting the second touch input on the touch-sensitive surface (2434), the device visually distinguishes (2436) a sequence of predefined areas in the chart, where the sequence of predefined areas is adjacent to the first predefined area. For example,
In response to detecting the second touch input on the touch-sensitive surface (2434), the device displays (2438) a change between the first value for the first predefined area and a value for a last predefined area in the sequence of predefined areas. For example,
In some embodiments, after the second touch input, a selected area in the chart comprises (2440) the first predefined area and the sequence of predefined areas. For example,
In some embodiments, the device detects (2442) a third touch input, the third touch input including initial contact of a finger at a location on the touch-sensitive surface that corresponds to a location on the display within the selected area in the chart, and movement of the finger across the touch-sensitive surface. For example,
In some embodiments, in response to detecting the third touch input (2444), the device moves (2446) the selected area across the chart, in accordance with the movement of the finger across the touch-sensitive surface, while maintaining a number of predefined areas in the moved selected area equal to the number of predefined areas in the sequence of predefined areas plus one. For example, in some embodiments, in response to detecting movement of contact 1402 toward the left side of the screen, the device moves selected portion 1308 to include data for months January through March (i.e., data for three months).
In some embodiments, in response to detecting the third touch input (2444), the device displays (2448) a change between a value corresponding to a leftmost predefined area in the moved selected area and a value corresponding to a rightmost predefined area in the moved selected area. For example, in some embodiments, in response to detecting movement of contact 1402 toward the left side of the screen, the device moves selected portion 1308 to include data for months January through March and the change value updates to denote the change in value between the leftmost data mark in the selected portion and the rightmost data mark in the selected portion.
In some embodiments, after the second touch input, a selected area in the chart comprises (2450) the first predefined area and the sequence of predefined areas. For example, selected portion 1308 in
In some embodiments, the device detects (2452) a fourth touch input (e.g., a de-pinch gesture). For example,
In some embodiments, in response to detecting the fourth touch input (2454), the device zooms (2456) in on the selected area in the chart. For example,
In some embodiments, in response to detecting the fourth touch input (2454), the device, in accordance with a determination that areas in the chart outside the selected area are still displayed on the display, maintains (2458) selection of the selected area. In some embodiments, while zooming in, the device maintains selection of the selected area. In some embodiments, after zooming in, the device maintains selection of the selected area. For example,
In some embodiments, in response to detecting the fourth touch input (2454), the device, in accordance with a determination only areas in the chart in the selected area are displayed on the display, ceases (2460) selection of the selected area. In some embodiments, the selected area disappears when the device zooms in to the chart such that no area in the chart outside the selected area is displayed. In some embodiments, while zooming in, the device ceases selection of the selected area. In some embodiments, after zooming in, the device ceases selection of the selected area.
As described below, method 2500 provides an intuitive way to update chart views. This method is particularly useful when the user is interacting with a portable device and/or a compact device with a smaller screen. The method reduces the cognitive burden on the user when adjusting a chart view (e.g., adjusting chart magnification), thereby creating a more efficient human-machine interface. For battery-operated electronic devices, enabling a user to adjust chart views faster and more efficiently conserves power and increase the time between battery charges.
The device displays (2502) a chart on the display. For example,
The chart has (2504) a horizontal axis with a first horizontal scale with first horizontal scale markers. For example, the chart in
The chart has (2506) a vertical axis with a first vertical scale with first vertical scale markers. For example, the chart in
The chart includes (2508) a first set of data marks. For example, the chart in
In some embodiments, adjacent data marks in the first set of first data marks are separated (2510) by a first horizontal distance. In some embodiments, the first horizontal distance corresponds to the first horizontal scale. For example, the chart in
Each respective data mark in the first set of data marks has (2512) a respective abscissa and a respective ordinate. For example, in some embodiments, each data mark 1902 in
The chart includes (2514) a line that connects adjacent data marks in the first set of data marks. For example, the chart in
The device detects (2516) a first touch input (e.g., a de-pinch gesture) at a location on the touch-sensitive surface that corresponds to a location on the display of the chart. For example, the movement of contacts 1910 and 1912 shown in
While detecting the first touch input (2518), the device expands (2520) at least a portion of the chart such that a distance between adjacent first horizontal scale markers increases in accordance with the first touch input. For example,
While detecting the first touch input (2518), the device expands (2522) at least a portion of the line that connects adjacent data marks in the first set of data marks in accordance with the first touch input. For example, the expanded portion of the chart shown in
While detecting the first touch input (2518), the device adds (2524) a second set of second data marks, distinct from the first set of data marks, on the line. For example,
Each respective data mark in the second set of data marks includes (2526) a respective abscissa and a respective ordinate. For example, in some embodiments, each data mark 1904 shown in
Each respective data mark in the second set of data marks is (2528) placed on the line based on the respective abscissa of the respective data mark, independent of the respective ordinate of the respective data mark. For example, in some embodiments, each data mark 1904 shown in
In some embodiments, adjacent data marks in the second set of data marks are separated (2530) by a second horizontal distance that corresponds to a second horizontal scale that is finer than the first horizontal scale. For example, the chart in
In some embodiments, each respective data mark in the second set of data marks is placed (2532) on the line based on the respective abscissa of the respective data mark and the ordinate of the line at the respective abscissa of the respective data mark. For example, in some embodiments, each data mark in data marks 1904 shown in
In some embodiments, a shape of the line is maintained (2534) when the second set of data marks is added to the line. For example, in some embodiments, the shape of the line in
In some embodiments, a single data mark in the first set of data marks corresponds (2536) to a plurality of data marks in the second set of data marks. For example, in some embodiments, each data mark in data marks 1902 corresponds to twelve data marks in data marks 1904 (e.g., one for each month in the year).
In some embodiments, the device ceases (2538) to display the set of first data marks when the second set of data marks is added. For example, in some embodiments, the device ceases to display data marks 1902 when data marks 1904 are added to the line.
After adding the second set of data marks on the line (2540), the device, for each respective data mark in the second set of data marks placed on the line at a vertical position distinct from its respective ordinate, animatedly moves (2542) the respective data mark vertically in accordance with the respective ordinate for the respective data mark and a second vertical scale for the vertical axis. For example, in some embodiments, data marks 1904 are animatedly moved from their initial positions shown in
In some embodiments, animatedly moving each respective data mark vertically in accordance with the respective ordinate for the respective data mark and a second vertical scale for the vertical axis occurs (2544) while detecting the first input. For example, data marks 1904 are animatedly moved from their initial positions shown in
In some embodiments, animatedly moving each respective data mark vertically in accordance with the respective ordinate for the respective data mark and a second vertical scale for the vertical axis occurs (2546) after ceasing to detect the first input.
In some embodiments, the second vertical scale is (2548) the same as the first vertical scale.
After adding the second set of data marks on the line (2540), the device animatedly adjusts (2550) the line so that the line connects the second set of data marks. For example, in some embodiments, the line connecting data marks 1904 is animatedly adjusted its' initial position shown in
In some embodiments, animatedly moving each respective data mark vertically and animatedly adjusting the line so that the line connects the set of second data marks occur (2552) concurrently.
In some embodiments, the device ceases (2554) to display the set of first data marks after the second set of data marks is added.
Initially, the user has filtered the data to display sales data for just the Central region, as shown in
In
As illustrated in
As illustrated by
As illustrated by
The method concurrently displays a first chart on the display, such as the visual graphic 2610-C in
While displaying the first chart, the method detects a first touch input at a location on the touch-sensitive surface that corresponds to a location on the display of the filter indicium. This is illustrated by the contact points 2614 and 2616 in
It will be understood that, although the terms “first,” “second,” etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first contact could be termed a second contact, and, similarly, a second contact could be termed a first contact, without changing the meaning of the description, so long as all occurrences of the “first contact” are renamed consistently and all occurrences of the second contact are renamed consistently. The first contact and the second contact are both contacts, but they are not the same contact.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the claims. As used in the description of the embodiments and the appended claims, the singular forms “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will also be understood that the term “and/or” as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, elements, components, and/or groups thereof.
As used herein, the term “if” may be construed to mean “when” or “upon” or “in response to determining” or “in accordance with a determination” or “in response to detecting,” that a stated condition precedent is true, depending on the context. Similarly, the phrase “if it is determined [that a stated condition precedent is true]” or “if [a stated condition precedent is true]” or “when [a stated condition precedent is true]” may be construed to mean “upon determining” or “in response to determining” or “in accordance with a determination” or “upon detecting” or “in response to detecting” that the stated condition precedent is true, depending on the context.
The foregoing description, for purpose of explanation, has been described with reference to specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the claims to the precise forms disclosed. Many modifications and variations are possible in view of the above teachings. The embodiments were chosen and described in order to best explain principles of operation and practical applications, to thereby enable others skilled in the art.
This application is a continuation of U.S. application Ser. No. 15/859,235, filed Dec. 29, 2017, entitled “Methods and Devices for Adjusting Chart Magnification,” which is a continuation of U.S. application Ser. No. 14/603,330, filed Jan. 22, 2015, entitled “Methods and Devices for Adjusting Chart Magnification,” now U.S. Pat. No. 9,857,952, which claims priority to U.S. Provisional Application Ser. No. 62/047,429, filed Sep. 8, 2014, entitled “Methods and Devices for Manipulating Graphical Views of Data,” each of which is incorporated herein by reference in its entirety. This application is related to U.S. patent application Ser. No. 14/603,302, filed Jan. 22, 2015, entitled “Methods and Devices for Adjusting Chart Filters,” U.S. patent application Ser. No. 14/603,312, filed Jan. 22, 2015, entitled “Methods and Devices for Adjusting Chart Magnification Asymmetrically,” and U.S. patent application Ser. No. 14/603,322, filed Jan. 22, 2015, entitled “Methods and Devices for Displaying Data Mark Information,” each of which is incorporated by reference herein in its entirety.
Number | Date | Country | |
---|---|---|---|
62047429 | Sep 2014 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15859235 | Dec 2017 | US |
Child | 16877373 | US | |
Parent | 14603330 | Jan 2015 | US |
Child | 15859235 | US |