Time-series analysis system

Information

  • Patent Grant
  • 10360702
  • Patent Number
    10,360,702
  • Date Filed
    Thursday, November 17, 2016
    8 years ago
  • Date Issued
    Tuesday, July 23, 2019
    5 years ago
Abstract
Various systems and methods are provided that display various graphs in an interactive user interface in substantially real-time in response to input from a user in order to determine information related to measured data points and provide the determined information to the user in the interactive user interface. For example, a computing device may be configured to retrieve data from one or more databases and generate one or more interactive user interfaces. The one or more interactive user interfaces may display the retrieved data in one or more graphs, such as time-series or scatterplots. The user interface may be interactive in that a user may manipulate one graph, which causes an identical or nearly identical manipulation of another displayed graph in real-time. The manipulations may occur even if the displayed graphs include data across different time ranges.
Description
TECHNICAL FIELD

The present disclosure relates to systems and techniques for querying databases and displaying queried data in an interactive user interface.


BACKGROUND

A database may store a large quantity of data. For example, a system may comprise a large number of sensors that each collect measurements at regular intervals, and the measurements may be stored in the database and/or a system of databases. The measurement data can be supplemented with other data, such as information regarding events that occurred while the system was operational, and the supplemental data can also be stored in the database and/or the system of databases.


In some cases, a user may attempt to analyze a portion of the stored data. For example, the user may attempt to analyze a portion of the stored data that is associated with a specific time period. However, as the number of measurements increases over time, it can become very difficult for the user to identify the relevant data and perform the analysis.


SUMMARY

The systems, methods, and devices described herein each have several aspects, no single one of which is solely responsible for its desirable attributes. Without limiting the scope of this disclosure, several non-limiting features will now be discussed briefly.


Disclosed herein are various systems and methods for displaying various graphs in an interactive user interface in substantially real-time in response to input from a user in order to determine information related to measured data points and provide the determined information to the user in the interactive user interface. For example, a computing device may be configured to retrieve data from one or more databases and generate one or more interactive user interfaces. The one or more interactive user interfaces may display the retrieved data in one or more graphs, such as time-series or scatterplots. The user interface may be interactive in that a user may manipulate one graph, which causes an identical or nearly identical manipulation of another displayed graph in real-time. The manipulations may occur even if the displayed graphs include data across different time ranges. The user interface may also be interactive in that a user may select a portion of a graph (e.g., data across a certain time period) to view tangential data related to the selection (e.g., events that occurred during a time period represented by the selection).


The various systems described herein may maximize or enhance the speed and accuracy of data displayed in user interfaces using zoom level specific caching. For example, depending on a zoom level of data displayed, individual pixels represent different time ranges (e.g., 1 day in February, 1 week in February, 1 month in 2014, etc.). Over the time range of an individual pixel, the computing system may determine a maximum value and a minimum value of the data to be displayed in the graph. The determined minimum and maximum values may then be cached, such that they are available in the future when that same zoom level is requested by the user or other users, saving the system from recalculation of the same minimum and maximum value to include on the chart (possibly from multiple data points at each pixel time range). For each individual pixel, a line may be rendered from the maximum value to the minimum value. If the granularity of the measured data matches the time range of an individual pixel, then the maximum value and the minimum value may be the same. In one embodiment, the computing system may display the graph at the closest zoom level at which maximum and minimum values have been cached to ensure that the cached data can be used effectively.


One aspect of the disclosure provides a computing system configured to access one or more databases in substantially real-time in response to input from a user provided in an interactive user interface in order to determine information related to measured data points and provide the determined information to the user in the interactive user interface. The computing system comprises a computer processor. The computing system further comprises a database storing at least first sensor values for a first sensor at each of a plurality of times and second sensor values for a second sensor at each of a plurality of times. The computing system further comprises a computer readable storage medium storing program instructions configured for execution by the computer processor in order to cause the computing system to generate user interface data for rendering the interactive user interface on a computing device, the interactive user interface including a first container and a second container, where the first container includes a first graph and the second container includes a second graph, where the first container and the second container have a same width, where the first graph includes first sensor values for the first sensor over a first time period and the second graph includes second sensor values for the second sensor over a second time period that is shorter than the first time period, and wherein portions of the first graph and the second graph are each selectable by the user; receive an identification of a selection by the user of a first data point in the first graph, where the first data point corresponds to a first time range; update the user interface data such that the interactive user interface includes a first marker at a location of the first data point in the first graph; access the database to determine a second sensor value that corresponds to a beginning of the first time range and a second sensor value that corresponds to an end of the first time range; and update the user interface data to include a second marker at a location of a second data point in the second graph that corresponds to the beginning of the first time range and a third marker at a location of a third data point in the second graph that corresponds to the end of the first time range.


The computing system of the preceding paragraph can have any sub-combination of the following features: where the instructions are further configured to cause the computing system to: receive an indication from the user of a change to the first time period in the first graph, in response to receiving the indication from the user of the change to the first time period, adjust positions of the first and second markers indicating the first time period in the second graph; where the computer readable storage medium further stores program instructions that cause the computing system to update the user interface data to include a third container, where the third container includes a list of events that occurred within the first time range; where the first graph, for each event that occurred within the first time range, includes a mark that indicates a data point on the first graph that corresponds with a time that the respective event occurred; where the computer readable storage medium further stores program instructions that cause the computing system to update the user interface data to include a marker at a location in the first graph corresponding to a first event in the list of events in response to selection by the user of a location in the third container that corresponds to the first event; where the computer readable storage medium further stores program instructions that cause the computing system to: receive an indication by the user of a selection in the first graph at a fourth data point such that a new event is added at a time that corresponds with the fourth data point, and update the user interface data such that the third container includes an identification of the new event; where the computer readable storage medium further stores program instructions that cause the computing system to: receive an indication by the user that the new event corresponds with the first graph, and update the user interface data such that a first mark is displayed in the first graph at the time that corresponds with the fourth data point; where the computer readable storage medium further stores program instructions that cause the computing system to: receive an indication by the user that the new event corresponds with the second graph, and update the user interface data such that a first mark is displayed in the second graph at the time that corresponds with the fourth data point; where the computer readable storage medium further stores program instructions that cause the computing system to: receive an indication by the user that the new event corresponds with the first graph and the second graph, and update the user interface data such that a first mark is displayed in the first graph at the time that corresponds with the fourth data point and in the second graph at the time that corresponds with the fourth data point; where the computer readable storage medium further stores program instructions that cause the computing system to: receive an indication of selection by the user of a first event in the list of events, and update the user interface data such that the first graph includes an icon at a position of a data point in the first graph that corresponds with the first event; where the computer readable storage medium further stores program instructions that cause the computing system to: receive an indication of selection, by the user, of a first location corresponding to the first time in the first graph, and update the user interface data such that the first graph includes a marker at the location in the first graph corresponding to the first time; where the computer readable storage medium further stores program instructions that cause the computing system to update the user interface data such that the second graph includes a second marker at a location in the second graph corresponding to the first time; where the computer readable storage medium further stores program instructions that cause the computing system to: receive an indication of selection, by the user of a second location corresponding to a second time in the first graph, and update the user interface data such that the first graph includes the marker at the second location in the first graph corresponding to the second time; where the computer readable storage medium further stores program instructions that cause the computing system to update the user interface data such that the second graph includes the second marker at a location in the second graph corresponding to the second time; where the first data point comprises a line from a location in the first graph that corresponds with a highest value measured by the first sensor during the first time range to a location in the first graph that corresponds with a lowest value measured by the first sensor during the first time range; where the computer readable storage medium further stores program instructions that cause the computing system to receive an indication that a zoom level of the first graph is adjusted from a first zoom level to a second zoom level; where the computer readable storage medium further stores program instructions that cause the computing system to retrieve, from a cache, for a second time range that corresponds to a first pixel in an x-axis of the first graph, a highest value measured by the first sensor during the second time range and a lowest value measured by the first sensor during the second time range; where the computer readable storage medium further stores program instructions that cause the computing system to update the user interface data such that the first graph includes a line from a location in the first graph that corresponds with the highest value to a location in the first graph that corresponds with the lowest value; where the first sensor and the second sensor are oil well sensors; and where the first sensor values correspond to oil extracted from an oil well, and where the second sensor values correspond to water extracted from the oil well.


The present disclosure also comprises a computer program product, for example a non-transitory or transitory computer-readable medium, that comprises the program instructions recited in any of the appended claims, and/or comprises the program instructions disclosed in the present description. The present disclosure further comprises a method in which the steps recited in any of the appended claims, and/or the steps disclosed in the present description, are executed by one or more computing devices.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 illustrates a user interface that displays a first time-series graph and a second time-series graph.



FIGS. 2A-B illustrate another user interface that displays the first time-series graph and the second time-series graph of FIG. 1.



FIGS. 3A-3D illustrate another user interface that displays the first time-series graph and the second time-series graph of FIG. 1.



FIGS. 4A-C illustrate another user interface that displays the first time-series graph and the second time-series graph of FIG. 1.



FIGS. 5A-B illustrate another user interface that displays the first time-series graph and the second time-series graph of FIG. 1.



FIGS. 6A-C illustrate another user interface that displays the first time-series graph and the second time-series graph of FIG. 1.



FIGS. 7A-C illustrate another user interface that displays the first time-series graph and the second time-series graph of FIG. 1.



FIGS. 8A-B illustrate another user interface that displays the first time-series graph and the second time-series graph of FIG. 1.



FIGS. 9A-E illustrate another user interface that displays the first time-series graph and the second time-series graph of FIG. 1.



FIGS. 10A-D illustrate a user interface that displays the first time-series graph and the second time-series graph of FIG. 1.



FIGS. 11A-D illustrate another user interface that displays interactive information about an oil well.



FIG. 12 illustrates another user interface that displays drill bit, hole depth, and rock layer information.



FIG. 13 is a flowchart depicting an illustrative operation of accessing one or more databases in substantially real-time in response to input from a user provided in an interactive user interface in order to determine information related to measured data points and provide the determined information to the user in the interactive user interface.



FIG. 14 illustrates a computer system with which certain methods discussed herein may be implemented, according to one embodiment.





DETAILED DESCRIPTION OF SPECIFIC EMBODIMENTS

Overview


As described above, it can become very difficult for the user to identify relevant data and perform an analysis when a database and/or a system of databases includes a large amount of data. This may be especially true if the user would like to compare two or more data sets over a specific period of time, where the data sets correspond to measurements taken by sensors in a system. In conventional systems, user interfaces may be generated that allow the user to view graphs of the data sets. However, it may be difficult or burdensome for the user to try to identify trends in the data and/or determine the reasons why a sensor acted in a given manner.


Accordingly, disclosed herein are various systems and methods for displaying various graphs in an interactive user interface. For example, a computing device (e.g., the computing system 1400 of FIG. 14 described below) may be configured to retrieve data from one or more databases and generate one or more interactive user interfaces. The one or more interactive user interfaces may display the retrieved data in one or more graphs, such as time-series or scatterplots. The user interface may be interactive in that a user may manipulate one graph, which causes an identical or nearly identical manipulation of another displayed graph in real-time. The manipulations may occur even if the displayed graphs include data across different time ranges. The user interface may also be interactive in that a user may select a portion of a graph (e.g., data across a certain time period) to view tangential data related to the selection (e.g., events that occurred during a time period represented by the selection).


The data in the graphs may be rendered in the interactive user interfaces according to a technique that efficiently uses the pixels that are available for displaying the graphs. For example, every individual pixel (in the x-axis of an x-y graph) may represent a time range (e.g., 1 day in February, 1 week in February, 1 month in 2014, etc.). Over the time range of an individual pixel, the computing system may determine a maximum value and a minimum value of the data to be displayed in the graph. If the granularity of the measured data matches the time range of an individual pixel (e.g., an individual pixel represents a time range of 1 hour and data was measured every hour), then the maximum value and the minimum value may be the same. For each individual pixel (in the x-axis), a line may be rendered from the maximum value to the minimum value (in the y-axis).


As described above, the graphs may be manipulated by the user. For example, the user may zoom into a portion of a graph. In an embodiment, the computing system predetermines each possible zoom level (or at least most commonly used zoom levels) and pre-calculates the maximum and minimum values for time periods associated with the minimum display resolution (e.g., each individual pixel in the x-axis may be associated with a time period) at separate possible zoom level. These pre-calculated maximum and minimum values may be cached, such that they may be retrieved as a user adjusts zoom levels in order to more rapidly update the graph to include the most granular data available at the particular zoom level. In one embodiment, if the user selects a zoom level having minimum time periods per pixel (or some other display unit) that has not been pre-cached, the computing system may display the graph at the closest zoom level at which maximum and minimum values have been cached to ensure that the cached data can be used effectively.


In some embodiments, the interactive user interfaces may include information about a system and sensors associated with the system. For example, the interactive user interfaces may include time-series graphs that display data measured by sensors associated with an oil well. The time-series (and/or any other graphs displayed in the user interface) may be manipulated by the user in any manner as described herein. While the disclosure is described herein with respect to time-series data measured by sensors associated with an oil well, this is not meant to be limiting. The various graphs described herein can depict any time-series data measured by sensors, not just sensors associated with an oil well. For example, the various graphs described herein can depict time-series data measured by outdoor or indoor temperature sensors, humidity sensors, sensors that measure water levels, sensors that measure traffic congestion, sensors that detect seismic activity, and/or the like. Furthermore, the various graphs described herein can depict any type of time-series data, not just time-series data derived from a sensor. For example, the various graphs described herein can depict healthcare data (e.g., pharmaceutical batch failure data over time, the number of claims filed over time, etc.), financial data (e.g., the price of a stock over time), polling data (e.g., the number of respondents that view an issue favorably in polls conducted over a period of time), census information (e.g., the population of a city over time), and/or the like. The time-series data depicted in the graphs and derived from any source can be manipulated by the user in any manner as described herein.


The systems and methods described herein may provide several benefits. For example, the systems and methods described herein may improve the usability of the user interface by providing graphs that can be manipulated by a user in a concurrent manner, thereby allowing the user to identify trends or other information associated with the graphs without having to separately manipulate each individual graph. As another example, the systems and methods described herein may reduce the processor load while the user is interacting with the user interfaces by predetermining each possible zoom level and pre-calculating the maximum and minimum values. The systems and methods described herein may also increase the processing speed as the computing system may not have to determine in real-time how a graph should be updated when the zoom level is changed. Rather, the computing system can retrieve the appropriate data from the cache to update the graph. As another example, the systems and methods described herein may reduce the latency in generating updated user interfaces as the zoom levels and maximum and minimum values may be predetermined and can be retrieved from cache rather than the databases that store the actual data (e.g., which could be located externally from the computing system). Thus, the systems and methods described herein may improve the usability of the user interface.


Examples of Manipulating Time-Series Graphs in an Interactive User Interface



FIG. 1 illustrates a user interface 100 that displays a time-series graph 110 and a time-series graph 120. As illustrated in FIG. 1, the time-series graph 110 plots water allocation data across several months. The time-series graph 120 plots temperature data across several years. While the time-series graph 110 and the time-series graph 120 are each illustrated as plotting a single type of data, this is merely for simplicity and not meant to be limiting. The time-series graph 110 and/or the time-series graph 120 can plot multiple types of data simultaneously. For example, the time-series graph 110 can plot both water allocation data across several months and bottomhole pressure data for the same time period. The depicting of the plotted data can be varied to distinguish between the different types of data plotted in a single time-series graph. The time-series graph 110 and/or 120 plots may be resized vertically, horizontally, diagonally, and/or the like. In an embodiment, not shown, the user interface 100 may include a button that, when selected, causes the computing system that generates the user interface 100 to request current or updated data from an external source for display in the graph 110 and/or the graph 120. In some embodiments, the water allocation data and the temperature data are measured by sensors associated with the same system (e.g., an oil well).


In an embodiment, the water allocation data was measured at a granularity that matches each individual pixel in the x-axis of the time-series graph 110. Thus, the maximum and minimum values of the water allocation data at each individual pixel may be the same and a single point (the size of a pixel in the x and y direction) may represent each water allocation measurement.


The temperature data, however, may be measured at a granularity that does not match each individual pixel in the x-axis of the time-series graph 120. For example, the temperature may have been measured every day, yet each individual pixel may represent a 2 week time period. Thus, the computing system that generates the user interface 100 may calculate the maximum and minimum temperature values for each 2 week time period between the beginning and the end of the time range associated with the time-series graph 120 (e.g., 2010 to 2014). For each individual pixel in the x-axis of the time-series graph 120, a line may be rendered from the maximum temperature value to the minimum temperature value.



FIGS. 2A-2B illustrate another user interface 200 that displays the time-series graph 110 and the time-series graph 120. As illustrated in FIG. 2A, a user may place a cursor 210 (e.g., a mouse pointer, a finger, etc.) over a portion of the time-series graph 120. For example, the user may select the portion of the time-series graph 120 at the location of the cursor 210. Because the temperature data may be measured at a granularity that does not match each individual pixel, the selected portion of the time-series graph 120 may correspond to a first time period (e.g., 1 month in the year 2013). Thus, selection of the portion of the time-series graph 120 at the location of the cursor 210 may include selecting all temperature values within the first time period.


In an embodiment, selection of the portion of the time-series graph 120 at the location of the cursor 210 causes a marker 220 to appear at the selection in the time-series graph 120, as illustrated in FIG. 2B. Furthermore, the selection of the portion of the time-series graph 120 at the location of the cursor 210 may cause a corresponding selection to be made and displayed in the time-series graph 110, e.g., water allocation data recorded over a same time period as is represented by the marker 220 with reference to temperate data. For example, markers 230A and 230B may be displayed in the time-series graph 110. The selection in the time-series graph 110 may correspond to the selection in the time-series graph 120 in that the water allocation values between markers 230A and 230B may have been measured over the first time period (e.g., during the same month in 2013 selected by marker 220).


The user interface 200 may further include a window240 where users can provide or view notes associated with a particular sensor data or with sensor data at a particular time or time period. In the example of FIG. 2B, the window240 includes notes associated with both graphs 110 and 120 during the selected time period, but as discussed further below, the user can choose to have note information on only a single chart displayed. In some embodiments, the notes can include actual measurement data associated with the corresponding graph. For example, in the embodiment of FIG. 2B, the window 240 includes water allocation maximum and minimum data during the selected time period, while window 250 includes temperature data for the selected time period.



FIGS. 3A-3D illustrate another user interface 300 that displays the time-series graph 110 and the time-series graph 120. As illustrated in FIG. 3A, a user may place the cursor 210 over a portion of the time-series graph 110. For example, the user may begin to select a portion of the time-series graph 110 at the location of the cursor 210. As illustrated in FIG. 3B, the user may drag the cursor 210, while the time-series graph 110 is selected, from left to right to complete the selection at the new location of the cursor 210. As the user is dragging the cursor 210, a marker 310A may appear in the time-series graph 110 to indicate where the selection began.


In an embodiment, as the selection of the end of the desired time period is made in graph 110, a second marker 310B appears in the time-series graph 110 to indicate an end of the time period, and marker 320 (or possibly two markers showing start and end of the time period, if the scale of graph 120 is such that the time period includes multiple pixels) is updated in response to changes in the time period selected in graph 110 such that the markers in each graph 110 and 120 indicate the same time period, even though the time series graphs are on a different time scale. Thus, the selected time period in the time-series graph 110 with reference to water allocation is automatically used to select a corresponding time period in the time-series graph 120 with reference to temperature values.


The user may indicate that all graphs or just a subset of graphs displayed in the user interface 300 should be synchronized or identically manipulated in a manner described herein (e.g., an identical manipulation such as a selection of a time period in one graph causing a selection of a corresponding time period in another graph). For example, if the user interface 300 displayed a third graph, the user may be provided with an option to synchronize the graph 110, the graph 120, and the third graph, the graph 110 and the third graph, or the graph 120 and the third graph. If, for example, the user selected to synchronize the graph 120 and the third graph, then any changes to the third graph by the user may also occur to the graph 120, but not to the graph 110. Likewise, any changes to the graph 110 by the user would not affect the graph 120 or the third graph.


In further embodiments, the user may elect to synchronize certain manipulations of a graph, but not other manipulations of a graph. For example, the user may select an option to synchronize the zoom level in two or more graphs, but not the time period displayed within the graphs. As illustrated in FIG. 3D, the user has selected an option to synchronize the zoom levels in the graph 110 and the graph 120 (e.g., the x-axis for both graphs 110 and 120 is at the same zoom level), however each graph is displaying data at a different period of time (e.g., the x-axis for graph 110 ranges from March to July and the x-axis for graph 120 ranges from January to May). If the user, for example, places a marker in the graph 110, the marker may appear in the graph 120 if the marker is placed at a time that appears on the x-axis for the graph 120 (e.g., if the marker is placed on April 1st, which also appears on the x-axis in the graph 120). If the user, as another example, manipulates the graph 110 by scrolling to the right, the graph 120 may be manipulated in the same way.



FIGS. 4A-4C illustrate another user interface 400 that displays the time-series graph 110 and the time-series graph 120. As illustrated in FIG. 4A, a user may place the cursor 210 over a portion of the time-series graph 120. For example, the user may hover over the portion of the time-series graph 120 at the location of the cursor 210. Hovering over the time-series graph 120 may cause the computing system to generate a marker 410 that is displayed in the time-series graph 120 at the location of the cursor 210, and which can be moved in response to movement, by the user, of the hovering cursor 210 over other portions of the graph 120. In addition, a corresponding marker 420 may be displayed in the time-series graph 110. The marker 420 may be located at a location in the time-series graph 110 that represents a water allocation value that was measured at a same time as a temperature value that falls within the time period represented by the marker 410.


In an embodiment, as the user moves the cursor 210 to different locations within the time-series graph 120, the marker 410 may follow the cursor 210. Furthermore, as illustrated in FIGS. 4B-4C, the marker 420 may also move such that the marker 420 continues to correspond to the marker 410 in a manner as described above. Because time scales of the two time-series graphs 110 and 120 are different, movement of the marker 420 may move at a faster rate than movement of the marker 410.



FIGS. 5A-5B illustrate another user interface 500 that displays the time-series graph 110 and the time-series graph 120. As illustrated in FIG. 5A, the time-series graph 110 and the time-series graph 120 include data plotted over the same time period (e.g., March to June). Furthermore, in this example the user may have selected water allocation values in the time-series graph 110 over a particular time period, represented by markers 510A-B, or the user may have selected temperature values in the time-series graph 120 over a time period, represented by markers 520A-B, causing the other time-series graph to display an automatically determined selection of the same time period.


In an embodiment, events (e.g., a manufacturing failure, a contamination event, etc.) may have occurred during the time period associated with the selections in the time-series graph 110 and the time-series graph 120 and/or annotations may be marked. The events that occurred and/or the annotations may be associated with the sensor that measured the water allocation values, the sensor that measured the temperature values, and/or other sensors that measured other data (not shown). Marks 530, 540, and 550 may identify a time at which an event occurred and/or an annotation is marked and/or a time range during which an event occurred and/or an annotation is marked. For example, the mark 530 may indicate that an event occurred or an annotation was marked at a time corresponding to the location of the mark 530 in the time-series graph 110, where the event or annotation is associated with the sensor that measured the water allocation values. Likewise, the mark 540 may indicate that an event occurred or an annotation was marked at a time corresponding to the location of the mark 540 in the time-series graph 110 (e.g., where the event or annotation is associated with the sensor that measured the water allocation values) and the mark 550 may indicate that an event occurred or an annotation was marked at a time corresponding to the location of the mark 550 in the time-series graph 120 (e.g., where the event or annotation is associated with the sensor that measured the temperature values). The marks 530, 540, and/or 550 can be represented in various ways. For example, if the event occurs or the annotation is marked at a specific time instant, the marks 530, 540, and/or 550 can be represented as vertical lines. If the event occurs or an annotation is marked during a range of time, the marks 530, 540, and/or 550 can be represented as blocks (e.g., rectangular blocks) that encompass the time range.


Furthermore, the user interface 500 may include an event information pane or notebook 560. The event information pane 560 may include information on the events that occurred (and/or annotations made by the user) corresponding to the sensors that measured the water allocation data, the temperature data, and/or other data (not shown). The information may include a time-series graph or sensor that the event or annotation is associated with, a time-series within the time-series graph that the event or annotation is associated with, a time that the event occurred (or that the annotation is associated with), and a description of the event or annotation itself, such as a description of the event or annotation provided by a human operator. In an embodiment, the event information pane 560 includes event or annotation information for any event that occurred during a time range for which data was collected and/or for any annotation marked within a time range for which data was collected. In another embodiment, the information displayed in the event information pane 560 is for events that occurred during the entire time range displayed (e.g., March to June) and/or for annotations marked during the entire time range displayed. In another embodiment, the information displayed in the event information pane 560 is for events that occurred during the selected portions (e.g., late April to late May, as represented by the markers 510A-B and 520A-B) and/or for annotations marked within the selected portions. The user interface may include controls that allow the user to select the desired time period for which event information should be included in the event information pane 560.


The event information pane 560 may display event and/or annotation information for every available time-series graph or just selected time-series graphs. For example, the user may use cursor 210 to select the time-series graph 120 (e.g., also referred to as “Graph 2”) and not the time-series graph 110 (e.g., also referred to as “Graph 1”), as illustrated by the dark outline of time-series graph 120 in FIG. 5B. Selecting the time-series graph 120 may cause the event information pane 560 to only display the events and/or annotations that are associated with the time-series graph 120 (e.g., the sensor that measured the temperature values) during the currently selected time period associated with markers 520A and 520B. The event information pane 560 may also include a search field 570 that allows the user to search for and identify specific events and/or annotations that may have occurred or been marked within the currently displayed events (or among other events that are not displayed in some embodiments).



FIGS. 6A-6C illustrate another user interface 600 that displays the time-series graph 110 and the time-series graph 120. As illustrated in FIG. 6A, the time-series graph 120 has been selected by the user (as indicated by the dark outline around time-series graph 120). In this example, the user is hovering the cursor 210 over the time-series graph 120, causing the user interface 600 to display marker 610 at the location of the cursor 210. As described above, a marker 620 corresponding to the marker 610 may be displayed in the time-series graph 110 as a result.


As illustrated in FIG. 6B, the user may provide an instruction to add an event and/or an annotation at a time (or time range) that corresponds to the location of the cursor 210. For example, the user may right-click on a mouse, tap a touch screen, or press a keyboard hotkey to indicate that an event is to be added. Once the user interface 600 receives the instruction, the user interface 600 may display an add event window 630 that appears near a location where the event and/or annotation is to be added. The add event window 630 may be a pop-up window or may be a window that overlays the window displaying the time-series graphs 110 and 120.


In an embodiment, the user can specify a description of the event and/or annotation and a time-series within the time-series graph 120 that the event and/or annotation corresponds to within the add event window 630. As described above, the time-series graph 120 can depict multiple time-series data. However, the time-series graph 120 as illustrated in FIG. 6B only includes a single time-series (e.g., the temperature time-series data, also referred to as “Series 1”). Thus, the add event window 630 only provides an option to associate the event and/or annotation with the time-series depicted in the time-series graph 120. However, if the time-series graph 120 as illustrated in FIG. 6B included two or more time-series, then the add event window 630 would provide the option to specify that the event and/or annotation corresponds to the first time-series (e.g., the temperature time-series), a second time-series, a third time-series, and so on, and/or all time-series graphs or combinations of graphs that are displayed within the time-series graph 120. In many embodiments, each time-series graph is associated with a different sensor or other data source, while in other embodiments a time-series graph may be associated with multiple sensors or other data sources, such as to indicate derived values that are based on two or more sensor values (e.g., a ratio of temperature to pressure). In other embodiments, not shown, the user can specify that an event and/or annotation can be associated with time-series depicted in different time-series graphs (e.g., the time-series graph 110 and the time series-graph 120).


As illustrated in FIG. 6C, the user has specified that the new event and/or annotation is to correspond with the temperature sensor time-series data (illustrated in time-series graph 120). Accordingly, a mark 650 is placed in the time-series graph 120 at the corresponding time. Because the time-series graph 120 is still selected for display of event information, the event information pane 560 is updated to include information about the newly added event and/or annotation (e.g., “Event 2”). Note that former “Event 3” has now become “Event 4” because the events and/or annotations can be listed (and/or numbered) in chronological order and the newly added event or annotation occurs prior to “Event 4” represented by marker 550.



FIGS. 7A-7C illustrate another user interface 700 that displays the time-series graph 110 and the time-series graph 120. As illustrated in FIG. 7A, the time-series graph 120 has been selected by the user. Furthermore, the event information pane 560 includes two events: Event 2 that corresponds with the mark 650 and Event 4 that corresponds with the mark 550.


As illustrated in FIG. 7B, the user may hover over information about an event and/or annotation using the cursor 210. In the example of FIG. 7B, the user has hovered over the Event 2. In an embodiment, when the cursor 210 hovers over and/or is used to select an event and/or annotation, a marker is displayed at a location of the event and/or annotation in the corresponding time-series graph(s). For example, when the cursor 210 hovers over Event 2, a marker 710 is displayed at a location in the time-series graph 110 that corresponds with a time that the event and/or annotation occurred and/or a marker 720 is displayed at a location of the mark 650 in the time-series graph 120. In other embodiments, an event and/or annotation may be selected in any other manner and indications of the corresponding data on the time-series graphs may be indicated in other visual representations, such as an animated circular marker that changes size, blinks off and on, etc.


As illustrated in FIG. 7C, the user has moved the cursor over Event 4. When the cursor 210 hovers over Event 4, a marker 730 is displayed at a location in the time-series graph 110 that corresponds with a time that the event and/or annotation occurred and a marker 740 is displayed at a location of the mark 550 in the time-series graph 120.



FIGS. 8A-8B illustrate another user interface 800 that displays the time-series graph 110 and the time-series graph 120. In some embodiments, a physical component that is monitored by a sensor may begin to operate outside normal operating conditions. For example, the physical component may have encountered a mechanical issue that causes the physical component to operate at sub-optimal levels. In some cases, the abnormal performance of the physical component could cause a system slowdown or failure. Accordingly, the computing system that generates the user interface 800 may generate an alert to notify a human operator of the abnormal operation.


In an embodiment, the abnormal performance of the physical component is represented by sensor values that are outside of an expected range and an alert may be triggered when the sensor values are outside of the expected range. For example, an alert may be generated for a sensor that measures temperature values for a physical component of an oil well if the measured temperature values exceed certain levels (e.g., 200° F.). Alerts may also be triggered based on a combination of sensor values. For example, an alert may be triggered if values associated with a first sensor (e.g., a temperature sensor) exceed certain values and values associated with a second sensor (e.g., a pressure sensor) do not exceed certain values. Triggering of alerts may initiate real-time (or substantially real-time) notifications to one or more users, such as via text messages, email, phone calls, etc. Thus, the alert may allow the user to make adjustments to the sensor and/or other system components in order to reduce impact of the physical component operating outside of its normal range. Alerts may be recorded and associated with a particular sensor and stored for display along side time-series graphs for the particular sensor in the future, such as in the notes or event information areas of the user interface.


The user interface 800 may display markers that indicate when an alert would be or should be triggered. For example, marker 810 may indicate an upper boundary at which point an alert may be triggered and marker 820 may indicate a lower boundary at which point an alert may be triggered. As illustrated in FIG. 8A, an alert was triggered in May as the water allocation values exceeded the value associated with the marker 810.



FIG. 8B illustrates an example recorded alert 830 that was triggered when the water allocation values exceeded the value associated with the marker 810. The user interface 800 may display the alert 830 if a user hovers over the portion of the time-series graph 110 that includes values that exceed the value associated with the marker 810 or that do not exceed the value associated with the marker 820.



FIGS. 9A-9E illustrate a user interface 900 that correlates time-series and scatterplot graphs. As illustrated in FIG. 9A, the user interface 900 includes a time-series graph 910. The user may use the cursor 210 to begin selecting a portion of the time-series graph 910.


As illustrated in FIG. 9B, the user, via the cursor 210, has selected a portion of the time-series graph 910 represented by markers 920 and 930. The selected portion of the time-series graph 910 represents water allocation values for a time period between late April and early June.


As illustrated in FIG. 9C, based on the selection in the time-series graph 910, a scatterplot 950 is displayed in the user interface 900. The scatterplot 950 may include temperature values plotted against pressure values. Each combination of temperature and pressure values may have been measured at a time within the time period corresponding to the selected portion of the time-series graph 910 (e.g., late April and early June). Thus, for every time increment in the time period, the computing system may retrieve a temperature value and a pressure value and generate the user interface 900 such that it plots the temperature value as a function of the pressure value.


In an embodiment, the water allocation values may be measured by a sensor associated with a system. The temperature values and the pressure values may also be measured by sensors associated with the same system.


As illustrated in FIG. 9D, the user can make a selection in the scatterplot 950 using the cursor 210. For example, the user can make a selection represented by box 960, where the box 960 includes various combinations of temperature and pressure values.


Once the user makes the selection in the scatterplot 950, the computing device may determine all times that the individual combinations of temperature and pressure values within the box 960 occurred. For example, while the combination of temperature and pressure values in the box 960 occurred during the time period between markers 920 and 930, the same combination of temperature and pressure values may have occurred at other times. Thus, the user interface 900 may indicate such times. As illustrated in FIG. 9E, markers 970A and 970B designate a first time period during which some or all combinations of temperature and pressure values in the box 960 occurred, markers 980A and 980B designate a second time period during which some or all combinations of temperature and pressure values in the box 960 occurred, and markers 990A and 990B designate a third time period during which some or all combinations of temperature and pressure values in the box 960 occurred. Alternatively or in addition, the portions of the time-series graph 910 that correspond with the times that the individual combinations of temperature and pressure values within the box 960 occurred can be bolded, highlighted, and/or otherwise annotated to indicate such times.


Example Use Case of an Interactive User Interface with Time-Series Graphs



FIGS. 10A-10D illustrate a user interface 1000 that provides interactive information about an oil well. As illustrated in FIG. 10A, the user interface 1000 includes a first window 1010 and a second window 1020. The window 1010 may be a navigation window that includes a list of selectable buttons that can be used to provide further information about the oil well. The window 1020 may be an informational window that provides details about the oil well, such as the well's age, the well's location, and/or the well's objectives.


As illustrated in FIG. 10B, the user may select a sensors button in the window 1010 using the cursor 210. The sensors button may retrieve data measured by the various sensors of the oil well. For example, the data may be retrieved from databases associated with the system that the sensors are associated with, and the data may be displayed in time-series graphs, scatterplot graphs, and/or the like as described herein.


As illustrated in FIG. 10C, after the sensors button is selected, the window 1020 may display various time-series graphs 1030 and 1040. The time-series graphs 1030 and/or 1040 may be manipulated in any manner as described herein with respect to FIGS. 1 through 9E. In addition, the user interface 1000 may display any number of time-series graphs, each of which may be manipulated in the manners described herein. For example, several time-series graphs (e.g., three, four, five, or more graphs) may be concurrently displayed on one or more display devices, each with differing (or identical) timescales for the corresponding sensor data displayed. According to the systems and methods discussed above, a user may select a particular time or time period on one of the displayed time-series graphs and, in response to such selection, the system automatically selects corresponding time periods in each of the other time-series graphs.


As illustrated in FIG. 10D, the user interface 1000 may also include an event information pane 1050 that provides event information for events that occurred and/or annotations that are related to the sensors that measured the data depicted in the time-series graphs 1030 and/or 1040.



FIGS. 11A-11D illustrate another user interface 1100 that provides interactive information about an oil well. As illustrated in FIG. 11A, the user interface 1100 includes the windows 1010 and 1020. In an embodiment, the user selects a maps button in the window 1010.


As illustrated in FIG. 11B, once the maps button is selected, the window 1020 displays a map showing a location of an oil well 1140, injectors 1110 and 1120, producer 1130, and/or other related components (not shown).


As illustrated in FIG. 11C, the user may hover and/or select one or more of the components depicted in the map to view more information. For example, the cursor 210 may hover over the injector 1110, which causes the user interface 1100 to display text associated with the injector 1110 in the window 1020 (e.g., “well communication between A01 and A1”).


As illustrated in FIG. 11D, selecting a component may allow the user to view additional information about the component in the window 1020. For example, selection of the injector 1110 causes the user interface 1100 to provide more information about the injector 1110 (e.g., the injector's age, the injector's location, the injector's objectives, etc.). Furthermore, from the information displayed in FIG. 11D the user may select the sensors indicator in order to view one or more time-series graphs associated with sensor data of the selected component (a particular well in the example of FIG. 11D).



FIG. 12 illustrates another user interface 1200 that displays drill bit, hole depth, and rock layer information. As illustrated in FIG. 12, the user interface 1200 includes graph 1205 and window 1240. The graph 1205 may display a vertical position of an item (e.g., a drill bit) in an underground crevice or structure (e.g., an oil well) over a period of time in which a rock layer at the position of the item, a rock layer at the bottom of the underground crevice or structure (e.g., a hole depth), and/or events associated with the underground crevice or structure that have occurred during the period of time are indicated. For example, the graph 1205 may have an x-axis that represents time (e.g., in months), may have a y-axis that represents depth below the surface (e.g., in meters), and may display a vertical position of a drill bit in an oil well over a period of time. The graph 1205 may also display a rock layer at the position of the drill bit during the period of time, a rock layer at the bottom of the oil well during the period of time, and events associated with the oil well that occurred during the period of time.


As illustrated in FIG. 12, the graph 1205 includes a curve 1202 that represents a depth of the drill bit at each point in time. The graph 1205 further includes a curve 1204 that represents a depth of the bottom of the oil well at each point in time. For example, the drill bit may be used alone or as a part of a larger apparatus to drill into the ground to increase the depth of the oil well. Occasionally, the drill bit may be raised from the bottom of the oil well to the surface (e.g., to perform maintenance on the drill). Thus, the curve 1202 may rise and fall as drilling begins, ends, and restarts. However, the depth of the bottom of the oil well may not decrease (e.g., the depth may not decrease from 1000 m below the surface to 500 m below the surface) unless, for example, the hole in the oil well is filled in. Thus, the curve 1204 may remain static over time or continue to fall down along the y-axis.


Row 1210 identifies the different rock layers at a vertical position of the drill bit over time. For example, between February and mid-March, the drill bit may be at a depth that falls within rock layer 1212 (except for times in which the drill bit is at the surface, which is indicated by a blank space in the row 1210). After mid-March, the drill bit may briefly be at a depth that falls within rock layer 1214. However, prior to the beginning of April, the drill bit may be slowly raised to the surface. During this time, the drill bit may pass from the rock layer 1214 to the rock layer 1212 before reaching the surface, as indicated in the row 1210. Likewise, in May, the drill bit may reach a depth below the surface that falls within rock layer 1216, as indicated in the row 1210.


Row 1220 identifies the different rock layers at the bottom of the oil well over time. For example, the row 1220 may identify the deepest rock layer reached by the drill bit (assuming that the oil well is not filled in and that the deepest region reached by the drill bit corresponds with the depth of the bottom of the oil well). Thus, while the drill bit may be raised to the surface in mid-May, the row 1220 indicates that the rock layer 1216 is the rock layer at the depth of the bottom of the oil well.


Row 1230 identifies a time or time range at which various events 1231-1238 may have occurred in the period of time viewed within the graph 1205. Information on one or more of the events 1231-1238 may be provided in the window 1240. For example, the window 1240 may identify an event, a time that the event occurred, and/or a description of the event. Accordingly, a user may be able to identify times in which an oil well is not being drilled, reasons why such delays have occurred, and/or possible solutions for reducing such delays.


In further embodiments, not shown, additional data or curves can be included in the graph 1205. For example, a curve indicating levels of gamma radiation at the vertical position of the drill bit (e.g., a curve in which the y-axis value represents gamma radiation levels at the vertical position of the drill bit at a time instant and the x-axis value represents time), a curve indicating levels of gamma radiation at the bottom of the oil well (e.g., a curve in which the y-axis value represents gamma radiation levels at the bottom of the oil well at a time instant and the x-axis value represents time), a curve indicating levels of gamma radiation at a static or dynamic depth within the oil well (e.g., a curve in which the y-axis value represents gamma radiation levels at the static or dynamic depth and the x-axis value represents time), and/or the like may be included in the graph 1205.


Example Process Flow



FIG. 13 is a flowchart 1300 depicting an illustrative operation of accessing one or more databases in substantially real-time in response to input from a user provided in an interactive user interface in order to determine information related to measured data points and provide the determined information to the user in the interactive user interface. Depending on the embodiment, the method of FIG. 13 may be performed by various computing devices, such as by the computing system 1400 described below. Depending on the embodiment, the method of FIG. 13 may include fewer and/or additional blocks and the blocks may be performed in an order different than illustrated. While the flowchart 1300 is described with respect to having sensor data depicting in graphs, this is not meant to be limiting. The illustrative operation depicting in the flowchart 1300 can be implemented on any type of time-series data from any source.


In block 1302, user interface data for rendering an interactive user interface is generated. The interactive user interface may include a first graph in a first container and a second graph in a second container that has a same width as the first container. The first graph may include first sensor values over a first time period and the second graph may include second sensor values over a second time period.


In block 1304, an identification of a selection of a first data point in the first graph that corresponds to a first time range is received. The first time range could be determined based on a time range that corresponds with each individual pixel in an x-axis.


In block 1306, the user interface data is updated such that the interactive user interface includes a first marker at a location of the first data point in the first graph. The marker may be a vertical line that is temporarily displayed in the interactive user interface.


In block 1308, the database is accessed to determine a second sensor value that corresponds to a beginning of the first time range and a second sensor value that corresponds to an end of the first time range. For example, the second sensor value that corresponds to a beginning of the first time range may be a sensor value that was measured at a time that corresponds with the beginning of the first time range.


In block 1310, the user interface data is updated to include a second marker at a location of a second data point in the second graph that corresponds to the beginning of the first time range and a third marker at a location of a third data point in the second graph that corresponds to the end of the first time range. Thus, the user may be able to view, within the interactive user interface, first sensor values and second sensor values that were measured at the same time.


Implementation Mechanisms


According to one embodiment, the techniques described herein are implemented by one or more special-purpose computing devices. The special-purpose computing devices may be hard-wired to perform the techniques, or may include digital electronic devices such as one or more application-specific integrated circuits (ASICs) or field programmable gate arrays (FPGAs) that are persistently programmed to perform the techniques, or may include one or more general purpose hardware processors programmed to perform the techniques pursuant to program instructions in firmware, memory, other storage, or a combination. Such special-purpose computing devices may also combine custom hard-wired logic, ASICs, or FPGAs with custom programming to accomplish the techniques. The special-purpose computing devices may be desktop computer systems, server computer systems, portable computer systems, handheld devices, networking devices or any other device or combination of devices that incorporate hard-wired and/or program logic to implement the techniques.


Computing device(s) are generally controlled and coordinated by operating system software, such as iOS, Android, Chrome OS, Windows XP, Windows Vista, Windows 7, Windows 8, Windows Server, Windows CE, Unix, Linux, SunOS, Solaris, iOS, Blackberry OS, VxWorks, or other compatible operating systems. In other embodiments, the computing device may be controlled by a proprietary operating system. Conventional operating systems control and schedule computer processes for execution, perform memory management, provide file system, networking, I/O services, and provide a user interface functionality, such as a graphical user interface (“GUI”), among other things.


For example, FIG. 14 is a block diagram that illustrates a computer system 1400 upon which an embodiment may be implemented. For example, any of the computing devices discussed herein may include some or all of the components and/or functionality of the computer system 1400.


Computer system 1400 includes a bus 1402 or other communication mechanism for communicating information, and a hardware processor, or multiple processors, 1404 coupled with bus 1402 for processing information. Hardware processor(s) 1404 may be, for example, one or more general purpose microprocessors.


Computer system 1400 also includes a main memory 1406, such as a random access memory (RAM), cache and/or other dynamic storage devices, coupled to bus 1402 for storing information and instructions to be executed by processor 1404. Main memory 1406 also may be used for storing temporary variables or other intermediate information during execution of instructions to be executed by processor 1404. Such instructions, when stored in storage media accessible to processor 1404, render computer system 1400 into a special-purpose machine that is customized to perform the operations specified in the instructions. Main memory 1406 may also store cached data, such as zoom levels and maximum and minimum sensor values at each zoom level.


Computer system 1400 further includes a read only memory (ROM) 1408 or other static storage device coupled to bus 1402 for storing static information and instructions for processor 1404. A storage device 1410, such as a magnetic disk, optical disk, or USB thumb drive (Flash drive), etc., is provided and coupled to bus 1402 for storing information and instructions. For example, the storage device 1410 may store measurement data obtained from a plurality of sensors.


Computer system 1400 may be coupled via bus 1402 to a display 1412, such as a cathode ray tube (CRT) or LCD display (or touch screen), for displaying information to a computer user. For example, the display 1412 can be used to display any of the user interfaces described herein with respect to FIGS. 1 through 12. An input device 1414, including alphanumeric and other keys, is coupled to bus 1402 for communicating information and command selections to processor 1404. Another type of user input device is cursor control 416, such as a mouse, a trackball, or cursor direction keys for communicating direction information and command selections to processor 1404 and for controlling cursor movement on display 1412. This input device typically has two degrees of freedom in two axes, a first axis (e.g., x) and a second axis (e.g., y), that allows the device to specify positions in a plane. In some embodiments, the same direction information and command selections as cursor control may be implemented via receiving touches on a touch screen without a cursor.


Computing system 1400 may include a user interface module to implement and/or update (e.g., in response to the graph manipulations described herein) a GUI that may be stored in a mass storage device as executable software codes that are executed by the computing device(s). This and other modules may include, by way of example, components, such as software components, object-oriented software components, class components and task components, processes, functions, attributes, procedures, subroutines, segments of program code, drivers, firmware, microcode, circuitry, data, databases, data structures, tables, arrays, and variables.


In general, the word “module,” as used herein, refers to logic embodied in hardware or firmware, or to a collection of software instructions, possibly having entry and exit points, written in a programming language, such as, for example, Java, Lua, C or C++. A software module may be compiled and linked into an executable program, installed in a dynamic link library, or may be written in an interpreted programming language such as, for example, BASIC, Perl, or Python. It will be appreciated that software modules may be callable from other modules or from themselves, and/or may be invoked in response to detected events or interrupts. Software modules configured for execution on computing devices may be provided on a computer readable medium, such as a compact disc, digital video disc, flash drive, magnetic disc, or any other tangible medium, or as a digital download (and may be originally stored in a compressed or installable format that requires installation, decompression or decryption prior to execution). Such software code may be stored, partially or fully, on a memory device of the executing computing device, for execution by the computing device. Software instructions may be embedded in firmware, such as an EPROM. It will be further appreciated that hardware modules may be comprised of connected logic units, such as gates and flip-flops, and/or may be comprised of programmable units, such as programmable gate arrays or processors. The modules or computing device functionality described herein are preferably implemented as software modules, but may be represented in hardware or firmware. Generally, the modules described herein refer to logical modules that may be combined with other modules or divided into sub-modules despite their physical organization or storage


Computer system 1400 may implement the techniques described herein using customized hard-wired logic, one or more ASICs or FPGAs, firmware and/or program logic which in combination with the computer system causes or programs computer system 1400 to be a special-purpose machine. According to one embodiment, the techniques herein are performed by computer system 1400 in response to processor(s) 1404 executing one or more sequences of one or more instructions contained in main memory 1406. Such instructions may be read into main memory 1406 from another storage medium, such as storage device 1410. Execution of the sequences of instructions contained in main memory 1406 causes processor(s) 1404 to perform the process steps described herein. In alternative embodiments, hard-wired circuitry may be used in place of or in combination with software instructions.


The term “non-transitory media,” and similar terms, as used herein refers to any media that store data and/or instructions that cause a machine to operate in a specific fashion. Such non-transitory media may comprise non-volatile media and/or volatile media. Non-volatile media includes, for example, optical or magnetic disks, such as storage device 1410. Volatile media includes dynamic memory, such as main memory 1406. Common forms of non-transitory media include, for example, a floppy disk, a flexible disk, hard disk, solid state drive, magnetic tape, or any other magnetic data storage medium, a CD-ROM, any other optical data storage medium, any physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, NVRAM, any other memory chip or cartridge, and networked versions of the same.


Non-transitory media is distinct from but may be used in conjunction with transmission media. Transmission media participates in transferring information between non-transitory media. For example, transmission media includes coaxial cables, copper wire and fiber optics, including the wires that comprise bus 1402. Transmission media can also take the form of acoustic or light waves, such as those generated during radio-wave and infra-red data communications.


Various forms of media may be involved in carrying one or more sequences of one or more instructions to processor 1404 for execution. For example, the instructions may initially be carried on a magnetic disk or solid state drive of a remote computer. The remote computer can load the instructions into its dynamic memory and send the instructions over a telephone line using a modem. A modem local to computer system 1400 can receive the data on the telephone line and use an infra-red transmitter to convert the data to an infra-red signal. An infra-red detector can receive the data carried in the infra-red signal and appropriate circuitry can place the data on bus 1402. Bus 1402 carries the data to main memory 406, from which processor 1404 retrieves and executes the instructions. The instructions received by main memory 1406 may retrieve and execute the instructions. The instructions received by main memory 1406 may optionally be stored on storage device 1410 either before or after execution by processor 1404.


Computer system 1400 also includes a communication interface 1418 coupled to bus 1402. Communication interface 1418 provides a two-way data communication coupling to a network link 1420 that is connected to a local network 1422. For example, communication interface 1418 may be an integrated services digital network (ISDN) card, cable modem, satellite modem, or a modem to provide a data communication connection to a corresponding type of telephone line. As another example, communication interface 1418 may be a local area network (LAN) card to provide a data communication connection to a compatible LAN (or WAN component to communicated with a WAN). Wireless links may also be implemented. In any such implementation, communication interface 1418 sends and receives electrical, electromagnetic or optical signals that carry digital data streams representing various types of information.


Network link 1420 typically provides data communication through one or more networks to other data devices. For example, network link 1420 may provide a connection through local network 1422 to a host computer 1424 or to data equipment operated by an Internet Service Provider (ISP) 1426. ISP 1426 in turn provides data communication services through the world wide packet data communication network now commonly referred to as the “Internet” 1428. Local network 1422 and Internet 1428 both use electrical, electromagnetic or optical signals that carry digital data streams. The signals through the various networks and the signals on network link 1420 and through communication interface 1418, which carry the digital data to and from computer system 1400, are example forms of transmission media.


Computer system 1400 can send messages and receive data, including program code, through the network(s), network link 1420 and communication interface 1418. In the Internet example, a server 1430 might transmit a requested code for an application program through Internet 1428, ISP 1426, local network 1422 and communication interface 1418.


The received code may be executed by processor 1404 as it is received, and/or stored in storage device 1410, or other non-volatile storage for later execution.


Terminology


Each of the processes, methods, and algorithms described in the preceding sections may be embodied in, and fully or partially automated by, code modules executed by one or more computer systems or computer processors comprising computer hardware. The processes and algorithms may be implemented partially or wholly in application-specific circuitry.


The various features and processes described above may be used independently of one another, or may be combined in various ways. All possible combinations and subcombinations are intended to fall within the scope of this disclosure. In addition, certain method or process blocks may be omitted in some implementations. The methods and processes described herein are also not limited to any particular sequence, and the blocks or states relating thereto can be performed in other sequences that are appropriate. For example, described blocks or states may be performed in an order other than that specifically disclosed, or multiple blocks or states may be combined in a single block or state. The example blocks or states may be performed in serial, in parallel, or in some other manner. Blocks or states may be added to or removed from the disclosed example embodiments. The example systems and components described herein may be configured differently than described. For example, elements may be added to, removed from, or rearranged compared to the disclosed example embodiments.


Conditional language, such as, among others, “can,” “could,” “might,” or “may,” unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or steps. Thus, such conditional language is not generally intended to imply that features, elements and/or steps are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without user input or prompting, whether these features, elements and/or steps are included or are to be performed in any particular embodiment.


Any process descriptions, elements, or blocks in the flow diagrams described herein and/or depicted in the attached figures should be understood as potentially representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps in the process. Alternate implementations are included within the scope of the embodiments described herein in which elements or functions may be deleted, executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those skilled in the art.


It should be emphasized that many variations and modifications may be made to the above-described embodiments, the elements of which are to be understood as being among other acceptable examples. All such modifications and variations are intended to be included herein within the scope of this disclosure. The foregoing description details certain embodiments of the invention. It will be appreciated, however, that no matter how detailed the foregoing appears in text, the invention can be practiced in many ways. As is also stated above, it should be noted that the use of particular terminology when describing certain features or aspects of the invention should not be taken to imply that the terminology is being re-defined herein to be restricted to including any specific characteristics of the features or aspects of the invention with which that terminology is associated. The scope of the invention should therefore be construed in accordance with the appended claims and any equivalents thereof.

Claims
  • 1. A computing system comprising: a computer processor; anda computer readable storage medium storing program instructions configured for execution by the computer processor in order to cause a user interface module of the computing system to: generate user interface data for rendering an interactive user interface on a computing device, the interactive user interface including a first graph and a second graph, wherein the first graph includes first data values over a first time period and the second graph includes second data values over a second time period that is shorter than the first time period;receive an identification of a selection of a first data point in the first graph, wherein the first data point corresponds to a first time range; andin response to the selection of the first data point: update, by the user interface module, the user interface data such that the interactive user interface includes a first marker at a location of the first data point; andupdate, by the user interface module, the user interface data to include both a second marker at a location of a second data point in the second graph that corresponds to a beginning of the first time range and a third marker at a location of a third data point in the second graph that corresponds to an end of the first time range.
  • 2. The computing system of claim 1, wherein the program instructions are further configured to cause the computing system to: receive an indication of a change to the first time period in the first graph;in response to receiving the indication of the change to the first time period, adjust positions of the second and third markers indicating the first time range in the second graph.
  • 3. The computing system of claim 1, wherein the computer readable storage medium further stores program instructions that cause the computing system to update the user interface data to include a list of events that occurred within the first time range.
  • 4. The computing system of claim 3, wherein the first graph, for each event that occurred within the first time range, includes a graphical representation that indicates a data point on the first graph that corresponds with a time that the respective event occurred.
  • 5. The computing system of claim 4, wherein the computer readable storage medium further stores program instructions that cause the computing system to update the user interface data to include a fourth marker at a first location in the first graph corresponding to a first event in the list of events in response to selection of a second location that corresponds to the first event.
  • 6. The computing system of claim 3, wherein the computer readable storage medium further stores program instructions that cause the computing system to: receive an indication of a selection in the first graph at a fourth data point such that a new event is added at a time that corresponds with the fourth data point; andupdate the user interface data to include an identification of the new event.
  • 7. The computing system of claim 6, wherein the computer readable storage medium further stores program instructions that cause the computing system to: receive an indication that the new event corresponds with the first graph; andupdate the user interface data such that a first graphical representation is displayed in the first graph at the time that corresponds with the fourth data point.
  • 8. The computing system of claim 6, wherein the computer readable storage medium further stores program instructions that cause the computing system to: receive an indication that the new event corresponds with the second graph; andupdate the user interface data such that a first graphical representation is displayed in the second graph at the time that corresponds with the fourth data point.
  • 9. The computing system of claim 6, wherein the computer readable storage medium further stores program instructions that cause the computing system to: receive an indication that the new event corresponds with the first graph and the second graph; andupdate the user interface data such that a first graphical representation is displayed in the first graph at the time that corresponds with the fourth data point and a second graphical representation is displayed in the second graph at the time that corresponds with the fourth data point.
  • 10. The computing system of claim 3, wherein the computer readable storage medium further stores program instructions that cause the computing system to: receive an indication of selection of a first event in the list of events; andupdate the user interface data such that the first graph includes a graphical representation at a position of a data point in the first graph that corresponds with the first event.
  • 11. The computing system of claim 1, wherein the computer readable storage medium further stores program instructions that cause the computing system to: receive an indication of selection of a first location corresponding to the first time range in the first graph; andupdate the user interface data such that the first graph includes a fourth marker at the first location in the first graph corresponding to the first time range.
  • 12. The computing system of claim 11, wherein the computer readable storage medium further stores program instructions that cause the computing system to update the user interface data such that the second graph includes a fifth marker at a second location in the second graph corresponding to the first time range.
  • 13. The computing system of claim 12, wherein the computer readable storage medium further stores program instructions that cause the computing system to: receive an indication of selection of a third location corresponding to a second time in the first graph; andupdate the user interface data such that the first graph includes the fourth marker at the third location in the first graph corresponding to the second time.
  • 14. The computing system of claim 13, wherein the computer readable storage medium further stores program instructions that cause the computing system to update the user interface data such that the second graph includes the fifth marker at a location in the second graph corresponding to the second time.
  • 15. The computing system of claim 1, wherein the first data point comprises a line from a location in the first graph that corresponds with a highest value measured during the first time range to a location in the first graph that corresponds with a lowest value measured during the first time range.
  • 16. The computing system of claim 1, wherein the computer readable storage medium further stores program instructions that cause the computing system to receive an indication that a zoom level of the first graph is adjusted from a first zoom level to a second zoom level.
  • 17. The computing system of claim 16, wherein the computer readable storage medium further stores program instructions that cause the computing system to retrieve, from a cache, for a second time range that corresponds to a first pixel in an x-axis of the first graph, a highest value measured during the second time range and a lowest value measured during the second time range.
  • 18. The computing system of claim 17, wherein the computer readable storage medium further stores program instructions that cause the computing system to update the user interface data such that the first graph includes a line from a location in the first graph that corresponds with the highest value to a location in the first graph that corresponds with the lowest value.
  • 19. A computer-implemented method comprising: generating user interface data for rendering an interactive user interface on a computing device, the interactive user interface including a first graph and a second graph, wherein the first graph includes first data values over a first time period and the second graph includes second data values over a second time period that is shorter than the first time period;receiving an identification of a selection of a first data point in the first graph, wherein the first data point corresponds to a first time range; andin response to the selection of the first data point: updating, by a user interface unit of the computing device, the user interface data such that the interactive user interface includes a first marker at a location of the first data point; andupdating, by the user interface unit, the user interface data to include both a second marker at a location of a second data point in the second graph that corresponds to a beginning of the first time range and a third marker at a location of a third data point in the second graph that corresponds to an end of the first time range.
  • 20. Non-transitory, computer-readable storage media comprising computer-executable instructions for providing data in an interactive user interface, wherein the computer-executable instructions, when executed by a computer system, cause a user interface component of the computer system to: generate user interface data for rendering the interactive user interface on a computing device, the interactive user interface including a first graph and a second graph, wherein the first graph includes first data values over a first time period and the second graph includes second data values over a second time period that is shorter than the first time period;receive an identification of a selection of a first data point in the first graph, wherein the first data point corresponds to a first time range; andin response to the selection of the first data point: update, by the user interface component, the user interface data such that the interactive user interface includes a first marker at a location of the first data point; andupdate, by the user interface component, the user interface data to include both a second marker at a location of a second data point in the second graph that corresponds to a beginning of the first time range and a third marker at a location of a third data point in the second graph that corresponds to an end of the first time range.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 14/871,465, filed on Sep. 30, 2015, entitled “TIME-SERIES ANALYSIS SYSTEM”, which claims priority under 35 U.S.C. § 119(e) to U.S. Provisional Application No. 62/059,601, entitled “CHRONICLE TIME-SERIES ANALYSIS SYSTEM” and filed on Oct. 3, 2014, and U.S. Provisional Application No. 62/206,159, entitled “TIME-SERIES ANALYSIS SYSTEM” and filed on Aug. 17, 2015, which are hereby incorporated by reference in their entireties.

US Referenced Citations (718)
Number Name Date Kind
4881179 Vincent Nov 1989 A
5109399 Thompson Apr 1992 A
5241625 Epard et al. Aug 1993 A
5329108 Lamoure Jul 1994 A
5632009 Rao et al. May 1997 A
5670987 Doi et al. Sep 1997 A
5781704 Rossmo Jul 1998 A
5798769 Chiu et al. Aug 1998 A
5845300 Comer Dec 1998 A
5999911 Berg et al. Dec 1999 A
6057757 Arrowsmith et al. May 2000 A
6065026 Cornelia et al. May 2000 A
6091956 Hollenberg Jul 2000 A
6101479 Shaw Aug 2000 A
6161098 Wallman Dec 2000 A
6219053 Tachibana et al. Apr 2001 B1
6232971 Haynes May 2001 B1
6237138 Hameluck et al. May 2001 B1
6243706 Moreau et al. Jun 2001 B1
6247019 Davies Jun 2001 B1
6279018 Kudrolli et al. Aug 2001 B1
6341310 Leshem et al. Jan 2002 B1
6366933 Ball et al. Apr 2002 B1
6369835 Lin Apr 2002 B1
6370538 Lamping et al. Apr 2002 B1
6374251 Fayyad et al. Apr 2002 B1
6430305 Decker Aug 2002 B1
6456997 Shukla Sep 2002 B1
6523019 Borthwick Feb 2003 B1
6549944 Weinberg et al. Apr 2003 B1
6560620 Ching May 2003 B1
6581068 Bensoussan et al. Jun 2003 B1
6594672 Lampson et al. Jul 2003 B1
6631496 Li et al. Oct 2003 B1
6642945 Sharpe Nov 2003 B1
6665683 Meltzer Dec 2003 B1
6674434 Chojnacki et al. Jan 2004 B1
6714936 Nevin, III Mar 2004 B1
6775675 Nwabueze et al. Aug 2004 B1
6820135 Dingman Nov 2004 B1
6828920 Owen et al. Dec 2004 B2
6839745 Dingari et al. Jan 2005 B1
6850317 Mullins et al. Feb 2005 B2
6877137 Rivette et al. Apr 2005 B1
6944777 Belani et al. Sep 2005 B1
6944821 Bates et al. Sep 2005 B1
6967589 Peters Nov 2005 B1
6976210 Silva et al. Dec 2005 B1
6978419 Kantrowitz Dec 2005 B1
6980984 Huffman et al. Dec 2005 B1
6985950 Hanson et al. Jan 2006 B1
7036085 Barros Apr 2006 B2
7043702 Chi et al. May 2006 B2
7055110 Kupka et al. May 2006 B2
7086028 Davis et al. Aug 2006 B1
7139800 Bellotti et al. Nov 2006 B2
7158878 Rasmussen et al. Jan 2007 B2
7162475 Ackerman Jan 2007 B2
7168039 Bertram Jan 2007 B2
7171427 Witowski et al. Jan 2007 B2
7174377 Bernard et al. Feb 2007 B2
7194680 Roy et al. Mar 2007 B1
7213030 Jenkins May 2007 B1
7269786 Malloy et al. Sep 2007 B1
7278105 Kitts Oct 2007 B1
7290698 Poslinski et al. Nov 2007 B2
7333998 Heckerman et al. Feb 2008 B2
7370047 Gorman May 2008 B2
7379811 Rasmussen et al. May 2008 B2
7379903 Caballero et al. May 2008 B2
7392254 Jenkins Jun 2008 B1
7426654 Adams et al. Sep 2008 B2
7441182 Beilinson et al. Oct 2008 B2
7441219 Perry et al. Oct 2008 B2
7454466 Bellotti et al. Nov 2008 B2
7467375 Tondreau et al. Dec 2008 B2
7487139 Fraleigh et al. Feb 2009 B2
7502786 Liu et al. Mar 2009 B2
7525422 Bishop et al. Apr 2009 B2
7529727 Arning et al. May 2009 B2
7529734 Dirisala May 2009 B2
7558677 Jones Jul 2009 B2
7574409 Patinkin Aug 2009 B2
7574428 Leiserowitz et al. Aug 2009 B2
7579965 Bucholz Aug 2009 B2
7596285 Brown et al. Sep 2009 B2
7614006 Molander Nov 2009 B2
7617232 Gabbert et al. Nov 2009 B2
7620628 Kapur et al. Nov 2009 B2
7627812 Chamberlain et al. Dec 2009 B2
7634717 Chamberlain et al. Dec 2009 B2
7703021 Flam Apr 2010 B1
7706817 Bamrah et al. Apr 2010 B2
7712049 Williams et al. May 2010 B2
7716077 Mikurak May 2010 B1
7716140 Nielsen et al. May 2010 B1
7725530 Sah et al. May 2010 B2
7725547 Albertson et al. May 2010 B2
7730082 Sah et al. Jun 2010 B2
7730109 Rohrs et al. Jun 2010 B2
7765489 Shah Jul 2010 B1
7770100 Chamberlain et al. Aug 2010 B2
7805457 Viola et al. Sep 2010 B1
7809703 Balabhadrapatruni et al. Oct 2010 B2
7818291 Ferguson et al. Oct 2010 B2
7818658 Chen Oct 2010 B2
7870493 Pall et al. Jan 2011 B2
7877421 Berger et al. Jan 2011 B2
7880921 Dattilo et al. Feb 2011 B2
7894984 Rasmussen et al. Feb 2011 B2
7899611 Downs et al. Mar 2011 B2
7899796 Borthwick et al. Mar 2011 B1
7917376 Bellin et al. Mar 2011 B2
7920963 Jouline et al. Apr 2011 B2
7933862 Chamberlain et al. Apr 2011 B2
7941336 Robin-Jan May 2011 B1
7958147 Turner et al. Jun 2011 B1
7962281 Rasmussen et al. Jun 2011 B2
7962495 Jain et al. Jun 2011 B2
7962848 Bertram Jun 2011 B2
7966199 Frasher Jun 2011 B1
7970240 Chao et al. Jun 2011 B1
7971150 Raskutti et al. Jun 2011 B2
7984374 Caro et al. Jul 2011 B2
8001465 Kudrolli et al. Aug 2011 B2
8001482 Bhattiprolu et al. Aug 2011 B2
8010507 Poston et al. Aug 2011 B2
8010545 Stefik et al. Aug 2011 B2
8015487 Roy et al. Sep 2011 B2
8024778 Cash et al. Sep 2011 B2
8036632 Cona et al. Oct 2011 B1
8042110 Kawahara et al. Oct 2011 B1
8073857 Sreekanth Dec 2011 B2
8103543 Zwicky Jan 2012 B1
8134457 Velipasalar et al. Mar 2012 B2
8145703 Frishert et al. Mar 2012 B2
8185819 Sah et al. May 2012 B2
8191005 Baier et al. May 2012 B2
8214361 Sandler et al. Jul 2012 B1
8214764 Gemmell et al. Jul 2012 B2
8225201 Michael Jul 2012 B2
8229947 Fujinaga Jul 2012 B2
8230333 Decherd et al. Jul 2012 B2
8271461 Pike et al. Sep 2012 B2
8280880 Aymeloglu et al. Oct 2012 B1
8290838 Thakur et al. Oct 2012 B1
8290926 Ozzie et al. Oct 2012 B2
8290942 Jones et al. Oct 2012 B2
8301464 Cave et al. Oct 2012 B1
8301904 Gryaznov Oct 2012 B1
8302855 Ma et al. Nov 2012 B2
8312367 Foster Nov 2012 B2
8312546 Alme Nov 2012 B2
8352881 Champion et al. Jan 2013 B2
8368695 Howell et al. Feb 2013 B2
8386377 Xiong et al. Feb 2013 B1
8392556 Goulet et al. Mar 2013 B2
8397171 Klassen et al. Mar 2013 B2
8412707 Mianji Apr 2013 B1
8447722 Ahuja et al. May 2013 B1
8452790 Mianji May 2013 B1
8463036 Ramesh et al. Jun 2013 B1
8489331 Kopf et al. Jul 2013 B2
8489641 Seefeld et al. Jul 2013 B1
8498984 Hwang et al. Jul 2013 B1
8510743 Hackborn et al. Aug 2013 B2
8514082 Cova et al. Aug 2013 B2
8515207 Chau Aug 2013 B2
8527949 Pleis et al. Sep 2013 B1
8554579 Tribble et al. Oct 2013 B2
8554653 Falkenborg et al. Oct 2013 B2
8554709 Goodson et al. Oct 2013 B2
8560413 Quarterman Oct 2013 B1
8577911 Stepinski et al. Nov 2013 B1
8589273 Creeden et al. Nov 2013 B2
8595234 Siripuapu et al. Nov 2013 B2
8620641 Farnsworth et al. Dec 2013 B2
8639757 Zang et al. Jan 2014 B1
8646080 Williamson et al. Feb 2014 B2
8676857 Adams et al. Mar 2014 B1
8682696 Shanmugam Mar 2014 B1
8688573 Ruknoic et al. Apr 2014 B1
8689108 Duffield et al. Apr 2014 B1
8713467 Goldenberg et al. Apr 2014 B1
8726379 Stiansen et al. May 2014 B1
8732574 Burr et al. May 2014 B2
8739278 Varghese May 2014 B2
8742934 Sarpy et al. Jun 2014 B1
8744890 Bernier Jun 2014 B1
8745516 Mason et al. Jun 2014 B2
8781169 Jackson et al. Jul 2014 B2
8787939 Papakipos et al. Jul 2014 B2
8788407 Singh et al. Jul 2014 B1
8798354 Bunzel et al. Aug 2014 B1
8799313 Satlow Aug 2014 B2
8799799 Cervelli et al. Aug 2014 B1
8807948 Luo et al. Aug 2014 B2
8812960 Sun et al. Aug 2014 B1
8830322 Nerayoff et al. Sep 2014 B2
8832594 Thompson et al. Sep 2014 B1
8868537 Colgrove et al. Oct 2014 B1
8917274 Ma et al. Dec 2014 B2
8924872 Bogomolov et al. Dec 2014 B1
8930874 Duff et al. Jan 2015 B2
8937619 Sharma et al. Jan 2015 B2
8938686 Erenrich et al. Jan 2015 B1
8984390 Aymeloglu et al. Mar 2015 B2
9009171 Grossman et al. Apr 2015 B1
9009827 Albertson et al. Apr 2015 B1
9021260 Falk et al. Apr 2015 B1
9021384 Beard et al. Apr 2015 B1
9043696 Meiklejohn et al. May 2015 B1
9043894 Dennison et al. May 2015 B1
9058315 Burr et al. Jun 2015 B2
9069842 Melby Jun 2015 B2
9116975 Shankar et al. Aug 2015 B2
9165100 Begur et al. Oct 2015 B2
9286373 Elliot et al. Mar 2016 B2
9348880 Kramer et al. May 2016 B1
20010021936 Bertram Sep 2001 A1
20020032677 Morgenthaler et al. Mar 2002 A1
20020033848 Sciammarella et al. Mar 2002 A1
20020065708 Senay et al. May 2002 A1
20020091707 Keller Jul 2002 A1
20020095360 Joao Jul 2002 A1
20020095658 Shulman Jul 2002 A1
20020103705 Brady Aug 2002 A1
20020116120 Ruiz et al. Aug 2002 A1
20020130907 Chi et al. Sep 2002 A1
20020174201 Ramer et al. Nov 2002 A1
20020194119 Wright et al. Dec 2002 A1
20020196229 Chen et al. Dec 2002 A1
20030028560 Kudrolli et al. Feb 2003 A1
20030036848 Sheha et al. Feb 2003 A1
20030036927 Bowen Feb 2003 A1
20030039948 Donahue Feb 2003 A1
20030061132 Mason et al. Mar 2003 A1
20030093755 O'Carroll May 2003 A1
20030126102 Borthwick Jul 2003 A1
20030140106 Raguseo Jul 2003 A1
20030144868 MacIntyre et al. Jul 2003 A1
20030163352 Surpin et al. Aug 2003 A1
20030200217 Ackerman Oct 2003 A1
20030225755 Iwayama et al. Dec 2003 A1
20030229848 Arend et al. Dec 2003 A1
20040032432 Baynger Feb 2004 A1
20040034570 Davis Feb 2004 A1
20040044648 Anfindsen et al. Mar 2004 A1
20040064256 Barinek et al. Apr 2004 A1
20040078451 Dietz et al. Apr 2004 A1
20040085318 Hassler et al. May 2004 A1
20040095349 Bito et al. May 2004 A1
20040111410 Burgoon et al. Jun 2004 A1
20040126840 Cheng et al. Jul 2004 A1
20040143602 Ruiz et al. Jul 2004 A1
20040143796 Lerner et al. Jul 2004 A1
20040163039 Gorman Aug 2004 A1
20040181554 Heckerman et al. Sep 2004 A1
20040193600 Kaasten et al. Sep 2004 A1
20040205492 Newsome Oct 2004 A1
20040221223 Yu et al. Nov 2004 A1
20040236688 Bozeman Nov 2004 A1
20040236711 Nixon et al. Nov 2004 A1
20040260702 Cragun et al. Dec 2004 A1
20040267746 Marcjan et al. Dec 2004 A1
20050010472 Quatse et al. Jan 2005 A1
20050027705 Sadri et al. Feb 2005 A1
20050028094 Allyn Feb 2005 A1
20050039116 Slack-Smith Feb 2005 A1
20050039119 Parks et al. Feb 2005 A1
20050065811 Chu et al. Mar 2005 A1
20050078858 Yao et al. Apr 2005 A1
20050080769 Gemmell Apr 2005 A1
20050086207 Heuer et al. Apr 2005 A1
20050091186 Elish Apr 2005 A1
20050125715 Di Franco et al. Jun 2005 A1
20050154628 Eckart et al. Jul 2005 A1
20050154769 Eckart et al. Jul 2005 A1
20050162523 Darrell et al. Jul 2005 A1
20050166144 Gross Jul 2005 A1
20050180330 Shapiro Aug 2005 A1
20050182793 Keenan et al. Aug 2005 A1
20050183005 Denoue et al. Aug 2005 A1
20050210409 Jou Sep 2005 A1
20050246327 Yeung et al. Nov 2005 A1
20050251786 Citron et al. Nov 2005 A1
20060026120 Carolan et al. Feb 2006 A1
20060026170 Kreitler et al. Feb 2006 A1
20060026561 Bauman et al. Feb 2006 A1
20060031779 Theurer et al. Feb 2006 A1
20060045470 Poslinski et al. Mar 2006 A1
20060053097 King et al. Mar 2006 A1
20060053170 Hill et al. Mar 2006 A1
20060059139 Robinson Mar 2006 A1
20060059423 Lehmann et al. Mar 2006 A1
20060074866 Chamberlain et al. Apr 2006 A1
20060074881 Vembu et al. Apr 2006 A1
20060080139 Mainzer Apr 2006 A1
20060080283 Shipman Apr 2006 A1
20060080619 Carlson et al. Apr 2006 A1
20060093222 Saffer et al. May 2006 A1
20060129746 Porter Jun 2006 A1
20060136513 Ngo et al. Jun 2006 A1
20060139375 Rasmussen et al. Jun 2006 A1
20060142949 Helt Jun 2006 A1
20060143034 Rothermel Jun 2006 A1
20060143075 Carr et al. Jun 2006 A1
20060149596 Surpin et al. Jul 2006 A1
20060155654 Plessis et al. Jul 2006 A1
20060178915 Chao Aug 2006 A1
20060203337 White Sep 2006 A1
20060218637 Thomas et al. Sep 2006 A1
20060241974 Chao et al. Oct 2006 A1
20060242040 Rader et al. Oct 2006 A1
20060242630 Koike et al. Oct 2006 A1
20060265417 Amato et al. Nov 2006 A1
20060271277 Hu et al. Nov 2006 A1
20060277460 Forstall et al. Dec 2006 A1
20060279630 Aggarwal et al. Dec 2006 A1
20070000999 Kubo et al. Jan 2007 A1
20070011150 Frank Jan 2007 A1
20070016363 Huang et al. Jan 2007 A1
20070018986 Hauser Jan 2007 A1
20070038646 Thota Feb 2007 A1
20070038962 Fuchs et al. Feb 2007 A1
20070043686 Teng et al. Feb 2007 A1
20070057966 Ohno et al. Mar 2007 A1
20070061752 Cory Mar 2007 A1
20070078832 Ott et al. Apr 2007 A1
20070083541 Fraleigh et al. Apr 2007 A1
20070088596 Berkelhamer et al. Apr 2007 A1
20070094389 Nussey et al. Apr 2007 A1
20070113164 Hansen et al. May 2007 A1
20070136095 Weinstein Jun 2007 A1
20070150369 Zivin Jun 2007 A1
20070162454 D'Albora et al. Jul 2007 A1
20070168871 Jenkins Jul 2007 A1
20070174760 Chamberlain et al. Jul 2007 A1
20070185850 Walters et al. Aug 2007 A1
20070192122 Routson et al. Aug 2007 A1
20070192265 Chopin et al. Aug 2007 A1
20070198571 Ferguson et al. Aug 2007 A1
20070208497 Downs et al. Sep 2007 A1
20070208498 Barker et al. Sep 2007 A1
20070208736 Tanigawa et al. Sep 2007 A1
20070233709 Abnous Oct 2007 A1
20070240062 Christena et al. Oct 2007 A1
20070245339 Bauman et al. Oct 2007 A1
20070266336 Nojima et al. Nov 2007 A1
20070284433 Domenica et al. Dec 2007 A1
20070294643 Kyle Dec 2007 A1
20070299697 Friedlander et al. Dec 2007 A1
20080016155 Khalatian Jan 2008 A1
20080040275 Paulsen et al. Feb 2008 A1
20080040684 Crump Feb 2008 A1
20080051989 Welsh Feb 2008 A1
20080052142 Bailey et al. Feb 2008 A1
20080077597 Butler Mar 2008 A1
20080077642 Carbone et al. Mar 2008 A1
20080082486 Lermant et al. Apr 2008 A1
20080091693 Murthy Apr 2008 A1
20080104019 Nath May 2008 A1
20080109714 Kumar et al. May 2008 A1
20080126951 Sood et al. May 2008 A1
20080148398 Mezack et al. Jun 2008 A1
20080155440 Trevor et al. Jun 2008 A1
20080162616 Gross et al. Jul 2008 A1
20080172607 Baer Jul 2008 A1
20080177782 Poston et al. Jul 2008 A1
20080186904 Koyama et al. Aug 2008 A1
20080195417 Surpin et al. Aug 2008 A1
20080195608 Clover Aug 2008 A1
20080208735 Balet et al. Aug 2008 A1
20080222295 Robinson et al. Sep 2008 A1
20080249820 Pathria Oct 2008 A1
20080249983 Meisels et al. Oct 2008 A1
20080255973 El Wade et al. Oct 2008 A1
20080263468 Cappione et al. Oct 2008 A1
20080267107 Rosenberg Oct 2008 A1
20080270328 Lafferty et al. Oct 2008 A1
20080276167 Michael Nov 2008 A1
20080278311 Grange et al. Nov 2008 A1
20080281819 Tenenbaum et al. Nov 2008 A1
20080288306 MacIntyre et al. Nov 2008 A1
20080288475 Kim et al. Nov 2008 A1
20080301042 Patzer Dec 2008 A1
20080301559 Martinsen et al. Dec 2008 A1
20080301643 Appleton et al. Dec 2008 A1
20080313132 Hao et al. Dec 2008 A1
20080313243 Poston et al. Dec 2008 A1
20080313281 Scheidl et al. Dec 2008 A1
20090002492 Velipasalar et al. Jan 2009 A1
20090024962 Gotz Jan 2009 A1
20090027418 Maru et al. Jan 2009 A1
20090030915 Winter et al. Jan 2009 A1
20090031401 Cudich et al. Jan 2009 A1
20090037912 Stoitsev et al. Feb 2009 A1
20090043801 LeClair Feb 2009 A1
20090055251 Shah et al. Feb 2009 A1
20090070162 Leonelli et al. Mar 2009 A1
20090076845 Bellin et al. Mar 2009 A1
20090088964 Schaaf et al. Apr 2009 A1
20090089651 Herberger et al. Apr 2009 A1
20090094270 Alirez et al. Apr 2009 A1
20090106178 Chu Apr 2009 A1
20090112678 Luzardo Apr 2009 A1
20090112745 Stefanescu Apr 2009 A1
20090119309 Gibson et al. May 2009 A1
20090125359 Knapic May 2009 A1
20090125369 Kloosstra et al. May 2009 A1
20090125459 Norton et al. May 2009 A1
20090132921 Hwangbo et al. May 2009 A1
20090132953 Reed et al. May 2009 A1
20090143052 Bates et al. Jun 2009 A1
20090144262 White et al. Jun 2009 A1
20090144274 Fraleigh et al. Jun 2009 A1
20090150868 Chakra et al. Jun 2009 A1
20090157732 Hao et al. Jun 2009 A1
20090164934 Bhattiprolu et al. Jun 2009 A1
20090171939 Athsani et al. Jul 2009 A1
20090172511 Decherd et al. Jul 2009 A1
20090172821 Daira et al. Jul 2009 A1
20090177962 Gusmorino et al. Jul 2009 A1
20090179892 Tsuda et al. Jul 2009 A1
20090187464 Bai et al. Jul 2009 A1
20090187546 Whyte et al. Jul 2009 A1
20090199106 Jonsson et al. Aug 2009 A1
20090216562 Faulkner et al. Aug 2009 A1
20090222400 Kupershmidt et al. Sep 2009 A1
20090222759 Drieschner Sep 2009 A1
20090222760 Halverson et al. Sep 2009 A1
20090228365 Tomchek et al. Sep 2009 A1
20090234720 George et al. Sep 2009 A1
20090248757 Havewala et al. Oct 2009 A1
20090249178 Ambrosino et al. Oct 2009 A1
20090249244 Robinson et al. Oct 2009 A1
20090254970 Agarwal et al. Oct 2009 A1
20090271343 Vaiciulis et al. Oct 2009 A1
20090281839 Lynn et al. Nov 2009 A1
20090282068 Shockro et al. Nov 2009 A1
20090287470 Farnsworth et al. Nov 2009 A1
20090292626 Oxford Nov 2009 A1
20090300589 Watters et al. Dec 2009 A1
20090307049 Elliott et al. Dec 2009 A1
20090313463 Pang et al. Dec 2009 A1
20090318775 Michelson et al. Dec 2009 A1
20090319891 MacKinlay Dec 2009 A1
20100004857 Pereira et al. Jan 2010 A1
20100011282 Dollard et al. Jan 2010 A1
20100042922 Bradateanu et al. Feb 2010 A1
20100057622 Faith et al. Mar 2010 A1
20100057716 Stefik et al. Mar 2010 A1
20100070523 Delgo et al. Mar 2010 A1
20100070842 Aymeloglu et al. Mar 2010 A1
20100070844 Aymeloglu et al. Mar 2010 A1
20100070845 Facemire et al. Mar 2010 A1
20100070897 Aymeloglu et al. Mar 2010 A1
20100076813 Ghosh et al. Mar 2010 A1
20100098318 Anderson Apr 2010 A1
20100100963 Mahaffey Apr 2010 A1
20100103124 Kruzeniski et al. Apr 2010 A1
20100106752 Eckardt et al. Apr 2010 A1
20100114887 Conway et al. May 2010 A1
20100122152 Chamberlain et al. May 2010 A1
20100131457 Heimendinger May 2010 A1
20100162176 Dunton Jun 2010 A1
20100191563 Schlaifer et al. Jul 2010 A1
20100198684 Eraker et al. Aug 2010 A1
20100199225 Coleman et al. Aug 2010 A1
20100223260 Wu Sep 2010 A1
20100228812 Uomini Sep 2010 A1
20100238174 Haub et al. Sep 2010 A1
20100250412 Wagner Sep 2010 A1
20100262901 DiSalvo Oct 2010 A1
20100280851 Merkin Nov 2010 A1
20100280857 Liu et al. Nov 2010 A1
20100293174 Bennett et al. Nov 2010 A1
20100306713 Geisner et al. Dec 2010 A1
20100306722 LeHoty et al. Dec 2010 A1
20100313119 Baldwin et al. Dec 2010 A1
20100313239 Chakra et al. Dec 2010 A1
20100318924 Frankel et al. Dec 2010 A1
20100321399 Ellren et al. Dec 2010 A1
20100325526 Ellis et al. Dec 2010 A1
20100325581 Finkelstein et al. Dec 2010 A1
20100330801 Rouh Dec 2010 A1
20110004498 Readshaw Jan 2011 A1
20110004626 Naeymi-Rad et al. Jan 2011 A1
20110029526 Knight et al. Feb 2011 A1
20110047159 Baid et al. Feb 2011 A1
20110047540 Williams et al. Feb 2011 A1
20110060753 Shaked et al. Mar 2011 A1
20110061013 Bilicki et al. Mar 2011 A1
20110066933 Ludwig Mar 2011 A1
20110074788 Regan et al. Mar 2011 A1
20110074811 Hanson et al. Mar 2011 A1
20110078055 Faribault et al. Mar 2011 A1
20110078173 Seligmann et al. Mar 2011 A1
20110093327 Fordyce, III et al. Apr 2011 A1
20110099133 Chang et al. Apr 2011 A1
20110107196 Foster May 2011 A1
20110117878 Barash et al. May 2011 A1
20110119100 Ruhl et al. May 2011 A1
20110137766 Rasmussen et al. Jun 2011 A1
20110153384 Horne et al. Jun 2011 A1
20110161096 Buehler et al. Jun 2011 A1
20110161409 Nair Jun 2011 A1
20110167105 Ramakrishnan et al. Jul 2011 A1
20110170799 Carrino et al. Jul 2011 A1
20110173032 Payne et al. Jul 2011 A1
20110173093 Psota et al. Jul 2011 A1
20110179048 Satlow Jul 2011 A1
20110185316 Reid et al. Jul 2011 A1
20110208565 Ross et al. Aug 2011 A1
20110208724 Jones et al. Aug 2011 A1
20110213655 Henkin Sep 2011 A1
20110218934 Elser Sep 2011 A1
20110219450 McDougal et al. Sep 2011 A1
20110225198 Edwards et al. Sep 2011 A1
20110225482 Chan et al. Sep 2011 A1
20110225586 Bentley et al. Sep 2011 A1
20110225650 Margolies et al. Sep 2011 A1
20110238495 Kang Sep 2011 A1
20110238553 Raj et al. Sep 2011 A1
20110251951 Kolkowtiz Oct 2011 A1
20110258158 Resende et al. Oct 2011 A1
20110270705 Parker Nov 2011 A1
20110289397 Eastmond et al. Nov 2011 A1
20110289407 Naik et al. Nov 2011 A1
20110289420 Morioka et al. Nov 2011 A1
20110291851 Whisenant Dec 2011 A1
20110310005 Chen et al. Dec 2011 A1
20110314007 Dassa et al. Dec 2011 A1
20120004894 Butler Jan 2012 A1
20120004904 Shin et al. Jan 2012 A1
20120019559 Siler et al. Jan 2012 A1
20120022945 Falkenborg et al. Jan 2012 A1
20120036013 Neuhaus et al. Feb 2012 A1
20120036434 Oberstein Feb 2012 A1
20120050293 Carlhian et al. Mar 2012 A1
20120059853 Jagota Mar 2012 A1
20120065987 Farooq et al. Mar 2012 A1
20120066296 Appleton et al. Mar 2012 A1
20120072825 Sherkin et al. Mar 2012 A1
20120079363 Folting et al. Mar 2012 A1
20120084117 Tavares et al. Apr 2012 A1
20120084118 Bai et al. Apr 2012 A1
20120084184 Raleigh Apr 2012 A1
20120106801 Jackson May 2012 A1
20120117082 Koperda et al. May 2012 A1
20120123989 Yu et al. May 2012 A1
20120131512 Takeuchi et al. May 2012 A1
20120137235 Ts et al. May 2012 A1
20120144335 Abeln et al. Jun 2012 A1
20120159307 Chung et al. Jun 2012 A1
20120159362 Brown et al. Jun 2012 A1
20120159399 Bastide et al. Jun 2012 A1
20120170847 Tsukidate Jul 2012 A1
20120173985 Peppel Jul 2012 A1
20120180002 Campbell et al. Jul 2012 A1
20120188252 Law Jul 2012 A1
20120196557 Reich et al. Aug 2012 A1
20120196558 Reich et al. Aug 2012 A1
20120197651 Robinson et al. Aug 2012 A1
20120197657 Prodanovic Aug 2012 A1
20120197660 Prodanovic Aug 2012 A1
20120203708 Psota et al. Aug 2012 A1
20120208636 Feige Aug 2012 A1
20120215784 King et al. Aug 2012 A1
20120221511 Gibson et al. Aug 2012 A1
20120221553 Wittmer et al. Aug 2012 A1
20120221580 Barney Aug 2012 A1
20120226590 Love et al. Sep 2012 A1
20120245976 Kumar et al. Sep 2012 A1
20120246148 Dror Sep 2012 A1
20120254129 Wheeler et al. Oct 2012 A1
20120266245 McDougal et al. Oct 2012 A1
20120284345 Costenaro et al. Nov 2012 A1
20120284670 Kashik et al. Nov 2012 A1
20120290879 Shibuya et al. Nov 2012 A1
20120296907 Long et al. Nov 2012 A1
20120304244 Xie et al. Nov 2012 A1
20120311684 Paulsen et al. Dec 2012 A1
20120323829 Stokes et al. Dec 2012 A1
20120323888 Osann, Jr. Dec 2012 A1
20120330801 McDougal et al. Dec 2012 A1
20120330973 Ghuneim et al. Dec 2012 A1
20130006426 Healey et al. Jan 2013 A1
20130006725 Simanek et al. Jan 2013 A1
20130006916 McBride et al. Jan 2013 A1
20130016106 Yip et al. Jan 2013 A1
20130018796 Kolhatkar et al. Jan 2013 A1
20130024268 Manickavelu Jan 2013 A1
20130046635 Grigg et al. Feb 2013 A1
20130046842 Muntz et al. Feb 2013 A1
20130055264 Burr et al. Feb 2013 A1
20130060786 Serrano et al. Mar 2013 A1
20130061169 Pearcy et al. Mar 2013 A1
20130073377 Heath Mar 2013 A1
20130073454 Busch Mar 2013 A1
20130078943 Biage et al. Mar 2013 A1
20130086482 Parsons Apr 2013 A1
20130097482 Marantz et al. Apr 2013 A1
20130101159 Chao et al. Apr 2013 A1
20130110822 Ikeda et al. May 2013 A1
20130110877 Bonham et al. May 2013 A1
20130111320 Campbell et al. May 2013 A1
20130117651 Waldman et al. May 2013 A1
20130124567 Balinsky et al. May 2013 A1
20130150004 Rosen Jun 2013 A1
20130151148 Parundekar et al. Jun 2013 A1
20130151305 Akinola et al. Jun 2013 A1
20130151388 Falkenborg et al. Jun 2013 A1
20130151453 Bhanot et al. Jun 2013 A1
20130157234 Gulli et al. Jun 2013 A1
20130166348 Scotto Jun 2013 A1
20130166480 Popescu et al. Jun 2013 A1
20130166550 Buchmann et al. Jun 2013 A1
20130176321 Mitchell et al. Jul 2013 A1
20130179420 Park et al. Jul 2013 A1
20130197925 Blue Aug 2013 A1
20130224696 Wolfe et al. Aug 2013 A1
20130225212 Khan Aug 2013 A1
20130226318 Procyk Aug 2013 A1
20130226953 Markovich et al. Aug 2013 A1
20130232045 Tai et al. Sep 2013 A1
20130238616 Rose et al. Sep 2013 A1
20130246170 Gross et al. Sep 2013 A1
20130251233 Yang et al. Sep 2013 A1
20130262527 Hunter et al. Oct 2013 A1
20130262528 Foit Oct 2013 A1
20130263019 Castellanos et al. Oct 2013 A1
20130267207 Hao et al. Oct 2013 A1
20130268520 Fisher et al. Oct 2013 A1
20130279757 Kephart Oct 2013 A1
20130282696 John et al. Oct 2013 A1
20130288719 Alonzo Oct 2013 A1
20130290011 Lynn et al. Oct 2013 A1
20130290825 Arndt et al. Oct 2013 A1
20130297619 Chandarsekaran et al. Nov 2013 A1
20130311375 Priebatsch Nov 2013 A1
20130325826 Agarwal et al. Dec 2013 A1
20140019936 Cohanoff Jan 2014 A1
20140032506 Hoey et al. Jan 2014 A1
20140033010 Richardt et al. Jan 2014 A1
20140040371 Gurevich et al. Feb 2014 A1
20140047319 Eberlein Feb 2014 A1
20140047357 Alfaro et al. Feb 2014 A1
20140058763 Zizzamia et al. Feb 2014 A1
20140059038 McPherson et al. Feb 2014 A1
20140067611 Adachi et al. Mar 2014 A1
20140068487 Steiger et al. Mar 2014 A1
20140074855 Zhao et al. Mar 2014 A1
20140081685 Thacker et al. Mar 2014 A1
20140089339 Siddiqui et al. Mar 2014 A1
20140095273 Tang et al. Apr 2014 A1
20140095363 Caldwell Apr 2014 A1
20140095509 Patton Apr 2014 A1
20140108068 Williams Apr 2014 A1
20140108380 Gotz et al. Apr 2014 A1
20140108985 Scott et al. Apr 2014 A1
20140129261 Bothwell et al. May 2014 A1
20140129936 Richards et al. May 2014 A1
20140149436 Bahrami et al. May 2014 A1
20140156484 Chan et al. Jun 2014 A1
20140156527 Grigg et al. Jun 2014 A1
20140157172 Peery et al. Jun 2014 A1
20140164502 Khodorenko et al. Jun 2014 A1
20140189536 Lange et al. Jul 2014 A1
20140195515 Baker et al. Jul 2014 A1
20140195887 Ellis et al. Jul 2014 A1
20140208281 Ming Jul 2014 A1
20140214579 Shen et al. Jul 2014 A1
20140222521 Chait Aug 2014 A1
20140222793 Sadkin et al. Aug 2014 A1
20140244284 Smith Aug 2014 A1
20140244388 Manouchehri et al. Aug 2014 A1
20140258246 Lo Faro et al. Sep 2014 A1
20140267294 Ma Sep 2014 A1
20140267295 Sharma Sep 2014 A1
20140279824 Tamayo Sep 2014 A1
20140310266 Greenfield Oct 2014 A1
20140316911 Gross Oct 2014 A1
20140333651 Cervelli et al. Nov 2014 A1
20140337772 Cervelli et al. Nov 2014 A1
20140344230 Krause et al. Nov 2014 A1
20140351070 Christner et al. Nov 2014 A1
20140358829 Hurwitz Dec 2014 A1
20150019394 Unser et al. Jan 2015 A1
20150026622 Roaldson et al. Jan 2015 A1
20150046870 Goldenberg et al. Feb 2015 A1
20150073929 Psota et al. Mar 2015 A1
20150073954 Braff Mar 2015 A1
20150089353 Folkening Mar 2015 A1
20150089424 Duffield et al. Mar 2015 A1
20150100897 Sun et al. Apr 2015 A1
20150100907 Erenrich et al. Apr 2015 A1
20150106379 Elliot et al. Apr 2015 A1
20150134666 Gattiker et al. May 2015 A1
20150169709 Kara et al. Jun 2015 A1
20150169726 Kara et al. Jun 2015 A1
20150170077 Kara et al. Jun 2015 A1
20150178825 Huerta Jun 2015 A1
20150178877 Bogomolov et al. Jun 2015 A1
20150186483 Tappan et al. Jul 2015 A1
20150186821 Wang et al. Jul 2015 A1
20150187036 Wang et al. Jul 2015 A1
20150212663 Papale et al. Jul 2015 A1
20150227295 Meiklejohn et al. Aug 2015 A1
20150242401 Liu Aug 2015 A1
20150254220 Burr et al. Sep 2015 A1
20150309719 Ma et al. Oct 2015 A1
20150317342 Grossman et al. Nov 2015 A1
20150324868 Kaftan et al. Nov 2015 A1
20160062555 Ward et al. Mar 2016 A1
20160098176 Cervelli et al. Apr 2016 A1
20160110369 Cervelli et al. Apr 2016 A1
20160162519 Stowe et al. Jun 2016 A1
Foreign Referenced Citations (48)
Number Date Country
2013251186 Nov 2015 AU
102054015 May 2014 CN
102014103482 Sep 2014 DE
102014215621 Feb 2015 DE
1672527 Jun 2006 EP
2551799 Jan 2013 EP
2560134 Feb 2013 EP
2778977 Sep 2014 EP
2835745 Feb 2015 EP
2835770 Feb 2015 EP
2838039 Feb 2015 EP
2846241 Mar 2015 EP
2851852 Mar 2015 EP
2858014 Apr 2015 EP
2858018 Apr 2015 EP
2863326 Apr 2015 EP
2863346 Apr 2015 EP
2869211 May 2015 EP
2881868 Jun 2015 EP
2884439 Jun 2015 EP
2884440 Jun 2015 EP
2891992 Jul 2015 EP
2911078 Aug 2015 EP
2911100 Aug 2015 EP
2940603 Nov 2015 EP
2940609 Nov 2015 EP
2993595 Mar 2016 EP
3002691 Apr 2016 EP
3009943 Apr 2016 EP
2516155 Jan 2015 GB
2518745 Apr 2015 GB
2012778 Nov 2014 NL
2013306 Feb 2015 NL
624557 Dec 2014 NZ
WO 2000009529 Feb 2000 WO
WO 01025906 Apr 2001 WO
WO 2001088750 Nov 2001 WO
WO 2002065353 Aug 2002 WO
WO 2005104736 Nov 2005 WO
WO 2007133206 Nov 2007 WO
WO 2008064207 May 2008 WO
WO 2009061501 May 2009 WO
WO 2010000014 Jan 2010 WO
WO 2010030913 Mar 2010 WO
WO 2010030914 Mar 2010 WO
WO 2012119008 Sep 2012 WO
WO 2013010157 Jan 2013 WO
WO 2013102892 Jul 2013 WO
Non-Patent Literature Citations (324)
Entry
“A First Look: Predicting Market Demand for Food Retail using a Huff Analysis,” TRF Policy Solutions, Jul. 2012, pp. 30.
“A Quick Guide to UniProtKB Swiss-Prot & TrEMBL,” Sep. 2011, pp. 2.
“A Word About Banks and the Laundering of Drug Money,” Aug. 18, 2012, http://www.golemxiv.co.uk/2012/08/a-word-about-banks-and-the-laundering-of-drug-money/.
Abbey, Kristen, “Review of Google Docs,” May 1, 2007, pp. 2.
About 80 Minutes, “Palantir in a Number of Parts—Part 6—Graph,” Mar. 21, 2013, pp. 1-6.
Acklen, Laura, “Absolute Beginner's Guide to Microsoft Word 2003,” Dec. 24, 2003, pp. 15-18, 34-41, 308-316.
Adams et al., “Worklets: A Service-Oriented Implementation of Dynamic Flexibility in Workflows,” R. Meersman, Z. Tari et al. (Eds.): OTM 2006, LNCS, 4275, pp. 291-308, 2006.
Alur et al., “Chapter 2: IBM InfoSphere DataStage Stages,” IBM InfoSphere DataStage Data Flow and Job Design, Jul. 1, 2008, pp. 35-137.
Amnet, “5 Great Tools for Visualizing Your Twitter Followers,” posted Aug. 4, 2010, http://www.amnetblog.com/component/content/article/115-5-grate-tools-for-visualizing-your-twitter-followers.html.
Ananiev et al., “The New Modality API,” http://web.archive.org/web/20061211011958/http://java.sun.com/developer/technicalArticles/J2SE/Desktop/javase6/modality/ Jan. 21, 2006, pp. 8.
Bluttman et al., “Excel Formulas and Functions for Dummies,” 2005, Wiley Publishing, Inc., pp. 280, 284-286.
Boyce, Jim, “Microsoft Outlook 2010 Inside Out,” Aug. 1, 2010, retrieved from the internet https://capdtron.files.wordpress.com/2013/01/outlook-2010-inside_out.pdf.
Bugzilla@Mozilla, “Bug 18726—[feature] Long-click means of invoking contextual menus not supported,” http://bugzilla.mozilla.org/show_bug.cgi?id=18726 printed Jun. 13, 2013 in 11 pages.
Canese et al., “Chapter 2: PubMed: The Bibliographic Database,” The NCBI Handbook, Oct. 2002, pp. 1-10.
Celik, Tantek, “CSS Basic User Interface Module Level 3 (CSS3 UI),” Section 8 Resizing and Overflow, Jan. 17, 2012, retrieved from internet http://www.w3.org/TR/2012/WD-css3-ui-20120117/#resizing-amp-overflow retrieved on May 18, 2015.
Chaudhuri et al., “An Overview of Business Intelligence Technology,” Communications of the ACM, Aug. 2011, vol. 54, No. 8.
Chen et al., “Bringing Order to the Web: Automatically Categorizing Search Results,” CHI 2000, Proceedings of the SIGCHI conference on Human Factors in Computing Systems, Apr. 1-6, 2000, The Hague, The Netherlands, pp. 145-152.
Chung, Chin-Wan, “Dataplex: An Access to Heterogeneous Distributed Databases,” Communications of the ACM, Association for Computing Machinery, Inc., vol. 33, No. 1, Jan. 1, 1990, pp. 70-80.
Conner, Nancy, “Google Apps: The Missing Manual,” May 1, 2008, pp. 15.
Definition “Identify”, downloaded Jan. 22, 2015, 1 page.
Definition “Overlay”, downloaded Jan. 22, 2015, 1 page.
Delcher et al., “Identifying Bacterial Genes and Endosymbiont DNA with Glimmer,” BioInformatics, vol. 23, No. 6, 2007, pp. 673-679.
Dramowicz, Ela, “Retail Trade Area Analysis Using the Huff Model,” Directions Magazine, Jul. 2, 2005 in 10 pages, http://www.directionsmag.com/articles/retail-trade-area-analysis-using-the-huff-model/123411.
“The FASTA Program Package,” fasta-36.3.4, Mar. 25, 2011, pp. 29.
Ferreira et al., “A Scheme for Analyzing Electronic Payment Systems,” Basil 1997.
Galliford, Miles, “SnagIt Versus Free Screen Capture Software: Critical Tools for Website Owners,” <http://www.subhub.com/articles/free-screen-capture-software>, Mar. 27, 2008, pp. 11.
Gesher, Ari, “Palantir Screenshots in the Wild: Swing Sightings,” The Palantir Blog, Sep. 11, 2007, pp. 1-12.
GIS-NET 3 Public_Department of Regional Planning. Planning & Zoning Information for Unincorporated LA County. Retrieved Oct. 2, 2013 from http://gis.planning.lacounty.gov/GIS-NET3_Public/Viewer.html.
Glaab et al., “EnrichNet: Network-Based Gene Set Enrichment Analysis,” Bioinformatics 28.18 (2012): pp. i451-i457.
Goswami, Gautam, “Quite Writly Said!,” One Brick at a Time, Aug. 21, 2005, pp. 7.
“GrabUp—What a Timesaver!” <http://atlchris.com/191/grabup/>, Aug. 11, 2008, pp. 3.
Griffith, Daniel A., “A Generalized Huff Model,” Geographical Analysis, Apr. 1982, vol. 14, No. 2, pp. 135-144.
Gu et al., “Record Linkage: Current Practice and Future Directions,” Jan. 15, 2004, pp. 32.
Hansen et al., “Analyzing Social Media Networks with NodeXL: Insights from a Connected World”, Chapter 4, pp. 53-67 and Chapter 10, pp. 143-164, published Sep. 2010.
Hardesty, “Privacy Challenges: Analysis: It's Surprisingly Easy to Identify Individuals from Credit-Card Metadata,” MIT News on Campus and Around the World, MIT News Office, Jan. 29, 2015, 3 pages.
Hibbert et al., “Prediction of Shopping Behavior Using a Huff Model Within a GIS Framework,” Healthy Eating in Context, Mar. 18, 2011, pp. 16.
Hogue et al., “Thresher: Automating the Unwrapping of Semantic Content from the World Wide Web,” 14th International Conference on World Wide Web, WWW 2005: Chiba, Japan, May 10-14, 2005, pp. 86-95.
Hua et al., “A Multi-attribute Data Structure with Parallel Bloom Filters for Network Services”, HiPC 2006, LNCS 4297, pp. 277-288, 2006.
Huang et al., “Systematic and Integrative Analysis of Large Gene Lists Using DAVID Bioinformatics Resources,” Nature Protocols, 4.1, 2008, 44-57.
Huff et al., “Calibrating the Huff Model Using ArcGIS Business Analyst,” ESRI, Sep. 2008, pp. 33.
Huff, David L., “Parameter Estimation in the Huff Model,” ESRI, ArcUser, Oct.-Dec. 2003, pp. 34-36.
Hur et al., “SciMiner: web-based literature mining tool for target identification and functional enrichment analysis,” Bioinformatics 25.6 (2009): pp. 838-840.
IBM, “Determining Business Object Structure,” IBM, 2004, 9 pages.
JetScreenshot.com, “Share Screenshots via Internet in Seconds,” <http://web.archive.org/web/20130807164204/http://www.jetscreenshot.com/>, Aug. 7, 2013, pp. 1.
Kahan et al., “Annotea: an Open RDF Infrastructure for Shared Web Annotations”, Computer Networks, Elsevier Science Publishers B.V., vol. 39, No. 5, dated Aug. 5, 2002, pp. 589-608.
Keylines.com, “An Introduction to KeyLines and Network Visualization,” Mar. 2014, <http://keylines.com/wp-content/uploads/2014/03/KeyLines-White-Paper.pdf> downloaded May 12, 2014 in 8 pages.
Keylines.com, “KeyLines Datasheet,” Mar. 2014, <http://keylines.com/wp-content/uploads/2014/03/KeyLines-datasheet.pdf> downloaded May 12, 2014 in 2 pages.
Keylines.com, “Visualizing Threats: Improved Cyber Security Through Network Visualization,” Apr. 2014, <http://keylines.com/wp-content/uploads/2014/04/Visualizing-Threats1.pdf> downloaded May 12, 2014 in 10 pages.
Kitts, Paul, “Chapter 14: Genome Assembly and Annotation Process,” The NCBI Handbook, Oct. 2002, pp. 1-21.
Kwout, <http://web.archive.org/web/20080905132448/http://www.kwout.com/> Sep. 5, 2008, pp. 2.
Li et al., “Interactive Multimodal Visual Search on Mobile Device,” IEEE Transactions on Multimedia, vol. 15, No. 3, Apr. 1, 2013, pp. 594-607.
Liu, Tianshun, “Combining GIS and the Huff Model to Analyze Suitable Locations for a New Asian Supermarket in the Minneapolis and St. Paul, Minnesota USA,” Papers in Resource Analysis, 2012, vol. 14, pp. 8.
Madden, Tom, “Chapter 16: The BLAST Sequence Analysis Tool,” The NCBI Handbook, Oct. 2002, pp. 1-15.
Manno et al., “Introducing Collaboration in Single-user Applications through the Centralized Control Architecture,” 2010, pp. 10.
Manske, “File Saving Dialogs,” <http://www.mozilla.org/editor/ui_specs/FileSaveDialogs.html>, Jan. 20, 1999, pp. 7.
Map Builder, “Rapid Mashup Development Tool for Google and Yahoo Maps!” <http://web.archive.org/web/20090626224734/http://www.mapbuilder.net/> printed Jul. 20, 2012 in 2 pages.
Map of San Jose, CA. Retrieved Oct. 2, 2013 from http://maps.yahoo.com.
Map of San Jose, CA. Retrieved Oct. 2, 2013 from http://maps.bing.com.
Map of San Jose, CA. Retrieved Oct. 2, 2013 from http://maps.google.com.
Microsoft—Developer Network, “Getting Started with VBA in Word 2010,” Apr. 2010, <http://msdn.microsoft.com/en-us/library/ff604039%28v=office.14%29.aspx> as printed Apr. 4, 2014 in 17 pages.
Microsoft Office—Visio, “About connecting shapes,” <http://office.microsoft.com/en-us/visio-help/about-connecting-shapes-HP085050369.aspx> printed Aug. 4, 2011 in 6 pages.
Microsoft Office—Visio, “Add and glue connectors with the Connector tool,” <http://office.microsoft.com/en-us/visio-help/add-and-glue-connectors-with-the-connector-tool-HA010048532.aspx?CTT=1> printed Aug. 4, 2011 in 1 page.
Microsoft Windows, “Microsoft Windows Version 2002 Print Out 2,” 2002, pp. 1-6.
Microsoft, “Registering an Application to a URI Scheme,” <http://msdn.microsoft.com/en-us/library/aa767914.aspx>, printed Apr. 4, 2009 in 4 pages.
Microsoft, “Using the Clipboard,” <http://msdn.microsoft.com/en-us/library/ms649016.aspx>, printed Jun. 8, 2009 in 20 pages.
Mizrachi, Ilene, “Chapter 1: GenBank: The Nuckeotide Sequence Database,” The NCBI Handbook, Oct. 2002, pp. 1-14.
“Money Laundering Risks and E-Gaming: A European Overview and Assessment,” 2009, http://www.cf.ac.uk/socsi/resources/Levi_Final_Money_Laundering_Risks_egaming.pdf.
Nierman, “Evaluating Structural Similarity in XML Documents”, 6 pages, 2002.
Nitro, “Trick: How to Capture a Screenshot as PDF, Annotate, Then Share It,” <http://blog.nitropdf.com/2008/03/04/trick-how-to-capture-a-screenshot-as-pdf-annotate-it-then-share/>, Mar. 4, 2008, pp. 2.
Nolan et al., “MCARTA: A Malicious Code Automated Run-Time Analysis Framework,” Homeland Security, 2012 IEEE Conference on Technologies for, Nov. 13, 2012, pp. 13-17.
Olanoff, Drew, “Deep Dive with the New Google Maps for Desktop with Google Earth Integration, It's More than Just a Utility,” May 15, 2013, pp. 1-6, retrieved from the internet: http://web.archive.org/web/20130515230641/http://techcrunch.com/2013/05/15/deep-dive-with-the-new-google-maps-for-desktop-with-google-earth-integration-its-more-than-just-a-utility/.
Online Tech Tips, “Clip2Net—Share files, folders and screenshots easily,” <http://www.online-tech-tips.com/free-software-downloads/share-files-folders-screenshots/>, Apr. 2, 2008, pp. 5.
O'Reilly.com, http://oreilly.com/digitalmedia/2006/01/01/mac-os-x-screenshot-secrets.html published Jan. 1, 2006 in 10 pages.
Palantir Technologies, “Palantir Labs—Timeline,” Oct. 1, 2010, retrieved from the internet https://www.youtube.com/watch?v=JCgDW5bru9M.
Palmas et al., “An Edge-Bunding Layout for Interactive Parallel Coordinates” 2014 IEEE Pacific Visualization Symposium, pp. 57-64.
Perdisci et al., “Behavioral Clustering of HTTP-Based Malware and Signature Generation Using Malicious Network Traces,” USENIX, Mar. 18, 2010, pp. 1-14.
Quest, “Toad for Oracle 11.6—Guide to Using Toad,” Sep. 24, 2012, pp. 1-162.
Rouse, Margaret, “OLAP Cube,” <http://searchdatamanagement.techtarget.com/definition/OLAP-cube>, Apr. 28, 2012, pp. 16.
Schroder, Stan, “15 Ways to Create Website Screenshots,” <http://mashable.com/2007/08/24/web-screenshots/>, Aug. 24, 2007, pp. 2.
Shi et al., “A Scalable Implementation of Malware Detection Based on Network Connection Behaviors,” 2013 International Conference on Cyber-Enabled Distributed Computing and Knowledge Discovery, IEEE, Oct. 10, 2013, pp. 59-66.
Sigrist, et al., “PROSITE, a Protein Domain Database for Functional Characterization and Annotation,” Nucleic Acids Research, 2010, vol. 38, pp. D161-D166.
Sirotkin et al., “Chapter 13: The Processing of Biological Sequence Data at NCBI,” The NCBI Handbook, Oct. 2002, pp. 1-11.
SnagIt, “SnagIt 8.1.0 Print Out 2,” Software release date Jun. 15, 2006, pp. 1-3.
SnagIt, “SnagIt 8.1.0 Print Out,” Software release date Jun. 15, 2006, pp. 6.
SnagIt, “SnagIt Online Help Guide,” <http://download.techsmith.com/snagit/docs/onlinehelp/enu/snagit_help.pdf>, TechSmith Corp., Version 8.1, printed Feb. 7, 2007, pp. 284.
Symantec Corporation, “E-Security Begins with Sound Security Policies,” Announcement Symantec, Jun. 14, 2001.
Thompson, Mick, “Getting Started with GEO,” Getting Started with GEO, Jul. 26, 2011.
Umagandhi et al., “Search Query Recommendations Using Hybrid User Profile with Query Logs,” International Journal of Computer Applications, vol. 80, No. 10, Oct. 1, 2013, pp. 7-18.
“Using Whois Based Geolocation and Google Maps API for Support Cybercrime Investigations,” http://wseas.us/e-library/conferences/2013/Dubrovnik/TELECIRC/TELECIRC-32.pdf.
Wang et al., “Research on a Clustering Data De-Duplication Mechanism Based on Bloom Filter,” IEEE 2010, 5 pages.
Warren, Christina, “TUAW Faceoff: Screenshot apps on the firing line,” <http://www.tuaw.com/2008/05/05/tuaw-faceoff-screenshot-apps-on-the-firing-line/>, May 5, 2008, pp. 11.
Wikipedia, “Federated Database System,” Sep. 7, 2013, retrieved from the internet on Jan. 27, 2015 http://en.wikipedia.org/w/index.php?title=Federated_database_system&oldid=571954221.
Wright et al., “Palantir Technologies VAST 2010 Challenge Text Records_Investigations into Arms Dealing,” Oct. 29, 2010, pp. 1-10.
Yang et al., “HTML Page Analysis Based on Visual Cues”, A129, pp. 859-864, 2001.
Zheng et al., “GOEAST: a web-based software toolkit for Gene Ontology enrichment analysis,” Nucleic acids research 36.suppl 2 (2008): pp. W385-W363.
Notice of Acceptance for Australian Patent Application No. 2013251186 dated Nov. 6, 2015.
Notice of Acceptance for Australian Patent Application No. 2014250678 dated Oct. 7, 2015.
Notice of Allowance for U.S. Appl. No. 12/556,318 dated Apr. 11, 2016.
Notice of Allowance for U.S. Appl. No. 12/556,318 dated Nov. 2, 2015.
Notice of Allowance for U.S. Appl. No. 13/247,987 dated Mar. 17, 2016.
Notice of Allowance for U.S. Appl. No. 14/102,394 dated Aug. 25, 2014.
Notice of Allowance for U.S. Appl. No. 14/108,187 dated Aug. 29, 2014.
Notice of Allowance for U.S. Appl. No. 14/135,289 dated Oct. 14, 2014.
Notice of Allowance for U.S. Appl. No. 14/148,568 dated Aug. 26, 2015.
Notice of Allowance for U.S. Appl. No. 14/192,767 dated Dec. 16, 2014.
Notice of Allowance for U.S. Appl. No. 14/192,767 dated Apr. 20, 2015.
Notice of Allowance for U.S. Appl. No. 14/225,084 dated May 4, 2015.
Notice of Allowance for U.S. Appl. No. 14/265,637 dated Feb. 13, 2015.
Notice of Allowance for U.S. Appl. No. 14/268,964 dated Dec. 3, 2014.
Notice of Allowance for U.S. Appl. No. 14/294,098 dated Dec. 29, 2014.
Notice of Allowance for U.S. Appl. No. 14/323,935 dated Oct. 1, 2015.
Notice of Allowance for U.S. Appl. No. 14/326,738 dated Nov. 18, 2015.
Notice of Allowance for U.S. Appl. No. 14/473,552 dated Jul. 24, 2015.
Notice of Allowance for U.S. Appl. No. 14/473,860 dated Feb. 27, 2015.
Notice of Allowance for U.S. Appl. No. 14/473,860 dated Jan. 5, 2015.
Notice of Allowance for U.S. Appl. No. 14/486,991 dated May 1, 2015.
Notice of Allowance for U.S. Appl. No. 14/504,103 dated May 18, 2015.
Notice of Allowance for U.S. Appl. No. 14/552,336 dated Nov. 3, 2015.
Notice of Allowance for U.S. Appl. No. 14/579,752 dated Apr. 4, 2016.
Notice of Allowance for U.S. Appl. No. 14/616,080 dated Apr. 2, 2015.
Notice of Allowance for U.S. Appl. No. 14/676,621 dated Feb. 10, 2016.
Notice of Allowance for U.S. Appl. No. 14/961,481 dated May 2, 2016.
Official Communication for Australian Patent Application No. 2013251186 dated Mar. 12, 2015.
Official Communication for Australian Patent Application No. 2014201511 dated Feb. 27, 2015.
Official Communication for Australian Patent Application No. 2014202442 dated Mar. 19, 2015.
Official Communication for Australian Patent Application No. 2014210604 dated Jun. 5, 2015.
Official Communication for Australian Patent Application No. 2014210614 dated Jun. 5, 2015.
Official Communication for Australian Patent Application No. 2014213553 dated May 7, 2015.
Official Communication for Australian Patent Application No. 2014250678 dated Jun. 17, 2015.
Official Communication for Canadian Patent Application No. 2831660 dated Jun. 9, 2015.
Official Communication for European Patent Application No. 12181585.6 dated Sep. 4, 2015.
Official Communication for European Patent Application No. 14158861.6 dated Jun. 16, 2014.
Official Communication for European Patent Application No. 14159464.8 dated Jul. 31, 2014.
Official Communication for European Patent Application No. 14180142.3 dated Feb. 6, 2015.
Official Communication for European Patent Application No. 14180281.9 dated Jan. 26, 2015.
Official Communication for European Patent Application No. 14180321.3 dated Apr. 17, 2015.
Official Communication for European Patent Application No. 14180432.8 dated Jun. 23, 2015.
Official Communication for European Patent Application No. 14186225.0 dated Feb. 13, 2015.
Official Communication for European Patent Application No. 14187739.9 dated Jul. 6, 2015.
Official Communication for European Patent Application No. 14187996.5 dated Feb. 12, 2015.
Official Communication for European Patent Application No. 14187996.5 dated Feb. 19, 2016.
Official Communication for European Patent Application No. 14189344.6 dated Feb. 20, 2015.
Official Communication for European Patent Application No. 14189344.6 dated Feb. 29, 2016.
Official Communication for European Patent Application No. 14189347.9 dated Mar. 4, 2015.
Official Communication for European Patent Application No. 14189802.3 dated May 11, 2015.
Official Communication for European Patent Application No. 14191540.5 dated May 27, 2015.
Official Communication for European Patent Application No. 14197879.1 dated Apr. 28, 2015.
Official Communication for European Patent Application No. 14197895.7 dated Apr. 28, 2015.
Official Communication for European Patent Application No. 14197938.5 dated Apr. 28, 2015.
Official Communication for European Patent Application No. 14199182.8 dated Mar. 13, 2015.
Official Communication for European Patent Application No. 15155845.9 dated Oct. 6, 2015.
Official Communication for European Patent Application No. 15155846.7 dated Jul. 8, 2015.
Official Communication for European Patent Application No. 15165244.3 dated Aug. 27, 2015.
Official Communication for European Patent Application No. 15175106.2 dated Nov. 5, 2015.
Official Communication for European Patent Application No. 15175151.8 dated Nov. 25, 2015.
Official Communication for European Patent Application No. 15183721.8 dated Nov. 23, 2015.
Official Communication for European Patent Application No. 15188106.7 dated Feb. 3, 2016.
Official Communication for European Patent Application No. 15190307.7 dated Feb. 19, 2016.
Official Communication for Great Britain Patent Application No. 1404457.2 dated Aug. 14, 2014.
Official Communication for Great Britain Patent Application No. 1404486.1 dated May 21, 2015.
Official Communication for Great Britain Patent Application No. 1404486.1 dated Aug. 27, 2014.
Official Communication for Great Britain Patent Application No. 1404489.5 dated May 21, 2015.
Official Communication for Great Britain Patent Application No. 1404489.5 dated Aug. 27, 2014.
Official Communication for Great Britain Patent Application No. 1404499.4 dated Aug. 20, 2014.
Official Communication for Great Britain Patent Application No. 1404574.4 dated Dec. 18, 2014.
Official Communication for Great Britain Patent Application No. 1408025.3 dated Nov. 6, 2014.
Official Communication for Great Britain Patent Application No. 1411984.6 dated Dec. 22, 2014.
Official Communication for Great Britain Patent Application No. 1413935.6 dated Jan. 27, 2015.
Official Communication for Netherlands Patent Application No. 2011729 dated Aug. 13, 2015.
Official Communication for Netherlands Patent Application No. 2012437 dated Sep. 18, 2015.
Official Communication for Netherlands Patent Application No. 2012438 dated Sep. 21, 2015.
Official Communication for Netherlands Patent Application No. 2013306 dated Apr. 24, 2015.
Official Communication for New Zealand Patent Application No. 622473 dated Jun. 19, 2014.
Official Communication for New Zealand Patent Application No. 622473 dated Mar. 27, 2014.
Official Communication for New Zealand Patent Application No. 622513 dated Apr. 3, 2014.
Official Communication for New Zealand Patent Application No. 622517 dated Apr. 3, 2014.
Official Communication for New Zealand Patent Application No. 624557 dated May 14, 2014.
Official Communication for New Zealand Patent Application No. 627962 dated Aug. 5, 2014.
Official Communication for New Zealand Patent Application No. 628161 dated Aug. 25, 2014.
Official Communication for New Zealand Patent Application No. 628263 dated Aug. 12, 2014.
Official Communication for New Zealand Patent Application No. 628495 dated Aug. 19, 2014.
Official Communication for New Zealand Patent Application No. 628585 dated Aug. 26, 2014.
Official Communication for New Zealand Patent Application No. 628840 dated Aug. 28, 2014.
Official Communication for U.S. Appl. No. 12/556,318 dated Jul. 2, 2015.
Official Communication for U.S. Appl. No. 12/556,321 dated Feb. 25, 2016.
Official Communication for U.S. Appl. No. 12/556,321 dated Jun. 6, 2012.
Official Communication for U.S. Appl. No. 12/556,321 dated Dec. 7, 2011.
Official Communication for U.S. Appl. No. 12/556,321 dated Jul. 7, 2015.
Official Communication for U.S. Appl. No. 13/247,987 dated Apr. 2, 2015.
Official Communication for U.S. Appl. No. 13/247,987 dated Sep. 22, 2015.
Official Communication for U.S. Appl. No. 13/669,274 dated Aug. 26, 2015.
Official Communication for U.S. Appl. No. 13/669,274 dated May 2, 2016.
Official Communication for U.S. Appl. No. 13/669,274 dated May 6, 2015.
Official Communication for U.S. Appl. No. 13/827,491 dated Dec. 1, 2014.
Official Communication for U.S. Appl. No. 13/827,491 dated Jun. 22, 2015.
Official Communication for U.S. Appl. No. 13/827,491 dated Mar. 30, 2016.
Official Communication for U.S. Appl. No. 13/827,491 dated Oct. 9, 2015.
Official Communication for U.S. Appl. No. 13/831,791 dated Feb. 11, 2016.
Official Communication for U.S. Appl. No. 13/831,791 dated Mar. 4, 2015.
Official Communication for U.S. Appl. No. 13/831,791 dated Aug. 6, 2015.
Official Communication for U.S. Appl. No. 13/835,688 dated Jun. 17, 2015.
Official Communication for U.S. Appl. No. 13/835,688 dated Sep. 30, 2015.
Official Communication for U.S. Appl. No. 13/839,026 dated Aug. 4, 2015.
Official Communication for U.S. Appl. No. 14/102,394 dated Mar. 27, 2014.
Official Communication for U.S. Appl. No. 14/108,187 dated Apr. 17, 2014.
Official Communication for U.S. Appl. No. 14/108,187 dated Mar. 20, 2014.
Official Communication for U.S. Appl. No. 14/134,558 dated Oct. 7, 2015.
Official Communication for U.S. Appl. No. 14/135,289 dated Apr. 16, 2014.
Official Communication for U.S. Appl. No. 14/135,289 dated Jul. 7, 2014.
Official Communication for U.S. Appl. No. 14/148,568 dated Oct. 22, 2014.
Official Communication for U.S. Appl. No. 14/148,568 dated Mar. 26, 2015.
Official Communication for U.S. Appl. No. 14/148,568 dated Mar. 27, 2014.
Official Communication for U.S. Appl. No. 14/192,767 dated Sep. 24, 2014.
Official Communication for U.S. Appl. No. 14/192,767 dated May 6, 2014.
Official Communication for U.S. Appl. No. 14/196,814 dated Aug. 13, 2014.
Official Communication for U.S. Appl. No. 14/196,814 dated May 5, 2015.
Official Communication for U.S. Appl. No. 14/196,814 dated Oct. 7, 2015.
Official Communication for U.S. Appl. No. 14/222,364 dated Dec. 9, 2015.
Official Communication for U.S. Appl. No. 14/225,006 dated Sep. 10, 2014.
Official Communication for U.S. Appl. No. 14/225,006 dated Sep. 2, 2015.
Official Communication for U.S. Appl. No. 14/225,006 dated Sep. 21, 2015.
Official Communication for U.S. Appl. No. 14/225,006 dated Feb. 27, 2015.
Official Communication for U.S. Appl. No. 14/225,084 dated Sep. 11, 2015.
Official Communication for U.S. Appl. No. 14/225,084 dated Sep. 2, 2014.
Official Communication for U.S. Appl. No. 14/225,084 dated Feb. 20, 2015.
Official Communication for U.S. Appl. No. 14/225,084 dated Feb. 26, 2016.
Official Communication for U.S. Appl. No. 14/225,084 dated Jan. 4, 2016.
Official Communication for U.S. Appl. No. 14/225,160 dated Feb. 11, 2015.
Official Communication for U.S. Appl. No. 14/225,160 dated Aug. 12, 2015.
Official Communication for U.S. Appl. No. 14/225,160 dated May 20, 2015.
Official Communication for U.S. Appl. No. 14/225,160 dated Oct. 22, 2014.
Official Communication for U.S. Appl. No. 14/225,160 dated Jan. 25, 2016.
Official Communication for U.S. Appl. No. 14/225,160 dated Jul. 29, 2014.
Official Communication for U.S. Appl. No. 14/265,637 dated Sep. 26, 2014.
Official Communication for U.S. Appl. No. 14/268,964 dated Jul. 11, 2014.
Official Communication for U.S. Appl. No. 14/268,964 dated Sep. 3, 2014.
Official Communication for U.S. Appl. No. 14/289,596 dated Jul. 18, 2014.
Official Communication for U.S. Appl. No. 14/289,596 dated Jan. 26, 2015.
Official Communication for U.S. Appl. No. 14/289,596 dated Apr. 30, 2015.
Official Communication for U.S. Appl. No. 14/289,596 dated Aug. 5, 2015.
Official Communication for U.S. Appl. No. 14/289,599 dated Jul. 22, 2014.
Official Communication for U.S. Appl. No. 14/289,599 dated May 29, 2015.
Official Communication for U.S. Appl. No. 14/289,599 dated Sep. 4, 2015.
Official Communication for U.S. Appl. No. 14/294,098 dated Aug. 15, 2014.
Official Communication for U.S. Appl. No. 14/294,098 dated Nov. 6, 2014.
Official Communication for U.S. Appl. No. 14/306,138 dated Sep. 14, 2015.
Official Communication for U.S. Appl. No. 14/306,138 dated Mar. 17, 2016.
Official Communication for U.S. Appl. No. 14/306,138 dated Feb. 18, 2015.
Official Communication for U.S. Appl. No. 14/306,138 dated Sep. 23, 2014.
Official Communication for U.S. Appl. No. 14/306,138 dated Dec. 24, 2015.
Official Communication for U.S. Appl. No. 14/306,138 dated May 26, 2015.
Official Communication for U.S. Appl. No. 14/306,138 dated Dec. 3, 2015.
Official Communication for U.S. Appl. No. 14/306,147 dated Feb. 19, 2015.
Official Communication for U.S. Appl. No. 14/306,147 dated Dec. 24, 2015.
Official Communication for U.S. Appl. No. 14/306,147 dated Aug. 7, 2015.
Official Communication for U.S. Appl. No. 14/306,147 dated Sep. 9, 2014.
Official Communication for U.S. Appl. No. 14/306,154 dated Feb. 1, 2016.
Official Communication for U.S. Appl. No. 14/306,154 dated Mar. 11, 2015.
Official Communication for U.S. Appl. No. 14/306,154 dated May 15, 2015.
Official Communication for U.S. Appl. No. 14/306,154 dated Nov. 16, 2015.
Official Communication for U.S. Appl. No. 14/306,154 dated Mar. 17, 2016.
Official Communication for U.S. Appl. No. 14/306,154 dated Jul. 6, 2015.
Official Communication for U.S. Appl. No. 14/306,154 dated Sep. 9, 2014.
Official Communication for U.S. Appl. No. 14/319,765 dated Feb. 1, 2016.
Official Communication for U.S. Appl. No. 14/319,765 dated Sep. 10, 2015.
Official Communication for U.S. Appl. No. 14/319,765 dated Jun. 16, 2015.
Official Communication for U.S. Appl. No. 14/319,765 dated Nov. 25, 2014.
Official Communication for U.S. Appl. No. 14/319,765 dated Feb. 4, 2015.
Official Communication for U.S. Appl. No. 14/323,935 dated Jun. 22, 2015.
Official Communication for U.S. Appl. No. 14/323,935 dated Nov. 28, 2014.
Official Communication for U.S. Appl. No. 14/323,935 dated Mar. 31, 2015.
Official Communication for U.S. Appl. No. 14/326,738 dated Dec. 2, 2014.
Official Communication for U.S. Appl. No. 14/326,738 dated Jul. 31, 2015.
Official Communication for U.S. Appl. No. 14/326,738 dated Mar. 31, 2015.
Official Communication for U.S. Appl. No. 14/332,306 dated May 20, 2016.
Official Communication for U.S. Appl. No. 14/473,552 dated Feb. 24, 2015.
Official Communication for U.S. Appl. No. 14/473,860 dated Nov. 4, 2014.
Official Communication for U.S. Appl. No. 14/479,160 dated Apr. 20, 2016.
Official Communication for U.S. Appl. No. 14/486,991 dated Mar. 10, 2015.
Official Communication for U.S. Appl. No. 14/490,612 dated Aug. 18, 2015.
Official Communication for U.S. Appl. No. 14/490,612 dated Jan. 27, 2015.
Official Communication for U.S. Appl. No. 14/490,612 dated Mar. 31, 2015.
Official Communication for U.S. Appl. No. 14/504,103 dated Mar. 31, 2015.
Official Communication for U.S. Appl. No. 14/504,103 dated Feb. 5, 2015.
Official Communication for U.S. Appl. No. 14/552,336 dated Jul. 20, 2015.
Official Communication for U.S. Appl. No. 14/571,098 dated Nov. 10, 2015.
Official Communication for U.S. Appl. No. 14/571,098 dated Mar. 11, 2015.
Official Communication for U.S. Appl. No. 14/571,098 dated Feb. 23, 2016.
Official Communication for U.S. Appl. No. 14/571,098 dated Aug. 24, 2015.
Official Communication for U.S. Appl. No. 14/571,098 dated Aug. 5, 2015.
Official Communication for U.S. Appl. No. 14/579,752 dated Aug. 19, 2015.
Official Communication for U.S. Appl. No. 14/579,752 dated May 26, 2015.
Official Communication for U.S. Appl. No. 14/579,752 dated Dec. 9, 2015.
Official Communication for U.S. Appl. No. 14/631,633 dated Sep. 10, 2015.
Official Communication for U.S. Appl. No. 14/631,633 dated Feb. 3, 2016.
Official Communication for U.S. Appl. No. 14/639,606 dated Oct. 16, 2015.
Official Communication for U.S. Appl. No. 14/639,606 dated May 18, 2015.
Official Communication for U.S. Appl. No. 14/639,606 dated Jul. 24, 2015.
Official Communication for U.S. Appl. No. 14/639,606 dated Apr. 5, 2016.
Official Communication for U.S. Appl. No. 14/676,621 dated Oct. 29, 2015.
Official Communication for U.S. Appl. No. 14/676,621 dated Jul. 30, 2015.
Official Communication for U.S. Appl. No. 14/715,834 dated Apr. 13, 2016.
Official Communication for U.S. Appl. No. 14/715,834 dated Feb. 19, 2016.
Official Communication for U.S. Appl. No. 14/726,353 dated Mar. 1, 2016.
Official Communication for U.S. Appl. No. 14/726,353 dated Sep. 10, 2015.
Official Communication for U.S. Appl. No. 14/741,256 dated Feb. 9, 2016.
Official Communication for U.S. Appl. No. 14/800,447 dated Dec. 10, 2015.
Official Communication for U.S. Appl. No. 14/800,447 dated Mar. 3, 2016.
Official Communication for U.S. Appl. No. 14/800,447 dated Jun. 6, 2016.
Official Communication for U.S. Appl. No. 14/813,749 dated Sep. 28, 2015.
Official Communication for U.S. Appl. No. 14/813,749 dated Apr. 8, 2016.
Official Communication for U.S. Appl. No. 14/841,338 dated Feb. 18, 2016.
Official Communication for U.S. Appl. No. 14/842,734 dated Nov. 19, 2015.
Official Communication for U.S. Appl. No. 14/871,465 dated Feb. 9, 2016.
Official Communication for U.S. Appl. No. 14/883,498 dated Mar. 17, 2016.
Official Communication for U.S. Appl. No. 14/883,498 dated Dec. 24, 2015.
Official Communication for U.S. Appl. No. 14/961,481 dated Mar. 2, 2016.
Official Communication for U.S. Appl. No. 14/975,215 dated May 19, 2016.
Restriction Requirement for U.S. Appl. No. 13/839,026 dated Apr. 2, 2015.
Notice of Allowance for U.S. Appl. No. 14/871,465 dated Jul. 19, 2016.
Official Communication for European Patent Application No. 15188106.7 dated Feb. 21, 2017.
Official Communication for U.S. Appl. No. 14/871,465 dated Apr. 11, 2016.
“Potential Money Laundering Warning Signs,” snapshot taken 2003, https://web.archive.org/web/20030816090055/http:/finsolinc.com/ANTI-MONEY%20LAUNDERING%20TRAINING%20GUIDES.pdf.
“Refresh CSS Ellipsis When Resizing Container—Stack Overflow,” Jul. 31, 2013, retrieved from internet http://stackoverflow.com/questions/17964681/refresh-css-ellipsis-when-resizing-container, retrieved on May 18, 2015.
Related Publications (1)
Number Date Country
20170139558 A1 May 2017 US
Provisional Applications (2)
Number Date Country
62206159 Aug 2015 US
62059601 Oct 2014 US
Continuations (1)
Number Date Country
Parent 14871465 Sep 2015 US
Child 15354868 US