Systems and user interfaces for dynamic and interactive access of, investigation of, and analysis of data objects stored in one or more databases

Information

  • Patent Grant
  • 10871887
  • Patent Number
    10,871,887
  • Date Filed
    Wednesday, November 29, 2017
    7 years ago
  • Date Issued
    Tuesday, December 22, 2020
    4 years ago
Abstract
Embodiments of the present disclosure relate to user interfaces and systems that may enable dynamic and interactive access of, investigation of, and analysis of data objects stored in one or more databases. The data objects may be accessed from the one or more databases, and presented in multiple related portions of a display. In particular, the system provides a time-based visualization of data objects (and/or properties associated with the data objects) to a user such that the user may, for example, determine connections between various data objects, observe flows of information among data objects, and/or investigate related data objects.
Description
BACKGROUND

Embodiments of the present disclosure relate to systems and techniques for time-based display of data objects.


An example of a time-based display of data objects is a timeline. A timeline is a way of displaying a list of events in chronological order. A timeline is typically a graphic design showing a long bar labeled with dates or times alongside itself and usually events labeled on points where they would have happened.


SUMMARY

The systems, methods, and devices described herein each have several aspects, no single one of which is solely responsible for its desirable attributes. Without limiting the scope of this disclosure, several non-limiting features will now be discussed briefly.


Embodiments of the present disclosure relate to a system that may enable efficient analysis of data objects. In particular, the system provides a time-based visualization of data objects (and/or properties associated with the data objects) to a user such that the user may, for example, determine connections between various data objects, observe flows of information among data objects, and/or investigate related data objects.


For example, the system described herein may show various data objects as graphical icons in a display area, and may indicate relationships among those data objects with graphical lines or links between the graphical icons. Further, there may be time-based properties associated with the data objects. These time-based properties may not be apparent in the display area, but may be made more clear in a time-based display. The time-based display may indicate time-dependent relationships among the various data objects. For example, the data objects may be represented by bars and/or lines along a timeline, and may be grouped together into bins corresponding to various periods of time.


The system enables a user to efficiently understand time-based relationships among the data objects, as the data objects may be simultaneously displayed with graphical links in one portion of the display, and in a time-based display in another portion. Interactions with the data objects in either portion of the display may cause corresponding indications and/or updates in another portion. For example, selection of particular data objects in the time-based portion of the display may cause the corresponding data objects as represented by graphical icons to be highlighted. Similarly, selection of graphical icons representing data objects may cause a corresponding highlighting and/or other adjustment to the time-based display. Additionally, the graphical icons may be highlighted in an animated fashion based on time-based properties associated with the corresponding data objects, enabling a user to efficiently determine a sequence of events associated with related data items.


According to an embodiment, a computer system is disclosed comprising one or more computer readable storage devices configured to store: a plurality of computer executable instructions; and a plurality of data objects, each of the data objects associated with one or more properties; and one or more hardware computer processors in communication with the one or more computer readable storage devices and configured to execute the plurality of computer executable instructions in order to cause the computer system to: access one or more data objects from the one or more computer readable storage devices; determine one or more relationships among the one or more data objects based on the one or more properties associated with respective of the one or more data objects; generate a data object display panel including one or more data objects in a graph layout, the graph layout including indications of the one or more relationships; determine a time-based property associated with each of at least some of the one or more data objects; generate a time-based display panel including representations of each of the at least some of the one or more data objects; display the data object display panel and the time-based display panel on an electronic display of the computer system; determine a window of time associated with the time-based display panel; and in response to receiving an input indicating a selection of a play indicator: move the window of time along the time-based display panel in an animated fashion; and highlight, in the data object display panel, data objects corresponding to the location of the window of time as it moves along the time-based display panel.


According to an aspect, the window of time is determined based on at least one of: a percentage of time represented in the time-based display panel, a percentage of the one or more data objects represented in the time-based display panel, a number of data objects represented in the time-based display panel, or an amount of time represented in the time-based display panel.


According to another aspect, the user may adjust the speed at which the window of time moves along the time-based display panel.


According to yet another aspect, highlighting the data objects comprises greying out any data objects not corresponding to the location of the window of time.


According to another aspect, the time-based display panel includes information bins in which the one or more data objects are placed.


According to yet another aspect, the information bins comprise at least one of bars or points along a line.


According to yet another aspect, in response to an input indicating a selection of one or more of the bars or the points along the line, highlighting corresponding data objects in the data object display panel.


According to another aspect, the data object display panel and the time-based display panel are simultaneously displayed on the electronic display of the computer system.


According to yet another aspect, the time-based display panel comprises a timeline.


According to another aspect, the graph layout comprises graphical icons representing each of the one or more data objects, and wherein the indications of the one or more relationships comprise graphical lines connecting respective graphical icons representing related data objects.


According to another embodiment, a computer system is disclosed comprising: one or more computer readable storage devices configured to store: a plurality of computer executable instructions; and a plurality of data objects, each of the data objects associated with one or more properties; and one or more hardware computer processors in communication with the one or more computer readable storage devices and configured to execute the plurality of computer executable instructions in order to cause the computer system to: access one or more data objects from the one or more computer readable storage devices; generate a data object display panel including one or more data objects; determine a time-based property associated with each of at least some of the one or more data objects; generate a time-based display panel including representations of each of the at least some of the one or more data objects; display the data object display panel and the time-based display panel on an electronic display of the computer system; determine a window of time associated with the time-based display panel; and in response to receiving an input indicating a selection of a play indicator: move the window of time along the time-based display panel in an animated fashion; and highlight, in the data object display panel, data objects corresponding to the location of the window of time as it moves along the time-based display panel.


According to an aspect, the window of time is determined based on at least one of: a percentage of time represented in the time-based display panel, a percentage of the one or more data objects represented in the time-based display panel, a number of data objects represented in the time-based display panel, or an amount of time represented in the time-based display panel.


According to another aspect, the user may adjust the speed at which the window of time moves along the time-based display panel.


According to yet another aspect, highlighting the data objects comprises greying out any data objects not corresponding to the location of the window of time.


According to another aspect, the time-based display panel includes information bins in which the one or more data objects are placed.


According to yet another aspect, the information bins comprise at least one of bars or points along a line.


According to yet another aspect, in response to an input indicating a selection of one or more of the bars or the points along the line, highlighting corresponding data objects in the data object display panel.


According to another aspect, the data object display panel and the time-based display panel are simultaneously displayed on the electronic display of the computer system.


According to yet another aspect, the time-based display panel comprises a timeline.


According to another aspect, the one or more hardware computer processors are configured to execute the plurality of computer executable instructions in order to further cause the computer system to: determine one or more relationships among the one or more data objects based on the one or more properties associated with respective of the one or more data objects, wherein the data object display panel includes the one or more data objects in a graph layout, the graph layout including indications of the one or more relationships.


According to yet another aspect, the graph layout comprises graphical icons representing each of the one or more data objects, and wherein the indications of the one or more relationships comprise graphical lines connecting respective graphical icons representing related data objects.


According to another aspect, the data object display panel includes the one or more data objects in at least one of a histogram, a table, a list, or a map.


In various embodiments, computer-implemented methods are disclosed in which, under control of one or more hardware computing devices configured with specific computer executable instructions, one or more aspects of the above-described embodiments are implemented and/or performed.


In various embodiments, a non-transitory computer-readable storage medium storing software instructions is disclosed that, in response to execution by a computer system having one or more hardware processors, configure the computer system to perform operations comprising one or more aspects of the above-described embodiments.


Further, as described herein, a system may be configured and/or designed to generate user interface data useable for rendering the various interactive user interfaces described. The user interface data may be used by the system, and/or another computer system, device, and/or software program (for example, a browser program), to render the interactive user interfaces. The interactive user interfaces may be displayed on, for example, electronic displays (including, for example, touch-enabled displays).


Additionally, it has been noted that design of computer user interfaces “that are useable and easily learned by humans is a non-trivial problem for software developers.” (Dillon, A. (2003) User Interface Design. MacMillan Encyclopedia of Cognitive Science, Vol. 4, London: MacMillan, 453-458.) The various embodiments of interactive and dynamic user interfaces of the present disclosure are the result of significant research, development, improvement, iteration, and testing. This non-trivial development has resulted in the user interfaces described herein which may provide significant cognitive and ergonomic efficiencies and advantages over previous systems. The interactive and dynamic user interfaces include improved human-computer interactions that may provide reduced mental workloads, improved decision-making, reduced work stress, and/or the like, for an analyst user.


Further, the interactive and dynamic user interfaces described herein are enabled by innovations in efficient interactions between the user interfaces and underlying systems and components. For example, disclosed herein are improved methods of receiving user inputs, translation and delivery of those inputs to various system components (for example, retrieval of relevant data objects and/or properties), automatic and dynamic execution of complex processes in response to the input delivery (for example, identifying related data objects and highlighting corresponding data objects in various views), automatic interaction among various components and processes of the system, and/or automatic and dynamic updating of the user interfaces. The interactions and presentation of data via the interactive user interfaces described herein may accordingly provide cognitive and ergonomic efficiencies and advantages over previous systems.


Advantageously, according to various embodiments, the disclosed techniques provide a more effective user interface for an investigation of data objects of various types. An analyst may be able to investigate a group of related data objects instead of an individual data object, and may be able to determine time-based relationships among the data objects that may not otherwise be apparent.





BRIEF DESCRIPTION OF THE DRAWINGS

The following drawings and the associated descriptions are provided to illustrate embodiments of the present disclosure and do not limit the scope of the claims. Aspects and many of the attendant advantages of this disclosure will become more readily appreciated as the same become better understood by reference to the following detailed description, when taken in conjunction with the accompanying drawings, wherein:



FIGS. 1A-1Q illustrate example embodiments of user interfaces, and functionality associated with the example user interfaces, of a system including time-based displays of data objects.



FIG. 2 illustrates one embodiment of a database system using an ontology.



FIG. 3 illustrates one embodiment of a system for creating data in a data store using a dynamic ontology.



FIG. 4 illustrates a sample user interface using relationships described in a data store using a dynamic ontology.



FIG. 5 illustrates a computer system with which certain methods discussed herein may be implemented.





DETAILED DESCRIPTION OF SPECIFIC EMBODIMENTS

Overview


Although certain preferred embodiments and examples are disclosed below, inventive subject matter extends beyond the specifically disclosed embodiments to other alternative embodiments and/or uses and to modifications and equivalents thereof. Thus, the scope of the claims appended hereto is not limited by any of the particular embodiments described below. For example, in any method or process disclosed herein, the acts or operations of the method or process may be performed in any suitable sequence and are not necessarily limited to any particular disclosed sequence. Various operations may be described as multiple discrete operations in turn, in a manner that may be helpful in understanding certain embodiments; however, the order of description should not be construed to imply that these operations are order dependent. Additionally, the structures, systems, and/or devices described herein may be embodied as integrated components or as separate components. For purposes of comparing various embodiments, certain aspects and advantages of these embodiments are described. Not necessarily all such aspects or advantages are achieved by any particular embodiment. Thus, for example, various embodiments may be carried out in a manner that achieves or optimizes one advantage or group of advantages as taught herein without necessarily achieving other aspects or advantages as may also be taught or suggested herein.


As mentioned above, various embodiments of an interactive time-based display of data objects (referred to herein as a “Time Chart”) are disclosed. The Time Chart provides a time-based visualization of data objects (and/or properties associated with the data objects) to a user such that the user may, for example, determine connections between various data objects, observe flows of information among data objects, and/or investigate related data objects.


For example, the system described herein may show various data objects as graphical icons in a display area, and may indicate relationships among those data objects with graphical lines or links between the graphical icons. Further, there may be time-based properties associated with the data objects. These time-based properties may not be apparent in the display area, but may be made more clear in a time-based display. The time-based display may indicate time-dependent relationships among the various data objects. For example, the data objects may be represented by bars and/or lines along a timeline, and may be grouped together into bins corresponding to various periods of time.


The system enables a user to efficiently understand time-based relationships among the data objects, as the data objects may be simultaneously displayed with graphical links in one portion of the display, and in a time-based display in another portion. Interactions with the data objects in either portion of the display may cause corresponding indications and/or updates in another portion. For example, selection of particular data objects in the time-based portion of the display may cause the corresponding data objects as represented by graphical icons to be highlighted. Similarly, selection of graphical icons representing data objects may cause a corresponding highlighting and/or other adjustment to the time-based display. Additionally, the graphical icons may be highlighted in an animated fashion based on time-based properties associated with the corresponding data objects, enabling a user to efficiently determine a sequence of events associated with related data items.



FIGS. 1A-1Q, described below, illustrate example embodiments of user interfaces, and functionality associated with the example user interfaces, of a system including the Time Chart.


Definitions


In order to facilitate an understanding of the systems and methods discussed herein, a number of terms are defined below. The terms defined below, as well as other terms used herein, should be construed to include the provided definitions, the ordinary and customary meaning of the terms, and/or any other implied meaning for the respective terms. Thus, the definitions below do not limit the meaning of these terms, but only provide exemplary definitions.


Ontology: Stored information that provides a data model for storage of data in one or more databases. For example, the stored data may comprise definitions for object types and property types for data in a database, and how objects and properties may be related.


Database: A broad term for any data structure for storing and/or organizing data, including, but not limited to, relational databases (Oracle database, mySQL database, etc.), spreadsheets, XML files, and text file, among others.


Data Object or Object: A data container for information representing specific things in the world that have a number of definable properties. For example, a data object can represent an entity such as a person, a place, an organization, a market instrument, or other noun. A data object can represent an event that happens at a point in time or for a duration. A data object can represent a document or other unstructured data source such as an e-mail message, a phone call, a news report, or a written paper or article. Each data object may be associated with a unique identifier that uniquely identifies the data object. The object's attributes (e.g. metadata about the object) may be represented in one or more properties.


Object Type: Type of a data object (e.g., Person, Event, or Document). Object types may be defined by an ontology and may be modified or updated to include additional object types. An object definition (e.g., in an ontology) may include how the object is related to other objects, such as being a sub-object type of another object type (e.g. an agent may be a sub-object type of a person object type), and the properties the object type may have.


Properties: Attributes of a data object that represent individual data items. At a minimum, each property of a data object has a property type and a value or values.


Property Type: The type of data a property is, such as a string, an integer, or a double. Property types may include complex property types, such as a series data values associated with timed ticks (e.g. a time series), etc.


Property Value: The value associated with a property, which is of the type indicated in the property type associated with the property. A property may have multiple values.


Link: A connection between two data objects, based on, for example, a relationship, an event, and/or matching properties. Links may be directional, such as one representing a payment from person A to B, or bidirectional.


Link Set: Set of multiple links that are shared between two or more data objects.


Object Centric Data Model


To provide a framework for the following discussion of specific systems and methods described herein, an example database system 210 using an ontology 205 will now be described. This description is provided for the purpose of providing an example and is not intended to limit the techniques to the example data model, the example database system, or the example database system's use of an ontology to represent information.


In one embodiment, a body of data is conceptually structured according to an object-centric data model represented by ontology 205. The conceptual data model is independent of any particular database used for durably storing one or more database(s) 209 based on the ontology 205. For example, each object of the conceptual data model may correspond to one or more rows in a relational database or an entry in Lightweight Directory Access Protocol (LDAP) database, or any combination of one or more databases.



FIG. 2 illustrates an object-centric conceptual data model according to an embodiment. An ontology 205, as noted above, may include stored information providing a data model for storage of data in the database 209. The ontology 205 may be defined by one or more object types, which may each be associated with one or more property types. At the highest level of abstraction, data object 201 is a container for information representing things in the world. For example, data object 201 can represent an entity such as a person, a place, an organization, a market instrument, or other noun. Data object 201 can represent an event that happens at a point in time or for a duration. Data object 201 can represent a document or other unstructured data source such as an e-mail message, a news report, or a written paper or article. Each data object 201 is associated with a unique identifier that uniquely identifies the data object within the database system.


Different types of data objects may have different property types. For example, a “Person” data object might have an “Eye Color” property type and an “Event” data object might have a “Date” property type. Each property 203 as represented by data in the database system 210 may have a property type defined by the ontology 205 used by the database 205.


Objects may be instantiated in the database 209 in accordance with the corresponding object definition for the particular object in the ontology 205. For example, a specific monetary payment (e.g., an object of type “event”) of US$30.00 (e.g., a property of type “currency”) taking place on Mar. 27, 2009 (e.g., a property of type “date”) may be stored in the database 209 as an event object with associated currency and date properties as defined within the ontology 205.


The data objects defined in the ontology 205 may support property multiplicity. In particular, a data object 201 may be allowed to have more than one property 203 of the same property type. For example, a “Person” data object might have multiple “Address” properties or multiple “Name” properties.


Each link 202 represents a connection between two data objects 201. In one embodiment, the connection is either through a relationship, an event, or through matching properties. A relationship connection may be asymmetrical or symmetrical. For example, “Person” data object A may be connected to “Person” data object B by a “Child Of” relationship (where “Person” data object B has an asymmetric “Parent Of” relationship to “Person” data object A), a “Kin Of” symmetric relationship to “Person” data object C, and an asymmetric “Member Of” relationship to “Organization” data object X. The type of relationship between two data objects may vary depending on the types of the data objects. For example, “Person” data object A may have an “Appears In” relationship with “Document” data object Y or have a “Participate In” relationship with “Event” data object E. As an example of an event connection, two “Person” data objects may be connected by an “Airline Flight” data object representing a particular airline flight if they traveled together on that flight, or by a “Meeting” data object representing a particular meeting if they both attended that meeting. In one embodiment, when two data objects are connected by an event, they are also connected by relationships, in which each data object has a specific relationship to the event, such as, for example, an “Appears In” relationship.


As an example of a matching properties connection, two “Person” data objects representing a brother and a sister, may both have an “Address” property that indicates where they live. If the brother and the sister live in the same home, then their “Address” properties likely contain similar, if not identical property values. In one embodiment, a link between two data objects may be established based on similar or matching properties (e.g., property types and/or property values) of the data objects. These are just some examples of the types of connections that may be represented by a link and other types of connections may be represented; embodiments are not limited to any particular types of connections between data objects. For example, a document might contain references to two different objects. For example, a document may contain a reference to a payment (one object), and a person (a second object). A link between these two objects may represent a connection between these two entities through their co-occurrence within the same document.


Each data object 201 can have multiple links with another data object 201 to form a link set 204. For example, two “Person” data objects representing a husband and a wife could be linked through a “Spouse Of” relationship, a matching “Address” property, and one or more matching “Event” properties (e.g., a wedding). Each link 202 as represented by data in a database may have a link type defined by the database ontology used by the database.



FIG. 3 is a block diagram illustrating exemplary components and data that may be used in identifying and storing data according to an ontology. In this example, the ontology may be configured, and data in the data model populated, by a system of parsers and ontology configuration tools. In the embodiment of FIG. 3, input data 300 is provided to parser 302. The input data may comprise data from one or more sources. For example, an institution may have one or more databases with information on credit card transactions, rental cars, and people. The databases may contain a variety of related information and attributes about each type of data, such as a “date” for a credit card transaction, an address for a person, and a date for when a rental car is rented. The parser 302 is able to read a variety of source input data types and determine which type of data it is reading.


In accordance with the discussion above, the example ontology 205 comprises stored information providing the data model of data stored in database 209, and the ontology is defined by one or more object types 310, one or more property types 316, and one or more link types 330. Based on information determined by the parser 302 or other mapping of source input information to object type, one or more data objects 201 may be instantiated in the database 209 based on respective determined object types 310, and each of the objects 201 has one or more properties 203 that are instantiated based on property types 316. Two data objects 201 may be connected by one or more links 202 that may be instantiated based on link types 330. The property types 316 each may comprise one or more data types 318, such as a string, number, etc. Property types 316 may be instantiated based on a base property type 320. For example, a base property type 320 may be “Locations” and a property type 316 may be “Home.”


In an embodiment, a user of the system uses an object type editor 324 to create and/or modify the object types 310 and define attributes of the object types. In an embodiment, a user of the system uses a property type editor 326 to create and/or modify the property types 316 and define attributes of the property types. In an embodiment, a user of the system uses link type editor 328 to create the link types 330. Alternatively, other programs, processes, or programmatic controls may be used to create link types and property types and define attributes, and using editors is not required.


In an embodiment, creating a property type 316 using the property type editor 326 involves defining at least one parser definition using a parser editor 322. A parser definition comprises metadata that informs parser 302 how to parse input data 300 to determine whether values in the input data can be assigned to the property type 316 that is associated with the parser definition. In an embodiment, each parser definition may comprise a regular expression parser 304A or a code module parser 304B. In other embodiments, other kinds of parser definitions may be provided using scripts or other programmatic elements. Once defined, both a regular expression parser 304A and a code module parser 304B can provide input to parser 302 to control parsing of input data 300.


Using the data types defined in the ontology, input data 300 may be parsed by the parser 302 determine which object type 310 should receive data from a record created from the input data, and which property types 316 should be assigned to data from individual field values in the input data. Based on the object-property mapping 301, the parser 302 selects one of the parser definitions that is associated with a property type in the input data. The parser parses an input data field using the selected parser definition, resulting in creating new or modified data 303. The new or modified data 303 is added to the database 209 according to ontology 205 by storing values of the new or modified data in a property of the specified property type. As a result, input data 300 having varying format or syntax can be created in database 209. The ontology 205 may be modified at any time using object type editor 324, property type editor 326, and link type editor 328, or under program control without human use of an editor. Parser editor 322 enables creating multiple parser definitions that can successfully parse input data 300 having varying format or syntax and determine which property types should be used to transform input data 300 into new or modified input data 303.


The properties, objects, and links (e.g. relationships) between the objects can be visualized using a graphical user interface (GUI). For example, FIG. 4 displays a user interface showing a graph representation 403 of relationships (including relationships and/or links 404, 405, 406, 407, 408, 409, 410, 411, 412, and 413) between the data objects (including data objects 421, 422, 423, 424, 425, 426, 427, 428, and 429) that are represented as nodes in the example of FIG. 4. In this embodiment, the data objects include person objects 421, 422, 423, 424, 425, and 426; a flight object 427; a financial account 428; and a computer object 429. In this example, each person node (associated with person data objects), flight node (associated with flight data objects), financial account node (associated with financial account data objects), and computer node (associated with computer data objects) may have relationships and/or links with any of the other nodes through, for example, other objects such as payment objects.


For example, in FIG. 4, relationship 404 is based on a payment associated with the individuals indicated in person data objects 421 and 423. The link 404 represents these shared payments (for example, the individual associated with data object 421 may have paid the individual associated with data object 423 on three occasions). The relationship is further indicated by the common relationship between person data objects 421 and 423 and financial account data object 428. For example, link 411 indicates that person data object 421 transferred money into financial account data object 428, while person data object 423 transferred money out of financial account data object 428. In another example, the relationships between person data objects 424 and 425 and flight data object 427 are indicated by links 406, 409, and 410. In this example, person data objects 424 and 425 have a common address and were passengers on the same flight data object 427. In an embodiment, further details related to the relationships between the various objects may be displayed. For example, links 411 and 412 may, in some embodiments, indicate the timing of the respective money transfers. In another example, the time of the flight associated with the flight data object 427 may be shown.


Relationships between data objects may be stored as links, or in some embodiments, as properties, where a relationship may be detected between the properties. In some cases, as stated above, the links may be directional. For example, a payment link may have a direction associated with the payment, where one person object is a receiver of a payment, and another person object is the payer of payment.


In addition to visually showing relationships between the data objects, the user interface may allow various other manipulations. For example, the objects within database 108 may be searched using a search interface 450 (e.g., text string matching of object properties), inspected (e.g., properties and associated data viewed), filtered (e.g., narrowing the universe of objects into sets and subsets by properties or relationships), and statistically aggregated (e.g., numerically summarized based on summarization criteria), among other operations and visualizations.


Time Chart


As described above, FIGS. 1A-1Q illustrate example embodiments of user interfaces, and functionality associated with the example user interfaces, of a system including the Time Chart. It is to be understood that some of the various functionality, features, and aspects of the system as described below in reference to FIGS. 1A-1Q, may or may not appear in any particular embodiment. For example, with many of the user interfaces of FIGS. 1A-1Q include both a Timeline and a Time Chart (as described below), in various embodiment the user interface includes only the Timeline or the Time Chart.



FIG. 1A shows an example user interface with various aspects that are similar among FIGS. 1A-1Q, and accordingly the following general description of the user interface of FIG. 1A provides a basis for the descriptions of the remaining figures (FIGS. 1B-1Q). The user interface of FIG. 1A includes a first time-based information display panel 1001 (“Timeline”), an object display panel 1002, and a second time-based information display panel 1003 (“Time Chart”). While the present description is focused on the Time Chart (1003) of the user interface, it is recognized that much of the description of the Timeline (1001) may also be applicable to the Time Chart (in various embodiments). Accordingly, in some embodiments the Time Chart may include one or more aspects of the functionality and/or features of the Timeline.


The object display panel 1002, in various embodiments, may include a display of data objects as described above in reference to FIG. 4. For example, data objects 1006 may be displayed which include a person, multiple emails, multiple phone calls (and/or phone numbers), and various connections among the data objects. As described above, these data objects 1006 may be selected and/or arranged by a user of the system. The various data objects may be associated with time-based properties. For example, a time an email was sent, or a time that a phone call was made. As described below, in various embodiments the Time Chart (and/or Timeline) provide time-based visualizations of the data objects (and/or properties associated with the data objects) to a user such that the user may, for example, determine connections between various data objects, observe flows of information among data objects, and/or generally investigate related data objects.


The Timeline 1001 includes bars (or information bins) 1005 that each indicate an absolute number of various types of objects that are associated with particular time periods (or bins). The time periods (for example, the length of time spanned by each of the time periods) may be adjusted. Accordingly, more or fewer objects may be associated with each time period (or bin), and greater or lesser granularity may be displayed to the user. As shown, the numbers of each type of object included in a given bar are represented by coloring of the bar. The numbers of objects are stacked in each bar such that the total number of objects is shown. On the left of the Timeline, at 1004, a breakdown of the types of objects (including Email and Phone Call events) and associated properties is shown. The various objects/events may be individually selected or deselected such that only selected objects are displayed in the Timeline. Also shown are the colors related each type of object.


Similar to the Timeline, the Time Chart 1003 includes bars (or information bins) 1007 that each indicate absolute numbers of objects associated with each time period (or bin). FIG. 1B shows the Time Chart 1003 in further detail. In the present embodiment the Time Chart includes a chart area 1021, an information panel 1026, a legends panel selector 1022, a tools panel selector 1023, a play button 1024, and a chart overview/viewport 1025.


Markers 1028 indicate portion of the chart that is currently displayed in the chart are 1021. As shown in example Time Chart 1030, the markers may show that a small portion 1031 of the overall charts is displayed in the chart area. Further, the user may move the markers independently or simultaneously to view particular portions of the chart. In an embodiment the Time Chart may additionally include a fit button for fitting all selected object data into the current chart display. In an embodiment the user may use a mouse wheel to expand or compress the viewport.


The information panel 1026 may display legend information and/or tools information. In FIG. 1B tools information is displayed including a dropdown for a user to select a chart types (for example, bar chart or line chart), radio buttons for a user to select what portion of the data objects are to be used to populate the Time Chart (for example, all data points/objects or selected data points/objects), and/or a button to capture history snapshots (for example, capture a snapshot of the currently displayed Time Chart). Other options may include, for example, a selector for choosing which object properties to include in the Time Chart and a selector for choosing which time zone to display on the Time Chart.


In FIG. 1C legend information is displayed including events and/or properties 1027. Various of the legend information may be selected/de-selected and/or may be associated with colors, etc. (similar to the events and properties described above with respect to the Timeline). Thus, in an embodiment, the legend panel may be used to choose which object types to include in the Time Chart.


As described below, the play button 1024 may be used to cause the system to display current Time Chart data, as indicated by a time window, in chronological order. In an embodiment the play button causes the chronological display of information to play in a continuous loop.



FIG. 1D shows an example user interface similar to FIG. 1A in which, in response to the user selecting one of the bars (e.g., bar 1041) in the Timeline, corresponding data objects (e.g., object 1042) in the object display panel are highlighted. Similarly, the user may select a bar (e.g., bar 1043) in the Time Chart to cause the objects to be highlighted. In an embodiment, selecting a bar in one of the Timeline or Time Chart may cause a corresponding bar in the other to be highlighted.



FIG. 1D shows an example user interface similar to FIG. 1A in which the user has selected a window of time 1051. In various embodiments the user may move the beginning and end of the window of time, and/or may move the entire window to time (to, for example, scrub the window along the Timeline or Time Chart). As shown, in the object display panel any objects not within the window of time are greyed out (e.g., object 1052), while the objects within the window of time are not greyed out (e.g., object 1053).



FIG. 1F shows an example user interface similar to FIG. 1A in which the user has de-selected the “Phone Call” object type 1061. Accordingly, the example user interface the phone call objects are no longer represented in the Timeline. As shown, when the user selects an information bin 1062 in the Timeline, the corresponding objects that are highlighted (e.g., object 1065) in object display panel show that none of the objects include phone call objects. In Time Chart, the highlighted portion of information bin 1063 represents object types still selected (e.g., Email), while the un-highlighted portion 1064 represents un-selected object type (e.g., phone call).


Similar to FIG. 1D, FIG. 1G shows an example user interface similar to FIG. 1A in which, in response to the user selecting one of the bars (e.g., bar 1071) in the Time Chart, corresponding data objects (e.g., object 1072) in the object display panel are highlighted. As shown, because the phone call object type has been re-selected (1070), phone call objects included in the Time Chart and are accordingly selected in the object display panel.



FIG. 1H shows an example user interface similar to FIG. 1A in which multiple information bins 1081 in the Time Chart are selected (and corresponding objects are highlighted in the object display panel.



FIG. 1I shows an example user interface similar to FIGS. 1A and 1D in which the user has selected a window of time 1091 on the Time Chart. As shown, objects that are not within the window of time are greyed out (e.g., object 1095), while objects that are within the window of time are not greyed out (e.g., object 1094). Additionally, as shown, objects corresponding to the selected bars of the Time Chart (1092) may still be highlighted (e.g., object 1093). In an embodiment, the user may select the play button 1096 to cause the window of time to automatically and smoothly move along Time Chart as indicated by arrow 1097. Such automatic and smooth movement of the window of time along the time chart may be referred to herein as the window of time moving in an “animated fashion.” As used herein, the term “animated fashion” is a broad term encompassing its ordinary and customary meaning, and includes but is not limited to any type of movement and/or shape change of a displayed element (e.g., a graphical element such as the window of time 1091). Accordingly, movement of the window of time 1091 in an animated fashion as indicated by arrow 1097 means, in an embodiment, that the window of time 1091 appears to a user to slide along the Time Chart, sequentially entering and encompassing each of the bars of the Time Chart, until an end of the Time Chart is reached. In some instances, when an end of the Time Chart is reached, the window of time 1091 may gradually slide off the end and wrap around to a beginning of the Time Chart and continue. Alternatively, the window of time 1091 may stop when the end of the Time Chart is reached. In some embodiments, when only a portion of the whole Time Chart is viewable by the user due to, for example, a view of the Time Chart being zoomed in, as the window of time 1091 progresses along the Time Chart in the animated fashion, the Time Chart itself may begin to scroll along in an animated fashion to reveal additional portions of the Time Chart as the window of time 1091 advances. As the window of time moves along the Time Chart in an animated fashion, corresponding objects are greyed out or not greyed out based on the location of the window. As mentioned above, in an embodiment the play button causes the movement of the window to continuously loop back to the beginning of the Time Chart when the end of the Time Chart is reached.



FIG. 1J shows the user interface of FIG. 1I after the window of time has moved (e.g., slid along the Time Chart in an animated fashion) to location 1111. As shown the indicator 1112 shows the location of the window of time with respect to the entire Time Chart. In various embodiments, not just objects, but connections between objects are highlighted and/or greyed/not greyed out based on user selections and/or the location of the window of time on the Time Chart. In an embodiment, a size of the window of time may be automatically determined by the system. For example, the size of the window of time may be determined based on one or more of: a percentage of time represented in the Time Chart, a percentage of objects represented in the Time Chart, a number of objects represented in the Time Chart, an amount of time represented in the Time Chart, and/or any other factor or item of information. In an embodiment, the user may adjust speed at which the window moves along the Time Chart when the play button is selected. In an embodiment, the system may determine a speed at which the windows moves (which may be determined based on any of the factors or items of information mentioned above with respect to the size of the window).



FIG. 1K shows an example user interface similar to FIG. 1A in which the user may adjust information bin size by selection from the dropdown 1121. In an embodiment, the bin size may be automatically determined by the system based on, for example, an amount of time represented and/or a number of objects represented in the Time Chart.



FIG. 1L shows an example user interface similar to FIG. 1A in which the user may change the Time Chart from Bar Chart to Line Chart via a dropdown 1131. FIG. 1M shows an example of the Line Chart version 1141 of the Time Chart. As shown, different colors of lines may represent different types of objects/events (e.g., purple line 1142 represents Phone Calls, while blue line 1143 represents Emails). In the Line Chart embodiment shown, objects represented in Time Chart are not stacked (as in Bar Chart).



FIG. 1N shows an example user interface similar to FIG. 1M in which a Line Chart version of the Time Chart is shown. As shown, the user may select a portion or region 1151 of the Time Chart that may include only object of particular types for select time periods. Accordingly, in the example show, the user has selected four phone calls and eight emails. The user may select portions of the Line Chart, rather than particular information bins (as in the Line Chart version of the Time Chart). As shown, objects corresponding to the selected portion of the Line Cart are highlighted in object display pane.



FIG. 1O shows an example user interface similar to FIG. 1A in which the user has selected a particular subset of objects 1162 in the object display pane. The user may select the “Selected data points” radio button 1161 in the tools pane so that Time Chart only shows a representation (e.g., bars or lines) of the selected subset of objects. FIG. 1P shows an example of the user selecting a portion or region 1171 of the Time Chart (which includes only the subset of objects selected in the object display panel as shown in FIG. 1O) and the corresponding objects being highlighted in the object display panel.



FIG. 1Q shows an example user interface similar to FIG. 1A in which a user may add an annotation 1181 to the Time Chart. In various embodiments the user may add annotations to particular objects, groups of objects, information bins (e.g., bars or points), and/or other items on the Time Chart.


In some embodiments the window of time may be referred to as a filter (e.g., a Time Chart or Timeline filter).


In various embodiments, the object display panel 1002 described above in reference to various figures may be replaced with another view of data objects. For example, rather than a graph layout or graph representation of data objects, object display panel 1002 may include data objects shown in another format and/or layout. Such other formats and/or layouts may include, for example, histograms of data objects, data object represented on a map, data objects displayed in a postboard view, data objects displayed in a list, data objects displayed in a reader view, and/or the like. Other formats may further include, e.g., any representation of data objects in graphs, maps, tables, timelines, histograms, and/or lists, among other types of data visualizations. Examples of such formats of display of data objects are described in U.S. Pat. No. 8,713,467, titled “Context-Sensitive Views,” the disclosure of which is hereby incorporated by reference herein in its entirety and for all purposes.


As with the graph layout of the object display panel 1002 described above, other views of data objects in various other embodiments may function similarly. For example, the user may select data objects via the object display panel 1002 which may then be shown in, e.g., the Timeline and Time Chart. Further, when data objects are selected (e.g., by a user and/or as a window of time scrolls along) in any display (e.g., object display panel 1002, the Timeline, and/or the Time Chart) corresponding data objects in one or more of the other views the may be highlighted or indicated, as described above.


Implementation Mechanisms


According to one embodiment, the techniques described herein are implemented by one or more special-purpose computing devices. The special-purpose computing devices may be hard-wired to perform the techniques, or may include digital electronic devices such as one or more application-specific integrated circuits (ASICs) or field programmable gate arrays (FPGAs) that are persistently programmed to perform the techniques, or may include one or more general purpose hardware processors programmed to perform the techniques pursuant to program instructions in firmware, memory, other storage, or a combination. Such special-purpose computing devices may also combine custom hard-wired logic, ASICs, or FPGAs with custom programming to accomplish the techniques. The special-purpose computing devices may be desktop computer systems, server computer systems, portable computer systems, handheld devices, networking devices or any other device or combination of devices that incorporate hard-wired and/or program logic to implement the techniques.


Computing device(s) are generally controlled and coordinated by operating system software, such as iOS, Android, Chrome OS, Windows XP, Windows Vista, Windows 7, Windows 8, Windows Server, Windows CE, Unix, Linux, SunOS, Solaris, iOS, Blackberry OS, VxWorks, or other compatible operating systems. In other embodiments, the computing device may be controlled by a proprietary operating system. Conventional operating systems control and schedule computer processes for execution, perform memory management, provide file system, networking, I/O services, and provide a user interface functionality, such as a graphical user interface (“GUI”), among other things.


For example, FIG. 5 is a block diagram that illustrates a computer system 800 upon which an embodiment may be implemented. Computer system 800 includes a bus 802 or other communication mechanism for communicating information, and a hardware processor, or multiple processors, 804 coupled with bus 802 for processing information. Hardware processor(s) 804 may be, for example, one or more general purpose microprocessors.


Computer system 800 also includes a main memory 806, such as a random access memory (RAM), cache and/or other dynamic storage devices, coupled to bus 802 for storing information and instructions to be executed by processor 804. Main memory 806 also may be used for storing temporary variables or other intermediate information during execution of instructions to be executed by processor 804. Such instructions, when stored in storage media accessible to processor 804, render computer system 800 into a special-purpose machine that is customized to perform the operations specified in the instructions.


Computer system 800 further includes a read only memory (ROM) 808 or other static storage device coupled to bus 802 for storing static information and instructions for processor 804. A storage device 810, such as a magnetic disk, optical disk, or USB thumb drive (Flash drive), etc., is provided and coupled to bus 802 for storing information and instructions.


Computer system 800 may be coupled via bus 802 to a display 812, such as a cathode ray tube (CRT) or LCD display (or touch screen), for displaying information to a computer user. An input device 814, including alphanumeric and other keys, is coupled to bus 802 for communicating information and command selections to processor 804. Another type of user input device is cursor control 816, such as a mouse, a trackball, or cursor direction keys for communicating direction information and command selections to processor 804 and for controlling cursor movement on display 812. This input device typically has two degrees of freedom in two axes, a first axis (e.g., x) and a second axis (e.g., y), that allows the device to specify positions in a plane. In some embodiments, the same direction information and command selections as cursor control may be implemented via receiving touches on a touch screen without a cursor.


Computing system 800 may include a user interface module to implement a GUI that may be stored in a mass storage device as executable software codes that are executed by the computing device(s). This and other modules may include, by way of example, components, such as software components, object-oriented software components, class components and task components, processes, functions, attributes, procedures, subroutines, segments of program code, drivers, firmware, microcode, circuitry, data, databases, data structures, tables, arrays, and variables.


In general, the word “module,” as used herein, refers to logic embodied in hardware or firmware, or to a collection of software instructions, possibly having entry and exit points, written in a programming language, such as, for example, Java, Lua, C or C++. A software module may be compiled and linked into an executable program, installed in a dynamic link library, or may be written in an interpreted programming language such as, for example, BASIC, Perl, or Python. It will be appreciated that software modules may be callable from other modules or from themselves, and/or may be invoked in response to detected events or interrupts. Software modules configured for execution on computing devices may be provided on a computer readable medium, such as a compact disc, digital video disc, flash drive, magnetic disc, or any other tangible medium, or as a digital download (and may be originally stored in a compressed or installable format that requires installation, decompression or decryption prior to execution). Such software code may be stored, partially or fully, on a memory device of the executing computing device, for execution by the computing device. Software instructions may be embedded in firmware, such as an EPROM. It will be further appreciated that hardware modules may be comprised of connected logic units, such as gates and flip-flops, and/or may be comprised of programmable units, such as programmable gate arrays or processors. The modules or computing device functionality described herein are preferably implemented as software modules, but may be represented in hardware or firmware. Generally, the modules described herein refer to logical modules that may be combined with other modules or divided into sub-modules despite their physical organization or storage


Computer system 800 may implement the techniques described herein using customized hard-wired logic, one or more ASICs or FPGAs, firmware and/or program logic which in combination with the computer system causes or programs computer system 800 to be a special-purpose machine. According to one embodiment, the techniques herein are performed by computer system 800 in response to processor(s) 804 executing one or more sequences of one or more instructions contained in main memory 806. Such instructions may be read into main memory 806 from another storage medium, such as storage device 810. Execution of the sequences of instructions contained in main memory 806 causes processor(s) 804 to perform the process steps described herein. In alternative embodiments, hard-wired circuitry may be used in place of or in combination with software instructions.


The term “non-transitory media,” and similar terms, as used herein refers to any media that store data and/or instructions that cause a machine to operate in a specific fashion. Such non-transitory media may comprise non-volatile media and/or volatile media. Non-volatile media includes, for example, optical or magnetic disks, such as storage device 810. Volatile media includes dynamic memory, such as main memory 806. Common forms of non-transitory media include, for example, a floppy disk, a flexible disk, hard disk, solid state drive, magnetic tape, or any other magnetic data storage medium, a CD-ROM, any other optical data storage medium, any physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, NVRAM, any other memory chip or cartridge, and networked versions of the same.


Non-transitory media is distinct from but may be used in conjunction with transmission media. Transmission media participates in transferring information between nontransitory media. For example, transmission media includes coaxial cables, copper wire and fiber optics, including the wires that comprise bus 802. Transmission media can also take the form of acoustic or light waves, such as those generated during radio-wave and infra-red data communications.


Various forms of media may be involved in carrying one or more sequences of one or more instructions to processor 804 for execution. For example, the instructions may initially be carried on a magnetic disk or solid state drive of a remote computer. The remote computer can load the instructions into its dynamic memory and send the instructions over a telephone line using a modem. A modem local to computer system 800 can receive the data on the telephone line and use an infra-red transmitter to convert the data to an infra-red signal. An infra-red detector can receive the data carried in the infra-red signal and appropriate circuitry can place the data on bus 802. Bus 802 carries the data to main memory 806, from which processor 804 retrieves and executes the instructions. The instructions received by main memory 806 may retrieves and executes the instructions. The instructions received by main memory 806 may optionally be stored on storage device 810 either before or after execution by processor 804.


Computer system 800 also includes a communication interface 818 coupled to bus 802. Communication interface 818 provides a two-way data communication coupling to a network link 820 that is connected to a local network 822. For example, communication interface 818 may be an integrated services digital network (ISDN) card, cable modem, satellite modem, or a modem to provide a data communication connection to a corresponding type of telephone line. As another example, communication interface 818 may be a local area network (LAN) card to provide a data communication connection to a compatible LAN (or WAN component to communicated with a WAN). Wireless links may also be implemented. In any such implementation, communication interface 818 sends and receives electrical, electromagnetic or optical signals that carry digital data streams representing various types of information.


Network link 820 typically provides data communication through one or more networks to other data devices. For example, network link 820 may provide a connection through local network 822 to a host computer 824 or to data equipment operated by an Internet Service Provider (ISP) 826. ISP 826 in turn provides data communication services through the world wide packet data communication network now commonly referred to as the “Internet” 828. Local network 822 and Internet 828 both use electrical, electromagnetic or optical signals that carry digital data streams. The signals through the various networks and the signals on network link 820 and through communication interface 818, which carry the digital data to and from computer system 800, are example forms of transmission media.


Computer system 800 can send messages and receive data, including program code, through the network(s), network link 820 and communication interface 818. In the Internet example, a server 830 might transmit a requested code for an application program through Internet 828, ISP 826, local network 822 and communication interface 818.


The received code may be executed by processor 804 as it is received, and/or stored in storage device 810, or other non-volatile storage for later execution.


Additional Embodiments

While the foregoing is directed to various embodiments, other and further embodiments may be devised without departing from the basic scope thereof. For example, aspects of the present disclosure may be implemented in hardware or software or in a combination of hardware and software. An embodiment of the disclosure may be implemented as a program product for use with a computer system. The program(s) of the program product define functions of the embodiments (including the methods described herein) and may be contained on a variety of computer-readable storage media. Illustrative computer-readable storage media include, but are not limited to: (i) non-writable storage media (e.g., read-only memory devices within a computer such as CD-ROM disks readable by a CD-ROM drive, flash memory, ROM chips or any type of solid-state non-volatile semiconductor memory) on which information is permanently stored; and (ii) writable storage media (e.g., hard-disk drive or any type of solid-state random-access semiconductor memory) on which alterable information is stored. Each of the processes, methods, and algorithms described in the preceding sections may be embodied in, and fully or partially automated by, code modules executed by one or more computer systems or computer processors comprising computer hardware. The processes and algorithms may alternatively be implemented partially or wholly in application-specific circuitry.


The various features and processes described above may be used independently of one another, or may be combined in various ways. All possible combinations and subcombinations are intended to fall within the scope of this disclosure. In addition, certain method or process blocks may be omitted in some implementations. The methods and processes described herein are also not limited to any particular sequence, and the blocks or states relating thereto can be performed in other sequences that are appropriate. For example, described blocks or states may be performed in an order other than that specifically disclosed, or multiple blocks or states may be combined in a single block or state. The example blocks or states may be performed in serial, in parallel, or in some other manner. Blocks or states may be added to or removed from the disclosed example embodiments. The example systems and components described herein may be configured differently than described. For example, elements may be added to, removed from, or rearranged compared to the disclosed example embodiments.


Conditional language, such as, among others, “can,” “could,” “might,” or “may,” unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or steps. Thus, such conditional language is not generally intended to imply that features, elements and/or steps are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without user input or prompting, whether these features, elements and/or steps are included or are to be performed in any particular embodiment.


The term “comprising” as used herein should be given an inclusive rather than exclusive interpretation. For example, a general purpose computer comprising one or more processors should not be interpreted as excluding other computer components, and may possibly include such components as memory, input/output devices, and/or network interfaces, among others.


The term “continuous” as used herein, is a broad term encompassing its plain an ordinary meaning and, as used in reference to various types of activity (for example, scanning, monitoring, logging, and the like), includes without limitation substantially continuous activity and/or activity that may include periodic or intermittent pauses or breaks, but which accomplish the intended purposes described (for example, continuous scanning may include buffering and/or storage of data that is thereafter processed, for example, in batch and/or the like).


Any process descriptions, elements, or blocks in the flow diagrams described herein and/or depicted in the attached figures should be understood as potentially representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps in the process. Alternate implementations are included within the scope of the embodiments described herein in which elements or functions may be deleted, executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those skilled in the art.


It should be emphasized that many variations and modifications may be made to the above-described embodiments, the elements of which are among other acceptable examples. All such modifications and variations are intended to be included herein within the scope of this disclosure. The foregoing description details certain embodiments of the invention. It will be appreciated, however, that no matter how detailed the foregoing appears in text, the invention may be practiced in many ways. As is also stated above, it should be noted that the use of particular terminology when describing certain features or aspects of the invention should not be taken to imply that the terminology is being re-defined herein to be restricted to including any specific characteristics of the features or aspects of the invention with which that terminology is associated. The scope of the invention should therefore be construed in accordance with the appended claims and any equivalents thereof.

Claims
  • 1. A computer system comprising: one or more computer readable storage devices configured to store: computer executable instructions; anddata objects, the data objects associated with one or more properties; andone or more hardware computer processors in communication with the one or more computer readable storage devices and configured to execute the computer executable instructions to cause the computer system to: access a plurality of data objects from the one or more computer readable storage devices;determine one or more time-based properties associated with the plurality of data objects;generate, based on the time-based properties, a time-based display panel representing the plurality of data objects according to their associated time-based properties;display the time-based display panel on an electronic display of the computer system;determine a window of time associated with the time-based display panel;in response to receiving an input indicating a selection of a play indicator, automatically move the window of time with respect to the time-based display panel in an animated fashion;receive a second input, via the time-based display panel, providing a two dimensional selection of one or more data objects within the time-based display panel; andin response to the second input, highlight, in data object display panel separate from the time-base display panel, the one or more data objects.
  • 2. The computer system of claim 1, wherein the time-based display panel includes a timeline with bars or points along a line that visually represent groups associated with the data objects.
  • 3. The computer system of claim 1, wherein the one or more hardware computer processors are configured to execute the computer executable instructions to further cause the computer system to: display a viewport visually representing groups associated with the data objects, the time-based display panel corresponding to at least a portion of the viewport.
  • 4. The computer system of claim 3, wherein in response to a second input the computer system is configured to adjust a size parameter of the viewport.
  • 5. The computer system of claim 3, wherein the computer system is configured to automatically adjust a size parameter of the viewport.
  • 6. The computer system of claim 1, wherein providing a two dimensional selection of one or more data objects comprises determining a size parameter of a selection window.
  • 7. The computer system of claim 1, wherein highlighting the one or more data objects comprises highlighting the one or more data objects corresponding to the location of the window of time as it moves along the time-based display panel.
  • 8. The computer system of claim 1, wherein highlighting the one or more data objects comprises greying out any data objects not corresponding to the location of the window of time.
  • 9. The computer system of claim 1, wherein the data object display panel and the time-based display panel are simultaneously displayed on the electronic display of the computer system.
  • 10. The computer system of claim 1, wherein the data object display panel includes the one or more data objects in at least one of a histogram, a table, a list, or a map.
  • 11. The computer system of claim 1, wherein in response to receiving the input indicating a selection of a play indicator, the computer system is configured to display a time window and current data of the time-based display panel.
  • 12. The computer system of claim 1, wherein the computer system is further configured to display an information panel comprising a legend and/or a toolbar.
  • 13. The computer system of claim 1, wherein the user may adjust the speed at which the window of time moves along the time-based display panel.
  • 14. A computer-implemented method comprising: accessing a plurality of data objects from the one or more computer readable storage devices;determining one or more time-based properties associated with the plurality of data objects;generating, based on the time-based properties, a time-based display panel representing the plurality of data objects according to their associated time-based properties;displaying the time-based display panel on an electronic display of the computer system;determining a window of time associated with the time-based display panel;in response to receiving an input indicating a selection of a play indicator, automatically moving the window of time with respect to the time-based display panel in an animated fashion; andreceiving a second input, via the time-based display panel, providing a two dimensional selection of one or more data objects within the time-based display panel; andin response to the second input, highlighting, in a data object display panel separate from the time-base display panel, the one or more data objects.
  • 15. The method of claim 14, wherein the time-based display panel includes a timeline with bars or points along a line that visually represent groups associated with the data objects.
  • 16. The method of claim 14, further comprising: displaying a viewport visually representing groups associated with the data objects, the time-based display panel corresponding to at least a portion of the viewport.
  • 17. The method of claim 16, further comprising: in response to a second input the method, adjusting a size parameter of the viewport.
  • 18. The method of claim 16, further comprising: automatically adjusting a size parameter of the viewport.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 14/696,069, filed Apr. 24, 2015, and titled “SYSTEMS AND USER INTERFACES FOR DYNAMIC AND INTERACTIVE ACCESS OF, INVESTIGATION OF, AND ANALYSIS OF DATA OBJECTS STORED IN ONE OR MORE DATABASES,” which application claims benefit of U.S. Provisional Patent Application No. 61/985,403, filed Apr. 28, 2014, and titled “TIME-BASED DISPLAY OF DATA OBJECTS.” The entire disclosure of each of the above items is hereby made part of this specification as if set forth fully herein and incorporated by reference for all purposes, for all that it contains. Any and all applications for which a foreign or domestic priority claim is identified in the Application Data Sheet as filed with the present application are hereby incorporated by reference under 37 CFR 1.57.

US Referenced Citations (594)
Number Name Date Kind
5109399 Thompson Apr 1992 A
5329108 Lamoure Jul 1994 A
5632009 Rao et al. May 1997 A
5670987 Doi et al. Sep 1997 A
5781704 Rossmo Jul 1998 A
5798769 Chiu et al. Aug 1998 A
5845300 Comer Dec 1998 A
6057757 Arrowsmith et al. May 2000 A
6091956 Hollenberg Jul 2000 A
6161098 Wallman Dec 2000 A
6167405 Rosensteel, Jr. et al. Dec 2000 A
6219053 Tachibana et al. Apr 2001 B1
6232971 Haynes May 2001 B1
6247019 Davies Jun 2001 B1
6279018 Kudrolli et al. Aug 2001 B1
6341310 Leshem et al. Jan 2002 B1
6366933 Ball et al. Apr 2002 B1
6369835 Lin Apr 2002 B1
6456997 Shukla Sep 2002 B1
6549944 Weinberg et al. Apr 2003 B1
6560620 Ching May 2003 B1
6581068 Bensoussan et al. Jun 2003 B1
6594672 Lampson et al. Jul 2003 B1
6631496 Li et al. Oct 2003 B1
6642945 Sharpe Nov 2003 B1
6674434 Chojnacki et al. Jan 2004 B1
6714936 Nevin, III Mar 2004 B1
6775675 Nwabueze et al. Aug 2004 B1
6820135 Dingman Nov 2004 B1
6828920 Owen et al. Dec 2004 B2
6839745 Dingari et al. Jan 2005 B1
6877137 Rivette et al. Apr 2005 B1
6976210 Silva et al. Dec 2005 B1
6980984 Huffman et al. Dec 2005 B1
6985950 Hanson et al. Jan 2006 B1
7036085 Barros Apr 2006 B2
7043702 Chi et al. May 2006 B2
7055110 Kupka et al. May 2006 B2
7139800 Bellotti et al. Nov 2006 B2
7158878 Rasmussen et al. Jan 2007 B2
7162475 Ackerman Jan 2007 B2
7168039 Bertram Jan 2007 B2
7171427 Witowski et al. Jan 2007 B2
7269786 Malloy et al. Sep 2007 B1
7278105 Kitts Oct 2007 B1
7290698 Poslinski et al. Nov 2007 B2
7333998 Heckerman et al. Feb 2008 B2
7370047 Gorman May 2008 B2
7379811 Rasmussen et al. May 2008 B2
7379903 Caballero et al. May 2008 B2
7426654 Adams et al. Sep 2008 B2
7451397 Weber et al. Nov 2008 B2
7454466 Bellotti et al. Nov 2008 B2
7467375 Tondreau et al. Dec 2008 B2
7487139 Fraleigh et al. Feb 2009 B2
7502786 Liu et al. Mar 2009 B2
7525422 Bishop et al. Apr 2009 B2
7529727 Arning et al. May 2009 B2
7529734 Dirisala May 2009 B2
7558677 Jones Jul 2009 B2
7574409 Patinkin Aug 2009 B2
7574428 Leiserowitz et al. Aug 2009 B2
7579965 Bucholz Aug 2009 B2
7596285 Brown et al. Sep 2009 B2
7614006 Molander Nov 2009 B2
7617232 Gabbert et al. Nov 2009 B2
7620628 Kapur et al. Nov 2009 B2
7627812 Chamberlain et al. Dec 2009 B2
7634717 Chamberlain et al. Dec 2009 B2
7703021 Flam Apr 2010 B1
7706817 Bamrah et al. Apr 2010 B2
7712049 Williams et al. May 2010 B2
7716077 Mikurak May 2010 B1
7725530 Sah et al. May 2010 B2
7725547 Albertson et al. May 2010 B2
7730082 Sah et al. Jun 2010 B2
7730109 Rohrs et al. Jun 2010 B2
7770100 Chamberlain et al. Aug 2010 B2
7805457 Viola et al. Sep 2010 B1
7809703 Balabhadrapatruni et al. Oct 2010 B2
7818658 Chen Oct 2010 B2
7870493 Pall et al. Jan 2011 B2
7894984 Rasmussen et al. Feb 2011 B2
7899611 Downs et al. Mar 2011 B2
7917376 Bellin et al. Mar 2011 B2
7920963 Jouline et al. Apr 2011 B2
7933862 Chamberlain et al. Apr 2011 B2
7941321 Greenstein et al. May 2011 B2
7962281 Rasmussen et al. Jun 2011 B2
7962495 Jain et al. Jun 2011 B2
7962848 Bertram Jun 2011 B2
7970240 Chao et al. Jun 2011 B1
7971150 Raskutti et al. Jun 2011 B2
7984374 Caro et al. Jul 2011 B2
8001465 Kudrolli et al. Aug 2011 B2
8001482 Bhattiprolu et al. Aug 2011 B2
8010545 Stefik et al. Aug 2011 B2
8015487 Roy et al. Sep 2011 B2
8024778 Cash et al. Sep 2011 B2
8036632 Cona et al. Oct 2011 B1
8103543 Zwicky Jan 2012 B1
8134457 Velipasalar et al. Mar 2012 B2
8145703 Frishert et al. Mar 2012 B2
8185819 Sah et al. May 2012 B2
8196184 Amirov et al. Jun 2012 B2
8214361 Sandler et al. Jul 2012 B1
8214764 Gemmell et al. Jul 2012 B2
8225201 Michael Jul 2012 B2
8229947 Fujinaga Jul 2012 B2
8230333 Decherd et al. Jul 2012 B2
8271461 Pike et al. Sep 2012 B2
8280880 Aymeloglu et al. Oct 2012 B1
8290926 Ozzie et al. Oct 2012 B2
8290942 Jones et al. Oct 2012 B2
8301464 Cave et al. Oct 2012 B1
8301904 Gryaznov Oct 2012 B1
8312367 Foster Nov 2012 B2
8312546 Alme Nov 2012 B2
8352881 Champion et al. Jan 2013 B2
8368695 Howell et al. Feb 2013 B2
8397171 Klassen et al. Mar 2013 B2
8412707 Mianji Apr 2013 B1
8447722 Ahuja et al. May 2013 B1
8452790 Mianji May 2013 B1
8463036 Ramesh et al. Jun 2013 B1
8489331 Kopf et al. Jul 2013 B2
8489623 Jain et al. Jul 2013 B2
8489641 Seefeld et al. Jul 2013 B1
8498984 Hwang et al. Jul 2013 B1
8510743 Hackborn et al. Aug 2013 B2
8514082 Cova et al. Aug 2013 B2
8515207 Chau Aug 2013 B2
8554579 Tribble et al. Oct 2013 B2
8554653 Falkenborg et al. Oct 2013 B2
8554709 Goodson et al. Oct 2013 B2
8560413 Quarterman Oct 2013 B1
8577911 Stepinski et al. Nov 2013 B1
8589273 Creeden et al. Nov 2013 B2
8595234 Siripuapu et al. Nov 2013 B2
8620641 Farnsworth et al. Dec 2013 B2
8639757 Zang et al. Jan 2014 B1
8646080 Williamson et al. Feb 2014 B2
8676857 Adams et al. Mar 2014 B1
8689108 Duffield et al. Apr 2014 B1
8713467 Goldenberg et al. Apr 2014 B1
8726379 Stiansen et al. May 2014 B1
8739278 Varghese May 2014 B2
8742934 Sarpy et al. Jun 2014 B1
8744890 Bernier Jun 2014 B1
8745516 Mason et al. Jun 2014 B2
8781169 Jackson et al. Jul 2014 B2
8787939 Papakipos et al. Jul 2014 B2
8788407 Singh et al. Jul 2014 B1
8799799 Cervelli et al. Aug 2014 B1
8812960 Sun et al. Aug 2014 B1
8830322 Nerayoff et al. Sep 2014 B2
8832594 Thompson et al. Sep 2014 B1
8868486 Tamayo Oct 2014 B2
8868537 Colgrove et al. Oct 2014 B1
8917274 Ma et al. Dec 2014 B2
8924872 Bogomolov et al. Dec 2014 B1
8937619 Sharma et al. Jan 2015 B2
8938686 Erenrich et al. Jan 2015 B1
9009171 Grossman et al. Apr 2015 B1
9009827 Albertson et al. Apr 2015 B1
9021260 Falk et al. Apr 2015 B1
9021384 Beard et al. Apr 2015 B1
9043696 Meiklejohn et al. May 2015 B1
9043894 Dennison et al. May 2015 B1
9069842 Melby Jun 2015 B2
9116975 Shankar et al. Aug 2015 B2
9146954 Boe et al. Sep 2015 B1
9202249 Cohen et al. Dec 2015 B1
9223773 Isaacson Dec 2015 B2
9229952 Meacham et al. Jan 2016 B1
9250759 Commons Feb 2016 B1
9256664 Chakerian et al. Feb 2016 B2
9298678 Chakerian et al. Mar 2016 B2
9319288 Somaiya et al. Apr 2016 B2
9367872 Visbal et al. Jun 2016 B1
9626088 Ma et al. Apr 2017 B2
9646396 Sharma et al. May 2017 B2
9823818 Ryan et al. Nov 2017 B1
9852195 Ma et al. Dec 2017 B2
9857958 Ma et al. Jan 2018 B2
20010021936 Bertram Sep 2001 A1
20020033848 Sciammarella et al. Mar 2002 A1
20020065708 Senay et al. May 2002 A1
20020091707 Keller Jul 2002 A1
20020095658 Shulman Jul 2002 A1
20020116120 Ruiz et al. Aug 2002 A1
20020130907 Chi et al. Sep 2002 A1
20020174201 Ramer et al. Nov 2002 A1
20020194119 Wright et al. Dec 2002 A1
20030028560 Kudrolli et al. Feb 2003 A1
20030036848 Sheha et al. Feb 2003 A1
20030039948 Donahue Feb 2003 A1
20030140106 Raguseo Jul 2003 A1
20030144868 MacIntyre et al. Jul 2003 A1
20030163352 Surpin et al. Aug 2003 A1
20030172014 Quackenbush et al. Sep 2003 A1
20030200217 Ackerman Oct 2003 A1
20030225755 Iwayama et al. Dec 2003 A1
20030229848 Arend et al. Dec 2003 A1
20040032432 Baynger Feb 2004 A1
20040064256 Barinek et al. Apr 2004 A1
20040085318 Hassler et al. May 2004 A1
20040095349 Bito et al. May 2004 A1
20040103124 Kupkova May 2004 A1
20040111410 Burgoon et al. Jun 2004 A1
20040126840 Cheng et al. Jul 2004 A1
20040143598 Drucker Jul 2004 A1
20040143602 Ruiz et al. Jul 2004 A1
20040143796 Lerner et al. Jul 2004 A1
20040160309 Stilp Aug 2004 A1
20040163039 Gorman Aug 2004 A1
20040181554 Heckerman et al. Sep 2004 A1
20040193600 Kaasten et al. Sep 2004 A1
20040205524 Richter et al. Oct 2004 A1
20040221223 Yu et al. Nov 2004 A1
20040260702 Cragun et al. Dec 2004 A1
20040267746 Marcjan et al. Dec 2004 A1
20050027705 Sadri et al. Feb 2005 A1
20050028094 Allyn Feb 2005 A1
20050039119 Parks et al. Feb 2005 A1
20050065811 Chu et al. Mar 2005 A1
20050078858 Yao et al. Apr 2005 A1
20050080769 Gemmell Apr 2005 A1
20050086207 Heuer et al. Apr 2005 A1
20050125715 Di Franco et al. Jun 2005 A1
20050143602 Yada et al. Jun 2005 A1
20050154628 Eckart et al. Jul 2005 A1
20050154769 Eckart et al. Jul 2005 A1
20050162523 Darrell et al. Jul 2005 A1
20050166144 Gross Jul 2005 A1
20050180330 Shapiro Aug 2005 A1
20050182793 Keenan et al. Aug 2005 A1
20050183005 Denoue et al. Aug 2005 A1
20050210409 Jou Sep 2005 A1
20050246327 Yeung et al. Nov 2005 A1
20050251786 Citron et al. Nov 2005 A1
20060026120 Carolan et al. Feb 2006 A1
20060026170 Kreitler et al. Feb 2006 A1
20060045470 Poslinski et al. Mar 2006 A1
20060059139 Robinson Mar 2006 A1
20060074866 Chamberlain et al. Apr 2006 A1
20060074881 Vembu et al. Apr 2006 A1
20060080619 Carlson et al. Apr 2006 A1
20060093222 Saffer et al. May 2006 A1
20060129746 Porter Jun 2006 A1
20060139375 Rasmussen et al. Jun 2006 A1
20060142949 Helt Jun 2006 A1
20060143034 Rothermel Jun 2006 A1
20060149596 Surpin et al. Jul 2006 A1
20060184889 Molander Aug 2006 A1
20060203337 White Sep 2006 A1
20060209085 Wong et al. Sep 2006 A1
20060218637 Thomas et al. Sep 2006 A1
20060241974 Chao et al. Oct 2006 A1
20060242040 Rader Oct 2006 A1
20060242630 Koike et al. Oct 2006 A1
20060271277 Hu et al. Nov 2006 A1
20060279630 Aggarwal et al. Dec 2006 A1
20070011150 Frank Jan 2007 A1
20070016363 Huang et al. Jan 2007 A1
20070038646 Thota Feb 2007 A1
20070038962 Fuchs et al. Feb 2007 A1
20070057966 Ohno et al. Mar 2007 A1
20070078832 Ott et al. Apr 2007 A1
20070083541 Fraleigh et al. Apr 2007 A1
20070094389 Nussey et al. Apr 2007 A1
20070150369 Zivin Jun 2007 A1
20070174760 Chamberlain et al. Jul 2007 A1
20070192265 Chopin et al. Aug 2007 A1
20070198571 Ferguson et al. Aug 2007 A1
20070208497 Downs et al. Sep 2007 A1
20070208498 Barker et al. Sep 2007 A1
20070208736 Tanigawa et al. Sep 2007 A1
20070233709 Abnous Oct 2007 A1
20070240062 Christena et al. Oct 2007 A1
20070250784 Riley Oct 2007 A1
20070266336 Nojima et al. Nov 2007 A1
20070294200 Au Dec 2007 A1
20070294643 Kyle Dec 2007 A1
20080016216 Worley et al. Jan 2008 A1
20080040275 Paulsen et al. Feb 2008 A1
20080040684 Crump Feb 2008 A1
20080051989 Welsh Feb 2008 A1
20080052142 Bailey et al. Feb 2008 A1
20080077597 Butler Mar 2008 A1
20080077642 Carbone et al. Mar 2008 A1
20080082486 Lermant et al. Apr 2008 A1
20080104019 Nath May 2008 A1
20080126951 Sood et al. May 2008 A1
20080148398 Mezack et al. Jun 2008 A1
20080155440 Trevor et al. Jun 2008 A1
20080162616 Gross et al. Jul 2008 A1
20080195417 Surpin et al. Aug 2008 A1
20080195608 Clover Aug 2008 A1
20080222295 Robinson et al. Sep 2008 A1
20080243711 Aymeloglu et al. Oct 2008 A1
20080249983 Meisels et al. Oct 2008 A1
20080255973 El Wade et al. Oct 2008 A1
20080263468 Cappione et al. Oct 2008 A1
20080267107 Rosenberg Oct 2008 A1
20080276167 Michael Nov 2008 A1
20080278311 Grange et al. Nov 2008 A1
20080281819 Tenenbaum et al. Nov 2008 A1
20080288306 MacIntyre et al. Nov 2008 A1
20080288475 Kim et al. Nov 2008 A1
20080294994 Kruger et al. Nov 2008 A1
20080301643 Appleton et al. Dec 2008 A1
20090002492 Velipasalar et al. Jan 2009 A1
20090027418 Maru et al. Jan 2009 A1
20090030915 Winter et al. Jan 2009 A1
20090055251 Shah et al. Feb 2009 A1
20090070162 Leonelli et al. Mar 2009 A1
20090076845 Bellin et al. Mar 2009 A1
20090088964 Schaaf et al. Apr 2009 A1
20090094166 Aymeloglu et al. Apr 2009 A1
20090119309 Gibson et al. May 2009 A1
20090125359 Knapic May 2009 A1
20090125369 Kloosstra et al. May 2009 A1
20090125459 Norton et al. May 2009 A1
20090132921 Hwangbo et al. May 2009 A1
20090132953 Reed et al. May 2009 A1
20090143052 Bates et al. Jun 2009 A1
20090144262 White et al. Jun 2009 A1
20090144274 Fraleigh et al. Jun 2009 A1
20090150854 Elaasar et al. Jun 2009 A1
20090164934 Bhattiprolu et al. Jun 2009 A1
20090171939 Athsani et al. Jul 2009 A1
20090172511 Decherd et al. Jul 2009 A1
20090172821 Daira et al. Jul 2009 A1
20090177962 Gusmorino et al. Jul 2009 A1
20090179892 Tsuda et al. Jul 2009 A1
20090187464 Bai et al. Jul 2009 A1
20090192957 Subramanian et al. Jul 2009 A1
20090222400 Kupershmidt et al. Sep 2009 A1
20090222759 Drieschner Sep 2009 A1
20090222760 Halverson et al. Sep 2009 A1
20090234720 George et al. Sep 2009 A1
20090249244 Robinson et al. Oct 2009 A1
20090254970 Agarwal et al. Oct 2009 A1
20090281839 Lynn et al. Nov 2009 A1
20090287470 Farnsworth et al. Nov 2009 A1
20090292626 Oxford Nov 2009 A1
20090327208 Bittner et al. Dec 2009 A1
20100011282 Dollard et al. Jan 2010 A1
20100030722 Goodson et al. Feb 2010 A1
20100042922 Bradateanu et al. Feb 2010 A1
20100057716 Stefik et al. Mar 2010 A1
20100070523 Delgo et al. Mar 2010 A1
20100070842 Aymeloglu et al. Mar 2010 A1
20100070845 Facemire et al. Mar 2010 A1
20100070897 Aymeloglu et al. Mar 2010 A1
20100100963 Mahaffey Apr 2010 A1
20100103124 Kruzeniski et al. Apr 2010 A1
20100106752 Eckardt et al. Apr 2010 A1
20100114887 Conway et al. May 2010 A1
20100122152 Chamberlain et al. May 2010 A1
20100131457 Heimendinger May 2010 A1
20100162176 Dunton Jun 2010 A1
20100185691 Irmak et al. Jul 2010 A1
20100191563 Schlaifer et al. Jul 2010 A1
20100198684 Eraker et al. Aug 2010 A1
20100199225 Coleman et al. Aug 2010 A1
20100228812 Uomini Sep 2010 A1
20100250412 Wagner Sep 2010 A1
20100280857 Liu et al. Nov 2010 A1
20100293174 Bennett et al. Nov 2010 A1
20100306029 Jolley Dec 2010 A1
20100306713 Geisner et al. Dec 2010 A1
20100313119 Baldwin et al. Dec 2010 A1
20100318924 Frankel et al. Dec 2010 A1
20100321399 Ellren et al. Dec 2010 A1
20100325526 Ellis et al. Dec 2010 A1
20100325581 Finkelstein et al. Dec 2010 A1
20100330801 Rouh Dec 2010 A1
20110004498 Readshaw Jan 2011 A1
20110029526 Knight et al. Feb 2011 A1
20110047159 Baid et al. Feb 2011 A1
20110060753 Shaked Mar 2011 A1
20110061013 Bilicki et al. Mar 2011 A1
20110066933 Ludwig Mar 2011 A1
20110074811 Hanson et al. Mar 2011 A1
20110078055 Faribault et al. Mar 2011 A1
20110078173 Seligmann et al. Mar 2011 A1
20110093327 Fordyce, III et al. Apr 2011 A1
20110107196 Foster May 2011 A1
20110113348 Twiss et al. May 2011 A1
20110117878 Barash et al. May 2011 A1
20110119100 Ruhl et al. May 2011 A1
20110131547 Elaasar Jun 2011 A1
20110137766 Rasmussen et al. Jun 2011 A1
20110153384 Horne et al. Jun 2011 A1
20110161096 Buehler et al. Jun 2011 A1
20110161137 Ubalde et al. Jun 2011 A1
20110167105 Ramakrishnan et al. Jul 2011 A1
20110170799 Carrino et al. Jul 2011 A1
20110173032 Payne et al. Jul 2011 A1
20110181598 O'Neall et al. Jul 2011 A1
20110185316 Reid et al. Jul 2011 A1
20110208724 Jones et al. Aug 2011 A1
20110213655 Henkin Sep 2011 A1
20110218934 Elser Sep 2011 A1
20110219321 Gonzalez et al. Sep 2011 A1
20110219450 McDougal et al. Sep 2011 A1
20110225198 Edwards et al. Sep 2011 A1
20110238495 Kang Sep 2011 A1
20110238553 Raj et al. Sep 2011 A1
20110251951 Kolkowtiz Oct 2011 A1
20110252351 Sikora et al. Oct 2011 A1
20110258158 Resende et al. Oct 2011 A1
20110270705 Parker Nov 2011 A1
20110289397 Eastmond et al. Nov 2011 A1
20110289407 Naik et al. Nov 2011 A1
20110289420 Morioka et al. Nov 2011 A1
20110291851 Whisenant Dec 2011 A1
20110310005 Chen et al. Dec 2011 A1
20110314007 Dassa et al. Dec 2011 A1
20120004904 Shin et al. Jan 2012 A1
20120019559 Siler et al. Jan 2012 A1
20120030616 Howes Feb 2012 A1
20120036013 Neuhaus et al. Feb 2012 A1
20120036434 Oberstein Feb 2012 A1
20120050293 Carlhian et al. Mar 2012 A1
20120066296 Appleton et al. Mar 2012 A1
20120072825 Sherkin et al. Mar 2012 A1
20120075324 Cardno et al. Mar 2012 A1
20120079363 Folting et al. Mar 2012 A1
20120084118 Bai et al. Apr 2012 A1
20120106801 Jackson May 2012 A1
20120116828 Shannon May 2012 A1
20120117082 Koperda et al. May 2012 A1
20120131139 Siripurapu et al. May 2012 A1
20120131512 Takeuchi et al. May 2012 A1
20120137235 Ts et al. May 2012 A1
20120144335 Abeln et al. Jun 2012 A1
20120159307 Chung et al. Jun 2012 A1
20120159362 Brown et al. Jun 2012 A1
20120159399 Bastide et al. Jun 2012 A1
20120170847 Tsukidate Jul 2012 A1
20120173985 Peppel Jul 2012 A1
20120180002 Campbell et al. Jul 2012 A1
20120196557 Reich et al. Aug 2012 A1
20120196558 Reich et al. Aug 2012 A1
20120197651 Robinson et al. Aug 2012 A1
20120203708 Psota et al. Aug 2012 A1
20120208636 Feige Aug 2012 A1
20120221511 Gibson et al. Aug 2012 A1
20120221553 Wittmer et al. Aug 2012 A1
20120221580 Barney Aug 2012 A1
20120245976 Kumar et al. Sep 2012 A1
20120246148 Dror Sep 2012 A1
20120254129 Wheeler et al. Oct 2012 A1
20120284345 Costenaro et al. Nov 2012 A1
20120290879 Shibuya et al. Nov 2012 A1
20120296907 Long et al. Nov 2012 A1
20120311684 Paulsen et al. Dec 2012 A1
20120323888 Osann, Jr. Dec 2012 A1
20120330801 McDougal et al. Dec 2012 A1
20120330973 Ghuneim et al. Dec 2012 A1
20130006426 Healey et al. Jan 2013 A1
20130006725 Simanek et al. Jan 2013 A1
20130006916 McBride et al. Jan 2013 A1
20130018796 Kolhatkar et al. Jan 2013 A1
20130024268 Manickavelu Jan 2013 A1
20130046635 Grigg et al. Feb 2013 A1
20130046842 Muntz et al. Feb 2013 A1
20130050217 Armitage Feb 2013 A1
20130060786 Serrano et al. Mar 2013 A1
20130061169 Pearcy et al. Mar 2013 A1
20130073377 Heath Mar 2013 A1
20130073454 Busch Mar 2013 A1
20130078943 Biage et al. Mar 2013 A1
20130086482 Parsons Apr 2013 A1
20130097482 Marantz et al. Apr 2013 A1
20130101159 Chao et al. Apr 2013 A1
20130106860 De Pauw et al. May 2013 A1
20130110822 Ikeda et al. May 2013 A1
20130110877 Bonham et al. May 2013 A1
20130111320 Campbell et al. May 2013 A1
20130117011 Ahmed et al. May 2013 A1
20130117651 Waldman et al. May 2013 A1
20130150004 Rosen Jun 2013 A1
20130151148 Parundekar et al. Jun 2013 A1
20130151388 Falkenborg et al. Jun 2013 A1
20130157234 Gulli et al. Jun 2013 A1
20130166550 Buchmann et al. Jun 2013 A1
20130169666 Pacheco et al. Jul 2013 A1
20130176321 Mitchell et al. Jul 2013 A1
20130179420 Park et al. Jul 2013 A1
20130224696 Wolfe et al. Aug 2013 A1
20130225212 Khan Aug 2013 A1
20130226318 Procyk Aug 2013 A1
20130226953 Markovich et al. Aug 2013 A1
20130232045 Tai et al. Sep 2013 A1
20130238616 Rose et al. Sep 2013 A1
20130246170 Gross et al. Sep 2013 A1
20130251233 Yang et al. Sep 2013 A1
20130262527 Hunter et al. Oct 2013 A1
20130263019 Castellanos et al. Oct 2013 A1
20130267207 Hao et al. Oct 2013 A1
20130268520 Fisher et al. Oct 2013 A1
20130276000 Neeman Oct 2013 A1
20130279757 Kephart Oct 2013 A1
20130282696 John et al. Oct 2013 A1
20130290011 Lynn et al. Oct 2013 A1
20130290825 Arndt et al. Oct 2013 A1
20130297619 Chandrasekaran et al. Nov 2013 A1
20130311375 Priebatsch Nov 2013 A1
20130335419 Bondeson et al. Dec 2013 A1
20140006938 Black et al. Jan 2014 A1
20140019461 Bredenberg et al. Jan 2014 A1
20140019936 Cohanoff Jan 2014 A1
20140032506 Hoey et al. Jan 2014 A1
20140033010 Richardt et al. Jan 2014 A1
20140040371 Gurevich et al. Feb 2014 A1
20140046934 Zhou et al. Feb 2014 A1
20140047319 Eberlein Feb 2014 A1
20140047357 Alfaro et al. Feb 2014 A1
20140059038 McPherson et al. Feb 2014 A1
20140067611 Adachi et al. Mar 2014 A1
20140068487 Steiger et al. Mar 2014 A1
20140074855 Zhao et al. Mar 2014 A1
20140075362 Gray Mar 2014 A1
20140059498 McCormack et al. Apr 2014 A1
20140095273 Tang et al. Apr 2014 A1
20140095509 Patton Apr 2014 A1
20140108068 Williams Apr 2014 A1
20140108380 Gotz et al. Apr 2014 A1
20140108985 Scott et al. Apr 2014 A1
20140129261 Bothwell et al. May 2014 A1
20140149436 Bahrami et al. May 2014 A1
20140156527 Grigg et al. Jun 2014 A1
20140157172 Peery et al. Jun 2014 A1
20140164502 Khodorenko et al. Jun 2014 A1
20140189536 Lange et al. Jul 2014 A1
20140195515 Baker et al. Jul 2014 A1
20140195887 Ellis et al. Jul 2014 A1
20140214579 Shen et al. Jul 2014 A1
20140222521 Chait Aug 2014 A1
20140244388 Manouchehri et al. Aug 2014 A1
20140258246 Lo Faro et al. Sep 2014 A1
20140267294 Ma Sep 2014 A1
20140267295 Sharma Sep 2014 A1
20140279824 Tamayo Sep 2014 A1
20140282177 Wang et al. Sep 2014 A1
20140310266 Greenfield Oct 2014 A1
20140316911 Gross Oct 2014 A1
20140333651 Cervelli et al. Nov 2014 A1
20140337772 Cervelli et al. Nov 2014 A1
20140344230 Krause et al. Nov 2014 A1
20140351070 Christner et al. Nov 2014 A1
20150019394 Unser et al. Jan 2015 A1
20150046870 Goldenberg et al. Feb 2015 A1
20150073929 Psota et al. Mar 2015 A1
20150081370 Lo et al. Mar 2015 A1
20150089424 Duffield et al. Mar 2015 A1
20150100897 Sun et al. Apr 2015 A1
20150100907 Erenrich et al. Apr 2015 A1
20150134371 Shivakumar et al. May 2015 A1
20150134666 Gattiker et al. May 2015 A1
20150169709 Kara et al. Jun 2015 A1
20150169726 Kara et al. Jun 2015 A1
20150170077 Kara et al. Jun 2015 A1
20150178690 May et al. Jun 2015 A1
20150178825 Huerta Jun 2015 A1
20150178877 Bogomolov et al. Jun 2015 A1
20150186821 Wang et al. Jul 2015 A1
20150187036 Wang et al. Jul 2015 A1
20150213631 Vander Broek Jul 2015 A1
20150227295 Meiklejohn et al. Aug 2015 A1
20150227847 Noel et al. Aug 2015 A1
20150229532 Somaiya et al. Aug 2015 A1
20150229546 Somaiya et al. Aug 2015 A1
20150242401 Liu Aug 2015 A1
20150254878 Sharma et al. Sep 2015 A1
20150309719 Ma et al. Oct 2015 A1
20150317342 Grossman et al. Nov 2015 A1
20150324868 Kaftan et al. Nov 2015 A1
20150341212 Hsiao et al. Nov 2015 A1
20150347903 Saxena et al. Dec 2015 A1
20150363478 Haynes Dec 2015 A1
20150378996 Kesin et al. Dec 2015 A1
20160004667 Chakerian et al. Jan 2016 A1
20160006749 Cohen et al. Jan 2016 A1
20160034545 Shankar et al. Feb 2016 A1
20160098173 Slawinski et al. Apr 2016 A1
20160162497 Cho et al. Jun 2016 A1
20170109030 Mingione Apr 2017 A1
20170109910 Sharma et al. Apr 2017 A1
20170116294 Ma et al. Apr 2017 A1
Foreign Referenced Citations (42)
Number Date Country
2014250678 Feb 2016 AU
102014103482 Sep 2014 DE
102014215621 Feb 2015 DE
1191463 Mar 2002 EP
1672527 Jun 2006 EP
2551799 Jan 2013 EP
2560134 Feb 2013 EP
2778977 Sep 2014 EP
2778983 Sep 2014 EP
2779082 Sep 2014 EP
2835745 Feb 2015 EP
2835770 Feb 2015 EP
2838039 Feb 2015 EP
2846241 Mar 2015 EP
2851852 Mar 2015 EP
2858014 Apr 2015 EP
2858018 Apr 2015 EP
2863326 Apr 2015 EP
2863346 Apr 2015 EP
2869211 May 2015 EP
2884439 Jun 2015 EP
2884440 Jun 2015 EP
2891992 Jul 2015 EP
2911078 Aug 2015 EP
2911100 Aug 2015 EP
2940603 Nov 2015 EP
2940609 Nov 2015 EP
2516155 Jan 2015 GB
2518745 Apr 2015 GB
2012778 Nov 2014 NL
2013306 Feb 2015 NL
624557 Dec 2014 NZ
WO 2000009529 Feb 2000 WO
WO 2002065353 Aug 2002 WO
WO 2005104736 Nov 2005 WO
WO 2008064207 May 2008 WO
WO 2009061501 May 2009 WO
WO 2010000014 Jan 2010 WO
WO 2010030913 Mar 2010 WO
WO 2010098958 Sep 2010 WO
WO 2013010157 Jan 2013 WO
WO 2013102892 Jul 2013 WO
Non-Patent Literature Citations (267)
Entry
“A First Look: Predicting Market Demand for Food Retail using a Huff Analysis,” TRF Policy Solutions, Jul. 2012, pp. 30.
“A Quick Guide to UniProtKB Swiss-Prot & TrEMBL,” Sep. 2011, pp. 2.
“A Word About Banks and the Laundering of Drug Money,” Aug. 18, 2012, http://www.golemxiv.co.uk/2012/08/a-word-about-banks-and-the-laundering-of-drug-money/.
About 80 Minutes, “Palantir in a Number of Parts—Part 6—Graph,” Mar. 21, 2013, pp. 1-6, retrieved from the internet http://about80minutes.blogspot.nl/2013/03/palantir-in-number-of-parts-part-6-graph.html retrieved on Aug. 18, 2015.
Acklen, Laura, “Absolute Beginner's Guide to Microsoft Word 2003,” Dec. 24, 2003, pp. 15-18, 34-41, 308-316.
Alfred, Rayner “Summarizing Relational Data Using Semi-Supervised Genetic Algorithm-Based Clustering Techniques”, Journal of Computer Science, 2010, vol. 6, No. 7, pp. 775-784.
Alur et al., “Chapter 2: IBM InfoSphere DataStage Stages,” IBM InfoSphere DataStage Data Flow and Job Design, Jul. 1, 2008, pp. 35-137.
Amnet, “5 Great Tools for Visualizing Your Twitter Followers,” posted Aug. 4, 2010, http://www.amnetblog.com/component/content/article/115-5-grate-tools-for-visualizing-your-twitter-followers.html.
Ananiev et al., “The New Modality API,” http://web.archive.org/web/20061211011958/http://java.sun.com/developer/technicalArticles/J2SE/Desktop/javase6/modality/ Jan. 21, 2006, pp. 8.
Bluttman et al., “Excel Formulas and Functions for Dummies,” 2005, Wiley Publishing, Inc., pp. 280, 284-286.
Boyce, Jim, “Microsoft Outlook 2010 Inside Out,” Aug. 1. 2010, retrieved from the internet https://capdtron.files.wordpress.com/2013/01/outlook-2010-inside_out.pdf.
Bugzilla@Mozilla, “Bug 18726—[feature] Long-click means of invoking contextual menus not supported,” http://bugzilla.mozilla.org/show_bug.cgi?id=18726 printed Jun. 13, 2013 in 11 pages.
Canese et al., “Chapter 2: PubMed: The Bibliographic Database,” The NCBI Handbook, Oct. 2002, pp. 1-10.
Celik, Tantek, “CSS Basic User Interface Module Level 3 (CSS3 UI),” Section 8 Resizing and Overflow, Jan. 17, 2012, retrieved from internet http://www.w3.org/TR/2012/WD-css3-ui-20120117/#resizing-amp-overflow retrieved on May 18, 2015.
Chen et al., “Bringing Order to the Web: Automatically Categorizing Search Results,” CHI 2000, Proceedings of the SIGCHI conference on Human Factors in Computing Systems, Apr. 1-6, 2000, The Hague, The Netherlands, pp. 145-152.
Chung, Chin-Wan, “Dataplex: An Access to Heterogeneous Distributed Databases,” Communications of the ACM, Association for Computing Machinery, Inc., vol. 33, No. 1, Jan. 1, 1990, pp. 70-80.
Conner, Nancy, “Google Apps: The Missing Manual,” May 1, 2008, pp. 15.
Definition “Identify”, downloaded Jan. 22, 2015, 1 page.
Definition “Overlay”, downloaded Jan. 22, 2015, 1 page.
Delcher et al., “Identifying Bacterial Genes and Endosymbiont DNA with Glimmer,” BioInformatics, vol. 23, No. 6, 2007, pp. 673-679.
Dramowicz, Ela, “Retail Trade Area Analysis Using the Huff Model,” Directions Magazine, Jul. 2, 2005 in 10 pages, http://www.directionsmag.com/articles/retail-trade-area-analysis-using-the-huff-model/123411.
“The FASTA Program Package,” fasta-36.3.4, Mar. 25, 2011, pp. 29.
Gesher, Ari, “Palantir Screenshots in the Wild: Swing Sightings,” The Palantir Blog, Sep. 11, 2007, pp. 1-12, retrieved from the internet https://www.palantir.com/2007/09/palantir-screenshots/ retrieved on Aug. 18, 2015.
GIS-NET 3 Public _ Department of Regional Planning. Planning & Zoning Information for Unincorporated LA County. Retrieved Oct. 2, 2013 from http://gis.planning.lacounty.gov/GIS-NET3_Public/Viewer.html.
Goswami, Gautam, “Quite Writly Said!,” One Brick at a Time, Aug. 21, 2005, pp. 7.
Griffith, Daniel A., “A Generalized Huff Model,” Geographical Analysis, Apr. 1982, vol. 14, No. 2, pp. 135-144.
Hansen et al., “Analyzing Social Media Networks with NodeXL: Insights from a Connected World”, Chapter 4, pp. 53-67 and Chapter 10, pp. 143-164, published Sep. 2010.
Hardesty, “Privacy Challenges: Analysis: It's Surprisingly Easy to Identify Individuals from Credit-Card Metadata,” MIT News on Campus and Around the World, MIT News Office, Jan. 29, 2015, 3 pages.
Hibbert et al., “Prediction of Shopping Behavior Using a Huff Model Within a GIS Framework,” Healthy Eating in Context, Mar. 18, 2011, pp. 16.
Hogue et al., “Thresher: Automating the Unwrapping of Semantic Content from the World Wide Web,” 14th International Conference on World Wide Web, WWW 2005: Chiba, Japan, May 10-14, 2005, pp. 86-95.
Huang et al., “Systematic and Integrative Analysis of Large Gene Lists Using DAVID Bioinformatics Resources,” Nature Protocols, 4.1, 2008, 44-57.
Huff et al., “Calibrating the Huff Model Using ArcGIS Business Analyst,” ESRI, Sep. 2008, pp. 33.
Huff, David L., “Parameter Estimation in the Huff Model,” ESRI, ArcUser, Oct.-Dec. 2003, pp. 34-36.
Janssen, Jan-Keno, “Wo bist'n. du?—Googles Geodienst Latitude,” Jan. 17, 2011, pp. 86-88, retrieved from the internet on Jul. 30, 2015 http://www.heise.de/artikel-archiv/ct/2011/03/086/@00250@/ct.11.03.086-088.pdf.
Jelen, Bill, “Excel 2013 in Depth, Video Enhanced Edition,” Jan. 25, 2013.
Kahan et al., “Annotea: an Open RDF Infrastructure for Shared Web Annotations”, Computer Networks, Elsevier Science Publishers B.V., vol. 39, No. 5, dated Aug. 5, 2002, pp. 589-608.
Keylines.com, “An Introduction to KeyLines and Network Visualization,” Mar. 2014, <http://keylines.com/wp-content/uploads/2014/03/KeyLines-White-Paper.pdf> downloaded May 12, 2014 in 8 pages.
Keylines.com, “KeyLines Datasheet,” Mar. 2014, <http://keylines.com/wp-content/uploads/2014/03/KeyLines-datasheetpdf> downloaded May 12, 2014 in 2 pages.
Keylines.com, “Visualizing Threats: Improved Cyber Security Through Network Visualization,” Apr. 2014, <http://keylines.com/wp-content/uploads/2014/04/Visualizing-Threats1.pdf> downloaded May 12, 2014 in 10 pages.
Kitts, Paul, “Chapter 14: Genome Assembly and Annotation Process,” The NCBI Handbook, Oct. 2002, pp. 1-21.
Li et al., “Interactive Multimodal Visual Search on Mobile Device,” IEEE Transactions on Multimedia, vol. 15, No. 3, Apr. 1, 2013, pp. 594-607.
Liu, Tianshun, “Combining GIS and the Huff Model to Analyze Suitable Locations for a New Asian Supermarket in the Minneapolis and St. Paul, Minnesota USA,” Papers in Resource Analysis, 2012, vol. 14, pp. 8.
Madden, Tom, “Chapter 16: The BLAST Sequence Analysis Tool,” The NCBI Handbook, Oct. 2002, pp. 1-15.
Manno et al., “Introducing Collaboration in Single-user Applications through the Centralized Control Architecture,” 2010, pp. 10.
Manske, “File Saving Dialogs,” <http://www.mozilla.org/editor/ui_specs/FileSaveDialogs.html>, Jan. 20, 1999, pp. 7.
Map Builder, “Rapid Mashup Development Tool for Google and Yahoo Maps!” <http://web.archive.org/web/20090626224734/http://www.mapbuilder.net/> printed Jul. 20, 2012 in 2 pages.
Map of San Jose, CA. Retrieved Oct. 2, 2013 from http://maps.yahoo.com.
Map of San Jose, CA. Retrieved Oct. 2, 2013 from http://maps.bing.com.
Map of San Jose, CA. Retrieved Oct. 2, 2013 from http://maps.google.com.
Microsoft—Developer Network, “Getting Started with VBA in Word 2010,” Apr. 2010, <http://msdn.microsoft.com/en-us/library/ff604039%28v-office.14%29.aspx> as printed Apr. 4, 2014 in 17 pages.
Microsoft Office—Visio, “About connecting shapes,” <http://office.microsoft.com/en-us/visio-help/about-connecting-shapes-HP085050369.aspx> printed Aug. 4, 2011 in 6 pages.
Microsoft Office—Visio, “Add and glue connectors with the Connector tool,” <http://office.microsoft.com/en-us/visio-help/add-and-glue-connectors-with-the-connector-tool-HA010048532.aspx?CTT=1> printed Aug. 4, 2011 in 1 page.
Mizrachi, Ilene, “Chapter 1: GenBank: The Nuckeotide Sequence Database,” The NCBI Handbook, Oct. 2002, pp. 1-14.
“Money Laundering Risks and E-Gaming: A European Overview and Assessment,” 2009, http://www.cf.ac.uk/socsi/resources/Levi_Final_Money_Laundering_Risks_egaming.pdf.
Nierman, “Evaluating Structural Similarity in XML Documents”, 6 pages, 2002.
Nolan et al., “MCARTA: A Malicious Code Automated Run-Time Analysis Framework,” Homeland Security, 2012 IEEE Conference on Technologies for, Nov. 13, 2012, pp. 13-17.
Ocull, Heather, “Timelines Everywhere: See and share your work with ease in SharePoint and PWA,” Office Blogs, published Sep. 7, 2012, in 8 pages.
Olanoff, Drew, “Deep Dive with the New Google Maps for Desktop with Google Earth Integration, It's More than Just a Utility,” May 15, 2013, pp. 1-6, retrieved from the internet: http://web.archive.org/web/20130515230641/http://techcrunch.com/2013/05/15/deep-dive-with-the-new-google-maps-for-desktop-with-google-earth-integration-its-more-than-just-a-utility/.
Palantir Technologies, “Palantir Labs—Timeline,” Oct. 1, 2010, retrieved from the internet https://www.youtube.com/watch?v=JCgDW5bru9M retrieved on Aug. 19, 2015.
Palmas et al., “An Edge-Bunding Layout for Interactive Parallel Coordinates” 2014 IEEE Pacific Visualization Symposium, pp. 57-64.
Perdisci et al., “Behavioral Clustering of HTTP-Based Malware and Signature Generation Using Malicious Network Traces,” USENIX, Mar. 18, 2010, pp. 1-14.
“Potential Money Laundering Warning Signs,” snapshot taken 2003, https://web.archive.org/web/20030816090055/http:/finsolinc.com/ANTI-MONEY%20LAUNDERING%20TRAINING%20GUIDES.pdf.
Psaltis, Andrew G., “Streaming Data—Designing the Real-Time Pipeline,” Jan. 16, 2015, vol. MEAP VO3, pp. 0-12.
Quest, “Toad for ORACLE 11.6—Guide to Using Toad,” Sep. 24, 2012, pp. 1-162.
“Refresh CSS Ellipsis When Resizing Container—Stack Overflow,” Jul. 31, 2013, retrieved from internet http://stackoverflow.com/questions/17964681/refresh-css-ellipsis-when-resizing-container, retrieved on May 18, 2015.
Rouse, Margaret, “OLAP Cube,” <http://searchdatamanagement.techtarget.com/definition/OLAP-cube>, Apr. 28, 2012, pp. 16.
Shi et al., “A Scalable Implementation of Malware Detection Based on Network Connection Behaviors,” 2013 International Conference on Cyber-Enabled Distributed Computing and Knowledge Discovery, IEEE, Oct. 10, 2013, pp. 59-66.
Sigrist, et al., “PROSITE, a Protein Domain Database for Functional Characterization and Annotation,” Nucleic Acids Research, 2010, vol. 38, pp. D161-D166.
Sirotkin et al., “Chapter 13: The Processing of Biological Sequence Data at NCBI,” The NCBI Handbook, Oct. 2002, pp. 1-11.
Symantec Corporation, “E-Security Begins with Sound Security Policies,” Announcement Symantec, Jun. 14, 2001.
Thompson, Mick, “Getting Started with GEO,” Getting Started with GEO, Jul. 26, 2011.
Umagandhi et al., “Search Query Recommendations Using Hybrid User Profile with Query Logs,” International Journal of Computer Applications, vol. 80, No. 10, Oct. 1, 2013, pp. 7-18.
“Using Whois Based Geolocation and Google Maps API for Support Cybercrime Investigations,” http://wseas.us/e-library/conferences/2013/Dubrovnik/TELECIRC/TELECIRC-32.pdf.
Vose et al., “Help File for ModelRisk Version 5,” 2007, Vose Software, pp. 349-353. [Uploaded in 2 Parts].
Wikipedia, “Federated Database System,” Sep. 7, 2013, retrieved from the internet on Jan. 27, 2015 http://en.wikipedia.org/w/index.php?title=Federated_database_system&oldid=571954221.
Wikipedia, “Mobile Web,” Jan. 23, 2015, retrieved from the internet on Mar. 15, 2016 https://en.wikipedia.org/w/index.php?title=Mobile_Web&oldid=643800164.
Windley, Phillip J., “The Live Web: Building Event-Based Connections in the Cloud,” Dec. 21, 2011, pp. 10, 216.
Wright et al., “Palantir Technologies VAST 2010 Challenge Text Records—Investigations into Arms Dealing,” Oct. 29, 2010, pp. 1-10, retrieved from the internet http://hcil2.cs.umd.edu/newvarepository/VAST%20Challenge%202010/challenges/MC1%20-%20Investigations%20into%20Arms%20Dealing/entries/Palantir%20Technologies/ retrieved on Aug. 20, 2015.
Yang et al., “HTML Page Analysis Based on Visual Cues”, A129, pp. 859-864, 2001.
International Search Report and Written Opinion in Application No. PCT/US2009/056703 dated Mar. 15, 2010.
Notice of Acceptance for Australian Patent Application No. 2014250678 dated Oct. 7, 2015.
Notice of Allowance for U.S. Appl. No. 12/556,318 dated Nov. 2, 2015.
Notice of Allowance for U.S. Appl. No. 13/247,987 dated Mar. 17, 2016.
Notice of Allowance for U.S. Appl. No. 14/102,394 dated Aug. 25, 2014.
Notice of Allowance for U.S. Appl. No. 14/108,187 dated Aug. 29, 2014.
Notice of Allowance for U.S. Appl. No. 14/135,289 dated Oct. 14, 2014.
Notice of Allowance for U.S. Appl. No. 14/148,568 dated Aug. 26, 2015.
Notice of Allowance for U.S. Appl. No. 14/192,767 dated Dec. 16, 2014.
Notice of Allowance for U.S. Appl. No. 14/192,767 dated Apr. 20, 2015.
Notice of Allowance for U.S. Appl. No. 14/225,084 dated May 4, 2015.
Notice of Allowance for U.S. Appl. No. 14/268,964 dated Dec. 3, 2014.
Notice of Allowance for U.S. Appl. No. 14/294,098 dated Dec. 29, 2014.
Notice of Allowance for U.S. Appl. No. 14/320,236 dated Jun. 29, 2016.
Notice of Allowance for U.S. Appl. No. 14/323,935 dated Oct. 1, 2015.
Notice of Allowance for U.S. Appl. No. 14/326,738 dated Nov. 18, 2015.
Notice of Allowance for U.S. Appl. No. 14/473,552 dated Jul. 24, 2015.
Notice of Allowance for U.S. Appl. No. 14/473,860 dated Feb. 27, 2015.
Notice of Allowance for U.S. Appl. No. 14/473,860 dated Jan. 5, 2015.
Notice of Allowance for U.S. Appl. No. 14/486,991 dated May 1, 2015.
Notice of Allowance for U.S. Appl. No. 14/504,103 dated May 18, 2015.
Notice of Allowance for U.S. Appl. No. 14/570,914 dated Jan. 31, 2017.
Notice of Allowance for U.S. Appl. No. 14/596,552 dated Dec. 23, 2016.
Notice of Allowance for U.S. Appl. No. 14/616,080 dated Apr. 2, 2015.
Notice of Allowance for U.S. Appl. No. 14/923,364 dated May 6, 2016.
Notice of Allowance for U.S. Appl. No. 14/948,009 dated May 6, 2016.
Notice of Allowance for U.S. Appl. No. 15/092,456 dated Jul. 14, 2017.
Notice of Allowance for U.S. Appl. No. 15/397,562 dated Aug. 16, 2017.
Official Communication for Australian Patent Application No. 2014201511 dated Feb. 27, 2015.
Official Communication for Australian Patent Application No. 2014202442 dated Mar. 19, 2015.
Official Communication for Australian Patent Application No. 2014210604 dated Jun. 5, 2015.
Official Communication for Australian Patent Application No. 2014210614 dated Jun. 5, 2015.
Official Communication for Australian Patent Application No. 2014213553 dated May 7, 2015.
Official Communication for Australian Patent Application No. 2014250678 dated Jun. 17, 2015.
Official Communication for European Patent Application No. 14158861.6 dated Jun. 16, 2014.
Official Communication for European Patent Application No. 14159464.8 dated Jul. 31, 2014.
Official Communication for European Patent Application No. 14180142.3 dated Feb. 6, 2015.
Official Communication for European Patent Application No. 14180281.9 dated Jan. 26, 2015.
Official Communication for European Patent Application No. 14180321.3 dated Apr. 17, 2015.
Official Communication for European Patent Application No. 14180432.8 dated Jun. 23, 2015.
Official Communication for European Patent Application No. 14186225.0 dated Feb. 13, 2015.
Official Communication for European Patent Application No. 14187739.9 dated Jul. 6, 2015.
Official Communication for European Patent Application No. 14187996.5 dated Feb. 12, 2015.
Official Communication for European Patent Application No. 14189344.6 dated Feb. 20, 2015.
Official Communication for European Patent Application No. 14189344.6 dated Feb. 29, 2016.
Official Communication for European Patent Application No. 14189347.9 dated Mar. 4, 2015.
Official Communication for European Patent Application No. 14189802.3 dated May 11, 2015.
Official Communication for European Patent Application No. 14191540.5 dated May 27, 2015.
Official Communication for European Patent Application No. 14197879.1 dated Apr. 28, 2015.
Official Communication for European Patent Application No. 14197895.7 dated Apr. 28, 2015.
Official Communication for European Patent Application No. 14197938.5 dated Apr. 28, 2015.
Official Communication for European Patent Application No. 14199182.8 dated Mar. 13, 2015.
Official Communication for European Patent Application No. 15155845.9 dated Oct. 6, 2015.
Official Communication for European Patent Application No. 15155846.7 dated May 19, 2016.
Official Communication for European Patent Application No. 15155846.7 dated Jul. 8, 2015.
Official Communication for European Patent Application No. 15165244.3 dated Aug. 27, 2015.
Official Communication for European Patent Application No. 15166137.8 dated Sep. 14, 2015.
Official Communication for European Patent Application No. 15175106.2 dated Nov. 5, 2015.
Official Communication for European Patent Application No. 15175151.8 dated Nov. 25, 2015.
Official Communication for European Patent Application No. 15183721.8 dated Nov. 23, 2015.
Official Communication for European Patent Application No. 16152984.7 dated Mar. 24, 2016.
Official Communication for Great Britain Patent Application No. 1404457.2 dated Aug. 14, 2014.
Official Communication for Great Britain Patent Application No. 1404574.4 dated Dec. 18, 2014.
Official Communication for Great Britain Patent Application No. 1408025.3 dated Nov. 6, 2014.
Official Communication for Great Britain Patent Application No. 1411984.6 dated Dec. 22, 2014.
Official Communication for Great Britain Patent Application No. 1413935.6 dated Jan. 27, 2015.
Official Communication for Netherlands Patent Application No. 2012437 dated Sep. 18, 2015.
Official Communication for Netherlands Patent Application No. 2013306 dated Apr. 24, 2015.
Official Communication for New Zealand Patent Application No. 622513 dated Apr. 3, 2014.
Official Communication for New Zealand Patent Application No. 622513 dated Aug. 3, 2014.
Official Communication for New Zealand Patent Application No. 622517 dated Apr. 3, 2014.
Official Communication for New Zealand Patent Application No. 624557 dated May 14, 2014.
Official Communication for New Zealand Patent Application No. 627962 dated Aug. 5, 2014.
Official Communication for New Zealand Patent Application No. 628161 dated Aug. 25, 2014.
Official Communication for New Zealand Patent Application No. 628263 dated Aug. 12, 2014.
Official Communication for New Zealand Patent Application No. 628495 dated Aug. 19, 2014.
Official Communication for New Zealand Patent Application No. 628585 dated Aug. 26, 2014.
Official Communication for New Zealand Patent Application No. 628840 dated Aug. 28, 2014.
Official Communication for U.S. Appl. No. 12/556,318 dated Jul. 2, 2015.
Official Communication for U.S. Appl. No. 13/247,987 dated Apr. 2, 2015.
Official Communication for U.S. Appl. No. 13/247,987 dated Sep. 22, 2015.
Official Communication for U.S. Appl. No. 13/831,791 dated Feb. 11, 2016.
Official Communication for U.S. Appl. No. 13/831,791 dated Mar. 4, 2015.
Official Communication for U.S. Appl. No. 13/831,791 dated Aug. 6, 2015.
Official Communication for U.S. Appl. No. 13/835,688 dated Jun. 17, 2015.
Official Communication for U.S. Appl. No. 13/839,026 dated Aug. 4, 2015.
Official Communication for U.S. Appl. No. 14/102,394 dated Mar. 27, 2014.
Official Communication for U.S. Appl. No. 14/108,187 dated Apr. 17, 2014.
Official Communication for U.S. Appl. No. 14/108,187 dated Mar. 20, 2014.
Official Communication for U.S. Appl. No. 14/134,558 dated Oct. 7, 2015.
Official Communication for U.S. Appl. No. 14/135,289 dated Oct. 14, 2014.
Official Communication for U.S. Appl. No. 14/135,289 dated Apr. 16, 2014.
Official Communication for U.S. Appl. No. 14/135,289 dated Jul. 7, 2014.
Official Communication for U.S. Appl. No. 14/148,559 dated Jun. 16, 2014.
Official Communication for U.S. Appl. No. 14/148,559 dated Apr. 2, 2014.
Official Communication for U.S. Appl. No. 14/148,568 dated Oct. 22, 2014.
Official Communication for U.S. Appl. No. 14/148,568 dated Mar. 26, 2015.
Official Communication for U.S. Appl. No. 14/148,568 dated Mar. 27, 2014.
Official Communication for U.S. Appl. No. 14/192,767 dated Sep. 24, 2014.
Official Communication for U.S. Appl. No. 14/192,767 dated May 6, 2014.
Official Communication for U.S. Appl. No. 14/196,814 dated May 5, 2015.
Official Communication for U.S. Appl. No. 14/225,006 dated Sep. 10, 2014.
Official Communication for U.S. Appl. No. 14/225,006 dated Sep. 2, 2015.
Official Communication for U.S. Appl. No. 14/225,006 dated Dec. 21, 2015.
Official Communication for U.S. Appl. No. 14/225,006 dated Feb. 27, 2015.
Official Communication for U.S. Appl. No. 14/225,084 dated Sep. 11, 2015.
Official Communication for U.S. Appl. No. 14/225,084 dated Sep. 2, 2014.
Official Communication for U.S. Appl. No. 14/225,084 dated Feb. 20, 2015.
Official Communication for U.S. Appl. No. 14/225,084 dated Feb. 26, 2016.
Official Communication for U.S. Appl. No. 14/225,084 dated Jan. 4, 2016.
Official Communication for U.S. Appl. No. 14/225,160 dated Feb. 11, 2015.
Official Communication for U.S. Appl. No. 14/225,160 dated Aug. 12, 2015.
Official Communication for U.S. Appl. No. 14/225,160 dated May 20, 2015.
Official Communication for U.S. Appl. No. 14/225,160 dated Oct. 22, 2014.
Official Communication for U.S. Appl. No. 14/225,160 dated Jan. 25, 2016.
Official Communication for U.S. Appl. No. 14/225,160 dated Jul. 29, 2014.
Official Communication for U.S. Appl. No. 14/268,964 dated Jul. 11, 2014.
Official Communication for U.S. Appl. No. 14/268,964 dated Sep. 3, 2014.
Official Communication for U.S. Appl. No. 14/289,596 dated Jul. 18, 2014.
Official Communication for U.S. Appl. No. 14/289,596 dated Jan. 26, 2015.
Official Communication for U.S. Appl. No. 14/289,596 dated Apr. 30, 2015.
Official Communication for U.S. Appl. No. 14/289,596 dated Aug. 5, 2015.
Official Communication for U.S. Appl. No. 14/289,599 dated Jul. 22, 2014.
Official Communication for U.S. Appl. No. 14/289,599 dated May 29, 2015.
Official Communication for U.S. Appl. No. 14/289,599 dated Sep. 4, 2015.
Official Communication for U.S. Appl. No. 14/294,098 dated Aug. 15, 2014.
Official Communication for U.S. Appl. No. 14/294,098 dated Nov. 6, 2014.
Official Communication for U.S. Appl. No. 14/306,138 dated Sep. 14, 2015.
Official Communication for U.S. Appl. No. 14/306,138 dated Mar. 17, 2016.
Official Communication for U.S. Appl. No. 14/306,138 dated Feb. 18, 2015.
Official Communication for U.S. Appl. No. 14/306,138 dated Sep. 23, 2014.
Official Communication for U.S. Appl. No. 14/306,138 dated Dec. 24, 2015.
Official Communication for U.S. Appl. No. 14/306,138 dated May 26, 2015.
Official Communication for U.S. Appl. No. 14/306,138 dated Dec. 3, 2015.
Official Communication for U.S. Appl. No. 14/306,147 dated Feb. 19, 2015.
Official Communication for U.S. Appl. No. 14/306,147 dated Dec. 24, 2015.
Official Communication for U.S. Appl. No. 14/306,147 dated Jun. 3, 2016.
Official Communication for U.S. Appl. No. 14/306,147 dated Aug. 7, 2015.
Official Communication for U.S. Appl. No. 14/306,147 dated Sep. 9, 2014.
Official Communication for U.S. Appl. No. 14/306,154 dated Feb. 1, 2016.
Official Communication for U.S. Appl. No. 14/306,154 dated Mar. 11, 2015.
Official Communication for U.S. Appl. No. 14/306,154 dated May 15, 2015.
Official Communication for U.S. Appl. No. 14/306,154 dated Nov. 16, 2015.
Official Communication for U.S. Appl. No. 14/306,154 dated Mar. 17, 2016.
Official Communication for U.S. Appl. No. 14/306,154 dated Jul. 6, 2015.
Official Communication for U.S. Appl. No. 14/306,154 dated Sep. 9, 2014.
Official Communication for U.S. Appl. No. 14/319,765 dated Feb. 1, 2016.
Official Communication for U.S. Appl. No. 14/319,765 dated Sep. 10, 2015.
Official Communication for U.S. Appl. No. 14/319,765 dated Jun. 16, 2015.
Official Communication for U.S. Appl. No. 14/319,765 dated Nov. 25, 2014.
Official Communication for U.S. Appl. No. 14/319,765 dated Feb. 4, 2015.
Official Communication for U.S. Appl. No. 14/323,935 dated Jun. 22, 2015.
Official Communication for U.S. Appl. No. 14/323,935 dated Nov. 28, 2014.
Official Communication for U.S. Appl. No. 14/323,935 dated Mar. 31, 2015.
Official Communication for U.S. Appl. No. 14/326,738 dated Dec. 2, 2014.
Official Communication for U.S. Appl. No. 14/326,738 dated Jul. 31, 2015.
Official Communication for U.S. Appl. No. 14/326,738 dated Mar. 31, 2015.
Official Communication for U.S. Appl. No. 14/473,552 dated Feb. 24, 2015.
Official Communication for U.S. Appl. No. 14/473,860 dated Nov. 4, 2014.
Official Communication for U.S. Appl. No. 14/486,991 dated Mar. 10, 2015.
Official Communication for U.S. Appl. No. 14/490,612 dated Aug. 18, 2015.
Official Communication for U.S. Appl. No. 14/504,103 dated Mar. 31, 2015.
Official Communication for U.S. Appl. No. 14/504,103 dated Feb. 5, 2015.
Official Communication for U.S. Appl. No. 14/570,914 dated Sep. 16, 2016.
Official Communication for U.S. Appl. No. 14/570,914 dated Dec. 19, 2016.
Official Communication for U.S. Appl. No. 14/579,752 dated Aug. 19, 2015.
Official Communication for U.S. Appl. No. 14/579,752 dated May 26, 2015.
Official Communication for U.S. Appl. No. 14/596,552 dated Dec. 23, 2016.
Official Communication for U.S. Appl. No. 14/596,552 dated Sep. 23, 2016.
Official Communication for U.S. Appl. No. 14/596,552 dated Oct. 5, 2016.
Official Communication for U.S. Appl. No. 14/631,633 dated Sep. 10, 2015.
Official Communication for U.S. Appl. No. 14/639,606 dated Oct. 16, 2015.
Official Communication for U.S. Appl. No. 14/639,606 dated May 18, 2015.
Official Communication for U.S. Appl. No. 14/639,606 dated Jul. 24, 2015.
Official Communication for U.S. Appl. No. 14/645,304 dated Jan. 25, 2016.
Official Communication for U.S. Appl. No. 14/726,353 dated Sep. 10, 2015.
Official Communication for U.S. Appl. No. 14/813,749 dated Sep. 28, 2015.
Official Communication for U.S. Appl. No. 14/874,690 dated Jun. 1, 2016.
Official Communication for U.S. Appl. No. 14/874,690 dated Dec. 21, 2015.
Official Communication for U.S. Appl. No. 14/948,009 dated Feb. 25, 2016.
Official Communication for U.S. Appl. No. 15/092,456 dated Nov. 4, 2016.
Official Communication for U.S. Appl. No. 15/392,624 dated Mar. 10, 2017.
Official Communication for U.S. Appl. No. 15/397,562 dated Mar. 14, 2017.
Official Communication for U.S. Appl. No. 15/397,562 dated May 24, 2017.
Restriction Requirement for U.S. Appl. No. 13/839,026 dated Apr. 2, 2015.
Notice of Allowance for U.S. Appl. No. 14/696,069 dated Aug. 29, 2017.
Official Communication for U.S. Appl. No. 14/696,069 dated Jul. 3, 2017.
Official Communication for U.S. Appl. No. 15/092,456 dated Mar. 21, 2017.
Related Publications (1)
Number Date Country
20180081531 A1 Mar 2018 US
Provisional Applications (1)
Number Date Country
61985403 Apr 2014 US
Continuations (1)
Number Date Country
Parent 14696069 Apr 2015 US
Child 15826402 US