Interactive data object map

Information

  • Patent Grant
  • 9953445
  • Patent Number
    9,953,445
  • Date Filed
    Thursday, July 3, 2014
    10 years ago
  • Date Issued
    Tuesday, April 24, 2018
    6 years ago
Abstract
An interactive data object map system is disclosed in which large amounts of geographical, geospatial, and other types of data, geodata, objects, features, and/or metadata are efficiently presented to a user on a map interface. The interactive data object map system allows for rapid and deep analysis of various objects, features, and/or metadata by the user. A layer ontology may be displayed to the user. In various embodiments, when the user rolls a selection cursor over an object/feature an outline of the object/feature is displayed. Selection of an object/feature may cause display of metadata associated with that object/feature. The interactive data object map system may automatically generate feature/object lists and/or histograms based on selections made by the user. The user may perform geosearches, generate heatmaps, and/or perform keyword searches, among other actions.
Description
TECHNICAL FIELD

The present disclosure relates to systems and techniques for geographical data integration, analysis, and visualization. More specifically, the present disclosure relates to interactive maps including data objects.


BACKGROUND

Interactive geographical maps, such as web-based mapping service applications and Geographical Information Systems (GIS), are available from a number of providers. Such maps generally comprise satellite images or generic base layers overlaid by roads. Users of such systems may generally search for and view locations of a small number of landmarks, and determine directions from one location to another. In some interactive graphical maps, 3D terrain and/or 3D buildings may be visible in the interface.


SUMMARY

The systems, methods, and devices described herein each have several aspects, no single one of which is solely responsible for its desirable attributes. Without limiting the scope of this disclosure, several non-limiting features will now be discussed briefly.


The systems, methods, and devices of the present disclosure may provide, among other features, high-performance, interactive geospatial and/or data object map capabilities in which large amounts of geographical, geospatial, and other types of data, geodata, objects, features, and/or metadata are efficiently presented to a user on a map interface. In various embodiments, an interactive geospatial map system (also referred to as an interactive data object map system) may enable rapid and deep analysis of various objects, features, and/or metadata by the user. In some embodiments, a layer ontology may be displayed to the user. In various embodiments, when the user rolls a selection cursor over an object/feature an outline of the object/feature is displayed. Selection of an object/feature may cause display of metadata associated with that object/feature. In various embodiments, the interactive data object map system may automatically generate feature/object lists and/or histograms based on selections made by the user. Various aspects of the present disclosure may enable the user to perform geosearches, generate heatmaps, and/or perform keyword searches, among other actions.


In an embodiment, a computer system is disclosed comprising an electronic data structure configured to store a plurality of features or objects, wherein each of the features or objects is associated with metadata; a computer readable medium storing software modules including computer executable instructions; one or more hardware processors in communication with the electronic data structure and the computer readable medium, and configured to execute a user interface module of the software modules in order to: display an interactive map on an electronic display of the computer system; include on the interactive map one or more features or objects, wherein the features or objects are selectable by a user of the computer system, and wherein the features or objects are accessed from the electronic data structure; receive a first input from the user selecting one or more of the included features or objects; and in response to the first input, access, from the electronic data structure, the metadata associated with each of the selected features or objects; determine one or more metadata categories based on the accessed metadata; organize the selected features or objects into one or more histograms based on the determined metadata categories and the accessed metadata; and display the one or more histograms on the electronic display.


According to an aspect, the features or objects may comprise vector data.


According to another aspect, the features or objects may comprise at least one of roads, terrain, lakes, rivers, vegetation, utilities, street lights, railroads, hotels or motels, schools, hospitals, buildings or structures, regions, transportation objects, entities, events, or documents.


According to yet another aspect, the metadata associated with the features or objects may comprise at least one of a location, a city, a county, a state, a country, an address, a district, a grade level, a phone number, a speed, a width, or other related attributes.


According to another aspect, the features or objects may be selectable by a user using a mouse and/or a touch interface.


According to yet another aspect, each histogram of the one or more histograms may be specific to a particular metadata category.


According to another aspect, each histogram of the one or more histograms may comprise a list of items of metadata specific to the particular metadata category of the histogram, wherein the list of items is organized in descending order from an item having the largest number of related objects or features to an item having the smallest number of related objects or features.


According to yet another aspect, the one or more histograms displayed on the electronic display may be displayed so as to partially overlay the displayed interactive map.


According to another aspect, the one or more hardware processors may be further configured to execute the user interface module in order to: receive a second input from the user selecting a second one or more features or objects from the one or more histograms; and in response to the second input, update the interactive map to display the second one or more features or objects on the display; and highlight the second one or more features or objects on the interactive map.


According to yet another aspect, updating the interactive map may comprise panning and/or zooming.


According to another aspect, highlighting the second one or more features may comprise at least one of outlining, changing color, bolding, or changing contrast.


According to yet another aspect, the one or more hardware processors may be further configured to execute the user interface module in order to: receive a third input from the user selecting a drill-down group of features or objects from the one or more histograms; and in response to the third input, drill-down on the selected drill-down group of features or objects by: accessing the metadata associated with each of the features or objects of the selected drill-down group; determining one or more drill-down metadata categories based on the accessed metadata associated with each of the features or objects of the selected drill-down group; organizing the features or objects of the selected drill-down group into one or more drill-down histograms based on the determined drill-down metadata categories and the accessed metadata associated with each of the features or objects of the selected drill-down group; and displaying on the interactive map the one or more drill-down histograms.


According to another aspect, the one or more hardware processors may be further configured to execute the user interface module in order to enable the user to further drill down into the one or more drill-down histograms.


According to yet another aspect, the one or more hardware processors may be further configured to execute the user interface module in order to: receive a feature or object hover over input from the user; and in response to receiving the hover over input, highlight, on the electronic display, metadata associated with the particular hovered over feature or object to the user.


According to another aspect, one or more hardware processors may be further configured to execute the user interface module in order to: receive a feature or object selection input from the user; and in response to receiving the selection input, display, on the electronic display, metadata associated with the particular selected feature or object to the user.


In another embodiment, a computer system is disclosed comprising: an electronic data structure configured to store a plurality of features or objects, wherein each of the features or objects is associated with metadata; a computer readable medium storing software modules including computer executable instructions; one or more hardware processors in communication with the electronic data structure and the computer readable medium, and configured to execute a user interface module of the software modules in order to: display an interactive map on a display of the computer system, the interactive map comprising a plurality of map tiles accessed from the electronic data structure, the map tiles each comprising an image composed of one or more vector layers; include on the interactive map a plurality of features or objects accessed from the electronic data structure, the features or objects being selectable by a user, each of the features or objects including associated metadata; receive an input from a user including at least one of a zoom action, a pan action, a feature or object selection, a layer selection, a geosearch, a heatmap, and a keyword search; and in response to the input from the user: request, from a server, updated map tiles, the updated map tiles being updated according to the input from the user; receive the updated map tiles from the server; and update the interactive map with the updated map tiles.


According to an aspect, the one or more vector layers may comprise at least one of a regions layer, a buildings/structures layer, a terrain layer, a transportation layer, or a utilities/infrastructure layer.


According to an aspect, each of the one or more vector layers may be comprised of one or more sub-vector layers.


In yet another embodiment, a computer system is disclosed comprising: one or more hardware processors in communication with the computer readable medium, and configured to execute a user interface module of the software modules in order to: display an interactive map on a display of the computer system, the interactive map comprising a plurality of map layers; determine a list of available map layers; organizing the list of available map layers according to a hierarchical layer ontology, wherein like map layers are grouped together; and display on the interactive map the hierarchical layer ontology, wherein the user may select one or more of the displayed layers, and wherein each of the available map layers is associated with one or more feature or object types.


According to an aspect, the map layers may comprise at least one of vector layers and base layers.





BRIEF DESCRIPTION OF THE DRAWINGS

The following aspects of the disclosure will become more readily appreciated as the same become better understood by reference to the following detailed description, when taken in conjunction with the accompanying drawings.



FIG. 1 illustrates a sample user interface of the interactive data object map system, according to an embodiment of the present disclosure.



FIG. 2A illustrates a sample user interface of the interactive data object map system in which map layers are displayed to a user, according to an embodiment of the present disclosure.



FIG. 2B illustrates an example map layer ontology, according to an embodiment of the present disclosure.



FIG. 2C illustrates a sample user interface of the interactive data object map system in which various objects are displayed, according to an embodiment of the present disclosure.



FIG. 3A illustrates a sample user interface of the interactive data object map system in which objects are selected, according to an embodiment of the present disclosure.



FIGS. 3B-3G illustrate sample user interfaces of the interactive data object map system in which objects are selected and a histogram is displayed, according to embodiments of the present disclosure.



FIGS. 3H-3I illustrate sample user interfaces of the interactive data object map system in which objects are selected and a list of objects is displayed, according to embodiments of the present disclosure.



FIGS. 3J-3K illustrate sample user interfaces of the interactive data object map system in which objects are outlined when hovered over, according to embodiments of the present disclosure.



FIGS. 4A-4D illustrate sample user interfaces of the interactive data object map system in which a radius geosearch is displayed, according to embodiments of the present disclosure.



FIGS. 5A-5D illustrate sample user interfaces of the interactive data object map system in which a heatmap is displayed, according to embodiments of the present disclosure.



FIGS. 5E-5F illustrate sample user interfaces of the interactive data object map system in which a shape-based geosearch is displayed, according to embodiments of the present disclosure.



FIG. 5G illustrates a sample user interface of the interactive data object map system in which a keyword object search is displayed, according to an embodiment of the present disclosure.



FIG. 5H illustrates an example of a UTF grid of the interactive data object map system, according to an embodiment of the present disclosure.



FIG. 6A shows a flow diagram depicting illustrative client-side operations of the interactive data object map system, according to an embodiment of the present disclosure.



FIG. 6B shows a flow diagram depicting illustrative client-side metadata retrieval of the interactive data object map system, according to an embodiment of the present disclosure.



FIG. 7A shows a flow diagram depicting illustrative server-side operations of the interactive data object map system, according to an embodiment of the present disclosure.



FIG. 7B shows a flow diagram depicting illustrative server-side layer composition of the interactive data object map system, according to an embodiment of the present disclosure.



FIG. 8A illustrates one embodiment of a database system using an ontology.



FIG. 8B illustrates one embodiment of a system for creating data in a data store using a dynamic ontology.



FIG. 8C illustrates a sample user interface using relationships described in a data store using a dynamic ontology.



FIG. 8D illustrates a computer system with which certain methods discussed herein may be implemented.





DETAILED DESCRIPTION OF SPECIFIC EMBODIMENTS
Overview

In general, a high-performance, interactive data object map system (or “map system”) is disclosed in which large amounts of geographical, geospatial, and other types of data, geodata, objects, features, and/or metadata are efficiently presented to a user on a map interface. The interactive data object map system allows for rapid and deep analysis of various objects, features, and/or metadata by the user. For example, millions of data objects and/or features may be simultaneously viewed and selected by the user on the map interface. A layer ontology may be displayed to the user that allows the user to select and view particular layers. In various embodiments, when the user rolls a selection cursor over an object/feature (and/or otherwise selects the object/feature) an outline of the object/feature is displayed. Selection of an object/feature may cause display of metadata associated with that object/feature.


In an embodiment, the user may rapidly zoom in and out and/or move and pan around the map interface to variously see more or less detail, and more or fewer objects. In various embodiments, the interactive data object map system may automatically generate feature/object lists and/or histograms based on selections made by the user. In various embodiments, the user may perform geosearches (based on any selections and/or drawn shapes), generate heatmaps, and/or perform keyword searches, among other actions as described below.


In an embodiment, the interactive data object map system includes server-side computer components and/or client-side computer components. The client-side components may implement, for example, displaying map tiles, showing object outlines, allowing the user to draw shapes, and/or allowing the user to select objects/features, among other actions. The server-side components may implement, for example, composition of layers into map tiles, caching of composed map tiles and/or layers, and/or providing object/feature metadata, among other actions. Such functions may be distribution in any other manner. In an embodiment, object/feature outlines and/or highlighting are accomplished on the client-side through the use of a UTF grid.


Definitions

In order to facilitate an understanding of the systems and methods discussed herein, a number of terms are defined below. The terms defined below, as well as other terms used herein, should be construed to include the provided definitions, the ordinary and customary meaning of the terms, and/or any other implied meaning for the respective terms. Thus, the definitions below do not limit the meaning of these terms, but only provide exemplary definitions.


Ontology: A hierarchical arrangement and/or grouping of data according to similarities and differences. The present disclosure describes two ontologies. The first relates to the arrangement of vector layers consisting of map and object data as used by the interactive data object map system (as described below with reference to FIGS. 2A-2B). The second relates to the storage and arrangement of data objects in one or more databases (as described below with reference to FIGS. 8A-8C). For example, the stored data may comprise definitions for object types and property types for data in a database, and how objects and properties may be related.


Database: A broad term for any data structure for storing and/or organizing data, including, but not limited to, relational databases (Oracle database, mySQL database, etc.), spreadsheets, XML files, and text file, among others.


Data Object, Object, or Feature: A data container for information representing specific things in the world that have a number of definable properties. For example, a data object can represent an entity such as a person, a place, an organization, a market instrument, or other noun. A data object can represent an event that happens at a point in time or for a duration. A data object can represent a document or other unstructured data source such as an e-mail message, a news report, or a written paper or article. Each data object may be associated with a unique identifier that uniquely identifies the data object. The object's attributes (e.g. metadata about the object) may be represented in one or more properties. For the purposes of the present disclosure, the terms “feature,” “data object,” and “object” may be used interchangeably to refer to items displayed on the map interface of the interactive data object map system, and/or otherwise accessible to the user through the interactive data object map system. Features/objects may generally include, but are not limited to, roads, terrain (such as hills, mountains, rivers, and vegetation, among others), street lights (which may be represented by a streetlight icon), railroads, hotels/motels (which may be represented by a bed icon), schools (which may be represented by a parent-child icon), hospitals, other types of buildings or structures, regions, transportation objects, and other types of entities, events, and documents, among others. Objects displayed on the map interface generally comprise vector data, although other types of data may also be displayed. Objects generally have associated metadata and/or properties.


Object Type: Type of a data object (e.g., Person, Event, or Document). Object types may be defined by an ontology and may be modified or updated to include additional object types. An object definition (e.g., in an ontology) may include how the object is related to other objects, such as being a sub-object type of another object type (e.g. an agent may be a sub-object type of a person object type), and the properties the object type may have.


Properties: Also referred to as “metadata,” includes attributes of a data object/feature. At a minimum, each property/metadata of a data object has a type (such as a property type) and a value or values. Properties/metadata associated with features/objects may include any information relevant to that feature/object. For example, metadata associated with a school object may include an address (for example, 123 S. Orange Street), a district (for example, 509c), a grade level (for example, K-6), and/or a phone number (for example, 800-0000), among other items of metadata. In another example, metadata associated with a road object may include a speed (for example, 25 mph), a width (for example, 2 lanes), and/or a county (for example, Arlington), among other items of metadata.


Property Type: The data type of a property, such as a string, an integer, or a double. Property types may include complex property types, such as a series data values associated with timed ticks (e.g. a time series), etc.


Property Value: The value associated with a property, which is of the type indicated in the property type associated with the property. A property may have multiple values.


Link: A connection between two data objects, based on, for example, a relationship, an event, and/or matching properties. Links may be directional, such as one representing a payment from person A to B, or bidirectional.


Link Set: Set of multiple links that are shared between two or more data objects.


DESCRIPTION OF THE FIGURES

Embodiments of the disclosure will now be described with reference to the accompanying Figures, wherein like numerals refer to like elements throughout. The terminology used in the description presented herein is not intended to be interpreted in any limited or restrictive manner, simply because it is being utilized in conjunction with a detailed description of certain specific embodiments of the disclosure. Furthermore, embodiments of the disclosure may include several novel features, no single one of which is solely responsible for its desirable attributes or which is essential to practicing the embodiments of the disclosure herein described.



FIG. 1 illustrates a sample user interface of the interactive data object map system, according to an embodiment of the present disclosure. The user interface includes a map interface 100, a selection button/icon 102, a shape button/icon 104, a layers button/icon 106, a geosearch button/icon 108, a heat map button/icon 110, a search box 112, a feature information box 114, a coordinates information box 116, map scale information 118, zoom selectors 120, and highlighted features 122. The functionality of the interactive data object map system may be implemented in one or more computer modules and/or processors, as is described below with reference to FIG. 8D.


The map interface 100 of FIG. 1 is composed of multiple map tiles. The map tiles are generally composed of multiple layers of geographical, vector, and/or other types of data. Vector data layers (also referred to as vector layers) may include associated and/or linked data objects/features. In an embodiment, vector layers are composed of data objects/features. The various data objects and/or features associated with a particular vector layer may be displayed to the user when that particular vector layer is activated. For example, a transportation vector layer may include road, railroad, and bike path objects and/or features that may be displayed to the user when the transportation layer is selected. The layers used to compose the map tiles and the map interface 100 may vary based on, for example, whether a user has selected features displayed in the map interface 100, and/or the particular layers a user has selected for display. In an embodiment, composition of map tiles is accomplished by server-side components of the interactive data object map system. In an embodiment, composed map tiles may be cached by the server-side components to speed up map tile delivery to client-side components. The map tiles may then be transmitted to the client-side components of the interactive data object map system where they are composed into the map interface 100.


In general, the user interface of FIG. 1 is displayed on an electronic display viewable by a user of the interactive data object map system. The user of the interactive data object map system may interact with the user interface of FIG. 1 by, for example, touching the display when the display is touch-enabled and/or using a mouse pointer to click on the various elements of the user interface.


The map interface 100 includes various highlighted features 122 and feature icons. For example, the map interface 100 includes roads, buildings and structures, utilities, lakes, rivers, vegetation, and railroads, among other features. The user may interact with the map interface 100 by, for example, rolling over and/or clicking on various features. In one embodiment, rolling over and/or placing the mouse pointer over a feature causes the feature to be outlined and/or otherwise highlighted. Additionally, the name of the feature and/or other information about the feature may be shown in the feature information box 114.


The user of the map system may interact with the user interface of FIG. 1 by scrolling or panning up, down, and/or side to side; zooming in or out; selecting features; drawing shapes; selecting layers; performing a geosearch; generating a heat map; and/or performing a keyword search; among other actions as are described below. Various user actions may reveal more or less map detail, and/or more or fewer features/objects.



FIG. 2A illustrates a sample user interface of the map system in which map layers are displayed to a user, according to an embodiment of the present disclosure. In the user interface of FIG. 2A, the user has selected the layers button 106, revealing the layers window 202. The layers window 202 includes a list of base layers, vector layers, and user layers. The base layers include, for example, overhead imagery, topographic, blank (Mercator), base map, aviation, and blank (unprojected). The vector layers include general categories such as, for example, regions, buildings/structures, terrain, transportation, and utilities/infrastructure. While no user layers are included in the user interface of FIG. 2A, user layers may be added by the user of the map system, as is described below.


In an embodiment, the user may select one or more of the base layers which may be used during composition of the map tiles. For example, selection of the overhead imagery base layer will produce map tiles in which the underlying map tile imagery is made up of recent aerial imagery. Similarly, selection of the topographic base layer will produce map tiles in which the underlying map tile imagery includes topographic map imagery.


Further, in an embodiment, the user may select one or more of the vector layers which may be used during composition of the map tiles. For example, selecting the transportation layer results in transportation-related objects and/or features being displayed on the map tiles. Transportation-related features may include, for example, roads, railroads, street signs, and/or street lights, among others. Examples of transportation-related features may be seen in the user interface of FIG. 2A where various roads, railroads, and street light icons are displayed.


In an embodiment, the user of the map system may create and save map layers. These saved map layers may be listed as user layers in the layers window 202.



FIG. 2B illustrates an example map layer ontology, according to an embodiment of the present disclosure. As mentioned above with reference to FIG. 2A, the list of vector layers in the layers window 202 may include general categories/layers such as regions, buildings/structures, terrain, transportation, and utilities/infrastructure. The vector layers available in the map system may be further organized into an ontology, or hierarchical arrangement. For example, as shown in the vector layers window 206, the buildings/structures category 208 may be further subdivided into layers including structures, government, medical, education, and commercial. The terrain category 210 may include vegetation and/or water/hydrography layers. The utilities/infrastructure category may include fire and/or storage/draining.


In an embodiment, the user of the map system may select one or more of the layers and/or sub-layers of the layer ontology. As shown in FIG. 2B, the user has deselected the vegetation sub-layer, and all of the utilities/infrastructure layers. Selecting and deselecting vector layers, or toggling vectors layers on and off, may cause the vector objects and/or features associated with those layers to be displayed or not displayed in the map interface. For example, when the user selects the transportation category/layer, road objects associated with the transportation layer may be displayed on the map interface. Likewise, when a user deselects the transportation category/layer, road objects associated with the transportation layer may be removed from the map interface.


In an embodiment, additional hierarchical levels of layers may be displayed to the user. For example, the vector layers window 206 may include sub-sub-layers (for example, the education sub-layer may be divided into elementary schools, secondary schools, and post-secondary schools). Alternatively, fewer hierarchical levels may be displayed to the user.


In an embodiment, each of the vector layers shown in the vector layers window 206 may be made up of many layers of map vector data. In this embodiment, the map system may advantageously generate a simplified layer ontology, such as the one shown in 206. The simplified layer ontology allows the user to easily select layers of interest from a reduced number of layers, rather than a large number of discrete layers. As described above, vector layers may contain data regarding associated features and/or objects. Thus, features visible in the map interface correspond to the currently active/selected layers. In an embodiment, the layer ontology may have an arbitrary depth.



FIG. 2C illustrates a sample user interface of the map system in which various objects are displayed, according to an embodiment of the present disclosure. The user interface of FIG. 2C includes a map interface 214, an outlined feature 216, and feature information box 114 indicating that the outlined feature 216 is called “Union Park.” Various features/objects may be seen in the map interface 214 including, for example, roads, buildings, terrain, street lights (represented by a streetlight icon), railroads, hotels/motels (represented by a bed icon), and schools (represented by a parent-child icon), among other features.



FIG. 3A illustrates a sample user interface of the map system in which objects are selected, according to an embodiment of the present disclosure. The user interface of FIG. 3A includes a highlighted user selection rectangle 302. The highlighted user selection rectangle 302 illustrates the user actively selecting a particular region of the map interface so as to select the features/objects that fall within the bounds of that rectangle. In an embodiment, visible features may be selected by the user, while features that are not currently visible are not selectable. For example, features related to layers that are not currently active are not selected when the user performs a selection. In another embodiment, even features that are not visible in a selected area may be selected.



FIGS. 3B-3C illustrate sample user interfaces of the map system in which objects are selected and a feature histogram 304 is displayed in a selection window, according to embodiments of the present disclosure. The selected objects/features of FIG. 3B (including roads 310 and other features 312) may have been selected via the highlighted user selection rectangle 302 of FIG. 3A. Selected features are indicated by highlighting and/or altered colors on the map tiles making up the map interface.


Feature histogram 304 is shown in a selection window included in the user interface of FIG. 3B. The histogram 304 shows a categorized histogram of all objects/features selected by the user in the map interface. The histogram divides the features into common buckets and/or categories based on related metadata (also referred to as metadata categories). For example, at 306, “Belongs to Layer” indicates that the following histogram includes all selected features organized by layer category. In this example there are over 70,000 selected buildings/structures features, over 40,000 selected facility features, and over 6,000 selected road features, among others. Further, the feature histogram 304 includes histograms of the selected objects organized by account and acreage. In various embodiments, the map system may select histogram categories and/or metadata categories based on, for example, the features selected and/or types of features selected, among others. Any other categorization of selected features may be displayed in the histograms of the feature histogram 304.


In an embodiment, the user of the map system may select a subset of the selected features for further analysis and/or histogram generation. For example, the user may select a subset comprising selected objects belonging to the road category by, for example, clicking on the roads item 308. This selection may result in “drilling down” to histograms of that subset of features, as shown in FIG. 3C. Thus, a drill-down group of features/objects (for example, the subset of features/objects) may be used by the map system to determine new drill-down metadata categories, or buckets of related metadata. At 314 in FIG. 3C, the arrow icon indicates that of the originally selected 124,172 features, the feature histogram now shows an analysis of the 6,724 features belonging to the road category (see item 316). The feature histogram window of FIG. 3C thus shows a new set of histograms organized by layer, address, addressed, and agency, among others. The user may thus “drill down” and “drill up” through the selected features via the displayed histograms.


In an embodiment, items selected in the feature histogram are correspondingly highlighted in the map interface of the map system. For example, in the map interface of FIG. 3B, the user has selected the roads in the histogram at 308. Corresponding features (in this example, roads) are thus highlighted in the map interface (as shown at 310).



FIGS. 3D-3G illustrate additional example user interfaces of the map system in which objects are selected from a histogram and correspondingly highlighted in the map interface, according to embodiments of the present disclosure. In FIGS. 3D-3F, in the selection window, the user is viewing a histogram of all selected roads organized in a histogram according to the road speed limit. In FIG. 3D, the user has selected (at 318) roads with speed limits of 55 and 65. The corresponding road features are highlighted in the map interface at, for example 320. In FIG. 3E, the user has selected (at 322) roads with speed limits of 35, 45, 40, 55, and 65. The corresponding road features are highlighted in the map interface at, for example 324. In FIG. 3F, the user has selected (at 326) roads with speed limits of 25. The corresponding road features are highlighted in the map interface at, for example 328. In FIG. 3G, the user may “drill down” into the histogram by, for example, right clicking on an item and selecting “Remove other objects in histogram” (330).



FIGS. 3H and 3I illustrate sample user interfaces of the map system in which objects are selected and a list of selected objects 332 is displayed in the selection window, according to embodiments of the present disclosure. With reference to FIG. 3H, the list of features 332 indicates that the user has drilled down further into the selected features of FIG. 3G by selecting a subset of selected features consisting of only roads with speed limits of 20. Thus, the subset of the example of FIG. 3H includes the 163 features that are roads with speed limits of 20. The user has additionally selected to view the list of features 332 in the selection window (rather than the feature histogram). The list of features 332 lists each individual feature that is included in the currently selected subset. For example, the list includes S Central Av 334, among others.


In FIG. 3I, the user has selected feature Hamilton St at 336. In an embodiment, when a feature is selected from the list of features, the map interface automatically zooms to the location of that feature. The user may select the feature from the list of features by clicking on the name of the feature and/or the displayed thumbnail. In an embodiment, the map interface only zooms to the feature when the user clicks on, and/or selects, the thumbnail associated with the feature. In the example of FIG. 3I, the map interface is automatically zoomed to the location of the selected Hamilton St, and the selected feature is highlighted (338). Additionally, the name of the selected feature is shown in the feature information box 114. In an embodiment, the name of the selected feature is shown in the feature information box 114 when the user hovers the cursor over the thumbnail associated with the feature in the list of features. In an embodiment, the selected feature may be any other type of object, and may be outlined or otherwise highlighted when selected.


In various embodiments, the user of the map system may select either the list of features, or the feature histogram, of the selection window to view information about the selected features.



FIGS. 3J-3K illustrate sample user interfaces of the map system in which objects are outlined when hovered over, according to embodiments of the present disclosure. In FIG. 3J, the user is hovering over a building feature with the mouse cursor. The feature being hovered over is automatically outlined (340). Additionally, the name of the feature is displayed in the feature information box 114. In FIG. 3K, the user is hovering over a shelter feature with the mouse cursor. The feature being hovered over is automatically outlined (342), and the name of the feature is displayed in the feature information box 114. The user of the map system may, at any time, highlight and/or outline any feature/object by rolling over, hovering over, selecting, and/or touching that feature/object in the map interface.


In various embodiments, the user may select a feature in order to view a feature information window. The feature information window may include, for example, metadata associated with the selected feature. For example, the user may select a building feature, resulting in a display of information associated with that building feature such as the building size, the building name, and/or the building address or location, among others. Metadata associated with features/objects may include any information relevant to that feature/object. For example, metadata associated with a school may include an address (for example, 123 S. Orange Street), a district (for example, 509c), a grade level (for example, K-6), and/or a phone number (for example, 800-0000), among other items of metadata. In an embodiment, a history of the object, changes made to the object, and/or user notes related to the object, among other items, may be displayed. In an embodiment, a user may edit metadata associated with a selected feature.



FIGS. 4A-4D illustrate sample user interfaces of the map system in which a radius geosearch is displayed, according to embodiments of the present disclosure. In FIG. 4A, the user has selected the shape button 104 and is drawing a circle selection 404 on the map interface by first selecting a center and then a radius. Shape window 402 indicates the coordinates of the center of the circle selection, as well as the radius of the circle selection. In various embodiments, any type of polygon or other shape may be drawn on the map interface to select features.


In FIG. 4B, the user has selected the geosearch button 108 so as to perform a geosearch within the selection circle 408. In an embodiment, a geosearch comprises a search through one or more databases of data objects, and metadata associated with those data objects, for any objects that meet the criteria of the geosearch. For example, a geosearch may search for any objects with geographic metadata and/or properties that indicate the object may be geographically within, for example, selection circle 408. A geosearch within a selected circle may be referred to as a radius search. Geosearch window 406 indicates various items of information related to the radius search, and includes various parameters that may be adjusted by the user. For example, the geosearch window 406 includes a search area slider that the user may slide to increase or decrease the radius of the selection circle 408. The user may also indicate a time range for the geosearch. In an embodiment, objects/features shown and/or searchable in the map system may include a time component and/or time metadata. Thus, for example, the user of the map system may specify a date or time period, resulting in the display of any objects/features with associated time metadata, for example, falling within the specified time period. In various embodiments, associated time metadata may indicate, for example, a time the feature was created, a time the feature was added to a database of features, a time the feature was previously added to a vector layer, a time the feature was last accessed by the map system and/or a user, a time the feature was built, and/or any combination of the foregoing. Alternatively, the user may select and/or search for objects/features within particular time periods, as shown in FIG. 4B. The geosearch window 406 also allows the user to specify the types of objects to be searched, for example, entities, events, and/or documents, among others.


In an embodiment, the user of the map system may perform a search by clicking and/or touching a search button. The map system may then perform a search of an object database for any objects matching the criteria specified in the geosearch. For example, in the example of FIG. 4B the map system will search for any objects with associated location information that falls within the selection circle 408. Objects searched by the map system may include objects other than those shown on the map interface. For example, in an embodiment the map system may access one or more databases of objects (and object metadata) that may be unrelated to the features currently shown in the map interface, or features related to the currently selected vector layers. The databases accessed may include databases external to any database storing data associated with the map system. Any objects found in the geosearch may then be made available to the user (as shown in FIG. 4B), and the user may be given the option of adding the objects to a new layer in the map interface (as shown in the geosearch information window 406).



FIG. 4C shows objects added to the map interface following the geosearch in FIG. 4B. The search results are also shown in the feature histogram 410. In this example the returned objects include various entities and events. FIG. 4D shows the user has selected, in the feature histogram, all search result objects with related metadata indicating a drug law violation. Those selected objects are additionally highlighted in the map interface of FIG. 4D. In another example, geosearch may be used to determine, for example, that many crimes are concentrated in a downtown area of a city, while DUIs are more common in areas with slow roads.



FIGS. 5A-5D illustrate sample user interfaces of the map system in which a heatmap is displayed, according to embodiments of the present disclosure. In FIG. 5A, the user has selected the heatmap button 110 so as to create a heatmap 504 based on the objects selected in FIG. 4D. A heatmap information window 502 is displayed in which the user may specify various parameters related to the generation of heatmap. For example, referring now to FIG. 5B, the user may adjust a radius (506) of the circular heatmap related to each selected object, an opacity (508) of the heatmap, a scale of the heatmap, and an auto scale setting. In FIG. 5B, the user has decreased the opacity of the generated heatmap and zoomed in on the map interface so as to more clearly view various objects and the underlying map tiles.



FIG. 5C shows the user selecting various objects and/or features while the heatmap is displayed using the rectangle selection tool, such as to view information regarding the features in a histogram. FIG. 5D shows the selected objects, selected in FIG. 5C, now highlighted (512).


In the map system a heatmap may be generated on any object type, and/or on multiple object types. In an embodiment, different heatmap radiuses may be set for different object types. For example, the user may generate a heatmap in which streetlights have a 20 m radius, while hospitals have a 500 m radius. In an embodiment, the heatmap may be generated based on arbitrary shapes. For example, rather than a circular-based heatmap, the heatmap may be rectangular-based or ellipse-based. In an embodiment, the heatmap may be generated based on error ellipses and/or tolerance ellipses. A heatmap based on error ellipses may be advantageous when the relevant objects have associated error regions. For example, when a location of an object is uncertain, or multiple datapoints associated with an object are available, an error ellipse may help the user determine the actual location of the object.



FIGS. 5E-5F illustrate sample user interfaces of the map system in which a shape-based geosearch is displayed, according to embodiments of the present disclosure. In FIG. 5E, the user has selected the shape button 104, and a shape information window 514 is shown. In the user interface of FIG. 5E the user has drawn lines 518, however any shapes may be drawn on the map interface. Information related to the drawn lines 518 is displayed in the shape information window 514. For example, at 516 the starting points, distance, and azimuth related to each line are displayed. Further, a total distance from the start to the end of the line is shown.



FIG. 5F shows a geosearch performed on the line shape drawn in FIG. 5E. Geosearch information window 520 indicates a search area 522, a time range 524, and an object type 526 as described above with reference to FIG. 4B. The search area is indicated on the map interface by the highlighted area 528 along the drawn line. The geosearch may be performed, and results may be shown, in a manner similar to that described above with reference to FIGS. 4B-4D. For example, geosearch along a path may be used to determine points of interest along that path.



FIG. 5G illustrates a sample user interface of the map system in which a keyword object search is displayed, according to an embodiment of the present disclosure. The user may type words, keywords, numbers, and/or geographic coordinates, among others, into the search box 112. In FIG. 5G, the user has typed Bank (530). As the user types, the map system automatically searches for objects and/or features that match the information typed. Matching may be performed based on object data and/or metadata. Search results are displayed as shown at 532 in FIG. 5G. In the example, a list of banks (bank features) is shown. The user may then select from the list shown, at which point the map system automatically zooms to the selected feature and indicates the selected feature with an arrow 534. In various embodiments, the selected feature may be indicated by highlighting, outlining, and/or any other type of indicator. In an embodiment, the search box 112 may be linked to a gazetteer so as to enable simple word searches for particular geographic locations. For example, a search for a city name, New York, may be linked with the geographic coordinates of the city, taking the user directly to that location on the map interface.



FIG. 5H illustrates an example of a UTF grid of the map system, according to an embodiment of the present disclosure. In an embodiment, the UTF grid enables feature outlining and/or highlighting of many objects with client-side components. In one embodiment, each map tile (or image) of the map interface includes an associated textual UTF (UCS Transformation Format) grid. In FIG. 5H, an example map tile 526 is shown next to an associated example UTF grid 538. In this example, the map tile and associated UTF grid are generated by the server-side components and sent to the client-side components. In the UTF grid, each character represents a pixel in the map tile image, and each character indicates what feature is associated with the pixel. Each character in the UTF grid may additionally be associated with a feature identifier which may be used to request metadata associated with that feature.


Contiguous regions of characters in the UTF grid indicate the bounds of a particular feature, and may be used by the client-side components to provide the feature highlighting and/or outlining. For example, when a user hovers a mouse pointer over a feature on a map tile, the map system determines the character and portion of the UTF grid associated with the pixel hovered over, draws a feature outline based on the UTF grid, and may additionally access metadata associated with the feature based on the feature identifier associated with the feature. In an embodiment, the UTF grid is sent to the client-side components in a JSON (JavaScript Object Notation) format.



FIG. 6A shows a flow diagram depicting illustrative client-side operations of the map system, according to an embodiment of the present disclosure. In various embodiments, fewer blocks or additional blocks may be included in the process, or various blocks may be performed in an order different from that shown in FIG. 6A. In an embodiment, one or more blocks in FIG. 6A may be performed by client-side components of the map system, for example, computer system 800 (described below in reference to FIG. 8D).


At block 602, the map system provides a user interface (for example, the user interface of FIG. 1) to the user. As described above and below, the user interface may be provided to the user through any electronic device, such as a desktop computer, a laptop computer, a mobile smartphone, and/or a tablet, among others. At block 604, an input is received from the user of the map system. For example, the user may use a mouse to roll over and/or click on an item of the user interface, or the user may touch the display of the interface (in the example of a touch screen device).


Inputs received from the user may include, for example, hovering over, rolling over, and/or touching and object in the user interface (606); filling out a text field (614); drawing a shape in the user interface (608), and/or drawing a selection box and/or shape in the user interface (610); among other actions or inputs as described above.


At block 612, any of inputs 606, 614, 608, and 610 may cause the map system to perform client-side actions to update the user interface. For example, hovering over an object (606) may result in the client-side components of the map system to access the UTF grid, determine the boundaries of the object, and draw an outline around the hovered-over object. In another example, filling out a text field (614) may include the user inputting data into the map system. In this example, the user may input geographic coordinates, metadata, and/or other types of data to the map system. These actions may result in, for example, the client-side components of the map system storing the inputted data and/or taking an action based on the inputted data. For example, the user inputting coordinates may result in the map interface being updated to display the inputted information, such as an inputted name overlaying a particular object. In yet another example, the actions/inputs of drawing a shape (608) and/or drawing a selection (610) may result in the client-side components of the map system to update the user interface with colored and/or highlighted shapes (see, for example, FIG. 3A).


In an embodiment, one or more blocks in FIG. 6A may be performed by server-side components of the map system, for example, server 830 (described below in reference to FIG. 8D).



FIG. 6B shows a flow diagram depicting illustrative client-side metadata retrieval of the map system, according to an embodiment of the present disclosure. In various embodiments, fewer blocks or additional blocks may be included in the process, or various blocks may be performed in an order different from that shown in FIG. 6B. In an embodiment, one or more blocks in FIG. 6B may be performed by client-side components of the map system, for example, computer system 800.


At block 620, the client-side components of the map system detect that the user is hovering over and/or touching an object in the user interface. At block 622, and as described above, the client-side components may access the UTF grid to determine the feature identifier and object boundaries associated with the hovered-over object. Then, at block 624, the client-side components may render the feature shape on the image or map interface. The feature shape may be rendered as an outline and/or other highlighting.


At block 636, the client-side components detect whether the user has selected the object. Objects may be selected, for example, if the user clicks on the object and or touches the object. If the user has selected the object, then at block 628, the client-side components query the server-side components to retrieve metadata associated with the selected object. In an embodiment, querying of the server-side components may include transmitting the feature identifier associated with the selected object to the server, the server retrieving from a database the relevant metadata, and the server transmitting the retrieved metadata back to the client-side components.


At block 630, the metadata is received by the client-side components and displayed to the user. For example, the metadata associated with the selected object may be displayed to the user in the user interface in a dedicated metadata window, among other possibilities.


In an embodiment, one or more blocks in FIG. 6B may be performed by server-side components of the map system, for example, server 830.



FIG. 7A shows a flow diagram depicting illustrative server-side operations of the map system, according to an embodiment of the present disclosure. In various embodiments, fewer blocks or additional blocks may be included in the process, or various blocks may be performed in an order different from that shown in FIG. 7A. In an embodiment, one or more blocks in FIG. 7A may be performed by server-side components of the map system, for example, server 830.


Server-side operations of the map system may include composing and updating the map tiles that make up the map interface. For example, when the user changes the selection of the base layer and/or one or more of the vector layers, the map tiles are re-composed and updated in the map interface to reflect the user's selection. Selection of objects resulting in highlighting of those objects may also involve re-composition of the map tiles. Further, UTF grids may be generated by the server-side components for each map tile composed.


At block 702, the user interface is provided to the user. At block 704 an input from the user is received. Inputs received from the user that may result in server-side operations may include, for example, an object selection (706), a change in layer selection (708), a geosearch (710), generating a heatmap (712), searching from the search box (714), and/or panning or zooming the map interface, among others.


At block 716, the client-side components of the map system may query the server-side components in response to any of inputs 706, 708, 710, 712, and 714 from the user. The server-side components then update and re-compose the map tiles and UTF grids of the map interface in accordance with the user input (as described below in reference to FIG. 7B), and transmits those updated map tiles and UTF grids back to the client-side components.


At block 718, the client-side components receive the updated map tile information from the server, and at block 720 the user interface is updated with the received information.


In an embodiment, additional information and/or data, in addition to updated map tiles, may be transmitted to the client-side components from the server-side components. For example, object metadata may be transmitted in response to a user selecting an object.


In an embodiment, one or more blocks in FIG. 7A may be performed by client-side components of the map system, for example, computer system 800.



FIG. 7B shows a flow diagram depicting illustrative server-side layer composition of the map system, according to an embodiment of the present disclosure. In various embodiments, fewer blocks or additional blocks may be included in the process, or various blocks may be performed in an order different from that shown in FIG. 7B. In an embodiment, one or more blocks in FIG. 7B may be performed by server-side components of the map system, for example, server 830.


At block 730, a query is received by the server-side components from the client-side components. Such a query may originate, for example, at block 716 of FIG. 7A. At block 732, the server-side components determine the map tile composition based on the query. For example, if the user has selected an object or group of objects, the map tiles containing those objects may be updated to include highlighted objects. In another example, if the user has changed the layer selection, the map tiles may be updated to include only those layers that are currently selected. In the example of FIG. 7B, the layers currently selected are determined, and the layers are composed and/or rendered into the map tiles. In another example, if the user has performed a geosearch and selected to add the search result objects to the map interface, the map tiles are updated to include those search result objects. In yet another example, when the user has generated a heatmap, the map tiles are updated to show the generated heatmap. In another example, if the user searches via the search box, the selected objects may be highlighted in the re-composed map tiles. In another example, when the user pans and/or zooms in the map interface, the map tiles are updated to reflect the new view selected by the user. In all cases, and updated UTF grid may also be generated for each composed map tile.


At block 734, the map system determines whether the layers necessary to compose the requested map tiles are cached. For example, when a layer is selected by the user, that layer may be composed by the map system and placed in a memory of the server-side components for future retrieval. Caching of composed layers may obviate the need for recomposing those layers later, which advantageously may save time and/or processing power.


If the required layers are cached, then at block 740 the layers are composed into the requested map tiles and, at block 742, transmitted to the client-side components.


When the required layers are not cached, at block 736, the server-side components calculate and/or compose the requested layer and or layers, and may then, at block 738, optionally cache the newly composed layers for future retrieval. Then, at blocks 740 and 742, the layers are composed into map tiles and provided to the client-side components.


In an embodiment, entire map tiles may be cached by the server-side components. In an embodiment, the size and/or quality of the map tiles that make up that map interface may be selected and/or dynamically selected based on at least one of: the bandwidth available for transmitting the map tiles to the client-side components, the size of the map interface, and/or the complexity of the layer composition, among other factors. In an embodiment, the map tiles comprise images, for example, in one or more of the following formats: PNG, GIF, JPEG, TIFF, BMP, and/or any other type of appropriate image format.


In an embodiment, the layer and object data composed into layers and map tiles comprises vector data. The vector data (for example, object data) may include associated metadata, as described above. In an embodiment, the vector, layer, and/or object data and associated metadata may originate from one or more databases and/or electronic data stores.


In an embodiment, one or more blocks in FIG. 7B may be performed by client-side components of the map system, for example, computer system 800.


In an embodiment, the map system may display more than 50 million selectable features to a user simultaneously. In an embodiment, the map system may support tens or hundreds of concurrent users accessing the same map and object data. In an embodiment, map and object data used by the map system may be mirrored and/or spread across multiple computers, servers, and/or server-side components.


In an embodiment, rather than updating the map tiles to reflect a selection by the user of one or more objects, the map system may show an approximation of the selection to the user based on client-side processing.


In an embodiment, a user may drag and drop files, for example, vector data and/or vector layers, onto the user interface of the map system, causing the map system to automatically render the file in the map interface.


In an embodiment, icons and/or styles associated with various objects in the map interface may be updated and/or changed by the user. For example, the styles of the various objects may be specified in or by a style data file. The style data file may be formatted according to a particular format or standard readable by the map system. In an embodiment, the style data file is formatted according to the JSON format standard. The user may thus change the look of the objects and shapes rendered in the map interface of the map system by changing the style data file. The style data file may further define the looks for object and terrain (among other items and data) at various zoom levels.


In an embodiment, objects, notes, metadata, and/or other types of data may be added to the map system by the user through the user interface. In an embodiment, user added information may be shared between multiple users of the map system. In an embodiment, a user of the map system may add annotations and shapes to the map interface that may be saved and shared with other users. In an embodiment, a user of the map system may share a selection of objects with one or more other users.


In an embodiment, the user interface of the map system may include a timeline window. The timeline window may enable the user to view objects and layers specific to particular moments in time and/or time periods. In an embodiment, the user may view tolerance ellipses overlaid on the map interface indicating the likely position of an object across a particular time period.


In an embodiment, the map system may include elevation profiling. Elevation profiling may allow a user of the system to determine the elevation along a path on the map interface, to perform a viewshed analysis (determine objects and/or terrain viewable from a particular location), to perform a reverse-viewshed analysis (for a particular location, determine objects and/or terrain that may view the location), among others.


In an embodiment, vector data, object data, metadata, and/or other types of data may be prepared before it is entered into or accessed by the map system. For example, the data may be converted from one format to another, may be crawled for common items of metadata, and/or may be prepared for application of a style file or style information, among other action. In an embodiment, a layer ontology may be automatically generated based on a group of data. In an embodiment, the map system may access common data sources available on the Internet, for example, road data available from openstreetmap.org.


In an embodiment, roads shown in the map interface are labeled with their names, and buildings are rendered in faux-3D to indicate the building heights. In an embodiment, Blue Force Tracking may be integrated into the map system as a layer with the characteristics of both a static vector layer and a dynamic selection layer. A Blue Force layer may enable the use of the map system for live operational analysis. In an embodiment, the map system may quickly render detailed chloropleths or heatmaps with minimal data transfer. For example, the system may render a chloropleth with a property value on the individual shapes of the properties themselves, rather than aggregating this information on a county or zip code level.


Advantageously, the map system displays many items of data, objects, features, and/or layers in a single map interface. A user may easily interact with things on the map and gather information by hovering over or selecting features, even though those features may not be labeled. The user may select features, may “drill down” on a particular type of feature (for example, roads), may view features through histograms, may use histograms to determine common characteristics (for example, determine the most common speed limit), and/or may determine correlations among features (for example, see that slower speed limit areas are centered around schools). Further, the map system may be useful in many different situations. For example, the system may be useful to operational planners and/or disaster relief personnel.


Additionally, the map system accomplishes at least three core ideas: providing a robust and fast back-end (server-side) renderer, keeping data on the back-end, and only transferring the data necessary to have interactivity. In one embodiment, the primary function of the server-side components is rendering map tiles. The server is capable of drawing very detailed maps with a variety of styles that can be based on vector metadata. Rendered map tiles for a vector layer are cached, and several of these layer tiles are drawn on top of one another to produce the final tile that is sent to the client-side browser. Map tile rendering is fast enough for displaying dynamic tiles for selection and highlight to the user. Server-side operations allow for dynamic selections of very large numbers of features, calculation of the histogram, determining the number of items shown and/or selected, and drawing the selection, for example. Further, the heatmap may include large numbers of points without incurring the cost of transferring those points to the client-side browser. Additionally, transferring only as much data as necessary to have interactivity enables quick server rendering of dynamic selections and vector layers. On the other hand, highlighting hovered-over features may be performed client-side nearly instantaneously, and provides useful feedback that enhances the interactivity of the map system. In an embodiment, to avoid transferring too much geometric data, the geometries of objects (in the map tiles and UTF grid) are down-sampled depending on how zoomed in the user is to the map interface. Thus, map tiles may be rendered and presented to a user of the map system in a dynamic and useable manner.


Object Centric Data Model


To provide a framework for the following discussion of specific systems and methods described above and below, an example database system 1210 using an ontology 1205 will now be described. This description is provided for the purpose of providing an example and is not intended to limit the techniques to the example data model, the example database system, or the example database system's use of an ontology to represent information.


In one embodiment, a body of data is conceptually structured according to an object-centric data model represented by ontology 1205. The conceptual data model is independent of any particular database used for durably storing one or more database(s) 1209 based on the ontology 1205. For example, each object of the conceptual data model may correspond to one or more rows in a relational database or an entry in Lightweight Directory Access Protocol (LDAP) database, or any combination of one or more databases.



FIG. 8A illustrates an object-centric conceptual data model according to an embodiment. An ontology 1205, as noted above, may include stored information providing a data model for storage of data in the database 1209. The ontology 1205 may be defined by one or more object types, which may each be associated with one or more property types. At the highest level of abstraction, data object 1201 is a container for information representing things in the world. For example, data object 1201 can represent an entity such as a person, a place, an organization, a market instrument, or other noun. Data object 1201 can represent an event that happens at a point in time or for a duration. Data object 1201 can represent a document or other unstructured data source such as an e-mail message, a news report, or a written paper or article. Each data object 1201 is associated with a unique identifier that uniquely identifies the data object within the database system.


Different types of data objects may have different property types. For example, a “Person” data object might have an “Eye Color” property type and an “Event” data object might have a “Date” property type. Each property 1203 as represented by data in the database system 1210 may have a property type defined by the ontology 1205 used by the database 1205.


Objects may be instantiated in the database 1209 in accordance with the corresponding object definition for the particular object in the ontology 1205. For example, a specific monetary payment (e.g., an object of type “event”) of US$30.00 (e.g., a property of type “currency”) taking place on Mar. 27, 2009 (e.g., a property of type “date”) may be stored in the database 1209 as an event object with associated currency and date properties as defined within the ontology 1205.


The data objects defined in the ontology 1205 may support property multiplicity. In particular, a data object 1201 may be allowed to have more than one property 1203 of the same property type. For example, a “Person” data object might have multiple “Address” properties or multiple “Name” properties.


Each link 1202 represents a connection between two data objects 1201. In one embodiment, the connection is either through a relationship, an event, or through matching properties. A relationship connection may be asymmetrical or symmetrical. For example, “Person” data object A may be connected to “Person” data object B by a “Child Of” relationship (where “Person” data object B has an asymmetric “Parent Of” relationship to “Person” data object A), a “Kin Of” symmetric relationship to “Person” data object C, and an asymmetric “Member Of” relationship to “Organization” data object X. The type of relationship between two data objects may vary depending on the types of the data objects. For example, “Person” data object A may have an “Appears In” relationship with “Document” data object Y or have a “Participate In” relationship with “Event” data object E. As an example of an event connection, two “Person” data objects may be connected by an “Airline Flight” data object representing a particular airline flight if they traveled together on that flight, or by a “Meeting” data object representing a particular meeting if they both attended that meeting. In one embodiment, when two data objects are connected by an event, they are also connected by relationships, in which each data object has a specific relationship to the event, such as, for example, an “Appears In” relationship.


As an example of a matching properties connection, two “Person” data objects representing a brother and a sister, may both have an “Address” property that indicates where they live. If the brother and the sister live in the same home, then their “Address” properties likely contain similar, if not identical property values. In one embodiment, a link between two data objects may be established based on similar or matching properties (e.g., property types and/or property values) of the data objects. These are just some examples of the types of connections that may be represented by a link and other types of connections may be represented; embodiments are not limited to any particular types of connections between data objects. For example, a document might contain references to two different objects. For example, a document may contain a reference to a payment (one object), and a person (a second object). A link between these two objects may represent a connection between these two entities through their co-occurrence within the same document.


Each data object 1201 can have multiple links with another data object 1201 to form a link set 1204. For example, two “Person” data objects representing a husband and a wife could be linked through a “Spouse Of” relationship, a matching “Address” property, and one or more matching “Event” properties (e.g., a wedding). Each link 1202 as represented by data in a database may have a link type defined by the database ontology used by the database.



FIG. 8B is a block diagram illustrating exemplary components and data that may be used in identifying and storing data according to an ontology. In this example, the ontology may be configured, and data in the data model populated, by a system of parsers and ontology configuration tools. In the embodiment of FIG. 8B, input data 1300 is provided to parser 1302. The input data may comprise data from one or more sources. For example, an institution may have one or more databases with information on credit card transactions, rental cars, and people. The databases may contain a variety of related information and attributes about each type of data, such as a “date” for a credit card transaction, an address for a person, and a date for when a rental car is rented. The parser 1302 is able to read a variety of source input data types and determine which type of data it is reading.


In accordance with the discussion above, the example ontology 1205 comprises stored information providing the data model of data stored in database 1209, and the ontology is defined by one or more object types 1310, one or more property types 1316, and one or more link types 1330. Based on information determined by the parser 1302 or other mapping of source input information to object type, one or more data objects 1201 may be instantiated in the database 209 based on respective determined object types 1310, and each of the objects 1201 has one or more properties 1203 that are instantiated based on property types 1316. Two data objects 1201 may be connected by one or more links 1202 that may be instantiated based on link types 1330. The property types 1316 each may comprise one or more data types 1318, such as a string, number, etc. Property types 1316 may be instantiated based on a base property type 1320. For example, a base property type 1320 may be “Locations” and a property type 1316 may be “Home.”


In an embodiment, a user of the system uses an object type editor 1324 to create and/or modify the object types 1310 and define attributes of the object types. In an embodiment, a user of the system uses a property type editor 1326 to create and/or modify the property types 1316 and define attributes of the property types. In an embodiment, a user of the system uses link type editor 1328 to create the link types 1330. Alternatively, other programs, processes, or programmatic controls may be used to create link types and property types and define attributes, and using editors is not required.


In an embodiment, creating a property type 1316 using the property type editor 1326 involves defining at least one parser definition using a parser editor 1322. A parser definition comprises metadata that informs parser 1302 how to parse input data 1300 to determine whether values in the input data can be assigned to the property type 1316 that is associated with the parser definition. In an embodiment, each parser definition may comprise a regular expression parser 1304A or a code module parser 1304B. In other embodiments, other kinds of parser definitions may be provided using scripts or other programmatic elements. Once defined, both a regular expression parser 1304A and a code module parser 1304B can provide input to parser 1302 to control parsing of input data 1300.


Using the data types defined in the ontology, input data 1300 may be parsed by the parser 1302 determine which object type 1310 should receive data from a record created from the input data, and which property types 1316 should be assigned to data from individual field values in the input data. Based on the object-property mapping 1301, the parser 1302 selects one of the parser definitions that is associated with a property type in the input data. The parser parses an input data field using the selected parser definition, resulting in creating new or modified data 1303. The new or modified data 1303 is added to the database 1209 according to ontology 205 by storing values of the new or modified data in a property of the specified property type. As a result, input data 1300 having varying format or syntax can be created in database 1209. The ontology 1205 may be modified at any time using object type editor 1324, property type editor 1326, and link type editor 1328, or under program control without human use of an editor. Parser editor 1322 enables creating multiple parser definitions that can successfully parse input data 1300 having varying format or syntax and determine which property types should be used to transform input data 300 into new or modified input data 1303.


The properties, objects, and links (e.g. relationships) between the objects can be visualized using a graphical user interface (GUI). For example, FIG. 8C displays a user interface showing a graph representation 1403 of relationships (including relationships and/or links 1404, 1405, 1406, 1407, 1408, 1409, 1410, 1411, 1412, and 1413) between the data objects (including data objects 1421, 1422, 1423, 1424, 1425, 1426, 1427, 1428, and 1429) that are represented as nodes in the example of FIG. 8C. In this embodiment, the data objects include person objects 1421, 1422, 1423, 1424, 1425, and 1426; a flight object 1427; a financial account 1428; and a computer object 1429. In this example, each person node (associated with person data objects), flight node (associated with flight data objects), financial account node (associated with financial account data objects), and computer node (associated with computer data objects) may have relationships and/or links with any of the other nodes through, for example, other objects such as payment objects.


For example, in FIG. 8C, relationship 1404 is based on a payment associated with the individuals indicated in person data objects 1421 and 1423. The link 1404 represents these shared payments (for example, the individual associated with data object 1421 may have paid the individual associated with data object 1423 on three occasions). The relationship is further indicated by the common relationship between person data objects 1421 and 1423 and financial account data object 1428. For example, link 1411 indicates that person data object 1421 transferred money into financial account data object 1428, while person data object 1423 transferred money out of financial account data object 1428. In another example, the relationships between person data objects 1424 and 1425 and flight data object 1427 are indicated by links 1406, 1409, and 1410. In this example, person data objects 1424 and 1425 have a common address and were passengers on the same flight data object 1427. In an embodiment, further details related to the relationships between the various objects may be displayed. For example, links 1411 and 1412 may, in some embodiments, indicate the timing of the respective money transfers. In another example, the time of the flight associated with the flight data object 1427 may be shown.


Relationships between data objects may be stored as links, or in some embodiments, as properties, where a relationship may be detected between the properties. In some cases, as stated above, the links may be directional. For example, a payment link may have a direction associated with the payment, where one person object is a receiver of a payment, and another person object is the payer of payment.


In various embodiments, data objects may further include geographical metadata and/or links. Such geographical metadata may be accessed by the interactive data object map system for displaying objects and features on the map interface (as described above).


In addition to visually showing relationships between the data objects, the user interface may allow various other manipulations. For example, the objects within database 1108 may be searched using a search interface 1450 (e.g., text string matching of object properties), inspected (e.g., properties and associated data viewed), filtered (e.g., narrowing the universe of objects into sets and subsets by properties or relationships), and statistically aggregated (e.g., numerically summarized based on summarization criteria), among other operations and visualizations. Additionally, as described above, objects within database 1108 may be searched, accessed, and implemented in the map interface of the interactive data object map system via, for example, a geosearch and/or radius search.


Implementation Mechanisms


According to an embodiment, the interactive data object map system and other methods and techniques described herein are implemented by one or more special-purpose computing devices. The special-purpose computing devices may be hard-wired to perform the techniques, or may include digital electronic devices such as one or more application-specific integrated circuits (ASICs) or field programmable gate arrays (FPGAs) that are persistently programmed to perform the techniques, or may include one or more general purpose hardware processors programmed to perform the techniques pursuant to program instructions in firmware, memory, other storage, or a combination. Such special-purpose computing devices may also combine custom hard-wired logic, ASICs, or FPGAs with custom programming to accomplish the techniques. The special-purpose computing devices may be desktop computer systems, server computer systems, portable computer systems, handheld devices, networking devices or any other device or combination of devices that incorporate hard-wired and/or program logic to implement the techniques.


Computing device(s) are generally controlled and coordinated by operating system software, such as iOS, Android, Chrome OS, Windows XP, Windows Vista, Windows 7, Windows 8, Windows Server, Windows CE, Unix, Linux, SunOS, Solaris, iOS, Blackberry OS, VxWorks, or other compatible operating systems. In other embodiments, the computing device may be controlled by a proprietary operating system. Conventional operating systems control and schedule computer processes for execution, perform memory management, provide file system, networking, I/O services, and provide a user interface functionality, such as a graphical user interface (“GUI”), among other things.


For example, FIG. 8D is a block diagram that illustrates a computer system 800 upon which the various systems and methods discussed herein may be implemented. Computer system 800 includes a bus 802 or other communication mechanism for communicating information, and a hardware processor, or multiple processors, 804 coupled with bus 802 for processing information. Hardware processor(s) 804 may be, for example, one or more general purpose microprocessors.


Computer system 800 also includes a main memory 806, such as a random access memory (RAM), cache and/or other dynamic storage devices, coupled to bus 802 for storing information and instructions to be executed by processor 804. Main memory 806 also may be used for storing temporary variables or other intermediate information during execution of instructions to be executed by processor 804. Such instructions, when stored in storage media accessible to processor 804, render computer system 800 into a special-purpose machine that is customized to perform the operations specified in the instructions.


Computer system 800 further includes a read only memory (ROM) 808 or other static storage device coupled to bus 802 for storing static information and instructions for processor 804. A storage device 810, such as a magnetic disk, optical disk, or USB thumb drive (Flash drive), etc., is provided and coupled to bus 802 for storing information and instructions.


Computer system 800 may be coupled via bus 802 to a display 812, such as a cathode ray tube (CRT), LCD display, or touch screen display, for displaying information to a computer user and/or receiving input from the user. An input device 814, including alphanumeric and other keys, is coupled to bus 802 for communicating information and command selections to processor 804. Another type of user input device is cursor control 816, such as a mouse, a trackball, or cursor direction keys for communicating direction information and command selections to processor 804 and for controlling cursor movement on display 812. This input device typically has two degrees of freedom in two axes, a first axis (e.g., x) and a second axis (e.g., y), that allows the device to specify positions in a plane. In some embodiments, the same direction information and command selections as cursor control may be implemented via receiving touches on a touch screen without a cursor.


Computing system 800 may include a user interface module, and/or various other types of modules to implement a GUI, a map interface, and the various other aspects of the interactive data object map system. The modules may be stored in a mass storage device as executable software codes that are executed by the computing device(s). This and other modules may include, by way of example, components, such as software components, object-oriented software components, class components and task components, processes, functions, attributes, procedures, subroutines, segments of program code, drivers, firmware, microcode, circuitry, data, databases, data structures, tables, arrays, and variables.


In general, the word “module,” as used herein, refers to logic embodied in hardware or firmware, or to a collection of software instructions, possibly having entry and exit points, written in a programming language, such as, for example, Java, Lua, C or C++. A software module may be compiled and linked into an executable program, installed in a dynamic link library, or may be written in an interpreted programming language such as, for example, BASIC, Perl, or Python. It will be appreciated that software modules may be callable from other modules or from themselves, and/or may be invoked in response to detected events or interrupts. Software modules configured for execution on computing devices may be provided on a computer readable medium, such as a compact disc, digital video disc, flash drive, magnetic disc, or any other tangible medium, or as a digital download (and may be originally stored in a compressed or installable format that requires installation, decompression or decryption prior to execution). Such software code may be stored, partially or fully, on a memory device of the executing computing device, for execution by the computing device. Software instructions may be embedded in firmware, such as an EPROM. It will be further appreciated that hardware modules may be comprised of connected logic units, such as gates and flip-flops, and/or may be comprised of programmable units, such as programmable gate arrays or processors. The modules or computing device functionality described herein are preferably implemented as software modules, but may be represented in hardware or firmware. Generally, the modules described herein refer to logical modules that may be combined with other modules or divided into sub-modules despite their physical organization or storage


Computer system 800 may implement the techniques described herein using customized hard-wired logic, one or more ASICs or FPGAs, firmware and/or program logic which in combination with the computer system causes or programs computer system 800 to be a special-purpose machine. According to one embodiment, the techniques herein are performed by computer system 800 in response to processor(s) 804 executing one or more sequences of one or more modules and/or instructions contained in main memory 806. Such instructions may be read into main memory 806 from another storage medium, such as storage device 810. Execution of the sequences of instructions contained in main memory 806 causes processor(s) 804 to perform the process steps described herein. In alternative embodiments, hard-wired circuitry may be used in place of or in combination with software instructions.


The term “non-transitory media,” and similar terms, as used herein refers to any media that store data and/or instructions that cause a machine to operate in a specific fashion. Such non-transitory media may comprise non-volatile media and/or volatile media. Non-volatile media includes, for example, optical or magnetic disks, such as storage device 810. Volatile media includes dynamic memory, such as main memory 806. Common forms of non-transitory media include, for example, a floppy disk, a flexible disk, hard disk, solid state drive, magnetic tape, or any other magnetic data storage medium, a CD-ROM, any other optical data storage medium, any physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, NVRAM, any other memory chip or cartridge, and networked versions of the same.


Non-transitory media is distinct from but may be used in conjunction with transmission media. Transmission media participates in transferring information between nontransitory media. For example, transmission media includes coaxial cables, copper wire and fiber optics, including the wires that comprise bus 802. Transmission media can also take the form of acoustic or light waves, such as those generated during radio-wave and infra-red data communications.


Various forms of media may be involved in carrying one or more sequences of one or more instructions to processor 804 for execution. For example, the instructions may initially be carried on a magnetic disk or solid state drive of a remote computer. The remote computer can load the instructions and/or modules into its dynamic memory and send the instructions over a telephone line using a modem. A modem local to computer system 800 can receive the data on the telephone line and use an infra-red transmitter to convert the data to an infra-red signal. An infra-red detector can receive the data carried in the infra-red signal and appropriate circuitry can place the data on bus 802. Bus 802 carries the data to main memory 806, from which processor 804 retrieves and executes the instructions. The instructions received by main memory 806 may optionally be stored on storage device 810 either before or after execution by processor 804.


Computer system 800 also includes a communication interface 818 coupled to bus 802. Communication interface 818 provides a two-way data communication coupling to a network link 820 that is connected to a local network 822. For example, communication interface 818 may be an integrated services digital network (ISDN) card, cable modem, satellite modem, or a modem to provide a data communication connection to a corresponding type of telephone line. As another example, communication interface 818 may be a local area network (LAN) card to provide a data communication connection to a compatible LAN (or WAN component to communicated with a WAN). Wireless links may also be implemented. In any such implementation, communication interface 818 sends and receives electrical, electromagnetic or optical signals that carry digital data streams representing various types of information.


Network link 820 typically provides data communication through one or more networks to other data devices. For example, network link 820 may provide a connection through local network 822 to a host computer 824 or to data equipment operated by an Internet Service Provider (ISP) 826. ISP 826 in turn provides data communication services through the world wide packet data communication network now commonly referred to as the “Internet” 828. Local network 822 and Internet 828 both use electrical, electromagnetic or optical signals that carry digital data streams. The signals through the various networks and the signals on network link 820 and through communication interface 818, which carry the digital data to and from computer system 800, are example forms of transmission media.


Computer system 800 can send messages and receive data, including program code, through the network(s), network link 820 and communication interface 818. In the Internet example, a server 830 might transmit a requested code for an application program through Internet 828, ISP 826, local network 822 and communication interface 818. Server-side components of the interactive data object map system described above (for example, with reference to FIGS. 7A and 7B) may be implemented in the server 830. For example, the server 830 may compose map layers and tiles, and transmit those map tiles to the computer system 800.


The computer system 800, on the other hand, may implement the the client-side components of the map system as described above (for example, with reference to FIGS. 6A and 6B). For example, the computer system may receive map tiles and/or other code that may be executed by processor 804 as it is received, and/or stored in storage device 810, or other non-volatile storage for later execution. The computer system 800 may further compose the map interface from the map tiles, display the map interface to the user, generate object outlines and other functionality, and/or receive input from the user.


In an embodiment, the map system may be accessible by the user through a web-based viewer, such as a web browser. In this embodiment, the map interface may be generated by the server 830 and/or the computer system 800 and transmitted to the web browser of the user. The user may then interact with the map interface through the web-browser. In an embodiment, the computer system 800 may comprise a mobile electronic device, such as a cell phone, smartphone, and/or tablet. The map system may be accessible by the user through such a mobile electronic device, among other types of electronic devices.


Each of the processes, methods, and algorithms described in the preceding sections may be embodied in, and fully or partially automated by, code modules executed by one or more computer systems or computer processors comprising computer hardware. The processes and algorithms may be implemented partially or wholly in application-specific circuitry.


The various features and processes described above may be used independently of one another, or may be combined in various ways. All possible combinations and subcombinations are intended to fall within the scope of this disclosure. In addition, certain method or process blocks may be omitted in some implementations. The methods and processes described herein are also not limited to any particular sequence, and the blocks or states relating thereto can be performed in other sequences that are appropriate. For example, described blocks or states may be performed in an order other than that specifically disclosed, or multiple blocks or states may be combined in a single block or state. The example blocks or states may be performed in serial, in parallel, or in some other manner. Blocks or states may be added to or removed from the disclosed example embodiments. The example systems and components described herein may be configured differently than described. For example, elements may be added to, removed from, or rearranged compared to the disclosed example embodiments.


Conditional language, such as, among others, “can,” “could,” “might,” or “may,” unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or steps. Thus, such conditional language is not generally intended to imply that features, elements and/or steps are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without user input or prompting, whether these features, elements and/or steps are included or are to be performed in any particular embodiment.


Any process descriptions, elements, or blocks in the flow diagrams described herein and/or depicted in the attached Figures should be understood as potentially representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps in the process. Alternate implementations are included within the scope of the embodiments described herein in which elements or functions may be deleted, executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those skilled in the art.


It should be emphasized that many variations and modifications may be made to the above-described embodiments, the elements of which are to be understood as being among other acceptable examples. All such modifications and variations are intended to be included herein within the scope of this disclosure. The foregoing description details certain embodiments of the invention. It will be appreciated, however, that no matter how detailed the foregoing appears in text, the invention can be practiced in many ways. As is also stated above, it should be noted that the use of particular terminology when describing certain features or aspects of the invention should not be taken to imply that the terminology is being re-defined herein to be restricted to including any specific characteristics of the features or aspects of the invention with which that terminology is associated. The scope of the invention should therefore be construed in accordance with the appended claims and any equivalents thereof.

Claims
  • 1. A computer system comprising: one or more hardware processors in communication with a computer readable medium storing software instructions, the one or more hardware processors configured to execute the software instructions to cause the computer system to: cause display of an interactive map in a graphical user interface, wherein the interactive map comprises a plurality of map layers;determine a list of available map layers;organize the list of available map layers according to a hierarchical layer ontology, wherein like map layers are grouped together;cause display, on the interactive map in the graphical user interface, of the hierarchical layer ontology including indications of the available map layers, wherein each of the available map layers is user selectable, wherein each of the available map layers is associated with one or more feature types, and wherein the available map layers comprise base layers, vector layers, and user-defined layers;receive a first user input selecting or deselecting one or more map layers of the available map layers; andresponsive to the first user input, determine whether map layers needed to compose map tiles based on the selecting or deselecting are cached.
  • 2. The computer system of claim 1, wherein the hierarchical layer ontology includes at least two hierarchical levels of layers.
  • 3. The computer system of claim 2, wherein selection or de-selection of an available map layer in a first hierarchical level causes a corresponding selection or de-selection of all available map layers in any hierarchical levels below the first hierarchical level.
  • 4. The computer system of claim 1, wherein fewer than all the map layers are displayed in the hierarchical layer ontology.
  • 5. The computer system of claim 1, wherein the base layer comprises at least one of an overhead imagery layer, a topographic layer, a subtle base layer, an aviation layer, a blank Mercator layer, or a blank unprojected layer.
  • 6. The computer system of claim 1, wherein the one or more vector layers comprise at least one of a regions layer, a buildings/structures layer, a terrain layer, a transportation layer, or a utilities/infrastructure layer.
  • 7. The computer system of claim 1, wherein each of the one or more vector layers is comprised of one or more sub-vector layers.
  • 8. The computer system of claim 7, wherein each of the one or more vector layers includes vector data associated with features of respective feature types.
  • 9. The computer system of claim 8, wherein the one or more feature types include at least one of regions, buildings/structures, terrain, transportation, or utilities/infrastructure.
  • 10. The computer system of claim 8, wherein each of the features represents at least one of a road, a terrain, a lake, a river, a vegetation, a utility, a street light, a sign, a railroad, a hotel, a motel, a school, a hospital, a building or other structure, a region, a transportation object, an entity, an event, or a document.
  • 11. The computer system of claim 8, wherein metadata associated with the features includes at least one of a location, a city, a county, a state, a country, an address, a district, a grade level, a phone number, a speed, a width, or other related attributes.
  • 12. The computer system of claim 11, wherein the one or more hardware processors are further configured to execute the software instructions to cause the computer system to: receive a second user input from the user selecting one or more of the features; andin response to the second user input, access and cause display of metadata associated with at least some of the selected features.
  • 13. The computer system of claim 12, wherein the one or more hardware processors are further configured to execute the software instructions to cause the computer system to: further in response to the second user input, cause display of one or more histograms based on the accessed metadata.
  • 14. The computer system of claim 1, wherein the one or more hardware processors are further configured to execute the software instructions to cause the computer system to: further responsive to the first user input: cause the interactive map to be updated with the one or more map tiles including the map layers needed to compose the map tiles based on the selecting or deselecting.
  • 15. The computer system of claim 14, wherein causing the interactive map to be updated comprises: composing data associated with each of the map layers needed to compose the map tiles into one or more map tiles; andarranging the one or more map tiles into the interactive map in the graphical user interface.
  • 16. The computer system of claim 1, wherein the one or more hardware processors are further configured to execute the software instructions to cause the computer system to: further responsive to the first user input: determine, based on the selecting or deselecting, a map tile composition and the map layers needed to compose the map tiles;compose the map layers needed to compose the map tiles into one or more map tiles; andcause the interactive map to be updated with the one or more map tiles.
  • 17. The computer system of claim 16, wherein the one or more hardware processors are further configured to execute the software instructions to cause the computer system to: further responsive to the first user input: calculate any map layers needed in the map tile composition that are not cached; andcache the calculated map layers.
  • 18. A computer-implemented method comprising: by one or more processors executing program instructions: causing display of an interactive map in a graphical user interface, wherein the interactive map comprises a plurality of map layers;determining a list of available map layers;organizing the list of available map layers according to a hierarchical layer ontology, wherein like map layers are grouped together;causing display, on the interactive map in the graphical user interface, of the hierarchical layer ontology including indications of the available map layers, wherein each of the available map layers is user selectable, wherein each of the available map layers is associated with one or more feature types, and wherein the available map layers comprise base layers, vector layers, and user-defined layers;receiving a first user input selecting or deselecting one or more map layers of the available map layers; andresponsive to the first user input, determining whether map layers needed to compose map tiles based on the selecting or deselecting are cached.
  • 19. The computer-implemented method of claim 18 further comprising: by the one or more processors executing program instructions: further responsive to the first user input: determining, based on the selecting or deselecting, a map tile composition and the map layers needed to compose the map tiles;composing the map layers needed to compose the map tiles into one or more map tiles; andcausing the interactive map to be updated with the one or more map tiles.
  • 20. The computer-implemented method of claim 19 further comprising: by the one or more processors executing program instructions: further responsive to the first user input: calculating any map layers needed in the map tile composition that are not cached; andcaching the calculated map layers.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 13/917,571, filed on Jun. 13, 2013, and titled “INTERACTIVE GEOSPATIAL MAP,” which application claims a priority benefit under 35 U.S.C. § 119 to U.S. Provisional Patent Application No. 61/820,608, filed on May 7, 2013, and titled “INTERACTIVE DATA OBJECT MAP.” All of the above-identified applications are hereby incorporated by reference herein in their entireties.

US Referenced Citations (890)
Number Name Date Kind
4899161 Morin et al. Feb 1990 A
4958305 Piazza Sep 1990 A
5109399 Thompson Apr 1992 A
5241625 Epard et al. Aug 1993 A
5329108 Lamoure Jul 1994 A
5623590 Becker et al. Apr 1997 A
5632009 Rao et al. May 1997 A
5632987 Rao et al. May 1997 A
5670987 Doi et al. Sep 1997 A
5754182 Kobayashi May 1998 A
5781195 Marvin Jul 1998 A
5781704 Rossmo Jul 1998 A
5798769 Chiu et al. Aug 1998 A
5845300 Comer Dec 1998 A
5999911 Berg et al. Dec 1999 A
6055569 O'Brien Apr 2000 A
6057757 Arrowsmith et al. May 2000 A
6065026 Cornelia et al. May 2000 A
6091956 Hollenberg Jul 2000 A
6157747 Szeliski et al. Dec 2000 A
6161098 Wallman Dec 2000 A
6169552 Endo et al. Jan 2001 B1
6173067 Payton et al. Jan 2001 B1
6178432 Cook et al. Jan 2001 B1
6219053 Tachibana et al. Apr 2001 B1
6232971 Haynes May 2001 B1
6237138 Hameluck et al. May 2001 B1
6243706 Moreau et al. Jun 2001 B1
6247019 Davies Jun 2001 B1
6279018 Kudrolli et al. Aug 2001 B1
6338066 Martin Jan 2002 B1
6341310 Leshem et al. Jan 2002 B1
6366933 Ball et al. Apr 2002 B1
6369835 Lin Apr 2002 B1
6370538 Lamping et al. Apr 2002 B1
6389289 Voce et al. May 2002 B1
6414683 Gueziec Jul 2002 B1
6430305 Decker Aug 2002 B1
6456997 Shukla Sep 2002 B1
6483509 Rabenhorst Nov 2002 B1
6523019 Borthwick Feb 2003 B1
6529900 Patterson et al. Mar 2003 B1
6549944 Weinberg et al. Apr 2003 B1
6560620 Ching May 2003 B1
6581068 Bensoussan et al. Jun 2003 B1
6584498 Nguyen Jun 2003 B2
6594672 Lampson et al. Jul 2003 B1
6631496 Li et al. Oct 2003 B1
6642945 Sharpe Nov 2003 B1
6662103 Skolnick et al. Dec 2003 B1
6665683 Meltzer Dec 2003 B1
6674434 Chojnacki et al. Jan 2004 B1
6714936 Nevin, III Mar 2004 B1
6742033 Smith May 2004 B1
6757445 Knopp Jun 2004 B1
6775675 Nwabueze et al. Aug 2004 B1
6820135 Dingman Nov 2004 B1
6828920 Owen et al. Dec 2004 B2
6839745 Dingari et al. Jan 2005 B1
6850317 Mullins et al. Feb 2005 B2
6877137 Rivette et al. Apr 2005 B1
6944821 Bates et al. Sep 2005 B1
6967589 Peters Nov 2005 B1
6976210 Silva et al. Dec 2005 B1
6978419 Kantrowitz Dec 2005 B1
6980984 Huffman et al. Dec 2005 B1
6983203 Wako Jan 2006 B1
6985950 Hanson et al. Jan 2006 B1
7003566 Codella Feb 2006 B2
7036085 Barros Apr 2006 B2
7043702 Chi et al. May 2006 B2
7055110 Kupka et al. May 2006 B2
7086028 Davis et al. Aug 2006 B1
7103852 Kairis, Jr. Sep 2006 B2
7139800 Bellotti et al. Nov 2006 B2
7149366 Sun Dec 2006 B1
7158878 Rasmussen et al. Jan 2007 B2
7162475 Ackerman Jan 2007 B2
7168039 Bertram Jan 2007 B2
7171427 Witowski et al. Jan 2007 B2
7174377 Bernard et al. Feb 2007 B2
7194680 Roy et al. Mar 2007 B1
7213030 Jenkins May 2007 B1
7269786 Malloy et al. Sep 2007 B1
7278105 Kitts Oct 2007 B1
7290698 Poslinski et al. Nov 2007 B2
7333998 Heckerman et al. Feb 2008 B2
7370047 Gorman May 2008 B2
7375732 Arcas May 2008 B2
7379811 Rasmussen et al. May 2008 B2
7379903 Caballero et al. May 2008 B2
7392254 Jenkins Jun 2008 B1
7426654 Adams et al. Sep 2008 B2
7441182 Beilinson et al. Oct 2008 B2
7441219 Perry et al. Oct 2008 B2
7454466 Bellotti et al. Nov 2008 B2
7457706 Malero et al. Nov 2008 B2
7467375 Tondreau et al. Dec 2008 B2
7487139 Fraleigh et al. Feb 2009 B2
7502786 Liu et al. Mar 2009 B2
7519470 Brasche et al. Apr 2009 B2
7525422 Bishop et al. Apr 2009 B2
7529195 Gorman May 2009 B2
7529727 Arning et al. May 2009 B2
7529734 Dirisala May 2009 B2
7539666 Ashworth et al. May 2009 B2
7558677 Jones Jul 2009 B2
7558822 Fredricksen Jul 2009 B2
7574409 Patinkin Aug 2009 B2
7574428 Leiserowitz et al. Aug 2009 B2
7579965 Bucholz Aug 2009 B2
7596285 Brown et al. Sep 2009 B2
7614006 Molander Nov 2009 B2
7617232 Gabbert et al. Nov 2009 B2
7617314 Bansod et al. Nov 2009 B1
7620628 Kapur et al. Nov 2009 B2
7627812 Chamberlain et al. Dec 2009 B2
7634717 Chamberlain et al. Dec 2009 B2
7653883 Hotelling Jan 2010 B2
7663621 Allen et al. Feb 2010 B1
7693816 Nemoto Apr 2010 B2
7703021 Flam Apr 2010 B1
7706817 Bamrah et al. Apr 2010 B2
7712049 Williams et al. May 2010 B2
7716077 Mikurak May 2010 B1
7725530 Sah et al. May 2010 B2
7725547 Albertson et al. May 2010 B2
7730082 Sah et al. Jun 2010 B2
7730109 Rohrs et al. Jun 2010 B2
7747749 Erikson Jun 2010 B1
7756843 Palmer Jul 2010 B1
7765489 Shah Jul 2010 B1
7770100 Chamberlain et al. Aug 2010 B2
7791616 Ioup et al. Sep 2010 B2
7805457 Viola et al. Sep 2010 B1
7809703 Balabhadrapatruni et al. Oct 2010 B2
7818658 Chen Oct 2010 B2
7870493 Pall et al. Jan 2011 B2
7872647 Mayer et al. Jan 2011 B2
7877421 Berger et al. Jan 2011 B2
7880921 Dattilo et al. Feb 2011 B2
7890850 Bryar Feb 2011 B1
7894984 Rasmussen Feb 2011 B2
7899611 Downs et al. Mar 2011 B2
7899796 Borthwick et al. Mar 2011 B1
7917376 Bellin et al. Mar 2011 B2
7920963 Jouline et al. Apr 2011 B2
7933862 Chamberlain et al. Apr 2011 B2
7941321 Greenstein et al. May 2011 B2
7941336 Robin-Jan May 2011 B1
7945852 Pilskains May 2011 B1
7949960 Roessler May 2011 B2
7958147 Turner et al. Jun 2011 B1
7962281 Rasmussen et al. Jun 2011 B2
7962495 Jain et al. Jun 2011 B2
7962848 Bertram Jun 2011 B2
7966199 Frasher Jun 2011 B1
7970240 Chao et al. Jun 2011 B1
7971150 Raskutti et al. Jun 2011 B2
7984374 Caro et al. Jul 2011 B2
8001465 Kudrolli et al. Aug 2011 B2
8001482 Bhattiprolu et al. Aug 2011 B2
8010507 Poston et al. Aug 2011 B2
8010545 Stefik et al. Aug 2011 B2
8015487 Roy et al. Sep 2011 B2
8024778 Cash et al. Sep 2011 B2
8036632 Cona et al. Oct 2011 B1
8036971 Aymeloglu et al. Oct 2011 B2
8046283 Burns Oct 2011 B2
8054756 Chand et al. Nov 2011 B2
8065080 Koch Nov 2011 B2
8073857 Sreekanth Dec 2011 B2
8085268 Carrino et al. Dec 2011 B2
8095434 Puttick et al. Jan 2012 B1
8103543 Zwicky Jan 2012 B1
8134457 Velipasalar et al. Mar 2012 B2
8145703 Frishert et al. Mar 2012 B2
8185819 Sah et al. May 2012 B2
8191005 Baier et al. May 2012 B2
8200676 Frank Jun 2012 B2
8214361 Sandler et al. Jul 2012 B1
8214490 Vos et al. Jul 2012 B1
8214764 Gemmell et al. Jul 2012 B2
8225201 Michael Jul 2012 B2
8229902 Vishniac et al. Jul 2012 B2
8229947 Fujinaga Jul 2012 B2
8230333 Decherd et al. Jul 2012 B2
8271461 Pike et al. Sep 2012 B2
8280880 Aymeloglu et al. Oct 2012 B1
8290838 Thakur et al. Oct 2012 B1
8290926 Ozzie et al. Oct 2012 B2
8290942 Jones et al. Oct 2012 B2
8290943 Carbone et al. Oct 2012 B2
8301464 Cave et al. Oct 2012 B1
8301904 Gryaznov Oct 2012 B1
8302855 Ma et al. Nov 2012 B2
8312367 Foster Nov 2012 B2
8312546 Alme Nov 2012 B2
8325178 Doyle, Jr. Dec 2012 B1
8352881 Champion et al. Jan 2013 B2
8368695 Howell et al. Feb 2013 B2
8386377 Xiong et al. Feb 2013 B1
8396740 Watson Mar 2013 B1
8397171 Klassen et al. Mar 2013 B2
8400448 Doyle, Jr. Mar 2013 B1
8407180 Ramesh et al. Mar 2013 B1
8412234 Gatmir-Motahair et al. Apr 2013 B1
8412707 Mianji Apr 2013 B1
8422825 Neophytou et al. Apr 2013 B1
8447722 Ahuja et al. May 2013 B1
8452790 Mianji May 2013 B1
8463036 Ramesh et al. Jun 2013 B1
8473454 Evanitsky et al. Jun 2013 B2
8484115 Aymeloglu et al. Jul 2013 B2
8489331 Kopf et al. Jul 2013 B2
8489641 Seefeld Jul 2013 B1
8494984 Hwang et al. Jul 2013 B2
8498984 Hwang et al. Jul 2013 B1
8508533 Cervelli et al. Aug 2013 B2
8510743 Hackborn et al. Aug 2013 B2
8514082 Cova et al. Aug 2013 B2
8514229 Cervelli et al. Aug 2013 B2
8515207 Chau Aug 2013 B2
8527949 Pleis et al. Sep 2013 B1
8554579 Tribble et al. Oct 2013 B2
8554653 Falkenborg et al. Oct 2013 B2
8554709 Goodson et al. Oct 2013 B2
8560413 Quarterman Oct 2013 B1
8564596 Carrino et al. Oct 2013 B2
8577911 Stepinski et al. Nov 2013 B1
8589273 Creeden et al. Nov 2013 B2
8595234 Siripurapu et al. Nov 2013 B2
8599203 Horowitz et al. Dec 2013 B2
8620641 Farnsworth et al. Dec 2013 B2
8639757 Zang et al. Jan 2014 B1
8646080 Williamson et al. Feb 2014 B2
8676857 Adams et al. Mar 2014 B1
8682696 Shanmugam Mar 2014 B1
8688573 Ruknoic et al. Apr 2014 B1
8689108 Duffield et al. Apr 2014 B1
8713467 Goldenberg et al. Apr 2014 B1
8726379 Stiansen et al. May 2014 B1
8732574 Burr et al. May 2014 B2
8739278 Varghese May 2014 B2
8742934 Sarpy et al. Jun 2014 B1
8744890 Bernier Jun 2014 B1
8745516 Mason et al. Jun 2014 B2
8781169 Jackson et al. Jul 2014 B2
8787939 Papakipos et al. Jul 2014 B2
8788407 Singh et al. Jul 2014 B1
8799313 Satlow Aug 2014 B2
8799799 Cervelli Aug 2014 B1
8807948 Luo et al. Aug 2014 B2
8812960 Sun et al. Aug 2014 B1
8830322 Nerayoff et al. Sep 2014 B2
8832594 Thompson et al. Sep 2014 B1
8868537 Colgrove et al. Oct 2014 B1
8917274 Ma et al. Dec 2014 B2
8924388 Elliot et al. Dec 2014 B2
8924389 Elliot et al. Dec 2014 B2
8924872 Bogomolov et al. Dec 2014 B1
8930874 Duff et al. Jan 2015 B2
8937619 Sharma et al. Jan 2015 B2
8938434 Jain et al. Jan 2015 B2
8938686 Erenrich et al. Jan 2015 B1
8949164 Mohler Feb 2015 B1
8983494 Onnen et al. Mar 2015 B1
8984390 Aymeloglu et al. Mar 2015 B2
9009171 Grossman et al. Apr 2015 B1
9009177 Zheng et al. Apr 2015 B2
9009827 Albertson et al. Apr 2015 B1
9021260 Falk et al. Apr 2015 B1
9021384 Beard et al. Apr 2015 B1
9043696 Meiklejohn et al. May 2015 B1
9043894 Dennison et al. May 2015 B1
9058315 Burr et al. Jun 2015 B2
9100428 Visbal Aug 2015 B1
9104293 Kornfeld et al. Aug 2015 B1
9104695 Cervelli et al. Aug 2015 B1
9111380 Piemonte et al. Aug 2015 B2
9116975 Shankar et al. Aug 2015 B2
9129219 Robertson et al. Sep 2015 B1
9146125 Vulcano et al. Sep 2015 B2
9165100 Begur et al. Oct 2015 B2
9280618 Bruce et al. Mar 2016 B1
9600146 Cervelli et al. Mar 2017 B2
20010021936 Bertram Sep 2001 A1
20020003539 Abe Jan 2002 A1
20020032677 Morgenthaler et al. Mar 2002 A1
20020033848 Sciammarella et al. Mar 2002 A1
20020065708 Senay et al. May 2002 A1
20020091707 Keller Jul 2002 A1
20020095360 Joao Jul 2002 A1
20020095658 Shulman et al. Jul 2002 A1
20020103705 Brady Aug 2002 A1
20020116120 Ruiz et al. Aug 2002 A1
20020130867 Yang et al. Sep 2002 A1
20020130906 Miyaki Sep 2002 A1
20020130907 Chi et al. Sep 2002 A1
20020147805 Leshem et al. Oct 2002 A1
20020174201 Ramer et al. Nov 2002 A1
20020194119 Wright et al. Dec 2002 A1
20020196229 Chen et al. Dec 2002 A1
20030028560 Kudrolli et al. Feb 2003 A1
20030036848 Sheha et al. Feb 2003 A1
20030036927 Bowen Feb 2003 A1
20030039948 Donahue Feb 2003 A1
20030052896 Higgins et al. Mar 2003 A1
20030093755 O'Carroll May 2003 A1
20030103049 Kindratenko et al. Jun 2003 A1
20030126102 Borthwick Jul 2003 A1
20030140106 Raguseo Jul 2003 A1
20030144868 MacIntyre et al. Jul 2003 A1
20030163352 Surpin et al. Aug 2003 A1
20030200217 Ackerman Oct 2003 A1
20030225755 Iwayama et al. Dec 2003 A1
20030229848 Arend et al. Dec 2003 A1
20040030492 Fox et al. Feb 2004 A1
20040032432 Baynger Feb 2004 A1
20040034570 Davis Feb 2004 A1
20040039498 Ollis et al. Feb 2004 A1
20040044648 Anfindsen et al. Mar 2004 A1
20040064256 Barinek et al. Apr 2004 A1
20040085318 Hassler et al. May 2004 A1
20040095349 Bito et al. May 2004 A1
20040098236 Mayer et al. May 2004 A1
20040111410 Burgoon et al. Jun 2004 A1
20040111480 Yue Jun 2004 A1
20040123135 Goddard Jun 2004 A1
20040126840 Cheng et al. Jul 2004 A1
20040143602 Ruiz et al. Jul 2004 A1
20040143796 Lerner et al. Jul 2004 A1
20040153418 Hanweck Aug 2004 A1
20040163039 Gorman Aug 2004 A1
20040175036 Graham Sep 2004 A1
20040181554 Heckerman et al. Sep 2004 A1
20040193600 Kaasten et al. Sep 2004 A1
20040205492 Newsome Oct 2004 A1
20040217884 Samadani et al. Nov 2004 A1
20040221223 Yu et al. Nov 2004 A1
20040236688 Bozeman Nov 2004 A1
20040236711 Nixon et al. Nov 2004 A1
20040260702 Cragun et al. Dec 2004 A1
20040267746 Marcjan et al. Dec 2004 A1
20050010472 Quatse et al. Jan 2005 A1
20050027705 Sadri et al. Feb 2005 A1
20050028094 Allyn Feb 2005 A1
20050028191 Sullivan Feb 2005 A1
20050031197 Knopp Feb 2005 A1
20050034062 Bufkin et al. Feb 2005 A1
20050039116 Slack-Smith Feb 2005 A1
20050039119 Parks et al. Feb 2005 A1
20050065811 Chu et al. Mar 2005 A1
20050080769 Gemmell Apr 2005 A1
20050086207 Heuer et al. Apr 2005 A1
20050091186 Elish Apr 2005 A1
20050125715 Franco et al. Jun 2005 A1
20050154628 Eckart et al. Jul 2005 A1
20050154769 Eckart et al. Jul 2005 A1
20050162523 Darrell et al. Jul 2005 A1
20050166144 Gross Jul 2005 A1
20050180330 Shapiro Aug 2005 A1
20050182502 Iyengar Aug 2005 A1
20050182793 Keenan et al. Aug 2005 A1
20050183005 Denoue et al. Aug 2005 A1
20050210409 Jou Sep 2005 A1
20050223044 Ashworth et al. Oct 2005 A1
20050246327 Yeung et al. Nov 2005 A1
20050251786 Citron et al. Nov 2005 A1
20050267652 Allstadt et al. Dec 2005 A1
20060026120 Carolan et al. Feb 2006 A1
20060026170 Kreitler et al. Feb 2006 A1
20060026561 Bauman et al. Feb 2006 A1
20060031779 Theurer et al. Feb 2006 A1
20060045470 Poslinski et al. Mar 2006 A1
20060047804 Fredricksen Mar 2006 A1
20060053097 King et al. Mar 2006 A1
20060053170 Hill et al. Mar 2006 A1
20060059139 Robinson Mar 2006 A1
20060059423 Lehmann et al. Mar 2006 A1
20060074866 Chamberlain et al. Apr 2006 A1
20060074881 Vembu et al. Apr 2006 A1
20060080139 Mainzer Apr 2006 A1
20060080283 Shipman Apr 2006 A1
20060080619 Carlson et al. Apr 2006 A1
20060093222 Saffer et al. May 2006 A1
20060129191 Sullivan Jun 2006 A1
20060129746 Porter Jun 2006 A1
20060136513 Ngo et al. Jun 2006 A1
20060139375 Rasmussen et al. Jun 2006 A1
20060142949 Helt Jun 2006 A1
20060143034 Rothermel Jun 2006 A1
20060143075 Carr et al. Jun 2006 A1
20060143079 Basak et al. Jun 2006 A1
20060146050 Yamauchi Jul 2006 A1
20060149596 Surpin et al. Jul 2006 A1
20060155654 Plessis et al. Jul 2006 A1
20060178915 Chao Aug 2006 A1
20060200384 Arutunian et al. Sep 2006 A1
20060203337 White Sep 2006 A1
20060218637 Thomas et al. Sep 2006 A1
20060241974 Chao et al. Oct 2006 A1
20060242040 Rader Oct 2006 A1
20060242630 Koike et al. Oct 2006 A1
20060251307 Florin et al. Nov 2006 A1
20060259527 Devarakonda et al. Nov 2006 A1
20060265417 Amato et al. Nov 2006 A1
20060271277 Hu et al. Nov 2006 A1
20060277460 Forstall et al. Dec 2006 A1
20060279630 Aggarwal et al. Dec 2006 A1
20060294223 Glasgow Dec 2006 A1
20070000999 Kubo et al. Jan 2007 A1
20070011150 Frank Jan 2007 A1
20070011304 Error Jan 2007 A1
20070016363 Huang et al. Jan 2007 A1
20070016435 Bevington Jan 2007 A1
20070024620 Muller-Fischer et al. Feb 2007 A1
20070038646 Thota Feb 2007 A1
20070038962 Fuchs et al. Feb 2007 A1
20070043686 Teng et al. Feb 2007 A1
20070057966 Ohno et al. Mar 2007 A1
20070061752 Cory Mar 2007 A1
20070078832 Ott et al. Apr 2007 A1
20070083541 Fraleigh et al. Apr 2007 A1
20070094389 Nussey et al. Apr 2007 A1
20070113164 Hansen et al. May 2007 A1
20070115373 Gallagher et al. May 2007 A1
20070136095 Weinstein Jun 2007 A1
20070150369 Zivin Jun 2007 A1
20070150801 Chidlovskii et al. Jun 2007 A1
20070156673 Maga Jul 2007 A1
20070162454 D'Albora et al. Jul 2007 A1
20070168871 Jenkins Jul 2007 A1
20070174760 Chamberlain et al. Jul 2007 A1
20070185850 Walters et al. Aug 2007 A1
20070185867 Maga Aug 2007 A1
20070185894 Swain Aug 2007 A1
20070188516 Loup et al. Aug 2007 A1
20070192122 Routson et al. Aug 2007 A1
20070192265 Chopin et al. Aug 2007 A1
20070198571 Ferguson et al. Aug 2007 A1
20070208497 Downs et al. Sep 2007 A1
20070208498 Barker et al. Sep 2007 A1
20070208736 Tanigawa et al. Sep 2007 A1
20070233709 Abnous Oct 2007 A1
20070240062 Christena et al. Oct 2007 A1
20070245339 Bauman et al. Oct 2007 A1
20070258642 Thota Nov 2007 A1
20070266336 Nojima et al. Nov 2007 A1
20070284433 Domenica et al. Dec 2007 A1
20070294643 Kyle Dec 2007 A1
20070299697 Friedlander et al. Dec 2007 A1
20080010605 Frank Jan 2008 A1
20080016155 Khalatian Jan 2008 A1
20080016216 Worley et al. Jan 2008 A1
20080040275 Paulsen et al. Feb 2008 A1
20080040684 Crump Feb 2008 A1
20080051989 Welsh Feb 2008 A1
20080052142 Bailey et al. Feb 2008 A1
20080066052 Wolfram Mar 2008 A1
20080069081 Chand et al. Mar 2008 A1
20080077597 Butler Mar 2008 A1
20080077642 Carbone et al. Mar 2008 A1
20080082486 Lermant et al. Apr 2008 A1
20080082578 Hogue et al. Apr 2008 A1
20080091693 Murthy Apr 2008 A1
20080098085 Krane et al. Apr 2008 A1
20080103996 Forman et al. May 2008 A1
20080104019 Nath May 2008 A1
20080109714 Kumar et al. May 2008 A1
20080126951 Sood et al. May 2008 A1
20080133579 Lim Jun 2008 A1
20080148398 Mezack et al. Jun 2008 A1
20080155440 Trevor et al. Jun 2008 A1
20080162616 Gross et al. Jul 2008 A1
20080163073 Becker et al. Jul 2008 A1
20080172607 Baer Jul 2008 A1
20080177782 Poston et al. Jul 2008 A1
20080192053 Howell et al. Aug 2008 A1
20080195417 Surpin et al. Aug 2008 A1
20080195474 Lau Aug 2008 A1
20080195608 Clover Aug 2008 A1
20080208735 Balet et al. Aug 2008 A1
20080222295 Robinson et al. Sep 2008 A1
20080223834 Griffiths et al. Sep 2008 A1
20080229056 Agarwal et al. Sep 2008 A1
20080243711 Aymeloglu et al. Oct 2008 A1
20080249820 Pathria Oct 2008 A1
20080249983 Meisels et al. Oct 2008 A1
20080255973 El Wade et al. Oct 2008 A1
20080263468 Cappione et al. Oct 2008 A1
20080267107 Rosenberg Oct 2008 A1
20080270468 Mao Oct 2008 A1
20080276167 Michael Nov 2008 A1
20080278311 Grange et al. Nov 2008 A1
20080288306 MacIntyre et al. Nov 2008 A1
20080294678 Gorman et al. Nov 2008 A1
20080301643 Appleton et al. Dec 2008 A1
20080313132 Hao et al. Dec 2008 A1
20080313243 Poston et al. Dec 2008 A1
20090002492 Velipasalar et al. Jan 2009 A1
20090026170 Tanaka et al. Jan 2009 A1
20090027418 Maru et al. Jan 2009 A1
20090030915 Winter et al. Jan 2009 A1
20090031401 Cudich et al. Jan 2009 A1
20090055251 Shah et al. Feb 2009 A1
20090076845 Bellin et al. Mar 2009 A1
20090088964 Schaaf et al. Apr 2009 A1
20090089651 Herberger et al. Apr 2009 A1
20090094166 Aymeloglu et al. Apr 2009 A1
20090094187 Miyaki Apr 2009 A1
20090094270 Alirez et al. Apr 2009 A1
20090100018 Roberts Apr 2009 A1
20090106178 Chu Apr 2009 A1
20090112678 Luzardo Apr 2009 A1
20090112745 Stefanescu Apr 2009 A1
20090115786 Shmiasaki et al. May 2009 A1
20090119309 Gibson et al. May 2009 A1
20090125359 Knapic May 2009 A1
20090125369 Kloostra et al. May 2009 A1
20090125459 Norton et al. May 2009 A1
20090132921 Hwangbo et al. May 2009 A1
20090132953 Reed et al. May 2009 A1
20090143052 Bates et al. Jun 2009 A1
20090144262 White et al. Jun 2009 A1
20090144274 Fraleigh et al. Jun 2009 A1
20090150868 Chakra et al. Jun 2009 A1
20090157732 Hao et al. Jun 2009 A1
20090158185 Lacevic et al. Jun 2009 A1
20090164934 Bhattiprolu et al. Jun 2009 A1
20090171939 Athsani et al. Jul 2009 A1
20090172511 Decherd et al. Jul 2009 A1
20090172821 Daira et al. Jul 2009 A1
20090177962 Gusmorino et al. Jul 2009 A1
20090179892 Tsuda et al. Jul 2009 A1
20090187447 Cheng et al. Jul 2009 A1
20090187464 Bai et al. Jul 2009 A1
20090187546 Whyte et al. Jul 2009 A1
20090187548 Ji et al. Jul 2009 A1
20090199106 Jonsson et al. Aug 2009 A1
20090222400 Kupershmidt et al. Sep 2009 A1
20090222759 Drieschner Sep 2009 A1
20090222760 Halverson et al. Sep 2009 A1
20090234720 George et al. Sep 2009 A1
20090248593 Putzolu Oct 2009 A1
20090248757 Havewala et al. Oct 2009 A1
20090249178 Ambrosino et al. Oct 2009 A1
20090249244 Robinson et al. Oct 2009 A1
20090254970 Agarwal et al. Oct 2009 A1
20090271343 Vaiciulis et al. Oct 2009 A1
20090281839 Lynn et al. Nov 2009 A1
20090282068 Shockro et al. Nov 2009 A1
20090287470 Farnsworth et al. Nov 2009 A1
20090292626 Oxford Nov 2009 A1
20090307049 Elliott et al. Dec 2009 A1
20090313463 Pang et al. Dec 2009 A1
20090319418 Herz Dec 2009 A1
20090319891 MacKinlay Dec 2009 A1
20100011282 Dollard et al. Jan 2010 A1
20100016910 Sullivan Jan 2010 A1
20100030722 Goodson et al. Feb 2010 A1
20100031141 Summers et al. Feb 2010 A1
20100031183 Kang Feb 2010 A1
20100042922 Bradateanu et al. Feb 2010 A1
20100049872 Roskind Feb 2010 A1
20100057622 Faith et al. Mar 2010 A1
20100057716 Stefik et al. Mar 2010 A1
20100063961 Guiheneuf et al. Mar 2010 A1
20100070523 Delgo et al. Mar 2010 A1
20100070842 Aymeloglu et al. Mar 2010 A1
20100070844 Aymeloglu et al. Mar 2010 A1
20100070845 Facemire et al. Mar 2010 A1
20100070897 Aymeloglu et al. Mar 2010 A1
20100076968 Boyns et al. Mar 2010 A1
20100088304 Jackson Apr 2010 A1
20100088398 Plamondon Apr 2010 A1
20100098318 Anderson Apr 2010 A1
20100100963 Mahaffey Apr 2010 A1
20100103124 Kruzeniski et al. Apr 2010 A1
20100106420 Mattikalli et al. Apr 2010 A1
20100114887 Conway et al. May 2010 A1
20100122152 Chamberlain et al. May 2010 A1
20100131457 Heimendinger May 2010 A1
20100131502 Fordham May 2010 A1
20100161735 Sharma Jun 2010 A1
20100162176 Dunton Jun 2010 A1
20100185692 Zhang et al. Jul 2010 A1
20100191563 Schlaifer et al. Jul 2010 A1
20100198684 Eraker et al. Aug 2010 A1
20100199225 Coleman et al. Aug 2010 A1
20100223260 Wu Sep 2010 A1
20100228812 Uomini Sep 2010 A1
20100235915 Memon et al. Sep 2010 A1
20100238174 Haub et al. Sep 2010 A1
20100250412 Wagner Sep 2010 A1
20100262688 Hussain et al. Oct 2010 A1
20100262901 DiSalvo Oct 2010 A1
20100277611 Holt et al. Nov 2010 A1
20100280851 Merkin Nov 2010 A1
20100280857 Liu et al. Nov 2010 A1
20100293174 Bennett et al. Nov 2010 A1
20100306713 Geisner et al. Dec 2010 A1
20100306722 LeHoty et al. Dec 2010 A1
20100312837 Bodapati et al. Dec 2010 A1
20100312858 Mickens Dec 2010 A1
20100313119 Baldwin et al. Dec 2010 A1
20100313239 Chakra et al. Dec 2010 A1
20100318924 Frankel et al. Dec 2010 A1
20100321399 Ellren et al. Dec 2010 A1
20100321871 Diebel Dec 2010 A1
20100325526 Ellis et al. Dec 2010 A1
20100325581 Finkelstein et al. Dec 2010 A1
20100328112 Liu Dec 2010 A1
20100330801 Rouh Dec 2010 A1
20100332324 Khosravy et al. Dec 2010 A1
20110004498 Readshaw Jan 2011 A1
20110022312 McDonough et al. Jan 2011 A1
20110029526 Knight et al. Feb 2011 A1
20110029641 Fainberg Feb 2011 A1
20110047159 Baid et al. Feb 2011 A1
20110047540 Williams et al. Feb 2011 A1
20110060753 Shaked et al. Mar 2011 A1
20110061013 Bilicki et al. Mar 2011 A1
20110066933 Ludwig Mar 2011 A1
20110074788 Regan et al. Mar 2011 A1
20110074811 Hanson et al. Mar 2011 A1
20110078055 Faribault et al. Mar 2011 A1
20110078173 Seligmann et al. Mar 2011 A1
20110090085 Belz Apr 2011 A1
20110090254 Carrino et al. Apr 2011 A1
20110093327 Fordyce, III et al. Apr 2011 A1
20110099046 Weiss et al. Apr 2011 A1
20110099133 Chang et al. Apr 2011 A1
20110117878 Barash et al. May 2011 A1
20110119100 Ruhl et al. May 2011 A1
20110125372 Ito May 2011 A1
20110137766 Rasmussen et al. Jun 2011 A1
20110153368 Pierre et al. Jun 2011 A1
20110153384 Horne et al. Jun 2011 A1
20110161096 Buehler et al. Jun 2011 A1
20110161409 Nair Jun 2011 A1
20110167105 Ramakrishnan et al. Jul 2011 A1
20110170799 Carrino et al. Jul 2011 A1
20110173032 Payne et al. Jul 2011 A1
20110173093 Psota et al. Jul 2011 A1
20110179048 Satlow Jul 2011 A1
20110185316 Reid et al. Jul 2011 A1
20110208565 Ross et al. Aug 2011 A1
20110208724 Jones Aug 2011 A1
20110213655 Henkin Sep 2011 A1
20110218934 Elser Sep 2011 A1
20110218955 Tang Sep 2011 A1
20110219450 McDougal et al. Sep 2011 A1
20110225198 Edwards et al. Sep 2011 A1
20110225482 Chan et al. Sep 2011 A1
20110238495 Kang Sep 2011 A1
20110238553 Raj et al. Sep 2011 A1
20110238690 Arrasvuori et al. Sep 2011 A1
20110251951 Kolkowtiz Oct 2011 A1
20110258158 Resende et al. Oct 2011 A1
20110270604 Qi et al. Nov 2011 A1
20110270705 Parker Nov 2011 A1
20110270834 Sokolan et al. Nov 2011 A1
20110289397 Eastmond et al. Nov 2011 A1
20110289407 Naik et al. Nov 2011 A1
20110289420 Morioka et al. Nov 2011 A1
20110291851 Whisenant Dec 2011 A1
20110295649 Fine Dec 2011 A1
20110310005 Chen et al. Dec 2011 A1
20110314007 Dassa et al. Dec 2011 A1
20110314024 Chang et al. Dec 2011 A1
20120004894 Butler Jan 2012 A1
20120011238 Rathod Jan 2012 A1
20120011245 Gillette et al. Jan 2012 A1
20120019559 Siler et al. Jan 2012 A1
20120022945 Falkenborg et al. Jan 2012 A1
20120036013 Neuhaus et al. Feb 2012 A1
20120036434 Oberstein Feb 2012 A1
20120050293 Carlhian et al. Mar 2012 A1
20120054284 Rakshit Mar 2012 A1
20120059853 Jagota Mar 2012 A1
20120066166 Curbera et al. Mar 2012 A1
20120066296 Appleton et al. Mar 2012 A1
20120072825 Sherkin et al. Mar 2012 A1
20120079363 Folting et al. Mar 2012 A1
20120084117 Tavares et al. Apr 2012 A1
20120084118 Bai et al. Apr 2012 A1
20120084184 Raleigh Apr 2012 A1
20120084287 Lakshminarayan et al. Apr 2012 A1
20120106801 Jackson May 2012 A1
20120117082 Koperda et al. May 2012 A1
20120123989 Yu et al. May 2012 A1
20120131512 Takeuchi et al. May 2012 A1
20120137235 TS et al. May 2012 A1
20120144325 Mital et al. Jun 2012 A1
20120144335 Abeln et al. Jun 2012 A1
20120158527 Cannelongo et al. Jun 2012 A1
20120159307 Chung et al. Jun 2012 A1
20120159362 Brown et al. Jun 2012 A1
20120159363 DeBacker et al. Jun 2012 A1
20120159399 Bastide et al. Jun 2012 A1
20120170847 Tsukidate Jul 2012 A1
20120173381 Smith Jul 2012 A1
20120173985 Peppel Jul 2012 A1
20120180002 Campbell et al. Jul 2012 A1
20120188252 Law Jul 2012 A1
20120196557 Reich et al. Aug 2012 A1
20120196558 Reich et al. Aug 2012 A1
20120197651 Robinson et al. Aug 2012 A1
20120197657 Prodanovic Aug 2012 A1
20120197660 Prodanovic Aug 2012 A1
20120203708 Psota et al. Aug 2012 A1
20120206469 Hulubei et al. Aug 2012 A1
20120208636 Feige Aug 2012 A1
20120215784 King et al. Aug 2012 A1
20120221511 Gibson et al. Aug 2012 A1
20120221553 Wittmer et al. Aug 2012 A1
20120221580 Barney Aug 2012 A1
20120226523 Weiss Sep 2012 A1
20120226590 Love et al. Sep 2012 A1
20120245976 Kumar et al. Sep 2012 A1
20120246148 Dror Sep 2012 A1
20120254129 Wheeler et al. Oct 2012 A1
20120284345 Costenaro et al. Nov 2012 A1
20120284670 Kashik et al. Nov 2012 A1
20120290879 Shibuya et al. Nov 2012 A1
20120296907 Long et al. Nov 2012 A1
20120311684 Paulsen et al. Dec 2012 A1
20120323888 Osann, Jr. Dec 2012 A1
20120330801 McDougal et al. Dec 2012 A1
20120330973 Ghuneim et al. Dec 2012 A1
20130006426 Healey et al. Jan 2013 A1
20130006725 Simanek et al. Jan 2013 A1
20130006916 McBride et al. Jan 2013 A1
20130016106 Yip et al. Jan 2013 A1
20130018796 Kolhatkar et al. Jan 2013 A1
20130021445 Cossette-Pacheco et al. Jan 2013 A1
20130024268 Manickavelu Jan 2013 A1
20130046635 Grigg et al. Feb 2013 A1
20130046842 Muntz et al. Feb 2013 A1
20130054306 Bhalla Feb 2013 A1
20130057551 Ebert et al. Mar 2013 A1
20130060786 Serrano et al. Mar 2013 A1
20130061169 Pearcy et al. Mar 2013 A1
20130073377 Heath Mar 2013 A1
20130073454 Busch Mar 2013 A1
20130076732 Cervelli et al. Mar 2013 A1
20130078943 Biage et al. Mar 2013 A1
20130086482 Parsons Apr 2013 A1
20130096988 Grossman et al. Apr 2013 A1
20130097482 Marantz et al. Apr 2013 A1
20130100134 Cervelli et al. Apr 2013 A1
20130101159 Chao et al. Apr 2013 A1
20130110746 Ahn May 2013 A1
20130110822 Ikeda et al. May 2013 A1
20130110877 Bonham et al. May 2013 A1
20130111320 Campbell et al. May 2013 A1
20130117651 Waldman et al. May 2013 A1
20130132398 Pfeifle May 2013 A1
20130150004 Rosen Jun 2013 A1
20130151148 Parundekar et al. Jun 2013 A1
20130151305 Akinola et al. Jun 2013 A1
20130151388 Falkenborg et al. Jun 2013 A1
20130151453 Bhanot et al. Jun 2013 A1
20130157234 Gulli et al. Jun 2013 A1
20130166348 Scotto Jun 2013 A1
20130166480 Popescu et al. Jun 2013 A1
20130166550 Buchmann et al. Jun 2013 A1
20130176321 Mitchell et al. Jul 2013 A1
20130179420 Park et al. Jul 2013 A1
20130185245 Anderson Jul 2013 A1
20130185307 El-Yaniv et al. Jul 2013 A1
20130224696 Wolfe et al. Aug 2013 A1
20130225212 Khan Aug 2013 A1
20130226318 Procyk Aug 2013 A1
20130226953 Markovich et al. Aug 2013 A1
20130232045 Tai et al. Sep 2013 A1
20130238616 Rose et al. Sep 2013 A1
20130246170 Gross et al. Sep 2013 A1
20130246537 Gaddala Sep 2013 A1
20130246597 Iizawa et al. Sep 2013 A1
20130251233 Yang et al. Sep 2013 A1
20130254900 Sathish et al. Sep 2013 A1
20130262527 Hunter et al. Oct 2013 A1
20130263019 Castellanos et al. Oct 2013 A1
20130267207 Hao et al. Oct 2013 A1
20130268520 Fisher et al. Oct 2013 A1
20130279757 Kephart Oct 2013 A1
20130282696 John et al. Oct 2013 A1
20130282723 Petersen et al. Oct 2013 A1
20130290011 Lynn et al. Oct 2013 A1
20130290825 Arndt et al. Oct 2013 A1
20130297619 Chandrasekaran et al. Nov 2013 A1
20130304770 Boero et al. Nov 2013 A1
20130311375 Priebatsch Nov 2013 A1
20130339891 Blumenberg et al. Dec 2013 A1
20140012796 Petersen et al. Jan 2014 A1
20140019936 Cohanoff Jan 2014 A1
20140032506 Hoey et al. Jan 2014 A1
20140033010 Richardt et al. Jan 2014 A1
20140033120 Bental et al. Jan 2014 A1
20140040371 Gurevich et al. Feb 2014 A1
20140047319 Eberlein Feb 2014 A1
20140047357 Alfaro et al. Feb 2014 A1
20140058914 Song et al. Feb 2014 A1
20140059038 McPherson et al. Feb 2014 A1
20140067611 Adachi et al. Mar 2014 A1
20140068487 Steiger et al. Mar 2014 A1
20140074855 Zhao et al. Mar 2014 A1
20140095273 Tang et al. Apr 2014 A1
20140095509 Patton Apr 2014 A1
20140108068 Williams Apr 2014 A1
20140108380 Gotz et al. Apr 2014 A1
20140108985 Scott et al. Apr 2014 A1
20140123279 Bishop et al. May 2014 A1
20140129261 Bothwell et al. May 2014 A1
20140129936 Richards et al. May 2014 A1
20140136285 Carvalho May 2014 A1
20140143009 Brice et al. May 2014 A1
20140149436 Bahrami et al. May 2014 A1
20140156527 Grigg et al. Jun 2014 A1
20140157172 Peery et al. Jun 2014 A1
20140164502 Khodorenko et al. Jun 2014 A1
20140176606 Narayan et al. Jun 2014 A1
20140189536 Lange et al. Jul 2014 A1
20140195515 Baker et al. Jul 2014 A1
20140195887 Ellis et al. Jul 2014 A1
20140208281 Ming Jul 2014 A1
20140214579 Shen et al. Jul 2014 A1
20140218400 O'Toole et al. Aug 2014 A1
20140222521 Chait Aug 2014 A1
20140222793 Sadkin et al. Aug 2014 A1
20140229554 Grunin et al. Aug 2014 A1
20140244284 Smith Aug 2014 A1
20140244388 Manouchehri et al. Aug 2014 A1
20140258246 Lo Faro et al. Sep 2014 A1
20140267294 Ma Sep 2014 A1
20140267295 Sharma Sep 2014 A1
20140279824 Tamayo Sep 2014 A1
20140310266 Greenfield Oct 2014 A1
20140316911 Gross Oct 2014 A1
20140333651 Cervelli et al. Nov 2014 A1
20140337772 Cervelli et al. Nov 2014 A1
20140344230 Krause et al. Nov 2014 A1
20140351070 Christner et al. Nov 2014 A1
20140358829 Hurwitz Dec 2014 A1
20140361899 Layson Dec 2014 A1
20140365965 Bray et al. Dec 2014 A1
20140366132 Stiansen et al. Dec 2014 A1
20150019394 Unser et al. Jan 2015 A1
20150026622 Roaldson et al. Jan 2015 A1
20150029176 Baxter et al. Jan 2015 A1
20150046870 Goldenberg et al. Feb 2015 A1
20150073929 Psota et al. Mar 2015 A1
20150073954 Braff Mar 2015 A1
20150089353 Folkening Mar 2015 A1
20150089424 Duffield et al. Mar 2015 A1
20150095773 Gonsalves et al. Apr 2015 A1
20150100897 Sun et al. Apr 2015 A1
20150100907 Erenrich et al. Apr 2015 A1
20150106170 Bonica Apr 2015 A1
20150106379 Elliot et al. Apr 2015 A1
20150134666 Gattiker et al. May 2015 A1
20150135256 Hoy et al. May 2015 A1
20150169709 Kara et al. Jun 2015 A1
20150169726 Kara et al. Jun 2015 A1
20150170077 Kara et al. Jun 2015 A1
20150178825 Huerta Jun 2015 A1
20150178877 Bogomolov et al. Jun 2015 A1
20150186483 Tappan et al. Jul 2015 A1
20150186821 Wang et al. Jul 2015 A1
20150187036 Wang et al. Jul 2015 A1
20150187100 Berry et al. Jul 2015 A1
20150188872 White Jul 2015 A1
20150212663 Papale et al. Jul 2015 A1
20150227295 Meiklejohn et al. Aug 2015 A1
20150254220 Burr et al. Sep 2015 A1
20150309719 Ma et al. Oct 2015 A1
20150312323 Peterson Oct 2015 A1
20150317342 Grossman et al. Nov 2015 A1
20150324868 Kaftan et al. Nov 2015 A1
20150338233 Cervelli et al. Nov 2015 A1
20150379413 Robertson et al. Dec 2015 A1
20160004764 Chakerian et al. Jan 2016 A1
20160026923 Erenrich et al. Jan 2016 A1
20160055501 Mukherjee et al. Feb 2016 A1
20160062555 Ward et al. Mar 2016 A1
20170052654 Cervelli et al. Feb 2017 A1
20170052655 Cervelli et al. Feb 2017 A1
20170052747 Cervelli et al. Feb 2017 A1
Foreign Referenced Citations (70)
Number Date Country
2012216622 May 2015 AU
2013251186 Nov 2015 AU
102546446 Jul 2012 CN
103167093 Jun 2013 CN
102054015 May 2014 CN
102014103482 Sep 2014 DE
102014204827 Sep 2014 DE
102014204830 Sep 2014 DE
102014204834 Sep 2014 DE
102013222023 Jan 2015 DE
102014215621 Feb 2015 DE
0 763 201 Mar 1997 EP
1 672 527 Jun 2006 EP
2487610 Aug 2012 EP
2551799 Jan 2013 EP
2560134 Feb 2013 EP
2 575 107 Apr 2013 EP
2778977 Sep 2014 EP
2835745 Feb 2015 EP
2835770 Feb 2015 EP
2838039 Feb 2015 EP
2846241 Mar 2015 EP
2851852 Mar 2015 EP
2858014 Apr 2015 EP
2858018 Apr 2015 EP
2863326 Apr 2015 EP
2863346 Apr 2015 EP
2869211 May 2015 EP
2881868 Jun 2015 EP
2884439 Jun 2015 EP
2884440 Jun 2015 EP
2889814 Jul 2015 EP
2891992 Jul 2015 EP
2892197 Jul 2015 EP
2911078 Aug 2015 EP
2911100 Aug 2015 EP
2940603 Nov 2015 EP
2940609 Nov 2015 EP
2963595 Jan 2016 EP
2988258 Feb 2016 EP
2993595 Mar 2016 EP
3070622 Sep 2016 EP
3133510 Feb 2017 EP
3139333 Mar 2017 EP
2516155 Jan 2015 GB
2518745 Apr 2015 GB
2012778 Nov 2014 NL
2013306 Feb 2015 NL
624557 Dec 2014 NZ
WO 9532424 Nov 1995 WO
WO 0009529 Feb 2000 WO
WO 01025906 Apr 2001 WO
WO 0188750 Nov 2001 WO
WO 0198925 Dec 2001 WO
WO 02065353 Aug 2002 WO
WO 2004057268 Jul 2004 WO
WO 2005013200 Feb 2005 WO
WO 2005104736 Nov 2005 WO
WO 2005116851 Dec 2005 WO
WO 2007133206 Nov 2007 WO
WO 2008064207 May 2008 WO
WO 2009061501 May 2009 WO
WO 2009123975 Oct 2009 WO
WO 2010000014 Jan 2010 WO
WO 2010030913 Mar 2010 WO
WO 2010030914 Mar 2010 WO
WO 2011058507 May 2011 WO
WO 2012119008 Sep 2012 WO
WO 2013010157 Jan 2013 WO
WO 2013102892 Jul 2013 WO
Non-Patent Literature Citations (404)
Entry
“A Quick Guide to UniProtKB Swiss-Prot & TrEMBL,” Sep. 2011, pp. 2.
Boyce, Jim, “Microsoft Outlook 2010 Inside Out,” Aug. 1, 2010, retrieved from the internet https://capdtron.files.wordpress.com/2013/01/outlook-2010-inside_out. pdf.
Canese et al., “Chapter 2: PubMed: The Bibliographic Database,” The NCBI Handbook, Oct. 2002, pp. 1-10.
Chung, Chin-Wan, “Dataplex: An Access to Heterogeneous Distributed Databases,” Communications of the ACM, Association for Computing Machinery, Inc., vol. 33, No. 1, Jan. 1, 1990, pp. 70-80.
Conner, Nancy, “Google Apps: The Missing Manual,” May 1, 2008, pp. 15.
Definition “Identify” downloaded Jan. 22, 2015, 1 page.
Definition “Overlay” downloaded Jan. 22, 2015, 1 page.
Delcher et al., “Identifying Bacterial Genes and Endosymbiont DNA with Glimmer,” BioInformatics, vol. 23, No. 6, 2007, pp. 673-679.
Gorr et al., “Crime Hot Spot Forecasting: Modeling and Comparative Evaluation,” Grant 98-IJ-CX-K005, May 6, 2002, 37 pages.
Goswami, Gautam, “Quite Writely Said!,” One Brick at a Time, Aug. 21, 2005, pp. 7.
Hansen et al. “Analyzing Social Media Networks with NodeXL: Insights from a Connected World”, Chapter 4, pp. 53-67 and Chapter 10, pp. 143-164, published Sep. 2010.
Hardesty, “Privacy Challenges: Analysis: It's Surprisingly Easy to Identify Individuals from Credit-Card Metadata,” MIT News on Campus and Around the World, MIT News Office, Jan. 29, 2015, 3 pages.
Hogue et al., “Thresher: Automating the Unwrapping of Semantic Content from the World Wide Web,” 14th International Conference on World Wide Web, WWW 2005: Chiba, Japan, May 10-14, 2005, pp. 86-95.
“HunchLab: Heat Map and Kernel Density Calculation for Crime Analysis,” Azavea Journal, printed from www.azavea.com/blogs/newsletter/v4i4/kernel-density-capabilities-added-to-hunchlab/ on Sep. 9, 2014, 2 pages.
Kahan et al., “Annotea: an open RDF infrastructure for shared WEB annotations”, Computer Networks 39, pp. 589-608, 2002.
Kitts, Paul, “Chapter 14: Genome Assembly and Annotation Process,” The NCBI Handbook, Oct. 2002, pp. 1-21.
Li et al., “Interactive Multimodal Visual Search on Mobile Device,” IEEE Transactions on Multimedia, vol. 15, No. 3, Apr. 1, 2013, pp. 594-607.
Madden, Tom, “Chapter 16: The BLAST Sequence Analysis Tool,” The NCBI Handbook, Oct. 2002, pp. 1-15.
Mizrachi, Ilene, “Chapter 1: GenBank: The Nuckeotide Sequence Database,” The NCBI Handbook, Oct. 2002, pp. 1-14.
Nierman, “Evaluating Structural Similarity in XML Documents,” 2002, 6 pages.
Olanoff, Drew, “Deep Dive with the New Google Maps for Desktop with Google Earth Integration, It's More than Just a Utility,” May 15, 2013, pp. 1-6, retrieved from the internet: http://webarchive.org/web/20130515230641/http://techcrunch.com/2013/05/15/deep-dive-with-the-new-google-maps-for-desktop-with-google-earth-integration-its-more-than-just-a-utility/.
Palmas et al., “An Edge-Bunding Layout for Interactive Parallel Coordinates” 2014 IEEE Pacific Visualization Symposium, pp. 57-64.
Sigrist, et al., “PROSITE, a Protein Domain Database for Functional Characterization and Annotation,” Nucleic Acids Research, 2010, vol. 38, pp. D161-D166.
Sirotkin et al., “Chapter 13: The Processing of Biological Sequence Data at NCBI,” The NCBI Handbook, Oct. 2002, pp. 1-11.
“The FASTA Program Package,” fasta-36.3.4, Mar. 25, 2011, pp. 29.
Umagandhi et al., “Search Query Recommendations Using Hybrid User Profile with Query Logs,” International Journal of Computer Applications, vol. 80, No. 10, Oct. 1, 2013, pp. 7-18.
Valentini et al., “Ensembles of Learning Machines,” M. Marinaro and R. Tagliaferri (Eds.): WIRN VIETRI 2002, LNCS 2486, pp. 3-20.
Wikipedia, “Federated Database System,” Sep. 7, 2013, retrieved from the internet on Jan. 27, 2015 http://en.wikipedia.org/w/index.php?title=Federated_database_system&oldid=571954221.
Wongsuphasawat et al., “Visual Analytics for Transportation Incident Data Sets,” Transportation Research Record 2138, 2009, pp. 135-145.
Yang et al., “HTML Page Analysis Based on Visual Cues,” 2001, pp. 859-864.
Notice of Allowance for U.S. Appl. No. 14/102,394 dated Aug. 25, 2014.
Notice of Allowance for U.S. Appl. No. 14/108,187 dated Aug. 29, 2014.
Notice of Allowance for U.S. Appl. No. 14/135,289 dated Oct. 14, 2014.
Notice of Allowance for U.S. Appl. No. 14/268,964 dated Dec. 3, 2014.
Notice of Allowance for U.S. Appl. No. 14/473,860 dated Jan. 5, 2015.
Notice of Allowance for U.S. Appl. No. 14/192,767 dated Dec. 16, 2014.
Notice of Allowance for U.S. Appl. No. 14/294,098 dated Dec. 29, 2014.
Notice of Allowance for U.S. Appl. No. 14/616,080 dated Apr. 2, 2015.
Notice of Allowance for U.S. Appl. No. 14/486,991 dated May 1, 2015.
Notice of Allowance for U.S. Appl. No. 14/225,084 dated May 4, 2015.
Notice of Allowance for U.S. Appl. No. 14/504,103 dated May 18, 2015.
Notice of Allowance for U.S. Appl. No. 13/948,859 dated Dec. 10, 2014.
Notice of Acceptance for Australian Patent Application No. 201221622 dated Jan. 6, 2015.
Official Communication for New Zealand Patent Application No. 628585 dated Aug. 26, 2014.
Official Communication for European Patent Application No. 14158861.6 dated Jun. 16, 2014.
Official Communication for New Zealand Patent Application No. 622517 dated Apr. 3, 2014.
Official Communication for New Zealand Patent Application No. 628263 dated Aug. 12, 2014.
Official Communication for Great Britain Patent Application No. 1404457.2 dated Aug. 14, 2014.
Official Communication for New Zealand Patent Application No. 627962 dated Aug. 5, 2014.
Official Communication for European Patent Application No. 14159464.8 dated Jul. 31, 2014.
Official Communication for New Zealand Patent Application No. 628840 dated Aug. 28, 2014.
Official Communication in New Zealand Patent Application No. 628495 dated Aug. 19, 2014.
Official Communication for Great Britain Patent Application No. 1408025.3 dated Nov. 6, 2014.
Official Communication for New Zealand Patent Application No. 622513 dated Apr. 3, 2014.
Official Communication for New Zealand Patent Application No. 628161 dated Aug. 25, 2014.
Official Communication for Great Britain Patent Application No. 1404574.4 dated Dec. 18, 2014.
Official Communication for Great Britain Patent Application No. 1411984.6 dated Dec. 22, 2014.
Official Communication for Great Britain Patent Application No. 1413935.6 dated Jan. 27, 2015.
Official Communication for European Patent Application No. 14180281.9 dated Jan. 26, 2015.
Official Communication for European Patent Application No. 14187996.5 dated Feb. 12, 2015.
Official Communication for European Patent Application No. 14180142.3 dated Feb. 6, 2015.
Official Communication for European Patent Application No. 14186225.0 dated Feb. 13, 2015.
Official Communication for European Patent Application No. 14189344.6 dated Feb. 20, 2015.
Official Communication for Australian Patent Application No. 2014201511 dated Feb. 27, 2015.
Official Communication for European Patent Application No. 14189347.9 dated Mar. 4, 2015.
Official Communication for European Patent Application No. 14199182.8 dated Mar. 13, 2015.
Official Communication for Australian Patent Application No. 2014202442 dated Mar. 19, 2015.
Official Communication for European Patent Application No. 14180321.3 dated Apr. 17, 2015.
Official Communication for European Patent Application No. 14197879.1 dated Apr. 28, 2015.
Official Communication for European Patent Application No. 14197895.7 dated Apr. 28, 2015.
Official Communication for European Patent Application No. 14189802.3 dated May 11, 2015.
Official Communication for U.S. Appl. No. 14/289,596 dated Jul. 18, 2014.
Official Communication for U.S. Appl. No. 14/289,599 dated Jul. 22, 2014.
Official Communication for U.S. Appl. No. 14/225,160 dated Jul. 29, 2014.
Official Communication for U.S. Appl. No. 14/268,964 dated Sep. 3, 2014.
Official Communication for U.S. Appl. No. 14/225,084 dated Sep. 2, 2014.
Official Communication for U.S. Appl. No. 14/294,098 dated Aug. 15, 2014.
Official Communication for U.S. Appl. No. 14/148,568 dated Oct. 22, 2014.
Official Communication for U.S. Appl. No. 14/225,006 dated Sep. 10, 2014.
Official Communication for U.S. Appl. No. 14/294,098 dated Nov. 6, 2014.
Official Communication for U.S. Appl. No. 14/306,138 dated Sep. 23, 2014.
Official Communication for U.S. Appl. No. 14/306,154 dated Sep. 9, 2014.
Official Communication for U.S. Appl. No. 14/306,147 dated Sep. 9, 2014.
Official Communication for U.S. Appl. No. 14/319,765 dated Nov. 25, 2014.
Official Communication for U.S. Appl. No. 14/323,935 dated Nov. 28, 2014.
Official Communication for U.S. Appl. No. 14/326,738 dated Dec. 2, 2014.
Official Communication for U.S. Appl. No. 14/225,160 dated Oct. 22, 2014.
Official Communication for U.S. Appl. No. 14/504,103 dated Feb. 5, 2015.
Official Communication for U.S. Appl. No. 14/289,596 dated Jan. 26, 2015.
Official Communication for U.S. Appl. No. 14/319,765 dated Feb. 4, 2015.
Official Communication for U.S. Appl. No. 14/225,160 dated Feb. 11, 2015.
Official Communication for U.S. Appl. No. 14/306,138 dated Feb. 18, 2015.
Official Communication for U.S. Appl. No. 14/306,147 dated Feb. 19, 2015.
Official Communication for U.S. Appl. No. 14/225,084 dated Feb. 20, 2015.
Official Communication for U.S. Appl. No. 14/225,006 dated Feb. 27, 2015.
Official Communication for U.S. Appl. No. 14/473,552 dated Feb. 24, 2015.
Official Communication for U.S. Appl. No. 14/486,991 dated Mar. 10, 2015.
Official Communication for U.S. Appl. No. 14/306,154 dated Mar. 11, 2015.
Official Communication for U.S. Appl. No. 13/831,791 dated Mar. 4, 2015.
Official Communication for U.S. Appl. No. 14/504,103 dated Mar. 31, 2015.
Official Communication for U.S. Appl. No. 14/323,935 dated Mar. 31, 2015.
Official Communication for U.S. Appl. No. 14/326,738 dated Mar. 31, 2015.
Official Communication for U.S. Appl. No. 13/247,987 dated Apr. 2, 2015.
Official Communication for U.S. Appl. No. 14/148,568 dated Mar. 26, 2015.
Official Communication for U.S. Appl. No. 12/840,673 dated Sep. 17, 2014.
Official Communication for U.S. Appl. No. 12/840,673 dated Jan. 2, 2015.
Official Communication for U.S. Appl. No. 14/319,161 dated Jan. 23, 2015.
Official Communication for U.S. Appl. No. 13/728,879 dated Jan. 27, 2015.
Official Communication for U.S. Appl. No. 13/728,879 dated Mar. 17, 2015.
Bluttman et al., “Excel Formulas and Functions for Dummies,” 2005, Wiley Publishing, Inc., pp. 280, 284-286.
Keylines.com, “An Introduction to KeyLines and Network Visualization,” Mar. 2014, http://keylines.com/wp-content/uploads/2014/03/KeyLines-White-Paper.pdf downloaded May 12, 2014 in 8 pages.
Keylines.com, “KeyLines Datasheet,” Mar. 2014, http://keylines.com/wp-content/uploads/2014/03/KeyLines-datasheet.pdf downloaded May 12, 2014 in 2 pages.
Keylines.com, “Visualizing Threats: Improved Cyber Security Through Network Visualization,” Apr. 2014, http://keylines.com/wp-content/uploads/2014/04/Visualizing-Threats1.pdf downloaded May 12, 2014 in 10 pages.
Manno et al., “Introducing Collaboration in Single-user Applications through the Centralized Control Architecture,” 2010, pp. 10.
Rouse, Margaret, “OLAP Cube,” http://searchdatamanagement.techtarget.com/definition/OLAP-cube, Apr. 28, 2012, pp. 16.
Official Communication in Great Britian Application No. 1319225.7 dated May 2, 2014.
Official Communication in New Zealand Application No. 624557 dated May 14, 2014.
“A First Look: Predicting Market Demand for Food Retail using a Huff Analysis,” TRF Policy Solutions, Jul. 2012, pp. 30.
Acklen, Laura, “Absolute Beginner's Guide to Microsoft Word 2003,” Dec. 24, 2003, pp. 15-18, 34-41, 308-316.
Ananiev et al., “The New Modality API,” http://web.archive.org/web/20061211011958/http://java.sun.com/developer/technicalArticles/J2SE/Desktop/javase6/modality/ Jan. 21, 2006, pp. 8.
“Andy Turner's GISRUK 2012 Notes” https://docs.google.com/document/d/1cTmxg7mVx5gd89lqblCYvCEnHA4QAivH4I4WpyPsqE4/edit?pli=1 printed Sep. 16, 2013 in 15 pages.
Barnes et al., “Viewshed Analysis”, GIS-ARC/INFO 2001, www.evsc.virginia.edu/˜jhp7e/evsc466/student_pres/Rounds.pdf.
Bugzilla@Mozilla, “Bug 18726—[feature] Long-click means of invoking contextual menus not supported,” http://bugzilla.mozilla.org/show_bug.cgi?id=18726 printed Jun. 13, 2013 in 11 pages.
Carver et al., “Real-Time Visibility Analysis and Rapid Viewshed Calculation Using a Voxel-Based Modelling Approach,” GISRUK 2012 Conference, Apr. 11-13, Lancaster UK, Apr. 13, 2012, pp. 6.
Chen et al., “Bringing Order to the Web: Automatically Categorizing Search Results,” CHI 2000, Proceedings of the SIGCHI conference on Human Factors in Computing Systems, Apr. 1-6, 2000, The Hague, The Netherlands, pp. 145-152.
Dramowicz, Ela, “Retail Trade Area Analysis Using the Huff Model,” Directions Magazine, Jul. 2, 2005 in 10 pages, http://www.directionsmag.com/articles/retail-trade-area-analysis-using-the-huff-model/123411.
Ghosh, P., “A Solution of Polygon Containment, Spatial Planning, and Other Related Problems Using Minkowski Operations,” Computer Vision, Graphics, and Image Processing, 1990, vol. 49, pp. 1-35.
GIS-NET 3 Public—Department of Regional Planning. Planning & Zoning Information for Unincorporated LA County. Retrieved Oct. 2, 2013 from http://gis.planning.lacounty.gov/GIS-NET3_Public/Viewer.html.
Griffith, Daniel A., “A Generalized Huff Model,” Geographical Analysis, Apr. 1982, vol. 14, No. 2, pp. 135-144.
Haralick et al., “Image Analysis Using Mathematical Morphology,” Pattern Analysis and Machine Intelligence, IEEE Transactions, Jul. 1987, vol. PAMI-9, No. 4, pp. 532-550.
Hibbert et al., “Prediction of Shopping Behavior Using a Huff Model Within a GIS Framework,” Healthy Eating in Context Mar. 18, 2011, pp. 16.
Huff et al., “Calibrating the Huff Model Using ArcGIS Business Analyst,” ESRI, Sep. 2008, pp. 33.
Huff, David L., “Parameter Estimation in the Huff Model,” ESRI, ArcUser, Oct.-Dec. 2003, pp. 34-36.
Ipbucker, C., “Inverse Transformation for Several Pseudo-cylindrical Map Projections Using Jacobian Matrix,” ICCSA 2009, Part 1 LNCS 5592, pp. 553-564.
Levine, N., “Crime Mapping and the Crimestat Program,” Geographical Analysis, 2006, vol. 38, pp. 41-56.
Liu, Tianshun, “Combining GIS and the Huff Model to Analyze Suitable Locations for a New Asian Supermarket in the Minneapolis and St. Paul, Minnesota USA,” Papers in Resource Analysis, 2012, vol. 14, pp. 8.
Mandagere, Nagapramod, “Buffer Operations in GIS,” http://www-users.cs.umn.edu/˜npramod/enc_pdf.pdf retrieved Jan. 28, 2010, pp. 7.
Manske, “File Saving Dialogs,” http://www.mozilia.org/editor/ui_specs/FileSaveDialogs.html, Jan. 20, 1999, pp. 7.
Map Builder, “Rapid Mashup Development Tool for Google and Yahoo Maps!” http://web.archive.org/web/20090626224734/http://www.mapbuilder.net/ printed Jul. 20, 2012 in 2 pages.
Map of San Jose, CA. Retrieved Oct. 2, 2013 from http://maps.bing.com.
Map of San Jose, CA. Retrieved Oct. 2, 2013 from http://maps.google.com.
Map of San Jose, CA. Retrieved Oct. 2, 2013 from http://maps.yahoo.com.
Microsoft—Developer Network, “Getting Started with VBA in Word 2010,” Apr. 2010, http://msdn.microsoft.com/en-us/library/ff604039%28v=office.14%29.aspx as printed Apr. 4, 2014 in 17 pages.
Microsoft Office—Visio, “About connecting shapes,” http://office.microsoft.com/en-us/visio-help/about-connecting-shapes-HP085050369.aspx printed Aug. 4, 2011 in 6 pages.
Microsoft Office—Visio, “Add and glue connectors with the Connector tool,” http://office.microsoft.com/en-us/visio-help/add-and-glue-connectors-with-the-connector-tool-HA010048532.aspx?CTT=1 printed Aug. 4, 2011 in page.
Murray, C., Oracle Spatial Developer's Guide—6 Coordinate Systems (Spatial Reference Systems), http://docs.oracle.com/cd/B28359_01/appdev.111/b28400.pdf, Jun. 2009.
Open Street Map, “Amm's Diary:Unconnected ways and other data quality issues,” http://www.openstreetmap.org/user/amm/diary printed Jul. 23, 2012 in 3 pages.
POI Editor, “How to: Create Your Own Points of Interest,” http://www.poieditor.com/articles/how_to_create_your_own_points_of_interest/ printed Jul. 22, 2012 in 4 pages.
Pozzi et al., “Vegetation and Population Density in Urban and Suburban Areas in the U.S.A.” Third International Symposium of Remote Sensing of Urban Areas Istanbul, Turkey, Jun. 2002, pp. 8.
Qiu, Fang, “3d Analysis and Surface Modeling”, http://web.archive.org/web/20091202221925/http://www.utsa.edu/lrsg/Teaching/EES6513/08-3D.pdf printed Sep. 16, 2013 in 26 pages.
Reddy et al., “Under the hood of GeoVRML 1.0,” SRI International, Proceedings of the fifth symposium on Vurtual Reality Modeling Language (Web3D-VRML), New York, NY, Feb. 2000, pp. 23-28. http://pdf.aminer.org/000/648/038/under_the_hood_of_geovrml.pdf.
Reibel et al., “Areal Interpolation of Population Counts Using Pre-classified Land Cover Data,” Population Research and Policy Review, 2007, vol. 26, pp. 619-633.
Reibel, M., “Geographic Information Systems and Spatial Data Processing in Demography: a Review,” Population Research and Policy Review, 2007, vol. 26, pp. 601-618.
Rizzardi et al., “Interfacing U.S. Census Map Files with Statistical Graphics Software: Application and Use in Epidemiology,” Statistics in Medicine, Oct. 1993, vol. 12, No. 19-20, pp. 1953-1964.
Snyder, “Map Projections—A Working Manual,” U.S. Geological Survey Professional paper 1395, United States Government Printing Office, Washington: 1987, pp. 11-21 and 60-70.
Sonris, “Using the Area of Interest Tools,” http://web.archive.org/web/20061001053327/http://sonris-www.dnr.state.la.us/gis/instruct_files/tutslide12 printed Jan. 3, 2013 in 1 page.
Tangelder et al., “Freeform Shape Matching Using Minkowski Operations,” The Netherlands, Jun. 1996, pp. 12.
VB Forums, “Buffer A Polygon,” Internet Citation, http://www.vbforums.com/showthread.php?198436-Buffer-a-Polygon, Specifically Thread #1, #5 & #11 retrieved on May 2, 2013, pp. 8.
Vivid Solutions, “JTS Topology Suite: Technical Specifications,” http://www.vividsolutions.com/jts/bin/JTS%20Technical%20Specs.pdf Version 1.4, 2003, pp. 36.
Wikipedia, “Douglas-Peucker-Algorithms,” http://de.wikipedia.org/w/index.php?title=Douglas-Peucker-Algorithmus&oldid=91846042 printed Jul. 2011, pp. 2.
Wikipedia, “Ramer-Douglas-Peucker Algorithm,” http://en.wikipedia.org/wiki/Ramer%E2%80%93Douglas%E2%80%93Peucker_algorithm printed Jul. 2011, pp. 3.
Woodbridge, Stephen, “[geos-devel] Polygon simplification,” http://lists.osgeo.org/pipermail/geos-devel/2011-May/005210.html dated May 8, 2011, pp. 3.
International Search Report and Written Opinion in Application No. PCT/US2009/056703, dated Mar. 15, 2010.
Official Communication in Australian Application No. AU2010257305, dated Apr. 12, 2011.
Official Communication in Australian Application No. AU2010257305, dated Sep. 22, 2011.
European Search Report in European Application No. EP10195798.3, dated May 17, 2011.
Official Communication in Australian Application No. AU2010227081, dated Mar. 18, 2011.
European Search Report in European Application No. EP12186236.1, dated May 17, 2013.
Official Communication in New Zealand Application No. 616167 dated Oct. 10, 2013.
“A Word About Banks and the Laundering of Drug Money,” Aug. 18, 2012, http://www.golemxiv.co.uk/2012/08/a-word-about-banks-and-the-laundering-of-drug-money/.
Abbey, Kristen, “Review of Google Docs,” May 1, 2007, pp. 2.
About 80 Minutes, “Palantir in a Number of Parts—Part 6—Graph,” Mar. 21, 2013, pp. 1-6.
Adams et al., “Worklets: A Service-Oriented Implementation of Dynamic Flexibility in Workflows,” R. Meersman, Z. Tari et al. (Eds.): OTM 2006, LNCS, 4275, pp. 291-308, 2006.
Alur et al., “Chapter 2: IBM InfoSphere DataStage Stages,” IBM InfoSphere DataStage Data Flow and Job Design, Jul. 1, 2008, pp. 35-137.
Amnet, “5 Great Tools for Visualizing Your Twitter Followers,” posted Aug. 4, 2010, http://www.amnetblog.com/component/content/article/115-5-grate-tools-for-visualizing-your-twitter-followers.html.
Appacts, “Smart Thinking for Super Apps,” <http://www.appacts.com> Printed Jul. 18, 2013 in 4 pages.
Apsalar, “Data Powered Mobile Advertising,” “Free Mobile App Analytics” and various analytics related screen shots <http://apsalar.com> Printed Jul. 18, 2013 in 8 pages.
Capptain—Pilot Your Apps, <http://www.capptain.com> Printed Jul. 18, 2013 in 6 pages.
Celik, Tantek, “CSS Basic User Interface Module Level 3 (CSS3 UI),” Section 8 Resizing and Overflow, Jan. 17, 2012, retrieved from internet http://www.w3.org/TR/2012/WD-css3-ui-20120117/#resizing-amp-overflow retrieved on May 18, 2015.
Chaudhuri et al., “An Overview of Business Intelligence Technology,” Communications of the ACM, Aug. 2011, vol. 54, No. 8.
Cohn, et al., “Semi-supervised clustering with user feedback,” Constrained Clustering: Advances in Algorithms, Theory, and Applications 4.1 (2003): 17-32.
Countly Mobile Analytics, <http://count.ly/> Printed Jul. 18, 2013 in 9 pages.
Distimo—App Analytics, <http://www.distimo.com/app-analytics> Printed Jul. 18, 2013 in 5 pages.
Flurry Analytics, <http://www.flurry.com/> Printed Jul. 18, 2013 in 14 pages.
Galliford, Miles, “Snaglt Versus Free Screen Capture Software: Critical Tools for Website Owners,” <http://www.subhub.com/articles/free-screen-capture-software>, Mar. 27, 2008, pp. 11.
Gesher, Ari, “Palantir Screenshots in the Wild: Swing Sightings,” The Palantir Blog, Sep. 11, 2007, pp. 1-12.
Google Analytics Official Website—Web Analytics & Reporting, <http://www.google.com/analytics.index.html> Printed Jul. 18, 2013 in 22 pages.
“GrabUp—What a Timesaver!” <http://atlchris.com/191/grabup/>, Aug. 11, 2008, pp. 3.
Gu et al., “Record Linkage: Current Practice and Future Directions,” Jan. 15, 2004, pp. 32.
Hua et al., “A Multi-attribute Data Structure with Parallel Bloom Filters for Network Services”, HiPC 2006, LNCS 4297, pp. 277-288, 2006.
Huang et al., “Systematic and Integrative Analysis of Large Gene Lists Using DAVID Bioinformatics Resources,” Nature Protocols, 4.1, 2008, 44-57.
JetScreenshot.com, “Share Screenshots via Internet in Seconds,” <http://web.archive.org/web/20130807164204/http://www.jetscreenshot.com/>, Aug. 7, 2013, pp. 1.
Kontagent Mobile Analytics, <http://www.kontagent.com/> Printed Jul. 18, 2013 in 9 pages.
Kwout, <http://web.archive.org/web/20080905132448/http://www.kwout.com/> Sep. 5, 2008, pp. 2.
Localytics—Mobile App Marketing & Analytics, <http://www.localytics.com/> Printed Jul. 18, 2013 in 12 pages.
Microsoft Windows, “Microsoft Windows Version 2002 Print Out 2,” 2002, pp. 1-6.
Microsoft, “Registering an Application to a URI Scheme,” <http://msdn.microsoft.com/en-us/library/aa767914.aspx>, printed Apr. 4, 2009 in 4 pages.
Microsoft, “Using the Clipboard,” <http://msdn.microsoft.com/en-us/library/ms649016.aspx>, printed Jun. 8, 2009 in 20 pages.
Mixpanel—Mobile Analytics, <https://mixpanel.com/> Printed Jul. 18, 2013 in 13 pages.
“Money Laundering Risks and E-Gaming: A European Overview and Assessment,” 2009, http://www.cf.ac.uk/socsi/resources/Levi_Final_Money_Laundering_Risks_egaming.pdf.
Nitro, “Trick: How to Capture a Screenshot as PDF, Annotate, Then Share It,” <http://blog.nitropdf.com/2008/03/04/trick-how-to-capture-a-screenshot-as-pdf-annotate-it-then-share/>, Mar. 4, 2008, pp. 2.
Nolan et al., “MCARTA: A Malicious Code Automated Run-Time Analysis Framework,” Homeland Security, 2012 IEEE Conference on Technologies for, Nov. 13, 2012, pp. 13-17.
Online Tech Tips, “Clip2Net—Share files, folders and screenshots easily,” <http://www.online-tech-tips.com/free-software-downloads/share-files-folders-screenshots/>, Apr. 2, 2008, pp. 5.
Open Web Analytics (OWA), <http://www.openwebanalytics.com/> Printed Jul. 19, 2013 in 5 pages.
O'Reilly.com, http://oreilly.com/digitalmedia/2006/01/01/mac-os-x-screenshot-secrets.html published Jan. 1, 2006 in 10 pages.
Palantir Technologies, “Palantir Labs_Timeline,” Oct. 1, 2010, retrieved from the internet https://www.youtube.com/watch?v=JCgDW5bru9M.
Perdisci et al., “Behavioral Clustering of HTTP-Based Malware and Signature Generation Using Malicious Network Traces,” USENIX, Mar. 18, 2010, pp. 1-14.
Piwik—Free Web Analytics Software. <http://piwik.org/> Printed Jul. 19, 2013 in18 pages.
“Potential Money Laundering Warning Signs,” snapshot taken 2003, https://web.archive.org/web/20030816090055/http:/finsolinc.com/ANTI-MONEY%20LAUNDERING%20TRAINING%20GUIDES.pdf.
Quest, “Toad for ORACLE 11.6—Guide to Using Toad,” Sep. 24, 2012, pp. 1-162.
“Refresh CSS Ellipsis When Resizing Container—Stack Overflow,” Jul. 31, 2013, retrieved from internet http://stackoverflow.com/questions/17964681/refresh-css-ellipsis-when-resizing-container, retrieved on May 18, 2015.
Schroder, Stan, “15 Ways to Create Website Screenshots,” <http://mashable.com/2007/08/24/web-screenshots/>, Aug. 24, 2007, pp. 2.
Shi et al., “A Scalable Implementation of Malware Detection Based on Network Connection Behaviors,” 2013 International Conference on Cyber-Enabled Distributed Computing and Knowledge Discovery, IEEE, Oct. 10, 2013, pp. 59-66.
SnagIt, “SnagIt 8.1.0 Print Out 2,” Software release date Jun. 15, 2006, pp. 1-3.
SnagIt, “SnagIt 8.1.0 Print Out,” Software release date Jun. 15, 2006, pp. 6.
SnagIt, “Snag It Online Help Guide,” <http://download.techsmith.com/snagit/docs/onlinehelp/enu/snagit_help.pdf>, TechSmith Corp., Version 8.1, printed Feb. 7, 2007, pp. 284.
StatCounter—Free Invisible Web Tracker, Hit Counter and Web Stats, <http://statcounter.com/> Printed Jul. 19, 2013 in 17 pages.
Symantec Corporation, “E-Security Begins with Sound Security Policies,” Announcement Symantec, Jun. 14, 2001.
TestFlight—Beta Testing on the Fly, <http://testflightapp.com/> Printed Jul. 18, 2013 in 3 pages.
Thompson, Mick, “Getting Started with GEO,” Getting Started with GEO, Jul. 26, 2011.
trak.io, <http://trak.io/> printed Jul. 18, 2013 in 3 pages.
UserMetrix, <http://usermetrix.com/android-analytics> printed Jul. 18, 2013 in 3 pages.
“Using Whois Based Geolocation and Google Maps API for Support Cybercrime Investigations,” http://wseas.us/e-library/conferences/2013/Dubrovnik/TELECIRC/TELECIRC-32.pdf.
Vose et al., “Help File for ModelRisk Version 5,” 2007, Vose Software, pp. 349-353. [Uploaded in 2 Parts].
Wang et al., “Research on a Clustering Data De-Duplication Mechanism Based on Bloom Filter,” IEEE 2010, 5 pages.
Warren, Christina, “TUAW Faceoff: Screenshot apps on the firing line,” <http://www.tuaw.com/2008/05/05/tuaw-faceoff-screenshot-apps-on-the-firing-line/>, May 5, 2008, pp. 11.
Wikipedia, “Multimap,” Jan. 1, 2013, https://en.wikipedia.org/w/index.php?title=Multimap&oldid=530800748.
Wright et al., “Palantir Technologies VAST 2010 Challenge Text Records_Investigations into Arms Dealing,” Oct. 29, 2010, pp. 1-10.
Notice of Acceptance for Australian Patent Application No. 2013251186 dated Nov. 6, 2015.
Notice of Acceptance for Australian Patent Application No. 2014250678 dated Oct. 7, 2015.
Notice of Allowance for U.S. Appl. No. 12/556,318 dated Nov. 2, 2015.
Notice of Allowance for U.S. Appl. No. 12/840,673 dated Apr. 6, 2015.
Notice of Allowance for U.S. Appl. No. 13/728,879 dated Jun. 21, 2016.
Notice of Allowance for U.S. Appl. No. 14/148,568 dated Aug. 26, 2015.
Notice of Allowance for U.S. Appl. No. 14/265,637 dated Feb. 13, 2015.
Notice of Allowance for U.S. Appl. No. 14/319,161 dated May 4, 2015.
Notice of Allowance for U.S. Appl. No. 14/323,935 dated Oct. 1, 2015.
Notice of Allowance for U.S. Appl. No. 14/326,738 dated Nov. 18, 2015.
Notice of Allowance for U.S. Appl. No. 14/473,552 dated Jul. 24, 2015.
Notice of Allowance for U.S. Appl. No. 14/479,863 dated Mar. 31, 2015.
Notice of Allowance for U.S. Appl. No. 14/552,336 dated Nov. 3, 2015.
Notice of Allowance for U.S. Appl. No. 14/730,123 dated Apr. 12, 2016.
Notice of Allowance for U.S. Appl. No. 14/746,671 dated Jan. 21, 2016.
Notice of Allowance for U.S. Appl. No. 14/934,004 dated Nov. 4, 2016.
Official Communication for Australian Patent Application No. 2013251186 dated Mar. 12, 2015.
Official Communication for Australian Patent Application No. 2014210604 dated Jun. 5, 2015.
Official Communication for Australian Patent Application No. 2014210614 dated Jun. 5, 2015.
Official Communication for Australian Patent Application No. 2014213553 dated May 7, 2015.
Official Communication for Australian Patent Application No. 2014250678 dated Jun. 17, 2015.
Official Communication for Canadian Patent Application No. 2831660 dated Jun. 9, 2015.
Official Communication for European Patent Application No. 12181585.6 dated Sep. 4, 2015.
Official Communication for European Patent Application No. 14180432.8 dated Jun. 23, 2015.
Official Communication for European Patent Application No. 14187739.9 dated Jul. 6, 2015.
Official Communication for European Patent Application No. 14191540.5 dated May 27, 2015.
Official Communication for European Patent Application No. 14197938.5 dated Apr. 28, 2015.
Official Communication for European Patent Application No. 14200246.8 dated May 29, 2015.
Official Communication for European Patent Application No. 14200298.9 dated May 13, 2015.
Official Communication for European Patent Application No. 15155845.9 dated Oct. 6, 2015.
Official Communication for European Patent Application No. 15155846.7 dated Jul. 8, 2015.
Official Communication for European Patent Application No. 15165244.3 dated Aug. 27, 2015.
Official Communication for European Patent Application No. 15175106.2 dated Nov. 5, 2015.
Official Communication for European Patent Application No. 15175151.8 dated Nov. 25, 2015.
Official Communication for European Patent Application No. 15181419.1 dated Sep. 29, 2015.
Official Communication for European Patent Application No. 15183721.8 dated Nov. 23, 2015.
Official Communication for European Patent Application No. 15184764.7 dated Dec. 14, 2015.
Official Communication for European Patent Application No. 15188106.7 dated Feb. 3, 2016.
Official Communication for European Patent Application No. 15190307.7 dated Feb. 19, 2016.
Official Communication for Great Britain Patent Application No. 1404486.1 dated May 21, 2015.
Official Communication for Great Britain Patent Application No. 1404486.1 dated Aug. 27, 2014.
Official Communication for Great Britain Patent Application No. 1404489.5 dated May 21, 2015.
Official Communication for Great Britain Patent Application No. 1404489.5 dated Aug. 27, 2014.
Official Communication for Great Britain Patent Application No. 1404499.4 dated Jun. 11, 2015.
Official Communication for Great Britain Patent Application No. 1404499.4 dated Aug. 20, 2014.
Official Communication for Netherlands Patent Application No. 2011632 dated Feb. 8, 2016.
Official Communication for Netherlands Patent Application No. 2011729 dated Aug. 13, 2015.
Official Communication for Netherlands Patent Application No. 2012417 dated Sep. 18, 2015.
Official Communication for Netherlands Patent Application No. 2012421 dated Sep. 18, 2015.
Official Communication for Netherlands Patent Application No. 2012437 dated Sep. 18, 2015.
Official Communication for Netherlands Patent Application No. 2012438 dated Sep. 21, 2015.
Official Communication for Netherlands Patent Application No. 2012778 dated Sep. 22, 2015.
Official Communication for Netherlands Patent Application No. 2013306 dated Apr. 24, 2015.
Official Communication for New Zealand Patent Application No. 622473 dated Jun. 19, 2014.
Official Communication for New Zealand Patent Application No. 622473 dated Mar. 27, 2014.
Official Communication for U.S. Appl. No. 12/556,318 dated Jul. 2, 2015.
Official Communication for U.S. Appl. No. 12/556,321 dated Feb. 25, 2016.
Official Communication for U.S. Appl. No. 12/556,321 dated Jun. 6, 2012.
Official Communication for U.S. Appl. No. 12/556,321 dated Dec. 7, 2011.
Official Communication for U.S. Appl. No. 12/556,321 dated Jul. 7, 2015.
Official Communication for U.S. Appl. No. 13/247,987 dated Sep. 22, 2015.
Official Communication for U.S. Appl. No. 13/669,274 dated Aug. 26, 2015.
Official Communication for U.S. Appl. No. 13/669,274 dated May 6, 2015.
Official Communication for U.S. Appl. No. 13/728,879 dated Aug. 12, 2015.
Official Communication for U.S. Appl. No. 13/728,879 dated Nov. 20, 2015.
Official Communication for U.S. Appl. No. 13/827,491 dated Dec. 1, 2014.
Official Communication for U.S. Appl. No. 13/827,491 dated Jun. 22, 2015.
Official Communication for U.S. Appl. No. 13/827,491 dated Oct. 9, 2015.
Official Communication for U.S. Appl. No. 13/831,791 dated Aug. 6, 2015.
Official Communication for U.S. Appl. No. 13/835,688 dated Jun. 17, 2015.
Official Communication for U.S. Appl. No. 13/839,026 dated Aug. 4, 2015.
Official Communication for U.S. Appl. No. 14/134,558 dated Oct. 7, 2015.
Official Communication for U.S. Appl. No. 14/141,252 dated Oct. 8, 2015.
Official Communication for U.S. Appl. No. 14/196,814 dated May 5, 2015.
Official Communication for U.S. Appl. No. 14/222,364 dated Dec. 9, 2015.
Official Communication for U.S. Appl. No. 14/225,006 dated Sep. 2, 2015.
Official Communication for U.S. Appl. No. 14/225,006 dated Dec. 21, 2015.
Official Communication for U.S. Appl. No. 14/225,084 dated Sep. 11, 2015.
Official Communication for U.S. Appl. No. 14/225,084 dated Jan. 4, 2016.
Official Communication for U.S. Appl. No. 14/225,160 dated Aug. 12, 2015.
Official Communication for U.S. Appl. No. 14/225,160 dated May 20, 2015.
Official Communication for U.S. Appl. No. 14/265,637 dated Sep. 26, 2014.
Official Communication for U.S. Appl. No. 14/289,596 dated Apr. 30, 2015.
Official Communication for U.S. Appl. No. 14/289,596 dated May 9, 2016.
Official Communication for U.S. Appl. No. 14/289,599 dated May 29, 2015.
Official Communication for U.S. Appl. No. 14/289,599 dated Sep. 4, 2015.
Official Communication for U.S. Appl. No. 14/306,138 dated Sep. 14, 2015.
Official Communication for U.S. Appl. No. 14/306,138 dated Mar. 17, 2016.
Official Communication for U.S. Appl. No. 14/306,138 dated Dec. 24, 2015.
Official Communication for U.S. Appl. No. 14/306,138 dated May 26, 2015.
Official Communication for U.S. Appl. No. 14/306,138 dated Dec. 3, 2015.
Official Communication for U.S. Appl. No. 14/306,147 dated Dec. 24, 2015.
Official Communication for U.S. Appl. No. 14/306,147 dated Mar. 4, 2016.
Official Communication for U.S. Appl. No. 14/306,147 dated Aug. 7, 2015.
Official Communication for U.S. Appl. No. 14/306,154 dated Feb. 1, 2016.
Official Communication for U.S. Appl. No. 14/306,154 dated May 15, 2015.
Official Communication for U.S. Appl. No. 14/306,154 dated Nov. 16, 2015.
Official Communication for U.S. Appl. No. 14/306,154 dated Mar. 17, 2016.
Official Communication for U.S. Appl. No. 14/306,154 dated Jul. 6, 2015.
Official Communication for U.S. Appl. No. 14/319,765 dated Sep. 10, 2015.
Official Communication for U.S. Appl. No. 14/319,765 dated Jun. 16, 2015.
Official Communication for U.S. Appl. No. 14/323,878 dated Mar. 30, 2017.
Official Communication for U.S. Appl. No. 14/323,935 dated Jun. 22, 2015.
Official Communication for U.S. Appl. No. 14/326,738 dated Jul. 31, 2015.
Official Communication for U.S. Appl. No. 14/451,221 dated Oct. 21, 2014.
Official Communication for U.S. Appl. No. 14/463,615 dated Sep. 10, 2015.
Official Communication for U.S. Appl. No. 14/463,615 dated May 12, 2016.
Official Communication for U.S. Appl. No. 14/463,615 dated Nov. 13, 2014.
Official Communication for U.S. Appl. No. 14/463,615 dated Mar. 21, 2016.
Official Communication for U.S. Appl. No. 14/463,615 dated May 21, 2015.
Official Communication for U.S. Appl. No. 14/463,615 dated Jan. 28, 2015.
Official Communication for U.S. Appl. No. 14/463,615 dated Dec. 9, 2015.
Official Communication for U.S. Appl. No. 14/479,863 dated Dec. 26, 2014.
Official Communication for U.S. Appl. No. 14/483,527 dated Jun. 22, 2015.
Official Communication for U.S. Appl. No. 14/483,527 dated Jan. 28, 2015.
Official Communication for U.S. Appl. No. 14/483,527 dated Oct. 28, 2015.
Official Communication for U.S. Appl. No. 14/490,612 dated Aug. 18, 2015.
Official Communication for U.S. Appl. No. 14/552,336 dated Jul. 20, 2015.
Official Communication for U.S. Appl. No. 14/562,524 dated Nov. 10, 2015.
Official Communication for U.S. Appl. No. 14/562,524 dated Sep. 14, 2015.
Official Communication for U.S. Appl. No. 14/562,524 dated Feb. 18, 2016.
Official Communication for U.S. Appl. No. 14/571,098 dated Nov. 10, 2015.
Official Communication for U.S. Appl. No. 14/571,098 dated Mar. 11, 2015.
Official Communication for U.S. Appl. No. 14/571,098 dated Aug. 24, 2015.
Official Communication for U.S. Appl. No. 14/571,098 dated Aug. 5, 2015.
Official Communication for U.S. Appl. No. 14/579,752 dated Aug. 19, 2015.
Official Communication for U.S. Appl. No. 14/579,752 dated May 26, 2015.
Official Communication for U.S. Appl. No. 14/631,633 dated Sep. 10, 2015.
Official Communication for U.S. Appl. No. 14/639,606 dated Oct. 16, 2015.
Official Communication for U.S. Appl. No. 14/639,606 dated May 18, 2015.
Official Communication for U.S. Appl. No. 14/639,606 dated Jul. 24, 2015.
Official Communication for U.S. Appl. No. 14/676,621 dated Oct. 29, 2015.
Official Communication for U.S. Appl. No. 14/676,621 dated Jul. 30, 2015.
Official Communication for U.S. Appl. No. 14/715,834 dated Feb. 29, 2016.
Official Communication for U.S. Appl. No. 14/726,353 dated Sep. 10, 2015.
Official Communication for U.S. Appl. No. 14/730,123 dated Sep. 21, 2015.
Official Communication for U.S. Appl. No. 14/746,671 dated Nov. 12, 2015.
Official Communication for U.S. Appl. No. 14/800,447 dated Dec. 10, 2015.
Official Communication for U.S. Appl. No. 14/800,447 dated Mar. 3, 2016.
Official Communication for U.S. Appl. No. 14/813,749 dated Sep. 28, 2015.
Official Communication for U.S. Appl. No. 14/841,338 dated Feb. 18, 2016.
Official Communication for U.S. Appl. No. 14/842,734 dated Jun. 1, 2017.
Official Communication for U.S. Appl. No. 14/842,734 dated Nov. 19, 2015.
Official Communication for U.S. Appl. No. 14/871,465 dated Feb. 9, 2016.
Official Communication for U.S. Appl. No. 14/883,498 dated Mar. 17, 2016.
Official Communication for U.S. Appl. No. 14/883,498 dated Dec. 24, 2015.
Official Communication for U.S. Appl. No. 14/929,584 dated May 25, 2016.
Official Communication for U.S. Appl. No. 14/929,584 dated Feb. 4, 2016.
Official Communication for U.S. Appl. No. 14/934,004 dated Feb. 16, 2016.
Official Communication for U.S. Appl. No. 14/934,004 dated Jul. 29, 2016.
Official Communication for U.S. Appl. No. 15/072,133 dated Nov. 10, 2016.
Official Communication for U.S. Appl. No. 15/072,133 dated Mar. 17, 2017.
Restriction Requirement for U.S. Appl. No. 13/839,026 dated Apr. 2, 2015.
“What Was It Again? Ways to Make Feature Tile Layers Interactive,” WordPress.com, published Jun. 12, 2011, retrieved from https://whatwasitagain.wordpress.com/2011/06/12/interactive-feature-tile-layers/.
Ask Drexel University Knowledge Base, “How to: Auto Save a Document Before Printing in Word 2007,” published Nov. 13, 2007.
Harville et al., “Mediabeads: An Architecture for Path-Enhanced Media Applications,” 2004 IEEE International Conference on Multimedia and Expo, Jun. 27-30, 2004, Taipei, Taiwan, vol. 1, pp. 455-458.
MacWright, Tom, “Announcing MapBOx.JS 1.0 with Leaflet,” Mapbox.com blog, Apr. 18, 2013, retrieved from https://www.mapbox.com/blog/mapbox-js-with-leaflet/.
Palantir, “Basic Map Searches,” YouTube, Sep. 12, 2013, retrieved from https://www.youtube.com/watch?v=UC-1x44xFR0.
Palantir, “Intelligence Integration in Palantir: An Open-Source View of the Afghan Conflict,” YouTube, Jul. 5, 2012, retrieved from https://www.youtube.com/watch?v=FXTxs2YqHY4.
Notice of Allowance for U.S. Appl. No. 14/319,765 dated Nov. 25, 2016.
Notice of Allowance for U.S. Appl. No. 15/072,133 dated Jun. 23, 2017.
Official Communication for Australian Patent Application No. 2012216622 dated Jan. 6, 2015.
Official Communication for European Patent Application No. 16160781.7 dated May 27, 2016.
Official Communication for European Patent Application No. 16184373.5 dated Jan. 17, 2017.
Official Communication for European Patent Application No. 16186622.3 dated Jan. 18, 2017.
Official Communication for Great Britain Patent Application No. 1620827.4 dated Jun. 28, 2017.
Official Communication for Great Britain Patent Application No. 1620827.4 dated Jan. 12, 2017.
Official Communication for U.S. Appl. No. 12/840,673 dated Jul. 25, 2012.
Official Communication for U.S. Appl. No. 12/840,673 dated Jan. 4, 2013.
Official Communication for U.S. Appl. No. 14/289,596 dated Aug. 5, 2015.
Official Communication for U.S. Appl. No. 14/323,878 dated Jul. 27, 2017.
Official Communication for U.S. Appl. No. 14/490,612 dated Jan. 27, 2015.
Official Communication for U.S. Appl. No. 14/490,612 dated Mar. 31, 2015.
Official Communication for U.S. Appl. No. 14/571,098 dated Feb. 23, 2016.
Official Communication for U.S. Appl. No. 14/715,834 dated Feb. 19, 2016.
Official Communication for U.S. Appl. No. 14/746,671 dated Sep. 28, 2015.
Related Publications (1)
Number Date Country
20140337772 A1 Nov 2014 US
Provisional Applications (1)
Number Date Country
61820608 May 2013 US
Continuations (1)
Number Date Country
Parent 13917571 Jun 2013 US
Child 14323881 US