The present disclosure relates to systems and techniques for geographical data integration, analysis, and visualization. More specifically, the present disclosure relates to interactive maps including data objects.
Interactive geographical maps, such as web-based mapping service applications and Geographical Information Systems (GIS), are available from a number of providers. Such maps generally comprise satellite images or generic base layers overlaid by roads. Users of such systems may generally search for and view locations of a small number of landmarks, and determine directions from one location to another. In some interactive graphical maps, 3D terrain and/or 3D buildings may be visible in the interface.
The systems, methods, and devices described herein each have several aspects, no single one of which is solely responsible for its desirable attributes. Without limiting the scope of this disclosure, several non-limiting features will now be discussed briefly.
The systems, methods, and devices of the present disclosure may provide, among other features, high-performance, interactive geospatial and/or data object map capabilities in which large amounts of geographical, geospatial, and other types of data, geodata, objects, features, and/or metadata are efficiently presented to a user on a map interface. In various embodiments, an interactive geospatial map system (also referred to as an interactive data object map system) may enable rapid and deep analysis of various objects, features, and/or metadata by the user. In some embodiments, a layer ontology may be displayed to the user. In various embodiments, when the user rolls a selection cursor over an object an outline of the object is displayed. Selection of an object may cause display of metadata associated with that object. In various embodiments, the interactive data object map system may automatically generate object lists and/or histograms based on selections made by the user. Various aspects of the present disclosure may enable the user to perform geosearches, generate heatmaps, define and apply filters to displayed data, copy data objects between different interactive maps, import additional data objects from files, and/or perform keyword searches, among other actions.
It has been noted that design of computer user interfaces “that are useable and easily learned by humans is a non-trivial problem for software developers.” (Dillon, A. (2003) User Interface Design. MacMillan Encyclopedia of Cognitive Science, Vol. 4, London: MacMillan, 453-458.) The present disclosure describes various embodiments of interactive and dynamic user interfaces that are the result of significant development. This non-trivial development has resulted in the user interfaces described herein which may provide significant cognitive and ergonomic efficiencies and advantages over previous systems. The interactive and dynamic user interfaces include improved human-computer interactions that may provide reduced mental workloads, improved decision-making, reduced work stress, and/or the like, for a user. For example, user interaction with the interactive user interfaces described herein may provide an optimized display of maps, and may enable a user to more quickly and accurately access, navigate, assess, and digest the map and its associated object data than previous systems.
Further, the interactive and dynamic user interfaces described herein are enabled by innovations in efficient interactions between the user interfaces and underlying systems and components. For example, disclosed herein are improved methods of displaying geographic maps and data objects associated with the map using a plurality of layers, generating heatmaps on the map, transferring data objects between different maps or applications, and generating and manipulating filters for data objects displayed on the map. The interactions and presentation of data via the interactive user interfaces described herein may accordingly provide cognitive and ergonomic efficiencies and advantages over previous systems.
Various embodiments of the present disclosure provide improvements to various technologies and technological fields. For example, existing map display is limited in various ways, and various embodiments of the disclosure provide significant improvements over such technology. Additionally, various embodiments of the present disclosure are inextricably tied to computer technology. In particular, various embodiments rely on detection of user inputs via graphical user interfaces, generation of map tile layers based on those user inputs, generation of heatmaps based upon user-selected attributes or aggregations of user-selected attributes, generation and manipulation of filters based upon user-selected attributes, and/or the like. Such features and others are intimately tied to, and enabled by, computer technology, and would not exist except for computer technology. For example, the interactions with displayed data described below in reference to various embodiments cannot reasonably be performed by humans alone, without the computer technology upon which they are implemented. Further, the implementation of the various embodiments of the present disclosure via computer technology enables many of the advantages described herein, including more efficient interaction with, and presentation of, various types of electronic map and image data.
In an embodiment, a computer system is disclosed comprising an electronic data structure configured to store a plurality of objects, wherein each of the objects is associated with metadata; a computer readable medium storing software modules including computer executable instructions; one or more hardware processors in communication with the electronic data structure and the computer readable medium, and configured to execute a user interface module of the software modules in order to generate user interface data for rendering an interactive user interface on a client computing device, the interactive user interface including an interactive map, wherein the interactive map includes a plurality of data objects, wherein the interactive map is comprised of a plurality of map tiles, each map tile comprising one or more tile layers. The one or more hardware processors may be further configured to receive a query from the client computing device, identify a map tile of the plurality of map tiles that is associated with the received query, determine a tile layer composition for the map tile based at least in part upon the received query, wherein the tile layer composition specifies a plurality of tile layers comprising at least a vector layer and an inactive layer, generate the plurality of tile layers, and provide the generated tile layers to the client computing device. In some embodiments, the plurality of tile layers further comprises a base layer and a selection layer.
In some embodiments, generating the plurality of tile layers comprises determining if one or more of the plurality of tile layers is stored in a cache. If the tile layer of the plurality of tile layers is determined to be not stored in the cache, the tile layer may be generated and stored in the cache.
In some embodiments, the vector tile layer may comprise one or more data objects having locations associated with the map tile. In some embodiments, the vector tile layer comprises an aggregation of a plurality of selected vector layers. The vector tile layer may further comprise one or more interface elements corresponding to data objects obtained through a search performed by the user.
In some embodiments, the selection tile layer comprises one or more interface elements corresponding to data objects that have been selected by the user. In some embodiments, the inactive tile layer comprises one or more interface elements corresponding to data objects of the plurality of data objects that are not selectable by the user.
In some embodiments, a tile layer of the plurality of tile layers is generated as an image. For example, a tile layer may comprise a PNG image.
In some embodiments, generating the plurality of tile layers comprises composing the plurality of tile layers into an updated map tile, and providing the generated tile layers to the client computing device comprises providing the updated map tile to the client computing device.
In some embodiments, the map tile is associated with a UTF grid, wherein the UTF grid comprises a plurality of characters corresponding to pixels on the map tile. A character of the UTF grid may indicate a data object associated with corresponding pixel.
In some embodiments, the query comprises a selection of one or more data objects, a search for one or more data objects, a selection of one or more layers, a request to generate a heatmap, or the application of a filter to filter one or more data objects. A map tile of the plurality of map tiles is identified as being associated with the received query if a data object associated with the query is located within the map tile.
In some embodiments, the one or more hardware processors are further configured to assemble the generated tile layers into a completed map tile, and wherein providing the generated tile layers to the client computing device comprises providing the completed map tile to the client computing device.
In another embodiment, a computer system is disclosed comprising an electronic data structure configured to store a plurality of objects, wherein each of the objects is associated with metadata; a computer readable medium storing software modules including computer executable instructions; one or more hardware processors in communication with the electronic data structure and the computer readable medium, and configured to execute a user interface module of the software modules in order to display, to a user, an interactive user interface including an interactive map, the map containing a plurality of data objects, wherein each of the plurality of data objects is associated with at least one attribute, and distributions of the plurality of data objects over attribute values of one or more attributes of the plurality of data objects. The one or more hardware processors may be further configured to receive a selection of an attribute value of an attribute of the one or more attributes, generate a first filter by providing an indication of the attribute value of the attribute to a server computing device, wherein the attribute value is associated with a portion of the plurality of data objects. In response to generating the first filter, the one or more hardware processors may be further configured to receive an update to the interactive map from the server computing device based on the indication, the update comprising at least an updated map tile of the interactive map, update the interactive map with the updated map tile such that the portion of the plurality of data objects that are selected are displayed as active data objects, while a remainder of the plurality of data objects that are not selected are displayed as inactive data objects, and update the interactive user interface to further include a user interface element corresponding to the first filter.
In some embodiments, the distributions of the plurality of data objects over attribute values comprise a histogram. The one or more hardware processors may be configured to receive a selection of an attribute value of an attribute of the one or more attributes in response to a user selecting one or more bars associated with the histogram
In some embodiments, the one or more hardware processors may be further configured to, in response to receiving a selection of an attribute value for an attribute of the one or more attributes, display one or more distributions over of displayed data objects associated with the selected attribute value over at least a portion of the remaining attributes of the one or more attributes.
In some embodiments, the attribute value may comprise an attribute value range. In some embodiments, the attribute value may comprise all attribute values except a specified set of attribute values.
In some embodiments, the one or more hardware processors may be further configured to receive an indication to deactivate the first filter, in response to the indication to deactivate the first filter, receive a second update to the interactive map from the server computing device based on the indication, the second update comprising at least a second updated map tile of the interactive map, and update the interactive map with the second updated map tile such that the remainder of the plurality of data objects are displayed as active data objects. In some embodiments, the indication may comprise a selection by the user of the user interface element corresponding to the first filter.
In some embodiments, the one or more hardware processors are further configured to receive a selection of a second attribute value, generate a second filter by providing an indication of the second attribute value to a server computing device, receive a second update to the interactive map from the server computing device based on the indication, the second update comprising at least a second updated map tile of the interactive map, update the interactive map with the second updated map tile such that data objects of the plurality of data objects that are associated with both the first attribute value and the second attribute value are displayed as active data objects, while the remaining data objects are displayed as inactive data objects, and update the interactive user interface to further include a user interface element corresponding to the second filter. In some embodiments, the one or more hardware processors may further receive an indication to deactivate the first filter, receive a third update to the interactive map from the server computing device based on the indication, the third update comprising at least a third updated map tile of the interactive map, and update the interactive map with the third updated map tile such that data objects of the displayed data objects that are associated with the second attribute value are displayed as active data objects, while a remainder of the plurality of data objects that are not selected are displayed as inactive data objects. In some embodiments, the first attribute value and second attribute value may be associated with different attributes.
In another embodiment, a computer system is disclosed comprising an electronic data structure configured to store a plurality of objects, wherein each of the objects is associated with metadata; a computer readable medium storing software modules including computer executable instructions; one or more hardware processors in communication with the electronic data structure and the computer readable medium, and configured to execute a user interface module of the software modules in order to generate user interface data for rendering an interactive user interface on a client computing device, the interactive user interface including an interactive map, wherein the interactive map includes a plurality of data objects, wherein each data object is associated with at least one attribute. The one or more hardware processors may be further configured to receive a query from the client computing device corresponding to a selection of an attribute, receive an indication from the client computing device identifying one or more shapes, wherein a shape of the one or more shapes defines a region on the interactive map, and for each shape of the one or more shapes, calculate an aggregate attribute value corresponding to the shape, based at least in part upon the selected attribute, generate data for rendering a heatmap on the interactive map, based at least in part upon the calculated aggregate attribute values for the one or more shapes. The generated data may be transmitted to a client computing device for rendering a heatmap.
In some embodiments, the one or more shapes may comprise one or more spaces in a rectangular grid. In other embodiments, the one or more shapes are associated with one or more data objects. For example, the one or more data objects may be associated with buildings, with the one or more shapes corresponding to building footprints. In some embodiments, the one or more data objects are associated with geographic regions, and the one or more shapes correspond to geographic borders.
In some embodiments, generating the heatmap on the map comprises assigning a color to each shape, wherein the color for a shape is assigned based at least in part upon the aggregate attribute value for the shape.
In some embodiments, the attribute may comprise a function of two or more different attributes. The two or more different attributes may comprise a first attribute associated with a first data object type, and a second attribute associated with a second data object type.
In some embodiments, calculating an aggregate attribute value for the shape comprises identifying one or more data objects associated with the attribute within the region defined by the shape, and calculating the aggregate attribute value for the shape based at least in part upon values of the attribute associated with the one or more identified data objects.
In some embodiments, the one or more hardware processors may be further configured to receive a request from the client computing device specifying one or more filter conditions, and in response to the received request, generate the heatmap such that only shapes having an aggregate attribute value satisfying the one or more filter conditions are displayed.
In another embodiment, a computer system is disclosed comprising an electronic data structure configured to store a plurality of objects, wherein each of the objects is associated with metadata; a computer readable medium storing software modules including computer executable instructions; one or more hardware processors in communication with the electronic data structure and the computer readable medium, and configured to execute a user interface module of the software modules in order to generate user interface data for rendering an interactive user interface on a client computing device, the interactive user interface including a first interactive map and a second interactive map, wherein the first interactive map includes a plurality of data objects, receive a query from the client computing device corresponding to a selection of one or more data objects in the first interactive map. The one or more hardware processors may be further configured to receive a first indication corresponding to a request to copy the one or more selected data objects, generate an ID corresponding to the one or more selected data objects, receive a second indication corresponding to a request to copy the one or more selected data objects from the first interactive map to the second interactive map, wherein the second indication comprises the ID received at the interactive second map, use the ID to identify the one or more data objects corresponding to the ID, generate a set of user interface data for updating the second interactive map, wherein updating the second interactive map comprises displaying the identified one or more data objects on the second interactive map, and provide the set of generated user interface data to the client computing device to display the identified one or more data objects on the second interactive map.
In some embodiments, each data object is associated with an identifier, and the ID is generated by applying a hash function on the identifiers of the one or more data objects. In some embodiments, the ID comprises a string of a fixed length.
In some embodiments, the first indication may correspond to a drag and drop operation performed by the user at the client computing device. In some embodiments, the first indication may correspond to a copy operation performed by the user at the client computing device, and the second indication may correspond to a paste operation performed by the user at the client computing device.
In some embodiments, the one or more hardware processors are further configured to, in response to receiving the first indication, generate and provide to the client computing device user interface data for rendering an aggregate icon representing the one or more selected data objects.
In some embodiments, the first interactive map and the second interactive map are displayed in different tabs in a web browser at the client computing device. Alternatively, the first interactive map and the second interactive map may be displayed in different windows.
In some embodiments, the generated ID may be stored in a mapping table. Identify the one or more data objects corresponding to the ID may be performed by accessing the mapping table.
Additional embodiments of the disclosure are described below in reference to the appended claims, which may serve as an additional summary of the disclosure.
In various embodiments, computer-implemented methods are disclosed in which, under control of one or more hardware computing devices configured with specific computer executable instructions, one or more aspects of the above-described embodiments (including one or more aspects of the appended claims) are implemented and/or performed.
In various embodiments, non-transitory computer-readable storage mediums storing software instructions are disclosed, wherein, in response to execution by a computing system having one or more hardware processors, the software instructions configure the computing system to perform operations comprising one or more aspects of the above-described embodiments (including one or more aspects of the appended claims).
Further, as described herein, various embodiments of the system may be configured and/or designed to generate user interface data useable for rendering the various interactive user interfaces described. The user interface data may be used by the system, and/or another computer system, device, and/or software program (for example, a browser program), to render the interactive user interfaces. The interactive user interfaces may be displayed on, for example, electronic displays (including, for example, touch-enabled displays).
The following aspects of the disclosure will become more readily appreciated as the same become better understood by reference to the following detailed description, when taken in conjunction with the accompanying drawings.
In general, a high-performance, interactive data object map system (or “map system”) is disclosed in which large amounts of geographical, geospatial, and other types of data, geodata, objects, features, and/or metadata are efficiently presented to a user on a map interface. The interactive data object map system allows for rapid and deep analysis of various objects, features, and/or metadata by the user. For example, millions of data objects and/or features may be simultaneously viewed and selected by the user on the map interface. A layer ontology may be displayed to the user that allows the user to select and view particular layers. In various embodiments, when the user rolls a selection cursor over an object/feature (and/or otherwise selects the object/feature) an outline of the object/feature is displayed. Selection of an object/feature may cause display of metadata associated with that object/feature. For the purposes of the present specification, the terms “objects,” “data objects,” and “features” may be used synonymously, and may hereinafter be collectively referred to as “objects.”
In an embodiment, the user may rapidly zoom in and out and/or move and pan around the map interface to variously see more or less detail, and more or fewer objects. In various embodiments, the interactive data object map system may automatically generate object lists and/or histograms based on selections made by the user. In various embodiments, the user may perform searches (such as geosearches based on any selections and/or drawn shapes, and/or other types of searches), generate heatmaps (e.g., based upon a grid or object shapes), copy objects between different interactive maps, import additional objects from files, and/or define filters to display data, among other actions as described below.
In an embodiment, the interactive data object map system includes server-side computer components and/or client-side computer components. The client-side components may implement, for example, displaying map tiles, showing object outlines, allowing the user to draw shapes, and/or allowing the user to select objects, among other actions. The server-side components may implement, for example, composition of layers into map tiles, caching of composed map tiles and/or layers, and/or providing object metadata, among other actions. Such functions may be distribution in any other manner. In an embodiment, object outlines and/or highlighting are accomplished on the client-side through the use of a UTF grid.
In order to facilitate an understanding of the systems and methods discussed herein, a number of terms are defined below. The terms defined below, as well as other terms used herein, should be construed to include the provided definitions, the ordinary and customary meaning of the terms, and/or any other implied meaning for the respective terms. Thus, the definitions below do not limit the meaning of these terms, but only provide exemplary definitions.
Ontology: A hierarchical arrangement and/or grouping of data according to similarities and differences. The present disclosure describes two ontologies. The first relates to the arrangement of vector layers consisting of map and object data as used by the interactive data object map system (as described below with reference to
Database: A broad term for any data structure for storing and/or organizing data, including, but not limited to, relational databases (for example, Oracle database, mySQL database, and the like), non-relational databases (for example, a NoSQL database), spreadsheets, XML files, and text file, among others. The various terms “database,” “data store,” and “data source” may be used interchangeably in the present disclosure.
Data Object, Object, or Feature: A data container for information representing specific things in the world that have a number of definable properties. For example, a data object can represent an entity such as a person, a place, an organization, a market instrument, or other noun. A data object can represent an event that happens at a point in time or for a duration. A data object can represent a document or other unstructured data source such as an e-mail message, a news report, or a written paper or article. Each data object may be associated with a unique identifier that uniquely identifies the data object. The object's attributes (e.g. metadata about the object) may be represented in one or more properties. For the purposes of the present disclosure, the terms “feature,” “data object,” “object,” and “data item” may be used interchangeably to refer to items displayed on the map interface of the interactive data object map system, and/or otherwise accessible to the user through the interactive data object map system. Features/objects may generally include, but are not limited to, roads, terrain (such as hills, mountains, rivers, and vegetation, among others), street lights (which may be represented by a streetlight icon), railroads, hotels/motels (which may be represented by a bed icon), schools (which may be represented by a parent-child icon), hospitals, other types of buildings or structures, regions, transportation objects, and other types of entities, events, and documents, among others. Objects displayed on the map interface generally comprise vector data, although other types of data may also be displayed. Objects generally have associated metadata and/or properties.
Object (or Feature) Type: Type of a data object or feature (e.g., Person, Event, or Document). Object types may be defined by an ontology and may be modified or updated to include additional object types. An object definition (e.g., in an ontology) may include how the object is related to other objects, such as being a sub-object type of another object type (e.g. an agent may be a sub-object type of a person object type), and the properties the object type may have.
Properties: Also referred to as “metadata” or “attributes” of a data object/feature. A property of a data item may include any item of information associated with, and/or relevant to, the data item. At a minimum, each property/metadata of a data object has a type (such as a property type) and a value or values. Properties/metadata associated with features/objects may include any information relevant to that feature/object. For example, metadata associated with a school object may include an address (for example, 123 S. Orange Street), a district (for example, 509c), a grade level (for example, K-6), and/or a phone number (for example, 800-0000), among other items of metadata. In another example, metadata associated with a road object may include a speed (for example, 25 mph), a width (for example, 2 lanes), and/or a county (for example, Arlington), among other items of metadata.
Property Type: The data type of a property, such as a string, an integer, or a double. Property types may include complex property types, such as a series data values associated with timed ticks (e.g. a time series), etc.
Property Value: The value associated with a property, which is of the type indicated in the property type associated with the property. A property may have multiple values.
Link: A connection between two data objects, based on, for example, a relationship, an event, and/or matching properties. Links may be directional, such as one representing a payment from person A to B, or bidirectional.
Link Set: Set of multiple links that are shared between two or more data objects.
Embodiments of the disclosure will now be described with reference to the accompanying Figures, wherein like numerals refer to like elements throughout. The terminology used in the description presented herein is not intended to be interpreted in any limited or restrictive manner, simply because it is being utilized in conjunction with a detailed description of certain specific embodiments of the disclosure. Furthermore, embodiments of the disclosure may include several novel features, no single one of which is solely responsible for its desirable attributes or which is essential to practicing the embodiments of the disclosure herein described.
Each map may comprise different types of objects and/or different map layers. For example, a first map may map data related to a first topic (e.g., instances of disease outbreaks), while a second map may map data related to a second topic (e.g., weather patterns). In some embodiments, data associated with different maps may correspond to different geographical areas. Once a desired map has been selected, objects associated with the map may be loaded to be viewed and/or manipulated by the user.
The map interface 100 illustrated in
In general, the user interface is displayed on an electronic display viewable by a user of the interactive data object map system. The user of the interactive data object map system may interact with the user interface by, for example, touching the display when the display is touch-enabled and/or using a mouse pointer to click on the various elements of the user interface.
The map interface 100 includes various visible objects 122 and object icons. For example, the map interface 100 includes roads, buildings and structures, utilities, lakes, rivers, vegetation, and railroads, among other objects. The user may interact with the map interface 100 by, for example, rolling over and/or clicking on various objects. In one embodiment, rolling over and/or placing the mouse pointer over an object causes the object to be outlined and/or otherwise highlighted. Additionally, in some embodiments, the name of the object and/or other information about the feature may be shown in the feature information box 114.
The user of the map system may interact with the user interface of
Map Layers
In an embodiment, the user may select one or more of the base layers which may be used during composition of the map tiles. For example, selection of the overhead imagery base layer will produce map tiles in which the underlying map tile imagery is made up of recent aerial imagery. Similarly, selection of the topographic base layer will produce map tiles in which the underlying map tile imagery includes topographic map imagery.
Further, in an embodiment, the user may select one or more of the vector layers which may be used during composition of the map tiles. For example, selecting the transportation layer results in transportation-related objects being displayed on the map tiles. Transportation-related objects may include, for example, roads, railroads, street signs, and/or street lights, among others. Examples of transportation-related objects may be seen in the user interface of
In an embodiment, the user of the map system may create and save map layers. These saved map layers may be listed as user layers in the layers window 202.
In an embodiment, the user of the map system may select one or more of the layers and/or sub-layers of the layer ontology. As shown in
In an embodiment, additional hierarchical levels of layers may be displayed to the user. For example, the vector layers window 206 may include sub-sub-layers (for example, the education sub-layer may be divided into elementary schools, secondary schools, and post-secondary schools). Alternatively, fewer hierarchical levels may be displayed to the user.
In an embodiment, each of the vector layers shown in the vector layers window 206 may be made up of many layers of map vector data. In this embodiment, the map system may advantageously generate a simplified layer ontology, such as the one shown in 206. The simplified layer ontology allows the user to easily select layers of interest from a reduced number of layers, rather than a large number of discrete layers. As described above, vector layers may contain data regarding associated objects. Thus, objects visible in the map interface correspond to the currently active/selected layers. In an embodiment, the layer ontology may have an arbitrary depth.
In some embodiments, a user may be able to select layer objects (e.g., a province defined by a provincial border) by selecting (e.g., touches on a touchscreen, clicks on using a mouse cursor) a point on a border of a desired province. Selecting layer objects may cause the objects to become highlighted (e.g., as shown at 310). On the other hand, in some embodiments, if the user selects a point on the map within a province but not on a border, the user may then pan across the map interface (e.g., by dragging the mouse cursor in a desired direction). In some embodiments, how the user is able to select layer objects and manipulate the map interface (e.g., pan, zoom, and/or the like) may be based upon a type of object associated with a selected vector layer.
In addition, highlighting (e.g., placing a cursor over an object) or selecting (e.g., clicking on an object) an object may change how the object is displayed. For example, in the illustrated embodiment, the user has placed a cursor over an object corresponding to a section of road 320 associated with the transportation vector layer, causing the section of road to be highlighted. Highlighting an object may comprise changing a color associated with how the objects is presented (e.g., changing the line color of the selected section of road from yellow to teal). In some embodiments, how an object is highlighted may also be based at least in part upon a thickness or dimension of the objects. For example, highlighting a road objects represented with a thin line may cause the object to be displayed using a thicker line, while highlighting a road objects represented with a thick line may cause no change in line thickness. In some embodiments, selecting an object may cause the same change in how the object is displayed when highlighted, while in other embodiments, selecting the object may cause a different change. In some embodiments, selecting an object can also cause a change in the view of the map (e.g., panning or zooming the map to show the selected object and surrounding area of the map).
In some embodiments, one or more objects 324 may be designated as inactive. Inactive objects, while still displayed on the map, are not able to be highlighted or selected. In some embodiments, a user may define one or more filters for a plurality of displayed objects, wherein objects that satisfy the filter conditions stay active, while object s that do not satisfy the filter conditions are made inactive. In some embodiments, an object may be indicated as inactive by the icon or shape used to represent the object being greyed out.
Search
In some embodiments, the user may configure various parameters of the search using a search panel 404. For example, the user may, at 406, select from other types of geospatial searches that be performed corresponding to different ways for the user to specify the search area, such as a radial search, a search from an existing selection, and/or the like. For example,
In some embodiments, additional parameters may be specified when performing a geographic search. For example, at 408, the user may specify a time range, restricting the returned search results to objects associated with a time value that falls within the specified range. The time range may be specified by an inputted start date and end date, or by a user selection of a time period based upon the current date (e.g., “last week,” “last month,” and/or the like). In some embodiments, the time may correspond to a time a object was created, a time a object was added to a database of data objects/features, a time a object was previously added to a vector layer, a time a object was last accessed by the map system and/or a user, a time an object was built, a time an object corresponding to an event occurred, and/or any combination of the foregoing.
In addition, at 410, the user may restrict the returned search results to certain types of objects and/or exclude certain types of objects from the search results. In some embodiments, other types of attributes and/or properties may also be used to restrict the search results. For example, in some embodiments, a geospatial search may only search for objects that are part of a currently displayed vector layer. In other embodiments, objects searched by the map system may include objects other than those shown on the map interface. For example, in an embodiment the map system may access one or more databases of objects (and object metadata) that may be unrelated to the objects currently shown in the map interface, or objects related to the currently selected vector layers. The databases accessed may include databases external to any database storing data associated with the map system.
In some embodiments, a particular map may be associated with hundreds of thousands or millions of objects. If the number of objects is sufficiently large, a search may take some time to perform. In addition, if a large number of search results are returned and displayed on the map, the map may become cluttered in difficult to read. As such, in some cases a user may wish to pause a search as it is in progress, due to the search taking too long, there being enough search results already displayed on the map, and/or the like. In some embodiments, a progress bar 412 may be displayed as the search is being executed. The progress bar 412 may display how many search results have been loaded, how many objects have been searched through, and/or the like. When the user feels that enough search results have been displayed, or if enough time as elapsed in the search, they may click on a stop button 414 or other interface element to stop the search before it has reached full completion. In some embodiments, search results may be displayed on the map as the search progresses. The results may be displayed as they are found, or on a periodic basis, wherein the period may be based upon a period of elapsed time, a number of search results found, and/or a number of data objects searched. In some embodiments, the search may pause periodically in order to allow the user to stop the search prematurely. In some embodiments, for the sake of visual clarity, the search may only display up to a predetermined number of results on the map, even if more search results have been found.
In some embodiments, other types of searches may be performed. For example,
Heatmaps
Heatmaps may be generated across maps of any size, using any number of objects, in order to more easily visualize attributes associated with large numbers of objects. For example,
Heatmaps can thus be used as a way to quickly visualize the attribute values of various objects displayed on a map. In some embodiments, in order to more clearly view how attribute values of objects vary by location, heatmap values may be aggregated based upon geographic regions or shapes (hereinafter also referred to as a “shape heatmap”). In some embodiments, shape heatmaps display data in coarser grain in comparison to raster heatmaps (as illustrated in
For example, a particular map may contain objects corresponding to rain events, wherein each rain event is associated with a rainfall attribute. In order to generate a grid heatmap based upon the rainfall attribute of the rain events, an aggregate rainfall value may be calculated for each shape within the grid (e.g., by calculating a sum of the rainfall attribute values of the rainfall events within each shape).
In some embodiments, an object may span multiple shapes. For example, a road object may run through multiple shapes in a rectangular grid. In such cases, the aggregation for a particular shape may be based upon only the portion of the object located within the shape (e.g., the length of the road within the shape). Alternatively, in some embodiments, a object that spans multiple shapes may be considered, for the purposes of generating a heatmap, to be part of the shape that contains a largest portion of the object. For example, a road object that runs through multiple shapes may be considered to be part of the shape that contains the greatest length of the road object. In some embodiments, a object that spans multiple shapes may be considered to be part of all of the shapes that it spans.
In some embodiments, a shape heatmap may be filtered in order focus and limit the heatmap data that is displayed. For example,
A filtered shape heatmap can be further used to retrieve underlying objects that satisfy certain criteria located in the filtered heatmap shapes. For example, a filtered heatmap may only show heatmap shapes having a heatmap value that satisfies a first criteria (e.g., heatmap shapes having a heatmap value meeting a first threshold value of rainfall). The original set of objects may be filtered to obtain all objects within the selected heatmap shapes satisfying a second criteria (e.g., rainfall events meeting a second threshold value of rainfall). As such, only objects meeting the second criteria within heatmap shapes that meet the first criteria are displayed.
In another example, filtering on a shape heatmap allows a user to aggregate on a particular attribute (e.g. find average home price per zip code region), filter to the regions with high or low aggregates (e.g. top 10% of zip code regions by average home price), and then use select-within-select to go back to the original set of objects (e.g. retrieve all home objects within zip code regions that have the top 10% average home price). This may result in a different set of objects compared to simply filtering for objects having high or low attributes (retrieving the top 10% of home objects by price).
In some embodiments, heatmaps may be generated based upon multiple attributes. Objects may be associated with a plurality of different attributes. For example, an object representing a road may have a width attribute and an average congestion attribute. These different attributes may be combined to form an aggregate attribute value, such as a ratio of the two attributes, which may be used for heatmap generation.
In some embodiments, a heatmap may be generated based upon a plurality of different attributes associated with different types of objects. For example,
As used herein, an aggregation may comprise any type of mathematical operation or combination of operations that may be performed on one or more attributes (e.g., counts, sums, ratio, products, maximum value, minimum value, media value, mode, and/or the like). In some embodiments, the user may define different types of aggregations to be performed between attributes in a set of attributes from which the aggregate value is generated. For example, in an embodiment, the user may define an aggregate value that comprises a ratio between a first attribute and a sum of a second attribute and a third attribute. In some embodiments, aggregations can also be performed on layers. For example, objects from a first layer may be aggregated to form objects to be displayed on a second layer. The attribute values of the objects on the second layer may comprise aggregations of attribute values of objects on the first layer. For example, in an embodiment, people objects can be aggregated to form family objects. In some embodiments, objects from multiple layers can be aggregated to form new objects on a new layer (e.g., objects from a first layer aggregated with objects from a second layer to form objects on a third layer).
In some embodiments, instead of a grid with rectangular shapes, a shape heatmap may be generated based upon a plurality of shapes based upon objects. Certain types of objects in the map system may cover a geographic area. These may comprise regions such as provinces (e.g., as illustrated in
For example,
In some embodiments, generating the shape heatmap comprises calculating an intersection between the selected shapes (e.g., building footprints) and the objects associated with the attribute to be heatmapped (e.g., rainfall events associated with a rainfall attribute). The objects that are located within each shape may then be used to calculate an aggregate heatmap value for the shape.
At block 604, one or more shapes may be identified. In some embodiments, the shapes may be defined by a rectangular grid overlaid on the map. In other embodiments, the shapes may be defined by one or more objects or object types. For example, a user may specify a type of object represented by an area on the map from which the shapes may be defined. These may include borders (e.g., state borders, regional borders), footprints (e.g., building footprints), and/or the like. In some embodiments, the shapes may directly border each other to cover a continuous area on the map. In other embodiments, such as with building footprints, there may be empty space between different shapes.
At block 606, attribute values are aggregated for each shape. In some embodiments, an intersection between the shapes and the locations of the objects associated with an attribute value for the identified attribute is calculated, in order to determine which objects fall within the shapes. The attribute values of the objects within the shape may be combined or aggregated to calculate an aggregate attribute value for the shape. In some embodiments, wherein the attribute may correspond to a combination or aggregation of two or more different attributes, objects within the shape associated with each attribute may be identified and aggregated. The aggregation may comprise any type of mathematical operation or combination of operations. For example, an aggregated attribute value for a particular shape may comprise an aggregation of a first value corresponding to an aggregation of values associated with a first attribute within the shape and a second value corresponding to an aggregation of values associated with a second attribute within the shape.
At block 608, a heatmap is generated based upon the aggregate attribute values for the shapes. For example, the aggregate attribute value for each shape may be mapped to a heatmap color. Each shape is filled with the heatmap color to generate the heatmap.
Drag and Drop
In some embodiments, a user may wish to view more than one map, or be able to analyze object data using different software applications (e.g., a chart application for creating one or more charts based upon attribute data associated with selected objects). For example, a user may view a first map associated with a first set of objects, and a second map associated with a second set of objects. The two different maps may be open in different software application windows (e.g., in an operating system environment). In some embodiments, an interactive data object map system may be implemented as a web application, wherein the user accesses the system through a web browser (e.g., Internet Explorer, Google Chrome, Mozilla Firefox, and/or the like). The user may have a first map open in a first tab or window, and a second map open in a second tab or window. In some embodiments, the user may view a first map in a first software application window or tab, and a second, different application in a second software application window or tab.
In some cases, a user may wish to be able to copy data selected objects from one map to another (e.g., in order to view certain data objects associated with the first map concurrently with data objects associated with the second map), or to another application (e.g., a chart creation application for creating one or more charts based upon attributes of the selected objects). It would be convenient for a user to be able to select the desired objects in the first map, and then drag and drop the selected objects into the second map or application, such that the selected objects will be displayed in the second map or processed by the application.
The user may select at least a portion of the displayed objects. For example,
However, for security reasons and/or due to technical constraints, web browsers and operating systems often restrict the type of data that may be communicated between different tabs and/or application windows. As such, it is often not possible to directly copy the objects from the first map to the second map or second application. Thus, in some embodiments, in order to be able to drag and drop objects between different maps or applications, when a drag and drop operation is detected, an ID string that may be passed between different tabs and/or windows may be generated that corresponds to the selected objects. In some embodiments a hash operation is performed on the selected objects in order to generate the ID string. The ID string may be mapped to the plurality of selected objects on the server side. For example, the server may maintain a mapping table keeping track of which ID strings are mapped to which objects or sets of objects.
The drag and drop operation will thus pass the ID string between the different tabs and/or windows. When the second map or application receives the ID string, it may access a server-side mapping to identify the plurality of objects that the received ID string corresponds to. Once the objects are identified, they may be placed on the second map by the server and displayed to the user at the client, or processed by the second application (e.g., attribute values of the received objects used to create a table or chart).
In some embodiments, if the user performs a copy and paste operation, the ID string is generated in response to the user performing a copy command. The ID string may then be stored on a clipboard on the client system. When the user selects a tab or application window associated with the second map or second application and performs a paste command, the ID string is retrieved from the clipboard and transmitted to the second map. The second map may then access the server-side mapping to identify the objects that are associated with the ID string.
At block 804, a drag and drop operation and/or copy operation to copy the selected objects from the first map to a second map or second application is detected at the client. In some embodiments, the operation is detected in response to the user clicking on (e.g., using a mouse cursor, a touch on a touchscreen, and/or the like) the selected objects and dragging them in a desired direction. In some embodiments, the operation is detected in response to the user specifying a copy command to be performed on the selected objects. The client may then notify the server that a drag and drop and/or copy operation has been initiated.
At block 806, the server, in response to receiving an indication of a drag and drop and/or copy operation, generates an ID based upon the selected objects. For example, in some embodiments, each object may be associated with an identifier. A hash function may be performed on the identifiers of the selected objects in order to produce the ID. The ID may then be mapped to the selected objects by the server. For example, the server may maintain a mapping table that keeps track of generated IDs and which sets of objects they are associated with. In some embodiments, the ID may comprise a string of a fixed length. In other embodiments, the ID may comprise a string of variable length. In some embodiments, the generated ID is transmitted from the server to the client. In some embodiments, the ID may be placed onto a clipboard on the client system.
At block 808, the generated ID is received by the second map or second application displayed at the client. In some embodiments, the ID may be received by the second map or second application in response to the user dragging the selected objects to a tab or window associated with the second map or second application, and dropping the selected objects (e.g., releasing a mouse cursor, removing a touch from a touch screen, and/or the like). In some embodiments, the ID may be received by the second map or second application in response to the user performing a paste command. In some embodiments, the ID is retrieved from a clipboard at the client and received by the second map or second application. Once the second map or second application has received the ID at the client, the received ID may be transmitted to the server.
At block 810, the server, upon receiving the ID received at the second map or second application, identifies one or more objects based upon the received ID. In some embodiments, the second map or second application accesses one or more mappings stored on the server (e.g., a mapping table) that map each ID to one or more corresponding objects. At block 812, the identified objects may then be displayed on the second map or processed by the second application. For example, the server, having identified the objects associated with the ID, generates one or more updated map tiles for the second map containing the identified data objects, which may then be transmitted to the client to be displayed as part of the second map. In addition, metadata associated with the objects may be retrieved and transmitted to the client.
Importing Data Objects
In some embodiments, a user may import objects from other sources to be displayed on a map. For example, a user may wish to import objects from one or more files, in order to compare those objects to those associated with a map.
In some embodiments, objects imported from a file may be contained in their own layer 904, which may be referred to as a “user layer.” In some embodiments, each imported file may have its own corresponding user layer. In some embodiments, user layers may be functionally similar to vector layers. For example, a user may toggle user layers on and off. The user may be able to organize and group different user layers into categories or folders (similar to as illustrated in
In some embodiments, different types of files containing object information may be imported. In addition, different types of coordinate systems may be used in different files to specify object locations or shapes. In some embodiments, a file may not specify the coordinate system that is used specify location data for the objects within the file.
In some embodiments, the system may also attempt to display the objects specified by the file using a default coordinate system. In some embodiments, the system may first check the coordinate values contained within the file against one or more allowed ranges associated with the default coordinate system. For example, a particular coordinate system may require that all coordinate values be between 0 and 180. If the coordinate values contained within a file contain values that are outside of this range, then it may be inferred that the file does not use that particular coordinate system. If it is determined that the coordinate values specified in the file do not use the default coordinate system, the system may attempt to display the data objects using a different coordinate system. The user may view the displayed objects, and choose to keep the coordinate system chosen by the system (e.g., default coordinate system), or specify a different coordinate system to be used.
Histograms and Filters
In some embodiments, various tools and functionalities may be provided to allow a user to better analyze displayed objects. For example, histograms may be used to allow the user to view how the displayed objects are distributed over different attribute values. In some embodiments, the user may also be able to define one or more filters based upon attribute values or attribute value ranges, allowing them to restrict the displayed objects to those deemed to be most relevant.
For each displayed attribute, the number of displayed objects having a particular attribute value or within a particular attribute value range may be displayed. For example, an age attribute 1006 may be divided into “bins” of 10 years each (e.g., a “10 yrs to 20 yrs” bin, a “20 yrs to 30 yrs” bin, and so forth). The number of displayed objects having an age attribute value that falls within each bin may be displayed next to their respective bins. In addition, in some embodiments a histogram bar may be displayed next to the numbers for each bin. In some embodiments, different bin sizes may be displayed for each attributes. In some embodiments, the user may be able to specify a desired bin size to be displayed.
In some embodiments, a user may select a particular attribute value or attribute value range. For example, as illustrated in
In addition, how the selected objects are distributed for other attributes displayed in summary panel 1004 may be shown. In some embodiments, this may comprise displaying a number next to each attribute value or attribute range bin for other displayed attributes, indicating a number of the selected objects that satisfy that attribute value or attribute value range. For example, the summary panel 1002 indicates that of the 51 displayed data objects 1002, 3 data objects 1010 satisfy the selected attribute value range, and all 3 are of the “person” type (out of a total of 10 objects of the “person” type). This allows the user to be able to quickly view how objects having a certain attributes values are distributed over other attributes.
In some embodiments, an object may be represented in a plurality of different histograms corresponding to attributes associated with the data object/feature. For example, an object of the “person” type may be associated with a “type” attribute, an “age” attribute, a “gender” attribute, and a “job” attribute. As such, the object may be represented in the histograms that correspond to the type, age, gender, and job attributes. If there are different types of selected objects having different attributes (e.g., the user has selected a plurality of “person” objects and a plurality of “restaurant” objects), the data objects may be represented in different histograms, depending upon which attributes they are associated with. For example, a selected “restaurant” object may not be associated with a “type” attribute but not a “job” attribute, and thus a restaurant object will not be represented in the histogram associated with the job attribute, but will be represented in the histogram associated with the type attribute. Additionally, objects of different types may be represented in a same histogram when the objects have the same types of attributes. For example, a “house” object and a “road” object, while different types of objects, may both be associated with a “city” or “location” attribute, and may thus be represented in the same histogram. However, the “house” object and the “road” object may, at the same time, have different types of attributes (e.g., “house” object has attribute “square feet,” while “road” object has attribute “speed limit”) such that the objects may be represented in different histograms.
In some embodiments, a timeline 1012 may be displayed indicating how the displayed objects are distributed over different time periods. In some embodiments, timeline 1012 functions similarly to a histogram as displayed in summary panel 1004. In some embodiments, the user may select objects that fall within a particular time range (e.g., by selecting one or more bars of the timeline). In addition, the selected bars of the timeline may be highlighted.
In some embodiments, a range covered by each bar in the timeline may automatically change as the user zooms in or out on the timeline. For example, the user may zoom in on the timeline such that each bar corresponds to a month. After selecting bars corresponding to one or more months, the user may then zoom back out such that each bar corresponds to a year. In some embodiments, if the user has selected one or more bars at a first zoom level (e.g., representing one or more months), and zooms out to a second zoom level (e.g., where the bars each represent a year), the highlighted bars at the first zoom level may be displayed using one or more partially highlighted bars at the second zoom level. In this way, the user may be able to quickly view a proportion of the number of objects associated with one or more selected time periods (e.g., the months of April and May) as compared to a number of objects associated with a second, longer time period (e.g., the entire year).
In some embodiments, the user may perform a search to display the desired objects. Alternatively, the user may select objects that do not satisfy the desired parameters and remove them from the map, leaving only the desired objects. However, these methods may make it difficult for the user to revisit previous data. For example, if the user has performed several selections within selections, the user may only be able to undo selections to view previous data in the reverse order that the selections were performed.
In some embodiments, the user may define a filter based upon the desired parameters (e.g., within the borders of a specified province), which may be displayed at 1014. Once the filter has been defined, objects 1016 that satisfy the filter criteria (e.g., roads within the selected province) are selected, while other objects 1018 (e.g., roads that are outside of the selected province) are made inactive.
In some embodiments, a user may define filters based upon any attribute or combination of attributes associated with the objects. For example, the user may define a filter by specifying one or more attribute values or attribute value ranges of one or more attributes associated with the displayed objects. In some embodiments, this may be done by selecting one or more attribute values or attribute values ranges of a displayed histogram associated with a desired filter attribute. For example, referring to
In some embodiments, the desired attribute values or attribute value ranges used to construct the filter may be based at least in part upon an attribute of another object. For example, as described above, the user may define a filter that filters objects having a location attribute value that is within an area defined by another type of object (e.g., a province or region).
In some embodiments, the user may later switch the filter off by clicking on the filter at 1014 to deselect the filter, resulting in the inactive objects and features 1018 displayed on the map to become active again. In some embodiments, the user may create multiple filters, each defining a different set of filter criteria. Each filter may be displayed as a button or other interface element at 1014. For example,
By clicking on the filters at 1014, the user may be able to switch each filter on and off independently. This allows the user to reorder and reapply the filters in any combination. For example, the user may define three different filters to view data objects that satisfy a first, a second, and a third filter parameter. The user may then turn off the second filter in order to view objects that satisfy the first and third filter parameters, without regard for the second filter parameter. At a later time, the user may turn the second filter back on, and turn off the first or third filters, in order to view another combination of objects.
In some embodiments, the user may create a filter by selecting a desired attribute value or attribute value range, and clicking on a filter button or other interface element. For example, in
In some embodiments, a created filter may be applied on all displayed objects. In other embodiments, the filter may apply to a portion of the displayed objects (e.g., objects from the same search, objects on the same vector layer, and/or the like). For example, a filter for roads within a particular region may be applied to objects corresponding to roads, but not objects corresponding to regions.
At block 1104, a selection of an attribute value or attribute value range (hereinafter collectively referred to as “attribute value”) is received. In some embodiments, the selected attribute value may be based upon a relationship between a object to a different type of object. For example, the selected attribute value may correspond to objects that are within a region defined by another type of object, such as objects within the borders of a selected region.
At block 1106, in response to the selection specifying the attribute value, one or more updates to the user interface may be performed. For example, at block 1106-1, a portion of the displayed objects satisfying the selected attribute value may be highlighted. In some embodiments, this comprises selecting the portion of objects. In addition, in some embodiments, at block 1106-2, one or more distributions (e.g., histograms) may be displayed illustrating how the portion of objects satisfying the selected attribute value are distributed over the attribute values of other attributes. For example, in some embodiments, one or more attribute histograms may be modified to indicate how many of the selected objects are associated with different attribute values or attribute value ranges (e.g., as illustrated in
At block 1108, an indication to generate a filter based upon the selected attribute value is received. In some embodiments, this may comprise the user clicking on or selecting a “create filter” button to create a filter based upon the requested attribute value. At block 1110, a filter is generated based upon the selected attribute value in response to the received indication. In addition, the portion of the displayed objects satisfying the selected attribute value may be displayed as active objects (e.g., selectable data objects), while remaining objects that do not satisfy the selected attribute value may be displayed as inactive objects (e.g., not selectable). In some embodiments, histograms may be regenerated for one or more attributes based upon the filtered data objects. For example, the histograms may be updated such that they reflect only the currently active objects that satisfy the currently activated filter(s), but not the currently inactive objects that do not satisfy the currently activated filter(s).
In some embodiments, a user interface element (e.g., a button) is created that corresponds to the generated filter. The user interface element may be used by the user to turn on or turn off the filter.
At block 1114, objects that satisfy the first filter (e.g., are associated with the first attribute value) are displayed as active objects, while objects that do not satisfy the first filter are displayed as inactive. In some embodiments, the active objects may also be selected. In addition, as described above, one or more attribute histograms may be updated or regenerated to reflect distributions of the currently active objects over different attribute values or attribute value ranges.
At block 1116, a second filter is generated based upon a second selected attribute value. In some embodiments, the second attribute value may be associated with a different attribute as the first attribute value.
At block 1118, objects that satisfy both the first and second filters are displayed as active objects, while objects that do not satisfy both the first and second filters may be displayed as inactive. In addition, as described above, one or more attribute histograms may be updated or regenerated to reflect distributions of the currently active objects over different attribute values or attribute value ranges.
At block 1120, an indication is received to turn off or deactivate the first filter. In some embodiments, this may comprise the user clicking on a user interface element (e.g., a button) associated with the first filter, in order to turn off the first filter.
At block 1122, objects that satisfy the second filter are displayed as active data objects, while objects that do not satisfy the second filter may be displayed as inactive. Because the first filter has been turned off, whether or not the objects satisfy the first filter is no longer considered. In addition, as described above, one or more attribute histograms may be updated or regenerated to reflect distributions of the currently active objects over different attribute values or attribute value ranges.
As such, the user is able to create a plurality of different filters based upon different attribute values associated with objects that the user wishes to analyze. By being able to activate and deactivate different filters without needing to adhere to the order in which the filters were created, the user may be able to more easily explore different sets of objects that having different combinations of attribute values.
Tiles, Tile Layers, and UTF Grid
A single map may be associated with hundreds, thousands, or even millions of objects. Due to the possibility of the amount of processing that may be required to deal with larger numbers objects (e.g., identifying which objects/features are to be displayed, how they are to be displayed, and/or the like) being beyond the processing and/or memory capacity of client-side components, the composition of map tiles may be accomplished by server-side components of the interactive data object map system. The map tiles may then be transmitted to the client-side components of the interactive data object map system where they are composed into the map interface. In some embodiments, each composed map tile may comprise one or more static image files (e.g., a PNG file, or other image file type).
In some embodiments, each map tile may comprise one or more tile layers, wherein each tile layer corresponds to a different type of data. For example, a user may specify a base layer and one or more vector layers to be displayed (see, e.g.,
Vector tile layer 1206 may be used to display objects to be overlaid on the base tile layer 1204. For example, the user may select one or more vector layers to be displayed (e.g., as illustrated in
In some embodiments, for the purposes of transmitting to the client, multiple vector layers (e.g., multiple vector layers selected by a user) may be compressed into a single vector tile layer 1206. In other embodiments, each vector layer may be transmitted as its own separate vector tile layer.
In some embodiments, not all selected vector layers need to be considered when generating vector tile layer 1206. For example, a particular selected vector layer may only contain objects within a certain area of the map. As such, when generating vector tile layer 1206, only vector layers having objects within the area of the map tile need to be considered. In some embodiments, in order to be able to quickly eliminate vector layers that do not have to be considered when generating vector tile layer 1206 for a given map tile, a bounding box may be drawn around the objects of a vector layer, to determine an area covered by the vector layer. If the area of the bounding box is outside the area covered by the map tile, the vector layer is not considered when generating vector tile layer 1206 for the map tile.
In some embodiments, a user may be able to select one or more of the displayed objects shown in object layer 1206. When an object is selected, the icon or shape representing the data object may change color and/or shape in response to being selected. For example, a circular icon representing a person object may contain a ring that changes color when the object is selected. A blue line representing a section of river may, in response to a selection may the user, be overlaid using a thicker line of a different color. As it is possible for a user to select hundreds or thousands of objects at a time (e.g., by performing a large search or selecting over an area), it is often beneficial to keep track of which objects have been selected on the server-side. As such, the modified icons/shapes caused by selection of the objects may be displayed in optional selection tile layer 1210. In some embodiments, other visual effects that may be applied on the map, such as a heatmap, may also be displayed in selection tile layer 510.
In some embodiments, not all of the displayed objects are able to be interactive or selectable by the user. Objects that are not interactive or selectable may be referred to as being “inactive.” For example, a user may define a filter to exclude a subset of the displayed objects. As such, objects that are inactive may be displayed on optional inactive tile layer 1208 instead of on vector tile layer 1206. In some embodiments, a visual effect may be applied on the icon or shape representing an inactive object (e.g., greying out the icon representing the object), in order to differentiate the object from selectable objects on the map.
The various layers (base tile layer 1204, vector tile layer 1206, inactive tile layer 1208, and selection tile layer 1210), are generated at the server side and overlaid on each other to form completed map tiles, which may then be sent to the client system. An array of completed map tiles are assembled form the map. In some embodiments, images corresponding to individual tile layers may be sent to the client system, whereupon the client system assembles the received tile layers into completed map tiles.
In some embodiments, user interactions with the displayed map may cause the tile layers to be updated. For example, if the user clicks a particular location on the map, a determination may be made as to whether the location corresponds to an icon or shape representing an object. In some embodiments, this determination may be made at the server-side in response the client sending coordinates of the click to the server. If it is determined that the location is associated with an object, the object may be selected, causing an update to one or more tile layers (e.g., selection tile layer 1210). In some embodiments, this may cause the server to generated one or more updated layers for which to create a re-composed tile. The re-composed tile comprising updated tile layers may then be sent from the server to the client, to be displayed in the tile location.
In some embodiments, not all tile layers of a map tile need to be updated in response to a user interaction. For example, when a user selects a displayed object on a particular tile, the interaction may necessitate updates to the vector tile layer and/or selection tile layer associated with the tile. However, it is possible that the base tile layer and inactive tile layer associated with the tile do not need to be updated. As such, in some embodiments, one or more updated tile layers are sent from the server to the client, and used to replace one or more corresponding tile layers previously displayed at the location. In some embodiments, in order to maintain visual continuity, the previous tile layers are not replaced until all updated tile layers for the particular tile have been received.
In some embodiments, at least a portion of the determinations may be performed by the client instead of the server. For example, in some embodiments, because the user may only highlight a single object at a time, highlighting an object may be performed by the client instead of by the server. As such, highlighting objects on the map may be performed nearly instantaneously, and provides useful feedback that enhances the interactivity of the map system.
In some embodiments, in order to determine whether a particular location on the map corresponds to a data object or feature, a UTF grid may be used.
Contiguous regions of characters in the UTF grid indicate the bounds of a particular object, and may be used by the client-side components to provide the object highlighting and/or outlining. For example, when a user hovers a mouse pointer over an object on a map tile, the map system determines the character and portion of the UTF grid associated with the pixel hovered over, draws an object outline based on the UTF grid, and may additionally access metadata associated with the object based on the object identifier associated with the object. For example, the object identifier is sent to a back-end server, where the identifier is used to identify the object and retrieve metadata associated with the object to be transmitted to the client. In some embodiments, characters in the UTF may also be associated with certain types of metadata associated with the object (e.g., object name), allowing the metadata to be displayed without having to first retrieve the metadata using the object identifier from the back-end server. This allows the metadata to be displayed almost instantly when the user selects or highlights the object. In an embodiment, the UTF grid is sent to the client-side components in a JSON (JavaScript Object Notation) format.
For example,
At block 1502, the client-side components of the map system detect that the user is hovering over and/or touching an object in the user interface. At block 1504, and as described above, the client-side components may access the UTF grid to determine the object identifier and object boundaries associated with the hovered-over object. Then, at block 1506, the client-side components may render the object shape on the image or map interface. The object shape may be rendered as an outline and/or other highlighting.
At block 1508, the client-side components detect whether the user has selected the object. Objects may be selected, for example, if the user clicks on the object and or touches the object. If the user has selected the object, then at block 1510, the client-side components query the server-side components to retrieve metadata associated with the selected object. In an embodiment, querying of the server-side components may include transmitting the object identifier associated with the selected object to the server, the server retrieving from a database the relevant metadata, and the server transmitting the retrieved metadata back to the client-side components. In other embodiments, only a location of the selection needs to be sent to the server, whereupon the server may identify the object based upon the selection location.
At block 1512, the metadata may be received by the client-side components and displayed to the user. For example, the metadata associated with the selected object may be displayed to the user in the user interface in a dedicated metadata window, among other possibilities. In some embodiments, the metadata may be used to be used as part of one or more aggregations or combinations that are displayed to the user. For example, the metadata may be used to generate one or more attribute histograms. In some embodiments, the metadata is aggregated and processed at the server side (e.g., to create a histogram), wherein the processed metadata (e.g., histogram data) is then transmitted to the client.
In an embodiment, one or more blocks in
Server-side operations of the map system may include composing and updating the map tiles that make up the map interface. For example, when the user changes the selection of the base layer and/or one or more of the vector layers, the map tiles are re-composed and updated in the map interface to reflect the user's selection. Selection of objects resulting in highlighting of those objects may also involve re-composition of the map tiles. Further, UTF grids may be generated by the server-side components for each map tile composed.
At block 1602, the user interface is provided to the user. At block 1604 an input from the user is received. Inputs received from the user that may result in server-side operations may include, for example, an object selection (1604-1), a change in layer selection (1604-2), a geosearch (1604-3), generating a heatmap (1604-4), searching from the search box (1604-5), panning or zooming the map interface (1604-6), and/or generating one or more filters (1604-7), among others.
At block 1606, the client-side components of the map system may query the server-side components in response to any of inputs 1604-1, 1604-2, 1604-3, 1604-4, 1604-5, 1604-6, and 1604-7 from the user. The server-side components then update and re-compose the map tiles and UTF grids of the map interface in accordance with the user input (as described above in reference to
At block 1608, the client-side components receive the updated map tile information from the server, and at block 1610 the user interface is updated with the received information.
In an embodiment, additional information and/or data, in addition to updated map tiles, may be transmitted to the client-side components from the server-side components. For example, object metadata may be transmitted in response to a user selecting an object.
In an embodiment, one or more blocks in
At block 1630, a query is received by the server-side components from the client-side components. Such a query may originate, for example, at block 1616 of
For example, the map tile may comprise a plurality of overlaid tile layers (e.g., as illustrated in
At block 1634, the map system determines whether the layers necessary to compose the requested map tiles are cached. For example, when a layer is selected by the user, that layer may be composed by the map system and placed in a memory of the server-side components for future retrieval. Caching of composed layers may obviate the need for recomposing those layers later, which advantageously may save time and/or processing power.
If the required layers are cached, then at block 1640 the layers are composed into the requested map tiles and, at block 1642, transmitted to the client-side components.
When the required layers are not cached, at block 1636, the server-side components calculate and/or compose the requested layer and or layers, and may then, at block 1638, optionally cache the newly composed layers for future retrieval. Then, at blocks 1640 and 1642, the layers are composed into map tiles and provided to the client-side components. In some embodiments, the layers are composed into tile layers. For example, multiple selected vector layers may be composed into a single vector tile layer. In some embodiments, the tile layers (e.g., a base tile layer, a vector tile layer, a selection tile layer, and/or an inactive tile layer) are composed into map tiles by the server and provided to the client-side components. In other embodiments, the tile layers are provided to the client-side components, which uses them to compose the map tiles (e.g., by overlaying the tile layers for a particular map tile on top of each other to form the map tile).
In an embodiment, tile layers and/or entire map tiles may be cached by the server-side components. In an embodiment, the size and/or quality of the map tiles that make up that map interface may be selected and/or dynamically selected based on at least one of: the bandwidth available for transmitting the map tiles to the client-side components, the size of the map interface, and/or the complexity of the layer composition, among other factors. In an embodiment, the map tiles comprise images, for example, in one or more of the following formats: PNG, GIF, JPEG, TIFF, BMP, and/or any other type of appropriate image format.
In an embodiment, the layer and object data composed into layers and map tiles comprises vector data. The vector data (for example, object data) may include associated metadata, as described above. In an embodiment, the vector, layer, and/or object data and associated metadata may originate from one or more databases and/or electronic data stores.
In an embodiment, the map system may display more than 50 million selectable features to a user simultaneously. In an embodiment, the map system may support tens or hundreds of concurrent users accessing the same map and object data. In an embodiment, map and object data used by the map system may be mirrored and/or spread across multiple computers, servers, and/or server-side components.
In an embodiment, rather than updating the map tiles to reflect a selection by the user of one or more objects, the map system may show an approximation of the selection to the user based on client-side processing.
In an embodiment, icons and/or styles associated with various objects in the map interface may be updated and/or changed by the user. For example, the styles of the various objects may be specified in or by a style data file. The style data file may be formatted according to a particular format or standard readable by the map system. In an embodiment, the style data file is formatted according to the JSON format standard. The user may thus change the look of the objects and shapes rendered in the map interface of the map system by changing the style data file. The style data file may further define the looks for object and terrain (among other items and data) at various zoom levels.
In an embodiment, objects, notes, metadata, and/or other types of data may be added to the map system by the user through the user interface. In an embodiment, user added information may be shared between multiple users of the map system. In an embodiment, a user of the map system may add annotations and shapes to the map interface that may be saved and shared with other users. In an embodiment, a user of the map system may share a selection of objects with one or more other users.
In an embodiment, the user interface of the map system may include a timeline window. The timeline window may enable the user to view objects and layers specific to particular moments in time and/or time periods. In an embodiment, the user may view tolerance ellipses overlaid on the map interface indicating the likely position of an object across a particular time period.
In an embodiment, the map system may include elevation profiling. Elevation profiling may allow a user of the system to determine the elevation along a path on the map interface, to perform a viewshed analysis (determine objects and/or terrain viewable from a particular location), to perform a reverse-viewshed analysis (for a particular location, determine objects and/or terrain that may view the location), among others.
In an embodiment, vector data, object data, metadata, and/or other types of data may be prepared before it is entered into or accessed by the map system. For example, the data may be converted from one format to another, may be crawled for common items of metadata, and/or may be prepared for application of a style file or style information, among other action. In an embodiment, a layer ontology may be automatically generated based on a group of data. In an embodiment, the map system may access common data sources available on the Internet, for example, road data available from openstreetmap.org.
In an embodiment, roads shown in the map interface are labeled with their names, and buildings are rendered in faux-3D to indicate the building heights. In an embodiment, Blue Force Tracking may be integrated into the map system as a layer with the characteristics of both a static vector layer and a dynamic selection layer. A Blue Force layer may enable the use of the map system for live operational analysis. In an embodiment, the map system may quickly render detailed chloropleths or heatmaps with minimal data transfer. For example, the system may render a chloropleth with a property value on the individual shapes of the properties themselves, rather than aggregating this information on a county or zip code level.
Advantageously, the map system displays many items of data, objects, features, and/or layers in a single map interface. A user may easily interact with things on the map and gather information by hovering over or selecting features, even though those features may not be labeled. The user may select features, may “drill down” on a particular type of feature (for example, roads), may view features through histograms, may use histograms to determine common characteristics (for example, determine the most common speed limit), and/or may determine correlations among features (for example, see that slower speed limit areas are centered around schools). Further, the map system may be useful in many different situations. For example, the system may be useful to operational planners and/or disaster relief personnel.
Additionally, the map system accomplishes at least three core ideas: providing a robust and fast back-end (server-side) renderer, keeping data on the back-end, and only transferring the data necessary to have interactivity. In one embodiment, the primary function of the server-side components is rendering map tiles. The server is capable of drawing very detailed maps with a variety of styles that can be based on vector metadata. Rendered map tiles for a vector layer may be cached, and several of these layer tiles are drawn on top of one another to produce the final tile that is sent to the client-side browser. Map tile rendering is fast enough for displaying dynamic tiles for selection and highlight to the user. Server-side operations allow for dynamic selections of very large numbers of features, calculation of the histogram, determining the number of items shown and/or selected, and drawing the selection, for example. Further, the heatmap may include large numbers of points without incurring the cost of transferring those points to the client-side browser. Additionally, transferring only as much data as necessary to have interactivity enables quick server rendering of dynamic selections and vector layers. On the other hand, highlighting hovered-over features may be performed client-side nearly instantaneously, and provides useful feedback that enhances the interactivity of the map system. In an embodiment, to avoid transferring too much geometric data, the geometries of objects (in the map tiles and UTF grid) are down-sampled depending on how zoomed in the user is to the map interface. Thus, map tiles may be rendered and presented to a user of the map system in a dynamic and useable manner.
Object Centric Data Model
To provide a framework for the following discussion of specific systems and methods described above and below, an example database system 1710 using an ontology 1705 will now be described. This description is provided for the purpose of providing an example and is not intended to limit the techniques to the example data model, the example database system, or the example database system's use of an ontology to represent information.
In one embodiment, a body of data is conceptually structured according to an object-centric data model represented by ontology 1705. The conceptual data model is independent of any particular database used for durably storing one or more database(s) 1709 based on the ontology 1705. For example, each object of the conceptual data model may correspond to one or more rows in a relational database or an entry in Lightweight Directory Access Protocol (LDAP) database, or any combination of one or more databases.
Different types of data objects may have different property types. For example, a “Person” data object might have an “Eye Color” property type and an “Event” data object might have a “Date” property type. Each property 1703 as represented by data in the database system 1710 may have a property type defined by the ontology 1705 used by the database 1705.
Objects may be instantiated in the database 1709 in accordance with the corresponding object definition for the particular object in the ontology 1705. For example, a specific monetary payment (e.g., an object of type “event”) of US$30.00 (e.g., a property of Type “currency”) taking place on Mar. 27, 2009 (e.g., a property of type “date”) may be stored in the database 1709 as an event object with associated currency and date properties as defined within the ontology 1705.
The data objects defined in the ontology 1705 may support property multiplicity. In particular, a data object 1701 may be allowed to have more than one property 1703 of the same property type. For example, a “Person” data object might have multiple “Address” properties or multiple “Name” properties.
Each link 1702 represents a connection between two data objects 1701. In one embodiment, the connection is either through a relationship, an event, or through matching properties. A relationship connection may be asymmetrical or symmetrical. For example, “Person” data object A may be connected to “Person” data object B by a “Child Of” relationship (where “Person” data object B has an asymmetric “Parent Of” relationship to “Person” data object A), a “Kin Of” symmetric relationship to “Person” data object C, and an asymmetric “Member Of” relationship to “Organization” data object X. The type of relationship between two data objects may vary depending on the types of the data objects. For example, “Person” data object A may have an “Appears In” relationship with “Document” data object Y or have a “Participate In” relationship with “Event” data object E. As an example of an event connection, two “Person” data objects may be connected by an “Airline Flight” data object representing a particular airline flight if they traveled together on that flight, or by a “Meeting” data object representing a particular meeting if they both attended that meeting. In one embodiment, when two data objects are connected by an event, they are also connected by relationships, in which each data object has a specific relationship to the event, such as, for example, an “Appears In” relationship.
As an example of a matching properties connection, two “Person” data objects representing a brother and a sister, may both have an “Address” property that indicates where they live. If the brother and the sister live in the same home, then their “Address” properties likely contain similar, if not identical property values. In one embodiment, a link between two data objects may be established based on similar or matching properties (e.g., property types and/or property values) of the data objects. These are just some examples of the types of connections that may be represented by a link and other types of connections may be represented; embodiments are not limited to any particular types of connections between data objects. For example, a document might contain references to two different objects. For example, a document may contain a reference to a payment (one object), and a person (a second object). A link between these two objects may represent a connection between these two entities through their co-occurrence within the same document.
Each data object 1701 can have multiple links with another data object 1701 to form a link set 1704. For example, two “Person” data objects representing a husband and a wife could be linked through a “Spouse Of” relationship, a matching “Address” property, and one or more matching “Event” properties (e.g., a wedding). Each link 1702 as represented by data in a database may have a link type defined by the database ontology used by the database.
In accordance with the discussion above, the example ontology 1705 comprises stored information providing the data model of data stored in database 1709, and the ontology is defined by one or more object types 1810, one or more property types 1816, and one or more link types 1830. Based on information determined by the parser 1802 or other mapping of source input information to object type, one or more data objects 1701 may be instantiated in the database 1709 based on respective determined object types 1710, and each of the objects 1701 has one or more properties 1703 that are instantiated based on property types 1816. Two data objects 1701 may be connected by one or more links 1702 that may be instantiated based on link types 1830. The property types 1816 each may comprise one or more data types 1818, such as a string, number, etc. Property types 1816 may be instantiated based on a base property type 1820. For example, a base property type 1820 may be “Locations” and a property type 1816 may be “Home.”
In an embodiment, a user of the system uses an object type editor 1824 to create and/or modify the object types 1810 and define attributes of the object types. In an embodiment, a user of the system uses a property type editor 1826 to create and/or modify the property types 1816 and define attributes of the property types. In an embodiment, a user of the system uses link type editor 1828 to create the link types 1830. Alternatively, other programs, processes, or programmatic controls may be used to create link types and property types and define attributes, and using editors is not required.
In an embodiment, creating a property type 1816 using the property type editor 1826 involves defining at least one parser definition using a parser editor 1822. A parser definition comprises metadata that informs parser 1802 how to parse input data 1800 to determine whether values in the input data can be assigned to the property type 1816 that is associated with the parser definition. In an embodiment, each parser definition may comprise a regular expression parser 1804A or a code module parser 1804B. In other embodiments, other kinds of parser definitions may be provided using scripts or other programmatic elements. Once defined, both a regular expression parser 1804A and a code module parser 1804B can provide input to parser 1802 to control parsing of input data 1800.
Using the data types defined in the ontology, input data 1800 may be parsed by the parser 1802 determine which object type 1810 should receive data from a record created from the input data, and which property types 1816 should be assigned to data from individual field values in the input data. Based on the object-property mapping 1801, the parser 1802 selects one of the parser definitions that is associated with a property type in the input data. The parser parses an input data field using the selected parser definition, resulting in creating new or modified data 1803. The new or modified data 1803 is added to the database 1709 according to ontology 1705 by storing values of the new or modified data in a property of the specified property type. As a result, input data 1700 having varying format or syntax can be created in database 1709. The ontology 1705 may be modified at any time using object type editor 1824, property type editor 1826, and link type editor 1828, or under program control without human use of an editor. Parser editor 1822 enables creating multiple parser definitions that can successfully parse input data 1800 having varying format or syntax and determine which property types should be used to transform input data 1800 into new or modified input data 1803.
The properties, objects, and links (e.g. relationships) between the objects can be visualized using a graphical user interface (GUI). For example,
For example, in
Relationships between data objects may be stored as links, or in some embodiments, as properties, where a relationship may be detected between the properties. In some cases, as stated above, the links may be directional. For example, a payment link may have a direction associated with the payment, where one person object is a receiver of a payment, and another person object is the payer of payment.
In various embodiments, data objects may further include geographical metadata and/or links. Such geographical metadata may be accessed by the interactive data object map system for displaying objects and features on the map interface (as described above). In some embodiments, geographical metadata may be associated with specific properties of a data object (e.g., an object corresponding to a flight may have an origin location and a destination location).
In addition to visually showing relationships between the data objects, the user interface may allow various other manipulations. For example, the objects within database 1108 may be searched using a search interface 1950 (e.g., text string matching of object properties), inspected (e.g., properties and associated data viewed), filtered (e.g., narrowing the universe of objects into sets and subsets by properties or relationships), and statistically aggregated (e.g., numerically summarized based on summarization criteria), among other operations and visualizations. Additionally, as described above, objects within database 1108 may be searched, accessed, and implemented in the map interface of the interactive data object map system via, for example, a geosearch and/or radius search.
Implementation Mechanisms
According to an embodiment, the interactive data object map system and other methods and techniques described herein are implemented by one or more special-purpose computing devices. The special-purpose computing devices may be hard-wired to perform the techniques, or may include digital electronic devices such as one or more application-specific integrated circuits (ASICs) or field programmable gate arrays (FPGAs) that are persistently programmed to perform the techniques, or may include one or more general purpose hardware processors programmed to perform the techniques pursuant to program instructions in firmware, memory, other storage, or a combination. Such special-purpose computing devices may also combine custom hard-wired logic, ASICs, or FPGAs with custom programming to accomplish the techniques. The special-purpose computing devices may be desktop computer systems, server computer systems, portable computer systems, handheld devices, networking devices or any other device or combination of devices that incorporate hard-wired and/or program logic to implement the techniques.
Computing device(s) are generally controlled and coordinated by operating system software, such as iOS, Android, Chrome OS, Windows XP, Windows Vista, Windows 7, Windows 8, Windows Server, Windows CE, Unix, Linux, SunOS, Solaris, iOS, Blackberry OS, VxWorks, or other compatible operating systems. In other embodiments, the computing device may be controlled by a proprietary operating system. Conventional operating systems control and schedule computer processes for execution, perform memory management, provide file system, networking, I/O services, and provide a user interface functionality, such as a graphical user interface (“GUI”), among other things.
For example,
Computer system 2000 also includes a main memory 2006, such as a random access memory (RAM), cache and/or other dynamic storage devices, coupled to bus 2002 for storing information and instructions to be executed by processor 2004. Main memory 2006 also may be used for storing temporary variables or other intermediate information during execution of instructions to be executed by processor 2004. Such instructions, when stored in storage media accessible to processor 2004, render computer system 2000 into a special-purpose machine that is customized to perform the operations specified in the instructions.
Computer system 2000 further includes a read only memory (ROM) 2008 or other static storage device coupled to bus 2002 for storing static information and instructions for processor 2004. A storage device 2010, such as a magnetic disk, optical disk, or USB thumb drive (Flash drive), etc., is provided and coupled to bus 2002 for storing information and instructions.
Computer system 2000 may be coupled via bus 2002 to a display 2012, such as a cathode ray tube (CRT), LCD display, or touch screen display, for displaying information to a computer user and/or receiving input from the user. An input device 2014, including alphanumeric and other keys, is coupled to bus 2002 for communicating information and command selections to processor 2004. Another type of user input device is cursor control 2016, such as a mouse, a trackball, or cursor direction keys for communicating direction information and command selections to processor 2004 and for controlling cursor movement on display 2012. This input device typically has two degrees of freedom in two axes, a first axis (e.g., x) and a second axis (e.g., y), that allows the device to specify positions in a plane. In some embodiments, the same direction information and command selections as cursor control may be implemented via receiving touches on a touch screen without a cursor.
Computing system 2000 may include a user interface module, and/or various other types of modules to implement a GUI, a map interface, and the various other aspects of the interactive data object map system. The modules may be stored in a mass storage device as executable software codes that are executed by the computing device(s). This and other modules may include, by way of example, components, such as software components, object-oriented software components, class components and task components, processes, functions, attributes, procedures, subroutines, segments of program code, drivers, firmware, microcode, circuitry, data, databases, data structures, tables, arrays, and variables.
In general, the word “module,” as used herein, refers to logic embodied in hardware or firmware, or to a collection of software instructions, possibly having entry and exit points, written in a programming language, such as, for example, Java, Lua, C or C++. A software module may be compiled and linked into an executable program, installed in a dynamic link library, or may be written in an interpreted programming language such as, for example, BASIC, Perl, or Python. It will be appreciated that software modules may be callable from other modules or from themselves, and/or may be invoked in response to detected events or interrupts. Software modules configured for execution on computing devices may be provided on a computer readable medium, such as a compact disc, digital video disc, flash drive, magnetic disc, or any other tangible medium, or as a digital download (and may be originally stored in a compressed or installable format that requires installation, decompression or decryption prior to execution). Such software code may be stored, partially or fully, on a memory device of the executing computing device, for execution by the computing device. Software instructions may be embedded in firmware, such as an EPROM. It will be further appreciated that hardware modules may be comprised of connected logic units, such as gates and flip-flops, and/or may be comprised of programmable units, such as programmable gate arrays or processors. The modules or computing device functionality described herein are preferably implemented as software modules, but may be represented in hardware or firmware. Generally, the modules described herein refer to logical modules that may be combined with other modules or divided into sub-modules despite their physical organization or storage
Computer system 2000 may implement the techniques described herein using customized hard-wired logic, one or more ASICs or FPGAs, firmware and/or program logic which in combination with the computer system causes or programs computer system 2000 to be a special-purpose machine. According to one embodiment, the techniques herein are performed by computer system 2000 in response to processor(s) 2004 executing one or more sequences of one or more modules and/or instructions contained in main memory 2006. Such instructions may be read into main memory 2006 from another storage medium, such as storage device 2010. Execution of the sequences of instructions contained in main memory 2006 causes processor(s) 2004 to perform the process steps described herein. In alternative embodiments, hard-wired circuitry may be used in place of or in combination with software instructions.
The term “non-transitory media,” and similar terms, as used herein refers to any media that store data and/or instructions that cause a machine to operate in a specific fashion. Such non-transitory media may comprise non-volatile media and/or volatile media. Non-volatile media includes, for example, optical or magnetic disks, such as storage device 2010. Volatile media includes dynamic memory, such as main memory 2006. Common forms of non-transitory media include, for example, a floppy disk, a flexible disk, hard disk, solid state drive, magnetic tape, or any other magnetic data storage medium, a CD-ROM, any other optical data storage medium, any physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, NVRAM, any other memory chip or cartridge, and networked versions of the same.
Non-transitory media is distinct from but may be used in conjunction with transmission media. Transmission media participates in transferring information between nontransitory media. For example, transmission media includes coaxial cables, copper wire and fiber optics, including the wires that comprise bus 2002. Transmission media can also take the form of acoustic or light waves, such as those generated during radio-wave and infra-red data communications.
Various forms of media may be involved in carrying one or more sequences of one or more instructions to processor 2004 for execution. For example, the instructions may initially be carried on a magnetic disk or solid state drive of a remote computer. The remote computer can load the instructions and/or modules into its dynamic memory and send the instructions over a telephone line using a modem. A modem local to computer system 2000 can receive the data on the telephone line and use an infra-red transmitter to convert the data to an infra-red signal. An infra-red detector can receive the data carried in the infra-red signal and appropriate circuitry can place the data on bus 2002. Bus 2002 carries the data to main memory 2006, from which processor 2004 retrieves and executes the instructions. The instructions received by main memory 2006 may optionally be stored on storage device 2010 either before or after execution by processor 2004.
Computer system 2000 also includes a communication interface 2018 coupled to bus 2002. Communication interface 2018 provides a two-way data communication coupling to a network link 2020 that is connected to a local network 2022. For example, communication interface 2018 may be an integrated services digital network (ISDN) card, cable modem, satellite modem, or a modem to provide a data communication connection to a corresponding type of telephone line. As another example, communication interface 2018 may be a local area network (LAN) card to provide a data communication connection to a compatible LAN (or WAN component to communicated with a WAN). Wireless links may also be implemented. In any such implementation, communication interface 2018 sends and receives electrical, electromagnetic or optical signals that carry digital data streams representing various types of information.
Network link 2020 typically provides data communication through one or more networks to other data devices. For example, network link 2020 may provide a connection through local network 2022 to a host computer 2024 or to data equipment operated by an Internet Service Provider (ISP) 2026. ISP 2026 in turn provides data communication services through the world wide packet data communication network now commonly referred to as the “Internet” 2028. Local network 2022 and Internet 2028 both use electrical, electromagnetic or optical signals that carry digital data streams. The signals through the various networks and the signals on network link 2020 and through communication interface 2018, which carry the digital data to and from computer system 2000, are example forms of transmission media.
Computer system 2000 can send messages and receive data, including program code, through the network(s), network link 2020 and communication interface 2018. In the Internet example, a server 2030 might transmit a requested code for an application program through Internet 2028, ISP 2026, local network 2022 and communication interface 2018. Server-side components of the interactive data object map system described above (for example, with reference to
The computer system 2000, on the other hand, may implement the client-side components of the map system as described above (for example, with reference to
In an embodiment, the map system may be accessible by the user through a web-based viewer, such as a web browser. In this embodiment, the map interface may be generated by the server 2030 and/or the computer system 2000 and transmitted to the web browser of the user. The user may then interact with the map interface through the web-browser. In an embodiment, the computer system 2000 may comprise a mobile electronic device, such as a cell phone, smartphone, and/or tablet. The map system may be accessible by the user through such a mobile electronic device, among other types of electronic devices.
Each of the processes, methods, and algorithms described in the preceding sections may be embodied in, and fully or partially automated by, code modules executed by one or more computer systems or computer processors comprising computer hardware. The processes and algorithms may be implemented partially or wholly in application-specific circuitry.
The various features and processes described above may be used independently of one another, or may be combined in various ways. All possible combinations and subcombinations are intended to fall within the scope of this disclosure. In addition, certain method or process blocks may be omitted in some implementations. The methods and processes described herein are also not limited to any particular sequence, and the blocks or states relating thereto can be performed in other sequences that are appropriate. For example, described blocks or states may be performed in an order other than that specifically disclosed, or multiple blocks or states may be combined in a single block or state. The example blocks or states may be performed in serial, in parallel, or in some other manner. Blocks or states may be added to or removed from the disclosed example embodiments. The example systems and components described herein may be configured differently than described. For example, elements may be added to, removed from, or rearranged compared to the disclosed example embodiments.
Conditional language, such as, among others, “can,” “could,” “might,” or “may,” unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or steps. Thus, such conditional language is not generally intended to imply that features, elements and/or steps are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without user input or prompting, whether these features, elements and/or steps are included or are to be performed in any particular embodiment.
The term “comprising” as used herein should be given an inclusive rather than exclusive interpretation. For example, a general purpose computer comprising one or more processors should not be interpreted as excluding other computer components, and may possibly include such components as memory, input/output devices, and/or network interfaces, among others.
The term “a” as used herein should be given an inclusive rather than exclusive interpretation. For example, unless specifically noted, the term “a” should not be understood to mean “exactly one” or “one and only one”; instead, the term “a” means “one or more” or “at least one,” whether used in the claims or elsewhere in the specification and regardless of uses of quantifiers such as “at least one,” “one or more,” or “a plurality” elsewhere in the claims or specification.
Any process descriptions, elements, or blocks in the flow diagrams described herein and/or depicted in the attached Figures should be understood as potentially representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps in the process. Alternate implementations are included within the scope of the embodiments described herein in which elements or functions may be deleted, executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those skilled in the art.
It should be emphasized that many variations and modifications may be made to the above-described embodiments, the elements of which are to be understood as being among other acceptable examples. All such modifications and variations are intended to be included herein within the scope of this disclosure. The foregoing description details certain embodiments of the invention. It will be appreciated, however, that no matter how detailed the foregoing appears in text, the invention can be practiced in many ways. As is also stated above, it should be noted that the use of particular terminology when describing certain features or aspects of the invention should not be taken to imply that the terminology is being re-defined herein to be restricted to including any specific characteristics of the features or aspects of the invention with which that terminology is associated. The scope of the invention should therefore be construed in accordance with the appended claims and any equivalents thereof.
Any and all applications for which a foreign or domestic priority claim is identified in the Application Data Sheet as filed with the present application are hereby incorporated by reference under 37 CFR 1.57. This application claims benefit of U.S. patent application Ser. No. 14/934,004 entitled “Interactive Geospatial Map” filed Nov. 5, 2015 and U.S. Provisional Patent Application Ser. No. 62/206,174 entitled “Interactive Geospatial Map” filed Aug. 17, 2015, both of which are hereby incorporated by reference in their entireties and for all purposes.
Number | Name | Date | Kind |
---|---|---|---|
4899161 | Morin et al. | Feb 1990 | A |
4958305 | Piazza | Sep 1990 | A |
5109399 | Thompson | Apr 1992 | A |
5241625 | Epard et al. | Aug 1993 | A |
5283562 | Kaneko | Feb 1994 | A |
5329108 | Lamoure | Jul 1994 | A |
5623590 | Becker et al. | Apr 1997 | A |
5632009 | Rao et al. | May 1997 | A |
5670987 | Doi et al. | Sep 1997 | A |
5754182 | Kobayashi | May 1998 | A |
5781195 | Marvin | Jul 1998 | A |
5781704 | Rossmo | Jul 1998 | A |
5798769 | Chiu et al. | Aug 1998 | A |
5845300 | Comer | Dec 1998 | A |
5884217 | Koyanagi | Mar 1999 | A |
5925091 | Ando | Jul 1999 | A |
5936631 | Yano | Aug 1999 | A |
5999911 | Berg et al. | Dec 1999 | A |
6055569 | O'Brien et al. | Apr 2000 | A |
6057757 | Arrowsmith et al. | May 2000 | A |
6065026 | Cornelia et al. | May 2000 | A |
6091956 | Hollenberg | Jul 2000 | A |
6157747 | Szeliski et al. | Dec 2000 | A |
6161098 | Wallman | Dec 2000 | A |
6169552 | Endo | Jan 2001 | B1 |
6173067 | Payton et al. | Jan 2001 | B1 |
6178432 | Cook et al. | Jan 2001 | B1 |
6219053 | Tachibana et al. | Apr 2001 | B1 |
6232971 | Haynes | May 2001 | B1 |
6237138 | Hameluck et al. | May 2001 | B1 |
6243706 | Moreau et al. | Jun 2001 | B1 |
6247019 | Davies | Jun 2001 | B1 |
6279018 | Kudrolli et al. | Aug 2001 | B1 |
6338066 | Martin et al. | Jan 2002 | B1 |
6341310 | Leshem et al. | Jan 2002 | B1 |
6366933 | Ball et al. | Apr 2002 | B1 |
6369835 | Lin | Apr 2002 | B1 |
6370538 | Lamping et al. | Apr 2002 | B1 |
6389289 | Voce et al. | May 2002 | B1 |
6414683 | Gueziec | Jul 2002 | B1 |
6430305 | Decker | Aug 2002 | B1 |
6456997 | Shukla | Sep 2002 | B1 |
6483509 | Rabenhorst | Nov 2002 | B1 |
6523019 | Borthwick | Feb 2003 | B1 |
6529900 | Patterson et al. | Mar 2003 | B1 |
6549944 | Weinberg et al. | Apr 2003 | B1 |
6560620 | Ching | May 2003 | B1 |
6581068 | Bensoussan et al. | Jun 2003 | B1 |
6584498 | Nguyen | Jun 2003 | B2 |
6594672 | Lampson et al. | Jul 2003 | B1 |
6631496 | Li et al. | Oct 2003 | B1 |
6642945 | Sharpe | Nov 2003 | B1 |
6662103 | Skolnick et al. | Dec 2003 | B1 |
6665683 | Meltzer | Dec 2003 | B1 |
6674434 | Chojnacki et al. | Jan 2004 | B1 |
6714936 | Nevin, III | Mar 2004 | B1 |
6742033 | Smith et al. | May 2004 | B1 |
6757445 | Knopp | Jun 2004 | B1 |
6775675 | Nwabueze et al. | Aug 2004 | B1 |
6792422 | Stride et al. | Sep 2004 | B1 |
6820135 | Dingman | Nov 2004 | B1 |
6828920 | Owen et al. | Dec 2004 | B2 |
6839745 | Dingari et al. | Jan 2005 | B1 |
6850317 | Mullins et al. | Feb 2005 | B2 |
6877137 | Rivette et al. | Apr 2005 | B1 |
6944821 | Bates et al. | Sep 2005 | B1 |
6967589 | Peters | Nov 2005 | B1 |
6976210 | Silva et al. | Dec 2005 | B1 |
6978419 | Kantrowitz | Dec 2005 | B1 |
6980984 | Huffman et al. | Dec 2005 | B1 |
6983203 | Wako | Jan 2006 | B1 |
6985950 | Hanson et al. | Jan 2006 | B1 |
7003566 | Codella et al. | Feb 2006 | B2 |
7036085 | Barros | Apr 2006 | B2 |
7043702 | Chi et al. | May 2006 | B2 |
7055110 | Kupka et al. | May 2006 | B2 |
7086028 | Davis et al. | Aug 2006 | B1 |
7103852 | Kairis, Jr. | Sep 2006 | B2 |
7139800 | Bellotti et al. | Nov 2006 | B2 |
7149366 | Sun | Dec 2006 | B1 |
7158878 | Rasmussen et al. | Jan 2007 | B2 |
7162475 | Ackerman | Jan 2007 | B2 |
7168039 | Bertram | Jan 2007 | B2 |
7171427 | Witowski et al. | Jan 2007 | B2 |
7174377 | Bernard et al. | Feb 2007 | B2 |
7194680 | Roy et al. | Mar 2007 | B1 |
7213030 | Jenkins | May 2007 | B1 |
7269786 | Malloy et al. | Sep 2007 | B1 |
7278105 | Kitts | Oct 2007 | B1 |
7290698 | Poslinski et al. | Nov 2007 | B2 |
7333998 | Heckerman et al. | Feb 2008 | B2 |
7370047 | Gorman | May 2008 | B2 |
7375732 | Arcas | May 2008 | B2 |
7379811 | Rasmussen et al. | May 2008 | B2 |
7379903 | Caballero et al. | May 2008 | B2 |
7392254 | Jenkins | Jun 2008 | B1 |
7426654 | Adams et al. | Sep 2008 | B2 |
7441182 | Beilinson et al. | Oct 2008 | B2 |
7441219 | Perry et al. | Oct 2008 | B2 |
7454466 | Bellotti et al. | Nov 2008 | B2 |
7457706 | Malero et al. | Nov 2008 | B2 |
7467375 | Tondreau et al. | Dec 2008 | B2 |
7487139 | Fraleigh et al. | Feb 2009 | B2 |
7502786 | Liu et al. | Mar 2009 | B2 |
7519470 | Brasche et al. | Apr 2009 | B2 |
7525422 | Bishop et al. | Apr 2009 | B2 |
7529195 | Gorman | May 2009 | B2 |
7529727 | Arning et al. | May 2009 | B2 |
7529734 | Dirisala | May 2009 | B2 |
7539666 | Ashworth et al. | May 2009 | B2 |
7558677 | Jones | Jun 2009 | B2 |
7558822 | Fredricksen et al. | Jul 2009 | B2 |
7574409 | Patinkin | Aug 2009 | B2 |
7574428 | Leiserowitz et al. | Aug 2009 | B2 |
7579965 | Bucholz | Aug 2009 | B2 |
7596285 | Brown et al. | Sep 2009 | B2 |
7614006 | Molander | Nov 2009 | B2 |
7617232 | Gabbert et al. | Nov 2009 | B2 |
7617314 | Bansod et al. | Nov 2009 | B1 |
7620628 | Kapur et al. | Nov 2009 | B2 |
7627812 | Chamberlain et al. | Dec 2009 | B2 |
7634717 | Chamberlain et al. | Dec 2009 | B2 |
7653883 | Hotelling et al. | Jan 2010 | B2 |
7663621 | Allen et al. | Feb 2010 | B1 |
7693816 | Nemoto et al. | Apr 2010 | B2 |
7703021 | Flam | Apr 2010 | B1 |
7706817 | Bamrah et al. | Apr 2010 | B2 |
7712049 | Williams et al. | May 2010 | B2 |
7716077 | Mikurak | May 2010 | B1 |
7725530 | Sah et al. | May 2010 | B2 |
7725547 | Albertson et al. | May 2010 | B2 |
7730082 | Sah et al. | Jun 2010 | B2 |
7730109 | Rohrs et al. | Jun 2010 | B2 |
7747749 | Erikson et al. | Jun 2010 | B1 |
7756843 | Palmer | Jul 2010 | B1 |
7765489 | Shah | Jul 2010 | B1 |
7770100 | Chamberlain et al. | Aug 2010 | B2 |
7791616 | Ioup et al. | Sep 2010 | B2 |
7805457 | Viola et al. | Sep 2010 | B1 |
7809703 | Balabhadrapatruni et al. | Oct 2010 | B2 |
7818658 | Chen | Oct 2010 | B2 |
7870493 | Pall et al. | Jan 2011 | B2 |
7872647 | Mayer et al. | Jan 2011 | B2 |
7877421 | Berger et al. | Jan 2011 | B2 |
7880921 | Dattilo et al. | Feb 2011 | B2 |
7890850 | Bryar et al. | Feb 2011 | B1 |
7894984 | Rasmussen et al. | Feb 2011 | B2 |
7899611 | Downs et al. | Mar 2011 | B2 |
7899796 | Borthwick et al. | Mar 2011 | B1 |
7917376 | Bellin et al. | Mar 2011 | B2 |
7920963 | Jouline et al. | Apr 2011 | B2 |
7933862 | Chamberlain et al. | Apr 2011 | B2 |
8085268 | Carrino et al. | Apr 2011 | B2 |
7941321 | Greenstein et al. | May 2011 | B2 |
7941336 | Robin-Jan | May 2011 | B1 |
7945852 | Pilskains | May 2011 | B1 |
7949960 | Roessler et al. | May 2011 | B2 |
7958147 | Turner et al. | Jun 2011 | B1 |
7962281 | Rasmussen et al. | Jun 2011 | B2 |
7962495 | Jain et al. | Jun 2011 | B2 |
7962848 | Bertram | Jun 2011 | B2 |
7966199 | Fresher | Jun 2011 | B1 |
7970240 | Chao et al. | Jun 2011 | B1 |
7971150 | Raskutti et al. | Jun 2011 | B2 |
7984374 | Caro et al. | Jun 2011 | B2 |
8001465 | Kudrolli et al. | Aug 2011 | B2 |
8001482 | Bhattiprolu et al. | Aug 2011 | B2 |
8010507 | Poston et al. | Aug 2011 | B2 |
8010545 | Stefik et al. | Aug 2011 | B2 |
8015487 | Roy et al. | Sep 2011 | B2 |
8024778 | Cash et al. | Sep 2011 | B2 |
8036632 | Cona et al. | Oct 2011 | B1 |
8036971 | Aymeloglu et al. | Oct 2011 | B2 |
8046283 | Burns | Oct 2011 | B2 |
8054756 | Chand et al. | Nov 2011 | B2 |
8065080 | Koch | Nov 2011 | B2 |
8073857 | Sreekanth | Dec 2011 | B2 |
8095434 | Puttick et al. | Jan 2012 | B1 |
8103543 | Zwicky | Jan 2012 | B1 |
8134457 | Velipasalar et al. | Mar 2012 | B2 |
8145703 | Frishert et al. | Mar 2012 | B2 |
8185819 | Sah et al. | May 2012 | B2 |
8191005 | Baier et al. | May 2012 | B2 |
8200676 | Frank | Jun 2012 | B2 |
8214361 | Sandler et al. | Jul 2012 | B1 |
8214490 | Vos et al. | Jul 2012 | B1 |
8214764 | Gemmell et al. | Jul 2012 | B2 |
8225201 | Michael | Jul 2012 | B2 |
8229902 | Vishniac et al. | Jul 2012 | B2 |
8229947 | Fujinaga | Jul 2012 | B2 |
8230333 | Decherd et al. | Jul 2012 | B2 |
8271461 | Pike et al. | Sep 2012 | B2 |
8280880 | Aymeloglu et al. | Oct 2012 | B1 |
8290838 | Thakur et al. | Oct 2012 | B1 |
8290926 | Ozzie et al. | Oct 2012 | B2 |
8290942 | Jones et al. | Oct 2012 | B2 |
8290943 | Carbone et al. | Oct 2012 | B2 |
8301464 | Cave et al. | Oct 2012 | B1 |
8301904 | Gryaznov | Oct 2012 | B1 |
8302855 | Ma et al. | Nov 2012 | B2 |
8312367 | Foster | Nov 2012 | B2 |
8312546 | Alme | Nov 2012 | B2 |
8325178 | Doyle, Jr. | Dec 2012 | B1 |
8352881 | Champion et al. | Jan 2013 | B2 |
8368695 | Howell et al. | Feb 2013 | B2 |
8386377 | Xiong et al. | Feb 2013 | B1 |
8396740 | Watson | Mar 2013 | B1 |
8397171 | Klassen et al. | Mar 2013 | B2 |
8400448 | Doyle, Jr. | Mar 2013 | B1 |
8407180 | Ramesh et al. | Mar 2013 | B1 |
8412234 | Gatmir-Motahair et al. | Apr 2013 | B1 |
8412707 | Mianji | Apr 2013 | B1 |
8422825 | Neophytou et al. | Apr 2013 | B1 |
8447722 | Ahuja et al. | May 2013 | B1 |
8452790 | Mianji | May 2013 | B1 |
8463036 | Ramesh et al. | Jun 2013 | B1 |
8473454 | Evanitsky et al. | Jun 2013 | B2 |
8484115 | Aymeloglu et al. | Jul 2013 | B2 |
8489331 | Kopf et al. | Jul 2013 | B2 |
8489641 | Seefeld et al. | Jul 2013 | B1 |
8498984 | Hwang et al. | Jul 2013 | B1 |
8508533 | Cervelli et al. | Aug 2013 | B2 |
8510743 | Hackborn et al. | Aug 2013 | B2 |
8514082 | Cova et al. | Aug 2013 | B2 |
8514229 | Cervelli et al. | Aug 2013 | B2 |
8515207 | Chau | Aug 2013 | B2 |
8527949 | Pleis et al. | Sep 2013 | B1 |
8554579 | Tribble et al. | Oct 2013 | B2 |
8554653 | Falkenborg et al. | Oct 2013 | B2 |
8554709 | Goodson et al. | Oct 2013 | B2 |
8560413 | Quarterman | Oct 2013 | B1 |
8564596 | Carrino et al. | Oct 2013 | B2 |
8577911 | Stepinski et al. | Nov 2013 | B1 |
8589273 | Creeden et al. | Nov 2013 | B2 |
8595234 | Siripuapu et al. | Nov 2013 | B2 |
8599203 | Horowitz et al. | Dec 2013 | B2 |
8620641 | Farnsworth et al. | Dec 2013 | B2 |
8639757 | Zang et al. | Jan 2014 | B1 |
8646080 | Williamson et al. | Feb 2014 | B2 |
8676857 | Adams et al. | Mar 2014 | B1 |
8682696 | Shanmugam | Mar 2014 | B1 |
8688573 | Ruknoic et al. | Apr 2014 | B1 |
8689108 | Duffield et al. | Apr 2014 | B1 |
8713467 | Goldenberg et al. | Apr 2014 | B1 |
8726379 | Stiansen et al. | May 2014 | B1 |
8732574 | Burr et al. | May 2014 | B2 |
8739278 | Varghese | May 2014 | B2 |
8742934 | Sarpy et al. | Jun 2014 | B1 |
8744890 | Bernier | Jun 2014 | B1 |
8745516 | Mason et al. | Jun 2014 | B2 |
8781169 | Jackson et al. | Jul 2014 | B2 |
8787939 | Papakipos et al. | Jul 2014 | B2 |
8788407 | Singh et al. | Jul 2014 | B1 |
8799313 | Satlow | Aug 2014 | B2 |
8799799 | Cervelli | Aug 2014 | B1 |
8807948 | Luo et al. | Aug 2014 | B2 |
8812960 | Sun et al. | Aug 2014 | B1 |
8830322 | Nerayoff et al. | Sep 2014 | B2 |
8832594 | Thompson et al. | Sep 2014 | B1 |
8868537 | Colgrove et al. | Oct 2014 | B1 |
8917274 | Ma et al. | Dec 2014 | B2 |
8924388 | Elliot et al. | Dec 2014 | B2 |
8924389 | Elliot et al. | Dec 2014 | B2 |
8924872 | Bogomolov et al. | Dec 2014 | B1 |
8930874 | Duff et al. | Jan 2015 | B2 |
8937619 | Sharma et al. | Jan 2015 | B2 |
8938434 | Jain et al. | Jan 2015 | B2 |
8949164 | Mohler | Feb 2015 | B1 |
8938494 | Onnen et al. | Mar 2015 | B2 |
8938686 | Erenrich et al. | Mar 2015 | B1 |
8983494 | Onnen et al. | Mar 2015 | B1 |
8984390 | Aymeloglu et al. | Mar 2015 | B2 |
9009171 | Grossman et al. | Apr 2015 | B1 |
9009177 | Zheng | Apr 2015 | B2 |
9009827 | Albertson et al. | Apr 2015 | B1 |
9021260 | Falk et al. | Apr 2015 | B1 |
9021384 | Beard et al. | Apr 2015 | B1 |
9043696 | Meiklejohn et al. | May 2015 | B1 |
9043894 | Dennison et al. | May 2015 | B1 |
9058315 | Burr et al. | Jun 2015 | B2 |
9100428 | Visbal | Aug 2015 | B1 |
9104293 | Kornfeld | Aug 2015 | B1 |
9104695 | Cervelli et al. | Aug 2015 | B1 |
9111380 | Piemonte et al. | Aug 2015 | B2 |
9116975 | Shankar et al. | Aug 2015 | B2 |
9129219 | Robertson et al. | Sep 2015 | B1 |
9146125 | Vulcano | Sep 2015 | B2 |
9165100 | Begur et al. | Oct 2015 | B2 |
9280618 | Bruce et al. | Mar 2016 | B1 |
9600146 | Cervelli et al. | Mar 2017 | B2 |
9734217 | Kara | Aug 2017 | B2 |
9891808 | Wilson et al. | Feb 2018 | B2 |
9953445 | Cervelli et al. | Apr 2018 | B2 |
20010021936 | Bertram | Sep 2001 | A1 |
20010030667 | Kelts | Oct 2001 | A1 |
20020003539 | Abe | Jan 2002 | A1 |
20020032677 | Morgenthaler et al. | Mar 2002 | A1 |
20020033848 | Sciammarella et al. | Mar 2002 | A1 |
20020065708 | Senay et al. | May 2002 | A1 |
20020091707 | Keller | Jul 2002 | A1 |
20020095360 | Joao | Jul 2002 | A1 |
20020095658 | Shulman | Jul 2002 | A1 |
20020103705 | Brady | Aug 2002 | A1 |
20020116120 | Ruiz et al. | Aug 2002 | A1 |
20020130867 | Yang et al. | Sep 2002 | A1 |
20020130906 | Miyaki | Sep 2002 | A1 |
20020130907 | Chi et al. | Sep 2002 | A1 |
20020147805 | Leshem et al. | Oct 2002 | A1 |
20020174201 | Ramer et al. | Nov 2002 | A1 |
20020194119 | Wright et al. | Dec 2002 | A1 |
20020196229 | Chen et al. | Dec 2002 | A1 |
20030028560 | Kudrolli et al. | Feb 2003 | A1 |
20030036848 | Sheha et al. | Feb 2003 | A1 |
20030036927 | Bowen | Feb 2003 | A1 |
20030039948 | Donahue | Feb 2003 | A1 |
20030052896 | Higgins et al. | Mar 2003 | A1 |
20030093755 | O'Carroll | May 2003 | A1 |
20030103049 | Kindratenko et al. | Jun 2003 | A1 |
20030126102 | Borthwick | Jul 2003 | A1 |
20030140106 | Raguseo | Jul 2003 | A1 |
20030144868 | MacIntyre et al. | Jul 2003 | A1 |
20030163352 | Surpin et al. | Aug 2003 | A1 |
20030200217 | Ackerman | Oct 2003 | A1 |
20030225755 | Iwayama et al. | Dec 2003 | A1 |
20030229848 | Arend et al. | Dec 2003 | A1 |
20040030492 | Fox et al. | Feb 2004 | A1 |
20040032432 | Baynger | Feb 2004 | A1 |
20040034570 | Davis | Feb 2004 | A1 |
20040039498 | Ollis et al. | Feb 2004 | A1 |
20040044648 | Anfindsen et al. | Mar 2004 | A1 |
20040064256 | Barinek et al. | Apr 2004 | A1 |
20040085318 | Hassler et al. | May 2004 | A1 |
20040095349 | Bito et al. | May 2004 | A1 |
20040098236 | Mayer et al. | May 2004 | A1 |
20040111410 | Burgoon et al. | Jun 2004 | A1 |
20040111480 | Yue | Jun 2004 | A1 |
20040123135 | Goddard | Jun 2004 | A1 |
20040126840 | Cheng et al. | Jul 2004 | A1 |
20040143602 | Ruiz et al. | Jul 2004 | A1 |
20040143796 | Lerner et al. | Jul 2004 | A1 |
20040153418 | Hanweck | Aug 2004 | A1 |
20040163039 | Gorman | Aug 2004 | A1 |
20040175036 | Graham | Sep 2004 | A1 |
20040181554 | Heckerman et al. | Sep 2004 | A1 |
20040193600 | Kaasten et al. | Sep 2004 | A1 |
20040205492 | Newsome | Oct 2004 | A1 |
20040217884 | Samadani et al. | Nov 2004 | A1 |
20040221223 | Yu et al. | Nov 2004 | A1 |
20040236688 | Bozeman | Nov 2004 | A1 |
20040236711 | Nixon et al. | Nov 2004 | A1 |
20040260702 | Cragun et al. | Dec 2004 | A1 |
20040267746 | Marcjan et al. | Dec 2004 | A1 |
20050010472 | Quatse et al. | Jan 2005 | A1 |
20050027705 | Sadri et al. | Feb 2005 | A1 |
20050028094 | Allyn | Feb 2005 | A1 |
20050028191 | Sullivan et al. | Feb 2005 | A1 |
20050031197 | Knopp | Feb 2005 | A1 |
20050034062 | Bufkin et al. | Feb 2005 | A1 |
20050039116 | Slack-Smith | Feb 2005 | A1 |
20050039119 | Parks et al. | Feb 2005 | A1 |
20050065811 | Chu et al. | Mar 2005 | A1 |
20050080769 | Gemmell | Apr 2005 | A1 |
20050086207 | Heuer et al. | Apr 2005 | A1 |
20050091186 | Elish | Apr 2005 | A1 |
20050125715 | Di Franco et al. | Jun 2005 | A1 |
20050143602 | Yada et al. | Jun 2005 | A1 |
20050154628 | Eckart et al. | Jul 2005 | A1 |
20050154769 | Eckart et al. | Jul 2005 | A1 |
20050162523 | Darrell et al. | Jul 2005 | A1 |
20050166144 | Gross | Jul 2005 | A1 |
20050180330 | Shapiro | Aug 2005 | A1 |
20050182502 | Iyengar | Aug 2005 | A1 |
20050182793 | Keenan et al. | Aug 2005 | A1 |
20050183005 | Denoue et al. | Aug 2005 | A1 |
20050210409 | Jou | Sep 2005 | A1 |
20050223044 | Ashworth et al. | Oct 2005 | A1 |
20050246327 | Yeung et al. | Nov 2005 | A1 |
20050251786 | Citron et al. | Nov 2005 | A1 |
20050267652 | Allstadt et al. | Dec 2005 | A1 |
20060026120 | Carolan et al. | Feb 2006 | A1 |
20060026170 | Kreitler et al. | Feb 2006 | A1 |
20060026561 | Bauman et al. | Feb 2006 | A1 |
20060031779 | Theurer et al. | Feb 2006 | A1 |
20060045470 | Poslinski et al. | Mar 2006 | A1 |
20060047804 | Fredricksen et al. | Mar 2006 | A1 |
20060053097 | King et al. | Mar 2006 | A1 |
20060053170 | Hill et al. | Mar 2006 | A1 |
20060059139 | Robinson | Mar 2006 | A1 |
20060059423 | Lehmann et al. | Mar 2006 | A1 |
20060074866 | Chamberlain et al. | Apr 2006 | A1 |
20060074881 | Vembu et al. | Apr 2006 | A1 |
20060080139 | Mainzer | Apr 2006 | A1 |
20060080283 | Shipman | Apr 2006 | A1 |
20060080619 | Carlson et al. | Apr 2006 | A1 |
20060093222 | Saffer et al. | May 2006 | A1 |
20060129191 | Sullivan et al. | Jun 2006 | A1 |
20060129746 | Porter | Jun 2006 | A1 |
20060136513 | Ngo et al. | Jun 2006 | A1 |
20060139375 | Rasmussen et al. | Jun 2006 | A1 |
20060142949 | Helt | Jun 2006 | A1 |
20060143034 | Rothermel | Jun 2006 | A1 |
20060143075 | Carr et al. | Jun 2006 | A1 |
20060143079 | Basak et al. | Jun 2006 | A1 |
20060146050 | Yamauchi | Jul 2006 | A1 |
20060149596 | Surpin et al. | Jul 2006 | A1 |
20060155654 | Plessis et al. | Jul 2006 | A1 |
20060178915 | Chao | Aug 2006 | A1 |
20060200384 | Arutunian | Sep 2006 | A1 |
20060203337 | White | Sep 2006 | A1 |
20060218637 | Thomas et al. | Sep 2006 | A1 |
20060241974 | Chao et al. | Oct 2006 | A1 |
20060242040 | Rader et al. | Oct 2006 | A1 |
20060242630 | Koike et al. | Oct 2006 | A1 |
20060251307 | Florin et al. | Nov 2006 | A1 |
20060259527 | Devarakonda et al. | Nov 2006 | A1 |
20060265417 | Amato et al. | Nov 2006 | A1 |
20060271277 | Hu et al. | Nov 2006 | A1 |
20060277460 | Forstall et al. | Dec 2006 | A1 |
20060279630 | Aggarwal et al. | Dec 2006 | A1 |
20060294223 | Glasgow et al. | Dec 2006 | A1 |
20070000999 | Kubo et al. | Jan 2007 | A1 |
20070011150 | Frank | Jan 2007 | A1 |
20070011304 | Error | Jan 2007 | A1 |
20070016363 | Huang et al. | Jan 2007 | A1 |
20070016435 | Bevington | Jan 2007 | A1 |
20070024620 | Muller-Fischer et al. | Feb 2007 | A1 |
20070038646 | Thota | Feb 2007 | A1 |
20070038962 | Fuchs et al. | Feb 2007 | A1 |
20070043686 | Teng et al. | Feb 2007 | A1 |
20070057966 | Ohno et al. | Mar 2007 | A1 |
20070061752 | Cory | Mar 2007 | A1 |
20070078832 | Ott et al. | Apr 2007 | A1 |
20070083541 | Fraleigh et al. | Apr 2007 | A1 |
20070094389 | Nussey et al. | Apr 2007 | A1 |
20070113164 | Hansen et al. | May 2007 | A1 |
20070115373 | Gallagher et al. | May 2007 | A1 |
20070136095 | Weinstein | Jun 2007 | A1 |
20070150369 | Zivin | Jun 2007 | A1 |
20070150801 | Chidlovskii et al. | Jun 2007 | A1 |
20070156673 | Maga | Jul 2007 | A1 |
20070162454 | D'Albora et al. | Jul 2007 | A1 |
20070168871 | Jenkins | Jul 2007 | A1 |
20070174760 | Chamberlain et al. | Jul 2007 | A1 |
20070185850 | Walters et al. | Aug 2007 | A1 |
20070185867 | Maga | Aug 2007 | A1 |
20070185894 | Swain et al. | Aug 2007 | A1 |
20070188516 | Loup et al. | Aug 2007 | A1 |
20070192122 | Routson et al. | Aug 2007 | A1 |
20070192265 | Chopin et al. | Aug 2007 | A1 |
20070198571 | Ferguson et al. | Aug 2007 | A1 |
20070208497 | Downs et al. | Sep 2007 | A1 |
20070208498 | Barker et al. | Sep 2007 | A1 |
20070208736 | Tanigawa et al. | Sep 2007 | A1 |
20070233709 | Abnous | Oct 2007 | A1 |
20070240062 | Christena et al. | Oct 2007 | A1 |
20070245339 | Bauman et al. | Oct 2007 | A1 |
20070258642 | Thota | Nov 2007 | A1 |
20070266336 | Nojima et al. | Nov 2007 | A1 |
20070284433 | Domenica et al. | Dec 2007 | A1 |
20070294643 | Kyle | Dec 2007 | A1 |
20070299697 | Friedlander et al. | Dec 2007 | A1 |
20080010605 | Frank | Jan 2008 | A1 |
20080016155 | Khalatian | Jan 2008 | A1 |
20080016216 | Worley et al. | Jan 2008 | A1 |
20080040275 | Paulsen et al. | Feb 2008 | A1 |
20080040684 | Crump | Feb 2008 | A1 |
20080051989 | Welsh | Feb 2008 | A1 |
20080052142 | Bailey et al. | Feb 2008 | A1 |
20080066052 | Wolfram | Mar 2008 | A1 |
20080069081 | Chand et al. | Mar 2008 | A1 |
20080077597 | Butler | Mar 2008 | A1 |
20080077642 | Carbone et al. | Mar 2008 | A1 |
20080082486 | Lermant et al. | Apr 2008 | A1 |
20080082578 | Hogue et al. | Apr 2008 | A1 |
20080091693 | Murthy | Apr 2008 | A1 |
20080098085 | Krane et al. | Apr 2008 | A1 |
20080103996 | Forman et al. | May 2008 | A1 |
20080104019 | Nath | May 2008 | A1 |
20080109714 | Kumar et al. | May 2008 | A1 |
20080126951 | Sood et al. | May 2008 | A1 |
20080133579 | Lim | Jun 2008 | A1 |
20080148398 | Mezack et al. | Jun 2008 | A1 |
20080155440 | Trevor et al. | Jun 2008 | A1 |
20080162616 | Gross et al. | Jul 2008 | A1 |
20080163073 | Becker et al. | Jul 2008 | A1 |
20080172607 | Baer | Jul 2008 | A1 |
20080177782 | Poston et al. | Jul 2008 | A1 |
20080192053 | Howell et al. | Aug 2008 | A1 |
20080195417 | Surpin et al. | Aug 2008 | A1 |
20080195474 | Lau et al. | Aug 2008 | A1 |
20080195608 | Clover | Aug 2008 | A1 |
20080208735 | Balet et al. | Aug 2008 | A1 |
20080222295 | Robinson et al. | Sep 2008 | A1 |
20080223834 | Griffiths et al. | Sep 2008 | A1 |
20080229056 | Agarwal et al. | Sep 2008 | A1 |
20080243711 | Aymeloglu et al. | Oct 2008 | A1 |
20080249820 | Pathria | Oct 2008 | A1 |
20080249983 | Meisels et al. | Oct 2008 | A1 |
20080255973 | El Wade et al. | Oct 2008 | A1 |
20080263468 | Cappione et al. | Oct 2008 | A1 |
20080267107 | Rosenberg | Oct 2008 | A1 |
20080270468 | Mao | Oct 2008 | A1 |
20080276167 | Michael | Nov 2008 | A1 |
20080278311 | Grange et al. | Nov 2008 | A1 |
20080288306 | MacIntyre et al. | Nov 2008 | A1 |
20080294678 | Gorman et al. | Nov 2008 | A1 |
20080301643 | Appleton et al. | Dec 2008 | A1 |
20080313132 | Hao et al. | Dec 2008 | A1 |
20080313243 | Poston et al. | Dec 2008 | A1 |
20090002492 | Velipasalar et al. | Jan 2009 | A1 |
20090026170 | Tanaka et al. | Jan 2009 | A1 |
20090027418 | Maru et al. | Jan 2009 | A1 |
20090030915 | Winter et al. | Jan 2009 | A1 |
20090031401 | Cudich et al. | Jan 2009 | A1 |
20090055251 | Shah et al. | Feb 2009 | A1 |
20090076845 | Bellin et al. | Mar 2009 | A1 |
20090088964 | Schaaf et al. | Apr 2009 | A1 |
20090089651 | Herberger et al. | Apr 2009 | A1 |
20090094166 | Aymeloglu et al. | Apr 2009 | A1 |
20090094187 | Miyaki | Apr 2009 | A1 |
20090094270 | Alirez et al. | Apr 2009 | A1 |
20090100018 | Roberts | Apr 2009 | A1 |
20090106178 | Chu | Apr 2009 | A1 |
20090112678 | Luzardo | Apr 2009 | A1 |
20090112745 | Stefanescu | Apr 2009 | A1 |
20090115786 | Shmiaski et al. | May 2009 | A1 |
20090119309 | Gibson et al. | May 2009 | A1 |
20090125359 | Knapic | May 2009 | A1 |
20090125369 | Kloosstra et al. | May 2009 | A1 |
20090125459 | Norton et al. | May 2009 | A1 |
20090132921 | Hwangbo et al. | May 2009 | A1 |
20090132953 | Reed et al. | May 2009 | A1 |
20090143052 | Bates et al. | Jun 2009 | A1 |
20090144262 | White et al. | Jun 2009 | A1 |
20090144274 | Fraleigh et al. | Jun 2009 | A1 |
20090150868 | Chakra et al. | Jun 2009 | A1 |
20090157732 | Hao et al. | Jun 2009 | A1 |
20090158185 | Lacevic et al. | Jun 2009 | A1 |
20090164934 | Bhattiprolu et al. | Jun 2009 | A1 |
20090171939 | Athsani et al. | Jul 2009 | A1 |
20090172511 | Decherd et al. | Jul 2009 | A1 |
20090172821 | Daira et al. | Jul 2009 | A1 |
20090177962 | Gusmorino et al. | Jul 2009 | A1 |
20090179892 | Tsuda et al. | Jul 2009 | A1 |
20090187447 | Cheng | Jul 2009 | A1 |
20090187464 | Bai et al. | Jul 2009 | A1 |
20090187546 | Whyte et al. | Jul 2009 | A1 |
20090187548 | Ji et al. | Jul 2009 | A1 |
20090199106 | Jonsson et al. | Aug 2009 | A1 |
20090222400 | Kupershmidt et al. | Sep 2009 | A1 |
20090222759 | Drieschner | Sep 2009 | A1 |
20090222760 | Halverson et al. | Sep 2009 | A1 |
20090234720 | George et al. | Sep 2009 | A1 |
20090248593 | Putzolu et al. | Oct 2009 | A1 |
20090248757 | Havewala et al. | Oct 2009 | A1 |
20090249178 | Ambrosino et al. | Oct 2009 | A1 |
20090249244 | Robinson et al. | Oct 2009 | A1 |
20090254970 | Agarwal et al. | Oct 2009 | A1 |
20090271343 | Vaiciulis et al. | Oct 2009 | A1 |
20090281839 | Lynn et al. | Nov 2009 | A1 |
20090282068 | Shockro et al. | Nov 2009 | A1 |
20090287470 | Farnsworth et al. | Nov 2009 | A1 |
20090292626 | Oxford | Nov 2009 | A1 |
20090307049 | Elliott et al. | Dec 2009 | A1 |
20090313463 | Pang et al. | Dec 2009 | A1 |
20090319418 | Herz | Dec 2009 | A1 |
20090319891 | MacKinlay | Dec 2009 | A1 |
20100011282 | Dollard et al. | Jan 2010 | A1 |
20100016910 | Sullivan et al. | Jan 2010 | A1 |
20100030722 | Goodson et al. | Feb 2010 | A1 |
20100031141 | Summers et al. | Feb 2010 | A1 |
20100031183 | Kang | Feb 2010 | A1 |
20100042922 | Bradateanu et al. | Feb 2010 | A1 |
20100049872 | Roskind | Feb 2010 | A1 |
20100057622 | Faith et al. | Mar 2010 | A1 |
20100057716 | Stefik et al. | Mar 2010 | A1 |
20100063961 | Guiheneuf et al. | Mar 2010 | A1 |
20100070523 | Delgo et al. | Mar 2010 | A1 |
20100070842 | Aymeloglu et al. | Mar 2010 | A1 |
20100070844 | Aymeloglu et al. | Mar 2010 | A1 |
20100070845 | Facemire et al. | Mar 2010 | A1 |
20100070897 | Aymeloglu et al. | Mar 2010 | A1 |
20100076968 | Boyns et al. | Mar 2010 | A1 |
20100088304 | Jackson | Apr 2010 | A1 |
20100088398 | Plamondon | Apr 2010 | A1 |
20100094548 | Tadman | Apr 2010 | A1 |
20100098318 | Anderson | Apr 2010 | A1 |
20100100963 | Mahaffey | Apr 2010 | A1 |
20100103124 | Kruzeniski et al. | Apr 2010 | A1 |
20100106420 | Mattikalli et al. | Apr 2010 | A1 |
20100114887 | Conway et al. | May 2010 | A1 |
20100122152 | Chamberlain et al. | May 2010 | A1 |
20100131457 | Heimendinger | May 2010 | A1 |
20100131502 | Fordham | May 2010 | A1 |
20100161735 | Sharma | Jun 2010 | A1 |
20100162176 | Dunton | Jun 2010 | A1 |
20100185692 | Zhang et al. | Jul 2010 | A1 |
20100191563 | Schlaifer et al. | Jul 2010 | A1 |
20100198684 | Eraker et al. | Aug 2010 | A1 |
20100199225 | Coleman et al. | Aug 2010 | A1 |
20100223260 | Wu | Sep 2010 | A1 |
20100228812 | Uomini | Sep 2010 | A1 |
20100235915 | Memon et al. | Sep 2010 | A1 |
20100238174 | Haub et al. | Sep 2010 | A1 |
20100250412 | Wagner | Sep 2010 | A1 |
20100262688 | Hussain et al. | Oct 2010 | A1 |
20100262901 | DiSalvo | Oct 2010 | A1 |
20100277611 | Holt et al. | Nov 2010 | A1 |
20100280851 | Merkin | Nov 2010 | A1 |
20100280857 | Liu et al. | Nov 2010 | A1 |
20100293174 | Bennett et al. | Nov 2010 | A1 |
20100306713 | Geisner et al. | Dec 2010 | A1 |
20100306722 | LeHoty et al. | Dec 2010 | A1 |
20100312837 | Bodapati et al. | Dec 2010 | A1 |
20100312858 | Mickens et al. | Dec 2010 | A1 |
20100313119 | Baldwin et al. | Dec 2010 | A1 |
20100313239 | Chakra et al. | Dec 2010 | A1 |
20100318924 | Frankel et al. | Dec 2010 | A1 |
20100321399 | Ellren et al. | Dec 2010 | A1 |
20100321871 | Diebel et al. | Dec 2010 | A1 |
20100325526 | Ellis et al. | Dec 2010 | A1 |
20100325581 | Finkelstein et al. | Dec 2010 | A1 |
20100328112 | Liu | Dec 2010 | A1 |
20100330801 | Rouh | Dec 2010 | A1 |
20100332324 | Khosravy et al. | Dec 2010 | A1 |
20110004498 | Readshaw | Jan 2011 | A1 |
20110022312 | McDonough et al. | Jan 2011 | A1 |
20110029526 | Knight et al. | Feb 2011 | A1 |
20110029641 | Fainberg et al. | Feb 2011 | A1 |
20110047159 | Baid et al. | Feb 2011 | A1 |
20110047540 | Williams et al. | Feb 2011 | A1 |
20110060753 | Shaked et al. | Mar 2011 | A1 |
20110061013 | Bilicki et al. | Mar 2011 | A1 |
20110066933 | Ludwig | Mar 2011 | A1 |
20110074788 | Regan et al. | Mar 2011 | A1 |
20110074811 | Hanson et al. | Mar 2011 | A1 |
20110078055 | Faribault et al. | Mar 2011 | A1 |
20110078173 | Seligmann et al. | Mar 2011 | A1 |
20110090085 | Belz et al. | Apr 2011 | A1 |
20110090254 | Carrino et al. | Apr 2011 | A1 |
20110093327 | Fordyce, III et al. | Apr 2011 | A1 |
20110099046 | Weiss et al. | Apr 2011 | A1 |
20110099133 | Chang et al. | Apr 2011 | A1 |
20110117878 | Barash et al. | May 2011 | A1 |
20110119100 | Ruhl et al. | May 2011 | A1 |
20110125372 | Ito | May 2011 | A1 |
20110137766 | Rasmussen et al. | Jun 2011 | A1 |
20110153368 | Pierre | Jun 2011 | A1 |
20110153384 | Home et al. | Jun 2011 | A1 |
20110161096 | Buehler et al. | Jun 2011 | A1 |
20110161409 | Nair | Jun 2011 | A1 |
20110167105 | Ramakrishnan et al. | Jul 2011 | A1 |
20110170799 | Carrino et al. | Jul 2011 | A1 |
20110173032 | Payne et al. | Jul 2011 | A1 |
20110173093 | Psota et al. | Jul 2011 | A1 |
20110179048 | Satlow | Jul 2011 | A1 |
20110185316 | Reid et al. | Jul 2011 | A1 |
20110208565 | Ross et al. | Aug 2011 | A1 |
20110208724 | Jones et al. | Aug 2011 | A1 |
20110213655 | Henkin | Sep 2011 | A1 |
20110218934 | Elser | Sep 2011 | A1 |
20110218955 | Tang | Sep 2011 | A1 |
20110219450 | McDougal et al. | Sep 2011 | A1 |
20110225198 | Edwards et al. | Sep 2011 | A1 |
20110225482 | Chan et al. | Sep 2011 | A1 |
20110238495 | Kang | Sep 2011 | A1 |
20110238553 | Raj et al. | Sep 2011 | A1 |
20110238690 | Arrasvuori | Sep 2011 | A1 |
20110251951 | Kolkowtiz | Oct 2011 | A1 |
20110258158 | Resende et al. | Oct 2011 | A1 |
20110270604 | Qi et al. | Nov 2011 | A1 |
20110270705 | Parker | Nov 2011 | A1 |
20110270834 | Sokolan et al. | Nov 2011 | A1 |
20110289397 | Eastmond et al. | Nov 2011 | A1 |
20110289407 | Naik et al. | Nov 2011 | A1 |
20110289420 | Morioka et al. | Nov 2011 | A1 |
20110291851 | Whisenant | Dec 2011 | A1 |
20110295649 | Fine | Dec 2011 | A1 |
20110310005 | Chen et al. | Dec 2011 | A1 |
20110314007 | Dassa et al. | Dec 2011 | A1 |
20110314024 | Chang et al. | Dec 2011 | A1 |
20120004894 | Butler | Jan 2012 | A1 |
20120011238 | Rathod | Jan 2012 | A1 |
20120011245 | Gillette et al. | Jan 2012 | A1 |
20120019559 | Siler et al. | Jan 2012 | A1 |
20120022945 | Falkenborg et al. | Jan 2012 | A1 |
20120036013 | Neuhaus et al. | Feb 2012 | A1 |
20120036434 | Oberstein | Feb 2012 | A1 |
20120050293 | Carlhian et al. | Mar 2012 | A1 |
20120054284 | Rakshit | Mar 2012 | A1 |
20120059853 | Jagota | Mar 2012 | A1 |
20120066166 | Curbera et al. | Mar 2012 | A1 |
20120066296 | Appleton et al. | Mar 2012 | A1 |
20120072825 | Sherkin et al. | Mar 2012 | A1 |
20120079363 | Folting et al. | Mar 2012 | A1 |
20120084117 | Tavares et al. | Apr 2012 | A1 |
20120084118 | Bai et al. | Apr 2012 | A1 |
20120084184 | Raleigh | Apr 2012 | A1 |
20120084287 | Lakshminarayan et al. | Apr 2012 | A1 |
20120102013 | Martini | Apr 2012 | A1 |
20120106801 | Jackson | May 2012 | A1 |
20120117082 | Koperda et al. | May 2012 | A1 |
20120123989 | Yu et al. | May 2012 | A1 |
20120131512 | Takeuchi et al. | May 2012 | A1 |
20120137235 | Ts et al. | May 2012 | A1 |
20120144325 | Mital et al. | Jun 2012 | A1 |
20120144335 | Abeln et al. | Jun 2012 | A1 |
20120158527 | Cannelongo et al. | Jun 2012 | A1 |
20120159307 | Chung et al. | Jun 2012 | A1 |
20120159362 | Brown et al. | Jun 2012 | A1 |
20120159363 | DeBacker et al. | Jun 2012 | A1 |
20120159399 | Bastide et al. | Jun 2012 | A1 |
20120170847 | Tsukidate | Jul 2012 | A1 |
20120173381 | Smith | Jul 2012 | A1 |
20120173985 | Peppel | Jul 2012 | A1 |
20120180002 | Campbell et al. | Jul 2012 | A1 |
20120188252 | Law | Jul 2012 | A1 |
20120196557 | Reich et al. | Aug 2012 | A1 |
20120196558 | Reich et al. | Aug 2012 | A1 |
20120197651 | Robinson et al. | Aug 2012 | A1 |
20120197657 | Prodanovic | Aug 2012 | A1 |
20120197660 | Prodanovic | Aug 2012 | A1 |
20120203708 | Psota et al. | Aug 2012 | A1 |
20120206469 | Hulubei et al. | Aug 2012 | A1 |
20120208636 | Feige | Aug 2012 | A1 |
20120215784 | King et al. | Aug 2012 | A1 |
20120221511 | Gibson et al. | Aug 2012 | A1 |
20120221553 | Wittmer et al. | Aug 2012 | A1 |
20120221580 | Barney | Aug 2012 | A1 |
20120226523 | Weiss | Sep 2012 | A1 |
20120226590 | Love et al. | Sep 2012 | A1 |
20120245976 | Kumar et al. | Sep 2012 | A1 |
20120246148 | Dror | Sep 2012 | A1 |
20120254129 | Wheeler et al. | Oct 2012 | A1 |
20120284345 | Costenaro et al. | Nov 2012 | A1 |
20120284670 | Kashik et al. | Nov 2012 | A1 |
20120290879 | Shibuya et al. | Nov 2012 | A1 |
20120296907 | Long et al. | Nov 2012 | A1 |
20120311684 | Paulsen et al. | Dec 2012 | A1 |
20120323888 | Osann, Jr. | Dec 2012 | A1 |
20120330801 | McDougal et al. | Dec 2012 | A1 |
20120330973 | Ghuneim et al. | Dec 2012 | A1 |
20130006426 | Healey et al. | Jan 2013 | A1 |
20130006725 | Simanek et al. | Jan 2013 | A1 |
20130006916 | McBride et al. | Jan 2013 | A1 |
20130016106 | Yip et al. | Jan 2013 | A1 |
20130018796 | Kolhatkar et al. | Jan 2013 | A1 |
20130021445 | Cossette-Pacheco et al. | Jan 2013 | A1 |
20130024268 | Manickavelu | Jan 2013 | A1 |
20130046635 | Grigg et al. | Feb 2013 | A1 |
20130046842 | Muntz et al. | Feb 2013 | A1 |
20130054306 | Bhalla | Feb 2013 | A1 |
20130057551 | Ebert et al. | Mar 2013 | A1 |
20130060786 | Serrano et al. | Mar 2013 | A1 |
20130061169 | Pearcy et al. | Mar 2013 | A1 |
20130073377 | Heath | Mar 2013 | A1 |
20130073454 | Busch | Mar 2013 | A1 |
20130076732 | Cervelli et al. | Mar 2013 | A1 |
20130078943 | Biage et al. | Mar 2013 | A1 |
20130086482 | Parsons | Apr 2013 | A1 |
20130096988 | Grossman et al. | Apr 2013 | A1 |
20130097482 | Marantz et al. | Apr 2013 | A1 |
20130100134 | Cervelli et al. | Apr 2013 | A1 |
20130101159 | Chao et al. | Apr 2013 | A1 |
20130110746 | Ahn | May 2013 | A1 |
20130110822 | Ikeda et al. | May 2013 | A1 |
20130110877 | Bonham et al. | May 2013 | A1 |
20130111320 | Campbell et al. | May 2013 | A1 |
20130117651 | Waldman et al. | May 2013 | A1 |
20130132398 | Pfiefle | May 2013 | A1 |
20130150004 | Rosen | Jun 2013 | A1 |
20130151148 | Parundekar et al. | Jun 2013 | A1 |
20130151305 | Akinola et al. | Jun 2013 | A1 |
20130151388 | Falkenborg et al. | Jun 2013 | A1 |
20130151453 | Bhanot et al. | Jun 2013 | A1 |
20130157234 | Gulli et al. | Jun 2013 | A1 |
20130166348 | Scotto | Jun 2013 | A1 |
20130166480 | Popescu et al. | Jun 2013 | A1 |
20130166550 | Buchmann et al. | Jun 2013 | A1 |
20130176321 | Mitchell et al. | Jul 2013 | A1 |
20130179420 | Park et al. | Jul 2013 | A1 |
20130185245 | Anderson | Jul 2013 | A1 |
20130185307 | El-Yaniv et al. | Jul 2013 | A1 |
20130224696 | Wolfe et al. | Aug 2013 | A1 |
20130225212 | Khan | Aug 2013 | A1 |
20130226318 | Procyk | Aug 2013 | A1 |
20130226953 | Markovich et al. | Aug 2013 | A1 |
20130232045 | Tai et al. | Sep 2013 | A1 |
20130238616 | Rose et al. | Sep 2013 | A1 |
20130246170 | Gross et al. | Sep 2013 | A1 |
20130246537 | Gaddala | Sep 2013 | A1 |
20130246597 | Iizawa et al. | Sep 2013 | A1 |
20130251233 | Yang et al. | Sep 2013 | A1 |
20130254900 | Sathish | Sep 2013 | A1 |
20130262527 | Hunter et al. | Oct 2013 | A1 |
20130263019 | Castellanos et al. | Oct 2013 | A1 |
20130267207 | Hao et al. | Oct 2013 | A1 |
20130268520 | Fisher et al. | Oct 2013 | A1 |
20130279757 | Kephart | Oct 2013 | A1 |
20130282696 | John et al. | Oct 2013 | A1 |
20130282723 | Petersen et al. | Oct 2013 | A1 |
20130290011 | Lynn et al. | Oct 2013 | A1 |
20130290825 | Arndt et al. | Oct 2013 | A1 |
20130297619 | Chandrasekaran et al. | Nov 2013 | A1 |
20130304770 | Boero et al. | Nov 2013 | A1 |
20130311375 | Priebatsch | Nov 2013 | A1 |
20130339891 | Blumenberg | Dec 2013 | A1 |
20140012796 | Petersen et al. | Jan 2014 | A1 |
20140019936 | Cohanoff | Jan 2014 | A1 |
20140032506 | Hoey et al. | Jan 2014 | A1 |
20140033010 | Richardt et al. | Jan 2014 | A1 |
20140033120 | Bental et al. | Jan 2014 | A1 |
20140040371 | Gurevich et al. | Feb 2014 | A1 |
20140043337 | Cardno | Feb 2014 | A1 |
20140047319 | Eberlein | Feb 2014 | A1 |
20140047357 | Alfaro et al. | Feb 2014 | A1 |
20140058914 | Song et al. | Feb 2014 | A1 |
20140059038 | McPherson et al. | Feb 2014 | A1 |
20140067611 | Adachi et al. | Mar 2014 | A1 |
20140068487 | Steiger et al. | Mar 2014 | A1 |
20140074855 | Zhao et al. | Mar 2014 | A1 |
20140095273 | Tang et al. | Apr 2014 | A1 |
20140095509 | Patton | Apr 2014 | A1 |
20140108068 | Williams | Apr 2014 | A1 |
20140108380 | Gotz et al. | Apr 2014 | A1 |
20140108985 | Scott et al. | Apr 2014 | A1 |
20140123279 | Bishop et al. | May 2014 | A1 |
20140129261 | Bothwell et al. | May 2014 | A1 |
20140129936 | Richards et al. | May 2014 | A1 |
20140136285 | Carvalho | May 2014 | A1 |
20140143009 | Brice et al. | May 2014 | A1 |
20140149436 | Bahrami et al. | May 2014 | A1 |
20140156527 | Grigg et al. | Jun 2014 | A1 |
20140157172 | Peery et al. | Jun 2014 | A1 |
20140164502 | Khodorenko et al. | Jun 2014 | A1 |
20140176606 | Narayan et al. | Jun 2014 | A1 |
20140189536 | Lange et al. | Jul 2014 | A1 |
20140195515 | Baker et al. | Jul 2014 | A1 |
20140195887 | Ellis et al. | Jul 2014 | A1 |
20140208281 | Ming | Jul 2014 | A1 |
20140214579 | Shen et al. | Jul 2014 | A1 |
20140218400 | O'Toole | Aug 2014 | A1 |
20140222521 | Chait | Aug 2014 | A1 |
20140222793 | Sadkin et al. | Aug 2014 | A1 |
20140229554 | Grunin et al. | Aug 2014 | A1 |
20140244284 | Smith | Aug 2014 | A1 |
20140244388 | Manouchehri et al. | Aug 2014 | A1 |
20140258246 | Lo Faro et al. | Sep 2014 | A1 |
20140267294 | Ma | Sep 2014 | A1 |
20140267295 | Sharma | Sep 2014 | A1 |
20140279824 | Tamayo | Sep 2014 | A1 |
20140310266 | Greenfield | Oct 2014 | A1 |
20140316911 | Gross | Oct 2014 | A1 |
20140333651 | Cervelli et al. | Nov 2014 | A1 |
20140337772 | Cervelli et al. | Nov 2014 | A1 |
20140344230 | Krause et al. | Nov 2014 | A1 |
20140351070 | Christner et al. | Nov 2014 | A1 |
20140358829 | Hurwitz | Dec 2014 | A1 |
20140361899 | Layson | Dec 2014 | A1 |
20140365965 | Bray et al. | Dec 2014 | A1 |
20140366132 | Stiansen et al. | Dec 2014 | A1 |
20150019394 | Unser et al. | Jan 2015 | A1 |
20150026622 | Roaldson et al. | Jan 2015 | A1 |
20150029176 | Baxter et al. | Jan 2015 | A1 |
20150046870 | Goldenberg et al. | Feb 2015 | A1 |
20150073929 | Psota et al. | Mar 2015 | A1 |
20150073954 | Braff | Mar 2015 | A1 |
20150089353 | Folkening | Mar 2015 | A1 |
20150089424 | Duffield et al. | Mar 2015 | A1 |
20150095773 | Gonsalves et al. | Apr 2015 | A1 |
20150100897 | Sun et al. | Apr 2015 | A1 |
20150100907 | Erenrich et al. | Apr 2015 | A1 |
20150106170 | Bonica | Apr 2015 | A1 |
20150106379 | Elliot et al. | Apr 2015 | A1 |
20150112963 | Mojtahedi | Apr 2015 | A1 |
20150134666 | Gattiker et al. | May 2015 | A1 |
20150135256 | Hoy et al. | May 2015 | A1 |
20150169694 | Longo | Jun 2015 | A1 |
20150169709 | Kara et al. | Jun 2015 | A1 |
20150169726 | Kara et al. | Jun 2015 | A1 |
20150170077 | Kara et al. | Jun 2015 | A1 |
20150172396 | Longo | Jun 2015 | A1 |
20150178825 | Huerta | Jun 2015 | A1 |
20150178877 | Bogomolov et al. | Jun 2015 | A1 |
20150186483 | Tappan et al. | Jul 2015 | A1 |
20150186821 | Wang et al. | Jul 2015 | A1 |
20150187036 | Wang et al. | Jul 2015 | A1 |
20150187100 | Berry | Jul 2015 | A1 |
20150188872 | White | Jul 2015 | A1 |
20150212663 | Papale et al. | Jul 2015 | A1 |
20150227295 | Meiklejohn et al. | Aug 2015 | A1 |
20150254220 | Burr et al. | Sep 2015 | A1 |
20150309719 | Ma et al. | Oct 2015 | A1 |
20150312323 | Peterson | Oct 2015 | A1 |
20150317342 | Grossman et al. | Nov 2015 | A1 |
20150324868 | Kaftan et al. | Nov 2015 | A1 |
20150338233 | Cervelli et al. | Nov 2015 | A1 |
20150379413 | Robertson et al. | Dec 2015 | A1 |
20160004764 | Chakerian et al. | Jan 2016 | A1 |
20160026923 | Erenrich et al. | Jan 2016 | A1 |
20160055501 | Mukherjee et al. | Feb 2016 | A1 |
20160062555 | Ward et al. | Mar 2016 | A1 |
20170052654 | Cervelli et al. | Feb 2017 | A1 |
20170052655 | Cervelli et al. | Feb 2017 | A1 |
20170052747 | Cervelli et al. | Feb 2017 | A1 |
20180136831 | Wilson et al. | May 2018 | A1 |
Number | Date | Country |
---|---|---|
2012216622 | May 2015 | AU |
2013251186 | Nov 2015 | AU |
102546446 | Jul 2012 | CN |
103167093 | Jun 2013 | CN |
102054015 | May 2014 | CN |
102014103482 | Sep 2014 | DE |
102014204827 | Sep 2014 | DE |
102014204834 | Sep 2014 | DE |
102013222023 | Jan 2015 | DE |
102014215621 | Feb 2015 | DE |
0763201 | Mar 1997 | EP |
1672527 | Jun 2006 | EP |
2487610 | Aug 2012 | EP |
2551799 | Jan 2013 | EP |
2560134 | Feb 2013 | EP |
2575107 | Apr 2013 | EP |
2778977 | Sep 2014 | EP |
2835745 | Feb 2015 | EP |
2835770 | Feb 2015 | EP |
2838039 | Feb 2015 | EP |
2846241 | Mar 2015 | EP |
2851852 | Mar 2015 | EP |
2858014 | Apr 2015 | EP |
2858018 | Apr 2015 | EP |
2863326 | Apr 2015 | EP |
2863346 | Apr 2015 | EP |
2869211 | May 2015 | EP |
2881868 | Jun 2015 | EP |
2884439 | Jun 2015 | EP |
2884440 | Jun 2015 | EP |
2889814 | Jul 2015 | EP |
2891992 | Jul 2015 | EP |
2892197 | Jul 2015 | EP |
2911078 | Aug 2015 | EP |
2911100 | Aug 2015 | EP |
2940603 | Nov 2015 | EP |
2940609 | Nov 2015 | EP |
2963595 | Jan 2016 | EP |
2988258 | Feb 2016 | EP |
2993595 | Mar 2016 | EP |
3070622 | Sep 2016 | EP |
3133510 | Feb 2017 | EP |
3139333 | Mar 2017 | EP |
2516155 | Jan 2015 | GB |
2518745 | Apr 2015 | GB |
102014204830 | Sep 2014 | ID |
2012778 | Nov 2014 | NL |
2013306 | Feb 2015 | NL |
624557 | Dec 2014 | NZ |
WO 95032424 | Nov 1995 | WO |
WO 00009529 | Feb 2000 | WO |
WO 01025906 | Apr 2001 | WO |
WO 2001088750 | Nov 2001 | WO |
WO 01098925 | Dec 2001 | WO |
WO 2002065353 | Aug 2002 | WO |
WO 2004057268 | Jul 2004 | WO |
WO 2005013200 | Feb 2005 | WO |
WO 2005104736 | Nov 2005 | WO |
WO 2005116851 | Dec 2005 | WO |
WO 2007133206 | Nov 2007 | WO |
WO 2008064207 | May 2008 | WO |
WO 2009061501 | May 2009 | WO |
WO 2009123975 | Oct 2009 | WO |
WO 2010000014 | Jan 2010 | WO |
WO 2010030913 | Mar 2010 | WO |
WO 2010030914 | Mar 2010 | WO |
WO 2011058507 | May 2011 | WO |
WO 2012119008 | Sep 2012 | WO |
WO 2013010157 | Jan 2013 | WO |
WO 2013102892 | Jul 2013 | WO |
Entry |
---|
“GrabUp—What a Timesaver!” <http://atlchris.com/191/grabup/>, Aug. 11, 2008, pp. 3. |
Abbey, Kristen, “Review of Google Docs,” May 1, 2007, pp. 2. |
Adams et al., “Worklets: A Service-Oriented Implementation of Dynamic Flexibility in Workflows,” R. Meersman, Z. Tari et al. (Eds.): OTM 2006, LNCS, 4275, pp. 291-308, 2006. |
Chaudhuri et al., “An Overview of Business Intelligence Technology,” Communications of the ACM, Aug. 2011, vol. 54, No. 8. |
Galliford, Miles, “SnagIt Versus Free Screen Capture Software: Critical Tools for Website Owners,” <http://www.subhub.com/articles/free-screen-capture-software>, Mar. 27, 2008, pp. 11. |
JetScreenshot.com, “Share Screenshots via Internet in Seconds,” <http://web.archive.org/web/20130807164204/http://www.jetscreenshot.com/>, Aug. 7, 2013, pp. 1. |
Kwout, <http://web.archive.org/web/20080905132448/http://www.kwout.com/> Sep. 5, 2008, pp. 2. |
Microsoft Windows, “Microsoft Windows Version 2002 Print Out 2,” 2002, pp. 1-6. |
Microsoft, “Registering an Application to a URI Scheme,” <http://msdn.microsoft.com/en-us/library/aa767914.aspx>, printed Apr. 4, 2009 in 4 pages. |
Microsoft, “Using the Clipboard,” <http://msdn.microsoft.com/en-us/library/ms649016.aspx>, printed Jun. 8, 2009 in 20 pages. |
Nitro, “Trick: How to Capture a Screenshot as PDF, Annotate, Then Share It,” <http://blog.nitropdf.com/2008/03/04/trick-how-to-capture-a-screenshot-as-pdf-annotate-it-then-share/>, Mar. 4, 2008, pp. 2. |
Online Tech Tips, “Clip2Net—Share files, folders and screenshots easily,” <http://www.online-tech-tips.com/free-software-downloads/share-files-folders-screenshots/>, Apr. 2, 2008, pp. 5. |
O'Reilly.com, http://oreilly.com/digitalmedia/2006/01/01/mac-os-x-screenshot-secrets.html published Jan. 1, 2006 in 10 pages. |
Schroder, Stan, “15 Ways to Create Website Screenshots,” http://mashable.com/2007/08/24/web-screenshots/>, Aug. 24, 2007, pp. 2. |
SnagIt, “SnagIt 8.1.0 Print Out 2,” Software release date Jun. 15, 2006, pp. 1-3. |
SnagIt, “SnagIt 8.1.0 Print Out,” Software release date Jun. 15, 2006, pp. 6. |
SnagIt, “SnagIt Online Help Guide,” <http://download.techsmith.com/snagit/docs/onlinehelp/enu/snagit_help.pdf>, TechSmith Corp., Version 8.1, printed Feb. 7, 2007, pp. 284. |
Warren, Christina, “TUAW Faceoff: Screenshot apps on the firing line,” <http://www.tuaw.com/2008/05/05/tuaw-faceoff-screenshot-apps-on-the-firing-line/>, May 5, 2008, pp. 11. |
Wikipedia, “Multimap,” Jan. 1, 2013, https://en.wikipedia.org/w/index.php?title=Multimap&oldid=530800748. |
IBM Predictive Analytics, https://www.ibm.com/analytics/us/en/technology/predictive-analytics/, as printed Feb. 15, 2017 in 12 pages. |
IBM SPSS Modeler, https://www.ibm.com/us-en/marketplace/spss-modeler, as printed Feb. 15, 2017 in 5 pages. |
IBM Analytics, “IBM SPSS software and Watson Analytics: A powerful combo for the cognitive age,” available at https://www.youtube.com/watch?v=AvYctzFt8gc as published on Apr. 14, 2016. |
Armand Ruiz, “Watson Analytics, SPSS Modeler and Esri ArcGIS,” available at https://www.youtube.com/watch?v=fk49hw4OrN4, as published on Jul. 28, 2015. |
IBM Knowledge Center, “Merge Node,” https://www.ibm.com/support/knowledgecenter/en/SS3RA7_15.0.0/com.ibm.spss.modeler.help/merge_overview.html[ibm.com], as printed Feb. 14, 2017 in 1 page. |
IBM Knowledge Center, “New features in IBM SPSS Modeler Professional,” https://www.ibm.com/support/knowledgecenter/en/SS3RA7_15.0./com.ibm.spss.modeler.help/whatsnew_features_pro.htm[ibm.com], as printed Feb. 14, 2017 in 2 pages. |
IBM Knowledge Center, “Overview—What's new in IBM Watson Explorer Content Analytics Version 10.0,” https://www.ibm.com/support/knowledgecenter/en/SS8NLW_10.0.0/com.ibm.discovery.es.nav.doc/llysawhatsnew.htm, as printed Mar. 6, 2017 in 4 pages. |
Yates, Rob, “Introducing the IBM Watson Natural Language Classifier,” IBM developerWorks/Developer Centers, posted Jul. 10, 2015 in 4 pages, https://developer.ibm.com/watson/blog/2015/07/10/the-ibm-watson-natural-language-classifier/. |
Goyal, Manish, “Announcing our largest release of Watson Developer Cloud services,” IBM developerWorks/Developer Centers, posted Sep. 24, 2015 in 6 pages, https://developer.ibm.com/watson/blog/2015/09/24/announcing-our-largest-release-of-watson-developer-cloud-services/. |
IBM Analytics Communities, “Is IBM SPSS statistics now integrated to WatsonAnalytics?” https://community.watsonanalytics.com/discussions/questions/1464/is-ibm-spss-statistics-now-integrated-to-watsonana.html, as printed Mar. 7, 2017 in 2 pages. |
IBM Support, “Software lifecycle—Watson Explorer 10.0.0,” https://www-01.ibm.com/software/support/lifecycleapp/PLCDetail.wss?q45=T283072T66911H98, as printed Mar. 7, 2017 in 1 page. |
IBM Analytics Communities, “Creating a map visualization for UK coordinates,” https://community.watsonanalytics.com/discussions/questions/3753/creating-a-map-visualisation-for-uk-coordinates.html, as printed Mar. 9, 2017 in 1 page. |
Esri News, “IBM and Esri Team Up to Offer Cognitive Analyrics and IoT in the IBM Cloud,” http://www.esri.com/esri-news/releases/16-4qtr/ibm-and-esri-team-up-to-offer-cognitive-analytics-and-lot-in-the-ibm-cloud, as published on Oct. 26, 2016, in 2 pages. |
Notice of Acceptance for Australian Patent Application No. 2013251186 dated Nov. 6, 2015. |
Notice of Allowance for U.S. Appl. No. 14/265,637 dated Feb. 13, 2015. |
Notice of Allowance for U.S. Appl. No. 14/323,935 dated Oct. 1, 2015. |
Notice of Allowance for U.S. Appl. No. 14/552,336 dated Nov. 3, 2015. |
Notice of Allowance for U.S. Appl. No. 14/746,671 dated Jan. 21, 2016. |
Notice of Allowance for U.S. Appl. No. 14/934,004 dated Nov. 4, 2016. |
Official Communication for Australian Patent Application No. 2013251186 dated Mar. 12, 2015. |
Official Communication for Canadian Patent Application No. 2831660 dated Jun. 9, 2015. |
Official Communication for European Patent Application No. 12181585.6 dated Sep. 4, 2015. |
Official Communication for European Patent Application No. 14200246.8 dated May 29, 2015. |
Official Communication for European Patent Application No. 15184764.7 dated Dec. 14, 2015. |
Official Communication for European Patent Application No. 15188106.7 dated Feb. 3, 2016. |
Official Communication for European Patent Application No. 15190307.7 dated Feb. 19, 2016. |
Official Communication for Great Britain Patent Application No. 1404486.1 dated Aug. 27, 2014. |
Official Communication for Great Britain Patent Application No. 1404489.5 dated Aug. 27, 2014. |
Official Communication for Great Britain Patent Application No. 1404499.4 dated Aug. 20, 2014. |
Official Communication for Netherlands Patent Application No. 2011729 dated Aug. 13, 2015. |
Official Communication for Netherlands Patent Application No. 2012417 dated Sep. 18, 2015. |
Official Communication for Netherlands Patent Application No. 2012421 dated Sep. 18, 2015. |
Official Communication for Netherlands Patent Application No. 2012438 dated Sep. 21, 2015. |
Official Communication for U.S. Appl. No. 12/556,321 dated Feb. 25, 2016. |
Official Communication for U.S. Appl. No. 12/556,321 dated Jun. 6, 2012. |
Official Communication for U.S. Appl. No. 12/556,321 dated Dec. 7, 2011. |
Official Communication for U.S. Appl. No. 12/556,321 dated Jul. 7, 2015. |
Official Communication for U.S. Appl. No. 13/669,274 dated Aug. 26, 2015. |
Official Communication for U.S. Appl. No. 13/669,274 dated May 6, 2015. |
Official Communication for U.S. Appl. No. 14/141,252 dated Oct. 8, 2015. |
Official Communication for U.S. Appl. No. 14/222,364 dated Dec. 9, 2015. |
Official Communication for U.S. Appl. No. 14/225,006 dated Dec. 21, 2015. |
Official Communication for U.S. Appl. No. 14/225,084 dated Jan. 4, 2016. |
Official Communication for U.S. Appl. No. 14/265,637 dated Sep. 26, 2014. |
Official Communication for U.S. Appl. No. 14/306,138 dated Mar. 17, 2016. |
Official Communication for U.S. Appl. No. 14/306,138 dated Dec. 24, 2015. |
Official Communication for U.S. Appl. No. 14/306,147 dated Dec. 24, 2015. |
Official Communication for U.S. Appl. No. 14/306,147 dated Mar. 4, 2016. |
Official Communication for U.S. Appl. No. 14/306,154 dated Feb. 1, 2016. |
Official Communication for U.S. Appl. No. 14/306,154 dated Mar. 17, 2016. |
Official Communication for U.S. Appl. No. 14/463,615 dated May 12, 2016. |
Official Communication for U.S. Appl. No. 14/463,615 dated Mar. 21, 2016. |
Official Communication for U.S. Appl. No. 14/463,615 dated Dec. 9, 2015. |
Official Communication for U.S. Appl. No. 14/483,527 dated Oct. 28, 2015. |
Official Communication for U.S. Appl. No. 14/562,524 dated Nov. 10, 2015. |
Official Communication for U.S. Appl. No. 14/562,524 dated Feb. 18, 2016. |
Official Communication for U.S. Appl. No. 14/571,098 dated Nov. 10, 2015. |
Official Communication for U.S. Appl. No. 14/676,621 dated Oct. 29, 2015. |
Official Communication for U.S. Appl. No. 14/715,834 dated Feb. 19, 2016. |
Official Communication for U.S. Appl. No. 14/746,671 dated Nov. 12, 2015. |
Official Communication for U.S. Appl. No. 14/800,447 dated Dec. 10, 2015. |
Official Communication for U.S. Appl. No. 14/800,447 dated Mar. 3, 2016. |
Official Communication for U.S. Appl. No. 14/841,338 dated Feb. 18, 2016. |
Official Communication for U.S. Appl. No. 14/842,734 dated Jun. 1, 2017. |
Official Communication for U.S. Appl. No. 14/842,734 dated Nov. 19, 2015. |
Official Communication for U.S. Appl. No. 14/871,465 dated Feb. 9, 2016. |
Official Communication for U.S. Appl. No. 14/883,498 dated Mar. 17, 2016. |
Official Communication for U.S. Appl. No. 14/883,498 dated Dec. 24, 2015. |
Official Communication for U.S. Appl. No. 15/072,133 dated Mar. 17, 2017. |
Ask Drexel University Knowledge Base, “How to: Auto Save a Document Before Printing in Word 2007,” published Nov. 13, 2007. |
Harville et al., “Mediabeads: An Architecture for Path-Enhanced Media Applications,” 2004 IEEE International Conference on Multimedia and Expo, Jun. 27-30, 2004, Taipei, Taiwan, vol. 1, pp. 455-458. |
MacWright, Tom, “Announcing MapBOx.JS 1.0 with Leaflet,” Mapbox.com blog, Apr. 18, 2013, retrieved from https://www.mapbox.com/blog/mapbox-js-with-leaflet/. |
Palantir, “Basic Map Searches,” YouTube, Sep. 12, 2013, retrieved from https://www.youtube.com/watch?v=UC-1x44xFR0. |
Palantir, “Intelligence Integration in Palantir: An Open-Source View of the Afghan Conflict,” YouTube, Jul. 5, 2012, retrieved from https://www.youtube.com/watch?v=FXTxs2YqHY4. |
“What Was It Again? Ways to Make Feature Tile Layers Interactive,” WordPress.com, published Jun. 12, 2011, retrieved from https://whatwasitagain.wordpress.com/2011/06/12/interactive-feature-tile-layers/. |
International Search Report and Written Opinion in Application No. PCT/US2009/056703 dated Mar. 15, 2010. |
Notice of Acceptance for Australian Patent Application No. 2012216622 dated Jan. 6, 2015. |
Notice of Allowance for U.S. Appl. No. 14/319,765 dated Nov. 25, 2016. |
Notice of Allowance for U.S. Appl. No. 14/323,881 dated Jun. 30, 2017. |
Notice of Allowance for U.S. Appl. No. 14/486,991 dated May 1, 2015. |
Notice of Allowance for U.S. Appl. No. 15/072,133 dated Jun. 23, 2017. |
Official Communication for Australian Patent Application No. 2010227081 dated Mar. 18, 2011. |
Official Communication for Australian Patent Application No. 2010257305 dated Apr. 12, 2011. |
Official Communication for Australian Patent Application No. 2010257305 dated Sep. 22, 2011. |
Official Communication for European Patent Application No. 10195798.3 dated May 17, 2011. |
Official Communication for European Patent Application No. 12186236.1 dated May 17, 2013. |
Official Communication for European Patent Application No. 16160781.7 dated May 27, 2016. |
Official Communication for European Patent Application No. 16184373.5 dated Jan. 17, 2017. |
Official Communication for European Patent Application No. 16186622.3 dated Jan. 18, 2017. |
Official Communication for Great Britain Patent Application No. 1319225.7 dated May 2, 2014. |
Official Communication for Great Britain Patent Application No. 1620827.4 dated Jan. 12, 2017. |
Official Communication for Great Britain Patent Application No. 1620827.4 dated Jun. 28, 2017. |
Official Communication for New Zealand Patent Application No. 616167 dated Oct. 10, 2013. |
Official Communication for U.S. Appl. No. 14/323,878 dated Jul. 27, 2017. |
Official Communication for U.S. Appl. No. 14/323,878 dated Mar. 30, 2017. |
Official Communication for U.S. Appl. No. 14/323,881 dated Apr. 18, 2017. |
Official Communication for U.S. Appl. No. 14/571,098 dated Feb. 23, 2016. |
Official Communication for U.S. Appl. No. 14/746,671 dated Sep. 28, 2015. |
“A First Look: Predicting Market Demand for Food Retail using a Huff Analysis,” TRF Policy Solutions, Jul. 2012, pp. 30. |
“Andy Turner's GISRUK 2012 Notes” https://docs.google.com/document/d/1cTmxg7mVx5gd89lqbICYvCEnHA4QAivH4l4WpyPsqE4/edit?pli=1 printed Sep. 16, 2013 in 15 pages. |
Barnes et al., “Viewshed Analysis”, GIS-ARC/Info 2001, www.evsc.virginia.edu/˜jhp7e/evsc466/student_pres/Rounds.pdf. |
Carver et al., “Real-Time Visibility Analysis and Rapid Viewshed Calculation Using a Voxel-Based Modelling Approach,” GISRUK 2012 Conference, Apr. 11-13, Lancaster UK, Apr. 13, 2012, pp. 6. |
Chen et al., “Bringing Order to the Web: Automatically Categorizing Search Results,” CHI 2000, Proceedings of the SIGCHI conference on Human Factors in Computing Systems, Apr. 1-6, 2000, The Hague, The Netherlands, pp. 145-152. |
Definition “Identify” downloaded Jan. 22, 2015, 1 page. |
Definition “Overlay” downloaded Jan. 22, 2015, 1 page. |
Dramowicz, Ela, “Retail Trade Area Analysis Using the Huff Model,” Directions Magazine, Jul. 2, 2005 in 10 pages, http://www.directionsmag.com/articles/retail-trade-area-analysis-using-the-huff-model/123411. |
Ghosh, P., “A Solution of Polygon Containment, Spatial Planning, and Other Related Problems Using Minkowski Operations,” Computer Vision, Graphics, and Image Processing, 1990, vol. 49, pp. 1-35. |
GIS-NET 3 Public—Department of Regional Planning. Planning & Zoning Information for Unincorporated LA County. Retrieved Oct. 2, 2013 from http://gis.planning.lacounty.gov/GIS-NET3_Public/Viewer.html. |
Gorr et al., “Crime Hot Spot Forecasting: Modeling and Comparative Evaluation,” Grant 98-IJ-CX-K005, May 6, 2002, 37 pages. |
Griffith, Daniel A., “A Generalized Huff Model,” Geographical Analysis, Apr. 1982, vol. 14, No. 2, pp. 135-144. |
Haralick et al., “Image Analysis Using Mathematical Morphology,” Pattern Analysis and Machine Intelligence, IEEE Transactions, Jul. 1987, vol. PAMI-9, No. 4, pp. 532-550. |
Hibbert et al., “Prediction of Shopping Behavior Using a Huff Model Within a GIS Framework,” Healthy Eating in Context, Mar. 18, 2011, pp. 16. |
Huang et al., “Systematic and Integrative Analysis of Large Gene Lists Using David Bioinformatics Resources,” Nature Protocols, 4.1, 2008, 44-57. |
Huff et al., “Calibrating the Huff Model Using ArcGIS Business Analyst,” ESRI, Sep. 2008, pp. 33. |
Huff, David L., “Parameter Estimation in the Huff Model,” ESRI, ArcUser, Oct.-Dec. 2003, pp. 34-36. |
“HunchLab: Heat Map and Kernel Density Calculation for Crime Analysis,” Azavea Journal, printed from www.azavea.com/blogs/newsletter/v414/kernel-density-capabilities-added-to-hunchlab/ on Sep. 9, 2014, 2 pages. |
Ipbucker, C., “Inverse Transformation for Several Pseudo-cylindrical Map Projections Using Jacobian Matrix,” ICCSA 2009, Part 1 LNCS 5592, pp. 553-564. |
Levine, N., “Crime Mapping and the Crimestat Program,” Geographical Analysis, 2006, vol. 38, pp. 41-56. |
Liu, Tianshun, “Combining GIS and the Huff Model to Analyze Suitable Locations for a New Asian Supermarket in the Minneapolis and St. Paul, Minnesota USA,” Papers in Resource Analysis, 2012, vol. 14, pp. 8. |
Mandagere, Nagapramod, “Buffer Operations in GIS,” http://www-users.cs.umn.edu/˜npramod/enc_pdf.pdf retrieved Jan. 28, 2010, pp. 7. |
Map Builder, “Rapid Mashup Development Tool for Google and Yahoo Maps!” http://web.archive.org/web/20090626224734/http://www.mapbuilder.net/ printed Jul. 20, 2012 in 2 pages. |
Map of San Jose, CA. Retrieved Oct. 2, 2013 from http://maps.bing.com. |
Map of San Jose, CA. Retrieved Oct. 2, 2013 from http://maps.google.com. |
Map of San Jose, CA. Retrieved Oct. 2, 2013 from http://maps.yahoo.com. |
Murray, C., Oracle Spatial Developer's Guide-6 Coordinate Systems (Spatial Reference Systems), http://docs.oracle.com/cd/B28359_01/appdev.111/b28400.pdf, Jun. 2009. |
Open Street Map, “Amm's Diary:Unconnected ways and other data quality issues,” http://www.openstreetmap.org/user/amm/diary printed Jul. 23, 2012 in 3 pages. |
POI Editor, “How to: Create Your Own Points of Interest,” http://www.poieditor.com/articles/how_to_create_your_own_points_of_interest/ printed Jul. 22, 2012 in 4 pages. |
Pozzi et al., “Vegetation and Population Density in Urban and Suburban Areas in the U.S.A.” Third International Symposium of Remote Sensing of Urban Areas Istanbul, Turkey, Jun. 2002, pp. 8. |
Qiu, Fang, “3d Analysis and Surface Modeling”, http://web.archive.org/web/20091202221925/http://www.utsa.edu/Irsg/Teaching/EES6513/08-3D.pdf printed Sep. 16, 2013 in 26 pages. |
Reddy et al., “Under the hood of GeoVRML 1.0,” SRI International, Proceedings of the fifth symposium on Vurtual Reality Modeling Language (Web3D-VRML), New York, NY, Feb. 2000, pp. 23-28. http://pdf.aminer.org/000/648/038/under_the_hood_of_geovrml.pdf. |
Reibel et al., “Areal Interpolation of Population Counts Using Pre-classified Land Cover Data,” Population Research and Policy Review, 2007, vol. 26, pp. 619-633. |
Reibel, M., “Geographic Information Systems and Spatial Data Processing in Demography: a Review,” Population Research and Policy Review, 2007, vol. 26, pp. 601-618. |
Rizzardi et al., “Interfacing U.S. Census Map Files with Statistical Graphics Software: Application and Use in Epidemiology,” Statistics in Medicine, Oct. 1993, vol. 12, No. 19-20, pp. 1953-1964. |
Snyder, “Map Projections—A Working Manual,” U.S. Geological Survey Professional paper 1395, United States Government Printing Office, Washington: 1987, pp. 11-21 and 60-70. |
Sonris, “Using the Area of Interest Tools,” http://web.archive.org/web/20061001053327/http://sonris-www.dnr.state.la.us/gis/instruct_files/tutslide12 printed Jan. 3, 2013 in 1 page. |
Tangelder et al., “Freeform Shape Matching Using Minkowski Operations,” The Netherlands, Jun. 1996, pp. 12. |
Thompson, Mick, “Getting Started with GEO,” Getting Started with GEO, Jul. 26, 2011. |
Valentini et al., “Ensembles of Learning Machines,” M. Marinaro and R. Tagliaferri (Eds.): WIRN VIETRI 2002, LNCS 2486, pp. 3-20. |
VB Forums, “Buffer a Polygon,” Internet Citation, http://www.vbforums.com/showthread.php?198436-Buffer-a-Polygon, Specifically Thread #1, #5 & #11 retrieved on May 2, 2013, pp. 8. |
Vivid Solutions, “JTS Topology Suite: Technical Specifications,” http://www.vividsolutions.com/jts/bin/JTS%20Technical%20Specs.pdf Version 1.4, 2003, pp. 36. |
Wikipedia, “Douglas-Peucker-Algorithms,” http://de.wikipedia.org/w/index.php?title=Douglas-Peucker-Algorithmus&oldid=91846042 printed Jul. 2011, pp. 2. |
Wikipedia, “Ramer-Douglas-Peucker Algorithm,” http://en.wikipedia.org/wiki/Ramer%E2%80%93Douglas%E2%80%93Peucker_algorithm printed Jul. 2011, pp. 3. |
Wongsuphasawat et al., “Visual Analytics for Transportation Incident Data Sets,” Transportation Research Record 2138, 2009, pp. 135-145. |
Woodbridge, Stephen, “[Geos-devel] Polygon simplification,” http://lists.osgeo.org/pipermail/geos-devel/2011-May/005210.html dated May 8, 2011, pp. 3. |
Notice of Allowance for U.S. Appl. No. 12/840,673 dated Apr. 6, 2015. |
Notice of Allowance for U.S. Appl. No. 13/948,859 dated Dec. 10, 2014. |
Notice of Allowance for U.S. Appl. No. 14/294,098 dated Dec. 29, 2014. |
Notice of Allowance for U.S. Appl. No. 14/319,161 dated May 4, 2015. |
Notice of Allowance for U.S. Appl. No. 14/730,123 dated Apr. 12, 2016. |
Official Communication for Australian Patent Application No. 2012216622 dated Jan. 6, 2015. |
Official Communication for Australian Patent Application No. 2014202442 dated Mar. 19, 2015. |
Official Communication for Australian Patent Application No. 2014213553 dated May 7, 2015. |
Official Communication for European Patent Application No. 14187739.9 dated Jul. 6, 2015. |
Official Communication for Great Britain Patent Application No. 1408025.3 dated Nov. 6, 2014. |
Official Communication for Netherlands Patent Application No. 2011632 dated Feb. 8, 2016. |
Official Communication for Netherlands Patent Application No. 2012778 dated Sep. 22, 2015. |
Official Communication for New Zealand Patent Application No. 628585 dated Aug. 26, 2014. |
Official Communication for New Zealand Patent Application No. 628840 dated Aug. 28, 2014. |
Official Communication for U.S. Appl. No. 12/840,673 dated Sep. 17, 2014. |
Official Communication for U.S. Appl. No. 12/840,673 dated Jan. 2, 2015. |
Official Communication for U.S. Appl. No. 13/728,879 dated Aug. 12, 2015. |
Official Communication for U.S. Appl. No. 13/728,879 dated Mar. 17, 2015. |
Official Communication for U.S. Appl. No. 13/728,879 dated Nov. 20, 2015. |
Official Communication for U.S. Appl. No. 13/728,879 dated Jan. 27, 2015. |
Official Communication for U.S. Appl. No. 14/289,596 dated Jul. 18, 2014. |
Official Communication for U.S. Appl. No. 14/289,596 dated Jan. 26, 2015. |
Official Communication for U.S. Appl. No. 14/289,596 dated Apr. 30, 2015. |
Official Communication for U.S. Appl. No. 14/289,596 dated May 9, 2016. |
Official Communication for U.S. Appl. No. 14/289,596 dated Aug. 5, 2015. |
Official Communication for U.S. Appl. No. 14/289,599 dated Jul. 22, 2014. |
Official Communication for U.S. Appl. No. 14/289,599 dated May 29, 2015. |
Official Communication for U.S. Appl. No. 14/289,599 dated Sep. 4, 2015. |
Official Communication for U.S. Appl. No. 14/294,098 dated Aug. 15, 2014. |
Official Communication for U.S. Appl. No. 14/294,098 dated Nov. 6, 2014. |
Official Communication for U.S. Appl. No. 14/319,161 dated Jan. 23, 2015. |
Official Communication for U.S. Appl. No. 14/490,612 dated Aug. 18, 2015. |
Official Communication for U.S. Appl. No. 14/490,612 dated Mar. 31, 2015. |
Official Communication for U.S. Appl. No. 14/490,612 dated Jan. 27, 2015. |
Official Communication for U.S. Appl. No. 14/730,123 dated Sep. 21, 2015. |
Official Communication for U.S. Appl. No. 14/929,584 dated Feb. 4, 2016. |
Official Communication for U.S. Appl. No. 14/934,004 dated Feb. 16, 2016. |
Official Communication for U.S. Appl. No. 14/934,004 dated Jul. 29, 2016. |
Official Communication for U.S. Appl. No. 12/840,673 dated Jul. 25, 2012. |
Official Communication for U.S. Appl. No. 12/840,673 dated Jan. 4, 2013. |
“A Quick Guide to UniProtKB Swiss-Prot & TrEMBL,” Sep. 2011, pp. 2. |
“A Word About Banks and the Laundering of Drug Money,” Aug. 18, 2012, http://www.golemxiv.co.uk/2012/08/a-word-about-banks-and-the-laundering-of-drug-money/. |
“Money Laundering Risks and E-Gaming: A European Overview and Assessment,” 2009, http://www.cf.ac.uk/socsi/resources/Levi_Final_Money_Laundering_Risks_egaming.pdf. |
“Potential Money Laundering Warning Signs,” snapshot taken 2003, https://web.archive.org/web/20030816090055/http:/finsolinc.com/ANTI-MONEY%20LAUNDERING%20TRAINING%20GUIDES.pdf. |
“Refresh CSS Ellipsis When Resizing Container—Stack Overflow,” Jul. 31, 2013, retrieved from internet http://stackoverflow.com/questions/17964681/refresh-css-ellipsis-when-resizing-container, retrieved on May 18, 2015. |
“The FASTA Program Package,” fasta-36.3.4, Mar. 25, 2011, pp. 29. |
“Using Whois Based Geolocation and Google Maps API for Support Cybercrime Investigations,” http://wseas.us/e-library/conferences/2013/Dubrovnik/TELECIRC/TELECIRC-32.pdf. |
About 80 Minutes, “Palantir in a Number of Parts—Part 6—Graph,” Mar. 21, 2013, pp. 1-6. |
Acklen, Laura, “Absolute Beginner's Guide to Microsoft Word 2003,” Dec. 24, 2003, pp. 15-18, 34-41, 308-316. |
Alur et al., “Chapter 2: IBM InfoSphere DataStage Stages,” IBM InfoSphere DataStage Data Flow and Job Design, Jul. 1, 2008, pp. 35-137. |
Amnet, “5 Great Tools for Visualizing Your Twitter Followers,” posted Aug. 4, 2010, http://www.amnetblog.com/component/content/article/115-5-grate-tools-for-visualizing-your-twitter-followers.html. |
Ananiev et al., “The New Modality API,” http://web.archive.org/web/20061211011958/http://java.sun.com/developer/technicalArticles/J2SE/Desktop/javase6/modality/ Jan. 21, 2006, pp. 8. |
Appacts, “Smart Thinking for Super Apps,” <http://www.appacts.com> Printed Jul. 18, 2013 in 4 pages. |
Apsalar, “Data Powered Mobile Advertising,” “Free Mobile App Analytics” and various analytics related screen shots <http://apsalar.com> Printed Jul. 18, 2013 in 8 pages. |
Bluttman et al., “Excel Formulas and Functions for Dummies,” 2005, Wiley Publishing, Inc., pp. 280, 284-286. |
Boyce, Jim, “Microsoft Outlook 2010 Inside Out,” Aug. 1, 2010, retrieved from the Internet https://capdtron.files.wordpress.com/2013/01/outlook-2010-inside_out.pdf. |
Bugzilla@Mozilla, “Bug 18726—[feature] Long-click means of invoking contextual menus not supported,” http://bugzilla.mozilla.org/show_bug.cgi?id=18726 printed Jun. 13, 2013 in 11 pages. |
Canese et al., “Chapter 2: PubMed: The Bibliographic Database,” The NCBI Handbook, Oct. 2002, pp. 1-10. |
Capptain—Pilot Your Apps, <http://www.capptain.com> Printed Jul. 18, 2013 in 6 pages. |
Celik, Tantek, “CSS Basic User Interface Module Level 3 (CSS3 UI),” Section 8 Resizing and Overflow, Jan. 17, 2012, retrieved from Internet http://www.w3.org/TR/2012/WD-css3-ui-20120117/#resizing-amp-overflow retrieved on May 18, 2015. |
Chung, Chin-Wan, “Dataplex: An Access to Heterogeneous Distributed Databases,” Communications of the ACM, Association for Computing Machinery, Inc., vol. 33, No. 1, Jan. 1, 1990, pp. 70-80. |
Cohn, et al., “Semi-supervised clustering with user feedback,” Constrained Clustering: Advances in Algorithms, Theory, and Applications 4.1 (2003): 17-32. |
Conner, Nancy, “Google Apps: The Missing Manual,” May 1, 2008, pp. 15. |
Countly Mobile Analytics, <http://count.ly/> Printed Jul. 18, 2013 in 9 pages. |
Delcher et al., “Identifying Bacterial Genes and Endosymbiont DNA with Glimmer,” BioInformatics, vol. 23, No. 6, 2007, pp. 673-679. |
DISTIMO—App Analytics, <http://www.distimo.com/app-analytics> Printed Jul. 18, 2013 in 5 pages. |
Flurry Analytics, <http://www.flurry.com/> Printed Jul. 18, 2013 in 14 pages. |
Gesher, Ari, “Palantir Screenshots in the Wild: Swing Sightings,” The Palantir Blog, Sep. 11, 2007, pp. 1-12. |
Google Analytics Official Website—Web Analytics & Reporting, <http://www.google.com/analytics.index.html> Printed Jul. 18, 2013 in 22 pages. |
Goswami, Gautam, “Quite Writly Said!,” One Brick at a Time, Aug. 21, 2005, pp. 7. |
Gu et al., “Record Linkage: Current Practice and Future Directions,” Jan. 15, 2004, pp. 32. |
Hansen et al., “Analyzing Social Media Networks with NodeXL: Insights from a Connected World”, Chapter 4, pp. 53-67 and Chapter 10, pp. 143-164, published Sep. 2010. |
Hardesty, “Privacy Challenges: Analysis: It's Surprisingly Easy to Identify Individuals from Credit-Card Metadata,” MIT News on Campus and Around the World, MIT News Office, Jan. 29, 2015, 3 pages. |
Hogue et al., “Thresher: Automating the Unwrapping of Semantic Content from the World Wide Web,” 14th International Conference on World Wide Web, WWW 2005: Chiba, Japan, May 10-14, 2005, pp. 86-95. |
Hua et al., “A Multi-attribute Data Structure with Parallel Bloom Filters for Network Services”, HiPC 2006, LNCS 4297, pp. 277-288, 2006. |
Kahan et al., “Annotea: an Open RDF Infrastructure for Shared Web Annotations”, Computer Networks, Elsevier Science Publishers B.V., vol. 39, No. 5, dated Aug. 5, 2002, pp. 589-608. |
Keylines.com, “An Introduction to KeyLines and Network Visualization,” Mar. 2014, <http://keylines.com/wp-content/uploads/2014/03/KeyLines-White-Paper.pdf> downloaded May 12, 2014 in 8 pages. |
Keylines.com, “KeyLines Datasheet,” Mar. 2014, <http://keylines.com/wp-content/uploads/2014/03/KeyLines-datasheet.pdf> downloaded May 12, 2014 in 2 pages. |
Keylines.com, “Visualizing Threats: Improved Cyber Security Through Network Visualization,” Apr. 2014, <http://keylines.com/wp-content/uploads/2014/04/Visualizing-Threats1.pdf> downloaded May 12, 2014 in 10 pages. |
Kitts, Paul, “Chapter 14: Genome Assembly and Annotation Process,” The NCBI Handbook, Oct. 2002, pp. 1-21. |
Kontagent Mobile Analytics, <http://www.kontagent.com/> Printed Jul. 18, 2013 in 9 pages. |
Li et al., “Interactive Multimodal Visual Search on Mobile Device,” IEEE Transactions on Multimedia, vol. 15, No. 3, Apr. 1, 2013, pp. 594-607. |
Localytics—Mobile App Marketing & Analytics, <http://www.localytics.com/> Printed Jul. 18, 2013 in 12 pages. |
Madden, Tom, “Chapter 16: The BLAST Sequence Analysis Tool,” The NCBI Handbook, Oct. 2002, pp. 1-15. |
Manno et al., “Introducing Collaboration in Single-user Applications through the Centralized Control Architecture,” 2010, pp. 10. |
Manske, “File Saving Dialogs,” <http://www.mozilla.org/editor/ui_specs/FileSaveDialogs.html>, Jan. 20, 1999, pp. 7. |
Microsoft—Developer Network, “Getting Started with VBA in Word 2010,” Apr. 2010, <http://msdn.microsoft.com/en-us/library/ff604039%28v=office.14%29.aspx> as printed Apr. 4, 2014 in 17 pages. |
Microsoft Office—Visio, “About connecting shapes,” <http://office.microsoft.com/en-us/visio-help/about-connecting-shapes-HP085050369.aspx> printed Aug. 4, 2011 in 6 pages. |
Microsoft Office—Visio, “Add and glue connectors with the Connector tool,” <http://office.microsoft.com/en-us/visio-help/add-and-glue-connectors-with-the-connector-tool-HA010048532.aspx?CTT=1> printed Aug. 4, 2011 in 1 page. |
Mixpanel—Mobile Analytics, <https://mixpanel.com/> Printed Jul. 18, 2013 in 13 pages. |
Mizrachi, Ilene, “Chapter 1: GenBank: The Nuckeotide Sequence Database,” The NCBI Handbook, Oct. 2002, pp. 1-14. |
Nierman, “Evaluating Structural Similarity in XML Documents”, 6 pages, 2002. |
Nolan et al., “MCARTA: A Malicious Code Automated Run-Time Analysis Framework,” Homeland Security, 2012 IEEE Conference on Technologies for, Nov. 13, 2012, pp. 13-17. |
Olanoff, Drew, “Deep Dive with the New Google Maps for Desktop with Google Earth Integration, It's More than Just a Utility,” May 15, 2013, pp. 1-6, retrieved from the internet: http://web.archive.org/web/20130515230641/http://techcrunch.com/2013/05/15/deep-dive-with-the-new-google-maps-for-desktop-with-google-earth-integration-its-more-than-just-a-utility/. |
Open Web Analytics (OWA), <http://www.openwebanalytics.com/> Printed Jul. 19, 2013 in 5 pages. |
Palantir Technologies, “Palantir Labs _ Timeline,” Oct. 1, 2010, retrieved from the internet https://www.youtube.com/watch?v=JCgDW5bru9M. |
Palmas et al. “An Edge-Bunding Layout for Interactive Parallel Coordinates” 2014 IEEE Pacific Visualization Symposium, pp. 57-64. |
Perdisci et al., “Behavioral Clustering of HTTP-Based Malware and Signature Generation Using Malicious Network Traces,” USENIX, Mar. 18, 2010, pp. 1-14. |
Piwik—Free Web Analytics Software. <http://piwik.org/> Printed Jul. 19, 2013 in18 pages. |
Quest, “Toad for Oracle 11.6—Guide to Using Toad,” Sep. 24, 2012, pp. 1-162. |
Rouse, Margaret, “OLAP Cube,” <http://searchdatamanagement.techtarget.com/definition/OLAP-cube>, Apr. 28, 2012, pp. 16. |
Shi et al., “A Scalable Implementation of Malware Detection Based on Network Connection Behaviors,” 2013 International Conference on Cyber-Enabled Distributed Computing and Knowledge Discovery, IEEE, Oct. 10, 2013, pp. 59-66. |
Sigrist, et al., “PROSITE, a Protein Domain Database for Functional Characterization and Annotation,” Nucleic Acids Research, 2010, vol. 38, pp. D161-D166. |
Sirotkin et al., “Chapter 13: The Processing of Biological Sequence Data at NCBI,” The NCBI Handbook, Oct. 2002, pp. 1-11. |
StatCounter—Free Invisible Web Tracker, Hit Counter and Web Stats, <http://statcounter.com/> Printed Jul. 19, 2013 in 17 pages. |
Symantec Corporation, “E-Security Begins with Sound Security Policies,” Announcement Symantec, Jun. 14, 2001. |
TestFlight—Beta Testing on the Fly, <http://testflightapp.com/> Printed Jul. 18, 2013 in 3 pages. |
trak.io, <http://trak.io/> printed Jul. 18, 2013 in 3 pages. |
Umagandhi et al., “Search Query Recommendations Using Hybrid User Profile with Query Logs,” International Journal of Computer Applications, vol. 80, No. 10, Oct. 1, 2013, pp. 7-18. |
UserMetrix, <http://usermetrix.com/android-analytics> printed Jul. 18, 2013 in 3 pages. |
Vose et al., “Help File for ModelRisk Version 5,” 2007, Vose Software, pp. 349-353. [Uploaded in 2 Parts]. |
Wang et al., “Research on a Clustering Data De-Duplication Mechanism Based on Bloom Filter,” IEEE 2010, 5 pages. |
Wikipedia, “Federated Database System,” Sep. 7, 2013, retrieved from the internet on Jan. 27, 2015 http://en.wikipedia.org/w/index.php?title=Federated_database_system&oldid=571954221. |
Wright et al. “Palantir Technologies VAST 2010 Challenge Text Records _ Investigations into Arms Dealing,” Oct. 29, 2010, pp. 1-10. |
Yang et al., “HTML Page Analysis Based on Visual Cues”, A129, pp. 859-864, 2001. |
Notice of Acceptance for Australian Patent Application No. 2014250678 dated Oct. 7, 2015. |
Notice of Allowance for U.S. Appl. No. 12/556,318 dated Nov. 2, 2015. |
Notice of Allowance for U.S. Appl. No. 13/728,879 dated Jun. 21, 2016. |
Notice of Allowance for U.S. Appl. No. 14/102,394 dated Aug. 25, 2014. |
Notice of Allowance for U.S. Appl. No. 14/108,187 dated Aug. 29, 2014. |
Notice of Allowance for U.S. Appl. No. 14/135,289 dated Oct. 14, 2014. |
Notice of Allowance for U.S. Appl. No. 14/148,568 dated Aug. 26, 2015. |
Notice of Allowance for U.S. Appl. No. 14/192,767 dated Dec. 16, 2014. |
Notice of Allowance for U.S. Appl. No. 14/225,084 dated May 4, 2015. |
Notice of Allowance for U.S. Appl. No. 14/268,964 dated Dec. 3, 2014. |
Notice of Allowance for U.S. Appl. No. 14/326,738 dated Nov. 18, 2015. |
Notice of Allowance for U.S. Appl. No. 14/473,552 dated Jul. 24, 2015. |
Notice of Allowance for U.S. Appl. No. 14/473,860 dated Jan. 5, 2015. |
Notice of Allowance for U.S. Appl. No. 14/479,863 dated Mar. 31, 2015. |
Notice of Allowance for U.S. Appl. No. 14/504,103 dated May 18, 2015. |
Notice of Allowance for U.S. Appl. No. 14/616,080 dated Apr. 2, 2015. |
Official Communication for Australian Patent Application No. 2014201511 dated Feb. 27, 2015. |
Official Communication for Australian Patent Application No. 2014210604 dated Jun. 5, 2015. |
Official Communication for Australian Patent Application No. 2014210614 dated Jun. 5, 2015. |
Official Communication for Australian Patent Application No. 2014250678 dated Jun. 17, 2015. |
Official Communication for European Patent Application No. 14158861.6 dated Jun. 16, 2014. |
Official Communication for European Patent Application No. 14159464.8 dated Jul. 31, 2014. |
Official Communication for European Patent Application No. 14180142.3 dated Feb. 6, 2015. |
Official Communication for European Patent Application No. 14180281.9 dated Jan. 26, 2015. |
Official Communication for European Patent Application No. 14180321.3 dated Apr. 17, 2015. |
Official Communication for European Patent Application No. 14180432.8 dated Jun. 23, 2015. |
Official Communication for European Patent Application No. 14186225.0 dated Feb. 13, 2015. |
Official Communication for European Patent Application No. 14187996.5 dated Feb. 12, 2015. |
Official Communication for European Patent Application No. 14189344.6 dated Feb. 20, 2015. |
Official Communication for European Patent Application No. 14189347.9 dated Mar. 4, 2015. |
Official Communication for European Patent Application No. 14189802.3 dated May 11, 2015. |
Official Communication for European Patent Application No. 14191540.5 dated May 27, 2015. |
Official Communication for European Patent Application No. 14197879.1 dated Apr. 28, 2015. |
Official Communication for European Patent Application No. 14197895.7 dated Apr. 28, 2015. |
Official Communication for European Patent Application No. 14197938.5 dated Apr. 28, 2015. |
Official Communication for European Patent Application No. 14199182.8 dated Mar. 13, 2015. |
Official Communication for European Patent Application No. 14200298.9 dated May 13, 2015. |
Official Communication for European Patent Application No. 15155845.9 dated Oct. 6, 2015. |
Official Communication for European Patent Application No. 15155846.7 dated Jul. 8, 2015. |
Official Communication for European Patent Application No. 15165244.3 dated Aug. 27, 2015. |
Official Communication for European Patent Application No. 15175106.2 dated Nov. 5, 2015. |
Official Communication for European Patent Application No. 15175151.8 dated Nov. 25, 2015. |
Official Communication for European Patent Application No. 15181419.1 dated Sep. 29, 2015. |
Official Communication for European Patent Application No. 15183721.8 dated Nov. 23, 2015. |
Official Communication for Great Britain Patent Application No. 1404457.2 dated Aug. 14, 2014. |
Official Communication for Great Britain Patent Application No. 1404486.1 dated May 21, 2015. |
Official Communication for Great Britain Patent Application No. 1404489.5 dated May 21, 2015. |
Official Communication for Great Britain Patent Application No. 1404499.4 dated Jun. 11, 2015. |
Official Communication for Great Britain Patent Application No. 1404574.4 dated Dec. 18, 2014. |
Official Communication for Great Britain Patent Application No. 1411984.6 dated Dec. 22, 2014. |
Official Communication for Great Britain Patent Application No. 1413935.6 dated Jan. 27, 2015. |
Official Communication for Netherlands Patent Application No. 2012437 dated Sep. 18, 2015. |
Official Communication for Netherlands Patent Application No. 2013306 dated Apr. 24, 2015. |
Official Communication for New Zealand Patent Application No. 622473 dated Jun. 19, 2014. |
Official Communication for New Zealand Patent Application No. 622473 dated Mar. 27, 2014. |
Official Communication for New Zealand Patent Application No. 622513 dated Apr. 3, 2014. |
Official Communication for New Zealand Patent Application No. 622517 dated Apr. 3, 2014. |
Official Communication for New Zealand Patent Application No. 624557 dated May 14, 2014. |
Official Communication for New Zealand Patent Application No. 627962 dated Aug. 5, 2014. |
Official Communication for New Zealand Patent Application No. 628161 dated Aug. 25, 2014. |
Official Communication for New Zealand Patent Application No. 628263 dated Aug. 12, 2014. |
Official Communication for New Zealand Patent Application No. 628495 dated Aug. 19, 2014. |
Official Communication for U.S. Appl. No. 12/556,318 dated Jul. 2, 2015. |
Official Communication for U.S. Appl. No. 13/247,987 dated Apr. 2, 2015. |
Official Communication for U.S. Appl. No. 13/247,987 dated Sep. 22, 2015. |
Official Communication for U.S. Appl. No. 13/827,491 dated Dec. 1, 2014. |
Official Communication for U.S. Appl. No. 13/827,491 dated Jun. 22, 2015. |
Official Communication for U.S. Appl. No. 13/827,491 dated Oct. 9, 2015. |
Official Communication for U.S. Appl. No. 13/831,791 dated Mar. 4, 2015. |
Official Communication for U.S. Appl. No. 13/831,791 dated Aug. 6, 2015. |
Official Communication for U.S. Appl. No. 13/835,688 dated Jun. 17, 2015. |
Official Communication for U.S. Appl. No. 13/839,026 dated Aug. 4, 2015. |
Official Communication for U.S. Appl. No. 14/134,558 dated Oct. 7, 2015. |
Official Communication for U.S. Appl. No. 14/148,568 dated Oct. 22, 2014. |
Official Communication for U.S. Appl. No. 14/148,568 dated Mar. 26, 2015. |
Official Communication for U.S. Appl. No. 14/196,814 dated May 5, 2015. |
Official Communication for U.S. Appl. No. 14/225,006 dated Sep. 10, 2014. |
Official Communication for U.S. Appl. No. 14/225,006 dated Sep. 2, 2015. |
Official Communication for U.S. Appl. No. 14/225,006 dated Feb. 27, 2015. |
Official Communication for U.S. Appl. No. 14/225,084 dated Sep. 11, 2015. |
Official Communication for U.S. Appl. No. 14/225,084 dated Sep. 2, 2014. |
Official Communication for U.S. Appl. No. 14/225,084 dated Feb. 20, 2015. |
Official Communication for U.S. Appl. No. 14/225,160 dated Feb. 11, 2015. |
Official Communication for U.S. Appl. No. 14/225,160 dated Aug. 12, 2015. |
Official Communication for U.S. Appl. No. 14/225,160 dated May 20, 2015. |
Official Communication for U.S. Appl. No. 14/225,160 dated Oct. 22, 2014. |
Official Communication for U.S. Appl. No. 14/225,160 dated Jul. 29, 2014. |
Official Communication for U.S. Appl. No. 14/268,964 dated Sep. 3, 2014. |
Official Communication for U.S. Appl. No. 14/306,138 dated Sep. 14, 2015. |
Official Communication for U.S. Appl. No. 14/306,138 dated Feb. 18, 2015. |
Official Communication for U.S. Appl. No. 14/306,138 dated Sep. 23, 2014. |
Official Communication for U.S. Appl. No. 14/306,138 dated May 26, 2015. |
Official Communication for U.S. Appl. No. 14/306,138 dated Dec. 3, 2015. |
Official Communication for U.S. Appl. No. 14/306,147 dated Feb. 19, 2015. |
Official Communication for U.S. Appl. No. 14/306,147 dated Aug. 7, 2015. |
Official Communication for U.S. Appl. No. 14/306,147 dated Sep. 9, 2014. |
Official Communication for U.S. Appl. No. 14/306,154 dated Mar. 11, 2015. |
Official Communication for U.S. Appl. No. 14/306,154 dated May 15, 2015. |
Official Communication for U.S. Appl. No. 14/306,154 dated Nov. 16, 2015. |
Official Communication for U.S. Appl. No. 14/306,154 dated Jul. 6, 2015. |
Official Communication for U.S. Appl. No. 14/306,154 dated Sep. 9, 2014. |
Official Communication for U.S. Appl. No. 14/319,765 dated Sep. 10, 2015. |
Official Communication for U.S. Appl. No. 14/319,765 dated Jun. 16, 2015. |
Official Communication for U.S. Appl. No. 14/319,765 dated Nov. 25, 2014. |
Official Communication for U.S. Appl. No. 14/319,765 dated Feb. 4, 2015. |
Official Communication for U.S. Appl. No. 14/323,935 dated Jun. 22, 2015. |
Official Communication for U.S. Appl. No. 14/323,935 dated Nov. 28, 2014. |
Official Communication for U.S. Appl. No. 14/323,935 dated Mar. 31, 2015. |
Official Communication for U.S. Appl. No. 14/326,738 dated Dec. 2, 2014. |
Official Communication for U.S. Appl. No. 14/326,738 dated Jul. 31, 2015. |
Official Communication for U.S. Appl. No. 14/326,738 dated Mar. 31, 2015. |
Official Communication for U.S. Appl. No. 14/451,221 dated Oct. 21, 2014. |
Official Communication for U.S. Appl. No. 14/463,615 dated Sep. 10, 2015. |
Official Communication for U.S. Appl. No. 14/463,615 dated Nov. 13, 2014. |
Official Communication for U.S. Appl. No. 14/463,615 dated May 21, 2015. |
Official Communication for U.S. Appl. No. 14/463,615 dated Jan. 28, 2015. |
Official Communication for U.S. Appl. No. 14/473,552 dated Feb. 24, 2015. |
Official Communication for U.S. Appl. No. 14/479,863 dated Dec. 26, 2014. |
Official Communication for U.S. Appl. No. 14/483,527 dated Jun. 22, 2015. |
Official Communication for U.S. Appl. No. 14/483,527 dated Jan. 28, 2015. |
Official Communication for U.S. Appl. No. 14/486,991 dated Mar. 10, 2015. |
Official Communication for U.S. Appl. No. 14/504,103 dated Mar. 31, 2015. |
Official Communication for U.S. Appl. No. 14/504,103 dated Feb. 5, 2015. |
Official Communication for U.S. Appl. No. 14/552,336 dated Jul. 20, 2015. |
Official Communication for U.S. Appl. No. 14/562,524 dated Sep. 14, 2015. |
Official Communication for U.S. Appl. No. 14/571,098 dated Mar. 11, 2015. |
Official Communication for U.S. Appl. No. 14/571,098 dated Aug. 24, 2015. |
Official Communication for U.S. Appl. No. 14/571,098 dated Aug. 5, 2015. |
Official Communication for U.S. Appl. No. 14/579,752 dated Aug. 19, 2015. |
Official Communication for U.S. Appl. No. 14/579,752 dated May 26, 2015. |
Official Communication for U.S. Appl. No. 14/631,633 dated Sep. 10, 2015. |
Official Communication for U.S. Appl. No. 14/639,606 dated Oct. 16, 2015. |
Official Communication for U.S. Appl. No. 14/639,606 dated May 18, 2015. |
Official Communication for U.S. Appl. No. 14/639,606 dated Jul. 24, 2015. |
Official Communication for U.S. Appl. No. 14/676,621 dated Jul. 30, 2015. |
Official Communication for U.S. Appl. No. 14/726,353 dated Sep. 10, 2015. |
Official Communication for U.S. Appl. No. 14/813,749 dated Sep. 28, 2015. |
Official Communication for U.S. Appl. No. 14/929,584 dated May 25, 2016. |
Official Communication for U.S. Appl. No. 15/072,133 dated Nov. 10, 2016. |
Restriction Requirement for U.S. Appl. No. 13/839,026 dated Apr. 2, 2015. |
Notice of Allowance for U.S. Appl. No. 15/072,133 dated Sep. 25, 2017. |
Official Communication for Great Britain Patent Application No. 1620827.4 dated Nov. 13, 2017. |
Official Communication for Great Britain Patent Application No. 1620827.4 dated Sep. 21, 2017. |
Official Communication for U.S. Appl. No. 14/323,878 dated Sep. 28, 2017. |
Official Communication for U.S. Appl. No. 14/323,881 dated Nov. 1, 2017. |
Official Communication for U.S. Appl. No. 14/842,734 dated Dec. 7, 2017. |
Official Communication for U.S. Appl. No. 14/323,878 dated Feb. 28, 2018. |
Official Communication for U.S. Appl. No. 15/146,841 dated Mar. 8, 2018. |
Official Communication for European Patent Application No. 16184373.5 dated May 28, 2018. |
Official Communication for U.S. Appl. No. 14/323,878 dated Sep. 7, 2018. |
Official Communication for U.S. Appl. No. 15/146,841 dated Nov. 28, 2018. |
Number | Date | Country | |
---|---|---|---|
20170052747 A1 | Feb 2017 | US |
Number | Date | Country | |
---|---|---|---|
62206174 | Aug 2015 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14934004 | Nov 2015 | US |
Child | 15146842 | US |