Server implemented geographic information system with graphical interface

Information

  • Patent Grant
  • 10437850
  • Patent Number
    10,437,850
  • Date Filed
    Tuesday, December 19, 2017
    7 years ago
  • Date Issued
    Tuesday, October 8, 2019
    5 years ago
Abstract
Example embodiments described herein pertain to a geographic information system (GIS), configured to obtain geospatial data representing a geographic area, assign a projection and coordinate system to the geospatial data, apply a transformation to the geospatial data, and generate a tile cache based on the transformed geospatial data, the tile cache including the determined projection and coordinate system.
Description
TECHNICAL FIELD

The subject matter disclosed herein generally relates to machines configured to process data. Specifically, example embodiments relate to a server implemented geographic information system.


BACKGROUND

A geographic information system (GIS) is a system designed to capture, store, manipulate, analyze, manage, and present geospatial data. Typically, a GIS uses a spatio-temporal location as the key index variable for all other information and calculations. A GIS can relate otherwise unrelated information (e.g., geographic data) by using location as the key index variable. Thus, any variable that can be located spatially can be referenced using a GIS. Locations in Earth space-time may be recorded as dates/times of occurrence, and x, y, and z coordinates representing longitude, latitude, and elevation, respectively.





BRIEF DESCRIPTION OF THE DRAWINGS

Various ones of the appended drawings merely illustrate example embodiments of the present inventive subject matter and cannot be considered as limiting its scope.



FIG. 1 is a network diagram illustrating a network environment suitable for generating and presenting a tile cache based on geospatial data, according to some example embodiments.



FIG. 2 is a block diagram illustrating components of a geographic information system suitable to receive geospatial data usable to generate and display a tile cache, according to some example embodiments.



FIG. 3 is a flowchart illustrating operations of the geographic information system in performing a method of obtaining geospatial data in order to generate and display a tile cache, according to some example embodiments.



FIG. 4 is a flowchart illustrating operations of the geographic information system in performing a method for determining and assigning a projection and coordinate system to the obtained geospatial data, according to some example embodiments.



FIG. 5 is a flowchart illustrating operations of the geographic information system in performing a method for determining and assigning a projection and coordinate system to the obtained geospatial data, according to some example embodiments.



FIG. 6 is an interaction diagram illustrating various example interactions between the geographic information system, third party servers, and a client device, according to some example embodiments.



FIG. 7 is a diagram illustrating a user interface for presenting a geospatial data usable by the geographic information system to generate and display a tile cache, according to some example embodiments.



FIG. 8 is a diagram illustrating a user interface for presenting a base map usable by the geographic information system as a reference to determine a projection and coordinate system to apply to the obtained geospatial data, according to some example embodiments.



FIG. 9 is a diagram illustrating a user interface configured to receive user inputs defining common landmarks of the geospatial data and the base map in order to determine a projection and coordinate system, according to some example embodiments.



FIG. 10 is a diagram illustrating a user interface configured to receive user inputs adjusting a position of the geospatial data in relative to the base map in order to determine a projection and coordinate system, according to some example embodiments.



FIG. 11 is a block diagram illustrating components of a machine, according to some example embodiments, able to read instructions from a machine-readable medium and perform any one or more of the methodologies discussed herein.





DETAILED DESCRIPTION

Example embodiments described herein pertain to a geographic information system (GIS) configured to receive geospatial data from a multitude of sources, and use the geospatial data to generate and display a tile cache. The GIS may be or include a group of one or more server machines configured to provide one or more GIS services. A client device may accordingly request and receive, from the GIS, a tile cache based on multiple geospatial data inputs, as well as through geospatial data submitted via scripts or external applications. The GIS may then determine an accurate corresponding projection and coordinate system of the geospatial data based on a user input, and in some example embodiments may apply a transformation to the geospatial data. Examples merely typify possible variations. Unless explicitly stated otherwise, components and functions are optional and may be combined or subdivided, and operations may vary in sequence or be combined or subdivided. In the following description, for purposes of explanation, numerous specific details are set forth to provide a thorough understanding of example embodiments. It will be evident to one skilled in the art, however, that the present subject matter may be practiced without these specific details.


The GIS is configured (e.g., by one or more suitable modules that include one or more processors) to obtain geospatial data (e.g., images captured via satellite and aerial sources), determine a projection and coordinate system of the geospatial data based on user inputs and a base map or corresponding metadata (e.g., a default projection and coordinate system), apply transformations to the geospatial data, and generate a tile cache useable by any conventional mapping system, based on at least the geospatial data. A tile cache is a collection of images made from geospatial data, comprising images of the geospatial data at several different scales. For example, based on the source data, and either corresponding metadata (e.g., which includes a projection and coordinate system) or a user input (e.g., defining the projection and coordinate system), a determination may be made regarding what “scales” are needed for the tile cache, and the size of the tiles comprising the tile cache. The GIS may obtain the geospatial data from a third party source, or directly from a client device.


In some example embodiments, the GIS automatically determines a projection and coordinate system of the geospatial data based on corresponding metadata of the geospatial data. Metadata is information about digital data. Numerous metadata standards have been developed in the area of geographic information systems, including at least Federal Geographic Data Committee standard (FGDC), Machine-Readable Cataloging record (MARC), and Dublin Core. For example, the geospatial data may include metadata representing a longitude, latitude, and elevation values useable to determine an appropriate projection and coordinate system to assign to the geospatial data. After obtaining the geospatial data, the GIS checks the metadata of the geospatial data to identify if there are longitude, latitude, and elevation values. If the corresponding metadata includes the coordinate values (e.g., longitude, latitude, and elevation values), the GIS determines and assigns a projection and coordinate system to the geospatial data. If the GIS determines that the corresponding metadata does not include any coordinates usable by the GIS in determining a projection and coordinate system, the GIS causes display of a notification on the client device, prompting the user to provide user inputs useable to identify the projection and coordinate system of the geospatial data.


In instances where the geospatial data has no corresponding longitude, latitude, and elevation values, the GIS determines a projection and coordinate system to assign to the geospatial data based on user input. For example, the GIS is configured to present the geospatial data at a client device, displayed beside a base map, where the geospatial data and the base map both represent a geographic region, and the base map includes a corresponding base-projection and base-coordinate system. The user input may include inputs to “hand georectify” the geospatial data by manually selecting points on the geospatial data and the base map where the selected points represent pairs of matching landmarks. Georectification is the digital alignment of a satellite or aerial image with a map of the same area. In georectification, a number of corresponding control points (e.g., landmarks such as street intersections) are marked on both the image (e.g., the geospatial data) and the map (e.g., the base map). These locations become reference points in the subsequent processing of the image. The GIS determines a projection and coordinate system for the geospatial data based on at least the base-projection and base-coordinate system of the base map and the user inputs identifying the matching pairs of identified points.


According to various example embodiments, the GIS is further configured to apply a transformation to the geospatial data at the server, based on the determined projection and coordinate system. The transformations include: affine transformations (e.g. in order to create a correctly georectified version of the source data); converting between formats (e.g., geotiff to a jpeg); converting the image to other standard projections (e.g., as defined in the European Petroleum Survey Group, for example); changing the transparency and/or color of the data; or generating a composite image based on multiple images (e.g., source data) imported at different projections, such that the composite image is of a single, uniform projection, and transparency. The transformation of the geospatial data aligns the geospatial data with the base map of the same area. For example, the server may distort the geospatial data (e.g., satellite image) in such a way as to put the image in the corresponding spatial projection system of the base map. In some embodiments, the transformation is applied to the geospatial data by a conventional georectification application (e.g., the Geospatial Data Extraction Library). With the transformed (e.g., georectified) geospatial data, the GIS generates a tile cache, where the tile cache is a fraction of the size of the original image.


As an illustrative example from a user perspective, suppose a user launches an application configured to interact with the GIS on a client device, and the application enables the user to submit geospatial data to a server in order to generate a tile cache useable by a conventional mapping system. The GIS application may cause the display of a notification window on the client device, the user identifies an aerial surveillance image (e.g., geospatial data) of a geographic region. The aerial surveillance image may be located within a local storage component of the client device, or a third-party database. The user may use the client device to transmit the aerial surveillance image to a server of the GIS.


Having obtained the aerial surveillance image from the user through the client device or a third-party server, the GIS may then identify, determine, and assign a projection and coordinate system to the aerial surveillance image, in order to apply any necessary transformations to the aerial surveillance image. If the GIS determines that the aerial surveillance image has no associated metadata which identifies a corresponding projection and coordinate system, the GIS obtains a base map of the same geographic region of the geospatial data (e.g., from the client device). The GIS may then display the base map and the aerial surveillance image within a graphical user interface presented on the client device.


The user may then identify matching pairs of landmarks located within the aerial surveillance image and the base map through user inputs on the graphical user interface displayed on the client device. After selecting a minimum number of landmark pairs, the GIS identifies and assigns a projection and coordinate system to the aerial surveillance image. The GIS then applies a transformation to the aerial surveillance image based on the assigned projection and coordinate system.


Having applied a transformation to the aerial surveillance image, the GIS generates a tile cache at the GIS server, and delivers the tile cache to the client device. Additionally, the user may choose to request individual tiles, or to download the entire tile cache. The user may then use the individual tiles, or the entire tile cache within any conventional mapping system to view the surveyed area.



FIG. 1 is a network diagram illustrating a network environment 100 suitable for operating a GIS (e.g., geographic information system application (GIS) 142), according to some example embodiments. A networked system 102, provides server-side functionality, via a network 104 (e.g., an Intranet, the Internet or a Wide Area Network (WAN)), to one or more clients. FIG. 1 illustrates, for example, a web client 112 (e.g. a web browser), client application(s) 114, and a programmatic client 116 executing on respective client device 110. It shall be appreciated that although the various functional components of the system 100 are discussed in the singular sense, multiple instances of one or more of the various functional components may be employed.


An Application Program Interface (API) server 120 and a web server 122 are coupled to, and provide programmatic and web interfaces respectively to, one or more application server 140. The application server(s) 140 host the GIS 142. The application servers 140 are, in turn, shown to be coupled to one or more database servers 124 that facilitate access to one or more databases 126.


The GIS 142 is a server application with a web front-end that obtains geospatial data and allows georectification (e.g., an application of transformations) of the geospatial data and may output the data in various forms for the networked system 102. For example, the GIS 142 may be configured to obtain geospatial data, apply a georectification (e.g., transformation) to the geospatial data, and output a tile cache based on the georectified geospatial data. While the GIS 142 is shown in FIG. 1 to form part of the networked system 102, it will be appreciated that, in alternative embodiments, the GIS 142 may form part of a system that is separate and distinct from the networked system 102.



FIG. 1 also illustrates a third-party application 132, executing on a third-party server 130, as having programmatic access to the networked system 102 via the programmatic interface provided by the API server 120. The third-party server 130 may, for example, be a source of geospatial data useable by the GIS 142.



FIG. 2 is a block diagram illustrating components of a geographic information system (e.g., the GIS 142) suitable to receive geospatial data, apply transformations, and generate and display a tile cache, according to some example embodiments. As is understood by skilled artisans in the relevant computer and Internet-related arts, each component (e.g., a module or engine) illustrated in FIG. 2 represents a set of executable software instructions and the corresponding hardware (e.g., memory and processor) for executing the instructions. The GIS 142 is shown as including a data retrieval module 202, a coordinate module 204, a transformation module 206, a tile caching module 208, and a presentation module 210, each of which is configured and communicatively coupled to communicate with the other modules (e.g., via a bus, shared memory, or a switch).


Geospatial data is obtained via the data retrieval module 202, from one or more data sources (e.g., the third-party servers 130 or the client device 110). In such instances, the data retrieval module 202 may receive a request to retrieve geospatial data from the third party server 130, or from the client device 110. For example, a user on the client device 110 may submit a request to the GIS 142 to retrieve geospatial data. The request may identify a source of the geospatial data at either the third-party server 130, or from a location in client device 110. Responsive to receiving the request the data retrieval module 202 retrieves the geospatial data for the GIS 142 from the identified data source.


After obtaining the geospatial data, the data retrieval module 202 provides the geospatial data to the coordinate module 204. The coordinate module 204 is configured to determine a projection and coordinate system of the geospatial data. In some example embodiments, the coordinate module 204 determines a projection and coordinate system based on corresponding metadata of the geospatial data. For example, the geospatial metadata may include coordinate values representing longitude, latitude, and elevation. Based on the coordinate values, the coordinate module 204 determines and assigns a projection and coordinate system to the geospatial data.


In instances where the geospatial data has no corresponding metadata, the coordinate module 204 determines a projection and coordinate system of the geospatial data based on user input. For example, the coordinate module 204 may receive user input from the client device 110 identifying matching pairs of landmarks on the geospatial data and a base map, where the base map represents the same geographic area as the geospatial data. The user inputs may, for example, include sets of points which represent matching landmark pairs located within the geospatial data and the base map. With the sets of points, the coordinate module 204 may determine and assign a projection and coordinate system to the geospatial data.


The transformation module 206 is configured to georectify (e.g., apply a transformation) the geospatial data based the projection and coordinate system determined by the coordinate module 204. The transformation module 206 applies a transformation to the geospatial data. The transformation converts the coordinate system in the geospatial data to another coordinate system (e.g., the coordinate system of the base map). The transformation includes distortions applied to the geospatial data.


The tile caching module 208 is configured to obtain the transformed geospatial data with corresponding projection and coordinate system, and generate a tile cache based on the transformed geospatial data. In a tile cache, the geospatial data is tiled so that the geospatial data may be represented as a set of polygonal tiles. Tiling geospatial data breaks the geospatial data into a manageable rectangular set, or rows and columns of pixels, typically used to process a large amount of data without consuming vast quantities of computer memory. Thus, by tiling the geospatial data in order to generate a tile cache of the geospatial data, the GIS 142 enables a user to process a large amount of geospatial data without consuming large quantities of computer memory.


The presentation module 210 is configured to present a graphical user interface on the client device 110, where the graphical user interface includes at least a presentation of the geospatial data. In other example embodiments, the presentation module 210 also causes the display of a base map and the generated tile cache to the client device 110.


Any one or more of the modules described may be implemented using hardware alone (e.g., one or more of the processors 212 of a machine) or a combination of hardware and software. For example, any module described of the GIS 142 may physically include an arrangement of one or more of the processors 212 (e.g., a subset of or among the one or more processors 212 of the machine) configured to perform the operations described herein for that module. As another example, any module of the GIS 142 may include software, hardware, or both, that configures an arrangement of one or more processors 212 (e.g., among the one or more processors 212 of the machine) to perform the operations described herein for that module. Accordingly, different modules of the GIS 142 may include and configure different arrangements of such processors 212 or a single arrangement of such processors 212 at different points in time. Moreover, any two or more modules of the GIS 142 may be combined into a single module, and the functions described herein for a single module may be subdivided among multiple modules. Furthermore, according to various example embodiments, modules described herein as being implemented within a single machine, database, or device may be distributed across multiple machines, databases, or devices.



FIG. 3 is a flowchart illustrating operations of the GIS 142 in performing a method 300 of obtaining geospatial data in order to generate a tile cache, according to some example embodiments. The method 300 may be embodied in computer-readable instructions for execution by one or more processors such that the steps of the method 300 may be performed in part or in whole by the components of the GIS 142; accordingly, the method 300 is described below by way of example with reference thereto. However, it shall be appreciated that the method 300 may be deployed on various other hardware configurations and is not intended to be limited to the GIS 142.


At operation 305, the data retrieval module 202 obtains geospatial data. In some example embodiments the GIS 142 retrieves geospatial data responsive to a request from the client device 110. For example, a user on the client device 110 may provide the GIS 142 with geospatial data directly, or alternatively may identify a source of the geospatial data on a third-party server 130 or a database server 124. The geospatial data represents a geographic region, and may include corresponding metadata.


At operation 310, the coordinate module 204 assigns a projection and coordinate system of the geospatial data. In instances in which the metadata includes coordinate values of the longitude, latitude, and elevation of the geospatial data, the coordinate module 204 determines the projection and coordinate system using the metadata of the geospatial data. Example operations for carrying out operation 310 in scenarios in which the metadata of the geospatial data is not useable in determining a projection and coordinate system are discussed below in reference to FIG. 4 and FIG. 5.


At operation 315, the transformation module 206 georectifies (e.g., applies a transformation) the geospatial data based on at least the determined projection and coordinate system. Georectification takes an image (e.g., the geospatial data) that has not been adjusted to be in a known coordinate system, and through applied transformations, puts the image into a known coordinate system. As discussed above, a projection and coordinate system may be determined by identifying sets of matching points between the image (e.g., the geospatial data), and a base map which includes a known projection and coordinate system. The transformation includes rotation, distortion, and scaling of the geospatial data.


At operation 320, the tile caching module 208 generates a tile cache based on the transformed geospatial data. As discussed above, a tile cache is a representation of the geospatial data. In a tile cache, the geospatial data is represented as a set of polygonal tiles, or polyhedral blocks, such that no figures overlap and there are no gaps. The tile caching module 208 generates a tile cache based on at least the transformed geospatial data.


In some example embodiments, the GIS 142 assigns a timestamp to the tile cache. The timestamp indicates a time and date when the geospatial data was first obtained. For example, the GIS 142 may obtain the time and date that the data was obtained by checking the metadata of the geospatial data. In some embodiments, a user may provide a timestamp to the GIS 142 to be assigned to the geospatial data. By assigning timestamps to the geospatial data, the GIS 142 may enable a user to compare imagery taken on different dates in order to see any changes that may have occurred (e.g., pre-and-post-disaster imagery). Additionally, the user may retrieve a tile cache from among one or more tile caches the user previous created, based on a user query which includes one or more criteria including a particular time or projection system. Thus, in this way, a user may view the most recent tile cache based on a query.


As shown in FIG. 4, one or more operations 311, 312, 313, and 314 may be performed as part (e.g., a precursor task, a subroutine, or portion) of operation 310 of method 300, in which the coordinate module 204 determines and assigns a projection and coordinate system to geospatial data, according to some example embodiments. FIG. 4 depicts a scenario when the geospatial data lacks useful corresponding metadata (e.g., longitude, latitude, or elevation values).


Operation 311 may be performed by the presentation module 210. The presentation module 210 causes presentation of a graphical user interface including a display of a graphical representation of the geospatial data on the client device 110. The geospatial data may be of a file format that does not include metadata useable to determine a projection and coordinate system of the geospatial data. In some example embodiments the graphical user interface may include a file retrieval or entry field, enabling the user to upload or select geospatial data to be displayed in the graphical user interface at the client device 110. For example, the user may upload geospatial data (e.g., a surveillance image from a drone) representative of a geographic region to the application servers 140 to be accessed by the GIS 142.


At operation 312, the coordinate module 204 presents base map data (e.g., from database 126) of the geographic region represented by the geospatial data selected by the user in the graphical user interface. The base map includes a corresponding base-projection and base-coordinate system, useable by the coordinate module 204 to determine and assign a projection and coordinate system to the geospatial data. In some example embodiments, the base map may be selected manually by the user via the graphical user interface on client device 110. A user may search for a base map based on at least some of search criteria and the geospatial data. The GIS 142 may cause presentation of a set of base maps in the graphical user interface for the user to select a base map from.


In some example embodiments, the coordinate module 204 is further configured to automatically identify a base map based on at least the geospatial data submitted by the user. As an example, the geospatial data may include a presentation of landmarks and features of a geographic region, and based on the locations of the landmarks and features relative to one another, the coordinate module 204 may search for and retrieve a set of base maps with similar landmarks and features. The presentation module 210 causes presentation of the set of base maps on the client device, along with a set of graphical elements that allow the user to select an appropriate base map.


At operation 313, the coordinate module 204 receives user inputs identifying landmark pairs (e.g., coordinate pairs) on the geospatial data and the base map. The presentation module 210 may present the base map and the geospatial data side by side in the graphical user interface. The user may then select one or more pairs of matching landmarks visible on the geospatial data and the base map. The geospatial data may include a presentation of a geographic region. The base map therefore includes a presentation of the same geographic region. The user may place markers (e.g., pins, flags, poles, or indicators) at corresponding locations on the geospatial data and the base map. For example, both the base map and the geographic data may include a presentation of a portion of a city with streets and intersections, and the user may place a marker at an intersection visible in the geospatial data, and then place a marker at the same intersection visible in the base map. In some example embodiments, the coordinate module 204 requires a predetermined number of landmark pairs in order to determine a projection and coordinate system of the geospatial data. For example, the coordinate module 204 may require that the user provide at least three landmark pairs.


At operation 314, the coordinate module 204 determines a projection and coordinate system of the geospatial data based on at least the user inputs and the base map.


As show in FIG. 5, one or more operations 515, 516, 517, and 518 may be performed as an alternative part (e.g., a precursor task, a subroutine, or portion) of operation 310 of method 300, in which the coordinate module 204 determines and assigns a projection and coordinate system to geospatial data, according to some example embodiments. FIG. 5 depicts a scenario when the geospatial data lacks useful corresponding metadata (e.g., longitude, latitude, or elevation values).


Operation 515 may be performed by the presentation module 210. A user accessing the GIS 142 via the client device 110, is presented with a graphical user interface including a display of the geospatial data on the client device 110. In some example embodiments, the geospatial data may be transparently overlaid over a base map.


At operation 516, as in operation 312 of FIG. 4, the coordinate module 204 retrieves and presents a base map (e.g., from database 126) of the geographic region represented by the geospatial data selected by the user in the graphical user interface. The base map includes a corresponding base-projection and base-coordinate system, useable by the coordinate module 204 as reference values to determine and assign a projection and coordinate system to the geospatial data.


In some example embodiments, the presentation module 210 displays the geospatial data with transparency in the graphical user interface along with the base map. At operation 517, the coordinate module 204 receives user inputs adjusting a position of the geospatial data in the graphical user interface to align the geospatial data with the base map. For example, the user may adjust the position of the geospatial data in the graphical user interface so that the landmarks of the geospatial data line up with the matching landmarks of the base map. The presentation module 210 may be configured to receive user inputs from the client device 110 that scale the size of the geospatial data, and adjust the position of the geospatial data in the graphical user interface.


At operation 518, the coordinate module 204 determines a projection and coordinate system of the geospatial data based on at least the position of the geospatial data in the graphical user interface relative to the base map, and the base-project and base-coordinate system of the base map.



FIG. 6 is an interaction diagram depicting example exchanges between the GIS 142, third-party servers 130, and client device 110, consistent with some example embodiments. At operation 602, geospatial data is generated (e.g., via aerial surveillance or a satellite) and stored in the third-party servers 130. For example, a surveillance drone may take a set of high resolution images and transmit the images to the third-party server 130.


At operation 604, the GIS 142 obtains the geospatial data from the third-party server 130. In some example embodiments, the GIS 142 may obtain the geospatial data responsive to receiving a request for the client device 110 via the data retrieval module 202. For example, the request may identify a source for the geospatial data, and an identifier of the geospatial data. The data retrieval module 202 may then retrieve the geospatial data based on at least the identified source and identifier of the geospatial data.


At operation 606, the presentation module 210 of the GIS 142 causes the client device 110 to display the geospatial data. At operation 608, the geospatial data is displayed on the client device 110. In some example embodiments, the presentation module 210 may also display a graphical user interface on the client device 110. The graphical user interface includes a presentation of the geospatial data. At operation 610, the client device 110 receives one or more user inputs on the geospatial data, the one or more user inputs useable by the GIS 142 to determine a projection and coordinate system of the geospatial data.


At operation 612, the GIS 142 determines a projection and coordinate system of the geospatial data based on at least the one or more user inputs received via the client device 110. In some example embodiments, the coordinate module 204 may also check the geospatial data for corresponding metadata which may include coordinate values for longitude, latitude, and elevation. At operation 614, having determined an appropriate projection and coordinate system to apply to the geospatial data, the coordinate module 204 assigns the projection and coordinate system to the geospatial data.


At operation 616, the transformation module 206 of the GIS 142 georectifies (e.g., applies a transformation) the geospatial data based on at least the assigned projection and coordinate system. After applying appropriate transformations to the geospatial data, the tile caching module 208 of the GIS 142 generates a tile cache. At operation 616, the tile cache is presented on the client device 110.



FIG. 7 is a diagram illustrating a user interface (e.g., GIS interface 700) for presenting geospatial data 702 usable by the GIS 142 to generate and display a tile cache, according to some example embodiments. The GIS interface 700 is shown to include an imagery search field 710, configured to enable a user to search for, select, retrieve, and upload images (e.g., geospatial data 702) into the GIS 142. The geospatial data 702 is usable by the GIS 142 to generate and cause the display of a tile cache, according to some example embodiments. The geospatial data 702 may include a single image, or multiple images (e.g., captured by an aerial surveillance drone or satellite) depicting a geographic region. The geospatial data 702 may further include representations of unique landmarks and features (e.g., 704, 706, and 708).


In some example embodiments, the geospatial data 702 include a high-resolution image obtained via aerial surveillance (e.g., drone, helicopter, airplane, satellite) and may be transmitted to a server accessible by the GIS 142 (e.g., the third-party server 130, the database server 124). In further embodiments, the geospatial data may be received by a client device 110, and uploaded into a server accessible by the GIS 142 via the GIS interface 700. In further embodiments, the geospatial data 702 may reside on the client device 110, and may be uploaded to the servers (e.g., database servers 124) of the GIS 142.


A user accessing the GIS 142 on a client device 110 is presented with the GIS interface 700, including the imagery search field 710. The user may provide the imagery search field 710 with search criteria (e.g., a file name, file source) in order to retrieve one or more images and data to be uploaded into the GIS 142, and to generate an present a tile cache based on the uploaded image (e.g., geospatial data 702).



FIG. 8 is a diagram illustrating the GIS interface 700 displaying a base map 802 of the geographic region represented by geospatial data (e.g., geospatial data 702). The coordinate module 204 of the GIS 142 uses the base map 802 to determine a projection and coordinate system of the geospatial data (e.g., geospatial data 702). The base map 802 includes metadata (e.g., FGDC, MARC, Dublin Core) useable by the coordinate module 204 to determine a projection and coordinate system of the geospatial data 702, as described in the operations of FIG. 3 above. For example, the corresponding metadata may include information defining coordinates of the base map 802 (e.g., longitude, latitude, and elevation) that may be used by the coordinate module 204 to determine and assign a projection and coordinate system to the base map 802.


In some example embodiments, the GIS 142 may retrieve the base map 802 in response to a user uploading geospatial data (e.g., the geospatial data 702). The GIS 142 may search for a base map (e.g., the base map 802) based on the locations of one or more landmarks (e.g., 704, 706, and 708) within the geospatial data (e.g., geospatial data 702), or a set of coordinates corresponding to the geographic region represented by the geospatial data 702. Upon identifying a base map (e.g., the base map 802) based on at least coordinates or the locations of the one or more landmarks (e.g., 704, 706, and 708), the GIS 142 causes display of the base map 802 on the client device 110.


In further embodiments, a user may simply retrieve and upload the base map 802 into the GIS 142 via the imagery search field 710 by providing the imagery search field 710 with a file name and file location of the base map 802. The base map 802 may reside within a server remote from the GIS 142 (e.g., third-party server 130), or within the client device 110. The GIS 142 may retrieve the base map 802 responsive to a command from the client device 110. In some embodiments, the GIS 142 may retain a set of base maps, including base map 802, within a local server (e.g., database server 124), and access the base map 802 responsive to a command from the client device 110 identifying the base map 802.



FIG. 9 is a diagram illustrating the GIS interface 700 including a presentation of the geospatial data 702 and the base map 802, configured to receive one or more user inputs 904, 906, and 908 identifying matching landmark pairs between geospatial data (e.g., the geospatial data 702), and base map (e.g., the base map 802). Thus, a user on a client device 110 may provide the GIS interface 700 with inputs (e.g., inputs 904, 906, and 908) via a cursor 902, identifying matching landmark pairs (e.g., 704, 706, and 708) between the geospatial data 702 and the base map 802.



FIG. 10 is a diagram illustrating the GIS interface 700, generated by the presentation module 210, configured to receive a user input aligning geospatial data (e.g., geospatial data 702) with a base map (e.g., the base map 802), according to some example embodiments. For example, the base map 802 may include a corresponding projection and coordinate system. A user manipulating a cursor 1002 may align the geospatial data 702 with the base map 802, such that the landmarks of the geospatial data 702 occupy the same location in the GIS interface 700 as the corresponding landmarks of the base map 802. In response, the coordinate module 204 may determine and apply the projection and coordinate system of the base map 802 to the geospatial data 702, thus enabling the transformation module 206 to apply a transformation to the geospatial data 702.



FIG. 11 is a block diagram illustrating components of a machine 1100, according to some example embodiments, able to read instructions 1124 from a machine-readable medium 1122 (e.g., a non-transitory machine-readable medium, a machine-readable storage medium, a computer-readable storage medium, or any suitable combination thereof) and perform any one or more of the methodologies discussed herein, in whole or in part. Specifically, FIG. 11 shows the machine 1100 in the example form of a computer system (e.g., a computer) within which the instructions 1124 (e.g., software, a program, an application, an applet, an app, or other executable code) for causing the machine 1100 to perform any one or more of the methodologies discussed herein may be executed, in whole or in part.


In alternative embodiments, the machine 1100 operates as a standalone device or may be communicatively coupled (e.g., networked) to other machines. In a networked deployment, the machine 1100 may operate in the capacity of a server machine or a client machine in a server-client network environment, or as a peer machine in a distributed (e.g., peer-to-peer) network environment. The machine 1100 may be a server computer, a client computer, a PC, a tablet computer, a laptop computer, a netbook, a cellular telephone, a smartphone, a set-top box (STB), a personal digital assistant (PDA), a web appliance, a network router, a network switch, a network bridge, or any machine capable of executing the instructions 1124, sequentially or otherwise, that specify actions to be taken by that machine. Further, while only a single machine is illustrated, the term “machine” shall also be taken to include any collection of machines that individually or jointly execute the instructions 1124 to perform all or part of any one or more of the methodologies discussed herein.


The machine 1100 includes a processor 1102 (e.g., a central processing unit (CPU), a graphics processing unit (GPU), a digital signal processor (DSP), an application specific integrated circuit (ASIC), a radio-frequency integrated circuit (RFIC), or any suitable combination thereof), a main memory 1104, and a static memory 1106, which are configured to communicate with each other via a bus 1108. The processor 1102 may contain solid-state digital microcircuits (e.g., electronic, optical, or both) that are configurable, temporarily or permanently, by some or all of the instructions 1124 such that the processor 1102 is configurable to perform any one or more of the methodologies described herein, in whole or in part. For example, a set of one or more microcircuits of the processor 1102 may be configurable to execute one or more modules (e.g., software modules) described herein. In some example embodiments, the processor 1102 is a multicore CPU (e.g., a dual-core CPU, a quad-core CPU, or a 128-core CPU) within which each of multiple cores is a separate processor that is able to perform any one or more of the methodologies discussed herein, in whole or in part. Although the beneficial effects described herein may be provided by the machine 1100 with at least the processor 1102, these same effects may be provided by a different kind of machine that contains no processors (e.g., a purely mechanical system, a purely hydraulic system, or a hybrid mechanical-hydraulic system), if such a processor-less machine is configured to perform one or more of the methodologies described herein.


The machine 1100 may further include a graphics display 1110 (e.g., a plasma display panel (PDP), a light emitting diode (LED) display, a liquid crystal display (LCD), a projector, a cathode ray tube (CRT), or any other display capable of displaying graphics or video). The machine 1100 may also include an input/output device 1112 (e.g., a keyboard or keypad, a mouse, or a trackpad), a storage unit 1116, an audio generation device 1118 (e.g., a sound card, an amplifier, a speaker, a headphone jack, or any suitable combination thereof), and a network interface device 1120.


The storage unit 1116 includes the machine-readable medium 1122 (e.g., a tangible and non-transitory machine-readable storage medium) on which are stored the instructions 1124 embodying any one or more of the methodologies or functions described herein. The instructions 1124 may also reside, completely or at least partially, within the main memory 1104, within the processor 1102 (e.g., within the processor's cache memory), within the static memory 1106, or all three, before or during execution thereof by the machine 1100. Accordingly, the main memory 1104 and the processor 1102 may be considered machine-readable media (e.g., tangible and non-transitory machine-readable media). The instructions 1124 may be transmitted or received over a network 1126 via the network interface device 1120. For example, the network interface device 1120 may communicate the instructions 1124 using any one or more transfer protocols (e.g., hypertext transfer protocol (HTTP)).


As used herein, the term “memory” refers to a machine-readable medium able to store data temporarily or permanently and may be taken to include, but not be limited to, random-access memory (RAM), read-only memory (ROM), buffer memory, flash memory, and cache memory. While the machine-readable medium 1122 is shown in an example embodiment to be a single medium, the term “machine-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, or associated caches and servers) able to store the instructions 1124. The term “machine-readable medium” shall also be taken to include any medium, or combination of multiple media, that is capable of storing the instructions 1124 for execution by the machine 1100, such that the instructions 1124, when executed by one or more processors of the machine 1100 (e.g., processor 1102), cause the machine 1100 to perform any one or more of the methodologies described herein, in whole or in part. Accordingly, a “machine-readable medium” refers to a single storage apparatus or device, as well as cloud-based storage systems or storage networks that include multiple storage apparatus or devices. The term “machine-readable medium” shall accordingly be taken to include, but not be limited to, one or more tangible and non-transitory data repositories (e.g., data volumes) in the example form of a solid-state memory chip, an optical disc, a magnetic disc, or any suitable combination thereof. A “non-transitory” machine-readable medium, as used herein, specifically does not include propagating signals per se. In some example embodiments, the instructions 1124 for execution by the machine 1100 may be communicated by a carrier medium. Examples of such a carrier medium include a storage medium (e.g., a non-transitory machine-readable storage medium, such as a solid-state memory, being physically moved from one place to another place) and a transient medium (e.g., a propagating signal that communicates the instructions 1124).


Certain embodiments are described herein as including logic or a number of components, modules, or mechanisms. Modules may constitute software modules (e.g., code stored or otherwise embodied on a machine-readable medium or in a transmission medium), hardware modules, or any suitable combination thereof. A “hardware module” is a tangible (e.g., non-transitory) unit capable of performing certain operations and may be configured or arranged in a certain physical manner. In various example embodiments, one or more computer systems (e.g., a standalone computer system, a client computer system, or a server computer system) or one or more hardware modules of a computer system (e.g., a processor or a group of processors) may be configured by software (e.g., an application or application portion) as a hardware module that operates to perform certain operations as described herein.


In some embodiments, a hardware module may be implemented mechanically, electronically, or any suitable combination thereof. For example, a hardware module may include dedicated circuitry or logic that is permanently configured to perform certain operations. For example, a hardware module may be a special-purpose processor, such as a field programmable gate array (FPGA) or an ASIC. A hardware module may also include programmable logic or circuitry that is temporarily configured by software to perform certain operations. For example, a hardware module may include software encompassed within a CPU or other programmable processor. It will be appreciated that the decision to implement a hardware module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations.


Accordingly, the phrase “hardware module” should be understood to encompass a tangible entity, and such a tangible entity may be physically constructed, permanently configured (e.g., hardwired), or temporarily configured (e.g., programmed) to operate in a certain manner or to perform certain operations described herein. As used herein, “hardware-implemented module” refers to a hardware module. Considering embodiments in which hardware modules are temporarily configured (e.g., programmed), each of the hardware modules need not be configured or instantiated at any one instance in time. For example, where a hardware module comprises a CPU configured by software to become a special-purpose processor, the CPU may be configured as respectively different special-purpose processors (e.g., each included in a different hardware module) at different times. Software (e.g., a software module) may accordingly configure one or more processors, for example, to constitute a particular hardware module at one instance of time and to constitute a different hardware module at a different instance of time.


Hardware modules can provide information to, and receive information from, other hardware modules. Accordingly, the described hardware modules may be regarded as being communicatively coupled. Where multiple hardware modules exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses) between or among two or more of the hardware modules. In embodiments in which multiple hardware modules are configured or instantiated at different times, communications between such hardware modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple hardware modules have access. For example, one hardware module may perform an operation and store the output of that operation in a memory device to which it is communicatively coupled. A further hardware module may then, at a later time, access the memory device to retrieve and process the stored output. Hardware modules may also initiate communications with input or output devices, and can operate on a resource (e.g., a collection of information).


The various operations of example methods described herein may be performed, at least partially, by one or more processors that are temporarily configured (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors may constitute processor-implemented modules that operate to perform one or more operations or functions described herein. Accordingly, the operations described herein may be at least partially processor-implemented, since a processor is an example of hardware. For example, at least some operations of any method may be performed by one or more processor-implemented modules. As used herein, “processor-implemented module” refers to a hardware module in which the hardware includes one or more processors. Moreover, the one or more processors may also operate to support performance of the relevant operations in a “cloud computing” environment or as a “software as a service” (SaaS). For example, at least some of the operations may be performed by a group of computers (as examples of machines including processors), with these operations being accessible via a network (e.g., the Internet) and via one or more appropriate interfaces (e.g., an application program interface (API)).


Throughout this specification, plural instances may implement components, operations, or structures described as a single instance. Although individual operations of one or more methods are illustrated and described as separate operations, one or more of the individual operations may be performed concurrently, and nothing requires that the operations be performed in the order illustrated. Structures and functionality presented as separate components in example configurations may be implemented as a combined structure or component. Similarly, structures and functionality presented as a single component may be implemented as separate components. These and other variations, modifications, additions, and improvements fall within the scope of the subject matter herein.


The performance of certain operations may be distributed among the one or more processors, whether residing only within a single machine or deployed across a number of machines. In some example embodiments, the one or more processors or hardware modules (e.g., processor-implemented modules) may be located in a single geographic location (e.g., within a home environment, an office environment, or a server farm). In other example embodiments, the one or more processors or hardware modules may be distributed across a number of geographic locations.


Some portions of the subject matter discussed herein may be presented in terms of algorithms or symbolic representations of operations on data stored as bits or binary digital signals within a machine memory (e.g., a computer memory). Such algorithms or symbolic representations are examples of techniques used by those of ordinary skill in the data processing arts to convey the substance of their work to others skilled in the art. As used herein, an “algorithm” is a self-consistent sequence of operations or similar processing leading to a desired result. In this context, algorithms and operations involve physical manipulation of physical quantities. Typically, but not necessarily, such quantities may take the form of electrical, magnetic, or optical signals capable of being stored, accessed, transferred, combined, compared, or otherwise manipulated by a machine. It is convenient at times, principally for reasons of common usage, to refer to such signals using words such as “data,” “content,” “bits,” “values,” “elements,” “symbols,” “characters,” “terms,” “numbers,” “numerals,” or the like. These words, however, are merely convenient labels and are to be associated with appropriate physical quantities.


Unless specifically stated otherwise, discussions herein using words such as “processing,” “computing,” “calculating,” “determining,” “presenting,” “displaying,” or the like may refer to actions or processes of a machine (e.g., a computer) that manipulates or transforms data represented as physical (e.g., electronic, magnetic, or optical) quantities within one or more memories (e.g., volatile memory, non-volatile memory, or any suitable combination thereof), registers, or other machine components that receive, store, transmit, or display information. Furthermore, unless specifically stated otherwise, the terms “a” or “an” are herein used, as is common in patent documents, to include one or more than one instance. Finally, as used herein, the conjunction “or” refers to a non-exclusive “or,” unless specifically stated otherwise.

Claims
  • 1. A system, comprising: processors; anda memory storing instructions that, when executed by at least one processor among the processors, causes the system to perform operations comprising:receiving geospatial data at the system, the geospatial data including image data that depicts a geographic area;identifying a base map from among a plurality of base maps based on the image data that depicts the geographic area in response to the receiving the geospatial data, the base map having a base projection and base coordinate system;applying a transformation to the geospatial data based on the base projection and base coordinate system of the base map; andgenerating a tile cache based on the transformed geospatial data, the tile cache including the base projection and base coordinate system.
  • 2. The system of claim 1, wherein the identifying the base map from among the plurality of base maps based on the image data includes: identifying a depiction of a landmark within the image data of the geospatial data; andselecting the base map from among the plurality of base maps based on the depiction of the landmark.
  • 3. The system of claim 1, wherein the identifying the base map from among the plurality of base maps is based on the geographic area depicted by the image data.
  • 4. The system of claim 1, wherein the geospatial data has a first orientation, the base map has a base orientation, and the applying the transformation to the geospatial data based on the base projection and base coordinate system of the base map includes: rotating the geospatial data to align the first orientation of the geospatial data with the base orientation of the base map.
  • 5. The system of claim 1, wherein the identifying the base map from among the plurality of base maps based on the image data includes: causing display of an interface that comprises a presentation of the plurality of base maps within the graphical user interface, the presentation of the plurality of base maps including the base map; andreceiving a selection of the base map from among the presentation of the plurality of base maps.
  • 6. The system of claim 1, wherein the instructions cause the system to perform operations further comprising: receiving a request to display the geographic area from a client device;retrieving the tile cache in response to the request to display the geographic area; andrendering a presentation of the geographic area at the client device based on the tile cache.
  • 7. The system of claim 6, wherein the rendering the presentation of the geographic area at the client device based on the tile cache includes: transmitting a portion of the tile cache to the client device; andrendering the geographic area based on the portion of the tile cache.
  • 8. A non-transitory machine-readable storage medium comprising instructions that, when executed by one or more processors of a machine, cause the machine to perform operations comprising: receiving geospatial data at the system, the geospatial data including image data that depicts a geographic area;identifying a base map from among a plurality of base maps based on the image data that depicts the geographic area in response to the receiving the geospatial data, the base map having a base projection and base coordinate system;applying a transformation to the geospatial data based on the base projection and base coordinate system of the base map; andgenerating a tile cache based on the transformed geospatial data, the tile cache including the base projection and base coordinate system.
  • 9. The non-transitory machine-readable storage medium of claim 8, wherein the identifying the base map from among the plurality of base maps based on the image data includes: identifying a depiction of a landmark within the image data of the geospatial data; andselecting the base map from among the plurality of base maps based on the depiction of the landmark.
  • 10. The non-transitory machine-readable storage medium of claim 8, wherein the identifying the base map from among the plurality of base maps is based on the geographic area depicted by the image data.
  • 11. The non-transitory machine-readable storage medium of claim 8, wherein the geospatial data has a first orientation, the base map has a base orientation, and the applying the transformation to the geospatial data based on the base projection and base coordinate system of the base map includes: rotating the geospatial data to align the first orientation of the geospatial data with the base orientation of the base map.
  • 12. The non-transitory machine-readable storage medium of claim 8, wherein the identifying the base map from among the plurality of base maps based on the image data includes: causing display of an interface that comprises a presentation of the plurality of base maps within the graphical user interface, the presentation of the plurality of base maps including the base map; andreceiving a selection of the base map from among the presentation of the plurality of base maps.
  • 13. The non-transitory machine-readable storage medium of claim 8, wherein the instructions cause the machine to perform operations further comprising: receiving a request to display the geographic area from a client device;retrieving the tile cache in response to the request to display the geographic area; andrendering a presentation of the geographic area at the client device based on the tile cache.
  • 14. The non-transitory machine-readable storage medium of claim 13, wherein the rendering the presentation of the geographic area at the client device based on the tile cache includes: transmitting a portion of the tile cache to the client device; andrendering the geographic area based on the portion of the tile cache.
  • 15. A method comprising: receiving geospatial data at the system, the geospatial data including image data that depicts a geographic area;identifying a base map from among a plurality of base maps based on the image data that depicts the geographic area in response to the receiving the geospatial data, the base map having a base projection and base coordinate system;applying a transformation to the geospatial data based on the base projection and base coordinate system of the base map; andgenerating a tile cache based on the transformed geospatial data, the tile cache including the base projection and base coordinate system.
  • 16. The method of claim 15, wherein the identifying the base map from among the plurality of base maps based on the image data includes: identifying a depiction of a landmark within the image data of the geospatial data; andselecting the base map from among the plurality of base maps based on the depiction of the landmark.
  • 17. The method of claim 15, wherein the identifying the base map from among the plurality of base maps is based on the geographic area depicted by the image data.
  • 18. The method of claim 15, wherein the geospatial data has a first orientation, the base map has a base orientation, and the applying the transformation to the geospatial data based on the base projection and base coordinate system of the base map includes: rotating the geospatial data to align the first orientation of the geospatial data with the base orientation of the base map.
  • 19. The method of claim 15, wherein the identifying the base map from among the plurality of base maps based on the image data includes: causing display of an interface that comprises a presentation of the plurality of base maps within the graphical user interface, the presentation of the plurality of base maps including the base map; andreceiving a selection of the base map from among the presentation of the plurality of base maps.
  • 20. The method of claim 15, wherein the method further comprises: receiving a request to display the geographic area from a client device;retrieving the tile cache in response to the request to display the geographic area; andrendering a presentation of the geographic area at the client device based on the tile cache.
CROSS-REFERENCE TO RELATED APPLICATION

This application is a continuation of U.S. patent application Ser. No. 15/209,477, filed Jul. 13, 2016, which is a continuation of U.S. patent application Ser. No. 14/730,123, filed Jun. 3, 2015, both of which are incorporated by reference herein in their entireties.

US Referenced Citations (245)
Number Name Date Kind
4899161 Morin, Jr. et al. Feb 1990 A
4958305 Piazza Sep 1990 A
5329108 Lamoure Jul 1994 A
5754182 Kobayashi May 1998 A
5781195 Marvin Jul 1998 A
5781704 Rossmo Jul 1998 A
6091956 Hollenberg Jul 2000 A
6100897 Mayer et al. Aug 2000 A
6157747 Szeliski et al. Dec 2000 A
6169552 Endo et al. Jan 2001 B1
6173067 Payton et al. Jan 2001 B1
6178432 Cook et al. Jan 2001 B1
6247019 Davies Jun 2001 B1
6389289 Voce May 2002 B1
6414683 Gueziec Jul 2002 B1
6483509 Rabenhorst Nov 2002 B1
6529900 Patterson et al. Mar 2003 B1
6631496 Li et al. Oct 2003 B1
6662103 Skolnick et al. Dec 2003 B1
6757445 Knopp Jun 2004 B1
6828920 Owen et al. Dec 2004 B2
6983203 Wako Jan 2006 B1
6985950 Hanson et al. Jan 2006 B1
7036085 Barros Apr 2006 B2
7158878 Rasmussen Jan 2007 B2
7375732 Arcas May 2008 B2
7379811 Rasmussen et al. May 2008 B2
7457706 Melero Nov 2008 B2
7502786 Liu et al. Mar 2009 B2
7519470 Brasche et al. Apr 2009 B2
7529195 Gorman May 2009 B2
7539666 Ashworth et al. May 2009 B2
7558677 Jones Jul 2009 B2
7574428 Leiserowitz et al. Aug 2009 B2
7579965 Bucholz Aug 2009 B2
7617314 Bansod Nov 2009 B1
7620628 Kapur et al. Nov 2009 B2
7663621 Allen Feb 2010 B1
7791616 Ioup et al. Sep 2010 B2
7805457 Viola et al. Sep 2010 B1
7809703 Balabhadrapatruni et al. Oct 2010 B2
7872647 Mayer et al. Jan 2011 B2
7894984 Rasmussen et al. Feb 2011 B2
7899611 Downs et al. Mar 2011 B2
7920963 Jouline et al. Apr 2011 B2
7945852 Pilskains May 2011 B1
7962281 Rasmussen et al. Jun 2011 B2
7970240 Chao et al. Jun 2011 B1
8010545 Stefik et al. Aug 2011 B2
8036632 Cona et al. Oct 2011 B1
8065080 Koch Nov 2011 B2
8085268 Carrino et al. Dec 2011 B2
8134457 Velipasalar et al. Mar 2012 B2
8145703 Frishert et al. Mar 2012 B2
8200676 Frank Jun 2012 B2
8214361 Sandler et al. Jul 2012 B1
8214764 Gemmell et al. Jul 2012 B2
8229947 Fujinaga Jul 2012 B2
8230333 Decherd et al. Jul 2012 B2
8290942 Jones et al. Oct 2012 B2
8290943 Carbone et al. Oct 2012 B2
8301464 Cave et al. Oct 2012 B1
8325178 Doyle et al. Dec 2012 B1
8368695 Howell et al. Feb 2013 B2
8397171 Klassen et al. Mar 2013 B2
8400448 Doyle Mar 2013 B1
8407180 Ramesh et al. Mar 2013 B1
8412234 Gatmir-Motahari et al. Apr 2013 B1
8412707 Mianji Apr 2013 B1
8422825 Neophytou et al. Apr 2013 B1
8452790 Mianji May 2013 B1
8463036 Ramesh et al. Jun 2013 B1
8489331 Kopf et al. Jul 2013 B2
8489641 Seefeld et al. Jul 2013 B1
8498984 Hwang et al. Jul 2013 B1
8508533 Cervelli et al. Aug 2013 B2
8514229 Cervelli et al. Aug 2013 B2
8515207 Chau Aug 2013 B2
8564596 Carrino et al. Oct 2013 B2
8742934 Sarpy, Sr. et al. Jun 2014 B1
8781169 Jackson et al. Jul 2014 B2
8799799 Cervelli et al. Aug 2014 B1
8830322 Nerayoff et al. Sep 2014 B2
8938686 Erenrich et al. Jan 2015 B1
8949164 Mohler Feb 2015 B1
8984099 Giencke et al. Mar 2015 B1
9009177 Zheng et al. Apr 2015 B2
9021384 Beard et al. Apr 2015 B1
9104293 Kornfeld et al. Aug 2015 B1
9104695 Cervelli et al. Aug 2015 B1
9111380 Piemonte et al. Aug 2015 B2
9129219 Robertson et al. Sep 2015 B1
9146125 Vulcano et al. Sep 2015 B2
9460175 Hong Oct 2016 B1
9886491 Hong Feb 2018 B2
20020003539 Abe Jan 2002 A1
20020033848 Sciammarella et al. Mar 2002 A1
20020116120 Ruiz et al. Aug 2002 A1
20020130867 Yang et al. Sep 2002 A1
20020130906 Miyaki Sep 2002 A1
20030052896 Higgins et al. Mar 2003 A1
20030103049 Kindratenko et al. Jun 2003 A1
20030144868 MacIntyre et al. Jul 2003 A1
20030163352 Surpin et al. Aug 2003 A1
20030225755 Iwayama et al. Dec 2003 A1
20040030492 Fox et al. Feb 2004 A1
20040039498 Ollis et al. Feb 2004 A1
20040098236 Mayer et al. May 2004 A1
20040143602 Ruiz et al. Jul 2004 A1
20050031197 Knopp Feb 2005 A1
20050034062 Bufkin et al. Feb 2005 A1
20050080769 Gemmell et al. Apr 2005 A1
20050134607 Purdy Jun 2005 A1
20050162523 Darrell et al. Jul 2005 A1
20050182502 Iyengar Aug 2005 A1
20050182793 Keenan et al. Aug 2005 A1
20050223044 Ashworth et al. Oct 2005 A1
20050267652 Allstadt et al. Dec 2005 A1
20060026170 Kreitler et al. Feb 2006 A1
20060139375 Rasmussen et al. Jun 2006 A1
20060146050 Yamauchi Jul 2006 A1
20060149596 Surpin et al. Jul 2006 A1
20060197837 Flath et al. Sep 2006 A1
20060200384 Arutunian et al. Sep 2006 A1
20060241974 Chao et al. Oct 2006 A1
20060251307 Florin et al. Nov 2006 A1
20060259527 Devarakonda et al. Nov 2006 A1
20060271277 Hu et al. Nov 2006 A1
20060279630 Aggarwal et al. Dec 2006 A1
20070011150 Frank Jan 2007 A1
20070016363 Huang et al. Jan 2007 A1
20070024620 Muller-Fischer et al. Feb 2007 A1
20070057966 Ohno et al. Mar 2007 A1
20070078832 Ott, IV et al. Apr 2007 A1
20070115373 Gallagher et al. May 2007 A1
20070188516 Ioup et al. Aug 2007 A1
20070208497 Downs et al. Sep 2007 A1
20070208498 Barker et al. Sep 2007 A1
20070258642 Thota Nov 2007 A1
20070294643 Kyle Dec 2007 A1
20080010605 Frank Jan 2008 A1
20080021726 Evans et al. Jan 2008 A1
20080040684 Crump et al. Feb 2008 A1
20080077642 Carbone Mar 2008 A1
20080082578 Hogue et al. Apr 2008 A1
20080098085 Krane et al. Apr 2008 A1
20080104019 Nath May 2008 A1
20080133579 Lim Jun 2008 A1
20080163073 Becker et al. Jul 2008 A1
20080192053 Howell Aug 2008 A1
20080195417 Surpin Aug 2008 A1
20080223834 Griffiths et al. Sep 2008 A1
20080229056 Agarwal et al. Sep 2008 A1
20080263468 Cappione et al. Oct 2008 A1
20080267107 Rosenberg Oct 2008 A1
20080270468 Mao et al. Oct 2008 A1
20080278311 Grange et al. Nov 2008 A1
20080288306 MacIntyre et al. Nov 2008 A1
20080294678 Gorman et al. Nov 2008 A1
20080301643 Appleton et al. Dec 2008 A1
20090002492 Velipasalar et al. Jan 2009 A1
20090027418 Maru et al. Jan 2009 A1
20090088964 Schaaf et al. Apr 2009 A1
20090100018 Roberts Apr 2009 A1
20090115786 Shimasaki et al. May 2009 A1
20090132921 Hwangbo et al. May 2009 A1
20090132953 Reed, Jr. et al. May 2009 A1
20090144262 White et al. Jun 2009 A1
20090158185 Lacevic et al. Jun 2009 A1
20090171939 Athsani et al. Jul 2009 A1
20090172511 Decherd et al. Jul 2009 A1
20090179892 Tsuda et al. Jul 2009 A1
20090187447 Cheng et al. Jul 2009 A1
20090187464 Bai et al. Jul 2009 A1
20090222400 Kupershmidt et al. Sep 2009 A1
20090292626 Oxford et al. Nov 2009 A1
20100057716 Stefik et al. Mar 2010 A1
20100063961 Guiheneuf et al. Mar 2010 A1
20100070523 Delgo et al. Mar 2010 A1
20100076968 Boyns et al. Mar 2010 A1
20100106420 Mattikalli et al. Apr 2010 A1
20100162176 Dunton Jun 2010 A1
20100198684 Eraker et al. Aug 2010 A1
20100199225 Coleman et al. Aug 2010 A1
20100277611 Holt et al. Nov 2010 A1
20100293174 Bennett Nov 2010 A1
20100321399 Ellren et al. Dec 2010 A1
20110022312 McDonough et al. Jan 2011 A1
20110090254 Carrino et al. Apr 2011 A1
20110117878 Barash et al. May 2011 A1
20110137766 Rasmussen et al. Jun 2011 A1
20110153368 Pierre et al. Jun 2011 A1
20110161096 Buehler et al. Jun 2011 A1
20110170799 Carrino et al. Jul 2011 A1
20110208724 Jones et al. Aug 2011 A1
20110218934 Elser Sep 2011 A1
20110225198 Edwards et al. Sep 2011 A1
20110238690 Arrasvouri et al. Sep 2011 A1
20110260860 Gupta Oct 2011 A1
20110270705 Parker Nov 2011 A1
20120066296 Appleton et al. Mar 2012 A1
20120084118 Bai et al. Apr 2012 A1
20120106801 Jackson May 2012 A1
20120144335 Abeln et al. Jun 2012 A1
20120158527 Cannelongo et al. Jun 2012 A1
20120159363 DeBacker et al. Jun 2012 A1
20120173985 Peppel Jul 2012 A1
20120206469 Hulubei et al. Aug 2012 A1
20120208636 Feige Aug 2012 A1
20120221580 Barney Aug 2012 A1
20120323888 Osann, Jr. Dec 2012 A1
20130006725 Simanek et al. Jan 2013 A1
20130021445 Cossette-Pacheco et al. Jan 2013 A1
20130057551 Ebert et al. Mar 2013 A1
20130060786 Serrano et al. Mar 2013 A1
20130073377 Heath Mar 2013 A1
20130076732 Cervelli et al. Mar 2013 A1
20130100134 Cervelli et al. Apr 2013 A1
20130101159 Chao et al. Apr 2013 A1
20130150004 Rosen Jun 2013 A1
20130176321 Mitchell et al. Jul 2013 A1
20130179420 Park et al. Jul 2013 A1
20130254900 Sathish et al. Sep 2013 A1
20130268520 Fisher et al. Oct 2013 A1
20130279757 Kephart Oct 2013 A1
20130282723 Petersen et al. Oct 2013 A1
20130339891 Blumenberg Dec 2013 A1
20140176606 Narayan et al. Jun 2014 A1
20140218400 O'Toole et al. Aug 2014 A1
20140333651 Cervelli et al. Nov 2014 A1
20140337772 Cervelli et al. Nov 2014 A1
20140361899 Layson Dec 2014 A1
20150019675 Kosakai Jan 2015 A1
20150029176 Baxter Jan 2015 A1
20150100907 Erenrich et al. Apr 2015 A1
20150106170 Bonica Apr 2015 A1
20150186821 Wang et al. Jul 2015 A1
20150187036 Wang et al. Jul 2015 A1
20150187100 Berry et al. Jul 2015 A1
20150312323 Peterson Oct 2015 A1
20150338233 Cervelli et al. Nov 2015 A1
20150347457 Zhu et al. Dec 2015 A1
20150370828 Maurer Dec 2015 A1
20150379413 Robertson et al. Dec 2015 A1
20170178376 Hong Jun 2017 A1
Foreign Referenced Citations (20)
Number Date Country
2012216622 May 2015 AU
102013222023 Jan 2015 DE
0763201 Mar 1997 EP
2575107 Apr 2013 EP
2858014 Apr 2015 EP
2963595 Jan 2016 EP
3101559 Dec 2016 EP
3185147 Jun 2017 EP
2516155 Jan 2015 GB
2012778 Nov 2014 NL
624557 Aug 2014 NZ
WO-9532424 Nov 1995 WO
WO-2000009529 Feb 2000 WO
WO-2001098925 Dec 2001 WO
WO-2004057268 Jul 2004 WO
WO-2005013200 Feb 2005 WO
WO-2008064207 May 2008 WO
WO-2009061501 May 2009 WO
WO-2009123975 Oct 2009 WO
WO-2011058507 May 2011 WO
Non-Patent Literature Citations (90)
Entry
“A First Look: Predicting Market Demand for Food Retails using a Huff Analysis”, TRF Policy Solutions, CDFI Fund, Capacity Building Initiative, (Jul. 2012), 1-30.
“Amm's Diary: Unconnected ways and other data quality issues”, Open Street Map, [Online]. Retrieved from the Internet: <URL: http://www.openstreetmap.org/user/amm/diary>, (Accessed: Jul. 23, 2012), 3 pgs.
“U.S. Appl. No. 12/840,673, Final Office Action dated Jan. 2, 2015”, 21 pgs.
“U.S. Appl. No. 12/840,673, Non Final Office Action dated Sep. 17, 2014”, 21 pgs.
“U.S. Appl. No. 12/840,673, Notice of Allowance dated Apr. 6, 2015”, 11 pgs.
“U.S. Appl. No. 13/728,879, Final Office Action dated Aug. 12, 2015”, 9 pgs.
“U.S. Appl. No. 13/728,879, First Action Interview Office Action Summary dated Mar. 17, 2015”, 5 pgs.
“U.S. Appl. No. 13/728,879, First Action Interview Pre-Interview Communication dated Jan. 27, 2015”, 4 pgs.
“U.S. Appl. No. 13/728,879, Non Final Office Action dated Nov. 20, 2015”, 9 pgs.
“U.S. Appl. No. 13/917,571, Issue Notification dated Aug. 5, 2014”, 1 pg.
“U.S. Appl. No. 13/948,859, Notice of Allowance dated Dec. 10, 2014”, 8 pgs.
“U.S. Appl. No. 14/289,596, Advisory Action dated Apr. 30, 2015”, 3 pgs.
“U.S. Appl. No. 14/289,596, Final Office Action dated Jan. 26, 2015”, 38 pgs.
“U.S. Appl. No. 14/289,596, First Action Interview Pre-Interview Communication dated Jul. 18, 2014”, 4 pgs.
“U.S. Appl. No. 14/289,599, Advisory Action dated Sep. 4, 2015”, 24 pgs.
“U.S. Appl. No. 14/289,599, Final Office Action dated May 29, 2015”, 8 pg.
“U.S. Appl. No. 14/289,599, First Action Interview Pre-Interview Communication dated Jul. 22, 2014”, 5 pgs.
“U.S. Appl. No. 14/294,098, Final Office Action dated Nov. 6, 2014”, 22 pgs.
“U.S. Appl. No. 14/294,098, First Action Interview Pre-Interview Communication dated Aug. 15, 2014”, 17 pgs.
“U.S. Appl. No. 14/294,098, Notice of Allowance dated Dec. 29, 2014”, 9 pgs.
“U.S. Appl. No. 14/319,161, Final Office Action dated Jan. 23, 2015”, 21 pgs.
“U.S. Appl. No. 14/319,161, Notice of Allowance dated May 4, 2015”, 6 pgs.
“U.S. Appl. No. 14/490,612, Final Office Action dated Aug. 18, 2015”, 71 pgs.
“U.S. Appl. No. 14/730,123, Notice of Allowance dated Apr. 12, 2016”, 10 pgs.
“U.S. Appl. No. 14/730,123, Pre-Interview First Office Action dated Sep. 21, 2015”, 4 pgs.
“U.S. Appl. No. 14/730,123, Pre-Interview First Office Action dated Dec. 7, 2015”, 5 pgs.
“U.S. Appl. No. 14/730,123, Response filed Nov. 20, 2016 to Pre-Interview First Office Action dated Sep. 21, 2015”, 1 pg.
“U.S. Appl. No. 14/730,123, Response filed Feb. 8, 2016 to First Action Interview Office Action dated Dec. 7, 2015”, 18 pgs.
“U.S. Appl. No. 14/929,584, Non Final Office Action dated Feb. 4, 2016”, 15 pgs.
“U.S. Appl. No. 14/934,004, First Action Interview Pre-Interview Communication dated Feb. 16, 2016”, 5 pgs.
“U.S. Appl. No. 15/209,477, Non Final Office Action dated Jun. 6, 2017”, 16 pgs.
“U.S. Appl. No. 15/209,477, Notice of Allowance dated Sep. 20, 2017”, 8 pgs.
“U.S. Appl. No. 15/258,715, First Action Interview—Pre-Interview Communication dated Dec. 13, 2017”, 4 pgs.
“U.S. Appl. No. 15/258,715, Notice of Allowance dated Jun. 22, 2018”, 12 pgs.
“Australian Application Serial No. 2012216622, Office Action dated Jan. 6, 2015”, 2 pgs.
“Australian Application Serial No. 2014202442, Office Action dated Mar. 19, 2015”, 5 pgs.
“Australian Application Serial No. 2014213553, Office Action dated May 7, 2015”, 2 pgs.
“Buffer a Polygon”, VBForums, [Online]. Retrieved from the Internet: <URL: http://www.vbforums.com/showthread.php?198436-Buffer-a-Polygon>, (Accessed: Oct. 10, 2016).
“Douglas-Peucker-Algorithms”, Wikipedia (W/ Machine Translation), [Online]. [Archived Jul. 29, 2011]. Retrieved from the Internet: <URL: http://de.wikipedia.org/w/index.php?title=Douglas-Peucker-Algorithms&oldid=91846042″>, (Last Modified: Jul. 29, 2011), 4 pgs.
“European Application Serial No. 14187739.9, Extended European Search Report dated Jul. 6, 2015”, 9 pgs.
“European Application Serial No. 16172401.8, Extended European Search Report dated Oct. 7, 2016”, 9 pgs.
“European Application Serial No. 16206016.4, Extended European Search Report dated May 17, 2017”, 11 pgs.
“GIS-NET 3 Public Department of Regional Planning”, Planning & Zoning Information for Unincorporated LA County, [Online] Retrieved from the internet: <http://gis.planning.lacounty.gov/GIS-NET3_Public/Viewer.html>, (Oct. 2, 2013), 1-2.
“Great Britain Application Serial No. 1408025.3, Office Action dated Nov. 6, 2014”, 3 pgs.
“How to georectify an image in ArcMap 10”, [Online] Retrieved from the Internet: http://web.archive.org/web/20140223140 531/http://gi s.une.edu/data/geog370/Georef erencingLabGEOG370_Spri ng2013.pdf, (Feb. 23, 2014), 1-8.
“Hunchlab: Heat Map and Kernel Density Calculation for Crime Analysis”, Azavea Journal, [Online]. Retrieved from the Internet: <www.azavea.com/blogs/newsletter/v4i4/kernel-density-capabilities-added-to-hunchlab>, (Sep. 9, 2014), 2 pgs.
“Identify—Definition”, Downloaded Jan. 22, 2015, (Jan. 22, 2015), 1 pg.
“Map Builder: Rapid Mashup Development Tool for Google and Yahoo Maps!”, http://web.archive.org/web/20090626224734/http://www.mapbuilder.net/, (Jul. 20, 2012), 2 pgs.
“Map of San Jose, CA”, Retrieved Oct. 2, 2013 from http://maps.google.com, (Oct. 2, 2013), 1 pg.
“Map of San Jose, CA.”, Retrieved Oct. 2, 2013 from http://maps.yahoo.com, (Oct. 2, 2013), 1 pg.
“Map of San Jose, CA.”, Retrieved Oct. 2, 2013 from http://maps.bing.com, (Oct. 2, 2013), 1 pg.
“Netherlands Application Serial No. 2011632, Netherlands Search Report dated Feb. 8, 2016”, W/ English Translation, 9 pgs.
“Netherlands Application Serial No. 2012778, Netherlands Search Report dated Sep. 22, 2015”, W/ English Translation, 10 pgs.
“New Zealand Application Serial No. 628585, Office Action dated Aug. 26, 2014”, 2 pgs.
“New Zealand Application Serial No. 628840, Office Action dated Aug. 28, 2014”, 2 pgs.
“Overlay—Definition”, Downloaded Jan. 22, 2015, (Jan. 22, 2015), 1 pg.
“Ramer-Douglas-Peucker algorithm”, Wikipedia, [Online]. [Archived May 31, 2013]. Retrieved from the Internet: <URL: http ://en wikipedia.orglw/index.php ?title= Ramer-DouglasPeucker_algorithm&oldid=557739119″>, (Jul. 2011), 3 pgs.
“Using the Area of Interest Tools”, Sonris, [Online]. Retrieved from the Internet: <URL: http://web.archive.org/web/20061001053327/http://sonris-www.dnr.state.la.us/gis/instruct_files/tutslide12.htm>, (Oct. 1, 2006), 1 pg.
Aquino, J., et al., “JTS Topology Suite: Technical Specifications”, Vivid Solutions, Technical Specifications Version 1.4, (Oct. 17, 2003), 1-36.
Barnes, Diane, et al., “Viewshed Analysis”, GIS-ARC/INFO, (2001), 1-10.
Barto, “How to: Create Your Own Points of Interest”, How to, [Online]. Retrieved from the Internet: <URL:http://www.poieditor.com/articles/how to_create_your_own_points_of_interest/>, (Jul. 22, 2008), 4 pgs.
Carver, Steve, et al., “Real-time visibility analysis and rapid viewshed calculation using a voxel-based modelling approach”, (Apr. 13, 2012), 6 pgs.
Chen, et al., “Bringing Order to the Web: Automatically Categorizing Search Results”, Proceedings of the SIGCHI conference on Human Factors in Computing Systems, The Hague, The Netherlands, (2000), 145-152.
Dramowicz, Ela, “Retail Trade Area Analysis Using the Huff Model”, Directions Magazine,, [Online] retrieved from the internet: <http://www.directionsmag.com/articles/retail-trade-area-analysis-usinq-the-huff-model/123411>, (Jul. 2, 2005), 10 pgs.
Ghosh, Pijush K, “A Solution of Polygon Containment, Spatial Planning, and Other Related Problems Using Minkowski Operations”, National Centre for Software Technology, Bombay India, Computer Vision, Graphics, and Image Processing, vol. 49, (Feb. 15, 1989), 35 pgs.
Gorr, et al., “Crime Hot Spot Forecasting: Modeling and Comparative Evaluation”, Grant 98-IJ-CX-K005, (May 6, 2002), 37 pgs.
Griffith, Daniel A, et al., “A Generalized Huff Model”, Geographical Analysis, vol. 14, No. 2, (Apr. 1982), 135-144.
Haralick, Robert M, et al., “Image Analysis Using Mathematical Morphology”, IEE Transactions on pattern analysis and machine intelligence, vol. PAMI-9, (Jul. 4, 1987), 532-550.
Hibbert, et al., “Prediction of Shopping Behavior Using a Huff Model Within a GIS Framework”, (Mar. 18, 2011), 16 pgs.
Huang, Da Wei, et al., “Systematic and Integrative Analysis of Large Gene Lists Using DAVID Bioinformatics Resources”, Nature Protocols, 4.1, (Dec. 2008), 44-57.
Huff, et al., “Calibrating the Huff Model Using ArcGIS Business Analyst”, ESRI, (Sep. 2008), 33 pgs.
Huff, David L, “Parameter Estimation in the Huff Model”, ESRI, ArcUser, (2003), 34-36.
Ipbucker, C, et al., “Inverse Transformation for Several Pseudo-cylindrical Map Projections Using Jacobian Matrix”, ICCSA 2009, Part 1 LNCS 5592, (2009), 2 pgs.
Levine, Ned, “Crime Mapping and the Crimestat Program”, Geographical Analysis, vol. 38, (2006), 41-56.
Liu, T., “Combining GIS and the Huff Model to Analyze Suitable Locations for a New Asian Supermarket in the Minneapolis and St. Paul, Minnesota USA”, Papers in Resource Analysis, 2012, vol. 14, (2012), 8 pgs.
Mandagere, Nagapramod, “Buffer Operations in GIS”, [Online]. Retrieved from the Internet: <URL: http://www-users.cs.umn.edu/˜npramod/enc_pdf.pdf>, (Printed: Jan. 20, 2010), 7 pgs.
Murray, C, “Oracle Spatial Developer's Guide-6 Coordinate Systems ( Spatial Reference Systems)”, [Online]. Retrieved from the Internet:<URL:http://docs.oracle.com/cd/B28359_01/appdev.111/b28400.pdf>, (Jun. 2009), 870 pgs.
Pozzi, F., et al., “Vegetation and Population Density in Urban and Suburban Areas in the USA”, Presented at the Third International Symposium of Remote Sensing of Urban Areas; Istanbul, Turkey, Jun. 2002, (Jun. 2002), 8 pgs.
Qiu, Fang, “3D Analysis and Surface Modeling”, Power point presentation, 26 pgs.
Reddy, Martin, et al., “Under the Hood of GeoVRML 1.0”, Proceeding VRML '00 Proceedings of the fifth symposium on Virtual reality modeling language (Web3D-VRML), [Online] Retrieved from the internet: <http://pdf.aminer.org/000/648/038/under_the_hood_of_geovrml.pdf>, (Feb. 2000), 23-28.
Reibel, M., et al., “Areal Interpolation of Population Counts Using Pre-classified Land Cover Data”, Popul Res Policy Rev. 26, (Sep. 19, 2007), 619-633.
Reibel, M., et al., “Geographic Information Systems and Spatial Data Processing in Demography: a Review”, Popul Res Policy Rev (2007) 26, (Sep. 6, 2007), 601-618.
Rizzardi, M., et al., “Interfacing U.S. Census Map Files With Statistical Graphics Software: Application and Use in Epidemiology”, Statistics in Medicine, vol. 12, (1993), 1953-1964.
Snyder, John P, “Map Projections—A Working Manual”, U.S. Geological Survey Professional Paper, 1395, (1987), 29 pgs.
Tangelder, J W.H, et al., “Freeform Shape Matching Using Minkowski Operations”, (Jun. 1996), 12 pgs.
Thompson, Mick, “Getting Started with GEO”, (Jul. 26, 2011), 3 pgs.
Turner, Andy, “Andy Turner's GISRUK 2012 Notes”, Google Drive—https://docs.google.com/document/d/1cTmxg7mVx5gd89lqblCYvDEnHA4QAivH417WpyPsqE4edit?pli=1, (Sep. 16, 2013), 1-15.
Valentini, Giorgio, et al., “Ensembles of Learning Machines”, Lecture Notes in Computer Science: Neural Nets, Springer Berlin Heidelberg, (Sep. 26, 2002), 3-20.
Wongsuphasawat, Krist, et al., “Visual Analytics for Transportation Incident Data Sets”, Transportation Research Record: Journal of the Transportation Research Board, No. 2138, (2009), 135-145.
Woodbridge, Stephen, “[geos-devel] Polygon simplification”, [Online]. Retrieved from the Internet:<URL:http://lists.osgeo.org/pipermail/geos-devel/2011-May/005210.html>, (May 8, 2011), 2 pgs.
Continuations (2)
Number Date Country
Parent 15209477 Jul 2016 US
Child 15847576 US
Parent 14730123 Jun 2015 US
Child 15209477 US