Zoom-adaptive data granularity to achieve a flexible high-performance interface for a geospatial mapping system

Information

  • Patent Grant
  • 11663694
  • Patent Number
    11,663,694
  • Date Filed
    Tuesday, May 18, 2021
    3 years ago
  • Date Issued
    Tuesday, May 30, 2023
    a year ago
Abstract
A geospatial mapping system can access a geospatial layer data set for a first geographic area defined by a first presentation level, and provide the geospatial layer data set for the first geographic area to a client device to present a visual rendering of the first geographic area. The geospatial mapping system can receive a request to utilize an advanced feature set on a subset of geospatial artifacts located within the first geographic area, and promote, to the geospatial layer data set, an additional geospatial artifact data set for the subset of geospatial artifacts, yielding an updated geospatial layer data set for the first geographic area. The geospatial mapping system can provide the updated geospatial layer data set to the client device to provide the advanced feature set for interacting with the subset of geospatial artifacts located within the first geographic area.
Description
TECHNICAL FIELD

The present disclosure generally relates to the technical field of special-purpose machines that manage geospatial artifact data, including computerized variants of such special-purpose machines and improvements to such variants, and to the technologies by which such special-purpose machines become improved compared to other special-purpose machines that manage geospatial artifact data. In particular, the present disclosure addresses systems and methods for promoting geospatial artifact data.


BACKGROUND

Geographic mapping applications allow users to view a visual rendering of a map at various presentation levels. For example, a user can zoom in and out of the map to view the map at varying levels of granularity. At higher presentation levels (e.g., zoomed out), the number of roads, cities, stores, etc., that fall within the view of the map may be overwhelming if each is presented to a user. Accordingly, current systems often do not present each geographic landmark at higher presentation levels and provide greater granularity as the user chooses to zoom into a particular portion of the map. In some situations, however, a user may want to view geographic landmarks with greater granularity while at a higher presentation level.





BRIEF DESCRIPTION OF THE DRAWINGS

Various ones of the appended drawings merely illustrate example embodiments of the present disclosure and are not intended to limit its scope to the illustrated embodiments. On the contrary, these examples are intended to cover alternatives, modifications, and equivalents as may be included within the scope of the disclosure.



FIG. 1 shows an exemplary system for promoting geospatial artifact data into a geospatial layer data set, according to some example embodiments.



FIG. 2 shows an example block diagram of a geospatial mapping manager, according to some example embodiments.



FIG. 3 shows an example method for causing a visual rendering of a geographical area to be presented on a client device, according to some example embodiments.



FIG. 4 shows an example method for promoting geospatial artifact data into a geospatial layer data set, according to some example embodiments.



FIG. 5 shows a diagrammatic representation of a computing device in the example form of a computer system within which a set of instructions for causing the computing device to perform any one or more of the methodologies discussed herein may be executed.





DETAILED DESCRIPTION

Reference will now be made in detail to specific example embodiments for carrying out the inventive subject matter of the present disclosure. In the following description, specific details are set forth in order to provide a thorough understanding of the subject matter. It shall be appreciated that embodiments may be practiced without some or all of these specific details.


Disclosed are systems, methods, and computer-readable storage media for promoting geospatial artifact data into a geospatial layer data set. A geospatial mapping system can present a visual rendering of a geographical area on a user's client device. For example, the geospatial mapping system can present a visual rendering of a world map, country map, city map, etc. The visual rendering of the geographic area can include visual renderings of geospatial artifacts that fall within the geographic area. A geospatial artifact can be a data object created to represent any geographic landmark or item. For example, a geospatial artifact can represent any event, object, path, landmark, person, vehicle, etc., that is geographically located within a geographic area. For example, a geospatial artifact can be a city, road, store, church, event, bus, school, etc., that is geographically located within a geographic area.


The geospatial mapping system can enable a user to adjust a presentation level at which the geographic area is presented. A presentation level can define a portion of the geographic area that is presented to the user. For example, a user can select to zoom in and out to adjust the presentation level, thereby defining the geographic area presented to the user.


The geospatial mapping system can maintain geospatial data defining the geographic area and the geospatial artifacts located within the geographic area. For example, the geospatial data can include map data that generally defines the geographic area, such as land, borders, bodies of water, etc. The geospatial data can also include geospatial artifact data for each geospatial artifact located within the geographic area. The geospatial artifact data for a geospatial artifact can include geographic coordinate data describing the geographic location of the geospatial artifact, as well as additional data describing the geospatial artifact. For example, the geospatial artifact data can include data describing the population of a city, name of a business, type of business, type of event, road name, etc.


To present a visual rendering of a geographical area, the geospatial mapping system can generate a geospatial layer data set that includes geospatial data for the geographic area. This can include map data describing the geographic area as well as a partial or complete set of the geospatial artifact data describing geospatial artifacts that are located within the geographic area. Map data may include the specific points, lines, and polygons required to draw a map on the client machine, and geospatial artifact data may include any subset of the geospatial artifact data which exists for a given set of geospatial artifacts. The geospatial mapping system can provide a generated geospatial layer data set to a user's client device, where it can be used to render the visual rendering of the geographical area.


The geospatial layer data set can also allow the client device to provide the user with a set of features to interact with the geospatial artifacts located within the geographic area. For example, features provided by the client device can include causing presentation of visual labels representing geospatial artifacts, adjusting presentation of the visual labels, accessing additional information describing a geospatial artifact, analytical functions, etc.


At higher presentation levels (e.g., zoomed out) the geographic area presented to the user can be relatively large and include a large number of geospatial artifacts. To reduce latency associated with generating and transmitting a geospatial layer data set, the geospatial mapping system can limit the amount of geospatial artifact data that is included in a geospatial layer data set based on the selected presentation level. For example, at higher presentation level that include a large number of geospatial artifacts, the geospatial mapping system can include a limited portion of the geospatial artifact data in the geospatial layer data set, such as only the geographic coordinate data associated with the geospatial artifacts. In addition, at a higher presentation level, a smaller number of individual points may be required to describe the same line because the granularity of the dataset may become large enough that multiple points are represented by a rendered pixel. Alternatively, at lower presentation levels (e.g., zoomed in) that include a smaller number of geospatial artifacts, the geospatial mapping system can include additional geospatial data in the geospatial layer data set. This additional geospatial data may either reference new geospatial artifacts or provide a more granular set of geospatial data for existing geospatial artifacts. For example, at a high presentation level, the geospatial data for the geospatial artifact referencing the United States may not have the full set of points which reference the coastline of Maine. When zooming in on Maine, however, the geospatial mapping system may choose to include more geospatial data about that specific section of the geospatial artifact.


While the limited portion of the geospatial artifact data may provide a user with a limited feature set to interact with geospatial artifacts, it may be insufficient to provide the user with advanced features. In some instances, a user may desire to perform advanced features while at a higher presentation level. To provide the user with these additional features, the geospatial mapping system can enable the user to promote geospatial artifact data into the geospatial layer data set. This can include geospatial artifact data for all of the geospatial artifacts included in the geographic area or for a subset of the geospatial artifacts. Once promoted, the additional geospatial artifact data can be used to provide the user with the additional features. Allowing a user to promote artifact data by request can enable the user to access the additional features when desired, while also limiting the amount of geospatial data included in a geospatial layer data set to reduce latency.



FIG. 1 shows an exemplary system 100 for promoting geospatial artifact data into a geospatial layer data set. While system 100 employs a client-server architecture, the present inventive subject matter is, of course, not limited to such an architecture, and could equally well find application in an event-driven, distributed, or peer-to-peer architecture system, for example. Moreover, it shall be appreciated that although the various functional components of system 100 are discussed in a singular sense, multiple instances of one or more of the various functional components may be employed.


As shown, system 100 can include multiple computing devices connected to communication network 102 and configured to communicate with each other through use of communication network 102. Communication network 102 can be any type of network, including a local area network (“LAN”), such as an intranet; a wide area network (“WAN”), such as the Internet; or any combination thereof. Further, communication network 102 can be a public network, a private network, or a combination thereof. Communication network 102 can also be implemented using any number of communication links associated with one or more service providers, including one or more wired communication links, one or more wireless communication links, or any combination thereof. Additionally, communication network 102 can be configured to support the transmission of data formatted using any number of protocols.


Multiple computing devices can be connected to communication network 102. A computing device can be any type of general computing device capable of network communication with other computing devices. For example, a computing device can be a personal computing device such as a desktop or workstation; a business server; or a portable computing device, such as a laptop, smart phone, or tablet personal computer (PC). A computing device can include some or all of the features, components, and peripherals of computing device 500 of FIG. 5.


To facilitate communication with other computing devices, a computing device can include a communication interface configured to receive a communication, such as a request, data, etc., from another computing device in network communication with the computing device and pass the communication along to an appropriate module running on the computing device. The communication interface can also be configured to send a communication to another computing device in network communication with the computing device.


As shown, system 100 includes client device 104, geospatial mapping system 106 and secondary data source(s) 108, such as high-scale data source(s). In system 100, a user can interact with geospatial mapping system 106 through client device 104 connected to communication network 102 by direct and/or indirect communication. Client device 104 can be any of a variety of types of computing devices that include at least a display, a computer processor, and communication capabilities that provide access to communication network 102 (e.g., a smart phone, a tablet computer, a personal digital assistant (PDA), a personal navigation device (PND), a handheld computer, a desktop computer, a laptop or netbook, or a wearable computing device).


Geospatial mapping system 106 can consist of one or more computing devices and support connections from a variety of different types of client devices 104, such as desktop computers, mobile computers, mobile communications devices (e.g., mobile phones, smart phones, tablets, etc.), smart televisions, set-top boxes, and/or any other network-enabled computing devices. Client device 104 can be of varying types, capabilities, operating systems, etc. Furthermore, geospatial mapping system 106 can concurrently accept connections from and interact with multiple client devices 104.


A user can interact with geospatial mapping system 106 via client-side application 110 installed on client device 104. In some embodiments, client-side application 110 can include a geospatial mapping system—specific component. For example, the component can be a standalone application, one or more application plug-ins, and/or a browser extension. However, the user can also interact with geospatial mapping system 106 via third-party application 112, such as a web browser, that resides on client device 104 and is configured to communicate with geospatial mapping system 106. In either case, client-side application 110 and/or third-party application 112 can present a user interface (UI) for the user to interact with geospatial mapping system 106.


Geospatial mapping system 106 can include geospatial data storage 114 configured to store geospatial data. Geospatial data can include any type of data, such as digital data, documents, text files, audio files, video files, etc. Geospatial data can be data describing a geographic area. For example, geospatial data can include map data defining the geographic area as well as geospatial artifact data for each geospatial artifact located within the geographic area. The geospatial artifact data for a geospatial artifact can include geographic coordinate data describing the geographic location of the geospatial artifact, as well as additional data describing the geospatial artifact. For example, the geospatial artifact data can include data describing the population of a city, name of a business, type of business, type of event, road name, etc.


Geospatial data storage 114 can be a storage device, multiple storage devices, or one or more servers. Alternatively, geospatial data storage 114 can be a cloud storage provider or network storage. High functionality data management system 106 can store data in a storage area network (SAN) device, in a redundant array of inexpensive disks (RAID), etc. Geospatial data storage 114 can store data using one or more partition types, such as FAT, FAT32, NTFS, EXT2, EXT3, EXT4, ReiserFS, BTRFS, and so forth.


System 100 can also include secondary data source(s) 108 that store geospatial data. For example, secondary data source(s) 108 can include secondary data storage(s) 116 configured to store geospatial data.


Geospatial mapping system 106 can include geospatial mapping manager 118 configured to provide a user with a visual rendering of a geographic area. Geospatial mapping manager 118 can receive a request from client device 104 to view a visual rendering of a geographic area, and in response, provide client device 104 with a geospatial layer data set for the geographic area. The geospatial layer data set can include geospatial data for the geographic area that client device 104 can use to render the visual rendering of the geographic area.


The geographic area can be defined by a presentation level selected by a user. For example, the presentation level can indicate a zoom level at which the user would like to view a specified geographic area. The user can adjust the presentation level to view a selected portion of a map at a lower presentation level (e.g., zoomed in) or higher presentation level (e.g., zoomed out), thereby adjusting the geographic area. In some embodiments, the request received from client device 104 can include data identifying the presentation level selected by the user. Geospatial mapping manager 118 can use the received data to determine the geographic area to be presented to the user.


The geospatial layer data set for a geographic area can include map data describing the geographic area as well as a partial or complete set of the geospatial artifact data describing geospatial artifacts that are located within the geographic area. In some embodiments, geospatial mapping manager 118 can generate a geospatial layer data set in response to receiving the request from client device 104. For example, in response to receiving the request, geospatial mapping manager 118 can search geospatial data storage 114 and/or secondary data source(s) 108 for geospatial data for the requested geographic area. Geospatial mapping manager 118 can use data gathered as part of the search to generate the geospatial layer data set.


Alternatively, geospatial mapping manager 118 can access a previously generated geospatial layer data set in response to receiving the request from client device 104. For example, geospatial mapping manager 118 can generate geospatial layer data sets from data gathered from secondary data source(s) 108 and/or geospatial data storage 114. The previously generated geospatial data sets can be stored in geospatial data storage 114 and geospatial mapping manager 118 can communicate with geospatial data storage 114 to retrieve a previously generated geospatial data set in response to receiving a request.


In addition to enabling client device 104 to present a visual rendering of the geographic area, the geospatial data included in the geospatial layer data set can also enable a user of client device 104 to utilize one or more features to interact with the geospatial artifacts. For example, a feature can include causing a label representing a geospatial artifact to be presented, adjusting presentation of the visual labels, accessing additional information describing a geospatial artifact, analytical functions such as creating a histogram or timeline, etc.


Geospatial mapping manager 118 can limit the amount of geospatial artifact data included in a geospatial layer data set. For example, if a large number of geospatial artifacts are located within a geographic area (e.g., when the user selects to view the map at a higher presentation level), geospatial mapping manager 118 can include a limited set of geospatial artifact data for the geospatial artifacts. For example, the limited set of geospatial data can include limited data describing the geospatial artifacts, such as geographic coordinate data for the geospatial artifacts. As another example, the limited data set can include geospatial data for only a subset of the geospatial artifacts included in a geographic area.


Limiting the amount of geospatial artifact data included in the geospatial layer data set can reduce latency associated with generating and/or transmitting the geospatial layer data set. Alternatively, geospatial mapping manager 118 can provide a complete set of geospatial artifact data when a relatively lower number of geospatial artifacts are located within a geographic area (e.g., when the user selects to view the map at a lower presentation level).


Geospatial mapping manager 118 can promote geospatial artifact data into a geospatial layer data set to provide a user with additional and/or advanced features to interact with a geospatial artifact located in a geographic area. In some instances, the limited set of geospatial artifact data included in a geospatial layer data set may be insufficient to provide one or more features for interacting with a geospatial artifact. For example, additional data beyond the geographic coordinate data of a geospatial artifact may be needed to provide a feature such as causing presentation of additional data associated with the geospatial artifact or analytical functions. A user may desire to access these additional features while at a higher viewing level (e.g., without having to zoom in).


In this type of situation, geospatial mapping manager 118 can promote additional geospatial artifact data into a geospatial layer data set. For example, geospatial mapping manager 118 can receive a request from client device 104 indicating that a user of client device 104 would like to utilize a feature that cannot be provided based on the geospatial artifact data included in the geospatial layer data set. In response, geospatial mapping manager 118 can promote additional geospatial artifact data into the geospatial layer data set, which can then be provided to client device 104 to provide the requested feature.



FIG. 2 shows an example block diagram of geospatial mapping manager 118. To avoid obscuring the inventive subject matter with unnecessary detail, various functional components (e.g., modules) that are not germane to conveying an understanding of the inventive subject matter have been omitted from FIG. 2. However, a skilled artisan will readily recognize that various additional functional components may be supported by high functionality data manager 120 to facilitate additional functionality that is not specifically described herein. Furthermore, the various functional modules depicted in FIG. 2 may reside on a single computing device or may be distributed across several computing devices in various arrangements such as those used in cloud-based architectures.


As shown, high functionality data manager 120 includes communication module 202, geospatial layer data set generation module 204, and promotion module 206. Interface module 202 can provide client device 104 with data that enables client device 104 to present a map interface on a display of client device 104. A map interface can be an interactive user interface that allows a user to view and interact with geospatial data. For example, a map interface can present a user with a visual rendering of a geographic area. Further, a map interface can include one or more user interface elements (e.g., buttons, text boxes, scroll bars, etc.) that enable a user to modify presentation of the geographic area, adjust settings, utilize features, etc. For example, the map interface can enable a user to define a geographic area by adjusting a presentation level. As another example, the map interface can enable a user to select to utilize features on a geospatial artifact and/or a group of geospatial artifacts, such as causing a label representing the geospatial artifact(s) to be presented within the map interface.


Geospatial layer data set generation module 204 can be configured to generate a geospatial layer data set. Geospatial layer data set generation module 204 can gather geospatial data from geospatial data storage 114 and/or secondary data sources 108, which can be used to generate the geospatial layer data set. A geospatial layer data set can include geospatial data for a specified geographic area. For example, the geospatial layer data set can include geospatial data, such as map data describing the geographic area. The geospatial layer data set can also include geospatial artifact data for geospatial artifacts located within the geographic area.


A geospatial layer data set can be used to present a user with a visual rendering of a geographic area. For example, a geospatial layer data set can be provided to client device 104, where client device 104 can utilize the geospatial layer data set to present a visual rendering of the geographic area. Alternatively, geospatial mapping manager 118 can utilize the geospatial layer data set to provide data to client device 104, which can be used by client device 104 to present the visual rendering of the geographic area.


In some embodiments, geospatial layer data set generation module 204 can generate a geospatial layer data set in response to receiving a request from client device 104. The request can include data defining a geographic area that the user would like view. For example, a user can utilize the map interface to define the geographic area by adjusting the presentation level, causing client device 104 to transmit the request to geospatial mapping manager 118. In response to geospatial mapping manager 118 receiving the request, geospatial layer data set generation module 204 can gather geospatial data from geospatial data storage 114 and/or secondary data source(s) 108 for the requested geographic area. Geospatial layer data set generation module 204 can use the gathered geospatial data to generate the geospatial layer data set.


In some embodiments, geospatial layer data set generation module 204 can pre-generate geospatial layer data sets. For example, geospatial layer data set generation module 204 can generate geospatial layer data sets for geographic areas based on available presentation levels.


The pre-generated geospatial layer data sets can be stored in geospatial data storage 114 and used to present a visual rendering of a geographic area on client device 104. For example, in response to receiving a request from client device 104 that defines a geographic area, geospatial mapping manager 118 can access geospatial data storage 114 to access an appropriate geospatial layer data set to satisfy the received request.


In some embodiments, geospatial layer data set generation module 204 can limit the amount of geospatial artifact data included in a geospatial layer data set to reduce latency associated with generating and/or transmitting the geospatial layer data set. For example, when a geographic area includes a large number of geospatial artifacts (e.g., when the user is at a higher presentation level), geospatial layer data set generation module 204 can include a limited set of geospatial artifact data in the geospatial layer data set. Alternatively, when a geographic area includes a lower number of geospatial artifacts (e.g., when the user is at a lower presentation level), geospatial layer data set generation module 204 can include a complete set of geospatial artifact data in the geospatial layer data set.


In some embodiments, geospatial mapping manager 118 can promote geospatial artifact data into a geospatial layer data set. Promoting geospatial artifact data into a geospatial layer data set can include accessing the geospatial artifact data from geospatial data storage 114 and/or secondary data source(s) 108, and modifying the geospatial layer data set to include the geospatial artifact data. In some embodiments, promoting geospatial artifact data describing a geospatial artifact can further include copying the geospatial data from secondary data source(s) 108 and storing the copied geospatial artifact data in geospatial data storage 114, where it is associated and/or otherwise linked to a data object for the geospatial artifact.


Promoting the additional geospatial artifact data into a geospatial layer data set can allow a user to utilize additional features associated with the geospatial artifacts without requiring the user to adjust the presentation layer (e.g., zoom in). For example, a user can view a label associated with geospatial artifacts while remaining at a high presentation level.


Promotion module 206 can be configured to promote geospatial artifact data into a generated geospatial layer data set. Promotion module 206 can receive a request from client device 104 to promote geospatial artifact data into a geospatial layer data set and, in response, search geospatial data storage 114 and secondary data source(s) 108 to gather additional geospatial artifact data to include the geospatial layer data set. Promotion module 206 can then update the geospatial layer data set to include the additional geospatial artifact data.


Additionally, in some embodiments, promotion module 206 can store the geospatial data received from secondary data source(s) 108 in geospatial data storage 114. Further, the stored data can be associates with corresponding data objects. For example, geospatial data received from secondary data source(s) 108 that describes a geospatial artifact can be stored in geospatial data storage 114 and associated with a data object for the geospatial artifact.


In some embodiments, the request received from client device 104 can include data identifying a set of geospatial artifacts that a user would like to interact with and thus have geospatial artifact data for the set of geospatial artifacts promoted to the geospatial layer data set. For example, the map interface can enable the user to select geospatial artifact by type, geographic area, etc., that the user would like to promote. For example, the user can select to promote geospatial artifact data for all stores, churches, etc. The resulting request sent to geospatial mapping system 106 can include data identifying the set of geospatial artifacts, which promotion module 206 can use to search geospatial data storage 114 and/or secondary data source(s) 108 for geospatial artifact data.



FIG. 3 shows an example method 300 for causing a visual rendering of a geographical area to be presented on a client device. Method 300 may be embodied in computer-readable instructions for execution by one or more processors such that the operations of method 300 may be performed in part or in whole by geospatial mapping system 106; accordingly, method 300 is described below by way of example with reference thereto. However, it shall be appreciated that at least some of the operations of method 300 may be performed on various other hardware configurations and method 300 is not intended to be limited to geospatial mapping system 106.


At operation 302, geospatial mapping system 106 receives a request from client device 104 to view a visual rendering of a geographical area. The request can include data identifying the geographical area. For example, the request can include data identifying a presentation level (e.g., zoom level) selected by a user of the client device that defines the geographic area.


At operation 304, geospatial layer data set generation module 204 generates a geospatial layer data set for the geographic area. The geospatial layer data set can include geospatial data describing the geographic area, such as map data and geospatial artefact data describing geospatial artifacts located in the geographic area.


In some embodiments, geospatial layer data set generation module 204 can search geospatial data storage 114 and secondary data source(s) 108 to gather the geospatial data to include in the geospatial layer data set. Further, the geospatial layer data set generation module 204 can include a limited set of geospatial artefact data in the geospatial layer data set. The limited set of geospatial artifact data can include limited data describing a geospatial artifact (e.g., only geographic coordinate data and label data), and/or geospatial artifact data describing only a subset of the geospatial artifacts located in the geographic area. Including a limited set of geospatial artifact data can reduce the time and resources required to generate the geospatial layer data set.


At operation 306, interface module 202 can provide the geospatial layer data set to client device 104. Client device 104 can utilize the received geospatial layer data set to present a visual rendering of the geographic area. For example, client-side application 110 can cause the visual rendering of the geographic area to be presented in a map interface.


The visual rendering of the geographic area can include visual representations of geospatial artifacts included in the geographic area. For example, client device 104 can use the label data and geographic coordinate data describing one or more geospatial artifacts to present a visual label, such as a graphical icon, representing the geospatial artifact. The visual label can be presented at a location to represent the geographic location of the geospatial artifact.


In some embodiments, the visual labels can represent a group of geospatial artifacts rather than a separate label being presented at each geospatial artifact. At higher presentation levels this can provide a user with a cleaner view of geographic area.


Although a visual representation of the geospatial artifacts is presented to the user, in some embodiments, additional features cannot be accessed with respect to the presented geospatial artifacts. For example, the user may not be able to select a geospatial artifact to access additional information describing the geospatial artifact without altering the presentation level (e.g., zooming in) or causing geospatial artifact data for the geospatial artifacts to be promoted into the geospatial layer data set.


At operation 308, geospatial mapping system 106 receives, from client device 104, data indicating an updated presentation level selected by a user of client device 104. The updated presentation level can define a smaller geographic as a result of the user selecting to zoom in to the presented geographic area.


At operation 310, geospatial layer data set generation module 204 generates an updated geospatial layer data set for the updated geographic area. The updated geospatial layer data set for the updated geographic area can include additional geospatial artifact data for the geospatial artifacts included in the updated geographic area.


At operation 312, interface module 202 provides the updated geospatial layer data set to client device 104. Client device 104 can utilize the updated geospatial layer data set to present a visual rendering of the updated geographic area. For example, client-side application 110 can cause the visual rendering of the updated geographic area to be presented in a map interface. Further, the additional geospatial artifact data can provide the user with additional features, such as accessing additional information describing a geospatial artifact. A user can therefore adjust the presentation level (e.g., zoom in) to passively access additional features associated with the geospatial artifacts included in a geographic area.



FIG. 4 shows an example method 400 for promoting geospatial artifact data into a geospatial layer data set. Method 400 may be embodied in computer-readable instructions for execution by one or more processors such that the operations of method 400 may be performed in part or in whole by geospatial mapping system 106; accordingly, method 400 is described below by way of example with reference thereto. However, it shall be appreciated that at least some of the operations of method 400 may be deployed on various other hardware configurations and method 400 is not intended to be limited to geospatial mapping 106.


At operation 402, geospatial mapping system 106 accesses a geospatial layer data set for a geographic area defined by a presentation level. The geospatial layer data set for the geographic area includes a limited geospatial artifact data set for geospatial artifacts located within the geographic area. For example, geospatial mapping system 106 can gather a pre-generated geospatial layer data set from geospatial data storage 114. Alternatively, geospatial layer data set generation module 204 can generate the geospatial layer data set.


At operation 404, geospatial mapping system 106 provides the geospatial layer data set for the geographic area to client device 104. Client device 104 uses the geospatial layer data set for the geographic area to present a visual rendering of the geographic area. The limited geospatial artifact data set included in the geospatial layer data set for the geographic area enables client device 104 to provide a limited feature set for interacting with the geospatial artifacts located within the geographic area. For example, a user can view a label representing the geospatial artifacts, however may not be able to interact with the geospatial artifact further to access additional information.


At operation 406, geospatial mapping system 106 receives, from client device 104, a request to change, annotate, edit, or utilize an additional feature set on a subset of geospatial artifacts located within the geographic area. For example, a user may have selected to annotate a subset of geospatial artifacts such as stores, churches, etc.


At operation 408, promotion module 206 promotes, to the geospatial layer data set for the geographic area, an additional geospatial artifact data set for the subset of geospatial artifacts, yielding an updated geospatial layer data set for the geographic area. For example, promotion module 206 can access the geospatial artifact data from geospatial data storage 114 and/or secondary data source(s) 108. Promotion module 206 then modifies the geospatial layer data set to include the additional geospatial data set for the subset of geospatial artifact.


At operation 410, interface module 202 provides the updated geospatial layer data set for the geographic area to client device 104. The additional geospatial artifact data set included in the updated geospatial layer data set enables client device 104 to provide the advanced feature set for interacting with the subset of geospatial artifacts located within the geographic area. Accordingly, a user can actively access additional features associated with the geospatial artifacts included in a geographic area by promoting geospatial artifact data into the geospatial artifact layer data set.



FIG. 5 shows a block diagram illustrating components of a computing device 500, according to some example embodiments, able to read instructions from a machine-readable medium (e.g., a machine-readable storage medium) and perform any one or more of the methodologies discussed herein. Specifically, FIG. 5 shows a diagrammatic representation of computing device 500 in the example form of a system, within which instructions 502 (e.g., software, a program, an application, an applet, an app, a driver, or other executable code) for causing computing device 500 to perform any one or more of the methodologies discussed herein may be executed. For example, instructions 502 include executable code that causes computing device 500 to execute methods 300 and 400. In this way, these instructions transform the general, non-programmed computing device into a particular computing device programmed to carry out the described and illustrated functions in the manner described herein. Computing device 500 may operate as a standalone device or may be coupled (e.g., networked) to other machines.


By way of non-limiting example, computing device 500 may comprise or correspond to a television, a computer (e.g., a server computer, a client computer, a PC, a tablet computer, a laptop computer, or a netbook), a set-top box (STB), a PDA, an entertainment media system (e.g., an audio/video receiver), a cellular telephone, a smart phone, a mobile device, a wearable device (e.g., a smart watch), a portable media player, or any machine capable of outputting audio signals and capable of executing instructions 502, sequentially or otherwise, that specify actions to be taken by computing device 500. Further, while only a single computing device 500 is illustrated, the term “machine” shall also be taken to include a collection of computing devices 500 that individually or jointly execute instructions 502 to perform any one or more of the methodologies discussed herein.


Computing device 500 may include processors 504, memory 506, storage unit 508, and I/O components 510, which may be configured to communicate with each other such as via bus 512. In an example embodiment, processors 504 (e.g., a central processing unit (CPU), a reduced instruction set computing (RISC) processor, a complex instruction set computing (CISC) processor, a graphics processing unit (GPU), a digital signal processor (DSP), an application specific integrated circuit (ASIC), a radio-frequency integrated circuit (RFIC), another processor, or any suitable combination thereof) may include, for example, processor 514 and processor 516 that may execute instructions 502. The term “processor” is intended to include multi-core processors that may comprise two or more independent processors (sometimes referred to as “cores”) that may execute instructions contemporaneously. Although FIG. 5 shows multiple processors, computing device 500 may include a single processor with a single core, a single processor with multiple cores (e.g., a multi-core processor), multiple processors with a single core, multiple processors with multiples cores, or any combination thereof.


Memory 506 (e.g., a main memory or other memory storage) and storage unit 508 are both accessible to processors 504 such as via bus 512. Memory 506 and storage unit 508 store instructions 502 embodying any one or more of the methodologies or functions described herein. In some embodiments, data storage 516 resides on storage unit 508. Instructions 502 may also reside, completely or partially, within memory 506, within storage unit 508, within at least one of processors 504 (e.g., within the processor's cache memory), or any suitable combination thereof, during execution thereof by computing device 500. Accordingly, memory 506, storage unit 508, and the memory of processors 504 are examples of machine-readable media.


As used herein, “machine-readable medium” means a device able to store instructions and data temporarily or permanently and may include, but not be limited to, random-access memory (RAM), read-only memory (ROM), buffer memory, flash memory, optical media, magnetic media, cache memory, other types of storage (e.g., erasable programmable read-only memory (EEPROM)), or any suitable combination thereof. The term “machine-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, or associated caches and servers) able to store instructions 502. The term “machine-readable medium” shall also be taken to include any medium, or combination of multiple media, that is capable of storing instructions (e.g., instructions 502) for execution by a machine (e.g., computing device 500), such that the instructions, when executed by one or more processors of computing device 500 (e.g., processors 504), cause computing device 500 to perform any one or more of the methodologies described herein (e.g., methods 300 and 400). Accordingly, a “machine-readable medium” refers to a single storage apparatus or device, as well as “cloud-based” storage systems or storage networks that include multiple storage apparatus or devices. The term “machine-readable medium” excludes signals per se.


Furthermore, the “machine-readable medium” is non-transitory in that it does not embody a propagating signal. However, labeling the tangible machine-readable medium as “non-transitory” should not be construed to mean that the medium is incapable of movement—the medium should be considered as being transportable from one real-world location to another. Additionally, since the machine-readable medium is tangible, the medium may be considered to be a machine-readable device.


The I/O components 510 may include a wide variety of components to receive input, provide output, produce output, transmit information, exchange information, capture measurements, and so on. The specific I/O components 510 that are included in a particular machine will depend on the type of machine. For example, portable machines such as mobile phones will likely include a touch input device or other such input mechanisms, while a headless server machine will likely not include such a touch input device. It will be appreciated that I/O components 510 may include many other components that are not specifically shown in FIG. 5. I/O components 510 are grouped according to functionality merely for simplifying the following discussion and the grouping is in no way limiting. In various example embodiments, I/O components 510 may include input components 518 and output components 520. Input components 518 may include alphanumeric input components (e.g., a keyboard, a touch screen configured to receive alphanumeric input, a photo-optical keyboard, or other alphanumeric input components), point-based input components (e.g., a mouse, a touchpad, a trackball, a joystick, a motion sensor, or other pointing instruments), tactile input components (e.g., a physical button, a touch screen that provides location and/or force of touches or touch gestures, or other tactile input components), audio input components, and the like. Output components 520 may include visual components (e.g., a display such as a plasma display panel (PDP), a light emitting diode (LED) display, a liquid crystal display (LCD), a projector, or a cathode ray tube (CRT)), acoustic components (e.g., speakers), haptic components (e.g., a vibratory motor, resistance mechanisms), other signal generators, and so forth.


Communication may be implemented using a wide variety of technologies. I/O components 510 may include communication components 522 operable to couple computing device 500 to network 524 or devices 526 via coupling 528 and coupling 530, respectively. For example, communication components 522 may include a network interface component or other suitable device to interface with network 524. In further examples, communication components 522 may include wired communication components, wireless communication components, cellular communication components, near field communication (NFC) components, Bluetooth® components (e.g., Bluetooth® Low Energy), WiFi® components, and other communication components to provide communication via other modalities. The devices 526 may be another machine or any of a wide variety of peripheral devices (e.g., a peripheral device coupled via a Universal Serial Bus (USB)).


Modules, Components, and Logic


Certain embodiments are described herein as including logic or a number of components, modules, or mechanisms. Modules may constitute either software modules (e.g., code embodied on a machine-readable medium or in a transmission signal) or hardware modules. A hardware module is a tangible unit capable of performing certain operations and may be configured or arranged in a certain manner. In example embodiments, one or more computer systems (e.g., a standalone, client, or server computer system) or one or more hardware modules of a computer system (e.g., a processor or a group of processors) may be configured by software (e.g., an application or application portion) as a hardware module that operates to perform certain operations as described herein.


In various embodiments, a hardware module may be implemented mechanically or electronically. For example, a hardware module may comprise dedicated circuitry or logic that is permanently configured (e.g., as a special-purpose processor, such as a field-programmable gate array (FPGA) or an ASIC) to perform certain operations. A hardware module may also comprise programmable logic or circuitry (e.g., as encompassed within a general-purpose processor or other programmable processor) that is temporarily configured by software to perform certain operations. It will be appreciated that the decision to implement a hardware module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations.


Accordingly, the term “hardware module” should be understood to encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired), or temporarily configured (e.g., programmed) to operate in a certain manner and/or to perform certain operations described herein. Considering embodiments in which hardware modules are temporarily configured (e.g., programmed), each of the hardware modules need not be configured or instantiated at any one instance in time. For example, where the hardware modules comprise a general-purpose processor configured using software, the general-purpose processor may be configured as respective different hardware modules at different times. Software may accordingly configure a processor, for example, to constitute a particular hardware module at one instance of time and to constitute a different hardware module at a different instance of time.


Hardware modules can provide information to, and receive information from, other hardware modules. Accordingly, the described hardware modules may be regarded as being communicatively coupled. Where multiple of such hardware modules exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses that connect the hardware modules). In embodiments in which multiple hardware modules are configured or instantiated at different times, communications between or among such hardware modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple hardware modules have access. For example, one hardware module may perform an operation and store the output of that operation in a memory device to which it is communicatively coupled. A further hardware module may then, at a later time, access the memory device to retrieve and process the stored output. Hardware modules may also initiate communications with input or output devices, and can operate on a resource (e.g., a collection of information).


The various operations of example methods described herein may be performed, at least partially, by one or more processors that are temporarily configured (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors may constitute processor-implemented modules that operate to perform one or more operations or functions. The modules referred to herein may, in some example embodiments, comprise processor-implemented modules.


Similarly, the methods described herein may be at least partially processor-implemented. For example, at least some of the operations of a method may be performed by one or more processors or processor-implemented modules. The performance of certain of the operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines. In some example embodiments, the processor or processors may be located in a single location (e.g., within a home environment, an office environment, or a server farm), while in other embodiments the processors may be distributed across a number of locations.


The one or more processors may also operate to support performance of the relevant operations in a “cloud computing” environment or as a “software as a service” (SaaS). For example, at least some of the operations may be performed by a group of computers (as examples of machines including processors), with these operations being accessible via a network (e.g., the Internet) and via one or more appropriate interfaces (e.g., APIs).


Electronic Apparatus and System


Example embodiments may be implemented in digital electronic circuitry; in computer hardware, firmware, or software; or in combinations of them. Example embodiments may be implemented using a computer program product, for example, a computer program tangibly embodied in an information carrier, for example, in a machine-readable medium for execution by, or to control the operation of, data processing apparatus, for example, a programmable processor, a computer, or multiple computers.


A computer program can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a standalone program or as a module, subroutine, or other unit suitable for use in a computing environment. A computer program can be deployed to be executed on one computer or on multiple computers at one site, or distributed across multiple sites and interconnected by a communication network.


In example embodiments, operations may be performed by one or more programmable processors executing a computer program to perform functions by operating on input data and generating output. Method operations can also be performed by, and apparatus of example embodiments may be implemented as, special-purpose logic circuitry (e.g., an FPGA or an ASIC).


The computing system can include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. In embodiments deploying a programmable computing system, it will be appreciated that both hardware and software architectures merit consideration. Specifically, it will be appreciated that the choice of whether to implement certain functionality in permanently configured hardware (e.g., an ASIC), in temporarily configured hardware (e.g., a combination of software and a programmable processor), or in a combination of permanently and temporarily configured hardware may be a design choice.


Language


Although the embodiments of the present inventive subject matter have been described with reference to specific example embodiments, it will be evident that various modifications and changes may be made to these embodiments without departing from the broader scope of the inventive subject matter. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense. The accompanying drawings that form a part hereof show by way of illustration, and not of limitation, specific embodiments in which the subject matter may be practiced. The embodiments illustrated are described in sufficient detail to enable those skilled in the art to practice the teachings disclosed herein. Other embodiments may be used and derived therefrom, such that structural and logical substitutions and changes may be made without departing from the scope of this disclosure. This Detailed Description, therefore, is not to be taken in a limiting sense, and the scope of various embodiments is defined only by the appended claims, along with the full range of equivalents to which such claims are entitled.


Such embodiments of the inventive subject matter may be referred to herein, individually and/or collectively, by the term “invention” merely for convenience and without intending to voluntarily limit the scope of this application to any single invention or inventive concept if more than one is in fact disclosed. Thus, although specific embodiments have been illustrated and described herein, it should be appreciated that any arrangement calculated to achieve the same purpose may be substituted for the specific embodiments shown. This disclosure is intended to cover any and all adaptations or variations of various embodiments. Combinations of the above embodiments, and other embodiments not specifically described herein, will be apparent, to those of skill in the art, upon reviewing the above description.


All publications, patents, and patent documents referred to in this document are incorporated by reference herein in their entirety, as though individually incorporated by reference. In the event of inconsistent usages between this document and those documents so incorporated by reference, the usage in the incorporated references should be considered supplementary to that of this document; for irreconcilable inconsistencies, the usage in this document controls.


In this document, the terms “a” or “an” are used, as is common in patent documents, to include one or more than one, independent of any other instances or usages of “at least one” or “one or more.” In this document, the term “or” is used to refer to a nonexclusive or, such that “A or B” includes “A but not B,” “B but not A,” and “A and B,” unless otherwise indicated. In the appended claims, the terms “including” and “in which” are used as the plain-English equivalents of the respective terms “comprising” and “wherein.” Also, in the following claims, the terms “including” and “comprising” are open-ended; that is, a system, device, article, or process that includes elements in addition to those listed after such a term in a claim are still deemed to fall within the scope of that claim.

Claims
  • 1. A method comprising: receiving, from a client device presenting a map interface based on a first geospatial layer data set rendered according to a first presentation level, a request to utilize an additional feature that is not included in a first set of features enabled by a geospatial artifact data set of the first geospatial layer data set; andin response to receiving the request, promoting additional geospatial artifact data to the first geospatial layer data set, yielding an updated geospatial layer data set rendered according to the first presentation level to provide the map interface at the first presentation level, the updated geospatial layer data set enabling the client device to provide the additional feature of the additional geospatial artifact data within the map interface while the client device is presenting the map interface.
  • 2. The method of claim 1, further comprising: receiving data indicating the first presentation level for viewing the map interface, the first presentation level having been selected by a user of the client device, the first presentation level defining a first subset of a geographic area; andreturning, to the client device, the first geospatial layer data set for the first subset of the geographic area, the first geospatial layer data set selected based on the first presentation level.
  • 3. The method of claim 2, further comprising: receiving data indicating a second presentation level selected by the user of the client device; andreturning, to the client device, a second geospatial layer data set selected based on the second presentation level.
  • 4. The method of claim 3, wherein the second geospatial layer data set enables the client device to perform a second set of features that includes at least one feature that is not provided by the first set of features.
  • 5. The method of claim 4, further comprising: receiving, from the client device, data indicating a third presentation level; andreturning, to the client device, a third geospatial layer data set selected based on the third presentation level, the third geospatial layer data set enabling the client device to perform a third set of features that includes at least one feature not provided by the first set of features and the second set of features.
  • 6. The method of claim 1, wherein the first presentation level is based on a zoom level selected by a user of the client device.
  • 7. The method of claim 1, wherein promoting additional geospatial artifact data to the first geospatial layer data set comprises: modifying the first geospatial layer data set to include the additional geospatial artifact data.
  • 8. The method of claim 1, wherein the first geospatial layer data set includes geographic coordinate data identifying geographic locations for geospatial artifacts located within a first geographic area.
  • 9. A system comprising: one or more computer processors; andone or more computer-readable mediums storing instructions that, when executed by the one or more computer processors, cause the system to perform operations comprising:receiving, from a client device presenting a map interface based on a first geospatial layer data set rendered according to a first presentation level, a request to utilize an additional feature that is not included in a first set of features enabled by a geospatial artifact data set of the first geospatial layer data set; andin response to receiving the request, promoting additional geospatial artifact data to the first geospatial layer data set, yielding an updated geospatial layer data set rendered according to the first presentation level to provide the map interface at the first presentation level, the updated geospatial layer data set enabling the client device to provide the additional feature of the additional geospatial artifact data within the map interface while the client device is presenting the map interface.
  • 10. The system of claim 9, the operations further comprising: receiving data indicating the first presentation level for viewing the map interface, the first presentation level having been selected by a user of the client device, the first presentation level defining a first subset of a geographic area; andreturning, to the client device, the first geospatial layer data set for the first subset of the geographic area, the first geospatial layer data set selected based on the first presentation level.
  • 11. The system of claim 10, the operations further comprising: receiving data indicating a second presentation level selected by the user of the client device; andreturning, to the client device, a second geospatial layer data set selected based on the second presentation level.
  • 12. The system of claim 11, wherein the second geospatial layer data set enables the client device to perform a second set of features that includes at least one feature that is not provided by the first set of features.
  • 13. The system of claim 12, the operations further comprising: receiving, from the client device, data indicating a third presentation level; andreturning, to the client device, a third geospatial layer data set selected based on the third presentation level, the third geospatial layer data set enabling the client device to perform a third set of features that includes at least one feature not provided by the first set of features and the second set of features.
  • 14. The system of claim 9, wherein the first presentation level is based on a zoom level selected by a user of the client device.
  • 15. The system of claim 9, wherein promoting additional geospatial artifact data to the first geospatial layer data set comprises: modifying the first geospatial layer data set to include the additional geospatial artifact data.
  • 16. The system of claim 9, wherein the first geospatial layer data set includes geographic coordinate data identifying geographic locations for geospatial artifacts located within a first geographic area.
  • 17. A non-transitory computer-readable medium storing instructions that, when executed by one or more computer processors of one or more computing devices, cause the one or more computing devices to perform operations comprising: receiving, from a client device presenting a map interface based on a first geospatial layer data set rendered according to a first presentation level, a request to utilize an additional feature that is not included in a first set of features enabled by a geospatial artifact data set of the first geospatial layer data set; andin response to receiving the request, promoting additional geospatial artifact data to the first geospatial layer data set, yielding an updated geospatial layer data set rendered according to the first presentation level to provide the map interface at the first presentation level, the updated geospatial layer data set enabling the client device to provide the additional feature of the additional geospatial artifact data within the map interface while the client device is presenting the map interface.
  • 18. The non-transitory computer-readable medium of claim 17, the operations further comprising: receiving data indicating the first presentation level for viewing the map interface, the first presentation level having been selected by a user of the client device, the first presentation level defining a first subset of a geographic area; andreturning, to the client device, the first geospatial layer data set for the first subset of the geographic area, the first geospatial layer data set selected based on the first presentation level.
  • 19. The non-transitory computer-readable medium of claim 18, the operations further comprising: receiving data indicating a second presentation level selected by the user of the client device; andreturning, to the client device, a second geospatial layer data set selected based on the second presentation level.
  • 20. The non-transitory computer-readable medium of claim 19, wherein the second geospatial layer data set enables the client device to perform a second set of features that includes at least one feature that is not provided by the first set of features.
PRIORITY APPLICATION

This application is a continuation of, and claims priority to U.S. patent application Ser. No. 16/681,546, filed Nov. 12, 2019, which is a continuation of, and claims priority to U.S. patent application Ser. No. 15/808,608, filed Nov. 9, 2017, which is a continuation of, and claims priority to U.S. patent application Ser. No. 15/377,668, filed Dec. 13, 2016, the disclosures of which are incorporated herein in their entirety by reference.

US Referenced Citations (248)
Number Name Date Kind
4899161 Morin, Jr. et al. Feb 1990 A
4958305 Piazza Sep 1990 A
5329108 Lamoure Jul 1994 A
5754182 Kobayashi May 1998 A
5781195 Marvin Jul 1998 A
5781704 Rossmo Jul 1998 A
6091956 Hollenberg Jul 2000 A
6157747 Szeliski et al. Dec 2000 A
6169552 Endo et al. Jan 2001 B1
6173067 Payton et al. Jan 2001 B1
6178432 Cook et al. Jan 2001 B1
6247019 Davies Jun 2001 B1
6389289 Voce May 2002 B1
6414683 Gueziec Jul 2002 B1
6483509 Rabenhorst Nov 2002 B1
6529900 Patterson et al. Mar 2003 B1
6631496 Li et al. Oct 2003 B1
6662103 Skolnick et al. Dec 2003 B1
6757445 Knopp Jun 2004 B1
6828920 Owen et al. Dec 2004 B2
6983203 Wako Jan 2006 B1
6985950 Hanson et al. Jan 2006 B1
7036085 Barros Apr 2006 B2
7158878 Rasmussen Jan 2007 B2
7375732 Arcas May 2008 B2
7379811 Rasmussen et al. May 2008 B2
7457706 Melero Nov 2008 B2
7502786 Liu et al. Mar 2009 B2
7519470 Brasche et al. Apr 2009 B2
7529195 Gorman May 2009 B2
7539666 Ashworth et al. May 2009 B2
7558677 Jones Jul 2009 B2
7574428 Leiserowitz et al. Aug 2009 B2
7579965 Bucholz Aug 2009 B2
7617314 Bansod Nov 2009 B1
7620628 Kapur et al. Nov 2009 B2
7663621 Allen Feb 2010 B1
7791616 Ioup et al. Sep 2010 B2
7805457 Viola et al. Sep 2010 B1
7809703 Balabhadrapatruni et al. Oct 2010 B2
7872647 Mayer et al. Jan 2011 B2
7894984 Rasmussen et al. Feb 2011 B2
7899611 Downs et al. Mar 2011 B2
7920963 Jouline et al. Apr 2011 B2
7945852 Pilskains May 2011 B1
7962281 Rasmussen et al. Jun 2011 B2
7970240 Chao et al. Jun 2011 B1
8010545 Stefik et al. Aug 2011 B2
8036632 Cona et al. Oct 2011 B1
8065080 Koch Nov 2011 B2
8085268 Carrino et al. Dec 2011 B2
8134457 Velipasalar et al. Mar 2012 B2
8145703 Frishert et al. Mar 2012 B2
8200676 Frank Jun 2012 B2
8214361 Sandler et al. Jul 2012 B1
8214764 Gemmell et al. Jul 2012 B2
8229947 Fujinaga Jul 2012 B2
8230333 Decherd et al. Jul 2012 B2
8290942 Jones et al. Oct 2012 B2
8290943 Carbone et al. Oct 2012 B2
8301464 Cave et al. Oct 2012 B1
8325178 Doyle et al. Dec 2012 B1
8368695 Howell et al. Feb 2013 B2
8397171 Klassen et al. Mar 2013 B2
8400448 Doyle Mar 2013 B1
8407180 Ramesh et al. Mar 2013 B1
8412234 Gatmir-Motahari et al. Apr 2013 B1
8412707 Mianji Apr 2013 B1
8422825 Neophytou et al. Apr 2013 B1
8452790 Mianji May 2013 B1
8463036 Ramesh et al. Jun 2013 B1
8489331 Kopf et al. Jul 2013 B2
8489641 Seefeld et al. Jul 2013 B1
8498984 Hwang et al. Jul 2013 B1
8508533 Cervelli et al. Aug 2013 B2
8510268 LaForge Aug 2013 B1
8514229 Cervelli et al. Aug 2013 B2
8515207 Chau Aug 2013 B2
8564596 Carrino et al. Oct 2013 B2
8738284 Jones May 2014 B1
8742934 Sarpy, Sr. et al. Jun 2014 B1
8781169 Jackson et al. Jul 2014 B2
8799799 Cervelli et al. Aug 2014 B1
8830322 Nerayoff et al. Sep 2014 B2
8938686 Erenrich et al. Jan 2015 B1
8949164 Mohler Feb 2015 B1
8983494 Onnen et al. Mar 2015 B1
9009177 Zheng et al. Apr 2015 B2
9021384 Beard et al. Apr 2015 B1
9104293 Kornfeld et al. Aug 2015 B1
9104695 Cervelli et al. Aug 2015 B1
9111380 Piemonte et al. Aug 2015 B2
9129219 Robertson et al. Sep 2015 B1
9146125 Vulcano et al. Sep 2015 B2
9280618 Bruce et al. Mar 2016 B1
9865034 Wilczynski et al. Jan 2018 B1
10515433 Wilczynski et al. Dec 2019 B1
20020003539 Abe Jan 2002 A1
20020033848 Sciammarella et al. Mar 2002 A1
20020065691 Twig May 2002 A1
20020116120 Ruiz et al. Aug 2002 A1
20020130867 Yang et al. Sep 2002 A1
20020130906 Miyaki Sep 2002 A1
20030052896 Higgins et al. Mar 2003 A1
20030103049 Kindatenko et al. Jun 2003 A1
20030144868 MacIntyre et al. Jul 2003 A1
20030163352 Surpin et al. Aug 2003 A1
20030225755 Iwayama et al. Dec 2003 A1
20040030492 Fox et al. Feb 2004 A1
20040039498 Ollis et al. Feb 2004 A1
20040098236 Mayer et al. May 2004 A1
20040143602 Ruiz et al. Jul 2004 A1
20050031197 Knopp Feb 2005 A1
20050034062 Bufkin et al. Feb 2005 A1
20050080769 Gemmell et al. Apr 2005 A1
20050162523 Darrell et al. Jul 2005 A1
20050182502 Iyengar Aug 2005 A1
20050182793 Keenan et al. Aug 2005 A1
20050223044 Ashworth et al. Oct 2005 A1
20050267652 Allstadt et al. Dec 2005 A1
20060026170 Kreitler et al. Feb 2006 A1
20060139375 Rasmussen et al. Jun 2006 A1
20060146050 Yamauchi Jul 2006 A1
20060149596 Surpin et al. Jul 2006 A1
20060200384 Arutunian et al. Sep 2006 A1
20060251307 Florin et al. Nov 2006 A1
20060259527 Devarakonda et al. Nov 2006 A1
20060271277 Hu et al. Nov 2006 A1
20060279630 Aggarwal et al. Dec 2006 A1
20070011150 Frank Jan 2007 A1
20070016363 Huang et al. Jan 2007 A1
20070024620 Muller-Fischer et al. Feb 2007 A1
20070057966 Ohno et al. Mar 2007 A1
20070078832 Ott et al. Apr 2007 A1
20070115373 Gallagher et al. May 2007 A1
20070188516 Ioup et al. Aug 2007 A1
20070208497 Downs et al. Sep 2007 A1
20070208498 Barker et al. Sep 2007 A1
20070258642 Thota Nov 2007 A1
20070294643 Kyle Dec 2007 A1
20080010605 Frank Jan 2008 A1
20080040684 Crump et al. Feb 2008 A1
20080077642 Carbone Mar 2008 A1
20080082578 Hogue et al. Apr 2008 A1
20080098085 Krane et al. Apr 2008 A1
20080104019 Nath May 2008 A1
20080133579 Lim Jun 2008 A1
20080163073 Becker et al. Jul 2008 A1
20080192053 Howell Aug 2008 A1
20080195417 Surpin Aug 2008 A1
20080223834 Griffiths et al. Sep 2008 A1
20080229056 Agarwal et al. Sep 2008 A1
20080263468 Cappione et al. Oct 2008 A1
20080267107 Rosenberg Oct 2008 A1
20080270468 Mao et al. Oct 2008 A1
20080278311 Grange et al. Nov 2008 A1
20080288306 MacIntyre et al. Nov 2008 A1
20080294678 Gorman et al. Nov 2008 A1
20080301643 Appleton et al. Dec 2008 A1
20090027418 Maru et al. Jan 2009 A1
20090088964 Schaaf et al. Apr 2009 A1
20090100018 Roberts Apr 2009 A1
20090115786 Shimasaki et al. May 2009 A1
20090132921 Hwangbo et al. May 2009 A1
20090132953 Reed, Jr. et al. May 2009 A1
20090144262 White et al. Jun 2009 A1
20090158185 Lacevic et al. Jun 2009 A1
20090171939 Athsani et al. Jul 2009 A1
20090172511 Decherd et al. Jul 2009 A1
20090179892 Tsuda et al. Jul 2009 A1
20090181650 Dicke Jul 2009 A1
20090187447 Cheng et al. Jul 2009 A1
20090187464 Bai et al. Jul 2009 A1
20090222400 Kupershmidt et al. Sep 2009 A1
20090292626 Oxford et al. Nov 2009 A1
20100057716 Stefik et al. Mar 2010 A1
20100063961 Guiheneuf et al. Mar 2010 A1
20100070523 Delgo et al. Mar 2010 A1
20100076968 Boyns et al. Mar 2010 A1
20100106420 Mattikalli et al. Apr 2010 A1
20100162176 Dunton Jun 2010 A1
20100198684 Eraker et al. Aug 2010 A1
20100199225 Coleman et al. Aug 2010 A1
20100277611 Holt et al. Nov 2010 A1
20100293174 Bennett Nov 2010 A1
20100321399 Ellren et al. Dec 2010 A1
20110022312 McDonough et al. Jan 2011 A1
20110090254 Carrino et al. Apr 2011 A1
20110117878 Barash et al. May 2011 A1
20110137766 Rasmussen et al. Jun 2011 A1
20110153368 Pierre et al. Jun 2011 A1
20110161096 Buehler et al. Jun 2011 A1
20110170799 Carrino et al. Jul 2011 A1
20110208724 Jones et al. Aug 2011 A1
20110218934 Elser Sep 2011 A1
20110225198 Edwards et al. Sep 2011 A1
20110238690 Arrasvouri et al. Sep 2011 A1
20110270705 Parker Nov 2011 A1
20120066296 Appleton et al. Mar 2012 A1
20120084118 Bai et al. Apr 2012 A1
20120106801 Jackson May 2012 A1
20120144335 Abeln et al. Jun 2012 A1
20120158527 Cannelongo Jun 2012 A1
20120159363 DeBacker et al. Jun 2012 A1
20120173985 Peppel Jul 2012 A1
20120206469 Hulubei et al. Aug 2012 A1
20120208636 Feige Aug 2012 A1
20120221580 Barney Aug 2012 A1
20120323888 Osann, Jr. Dec 2012 A1
20130006725 Simanek et al. Jan 2013 A1
20130021445 Cossette-Pacheco et al. Jan 2013 A1
20130057549 Beaver, III Mar 2013 A1
20130057551 Ebert et al. Mar 2013 A1
20130060786 Serrano et al. Mar 2013 A1
20130073377 Heath Mar 2013 A1
20130076732 Cervelli et al. Mar 2013 A1
20130100134 Cervelli et al. Apr 2013 A1
20130101159 Chao et al. Apr 2013 A1
20130124563 Cavelie et al. May 2013 A1
20130132398 Pfeifle May 2013 A1
20130150004 Rosen Jun 2013 A1
20130176321 Mitchell et al. Jul 2013 A1
20130179420 Park et al. Jul 2013 A1
20130254900 Sathish et al. Sep 2013 A1
20130268520 Fisher et al. Oct 2013 A1
20130279757 Kephart Oct 2013 A1
20130282723 Petersen et al. Oct 2013 A1
20130339891 Blumenberg Dec 2013 A1
20140176606 Narayan et al. Jun 2014 A1
20140218400 O'Toole et al. Aug 2014 A1
20140218404 Kreft Aug 2014 A1
20140333651 Cervelli et al. Nov 2014 A1
20140337772 Cervelli et al. Nov 2014 A1
20140347383 Cornell Nov 2014 A1
20140361899 Layson Dec 2014 A1
20150029176 Baxter Jan 2015 A1
20150070397 Miller Mar 2015 A1
20150100907 Erenrich et al. Apr 2015 A1
20150106170 Bonica Apr 2015 A1
20150170385 Appleton et al. Jun 2015 A1
20150186821 Wang et al. Jul 2015 A1
20150187036 Wang et al. Jul 2015 A1
20150187100 Berry et al. Jul 2015 A1
20150312323 Peterson Oct 2015 A1
20150338233 Cervelli et al. Nov 2015 A1
20150379413 Robertson et al. Dec 2015 A1
20160033295 Li et al. Feb 2016 A1
20200082502 Wilczynski et al. Mar 2020 A1
Foreign Referenced Citations (18)
Number Date Country
2012216622 Apr 2013 AU
102013222023 Jan 2015 DE
0763201 Mar 1997 EP
2575107 Apr 2013 EP
2858014 Apr 2015 EP
2963595 Jan 2016 EP
2516155 Jan 2015 GB
2012778 Nov 2014 NL
624557 Aug 2014 NZ
WO-9532424 Nov 1995 WO
WO-2000009529 Feb 2000 WO
WO-2001098925 Dec 2001 WO
WO-2004057268 Jul 2004 WO
WO-2005013200 Feb 2005 WO
WO-2008064207 May 2008 WO
WO-2009061501 May 2009 WO
WO-2009123975 Oct 2009 WO
WO-201 1058507 May 2011 WO
Non-Patent Literature Citations (106)
Entry
“A First Look: Predicting Market Demand for Food Retails using a Huff Analysis”, TRF Policy Solutions, CDFI Fund, Capacity Building Initiative, (Jul. 2012), 1-30.
“Amm's Diary:Unconnected ways and other data quality issues”, Open Street Map, [Online], Retrieved from the Internet: <URL: http://www.openstreetmap.org/user/amm/diary>, (Accessed: Jul. 23, 2012), 3 pgs.
“U.S. Appl. No. 12/840,673, Final Office Action dated Jan. 2, 2015”, 21 pgs.
“U.S. Appl. No. 12/840,673, Non Final Office Action dated Sep. 17, 2014”, 21 pgs.
“U.S. Appl. No. 12/840,673, Notice of Allowance dated Apr. 6, 2015”, 11 pgs.
“U.S. Appl. No. 13/728,879, Final Office Action dated Aug. 12, 2015”, 9 pgs.
“U.S. Appl. No. 13/728,879, First Action Interview Office Action Summary dated Mar. 17, 2015”, 5 pgs.
“U.S. Appl. No. 13/728,879, First Action Interview Pre-Interview Communication dated Jan. 27, 2015”, 4 pgs.
“U.S. Appl. No. 13/728,879, Non Final Office Action dated Nov. 20, 2015”, 9 pgs.
“U.S. Appl. No. 13/728,879, Notice of Allowance dated Jun. 21, 2016”, 13 pgs.
“U.S. Appl. No. 13/917,571, Issue Notification dated Aug. 5, 2014”, 1 pg.
“U.S. Appl. No. 13/948,859, Notice of Allowance dated Dec. 10, 2014”, 8 pgs.
“U.S. Appl. No. 14/289,596, Advisory Action dated Apr. 30, 2015”, 3 pgs.
“U.S. Appl. No. 14/289,596, Final Office Action dated Jan. 26, 2015”, 38 pgs.
“U.S. Appl. No. 14/289,596, First Action Interview Pre-Interview Communication dated Jul. 18, 2014”, 4 pgs.
“U.S. Appl. No. 14/289,596, Non Final Office Action dated May 9, 2016”, 37 pgs.
“U.S. Appl. No. 14/289,599, Advisory Action dated Sep. 4, 2015”, 24 pgs.
“U.S. Appl. No. 14/289,599, Final Office Action dated May 29, 2015”, 8 pgs.
“U.S. Appl. No. 14/289,599, First Action Interview Pre-Interview Communication dated Jul. 22, 2014”, 5 pgs.
“U.S. Appl. No. 14/294,098, Final Office Action dated Nov. 6, 2014”, 22 pgs.
“U.S. Appl. No. 14/294,098, First Action Interview Pre-Interview Communication dated Aug. 15, 2014”, 17 pgs.
“U.S. Appl. No. 14/294,098, Notice of Allowance dated Dec. 29, 2014”, 9 pgs.
“U.S. Appl. No. 14/319,161, Final Office Action dated Jan. 23, 2015”, 21 pgs.
“U.S. Appl. No. 14/319,161, Notice of Allowance dated May 4, 2015”, 6 pgs.
“U.S. Appl. No. 14/490,612, Final Office Action dated Aug. 18, 2015”, 71 pgs.
“U.S. Appl. No. 14/730,123, First Action Interview Pre-Interview Communication dated Sep. 21, 2015”, 18 pgs.
“U.S. Appl. No. 14/730,123, Notice of Allowance dated Apr. 12, 2016”, 10 pgs.
“U.S. Appl. No. 14/929,584, Final Office Action dated May 25, 2016”, 42 pgs.
“U.S. Appl. No. 14/929,584, Non Final Office Action dated Feb. 4, 2016”, 15 pgs.
“U.S. Appl. No. 14/934,004, First Action Interview Pre-Interview Communication dated Feb. 16, 2016”, 5 pgs.
“U.S. Appl. No. 15/377,668 Examiner Interview Summary dated Aug. 11, 2017”, 5 pgs.
“U.S. Appl. No. 15/377,668, Final Office Action dated Jun. 16, 2017”, 15 pgs.
“U.S. Appl. No. 15/377,668, First Action Interview—Office Action Summary dated Mar. 30, 2017”, 5 pgs.
“U.S. Appl. No. 15/377,668, First Action Interview—Pre-Interview Communication dated Feb. 16, 2017”, 14 pgs.
“U.S. Appl. No. 15/377,668, Notice of Allowance dated Sep. 11, 2017”, 9 pgs.
“U.S. Appl. No. 15/377,668, Response filed Aug. 16, 2017 to Final Office Action dated Jun. 16, 2017”, 16 pgs.
“U.S. Appl. No. 15/377,668, Response filed May 30, 2017 to First Action Interview—Office Action Summary dated Mar. 30, 2017”, 13 pgs.
“U.S. Appl. No. 15/808,608, Examiner Interview Summary dated Jul. 19, 2019”, 3 pgs.
“U.S. Appl. No. 15/808,608, Non Final Office Action dated Apr. 2, 2019”, 10 pgs.
“U.S. Appl. No. 15/808,608, Notice of Allowance dated Aug. 14, 2019”, 8 pgs.
“U.S. Appl. No. 15/808,608, Preliminary Amendment filed Nov. 22, 2017”, 8 pgs.
“U.S. Appl. No. 15/808,608, Response filed Aug. 1, 2019 to Non-Final Office Action dated Apr. 2, 2019”, 12 pgs.
“U.S. Appl. No. 16/681,546, First Action Interview—Office Action Summary dated Dec. 9, 2020”, 13 pgs.
“U.S. Appl. No. 16/681,546, First Action Interview—Pre-Interview Communication dated Nov. 18, 2020”, 12 pgs.
“U.S. Appl. No. 16/681,546, Notice of Allowance dated Feb. 18, 2021”, 8 pgs.
“U.S. Appl. No. 16/681,546, Response filed Feb. 10, 2021 to First Action Interview—Office Action Summary dated Dec. 9, 2020”, 11 pgs.
“Australian Application Serial No. 2012216622, Office Action dated Jan. 6, 2015”, 2 pgs.
“Australian Application Serial No. 2014202442, Office Action dated Mar. 19, 2015”, 5 pgs.
“Australian Application Serial No. 2014213553, Office Action dated May 7, 2015”, 2 pgs.
“Buffer a Polygon”, VBForums, [Online]Retrieved from the Internet: <URL: http://www.vbforums.com/showthread.php?198436-Buffer-a-Polygon>, (Accessed: Oct. 10, 2016).
“Douglas-Peucker-Algorithms”, Wikipedia (W/ Machine Translation ), [Online]. [Archived Jul. 29, 2011]. Retrieved from the Internet: <URL: http://de.wikipedia.org/w/index.php?title=Douglas-Peucker-Algorithmus&oldid=91846042″>, (Last Modified: Jul. 29, 2011), 4 pgs.
“European Application Serial No. 14187739.9, Extended European Search Report dated Jul. 6, 2015”, 9 pgs.
“European Application Serial No. 17207163.1, Communication Pursuant to Article 94(3) EPC dated Nov. 19, 2019”, 7 pgs.
“European Application Serial No. 17207163.1, Extended European Search Report dated Mar. 12, 2018”, 10 pgs.
“European Application Serial No. 17207163.1, Response filed Jun. 2, 2020 to Communication Pursuant to Article 94(3) EPC dated Nov. 19, 2019”, 13 pgs.
“European Application Serial No. 17207163.1, Response filed Dec. 20, 2018 to Extended European Search Report dated Mar. 12, 2018”, 21 pgs.
“GIS-NET 3 Public Department of Regional Planning”, Planning & Zoning Information for Unincorporated LA County, [Online] Retrieved from the internet: <http://gis.planning.lacounty.gov/GIS-NET3_Public/Viewer.html>, (Oct. 2, 2013), 1-2.
“Great Britain Application Serial No. 1408025.3, Office Action dated Nov. 6, 2014”, 3 pgs.
“Hunchlab: Heat Map and Kernel Density Calculation for Crime Analysis”, Azavea Journal, [Online]. Retrieved from the Internet: <URL: www.azavea.com/blogs/newsletter/v4i4/kernel-density-capabilities-added-to-hunchlab>, (Sep. 9, 2014), 2 pgs.
“Identify—Definition”, Downloaded Jan. 22, 2015, (Jan. 22, 2015), 1 pg.
“Map Builder: Rapid Mashup Development Tool for Google and Yahoo Maps!”, http://web.archive.org/web/20090626224734/http://www.mapbuilder.net/, (Jul. 20, 2012), 2 pgs.
“Map of San Jose, CA”, Retrieved Oct. 2, 2013 from http://maps.google.com, (Oct. 2, 2013), 1 pg.
“Map of San Jose, CA.”, Retrieved Oct. 2, 2013 from http://maps.yahoo.com, (Oct. 2, 2013), 1 pg.
“Map of San Jose, CA.”, Retrieved Oct. 2, 2013 from http://maps.bing.com, (Oct. 2, 2013), 1 pg.
“Netherlands Application Serial No. 2011632, Netherlands Search Report dated Feb. 8, 2016”, W/ English Translation, 9 pgs.
“Netherlands Application Serial No. 2012778, Netherlands Search Report dated Sep. 22, 2015”, W/ English Translation, 10 pgs.
“New Zealand Application Serial No. 628585, Office Action dated Aug. 26, 2014”, 2 pgs.
“New Zealand Application Serial No. 628840, Office Action dated Aug. 28, 2014”, 2 pgs.
“Overlay—Definition”, Downloaded Jan. 22, 2015, (Jan. 22, 2015), 1 pg.
“Ramer-Douglas-Peucker algorithm”, Wikipedia, [Online]. [Archived May 31, 2013]. Retrieved from the Internet: <URL: http ://en Wikipedia.orglw/index. php ?title= Ramer-DouglasPeucker_algorithm&oldid=557739119″>, (Jul. 2011), 3 pgs.
“Using the Area of Interest Tools”, Sonris, [Online]. Retrieved from the Internet: <URL: http://web.archive.org/web/20061001053327/http://sonris-www.dnr.state.la.us/gis/instruct_files/tutslide12.htm>, (Oct. 1, 2006), 1 pg.
Aquino, J., et al., “JTS Topology Suite: Technical Specifications”, Vivid Solutions, Technical Specifications Version 1.4, (Oct. 17, 2003), 1-36.
Barnes, Diane, et al., “Viewshed Analysis”, GIS-ARC/INFO, (2001), 1-10.
Barto, “How To: Create Your Own Points of Interest”, How To, [Online]. Retrieved from the Internet: <URL:http://www.poieditor.com/articles/howto_create_your_own_points_of_interest/>, (Jul. 22, 2008), 4 pgs.
Carver, Steve, et al., “Real-time visibility analysis and rapid viewshed calculation using a voxel-based modelling approach”, (Apr. 13, 2012), 6 pgs.
Chen, et al., “Bringing Order to the Web: Automatically Categorizing Search Results”, Proceedings of the SIGCHI conference on Human Factors in Computing Systems, The Haque, The Netherlands, (2000), 145-152.
Dramowicz, Ela, “Retail Trade Area Analysis Using the Huff Model”, Directions Magazine,, [Online] retrieved from the internet: <http://www.directionsmag.com/articles/retail-trade-area-analysis-usinq-the-huff-model/123411>, (Jul. 2, 2005), 10 pgs.
Ghosh, Pijush K, “A Solution of Polygon Containment, Spatial Planning, and Other Related Problems Using Minkowski Operations”, National Centre for Software Technology, Bombay India, Computer Vision, Graphics, and Image Processing, vol. 49, (Feb. 15, 1989), 35 pgs.
Gorr, et al., “Crime Hot Spot Forecasting: Modeling and Comparative Evaluation”, Grant 98-IJ-CX-K005, (May 6, 2002), 37 pgs.
Griffith, Daniel A, et al., “A Generalized Huff Model”, Geographical Analysis, vol. 14, No. 2, (Apr. 1982), 135-144.
Haralick, Robert M, et al., “Image Analysis Using Mathematical Morphology”, IEE Transactions on pattern analysis and machine intelligence, vol. PAMI-9, (Jul. 4, 1987), 532-550.
Hibbert, et al., “Prediction of Shopping Behavior Using a Huff Model Within a GIS Framework”, (Mar. 18, 2011), 16 pgs.
Huang, Da Wei, et al., “Systematic and Integrative Analysis of Large Gene Lists Using DAVID Bioinformatics Resources”, Nature Protocols, 4.1, (Dec. 2008), 44-57.
Huff, et al., “Calibrating the Huff Model Using ArcGIS Business Analyst”, ESRI, (Sep. 2008), 33 pgs.
Huff, David L, “Parameter Estimation in the Huff Model”, ESRI, ArcUser, (2003), 34-36.
Ipbucker, C, et al., “Inverse Transformation for Several Pseudo-cylindrical Map Projections Using Jacobian Matrix”, ICCSA 2009, Part 1 LNCS 5592, (2009), 2 pgs.
Levine, Ned, “Crime Mapping and the Crimestat Program”, Geographical Analysis, vol. 38, (2006), 41-56.
Liu, T., “Combining GIS and the Huff Model to Analyze Suitable Locations for a New Asian Supermarket in the Minneapolis and St. Paul, Minnesota USA”, Papers in Resource Analysis, 2012, vol. 14, (2012), 8 pgs.
Mandagere, Nagapramod, “Buffer Operations in GIS”, [Online]. Retrieved from the Internet: <URL: http://www-users.cs.umn.edu/˜npramod/enc_pdf.pdf>, (Printed: Jan. 20, 2010), 7 pgs.
Murray, C, “Oracle Spatial Developer's Guide-6 Coordinate Systems ( Spatial Reference Systems)”, [Online]. Retrieved from the lnternet:<URL:http://docs.oracle.com/cd/B28359_01/appdev.111/b28400.pdf>, (Jun. 2009), 870 pgs.
Pozzi, F., et al., “Vegetation and Population Density in Urban and Suburban Areas in the USA”, Presented at the Third International Symposium of Remote Sensing of Urban Areas; Istanbul, Turkey, Jun. 2002, (Jun. 2002), 8 pgs.
Qiu, Fang, “3D Analysis and Surface Modeling”, Powerpoint presentation, [Online] Retrieved from the Internet: <URL: http://web.archive.org/web/20091202221925/http://www.utsa.edu/lrsg/Teaching/EES6513/08-3D.pdf>, (accessed Sep. 16, 2013), 26 pgs.
Reddy, Martin, et al., “Under the Hood of GeoVRML 1.0”, Proceeding VRML '00 Proceedings of the fifth symposium on Virtual reality modeling language (Web3D-VRML) pp. 23-28, [Online] Retrieved from the internet: <http://pdf.aminer.org/000/648/038/under_the_hood_of_geovrml.pdf>, (Feb. 2000), 7 pgs.
Reibel, M., et al., “Areal Interpolation of Population Counts Using Pre-classified Land Cover Data”, Popul Res Policy Rev. 26, (Sep. 19, 2007), 619-633.
Reibel, M., et al., “Geographic Information Systems and Spatial Data Processing in Demography: a Review”, Popul Res Policy Rev (2007) 26, (Sep. 6, 2007), 601-618.
Rizzardi, M., et al., “Interfacing U.S. Census Map Files With Statistical Graphics Software: Application and Use in Epidemiology”, Statistics in Medicine, vol. 12, (1993), 1953-1964.
Snyder, John P, “Map Projections—A Working Manual”, U.S. Geological Survey Professional Paper, 1395, (1987), 29 pgs.
Tangelder, J W.H, et al., “Freeform Shape Matching Using Minkowski Operations”, (Jun. 1996), 12 pgs.
Thompson, Mick, “Getting Started with GEO”, (Jul. 26, 2011), 3 pgs.
Turner, Andy, “Andy Turner's GISRUK 2012 Notes”, Google Drive—https://docs.google.com/document/d/1cTmxg7mVx5gd89lqblCYvDEnHA4QAivH417WpyPsqE4edit?pli=1, (Sep. 16, 2013), 1-15.
Valentini, Giorgio, et al., “Ensembles of Learning Machines”, Lecture Notes in Computer Science: Neural Nets, Springer Berlin Heidelberg, (Sep. 26, 2002), 3-20.
Wongsuphasawat, Krist, et al., “Visual Analytics for Transportation Incident Data Sets”, Transportation Research Record: Journal of the Transportation Research Board, No. 2138, (2009), 135-145.
Woodbridge, Stephen, “[geos-devel] Polygon simplification”, [Online]. Retrieved from the lnternet:<URL:http://lists.osgeo.org/pipermail/geos-devel/2011-May/005210.html>, (May 8, 2011), 2 pgs.
U.S. Appl. No. 15/808,608 U.S. Pat. No. 10,515,433, filed Nov. 9, 2017, Zoom-Adaptive Data Granularity to Achieve a Flexible High-Performance Interface for a Geospatial Mapping System.
U.S. Appl. No. 16/681,546, filed Nov. 12, 2019, Zoom-Adaptive Data Granularity to Achieve a Flexible High-Perofmrnace Interface for a Geospatial Mapping System.
U.S. Appl. No. 15/377,668 U.S. Pat. No. 9,865,034, filed Dec. 13, 2016, Zoom-Adaptive Data Granularity to Achieve a Flexible High-Performance Interface for a Geospatial Mapping System.
Related Publications (1)
Number Date Country
20210272234 A1 Sep 2021 US
Continuations (3)
Number Date Country
Parent 16681546 Nov 2019 US
Child 17323507 US
Parent 15808608 Nov 2017 US
Child 16681546 US
Parent 15377668 Dec 2016 US
Child 15808608 US