Capturing overhead images of geographic regions, typically by satellites or aircraft, and providing the images to interested parties has become a popular service in recent years. Such images can be of interest to a wide variety of individuals and organizations, including geographers, researchers, meteorologists, scientists, map service providers, government agencies, and amateur photography enthusiasts, to name a few. The character and quality of these overhead images can vary widely, depending on multiple factors including lighting, elevation, cloud cover, and the equipment used to capture the image. There can be thousands of unique overhead images for a particular geographic region. Consequently, some service providers have implemented web sites and other types of interactive systems for searching and viewing overhead images.
Throughout the drawings, reference numbers may be re-used to indicate correspondence between referenced elements. The drawings are provided to illustrate example embodiments described herein and are not intended to limit the scope of the disclosure.
Overview
The present disclosure provides an overhead image viewing system and associated user interface that enable users to seamlessly browse overhead images (e.g. satellite images) in context while browsing a map, and select an overhead image to view detailed information and/or to purchase. The user interface enables users to browse seamlessly by dynamically retrieving and providing overhead images corresponding to a currently displayed geographic region of interest; by displaying a selected overhead image overlaid on and contextually blended with the geographic region associated with the overhead image; and by automatically re-centering the displayed map when users return to the main view. Accordingly, certain embodiments of the overhead image viewing system allow the user to continuously view and select overhead images while remaining in the context of a map display of a geographic region of interest (which may change as the user pans the map or searches for a new region).
The user interface may be used with any type of interactive system (such as a web site, a mobile application based system, or an interactive television system) that generates a user interface for browsing a map or geographic region of interest and overhead images corresponding to a particular geographic region of interest and/or search filter criteria. For example, the user interface may be part of an interactive system, such as the interactive system 700 of
Examples of User Interfaces for Overhead Image Viewing Systems
As depicted in
In the illustrated embodiment, a filmstrip image viewer 135 is displayed below the geographic region of interest. The filmstrip image viewer 135 may display a set of the overhead images 125 which correspond to the currently displayed geographic region on the map 120. In one embodiment, the set of overhead images displayed in the filmstrip image viewer 135 may be updated, replaced, or refreshed when the user indicates or specifies a different geographic region of interest, for example using the various user interface elements described herein (e.g. the search box 105, the zoom level selector 115, or by panning the map using the mouse click-and-drag feature, or by a touch and/or swipe gesture via a touch-enabled screen). Thus, the set of overhead images displayed to the user may be updated dynamically and automatically as the user navigates or browses the map. Once the user has found a particular geographic region of interest, the set of overhead images displayed in the filmstrip image viewer 135 may be updated and provide the user with the set of overhead images most relevant to the user's browsing.
For example, the user may use the search box 105 to search for overhead images in Maui, Hi. In response to this search request, the interactive system 700 may update the map 120 to display the geographic region corresponding to Maui, Hi., and update the filmstrip image viewer 135 with a set of overhead images corresponding to the displayed geographic region. Once this view is updated, the user may decide to pan the map using the mouse click-and-drag feature (or touch and/or swipe gesture via a touch-enabled screen) to view the nearby island of Lanai, Hi., or to zoom the map using the zoom level selector 115 on the mountain of Haleakala. In any of these interactions, the map may be automatically updated to display the selected geographic region of interest, and the filmstrip image viewer 135 may be automatically refreshed with overhead images corresponding to the selected geographic region. For example, if a zoom-in operation is performed to limit the map display 120 to a specific area, images that do not correspond to that specific area may be removed from the filmstrip image viewer.
The user interface depicted in
In some embodiments, the set of overhead images corresponding to the geographic region of interest on the map 120 may be larger than the available space on the filmstrip image viewer 135. For example, the filmstrip image viewer 135 may be able to display up to 10 images, while the full set of overhead images available for display is more than 10. Thus, in some embodiments, and as illustrated in
In the illustrated embodiment, the filmstrip image viewer 135 is displayed at the bottom of the page. However, the filmstrip image viewer 135 could also be displayed, for example, across the top of the page, or arranged vertically, or in any other variation. Further, multiple rows or columns of images could be displayed.
In
In some embodiments, the selected overhead image 226 may be significantly larger or smaller than the original geographic region of interest displayed in the user interface of
In some implementations, a still image can be displayed using a pyramid resolution segmentation approach in which a hierarchical set of image “tiles” is used to display images of a region of interest. Each tile can be the same number of pixels; however, the spatial resolution of each pixel can increase with each successive pyramid. In this approach, regardless of the spatial resolution that a user is viewing the image with, the bandwidth used to deliver the image data to the user remains substantially constant. Overhead images may be ortho-rectified to correct for, e.g., topographic relief, imaging system distortion, tilt of the overhead imaging platform or sensor relative to the region of interest, etc. Ortho-rectified images can roughly correspond to the image that would be observed if the imaging sensor were directly above the region of interest and “looking” straight down. In some implementations, the user interface can provide ortho-rectified video using a pyramid segmentation approach with settings to account for different user system communication bandwidths.
As further depicted in
When the user clicks on the “i” icon 245, the user may be presented with a popup box displaying various detail information about the selected overhead image 226. For example, information displayed may include degree of cloud cover, degree of blur, panchromatic resolution, multi-resolution of the image, spectral or multispectral resolution, spectral bandwidth, sun azimuth, sun elevation, platform, date the image was captured, and the name of the image source provider. In some embodiments, the user may be presented with additional options of features not depicted in
In some embodiments of the user interface depicted in
Also depicted in
The user interface depicted in
Other filter criteria not depicted in
In addition to the example user interfaces depicted in
Examples of Methods Performed by Overhead Image Viewing Systems
Once the interactive system 700 receives the user input, at block 410 the interactive system 700 selects a set of overhead images which correspond to the geographic region of interest. The overhead images may be stored, for example, in data source(s) 770, or cached locally in a mass storage device 720 for quicker retrieval. The overhead images may include metadata, such as a shape file which provides geolocation information to enable searching to match the geographic region of interest. Other metadata associated with an overhead image may include, for example, cloud cover, slant angle, image resolution, the date the image was taken, lighting conditions, collection geometry, other data used to identify and describe conditions under which the image was taken, average user ratings, vote tallies, user comments and/or feedback, and tags.
At block 415, the interactive system 700 generates a filmstrip view of the selected overhead images matching the geographic region of interest. The filmstrip view may include a limited number or subset of the selected images, depending on factors such as the screen resolution of the user computing system 722, the file size of the images, and the speed of the network 760 connecting the interactive system 700 to the user computing system 722. The system 700 may provide a graphic, icon, drop-down box, or the like to allow the user to choose whether the filmstrip view displays still images, videos, or both still images and videos. The interactive system 700 may also generate smaller thumbnail versions of the selected images for the filmstrip view. In some embodiments, the selected images may be arranged in the filmstrip according to parameters, such as by date, degree of cloud cover, by type (e.g., still image or video), etc. The arrangement parameters may be user-selected or determined by the interactive system.
At block 420, the interactive system 700 incorporates the generated filmstrip view with the currently selected geographic region of interest into the geographic region view page 100. For example, as shown in
Beginning at block 505, the interactive system 700 receives user input specifying a selected overhead image of interest. For example, the user input may be provided manually if/when a user requests a particular overhead image by clicking on a thumbnail presented in a filmstrip image viewer 135 of
At block 515, the interactive system 700, using the associated geolocation information, determines the geographic region corresponding to the selected overhead image. At block 520, the interactive system 700 generates a page displaying the selected overhead image superimposed over the corresponding geographic region. The page may also include various user interface elements to enable further browsing of images and/or image detail for the selected overhead image. At block 525, the image viewing page 100 is transmitted by the interactive system 700 to the requesting user computing system 722 for display.
Beginning at block 605, the interactive system 700 receives user input specifying filter criteria to narrow or broaden a search for a set of overhead images which correspond to a geographic region of interest. The filter criteria may include, for example, the filter criteria displayed to the user in the user interface of
Once the interactive system 700 receives the user input, at block 610 the interactive system 700 retrieves a set of overhead images from, for example, data source(s) 770, or the local cache in a mass storage device 720 for quicker retrieval, where the set of overhead images includes overhead images meeting the filter criteria and which correspond to the geographic region of interest.
At block 615, the interactive system 700 generates a filmstrip view of the selected overhead images. The filmstrip view may include a limited number or subset of the selected images, depending on factors such as, e.g., the screen resolution of the user computing system 722, the file size of the images, and the speed of the network 760 connecting the interactive system 700 to the user computing system 722, whether still images and/or videos are selected for display by the user. The interactive system 700 may also generate smaller thumbnail versions of the selected images for the filmstrip view. In some embodiments, the selected images may be arranged in the filmstrip according to parameters, such as by date, degree of cloud cover, etc. The arrangement parameters may be user-selected or determined by the interactive system 700.
At block 620, the interactive system 700 incorporates the generated filmstrip view with the currently selected geographic region of interest into the geographic region view page 100. At block 630, the image viewing page 100 is transmitted by the interactive system 700 to the requesting user computing system 722 for display.
In the illustrative processes described in
Examples of Overhead Image Viewing Systems
In one embodiment, the interactive system 700 comprises an image view module 790 that carries out the functions, methods, and/or processes described herein (including embodiments of the processes 400, 500, and 600). The image view module 790 may be executed on the interactive system 700 by a central processing unit 750 discussed further below.
Computing System Components
In one embodiment, the processes, systems, and methods illustrated above may be embodied in part or in whole in software that is executed on a computing system. The functionality provided for in the components and modules of the computing system may comprise one or more components and/or modules. For example, the computing system may comprise multiple central processing units (CPUs) and mass storage devices, such as may be implemented in an array of physical servers.
In general, the word “module,” as used herein, refers to logic embodied in hardware or firmware, or to a collection of software instructions, possibly having entry and exit points, written in a programming language, such as, for example, Java, C or C++, or the like. A software module may be compiled and linked into an executable program, installed in a dynamic link library, or may be written in an interpreted programming language such as, for example, BASIC, Perl, Lua, or Python. It will be appreciated that software modules may be callable from other modules or from themselves, and/or may be invoked in response to detected events or interrupts. Software instructions may be embedded in firmware, such as an EPROM. It will be further appreciated that hardware modules may be comprised of connected logic units, such as gates and flip-flops, and/or may be comprised of programmable units, such as programmable gate arrays or processors. The modules described herein are preferably implemented as software modules, but may be represented in hardware or firmware. Generally, the modules described herein refer to logical modules that may be combined with other modules or divided into sub-modules despite their physical organization or storage.
In one embodiment, the interactive system 700 also comprises a mainframe computer suitable for controlling and/or communicating with large databases, performing high volume transaction processing, and generating reports from large databases. The interactive system 700 also comprises a central processing unit (“CPU”) 750, which may comprise a conventional microprocessor. The interactive system 700 further comprises a memory 730, such as random access memory (“RAM”) for temporary storage of information and/or a read only memory (“ROM”) for permanent storage of information, and a mass storage device 720, such as a hard drive, diskette, optical media storage device, non-volatile computer storage (e.g., flash memory), and so forth. Typically, the modules of the interactive system 700 are connected to the computer using a standards based bus system. In different embodiments, the standards based bus system could be Peripheral Component Interconnect (PCI), Microchannel, SCSI, Industrial Standard Architecture (ISA) and Extended ISA (EISA) architectures, for example.
Computing System Device/Operating System
The interactive system 700 may include a variety of computing devices, such as, for example, a server, a Windows server, a Structure Query Language server, a Unix server, a personal computer, a mainframe computer, a laptop computer, a cell phone, a personal digital assistant, a kiosk, an audio player, and so forth. The interactive system 700 is generally controlled and coordinated by operating system software, such as z/OS, Windows 95, Windows 98, Windows NT, Windows 2000, Windows XP, Windows Vista, Linux, BSD, SunOS, Solaris, or other compatible operating systems. In Macintosh systems, the operating system may be any available operating system, such as MAC OS X. In other embodiments, the interactive system 700 may be controlled by a proprietary operating system. Conventional operating systems control and schedule computer processes for execution, perform memory management, provide file system, networking, and I/O services, and provide a user interface, such as a graphical user interface (“GUI”), among other things.
Network
In the embodiment of
Access to the image view module 790 of the interactive system 700 by user computing systems 722 and/or by data sources 770 may be through a web-enabled user access point such as the user computing systems' 722 or data sources' 770 personal computer, cellular phone, laptop, tablet, or other device capable of connecting to the network 760. Such a device may have a browser module implemented as a module that uses text, graphics, audio, video, and other media to present data and to allow interaction with data via the network 760.
The browser module or other output module may be implemented as a combination of an all points addressable display such as a cathode-ray tube (CRT), a liquid crystal display (LCD), a plasma display, or other types and/or combinations of displays. In addition, the browser module or other output module may be implemented to communicate with input devices and may also comprise software with the appropriate interfaces which allow a user to access data through the use of stylized screen elements such as, for example, menus, windows, dialog boxes, toolbars, and controls (for example, radio buttons, check boxes, sliding scales, and so forth). Furthermore, the browser module or other output module may communicate with a set of input and output devices to receive signals from the user.
The input device(s) may comprise a keyboard, roller ball, pen and stylus, mouse, trackball, voice recognition system, or pre-designated switches or buttons. The output device(s) may comprise a speaker, a display screen, a printer, or a voice synthesizer. In addition a touch screen may act as a hybrid input/output device. In another embodiment, a user may interact with the system more directly such as through a system terminal connected to the interactive system without communications over the Internet, a WAN, or LAN, or similar network.
In some embodiments, the system 700 may comprise a physical or logical connection established between a remote microprocessor and a mainframe host computer for the express purpose of uploading, downloading, or viewing interactive data and databases on-line in real time. The remote microprocessor may be operated by an entity operating the computer system 700, including the client server systems or the main server system, an/or may be operated by one or more of the data sources 770 and/or one or more of the computing systems. In some embodiments, terminal emulation software may be used on the microprocessor for participating in the micro-mainframe link.
In some embodiments, user computing systems 722 who are internal to an entity operating the computer system 700 may access the image view module 790 as an application or process run by the CPU 750.
Other Systems
In addition to the systems that are illustrated in
In some embodiments, the acts, methods, and processes described herein are implemented within, or using, software modules (programs) that are executed by one or more general purpose computers. The software modules may be stored on or within any suitable computer-readable medium including computer-generated signals or non-transitory computer storage. It should be understood that the various steps or blocks may alternatively be implemented in-whole or in-part within specially designed hardware.
As will be apparent, many variations on the interactive system 700 described above are possible. For example, in one embodiment, the interactive system 700 may be configured to store, in the data source(s) 770 and/or the mass storage device 720, the user's searches for later re-use or retrieval. In another embodiment, the image viewing interfaces 100, 200 may be configured to display a “heat map” of overhead images corresponding to a geographic region of interest, such that the heat map represents the density of overhead images per area or location in the geographic region of interest. For example, the heat map may indicate a color (e.g. red) for a higher density of overhead images available for a region and another color (e.g. blue) for a lower density of overhead images available for the region. The heat map may, additionally or alternatively, include graphics (e.g., circles) whose size (e.g., diameter) corresponds to density (e.g., larger circle diameter corresponds to larger density).
The various components shown in
The various features and processes described above may be used independently of one another, or may be combined in various ways. All possible combinations and subcombinations are intended to fall within the scope of this disclosure. In addition, certain method, event, state or process blocks may be omitted in some implementations. The methods and processes described herein are also not limited to any particular sequence, and the blocks or states relating thereto can be performed in other sequences that are appropriate. For example, described tasks or events may be performed in an order other than that specifically disclosed, or multiple may be combined in a single block or state. The example tasks or events may be performed in serial, in parallel, or in some other manner. Tasks or events may be added to or removed from the disclosed example embodiments. The example systems and components described herein may be configured differently than described. For example, elements may be added to, removed from, or rearranged compared to the disclosed example embodiments.
Conditional language used herein, such as, among others, “can,” “could,” “might,” “may,” “e.g.,” and the like, unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or steps. Thus, such conditional language is not generally intended to imply that features, elements and/or steps are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without author input or prompting, whether these features, elements and/or steps are included or are to be performed in any particular embodiment. The terms “comprising,” “including,” “having,” and the like are synonymous and are used inclusively, in an open-ended fashion, and do not exclude additional elements, features, acts, operations, and so forth. Also, the term “or” is used in its inclusive sense (and not in its exclusive sense) so that when used, for example, to connect a list of elements, the term “or” means one, some, or all of the elements in the list. Conjunctive language such as the phrase “at least one of X, Y and Z,” unless specifically stated otherwise, is otherwise understood with the context as used in general to convey that an item, term, etc. may be either X, Y or Z. Thus, such conjunctive language is not generally intended to imply that certain embodiments require at least one of X, at least one of Y and at least one of Z to each be present
While certain example embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the inventions disclosed herein. Thus, nothing in the foregoing description is intended to imply that any particular feature, characteristic, step, module, or block is necessary or indispensable. Indeed, the novel methods and systems described herein may be embodied in a variety of other forms; furthermore, various omissions, substitutions and changes in the form of the methods and systems described herein may be made without departing from the spirit of the inventions disclosed herein.
This application claims the benefit of priority under 35 U.S.C. § 119(e) of U.S. Provisional Application No. 61/642,766, filed May 4, 2012, and U.S. Provisional Application No. 61/689,794, filed Jun. 12, 2012, both entitled OVERHEAD IMAGE VIEWING SYSTEMS AND METHODS, the disclosures of which are hereby incorporated by reference in their entireties.
Number | Name | Date | Kind |
---|---|---|---|
6421610 | Carroll et al. | Jul 2002 | B1 |
6456299 | Trombley | Sep 2002 | B1 |
6504571 | Narayanaswami et al. | Jan 2003 | B1 |
6601059 | Fries | Jul 2003 | B1 |
7009619 | Akitsune et al. | Mar 2006 | B2 |
7239760 | Di Bernardo et al. | Jul 2007 | B2 |
7555725 | Abramson et al. | Jun 2009 | B2 |
7577937 | Henke et al. | Aug 2009 | B2 |
7657124 | Turner et al. | Feb 2010 | B2 |
7729561 | Boland et al. | Jun 2010 | B1 |
8219318 | Kreft | Jul 2012 | B2 |
20030040971 | Freedenberg et al. | Feb 2003 | A1 |
20030044085 | Dial, Jr. et al. | Mar 2003 | A1 |
20060133694 | Dewaele | Jun 2006 | A1 |
20060197781 | Arutunian | Sep 2006 | A1 |
20070180131 | Goldstein et al. | Aug 2007 | A1 |
20080040028 | Crump | Feb 2008 | A1 |
20080063237 | Rubenstein | Mar 2008 | A1 |
20080155060 | Weber | Jun 2008 | A1 |
20090005968 | Vengroff et al. | Jan 2009 | A1 |
20090024476 | Baar | Jan 2009 | A1 |
20090027418 | Maru | Jan 2009 | A1 |
20090187575 | DaCosta | Jul 2009 | A1 |
20090210388 | Elson et al. | Aug 2009 | A1 |
20090284551 | Stanton | Nov 2009 | A1 |
20100058225 | Lin | Mar 2010 | A1 |
20110007094 | Nash et al. | Jan 2011 | A1 |
20120059576 | Lee et al. | Mar 2012 | A1 |
20120213416 | Lorimer et al. | Aug 2012 | A1 |
20120221595 | Slowe | Aug 2012 | A1 |
20130051661 | Robinson et al. | Feb 2013 | A1 |
20130250104 | Williams | Sep 2013 | A1 |
20140027576 | Boshuizen et al. | Jan 2014 | A1 |
Number | Date | Country |
---|---|---|
2010060294 | Mar 2010 | JP |
WO 2013166322 | Nov 2013 | WO |
Entry |
---|
“A First Look at the Book Module and Map Module in Adobe Photoshop Lightroom 4” Mar. 6, 2012, available at <URL=http://www.adobepress.com/articles/article.asp?p=1842937> last accessed Dec. 13, 2014. |
International Search Report and Written Opinion of the International Searching Authority for International Application No. PCT/US2013/039320, dated Oct. 4, 2013. |
DataDoors for Direct Receiving Stations, Informational Brochure, i-cubed, LLC, retrieved from http://www.i3.com/products/software-applications/datadoors (last accessed Apr. 20, 2012) (2 pages). |
DataDoors Cloud and Deployed GeoCMS, Informational Brochure, i-cubed, LLC, retrieved from http://www.i3.com/products/software-applications/datadoors (last accessed Apr. 20, 2012) (2 pages). |
Harnessing the Cloud for Spatial Data Management, Vector1 Media, May 2010 (4 pages). |
GeoFuse GeoEye Online Maps, User Interface Screen Captures, retrieved from http://geofuse.geoeye.com/maps/Map.aspx (last accessed Jan. 2, 2013) (3 pages). |
GeoStore Services Order Online, User Interface Screen Captures, retrieved from http://geostore.astrium-geo.com/en-NA/278-order-online , (last accessed Jan. 2, 2013) (3 pages). |
Google Earth, User Interface Screen Captures, retrieved from software program running on Windows 7, (as generated Jan. 2, 2013) (2 pages). |
SPOTCatalog Location Search, User Interface Screen Captures, retrieved from http://catalog.spotimage.com/PageSearch.aspx (last accessed Jan. 2, 2013) (4 pages). |
TerraFly Databases Home Page, User Interface Screen Captures, retrieved from http://terrafly.com, (last accessed Jan. 2, 2013) (1 page). |
TerraServer Imagery Search, User Interface Screen Captures, retrieved from http://www.terraserver.com, (last accessed Jan. 2, 2013) (4 pages). |
USGS Earth Explorer, User Interface Screen Captures, retrieved from http://earthexplorer.usgs.gov, (last accessed Jan. 2, 2013) (5 pages). |
Anselin, L., et el., “GeoDa: An Introduction to Spatial Data Analysis,” May 5, 2004, published in Geographical Analysis, Jan. 2006, pp. 5-22, vol. 38. |
Environment Research Systems, Inc. (ESRI), “ArcGIS Map Examples”, accessed at http://www.esri.com/software/arcgis/take-a-look-at-web-mapsx on Jul. 17, 2012. |
Google, Inc. “Google Earth, View of American Museum of Natural History”, accessed at http://maps.google.com on Jul. 17, 2012. |
InfoChimps, Inc., “Geocoding API,” accessed at www.infochimps.com/datasets/geocoding-api on Jul. 17, 2012. |
OpenStreetMap, “Free Wiki World Map,” accessed at http://www.openstreetmap.org/ on Jul. 17, 2012. |
Quantum GIS Project, “QGIS 1.8 release”, accessed at http://qgis.org/en/sponsorship/149.html on Aug. 29, 2012, in 13 pages. |
United States Geological Survey, “Map Locator & Downloader,” accessed at http://store.usgs.gov/b2c_usgs/usgs/maplocator/(xcm=r3standardpitrex_prd&layout=6_1_48&uiarea=2&ctype=areaDetails&carea=%24ROOT)/.do on Jul. 17, 2012. |
U.S. Appl. No. 61/675,730, filed Jul. 25, 2012, titled Low Cost Earth Observation Constellation Methodology and Applications in 15 pages. |
Number | Date | Country | |
---|---|---|---|
20130298083 A1 | Nov 2013 | US |
Number | Date | Country | |
---|---|---|---|
61689794 | Jun 2012 | US | |
61642766 | May 2012 | US |