The following relates generally to systems and method of Earth observation, and can be applied to observing other planetary objects.
Aerial imaging systems are becoming more popular as users wish to obtain images and video about the geography and landscape. For example, helicopters, airplanes and other aircraft are equipped with cameras to obtain aerial images of cities, forests, or other specific locations requested by a customer. Such systems are often limited to the flight time of the aircraft and the data is often very specific to a customer's request (e.g. surveying forests for forest fires, surveying a city for roads, or surveying land to inspect power lines).
Some satellite spacecraft are equipped with cameras to obtain imagery of the Earth. The images are sent from the satellite to a ground station on Earth, and the images are processed and sent to the customer. The satellite typically acquires a select or limited number of images targeting very specific areas of interest and at very specific times, as requested by a specific customer (e.g. weather companies, land development companies, security and defense organizations, etc.). The general public typically does not have access to the images.
Embodiments will now be described by way of example only with reference to the appended drawings wherein:
It will be appreciated that for simplicity and clarity of illustration, where considered appropriate, reference numerals may be repeated among the figures to indicate corresponding or analogous elements. In addition, numerous specific details are set forth in order to provide a thorough understanding of the example embodiments described herein. However, it will be understood by those of ordinary skill in the art that the example embodiments described herein may be practiced without these specific details. In other instances, well-known methods, procedures and components have not been described in detail so as not to obscure the example embodiments described herein. Also, the description is not to be considered as limiting the scope of the example embodiments described herein.
It is recognized herein that there are a growing number of users who wish to consume or view imagery of the Earth, and that the users may vary. Non-limiting examples of users include the general public, consumer companies, advertising companies, social data networks, governments, security organizations, shipping companies, environmental companies, forestry organizations, etc. Providing images to these different types of users can be difficult in terms of acquiring the images and in terms of distributing the images.
It is also recognized that images from a single image provider may not be sufficient to meet the requests of customers, and that additional imagery data from other providers may be advantageous.
It is also recognized herein that users may find it difficult to know in advance of a certain time period that they would like imagery captured for the certain time. For example, a user may realize that after an event has occurred, they would like to have imagery documenting the event. However, the imagery may not be available, because the user did not give advance instructions to the aircraft operator or satellite operator to obtain the imagery for the event. Moreover, even if a user may know about an event, in many cases the user is not able to give the instructions to the aircraft or satellite operators with sufficient time before the event.
It is also recognized herein that even acquiring imagery data in an efficient manner, distributing the imagery data over computer networks, and storing the imagery data in memory to be searched later in a meaningful way, can be difficult.
The systems and methods described herein allow an imaging system from a spacecraft or aircraft to acquire imagery data of Earth, transmit the data to ground stations, process the data, and distribute the data over a Web platform to users. These systems and methods attempt to address at least one of the above recognized problems. It can be appreciated that there are other problems recognized herein, which may be explicitly stated or implicitly stated by describing the systems and the methods.
In an example embodiment, the system includes a high resolution camera (HRC) which captures high resolution (HR) imagery and a medium resolution camera (MRC) which captures medium resolution (MR) imagery. Further details about the imaging system are described below.
System Overview
Below is a table, i.e. Table 1, providing example meanings of terms used to throughout this document. It can be appreciated that other terms may be used in this document, which are not in the below table.
Turning to
Although Earth is used as an example in this document, the principles described herein also apply to remote sensing operations for other planetary objects. Non-limiting examples include asteroids, meteors, Mars, the Moon, the Sun, etc.
It can be appreciated that spacecraft 100 and aircraft 101 orbit or fly at a distance above the Earth's surface to capture larger area of the Earth's surface. It can also be appreciated that the principles described herein are described with respect to spacecraft, but the principles also apply to aircraft.
Turning to
It will be appreciated that although the principles described herein apply to aircraft and spacecraft, it is recognized that a spacecraft 100 is able to orbit the Earth. In other words, a spacecraft is able to cover vast distances of the Earth very quickly, compared to an aircraft, and the spacecraft is able to stay positioned above the Earth for extended periods of time, compared to the aircraft.
It will also be appreciated that although cameras and imaging systems are often described herein to observe the Earth, other types of sensors can be used to observe the Earth. Many of the principles described herein also apply to different types of sensors. Non-limiting examples of other types of sensors that can be used to observe the Earth include LiDAR, RADAR, infrared sensors, and Synthetic Aperture RADAR (SAR). Other types of remote sensing technology also applies.
Turning to
In an example embodiment, the ground stations are in communication with each other. Turning to
The space segment 501 includes a Medium Resolution Camera (MRC) 502. The MRC includes a Medium Resolution Telescope (MRT) 503, a data compression unit (M-DCU) 504, and structure and thermal components 505. The space segment also includes a High Resolution Camera (HRC) 506, which includes a High Resolution Telescope (HRT), a data compression unit (H-DCU) 508, gyroscopes (GYU) 509, and structure and thermal components 510. The space segment also includes a star tracker unit assembly (STUA) 511 and a Data Handling Unit (DHU) 512.
The ground segment 513 includes the following systems, components and modules: an order management system (OMS) 514, a processing system 515, an archiving system 516, a calibration system 517, a control and planning system (CPS) 518, a ground station network 519 (which comprises the ground stations 300 and the network 400), an orbit and attitude system (OAS) 520, a health monitoring system (HMS) 521, a data hub 522, network and communications 523, a Web platform 524, a Web data storage system and content delivery network (CDN) 525, a product delivery system (PDS) 526, and a financial and account system 527. The systems, components and modules described in the ground segment are implanted using server systems and software modules.
The operation segment 528 includes operation facilities 529, which are located at different locations and at the ground stations 300, and an operations team 530.
The observation system 500 may also include or interact with external systems 540, such as public users 541, third party applications 542, customers and distributors 543, external data providers 544, community-sourced data providers 545, and auxiliary data providers 546.
More generally, the space segment 500 includes camera systems installed on the International Space Station (ISS), or some other spacecraft. For example, the MRC 502 provides a medium resolution swath image of the Earth that is approximately 50 km across. The HRC 506 captures true video data, for example at approximately 3 frames/sec, having an area of approximately 5 km by 3.5 km for each image. Other cameras are mounted inside or outside the ISS looking out the windows.
Some high level operational scenarios are summarized below.
In an example operation scenario, the system acquires image and video data and makes it available on the Web Platform 524 (e.g. a Website or application accessed using the Internet). This includes ongoing collection and sufficient time to build up archives of a significant portion of the Earth. This involves very large data volumes. The benefits to users include constantly updating imagery. Image data is acquired to cover the accessible part of the Earth, with higher priority and quality given to areas of greater user interest. Image data, such as video data and high resolution imagery from the HRC, is acquired for specific areas of interest based on predictions from the system 500 and from input from users.
In another example operation scenario, the Web Platform 524 provides a user experience that incorporates continually refreshed and updated data. The system is able to publish the remote sensing data (e.g. imagery) to users in near real time. Users (e.g. public user 524) will be able to interact with the platform and schedule outdoor events around the time when they'll be viewable from our cameras. The Web Platform will also integrate currently known and future known social media platforms (e.g. Twitter, Facebook, Pinterest, etc.) allowing for a fully geo-located environment with Earth video content. In addition, the API will be open source, allowing developers to create their own educational, environmental, and commercially focused applications.
In another example operation scenario, customers and distributors interact with the systems to submit requests. Requests include Earth observation data (e.g. both existing and not-yet acquired data) and value added information services.
In another example operation scenario, an online platform is provided that incorporates components of various currently known and future known online stores (e.g. Amazon.com, the Apple AppStore, Facebook, etc.). The online platform or online store allows consumers to search and purchase software applications developed and uploaded by third party developers. The applications have access to the images obtained by the Earth observation system 500, including images obtained by external systems 540.
Turning to
The space segment 501 includes the Biaxial Pointing Platform (BPP) 605, the On-board Memory Unit (OMU) 610, the TC1-S computer 611, the time synchronization signal generation 609, Internal Camera Equipment (ICE) 608, the Data Transmission Radio Engineering System (DTRES) 607 which is the X-band downlink transmitter, and the on-board S-band telemetry System 606 that is used to receive the command files and transmit real-time telemetry to the Mission Control Centre.
The TC1-S 611 is configured to receive a set of commands used for imaging and downlinking in an Operational Command File (OCF). OCFs are configured to be uplinked through the s-band telemetry system to the TC1-S 611. The TC1-S 611 checks the OCF and then sends the OCF to the DHU 512 which controls the cameras.
Image data, video data, ancillary data, telemetry data, and log data is collected by the Data Handling Unit 512 and then transferred to the OMU 610. This data is then transferred from the OMU 610 to the DTRES 607. The DTRES 607 downlinks this data to ground stations 300 around the Earth.
The Internal Camera Equipment (ICE) 608 would be used to provide imagery that is in addition to the MRC and HRC. The ICE includes, for example, a video camera pointed out of a viewing port to observe the earth's limb (e.g. camera 202), and a still-image camera would be pointed out a of different viewing port along nadir or as near to nadir as is possible. The cameras have a USB interface that can be used to get the data from the cameras into the DHU 512 to be subsequently downlinked. It will be appreciated that certain components (e.g. 512, 608, 609, 610, 611) are located inside the spacecraft 100 and other components are located outside the spacecraft.
Continuing with
The main elements of the MRC 502 are the Medium Resolution Telescope (MRT) 503, which includes the focal plane and associated electronics, the Data Compression Unit (M-DCU) 504, the structure and thermal enclosure 505, and the corresponding cable harnesses and a connector box.
In an example embodiment, the MRT 503 is a fixed pointing ‘push broom’ imaging system with four linear CCD arrays providing images in four separate spectral bands. For example, the images will have a Ground Sampling Distance (GSD) of approximately 5.4 m×6.2 m and will cover a swath of 47.4 km (at 350 km altitude).
The data from the MRT 503 is fed into the M-DCU 504 which uses a compression process (e.g. JPEG2000 or JPEG2K) to compress the data stream in real-time and then transmit the compressed image data to the DHU. In addition to performing the data compression, the M-DCU 504 is also the main interface to the DHU 512 for controlling the camera. It gathers camera telemetry to be put into log files that are downlinked with the imagery, sets up the MRT 503 for each imaging session (e.g. sets the integration time), and performs the operational thermal control.
The MRC 502 is able to take continuous, or near continuous, images of the Earth, producing long image strips. The image strips will be segmented so that each segment has a given set of parameters (i.e., compression ratio and integration time). Each image strip segment, made up of all 4 spectral bands, is referred to as an “Image Take” (IT). In some cases, there may be a very small gap between Image Takes whenever a control parameter such as compression ratio or integration time is changed.
The imagery is divided into “frames”, each of which are JPEG2000 compressed and downlinked as a stream of J2K files. Other compression protocols and data formats may be used.
In an example embodiment, the integration time is varied in a series of steps over the course of the orbit, adjusting for the solar illumination level, including night imaging. The compression ratio may also be varied over the course of the orbit, according to the scene content. Images of the land with reasonable solar illumination levels may be acquired with relatively low compression ratios, yielding high quality products. Images of the ocean and land with low solar illumination levels, and all images at night may be acquired with higher compression ratios with little perceptible losses since they have much lower spatially varying content.
An along-track separation of the bands can occur because the linear CCD arrays are mounted on a common focal plane, but spatially offset with respect to the camera bore sight. The image take data collected by the individual spectral bands of the MRC are acquired at the same time, but are not geo-spatially aligned. In a particular example, the NIR-band (leading band) will record a scene 6 to 7 seconds before the red-band (trailing band). This temporal separation will also cause a cross-track band-to-band separation due to the fact that the Earth has rotated during this period.
The along-track and cross-track band-to-band spatial and temporal separations in the image take data sets are typical of push broom image data collection, and will be compensated for by the image processing performed on the ground by the processing system 515 when making the multi-band image products.
Continuing with
In an example embodiment, the HRT 507 is configured to produce full frame RGB video at a rate of 3 frames per second. Throughout the system, the HRT video data is largely treated as a time series of independent images, both by the HRC 506 and the processing system 515.
In an example embodiment, the HRT 507 is a large aperture reflective (i.e. uses mirrors) telescope which also includes a refractive element. The HRT also includes a Bayer filter and a two-dimensional, 14 Megapixel CMOS RGB imaging sensor on the focal plane. In an example embodiment, the image area on the ground is 5 km×3.3 km with a GSD of 1.1 m when the space craft is at an altitude of 350 km.
The data from the HRT 507 is fed into the HR-DCU 508 which compresses the data stream in real-time and then transmit the compressed image data to the DHU 512. In addition to performing the data compression, the DCU 508 is also the main interface to the DHU for controlling the camera. The DCU 508 gathers camera telemetry to be put into log files that are downlinked with the imagery, sets-up the HRT for each imaging session (e.g., sets the integration time), and performs the operational thermal control.
The imagery is divided into “frames”, each of which are JPEG2000 compressed and downlinked as a stream of J2K files. Like the MRC, the integration time for the HRC will be appropriately selected for the solar illumination level, including night imaging. The compression ratio will also be selected, according to the scene content. Videos of the land with reasonable solar illumination levels will be acquired with relatively low compression ratios, yielding high quality products. Videos of the ocean and land with low solar illumination levels, and all videos at night will be acquired with higher compression ratios with little perceptible losses since they have much lower spatially varying content.
The HRC 506 is mounted to a two-axis steerable platform (e.g. the Biaxial Pointing Platform—BPP). The BPP 605 is capable of pointing the camera's bore sight at a fixed point on the ground and maintaining tracking of the ground target. For example, the BPP will rotate the camera to continuously point at the same target while the spacecraft is moving for approximately a few minutes. A 3-axis gyro system 509 is also included in the HRC 506 that measures the angular rates at high frequency. The system 509 sends this angular data to the DHU 512 to be downlinked as ancillary data. This angular data is used in the image processing on the ground to improve the image quality.
Collection of a single video over a selected ground target is referred to as a “Video Take” (VT). A ground target may be a single point where all frames are centered on this one point, a 2D grid of points where a fixed number (e.g. 1-5) of frames is centered on each of the points in a serpentine sequence (resulting in a quilt-like pattern that covers a larger area), or a slowly varying series of points forming a ground track (following along a river, for example).
Continuing with
The DHU 512 interfaces to a terminal computer 611. The terminal computer 611 receives the OCFs uplinked from mission control and transfers these files to the DHU 512 as well as inputs to ancillary data files and log files. The DHU 512 and the terminal computer 611 execute the time tagged commands listed in the OCF using their own internal clocks. The clocks are synchronized by use of a GPS-derived time synchronization signal (Pulse Per Second—PPS) to ensure that commands executed by both the DHU and the terminal computer are coordinated. The DHU also sends this same PPS signal to the Gyro Unit 509 in the HRC and to the Star Tracker Assembly Unit 511 so that the angular rate data and attitude data are also time synchronized to the commanding of the system.
Prior to each downlink, the DHU 512 sends the image and video data files to be downlinked, as well as the associated ancillary data and log files to the OMU 610 which then sends the data to the DTRES 607 for downlinking to a ground station 300.
Continuing with
Elements of the Star Tracker Unit Assembly (STUA) 511 include the Power and Interface Control Unit (PICU) 601, and two Star Tracker Heads 602, 603 (e.g. each pointed in a different direction). The STUA 511 also includes structural and thermal elements 604, such as a baseplate, secondary structural items (e.g., brackets), a thermal system (e.g. heaters, multi-layer insulation), and the associated cabling. The PICU 601 interfaces directly to the terminal computer 611 to provide the terminal computer 611 the real-time localized spacecraft attitude data that may be used to control the BPP 605.
Turning to
As best shown in
As shown in
Public Users (541): General public users can use the Web, internet, and mobile interfaces to look at imagery, video, and other information and to also contribute their own inputs.
Third Party Applications (542): Applications developed by third parties are configured to interact with the earth observation system's Internet services and resources via an application programming interface (API). The applications are expected to support mobile devices.
Customers/Distributors (543): Customers are those customers that place orders for new collections or specifically generated image and data products.
External Data Providers (544): In addition to the data acquired from the spacecraft 100, the ground segment of the earth observation system is configured to acquire imagery, video, and other data from External Data Providers such as other satellite data suppliers.
Community Sourced Data Providers (545): Data, including image and video, may also be obtained from the general public.
Auxiliary Data Providers (546): Auxiliary Data Providers provide supporting data such as Digital Elevation Models (DEMs), Ground Control Points (GCPs), Maps, and ground truth data, to the Earth observation system, such as the calibration system 517.
The Earth observation system includes a number of components, such as the Web platform 524. The Web platform 524 provides a Web interface to the general public. It includes capabilities to: browse and view imagery, videos and other geographic data; contribute additional information and social inputs; and accept requests for future data collection activities.
The Web Data Storage & Content Delivery Network (Web DS & CDN) 525 includes cloud infrastructure that is used to store the Web image data, video data, and community-sourced data, and distribute the data around the world using a Content Delivery Network (CDN) service.
The earth observation system also includes a Product Delivery System (PDS) 526. The PDS includes online storage that is used to serve up Products for retrieval by Customers/Distributors.
The Order Management System (OMS) 514 accepts orders for products and services and manages the fulfillment of those orders. The OMS is configured to task the CPS 518 for new acquisitions and the Processing System 515 for processing. Orders are tracked and feedback is provided to users.
The Control and Planning System (CPS) 518 is configured to provide the following functionality: assess the feasibility of future acquisitions; re-plan future acquisitions and downlinks to assess and adjust the feasibility of the overall collection plan for an upcoming time period; and, based on a resource model and updated resource status received from the mission control center (MCC) 530 and the ground station network (GSN) 519, create plans and command files for onboard activities including imaging and downlinks, and tasks for the GSN 519.
The Accounting & Financial, Billing and Customer Management Systems 527 are the general systems that are used to manage the sales and monetary funds of the image data and imaging services.
The Archiving System 516 archives the raw MRC & HRC image and video take data and associated ancillary data.
The Processing System 515 performs several functions. In an example embodiment, the processing system 515 processes the raw camera data to create image tiles (i.e. map tiles), near real-time live feed tiles, and video files for the Web platform 524. This includes, for example, additional compression and other degradation (e.g. adding watermarks) to differentiate this data from the data that is sold to Customers/Distributors 543.
The processing system 515 also processes the data received from External Data Providers 544 and community-sourced data providers 545 to create image tiles and video files for the Web platform 524.
The processing system 515 also processes the raw MRC and HRC data to generate the image products and video products for the Customers/Distributors 543. In an example embodiment, the data for the customers/distributors 543 is of higher quality compared to the data provided on the Web platform 524. In this way, data presented on the Web platform 524 can be more easily displayed and consumed by lower power user devices, like tablets, mobile devices and laptops.
The Calibration system 517 monitors the image quality performance of the system and generates updated parameters for use in the rest of the system. This includes creating HRC and MRC radiometric and geometric correction tables that will be provided to the Processing system 515. The correction tables may include gains and offsets for the radiometric correction, misalignment angles, and optical distortion coefficients for the geometric correction. The Calibration system 517 also includes automated functions to monitor the characteristics of the HRC and MRC and, when necessary, perform updates to the radiometric and geometric correction tables. The Calibration system 517 may also include tools to allow the operators to monitor the characteristics of the HRC and the MRC, and the tools may also allow operators to perform updated to the correction tables.
The Ground Station Network (GSN) 519 is the collection of X-Band Ground Stations that are used for the X-Band downlink of image, video, ancillary, and log data. The GSN is a distributed network of ground stations (e.g. ten ground stations) providing for frequent downlink opportunities.
The Data Hub 522 is responsible for collecting, preprocessing and routing of downlink data.
The Health Monitoring System (HMS) 521 is configured to perform a number of functions. The HMS monitors the health status of the space segment 501, and generates of health status reports. The HMS organizes and stores engineering telemetry and diagnostic logs, which can be transmitted to an operator for viewing. The HMS also logs behavior and performance, such as by computing long-term trends and statistical analysis. The HMS is also configured to receive and store engineering inputs for the generation of maintenance, configuration and diagnostic activities of the space segment 501. The HMS is also configured to monitor general performance of the Ground Station Network (GSN). For example, the HMS monitors signal levels and lock synchronization, and may monitor other characteristics.
The Orbit & Attitude System (OAS) 520 publishes definitive and predicted orbit data, definitive and predicted attitude data of the ISS. The OAS also provides some related orbit and attitude related services to the rest of the system.
The Mission Control Center (MCC) 530 is used to manage communications between the spacecraft 100 and the ground. For supporting earth observation, the MCC station is used for uplinking the command files (e.g. OCFs) and receiving real-time heath and status telemetry. The MCC 530 is also configured to transmit resource availability about the spacecraft and the space segment 501 to the CPS 518. This resource availability data may include data regarding power resources, planned orbit adjustment maneuvers, and any scheduled outages or other availability issues.
The MCC 530 receive OCFs from the CPS 518. The MCC 530 then confirms that it meets all resource constraints and availability constraints. If there is a conflict where any resources are not available to optical telescope system, it will either request a new plan from the CPS 518 or could cancel some imaging sessions to satisfy the constraint.
It will be appreciated that
With respect to
In
In an example embodiment, ancillary data for Level 0 products include orbit position, attitude and other data used for processing.
Turning to
It will be appreciated that any module, component, or system exemplified herein that executes instructions or operations may include or otherwise have access to computer readable media such as storage media, computer storage media, or data storage devices (removable and/or non-removable) such as, for example, magnetic disks, optical disks, or tape. Computer storage media may include volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data, except transitory propagating signals per se. Examples of computer storage media include RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by an application, module, or both. Any such computer storage media may be part of the systems, modules or components of the Earth observation system 500, or accessible or connectable thereto. Any application, system or module herein described may be implemented using computer readable/executable or instructions or operations that may be stored or otherwise held by such computer readable media.
Camera System
Turning to
Turning to
The terminal computer 611 controls operations on the DHU 512. The terminal computer also controls the MRC and HRC units' power on/off transitions in accordance with the OCF content. The terminal computer 611 provides time updates, Operational Command Files (OCFs 1201), direct commands, spacecraft ancillary data (e.g. ephemeris and attitude data, BPP joint angles, STUA quaternions), and terminal computer telemetry data to the DHU 512. In return, the DHU 512 provides DHU telemetry (e.g. health status) information back to the terminal computer 611.
The Mission Control Center (MCC) 530, located on the Earth, performs checking on the OCFs. If no issues are found, MCC manages the uplink of the OCF to the spacecraft. If any issues are found, the OCF is rejected and is not sent to the spacecraft. The MCC manages the uplink of the OCFs via ground stations. The size of the command data that can be uplinked in one pass is limited. For example, the command data in an OCF is less than 8800 bytes. However, the limit varies from pass to pass (e.g. when the spacecraft passes over a region of a ground station). The MCC will enforce this restriction and will only uplink OCFs that fit within the restriction.
The CPS 518 sends OCFs to the MCC 530 and the MCC manages the communications with the spacecraft via the terminal computer 611.
The MCC also received telemetry from the terminal computer 611, including health and safety data.
In an example embodiment the health and safety data is sent from the spacecraft to a ground station 300 via the spacecraft's DTRES. In particular, space packets of data are generated by the terminal computer 611, and stored on the OMU 610 of the spacecraft. The space packets include image data (e.g. image take files 1206 and video take files 1207), ancillary data files 1204 and health monitoring data (e.g. log files 1205). When possible, the space packets are sent from the OMU to the ground station 300, via the DTRES 607. The ground station sends the data to a data hub 522. The space packets, including the log files, are sent to the HMS 521. The space packets including the ancillary files 1204, the image take files 1206, and the video take files 1207 are sent to the processing system 515.
The terminal computer 611 is configured to, for each Session in the OCF, execute the necessary steps to power on/off the DHU 512, STUA 511, MRC 502 and HRC 506. In some cases, the terminal computer 611 powers on/off the OMU 610 and DTRES (e.g. X-band transmitter) 607 to support the downlinks. For each STUA Session in the OCF, the terminal computers 611 execute the STUA commands. For each HRC Session in the OCF, the terminal computer executes the BPP control.
When the terminal computer 611 receives a new OCF, the terminal computer performs a check on the OCF. If any errors are found the OCF is discarded and is not passed onto the DHU 512. This occurrence would be reported to the DHU via an execution message and passed to the ground in a Space Segment Log (e.g. log file 1205).
In an example embodiment, the terminal computer 611 is configured to modify an OCF by deleting, inserting, or modifying commands. For example, the terminal computer introduces unplanned instructions (e.g. instructions not contained in the originally received OCF) to activate power off cycles to support off-nominal operations aboard the spacecraft.
After receiving a new OCF from the terminal computer 611, the DHU 512 computes a basic check on the OCF format and, if there is a discrepancy, the DHU 512 rejects the OCF. This discrepancy and rejections would also be noted in the Space Segment Log.
The DHU 512 generates internal command schedules based on the contents of the OCF. In addition to controlling its internal activities, the DHU 512 will control the MRC and HRC. This includes power on/off and any necessary warm-up/cool-down periods. In particular, the DHU sends MRC commands to the MRC 502, HRC commands to the HRC 506, gyro commands to the gyroscopic unit 509. The DHU also received image or video data from the cameras, telemetry data from the cameras, and telemetry data from the gyroscopic unit.
Continuing with
Although not shown in
In an example embodiment, the DHU and the terminal computer are configured to process each OCF to create the internal commanding schedules without affecting the current schedule being executed. The DHU and the terminal computer are configured to stop all activities associated with the previous OCF at the start time of the new OCF and generate their new internal schedule based on this new OCF from that start time.
Regarding the camera system's configuration, onboard update activities such as maintenance, calibration and software patches may be controlled through OCF command files generated on the ground. Configuration parameters includes software patches, calibration and configuration settings. The specific configuration instructions that may be commanded include the update and dump of MRC and HRC configuration factors, uploading of any of the telescope configuration tables, update of compression selection tables, DHU configuration, software updates.
Software updates may be in the form of OCFs and the transport mechanism could be via the normal command uplink to the terminal computer 611. However the updates could also be, via memory stick, sent up on a flight to the spacecraft, where an astronaut or cosmonaut (or the like) inserts the memory stick into the terminal computer.
Regarding the MRC 502, the MRC is a fixed pointing camera that takes image strips. The configurable camera settings include ADC gain, integration time and compression ratio. The following are some overall considerations related to compression ratio, integration time. Examples of compression ratios for the MRC are provided below in Table 4.
An Image Take (IT) is a contiguous strip of four-band MRC data collected with the same camera and configuration settings. Four-band data refers Red, Green, Blue and Near Infrared (RBGN) data. In an example embodiment, all image takes will have a unique Image Take ID.
The MRC camera is primarily operated in a systematic image collection mode. In a default mode, the MRC camera collects imagery all the time, subject to resource constraints. Based on this mode of operation, most of the MR imagery will be part of a systematic background acquisition plan. Most of the imagery will be acquired as long strips. The length of individual strips (Image Takes) is limited in practice by the need to change the ADC gain, integration time and compression ratio as the sun illumination changes along the orbital ground track. This limit may be reached sooner than other limits, such as limits in the size of data that can be downlinked and handled in the rest of the system.
A request for imaging using the MRC can originate from a customer or external user are managed by OMS and allow a user to: request that imagery of a certain area is covered; request certain compression level (e.g. higher quality than the default setting); and request delivery to the customer. The request may also include details that exclude or delay the imagery from being placed in the public archive.
The default imaging routine of the MRC is systematic and includes capturing long images of strips of the Earth. The image strips are broken or separated into manageable lengths during planning. The systematic imaging is not done in response to any specific user requests. The imaging concept for the MRC is for continuous global coverage. However the specific Acquisitions may be influenced by: light conditions; land or ocean area; and data volume limitations on board or downlink capacity
For the MRC, the operational Image Take executable instructions may be performed according to the following order: MRC On command; MRC Warmup delay; one or more Image Takes; MRC Power-off clean-up delay; and MRC off command.
Regarding the HRC, the HRC is an agile high-resolution video camera. For example, it has capture images at 3 frames per second, optionally decimated by integral factors. The HRC is capable of collecting video/imagery in several ways.
The HRC is configured to capture video centered on a single target on the ground. The camera remains pointed at the target as the spacecraft moves along its orbit. A video with one frame is equivalent to acquiring a single image. Sets of such videos or images can be used for specific purposes such as stereo pairs. If the images are not frames of a continuous video, then such scenarios are composed of multiple Image Takes.
The HRC is also configured to capture a one-dimensional or a two-dimensional grid of adjacent single frame takes where a fixed number (e.g. 1-5) of frames are acquired in a line or serpentine pointing sequence, resulting in a quilt-like pattern that covers an area longer and/or wider than a single swath. Such an imaging scenario can be commanded such that it is stored as a single Video Take.
In another example embodiment, the HRC is further configured to capture a video of a path on the ground (a video that slides along a path on the ground). Such an imaging scenario can be commanded such that it is stored as a single Video Take.
The HRC is also configured to capture a video of a celestial target (e.g. the moon).
The HRC's configurable camera settings include, for example, ADC gain, integration time and compression ratio. The following are some considerations related to compression ratio, integration time and ADC gain settings. Typical compression settings for the HRC are shown in Table 5 below.
A Video Take (VT) is defined as a set of HRC frames taken of a specific target. During this period, the BPP 605 will normally point the HRC boresight at the target and track this target during the VT. Each VT has constant camera and compression configuration settings.
The HRC may be operated in an order driven mode. These orders could originate from various sources, including Customers/Distributors, planners, and inputs from Public Users. The orders will specify targets, areas, or paths. The HRC operations include, for example, single target video, capturing a grid of images to cover an area, a video of a path, and celestial target video.
For terrestrial imaging, commands from the ground segment give the space segment coordinates of points on the Earth at which to point the HRC. The computing terminal 611 is configured to calculate the BPP pointing angles and commands the BPP. The CPS 518 is responsible for determining the timing of the HRC imaging activities such that slew and settling time rules are followed.
With respect to single target videos, the Order specifies the target's latitude, longitude and altitude. The BPP slews from the previous location, settles, and then dwells on the target for the entire Video Take. The result of this type of imaging is a video (or image if the video has only one frame) centered over a target.
With respect to a grid of images to cover an area, the Order specifies an area polygon. At the time of order entry, limits are automatically placed on the size of the area, based on the maximum area that the camera can acquire in one pass. The CPS 518 takes the requested area and breaks it into a series of individual single target Video Takes that form a grid pattern to cover the area. The BPP 605 slews from the previous location, settles, and then dwells on the first target for the duration of a single frame. The BPP 605 then slews, settles, and dwells on the second target for another single frame. And so it continues through all the individual single target Video Takes.
With respect to a video of a path, the Order specifies a path expressed as a sequence of video center coordinates. The CPS 518 creates a sequence of target center coordinates with appropriate spacing along the requested path. The BPP 605 slews from the previous location, settles, and then “sweeps” smoothly through each of the target center coordinates over the total Video Take duration. The required integration time (e.g. based on lighting level) determines the frame rate which determines the maximum path length. Based on typical values for integration time a swath of up to approximately 20 km could be swept. The result of this type of imaging is a single video that “sweeps” or “pans” over a path on the ground.
With respect to a Video Take of a celestial target, the order contains a coordinate in RAAN, Declination format. The BPP slews from the previous location, settles, and then dwells on the target for the entire Video Take. The result of this type of imaging is a video (or image if the video has only one frame) centered on a target.
In an example embodiment, the HRC operation executable instructions are executed according to the following timeline: HRC on command; gyro warm up delay; BPP on command; BPP align delay; HRC warm up delay; one or more Video Takes; BPP slews to position; acquisition begins and BPP tracks target position; at the end of the Video-take the BPP slews to its next position and wait to start the next video take; at the end of the last video take in the current HRC Session, the BPP performs a final slew manoeuvre to its stowed position, if so commanded; HRC Power Off Cleanup delay; HRC Off Command; and BPP Off Command.
Turning to
At block 1301, a computing device determines solar illumination level of the Earth, of which the camera will capture images and video. The solar illumination is be estimated based on location, date, and time of day at location, or can be obtained from measurements. At block 1302, the computing device determines integration time of the camera based on the solar illumination level (e.g. night time has a higher integration time). At block 1303, the computing device outputs the integration time for the camera.
Continuing with
Regarding compression ratio, after computing the solar illumination level, the computing device determines if the time of day is “night” for the location viewed by the spacecraft (block 1306). If so, and it is night, the computing device outputs a “high” compression ratio (block 1307). This is because there is little data to be obtained from night images, and higher quality images are not required for the night images. If the time of day for the specific location is not night (e.g. is the “day time”), the computing device determine the scene content (e.g. estimate based on location viewed by spacecraft) (block 1308). If the scene content is the ocean or water, the computing device outputs a “high” compression ratio (block 1309). If the scene content is land with low solar illumination, the output is a “high” compression ratio (block 1310). If the scene content is land with sufficiently high solar illumination, then the output is a “low” compression ratio (block 1311). It can therefore be appreciated that, by automatically adjusting the camera settings based on these factors, the size of the data files are appropriately smaller for less important images and are appropriately larger for more important images. This saves resources related to data storage and transfer.
Turning to
Turning to
Continuing with
After determining the interest factor, the computing device activates power or deactivates power for the camera based on the interest factor (block 1504). For example, if the interest factor is sufficiently high, activate power to the camera, or otherwise deactivate power to the camera (block 1508).
Turning to
In another example embodiment,
Turning to
Turning to
If the BPP is commanded to slew to a starting position for another VT, the process starts again from block 1904. Otherwise, if the BPP is moved to a stowed position, the computing device determines if more than a predetermined amount of time passed since last VT (block 1909). If so, the computing device executes a re-alignment procedure for the BPP (block 1910). This realignment is to take into account the vibrations that may have caused the camera to be misaligned while being in the stowed position. If the predetermined amount of time has not yet passed, no action is taken, and the computing device re-evaluates the condition posed in block 1909.
Downlink Processes
Turning to
Alternatively, or in addition, if the spacecraft connects to a secondary ground station, the computing device on the spacecraft downlinks the data to the secondary ground station (block 2003). The secondary ground station receives the data from the space craft 2005 and sends the data to the central ground station 2006, for example, over the network 400. The data is received by the central ground station at block 2008.
In this way, even if a spacecraft is not able to connect to the central ground station, the spacecraft can communicate data to another ground station and the data will be compiled on Earth.
Turning to
At block 2102, the DHU performs planning operations to determine which files are to be downlinked in each downlink window for a given ground station. The DHU has a set of logic rules that implement querying priority. The DHU populates a downlink list with an ordered list of files to be downlinked, as per the derived plan (block 2103). The DHU then downlinks the files according to the downlink list (block 2104).
Turning to
During a Downlink Window, the spacecraft 100 downlinks the data to a ground station, which records the data and sends it on to the Data Hub 522. The Data Hub re-assembles the Image/Video Take files which may have been split across several passes and ground stations. When the Image/Video Take files are assembled, the Data Hub 522 forwards them to the Processing System 515. The Processing System catalogues the new data, and then generates the systematic products which are sent to the Web DS & CDN 525.
In particular, the CPS 518 sends a reception schedule 2202 to the ground station network 519 and sends the expected file list 2203 to the data hub. This occurs at the end of a planning session 2204. The spacecraft, using components 606, 607, downlinks the image and video data via an X-band downlink 2205.
A set of operation 2208 are performed for each downlink pass at each ground station. The ground station receiving the data sends a compiled data file 2206, like a Cortex Data File, to the data hub 522 and sends a post-pass report 2207 to the CPS 518.
For each downlink pass received from a ground station, the data hub re-assembles the received files 2209. It is noted that Application Identifiers (APIDs) are embedded into the DATA ID and include SOF and EOF markers, which used to help reassemble the files. After assembling the files, the data hub sends Level 0 Files and a data quality report 2210 to the processing system 515. The assembled Cortex data file (2211) is also sent to the E-Data Hub 2201. The data hub also sends the data hub log to the CPS and the OMS (2212).
The processing system then performs a cataloguing process (2214) and then sends Level 0 files and catalogue entries to the archiving system (2215). The processing system also performs additional processing 2216 to generate map tiles, live-feed tiles and pinpoint data, amongst other types of processed data. The processed data is sent to the Web DS & CDN (2217). The image video take notification is sent to the OMS, for example to alert the users that their data has been acquired and is ready for downloading.
Turning to
The DHU 512 receives compressed blocks of imagery/video from the DCU in each camera 502, 506, 608. The DHU 512 assembles these blocks into Downlink Files composed of space packets. Each file stored on the DHU will have a unique DHU Data ID which serves as a handle for the particular type of file to be accessed. The DHU Data ID for Image/Video Take files contains the Image/Video Take ID that was specified in the OCF.
The output from the DCUs in each camera to the DHU is a stream of image files. The DHU will manage the collection of these image files belonging to a single Image Take or Video Take as one file. The IT or VT will be uniquely identified by an ID included as part of the record command in the OCF. This same ID will be used to identify the file to the DHU when it is included in a downlink list, a delete list, or a do-not-delete list.
The Application Identifier (APID) field in each Space Packet is encoded to identify the downlink file type.
It will be appreciated that the Consultative Committee for Space Data Systems (CCSDS) maintains a set of recommendations that include Space Packets and Digital Video Broadcast transmission standards.
It will be appreciated that ancillary data and log information from both the camera systems and the spacecraft are used by the ground segment to monitor and control the collection of data, and also process the collected data. It may also be used by systems onboard to determine the status of other systems, for example for the DHU to monitor the status of the HRT and MRT.
The DHU is configured to receive ancillary data from the following different sources: the terminal computer 611 (e.g. the spacecraft position, the spacecraft attitude, BPP pointing angles, etc.); the HRC-DCU 508 (e.g. temperatures, dark pixel data, etc.); the HRC-Rate Gyros 509 (e.g. configuration settings); the MRC-DCU 504 (e.g. temperatures, dark pixel data, etc.); and from the DHU itself (e.g. status information). The DHU collects this data and periodically saves the data to a file, such as an ancillary data file and a DHU data log. A new file is created each time the data is saved.
In an example embodiment, the DHU collects log information from the same sources used to collect ancillary data. Log data is collected and written to files in the same manner as the Ancillary Data. The log data includes, for example, the execution result of each command in the OCF, and health/status telemetry.
Turning to
With respect to X-band downlink operations, the OCF contains downlink commands to indicate the times of downlink passes and to indicate the data to be downlinked in each pass. For each upcoming Downlink Window, the DHU will create a list of files to be downlinked by taking the downlink list in the OCF, and the DHU will add other files based on predetermined and configurable rules.
The DHU may arrange for the files for downlinking to be transferred to the OMU just before the downlink start time. The transfer between DHU and OMU may be done in partitions based on data size (e.g. 1 GB). This means that a single Image/Video Take may be split up across 1 GB boundaries and that partial files may be received on the ground. Note that Space Packets are always downlinked in their entirely and are not split up.
The OMU/DTRES will then format the files to prepare for the downlink (DVB format). The OMU/DTRES will use the X-Band transmitter to downlink the data to the GSN.
As noted above, the image take files and video take files may be split and downlinked over multiple Downlink Windows and potentially the pieces of the image and video take file may go to different ground stations. The Data Hub is responsible for reconstructing the original files.
The CPS keeps track of the overall downlink usage of the system. The CPS accounts for the downlinks during planning based on identifying downlink windows, the downlink data rate, and the size of each image/video take file. The size of the each image/video take file can be computed by estimating the compression ratios to be used for each image.
Further information about the data to be downlinked is below. Table 6 identifies the types of files to be downlinked, for example using the X-band link.
The DHU is able to identify the actual file sizes of each file and, shortly before each Downlink Window, the DHU constructs a plan for the Downlink Window using the actual file sizes, the Downlink Window duration and the downlink data rate.
Some image/video take files may be partially downlinked in a Downlink Window. The remainder of the files will be downlinked at the beginning of the next allowable Downlink Window. In an example embodiment, to increase the usage of the downlink resource, each downlink session produces at least one new partially downlinked file.
In an example embodiment, in order to reduce the risk of losing Ancillary data during downlinks, and since Ancillary data files are relatively small, Ancillary data files will be downlinked twice. For each Downlink Window the DHU will automatically downlink every Ancillary data file generated in the time period starting from the two prior Downlink Windows until the current Downlink Window. This concept is illustrated in the
In
Similarly, ancillary files 2509 generated during downlink window N−1 2503 and downlink window N 2504 are actually downlinked during downlink window N+1 2505. In this way, the ancillary files generated during downlink window N−1 are downloaded twice; a first time during downlink window N 2504 and a second time during downlink window N−1 2505.
The Ancillary data is downlinked to an allowable list of ground stations (e.g. one of the ground station lists stored in the DHU). In addition, the DHU can also be expressly commanded to downlink Ancillary data files through the OCF. This would be used in the case where the Ancillary data file needs to be re-downlinked because it could not be successfully reconstructed, or needs to be downlinked to another ground station.
In an example embodiment, the DHU places the Ancillary data files before the start of the second or third downlinked file, thus avoiding the beginning and end of the Downlink Window where there is a higher probability of data loss or corruption.
A similar process is used to downlink space segment Log files. In an example embodiment, however, Log files are downlinked once. The DHU will place the Log data files immediately after the Ancillary data files, thus avoiding the beginning and end of the Downlink Window where there is a higher probability of data loss or corruption.
With respect to downlink commanding, the commanding of downlinks is contained within the HRC, MRC, and downlink sessions in OCFs. The relevant fields and how they are determined by CPS is shown in the following table, Table 7.
Each Image/Video take file is assigned one of two priorities (i.e. normal or urgent) in the OCF. The DHU places files with urgent priority at the beginning of its downlink list. It is expected that only a very small faction of files will be labeled as urgent priority.
If some part of a downlink has been lost for some reason, the OMS detects this based on the Data Hub Logs. If the OMS decides to request the re-downlink of the missing data, the OMS sends a Re-Downlink Request to the CPS. The CPS includes the file ID in the OCF, specifying certain space packets to downlink.
Within each Downlink Window, the sequence of files to be downlinked is shown in Table 8.
As described above, the DHU downlinks files according to certain rules and ground commands. After a file has been downlinked (or in the case of an Ancillary file, downlinked twice) the file is deleted, unless the protection flag is set, in which case the DHU never deletes the file until a command is received from the ground. The process of handing the data after downlinks is shown in
Orbit and Attitude System
It can be appreciated that a spacecraft, such as the ISS or another satellite, performs orbit/attitude maneuvers. The maneuvers are sometimes independent of the imaging operations. Predicted orbit data may be available in advance of the orbit maneuvers. During orbit maneuvers imaging operations can continue, or can be suspended via the insertion of an unavailability period.
Turning to
Planning
Turning to
Further details regarding the planning processes are described below.
Turning to
Continuing with
An example of a modified map 3005 is shown in
Turning to
Note that night-time imaging may be handled differently compared to day-time imaging. Since the compression ratio is set to be high at night, there is less need to split Image Takes at map region boundaries. Having fewer Image Takes has the advantage of reducing the size of the OCF. An example planning process for night time would be for CPS to command night time imaging operations as a single long Image Take with a constant compression setting.
The quality level associated with each region may be identifies as one of three “quality index” values that are sent to the CPS: nominal quality index, maximum required quality index, and minimum required quality index. In the CPS, each quality index value is mapped via a table to camera compression ratio settings. Several tables are needed to allow for different lighting conditions, for example separate tables for Summer, Winter, and Night.
It is recognized that the total amount and quality of imaging will be limited by resource constraints, such as total data volume due to downlink limitations. The total data volume is kept within resource limits by the CPS by limiting the data quality and the imaging time.
Image takes that fill all orbits completely are planned using the nominal value of the compression ratio, also called image quality. The image quality is then adjusted using the approach described in
Turning to
At block 3201, the CPS applies a nominal value of the image quality level to each region on the map. The CPS determines if the resource usage is above a predefined limit (block 3202). If so, operations 3210 are performed.
In particular, at block 3204, the CPS considers all regions being imaged along the orbit/flight path of the spacecraft for the given Image Take. In reverse order of priority level (e.g. starting with the region with the lowest priority level), the CPS decreases the quality level associated with the region by a decrement. The operation of decreasing the priority level by a decrement is repeated for the next region, based on reverse order, until: resource usage is at/below the predefined limit; or the quality level for all regions has been decreased by one decrement (lock 3205). If the resource usage is at/below the predefined limit, the process stops.
However, after repeating the operation of block 3204 for all regions being imaged along the orbit/flight path, and if the resource usage is still above a predefined limit, then the series of operations in 3210 are repeated, but further decreasing the quality level by yet another decrement, until resource usage is at/below the predefined limit (block 3206). Again, if the resource usage is at/below the predefined limit, the process stops.
After implementing block 3206, and if the resource usage is still above the predefined limit after the minimum quality level is reached for all regions, then the CPS removes a region from the image taking or video taking plan, starting with the region having the lowest priority level (block 3207). This removal of regions, starting with the lowest priority regions, is repeated until resource usage is at/below the predefined limit (block 3208).
However, if at block 3202 it is decided that the planned resource usage is not above a predefined limit, then the image quality is increased. In particular, a set of operations 3209 is performed and includes blocks 3210 and 3211.
Block 3210 includes the CPS considering all regions being imaged along the orbit/flight path of the spacecraft for the given Image Take and, in order of priority level (e.g. starting with the region with the highest priority level), the CPS increases the quality level associated with the region by an increment. At block 3211, the CPS repeats this operation for the next region based on order of highest priority level, until: resource usage is at the predefined limit; or the quality level for all regions has been increased by one increment.
At block 3212, if the quality level has been increased for all regions (as per operations 3209), the CPS repeats the series of operations 3209, but further increases the quality level by yet another increment, until: resource usage is at the predefined limit; or the maximum quality level has been reached for all regions.
Turning to
Turning to
Another aspect of planning is pre-planning, as per operation 2801 in
Continuing with
A more generalized representation of the planning process 2804 is shown in
The planning process outputs include: an advance operating schedule; an activity schedule; a reception schedule; and an expected file list.
It can be appreciated that an acquisition request includes one or more of the following types of information: a region (e.g. area of polygon coordinates), a priority level, a time period, a quality index, camera configuration settings (e.g. integration time setting, ADC gain setting, compression ratio setting), a protection flag indicating the data should not be deleted from the OMU, an urgent downlink flag, a preferred ground station, cloud cover, sun elevation and azimuth angles, and incidence and azimuth angles.
For acquisition requests specific to video, acquisition requests may additionally include the location to be imaged and the length of the video. For a single target video, the information in the request additionally includes target coordinates and a video length. For a grid of images to cover an area, for each target grid, the request additionally includes: target coordinates, relative start time, and time duration. For a video of a path, the request additionally includes for each target along the path: target coordinates, relative start time, and time duration. For a celestial target video, the request additionally includes a vector to the target and a video length.
It can be appreciated that the CPS may also receive re-downlink requests. A re-downlink request is a request received by the CP to request the DHU on the spacecraft to downlink certain files or portions of files again. Re-downlink requests include the image/video take ID and a parameter specifying the whole file or a specific range of space packets. If the requested file is for an ancillary or log file, the re-downlink request includes the time range for a whole file, or an ancillary/log file ID and range of packets for a partial file. The re-downlink request may even include a preferred ground station to receive the data.
As described above, the CPS outputs activity schedules and these schedules are used to create OCFs. The contents of OCFs are summarized in Table 9 below.
In an example embodiment, each OCF corresponds to a time range. One or more OCFs are uplinked on commanding passes. In an example embodiment, OCFs are not split between passes, while downlink files are split.
In an example embodiment, each OCF is divided into five different types of “Sessions” (e.g. DHU, STUA, MRC, DHU and Downlink) and an “Additional Downlink List”. Each Session corresponds to a power-on/off time range, together with a set of time-keyed commands to execute. In the case where the power-on time of a Session is within some pre-defined time of the power-off time of the previous Session of the same type, the power-off/on cycle is not actually performed.
This commanding approach helps to ensure that the different subsystems will be automatically powered down in the event that an OCF does not arrive, arrives late or is corrupted. The configuration of the OCF also helps to ensure that the system is left in a well-defined or “whole” state at the end of each OCF time range.
As also note above, the CPS generates Reception Schedules for each Ground Station that receives X-band data. Each Reception Schedule contains the planned receptions for the Execution Period (e.g. 24 hours). The Reception Schedules include the following information: identification of the Ground Station and visibility mask in use; visibility start/end time; and approximations of downlink start/end time. Reception Schedules may also include draft information for a longer duration (1-2 weeks ahead) which is updated each time a Reception Schedule is created.
The CPS also generates Expected File Lists for the Data Hubs (the same Expected File List goes to all Data Hubs). The Expected File List includes, for example, a list of Downlink Files (e.g. each Image/Video Take that has been planned for the upcoming Execution Period). For each Image/Video Take Downlink File, the Expected File includes the following information: the type of file; the Image/Video Take ID; the file priority; the approximate file size (e.g. in bytes, number of space packets, or some other kind of counter); the Order ID or list of Order IDs (e.g. allows the data to be related back to a Customer/Distributor order); the Data Hub time out (e.g. maximum time to wait for the remainder of incomplete files); and the end-file-wait period (e.g. the time to wait for an end-of-file marker).
The Expected File List also includes a schedule of when Ancillary Data is to be generated on-board. The Data Hub can use this schedule to check if all expected Ancillary Data has been received.
The Expected File List also includes a schedule of when Space Segment Log Data is to be generated on-board. The Data Hub can use this schedule to check if all expected Space Segment Log Data has been received.
The Expected File List also includes a list of Downlink Windows for the Execution Period. The list may be, for example, a concatenation of all the Reception Schedules for all the ground stations. This list is used to flag failures like missed passes.
It can be appreciated that the scheduling algorithms, currently known or future known, can be applied to the planning process to take into account unavailability periods, acquisition requests, and other resource constraints.
Turning to
Turning to
In an example embodiment, the CPS receives Post-Pass Reports from ground stations, and the CPS analyses this information to determine the status and detect anomalies.
With respect to planning downlink activities, the CPS maintains a database of all ground station visibility times based on orbit geometry and on ground station reported availability periods. Ground stations are assumed to be available within the periods that have been set-up in CPS based on coordination with the ground stations and Ground Station Availability Reports.
It is recognized that ground stations may have overlapping downlink windows. To accommodate for such overlap, the CPS or ground segment is configured to execute the example computer executable instructions shown in
Turning to
Calibration
Primary calibration includes collecting a data (primarily from processing by-products) and storing these in a database within the calibration system 517. The calibration system then performs trending analysis of this data. Statistical and heuristic methods can be employed. The calibration system also identifies degradations or issues with Image Quality. It makes any required adjustments to system parameters based on the identified trends and degradation issues.
Systems in involved in establishing and maintaining operational quality include the calibration system, the processing system and the HMS.
Turning to
Data Processing
Data processing is performed by a number of systems within the Earth observation system, including the processing system 515.
Data processing includes data reception at the ground segment. An example process of computer executable instructions for data reception is shown in
In
From the perspective of the Data Hub, based on the planning session, the Data Hub receives an Expected File List from the CPS and then waits for the expected data to arrive from Ground Stations.
After the data has arrived, the Data Hub 522 executes a number of computer executable instructions. The Data Hub collects Space Packets by DHU Data ID, including searching for start-of-file (SOF) and/or end-of-file (EOF) Space Packets, and filtering duplicate Space Packets based on data quality. It can be appreciated that if there are duplicates of received data, the Data Hub selects the highest quality data and discards the other duplicated data. After the processing of each Raw Product File (e.g. having a Cortex Data File format), the Data Hub generates a Data Hub Log regarding the received data, and sends this log report to the CPS/OMS. The Data Hub also combines sorted Space Packets, into the original Downlink Files and once the files are complete creates and releases Level-0 Product Files.
The computer executable instructions further include the Data hub creating a Data Quality Report for each Level 0 Product File. In case a Downlink File is missing from the Space Packets, the Data Hub waits to receive the rest of the file for a configurable period of time. This is in case the remainder of the file is being received at another Ground Station.
In cases when the Data Hub detects a missing data, a time-out, or the number of Space Packets with errors exceeding a configurable data quality threshold, the Data Hub includes such conditions or errors in the Data Hub Log. After sending the Data Hub Log to the CPS/OMS, the Data Hub waits to receive a Workflow Control File from the OMS that will instruct the Data Hub to either wait for re-downlinking of Space Packets, or to release the file as is, or to discard the file. Error handling is further described below.
In an example embodiment, the Data Hub Log includes information about each file that is in the Expected File List. Each file is extracted from the Expected File List and is noted in the Data Hub Log, unless the file has been included in a previous Data Hub Log with a status of Complete or Failed. It can be appreciated that data, which does not pertain to a file from the Expected File List, is also included in the Data Hub Log. For each expected file, the Data Hub Log includes: DHU Data ID; file status (e.g. nothing received, incomplete (such as partially received data), completed, or failed); flags whether the SOF or EOF were found and whether a time-out has been reached; the number of expected Space Packets (approximate) extracted from the Expected File List; the number of received Space Packets with no errors; the number of received Space Packets with errors and a list of the Space Packet ranges for the erroneous packets; the number of missing Space Packets and a list of the Space Packet ranges for the missing packets; and other log and status information.
In addition to the Data Hub 522 sending the Data Hub Log to the CPS 518 or OMS 514, or both, the Data Hub also sends a Data Quality Report to the Processing System 515, per Level 0 Product Basis. It is noted that the Level 0 Product Basis pertains to the downlink of raw data. A Data Quality Report (DQR) indicates errors in data. The Data Hub also sends the Data Hub Log and the Data Quality Report to the HMS 521.
A simplified state transition diagram for the handling of each Downlink File is shown in
The downlink file remains in the incomplete state if the SOF and EOF have been received by the Data Hub, and the data quality is not of sufficiently quality 4207. The downlink file also remains in the incomplete state if the time out has been reached and the data quality is not of sufficient quality 4208.
The downlink file moves from the incomplete state 4202 to the completed state 4203 if at least one of several possible conditions is met. An example condition is that the SOF and EOF have been received and the data quality is of sufficient quality 4209. Another example condition is if the time out period has been reached and the data quality is of sufficient quality 4210. Another example condition is if the workflow control file indicates the command “release file” for the given downlink file 4211. When the downlink file is in a completed state 4203, the downlink file is released. For example, the downlink file is sent to the processing system 515.
The downlink files moves from the incomplete state 4202 to the failed state 4204 if the workflow control file indicates the command “discard file” for the given downlink file. In the failed state, the downlink file is discarded.
Turning to
In general, the Data Hub is the system that detects a missing file, or missing/corrupted data and reports this condition to the rest of the system.
At the end of the planning session 4302, the CPS sends an expected file list to the Data Hub 4301, which outlines the expected list of files. From the Data Hub's perspective, the spacecraft sends downlink files via the X-band to one or more ground stations 4303, and the one or more ground stations send the raw product files to the Data Hub 4304. The one or more ground stations also send a post-pass report to the HMS 4305. Operations 4303, 4304 and 4305 occur for each downlink pass at each ground station 4306.
Continuing with
However, for files that have been received completely with acceptable quality 4312, the Data Hub sends Level 0 files and the Data Quality Report to the Processing System 4311.
The Data Hub generates and sends the Data Hub Log to the CPS and to the OMS 4313. After receiving the Data Hub Log, the OMS sends a request to re-downlink some space packets 4314, such as for the corrupted or missing space packets. The OMS sends the Workflow Control File to the Data Hub 4315 and sends a re-downlinking request to the CPS 4316.
During a next CPS planning session 4312, the CPS executes a planning session 4317 which takes into account the re-downlinking request, and then sends another expected file list to the processing system 4318. The CPS also sends another OCF to the mission control center 4319, where the OCF includes the re-downlinking request. The mission control center uplinks the OCF to the spacecraft 4320, and the space segment executes the commands of the OCF. The process of re-downlinking is repeated 4322, returning to operation 4303.
Ancillary Data Files and space segment Log Files are handled in a similar way as the downlink files, although the files are may not be included in the Expected File List.
Assuming the new image take and video take data, and other data, has been completely received and is of sufficient quality, this raw data is archived. It can be appreciated that raw data, such as Level 0 files, may be provided by the Data Hub. Examples types of Level 0 files include image data, video data, spacecraft ancillary data, and imaging system ancillary data. Data may also be provided to the processing system from an External Imagery Provider or the Web Platform, or both.
Turning to
In the process 4401, the Data Hub sends Level 0 files and a Data Quality Report to the Processing System. The Processing System pre-processes the Level 0 files 4405, which is a downlink format and converts Level 0 to Level 1, which has sensor models included. The Processing System also submits the pre-processed Level 0 files and the Data Quality Report to the Archiving system 4406 for storage.
The Orbit and Attitude System sends orbit data to the Processing System 4407. The Processing System pre-processes the orbit data to create an orbit file 4408, and sends this orbit file to the Archiving System 4409. In an example embodiment, the pre-processing of the orbit data includes compiling the data and filtering the data to make the data in the orbit file more accurate prior to sending the orbit file to the Archiving System.
In the process 4402, other imagery providers send raw imagery and ancillary data to the processing system 4410. The data may also include other product imagery and metadata. Non-limiting examples of metadata include Sun angle, geometry, cloud mask, black fill, bounding box, ownership and copyright information, conversion parameters, etc. The Processing System pre-processes this external data 4411 and submits the same to the archiving system 4412.
In the process 4403, the community data providers submit data to the Web platform 4413. The Web platform sends this data to the Web DS and CDN 4414 and further sends a notification about the community-sourced data to the Data Hub 4415. This process is shown in isolation in
After the image take data, or data from the MRC, has been received and archived, the example computer executable instructions of
The Processing System the sends a catalogue update request to the Archiving System 4505. The Processing System also sends an image take notification to the OMS 4506 and sends the processing by-products to the Calibration System 517.
In another aspect, after the video take data, or other data from the HRC, has been received and archived, the example computer executable instructions of
The Processing System performs pinpoint video processing 4604. It will be appreciated that a pin point herein refers to an area of interest within a polygon on a map. After, the Processing System sends the generated pin-point video to the Web DS and CDN 4605. The Processing System also sends an image take notification or video take notification to the OMS.
The Processing System also performs map tile processing 4607. The map tile processing includes selecting the best tiles (e.g. using down-selection). Desired tiles, for example, are those images or tiles that are cloud-free, or have little cloud cover. The desired tiles are merged with the most recent map tiles on the Web, which includes retrieving them from the Web DS and CDN. In an example embodiment, tiles are processed when not already in a cache, and are visible within a viewport on a browser (e.g. Internet browser) or application, which are rendered and served to the viewport. The processed map tiles and metadata are then sent to the Web DS and CDN 4608. The processing by-products are sent to the calibration system 4609.
In another aspect of data processing, product generation is based on raw data already in the archive in response to a user order. The OMS 514 will send the Processing System 515 a Product Generation request. The Processing System will send two data retrieval requests to the Archiving System 516: one request for DEM data, and one for the raw data from the HRC or MRC. When the product generation is completed, OMS is notified, and the product is submitted to the PDS 526.
It can be appreciated that different product types are associated with different processing levels for the data. For customer products, Level 1B products are images that have been corrected for sensor projection and Level 2B products are images that have been corrected for orthographic projection using image processing algorithms. It is appreciated that currently known or future known image processing techniques can be used in combination with each other, or alone, to correct the orthographic projection.
It will also be appreciated that when product generation is completed, the Processing System will transfer the products to the PDS 526 and send a message to the OMS 514. The PDS 526 may then send the data to the customer's computing device, or initiate sending of the data product in another way.
Event-Based Image Tagging
Another approach to generate Acquisition Requests or to process the data is to tag the data based on events. Turning to
At block 4801, a computing device searches the Internet and online data sources for events. Currently known and future known techniques for Web scraping, data mining, analytics, semantic analysis, machine learning, and other automated search techniques, can be used with the principles herein to search and identify events. At block 4802, the computing device identifies events of interest. Non-limiting examples of interesting events include a sports event, a weather event, a natural phenomenon, a military conflict, a social event, etc. In an example embodiment, user input identifying desired events is used to identify whether an event is of interest (block 4811). For example, a user is interested in events related to hurricanes.
At block 4803, the computing device identifies metadata for each event (e.g. date, time, location, keywords, key phrases). At block 4804, the computing device ranks each event based on interest (e.g. interest to potential or current customers). In an example embodiment, the computing device applies ranking algorithms configured to analyze monetization, semantics, and usage patterns, and to also analyze trending topics to determine what imagery to display to the user. Such ranking algorithms may use k-means and z-score indexes for ranking. The ranking approaches may also include, for example, the use of Bayesian statistics, clustering, and other types of machine learning approaches. At block 4805, the instructions include identifying if a given event is in the past or in the future.
If the identified event is in the past (e.g. a few minutes ago, a few hours ago, a few days, weeks, months or years ago, etc.), the computing device searches for images/videos from one or more databases (e.g. Archive System, Web DS and CDN, external database, or combination thereof) (block 4806). The sought after images have the same location, date (and time) as the metadata of the given event. At block 4807, when the images matching the conditions of the metadata are found, the computing device associates the metadata, the event, and the ranking with each of the identified images/videos.
If the identified event is a future even, the computing device creates an Acquisition Request to capture image/video for the location, time and date (block 4808). At block 4809, the spacecraft captures the image(s) based on Acquisition Request. The computing device stores the image(s) and associates the image(s) with the metadata, the event, and the ranking (block 4810).
Using the above process, the ground segment can automatically identify future events of interest and capture images of those events. Additionally, past events of interest are automatically identified, and the saved images are tagged so that users interested in such past events can easily find the saved images.
For each training query, the system searches the archive and generates a set of results satisfying the training query (block 4824). At block 4826, the system selects a subset of results. For each result in the selected subset, the system determines a rating based at least in part on one or more of a commercial value of the geospatial data and the associated metadata, a measure of a usage pattern of the geospatial data and the associated metadata, a measure of semantic assessment of the geospatial data and the associated metadata, or a trend associated with the geospatial data and associated metadata (block 4828).
For each result in the set of retrieved geospatial data and metadata, the system determines a score reflecting a relevance to the user of the retrieved geospatial data and associated metadata result, the score based at least in part on the user's geospatial query and the rating associated with the data and the metadata (block 4834). If, at block 4835, the system determines the score is greater than a configurable threshold, the system distributes the retrieved geospatial data and associated metadata to the user via the Web platform (block 4836).
Natural Environment and Sentiment Taming
In another aspect of processing the image data, it is herein recognized that images can be difficult to search. The methods described herein provide a way to understand the content of the image, and to tag the image based on sentiment associated with the content of the image.
Turning to
At block 4903, based on the identified characteristics and a mapping table between the characteristics and sentiment values, the computing device identifies corresponding sentiment value(s). An example table is shown in 4905. For example, a sunny characteristic is associated with the sentiments happy and bright. In another example, a comet characteristic (e.g. which occurs when there is a comet in the image) is associated with the sentiment wishful or wish. In another example embodiment, the characteristic of a hurricane is associated with the sentiment values fierce and angry.
At block 4904, the computing device associates the sentiment value(s) and the characteristics, for example as metadata, with the image. In this way, when a user searches for “happy” images, the image of a sunny landscape appears. In another example embodiment, when a user searches for “wishful” images, an image of a comet appears.
Order Management
In another aspect of the Earth observation system, using the systems described herein, a user can interact with the Web platform 524 to view and obtain image data.
Turning to
Turning to
The OMS sends a product notification to the user's computing device (e.g. via email, text, instant messaging, social network, or by a message on the Web platform) 5108. The OMS also sends a report about the product to the financial system for accounting purposes 5109.
The customer then uses their computing device to retrieve the product from the PDS 5110.
Turning to
When the OMS receives the quotation acceptance 5205, the OMS and CPS may, or may not, re-compute the feasibility analysis 5206 based on the period of time elapsed since the initial analysis 5207. Then, the planning and uplinking operations 5208 and the downlinking and processing operations 5209 are executed. These operations were described above.
Turning to
It can be appreciated that there are various types of customer products that can be ordered. Table 10 describes some of the types of data to be ordered through the OMS or through the Web platform. Other types of data and services are made available through the Web platform.
The above products may include processing of the collected data to a specified level.
Other possible products and services include: image/video products; standard image and video products; super resolution images; stereo pairs; change detection; change monitoring reports; activity monitoring reports; mosaics; 3D city models; Digital Elevation Models (DEMs) which describe terrain; Digital Surface Models (DSMs) which describe terrain and any features on the terrain (e.g. trees, buildings, etc.); subsidence monitoring reports; sightlines for buildings and signage; annotated images and videos; broadcast-ready videos (e.g. edited to include content from other sources); large area background collections; and calibration collections.
Although the data products are configured to be sent to the customer over wired or wireless networks, or both, it is recognized that the customer may desire to have the data product on a physical memory media (e.g. USB key, DVD, CD, etc.). Therefore, the ordering system may include automatic data storage processes to automatically store data onto a physical memory media and automatically ship or mail the same to the customer.
In another aspect of the ordering process, the contents of an order are described here. The order includes general order information, such as the customer information (e.g. identification, contact information, etc.) and the information the customer needs or the information the application needs. Non-limiting examples of the information requested by the customer or the application include imaging or processing parameters, or both, such as sung angles, the type of product, the areas of interest, time constraints and privacy requirements.
The order also includes a tasking request. There may be zero, or one, or multiple tasking requests in an order. Each taking requests includes: an area of interest (e.g. geographic coordinates or attached Shape/KML file); priority level; time frame (e.g. earliest and latest time that the imagery is allowed to be acquired); a camera selection (e.g. MRC, HRC, other cameras); and the length of video (e.g. number of frames or time), where video data is being ordered.
The tasking request may also include other constraints, for example, including: quality level or maximum compression ratio; the maximum acceptable cloud cover; sun elevation and azimuth angles; incidence elevation and azimuth angles (e.g. for the HRC); preferred ground station for downlinking; and security access settings.
The order may also include a product request. There may be zero, or one, or multiple product requests in an order. The product request, for example, includes: identification of raw imagery; processing level; a resampling kernel (e.g. normal or sharpen); and data specifying geolocation accuracy (e.g. systematic or precision).
The OMS or the Web platform may provide weather forecast information in relation to a product order, which can be used to assist a customer in assessing the cloud cover risks within the time period and location of interest.
Turning to
Each acquisition request results in 1 to N acquisitions (e.g. image takes or video takes) 5406. Each acquisition results in an on-board file (e.g. a file on the space segment) 5407. Each on-board file results in an image/video product file of Level 0 5408.
Turning to
Turning to
Turning to
In another aspect of the ordering process, the Web platform 524 collects inputs from the public users 54. The number of inputs could be in the many thousands. The inputs are filtered by a combination of automatic means to arrive at a shorter list of specific requests that will be turned into orders. The algorithms to be employed to select orders from the public consider various factors, including popularity voting, lottery, rewards for activity, etc.
For example, the Web platform 524 is configured to receive votes for areas of interest over which the space segment will collect imagery data. Areas with the most votes are selected for imaging.
In another example for selecting an order, a public user who has gained a online status above a certain threshold (e.g. absolute or relative threshold) is granted access by Web platform 524 or the OMS 514, or both, to put in a specific collection request. For example by earning a sufficient number of reward points for interactions with the Web platform, the status of a particular user increases.
In another example for selecting an order, potential collection opportunities are presented to public users and are auctioned to the highest bidder.
Turning to
Turning to
As noted above, users may browse a catalogue database through the Web platform to search for raw image data and raw video data. In an example embodiment, a GUI, separate from the Web platform for public users, is provided for customers to view the catalogue. In an example embodiment, the catalogue displays everything that is available for retrieval or purchase.
The catalogue system provides a Web accessible interface that allows users to search/browse for raw image data (e.g. each dataset corresponds to one MR Image Take) and raw video data (e.g. each dataset corresponds to one HR Video Take).
The catalogue system can be searched by several criteria. Non-limiting example search criteria include: geographic area (e.g. either polygon or screen extent; the polygon inputted by the user via a GUI); a time range; a specific camera; incidence elevation angle; sun elevation angle; compression ratio; and other data.
The catalogue system's GUI is configured to display a geographic view of footprints of datasets on a world map, and optionally also browse imagery overlaid on the map. The GUI is also configured to allow a user to browse images and to display a tabular view of datasets that match the search criteria. The GUI also displays a listing of all metadata for datasets as they are selected by the user. The GUI is also configured to receive user input selections for user desired datasets, and is further configured to save the list of user selected datasets for later use outside of the catalogue, for example placing an order with the OMS 514.
User Interfaces
Turning to
In another example embodiment, the Web platform identifies the location of the user viewing the login page, for example based on the user device's IP address. The Web platform then displays streaming video that shows the location of the user. The video data may be recent. In this way, the user is engaged with the relevant data on the Web platform even from the login page.
Continuing with
The login GUI 6001 also includes text fields 6005 and 6007 to receive the user's email and password. Other types of login identifiers, in alternative or in addition to an email address, may be used. A control 6006 is configured to, when selected by the user, initiate a password recovery process. After the text fields 6005, 6007 have been populated with data, and the user selects the login control 6008, the main Web page is displayed.
Turning to
The dashboard GUI 6101 includes an interactive map display 6107 that shows the current locations 6111 and 6112 of one or more spacecraft, in this case two spacecraft. The orbit paths are also shown, both recently traversed orbits and orbits to be soon traversed by the spacecraft. The map display 6107 also shows pinpoint controls 6114, 6113 and 6115 located on the map. The pinpoint control 6114 indicates imagery was collected at a location in North America. The pinpoint control 6113 indicates that video imagery was obtained for a specific location in Europe. Similarly, pinpoint control 6115 indicates that video imagery was obtained for a specific location in Australia. The pinpoint controls are selectable by a user. When a given pinpoint control is selected, the data associated with the given pinpoint control is displayed.
The interactive map 6107 also shows the current night time regions 6108 of the Earth and the current day time regions 6118. The night regions are displayed with a darker shadow, while the day time regions are displayed more brightly. The interactive map 6107 is continuously updated to show the current location(s) of the spacecraft and the current day and night time zones. Furthermore, as new imagery data or event data is made available, the interactive map 6107 is updated to show new pinpoint controls for the new imagery or event data.
The interactive map 6107 also includes a distance scale 6110 and a zoom in/out control 6117.
When the user scrolls down 6119 on the dashboard GUI 6101, a timeline 6201 is shown, as per
In
The timeline also includes an indicator of the year 6206. In this example embodiment, the most recent events of the timeline are shown near the top, but can be reversed in another example embodiment.
The timeline also includes an imagery icon 6207, a description of the imagery 6208, and a time and date of when the imagery was acquired 6211. A map 6209 is displayed in associated with the imagery icon 6207, and the map 6209 includes a marker 6210 that indicates the location of the image on the Earth.
The timeline also includes an event icon 6212, a description of the event 6213, and the time and date 6216 of the event. A map 6214 displayed in association with the event icon 6212 includes a marker 6215 showing the location of the event on the Earth.
Another set of data 6217 specific to a place is also included on the timeline. It can be appreciated that many instances of events, video, imagery, places, and other types of data can be shown on the timeline.
Returning back to
Turning to
Similarly, the position of the other spacecraft 6111 is displayed. The already travelled orbit path of this other spacecraft is shown by the solid line 6301 and the orbit path to be travelled is shown by the dotted line 6302. It can be appreciated that, by displaying where a spacecraft has recently travelled, a user can determine where and when recent images have been acquired by the spacecraft.
When the Web platform detects a user input 6305 selecting the pinpoint control 6115, the Web platform displays the data associated with the pinpoint control 6115, which is shown on the timeline 6201. Turning to
In
When the Web platform detects a user input 6401 selecting the live control 6103, the Web platform displays the live GUI, such as shown in
Turning to
In another example embodiment, the GUI 6501 includes various controls to allow the user to view past streaming imagery.
The streaming video section 6502 includes control 6503 and 6504 to increase and decrease the zenith viewing angle of the Earth. Controls 6512 and 6513 may be used to increase and decrease the zoom setting of the streaming video. Control 6514 is used to collapse the secondary interactive maps 6505 and 6506, thereby increasing the available space for displaying the streaming video section 6502. The same control 6514 can be used to restore the display of the secondary interactive maps 6505 and 6506, as currently shown in
The extended secondary interactive map 6505 shows the orbit path of the spacecraft over many orbital cycles around the Earth. This is like a timeline. The map 6505 shows the map of the Earth, but linearly repeated as a pattern. This shows the repeated orbiting path of the spacecraft around the Earth. The position of the spacecraft 6517 in relation to the orbit path 6507 is shown, and that position 6517 is associated with the currently displayed video image 6502. When a user moves a pointer 6508 or selects the position 6517, a pop-up message 6509 is displayed indicating the time of when the video image was collected. The time may be an absolute time, or may be a relative time indicating how long ago from the current time the image was collected. In the example in
The localized secondary interactive map 6506 shows a zoomed-in view of the position of the spacecraft 6518 along the orbit path 6515. A band 6519 encompassing the orbit path 6515 is also displayed. The band 6519 shows the width of the imagery captured by the spacecraft. The controls 6510 and 6511 can be used to zoom in and out of the localized secondary interactive map.
It will be appreciated that the location of the spacecraft in the secondary interactive maps 6505 and 6506 is continuously updated as the spacecraft moves along the orbit path. At the same time, the streaming video 6502 is continuously updated to reflect or match the position of the spacecraft. In other words, the streaming video shows what is being viewed by a camera on the moving spacecraft, and the location of the moving spacecraft is correspondingly being continuously updated in the secondary interactive maps 6505 and 6506.
Turning to
When the Web platform detects that the user has selected or positioned a pointer over another location 6601 on the orbit path 6507, the Web platform displays a line 6603 indicating the cross section in time of the orbit. The Web platform also displays a pop-up message 6602 indicating that at that other location 6601 of the orbit, imagery was captured at a certain time (e.g. 9 hours and 6 minutes ago from the current time), and provides a status regarding the imagery. In the example in
It will be appreciated that selecting the position 6601 will cause the GUI 6501 to display video imagery at that position and time along the orbit 6507, and the location of the spacecraft along the orbit pat 6507 will move to the position 6601.
Turning to
When the Web platform detects that the user has selected or positioned a pointer over another location 6702 on the orbit path 6516 in the local secondary interactive map 6506, the Web platform displays a line 6701 highlighting the linear position along the orbit 6516. The Web platform also displays a pop-up message 6703 indicating that at that other location 6702 of the orbit, imagery was captured at a certain time (e.g. 9 hours and 58 minutes ago from the current time), and provides a status regarding the imagery. In the example in
In the example, the Web platform receives a user input selection 6704 with respect to the other position 6702. In response, the Web platform displays the GUI shown in
Turning to
It can be appreciated that the interactive secondary maps 6505 and 6506 are configured to allow a user to control the display of streaming video be selecting a location and a time along the orbit.
When the Web platform detects that the user has selected or hovered over the control 6503 to increase the elevation viewing angle, the pop-up message 6801 is displayed indicating the action to increase the zenith viewing angle. After detecting the user input selection 6602, the GUI in
In
After the Web platform detects the selection input 6901 with respect to the explore control 6104, the GUI in
In
For the polygon tool, the user can selected several points to create a polygon area over a section of the map. The Web platform then searches for images, data, and events located within the polygon map. The polygon tool therefore provides an intuitive and visual way to search a map, even if the name of a place is not known or is not provided.
Turning to
Turning to
The capture schedule 7201 show the date, time, weather forecast and/or illumination information for upcoming image captures for the searched city of San Francisco, USA. The schedule also shows the estimated percentage (%) chance that the image will be captured. This information is based the predicted orbit path of the spacecraft, and the weather forecast associated with the positions and times of the spacecraft. The user can scroll through the dates using the controls 7204.
In particular, the example capture schedule shows an entry 7204 that on Tuesday, October 3rd, the spacecraft will capture an image of San Francisco at 2:30 pm. The weather at the time will have an 80% chance of being sunny, and there will be a 99% chance of capturing the image. Other entries 7206, 7207, 7208, 7209, 7210, and 7211 show various values for similar types of data.
The GUI 7001b also shows several orbit paths 7212, 7213, 7214 that are overlaid the map of San Francisco and the surrounding area. In this example, the user's pointer 7215 is positioned along a certain location 7216 of the orbit path 7212. The location 7216 is visually indicated with a line across the path 7212, and the path 7212 is highlighted as well to visually distinguish itself from other orbit paths. A pop-up box 7212 is also displayed and it includes the message indicating the predicted date and time the spacecraft will be passing over that location 7216 (e.g. Wednesday, October 4th at 4:25 pm). The pop-up box 7212 also includes the resolution of the expected imagery, which in this example indicates a 5 meter resolution. Furthermore, the scheduled imagery corresponding to the user's pointed position 7216 is also highlighted in the capture schedule 7201. In this example, the entry 7210 is highlighted and it indicates the same date and time as the pop-up box 7217, and the entry 7210 further indicates that the weather forecast for that date and time will be foggy. The entry 7210 also indicates a 98% chance that the spacecraft will be able to capture the image. It can be appreciated that this type of GUI can be used to place orders for acquisition requests.
Turning to
The Web platform uses these search criteria to show only those images that meet the selected criteria.
An option box 7305 allows a user to determine if image and video data from all data providers should be displayed or only data from select providers. In this case, all image providers have been selected. Thus, images from 3rd parties will be shown in addition to those images acquired from the spacecraft 100.
Entries 7306, 7307, 7308, 7309 are displayed as archived results corresponding to the searched location (e.g. San Francisco), and according to the search criteria set using controls 7301. Each entry includes an option box indicating whether or not the data from the entry is selected for downloading or detailed viewing, an indicator of which company or organization provided the data, an image resolution value, a specified camera sensor, an image spectrum value, and an indicator of whether the data is image data, video data, or both. For example, entry 7306 shows the following information: the option box is checked; the image data is provided by the company Urthecast; the image resolution is 5 m; the spectrum is 4-band Red Blue Green and Near-Infrared (RBGN); and the data includes image data only
In an example embodiment, all the entries matching the search criteria are shown and the user can uncheck the option boxes to remove certain search result entries.
A control shows the percentage of the searched area that is covered by the available archived images. In this example, using the archived images, 100% of the area of San Francisco is covered or imaged.
The map of San Francisco is also overlaid by a mosaic of small rectangles 7311. Each rectangle represents an image or video, and is positioned on the map to indicate the location shown in the image. The images shown in the mosaic 7311 relate to those images returned from the search criteria (e.g. location, sun angle, month, cloud cover, providers, and selected archived entries).
Continuing from
Several entries 7306a and 7307a are no longer selected, while other entries 7308a and 7309a remain selected. The mosaic of rectangles representing the images 7311a is now less dense. The control 7310 still indicates that with the selected data sets, 100% of San Francisco is covered.
In
Turning to
The selection of image processing settings is shown in
Turning to
Processing tools are displayed in the GUI 7001f, which can be used to further process the raw image data. For example, the tools include a tile tool 7607, which when selected, generates an image as a set of tiles. A mosaic tool 7608, which when selected, compiles multiples images together to form a mosaic-like image. A time lapse tool 7609, which when selected, compiles images of the same place, but of different times, together into a time lapse video. A change tool 7610, which when selected, identifies where features in an image have changed. For example, the position of boats, the existence of buildings, the shape of coastlines, or other features may have changed, and the tool 7610 will generate a visual or written report, or both, indicating the change. The change tool 7610 employs image recognition and image processing techniques. A vegetation index tool 7611, which when selected, analyzes the images to compute a vegetation index value. For example, this can be based on the color of the images (e.g. green) and the texture of the images (e.g. forests have certain textures when imaged from above).
These image processing options are available to be selected. The controls 7613 can be used to browse through and select other processing tools. The control 7612 allows the user to add the selected processing options to the cart for checkout and purchase.
Turning to
Turning to
The GUI includes a map 7806 showing where the event occurred. Various controls 7803, 7804, 7805 to link or share data about the event over social data networks are provided.
The GUI 7801 also includes a timeline 7807 tracking different times for the event. The timeline includes certain dates 7807, 7808, 7810, each of which are controls. The control of August 6 7807 is currently selected, and thus the image 7806 is of the hurricane event captured on August 6th.
An overview 7819 of the event is displayed. The tabs 7813, 7814, 7815, 7816, and 7817 allow a user to respectively view overview data, photos, videos, social data, and news, all of which are related to the event. Control 7811 allows a user to contribute data (e.g. image data, text data, etc.) to the event page. Another control 7812 allows the user to share the data from the event page.
When a user selection input 7818 is received with respect to control 7810 along the timeline, the GUI is updated as per
Turning now to
When a user selection input 7902 is received with respect to the control tab 7814 for photos, the GUI in
In
When a user selection input 8001 is received with respect to the control tab 7815 for videos, the GUI in
In
When a user selection input 8101 is received with respect to the control tab 7816 for social data, the GUI in
In
When a user selection input 8201 is received with respect to the control tab 7817 for news data, the GUI in
In
Turning to
The image tiles show the icon 8404 and the summary of tile captures 8405. The detailed breakdown of the captures 8406, 8407 and 8408 are also shown. Each capture may show the date and time of the capture, the location of the capture (e.g. by coordinates), and the associated price.
The time lapse data includes a summary of the time lapse 8409, such as the number of frames. The detailed view of each time lapse 8410 is also shown, and it displays the time and date, location, and the associated price.
The GUI shows the totalled cost information 8411 and includes a “checkout” button 8411 to purchase the products.
In other example embodiments, the GUI features shown in
Turning to
Turning to
Turning to
Turning to
In
It will be appreciated that systems and methods, including computer algorithms, are provided herein relating to remote sensing. An Earth observation platform is also provided, which can obtain imagery, video, and other remote sensing data of the Earth or objects intentionally placed into orbit of planetary objects. The remote sensing data may also be obtained from the International Space Station, other manned (spacecraft, aircraft), or unmanned aerial vehicles (UAVs, spacecraft probes). A sensor captures observation data and transmits the data to ground stations on the Earth. The ground stations receive the Earth observation data. An archiving system stores the sensor observation data. Customers or users use an order management system to place orders for the observation data, which specify processing parameters for the Earth observation data. Based on the orders, a processing system retrieves the Earth observation data from the archiving system and processes the Earth observation data according to the parameters to generate an Earth observation data product. This system provides unique tools for searching, browsing, and analyzing the data as well as capabilities for interacting with the system through an API. The system is configured to combine observation data (e.g. remote sensing data) from sources produced internally by the observation platform and by third parties.
The elements in the GUIs described or shown herein are just for examples. There may be many variations to these GUI elements without departing from the spirit of the invention. For instance, buttons, images, graphs, and other GUI controls may be displayed and operated in a differing order, or buttons, images, graphs, and other GUI controls may be added, deleted, or modified. The teachings of U.S. provisional patent application Ser. No. 61/911,914 are incorporated by reference herein, in its entirety.
The steps or operations in the flow charts described herein are just for examples. There may be many variations to these steps or operations without departing from the spirit of the invention. For instance, the steps may be performed in a differing order, or steps may be added, deleted, or modified.
Although the above has been described with reference to certain specific embodiments, various modifications thereof will be apparent to those skilled in the art as outlined in the appended claims.
A portion of the disclosure of this patent document contains material which is subject to (copyright or mask work) protection. The (copyright or mask work) owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure, as it appears in the Patent and Trademark Office patent file or records, but otherwise reserves all (copyright or mask work) rights whatsoever. This application is a continuation of U.S. patent application Ser. No. 15/101,336, filed on Jun. 2, 2016, which is a 371 application of Patent Cooperation Treaty Patent Application No. PCT/US2014/068645 that claims priority to U.S. Provisional Patent Application Ser. No. 61/911,914, which applications are hereby incorporated in its entirety by reference.
Number | Name | Date | Kind |
---|---|---|---|
9648075 | Kalinke | May 2017 | B1 |
20070112689 | Brown | May 2007 | A1 |
20080140348 | Frank | Jun 2008 | A1 |
20140149372 | Sankar | May 2014 | A1 |
Number | Date | Country | |
---|---|---|---|
20220164376 A1 | May 2022 | US |
Number | Date | Country | |
---|---|---|---|
61911914 | Dec 2013 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15101336 | US | |
Child | 17585320 | US |