The present invention relates to the field of remote sensing, and in particular to a system and technique for high-altitude remote sensing using an airborne vehicle.
A need to monitor critical infrastructure or other areas of high importance has driven the development of innovative solutions for remote sensing. Significant efforts have been put into attempts to find cost-effective surveillance technologies that could help organizations find and manage problems in a faster, more efficient way. To date, however, surveillance technology has remained slower and more expensive than would be desirable, limiting the ability to inspect and effectively manage critical zones.
In one general aspect a high-altitude remote sensing system comprises a powered autonomous unmanned aerial vehicle capable of vertical takeoff and ascending to a predetermined altitude of 60,000 to 100,000 feet comprising; a takeoff propeller configured for taking off from a ground location; an ascent propeller configured for ascending to the predetermined altitude after takeoff; and a cruising propeller configured for cruising and station-keeping after ascent to the predetermined altitude; and a remote sensing electronics package disposed with the autonomous unmanned aerial vehicle.
In a second general aspect a method of remote sensing, comprises provisioning an autonomous unmanned aerial vehicle with a remote sensing electronics package; taking off the autonomous unmanned aerial vehicle vertically from a ground location using a takeoff propeller; ascending the autonomous unmanned aerial vehicle after takeoff to a predetermined altitude using an ascent propeller, wherein the predetermined altitude is between 60,000 feet and 100,000 feet; flying the autonomous unmanned aerial vehicle autonomously over a target area using a cruising propeller; and capturing remote sensing imagery in flight by a remote sensing electronics package disposed with the autonomous unmanned aerial vehicle.
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate an implementation of apparatus and methods consistent with the present invention and, together with the detailed description, serve to explain advantages and principles consistent with the invention. In the drawings,
In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the invention. It will be apparent, however, to one skilled in the art that the invention may be practiced without these specific details. In other instances, structure and devices are shown in block diagram form to avoid obscuring the invention. References to numbers without subscripts are understood to reference all instances of subscripts corresponding to the referenced number. Moreover, the language used in this disclosure has been principally selected for readability and instructional purposes, and may not have been selected to delineate or circumscribe the inventive subject matter, resort to the claims being necessary to determine such inventive subject matter. Reference in the specification to “one embodiment” or to “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiments is included in at least one embodiment of the invention, and multiple references to “one embodiment” or “an embodiment” should not be understood as necessarily all referring to the same embodiment.
Although some of the following description is written in terms that relate to software or firmware, embodiments can implement the features and functionality described herein in software, firmware, or hardware as desired, including any combination of software, firmware, and hardware. References to daemons, drivers, engines, modules, or routines should not be considered as suggesting a limitation of the embodiment to any type of implementation. The actual specialized control hardware or software code used to implement these systems or methods is not limiting of the implementations. Thus, the operation and behavior of the systems and methods are described herein without reference to specific software code with the understanding that software and hardware can be used to implement the systems and methods based on the description herein
As used herein, satisfying a threshold may, depending on the context, refer to a value being greater than the threshold, greater than or equal to the threshold, less than the threshold, less than or equal to the threshold, equal to the threshold, or the like, depending on the context.
Although particular combinations of features are recited in the claims and disclosed in the specification, these combinations are not intended to limit the disclosure of various implementations. Features may be combined in ways not specifically recited in the claims or disclosed in the specification.
Although each dependent claim listed below may directly depend on only one claim, the disclosure of various implementations includes each dependent claim in combination with every other claim in the claim set. No element, act, or instruction used herein should be construed as critical or essential unless explicitly described as such.
Various types of remote sensing techniques have been used to date. Various parties have used satellites, piloted drones, trucks, airplanes, and combinations of those systems. Drones require a skilled drone pilot to travel from place to place, launch the drone and pilot it in the air, then recover the drone. The data collected by the drone must then be downloaded and analyzed. Because the types of drones used in such a system have significantly limited flight time endurance limitations, the area that can be examined by a drone in a single flight is necessarily also significantly limited. In addition, the time and cost of hiring a drone pilot and transporting the drone pilot from place to place are significant. For example, currently, the pilot has to drive to the drone landing spot, which takes a significant amount of time.
Truck-mounted sensing systems are simpler, typically requiring only a truck driver with sufficient training to operate the truck-mounted sensing equipment. They may also have visual observers drive or ride in the trucks, but these are not as thorough. However, the range of truck-mounted sensing equipment is low, the truck is typically limited to areas with good roads, and the time required to drive the truck from site to site can be extensive.
Satellite-based remote sensing systems are highly expensive, with significant infrastructure required to manage the satellite while in orbit. Although satellite remote sensing systems have increased their capabilities since the earliest Landsat satellites were launched in the 1970s, the resolution of remote sensing satellites with a high revisit rate is still larger than desired, while remote sensing satellites with a better resolution rate typically have a prohibitively low revisit rate.
Aircraft flying at low altitudes providing aerial surveillance has been in use for decades and can provide high-resolution sensing capability. However, a single aircraft flying at a low altitude can cover a limited area at any time. In addition, the cost of the aircraft and skilled pilots are high.
The desired approach is to get high-resolution sensing of large areas at the lowest possible cost. In one embodiment, a high-altitude remote sensing system uses a high-altitude autonomous unmanned aerial vehicle (UAV) that can take off from the ground without the assistance of another vehicle and ascend to high altitudes, where it can cruise over a predetermined target area while collecting remote sensing data. In some embodiments, the UAV can be an unpowered UAV that can be taken to a high altitude by a balloon or other vehicle and then launched at the desired cruising altitude. In other embodiments, the UAV can be a powered UAV that can take off from the ground without the assistance of another vehicle. For purposes of this discussion, a UAV is considered powered if it includes onboard motive power for ascent, landing, or cruising over a target area and unpowered if it includes no onboard motive power, even though it may contain sources of electrical power for operation of onboard navigation or remote sensing equipment. In some embodiments, the UAV may be a powered UAV that is taken to a desired altitude by another vehicle, then cruises using onboard motive power. Preferably, the UAV is an autonomous vehicle that operates without a human pilot directing its operation remotely.
For purposes of this discussion, “high altitude” is considered to be in the range of approximately 60,000 feet to 100,000 feet.
Remote sensing equipment may be mounted interior to the UAV 100 or on the exterior of the UAV 100 or both as desired, such as internal to or on the exterior of the wings 120 or fuselage 130, as desired. The remote sensors may comprise one or more remote sensors of any desired type, including infrared, optical, electro-optical, synthetic aperture radar, multispectral, or hyperspectral cameras. In some embodiments, the remote sensors and avionics for control of the UAV 100 may be housed in a remote sensing pod 230 or other structure that can be insulated from extreme temperatures and made waterproof. The remote sensing pod 230 may be made of fiberglass or other desired material. One or more onboard data storage devices may also be housed in the remote sensing pod 230 for storing data collected in flight by the remote sensing equipment.
The remote sensing equipment sensors are preferably oriented in a nadir position, but can also be oriented in an oblique position.
Avionics and other relevant electronics for controlling the flight of the aircraft may be included in the remote sensing pod 230, a separate pod 240, or mounted directly in the fuselage 130 of the UAV 100. The avionics and other relevant electronics may include an electronic speed controller (ESC) for one or more electric motors 210, servo motors, a detect and avoid system, an Automatic Dependent Surveillance-Broadcast (ADS-B) transmitter, high precision Global Positioning System (GPS), Real-Time Kinematics (RTK), or Global Navigation Satellite System (GNSS) systems and antenna, and any other aircraft control systems. In some embodiments, real-time data transfer to a ground receiver may be enabled by including a transmitter and antenna for radio connections, such as a long-distance local network connection. Airspeed sensors may be used as part of an autopilot control system for controlling the flight of the UAV 100.
In the embodiment illustrated in
In other embodiments, the UAV 100 may take off or land using a runway (e.g., an airport runway) as with conventional airplanes. In such an embodiment, the UAV 100 may include an undercarriage comprising wheels, skids, pontoons, supporting struts, or other structures that are used to keep it off the ground or above water when it is not flying.
The UAV 100 illustrated in
In embodiments in which the UAV 100 is an autonomous vehicle, a flight path may be preprogrammed before launch or a flight path may be communicated from a ground control station to the UAV 100 via radio from an automatic tracking antenna. An onboard flight computer and autopilot software may then control the flight path of the UAV 100 over the target area. In some embodiments, an optional pilot control system may allow a ground-based pilot to control the UAV 100 as desired, such as in the event of a failure or malfunction of the autopilot. A navigation system, such as a GPS navigation system may confirm the location of the UAV 100 and initialize data collection by the remote sensing equipment once the UAV 100 is over the target area. In some embodiments, an integrated navigation system can consolidate multiple inputs, compare the positions, remove outliers, and output a single position to provide a more resilient basis for navigation of the UAV 100.
Because the UAV 100 is a low-weight aircraft with a high glide ratio, the UAV 100 and remote sensing equipment may stay aloft for long periods, such as over 10 hours, before needing to land. This allows the UAV 100 to loiter over the general target area in the event of cloud coverage over the target area that would prevent obtaining clear remote sensing imagery until the cloud coverage has cleared sufficiently that clear imagery is available.
Once the remote sensing system has completed data capture, the UAV 100 may descend while flying to a predetermined landing zone where the UAV 100 may be recovered and remote sensing data that is stored onboard may be transferred to a ground-based computer for processing as described below. In the event of an uncontrollable descent or any other major malfunction of the UAV 100 that cannot be corrected, embodiments may provide a backup parachute 2500 that can be deployed to bring the UAV 100 down at a safe speed. Geospatial data obtained from the navigation system may be attached to the remote sensing imagery.
The data collected from the remote sensing equipment on the UAV 100 may be inspected individually or processed using algorithms to join the raw data captures (multispectral, hyperspectral, optical, etc.) and stitch the imagery into a panoramic view of the target area for inspection. In addition, the data may be processed to determine changes in the state of the target area or the area surrounding the target area, by referencing previous results to detect changes along a right-of-way, such as vegetation growth or death, hydrocarbon leakage, or any other intrusion or disturbance.
In this example, each of the eight cameras 310A-H are connected via a connector to one of a pair of hubs 320A-B. The hubs 320A-B are then connected to an interface card 330 that provides a connection to a computer 340. Although illustrated as an external card in
The number of cameras 310A-H and hubs 320A-B is illustrative and by way of example only, and any type or number of cameras or hubs may be used as desired, such as to fit into a desired form factor for the camera array. Any convenient type or types of connectors and communication protocols can be used as desired, such as Universal Serial Bus Type C (USB-C). The computer 340 may be any type of device capable of connecting to the cameras 310A-H for collecting the data. In some embodiments, the data is simply collected by the computer 340, then made available for later analysis by other computers or other devices. In other embodiments, the data collected by the computer 340 may be processed or analyzed in real-time during flight, and the analysis used to guide the path of UAV 100 or to provide any other useful guidance to an operator of the sensing system 300. In some embodiments, the data collected by the computer 340 is continuously processed in situ and stored on the computer 340 or another device in the UAV 100 from which the data may be downloaded after the flight. In some embodiments, the data may be transmitted while in flight to a ground station via a wireless network, a satellite data network, or a mobile telephone data network such as a 4G or 5G data network. Although illustrated in
As described above, an onboard computer 340 may process the captured remote sensing imagery in flight. The computer 340 may be attached to both the onboard flight computer and the cameras 310 to get all required information.
Prior to any information being processed, a targeted pipeline's geographic data may be loaded to the aircraft's onboard computer 340 along with the flight plan. Using this information, the computer 340 may be programmed to begin processing when the pipeline is in the line of sight of the cameras 310. Whilst in flight, a lightweight object identification program on the aircraft may assign to each image 505 a likelihood that there are right-of-way objects in the pipeline. This program may use the pipeline's geographic location along with a lightweight object detection program. This results in a set of images on the hard drive 515 along with associated object likelihoods.
Once on the ground, the hard drive 515′s contents will be transferred through one or more networks 520, such as the internet, to a database 532 associated with a ground-based server 530 that may execute image processing software 534 such as a large neural network or other object and issue detection analysis software to identify objects in the captured remote sensing imagery. In one embodiment, Images may be processed in order of likelihood from the airborne computations. This allows the ground-based computer to send likely issues to users as fast as possible. The program may put a running list of flagged locations into a database 536 that users may be able to view in real time. Edge cases may be flagged and manually reviewed by a human in the loop as indicated in block 538.
This list of images with right-of-way objects in the database 536 may be shown to users via one or more networks 540 through an online platform 550 provided by a service provider or customer business operations software 560. In some embodiments, all images may be available, but only issues and their corresponding imagery may be raised to users. The flagged images may be shown with optional feedback buttons to correct the algorithm, such as to create a custom input square of the object or to remove a flagged image. These images may then be sent back to the image processing software 534 as training data.
The software executed by the onboard computer 340 may be constrained to run on a computer of limited processing power and may be a standard rules-based algorithm, instead of a machine learning algorithm. Inputs may include a pipeline geographic data file, a current camera location, and the image itself. This program may then draw a line over the expected location of the pipeline and compute a difference gradient over the length of the pipeline in the image. That gradient may then be normalized by the pixel length of the pipeline in the image producing a likelihood value for use by the ground-based server 530.
In one embodiment, the image processing software 534 may include a convolutional neural network. Inputs may include the expected pipeline location, the camera location, and the output of the aircraft pre-processing. The outputs of this program may be boxes identifying the location of objects along the pipeline with a likelihood of those objects infringing the right-of-way of the pipeline. In one implementation all objects with a threshold confidence level (e.g., a 90+%) may be flagged to show the user. Any objects with medium-level confidence (e.g., 50%-90%) may be sent to a human for manual review. This program may be trained on an existing corpus of pipeline imagery, but may also be retrained periodically (e.g., weekly) on additional data. All manually reviewed data, along with flagged data may be sent to this continually increasing corpus of training imagery.
As shown in
It should be noted that while
While certain example embodiments have been described in detail and shown in the accompanying drawings, it is to be understood that such embodiments are merely illustrative of and not devised without departing from the basic scope thereof, which is determined by the claims that follow.
This Patent Application claims priority to U.S. Provisional Patent Application No. 63/202,695 filed on Jun. 21, 2021, entitled “High-Altitude Airborne Remote Sensing.” The disclosure of the prior application is considered part of and is incorporated by reference into this Patent Application.
Number | Date | Country | |
---|---|---|---|
63202695 | Jun 2021 | US |