1. Field of the Invention
The present invention generally relates to controlling small payload air vehicles in flight, and more particularly, to automatically controlling Unmanned Air Vehicles (UAVs) and Remotely Piloted Vehicles (RPVs) to sense and avoid potential collisions with other local air vehicles.
2. Background Description
Currently, Unmanned Air Vehicles (UAVs) and/or Remotely Piloted Vehicles (RPVs) are accompanied by a manned “chaperone” aircraft to mitigate risk of collision when operating in National Air Space (NAS). A chaperone is particularly necessary to assure that the aircraft (UAV or RPV) does not collide with other manned or unmanned aircraft operating in the vicinity or vice versa. Unfortunately, chaperoning such a vehicle is labor intensive and not particularly useful, other than for test and demonstration purposes.
Manned aircraft rely on air traffic control, transponders, and pilot vision for collision avoidance. While transponders are required on all commercial aircraft, many private aircraft do not carry transponders, and transponders may not be utilized in combat situations. Further, there have been cases of air traffic control issuing commands that contradict transponder avoidance recommendations. For manned aircraft, the human pilot visually identifies local moving objects and makes a judgment call as to whether each object poses a collision threat. Consequently, vision based detection is necessary and often critical in detecting other aircraft in the local vicinity.
Currently, the Federal Aviation Administration (FAA) is seeking an “equivalent level of safety” compared to existing manned aircraft for operating such aircraft in the NAS. While airspace could be restricted around UAVs or UAVs could be limited to restricted airspace to eliminate the possibility of other aircraft posing a collision risk, this limits the range of missions and conditions under which an unmanned aircraft can be employed. So, an unaccompanied UAV must also have some capability to detect and avoid any nearby aircraft. An unmanned air vehicle may be equipped to provide a live video feed from the aircraft (i.e., a video camera relaying a view from the “cockpit”) to the ground-based pilot that remotely pilots the vehicle in congested airspace. Unfortunately, remotely piloting vehicles with onboard imaging capabilities requires both additional transmission capability for both the video and control, sufficient bandwidth for both transmissions, and a human pilot continuously in the loop. Consequently, equipping and remotely piloting such a vehicle is costly. Additionally, with a remotely piloted vehicle there is an added delay both in the video feed from the vehicle to when it is viewable/viewed and in the remote control mechanism (i.e., between when the pilot makes course corrections and when the vehicle changes course). So, such remote imaging, while useful for ordinary flying, is not useful for timely threat detection and avoidance.
Thus, there is a need for a small, compact, lightweight, real-time, on-board collision sense and avoidance system with a minimal footprint, especially for unmanned vehicles, that can detect and avoid collisions with other local airborne targets. Further, there is a need for such a collision sense and avoidance system that can determine the severity of threats from other local airborne objects under any flight conditions and also determine an appropriate avoidance maneuver.
An embodiment of the present invention detects objects in the vicinity of an aircraft that may pose a collision risk. Another embodiment of the present invention may propose evasive maneuvers to an aircraft for avoiding any local objects that are identified as posing a collision risk to the aircraft. Yet another embodiment of the present invention visually locates and automatically detects objects in the vicinity of an unmanned aircraft that may pose a collision risk to the unmanned aircraft, and automatically proposes an evasive maneuver for avoiding any identified collision risk.
In particular, embodiments of the present invention include a collision sense and avoidance system and an aircraft, such as an Unmanned Air Vehicle (UAV) and/or Remotely Piloted Vehicle (RPV), including the collision sense and avoidance system. The collision sense and avoidance includes an image interrogator that identifies potential collision threats to the aircraft and provides maneuvers to avoid any identified threat. Motion sensors (e.g., imaging and/or infrared sensors) provide image frames of the surroundings to a clutter suppression and target detection unit that detects local targets moving in the frames. A Line Of Sight (LOS), multi-target tracking unit, tracks detected local targets and maintains a track history in LOS coordinates for each detected local target. A threat assessment unit determines whether any tracked local target poses a collision threat. An avoidance maneuver unit provides flight control and guidance with a maneuver to avoid any identified said collision threat.
Advantageously, a preferred collision sense and avoidance system provides a “See & Avoid” or “Detect and Avoid” capability to any aircraft, not only identifying and monitoring local targets, but also identifying any that may pose a collision threat and providing real time avoidance maneuvers. A preferred image interrogator may be contained within one or more small image processing hardware modules that contain the hardware and embedded software and that weighs only a few ounces. Such a dramatically reduced size and weight enables making classic detection and tracking capability available even to a small UAV, e.g., ScanEagle or smaller.
While developed for unmanned aircraft, a preferred sense and avoidance system has application to alerting pilots of manned aircraft to unnoticed threats, especially in dense or high stress environments. Thus, a preferred collision sense and avoidance system may be used with both manned and unmanned aircraft. In a manned aircraft, a preferred collision sense and avoidance system augments the pilot's vision. In an unmanned aircraft, a preferred collision sense and avoidance system may be substituted for the pilot's vision, detecting aircraft that may pose collision risks, and if necessary, proposing evasive maneuvers to the unmanned aircraft's flight control.
The foregoing and other objects, aspects and advantages will be better understood from the following detailed description of a preferred embodiment of the invention with reference to the drawings, in which:
Turning now to the drawings, and more particularly,
Image data from one or more sensor(s) 102 may be buffered temporarily in the frame buffer 114, which may simply be local Random Access Memory (RAM), Static or dynamic (SRAM or DRAM) in the FPGA processor, designated permanently or temporarily for frame buffer storage. Each sensor 102 may be provided with a dedicated frame buffer 114, or a shared frame buffer 114 may temporarily store image frames for all sensors. The image data is passed from the frame buffer 114 to a clutter suppression and target detection unit 118 in the preferred image interrogator 112. The clutter suppression and target detection unit 118 is capable of identifying targets under any conditions, e.g., against a natural sky, in clouds, and against terrain backgrounds, and under various lighting conditions. A LOS, multi-target tracking unit 120 tracks targets identified in the target detection unit 118 in LOS coordinates. The LOS, multi-target tracking unit 120 also maintains a history 122 of movement for each identified target. A threat assessment unit 124 monitors identified targets and the track history for each to determine the likelihood of a collision with each target. An avoidance maneuver unit 126 determines a suitable avoidance maneuver for any target deemed to be on a collision course with the host aircraft. The avoidance maneuver unit 126 passes the avoidance maneuvers to flight control and guidance 116 for execution.
The clutter suppression and target detection unit 118 and the LOS, multi-target tracking unit 120 may be implemented using any of a number of suitable, well known algorithms that are widely used in target tracking. Preferably, clutter suppression and target detection is either implemented in a single frame target detection mode or a multi-frame target detection mode. In the single frame mode each frame is convolved with an Optical Point Spread Function (OPSF). As a result, single pixel noise is rejected, as are all large features, i.e., features that are larger than a few pixels in diameter. So, only unresolved or nearly unresolved shapes remain to identify actual targets. An example of a suitable multi-frame moving target detection approach, generically referred to as a Moving Target Indicator (MTI), is provided by Sanders-Reed, et al., “Multi-Target Tracking In Clutter,” Proc. of the SPIE, 4724, April 2002. Sanders-Reed, et al. teaches assuming that a moving target moves relative to background, and hence, everything moving with a constant apparent velocity (the background) is rejected with the result leaving only moving targets.
The track history 122 provides a time history of each target's motion and may be contained in local storage, e.g., as a table or database. Previously, since typical state of the art tracking units simply track targets in focal plane pixel coordinates, a high level coordinate system was necessary to understand target motion. However, the preferred embodiment collision sense and avoidance system 110 does not require such a high level coordinate system and instead, the LOS, multi-target tracking unit 120 collects track history 122 in LOS coordinates. See, e.g., J. N. Sanders-Reed “Multi-Target, Multi-Sensor, Closed Loop Tracking,” J. Proc. of the SPIE, 5430, April 2004, for an example of a system that develops, maintains and uses a suitable track history.
So, for example, the threat assessment unit 124 might determine in 1250 that within the next 30 seconds a target will approach within one mean target diameter of the host aircraft. Moreover, the threat assessment unit 124 may deem in 1252 that this a collision risk 1254 regardless of the true size and range of the target.
Optionally, the threat assessment unit 124 can make a probabilistic estimate in 1252 of whether a true range estimate is desired or deemed necessary. In those instances where a true range estimate is desired, the threat assessment unit 124 can determine target speed-to-size ratio from the reconstructed scaled three-dimensional trajectory, e.g., in 1250. Then in 1252, target speed-to-size ratio can be compared with the speed-to-size ratios and probabilities of known real collision threats with a match indicating that the target is a collision threat. Optionally, the motion of the host aircraft relative to the ground can be tracked, e.g., by the target detection unit 118, and factored into this probabilistic true range determination for better accuracy.
Short term intensity spikes may result, for example, from momentary specular reflections. These short term intensity spikes tend to cause ranging jitter that can impair collision threat assessments. So, for enhanced collision threat assessment accuracy and stability, the threat assessment unit 124 can remove or filter these short term intensity spikes, e.g., in 1248, using any suitable technique such as are well known in the art.
In alternative embodiments, the image interrogator 112 may be implemented using a combination of one or more FPGAs with one or more parallel processing devices for higher level computing capability, as may be required for the threat assessment and avoidance maneuver calculations.
Advantageously, a preferred collision sense and avoidance system 110 provides a “See & Avoid” or “Detect and Avoid” capability to any aircraft, not only identifying and monitoring local targets, but also identifying any that may pose a collision threat and providing real time avoidance maneuvers. The preferred image interrogator 112 may be contained within a small image processing hardware module that contains the hardware and embedded software and that weighs only a few ounces. Such a dramatically reduced size and weight enables making classic detection and tracking capability available even to a small UAV, e.g., ScanEagle or smaller. Thus, the preferred collision sense and avoidance system 110 may be used with both manned and unmanned aircraft. In a manned aircraft, the preferred collision sense and avoidance system 110 augments the pilot's vision. In an unmanned aircraft, the preferred collision sense and avoidance system 110 may be substituted for the pilot's vision, detecting aircraft that may pose collision risks, and if necessary, proposing evasive maneuvers to the unmanned aircraft's flight control.
While the invention has been described in terms of preferred embodiments, those skilled in the art will recognize that the invention can be practiced with modification within the spirit and scope of the appended claims. It is intended that all such variations and modifications fall within the scope of the appended claims. Examples and drawings are, accordingly, to be regarded as illustrative rather than restrictive.
Number | Name | Date | Kind |
---|---|---|---|
5321406 | Bishop et al. | Jun 1994 | A |
5581250 | Khvilivitzky | Dec 1996 | A |
6799114 | Etnyre | Sep 2004 | B2 |
6804607 | Wood | Oct 2004 | B1 |
20020057216 | Richardson et al. | May 2002 | A1 |
20040024528 | Patera et al. | Feb 2004 | A1 |
20040099787 | Dolne et al. | May 2004 | A1 |
20050024256 | Ridderheim et al. | Feb 2005 | A1 |
20050073433 | Gunderson et al. | Apr 2005 | A1 |
20050109872 | Voos et al. | May 2005 | A1 |
20060145913 | Kaltschmidt et al. | Jul 2006 | A1 |
20070252748 | Rees et al. | Nov 2007 | A1 |
Number | Date | Country |
---|---|---|
1505556 | Feb 2005 | DE |
Number | Date | Country | |
---|---|---|---|
20070210953 A1 | Sep 2007 | US |