The present application is related to U.S. patent application Ser. No. 11/470,134, which is incorporated herein by reference.
Unmanned aerial vehicles (UAVs) are remotely piloted or self-piloted aircraft that can carry sensors, communications equipment, or other payloads. They have been used in a reconnaissance and intelligence-gathering role for many years. More recently, UAVs have been developed for the purpose of surveillance and target tracking Autonomous surveillance and target tracking performed by UAVs in either military or civilian environments is becoming an important aspect of intelligence-gathering methods.
Typically, UAVs use the Global Positioning System (GPS) to provide navigation over various terrains. In many scenarios, UAVs are required to fly in an urban environment, which often have an urban canyon effect. An “urban canyon” is an artifact of an urban environment similar to a natural canyon. The urban canyon is caused by streets cutting through dense blocks of high rise buildings such as skyscrapers. It is known that the urban canyon can cause problems in reception of GPS signals.
For example, GPS receivers can suffer from multipath errors which are caused from receiving a composite of direct GPS signals and reflected GPS signals from nearby objects such as buildings. In an urban canyon, a direct line of sight (LOS) GPS signal can be completely blocked by a building structure and a reflected signal can reach the receiver, resulting in position and velocity errors of significant magnitudes. Thus, when UAVs operate in an urban environment, access to the GPS signals needed to navigate without drift is often denied by the urban canyon.
Many methods have been proposed for using image-aided navigation of autonomous vehicles such as UAVs. For example, matching or referencing a GPS-denied vehicle's current image of terrain to a satellite image of the same terrain has been suggested. However, these methods lack the temporal and spatial proximity needed in order to use the images for UAV navigation in an urban canyon.
The present invention is related to a method and system for navigation of one or more unmanned aerial vehicles in an urban environment. The method comprises flying at least one Global Positioning System (GPS)-aided unmanned aerial vehicle at a first altitude over an urban environment, and flying at least one GPS-denied unmanned aerial vehicle at a second altitude over the urban environment that is lower than the first altitude. The unmanned aerial vehicles are in operative communication with each other so that images can be transmitted therebetween. A first set of images from the GPS-aided unmanned aerial vehicle is captured, and a second set of images from the GPS-denied unmanned aerial vehicle is also captured. Image features from the second set of images are then matched with corresponding image features from the first set of images. A current position of the GPS-denied unmanned aerial vehicle is calculated based on the matched image features from the first and second sets of images.
Features of the present invention will become apparent to those skilled in the art from the following description with reference to the drawings. Understanding that the drawings depict only typical embodiments of the invention and are not therefore to be considered limiting in scope, the invention will be described with additional specificity and detail through the use of the accompanying drawings, in which:
In the following detailed description, embodiments are described in sufficient detail to enable those skilled in the art to practice the invention. It is to be understood that other embodiments may be utilized without departing from the scope of the present invention. The following detailed description is, therefore, not to be taken in a limiting sense.
The present invention is directed to a method and system for GPS-denied navigation using a hierarchy of unmanned aerial vehicles (UAVs), which is particularly useful in an urban environment. In one approach of the present invention, UAVs flying simultaneously at different altitudes communicate their respective images to each other, with a lower altitude GPS-denied UAV registering or matching its images to those of a higher altitude GPS-aided UAV. If the image correspondences are automatically established, then the lower altitude UAV can determine its attitude to about the same accuracy as the GPS-aided UAV.
In another approach, images from higher altitude GPS-aided UAVs can be used on-the-fly for planning the path of lower altitude UAVs, and for bringing the lowest-attitude UAV to the desired target location in the shortest possible time.
The UAVs 110 and 120 can be hover-capable aerial vehicles or fixed-wing aerial vehicles. One or more image sensors such as a camera can be used by UAVs 110 and 120, which provide a means for capturing images during flight. Additional sensors that can be used by the UAVs include one or more GPS sensors for obtaining GPS measurements; inertial sensors such as accelerometers or gyroscopes; a radar sensor (e.g., Doppler radar for velocity measurements); and laser detection and ranging (LADAR) or acoustic sensors (for distance measurements). Various combinations of any of the above sensors may also be employed in the UAVs.
A GPS satellite 140 transmits GPS signals 150 to a GPS sensor such as a receiver carried onboard UAV 120. An image sensor such as a camera onboard UAV 110 captures a first set of images within a field-of-view (FOV) 160 of the camera. Likewise, an image sensor such as a camera onboard UAV 120 captures a second set of images within a FOV 170 of the camera. Image features from the second set of images are then matched with corresponding image features from the first set of images. A current position of UAV 120 is calculated based on the matched image features from the first and second sets of images. This aids in the navigation of UAV 120 flying at the lower altitude, since urban environment 115 can interfere with GPS signals 150 because of the urban canyon effect.
Although two UAVs are shown in
Using images from collaborating UAVs at different altitudes rather than satellite images has the advantage of spatial and temporal proximity between images from the higher and lower altitude UAVs. This is an important advantage in automatically establishing correspondences between the images.
The present method automatically establishes the correspondences between images of the different altitude UAVs. A combination of the inertial position estimate of the low-altitude UAV, salient feature detection, and epipolar constraints between images can be used to correctly establish these correspondences. The salient feature detection can be carried out by cueing based on salient image features to detect matches between the second set of images and the first set of images. A variety of salient image feature types can be considered depending on the exact scenario.
The present method can be implemented by utilizing computer hardware and/or software, which provide a means for matching image features from the second set of images with corresponding image features from the first set of images; and means for calculating a current position of the GPS-denied UAV based on the matched image features from the first and second sets of images.
Instructions for carrying out the various process tasks, calculations, control functions, and the generation of signals and other data used in the operation of the method and systems described herein can be implemented in software, firmware, or other computer readable instructions. These instructions are typically stored on any appropriate computer readable media used for storage of computer readable instructions or data structures. Such computer readable media can be any available media that can be accessed by a general purpose or special purpose computer or processor, or any programmable logic device.
Suitable computer readable media may comprise, for example, non-volatile memory devices including semiconductor memory devices such as EPROM, EEPROM, or flash memory devices; magnetic disks such as internal hard disks or removable disks; magneto-optical disks; CDs, DVDs, or other optical storage disks; nonvolatile ROM, RAM, and other like media. Any of the foregoing may be supplemented by, or incorporated in, specially-designed application-specific integrated circuits (ASICs). When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or a combination of hardwired or wireless) to a computer, the computer properly views the connection as a computer readable medium. Thus, any such connection is properly termed a computer readable medium. Combinations of the above are also included within the scope of computer readable media.
The method of the invention can be implemented in computer readable instructions, such as program modules or applications, which are executed by a data processor. Generally, program modules or applications include routines, programs, objects, data components, data structures, algorithms, etc. that perform particular tasks or implement particular abstract data types. These represent examples of program code means for executing steps of the method disclosed herein. The particular sequence of such executable instructions or associated data structures represents examples of corresponding acts for implementing the functions described in such steps.
The present invention may be embodied in other specific forms without departing from its essential characteristics. The described embodiments are to be considered in all respects only as illustrative and not restrictive. The scope of the invention is therefore indicated by the appended claims rather than by the foregoing description. All changes that come within the meaning and range of equivalency of the claims are to be embraced within their scope.
Number | Name | Date | Kind |
---|---|---|---|
4179693 | Evans et al. | Dec 1979 | A |
4686474 | Olsen et al. | Aug 1987 | A |
5878356 | Garrot, Jr. et al. | Mar 1999 | A |
6173087 | Kumar et al. | Jan 2001 | B1 |
6388611 | Dillman | May 2002 | B1 |
6928194 | Mai et al. | Aug 2005 | B2 |
7127348 | Smitherman et al. | Oct 2006 | B2 |
7143130 | Lin | Nov 2006 | B2 |
7191056 | Costello et al. | Mar 2007 | B2 |
7228230 | Hirokawa | Jun 2007 | B2 |
7321386 | Mittal et al. | Jan 2008 | B2 |
7418320 | Bodin et al. | Aug 2008 | B1 |
7469183 | Bodin et al. | Dec 2008 | B2 |
7546187 | Bodin et al. | Jun 2009 | B2 |
20030081827 | Paz-Pujalt et al. | May 2003 | A1 |
20040039497 | Wood et al. | Feb 2004 | A1 |
20050271300 | Pina | Dec 2005 | A1 |
20060167596 | Bodin et al. | Jul 2006 | A1 |
20060167622 | Bodin et al. | Jul 2006 | A1 |
20080228335 | Bodin et al. | Sep 2008 | A1 |
20080243372 | Bodin et al. | Oct 2008 | A1 |
Number | Date | Country |
---|---|---|
1677076 | Jul 2006 | EP |