This invention relates to a ground-based system to locate perpetrators of aircraft laser strikes.
Laser strikes against commercial aircraft are a growing problem in the United States. In the last decade there has been a large increase in the number of incidents reported to the Federal Aviation Administration, likely driven by the increasing availability of high power, low cost laser pointers. Reports increased rapidly in 2015, with 7,703 incidents compared to 3,894 in 2014.
Impacts to the pilot include distraction, loss of night vision, temporary flash blindness and retinal burning. These effects generally happen during the most critical takeoff and landing phases of flight when the aircraft are closest to the ground, and as a result they pose a serious safety hazard that has many public officials very concerned.
Current responses to these incidents are limited. For persistent incidents in a given location, air traffic controllers may change departure and landing patterns to avoid areas where laser strikes have been reported. For more egregious incidents, police helicopters may be deployed to try to find the perpetrator on the ground using the helicopter's onboard infrared sensors. Currently, this is the only way for law enforcement to locate and stop a perpetrator from future incidents, but very few perpetrators have been apprehended to date.
An object of the present invention is a ground-based system to automatically detect laser streaks and geolocate their origin, with coupling to an alerting system to allow a rapid law enforcement response and a post-event analysis system to support prosecution. The system disclosed herein holds the promise of persistent protection of high criticality air space around an airport which will lead to significantly greater potential for perpetrator apprehension and prosecution, with associated strong deterrent effects.
The system according to the invention for geolocation of a laser light source includes at least two spaced-apart ground-based sensors for receiving light from the laser source that has been off-axis scattered by air molecules and particulates to form imagery from the scattered light; and a processor operating on the scattered light imagery from the two sensors to locate the laser source. In a preferred embodiment, the sensors comprise a large aperture lens including a laser line or passband filter delivering light to a cooled charge coupled device (CCD) camera. A suitable lens aperture is 10 centimeters. It is preferred that the cooled CCD camera be astronomy grade. In another embodiment, the imagery is a plane of interest outward from each sensor. It is preferred that the processor forms a vector from intersection of planes of interest from the two sensors, which gets propagated to the ground using a terrain map to establish laser origin coordinates. Post-event algorithms can be used to overlay laser beam direction with aircraft coordinates to aid prosecution activities.
The system disclosed herein for geolocation of a laser light source uses two or more ground-based sensors that monitor the sky typically around a final approach or departure path with respect to an airport runway. When a laser beam enters a region protected by the present invention, the sensors will detect an off-axis scatter streak from the laser beam as photons scatter from air molecules and particulates. These photons are detected by sensors and software converts the detected streak into a plane of interest outward from each sensor. The vector formed by the intersection of the two planes from the two sensors is the location of the actual beam, and that vector is followed to the ground to find the perpetrator's location. With real time processing and a system to properly alert appropriate authorities, an accurate geolocation of the perpetrator is available to law enforcement within seconds.
With reference first to
The image processing system used in the invention includes a first stage for signal detection and characterization and a second stage for signal synthesis and geolocation. In stage 1, images are processed to improve the signal-to-noise ratio. Processing steps in this stage include: frame stacking (image summing), background subtraction, pixel re-binning, spatial or temporal filtering and thresholding. The processed images are analyzed via Hough transform to identify line segments in the image that could potentially be laser strikes. Basic features of the detected line segments are used to rule out obvious false positives such as, for example, line segments that are horizontal or are shorter than a certain number of pixels, the slopes of both line segments differ by more than is physically allowed by our sensor baseline geometry.
The pixels that comprise the detected line segment are passed through a weighted-least-squares linear regression algorithm to determine the position and orientation of the line in pixel-space with greater accuracy than is afforded by the Hough transform. The weights are chosen to be a function of the pixel value, i.e., pixels with a higher photon count are given more weight in the regression. The center of each pixel is assigned a local azimuth and elevation value based on a pre-calibrated azimuth and elevation of the image center and the instantaneous field of view (IFOV) of the sensor. The local spherical coordinates of the two outermost points of the detected line segment are recorded and stored for further processing. An initial guess for the slant range of each detection point is recorded in place of the correct value. The two outermost points of the detected line segment and the known sensor location form a triplet of points that form a unique plane.
In the second stage, the two planes detected and characterized in stage 1 intersect along a single line (the true location of the laser streak). The equation for this line is found using planar geometry. The line is extrapolated down to the point at which it intersects the surface of the Earth. The surface of the Earth is modeled by placing digital terrain elevation data (DTED) on top of an appropriately-chosen reference ellipsoid and geoid. The longitude, geodetic latitude and ellipsoidal height of the point of intersection gives the predicted location of the perpetrator (the origin of the detected laser streak).
These spatial coordinates, together with the equation for laser streak, allow one to calculate other quantities of interest, such as the local direction (azimuth and elevation) that the laser was pointing and the precise distance to the perpetrator. The details about the laser strike event, together with the timestamps associated with each image, are correlated with recorded aircraft tracks to provide strong evidence for law enforcement that the perpetrator was attempting to laser an aircraft.
Those of ordinary skill in the art will recognize that the sensors can be affixed to motorized mounts that can tilt the camera up/down in elevation as well as rotating in azimuth either to orient the sensors for different runways, and/or to automatically track aircraft. The field of view of a camera presents a tradeoff between accuracy of geolocation (preferring narrow FOV) and ability to cover a large geographic area with the intersection of the FOV of two cameras (preferring wide FOV). In a preferred embodiment, this tradeoff can be improved by pointing the two or more sensors using a moving mount and observing areas in which laser strikes are most likely to occur such as directly beneath an aircraft during approach or departure.
It is noted that the system disclosed herein can also be used to geolocate perpetrators of strikes against targets other than aircraft, including chips, surface vehicles and individuals.
It is recognized that modifications and variations on the present invention will be apparent to those of ordinary skill in the art and it is intended that all such modifications and variations be included within the scope of the appended claims.
This application claims priority to provisional patent application Ser. No. 62/324,427 filed on Apr. 19, 2016, the contents of which are incorporated herein by reference in its entirety.
This invention was made with government support under contract number FA8721-05-C-0002 awarded by the US Air Force. The government has certain rights in the invention.
Number | Name | Date | Kind |
---|---|---|---|
5699245 | Herold | Dec 1997 | A |
6770865 | Wootton | Aug 2004 | B2 |
7339148 | Kawano | Mar 2008 | B2 |
7773204 | Nelson | Aug 2010 | B1 |
8730457 | Rothenberger | May 2014 | B2 |
9536320 | Prince | Jan 2017 | B1 |
9639760 | Ottlik | May 2017 | B2 |
20040095626 | Brady | May 2004 | A1 |
20060085130 | Belenkii | Apr 2006 | A1 |
20080015771 | Breed | Jan 2008 | A1 |
20080037829 | Givon | Feb 2008 | A1 |
20100053330 | Hellickson | Mar 2010 | A1 |
20120044476 | Earhart | Feb 2012 | A1 |
20120087212 | Vartanian | Apr 2012 | A1 |
20130250112 | Breed | Sep 2013 | A1 |
20130264463 | Martin | Oct 2013 | A1 |
20130314694 | Tchoryk, Jr. | Nov 2013 | A1 |
20140334554 | Walsh et al. | Nov 2014 | A1 |
20150015869 | Smith | Jan 2015 | A1 |
20160041266 | Smits | Feb 2016 | A1 |
20160077207 | Freeman | Mar 2016 | A1 |
20160261300 | Fei | Sep 2016 | A1 |
20160291117 | Hui | Oct 2016 | A1 |
20170336326 | Sirat | Nov 2017 | A1 |
20180031737 | Murphy | Feb 2018 | A1 |
20190004178 | Motoyama | Jan 2019 | A1 |
Number | Date | Country |
---|---|---|
2006110973 | Oct 2006 | WO |
Entry |
---|
Notification of Transmittal of the International Search Report and Written Opinion for PCT/US2017/019630 dated May 10, 2017. |
Lillard et al. Minimum variance missile launch and impact estimation by fusing observations from multiple sensors. 1997. [retrieved on Apr. 18, 2017] Retrieved from the internet: <URL :http:/ /i eeexpl ore. ieee .org/document/57 4879/?section=abstract>. |
Number | Date | Country | |
---|---|---|---|
20180010911 A1 | Jan 2018 | US |
Number | Date | Country | |
---|---|---|---|
62324427 | Apr 2016 | US |