This document pertains generally, but not by way of limitation, to unmanned aerial systems (UASs), such as including an unmanned aerial vehicle (UAV) and a corresponding ground control station, and more particularly, to detection of UAVs, and optionally, negation of such vehicles.
The advent of new technologies has led to the emergence of relatively inexpensive, highly-maneuverable unmanned aerial vehicles (UAVs). Such UAVs have raised concerns regarding privacy, public safety, and security. One threat posed by unauthorized operation of a UAV is inadequate control over UAVs that penetrate sensitive areas. In one approach, an acoustic (e.g., sound-based) technique can be used to identify a presence of a UAV. However, such an approach can present challenges. For example, an acoustic technique may only provide limited detection range and may not be able to provide spatial localization of a detected UAV location, particularly in three dimensions. In another approach, UAVs could be required to transmit their position using a standardized protocol or beacon, such as using Automated Dependent Surveillance-Broadcast (ADS-B). However, such an approach can also present challenges. Many existing UAVs are not equipped (and may not be economically equipped) to provide beaconing, particularly “ADS-B Out” transmission capability, or such a transmitter could be intentionally disabled by the user to more easily penetrate sensitive areas without detection.
Radio-controlled unmanned aerial vehicles (UAVS) provide a way to perform certain difficult tasks without the need of putting a human pilot at risk. UAVs have long been used to perform military tasks and surveillance tasks; however, in recent years the availability of low-cost components has reduced the unit cost of producing UAVs. Accordingly, UAVs are now more accessible to other industries and even to individual hobbyists. However, the arbitrary use of UAVs by hobbyists and amateurs has raised concerns in terms of privacy and public security. For example, unauthorized UAVs with cameras can easily become intruders when flying over sensitive areas such as nuclear plants or high-value targets, or when flying into certain areas of airports.
The present inventors have recognized, among other things, that hobbyist and amateur UAVs, in particular, are typically small and difficult to detect by traditional radars, and that other approaches can help alleviate threats from any type of UAV. In some approaches, a wireless distributed acoustic sensor network can identify the appearance and estimate the position of trespassing UAVs that have entered or are about to enter a sensitive area. Once such a UAV is detected, to cope with the diversity in RF characteristics and telemetry protocols of amateur UAVs, the subject matter described herein can include a software defined radio (S DR) platform to capture and use machine learning approaches to identify and decode the telemetry protocols of the suspected trespassing UAV. Finally, when the UAV is confirmed to be unauthorized, control commands can be utilized to route that UAV away from sensitive areas.
In the drawings, which are not necessarily drawn to scale, like numerals may describe similar components in different views. Like numerals having different letter suffixes may represent different instances of similar components. The drawings illustrate generally, by way of example, but not by way of limitation, various embodiments discussed in the present document.
As mentioned above, the use of UAVs by hobbyists and amateurs has raised concerns in terms of privacy and public security. For example, unauthorized UAVs with cameras can easily be considered intruders when flying over sensitive areas such as nuclear plants or other high-value targets (e.g., government or military installations, sports arenas). According to the United States Federal Aviation Administration (FAA), more than 500 such unauthorized UAV operations were spotted between July and September 2016. UAVs can be used as weapons if they are made to carry explosive payloads or are otherwise under the control of terrorists. Despite such concerns, unauthorized amateur UAV operations remain difficult to detect or control, due in part to the fact that small-size UAVs are difficult to detect by traditional radars (e.g., aviation surveillance radar) and generally lack a radio beacon. Generally, UAVs are not entirely autonomous and a ground control station (e.g., a handheld remote control or other device) is used to control the UAV. Even if a location of UAV is determined, challenges may still exist in determining a location of a corresponding remote control on the ground.
The present inventors have recognized, among other things, that at least some of the challenges mentioned above can be addressed through apparatus and techniques as shown and described herein. Such apparatus and techniques can include one or more of detection and negation or “eviction” of an unauthorized UAV from a sensitive area. First, systems described herein can include a wireless distributed acoustic sensor network to identify the appearance and estimate the position of unwelcome UAVs. Furthermore, a software-defined radio (SDR) can use machine learning approaches to identify ad decode the telemetry protocols of the unauthorized UAV. Negation can be achieved such as by injecting a negation command, such as control commands, into a control channel once the telemetry protocol has been recognized. Negation can also include transmission of a warning or other notification by injecting a video signal or image data according to the detected protocol, to appear on a display of the remote control. Such transmission can also be accomplished using an SDR-based transmitter.
If the techniques fail to properly classify the protocol, jamming can be used to block communication between the ground control station and the UAV. A transmit power used for one or more of commandeering control, injecting warning information, or blocking of communication can include modulation of transmit power. Such modulation can include establishing a transmit power sufficient to achieve the desired control, warning injection, or signal blocking, without precluding operation of UAVs nearby or otherwise causing unwanted interference. Generally, the approaches described herein can proactively secure a protected physical area from unauthorized UAV operation while still otherwise permitting normal UAV operation elsewhere. Approaches described herein can use information from different sources to include detection accuracy, and distributed sensing can increase range of coverage and sensitivity.
SDR receivers 118 scan radio channels typically used by amateur UAVs (e.g., object 112 or similar UAVs) of a detection area 108 or areas within a threshold distance of the detection area 108. By using pattern recognition techniques described later herein, processing circuitry (e.g., processing circuitry 602 (
In operations 308 and 310, the processing circuitry estimates a signal power of at least one of a) a telemetry signal that is transmitted by object 112 airborne RF transmitter car b) a remaining signal strength available for use by the object 112 remote control. Depending on the signal power, the processing circuitry can use information regarding the signal power to help determine transmission power for negation transmissions, such as jamming, spoofing, warning, etc. In operation 312, the processing circuitry may decode a sample of the object 112 communication signals and use machine-learning-based classifiers to identify one or more of a video or a telemetry channel. Further details regarding operation 312 are provided below with reference to
In operation 316, if the processing circuitry has detected a video streaming protocol, the processing circuitry may transmit, or encode for transmitting, one or more warning video frames using an identified protocol with sufficient power to overcome the normal video signal (e.g., a multiple of the object 112 video channel transmit power such as twice the normal video channel transmit power). In operation 318, if the processing circuitry identifies a telemetry protocol including control capability, the processing circuitry can transmit, or causes to be transmitted, simulated control commands (commands 212 (
In operation 320, if the processing circuitry has not identified video or telemetry protocols can be identified or otherwise classified to allow the video or telemetry to be commandeered, then jamming can be performed, such as to trigger a UAV program that causes the UAV to return or search autonomously to re-establish a link with the remote control. As in example 4a and 4b, such jamming can be performed using a transmitted power sufficient to block reception of control commands via the normal telemetry channel, but without necessarily disrupting operation of other UAVs farther away. Other actions can be triggered if video warnings are not heeded and control cannot be established. For examples, the detection techniques shown an described herein can be used to trigger dispatch of an interceptor, such as a surveillance UAV 120 (
Acoustic identification techniques, for example techniques executed by distributed wireless acoustic sensors 110 can be based on the inventors' observation that the spectrum of UAV acoustic signals differs from the sound of natural backgrounds. Specifically, a UAV acoustic spectrum has a stronger and more concentrated power spectrum and steeper cutoff frequency than that of natural background sounds. The observation also provides an indication that a low pass filter (LPF) with a cutoff frequency of, for example, 15 kHz, can eliminate unnecessary noise while preserving most acoustic information.
In some embodiments, instead of defining rules for acoustic identification, a support vector machine (SVM) is used. For each digitalized acoustic signal, fast Fourier transform (FFT) is used to convert time domain signals into a spectral series (including amplitude information only). To avoid manually defining rules for acoustic identification, the support vector machine (SVM) is employed. For each piece of digitalized acoustic signal Si={a0, a1 . . . an} with length n, and class label I the fast Fourier transform (FFT) is used to convert time domain signals into the spectral series (amplitude only) Fi={w0, w1, . . . , wn}. Training sets are then generated and the wireless acoustic sensors 110 in the surveillance area (e.g., within the detection area 108 (
The wireless acoustic sensors 110 can use acoustic locating, based on the known speed of sound, to detect object 112 location (e.g., 3D location) such as may be performed in operation 306 (
As mentioned above with respect to operation 312 (
A training approach can include generation of an enriched training dataset using a scheme 400 as shown generally in
Inputs can include user-specified geography coordinates 402, UAV-related data 404 (including, for example, make, model, size, manufacturer, etc.) The protocol generators 406, 408 and 410 can respectively randomly output different types of data packets (e.g., a flying status report, one or more control commands) having payloads established in at least a semi-randomized manner. In such a semi-randomized scheme, a range of different random values can be constrained such as to provide data within the bounds that would be reasonable in actual operation (for example, geographic coordinates or battery status values can be constrained to avoid nonsensical values). Noise in the channel can be simulated at 412 by randomly toggling bit values in the generated packets according to specified error criteria such as bit error rates, minimum or maximum run length of error sequences, or the like.
Packets 414, 416, 418 generated using known target protocols can be combined by packets 420 corresponding to random data, and respective packets can be labeled to provide an enriched corpus 422 of training data. Machine learning techniques 424 such as implemented as a random forest, a support vector machine (SVM, e.g., a one-against-all SVM) or a convolutional neural network (CNN) can then be established using the training data.
As mentioned above with respect to operations 316, 318 and 320 (
A principal components analysis (PCA) technique can be used to compress a dimensionality of the data. Knowing the UAV's telemetry protocol, processing circuitry (e.g., of control center 116 (
Examples, as described herein, may include, or may operate by, logic or a number of components, or mechanisms. Circuitry is a collection of circuits implemented in tangible entities that include hardware (e.g., simple circuits, gates, logic, etc.). Circuitry membership may be flexible over time and underlying hardware variability. Circuitries include members that may, alone or in combination, perform specified operations when operating. In an example, hardware of the circuitry may be immutably designed to carry out a specific operation (e.g., hardwired). In an example, the hardware comprising the circuitry may include variably connected physical components (e.g., execution units, transistors, simple circuits, etc.) including a computer readable medium physically modified (e.g., magnetically, electrically, such as via a change in physical state or transformation of another physical characteristic, etc.) to encode instructions of the specific operation. In connecting the physical components, the underlying electrical properties of a hardware constituent may be changed, for example, from an insulating characteristic to a conductive characteristic or vice versa. The instructions enable embedded hardware (e.g., the execution units or a loading mechanism) to create members of the circuitry in hardware via the variable connections to carry out portions of the specific operation when in operation. Accordingly, the computer readable medium is communicatively coupled to the other components of the circuitry when the device is operating. In an example, any of the physical components may be used in more than one member of more than one circuitry. For example, under operation, execution units may be used in a first circuit of a first circuitry at one point in time and reused by a second circuit in the first circuitry, or by a third circuit in a second circuitry at a different time.
Machine (e.g., computer system) 600 may include a hardware processor 602 (e.g., a central processing unit (CPU), a graphics processing unit (GPU), a hardware processor core, or any combination thereof), a main memory 604 and a static memory 606, some or all of which may communicate with each other via an interlink (e.g., bus) 608. The machine 600 may further include a display unit Error! Reference source not found.10, an alphanumeric input device 612 (e.g., a keyboard), and a user interface (UI) navigation device 614 (e.g., a mouse). In an example, the display unit 610, input device 612 and UI navigation device 614 may be a touch screen display. The machine 600 may additionally include a storage device (e.g., drive unit) 616, a signal generation device 618 (e.g., a speaker), a network interface device 620, and one or more sensors 621, such as a global positioning system (GPS) sensor, compass, accelerometer, or other sensor. The machine 600 may include an output controller 628, such as a serial (e.g., universal serial bus (USB), parallel, or other wired or wireless (e.g., infrared (IR), near field communication (NEC), etc.) connection to communicate or control one or more peripheral devices (e.g., a printer, card reader, etc.).
The storage device 616 may include a machine readable medium 622 on which is stored one or more sets of data structures or instructions 624 (e.g., software) embodying or utilized by any one or more of the techniques or functions described herein. The instructions 624 may also reside, completely or at least partially, within the main memory 604, within static memory 606, or within the hardware processor 602 during execution thereof by the machine 600. In an example, one or any combination of the hardware processor 602, the main memory 604, the static memory 606, or the storage device 616 may constitute machine readable media.
While the machine readable medium 622 is illustrated as a single medium, the term “machine readable medium” may include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) configured to store the one or more instructions 624.
The term “machine readable medium” may include any medium that is capable of storing, encoding, or carrying instructions for execution by the machine 600 and that cause the machine 600 to perform any one or more of the techniques of the present disclosure, or that is capable of storing, encoding or carrying data structures used by or associated with such instructions. Non-limiting machine readable medium examples may include solid-state memories, and optical and magnetic media. Accordingly, machine-readable media are not transitory propagating signals. Specific examples of massed machine readable media may include: non-volatile memory, such as semiconductor memory devices (e.g., Electrically Programmable Read-Only Memory (EPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM)) and flash memory devices; magnetic or other phase-change or state-change memory circuits; magnetic disks, such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks.
The instructions 624 may further be transmitted or received over a communications network 626 using a transmission medium via the network interface device 620 utilizing any one of a number of transfer protocols (e.g., frame relay, internet protocol (IP), transmission control protocol (TCP), user datagram protocol (UDP), hypertext transfer protocol (HTTP), etc.). Example communication networks may include a local area network (LAN), a wide area network (WAN), a packet data network (e.g., the Internet), mobile telephone networks (e.g., cellular networks), Plain Old Telephone (POTS) networks, and wireless data networks (e.g., Institute of Electrical and Electronics Engineers (IEEE) 802.11 family of standards known as Wi-Fi®, IEEE 802.16 family of standards known as WiMax®), IEEE 802.15.4 family of standards, peer-to-peer (P2P) networks, among others. In an example, the network interface device 620 may include one or more physical jacks (e.g., Ethernet, coaxial, or phone jacks) or one or more antennas to connect to the communications network 626. In an example, the network interface device 620 may include a plurality of antennas to wirelessly communicate using at least one of single-input multiple-output (SIMO), multiple-input multiple-output (MIMO), or multiple-input single-output (MISO) techniques. The term “transmission medium” shall be taken to include any intangible medium that is capable of storing, encoding or carrying instructions for execution by the machine 600, and includes digital or analog communications signals or other intangible medium to facilitate communication of such software.
Each of the non-limiting aspects above can stand on its own, or can be combined in various permutations or combinations with one or more of the other aspects or other subject matter described in this document.
The above detailed description includes references to the accompanying drawings, which form a part of the detailed description. The drawings show, by way of illustration, specific embodiments in which the invention can be practiced. These embodiments are also referred to generally as “examples.” Such examples can include elements in addition to those shown or described. However, the present inventors also contemplate examples in which only those elements shown or described are provided. Moreover, the present inventors also contemplate examples using any combination or permutation of those elements shown or described (or one or more aspects thereof), either with respect to a particular example (or one or more aspects thereof), or with respect to other examples (or one or more aspects thereof) shown or described herein.
In the event of inconsistent usages between this document and any documents so incorporated by reference, the usage in this document controls.
In this document, the terms “a” or “an” are used, as is common in patent documents, to include one or more than one, independent of any other instances or usages of “at least one” or “one or more.” In this document, the term “or” is used to refer to a nonexclusive or, such that “A or B” includes “A but not B,” “B but not A,” and “A and B,” unless otherwise indicated. In this document, the terms “including” and “in which” are used as the plain-English equivalents of the respective terms “comprising” and “wherein.” Also, in the following claims, the terms “including” and “comprising” are open-ended, that is, a system, device, article, composition, formulation, or process that includes elements in addition to those listed after such a term in a claim are still deemed to fall within the scope of that claim. Moreover, in the following claims, the terms “first,” “second,” and “third,” etc. are used merely as labels, and are not intended to impose numerical requirements on their objects.
Method examples described herein can be machine or computer-implemented at least in part. Some examples can include a computer-readable medium or machine-readable medium encoded with instructions operable to configure an electronic device to perform methods as described in the above examples. An implementation of such methods can include code, such as microcode, assembly language code, a higher-level language code, or the like. Such code can include computer readable instructions for performing various methods. The code may form portions of computer program products. Further, in an example, the code can be tangibly stored on one or more volatile, non-transitory, or non-volatile tangible computer-readable media, such as during execution or at other times. Examples of these tangible computer-readable media can include, but are not limited to, hard disks, removable magnetic disks, removable optical disks (e.g., compact disks and digital video disks), magnetic cassettes, memory cards or sticks, random access memories (RAMS), read only memories (ROMs), and the like.
The above description is intended to be illustrative, and not restrictive. For example, the above-described examples (or one or more aspects thereof) may be used in combination with each other. Other embodiments can be used, such as by one of ordinary skill in the art upon reviewing the above description. The Abstract is provided to allow the reader to quickly ascertain the nature of the technical disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. Also, in the above Detailed Description, various features may be grouped together to streamline the disclosure. This should not be interpreted as intending that an unclaimed disclosed feature is essential to any claim. Rather, inventive subject matter may lie in less than all features of a particular disclosed embodiment. Thus, the following claims are hereby incorporated into the Detailed Description as examples or embodiments, with each claim standing on its own as a separate embodiment, and it is contemplated that such embodiments can be combined with each other in various combinations or permutations. The scope of the invention should be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled.
This patent application claims the benefit of priority of Song et al., U.S. Provisional Patent Application Ser. No. 62/833,153, titled “UAS DETECTION AND NEGATION,” filed on Apr. 12, 2019 (Attorney Docket No. 4568.006PRV), which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | |
---|---|---|---|
62833153 | Apr 2019 | US |