Embodiments of the invention relate to systems and methods for detecting small unmanned aerial systems. More specifically, embodiments of the invention relate to the employment of intra-netted acoustic detection of small unmanned aerial systems in 360-degrees of terrain-independent coverage with multiple radii in depth.
Typical systems and methods of detecting Unmanned Aerial Systems (UAS) employ radar, visible optics, thermal optics and/or radio frequency detection. However, small UAS still may elude these line-of-sight detection methods as they can fly nap-of-the-earth, leverage terrain features for cover and concealment, and/or move unpredictably within high clutter, low-altitude areas. Furthermore, UAS may be extremely difficult to detect using radar and/or electro-optical systems. The current UAS detection methods are line-of-sight. Therefore, the current methods do not allow for detection in complex urban settings, behind hills, and in valleys where attacking UAS may hide. Furthermore, when a UAS is in line-of-sight, the cross-section of the UAS may be reduced drastically based on the orientation of the UAS to the radar signal and the materials used for construction. The tactical measures may be used by attacking UAS and decrease the confidence in detecting the UAS using radar and/or electro-optical systems. Further, small UAS have small thermal and visible signatures, are quiet, and can easily be mistaken for birds. These drawbacks of current detection methods make it very difficult to accurately detect and/or identify UAS threats when they move fast at low-altitude in highly cluttered non-line-of-sight conditions.
Further enhancing the problem is that small UAS are readily available, man-portable, inexpensive, capable of carrying small payloads of sensors, munitions and/or contraband, and the world-wide market is expected to grow continuously. Any person may acquire and modify UAS. These conditions create the basis for a capability whereby a UAS may be flown in restricted zones and be outfitted with destructive payloads such as explosives and/or chemical, biological, or radiological materials. Further, many national governments, non-government organizations, and terrorist organizations are experimenting with and employing small UAS for a host of purposes. The abundance of UAS combined with the difficulties in identifying and tracking the UAS creates a need for Counter—Small Unmanned Aerial Systems (C-sUAS) strategies and capabilities.
What is needed is a system that accurately and reliably detects that UAS are present, determines if they are a threat, provides integrated early warning, engages the UAS, and does so regardless of terrain and/or terrain features, natural or man-made, under both line-of-sight and non-line-of-sight conditions within a redundant, layered construct and in doing so, minimizes constant hands-on attention until a triggering event. Systems and methods utilizing acoustic sensors, and acoustic sensor arrays, may provide more accurate detection and identification of UAS. Further, the passive nature of acoustics reduces risk of being targeted by threat actors or forces. Thus, detection and identification of UAS by acoustics may provide for a more reduced-risk environment. Measuring acoustic signal characteristics of UAS may provide accurate identification methods such that the UAS may not be confused with other friendly systems. Further, when compared to a database of acoustic signatures, the type of UAS may be identified. Further still, an array of acoustic sensors may be utilized to determine a number of UAS, the position and velocity of the UAS for tracking, and display and engagement of the UAS. The systems and methods for detecting UAS using acoustic sensor described herein may provide more accurate and reliable detection and identification of UAS under a full range of operating conditions. Detection and identification of UAS may provide for a safer environment.
Embodiments of the invention solve the above-mentioned problems by providing systems and methods for non-line-of-sight passive detection and integrated early warning of UAS by a connected set of acoustic sensors. In some embodiments, the set of acoustic sensors detect non-line-of-sight UAS, trigger other sensors to actively detect, store, and transmit data. In some embodiments, the system also comprises acoustic sensors with integrated electro-optical imaging components operated in an orthogonal manner for further enhancing confidence in detection of UAS. In some embodiments, the systems may track and record the UAS by visual sensors, and automatically initiate engaging the UAS with weaponry.
A first embodiment of the invention is directed to a method of non-line-of-sight passive detection and integrated early warning of an unmanned aerial system, the method comprising the steps of positioning a plurality of geo-located acoustic sensors in depth within an intra-connected array according to at least one of a terrain, terrain features, or man-made objects or structures, receiving, from at least one acoustic sensor of the plurality of acoustic sensors, an acoustic signal, and comparing a signal indicative of at least a portion of the acoustic signal with known characteristic signals to classify a source of the acoustic signal, wherein the known characteristic signals include information indicative of unmanned aerial systems.
A second embodiment of the invention is directed to a system for non-line-of-sight passive detection and integrated early warning of an unmanned aerial system, the system comprising a plurality of geo-located acoustic sensors in depth within an intra-connected array according to at least one of a terrain, terrain features, or man-made objects or structures, at least one acoustic sensor of the plurality of acoustic sensors receiving an acoustic signal, and a processor. The system also comprises acoustic sensors with integrated electro-optical imaging components operated in an orthogonal manner for further enhancing confidence in detection of UAS. The system further comprises one or more non-transitory computer-readable media storing computer-executable instructions that, when executed by the processor, perform a method of classifying a source of the acoustic signal. The method comprises the step of comparing a signal indicative of at least a portion of the acoustic signal with known characteristic signals to classify the source of the acoustic signal, wherein the known characteristic signals include information indicative of unmanned aerial systems.
A third embodiment of the invention is directed to a system for non-line-of-sight passive detection and integrated early warning of an unmanned aerial system, the system comprising a plurality of geo-located acoustic sensors in depth within an intra-connected array according to at least one of a terrain, terrain features, or man-made objects or structures, at least one acoustic sensor of the plurality of acoustic sensors receiving an acoustic signal, and a processor. The system also comprises acoustic sensors with integrated electro-optical imaging components operated in an orthogonal manner for further enhancing confidence in detection of UAS. The system further comprises one or more non-transitory computer-readable media storing computer-executable instructions that, when executed by the processor, perform a method of classifying a source of the acoustic signal. The method comprises the steps of comparing a signal indicative of at least a portion of the acoustic signal with known characteristic signals to classify the source of the acoustic signal, wherein the known characteristic signals include information indicative of unmanned aerial systems and determining a threat level of the source of the acoustic signal based at least in part on the classification of the source of the acoustic signal.
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Other aspects and advantages of the invention will be apparent from the following detailed description of the embodiments and the accompanying drawing figures.
Embodiments of the invention are described in detail below with reference to the attached drawing figures, wherein:
The drawing figures do not limit the invention to the specific embodiments disclosed and described herein. The drawings are not necessarily to scale, emphasis instead being placed upon clearly illustrating the principles of the invention.
Embodiments of the invention solve the above-described problems and provide a distinct advance in the field by providing a method and system for passively detecting UAS. In some embodiments, acoustic sensors may be arranged in arrays. The acoustic sensors may detect vibrations in the air and ground as derived from UAS propeller rotations. The signal measured by the acoustic sensors may be compared to a database of known sensors to determine the source of the signal and if the source of the signal is friendly or a possible threat. In some embodiments, acoustic sensors may have integrated electro-optical imaging components operated in an orthogonal manner for further enhancing confidence in detection of UAS. In some embodiments, detection of the UAS may trigger additional sensors and systems and methods for countering the threat.
Though UAS are described in embodiments herein, it should be recognized that any vehicle may be detected and recognized. For example, the vehicle may be any aircraft such as a UAS, an airplane, a helicopter, and any other aerial vehicle. Though, exemplary small UAS are discussed herein, the UAS may be any size and weight. Similarly, the vehicle may be any ground-based vehicle such as, for example, an automobile, manned vehicle, unmanned vehicle, military, civilian, and any other ground-based vehicle. Similarly, a water-based vehicle may be detected and recognized such as a motorboat, sailboat, hydrofoil, submarine, and any other water-based vehicle. The systems and methods described herein are not limited to small UAS.
The following detailed description references the accompanying drawings that illustrate specific embodiments in which the invention can be practiced. The embodiments are intended to describe aspects of the invention in sufficient detail to enable those skilled in the art to practice the invention. Other embodiments can be utilized and changes can be made without departing from the scope of the invention. The following detailed description is, therefore, not to be taken in a limiting sense. The scope of the invention is defined only by the appended claims, along with the full scope of equivalents to which such claims are entitled.
In this description, references to “one embodiment,” “an embodiment,” or “embodiments” mean that the feature or features being referred to are included in at least one embodiment of the technology. Separate references to “one embodiment,” “an embodiment,” or “embodiments” in this description do not necessarily refer to the same embodiment and are also not mutually exclusive unless so stated and/or except as will be readily apparent to those skilled in the art from the description. For example, a feature, structure, act, etc. described in one embodiment may also be included in other embodiments but is not necessarily included. Thus, the technology can include a variety of combinations and/or integrations of the embodiments described herein.
Turning first to
Computer-readable media include both volatile and nonvolatile media, removable and nonremovable media, and contemplate media readable by a database. For example, computer-readable media include (but are not limited to) RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile discs (DVD), holographic media or other optical disc storage, magnetic cassettes, magnetic tape, magnetic disk storage, and other magnetic storage devices. These technologies can store data temporarily or permanently. However, unless explicitly specified otherwise, the term “computer-readable media” should not be construed to include physical, but transitory, forms of signal transmission such as radio broadcasts, electrical signals through a wire, or light pulses through a fiber-optic cable. Examples of stored information include computer-useable instructions, data structures, program modules, and other data representations.
Finally, network interface card (NIC) 124 is also attached to system bus 104 and allows computer 102 to communicate over a network such as network 126. NIC 124 can be any form of network interface known in the art, such as Ethernet, ATM, fiber, Bluetooth, or Wi-Fi (i.e., the IEEE 802.11 family of standards). NIC 124 connects computer 102 to local network 126, which may also include one or more other computers, such as computer 128, and network storage, such as data store 130. Generally, a data store such as data store 130 may be any repository from which information can be stored and retrieved as needed. Examples of data stores include relational or object-oriented databases, spreadsheets, file systems, flat files, directory services such as LDAP and Active Directory, or email storage systems. A data store may be accessible via a complex API (such as, for example, Structured Query Language), a simple API providing only read, write and seek operations, or any level of complexity in between. Some data stores may additionally provide management functions for data sets stored therein such as backup or versioning. Data stores can be local to a single computer such as computer 128, accessible on a local network such as local network 126, or remotely accessible over Internet 132. Local network 126 is in turn connected to Internet 132, which connects many networks such as local network 126, remote network 134 or directly attached computers such as computer 136. In some embodiments, computer 102 can itself be directly connected to Internet 132.
Acoustic sensors 202 may comprise different battery capacities determining length of time in operation without replacement or maintenance. First sensor 204 may be a sensor capable of remaining in the field without battery replacement or maintenance for two years or more. Second sensor 206 may contain a smaller shorter battery life and may remain operational for up to six months. First sensor 204 and second sensor 206 are exemplary, and the life of the battery of acoustic sensors 202 may be dependent on the type of battery and additional power consuming components. Acoustic sensors 202 may comprise a power management system that allows acoustic sensors 202 to remain in a low-power state until triggered by the detection of an external sound. The power management system may allow acoustic sensors 202 to remain deployed for extensive periods without battery replacement. In some embodiments, acoustic sensors 202 may be connected to wired power and may remain operational indefinitely.
As depicted, second sensor 206 is larger than first sensor 204. In some embodiments, different battery types may be used based on the use of acoustic sensors 202. For example, first sensor 204 may be used in proximity to an airport. There may be no restriction on when first sensor 204 may be maintained and batteries replaced. Consequently, first sensor 204 may be maintained without concern. Alternatively, second sensor 206 may be used within a high-risk region of interest, such as in a military environment where access is restricted. It may be dangerous to access the location of second sensor 206 and, therefore, the battery may be much larger to decrease the timing period for maintenance. In high threat regions, second sensor 206 may comprise up to a 2-year operational window. Furthermore, acoustic sensors 202 may include power input such that acoustic sensors 202 may be directly coupled to an external power source.
In some embodiments, acoustic sensors 202 may be positioned in an intra-netted layout, or array, and share a power source that may be a battery or be directly connected to a nearby facility. In the intra-netted layout (or array), at least one of the acoustic sensors 202 is communicatively coupled (i.e., connected) to at least one other acoustic sensor. In some embodiments, each of the acoustic sensors 202 in the intra-netted layout is communicatively connected to all of the other sensors. The “intra-netted” layout as used herein is intended to encompass one or more sensors communicatively connected to one or more other sensors in a plurality of sensors arranged in an array for non-line-of-sight passive detection. Such communicative connection may be obtained via a local area network, Bluetooth, WiFi, or any other presently known or future wired or wireless communication means.
In some embodiments, acoustic sensors 202 may comprise microphones 208 capable of detecting small vibrations in the air. Microphones 208 may be configured to detect desired sounds while filtering sounds that may not be desirable. As depicted on first sensor 204, microphones 208 may be disposed on an outer surface, or housing 214. Microphones 208 may be arranged to individually or collectively detect 360 degrees around first sensor 204. Furthermore, microphones 208 may be slightly set back and partially covered by housing 214 such that noise from wind or other ambient sounds is reduced. In some embodiments, microphones 208 may be completely exposed and mounted on a stand or at a separate location from first sensor 204 and be communicatively connected by wire or wirelessly. In some embodiments, microphones 208 may be any of polar, cardioid, omnidirectional, figure eight, and any other type of microphone depending on the arrangement and the target direction. Furthermore, in some embodiments, noise cancelling or noise reduction devices may be used to filter known noises prior to detection by microphones 208. For example, baffling, foam, windscreen, and any other noise cancellation devices may be added based on the expected noises in the environment in which microphones 208 are placed. In some embodiments, microphones 208 may be condenser or diaphragm and may be micro-electromechanical system (MEMS) microphones.
An exemplary sensor interior is depicted in
In some embodiments, some components may be exterior and may be communicatively connected to acoustic sensors 202 by electrical ports or by transceivers. For example, in some embodiments, GPS receiver 218 may be positioned at a single location and the acoustic sensors 202 may comprise laser range finders that determine a range between GPS receiver 218 and acoustic sensors 202. In some embodiments, GPS receiver 218 may be positioned at a central server, or data management system. Furthermore, GPS receiver 218 may be positioned at a different location than acoustic sensors 202 if acoustic sensors 202 are under overhead cover resulting in intermittent reception. In some embodiments, position sensors such as, for example, GPS, proximity sensors such as Bluetooth, Radio Frequency Communication (e.g., RFID tags), laser range finders, or any other position sensors may be used to determine the position of the acoustic sensors 202. The position sensors may be used to determine the global coordinates of acoustic sensors 202 as well as the relative location of each sensor to a region of interest and other sensors. Any components included in first sensor 204 may also be included in second sensor 206. Though first sensor 204 is referenced in embodiments described herein, it should be understood that second sensor 206 may include the same or similar components and perform the same or similar function.
In some embodiments, the acoustic sensors 202 may also comprise memory, or local storage 122, containing a database of characteristic signals for comparing to detected acoustic signals. Signals indicative of friendly aircraft and UAS may be stored as non-threats, and signals indicative of small UAS that are not known or known to be unfriendly may be stored as possible threats, or known threats. Furthermore, other phenomena such as, for example, general aviation aircraft, commercial aircraft, ground vehicles, traffic, or any other usual and natural phenomenon common to the environment in which the acoustic sensors 202 are placed, may be stored for comparison to received acoustic signals. Furthermore, algorithms for filtering certain types of noises may be stored. For example, wind, rain, snow, and other environmental conditions may create characteristic signals that may be used to train machine learning algorithms. Once the characteristic signals are learned, the machine learning algorithm may classify a signal as, for example, rain, wind, earthquake, or any other natural or man-made non-threat signals. Once the non-threat signals are classified, the non-threat signals may either be filtered or canceled as described in more detail below.
Acoustic sensors 202 may comprise transceiver antenna 210 for transmitting and receiving communication from various communication devices. As depicted in
In some embodiments, mobile communication device 220 may be used in combination with acoustic sensors 202 and in communication with transceiver antenna 210. Mobile communication device 220 may receive any communication from acoustic sensors 202 including data from electro-optical sensors, acoustic sensors, and any alerts or notifications. In some embodiments, mobile communication device 220 may be any system comprising hardware platform 100 as described above and depicted in
Complete coverage of ROI 304 may require discrete sensor placements at a number of sensor positions that are non-line-of-sight from a central point and may be optimally placed to account for complex terrain, terrain features, other cluttering conditions, and/or man-made objects so as to achieve assured coverage for operations in depth from a central point. Sensor array 300 may be arranged such that a UAS may not be able to penetrate the perimeter without being detected by acoustic sensors 202. For example, acoustic sensors 202 may be arranged such that detectable areas 302 around each sensor may overlap as shown. Placement of acoustic sensors 202 such that detectable areas 302 overlap prevents cracks for UAS to breach detectable areas 302 without being detected.
In an exemplary scenario, ROI 304 is a possible target of terrorism. ROI 304 may be any protected facility such as, for example, a government building, prison, national border, power plant, oil field, military facility or other critical infrastructure. Acoustic sensors 202 may be placed around ROI 304 such that all sides may be protected. As shown in
Though a circular array of acoustic sensors 202 is depicted in
In some embodiments, acoustic sensors 202 may be arranged along the uneven terrain such that the UAS may be detected without line-of-sight electromagnetic sensors. A symmetric arrangement of the acoustic sensors 202 is not necessary as long as the location of each sensor is known. This can be achieved by GPS sensors on the acoustic sensors or simply by recording and storing the relative location of each sensor. For more precise, location information, range measurement devices may be disposed on the acoustic sensors 202 or at the location of the acoustic sensors 202. For example, each acoustic sensor may be enabled by laser range finding for determining precise distance from a known location. This may provide extremely accurate location information for the acoustic sensors such that the UAS location may also be accurately determined. Because acoustic sensors 202 may not move, the location may be recorded and stored one time such that each sensor does not have to be equipped with a location detection device.
Continuing with the exemplary embodiment described above where terrorists use a swarm of UAS to attack ROI 304, tactics may be used to hide UAS from detection. As depicted ROI 304 is an airfield being attacked by a swarm of UAS. For example, the swarm of UAS may be programed to hide from line-of-sight detection using canyons, hills, buildings, vegetation, riverbanks, and any other cover. Acoustic sensors 202 may be positioned to detect the UAS when line-of-sight detection methods are diminished or not workable. In the exemplary scenario depicted by GUI 400, mountain area 404 may be mountainous terrain, and the swarm of UAS may be represented by the path 406. The closest sensors, identified with cross lines, may detect the swarm of UAS first. When a sensor of acoustic sensors 202 detects an acoustic signal, the sensor may wake from low-power state where the sensor is just listening. Upon waking, the sensor may then compare the received signal with stored characteristic signals and classify the signal as particular type of UAS and a threat level. If the signal is determined to be a threat the sensor may signal transmit data to the other sensors of acoustic sensors 202. The data transmitted to the other sensors may just wake the other sensors such that the other sensors process acoustic signals, or the data transmitted may comprise the classifications and the signal information such that the other sensors know what to listen for and know that the signal source has already been classified as a threat.
In some embodiments, the transmitted data is received by mobile communication device 220 or at a remote observation station that may be located at ROI 304 (e.g., the airfield). GUI 400 may be displayed via mobile communication device 220 to a user in the field or at any remote observation station. GUI 400 may display any map data that may be open source and locations of acoustic sensors 202 may be displayed on the map. GUI 400 may display location coordinates 408 or any other location indication. Any sensor that detects the acoustic signal may indicate as such by changing color, blinking, changing size, or by any other method. Furthermore, an indicia 410 may be displayed by GUI 400 indicating that an acoustic signal is detected. Furthermore, the indicia 410 may be indicative of a threat level by color, size, shape, texture, blinking, or any other method.
In some embodiments, acoustic sensors 202 may be coupled with and trigger other sensors. The sensors may detect a threat as described in embodiments above and send a signal to additional sensors to be begin recording, processing, storing, and transmitting. The additional sensors may be acoustic sensors in the intra-netted array; however, in some embodiments, the additional sensors may be combined with the sensor and detect various other phenomena associated with the source of the sound vibration. For example, the additional sensors may be optical. In some embodiments, the data transmitted by acoustic sensors may trigger line-of-sight sensors such as, for example, RADAR, video cameras, still image cameras, thermal imaging cameras, electro-optical infrared, and any other cameras that may detect electromagnetic radiation across and wavelength of the spectrum. The alternative sensor may also transmit data to remote observation stations for visual tracking and identification by personnel. In some embodiments, the remote observation station may be a central control station for providing power to and facilitating communication between acoustic sensors 202. The data may be transmitted in near real time such that the personnel may monitor the changing situation and may provide quick real-time response. For example, an array of acoustic sensors 202 may be disposed at a military airfield ROI 304 as described in embodiments above. In some embodiments, the acoustic sensors 202 may be couple with a parabolic microphone for detecting over long ranges in specific directions. For example, line-of-sight sensors such as, for example, radar and cameras may be used for threat detection across a large area; however, mountain area 404 may obscure the line-of-sight sensors. Acoustic sensors 202 may be directed toward the valley for specific acoustic detection in the direction of the mountains. As such, acoustic sensors 202 may detect the acoustic signal associated with the UAS before the line-of-sight sensors and acoustic sensors 202 may transmit to the other sensors to begin recording, processing, and transmitting.
In some embodiments, the data by acoustic sensors 202 may be used to provide visual virtual reality (VR) simulations for display to tactical groups. As described above, acoustic sensors 202 may be placed in an array and may trigger other sensors such as, for example, a video camera. In some embodiments, acoustic sensors 202 may comprise electro-optical sensors. The electro-optical data obtained by the electro-optical sensors may be transmitted with the acoustic data from acoustic sensors 202. In some embodiments, an array of video cameras, or the integrated electro-optical sensors, may be triggered and actuated to focus on the acoustic signal source which may be the UAS swarm. The video data recorded by the plurality of video cameras (e.g., electro-optical sensors) may be combined into a three-dimensional virtual and/or augmented reality (VR/AR) display of the environment. The virtual reality display of the environment may be provided at a remote location for review by personnel. In some embodiments, the VR/AR display may be provided to personnel on the ground such as, for example, military groups, fire fighters, police officers, or other emergency personnel that may be in-route or on-location.
In some embodiments, acoustic sensors 202 may transmit signals that trigger initiation of weapons-based man-in-the-loop effectors generally referenced as weapons 412 that engage the UAS. Weapons 412 may be any engagement device that may use sound, electromagnetic radiation, projectiles, and explosives to incapacitate the acoustic signal source. For example, the swarm of UAS may approach the military airfield described above. The swarm of UAS may approach out of sight of line-of-sight detection devices such as optical cameras and radar. The UAS may be detected by acoustic sensors 202 of acoustic detection system 200. Acoustic sensors 202 may detect the sound (i.e., acoustic signal) of the UAS and transmit the signal indicative of the UAS sound to at least one processor that may classify the sound of the UAS and determine a threat level as described in embodiments herein. When it is determined that the UAS pose a threat, weapons 412 may be activated and supplied a position of the detected UAS. In some embodiments, weapons 412 may be a plurality of laser-emitting devices and each laser-emitting device may be activated. Each laser-emitting device may be assigned a UAS or a plurality of UAS.
In some embodiments, the target direction of the laser-emitting devices may be update in real time as the UAS is tracked. When the UAS becomes visible, the laser-emitting device may also be connected to an optical sensor, acoustic sensors 202, and any other sensor that allows the laser-emitting device to track and target the UAS using a statistical algorithm such as, for example, an extended Kalman filter. When the UAS is targeted, the laser-emitting device may engage and destroy the UAS. After a first UAS is destroyed, the laser-emitting device may move on and engage a second UAS. Laser-emitting device may move to the next closest UAS or any UAS that may pose the greatest threat or may target the UAS in any tactical manner.
In some embodiments, acoustic sensors 202 may be placed in an urban environment. Acoustic sensors 202 may be trained to detect and classify urban sounds such as, for example, conversation, traffic, animals, alarms, as well as natural sounds. Acoustic sensors 202 may be placed on buildings and towers for relative height displacement. In some embodiments, acoustic sensors 202 may be placed around and on sensitive buildings and other critical infrastructure such as, for example, government buildings, foreign embassies, prisons, defense contractor buildings, and the like. In some embodiments, the UAS may be connected to law enforcement communications and the Internet and automatically determine if there is threat. For example, the UAS may detect a swarm of UAS and determine from analyzing the news of the area that a local light show involving UAS is underway. Furthermore, the system may be notified by law enforcement communication that unknown UAS are entering secured airspace around the foreign embassy and automatically activate all sensors, begin storing information, and begin processing acoustic signals.
In some embodiments, acoustic sensors 202 are disposed with vertical displacements as shown in
In some embodiments, any other sensor data may be combined with data from acoustic sensors 202 to provide a better estimate of the location of quadcopter 502. In some embodiments, the three-dimensional location of quadcopter 502 may be determined from a planar array or a sensor array that is terrain-based when the locations of acoustic sensors 202 are known; however, placing acoustic sensors 202 at elevation may provide early warning and more accurate location of higher altitude UAS as well as more accurate tracking of vertical movement of the UAS. Acoustic sensors 202 may be placed at elevation based on the terrain or may be placed at elevation on stands 512.
Turning now to
Furthermore, acoustic sensors 202 may be arranged to reduce noise as described above. A sensor that is further from the ground may reduce ground noise if the sensor is positioned near a roadway, railroad tracks, bridge, or the like. A sensor may be positioned behind a wall or building to reduce wind in a windy environment and may be configured to detect acoustic signals from a specific target direction. These processes may reduce and filter noise and friendly acoustic signals such that the acoustic detection system 200 may process the target acoustic signals.
In some embodiments, acoustic sensors 202 may detect acoustic signals and store the acoustic signals in the local storage 122. One or more non-transitory computer-readable media may be executed by at least one processor to compare the acoustic signals with a database of known characteristic signals to determine a type of acoustic sound that was detected by the acoustic sensors 202. For example, a gust of wind may be detected. Upon comparison to the database of characteristic signals it may be determined that the acoustic signal is indicative of a gust of wind, and disregard or store the acoustic signal for later comparisons. Alternatively, the acoustic signal may be compared to the database of characteristic signals, and it may be determined that the acoustic signal matches a known UAS that is in violation of flying restrictions. For example, the signal may be indicative of the quadcopter 502 turning propellers at specific RPM indicative of the size of the propellers and the weight of quadcopter 502. The characteristics of the acoustic signal may be compared to the database of characteristic signals, and it may be determined that the source of the signal (e.g., quadcopter 502) is a known threat. When an unknown signal or a known threat is detected, an alert may be transmitted notifying the authorities and personnel at ROI 304 of the threat. When an unknown signal is identified, the unknown signal may be stored as a characteristic signal for future comparisons. In some embodiments, integration of electro-optical imaging components within acoustic sensors 202 may enable real-time orthogonal sensing and deliver higher confidence detections especially under non-line-of-sight conditions. In some embodiments, orthogonal sensing may utilize any sensors described herein to cover detectable areas 302. The sensors may be arranged in any location and may be positioned to detect at any angle relative to other sensors including acute, right, and obtuse angles.
Furthermore, as shown in both log frequency power spectrum 604 and linear frequency power spectrum 606 a Doppler shift in frequency may be indicative of motion of the UAS either towards or away from acoustic sensors 202. As the UAS moves closer to the sensor the frequency may increase and as the UAS moves away from the sensor the frequency may decrease. As such, a single sensor may receive data that can be analyzed to determine motion of the UAS relative to the sensor. The Doppler motion and the increased RPM may be combined to show increased speed toward and away from the sensor.
The signals may be analyzed and classified using machine learning algorithms such that the source of the detected sound has a probability of classification associated. In some embodiments, the signal extraction may be performed in time, frequency, and wavelet domains, and the acoustic signal may be analyzed for noise, separability, repeatability, and robustness prior to further analysis. In some embodiments, acoustic signal analysis may classify by comparison to characteristic signals using exemplary statistical and machine learning algorithms such as linear discriminant analysis, distance-based likelihood ratio test, quantitative descriptive analysis, artificial neural networks, and the like.
In some embodiments, a machine learning algorithms (MLA) may be trained for signal classification. The MLA may be trained on known noises such as wind, rain, traffic, human and animal voices, foot traffic, and other non-threat noises that may be expected in the area of the sensors. Furthermore, the MLA may be trained on known and friendly aircraft and vehicles for classification of the vehicles as a non-threat classification. Similarly, the MLA may be trained on known UAS, and enemy vehicle sounds such that the MLA may be trained to detect threats with a minimum known probability. In some embodiments, the MLA provide a probability of detection and a probability of false alarms based on the classification.
In some embodiments, a threat level may be determined. The signal may be compared to the database and the source of the signal determined with a probability. The probability may be used to determine a threat level. For example, the acoustic signal may match known signal characteristics 100% and it is determined that the source of the acoustic signal is a commercial airliner. The known commercial airliner is not a threat, so the threat level is indicated as zero. Alternatively, the source of the signal may be determined to be an unknown UAS type. Because the UAS is unknown, the threat level may be 50%. As such, more information may be required. So, an action taken may be to deploy surveillance or trigger alternative sensors to determine the UAS type and determine if the UAS is a threat. In the event that the UAS is determined to be a threat, a threat level of 100% may be determined and military action taken. The action based on the threat level may be determined by threshold levels. For example, at 75% threat probability, action is taken. At 25% threat probability, surveillance is taken, and below 25%, no action is taken. The thresholds noted are examples, and any thresholds and threat levels may be used based on conditions.
At step 704, the acoustic sensors 202 may send a signal indicative of the acoustic signal to be stored and processed. The acoustic signal may be received by, for example, microphones 208, and an electrical signal indicative of the acoustic signal may be generated and sent for storage and analysis. In some embodiments, many overlapping sounds may be received and, consequently, many overlapping signals may be sent.
At step 706, the signal indicative of the acoustic signal is stored and analyzed as described in embodiments above. The characteristics of the received acoustic signal may be compared to stored characteristics of stored signals in the database. The comparison may measure error between the received signals and the stored signal characteristics using statistical and machine learning algorithms. A low error may indicate a high likelihood that the received acoustic signal is the same or similar to the stored signal. Likewise, a high error may indicate that the received acoustic signal is not the same as the characteristic signal to which the received signal is compared. The database may store a plurality of characteristic signals indicative of common sounds such as, for example, airplanes, wind, and automobiles. Further, the database may store characteristic signals indicative of known UAS threats. Therefore, the source of the acoustic signal may be determined from the acoustic signal and may be analyzed to determine if the source is a threat.
At step 708, the source of the signal is analyzed to determine if the source of the signal is a threat. In some embodiments, a likelihood of threat is determined from the comparison of the acoustic signal and the stored signal characteristics. In some embodiments and depending on line-of-sight versus non-line-of sight conditions, the acoustic signal may be compared and correlated in real-time against line-of-sight orthogonal sensor data or other non-line-of-sight sensor data such as from integrated electro-optical components within acoustic sensor 202. The likelihood determined from the comparison at step 706 may be indicative of a likelihood that the source of the acoustic signal is a threat as described in embodiments above. Furthermore, there may be thresholds for determining action based on the perceived threats. The thresholds may be low, medium, and high threat, and actions may be taken based on the likelihood of a threat compared to the thresholds.
At step 710, if the source of the acoustic signal is a threat or is unknown, an automatic action may be taken. In some embodiments, an action may be taken based on the level of threat detected compared to threshold values. For example, no action may be taken, or the signal may be disregarded if no threat is detected. A warning and signal to initiate surveillance may be taken if the signal may be a threat. Military action, or lock down, may be taken if there is a high likelihood of a threat. The thresholds may be placed at any likelihood of a threat and may be customizable by the user.
At step 712, if the object is a threat and the location is, to some degree, known, additional actions may be taken such as, for example, triggering other area sensors and initiating man-in-the-loop weapons engagement 412. In some embodiments, optical sensors may be triggered and provided the location of the source of the acoustic signal such that the optical sensors may observe the source. Furthermore, any sensors data may be used for tracking the vehicle. In some embodiments, man-in-the-loop weapons 412 may be triggered to engage and mitigate the threat. Any sensors and man-in-the-loop weapons 412 may be used to track, engage and mitigate the source of the threat acoustic signal. Though man-in-the-loop weapons are described herein, in some embodiments, weapons may be automatically triggered to mitigate the threat.
Although the invention has been described with reference to the embodiments illustrated in the attached drawing figures, it is noted that equivalents may be employed, and substitutions made herein without departing from the scope of the invention.
This patent application is a continuation application claiming priority benefit, with regard to all common subject matter of U.S. patent application Ser. No. 17/339,447, filed Jun. 4, 2021, and entitled “ACOUSTIC DETECTION OF SMALL UNMANNED AIRCRAFT SYSTEMS” (“the '447 application”). The '447 application claims priority benefit of U.S. Provisional Application No. 63/036,575, filed Jun. 9, 2020, and entitled “ACOUSTIC DETECTION OF SMALL UNMANNED AIRCRAFT SYSTEMS.” The identified earlier-filed patent applications are hereby incorporated by reference in their entirety into the present application.
Number | Date | Country | |
---|---|---|---|
63036575 | Jun 2020 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 17339447 | Jun 2021 | US |
Child | 18237164 | US |