Unmanned vehicle recognition and threat management

Information

  • Patent Grant
  • 11893893
  • Patent Number
    11,893,893
  • Date Filed
    Thursday, September 28, 2023
    a year ago
  • Date Issued
    Tuesday, February 6, 2024
    9 months ago
Abstract
Systems and methods for automated unmanned aerial vehicle recognition. A multiplicity of receivers captures RF data and transmits the RF data to at least one node device. The at least one node device comprises a signal processing engine, a detection engine, a classification engine, and a direction finding engine. The at least one node device is configured with an artificial intelligence algorithm. The detection engine and classification engine are trained to detect and classify signals from unmanned vehicles and their controllers based on processed data from the signal processing engine. The direction finding engine is operable to provide lines of bearing for detected unmanned vehicles.
Description
BACKGROUND OF THE INVENTION
1. Field of the Invention

The present invention relates to systems and methods for unmanned vehicle recognition and threat management. More particularly, the systems and methods of the present invention are directed to unmanned vehicle detection, classification and direction finding.


2. Description of the Prior Art

Unmanned Aerial Vehicles (UAVs), commonly known as drones, have become readily available in commercial and retail stores. Detailed schematics for their control systems are available from many manufacturers and the internet along with Software Development Kits (SDKs). Rapid modifications are enabled by advancements in various technologies (e.g., 3D printing). UAVs can be modified to deploy dangerous actions and threaten societal securities. For example, UAVs can be modified to deliver dangerous payloads. It is no longer a question of if, it is now a question of when. Thus, it is imperative that organizations and governments take steps to protect critical assets (e.g., ports, power plants), structures (e.g., buildings, stadiums), and personnel and their citizens.


Exemplary U.S. Patent Documents relevant to the prior art include:


U.S. Pat. No. 9,862,489 for “Method and apparatus for drone detection and disablement” by inventors Lee Weinstein et al., filed Feb. 7, 2016 and issued Jan. 9, 2018, describes a method and apparatus for detection and disablement of an unidentified aerial vehicle (UAV) includes arrays of antenna elements receiving in two modalities (for instance radio frequency (RF) and acoustic modalities, or RF and optical modalities). Signal processing of outputs from multiple antenna arrays locates a potential UAV at specific coordinates within a volume of space under surveillance, and automatically aims video surveillance and a short-range projectile launcher at the UAV, and may automatically fire the projectile launcher to down the UAV.


U.S. Pat. No. 9,858,947 for “Drone detection and classification methods and apparatus” by inventors Brian Hearing et al., filed Nov. 24, 2015 and issued Jan. 2, 2018, describes a system, method, and apparatus for drone detection and classification. An example method includes receiving a sound signal in a microphone and recording, via a sound card, a digital sound sample of the sound signal, the digital sound sample having a predetermined duration. The method also includes processing, via a processor, the digital sound sample into a feature frequency spectrum. The method further includes applying, via the processor, broad spectrum matching to compare the feature frequency spectrum to at least one drone sound signature stored in a database, the at least one drone sound signature corresponding to a flight characteristic of a drone model. The method moreover includes, conditioned on matching the feature frequency spectrum to one of the drone sound signatures, transmitting, via the processor, an alert.


U.S. Pat. No. 9,767,699 for “System for and method of detecting drones” by inventors John W. Borghese et al., filed May 14, 2015 and issued Sep. 19, 2017, describes an apparatus and method can provide a warning of a drone or unmanned aerial vehicle in the vicinity of an airport. The apparatus can include at least one antenna directionally disposed at an along the approach or departure path and a detector configured to provide a warning of a presence of sense an unmanned aerial or drone. The warning can be provided in response to a radio frequency signal received by the at least one of the antenna being in a frequency band associated with a transmission frequency for the unmanned aerial vehicle or drone or in a frequency band associated with interaction from receive circuitry of the unmanned aerial vehicle or drone.


U.S. Pat. No. 9,715,009 for “Deterrent for unmanned aerial systems” by inventors Dwaine A. Parker et al., filed Dec. 2, 2016 and issued Jul. 25, 2017, describes a system for providing an integrated multi-sensor detection and countermeasure against commercial unmanned aerial systems/vehicles and includes a detecting element, a tracking element, an identification element, and an interdiction element. The detecting element detects an unmanned aerial vehicle in flight in the region of, or approaching, a property, place, event or very important person. The tracking element determines the exact location of the unmanned aerial vehicle. The identification/classification element utilizing data from the other elements generates the identification and threat assessment of the UAS. The interdiction element, based on automated algorithms can either direct the unmanned aerial vehicle away from the property, place, event or very important person in a non-destructive manner, or can disable the unmanned aerial vehicle in a destructive manner. The interdiction process may be over ridden by intervention by a System Operator/HiL.


U.S. Pat. No. 9,529,360 for “System and method for detecting and defeating a drone” by inventors Howard Melamed et al., filed Apr. 22, 2015 and issued Dec. 27, 2016, describes a system for detecting and defeating a drone. The system has a detection antenna array structured and configured to detect the drone and the drone control signal over a 360 degree field relative to the detection antenna array including detecting the directionality of the drone. The system also includes a neutralization system structured and configured in a communicating relation with the detection antenna array. The neutralization system has a transmission antenna structured to transmit an override signal aimed at the direction of the drone, an amplifier configured to boost the gain of the override signal to exceed the signal strength of the drone control signal, and a processing device configured to create and effect the transmission of the override signal. The patent also discloses a method for detecting and defeating a drone.


U.S. Publication No. 2017/0358103 for “Systems and Methods for Tracking Moving Objects” by inventors Michael Shao et al., filed Jun. 9, 2017 and published Dec. 14, 2017, describes systems and methods for tracking moving objects. The publication discloses an object tracking system comprises a processor, a communications interface, and a memory configured to store an object tracking application. The object tracking application configures the processor to receive a sequence of images; estimate and subtract background pixel values from pixels in a sequence of images; compute sets of summed intensity values for different per frame pixel offsets from a sequence of images; identify summed intensity values from a set of summed intensity values exceeding a threshold; cluster identified summed intensity values exceeding the threshold corresponding to single moving objects; and identify a location of at least one moving object in an image based on at least one summed intensity value cluster.


U.S. Publication No. 2017/0261613 for “Counter drone system” by inventor Brian R. Van Voorst, filed Feb. 27, 2017 and published Sep. 14, 2017, describes a counter drone system that includes a cueing sensor to detect the presence of an object wherein the cueing sensor cues the presence of a target drone, a long range LIDAR system having a sensor pointed in a direction of the target drone to acquire and track at long range the target drone to provide an accurate location of the target drone wherein once a track is acquired, the motion of the target drone is used to maintain the track of the target drone and a threat detector wherein LIDAR data is provided to the threat detector to determine if the target drone is a threat.


U.S. Publication No. 2017/0261604 for “Intercept drone tasked to location of lidar tracked drone” by inventor Brian Van Voorst, filed Feb. 27, 2017 and published Sep. 14, 2017, describes a system that includes a long range LIDAR tracking system to track a target drone and provide detection and tracking information of the target drone; a control system to process the detection and tracking information and provide guidance information to intercept the target drone; and a high powered intercept drone controlled by supervised autonomy, the supervised autonomy provided by processing the detection and tracking information of the target drone and sending guidance information to the intercept drone to direct the intercept drone to the target drone.


U.S. Publication No. 2017/0039413 for “Commercial drone detection” by inventor Gary J. Nadler, filed Aug. 3, 2015 and published Feb. 9, 2017, describes a method of capturing the presence of a drone, including: collecting, using at least one sensor, data associated with an aerial object; analyzing, using a processor, the data to determine at least one characteristic of the aerial object; accessing, in a database, a library of stored characteristics of commercially available drones; determining, based on the analyzing, if the at least one characteristic of the aerial object matches a characteristic of a commercially available drone; and responsive to the determining, generating an indication of a positive match.


SUMMARY OF THE INVENTION

The present invention provides systems and methods for unmanned vehicle recognition. In one embodiment, a multiplicity of receivers captures RF data and transmits the RF data to at least one node device. The at least one node device comprises a signal processing engine, a detection engine, a classification engine, and a direction finding engine. The at least one node device is configured with an artificial intelligence algorithm. The detection engine and classification engine are trained to detect and classify signals from unmanned vehicles and their controllers based on processed data from the signal processing engine. The direction finding engine is operable to provide lines of bearing for detected unmanned vehicles. A display and control unit is in network communication with the at least one node device for displaying locations and other related data for the detected unmanned vehicles.


These and other aspects of the present invention will become apparent to those skilled in the art after a reading of the following description of the preferred embodiment when considered with the drawings, as they support the claimed invention.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a system for unmanned vehicle recognition according to one embodiment of the present invention.



FIG. 2 illustrates signal characterization within a spectrum from 700 MHz to 900 MHz.



FIG. 3 is an illustration of Phantom 4 controller and drone signals.



FIG. 4 is a human interface display for drone detection according to one embodiment of the present invention.



FIG. 5 shows a setup for RF data capture in an Anechoic Chamber according to one embodiment of the present invention.



FIG. 6 illustrates a simulation for fading and channel hopping of an OcuSync drone.



FIG. 7 is an illustration of an inception based convolutional neural network.



FIG. 8 illustrates a scenario for an RF environment with just noise.



FIG. 9 illustrates a scenario for an RF environment with a Phantom 4 controller and drone.



FIG. 10 illustrates a scenario for an RF environment with two Mavic Pro drones.



FIG. 11 illustrates a scenario for an RF environment with a Mavic Pro controller only.



FIG. 12 illustrates a scenario for an RF environment with one Mavic Pro drone only.



FIG. 13 illustrates a scenario for an RF environment with one Phantom 3 controller only.



FIG. 14 illustrates a scenario for an RF environment with a Phantom 3 controller and drone.



FIG. 15 illustrates a scenario for an RF environment with a Mavic Pro controller and drone in wideband mode.



FIG. 16 illustrates a scenario for an RF environment with a Mavic Pro drone in wideband mode.



FIG. 17 illustrates a scenario for an RF environment with a Mavic Pro controller and drone.



FIG. 18 illustrates a scenario for an RF environment with a Mavic Pro controller and a Phantom 4 controller.



FIG. 19 is an illustration of identifying drones and controllers based on signal edge detection.



FIG. 20 is an illustration with averaged signal amplitudes and signal edges according to one embodiment of the present invention.



FIG. 21 displays a detection range of less than 500 meters based on equipment specification and location.





DETAILED DESCRIPTION

The present invention provides systems and methods for unmanned vehicle recognition. The present invention relates to automatic signal detection, temporal feature extraction, geolocation, and edge processing disclosed in U.S. patent application Ser. No. 15/412,982 filed Jan. 23, 2017, U.S. patent application Ser. No. 15/478,916 filed Apr. 4, 2017, U.S. patent application Ser. No. 15/681,521 filed Aug. 21, 2017, U.S. patent application Ser. No. 15/681,540 filed Aug. 21, 2017, and U.S. patent application Ser. No. 15/681,558 filed Aug. 21, 2017, each of which is incorporated herein by reference in its entirety.


Currently, commercial and retail UAVs dominate frequencies including 433 MHz industrial, scientific, and medical radio band (ISM Band) Region 1, 900 MHz ISM Band Region 1,2,3 (varies by country), 2.4 GHz (channels 1-14), 5 GHz (channels 7-165 most predominant), and 3.6 GHz (channels 131-183). Modulation types used by commercial and retail UAVs include Direct Sequence Spread Spectrum (DSSS), Orthogonal Frequency Division Multiplexing (OFDM), Frequency Hopping Spread Spectrum (FHSS), Fataba Advanced Spread Spectrum Technology (FASST).


Many counter UAV systems in the prior art focus on the 2.4 GHz and 5.8 GHz bands utilizing demodulation and decryption of radio frequency (RF) signals to detect and analyze each signal to determine if it is related to a UAV.


The present invention provides systems and methods for unmanned vehicle recognition including detection, classification and direction finding. Unmanned vehicles comprise aerial, terrestrial or water borne unmanned vehicles. The systems and methods for unmanned vehicle recognition are operable to counter threats from the aerial, terrestrial or water borne unmanned vehicles.


In one embodiment, a multiplicity of receivers captures RF data and transmits the RF data to at least one node device. The at least one node device comprises a signal processing engine, a detection engine, a classification engine, and a direction finding engine. The at least one node device is configured with an artificial intelligence algorithm. The detection engine and classification engine are trained to detect and classify signals from unmanned vehicles and their controllers based on processed data from the signal processing engine. The direction finding engine is operable to provide lines of bearing for detected unmanned vehicles. A display and control unit is in network communication with the at least one node device for displaying locations and other related data for the detected unmanned vehicles.


In one embodiment, the present invention provides systems and methods for unmanned vehicle (UV) recognition in a radio frequency (RF) environment. A multiplicity of RF receivers and a displaying device are in network communication with a multiplicity of node devices. The multiplicity of RF receivers is operable to capture the RF data in the RF environment, convert the RF data to fast Fourier transform (FFT) data, and transmit the FFT data to the multiplicity of node devices. The multiplicity of node devices each comprises a signal processing engine, a detection engine, a classification engine, a direction-finding engine, and at least one artificial intelligence (AI) engine. The signal processing engine is operable to average the FFT data into at least one tile. The detection engine is operable to group the FFT data into discrete FFT bins over time, calculate average and standard deviation of power for the discrete FFT bins, and identify at least one signal related to at least one UV and/or corresponding at least one UV controller. The at least one AI engine is operable to generate an output for each of the at least one tile to identify at least one UV and corresponding at least one UV controller with a probability, and calculate an average probability based on the output from each of the at least one tile. The classification engine is operable to classify the at least one UV and/or the at least one UV controller by comparing the at least one signal to classification data stored in a classification library in real time or near real time. The direction-finding engine is operable to calculate a line of bearing for the at least one UV. The displaying device is operable to display a classification of the at least one UV and/or the at least one UV controller and/or the line of bearing of the at least one UV. Each of the at least one tile is visually represented in a waterfall image via a graphical user interface on the displaying device.



FIG. 1 illustrates a system for unmanned vehicle recognition according to one embodiment of the present invention. The system includes a multiplicity of antennas, a receiver and processing unit, and a display and control unit. In one embodiment, there are four multiband omnidirectional antennas. In one embodiment, three multiband omnidirectional antennas are positioned to form an equilateral with 6 meters spacing as illustrated in FIG. 1 as an example. The receiver and processing unit includes a signal processing engine, a UAV detection engine, a UAV classification engine, a direction finding processing engine, and an internal Global Positioning System (GPS). The receiver and processing unit is operable to receive RF data from the antennas and automatically process the RF data for UAV detection and classification and direction finding. The display and control unit includes a human interface display. In one embodiment, the human interface display is provided by a remote web-based interface. The display and control unit is operable to display lines of bearings for detected UAVs and controllers, classification for detected UAVs and controllers, received signal strength (RSS) values, and operating frequencies. In one embodiment, the display and control unit is SigBase 4000 as shown in FIG. 1. In another embodiment, any computer, laptop, or tablet configured with the human interface display of the present invention is operable to function as a display and control unit. In one embodiment, the receiver and processing unit is a node device, and there are multiple node devices communicating with each other and forming a group of networked nodes for UAV detection, classification, and direction finding.


The present invention provides a more efficient methodology for UAV detection and identification, which takes advantage of Fast Fourier Transform (FFT) over a short period of time and its derivation. RF data received from antennas are directly converted to FFT data with finer granularity. This allows rapid identification of protocols used by high threat drones without demodulation, and the identification is probability based. An analytics engine is operable to perform near real-time analysis and characterize signals within the spectrum under observation. FIG. 2 illustrates signal characterization within a spectrum from 700 MHz to 900 MHz. Temporal feature extraction is applied for signal characterization, which is described in U.S. patent application Ser. No. 15/412,982 filed Jan. 23, 2017, U.S. patent application Ser. No. 15/681,521 filed Aug. 21, 2017, U.S. patent application Ser. No. 15/681,540 filed Aug. 21, 2017, U.S. patent application Ser. No. 15/681,558 filed Aug. 21, 2017, each of which is incorporated herein by reference in its entirety.


Advantageously, multiple receivers in the present invention work together to ingest spectral activities across large blocks of spectrum. The multiple receivers have an instantaneous bandwidth from 40 MHz to 500 MHz. In one embodiment, the multiple receivers are configurable in 40 MHz and 125 MHz segment building blocks. Input data are converted directly to FFT data and fed into process engines, which significantly decreases latency. The process engines are designed for rapid identification of signals of interest (SOI). When an SOI is detected, a direction finding process is initiated autonomously. In one embodiment, the direction finding process is configurable by an operator.


There are multiple types of communications links utilized for command and control of an unmanned vehicle. Although several cost-effective radio communication (RC) protocols are gaining global popularity, WI-FI is still the most popular protocol for command and control of UAVs and camera systems. A remote controller of a UAV acts as a WI-FI access point and the UAV acts as a client. There are several limiting factors for WI-FI-based UAVs. For example, the operational range of a WI-FI-based UAV is typically limited to 150 feet (46 m) indoor and 300 feet (92 m) outdoor. There is significant latency for control and video behaviors. Interference by other WI-FI devices affects operational continuity of the WI-FI-based UAVs.


Demand in the UAV user community has made more professional-level protocols available in the commercial and retail markets. By way of example but not limitation, two common RC protocols used for UAVs are Lightbridge and OcuSync. Enhancements in drone technology inevitably increases the capability of drones for use in industrial espionage and as weapons for nefarious activities.


Lightbridge is developed for long range and reliable communication. Communication is available within a range up to 5 km. Lightbridge supports 8 selectable channels, and the selection can be manual or automatic. Drones with Lightbridge protocol also have the ability to assess interference and move to alternate channels for greater quality.


OcuSync is developed based on the LightBridge protocol. OcuSync uses effective digital compression and other improvements, which decreases knowledge required to operate. OcuSync provides reliable HD and UHD video, and OcuSync-based drones can be operated in areas with greater dynamic interference. Ocusync improves command and control efficiencies and reduces latency. With OcuSync, video communications are improved substantially, operational range is increased, command and control recovery are enhanced when interference occurs.


The systems and methods of the present invention for unmanned vehicle recognition are operable to detect and classify UAVs at a distance, provide directions of the UAVs, and take defensive measures to mitigate risks. The detection and classification are fast, which provides more time to react and respond to threats. Exact detection range is based upon selection of antenna systems, topology, morphology, and client criteria. Classification of the detected UAVs provides knowledge of the UAVs and defines effective actions and capabilities for countering UAV threats. In one embodiment, the direction information of the UAVs provides orientation within the environment based on the location of the UAV detector.


In one embodiment, the systems and methods of the present invention provides unmanned vehicle recognition solution targeting radio controlled and WI-FI-based drones. The overall system is capable of surveying the spectrum from 20 MHz to 6 GHz, not just the common 2.4 GHz and 5.8 GHz areas as in the prior art. In one embodiment, the systems and methods of the present invention are applied to address 2 major categories: RC-based UAV systems and WI-FI-based UAV systems. In one embodiment, UAV systems utilize RC protocols comprising LightBridge and OcuSync. In another embodiment, UAV systems are WI-FI based, for example but not for limitation, 3DR Solo and Parrot SkyController. The systems and methods of the present invention are operable to detect UAVs and their controllers by protocol.


The systems and methods of the present invention maintain a state-of-the-art learning system and library for classifying detected signals by manufacturer and controller type. The state-of-the-art learning system and library are updated as new protocols emerge.


In one embodiment, classification by protocol chipset is utilized to provide valuable intelligence and knowledge for risk mitigation and threat defense. The valuable intelligence and knowledge include effective operational range, supported peripherals (e.g., external or internal camera, barometers, GPS and dead reckoning capabilities), integrated obstacle avoidance systems, and interference mitigation techniques.


The state-of-the-art learning system of the present invention is highly accurate and capable of assessing detected UAV signals and/or controller signals for classification in less than a few seconds with a high confidence level. The state-of-the-art learning system is operable to discriminate changes in the environment for non-drone signals as well as drone signals. FIG. 3 is an illustration of Phantom 4 controller and drone signals. A human interface is operable to display the classification results. FIG. 4 illustrates a human interface display for drone detection according to one embodiment of the present invention.


It is difficult to recognize commercial and retail drones with the naked eye over 100 meters. It is critical to obtain a vector to the target for situational awareness and defense execution. The systems and methods of the present invention provides lines of bearing for direction finding for multiple UAVs flying simultaneously. Each line of bearing is color coded for display. Angles, along with frequencies utilized for uplink and downlink, are also displayed on the human interface.


Once a UAV is detected and classified, an alert is posted to a counter UAV system operator (e.g., a network operation center, an individual operator) including azimuth of the UAV and other information. The alert is transmitted via email, short message service (SMS) or third-party system integration. The counter UAV system is operable to engage an intercession transmission, which will disrupt the communication between the UAV and its controller. When the communication between the UAV and its controller is intercepted, the UAV will invoke certain safety protocols, such as reduce height and hover, land, or return to the launch point. The counter UAV system may have certain restrictions based on country and classification of the UAV.


In one embodiment, the systems and methods of the present invention are operable to update the UAV library with emerging protocols for classification purposes, and refine the learning engine for wideband spectrum analysis for other potential UAV signatures, emerging protocols and technologies. In other words, the systems and methods of the present invention are adaptable to any new and emerging protocols and technologies developed for unmanned vehicles. In one embodiment, multiple node devices in the present invention are deployed to operate as a group of networked nodes. In one embodiment, the group of networked nodes are operable to estimate geographical locations for unmanned vehicles. In one embodiment, two node devices are operable to provide a single line of bearing and approximate a geographical location of a detected drone or controller. The more node devices there are in the group of network nodes, the more lines of bearing are operable to be provided, and the more accurate the geographical location is estimated for the detected drone or controller. In one embodiment, the geolocation function provides altitude and distance of a targeted drone.


In one embodiment, the counter UAV system in the present invention is operable to alert when unexpected signal characteristics are detected in 2.4 GHz and 5.8 GHz areas and classify the unexpected signal characteristics as potential UAV activities. In another embodiment, the counter UAV system in the present invention is operable to alert when unexpected signal characteristics are detected anywhere from 20 MHz to 6 GHz and classify the unexpected signal characteristics as potential UAV activities. In another embodiment, the counter UAV system in the present invention is operable to classify the unexpected signal characteristics as potential UAV activities when unexpected signal characteristics are detected anywhere from 40 MHz to 6 GHz. The automatic signal detection engine and analytics engine are enhanced in the counter UAV system to recognize potential UAV activities across a great portion of the spectrum. In one embodiment, any blocks of spectrum from 40 MHz to 6 GHz are operable to be selected for UAV recognition.


In one embodiment, vector-based information including inclinations, declinations, topology deviations, and user configurable Northing map orientation is added to the WGS84 mapping system for direction finding and location estimation. In one embodiment, earth-centered earth-fixed vector analysis is provided for multi-node systems to estimate UAV locations, derive UAV velocities from position changes over time, and determine UAV trajectory vectors in fixed nodal processing. In one embodiment, a group of networked node devices are operable to continually provide lines of bearing over time, approximate geographical locations of a detected unmanned vehicle on or above the earth, and track the movement of the detected unmanned vehicle from one estimated location to another. In one embodiment, the group of networked node devices are operable to determine velocities of the detected unmanned vehicle based on estimated locations and travel time. In one embodiment, the group of networked node devices are operable to estimate a trajectory of the detected unmanned vehicle based on the estimated geographical locations over time. In one embodiment, the group of networked node devices are operable to estimate accelerations and decelerations of the unmanned vehicle based on the velocities of the unmanned vehicles over time.


In one embodiment, the systems and methods of the present invention are operable for UAV detection and direction finding for different modulation schemes including but not limited to DSSS, OFDM, FHSS, FASST, etc. In one embodiment, the counter UAV system in the present invention is configured with cameras for motion detection. The cameras have both day and night vision.


In one embodiment, systems and methods of the present invention provides training for unmanned vehicle recognition. RF data is captured for a Phantom 3 drone and its controller and a Phantom 4 drone and its controller, both of which use Lightbridge protocol. RF data is also captured for a Mavic Pro drone and its controller, which uses OcuSync protocol. The RF data is recorded at different channels, different RF bandwidths, and different video quality settings inside and outside an Anechoic Chamber. FIG. 5 shows a setup for RF data capture in an Anechoic Chamber according to one embodiment of the present invention. The recordings are overlaid on the RF environment, and fading and channel hopping are simulated. FIG. 6 illustrates a simulation for fading and channel hopping of an OcuSync drone.


In one embodiment, the recorded RF data is used to train and calibrate an inception based convolutional neural network comprised in a drone detection system. FIG. 7 is an illustration of an inception based convolutional neural network. U.S. Patent Publication No. 2018/0137406 titled “Efficient Convolutional Neural Networks and Techniques to Reduce Associated Computational Costs” is incorporated herein by reference in its entirety. The inception based convolutional neural network generates probabilities that drones or their controllers are detected. The detection probabilities are updated multiple times per second.


The trained inception based convolutional neural network is operable to identify Lightbridge 1 controller and drone, Lightbridge 2 controller and drone, and OcuSync controller and drone. The trained inception based convolutional neural network is operable to identify Lightbridge and Ocusync controllers and drones at the same time. In one embodiment, the drone detection system comprising the trained inception based convolutional neural network is operable to search an instantaneous bandwidth of 147.2 MHz.


In one embodiment, the drone detection system of the present invention includes an artificial intelligence (AI) algorithm running on a single board computer (e.g., Nvidia Jetson TX2) with an execution time less than 10 ms. The drone detection system is operable to separate Phantom 3 and Phantom 4 controllers. Waveforms for Phantom 3 and Phantom 4 controllers are sufficiently different to assign separate probabilities.


The Artificial Intelligence (AI) algorithm is used to enhance performance for RF data analytics. The RF data analytics process based on the AI algorithm is visualized. The RF waterfalls of several drone scenarios are presented in FIGS. 8-18. FIG. 8 illustrates a scenario for an RF environment with just noise. FIG. 9 illustrates a scenario for an RF environment with a Phantom 4 controller and drone. FIG. 10 illustrates a scenario for an RF environment with two Mavic Pro drones. FIG. 11 illustrates a scenario for an RF environment with a Mavic Pro controller only. FIG. 12 illustrates a scenario for an RF environment with one Mavic Pro drone only. FIG. 13 illustrates a scenario for an RF environment with one Phantom 3 controller only. FIG. 14 illustrates a scenario for an RF environment with a Phantom 3 controller and drone. FIG. 15 illustrates a scenario for an RF environment with a Mavic Pro controller and drone in wideband mode. FIG. 16 illustrates a scenario for an RF environment with a Mavic Pro drone in wideband mode. FIG. 17 illustrates a scenario for an RF environment with a Mavic Pro controller and drone. FIG. 18 illustrates a scenario for an RF environment with a Mavic Pro controller and a Phantom 4 controller.


Each scenario is illustrated with 6 waterfall images. Each waterfall represents ˜80 ms of time and 125 MHz of bandwidth. The top left image is the waterfall before an AI processing. The other five images are waterfalls after the AI processing. For each signal type, the areas of the waterfall that are likely for the RF signal type are highlighted. Areas that are not for the signal type are grayed out. The overall probability that a signal exists in the image is printed in the title of each waterfall image. In one embodiment, the AI algorithm is securely integrated with a state engine and a detection process of the present invention.


In one embodiment, a method for drone detection and classification comprises applying FFT function to RF data, converting FFT data into logarithmic scale in magnitude, averaging converted FFT into 256 by 256 array representing 125 MHz of bandwidth and 80 ms of time as a base tile, applying normalization function to the base tile, applying a series of convolutional and pooling layers, applying modified You Only Look Once (YOLO) algorithm for detection, grouping bounding boxes displayed in the waterfall images (e.g., waterfall plots in FIGS. 8-18), classifying signals based on the shape of detection output, verifying results with a second level recurrent neural network (RNN) based pattern estimator.


In one embodiment, a method for training comprises recording clean RF signals, shifting RF signals in frequency randomly, creating truth data for YOLO output, adding a simulated channel to the RF signals, recording typical RF backgrounds, applying FFT function to RF data, converting FFT data into logarithmic scale in magnitude, averaging converted FFT into 256 by 256 array representing 125 MHz of bandwidth and 80 ms of time as a base tile, applying normalization function to the base tile, applying a series of convolutional and pooling layers, applying modified You Only Look Once (YOLO) algorithm for detection, grouping bounding boxes displayed in the waterfall images (e.g., waterfall plots in FIGS. 8-18), applying a sigmoid cross entropy function, and applying an Adaptive Moment Estimation (Adam) based back propagation algorithm.


In one embodiment, a drone detection engine is operable to convert FFT flows from a radio to a tile. For each channel, the drone detection engine is operable to standardize the FFT output from the radio at a defined resolution bandwidth, and group high resolution FFT data into distinct bins overtime. The drone detection engine is further operable to calculate average and standard deviation of power for discrete FFT bins, assign a power value to each channel within the tile. Each scan or single stare at the radio is a time slice, and multiple time slices with power and channel assignment create a tile. Tiles from different frequency spans and center frequencies are identified as a tile group by a tile group number. Receivers in the drone detection system are operable to be re-tuned to different frequencies and spans. In one embodiment, the drone detection system comprises multiple receivers to generate tiles and tile groups.


In one embodiment, a tile is sent to a YOLO AI Engine. Outputs of a decision tree in the YOLO AI engine are used to detect multiple drones and their controllers. Drones of the same type of radio protocol are operable to be identified within the tile. Controllers of the same type of radio protocol are operable to be identified within the tile. Drones of different radio protocols are also operable to be identified within the tile. Controllers of different radio protocols are also operable to be identified within the tile. FIG. 19 is an illustration of identifying drones and controllers based on signal edge detection.


In one embodiment, a plurality of tiles is sent to the YOLO AI engine. In one embodiment, a tile group is sent to the YOLO AI engine. The YOLO AI engine generates an output for each tile to identify drones and their controllers with a probability. An average probability is calculated based on outputs for multiple tiles in the tile group. For each tile group, the YOLO AI engine computes outputs for several tiles per second.


In one embodiment, a state engine controls the flows of tiles and tile groups into one or more AI engines. AI engines do not use frequency values for analytics. Thus, the one or more AI engines are operable for any frequency and frequency span that a drone radio supports. The state engine further correlates output of the one or more AI engines to appropriate tiles and tile groups.


The systems and methods of the present invention are operable for direction finding of drones and their controllers. Outputs from the AI engine are denoted with time basis for the drones and their controllers.


Drones typically maintain the same frequency unless their firmware detects interference. Then the drones may negotiate a change with their controllers. This does not create an issue for detection as long as the new frequency and span is monitored by the systems and methods of the present invention. Drone controllers typically use a frequency hopping spread spectrum (FHSS) or other Frequency hopping system (e.g., Gaussian frequency shift keying (GFSK)).


In one embodiment, the systems and method of the present invention are operable to approximate a start time of a line of bearing for a direction finding (DF) system. The time intervals are either known or estimated based upon the behavior monitored by the AI engine and state engine. This allows the time slice and frequency of each individual drone and/or controller to be passed to the DF system. In one embodiment, three or four receivers are coordinated to collect information in appropriate frequency segments, wherein the frequency segments are similar to tiles described earlier. FIG. 20 is an illustration with averaged signal amplitudes and signal edges according to one embodiment of the present invention.


The AI engine examines the segments to determine if a drone or controller exists. An azimuth of the drone or controller in an Earth-Centered Earth-Fixed coordinate system is determined based on other information collected from the three or four receivers using time difference of arrival (TDOA), angle of arrival (AOA), power correlative, or interferometry techniques.


Distance capability of UAV detection and classification system depends on hardware configuration, environment morphology and restrictions based on country and classification of the counter UAV operator. In one embodiment, the systems and methods for unmanned vehicle recognition are operable to detect unmanned vehicles within 3-4 kilometers. FIG. 21 displays a detection range of less than 500 meters based on equipment specification and location.


Certain modifications and improvements will occur to those skilled in the art upon a reading of the foregoing description. The above-mentioned examples are provided to serve the purpose of clarifying the aspects of the invention and it will be apparent to one skilled in the art that they do not serve to limit the scope of the invention. All modifications and improvements have been deleted herein for the sake of conciseness and readability but are properly within the scope of the present invention.

Claims
  • 1. A system for signal classification in an RF environment, comprising: at least one node device in communication with at least one RF receiver;wherein the at least one RF receiver is operable to capture RF data in the RF environment and transmit the RF data to the at least one node device;wherein the at least one node device comprises a classification engine and a detection engine;wherein the detection engine is operable to average Fast Fourier Transform (FFT) data derived from the RF data into at least one tile;wherein the detection engine is operable to detect at least one signal related to at least one signal emitting device using the at least one tile; andwherein the classification engine is operable to classify the at least one signal emitting device by comparing the at least one signal emitted from the at least one signal emitting device to classification data stored in a classification library in real time.
  • 2. The system of claim 1, further comprising a display operable to display results from the classification engine.
  • 3. The system of claim 1, wherein the at least one node device is operable to estimate a geographical location for the at least one signal emitting device.
  • 4. The system of claim 1, wherein the RF data is from a spectrum between 20 MHz and 6 GHz.
  • 5. The system of claim 1, wherein the detection engine is operable to detect the at least one signal emitting device by radio communication protocols.
  • 6. The system of claim 1, wherein the at least one node device is operable to update the classification library with emerging protocols.
  • 7. The system of claim 1, further comprising a global positioning system (GPS) unit.
  • 8. The system of claim 1, wherein the at least one node device is operable to transmit an alert related to the at least one signal emitting device.
  • 9. The system of claim 1, wherein the at least one RF receiver is operable to transform the RF data to the FFT data and transmit the FFT data to the at least one node device.
  • 10. An apparatus for signal classification in a radiofrequency (RF) environment, comprising: at least one node device comprising a classification engine and a detection engine;wherein the at least one node device is operable to receive RF data from at least one RF receiver;wherein the detection engine is operable to average Fast Fourier Transform (FFT) data derived from the RF data into at least one tile;wherein the detection engine is operable to detect at least one signal related to at least one signal emitting device using the at least one tile; andwherein the classification engine is operable to classify the at least one signal emitting device by comparing the at least one signal emitted from the at least one signal emitting device to classification data stored in a classification library in real time.
  • 11. The apparatus of claim 10, wherein the at least one node device is operable to send results from the classification engine to a display, wherein the display is operable to display the results from the classification engine.
  • 12. The apparatus of claim 10, wherein the at least one node device is operable to estimate a geographical location for the at least one signal emitting device.
  • 13. The apparatus of claim 10, wherein the RF data is from a spectrum between 20 MHz and 6 GHz.
  • 14. The apparatus of claim 10, wherein the detection engine is operable to detect the at least one signal emitting device by radio communication protocols.
  • 15. The apparatus of claim 10, wherein the at least one node device is operable to update the classification library with emerging protocols.
  • 16. The apparatus of claim 10, further comprising a global positioning system (GPS) unit.
  • 17. The apparatus of claim 10, wherein the at least one node device is operable to transmit an alert related to the at least one signal emitting device.
  • 18. The apparatus of claim 10, wherein the at least one node device is operable to receive the FFT data from the at least one RF receiver.
  • 19. A method for signal classification in an RF environment, comprising: at least one RF receiver capturing RF data in the RF environment and transmitting the RF data to at least one node device;wherein the at least one node device comprises a classification engine and a detection engine;the detection engine averaging Fast Fourier Transform (FFT) data derived from the RF data into at least one tile;a detection engine of the at least one node device detecting at least one signal related to at least one signal emitting device using the at least one tile; anda classification engine of the at least one node device classifying the at least one signal emitting device by comparing the at least one signal emitted from the at least one signal emitting device to classification data stored in a classification library in real time.
  • 20. The method of claim 19, further comprising the at least one node device estimating a geographical location for the at least one signal emitting device.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application relates to and claims priority from the following applications. This application is a continuation of U.S. patent application Ser. No. 18/142,904, filed May 3, 2023, which is a continuation of U.S. patent application Ser. No. 17/991,348, filed Nov. 21, 2022, which is a continuation of U.S. patent application Ser. No. 17/735,615, filed May 3, 2022, which is a continuation of U.S. patent application Ser. No. 17/190,048 filed Mar. 2, 2021, which is a continuation of U.S. patent application Ser. No. 16/732,811 filed Jan. 2, 2020, which is a continuation of U.S. patent application Ser. No. 16/275,575 filed Feb. 14, 2019, which claims the benefit of U.S. Provisional Application 62/632,276 filed Feb. 19, 2018. U.S. patent application Ser. No. 16/275,575 also claims priority from and is a continuation-in-part of U.S. patent application Ser. No. 16/274,933 filed Feb. 13, 2019, which is a continuation-in-part of U.S. patent application Ser. No. 16/180,690 filed Nov. 5, 2018, which is a continuation-in-part of U.S. patent application Ser. No. 15/412,982 filed Jan. 23, 2017. U.S. patent application Ser. No. 16/180,690 also claims priority from U.S. Provisional Patent Application No. 62/722,420 filed Aug. 24, 2018. U.S. patent application Ser. No. 16/274,933 also claims the benefit of U.S. Provisional Application 62/632,276 filed Feb. 19, 2018. Each of the above-mentioned applications is incorporated herein by reference in its entirety.

US Referenced Citations (604)
Number Name Date Kind
4215345 Robert et al. Jul 1980 A
4400700 Rittenbach Aug 1983 A
4453137 Rittenbach Jun 1984 A
4501020 Wakeman Feb 1985 A
4638493 Bishop et al. Jan 1987 A
4928106 Ashjaee et al. May 1990 A
5134407 Lorenz et al. Jul 1992 A
5230087 Meyer et al. Jul 1993 A
5293170 Lorenz et al. Mar 1994 A
5343212 Rose et al. Aug 1994 A
5393713 Schwob Feb 1995 A
5506864 Schilling Apr 1996 A
5513385 Tanaka Apr 1996 A
5548809 Lemson Aug 1996 A
5570099 DesJardins Oct 1996 A
5589835 Gildea et al. Dec 1996 A
5612703 Mallinckrodt Mar 1997 A
5642732 Wang Jul 1997 A
5831874 Boone et al. Nov 1998 A
5835857 Otten Nov 1998 A
5846208 Pichlmayr et al. Dec 1998 A
5856803 Pevler Jan 1999 A
5936575 Azzarelli et al. Aug 1999 A
6018312 Haworth Jan 2000 A
6039692 Kristoffersen Mar 2000 A
6085090 Yee et al. Jul 2000 A
6115580 Chuprun et al. Sep 2000 A
6134445 Gould et al. Oct 2000 A
6144336 Preston et al. Nov 2000 A
6157619 Ozluturk et al. Dec 2000 A
6160511 Pfeil et al. Dec 2000 A
6167277 Kawamoto Dec 2000 A
6191731 McBurney et al. Feb 2001 B1
6249252 Dupray Jun 2001 B1
6286021 Tran et al. Sep 2001 B1
6296612 Mo et al. Oct 2001 B1
6304760 Thomson et al. Oct 2001 B1
6339396 Mayersak Jan 2002 B1
6418131 Snelling et al. Jul 2002 B1
6433671 Nysen Aug 2002 B1
6490318 Larsson et al. Dec 2002 B1
6492945 Counselman, III et al. Dec 2002 B2
6512788 Kuhn et al. Jan 2003 B1
6628231 Mayersak Sep 2003 B2
6677895 Holt Jan 2004 B1
6707910 Valve et al. Mar 2004 B1
6711404 Arpee et al. Mar 2004 B1
6741595 Maher et al. May 2004 B2
6771957 Chitrapu Aug 2004 B2
6785321 Yang et al. Aug 2004 B1
6850557 Gronemeyer Feb 2005 B1
6850735 Sugar et al. Feb 2005 B2
6859831 Gelvin et al. Feb 2005 B1
6861982 Forstrom et al. Mar 2005 B2
6876326 Martorana Apr 2005 B2
6898197 Lavean May 2005 B1
6898235 Carlin et al. May 2005 B1
6904269 Deshpande et al. Jun 2005 B1
6985437 Vogel Jan 2006 B1
6991514 Meloni et al. Jan 2006 B1
7035593 Miller et al. Apr 2006 B2
7043207 Miyazaki May 2006 B2
7049965 Kelliher et al. May 2006 B2
7110756 Diener Sep 2006 B2
7116943 Sugar et al. Oct 2006 B2
7146176 McHenry Dec 2006 B2
7151938 Weigand Dec 2006 B2
7152025 Lusky et al. Dec 2006 B2
7162207 Kursula et al. Jan 2007 B2
7171161 Miller Jan 2007 B2
7187326 Beadle et al. Mar 2007 B2
7206350 Korobkov et al. Apr 2007 B2
7254191 Sugar et al. Aug 2007 B2
7269151 Diener et al. Sep 2007 B2
7292656 Kloper et al. Nov 2007 B2
7298327 Dupray et al. Nov 2007 B2
7340375 Patenaud et al. Mar 2008 B1
7366463 Archer et al. Apr 2008 B1
7408907 Diener Aug 2008 B2
7424268 Diener et al. Sep 2008 B2
7459898 Woodings Dec 2008 B1
7466960 Sugar Dec 2008 B2
7471683 Maher, III et al. Dec 2008 B2
7522917 Purdy, Jr. et al. Apr 2009 B1
7555262 Brenner Jun 2009 B2
7564816 McHenry et al. Jul 2009 B2
7595754 Mehta Sep 2009 B2
7606335 Kloper et al. Oct 2009 B2
7606597 Weigand Oct 2009 B2
7620396 Floam et al. Nov 2009 B2
7676192 Wilson Mar 2010 B1
7692532 Fischer et al. Apr 2010 B2
7692573 Funk Apr 2010 B1
7702044 Nallapureddy et al. Apr 2010 B2
7725110 Weigand May 2010 B2
7728755 Jocic Jun 2010 B1
7801490 Scherzer Sep 2010 B1
7835319 Sugar Nov 2010 B2
7865140 Levien et al. Jan 2011 B2
7893875 Smith Feb 2011 B1
7933344 Hassan et al. Apr 2011 B2
7945215 Tang May 2011 B2
7953549 Graham et al. May 2011 B2
7965641 Ben Letaief et al. Jun 2011 B2
8001901 Bass Aug 2011 B2
8006195 Woodings et al. Aug 2011 B1
8023957 Weigand Sep 2011 B2
8026846 Mcfadden et al. Sep 2011 B2
8027249 Mchenry et al. Sep 2011 B2
8027690 Shellhammer Sep 2011 B2
8045660 Gupta Oct 2011 B1
8055204 Livsics et al. Nov 2011 B2
8059694 Junell et al. Nov 2011 B2
8060017 Schlicht et al. Nov 2011 B2
8060035 Haykin Nov 2011 B2
8060104 Chaudhri et al. Nov 2011 B2
8064840 McHenry et al. Nov 2011 B2
8077662 Srinivasan et al. Dec 2011 B2
RE43066 McHenry Jan 2012 E
8094610 Wang et al. Jan 2012 B2
8107391 Wu et al. Jan 2012 B2
8125213 Goguillon et al. Feb 2012 B2
8131239 Walker et al. Mar 2012 B1
8134493 Noble et al. Mar 2012 B2
8151311 Huffman et al. Apr 2012 B2
8155039 Wu et al. Apr 2012 B2
8155649 McHenry et al. Apr 2012 B2
8160839 Woodings et al. Apr 2012 B1
8170577 Singh May 2012 B2
8175539 Diener et al. May 2012 B2
8184653 Dain et al. May 2012 B2
8193981 Hwang et al. Jun 2012 B1
8213868 Du et al. Jul 2012 B2
8224254 Haykin Jul 2012 B2
8229368 Immendorf et al. Jul 2012 B1
8233928 Stanforth et al. Jul 2012 B2
8238247 Wu et al. Aug 2012 B2
8249028 Porras et al. Aug 2012 B2
8249631 Sawai Aug 2012 B2
8260207 Srinivasan et al. Sep 2012 B2
8265684 Sawai Sep 2012 B2
8279786 Smith et al. Oct 2012 B1
8280433 Quinn et al. Oct 2012 B2
8289907 Seidel et al. Oct 2012 B2
8290503 Sadek et al. Oct 2012 B2
8295859 Yarkan et al. Oct 2012 B1
8295877 Hui et al. Oct 2012 B2
8305215 Markhovsky et al. Nov 2012 B2
8311483 Tillman et al. Nov 2012 B2
8311509 Feher Nov 2012 B2
8315571 Lindoff et al. Nov 2012 B2
8320910 Bobier Nov 2012 B2
8326240 Kadambe et al. Dec 2012 B1
8326309 Mody et al. Dec 2012 B2
8326313 McHenry et al. Dec 2012 B2
8335204 Samarasooriya et al. Dec 2012 B2
8346273 Weigand Jan 2013 B2
8350970 Birkett et al. Jan 2013 B2
8352223 Anthony et al. Jan 2013 B1
8358723 Hamkins et al. Jan 2013 B1
8364188 Srinivasan et al. Jan 2013 B2
8369305 Diener et al. Feb 2013 B2
8373759 Samarasooriya et al. Feb 2013 B2
8391794 Sawai et al. Mar 2013 B2
8391796 Srinivasan et al. Mar 2013 B2
8401564 Singh Mar 2013 B2
8406776 Jallon Mar 2013 B2
8406780 Mueck Mar 2013 B2
RE44142 Wilson Apr 2013 E
8421676 Moshfeghi Apr 2013 B2
8422453 Abedi Apr 2013 B2
8422958 Du et al. Apr 2013 B2
RE44237 Mchenry May 2013 E
8437700 Mody et al. May 2013 B2
8442445 Mody et al. May 2013 B2
8451751 Challapali et al. May 2013 B2
8463195 Shellhammer Jun 2013 B2
8467353 Proctor Jun 2013 B2
8483155 Banerjea et al. Jul 2013 B1
8494464 Kadambe et al. Jul 2013 B1
8503955 Kang et al. Aug 2013 B2
8504087 Stanforth et al. Aug 2013 B2
8514729 Blackwell Aug 2013 B2
8515473 Mody et al. Aug 2013 B2
8520606 Cleveland Aug 2013 B2
RE44492 Mchenry Sep 2013 E
8526974 Olsson et al. Sep 2013 B2
8532686 Schmidt et al. Sep 2013 B2
8538339 Hu et al. Sep 2013 B2
8548521 Hui et al. Oct 2013 B2
8554264 Gibbons et al. Oct 2013 B1
8559301 Mchenry et al. Oct 2013 B2
8565811 Tan et al. Oct 2013 B2
8599024 Bloy Dec 2013 B2
8718838 Kokkeby et al. May 2014 B2
8761051 Brisebois et al. Jun 2014 B2
8780968 Garcia et al. Jul 2014 B1
8798548 Carbajal Aug 2014 B1
8805291 Garcia et al. Aug 2014 B1
8818283 McHenry et al. Aug 2014 B2
8824536 Garcia et al. Sep 2014 B1
8843155 Burton et al. Sep 2014 B2
8977212 Carbajal Mar 2015 B2
9007262 Witzgall Apr 2015 B1
9008587 Carbajal Apr 2015 B2
9078162 Garcia et al. Jul 2015 B2
9143968 Manku et al. Sep 2015 B1
9185591 Carbajal Nov 2015 B2
9245378 Villagomez et al. Jan 2016 B1
9288683 Garcia et al. Mar 2016 B2
9356727 Immendorf et al. May 2016 B2
9412278 Gong et al. Aug 2016 B1
9414237 Garcia et al. Aug 2016 B2
9439078 Menon et al. Sep 2016 B2
9529360 Melamed et al. Dec 2016 B1
9537586 Carbajal Jan 2017 B2
9572055 Immendorf et al. Feb 2017 B2
9635669 Gormley et al. Apr 2017 B2
9658341 Mathews et al. May 2017 B2
9674684 Mendelson Jun 2017 B1
9674836 Gormley et al. Jun 2017 B2
9686789 Gormley et al. Jun 2017 B2
9715009 Parker et al. Jul 2017 B1
9749069 Garcia et al. Aug 2017 B2
9755972 Mao Sep 2017 B1
9767699 Borghese et al. Sep 2017 B1
9769834 Immendorf et al. Sep 2017 B2
9805273 Seeber et al. Oct 2017 B1
9819441 Immendorf et al. Nov 2017 B2
9858947 Hearing et al. Jan 2018 B2
9862489 Weinstein et al. Jan 2018 B1
9923700 Gormley et al. Mar 2018 B2
9942775 Yun et al. Apr 2018 B2
9998243 Garcia et al. Jun 2018 B2
10104559 Immendorf et al. Oct 2018 B2
10157548 Priest Dec 2018 B2
10194324 Yun et al. Jan 2019 B2
10241140 Moinuddin Mar 2019 B2
10251242 Rosen et al. Apr 2019 B1
10281570 Parker May 2019 B2
10389616 Ryan et al. Aug 2019 B2
10408936 Van Voorst Sep 2019 B2
10459020 Dzierwa et al. Oct 2019 B2
10613209 Emami et al. Apr 2020 B2
10700721 Ayala et al. Jun 2020 B2
10701574 Gormley et al. Jun 2020 B2
10784974 Menon Sep 2020 B2
10917797 Menon et al. Feb 2021 B2
11012340 Ryan et al. May 2021 B2
11035929 Parker et al. Jun 2021 B2
11223431 Garcia et al. Jan 2022 B2
11265652 Kallai et al. Mar 2022 B2
11637641 Garcia et al. Apr 2023 B1
20010020220 Kurosawa Sep 2001 A1
20020044082 Woodington et al. Apr 2002 A1
20020070889 Griffin et al. Jun 2002 A1
20020097184 Mayersak Jul 2002 A1
20020119754 Wakutsu et al. Aug 2002 A1
20020161775 Lasensky et al. Oct 2002 A1
20030013454 Hunzinger Jan 2003 A1
20030087648 Mezhvinsky et al. May 2003 A1
20030104831 Razavilar et al. Jun 2003 A1
20030144601 Prichep Jul 2003 A1
20030145328 Rabinowitz et al. Jul 2003 A1
20030198304 Sugar et al. Oct 2003 A1
20030232612 Richards et al. Dec 2003 A1
20040127214 Reddy et al. Jul 2004 A1
20040147254 Reddy et al. Jul 2004 A1
20040171390 Chitrapu Sep 2004 A1
20040203826 Sugar et al. Oct 2004 A1
20040208238 Thomas et al. Oct 2004 A1
20040219885 Sugar et al. Nov 2004 A1
20040233100 Dibble et al. Nov 2004 A1
20050003828 Sugar et al. Jan 2005 A1
20050096026 Chitrapu et al. May 2005 A1
20050107102 Yoon et al. May 2005 A1
20050176401 Nanda et al. Aug 2005 A1
20050185618 Friday et al. Aug 2005 A1
20050227625 Diener Oct 2005 A1
20050285792 Sugar et al. Dec 2005 A1
20060025118 Chitrapu et al. Feb 2006 A1
20060047704 Gopalakrishnan Mar 2006 A1
20060080040 Garczarek et al. Apr 2006 A1
20060111899 Padhi et al. May 2006 A1
20060128311 Tesfai Jun 2006 A1
20060199546 Durgin Sep 2006 A1
20060238417 Jendbro et al. Oct 2006 A1
20060258347 Chitrapu Nov 2006 A1
20070049823 Li Mar 2007 A1
20070076657 Woodings et al. Apr 2007 A1
20070098089 Li et al. May 2007 A1
20070111746 Anderson May 2007 A1
20070149216 Misikangas Jun 2007 A1
20070203645 Dees et al. Aug 2007 A1
20070223419 Ji et al. Sep 2007 A1
20070233409 Boyan et al. Oct 2007 A1
20070293171 Li et al. Dec 2007 A1
20070297541 Mcgehee Dec 2007 A1
20080001735 Tran Jan 2008 A1
20080010040 Mcgehee Jan 2008 A1
20080090563 Chitrapu Apr 2008 A1
20080113634 Gates et al. May 2008 A1
20080123731 Wegener May 2008 A1
20080130519 Bahl et al. Jun 2008 A1
20080133190 Peretz et al. Jun 2008 A1
20080180325 Chung et al. Jul 2008 A1
20080186235 Struckman et al. Aug 2008 A1
20080195584 Nath et al. Aug 2008 A1
20080209117 Kajigaya Aug 2008 A1
20080211481 Chen Sep 2008 A1
20080252516 Ho et al. Oct 2008 A1
20080293353 Mody et al. Nov 2008 A1
20090006103 Koishida et al. Jan 2009 A1
20090011713 Abusubaih et al. Jan 2009 A1
20090018422 Banet et al. Jan 2009 A1
20090046003 Tung et al. Feb 2009 A1
20090046625 Diener et al. Feb 2009 A1
20090066578 Beadle et al. Mar 2009 A1
20090086993 Kawaguchi et al. Apr 2009 A1
20090111463 Simms et al. Apr 2009 A1
20090131067 Aaron May 2009 A1
20090143019 Shellhammer Jun 2009 A1
20090146881 Mesecher Jun 2009 A1
20090149202 Hill et al. Jun 2009 A1
20090190511 Li et al. Jul 2009 A1
20090207950 Tsuruta et al. Aug 2009 A1
20090224957 Chung et al. Sep 2009 A1
20090278733 Haworth Nov 2009 A1
20090282130 Antoniou et al. Nov 2009 A1
20090285173 Koorapaty et al. Nov 2009 A1
20090286563 Ji et al. Nov 2009 A1
20090322510 Berger et al. Dec 2009 A1
20100020707 Woodings Jan 2010 A1
20100056200 Tolonen Mar 2010 A1
20100075704 Mchenry et al. Mar 2010 A1
20100109936 Levy May 2010 A1
20100150122 Berger et al. Jun 2010 A1
20100172443 Shim et al. Jul 2010 A1
20100173586 Mchenry et al. Jul 2010 A1
20100176988 Maezawa et al. Jul 2010 A1
20100220011 Heuser Sep 2010 A1
20100253512 Wagner et al. Oct 2010 A1
20100255794 Agnew Oct 2010 A1
20100255801 Gunasekara et al. Oct 2010 A1
20100259998 Kwon et al. Oct 2010 A1
20100306249 Hill et al. Dec 2010 A1
20100309317 Wu et al. Dec 2010 A1
20110022342 Pandharipande et al. Jan 2011 A1
20110045781 Shellhammer et al. Feb 2011 A1
20110053604 Kim et al. Mar 2011 A1
20110059747 Lindoff et al. Mar 2011 A1
20110070885 Ruuska et al. Mar 2011 A1
20110074631 Parker Mar 2011 A1
20110077017 Yu et al. Mar 2011 A1
20110087639 Gurney Apr 2011 A1
20110090939 Diener et al. Apr 2011 A1
20110096770 Henry Apr 2011 A1
20110102258 Underbrink et al. May 2011 A1
20110111751 Markhovsky et al. May 2011 A1
20110116484 Henry May 2011 A1
20110117869 Woodings May 2011 A1
20110122855 Henry May 2011 A1
20110129006 Jung et al. Jun 2011 A1
20110131260 Mody Jun 2011 A1
20110134878 Geiger et al. Jun 2011 A1
20110183621 Quan et al. Jul 2011 A1
20110183685 Burton et al. Jul 2011 A1
20110185059 Adnani et al. Jul 2011 A1
20110237243 Guvenc et al. Sep 2011 A1
20110241923 Chernukhin Oct 2011 A1
20110273328 Parker Nov 2011 A1
20110286555 Cho et al. Nov 2011 A1
20110287779 Harper Nov 2011 A1
20110299481 Kim et al. Dec 2011 A1
20120014332 Smith et al. Jan 2012 A1
20120032854 Bull et al. Feb 2012 A1
20120039284 Barbieri et al. Feb 2012 A1
20120052869 Lindoff et al. Mar 2012 A1
20120058775 Dupray et al. Mar 2012 A1
20120071188 Wang et al. Mar 2012 A1
20120072986 Livsics et al. Mar 2012 A1
20120077510 Chen et al. Mar 2012 A1
20120081248 Kennedy et al. Apr 2012 A1
20120094681 Freda et al. Apr 2012 A1
20120100810 Oksanen et al. Apr 2012 A1
20120105066 Marvin et al. May 2012 A1
20120115522 Nama et al. May 2012 A1
20120115525 Kang et al. May 2012 A1
20120120892 Freda et al. May 2012 A1
20120129522 Kim et al. May 2012 A1
20120140236 Babbitt et al. Jun 2012 A1
20120142386 Mody et al. Jun 2012 A1
20120148068 Chandra et al. Jun 2012 A1
20120148069 Bai et al. Jun 2012 A1
20120155217 Dellinger et al. Jun 2012 A1
20120182430 Birkett et al. Jul 2012 A1
20120195269 Kang et al. Aug 2012 A1
20120212628 Wu et al. Aug 2012 A1
20120214511 Vartanian et al. Aug 2012 A1
20120230214 Kozisek et al. Sep 2012 A1
20120246392 Cheon Sep 2012 A1
20120264388 Guo et al. Oct 2012 A1
20120264445 Lee et al. Oct 2012 A1
20120275354 Villain Nov 2012 A1
20120281000 Woodings Nov 2012 A1
20120282942 Uusitalo et al. Nov 2012 A1
20120295575 Nam Nov 2012 A1
20120302190 Mchenry Nov 2012 A1
20120302263 Tinnakornsrisuphap et al. Nov 2012 A1
20120309288 Lu Dec 2012 A1
20120322487 Stanforth Dec 2012 A1
20130005240 Novak et al. Jan 2013 A1
20130005374 Uusitalo et al. Jan 2013 A1
20130012134 Jin et al. Jan 2013 A1
20130017794 Kloper et al. Jan 2013 A1
20130023285 Markhovsky et al. Jan 2013 A1
20130028111 Dain et al. Jan 2013 A1
20130035108 Joslyn et al. Feb 2013 A1
20130035128 Chan et al. Feb 2013 A1
20130045754 Markhovsky et al. Feb 2013 A1
20130052939 Anniballi et al. Feb 2013 A1
20130053054 Lovitt et al. Feb 2013 A1
20130062334 Bilchinsky et al. Mar 2013 A1
20130064197 Novak et al. Mar 2013 A1
20130064328 Adnani et al. Mar 2013 A1
20130070639 Demura et al. Mar 2013 A1
20130090071 Abraham et al. Apr 2013 A1
20130095843 Smith et al. Apr 2013 A1
20130100154 Woodings et al. Apr 2013 A1
20130103684 Yee et al. Apr 2013 A1
20130165051 Li et al. Jun 2013 A9
20130165134 Touag et al. Jun 2013 A1
20130165170 Kang Jun 2013 A1
20130183989 Hasegawa et al. Jul 2013 A1
20130183994 Ringstroem et al. Jul 2013 A1
20130184022 Schmidt Jul 2013 A1
20130190003 Smith et al. Jul 2013 A1
20130190028 Wang et al. Jul 2013 A1
20130196677 Smith et al. Aug 2013 A1
20130208587 Bala et al. Aug 2013 A1
20130210457 Kummetz Aug 2013 A1
20130210473 Weigand Aug 2013 A1
20130217406 Villardi et al. Aug 2013 A1
20130217408 Difazio et al. Aug 2013 A1
20130217450 Kanj et al. Aug 2013 A1
20130231121 Kwak et al. Sep 2013 A1
20130237212 Khayrallah et al. Sep 2013 A1
20130242792 Woodings Sep 2013 A1
20130242934 Ueda et al. Sep 2013 A1
20130260703 Actis et al. Oct 2013 A1
20130265198 Stroud Oct 2013 A1
20130279556 Seller Oct 2013 A1
20130288734 Mody et al. Oct 2013 A1
20130315112 Gormley et al. Nov 2013 A1
20130329690 Kim et al. Dec 2013 A1
20130331114 Gormley et al. Dec 2013 A1
20140018683 Park et al. Jan 2014 A1
20140064723 Adles et al. Mar 2014 A1
20140073261 Hassan et al. Mar 2014 A1
20140086212 Kafle et al. Mar 2014 A1
20140128032 Muthukumar et al. May 2014 A1
20140139374 Wellman et al. May 2014 A1
20140163309 Bernhard et al. Jun 2014 A1
20140201367 Trummer et al. Jul 2014 A1
20140204766 Immendorf et al. Jul 2014 A1
20140206279 Immendorf et al. Jul 2014 A1
20140206307 Maurer et al. Jul 2014 A1
20140206343 Immendorf et al. Jul 2014 A1
20140256268 Olgaard Sep 2014 A1
20140256370 Gautier et al. Sep 2014 A9
20140269374 Abdelmonem et al. Sep 2014 A1
20140269376 Garcia et al. Sep 2014 A1
20140301216 Immendorf et al. Oct 2014 A1
20140302796 Gormley et al. Oct 2014 A1
20140335879 Immendorf et al. Nov 2014 A1
20140340684 Edler et al. Nov 2014 A1
20140342675 Massarella et al. Nov 2014 A1
20150016429 Menon et al. Jan 2015 A1
20150072633 Massarella et al. Mar 2015 A1
20150133058 Livis et al. May 2015 A1
20150215794 Gormley et al. Jul 2015 A1
20150215949 Gormley et al. Jul 2015 A1
20150289254 Garcia et al. Oct 2015 A1
20150289265 Gormley et al. Oct 2015 A1
20150296386 Menon et al. Oct 2015 A1
20160014713 Kennedy et al. Jan 2016 A1
20160050690 Yun et al. Feb 2016 A1
20160117853 Zhong et al. Apr 2016 A1
20160124071 Baxley et al. May 2016 A1
20160127392 Baxley et al. May 2016 A1
20160219506 Pratt et al. Jul 2016 A1
20160225240 Voddhi et al. Aug 2016 A1
20160334527 Xu et al. Nov 2016 A1
20160345135 Garcia et al. Nov 2016 A1
20160366685 Gormley et al. Dec 2016 A1
20160374088 Garcia et al. Dec 2016 A1
20170024767 Johnson, Jr. et al. Jan 2017 A1
20170039413 Nadler Feb 2017 A1
20170061690 Laughlin et al. Mar 2017 A1
20170064564 Yun et al. Mar 2017 A1
20170078792 Simons Mar 2017 A1
20170079007 Carbajal Mar 2017 A1
20170094527 Shattil et al. Mar 2017 A1
20170134631 Zhao et al. May 2017 A1
20170148467 Franklin et al. May 2017 A1
20170237484 Heath et al. Aug 2017 A1
20170238201 Gormley et al. Aug 2017 A1
20170238203 Dzierwa et al. Aug 2017 A1
20170243138 Dzierwa et al. Aug 2017 A1
20170243139 Dzierwa et al. Aug 2017 A1
20170250766 Dzierwa et al. Aug 2017 A1
20170261604 Van Voorst Sep 2017 A1
20170261613 Van Voorst Sep 2017 A1
20170261615 Ying et al. Sep 2017 A1
20170274992 Chretien Sep 2017 A1
20170289840 Sung et al. Oct 2017 A1
20170290075 Carbajal et al. Oct 2017 A1
20170358103 Shao et al. Dec 2017 A1
20170374572 Kleinbeck et al. Dec 2017 A1
20170374573 Kleinbeck et al. Dec 2017 A1
20180006730 Kuo et al. Jan 2018 A1
20180014217 Kleinbeck et al. Jan 2018 A1
20180024220 Massarella et al. Jan 2018 A1
20180070362 Ryan et al. Mar 2018 A1
20180083721 Wada et al. Mar 2018 A1
20180129881 Seeber May 2018 A1
20180149729 Grandin et al. May 2018 A1
20180211179 Dzierwa Jul 2018 A1
20180288620 Jayawickrama et al. Oct 2018 A1
20180294901 Garcia et al. Oct 2018 A1
20180313877 Brant et al. Nov 2018 A1
20180331863 Carbajal Nov 2018 A1
20190004518 Zhou et al. Jan 2019 A1
20190018103 Qian et al. Jan 2019 A1
20190064130 Kanazawa et al. Feb 2019 A1
20190064223 Kincaid Feb 2019 A1
20190072601 Dzierwa et al. Mar 2019 A1
20190123428 Packer et al. Apr 2019 A1
20190180630 Kleinbeck Jun 2019 A1
20190191313 Dzierwa et al. Jun 2019 A1
20190208112 Kleinbeck Jul 2019 A1
20190208491 Dzierwa et al. Jul 2019 A1
20190215709 Kleinbeck et al. Jul 2019 A1
20190223139 Kleinbeck et al. Jul 2019 A1
20190230539 Dzierwa et al. Jul 2019 A1
20190230540 Carbajal et al. Jul 2019 A1
20190245722 Carbajal Aug 2019 A1
20190246304 Dzierwa et al. Aug 2019 A1
20190253160 Garcia et al. Aug 2019 A1
20190253905 Kleinbeck et al. Aug 2019 A1
20190274059 Kleinbeck et al. Sep 2019 A1
20190302249 High et al. Oct 2019 A1
20190342202 Ryan et al. Nov 2019 A1
20190360783 Whittaker Nov 2019 A1
20190364533 Kleinbeck et al. Nov 2019 A1
20200034620 Lutterodt Jan 2020 A1
20200036459 Menon Jan 2020 A1
20200043346 Vacek Feb 2020 A1
20200059800 Menon et al. Feb 2020 A1
20200066132 Kleinbeck Feb 2020 A1
20200067752 DelMarco Feb 2020 A1
20200096548 Dzierwa et al. Mar 2020 A1
20200107207 Kleinbeck et al. Apr 2020 A1
20200120266 Kleinbeck Apr 2020 A1
20200128418 Dzierwa et al. Apr 2020 A1
20200145032 Ayala et al. May 2020 A1
20200162890 Spencer et al. May 2020 A1
20200169892 Dzierwa et al. May 2020 A1
20200184832 Kleinbeck Jun 2020 A1
20200196269 Dzierwa et al. Jun 2020 A1
20200196270 Kleinbeck et al. Jun 2020 A1
20200245167 Kleinbeck et al. Jul 2020 A1
20200260306 Kleinbeck et al. Aug 2020 A1
20200295855 Kleinbeck et al. Sep 2020 A1
20200382961 Shattil et al. Dec 2020 A1
20210082254 Givant Mar 2021 A1
20210084217 Kleinbeck Mar 2021 A1
20210211911 Kleinbeck et al. Jul 2021 A1
20210250795 Dzierwa et al. Aug 2021 A1
20210280039 Kleinbeck Sep 2021 A1
20210360423 Dzierwa et al. Nov 2021 A1
20210360450 Kleinbeck et al. Nov 2021 A1
20210360453 Kleinbeck et al. Nov 2021 A1
20210360454 Carbajal et al. Nov 2021 A1
20210409591 Kleinbeck Dec 2021 A1
20220030541 Dzierwa et al. Jan 2022 A1
20220052770 Kleinbeck et al. Feb 2022 A1
20220128612 Dzierwa et al. Apr 2022 A1
20220131623 Garcia et al. Apr 2022 A1
20220150824 Kleinbeck et al. May 2022 A1
20220262228 Kleinbeck Aug 2022 A1
20220262261 Kleinbeck Aug 2022 A1
20220286997 Kleinbeck et al. Sep 2022 A1
20230087729 Goldstein et al. Mar 2023 A1
20230105718 Carbajal Apr 2023 A1
20230114804 Kleinbeck Apr 2023 A1
20230118723 Carbajal et al. Apr 2023 A1
20230123375 Dzierwa et al. Apr 2023 A1
20230209378 Kleinbeck et al. Jun 2023 A1
20230232244 Dzierwa et al. Jul 2023 A1
20230252744 Miller et al. Aug 2023 A1
20230254567 Kleinbeck Aug 2023 A1
20230308789 Tian et al. Sep 2023 A1
20230326323 Kleinbeck Oct 2023 A1
Foreign Referenced Citations (2)
Number Date Country
100248671 Apr 2000 KR
20140041618 Apr 2014 KR
Non-Patent Literature Citations (11)
Entry
“A Low-Cost, Near-Real-Time Two-LIAS-Based UWB Emitter Monitoring System”; Wang et al.; IEEE A&E Systems Magazine Nov. 2015 (Year: 2015).
“Multipath TDOA and FDOA Estimation Using the EM Algorithm”; Belanger; Apr. 27, 1993; 1993 IEEE International Conference on Acoustics, Speech, and Signal Processing (Year: 1993).
“Noise Figure”, Wikipedia, located at https://en.wikipedia.org/wiki/Noise_figure (Year: 2022).
Boll S.F., Suppression of Acoustic Noise in Speech Using Spectral Subtraction, Apr. 1979, IEEE Transactions On Acoustics, Speech, and Signal Processing, vol. ASSP-27, No. 2, (Year: 1979).
David Eppink and Wolf Kuebler, “Tirem/Sem Handbook”, Mar. 1994, IIT Research Institute, p. 1-6, located at http://www.dtic.mil/cgi-bin/GetTRDoc?Location=U2&doc=GetTRDoc.pdf&AD=ADA296913.
Gabriel Garcia and Daniel Carbajal, U.S. Appl. No. 61/789,758, Provisional Patent Application, Filed Mar. 15, 2013 (Specification, Claims, and Drawings).
Gary L. Sugar, System and method for locating wireless devices in an unsynchronized wireless network, U.S. Appl. No. 60/319,737, Provisional Patent Application filed on Nov. 27, 2002, Specification including the claims, abstract, and drawings.
International Search Report and Written Opinion dated Jun. 21, 2018 issued by the International Application Division, Korean Intellectual Property Office as International Searching Authority in connection with International Application No. PCT/US2018/014504 (21 pages).
Mobile Emitter Geolocation and Tracking Using TDOA and FDOA Measurements; Musicki et al.; IEEE Transactions on Signal Processing, vol. 58, No. 3, Mar. 2010 (Year: 2010).
Bluetooth vs Zigbee-difference between Bluetooth and Zigbee (located at https://www.rfwireless-world.com/Terminology/Bluetooth-vs-zigbee.html) (Year: 2012).
Steven W. Smith, The Scientist & Engineer's Guide to Digital Signal Processing, 1999, California Technical Publishing, San Diego, California, 2nd Edition, p. 312 (located at http://www.analog.com/media/en/technical-documentation/dsp-book/dsp_book_ch18.pdf) (Year: 1999).
Provisional Applications (2)
Number Date Country
62722420 Aug 2018 US
62632276 Feb 2018 US
Continuations (6)
Number Date Country
Parent 18142904 May 2023 US
Child 18374376 US
Parent 17991348 Nov 2022 US
Child 18142904 US
Parent 17735615 May 2022 US
Child 17991348 US
Parent 17190048 Mar 2021 US
Child 17735615 US
Parent 16732811 Jan 2020 US
Child 17190048 US
Parent 16275575 Feb 2019 US
Child 16732811 US
Continuation in Parts (3)
Number Date Country
Parent 16274933 Feb 2019 US
Child 16275575 US
Parent 16180690 Nov 2018 US
Child 16274933 US
Parent 15412982 Jan 2017 US
Child 16180690 US