The present invention relates to systems and methods for unmanned vehicle recognition and threat management. More particularly, the systems and methods of the present invention are directed to unmanned vehicle detection, classification and direction finding.
Unmanned Aerial Vehicles (UAVs), commonly known as drones, have become readily available in commercial and retail stores. Detailed schematics for their control systems are available from many manufacturers and the internet along with Software Development Kits (SDKs). Rapid modifications are enabled by advancements in various technologies (e.g., 3D printing). UAVs can be modified to deploy dangerous actions and threaten societal securities. For example, UAVs can be modified to deliver dangerous payloads. It is no longer a question of if, it is now a question of when. Thus, it is imperative that organizations and governments take steps to protect critical assets (e.g., ports, power plants), structures (e.g., buildings, stadiums), and personnel and their citizens.
Exemplary U.S. Patent Documents relevant to the prior art include:
U.S. Pat. No. 9,862,489 for “Method and apparatus for drone detection and disablement” by inventors Lee Weinstein et al., filed Feb. 7, 2016 and issued Jan. 9, 2018, describes a method and apparatus for detection and disablement of an unidentified aerial vehicle (UAV) includes arrays of antenna elements receiving in two modalities (for instance radio frequency (RF) and acoustic modalities, or RF and optical modalities). Signal processing of outputs from multiple antenna arrays locates a potential UAV at specific coordinates within a volume of space under surveillance, and automatically aims video surveillance and a short-range projectile launcher at the UAV, and may automatically fire the projectile launcher to down the UAV.
U.S. Pat. No. 9,858,947 for “Drone detection and classification methods and apparatus” by inventors Brian Hearing et al., filed Nov. 24, 2015 and issued Jan. 2, 2018, describes a system, method, and apparatus for drone detection and classification. An example method includes receiving a sound signal in a microphone and recording, via a sound card, a digital sound sample of the sound signal, the digital sound sample having a predetermined duration. The method also includes processing, via a processor, the digital sound sample into a feature frequency spectrum. The method further includes applying, via the processor, broad spectrum matching to compare the feature frequency spectrum to at least one drone sound signature stored in a database, the at least one drone sound signature corresponding to a flight characteristic of a drone model. The method moreover includes, conditioned on matching the feature frequency spectrum to one of the drone sound signatures, transmitting, via the processor, an alert.
U.S. Pat. No. 9,767,699 for “System for and method of detecting drones” by inventors John W. Borghese et al., filed May 14, 2015 and issued Sep. 19, 2017, describes an apparatus and method can provide a warning of a drone or unmanned aerial vehicle in the vicinity of an airport. The apparatus can include at least one antenna directionally disposed at an along the approach or departure path and a detector configured to provide a warning of a presence of sense an unmanned aerial or drone. The warning can be provided in response to a radio frequency signal received by the at least one of the antenna being in a frequency band associated with a transmission frequency for the unmanned aerial vehicle or drone or in a frequency band associated with interaction from receive circuitry of the unmanned aerial vehicle or drone.
U.S. Pat. No. 9,715,009 for “Deterent for unmanned aerial systems” by inventors Dwaine A. Parker et al., filed Dec. 2, 2016 and issued Jul. 25, 2017, describes a system for providing an integrated multi-sensor detection and countermeasure against commercial unmanned aerial systems/vehicles and includes a detecting element, a tracking element, an identification element, and an interdiction element. The detecting element detects an unmanned aerial vehicle in flight in the region of, or approaching, a property, place, event or very important person. The tracking element determines the exact location of the unmanned aerial vehicle. The identification/classification element utilizing data from the other elements generates the identification and threat assessment of the UAS. The interdiction element, based on automated algorithms can either direct the unmanned aerial vehicle away from the property, place, event or very important person in a non-destructive manner, or can disable the unmanned aerial vehicle in a destructive manner. The interdiction process may be over ridden by intervention by a System Operator/HiL.
U.S. Pat. No. 9,529,360 for “System and method for detecting and defeating a drone” by inventors Howard Melamed et al., filed Apr. 22, 2015 and issued Dec. 27, 2016, describes a system for detecting and defeating a drone. The system has a detection antenna array structured and configured to detect the drone and the drone control signal over a 360 degree field relative to the detection antenna array including detecting the directionality of the drone. The system also includes a neutralization system structured and configured in a communicating relation with the detection antenna array. The neutralization system has a transmission antenna structured to transmit an override signal aimed at the direction of the drone, an amplifier configured to boost the gain of the override signal to exceed the signal strength of the drone control signal, and a processing device configured to create and effect the transmission of the override signal. The patent also discloses a method for detecting and defeating a drone.
U.S. Publication No. 2017/0358103 for “Systems and Methods for Tracking Moving Objects” by inventors Michael Shao et al., filed Jun. 9, 2017 and published Dec. 14, 2017, describes systems and methods for tracking moving objects. The publication discloses an object tracking system comprises a processor, a communications interface, and a memory configured to store an object tracking application. The object tracking application configures the processor to receive a sequence of images; estimate and subtract background pixel values from pixels in a sequence of images; compute sets of summed intensity values for different per frame pixel offsets from a sequence of images; identify summed intensity values from a set of summed intensity values exceeding a threshold; cluster identified summed intensity values exceeding the threshold corresponding to single moving objects; and identify a location of at least one moving object in an image based on at least one summed intensity value cluster.
U.S. Publication No. 2017/0261613 for “Counter drone system” by inventor Brian R. Van Voorst, filed Feb. 27, 2017 and published Sep. 14, 2017, describes a counter drone system that includes a cueing sensor to detect the presence of an object wherein the cueing sensor cues the presence of a target drone, a long range LIDAR system having a sensor pointed in a direction of the target drone to acquire and track at long range the target drone to provide an accurate location of the target drone wherein once a track is acquired, the motion of the target drone is used to maintain the track of the target drone and a threat detector wherein LIDAR data is provided to the threat detector to determine if the target drone is a threat.
U.S. Publication No. 2017/0261604 for “Intercept drone tasked to location of lidar tracked drone” by inventor Brian Van Voorst, filed Feb. 27, 2017 and published Sep. 14, 2017, describes a system that includes a long range LIDAR tracking system to track a target drone and provide detection and tracking information of the target drone; a control system to process the detection and tracking information and provide guidance information to intercept the target drone; and a high powered intercept drone controlled by supervised autonomy, the supervised autonomy provided by processing the detection and tracking information of the target drone and sending guidance information to the intercept drone to direct the intercept drone to the target drone.
U.S. Publication No. 2017/0039413 for “Commercial drone detection” by inventor Gary J. Nadler, filed Aug. 3, 2015 and published Feb. 9, 2017, describes a method of capturing the presence of a drone, including: collecting, using at least one sensor, data associated with an aerial object; analyzing, using a processor, the data to determine at least one characteristic of the aerial object; accessing, in a database, a library of stored characteristics of commercially available drones; determining, based on the analyzing, if the at least one characteristic of the aerial object matches a characteristic of a commercially available drone; and responsive to the determining, generating an indication of a positive match.
U.S. Pat. No. 11,663,992 for Systems and Methods for Detecting, Monitoring, and Mitigating the Presence of Unauthorized Drones by inventors Jordan, et al., Jun. 29, 2020 and published May 23, 2023, is directed to Systems and methods for detecting, monitoring, and mitigating the presence of a drone are provided herein. In one aspect, a system for detecting presence of a drone includes a radio-frequency (RF) receiver. The system can further include a processor and a computer-readable memory in communication with the processor and having stored thereon computer-executable instructions to cause the at least one processor to receive a set of samples from the RF receiver for a time interval, obtain predetermined data of expected communication protocols used between the drone and a controller, and determine whether the RF signal corresponds to one of the expected communication protocols by comparing the samples of the RF signal to the predetermined data and decoding the RF signal. In further aspects the system extracts a unique identifier of the drone based at least partially on the decoded RF signal.
U.S. Pat. No. 11,190,233 for Systems and Methods for Detecting, Monitoring, and Mitigating the Presence of a Drone Using Frequency Hopping by inventors Fang-Hsuan Lo, et al., filed May 28, 2020 and published Nov. 10, 2021, is directed to systems and methods for detecting, monitoring, and mitigating the presence of a drone are provided herein. In one aspect, a system for detecting presence of a one or more drones includes a radio-frequency (RF) receiver configured to receive an RF signal transmitted between a drone and a controller. The system can further include a processor and a computer-readable memory in communication with the processor and having stored thereon computer-executable instructions to cause the at least one processor to receive a set of samples from the RF receiver for a time interval, the set of samples comprising samples of the first RF signal, obtain a parameter model of the first frequency hopping parameters, and fit the parameter model to the set of samples.
U.S. Pat. No. 10,698,076 for Radio Frequency Signal Transmission Detector and Locator by inventors Jones, et al., filed Aug. 2, 2018 and published Jun. 10, 2020, is directed to The present invention is a system and method for detecting and locating the transmission of radio frequency signals from within a defined geographical area. The system uses statistical confidence limits to detect outliers caused by transmissions in the defined geographical area. The source of the transmission can then be located with triangulation.
U.S. Pat. No. 10,540,905 for Systems, Aircrafts, and Methods for Drone Detection and Collision Avoidance by inventors Bohanan, et al., filed Mar. 28, 2018 and published Jan. 1, 2020, is directed to a system and a method for drone detection and collision avoidance, particularly for use in an aircraft, is provided. The system includes, but is not limited to a sensor, a processor, and an avoidance unit comprising a control unit. The sensor is configured to detect a drone signal in a predetermined space and to transmit the drone signal to the processor. The processor is configured to determine the presence of a drone in the predetermined space based on the drone signal. The processor is configured to transmit a command to the avoidance unit when the processor determines the presence of a drone. The control unit is configured to receive the command and to generate a warning signal in response to receiving the command.
U.S. Pat. No. 9,529,360 for System and Method for Detecting and Defeating a Drone by inventors Melamed. et al., filed Apr. 22, 2015 and published Dec. 7, 2016, is directed to The invention is directed to a system for detecting and defeating a drone. The system has a detection antenna array structured and configured to detect the drone and the drone control signal over a 360 degree field relative to the detection antenna array including detecting the directionality of the drone. The system also includes a neutralization system structured and configured in a communicating relation with the detection antenna array. The neutralization system has a transmission antenna structured to transmit an override signal aimed at the direction of the drone, an amplifier configured to boost the gain of the override signal to exceed the signal strength of the drone control signal, and a processing device configured to create and effect the transmission of the override signal. The invention is also directed to a method for detecting and defeating a drone.
The present invention provides systems and methods for unmanned vehicle recognition. In one embodiment, a multiplicity of receivers captures RF data and transmits the RF data to at least one node device. The at least one node device comprises a signal processing engine, a detection engine, a classification engine, and a direction finding engine. The at least one node device is configured with an artificial intelligence algorithm. The detection engine and classification engine are trained to detect and classify signals from unmanned vehicles and their controllers based on processed data from the signal processing engine. The direction finding engine is operable to provide lines of bearing for detected unmanned vehicles. A display and control unit is in network communication with the at least one node device for displaying locations and other related data for the detected unmanned vehicles.
In one embodiment, the present invention includes a system for signal identification in a radiofrequency (RF) environment, including at least one node device including a processor and at least one memory in communication with at least one RF receiver, wherein the at least one RF receiver is operable to capture RF data in the RF environment and transmit the RF data to the at least one node device, wherein the at least one node device is operable to average Fast Fourier Transform (FFT) data derived from the RF data into at least one tile, wherein the at least one tile is visually represented as at least one waterfall image, wherein the at least one node device is operable to analyze the at least one waterfall image using machine learning (ML) or at least one convolutional neural network (CNN) to identify at least one signal, at least one signal type, and/or noise to create at least one analyzed waterfall image, and wherein the at least one analyzed waterfall image includes a visual indication of the at least one signal, the at least one signal type, and/or the noise.
In another embodiment, the present invention includes an apparatus for signal identification in a radiofrequency (RF) environment, including a node device including a processor and at least one memory, wherein the node device is operable to receive RF data from at least one RF receiver, wherein the node device is operable to average Fast Fourier Transform (FFT) data derived from the RF data into at least one tile, wherein the at least one tile is represented as at least one waterfall image, wherein the node device waterfall image is analyzed using machine learning (ML) or at least one convolutional neural network (CNN) to identify at least one signal, at least one signal type, and/or noise to create at least one analyzed waterfall image, and wherein the at least one analyzed waterfall image includes a visual indication of the at least one signal, the at least one signal type, and/or the noise.
In yet another embodiment, the present invention includes a method for signal identification in a radiofrequency (RF) environment, including at least one RF receiver capturing Fast Fourier Transform (FFT) data in the RF environment and transmitting the FFT data to at least one node device, the at least one node device averaging the FFT data derived from the RF data into at least one tile, wherein the at least one tile is represented as at least one waterfall image, the at least one node device analyzing the at least one waterfall image using machine learning (ML) or at least one convolutional neural network (CNN) to identify at least one signal, at least one signal type, and/or noise, and the at least one node device creating at least one analyzed waterfall image based on the identification of the at least one signal, the at least one signal type, and/or the noise, wherein the at least one analyzed waterfall image includes a visual indication of the at least one signal, the at least one signal type, and/or the noise.
These and other aspects of the present invention will become apparent to those skilled in the art after a reading of the following description of the preferred embodiment when considered with the drawings, as they support the claimed invention.
The present invention provides systems and methods for unmanned vehicle recognition. The present invention relates to automatic signal detection, temporal feature extraction, geolocation, and edge processing disclosed in U.S. patent application Ser. No. 15/412,982 filed Jan. 23, 2017, U.S. patent application Ser. No. 15/478,916 filed Apr. 4, 2017, U.S. patent application Ser. No. 15/681,521 filed Aug. 21, 2017, U.S. patent application Ser. No. 15/681,540 filed Aug. 21, 2017, and U.S. patent application Ser. No. 15/681,558 filed Aug. 21, 2017, each of which is incorporated herein by reference in its entirety.
In one embodiment, the present invention includes a system for signal identification in a radiofrequency (RF) environment, including at least one node device including a processor and at least one memory in communication with at least one RF receiver, wherein the at least one RF receiver is operable to capture RF data in the RF environment and transmit the RF data to the at least one node device, wherein the at least one node device is operable to average Fast Fourier Transform (FFT) data derived from the RF data into at least one tile, wherein the at least one tile is visually represented as at least one waterfall image, wherein the at least one node device is operable to analyze the at least one waterfall image using machine learning (ML) or at least one convolutional neural network (CNN) to identify at least one signal, at least one signal type, and/or noise to create at least one analyzed waterfall image, and wherein the at least one analyzed waterfall image includes a visual indication of the at least one signal, the at least one signal type, and/or the noise.
In another embodiment, the present invention includes an apparatus for signal identification in a radiofrequency (RF) environment, including a node device including a processor and at least one memory, wherein the node device is operable to receive RF data from at least one RF receiver, wherein the node device is operable to average Fast Fourier Transform (FFT) data derived from the RF data into at least one tile, wherein the at least one tile is represented as at least one waterfall image, wherein the node device waterfall image is analyzed using machine learning (ML) or at least one convolutional neural network (CNN) to identify at least one signal, at least one signal type, and/or noise to create at least one analyzed waterfall image, and wherein the at least one analyzed waterfall image includes a visual indication of the at least one signal, the at least one signal type, and/or the noise.
In yet another embodiment, the present invention includes a method for signal identification in a radiofrequency (RF) environment, including at least one RF receiver capturing Fast Fourier Transform (FFT) data in the RF environment and transmitting the FFT data to at least one node device, the at least one node device averaging the FFT data derived from the RF data into at least one tile, wherein the at least one tile is represented as at least one waterfall image, the at least one node device analyzing the at least one waterfall image using machine learning (ML) or at least one convolutional neural network (CNN) to identify at least one signal, at least one signal type, and/or noise, and the at least one node device creating at least one analyzed waterfall image based on the identification of the at least one signal, the at least one signal type, and/or the noise, wherein the at least one analyzed waterfall image includes a visual indication of the at least one signal, the at least one signal type, and/or the noise.
Currently, commercial and retail UAVs dominate frequencies including 433 MHz industrial, scientific, and medical radio band (ISM Band) Region 1, 900 MHz ISM Band Region 1,2,3 (varies by country), 2.4 GHz (channels 1-14), 5 GHZ (channels 7-165 most predominant), and 3.6 GHz (channels 131-183). Modulation types used by commercial and retail UAVs include Direct Sequence Spread Spectrum (DSSS), Orthogonal Frequency Division Multiplexing (OFDM), Frequency Hopping Spread Spectrum (FHSS), Fataba Advanced Spread Spectrum Technology (FASST).
Many counter UAV systems in the prior art focus on the 2.4 GHz and 5.8 GHz bands utilizing demodulation and decryption of radio frequency (RF) signals to detect and analyze each signal to determine if it is related to a UAV.
The present invention provides systems and methods for unmanned vehicle recognition including detection, classification and direction finding. Unmanned vehicles comprise aerial, terrestrial or water borne unmanned vehicles. The systems and methods for unmanned vehicle recognition are operable to counter threats from the aerial, terrestrial or water borne unmanned vehicles.
In one embodiment, a multiplicity of receivers captures RF data and transmits the RF data to at least one node device. The at least one node device comprises a signal processing engine, a detection engine, a classification engine, and a direction finding engine. The at least one node device is configured with an artificial intelligence algorithm. The detection engine and classification engine are trained to detect and classify signals from unmanned vehicles and their controllers based on processed data from the signal processing engine. The direction finding engine is operable to provide lines of bearing for detected unmanned vehicles. A display and control unit is in network communication with the at least one node device for displaying locations and other related data for the detected unmanned vehicles.
In one embodiment, the present invention provides systems and methods for unmanned vehicle (UV) recognition in a radio frequency (RF) environment. A multiplicity of RF receivers and a displaying device are in network communication with a multiplicity of node devices. The multiplicity of RF receivers is operable to capture the RF data in the RF environment, convert the RF data to fast Fourier transform (FFT) data, and transmit the FFT data to the multiplicity of node devices. The multiplicity of node devices each comprises a signal processing engine, a detection engine, a classification engine, a direction-finding engine, and at least one artificial intelligence (AI) engine. The signal processing engine is operable to average the FFT data into at least one tile. The detection engine is operable to group the FFT data into discrete FFT bins over time, calculate average and standard deviation of power for the discrete FFT bins, and identify at least one signal related to at least one UV and/or corresponding at least one UV controller. The at least one AI engine is operable to generate an output for each of the at least one tile to identify at least one UV and corresponding at least one UV controller with a probability, and calculate an average probability based on the output from each of the at least one tile. The classification engine is operable to classify the at least one UV and/or the at least one UV controller by comparing the at least one signal to classification data stored in a classification library in real time or near real time. The direction-finding engine is operable to calculate a line of bearing for the at least one UV. The displaying device is operable to display a classification of the at least one UV and/or the at least one UV controller and/or the line of bearing of the at least one UV. Each of the at least one tile is visually represented in a waterfall image via a graphical user interface on the displaying device.
The present invention provides a more efficient methodology for UAV detection and identification, which takes advantage of Fast Fourier Transform (FFT) over a short period of time and its derivation. RF data received from antennas are directly converted to FFT data with finer granularity. This allows rapid identification of protocols used by high threat drones without demodulation, and the identification is probability based. An analytics engine is operable to perform near real-time analysis and characterize signals within the spectrum under observation.
Advantageously, multiple receivers in the present invention work together to ingest spectral activities across large blocks of spectrum. The multiple receivers have an instantaneous bandwidth from 40 MHz to 500 MHz. In one embodiment, the multiple receivers are configurable in 40 MHz and 125 MHz segment building blocks. Input data are converted directly to FFT data and fed into process engines, which significantly decreases latency. The process engines are designed for rapid identification of signals of interest (SOI). When an SOI is detected, a direction finding process is initiated autonomously. In one embodiment, the direction finding process is configurable by an operator.
There are multiple types of communications links utilized for command and control of an unmanned vehicle. Although several cost-effective radio communication (RC) protocols are gaining global popularity, WI-FI is still the most popular protocol for command and control of UAVs and camera systems. A remote controller of a UAV acts as a WI-FI access point and the UAV acts as a client. There are several limiting factors for WI-FI-based UAVs. For example, the operational range of a WI-FI-based UAV is typically limited to 150 feet (46 m) indoor and 300 feet (92 m) outdoor. There is significant latency for control and video behaviors. Interference by other WI-FI devices affects operational continuity of the WI-FI-based UAVs.
Demand in the UAV user community has made more professional-level protocols available in the commercial and retail markets. By way of example but not limitation, two common RC protocols used for UAVs are Lightbridge and OcuSync. Enhancements in drone technology inevitably increases the capability of drones for use in industrial espionage and as weapons for nefarious activities.
Lightbridge is developed for long range and reliable communication. Communication is available within a range up to 5 km. Lightbridge supports 8 selectable channels, and the selection can be manual or automatic. Drones with Lightbridge protocol also have the ability to assess interference and move to alternate channels for greater quality.
OcuSync is developed based on the LightBridge protocol. OcuSync uses effective digital compression and other improvements, which decreases knowledge required to operate. OcuSync provides reliable HD and UHD video, and OcuSync-based drones can be operated in areas with greater dynamic interference. Ocusync improves command and control efficiencies and reduces latency. With OcuSync, video communications are improved substantially, operational range is increased, command and control recovery are enhanced when interference occurs.
The systems and methods of the present invention for unmanned vehicle recognition are operable to detect and classify UAVs at a distance, provide directions of the UAVs, and take defensive measures to mitigate risks. The detection and classification are fast, which provides more time to react and respond to threats. Exact detection range is based upon selection of antenna systems, topology, morphology, and client criteria. Classification of the detected UAVs provides knowledge of the UAVs and defines effective actions and capabilities for countering UAV threats. In one embodiment, the direction information of the UAVs provides orientation within the environment based on the location of the UAV detector.
In one embodiment, the systems and methods of the present invention provides unmanned vehicle recognition solution targeting radio controlled and WI-FI-based drones. The overall system is capable of surveying the spectrum from 20 MHz to 6 GHz, not just the common 2.4 GHz and 5.8 GHz areas as in the prior art. In one embodiment, the systems and methods of the present invention are applied to address 2 major categories: RC-based UAV systems and WI-FI-based UAV systems. In one embodiment, UAV systems utilize RC protocols comprising LightBridge and OcuSync. In another embodiment, UAV systems are WI-FI based, for example but not for limitation, 3DR Solo and Parrot SkyController. The systems and methods of the present invention are operable to detect UAVs and their controllers by protocol.
The systems and methods of the present invention maintain a state-of-the-art learning system and library for classifying detected signals by manufacturer and controller type. The state-of-the-art learning system and library are updated as new protocols emerge.
In one embodiment, classification by protocol chipset is utilized to provide valuable intelligence and knowledge for risk mitigation and threat defense. The valuable intelligence and knowledge include effective operational range, supported peripherals (e.g., external or internal camera, barometers, GPS and dead reckoning capabilities), integrated obstacle avoidance systems, and interference mitigation techniques.
The state-of-the-art learning system of the present invention is highly accurate and capable of assessing detected UAV signals and/or controller signals for classification in less than a few seconds with a high confidence level. The state-of-the-art learning system is operable to discriminate changes in the environment for non-drone signals as well as drone signals.
It is difficult to recognize commercial and retail drones with the naked eye over 100 meters. It is critical to obtain a vector to the target for situational awareness and defense execution. The systems and methods of the present invention provides lines of bearing for direction finding for multiple UAVs flying simultaneously. Each line of bearing is color coded for display. Angles, along with frequencies utilized for uplink and downlink, are also displayed on the human interface.
Once a UAV is detected and classified, an alert is posted to a counter UAV system operator (e.g., a network operation center, an individual operator) including azimuth of the UAV and other information. The alert is transmitted via email, short message service (SMS) or third-party system integration. The counter UAV system is operable to engage an intercession transmission, which will disrupt the communication between the UAV and its controller. When the communication between the UAV and its controller is intercepted, the UAV will invoke certain safety protocols, such as reduce height and hover, land, or return to the launch point. The counter UAV system may have certain restrictions based on country and classification of the UAV.
In one embodiment, the systems and methods of the present invention are operable to update the UAV library with emerging protocols for classification purposes, and refine the learning engine for wideband spectrum analysis for other potential UAV signatures, emerging protocols and technologies. In other words, the systems and methods of the present invention are adaptable to any new and emerging protocols and technologies developed for unmanned vehicles. In one embodiment, multiple node devices in the present invention are deployed to operate as a group of networked nodes. In one embodiment, the group of networked nodes are operable to estimate geographical locations for unmanned vehicles. In one embodiment, two node devices are operable to provide a single line of bearing and approximate a geographical location of a detected drone or controller. The more node devices there are in the group of network nodes, the more lines of bearing are operable to be provided, and the more accurate the geographical location is estimated for the detected drone or controller. In one embodiment, the geolocation function provides altitude and distance of a targeted drone.
In one embodiment, the counter UAV system in the present invention is operable to alert when unexpected signal characteristics are detected in 2.4 GHz and 5.8 GHz areas and classify the unexpected signal characteristics as potential UAV activities. In another embodiment, the counter UAV system in the present invention is operable to alert when unexpected signal characteristics are detected anywhere from 20 MHz to 6 GHz and classify the unexpected signal characteristics as potential UAV activities. In another embodiment, the counter UAV system in the present invention is operable to classify the unexpected signal characteristics as potential UAV activities when unexpected signal characteristics are detected anywhere from 40 MHz to 6 GHz. The automatic signal detection engine and analytics engine are enhanced in the counter UAV system to recognize potential UAV activities across a great portion of the spectrum. In one embodiment, any blocks of spectrum from 40 MHz to 6 GHz are operable to be selected for UAV recognition.
In one embodiment, vector-based information including inclinations, declinations, topology deviations, and user configurable Northing map orientation is added to the WGS84 mapping system for direction finding and location estimation. In one embodiment, earth-centered earth-fixed vector analysis is provided for multi-node systems to estimate UAV locations, derive UAV velocities from position changes over time, and determine UAV trajectory vectors in fixed nodal processing. In one embodiment, a group of networked node devices are operable to continually provide lines of bearing over time, approximate geographical locations of a detected unmanned vehicle on or above the earth, and track the movement of the detected unmanned vehicle from one estimated location to another. In one embodiment, the group of networked node devices are operable to determine velocities of the detected unmanned vehicle based on estimated locations and travel time. In one embodiment, the group of networked node devices are operable to estimate a trajectory of the detected unmanned vehicle based on the estimated geographical locations over time. In one embodiment, the group of networked node devices are operable to estimate accelerations and decelerations of the unmanned vehicle based on the velocities of the unmanned vehicles over time.
In one embodiment, the systems and methods of the present invention are operable for UAV detection and direction finding for different modulation schemes including but not limited to DSSS, OFDM, FHSS, FASST, etc. In one embodiment, the counter UAV system in the present invention is configured with cameras for motion detection. The cameras have both day and night vision.
In one embodiment, systems and methods of the present invention provides training for unmanned vehicle recognition. RF data is captured for a Phantom 3 drone and its controller and a Phantom 4 drone and its controller, both of which use Lightbridge protocol. RF data is also captured for a Mavic Pro drone and its controller, which uses OcuSync protocol. The RF data is recorded at different channels, different RF bandwidths, and different video quality settings inside and outside an Anechoic Chamber.
In one embodiment, the recorded RF data is used to train and calibrate an inception based convolutional neural network comprised in a drone detection system.
The trained inception based convolutional neural network is operable to identify Lightbridge 1 controller and drone, Lightbridge 2 controller and drone, and OcuSync controller and drone. The trained inception based convolutional neural network is operable to identify Lightbridge and Ocusync controllers and drones at the same time. In one embodiment, the drone detection system comprising the trained inception based convolutional neural network is operable to search an instantaneous bandwidth of 147.2 MHz.
In one embodiment, the drone detection system of the present invention includes an artificial intelligence (AI) algorithm running on a single board computer (e.g., Nvidia Jetson TX2) with an execution time less than 10 ms. The drone detection system is operable to separate Phantom 3 and Phantom 4 controllers. Waveforms for Phantom 3 and Phantom 4 controllers are sufficiently different to assign separate probabilities.
The Artificial Intelligence (AI) algorithm is used to enhance performance for RF data analytics. The RF data analytics process based on the AI algorithm is visualized. The RF waterfalls of several drone scenarios are presented in
Each scenario is illustrated with 6 waterfall images. Each waterfall represents ˜80 ms of time and 125 MHz of bandwidth. The top left image is the waterfall before an AI processing. The other five images are waterfalls after the AI processing. For each signal type, the areas of the waterfall that are likely for the RF signal type are highlighted. Areas that are not for the signal type are grayed out. The overall probability that a signal exists in the image is printed in the title of each waterfall image. In one embodiment, the AI algorithm is securely integrated with a state engine and a detection process of the present invention. In one embodiment, AI processing or processing of the waterfall includes a comparison of the waterfall image to a database of waterfall images of the RF environment or similar RF environments for which signals or noise were identified. The comparison of the waterfall image to the database of waterfall images is operable to be performed in real time or near real time. In one embodiment, the database of waterfall images is operable to be updated in real time or near real time with the waterfall image created based on the at least one tile and associated information with the waterfall image, including but not limited to the signal identified in the waterfall image, the signal type(s) identified in the waterfall image, and/or noise identified in the waterfall image.
The comparison includes the use of machine learning (ML) or convolutional neural networks (CNN) in one embodiment. In other embodiments, the node or system is operable to utilize a plurality of learning techniques for analyzing waterfall images including, but not limited to, artificial intelligence (AI), deep learning (DL), artificial neural networks (ANNs), support vector machines (SVMs), Markov decision process (MDP), and/or natural language processing (NLP). The node or system is operable to use any of the aforementioned learning techniques alone or in combination. Further, the node or system is operable to utilize predictive analytics techniques including, but not limited to, machine learning (ML), artificial intelligence (AI), neural networks (NNs) (e.g., long short term memory (LSTM) neural networks), deep learning, historical data, and/or data mining to make future predictions and/or models. The node or system is preferably operable to recommend and/or perform actions based on historical data, external data sources, ML, AI, NNs, and/or other learning techniques. The node or system is operable to utilize predictive modeling and/or optimization algorithms including, but not limited to, heuristic algorithms, particle swarm optimization, genetic algorithms, technical analysis descriptors, combinatorial algorithms, quantum optimization algorithms, iterative methods, deep learning techniques, and/or feature selection techniques.
In one embodiment, a method for drone detection and classification comprises applying FFT function to RF data, converting FFT data into logarithmic scale in magnitude, averaging converted FFT into 256 by 256 array representing 125 MHz of bandwidth and 80 ms of time as a base tile, applying normalization function to the base tile, applying a series of convolutional and pooling layers, applying modified You Only Look Once (YOLO) algorithm for detection, grouping bounding boxes displayed in the waterfall images (e.g., waterfall plots in
In one embodiment, a method for training comprises recording clean RF signals, shifting RF signals in frequency randomly, creating truth data for YOLO output, adding a simulated channel to the RF signals, recording typical RF backgrounds, applying FFT function to RF data, converting FFT data into logarithmic scale in magnitude, averaging converted FFT into 256 by 256 array representing 125 MHz of bandwidth and 80 ms of time as a base tile, applying normalization function to the base tile, applying a series of convolutional and pooling layers, applying modified You Only Look Once (YOLO) algorithm for detection, grouping bounding boxes displayed in the waterfall images (e.g., waterfall plots in
In one embodiment, a drone detection engine is operable to convert FFT flows from a radio to a tile. For each channel, the drone detection engine is operable to standardize the FFT output from the radio at a defined resolution bandwidth, and group high resolution FFT data into distinct bins overtime. The drone detection engine is further operable to calculate average and standard deviation of power for discrete FFT bins, assign a power value to each channel within the tile. Each scan or single stare at the radio is a time slice, and multiple time slices with power and channel assignment create a tile. Tiles from different frequency spans and center frequencies are identified as a tile group by a tile group number. Receivers in the drone detection system are operable to be re-tuned to different frequencies and spans. In one embodiment, the drone detection system comprises multiple receivers to generate tiles and tile groups.
In one embodiment, a tile is sent to a YOLO AI Engine. Outputs of a decision tree in the YOLO AI engine are used to detect multiple drones and their controllers. Drones of the same type of radio protocol are operable to be identified within the tile. Controllers of the same type of radio protocol are operable to be identified within the tile. Drones of different radio protocols are also operable to be identified within the tile. Controllers of different radio protocols are also operable to be identified within the tile.
In one embodiment, a plurality of tiles is sent to the YOLO AI engine. In one embodiment, a tile group is sent to the YOLO AI engine. The YOLO AI engine generates an output for each tile to identify drones and their controllers with a probability. An average probability is calculated based on outputs for multiple tiles in the tile group. For each tile group, the YOLO AI engine computes outputs for several tiles per second.
In one embodiment, a state engine controls the flows of tiles and tile groups into one or more AI engines. AI engines do not use frequency values for analytics. Thus, the one or more AI engines are operable for any frequency and frequency span that a drone radio supports. The state engine further correlates output of the one or more AI engines to appropriate tiles and tile groups.
The systems and methods of the present invention are operable for direction finding of drones and their controllers. Outputs from the AI engine are denoted with time basis for the drones and their controllers.
Drones typically maintain the same frequency unless their firmware detects interference. Then the drones may negotiate a change with their controllers. This does not create an issue for detection as long as the new frequency and span is monitored by the systems and methods of the present invention. Drone controllers typically use a frequency hopping spread spectrum (FHSS) or other Frequency hopping system (e.g., Gaussian frequency shift keying (GFSK)).
In one embodiment, the systems and method of the present invention are operable to approximate a start time of a line of bearing for a direction finding (DF) system. The time intervals are either known or estimated based upon the behavior monitored by the AI engine and state engine. This allows the time slice and frequency of each individual drone and/or controller to be passed to the DF system. In one embodiment, three or four receivers are coordinated to collect information in appropriate frequency segments, wherein the frequency segments are similar to tiles described earlier.
The AI engine examines the segments to determine if a drone or controller exists. An azimuth of the drone or controller in an Earth-Centered Earth-Fixed coordinate system is determined based on other information collected from the three or four receivers using time difference of arrival (TDOA), angle of arrival (AOA), power correlative, or interferometry techniques.
Distance capability of UAV detection and classification system depends on hardware configuration, environment morphology and restrictions based on country and classification of the counter UAV operator. In one embodiment, the systems and methods for unmanned vehicle recognition are operable to detect unmanned vehicles within 3-4 kilometers.
In one embodiment, the systems and methods of the present invention are operable to utilize training data. In one embodiment training data contains at least one tile displaying at least one waterfall image for one or more drones. These images are numbered based on the time slices and color coded to keep track of each drone. The numbering and color coding of each drone is fed into the ML drone frequency hopping algorithm. The color coding and numbering of each drone signal allows for easy training of the image analysis algorithm to quickly pick up on trends in drone frequency hopping which allows for quick training of the algorithm to account for multiple drones and/or frequency hopping types. In one embodiment, one or more drones contain the same and/or different types of radio protocols within at least one tile. In one embodiment the training data is further utilizable by one or more AI engines operable for any frequency and frequency span supported by drone radio. In one embodiment, the AI engine is operable to use machine learning (ML), artificial intelligence (AI), You Only Look Once Artificial Intelligence (YOLO AI), deep learning (DL), neural networks (NNs), artificial neural networks (ANNs), convolutional neural networks (CNNs), support vector machines (SVMs), Markov decision process (MDP), natural language processing (NLP), control theory, and/or statistical learning techniques.
In one embodiment the AI and/or ML algorithm is operable to classify the drone and/or signal modulation type based on the waterfall image created by the signal. The training data and/or waterfall images are operable to be sets of known modulation schemes for drones for the ML algorithm to provide an image comparison to the signal in the RF environment. The known frequency hopping schemes are operable to be learned by the algorithm and then identified by pattern recognition when a signal in the environment is analyzed and put into a waterfall graphical representation. The image comparison ML or AI algorithm is operable to detect the modulation type patterns of the signal.
In one embodiment, the training data is operable to be utilized for “No Drone Zone” identification and avoidance. The Federal Aviation Administration (FAA) denotes areas where drones are unable to operate as “No Drone Zones”, with each zone containing specific operating restrictions. To report “No Drone Zones” to recreational flyers, the FAA released B4UFLY—a service that allows recreational flyers to determine where drones are operable and not operable.
In one embodiment, the training data is operable to be fed to the ML Drone Frequency Hopping algorithm to allow for identification and avoidance of “No Drone Zones” In a further embodiment, the ML Drone Frequency Hopping algorithm is operable to cross reference detected signals with B4UFLY to determine if the signal is in a restricted area.
In one embodiment, the training data includes at least a tile or group of tiles representing a drone that is operable to exist in a “No Drone Zone”. In one embodiment, the system utilizes the ML or AI Drone Frequency Hopping algorithm to confirm detection of a drone before cross referencing the signal with B4UFLY, the geolocation engine, and other sensors to determine if the drone has breached a “No Drone Zone”. In one embodiment, other sensors include visual systems such as LIDAR, cameras and radar. The system is operable to interface with the multiple data sources, determine if the information from the multiple data sources is consistent or in agreement, and create a report identifying and determining if the drone has breached a “No Drone Zone”. If the sensor information is not in agreement, the system is operable to discard the data and create a report identifying the faulty data.
In one embodiment, the present invention is operable to utilize training data to detect friendly drones (F-Drone) and nefarious drones (N-Drone). In one embodiment an F-Drone is identified by a specific radio protocol and/or a N-Drone is identified by not fitting into the desired radio protocol. In another embodiment, an N-Drone is identified by a desired radio protocol and/or a F-Drone is identified by not identifying with a targeted radio protocol. In a further embodiment one or more AI engines is operable to detect F-Drones and/or N-Drones utilizing training data. Training data is further utilizable by one or more AI engines that are operable for any frequency and frequency span supported by drone radio. In one embodiment, training data is operable to be utilized by one or more AI engines to define a “No Drone Zone”. In a further embodiment, the AI engine is operable to detect a N-Drones by identifying a signal within the “No Drone Zone”. In one embodiment training data is cross referenced with B4UFLY, the geolocation engine, and other sensors to embody parameters, including but not limited to altitude, to detect friendly and/or nefarious drones. In one embodiment, a report is created and sent identifying and detecting F-Drones and/or N-Drones.
Certain modifications and improvements will occur to those skilled in the art upon a reading of the foregoing description. The above-mentioned examples are provided to serve the purpose of clarifying the aspects of the invention and it will be apparent to one skilled in the art that they do not serve to limit the scope of the invention. All modifications and improvements have been deleted herein for the sake of conciseness and readability but are properly within the scope of the present invention.
This application relates to and claims priority from the following applications. This application is a continuation-in-part of U.S. patent application Ser. No. 18/775,710, filed Jul. 17, 2024, which is a continuation-in-part of U.S. patent application Ser. No. 18/428,606, filed Jan. 31, 2024, which is a continuation of U.S. patent application Ser. No. 18/374,376, filed Sep. 28, 2023, which is a continuation of U.S. patent application Ser. No. 18/142,904, filed May 3, 2023, which is a continuation of U.S. patent application Ser. No. 17/991,348, filed Nov. 21, 2022, which is a continuation of U.S. patent application Ser. No. 17/735,615, filed May 3, 2022, which is a continuation of U.S. patent application Ser. No. 17/190,048 filed Mar. 2, 2021, which is a continuation of U.S. patent application Ser. No. 16/732,811 filed Jan. 2, 2020, which is a continuation of U.S. patent application Ser. No. 16/275,575 filed Feb. 14, 2019, which claims the benefit of U.S. Provisional Application 62/632,276 filed Feb. 19, 2018. U.S. patent application Ser. No. 16/275,575 also claims priority from and is a continuation-in-part of U.S. patent application Ser. No. 16/274,933 filed Feb. 13, 2019, which is a continuation-in-part of U.S. patent application Ser. No. 16/180,690 filed Nov. 5, 2018, which is a continuation-in-part of U.S. patent application Ser. No. 15/412,982 filed Jan. 23, 2017. U.S. patent application Ser. No. 16/180,690 also claims priority from U.S. Provisional Patent Application No. 62/722,420 filed Aug. 24, 2018. U.S. patent application Ser. No. 16/274,933 also claims the benefit of U.S. Provisional Application 62/632,276 filed Feb. 19, 2018. Each of the above-mentioned applications is incorporated herein by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
4215345 | Robert et al. | Jul 1980 | A |
4400700 | Rittenbach | Aug 1983 | A |
4453137 | Rittenbach | Jun 1984 | A |
4501020 | Wakeman | Feb 1985 | A |
4638493 | Bishop et al. | Jan 1987 | A |
4794325 | Britton et al. | Dec 1988 | A |
4928106 | Ashjaee et al. | May 1990 | A |
5134407 | Lorenz et al. | Jul 1992 | A |
5166664 | Fish | Nov 1992 | A |
5230087 | Meyer et al. | Jul 1993 | A |
5293170 | Lorenz et al. | Mar 1994 | A |
5343212 | Rose et al. | Aug 1994 | A |
5393713 | Schwob | Feb 1995 | A |
5448309 | Won | Sep 1995 | A |
5506864 | Schilling | Apr 1996 | A |
5513385 | Tanaka | Apr 1996 | A |
5548809 | Lemson | Aug 1996 | A |
5570099 | DesJardins | Oct 1996 | A |
5589835 | Gildea et al. | Dec 1996 | A |
5612703 | Mallinckrodt | Mar 1997 | A |
5642732 | Wang | Jul 1997 | A |
5831874 | Boone et al. | Nov 1998 | A |
5835857 | Otten | Nov 1998 | A |
5838906 | Doyle et al. | Nov 1998 | A |
5846208 | Pichlmayr et al. | Dec 1998 | A |
5856803 | Pevler | Jan 1999 | A |
5936575 | Azzarelli et al. | Aug 1999 | A |
6018312 | Haworth | Jan 2000 | A |
6039692 | Kristoffersen | Mar 2000 | A |
6085090 | Yee et al. | Jul 2000 | A |
6115580 | Chuprun et al. | Sep 2000 | A |
6134445 | Gould et al. | Oct 2000 | A |
6144336 | Preston et al. | Nov 2000 | A |
6157619 | Ozluturk et al. | Dec 2000 | A |
6160511 | Pfeil et al. | Dec 2000 | A |
6167277 | Kawamoto | Dec 2000 | A |
6185309 | Attias | Feb 2001 | B1 |
6188715 | Partyka | Feb 2001 | B1 |
6191731 | McBurney et al. | Feb 2001 | B1 |
6198414 | McPherson et al. | Mar 2001 | B1 |
6249252 | Dupray | Jun 2001 | B1 |
6286021 | Tran et al. | Sep 2001 | B1 |
6296612 | Mo et al. | Oct 2001 | B1 |
6304760 | Thomson et al. | Oct 2001 | B1 |
6314366 | Farmakis et al. | Nov 2001 | B1 |
6339396 | Mayersak | Jan 2002 | B1 |
6400647 | Huntress | Jun 2002 | B1 |
6418131 | Snelling et al. | Jul 2002 | B1 |
6433671 | Nysen | Aug 2002 | B1 |
6490318 | Larsson et al. | Dec 2002 | B1 |
6492945 | Counselman, III et al. | Dec 2002 | B2 |
6512788 | Kuhn et al. | Jan 2003 | B1 |
6628231 | Mayersak | Sep 2003 | B2 |
6677895 | Holt | Jan 2004 | B1 |
6707910 | Valve et al. | Mar 2004 | B1 |
6711404 | Arpee et al. | Mar 2004 | B1 |
6741595 | Maher et al. | May 2004 | B2 |
6771957 | Chitrapu | Aug 2004 | B2 |
6785321 | Yang et al. | Aug 2004 | B1 |
6850557 | Gronemeyer | Feb 2005 | B1 |
6850735 | Sugar et al. | Feb 2005 | B2 |
6859831 | Gelvin et al. | Feb 2005 | B1 |
6861982 | Forstrom et al. | Mar 2005 | B2 |
6876326 | Martorana | Apr 2005 | B2 |
6898197 | Lavean | May 2005 | B1 |
6898235 | Carlin et al. | May 2005 | B1 |
6904269 | Deshpande et al. | Jun 2005 | B1 |
6985437 | Vogel | Jan 2006 | B1 |
6991514 | Meloni et al. | Jan 2006 | B1 |
7035593 | Miller et al. | Apr 2006 | B2 |
7043207 | Miyazaki | May 2006 | B2 |
7049965 | Kelliher et al. | May 2006 | B2 |
7110756 | Diener | Sep 2006 | B2 |
7116943 | Sugar et al. | Oct 2006 | B2 |
7146176 | Mchenry | Dec 2006 | B2 |
7151938 | Weigand | Dec 2006 | B2 |
7152025 | Lusky et al. | Dec 2006 | B2 |
7162207 | Kursula et al. | Jan 2007 | B2 |
7171161 | Miller | Jan 2007 | B2 |
7187326 | Beadle et al. | Mar 2007 | B2 |
7206350 | Korobkov et al. | Apr 2007 | B2 |
7215716 | Smith | May 2007 | B1 |
7254191 | Sugar et al. | Aug 2007 | B2 |
7269151 | Diener et al. | Sep 2007 | B2 |
7289733 | He | Oct 2007 | B1 |
7292656 | Kloper et al. | Nov 2007 | B2 |
7298327 | Dupray et al. | Nov 2007 | B2 |
7340375 | Patenaud et al. | Mar 2008 | B1 |
7366463 | Archer et al. | Apr 2008 | B1 |
7408907 | Diener | Aug 2008 | B2 |
7424268 | Diener et al. | Sep 2008 | B2 |
7430254 | Anderson | Sep 2008 | B1 |
7459898 | Woodings | Dec 2008 | B1 |
7466960 | Sugar | Dec 2008 | B2 |
7471683 | Maher, III et al. | Dec 2008 | B2 |
7522917 | Purdy, Jr. et al. | Apr 2009 | B1 |
7555262 | Brenner | Jun 2009 | B2 |
7564816 | Mchenry et al. | Jul 2009 | B2 |
7595754 | Mehta | Sep 2009 | B2 |
7606335 | Kloper et al. | Oct 2009 | B2 |
7606597 | Weigand | Oct 2009 | B2 |
7620396 | Floam et al. | Nov 2009 | B2 |
7676192 | Wilson | Mar 2010 | B1 |
7692532 | Fischer et al. | Apr 2010 | B2 |
7692573 | Funk | Apr 2010 | B1 |
7702044 | Nallapureddy et al. | Apr 2010 | B2 |
7725110 | Weigand | May 2010 | B2 |
7728755 | Jocic | Jun 2010 | B1 |
7801490 | Scherzer | Sep 2010 | B1 |
7813742 | Mitchell | Oct 2010 | B1 |
7835319 | Sugar | Nov 2010 | B2 |
7865140 | Levien et al. | Jan 2011 | B2 |
7893875 | Smith | Feb 2011 | B1 |
7929508 | Yucek et al. | Apr 2011 | B1 |
7933344 | Hassan et al. | Apr 2011 | B2 |
7945215 | Tang | May 2011 | B2 |
7953549 | Graham et al. | May 2011 | B2 |
7965641 | Ben Letaief et al. | Jun 2011 | B2 |
8001901 | Bass | Aug 2011 | B2 |
8006195 | Woodings et al. | Aug 2011 | B1 |
8023957 | Weigand | Sep 2011 | B2 |
8026846 | Mcfadden et al. | Sep 2011 | B2 |
8027249 | Mchenry et al. | Sep 2011 | B2 |
8027690 | Shellhammer | Sep 2011 | B2 |
8045660 | Gupta | Oct 2011 | B1 |
8055204 | Livsics et al. | Nov 2011 | B2 |
8059694 | Junell et al. | Nov 2011 | B2 |
8060017 | Schlicht et al. | Nov 2011 | B2 |
8060035 | Haykin | Nov 2011 | B2 |
8060104 | Chaudhri et al. | Nov 2011 | B2 |
8064840 | McHenry et al. | Nov 2011 | B2 |
8077662 | Srinivasan et al. | Dec 2011 | B2 |
RE43066 | McHenry | Jan 2012 | E |
8094610 | Wang et al. | Jan 2012 | B2 |
8107391 | Wu et al. | Jan 2012 | B2 |
8125213 | Goguillon et al. | Feb 2012 | B2 |
8131239 | Walker et al. | Mar 2012 | B1 |
8134493 | Noble et al. | Mar 2012 | B2 |
8151311 | Huffman et al. | Apr 2012 | B2 |
8155039 | Wu et al. | Apr 2012 | B2 |
8155649 | McHenry et al. | Apr 2012 | B2 |
8160839 | Woodings et al. | Apr 2012 | B1 |
8170577 | Singh | May 2012 | B2 |
8175539 | Diener et al. | May 2012 | B2 |
8184653 | Dain et al. | May 2012 | B2 |
8193981 | Hwang et al. | Jun 2012 | B1 |
8213868 | Du et al. | Jul 2012 | B2 |
8224254 | Haykin | Jul 2012 | B2 |
8229368 | Immendorf et al. | Jul 2012 | B1 |
8233928 | Stanforth et al. | Jul 2012 | B2 |
8238247 | Wu et al. | Aug 2012 | B2 |
8249028 | Porras et al. | Aug 2012 | B2 |
8249631 | Sawai | Aug 2012 | B2 |
8260207 | Srinivasan et al. | Sep 2012 | B2 |
8265684 | Sawai | Sep 2012 | B2 |
8279786 | Smith et al. | Oct 2012 | B1 |
8280433 | Quinn et al. | Oct 2012 | B2 |
8289907 | Seidel et al. | Oct 2012 | B2 |
8290503 | Sadek et al. | Oct 2012 | B2 |
8295859 | Yarkan et al. | Oct 2012 | B1 |
8295877 | Hui et al. | Oct 2012 | B2 |
8301075 | Sherman et al. | Oct 2012 | B2 |
8305215 | Markhovsky et al. | Nov 2012 | B2 |
8311483 | Tillman et al. | Nov 2012 | B2 |
8311509 | Feher | Nov 2012 | B2 |
8315571 | Lindoff et al. | Nov 2012 | B2 |
8320910 | Bobier | Nov 2012 | B2 |
8326240 | Kadambe et al. | Dec 2012 | B1 |
8326309 | Mody et al. | Dec 2012 | B2 |
8326313 | McHenry et al. | Dec 2012 | B2 |
8335204 | Samarasooriya et al. | Dec 2012 | B2 |
8346273 | Weigand | Jan 2013 | B2 |
8350970 | Birkett et al. | Jan 2013 | B2 |
8352223 | Anthony et al. | Jan 2013 | B1 |
8358723 | Hamkins et al. | Jan 2013 | B1 |
8364188 | Srinivasan et al. | Jan 2013 | B2 |
8369305 | Diener et al. | Feb 2013 | B2 |
8373759 | Samarasooriya et al. | Feb 2013 | B2 |
8391794 | Sawai et al. | Mar 2013 | B2 |
8391796 | Srinivasan et al. | Mar 2013 | B2 |
8401564 | Singh | Mar 2013 | B2 |
8406776 | Jallon | Mar 2013 | B2 |
8406780 | Mueck | Mar 2013 | B2 |
RE44142 | Wilson | Apr 2013 | E |
8421676 | Moshfeghi | Apr 2013 | B2 |
8422453 | Abedi | Apr 2013 | B2 |
8422958 | Du et al. | Apr 2013 | B2 |
RE44237 | Mchenry | May 2013 | E |
8437700 | Mody et al. | May 2013 | B2 |
8442445 | Mody et al. | May 2013 | B2 |
8451751 | Challapali et al. | May 2013 | B2 |
8463195 | Shellhammer | Jun 2013 | B2 |
8467353 | Proctor | Jun 2013 | B2 |
8483155 | Banerjea et al. | Jul 2013 | B1 |
8494464 | Kadambe et al. | Jul 2013 | B1 |
8503955 | Kang et al. | Aug 2013 | B2 |
8504087 | Stanforth et al. | Aug 2013 | B2 |
8514729 | Blackwell | Aug 2013 | B2 |
8515473 | Mody et al. | Aug 2013 | B2 |
8520606 | Cleveland | Aug 2013 | B2 |
RE44492 | Mchenry | Sep 2013 | E |
8526974 | Olsson et al. | Sep 2013 | B2 |
8532686 | Schmidt et al. | Sep 2013 | B2 |
8538339 | Hu et al. | Sep 2013 | B2 |
8548521 | Hui et al. | Oct 2013 | B2 |
8554264 | Gibbons et al. | Oct 2013 | B1 |
8559301 | Mchenry et al. | Oct 2013 | B2 |
8565811 | Tan et al. | Oct 2013 | B2 |
8599024 | Bloy | Dec 2013 | B2 |
8718838 | Kokkeby et al. | May 2014 | B2 |
8761051 | Brisebois et al. | Jun 2014 | B2 |
8773966 | Petrovic et al. | Jul 2014 | B1 |
8780968 | Garcia et al. | Jul 2014 | B1 |
8798548 | Carbajal | Aug 2014 | B1 |
8805291 | Garcia et al. | Aug 2014 | B1 |
8818283 | McHenry et al. | Aug 2014 | B2 |
8824536 | Garcia et al. | Sep 2014 | B1 |
8843155 | Burton et al. | Sep 2014 | B2 |
8941491 | Polk et al. | Jan 2015 | B2 |
8977212 | Carbajal | Mar 2015 | B2 |
9007262 | Witzgall | Apr 2015 | B1 |
9008587 | Carbajal | Apr 2015 | B2 |
9078162 | Garcia et al. | Jul 2015 | B2 |
9143968 | Manku et al. | Sep 2015 | B1 |
9185591 | Carbajal | Nov 2015 | B2 |
9245378 | Villagomez et al. | Jan 2016 | B1 |
9288683 | Garcia et al. | Mar 2016 | B2 |
9356727 | Immendorf et al. | May 2016 | B2 |
9397619 | Lozhkin | Jul 2016 | B2 |
9412278 | Gong et al. | Aug 2016 | B1 |
9414237 | Garcia et al. | Aug 2016 | B2 |
9439078 | Menon et al. | Sep 2016 | B2 |
9529360 | Melamed | Dec 2016 | B1 |
9537586 | Carbajal | Jan 2017 | B2 |
9538040 | Goergen et al. | Jan 2017 | B2 |
9572055 | Immendorf et al. | Feb 2017 | B2 |
9635669 | Gormley et al. | Apr 2017 | B2 |
9658341 | Mathews et al. | May 2017 | B2 |
9674684 | Mendelson | Jun 2017 | B1 |
9674836 | Gormley et al. | Jun 2017 | B2 |
9686789 | Gormley et al. | Jun 2017 | B2 |
9715009 | Parker et al. | Jul 2017 | B1 |
9749069 | Garcia et al. | Aug 2017 | B2 |
9755972 | Mao et al. | Sep 2017 | B1 |
9767699 | Borghese et al. | Sep 2017 | B1 |
9769834 | Immendorf et al. | Sep 2017 | B2 |
9805273 | Seeber et al. | Oct 2017 | B1 |
9819441 | Immendorf et al. | Nov 2017 | B2 |
9858947 | Hearing et al. | Jan 2018 | B2 |
9862489 | Weinstein et al. | Jan 2018 | B1 |
9923700 | Gormley et al. | Mar 2018 | B2 |
9942775 | Yun et al. | Apr 2018 | B2 |
9989633 | Pandey et al. | Jun 2018 | B1 |
9998243 | Garcia et al. | Jun 2018 | B2 |
10027429 | Kiannejad | Jul 2018 | B1 |
10104559 | Immendorf et al. | Oct 2018 | B2 |
10157548 | Priest | Dec 2018 | B2 |
10194324 | Yun et al. | Jan 2019 | B2 |
10227429 | Watanabe et al. | Mar 2019 | B2 |
10241140 | Moinuddin | Mar 2019 | B2 |
10251242 | Rosen et al. | Apr 2019 | B1 |
10281570 | Parker et al. | May 2019 | B2 |
10389616 | Ryan et al. | Aug 2019 | B2 |
10393784 | Logan et al. | Aug 2019 | B2 |
10408936 | Van Voorst | Sep 2019 | B2 |
10459020 | Dzierwa et al. | Oct 2019 | B2 |
10540905 | Bohanan et al. | Jan 2020 | B2 |
10552738 | Holt et al. | Feb 2020 | B2 |
10594034 | Tran et al. | Mar 2020 | B1 |
10613209 | Emami et al. | Apr 2020 | B2 |
10642813 | Lazier et al. | May 2020 | B1 |
10698076 | Jones et al. | Jun 2020 | B2 |
10700721 | Ayala et al. | Jun 2020 | B2 |
10701574 | Gormley et al. | Jun 2020 | B2 |
10764718 | Boettcher | Sep 2020 | B1 |
10784974 | Menon | Sep 2020 | B2 |
10811771 | Tran et al. | Oct 2020 | B1 |
10907940 | Parker | Feb 2021 | B1 |
10916845 | Tran et al. | Feb 2021 | B2 |
10917797 | Menon et al. | Feb 2021 | B2 |
11012340 | Ryan et al. | May 2021 | B2 |
11035929 | Parker et al. | Jun 2021 | B2 |
11063653 | Ottersten et al. | Jul 2021 | B2 |
11190233 | Lo et al. | Nov 2021 | B2 |
11223431 | Garcia et al. | Jan 2022 | B2 |
11265652 | Kallai et al. | Mar 2022 | B2 |
11321282 | Tran | May 2022 | B2 |
11334807 | O'Shea et al. | May 2022 | B1 |
11336011 | Tran et al. | May 2022 | B2 |
11516071 | Karapantelakis et al. | Nov 2022 | B2 |
11637641 | Garcia et al. | Apr 2023 | B1 |
11663992 | Canberk et al. | May 2023 | B2 |
11671839 | Guo et al. | Jun 2023 | B2 |
11700304 | Brown, Jr. et al. | Jul 2023 | B2 |
11757185 | Tran et al. | Sep 2023 | B2 |
11777783 | Meirosu et al. | Oct 2023 | B2 |
11791913 | Garcia et al. | Oct 2023 | B2 |
11871103 | Kleinbeck | Jan 2024 | B2 |
11889351 | Tagg | Jan 2024 | B2 |
11910305 | Buyukdura | Feb 2024 | B2 |
20010000959 | Campana et al. | May 2001 | A1 |
20010005423 | Rhoads | Jun 2001 | A1 |
20010016503 | Kang | Aug 2001 | A1 |
20010020220 | Kurosawa | Sep 2001 | A1 |
20020044082 | Woodington et al. | Apr 2002 | A1 |
20020070889 | Griffin et al. | Jun 2002 | A1 |
20020097184 | Mayersak | Jul 2002 | A1 |
20020119754 | Wakutsu et al. | Aug 2002 | A1 |
20020161775 | Lasensky et al. | Oct 2002 | A1 |
20020173341 | Abdelmonem et al. | Nov 2002 | A1 |
20030013454 | Hunzinger | Jan 2003 | A1 |
20030083091 | Nuutinen et al. | May 2003 | A1 |
20030087648 | Mezhvinsky et al. | May 2003 | A1 |
20030104831 | Razavilar et al. | Jun 2003 | A1 |
20030144601 | Prichep | Jul 2003 | A1 |
20030145328 | Rabinowitz et al. | Jul 2003 | A1 |
20030198304 | Sugar et al. | Oct 2003 | A1 |
20030206640 | Malvar et al. | Nov 2003 | A1 |
20030224801 | Lovberg et al. | Dec 2003 | A1 |
20030232612 | Richards et al. | Dec 2003 | A1 |
20040001688 | Shen | Jan 2004 | A1 |
20040023674 | Miller | Feb 2004 | A1 |
20040127214 | Reddy et al. | Jul 2004 | A1 |
20040147254 | Reddy et al. | Jul 2004 | A1 |
20040171390 | Chitrapu | Sep 2004 | A1 |
20040203826 | Sugar et al. | Oct 2004 | A1 |
20040208238 | Thomas et al. | Oct 2004 | A1 |
20040219885 | Sugar et al. | Nov 2004 | A1 |
20040233100 | Dibble et al. | Nov 2004 | A1 |
20050003828 | Sugar et al. | Jan 2005 | A1 |
20050096026 | Chitrapu et al. | May 2005 | A1 |
20050107102 | Yoon et al. | May 2005 | A1 |
20050114023 | Williamson et al. | May 2005 | A1 |
20050152317 | Awater et al. | Jul 2005 | A1 |
20050159928 | Moser | Jul 2005 | A1 |
20050176401 | Nanda et al. | Aug 2005 | A1 |
20050185618 | Friday et al. | Aug 2005 | A1 |
20050192727 | Shostak et al. | Sep 2005 | A1 |
20050227625 | Diener | Oct 2005 | A1 |
20050285792 | Sugar et al. | Dec 2005 | A1 |
20060025118 | Chitrapu et al. | Feb 2006 | A1 |
20060047704 | Gopalakrishnan | Mar 2006 | A1 |
20060074558 | Williamson et al. | Apr 2006 | A1 |
20060080040 | Garczarek et al. | Apr 2006 | A1 |
20060111899 | Padhi et al. | May 2006 | A1 |
20060128311 | Tesfai | Jun 2006 | A1 |
20060199546 | Durgin | Sep 2006 | A1 |
20060235574 | Lapinski et al. | Oct 2006 | A1 |
20060238417 | Jendbro et al. | Oct 2006 | A1 |
20060258347 | Chitrapu | Nov 2006 | A1 |
20070049823 | Li | Mar 2007 | A1 |
20070076657 | Woodings et al. | Apr 2007 | A1 |
20070098089 | Li et al. | May 2007 | A1 |
20070111746 | Anderson | May 2007 | A1 |
20070149216 | Misikangas | Jun 2007 | A1 |
20070171889 | Kwon et al. | Jul 2007 | A1 |
20070203645 | Dees et al. | Aug 2007 | A1 |
20070223419 | Ji et al. | Sep 2007 | A1 |
20070233409 | Boyan et al. | Oct 2007 | A1 |
20070273581 | Garrison et al. | Nov 2007 | A1 |
20070293171 | Li et al. | Dec 2007 | A1 |
20070296591 | Frederick et al. | Dec 2007 | A1 |
20070297541 | Mcgehee | Dec 2007 | A1 |
20080001735 | Tran | Jan 2008 | A1 |
20080010040 | Mcgehee | Jan 2008 | A1 |
20080090563 | Chitrapu | Apr 2008 | A1 |
20080113634 | Gates et al. | May 2008 | A1 |
20080123731 | Wegener | May 2008 | A1 |
20080129367 | Murata et al. | Jun 2008 | A1 |
20080130519 | Bahl et al. | Jun 2008 | A1 |
20080133190 | Peretz et al. | Jun 2008 | A1 |
20080180325 | Chung et al. | Jul 2008 | A1 |
20080186235 | Struckman et al. | Aug 2008 | A1 |
20080195584 | Nath et al. | Aug 2008 | A1 |
20080209117 | Kajigaya | Aug 2008 | A1 |
20080211481 | Chen | Sep 2008 | A1 |
20080214903 | Orbach | Sep 2008 | A1 |
20080252516 | Ho et al. | Oct 2008 | A1 |
20080293353 | Mody et al. | Nov 2008 | A1 |
20090006103 | Koishida et al. | Jan 2009 | A1 |
20090011713 | Abusubaih et al. | Jan 2009 | A1 |
20090018422 | Banet et al. | Jan 2009 | A1 |
20090021420 | Sahinoglu | Jan 2009 | A1 |
20090046003 | Tung et al. | Feb 2009 | A1 |
20090046625 | Diener et al. | Feb 2009 | A1 |
20090066578 | Beadle et al. | Mar 2009 | A1 |
20090086993 | Kawaguchi et al. | Apr 2009 | A1 |
20090111463 | Simms et al. | Apr 2009 | A1 |
20090131067 | Aaron | May 2009 | A1 |
20090136052 | Hohlfeld et al. | May 2009 | A1 |
20090143019 | Shellhammer | Jun 2009 | A1 |
20090146881 | Mesecher | Jun 2009 | A1 |
20090149202 | Hill et al. | Jun 2009 | A1 |
20090190511 | Li et al. | Jul 2009 | A1 |
20090207950 | Tsuruta et al. | Aug 2009 | A1 |
20090224957 | Chung et al. | Sep 2009 | A1 |
20090245327 | Michaels | Oct 2009 | A1 |
20090278733 | Haworth | Nov 2009 | A1 |
20090282130 | Antoniou et al. | Nov 2009 | A1 |
20090285173 | Koorapaty et al. | Nov 2009 | A1 |
20090286563 | Ji et al. | Nov 2009 | A1 |
20090322510 | Berger et al. | Dec 2009 | A1 |
20100020707 | Woodings | Jan 2010 | A1 |
20100044122 | Sleeman et al. | Feb 2010 | A1 |
20100056200 | Tolonen | Mar 2010 | A1 |
20100075704 | Mchenry et al. | Mar 2010 | A1 |
20100109936 | Levy | May 2010 | A1 |
20100142454 | Chang | Jun 2010 | A1 |
20100150122 | Berger et al. | Jun 2010 | A1 |
20100172443 | Shim et al. | Jul 2010 | A1 |
20100173586 | Mchenry et al. | Jul 2010 | A1 |
20100176988 | Maezawa et al. | Jul 2010 | A1 |
20100177710 | Gutkin et al. | Jul 2010 | A1 |
20100220011 | Heuser | Sep 2010 | A1 |
20100253512 | Wagner et al. | Oct 2010 | A1 |
20100255794 | Agnew | Oct 2010 | A1 |
20100255801 | Gunasekara et al. | Oct 2010 | A1 |
20100259998 | Kwon et al. | Oct 2010 | A1 |
20100279680 | Reudink | Nov 2010 | A1 |
20100292930 | Koster et al. | Nov 2010 | A1 |
20100306249 | Hill et al. | Dec 2010 | A1 |
20100309317 | Wu et al. | Dec 2010 | A1 |
20100325621 | Andrade et al. | Dec 2010 | A1 |
20110022342 | Pandharipande et al. | Jan 2011 | A1 |
20110045781 | Shellhammer et al. | Feb 2011 | A1 |
20110053604 | Kim et al. | Mar 2011 | A1 |
20110059747 | Lindoff et al. | Mar 2011 | A1 |
20110070885 | Ruuska et al. | Mar 2011 | A1 |
20110074631 | Parker | Mar 2011 | A1 |
20110077017 | Yu et al. | Mar 2011 | A1 |
20110087639 | Gurney | Apr 2011 | A1 |
20110090939 | Diener et al. | Apr 2011 | A1 |
20110096770 | Henry | Apr 2011 | A1 |
20110102258 | Underbrink et al. | May 2011 | A1 |
20110111751 | Markhovsky et al. | May 2011 | A1 |
20110116484 | Henry | May 2011 | A1 |
20110117869 | Woodings | May 2011 | A1 |
20110122855 | Henry | May 2011 | A1 |
20110129006 | Jung et al. | Jun 2011 | A1 |
20110131260 | Mody | Jun 2011 | A1 |
20110134878 | Geiger et al. | Jun 2011 | A1 |
20110151876 | Ishii et al. | Jun 2011 | A1 |
20110183621 | Quan et al. | Jul 2011 | A1 |
20110183685 | Burton et al. | Jul 2011 | A1 |
20110185059 | Adnani et al. | Jul 2011 | A1 |
20110235728 | Karabinis | Sep 2011 | A1 |
20110237243 | Guvenc et al. | Sep 2011 | A1 |
20110241923 | Chernukhin | Oct 2011 | A1 |
20110273328 | Parker | Nov 2011 | A1 |
20110286555 | Cho et al. | Nov 2011 | A1 |
20110286604 | Matsuo | Nov 2011 | A1 |
20110287779 | Harper | Nov 2011 | A1 |
20110299481 | Kim et al. | Dec 2011 | A1 |
20110319120 | Chen et al. | Dec 2011 | A1 |
20120014332 | Smith et al. | Jan 2012 | A1 |
20120032854 | Bull et al. | Feb 2012 | A1 |
20120039284 | Barbieri et al. | Feb 2012 | A1 |
20120047544 | Bouchard | Feb 2012 | A1 |
20120052869 | Lindoff et al. | Mar 2012 | A1 |
20120058775 | Dupray et al. | Mar 2012 | A1 |
20120063302 | Damnjanovic et al. | Mar 2012 | A1 |
20120071188 | Wang et al. | Mar 2012 | A1 |
20120072986 | Livsics et al. | Mar 2012 | A1 |
20120077510 | Chen et al. | Mar 2012 | A1 |
20120078071 | Bohm et al. | Mar 2012 | A1 |
20120081248 | Kennedy et al. | Apr 2012 | A1 |
20120094681 | Freda et al. | Apr 2012 | A1 |
20120100810 | Oksanen et al. | Apr 2012 | A1 |
20120040602 | Charland | May 2012 | A1 |
20120105066 | Marvin et al. | May 2012 | A1 |
20120115522 | Nama et al. | May 2012 | A1 |
20120115525 | Kang et al. | May 2012 | A1 |
20120120892 | Freda et al. | May 2012 | A1 |
20120129522 | Kim et al. | May 2012 | A1 |
20120140236 | Babbitt et al. | Jun 2012 | A1 |
20120142386 | Mody et al. | Jun 2012 | A1 |
20120148068 | Chandra et al. | Jun 2012 | A1 |
20120148069 | Bai et al. | Jun 2012 | A1 |
20120155217 | Dellinger et al. | Jun 2012 | A1 |
20120169424 | Pinarello et al. | Jul 2012 | A1 |
20120182430 | Birkett et al. | Jul 2012 | A1 |
20120195269 | Kang et al. | Aug 2012 | A1 |
20120212628 | Wu et al. | Aug 2012 | A1 |
20120214511 | Vartanian et al. | Aug 2012 | A1 |
20120230214 | Kozisek et al. | Sep 2012 | A1 |
20120246392 | Cheon | Sep 2012 | A1 |
20120264388 | Guo et al. | Oct 2012 | A1 |
20120264445 | Lee et al. | Oct 2012 | A1 |
20120275354 | Villain | Nov 2012 | A1 |
20120281000 | Woodings | Nov 2012 | A1 |
20120282942 | Uusitalo et al. | Nov 2012 | A1 |
20120295575 | Nam | Nov 2012 | A1 |
20120302190 | Mchenry | Nov 2012 | A1 |
20120302263 | Tinnakornsrisuphap et al. | Nov 2012 | A1 |
20120309288 | Lu | Dec 2012 | A1 |
20120321024 | Wasiewicz et al. | Dec 2012 | A1 |
20120322487 | Stanforth | Dec 2012 | A1 |
20130005240 | Novak et al. | Jan 2013 | A1 |
20130005374 | Uusitalo et al. | Jan 2013 | A1 |
20130012134 | Jin et al. | Jan 2013 | A1 |
20130017794 | Kloper et al. | Jan 2013 | A1 |
20130023285 | Markhovsky et al. | Jan 2013 | A1 |
20130028111 | Dain et al. | Jan 2013 | A1 |
20130035108 | Joslyn et al. | Feb 2013 | A1 |
20130035128 | Chan et al. | Feb 2013 | A1 |
20130045754 | Markhovsky et al. | Feb 2013 | A1 |
20130052939 | Anniballi et al. | Feb 2013 | A1 |
20130053054 | Lovitt et al. | Feb 2013 | A1 |
20130062334 | Bilchinsky et al. | Mar 2013 | A1 |
20130064197 | Novak et al. | Mar 2013 | A1 |
20130064328 | Adnani et al. | Mar 2013 | A1 |
20130070639 | Demura et al. | Mar 2013 | A1 |
20130090071 | Abraham et al. | Apr 2013 | A1 |
20130095843 | Smith et al. | Apr 2013 | A1 |
20130100154 | Woodings et al. | Apr 2013 | A1 |
20130103684 | Yee et al. | Apr 2013 | A1 |
20130113659 | Morgan | May 2013 | A1 |
20130165051 | Li et al. | Jun 2013 | A9 |
20130165134 | Touag et al. | Jun 2013 | A1 |
20130165170 | Kang | Jun 2013 | A1 |
20130183989 | Hasegawa et al. | Jul 2013 | A1 |
20130183994 | Ringstroem et al. | Jul 2013 | A1 |
20130184022 | Schmidt | Jul 2013 | A1 |
20130190003 | Smith et al. | Jul 2013 | A1 |
20130190028 | Wang et al. | Jul 2013 | A1 |
20130196677 | Smith et al. | Aug 2013 | A1 |
20130208587 | Bala et al. | Aug 2013 | A1 |
20130210457 | Kummetz | Aug 2013 | A1 |
20130210473 | Weigand | Aug 2013 | A1 |
20130217406 | Villardi et al. | Aug 2013 | A1 |
20130217408 | Difazio et al. | Aug 2013 | A1 |
20130217450 | Kanj et al. | Aug 2013 | A1 |
20130231121 | Kwak et al. | Sep 2013 | A1 |
20130237212 | Khayrallah et al. | Sep 2013 | A1 |
20130242792 | Woodings | Sep 2013 | A1 |
20130242934 | Ueda et al. | Sep 2013 | A1 |
20130260703 | Actis et al. | Oct 2013 | A1 |
20130265198 | Stroud | Oct 2013 | A1 |
20130272436 | Makhlouf et al. | Oct 2013 | A1 |
20130279556 | Seller | Oct 2013 | A1 |
20130288734 | Mody et al. | Oct 2013 | A1 |
20130309975 | Kpodzo et al. | Nov 2013 | A1 |
20130315112 | Gormley et al. | Nov 2013 | A1 |
20130329690 | Kim et al. | Dec 2013 | A1 |
20130331114 | Gormley et al. | Dec 2013 | A1 |
20140003547 | Williams et al. | Jan 2014 | A1 |
20140015796 | Philipp | Jan 2014 | A1 |
20140018683 | Park et al. | Jan 2014 | A1 |
20140024405 | Qiu | Jan 2014 | A1 |
20140064723 | Adles et al. | Mar 2014 | A1 |
20140073261 | Hassan et al. | Mar 2014 | A1 |
20140086212 | Kafle et al. | Mar 2014 | A1 |
20140128032 | Muthukumar et al. | May 2014 | A1 |
20140139374 | Wellman et al. | May 2014 | A1 |
20140163309 | Bernhard et al. | Jun 2014 | A1 |
20140199993 | Dhanda et al. | Jul 2014 | A1 |
20140201367 | Trummer et al. | Jul 2014 | A1 |
20140204766 | Immendorf et al. | Jul 2014 | A1 |
20140206279 | Immendorf et al. | Jul 2014 | A1 |
20140206307 | Maurer et al. | Jul 2014 | A1 |
20140206343 | Immendorf et al. | Jul 2014 | A1 |
20140225590 | Jacobs | Aug 2014 | A1 |
20140256268 | Olgaard | Sep 2014 | A1 |
20140256370 | Gautier et al. | Sep 2014 | A9 |
20140269374 | Abdelmonem et al. | Sep 2014 | A1 |
20140269376 | Garcia et al. | Sep 2014 | A1 |
20140274103 | Steer et al. | Sep 2014 | A1 |
20140287100 | Libman | Sep 2014 | A1 |
20140301216 | Immendorf et al. | Oct 2014 | A1 |
20140302796 | Gormley et al. | Oct 2014 | A1 |
20140335879 | Immendorf et al. | Nov 2014 | A1 |
20140340684 | Edler et al. | Nov 2014 | A1 |
20140342675 | Massarella et al. | Nov 2014 | A1 |
20140348004 | Ponnuswamy | Nov 2014 | A1 |
20140362934 | Kumar | Dec 2014 | A1 |
20150016429 | Menon et al. | Jan 2015 | A1 |
20150023329 | Jiang et al. | Jan 2015 | A1 |
20150068296 | Lanza Di et al. | Mar 2015 | A1 |
20150072633 | Massarella et al. | Mar 2015 | A1 |
20150126181 | Breuer et al. | May 2015 | A1 |
20150133058 | Livis et al. | May 2015 | A1 |
20150150753 | Racette | Jun 2015 | A1 |
20150156827 | Ibragimov et al. | Jun 2015 | A1 |
20150201385 | Mercer et al. | Jul 2015 | A1 |
20150215794 | Gormley et al. | Jul 2015 | A1 |
20150215949 | Gormley et al. | Jul 2015 | A1 |
20150248047 | Chakraborty | Sep 2015 | A1 |
20150289254 | Garcia et al. | Oct 2015 | A1 |
20150289265 | Gormley et al. | Oct 2015 | A1 |
20150296386 | Menon et al. | Oct 2015 | A1 |
20150319768 | Abdelmonem et al. | Nov 2015 | A1 |
20160014713 | Kennedy et al. | Jan 2016 | A1 |
20160037550 | Barabell et al. | Feb 2016 | A1 |
20160050690 | Yun et al. | Feb 2016 | A1 |
20160073318 | Aguirre | Mar 2016 | A1 |
20160086621 | Hearing et al. | Mar 2016 | A1 |
20160095188 | Verberkt et al. | Mar 2016 | A1 |
20160117853 | Zhong et al. | Apr 2016 | A1 |
20160124071 | Baxley et al. | May 2016 | A1 |
20160127392 | Baxley et al. | May 2016 | A1 |
20160154406 | Im et al. | Jun 2016 | A1 |
20160198471 | Young et al. | Jul 2016 | A1 |
20160219506 | Pratt et al. | Jul 2016 | A1 |
20160219590 | Khawer et al. | Jul 2016 | A1 |
20160225240 | Voddhi et al. | Aug 2016 | A1 |
20160334527 | Xu et al. | Nov 2016 | A1 |
20160345135 | Garcia et al. | Nov 2016 | A1 |
20160364079 | Qiu et al. | Dec 2016 | A1 |
20160366685 | Gormley et al. | Dec 2016 | A1 |
20160374088 | Garcia et al. | Dec 2016 | A1 |
20170024767 | Johnson, Jr. et al. | Jan 2017 | A1 |
20170025996 | Cheung et al. | Jan 2017 | A1 |
20170039413 | Nadler | Feb 2017 | A1 |
20170048838 | Chrisikos et al. | Feb 2017 | A1 |
20170061690 | Laughlin et al. | Mar 2017 | A1 |
20170064564 | Yun et al. | Mar 2017 | A1 |
20170078792 | Simons | Mar 2017 | A1 |
20170079007 | Carbajal | Mar 2017 | A1 |
20170094527 | Shattil et al. | Mar 2017 | A1 |
20170134631 | Zhao et al. | May 2017 | A1 |
20170146462 | Baker et al. | May 2017 | A1 |
20170148332 | Ziemba et al. | May 2017 | A1 |
20170148467 | Franklin et al. | May 2017 | A1 |
20170192089 | Parker | Jul 2017 | A1 |
20170234979 | Mathews et al. | Aug 2017 | A1 |
20170237484 | Heath et al. | Aug 2017 | A1 |
20170238201 | Gormley et al. | Aug 2017 | A1 |
20170238203 | Dzierwa et al. | Aug 2017 | A1 |
20170243138 | Dzierwa et al. | Aug 2017 | A1 |
20170243139 | Dzierwa et al. | Aug 2017 | A1 |
20170248677 | Mahmood et al. | Aug 2017 | A1 |
20170250766 | Dzierwa et al. | Aug 2017 | A1 |
20170261604 | Van Voorst | Sep 2017 | A1 |
20170261613 | Van Voorst | Sep 2017 | A1 |
20170261615 | Mng et al. | Sep 2017 | A1 |
20170274992 | Chretien | Sep 2017 | A1 |
20170289840 | Sung et al. | Oct 2017 | A1 |
20170290075 | Carbajal et al. | Oct 2017 | A1 |
20170311307 | Negus et al. | Oct 2017 | A1 |
20170358103 | Shao et al. | Dec 2017 | A1 |
20170366361 | Afkhami et al. | Dec 2017 | A1 |
20170374572 | Kleinbeck et al. | Dec 2017 | A1 |
20170374573 | Kleinbeck et al. | Dec 2017 | A1 |
20180006730 | Kuo et al. | Jan 2018 | A1 |
20180014217 | Kleinbeck et al. | Jan 2018 | A1 |
20180024220 | Massarella et al. | Jan 2018 | A1 |
20180070362 | Ryan et al. | Mar 2018 | A1 |
20180081355 | Magy et al. | Mar 2018 | A1 |
20180083721 | Wada et al. | Mar 2018 | A1 |
20180129881 | Seeber | May 2018 | A1 |
20180149729 | Grandin et al. | May 2018 | A1 |
20180211179 | Dzierwa | Jul 2018 | A1 |
20180284758 | Cella et al. | Oct 2018 | A1 |
20180288620 | Jayawickrama et al. | Oct 2018 | A1 |
20180294901 | Garcia et al. | Oct 2018 | A1 |
20180313877 | Brant et al. | Nov 2018 | A1 |
20180313945 | Parker et al. | Nov 2018 | A1 |
20180324595 | Shima | Nov 2018 | A1 |
20180329020 | Hafizovic | Nov 2018 | A1 |
20180331863 | Carbajal | Nov 2018 | A1 |
20190004518 | Zhou et al. | Jan 2019 | A1 |
20190018103 | Qian et al. | Jan 2019 | A1 |
20190064130 | Kanazawa et al. | Feb 2019 | A1 |
20190064223 | Kincaid | Feb 2019 | A1 |
20190072601 | Dzierwa et al. | Mar 2019 | A1 |
20190074802 | Geha et al. | Mar 2019 | A1 |
20190077507 | Ferris et al. | Mar 2019 | A1 |
20190123428 | Packer et al. | Apr 2019 | A1 |
20190180630 | Kleinbeck | Jun 2019 | A1 |
20190191313 | Dzierwa et al. | Jun 2019 | A1 |
20190200303 | Nakahara | Jun 2019 | A1 |
20190208112 | Kleinbeck | Jul 2019 | A1 |
20190208491 | Dzierwa et al. | Jul 2019 | A1 |
20190215709 | Kleinbeck et al. | Jul 2019 | A1 |
20190223139 | Kleinbeck et al. | Jul 2019 | A1 |
20190230539 | Dzierwa et al. | Jul 2019 | A1 |
20190230540 | Carbajal et al. | Jul 2019 | A1 |
20190236266 | Nashimoto et al. | Aug 2019 | A1 |
20190245722 | Carbajal | Aug 2019 | A1 |
20190246304 | Dzierwa et al. | Aug 2019 | A1 |
20190253160 | Garcia et al. | Aug 2019 | A1 |
20190253905 | Kleinbeck et al. | Aug 2019 | A1 |
20190260768 | Mestha et al. | Aug 2019 | A1 |
20190274059 | Kleinbeck et al. | Sep 2019 | A1 |
20190296910 | Cheung | Sep 2019 | A1 |
20190302249 | High et al. | Oct 2019 | A1 |
20190342202 | Ryan et al. | Nov 2019 | A1 |
20190346571 | Furumoto | Nov 2019 | A1 |
20190360783 | Whittaker | Nov 2019 | A1 |
20190364533 | Kleinbeck et al. | Nov 2019 | A1 |
20200034620 | Lutterodt | Jan 2020 | A1 |
20200036459 | Menon | Jan 2020 | A1 |
20200036487 | Hammond et al. | Jan 2020 | A1 |
20200059800 | Menon et al. | Feb 2020 | A1 |
20200066132 | Kleinbeck | Feb 2020 | A1 |
20200067752 | DelMarco | Feb 2020 | A1 |
20200096548 | Dzierwa et al. | Mar 2020 | A1 |
20200107207 | Kleinbeck et al. | Apr 2020 | A1 |
20200120266 | Kleinbeck | Apr 2020 | A1 |
20200128418 | Dzierwa et al. | Apr 2020 | A1 |
20200137583 | Economy et al. | Apr 2020 | A1 |
20200142029 | Brooker | May 2020 | A1 |
20200145032 | Ayala et al. | May 2020 | A1 |
20200162890 | Spencer et al. | May 2020 | A1 |
20200169892 | Dzierwa et al. | May 2020 | A1 |
20200043346 | Vacek | Jun 2020 | A1 |
20200184832 | Kleinbeck | Jun 2020 | A1 |
20200196269 | Dzierwa et al. | Jun 2020 | A1 |
20200196270 | Kleinbeck et al. | Jun 2020 | A1 |
20200242603 | Salkintzis | Jul 2020 | A1 |
20200245167 | Kleinbeck et al. | Jul 2020 | A1 |
20200260306 | Kleinbeck et al. | Aug 2020 | A1 |
20200295855 | Kleinbeck et al. | Sep 2020 | A1 |
20200382961 | Shattil et al. | Dec 2020 | A1 |
20200388036 | Skrede | Dec 2020 | A1 |
20210067974 | Guo et al. | Mar 2021 | A1 |
20210082254 | Givant | Mar 2021 | A1 |
20210084217 | Kleinbeck | Mar 2021 | A1 |
20210211911 | Kleinbeck et al. | Jul 2021 | A1 |
20210250795 | Dzierwa et al. | Aug 2021 | A1 |
20210255356 | Vu | Aug 2021 | A1 |
20210280039 | Kleinbeck | Sep 2021 | A1 |
20210281510 | Brown, Jr. et al. | Sep 2021 | A1 |
20210306022 | Fernando et al. | Sep 2021 | A1 |
20210360423 | Dzierwa et al. | Nov 2021 | A1 |
20210360450 | Kleinbeck et al. | Nov 2021 | A1 |
20210360453 | Kleinbeck et al. | Nov 2021 | A1 |
20210360454 | Carbajal et al. | Nov 2021 | A1 |
20210409591 | Kleinbeck | Dec 2021 | A1 |
20220030541 | Dzierwa et al. | Jan 2022 | A1 |
20220052770 | Kleinbeck et al. | Feb 2022 | A1 |
20220128612 | Dzierwa et al. | Apr 2022 | A1 |
20220131623 | Garcia et al. | Apr 2022 | A1 |
20220150824 | Kleinbeck et al. | May 2022 | A1 |
20220174525 | Dzierwa et al. | Jun 2022 | A1 |
20220253407 | Tran | Aug 2022 | A1 |
20220262228 | Kleinbeck | Aug 2022 | A1 |
20220262261 | Kleinbeck | Aug 2022 | A1 |
20220286997 | Kleinbeck et al. | Sep 2022 | A1 |
20220376921 | Maria | Nov 2022 | A1 |
20230087729 | Goldstein et al. | Mar 2023 | A1 |
20230105718 | Carbajal | Apr 2023 | A1 |
20230114804 | Kleinbeck | Apr 2023 | A1 |
20230118723 | Carbajal et al. | Apr 2023 | A1 |
20230123375 | Dzierwa et al. | Apr 2023 | A1 |
20230126223 | Kleinbeck et al. | Apr 2023 | A1 |
20230209378 | Kleinbeck et al. | Jun 2023 | A1 |
20230232244 | Dzierwa et al. | Jul 2023 | A1 |
20230252744 | Miller et al. | Aug 2023 | A1 |
20230254054 | Garcia et al. | Aug 2023 | A1 |
20230254567 | Kleinbeck | Aug 2023 | A1 |
20230254702 | Damnjanovic et al. | Aug 2023 | A1 |
20230275791 | Carbajal | Aug 2023 | A1 |
20230276280 | Kleinbeck et al. | Aug 2023 | A1 |
20230308789 | Tian et al. | Sep 2023 | A1 |
20230308915 | Carbajal et al. | Sep 2023 | A1 |
20230326323 | Kleinbeck | Oct 2023 | A1 |
20230345441 | Baxley | Oct 2023 | A1 |
20230349962 | Dzierwa et al. | Nov 2023 | A1 |
20230378645 | Tran | Nov 2023 | A1 |
20230403564 | Dzierwa et al. | Dec 2023 | A1 |
20240007204 | Kleinbeck et al. | Jan 2024 | A1 |
20240023054 | Dzierwa et al. | Jan 2024 | A1 |
20240029572 | Kleinbeck | Jan 2024 | A1 |
20240031042 | Garcia et al. | Jan 2024 | A1 |
20240032084 | Hellwig | Jan 2024 | A1 |
20240097951 | Carbajal | Mar 2024 | A1 |
20240103059 | Dzierwa et al. | Mar 2024 | A1 |
20240114370 | Kleinbeck et al. | Apr 2024 | A1 |
Number | Date | Country |
---|---|---|
100248671 | Apr 2000 | KR |
20140041618 | Apr 2014 | KR |
953557 | Aug 1982 | SU |
2012129932 | Oct 2012 | WO |
2018184682 | Oct 2018 | WO |
Entry |
---|
“A Hardware Design for Time Delay Estimation of TDOA”; Li et al.; 2013 IEEE International Conference on Signal Processing, Communication and Computing (ICSPCC 2013); Aug. 2013 (Year: 2013). |
“A Low-Cost, Near-Real-Time Two-LIAS-Based UWB Emitter Monitoring System”; Wang et al.; IEEE A&E Systems Magazine Nov. 2015 (Year: 2015). |
“Joint TDOA and FDOA Estimation: A Conditional Bound and Its Use for Optimally Weighted Localization”; Yeredor et al.; IEEE Transactions on Signal Processing, vol. 59, No. 4, Apr. 2011 (Year: 2011). |
“Multipath TDOA and FDOA Estimation Using the EM Algorithm”; Belanger; Apr. 27, 1993; 1993 IEEE International Conference on Acoustics, Speech, and Signal Processing (Year: 1993). |
“Noise Figure”, Wikipedia, located at https://en.wikipedia.org/wiki/Noise_figure (Year: 2022). |
“Signal Models for TDOA/FDOA Estimation”; Fowler et al.; IEEE Transactions on Aerospace and Electronic Systems vol. 44, No. 4 Oct. 2008 (Year: 2008). |
“Specific attenuation model for rain for use in prediction methods”, Recommendation ITU-R p. 838-3 (Year: 2005). |
Bluetooth vs Zigbee—difference between Bluetooth and Zigbee (located at https://www.rfwireless-world.com/Terminology/Bluetooth-vs-zigbee.html) (Year: 2012). |
Boll S.F., Suppression of Acoustic Noise in Speech Using Spectral Subtraction, Apr. 1979, IEEE Transactions on Acoustics, Speech, and Signal Processing, vol. ASSP-27, No. 2, (Year: 1979). |
David Eppink and Wolf Kuebler, “TIREM/SEM Handbook”, Mar. 1994, IIT Research Institute, p. 1-6, located at http://www.dtic.mil/cgi-bin/GetTRDoc?Location=U2&doc=GetTRDoc.pdf&AD=ADA296913. |
English translation of SU-953557-A1 (Year: 2024). |
Gabriel Garcia and Daniel Carbajal, U.S. Appl. No. 61/789,758, Provisional Patent Application, filed Mar. 15, 2013 (Specification, Claims, and Drawings). |
Gary L. Sugar, System and method for locating wireless devices in an unsynchronized wireless network, U.S. Appl. No. 60/319,737, Provisional Patent Application filed Nov. 27, 2002, Specification including the claims, abstract, and drawings. |
International Search Report and Written Opinion dated Jun. 21, 2018 issued by the International Application Division, Korean Intellectual Property Office as International Searching Authority in connection with International Application No. PCT/US2018/014504 (21 pages). |
Mehmet Ali Aygul, Ahmed Naeem, Huseyin Arslan. “Blind Signal Analysis—Wireless Communication Signals”, located at https://doi.org/10.1002/9781119764441.ch12 (Year: 2021). |
Mobile Emitter Geolocation and Tracking Using TDOA and FDOA Measurements; Musicki et al.; IEEE Transactions on Signal Processing, vol. 58, No. 3, Mar. 2010 (Year: 2010). |
S. Dorner, S. Cammerer, J. Hoydis and S. t. Brink, “Deep Learning Based Communication Over the Air,” in IEEE Journal of Selected Topics in Signal Processing, vol. 12, No. 1, pp. 132-143, Feb. 2018, doi: 10.1109/JSTSP.2017.2784180. |
Steven W. Smith, The Scientist & Engineer's Guide to Digital Signal Processing, 1999, California Technical Publishing, San Diego, California, 2nd Edition, p. 312 (located at http://www.analog.com/media/en/technical-documentation/dsp-book/dsp_book_ch18.pdf) (Year: 1999). |
T. J. O'Shea, K. Karra and T. C. Clancy, “Learning to communicate: Channel auto-encoders, domain specific regularizers, and attention,” 2016 IEEE International Symposium on Signal Processing and Information Technology (ISSPIT), Limassol, Cyprus, 2016, pp. 223-228, doi: 10.1109/ISSPIT.2016.7886039. |
T. O'Shea and J. Hoydis, “An Introduction to Deep Learning for the Physical Layer,” in IEEE Transactions on Cognitive Communications and Networking, vol. 3, No. 4, pp. 563-575, Dec. 2017, doi: 10.1109/TCCN.2017.2758370. |
IEEE 100 The Authoritative Dictionary of IEEE Standards Terms. Seventh Edition. Published by Standards Information Network IEEE Press. p. 6 (Year: 2000). |
Number | Date | Country | |
---|---|---|---|
20240386800 A1 | Nov 2024 | US |
Number | Date | Country | |
---|---|---|---|
62722420 | Aug 2018 | US | |
62632276 | Feb 2018 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 18374376 | Sep 2023 | US |
Child | 18428606 | US | |
Parent | 18142904 | May 2023 | US |
Child | 18374376 | US | |
Parent | 17991348 | Nov 2022 | US |
Child | 18142904 | US | |
Parent | 17735615 | May 2022 | US |
Child | 17991348 | US | |
Parent | 17190048 | Mar 2021 | US |
Child | 17735615 | US | |
Parent | 16732811 | Jan 2020 | US |
Child | 17190048 | US | |
Parent | 16275575 | Feb 2019 | US |
Child | 16732811 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 18775710 | Jul 2024 | US |
Child | 18785936 | US | |
Parent | 18428606 | Jan 2024 | US |
Child | 18775710 | US | |
Parent | 16274933 | Feb 2019 | US |
Child | 16275575 | US | |
Parent | 16180690 | Nov 2018 | US |
Child | 16274933 | US | |
Parent | 15412982 | Jan 2017 | US |
Child | 16180690 | US |