The present invention relates to the discrimination of law enforcement radar signals from collision avoidance radar systems and other sources of radar signals.
Traditionally, radar detectors include circuitry or systems for identifying incoming radar signals and characterizing those signals by strength, direction or location, frequency and/or on/off patterns. This information is fed to a programmed digital signal processing system which applies frequency filters, location/direction filters, and/or database lookups that characterize known sources, all feeding through traditional logic if-then-else statements, state machines performing analysis of behavior over time, chirp compression filters, and similar logical systems to process the incoming signals. These methods have been developed over many years and have become increasingly complex as a greater number and diversity of non-law enforcement radar sources appear on the open roads, from such diverse sources as collision avoidance/driver assistance systems, roadside traffic sensors, door openers and other non-traffic fixed radar sources, and even poorly shielded radar detectors.
According to principles of the present invention, an artificial intelligence type neural network is applied to the problem of classification and discrimination of radar sources. Specifically, an artificial intelligence deep neural network classifies radar sources received into a radar detector.
To enhance the signal discrimination of the neural network architecture, the radar detector signal processing section employs unique methods for channel analysis. Specifically, the signal processing includes a subchannel processor for evaluating programmable sub-bands of a channel for enhanced pattern identification. The subchannel processor specifically incorporates multiple digital demultiplexing stages, producing N signal subchannels at selectable angular frequencies ω0 through ωN-1. The subchannels are then decimated from the original sampling frequency to a selectable lower sample frequency, e.g. using a cascaded integrator-comb filter, and then filtered, e.g. using a finite impulse response filter. The resulting sub channels 0 to N−1 may then be delivered for Neural Network assessment. Importantly, the subchannel frequencies (ω0 through ωN-1, decimation rates and IIR filter profiles may be selectively adjusted to emphasize relevant patterns to be extracted from a channel signal.
It will be appreciated that the neural network may consist of multiple deep layers including, but not limited to; convolutional layers, max pooling layers, dropout layers, fully connected layers, recurrent layers, and long short-term memory (LSTM). The case of LSTM layers may prove especially helpful for input data that undergoes meaningful temporal changes. The neural network may be multi-class, with an output class for each radar source that may be encountered. The neural network may also be multi-label, where each output class can be present simultaneously and concurrently with and without any combination of the other output classes. In the case of a multi-label neural network multiple radar sources and threats can be present at the same time, and the neural network may discriminate and classify each radar source and treat each individually.
In one specific embodiment the neural network may consist of a reinforcement learning agent. A reinforcement learning agent samples its environment, makes multiple decisions, and is ultimately rewarded or penalized based on a reward function defined by the user.
In the one specific embodiment, a supervised initial training of a neural network uses an initial set of known-origin signals, which may include those known to originated either from law enforcement radar and/or non-law enforcement sources. In one embodiment only signals originated from non-law enforcement radar sources are utilized. In training, the neural network develops neural pathways to recognize the known-origin radar signals, as well as subchannel parameters to enhance the distinctive characteristics of those signals in each channel.
It will be appreciated that the known-origin signals may be collected via laboratory simulation of known publicly occurring radar signals or collection of publicly occurring radar signals gathered in field surveys of the target radar environment(s). In an alternative embodiment, the neural network may be trained in an unsupervised manner using signals of unknown origin which are typical publicly occurring signals. In the latter case the neural network will learn to distinguish the various signals and report the signal source. Alternatively, or in addition, the neural network may also learn to distinguish behaviors that are associated with the law enforcement-originated signals enabling the neural network to identify a signal as law enforcement originated. For example, law enforcement originated signals have an intermittent nature, tend to suddenly appear, and tend to correlate with driver behaviors such as slower speeds or braking when drivers recognize that speed monitoring is occurring. The neural network can identify these correlated effects and use them to identify a signal as law enforcement-originated without supervised training.
The initially trained neural network may then be deployed on a radar detector; once deployed in the radar detector the neural network may use its trained neural pathways and subchannel parameters to classify the source of incoming radar signals and specifically to identify the source as either law enforcement radar or non-law enforcement originated interference sources, and from this determination a radar detector may filter false alert signals during operation, and display the identity of a known law enforcement source such as the make and model of a radar gun or other law enforcement radar source.
In one embodiment, the initially trained neural network further improves its operation by retraining neural pathways based upon characteristics of signals encountered by the neural network while operating in the field (known as “transfer learning”), so that those signals may be characterized as law enforcement-originated or non-law enforcement-originated based upon signal characteristics, vehicle behaviors and/or input from a user of the radar detector. These additional imprinted neural pathways of the radar detector may then improve discrimination of law enforcement-originated radar signals from non-law enforcement radar signals. This process may improve operation by imprinting neural pathways to recognize signals which are unique to a particular geographic area (such as a region in which different licensed bands or different law enforcement equipment is utilized), or which are unique to a particular deployment circumstance (such as the host automobile's on-board crash avoidance system).
It will be recognized that the neural network may learn neural network weights to recognize any number of additional fixed or in-vehicle technologies which appear in the environment where the neural network is deployed. In this respect the use of artificial intelligence/machine learning/neural networks has a unique advantage as compared to traditional radar detector control strategies which rely upon pre-programmed logic systems and can implement only a limited extent of field learning to the extent it is structured by the pre-programmed logic system.
In specific embodiments, the deployed neural networks on radar detectors in the field are subsequently connected to a host to provide details of the retrained neural pathways created therein to the host, such that the host may integrate those modified pathways into its existing neural networks, for improvement of the training of the neural network. Alternatively, or in addition, the connection to the host may permit the deployed radar neural network to be updated using improved training from the host developed in this manner or other manners.
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and, together with a general description of the invention given above, and the detailed description of the embodiments given below, serve to explain the principles of the invention.
In the following detailed description of several illustrative embodiments, reference is made to the accompanying drawings that form a part hereof, and in which is shown by way of illustration specific preferred embodiment in which the invention may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the invention, and it is understood that other embodiments may be utilized, and that logical structural, mechanical, electrical, and chemical changes may be made without departing from the spirit or scope of the invention. To avoid detail not necessary to enable those skilled in the art to practice the embodiments described herein, the description may omit certain information known to those skilled in the art. The following detailed description is, therefore, not to be taken in a limiting sense, and the scope of the illustrative embodiments are defined only by the appended claims.
The receiver section 12 receives radar signals 13 via one or several antennae such as horn antennae 13 shown in
The output(s) from the receiver section typically comprise an intermediate frequency demodulated signal representing the radio frequency content received in the antenna(e). This intermediate frequency content is typically in the form of a swept spectral representation of the content received by the antenna. Numerous methods have been developed and disclosed for the reception and demodulation of radio frequency signals from one or several antennae and will not be elaborated here. Importantly the demodulated intermediate frequency signals are delivered to the controller 14 where they are converted to digital signals, typically in an intermediate frequency band, and then processed through digital signal processing 41 to provide a sequence of spectral content snapshots to the neural network 42 which may use the same to identify a signal from its frequency, and/or amplitude or frequency modulation.
Neural network 42 implements the machine learning/artificial intelligence principles, as described in greater detail in Patent Cooperation Treaty Application Serial No. PCT/US20/35532, filed Jun. 1, 2020, which is hereby incorporated herein in its entirety as if fully set forth herein. The neural network 42 uses digital signal processed data derived from the intermediate frequency demodulated signals from the receiver section, as well as (optionally) signals from the sensors 18. Neural network 42 responds to this content and its internal neural pathways to discriminate between incoming radar signals from non-law enforcement sources and those from law enforcement sources, and produce an output indicating the results of this analysis.
The output of the neural network is delivered to a user interface logic section 43 which assesses that output and determines whether to initiate an alert, such as a warning that radar determined to be from law enforcement sources has been identified. The warning information may usefully include make and model information for the known law enforcement radar source detected by the neural network.
Alerts are delivered to a user via the user interface 18 which is connected to and responsive to the user interface logic 43 of the controller 14. The user interface includes a display 51 for presenting information about the operation of the radar detector, including details about alerts which are identified by the controller 14, which are typically (but not necessarily) accompanied with a display readout and/or a speaker tone, chip or spoken voice. The user interface further includes a keypad 52 for receiving commands from the user (adjusting settings of the radar detector, acknowledging and/or muting alerts, and optionally providing user feedback on the nature (law enforcement-originated or not) of signals which have triggered an alert. The keypad may take the form of a series of keys or buttons on the housing of the radar detector. Optionally additional keys or buttons may be included on a remote device which may be positioned conveniently within the passenger cabin, or included on the detector power cord.
The detector power cord may usefully take the form of a USB3 power cord, which differs from a conventional USB3 power cord by the inclusion of one or more buttons the user may press to, for example, mute an alert, confirm an alert, or take other actions as defined by the device user interface.
The radar detector controller 14 is also optionally responsive to a variety of sensor inputs, including for example an accelerometer sensor 61 for detecting acceleration, braking, cornering or collisions of the vehicle in which the radar detector is deployed. Also optionally included are one or more camera sensor(s) which detect conditions around the vehicle, such as the nature of the roadway, weather conditions, proximity of traffic, the behavior of traffic, the visible presence of law enforcement vehicles, and the like. Further, optional proximity sensors can similarly detect proximity of traffic and vehicles, lane positioning of the vehicle, and potentially roadway parameters. Also, a microphone sensor may be included to detect cabin sounds and thereby determine the operational state of the vehicle and an ambient temperature sensor can use vehicle temperature information to characterize operation as well.
Additional information useful in determining the operational state of the vehicle can be gathered from the vehicle's data bus, which will generally provide a range of data regarding the operation of the vehicles engine and main sensor systems. This data is optionally obtained in the illustrated embodiment via an On Board Diagnostics (OBD-II) connector 20 coupled to the data bus of the vehicle 21.
Finally, the detector may optionally interface with a host system, such as a central server, to acquire updates or deliver learned data to the host, or even to receive information on radar signals detected by other devices so that those signals can be fed to the neural network and used in development of neural pathways. This interface can be achieved in a variety of manners, such as via a network connector, USB connect, WiFi circuits, Bluetooth circuits, circuits supporting other 801.11 protocols such as Zigbee, or circuits supporting cellular connectivity, all of which are schematically represented by interface 22, and connect via the public network cloud 23 to the host 24.
Further details of the neural network and its configuration may be found in the above-referenced Patent Cooperation Treaty patent application.
Referring now to
The overall processing activity of the system that is outlined above is orchestrated by a Data Interpreter 130 and System Coordinator 132 implemented in a System Processor. These modules generate Analysis Requests 134 for handling by the FPGA 115 and also originate Raster Requests 136 when a particular signal is to be analyzed by the Neural Network 42.
Analysis Requests are handled by the Command/Response Handler 140, which orchestrates tuning of incoming signal via Tuning Controller 138 and further orchestrates analysis by a customized integrated circuit such as a Field Programmable Gate Array (FPGA) 115 via commands sent over an SPI interface.
Upon command, incoming signal information is acquired into FPGA 115 from the Analog/Digital Converter 114 under control of the ADC controller and ADC Calibration blocks of the FPGA 115. This data is then delivered as an AXIS stream under the control of the ADC controller/calibration blocks, to one, two or all of three processing engines:
Engine 1 performs Time Domain analysis of the incoming signal to develop RMS Sub-Band complex power data and instantaneous power change information.
Engine 2 performs Frequency Domain (Fourier Transform) analysis of the incoming signal.
Engine 3 performs Time-Frequency (Spectrogram) analysis of the incoming signal.
The output of Engines 1, 2 and 3 are delivered as a result AXIS stream via a stream selector and stream adapter to a packet framer which arranges the resulting data for delivery via a USB FIFO and controller to the USB interface of the Carrier Board for handling by the Command/Response Handler 140.
In the event a signal is to be analyzed by Neural Network 42, a Raster Request 136 is delivered from System Coordinator via the SPI interface to the FPGA 115 which returns the requested Raster, which is then delivered to Neural Network 42 for analysis. The resulting analytical response from Neural Network 42 then informs the decision to initiate Alerts via the User Interface 16 (
It will be appreciated that each of the stages 160, 170, 180 and 190 is responsive to a sampling clock CLK1, CLK2, CLK3 which is derived from the base sampling clock CLK0 thus maintaining digital phase lock throughout the process. Sampling clock CLK0 and other clock signals of the subchannel processor are produced by a multi-channel phase locked loop (PLL) 142 in response to a master clock signal 140.
It will be further appreciated that the subchannel demodulator shown in
While the present invention has been illustrated by a description of various embodiments and while these embodiments have been described in considerable detail, it is not the intention of the applicants to restrict or in any way limit the scope of the appended claims to such detail. Additional advantages and modifications will readily appear to those skilled in the art. The invention in its broader aspects is therefore not limited to the specific details, representative apparatus and method, and illustrative example shown and described. Accordingly, departures may be made from such details without departing from the spirit or scope of applicant's general inventive concept.
This application claims priority benefit of provisional application Ser. No. 63/119,064 filed Nov. 30, 2020, which is incorporated herein by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
4313216 | Jaeger et al. | Jan 1982 | A |
4626857 | Imazeki | Dec 1986 | A |
4631542 | Grimsley | Dec 1986 | A |
4841302 | Henry | Jun 1989 | A |
4952936 | Martinson | Aug 1990 | A |
4961074 | Martinson | Oct 1990 | A |
5068663 | Valentine | Nov 1991 | A |
5079553 | Orr | Jan 1992 | A |
5083129 | Valentine | Jan 1992 | A |
5134406 | Orr | Jul 1992 | A |
5146226 | Valentine | Sep 1992 | A |
5151701 | Valentine | Sep 1992 | A |
5164729 | Decker et al. | Nov 1992 | A |
5206651 | Valentine | Apr 1993 | A |
5250951 | Valentine | Oct 1993 | A |
5300932 | Valentine et al. | Apr 1994 | A |
5305007 | Orr et al. | Apr 1994 | A |
5315302 | Katsukura et al. | May 1994 | A |
5461383 | Ono et al. | Oct 1995 | A |
5646624 | Cope et al. | Jul 1997 | A |
5835052 | Iwakuni | Nov 1998 | A |
5852417 | Valentine et al. | Dec 1998 | A |
5856801 | Valentine et al. | Jan 1999 | A |
5900832 | Valentine et al. | May 1999 | A |
5917441 | Valentine et al. | Jun 1999 | A |
5990821 | Sakar | Nov 1999 | A |
6043771 | Clark | Mar 2000 | A |
6140809 | Doi | Oct 2000 | A |
6175324 | Valentine et al. | Jan 2001 | B1 |
6204798 | Fleming, III | Mar 2001 | B1 |
6239735 | Ono | May 2001 | B1 |
6400305 | Kuhn | Jun 2002 | B1 |
6400306 | Nohara et al. | Jun 2002 | B1 |
6463091 | Zhodzicshsky et al. | Oct 2002 | B1 |
6617995 | Kim et al. | Sep 2003 | B2 |
6670905 | Orr | Dec 2003 | B1 |
6693578 | Martinson | Feb 2004 | B1 |
7023374 | Jossef et al. | Apr 2006 | B2 |
7061423 | Valentine et al. | Jun 2006 | B1 |
7206359 | Kjeldsen | Apr 2007 | B2 |
7215276 | Batten et al. | May 2007 | B2 |
7301453 | Fry | Nov 2007 | B2 |
7388537 | Martinson | Jun 2008 | B2 |
7430254 | Anderson | Sep 2008 | B1 |
7576679 | Orr | Aug 2009 | B1 |
7593449 | Shattil | Sep 2009 | B2 |
7948427 | Wen et al. | May 2011 | B2 |
7965761 | Shattil | Jun 2011 | B2 |
8026840 | Dwelly | Sep 2011 | B2 |
8045654 | Anderson | Oct 2011 | B1 |
8180392 | Sekiya et al. | May 2012 | B2 |
8958506 | Seely | Feb 2015 | B2 |
8988272 | Chernukhin | Mar 2015 | B2 |
9014307 | Seely | Apr 2015 | B2 |
9077487 | Seely | Jul 2015 | B2 |
9252998 | Seely | Feb 2016 | B2 |
9426680 | Seely | Aug 2016 | B2 |
9900796 | Seely | Feb 2018 | B2 |
11422252 | Bowring | Aug 2022 | B2 |
11852750 | Kim | Dec 2023 | B2 |
11907829 | Meissner | Feb 2024 | B2 |
20030231714 | Kjeldsen | Dec 2003 | A1 |
20040151137 | McFarland et al. | Aug 2004 | A1 |
20050212672 | Fry | Sep 2005 | A1 |
20060139203 | Kim et al. | Jun 2006 | A1 |
20070018879 | Batten et al. | Jan 2007 | A1 |
20070120728 | Orr | May 2007 | A1 |
20070211786 | Shattil | Sep 2007 | A1 |
20090110033 | Shattil | Apr 2009 | A1 |
20090146863 | Wen et al. | Jun 2009 | A1 |
20100019947 | Kruys et al. | Jan 2010 | A1 |
20100075704 | McHenry et al. | Mar 2010 | A1 |
20110102232 | Orr | May 2011 | A1 |
20110241923 | Chernukhin | Oct 2011 | A1 |
20140241178 | Seely | Aug 2014 | A1 |
20140241468 | Seely | Aug 2014 | A1 |
20140241469 | Seely | Aug 2014 | A1 |
20140242922 | Seely | Aug 2014 | A1 |
20140242931 | Seely | Aug 2014 | A1 |
20140242936 | Seely | Aug 2014 | A1 |
20140266853 | Orr | Sep 2014 | A1 |
20150283939 | Parkes | Oct 2015 | A1 |
20160187461 | Orr | Jun 2016 | A1 |
20160277132 | Pratt | Sep 2016 | A1 |
20160337891 | Seely | Nov 2016 | A1 |
20170090012 | Kuhn | Mar 2017 | A1 |
20200292660 | Meissner | Sep 2020 | A1 |
20200328916 | Nikitin | Oct 2020 | A1 |
20210314201 | Nikitin | Oct 2021 | A1 |
20220128657 | Kale | Apr 2022 | A1 |
Number | Date | Country | |
---|---|---|---|
63119064 | Nov 2020 | US |