Programmable digital subchannel processor for radar detector

Information

  • Patent Grant
  • 12313763
  • Patent Number
    12,313,763
  • Date Filed
    Monday, November 29, 2021
    3 years ago
  • Date Issued
    Tuesday, May 27, 2025
    a month ago
Abstract
A radar detector employs parameterized subchannel analysis for discrimination of radar signals. Specifically, the signal processing of the radar detector includes a subchannel processor for evaluating programmable sub-bands of a channel for enhanced pattern identification. The subchannel processor specifically incorporates multiple digital demultiplexing stages, producing N signal subchannels at selectable frequencies. The subchannels are decimated from the original sampling frequency to a selectable lower sample frequency, e.g. using a cascaded integrator-comb filter, and then filtered, e.g. using a finite impulse response filter. The resulting sub channels 0 to N−1 are delivered for Neural Network assessment. Importantly, the subchannel frequencies, decimation rates and IIR filter profiles may be selectively adjusted by the control circuits to emphasize relevant patterns to be extracted by the neural network from a received radar signal.
Description
BACKGROUND OF THE INVENTION

The present invention relates to the discrimination of law enforcement radar signals from collision avoidance radar systems and other sources of radar signals.


Traditionally, radar detectors include circuitry or systems for identifying incoming radar signals and characterizing those signals by strength, direction or location, frequency and/or on/off patterns. This information is fed to a programmed digital signal processing system which applies frequency filters, location/direction filters, and/or database lookups that characterize known sources, all feeding through traditional logic if-then-else statements, state machines performing analysis of behavior over time, chirp compression filters, and similar logical systems to process the incoming signals. These methods have been developed over many years and have become increasingly complex as a greater number and diversity of non-law enforcement radar sources appear on the open roads, from such diverse sources as collision avoidance/driver assistance systems, roadside traffic sensors, door openers and other non-traffic fixed radar sources, and even poorly shielded radar detectors.


SUMMARY OF THE INVENTION

According to principles of the present invention, an artificial intelligence type neural network is applied to the problem of classification and discrimination of radar sources. Specifically, an artificial intelligence deep neural network classifies radar sources received into a radar detector.


To enhance the signal discrimination of the neural network architecture, the radar detector signal processing section employs unique methods for channel analysis. Specifically, the signal processing includes a subchannel processor for evaluating programmable sub-bands of a channel for enhanced pattern identification. The subchannel processor specifically incorporates multiple digital demultiplexing stages, producing N signal subchannels at selectable angular frequencies ω0 through ωN-1. The subchannels are then decimated from the original sampling frequency to a selectable lower sample frequency, e.g. using a cascaded integrator-comb filter, and then filtered, e.g. using a finite impulse response filter. The resulting sub channels 0 to N−1 may then be delivered for Neural Network assessment. Importantly, the subchannel frequencies (ω0 through ωN-1, decimation rates and IIR filter profiles may be selectively adjusted to emphasize relevant patterns to be extracted from a channel signal.


It will be appreciated that the neural network may consist of multiple deep layers including, but not limited to; convolutional layers, max pooling layers, dropout layers, fully connected layers, recurrent layers, and long short-term memory (LSTM). The case of LSTM layers may prove especially helpful for input data that undergoes meaningful temporal changes. The neural network may be multi-class, with an output class for each radar source that may be encountered. The neural network may also be multi-label, where each output class can be present simultaneously and concurrently with and without any combination of the other output classes. In the case of a multi-label neural network multiple radar sources and threats can be present at the same time, and the neural network may discriminate and classify each radar source and treat each individually.


In one specific embodiment the neural network may consist of a reinforcement learning agent. A reinforcement learning agent samples its environment, makes multiple decisions, and is ultimately rewarded or penalized based on a reward function defined by the user.


In the one specific embodiment, a supervised initial training of a neural network uses an initial set of known-origin signals, which may include those known to originated either from law enforcement radar and/or non-law enforcement sources. In one embodiment only signals originated from non-law enforcement radar sources are utilized. In training, the neural network develops neural pathways to recognize the known-origin radar signals, as well as subchannel parameters to enhance the distinctive characteristics of those signals in each channel.


It will be appreciated that the known-origin signals may be collected via laboratory simulation of known publicly occurring radar signals or collection of publicly occurring radar signals gathered in field surveys of the target radar environment(s). In an alternative embodiment, the neural network may be trained in an unsupervised manner using signals of unknown origin which are typical publicly occurring signals. In the latter case the neural network will learn to distinguish the various signals and report the signal source. Alternatively, or in addition, the neural network may also learn to distinguish behaviors that are associated with the law enforcement-originated signals enabling the neural network to identify a signal as law enforcement originated. For example, law enforcement originated signals have an intermittent nature, tend to suddenly appear, and tend to correlate with driver behaviors such as slower speeds or braking when drivers recognize that speed monitoring is occurring. The neural network can identify these correlated effects and use them to identify a signal as law enforcement-originated without supervised training.


The initially trained neural network may then be deployed on a radar detector; once deployed in the radar detector the neural network may use its trained neural pathways and subchannel parameters to classify the source of incoming radar signals and specifically to identify the source as either law enforcement radar or non-law enforcement originated interference sources, and from this determination a radar detector may filter false alert signals during operation, and display the identity of a known law enforcement source such as the make and model of a radar gun or other law enforcement radar source.


In one embodiment, the initially trained neural network further improves its operation by retraining neural pathways based upon characteristics of signals encountered by the neural network while operating in the field (known as “transfer learning”), so that those signals may be characterized as law enforcement-originated or non-law enforcement-originated based upon signal characteristics, vehicle behaviors and/or input from a user of the radar detector. These additional imprinted neural pathways of the radar detector may then improve discrimination of law enforcement-originated radar signals from non-law enforcement radar signals. This process may improve operation by imprinting neural pathways to recognize signals which are unique to a particular geographic area (such as a region in which different licensed bands or different law enforcement equipment is utilized), or which are unique to a particular deployment circumstance (such as the host automobile's on-board crash avoidance system).


It will be recognized that the neural network may learn neural network weights to recognize any number of additional fixed or in-vehicle technologies which appear in the environment where the neural network is deployed. In this respect the use of artificial intelligence/machine learning/neural networks has a unique advantage as compared to traditional radar detector control strategies which rely upon pre-programmed logic systems and can implement only a limited extent of field learning to the extent it is structured by the pre-programmed logic system.


In specific embodiments, the deployed neural networks on radar detectors in the field are subsequently connected to a host to provide details of the retrained neural pathways created therein to the host, such that the host may integrate those modified pathways into its existing neural networks, for improvement of the training of the neural network. Alternatively, or in addition, the connection to the host may permit the deployed radar neural network to be updated using improved training from the host developed in this manner or other manners.





BRIEF DESCRIPTION OF THE DRAWINGS

The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and, together with a general description of the invention given above, and the detailed description of the embodiments given below, serve to explain the principles of the invention.



FIG. 1 is a schematic view of a radar detector utilizing a neural network;



FIG. 2 is an information flow diagram of a field programmable gate array used for digital signal processing in the embodiment of FIG. 1.



FIG. 3 is a block diagram of a subchannel processing system of the FPGA of FIG. 2.





DETAILED DESCRIPTION OF THE INVENTION

In the following detailed description of several illustrative embodiments, reference is made to the accompanying drawings that form a part hereof, and in which is shown by way of illustration specific preferred embodiment in which the invention may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the invention, and it is understood that other embodiments may be utilized, and that logical structural, mechanical, electrical, and chemical changes may be made without departing from the spirit or scope of the invention. To avoid detail not necessary to enable those skilled in the art to practice the embodiments described herein, the description may omit certain information known to those skilled in the art. The following detailed description is, therefore, not to be taken in a limiting sense, and the scope of the illustrative embodiments are defined only by the appended claims.



FIG. 1 is a schematic view of a radar detector utilizing a neural network. The detector 10 generally comprises a receiver section 12, controller/filtering section 14, user interface 16, and (optionally) sensors 18.


The receiver section 12 receives radar signals 13 via one or several antennae such as horn antennae 13 shown in FIG. 1. These are fed through stripline, filtering and (optionally) low noise amplification stage 32. In one embodiment the signal is then fed to a demodulation stage 33 where the received radar signal is mixed with a local oscillator from a source 34. Although three antennae independently feeding three demodulation stages 33 using three local oscillator sources 34 are shown in FIG. 1, it will be appreciated that in other embodiments repetition of the front-end radio frequency electronics can be avoided by multiplexing between multiple antennae through a switch. Furthermore, principles of the present invention are applicable to radar detectors having only a single antenna as well as to embodiments featuring multiple antennae.


The output(s) from the receiver section typically comprise an intermediate frequency demodulated signal representing the radio frequency content received in the antenna(e). This intermediate frequency content is typically in the form of a swept spectral representation of the content received by the antenna. Numerous methods have been developed and disclosed for the reception and demodulation of radio frequency signals from one or several antennae and will not be elaborated here. Importantly the demodulated intermediate frequency signals are delivered to the controller 14 where they are converted to digital signals, typically in an intermediate frequency band, and then processed through digital signal processing 41 to provide a sequence of spectral content snapshots to the neural network 42 which may use the same to identify a signal from its frequency, and/or amplitude or frequency modulation.


Neural network 42 implements the machine learning/artificial intelligence principles, as described in greater detail in Patent Cooperation Treaty Application Serial No. PCT/US20/35532, filed Jun. 1, 2020, which is hereby incorporated herein in its entirety as if fully set forth herein. The neural network 42 uses digital signal processed data derived from the intermediate frequency demodulated signals from the receiver section, as well as (optionally) signals from the sensors 18. Neural network 42 responds to this content and its internal neural pathways to discriminate between incoming radar signals from non-law enforcement sources and those from law enforcement sources, and produce an output indicating the results of this analysis.


The output of the neural network is delivered to a user interface logic section 43 which assesses that output and determines whether to initiate an alert, such as a warning that radar determined to be from law enforcement sources has been identified. The warning information may usefully include make and model information for the known law enforcement radar source detected by the neural network.


Alerts are delivered to a user via the user interface 18 which is connected to and responsive to the user interface logic 43 of the controller 14. The user interface includes a display 51 for presenting information about the operation of the radar detector, including details about alerts which are identified by the controller 14, which are typically (but not necessarily) accompanied with a display readout and/or a speaker tone, chip or spoken voice. The user interface further includes a keypad 52 for receiving commands from the user (adjusting settings of the radar detector, acknowledging and/or muting alerts, and optionally providing user feedback on the nature (law enforcement-originated or not) of signals which have triggered an alert. The keypad may take the form of a series of keys or buttons on the housing of the radar detector. Optionally additional keys or buttons may be included on a remote device which may be positioned conveniently within the passenger cabin, or included on the detector power cord.


The detector power cord may usefully take the form of a USB3 power cord, which differs from a conventional USB3 power cord by the inclusion of one or more buttons the user may press to, for example, mute an alert, confirm an alert, or take other actions as defined by the device user interface.


The radar detector controller 14 is also optionally responsive to a variety of sensor inputs, including for example an accelerometer sensor 61 for detecting acceleration, braking, cornering or collisions of the vehicle in which the radar detector is deployed. Also optionally included are one or more camera sensor(s) which detect conditions around the vehicle, such as the nature of the roadway, weather conditions, proximity of traffic, the behavior of traffic, the visible presence of law enforcement vehicles, and the like. Further, optional proximity sensors can similarly detect proximity of traffic and vehicles, lane positioning of the vehicle, and potentially roadway parameters. Also, a microphone sensor may be included to detect cabin sounds and thereby determine the operational state of the vehicle and an ambient temperature sensor can use vehicle temperature information to characterize operation as well.


Additional information useful in determining the operational state of the vehicle can be gathered from the vehicle's data bus, which will generally provide a range of data regarding the operation of the vehicles engine and main sensor systems. This data is optionally obtained in the illustrated embodiment via an On Board Diagnostics (OBD-II) connector 20 coupled to the data bus of the vehicle 21.


Finally, the detector may optionally interface with a host system, such as a central server, to acquire updates or deliver learned data to the host, or even to receive information on radar signals detected by other devices so that those signals can be fed to the neural network and used in development of neural pathways. This interface can be achieved in a variety of manners, such as via a network connector, USB connect, WiFi circuits, Bluetooth circuits, circuits supporting other 801.11 protocols such as Zigbee, or circuits supporting cellular connectivity, all of which are schematically represented by interface 22, and connect via the public network cloud 23 to the host 24.


Further details of the neural network and its configuration may be found in the above-referenced Patent Cooperation Treaty patent application.


Referring now to FIG. 2, the logical flow of signal information of the exemplary device of FIG. 6 is useful to understand the cooperative functioning of the system processor, peripheral circuits, FPGA, and neural network.


The overall processing activity of the system that is outlined above is orchestrated by a Data Interpreter 130 and System Coordinator 132 implemented in a System Processor. These modules generate Analysis Requests 134 for handling by the FPGA 115 and also originate Raster Requests 136 when a particular signal is to be analyzed by the Neural Network 42.


Analysis Requests are handled by the Command/Response Handler 140, which orchestrates tuning of incoming signal via Tuning Controller 138 and further orchestrates analysis by a customized integrated circuit such as a Field Programmable Gate Array (FPGA) 115 via commands sent over an SPI interface.


Upon command, incoming signal information is acquired into FPGA 115 from the Analog/Digital Converter 114 under control of the ADC controller and ADC Calibration blocks of the FPGA 115. This data is then delivered as an AXIS stream under the control of the ADC controller/calibration blocks, to one, two or all of three processing engines:


Engine 1 performs Time Domain analysis of the incoming signal to develop RMS Sub-Band complex power data and instantaneous power change information.


Engine 2 performs Frequency Domain (Fourier Transform) analysis of the incoming signal.


Engine 3 performs Time-Frequency (Spectrogram) analysis of the incoming signal.


The output of Engines 1, 2 and 3 are delivered as a result AXIS stream via a stream selector and stream adapter to a packet framer which arranges the resulting data for delivery via a USB FIFO and controller to the USB interface of the Carrier Board for handling by the Command/Response Handler 140.


In the event a signal is to be analyzed by Neural Network 42, a Raster Request 136 is delivered from System Coordinator via the SPI interface to the FPGA 115 which returns the requested Raster, which is then delivered to Neural Network 42 for analysis. The resulting analytical response from Neural Network 42 then informs the decision to initiate Alerts via the User Interface 16 (FIG. 1). The user interface may also be used to define User Preferences for altering the operation of the system; e.g., by engaging or disengaging various band monitors or filters. Furthermore, in a learning mode, the User Interface can obtain information from a user indicating feedback on the appropriateness or quality of an alert determination in order to improve the operation of the Neural Network.



FIG. 3 illustrates a detailed implementation of an FPGA subchannel processor for evaluating programmable sub-bands of a channel for enhanced pattern identification. The subchannel processor 150 develops N digital subchannel signals, each representing a subchannel of the in-phase and quadrature (I/Q) output of an Analog to Digital Converter (ADC) 144 and representing signal characteristics in one of N selected subchannel frequencies. The subchannel frequencies ω0 through ωN-1 are produced by N complex direct digital synthesizers 160; these subchannel frequencies are used to demodulate the I/Q signal from ADC 144, using N complex multipliers 170. The resulting demodulated subchannel signals are then respectively decimated and low pass filtered by an N subchannel Cascaded Integrator-Comb (CIC) filter decimator stage, which decimates each demodulated subchannel signal by a factor of M, producing N decimated subchannel signals. These N decimated subchannel signals are then filtered by an N subchannel Finite Impulse Response (FIR) compensation filter producing a filtered output for each of subchannels 0 through N−1.


It will be appreciated that each of the stages 160, 170, 180 and 190 is responsive to a sampling clock CLK1, CLK2, CLK3 which is derived from the base sampling clock CLK0 thus maintaining digital phase lock throughout the process. Sampling clock CLK0 and other clock signals of the subchannel processor are produced by a multi-channel phase locked loop (PLL) 142 in response to a master clock signal 140.


It will be further appreciated that the subchannel demodulator shown in FIG. 3 is fully programmable as appropriate for extracting distinctive elements from each channel under study, through the adjustment of (a) the subchannel demodulation frequencies ω0 through ωN-1 160 (b) the characteristics of the CIC decimator 180 and (c) the characteristics of the FIR filters 190. The appropriate parameters for isolating, decimating and filtering desirable subchannels can be developed during training and neural network assessment using any of the above-described methods. By a learning technique, the subchannel frequencies ω0 through ωN-1, decimation rates and IIR filter profiles may be selectively adjusted to emphasize relevant patterns to be extracted from a channel signal to accurately identify a radar source of interest.


While the present invention has been illustrated by a description of various embodiments and while these embodiments have been described in considerable detail, it is not the intention of the applicants to restrict or in any way limit the scope of the appended claims to such detail. Additional advantages and modifications will readily appear to those skilled in the art. The invention in its broader aspects is therefore not limited to the specific details, representative apparatus and method, and illustrative example shown and described. Accordingly, departures may be made from such details without departing from the spirit or scope of applicant's general inventive concept.

Claims
  • 1. A radar detector, comprising a. a radar receiver for receiving and characterizing the signal characteristics of a received radar signal, wherein the received radar signal includes signals generated remotely by one or more law enforcement sources, or signals generated remotely by one or more non-law enforcement sources, or both, the radar receiver using a local oscillator that is not synchronized to any of said sources to provide an output representative of the received radar signal comprising one or more of radar frequency, radar intensity and radar direction,b. a control system coupled to the radar receiver and receiving the output thereof, the control system comprising a neural network structured in multiple layers each processing signal characteristics delivered thereto to develop neural pathways associated with the distinguishing signatures of the signal characteristics of remotely generated signals from law enforcement sources when provided to the neural network, andc. a digital signal processing section processing N subchannels of the output of the radar receiver, where N is greater than 1, the digital signal processing section comprising an analog to digital converter, a demultiplexing stage comprising N synthesizers and N multipliers, a decimation stage comprising N decimators, and a filtering stage comprising N compensation filters, the corresponding synthesizer, multiplier, decimator and filter operating to produce subchannels of the radar receiver output, the digital signal processing section coupled to the control system to provide subchannels of the radar receiver output to the neural network, wherein the control system identifies the presence, or absence, of a remotely generated signal from a law enforcement source based upon the distinguishing signatures of signal characteristics associated with neural pathways within the neural network, and communicates the result of the identification as a radar detection output.
  • 2. The radar detector of claim 1 wherein the analog to digital converter is an in phase and quadrature (I/Q) converter producing an I/Q digital output, and the N multipliers are complex multipliers.
  • 3. The radar detector of claim 1 wherein parameters of at least one synthesizer, decimator, or filter for at least one subchannel are adjustable.
  • 4. The radar detector of claim 3 wherein the control system provides parameters for adjustment of at least one synthesizer, decimator, or filters for at least one subchannel in response to learned characteristics of signals to be identified by neural network.
  • 5. The radar detector of claim 4 wherein the neural network develops neural pathways associated with distinguishing characteristics relating to signals provided to the neural network and parameters for at least one subchannel to emphasize those characteristics.
  • 6. The radar detector of claim 3 wherein at least one kernel of a convolutional layer of said neural network is trained to detect one or more of specific frequencies of a radar signal, specific combinations of different frequencies of a radar signal, specific combinations of different frequencies at different relative amplitudes of a radar signals, and specific transients, or changes in frequency and amplitude, of a radar signal, and produces parameters for adjustment of at least one synthesizer, decimator, or filter for at least one subchannel in response thereto.
  • 7. The radar detector of claim 1 further comprising a display for communicating radar signal information to a user, the display identifying a make and/or model of an identified known law enforcement radar source.
RELATED APPLICATIONS

This application claims priority benefit of provisional application Ser. No. 63/119,064 filed Nov. 30, 2020, which is incorporated herein by reference in its entirety.

US Referenced Citations (90)
Number Name Date Kind
4313216 Jaeger et al. Jan 1982 A
4626857 Imazeki Dec 1986 A
4631542 Grimsley Dec 1986 A
4841302 Henry Jun 1989 A
4952936 Martinson Aug 1990 A
4961074 Martinson Oct 1990 A
5068663 Valentine Nov 1991 A
5079553 Orr Jan 1992 A
5083129 Valentine Jan 1992 A
5134406 Orr Jul 1992 A
5146226 Valentine Sep 1992 A
5151701 Valentine Sep 1992 A
5164729 Decker et al. Nov 1992 A
5206651 Valentine Apr 1993 A
5250951 Valentine Oct 1993 A
5300932 Valentine et al. Apr 1994 A
5305007 Orr et al. Apr 1994 A
5315302 Katsukura et al. May 1994 A
5461383 Ono et al. Oct 1995 A
5646624 Cope et al. Jul 1997 A
5835052 Iwakuni Nov 1998 A
5852417 Valentine et al. Dec 1998 A
5856801 Valentine et al. Jan 1999 A
5900832 Valentine et al. May 1999 A
5917441 Valentine et al. Jun 1999 A
5990821 Sakar Nov 1999 A
6043771 Clark Mar 2000 A
6140809 Doi Oct 2000 A
6175324 Valentine et al. Jan 2001 B1
6204798 Fleming, III Mar 2001 B1
6239735 Ono May 2001 B1
6400305 Kuhn Jun 2002 B1
6400306 Nohara et al. Jun 2002 B1
6463091 Zhodzicshsky et al. Oct 2002 B1
6617995 Kim et al. Sep 2003 B2
6670905 Orr Dec 2003 B1
6693578 Martinson Feb 2004 B1
7023374 Jossef et al. Apr 2006 B2
7061423 Valentine et al. Jun 2006 B1
7206359 Kjeldsen Apr 2007 B2
7215276 Batten et al. May 2007 B2
7301453 Fry Nov 2007 B2
7388537 Martinson Jun 2008 B2
7430254 Anderson Sep 2008 B1
7576679 Orr Aug 2009 B1
7593449 Shattil Sep 2009 B2
7948427 Wen et al. May 2011 B2
7965761 Shattil Jun 2011 B2
8026840 Dwelly Sep 2011 B2
8045654 Anderson Oct 2011 B1
8180392 Sekiya et al. May 2012 B2
8958506 Seely Feb 2015 B2
8988272 Chernukhin Mar 2015 B2
9014307 Seely Apr 2015 B2
9077487 Seely Jul 2015 B2
9252998 Seely Feb 2016 B2
9426680 Seely Aug 2016 B2
9900796 Seely Feb 2018 B2
11422252 Bowring Aug 2022 B2
11852750 Kim Dec 2023 B2
11907829 Meissner Feb 2024 B2
20030231714 Kjeldsen Dec 2003 A1
20040151137 McFarland et al. Aug 2004 A1
20050212672 Fry Sep 2005 A1
20060139203 Kim et al. Jun 2006 A1
20070018879 Batten et al. Jan 2007 A1
20070120728 Orr May 2007 A1
20070211786 Shattil Sep 2007 A1
20090110033 Shattil Apr 2009 A1
20090146863 Wen et al. Jun 2009 A1
20100019947 Kruys et al. Jan 2010 A1
20100075704 McHenry et al. Mar 2010 A1
20110102232 Orr May 2011 A1
20110241923 Chernukhin Oct 2011 A1
20140241178 Seely Aug 2014 A1
20140241468 Seely Aug 2014 A1
20140241469 Seely Aug 2014 A1
20140242922 Seely Aug 2014 A1
20140242931 Seely Aug 2014 A1
20140242936 Seely Aug 2014 A1
20140266853 Orr Sep 2014 A1
20150283939 Parkes Oct 2015 A1
20160187461 Orr Jun 2016 A1
20160277132 Pratt Sep 2016 A1
20160337891 Seely Nov 2016 A1
20170090012 Kuhn Mar 2017 A1
20200292660 Meissner Sep 2020 A1
20200328916 Nikitin Oct 2020 A1
20210314201 Nikitin Oct 2021 A1
20220128657 Kale Apr 2022 A1
Provisional Applications (1)
Number Date Country
63119064 Nov 2020 US