The information provided in this section is for the purpose of generally presenting the context of the disclosure. Work of the presently named inventors, to the extent it is described in this section, as well as aspects of the description that may not otherwise qualify as prior art at the time of filing, are neither expressly nor impliedly admitted as prior art against the present disclosure.
The present disclosure relates to enhanced radar object detection via dynamic and static doppler spectrum partitioning.
A vehicle may include a driver assistance system that relies on sensors for blind spot detection, adaptive cruise control, lane departure warnings, etc. In some cases, the sensors may include a radar device (e.g., a long-range or a short-range radar device) to detect objects in the vicinity of the vehicle. Specifically, the radar device detects the objects by emitting waves (e.g., electromagnetic waves, radio waves, etc.) and then detecting reflected waves from the objects.
A vehicle system for enhancing object detection in a vehicle includes at least one radar device configured to detect signals reflected by objects in a vicinity of the vehicle, and a control module in communication with the at least one radar device. The control module is configured to generate a radar spectrum based on the signals detected by the at least one radar device, partition the radar spectrum into static reflections and dynamic reflections separate from the static reflections, extract features from the static reflections with a first machine learning module, extract features from the dynamic reflections with a second machine learning module different than the first machine learning module, merge the extracted features from the static reflections and the extracted features from the dynamic reflections, and detect static objects and dynamic objects in the vicinity of the vehicle based on the merged features from the static reflections and the dynamic reflections.
In other features, the first machine learning module and the second machine learning module are separate deep neural networks.
In other features, the control module is configured to determine Doppler bins for the static reflections based on a velocity of the vehicle.
In other features, the control module is configured to separate the determined Doppler bins for the static reflections from the radar spectrum to partition the radar spectrum into static reflections and dynamic reflections.
In other features, the radar spectrum is a radar tensor formed of range values, angle values, and Doppler values, and each Doppler bin associates one of the range values and one of the angle values at the velocity of the vehicle.
In other features, the control module is configured to estimate the velocity of the vehicle based on data from the radar tensor.
In other features, the vehicle system further includes a sensor configured to detect the velocity of the vehicle.
In other features, the control module is configured to receive one or more signals from the sensor indicative of the velocity of the vehicle.
In other features, the control module is configured to generate a static feature map based on the extracted features from the static reflections, and generate a dynamic feature map based on the extracted features from the dynamic reflections.
In other features, the static feature map includes static feature vectors. Each static feature vector is associated with a learnable feature identified by the first machine learning module.
In other features, the dynamic feature map includes dynamic feature vectors. Each dynamic feature vector is associated with a learnable feature identified by the second machine learning module.
In other features, the control module is configured to concatenate the static feature map and the dynamic feature map to merge the extracted features from the static reflections and the extracted features from the dynamic reflections.
In other features, the vehicle system further includes a vehicle control module in communication with the control module. The vehicle control module configured to receive one or more signals from the control module indicative of the detected static objects and dynamic objects.
In other features, the vehicle control module is configured to control at least one vehicle control system based on the one or more signals.
In other features, the vehicle control module is configured to generate a map including the detected static objects and dynamic objects in the vicinity of the vehicle based on the one or more signals.
In other features, a vehicle includes the vehicle system.
A method system for enhancing object detection includes generating a radar spectrum based on signals detected by at least one radar device, partitioning the radar spectrum into static reflections and dynamic reflections separate from the static reflections, extracting features from the static reflections with a first machine learning module, extracting features from the dynamic reflections with a second machine learning module different than the first machine learning module, merging the extracted features from the static reflections and the extracted features from the dynamic reflections, and detecting static objects and dynamic objects based on the merged features from the static reflections and the dynamic reflections.
In other features, partitioning the radar spectrum includes determining Doppler bins for the static reflections.
In other features, partitioning the radar spectrum includes separating the determined Doppler bins for the static reflections from the radar spectrum.
In other features, the method further includes generating a static feature map based on the extracted features from the static reflections, and generating a dynamic feature map based on the extracted features from the dynamic reflections.
In other features, the static feature map includes static feature vectors. Each static feature vector is associated with a learnable feature identified by the first machine learning module.
In other features, the dynamic feature map includes dynamic feature vectors. Each dynamic feature vector is associated with a learnable feature identified by the second machine learning module.
In other features, merging the extracted features from the static reflections and the extracted features from the dynamic reflections includes concatenating the static feature map and the dynamic feature map.
Further areas of applicability of the present disclosure will become apparent from the detailed description, the claims and the drawings. The detailed description and specific examples are intended for purposes of illustration only and are not intended to limit the scope of the disclosure.
The present disclosure will become more fully understood from the detailed description and the accompanying drawings, wherein:
In the drawings, reference numbers may be reused to identify similar and/or identical elements.
A radar device emits waves and then detects reflected waves from both static and dynamic objects. The radar device or a control module associated with the device then processes data from the reflected waves to obtain a radar spectrum for the objects. The radar spectrum may include data in the form of a three-dimensional radar tensor with two dimensions having range values and angle values (e.g., range-azimuth space) and the third dimension having Doppler values with velocity information (e.g., velocity of the objects relative to the radar device). In such examples, the radar tensor combines data for both static and dynamic objects. However, detecting objects by processing a radar tensor with both types often results in limited performance. For instance, interference between static and dynamic objects may be induced due to a wide spreading function of the radar device, and different processing approaches (e.g., filtering) may be necessary due to the distinct shapes and Doppler spreads of the static and dynamic objects.
The systems and methods according to the present disclosure provide technical solutions for enhancing radar object detection by partitioning or dividing a radar spectrum into distinct static and dynamic components, independently extracting features from each component with differently trained machine learning modules (e.g., deep neural networks, etc.), and then merging the extracted features from the static and dynamic components. In doing so, static and dynamic objects may be more accurately detected as compared to conventional radar systems. For example, the systems and methods herein enhance the accuracy of object bounding boxes and velocity estimations in radar applications, thereby delivering improved results as compared to conventional radar systems that process the entire radar tensor. As such, the systems and methods herein provide improved object detection at long distance and at adverse weather conditions with high angular resolution, which in turn increases the coverage and driving speed of autonomous driving.
Referring now to
Although not shown in
In various embodiments, the vehicle system 100 of
With continued reference to
Then, in various embodiments, a radar spectrum is generated based on the signals detected by at least one of the radar devices 104, 106, 108. For example, the control module 102 may receive the radar data, process the data, and form the radar spectrum based on the radar data. In other examples, a control module associated with one or more of the radar devices 104, 106, 108 may process the radar data, form the radar spectrum based on the radar data, and transmit the radar spectrum to the control module 102. In such examples, the radar spectrum generated by the control module 102 or received by the control module 102 may be a radar tensor formed of range values, angle values, and Doppler values, as is conventional.
For example,
Then, the control module 102 of
In various embodiments, the control module 102 can partition the radar spectrum based on a velocity of the vehicle (e.g., the vehicle 200 of
In other examples, the control module 102 may estimate the velocity of the vehicle if, for example, the velocity sensor 110 is unavailable. In such examples, the velocity may be estimated based on data from the radar tensor. For example, the Doppler values may be used based on a proximity to the vehicle's velocity. In some examples, the static reflection points of the radar tensor may be detected by clustering Doppler values of detection points. In such examples, the largest cluster may be static reflection points. For example, the vehicle's velocity may be estimated according to a detection point index (e.g., a cluster) of static reflections, and the estimated velocity may be broken into vectors (e.g., vx, vy) which may then be used to partition the radar spectrum into static reflections and dynamic reflections.
As one example, equations 1-3 below represent an approach to estimate the vehicle's velocity based on a cluster of Doppler values of detection points. In equations 1-3, f represents an index of Doppler values f0, f1, fN, H represents a matrix of angle values θ0, θ1, θN corresponding to the Doppler values f0, f1, fN, T represents a transpose operator on the matrix H, and v represents the estimated velocity based on the matrix H and a corresponding Doppler value.
The control module 102 then can partition the radar spectrum by determining or calculating Doppler bins for the static reflections based on the velocity of the vehicle. In various embodiments, the control module 102 may include a Doppler bin calculation module 306 of
As one example, the Doppler bin calculation module 306 may implement equation 4 below for determining a Doppler bin. In equation 4, f represents a Doppler value, A represents a wavelength of the transmitted radar signal, vx, vy represent vectors associated with the vehicle's velocity, and θ represents an angle value. In such examples, the angle value may be scanned for each range. For instance, the angle may be scanned from −50 degrees to 50 degrees for one range value, from −50 degrees to 50 degrees for another range value, etc. Then, the Doppler value may be obtained based on the velocity vectors.
With continued reference to
Then, once the Doppler bins for the static reflections are known, the control module 102 can separate the determined Doppler bins for the static reflections to partition the radar spectrum into static reflections and dynamic reflections, as shown in
With continued reference to
In various embodiments, different and/or independent filters may be employed on the static and dynamic paths if desired. Additionally, if desired, a set maximum of inputs may be taken on the Doppler domain to reduce an input dimension.
In some examples, the control module 102 of
In various embodiments, the control module 102 merges the extracted features from the static reflections and the extracted features from the dynamic reflections. For example, a merger module 324 of
The control module 102 may then generate a new map of vectors based on the merged features. For example, the control module 102 may include a multilayer perceptron module 326 of
Then, the control module 102 detects static objects and dynamic objects in the vicinity of the vehicle based on the merged features from the static reflections and the dynamic reflections. For instance, the control module 102 may process the output of the combined features to detect static and dynamic objects. In various embodiments, the control module 102 may employ conventional techniques to detect the objects. For example, the control module 102 may include an object detection module 328 of
In various embodiments, the control module 102 of
In some examples, the detected static objects and dynamic objects may be shown on a map in the vehicle. For example, the vehicle control module 112 may generate a map including the detected static and/or dynamic objects in the vicinity of the vehicle based on received signals from the control module 102. The vehicle control module 112 may then output signals indicative of the map to the display module 114 (e.g., a display in the vehicle), which can then display the map for the driver and/or other passengers in the vehicle. In other examples, the display module 114 may receive the signals indicative of the detected static objects and/or dynamic objects from the control module 102, generate the map with the detected static and/or dynamic objects in the vicinity of the vehicle based on received signals, and display the map.
As shown, the control process 400 begins as 402 where the control module 102 determines whether any radar signals reflected by objects are detected or otherwise received. If no, control may return to 402 as shown in
At 404, the control module 102 generates or receives a radar spectrum based on the reflected signals detected by one or more radar devices (e.g., the radar devices 104, 106, 108 of
At 406, the control module 102 determines Doppler bins for static reflections based on a velocity of the vehicle. In such examples, the velocity may be a sensed value from a suitable sensor in the vehicle or an estimated value. Additionally, the control module 102 may implement equation 4 when determining the Doppler bins for the static reflections, as explained above. Control then proceeds to 408.
At 408, the control module 102 partitions the radar spectrum into static reflection Doppler bins and dynamic reflection Doppler bins. For example, the control module 102 may separate the determined Doppler bins for the static reflections from the radar spectrum. Then, the remaining Doppler bins not identified as static reflections may be Doppler bins for the dynamic reflections. Control then proceeds to 410, 412.
At 410, 412, the control module 102 independently extract features from the static reflections and the dynamic reflections with different machine learning modules. More specifically, the control module 102 employs a trained machine learning module (e.g., a deep neural network or another suitable machine learning module) for extracting features from the static reflections at 410, and another trained machine learning module (e.g., a deep neural network or another suitable machine learning module) for extracting features from the dynamic reflections at 412. In such examples, each machine learning module may be trained on different labeled data sets for known detected static and dynamic objects, as explained above. Control then proceeds to 414.
At 414, the control module 102 merges the extracted features from the static reflections and the extracted features from the dynamic reflections. In such examples, the control module 102 may merge (e.g., concatenate, etc.) a static feature map and a dynamic feature map representing at least the extract features, as explained above. Control then proceeds to 416.
At 416, the control module 102 determines whether any static and/or dynamic objects are detected based on the merged features. For example, and as explained above, the control module 102 may process the combined features to detect static and dynamic objects. In such examples, a conventional algorithm may be used to scan the combined features to detect the object. If no object is detected, control may return to 402 as shown in
At 418, the control module 102 generates one or more signals indicative of the detected static and/or dynamic objects, as explained above. In such examples, the control module 102 may transmit the signal(s) to a vehicle control module (e.g., the vehicle control module 112 of
At 420, a vehicle action is initiated based on the signal(s). For example, the vehicle control module may generate a control signal for controlling one or more vehicle control systems based on the detected static objects and/or dynamic objects, as explained above. Additionally and/or alternatively, a display module (e.g., the display module 114 of
The systems and methods described herein enhance radar object detection over conventional systems and methods. For example,
The foregoing description is merely illustrative in nature and is in no way intended to limit the disclosure, its application, or uses. The broad teachings of the disclosure can be implemented in a variety of forms. Therefore, while this disclosure includes particular examples, the true scope of the disclosure should not be so limited since other modifications will become apparent upon a study of the drawings, the specification, and the following claims. It should be understood that one or more steps within a method may be executed in different order (or concurrently) without altering the principles of the present disclosure. Further, although each of the embodiments is described above as having certain features, any one or more of those features described with respect to any embodiment of the disclosure can be implemented in and/or combined with features of any of the other embodiments, even if that combination is not explicitly described. In other words, the described embodiments are not mutually exclusive, and permutations of one or more embodiments with one another remain within the scope of this disclosure.
Spatial and functional relationships between elements (for example, between modules, circuit elements, semiconductor layers, etc.) are described using various terms, including “connected,” “engaged,” “coupled,” “adjacent,” “next to,” “on top of,” “above,” “below,” and “disposed.” Unless explicitly described as being “direct,” when a relationship between first and second elements is described in the above disclosure, that relationship can be a direct relationship where no other intervening elements are present between the first and second elements, but can also be an indirect relationship where one or more intervening elements are present (either spatially or functionally) between the first and second elements. As used herein, the phrase at least one of A, B, and C should be construed to mean a logical (A OR B OR C), using a non-exclusive logical OR, and should not be construed to mean “at least one of A, at least one of B, and at least one of C.”
In the figures, the direction of an arrow, as indicated by the arrowhead, generally demonstrates the flow of information (such as data or instructions) that is of interest to the illustration. For example, when element A and element B exchange a variety of information but information transmitted from element A to element B is relevant to the illustration, the arrow may point from element A to element B. This unidirectional arrow does not imply that no other information is transmitted from element B to element A. Further, for information sent from element A to element B, element B may send requests for, or receipt acknowledgements of, the information to element A.
In this application, including the definitions below, the term “module” or the term “controller” may be replaced with the term “circuit.” The term “module” may refer to, be part of, or include: an Application Specific Integrated Circuit (ASIC); a digital, analog, or mixed analog/digital discrete circuit; a digital, analog, or mixed analog/digital integrated circuit; a combinational logic circuit; a field programmable gate array (FPGA); a processor circuit (shared, dedicated, or group) that executes code; a memory circuit (shared, dedicated, or group) that stores code executed by the processor circuit; other suitable hardware components that provide the described functionality; or a combination of some or all of the above, such as in a system-on-chip.
The module may include one or more interface circuits. In some examples, the interface circuits may include wired or wireless interfaces that are connected to a local area network (LAN), the Internet, a wide area network (WAN), or combinations thereof. The functionality of any given module of the present disclosure may be distributed among multiple modules that are connected via interface circuits. For example, multiple modules may allow load balancing. In a further example, a server (also known as remote, or cloud) module may accomplish some functionality on behalf of a client module.
The term code, as used above, may include software, firmware, and/or microcode, and may refer to programs, routines, functions, classes, data structures, and/or objects. The term shared processor circuit encompasses a single processor circuit that executes some or all code from multiple modules. The term group processor circuit encompasses a processor circuit that, in combination with additional processor circuits, executes some or all code from one or more modules. References to multiple processor circuits encompass multiple processor circuits on discrete dies, multiple processor circuits on a single die, multiple cores of a single processor circuit, multiple threads of a single processor circuit, or a combination of the above. The term shared memory circuit encompasses a single memory circuit that stores some or all code from multiple modules. The term group memory circuit encompasses a memory circuit that, in combination with additional memories, stores some or all code from one or more modules.
The term memory circuit is a subset of the term computer-readable medium. The term computer-readable medium, as used herein, does not encompass transitory electrical or electromagnetic signals propagating through a medium (such as on a carrier wave); the term computer-readable medium may therefore be considered tangible and non-transitory. Non-limiting examples of a non-transitory, tangible computer-readable medium are nonvolatile memory circuits (such as a flash memory circuit, an erasable programmable read-only memory circuit, or a mask read-only memory circuit), volatile memory circuits (such as a static random access memory circuit or a dynamic random access memory circuit), magnetic storage media (such as an analog or digital magnetic tape or a hard disk drive), and optical storage media (such as a CD, a DVD, or a Blu-ray Disc).
The apparatuses and methods described in this application may be partially or fully implemented by a special purpose computer created by configuring a general purpose computer to execute one or more particular functions embodied in computer programs. The functional blocks, flowchart components, and other elements described above serve as software specifications, which can be translated into the computer programs by the routine work of a skilled technician or programmer.
The computer programs include processor-executable instructions that are stored on at least one non-transitory, tangible computer-readable medium. The computer programs may also include or rely on stored data. The computer programs may encompass a basic input/output system (BIOS) that interacts with hardware of the special purpose computer, device drivers that interact with particular devices of the special purpose computer, one or more operating systems, user applications, background services, background applications, etc.
The computer programs may include: (i) descriptive text to be parsed, such as HTML (hypertext markup language), XML (extensible markup language), or JSON (JavaScript Object Notation) (ii) assembly code, (iii) object code generated from source code by a compiler, (iv) source code for execution by an interpreter, (v) source code for compilation and execution by a just-in-time compiler, etc. As examples only, source code may be written using syntax from languages including C, C++, C#, Objective-C, Swift, Haskell, Go, SQL, R, Lisp, Java®, Fortran, Perl, Pascal, Curl, OCaml, JavaScript®, HTML5 (Hypertext Markup Language 5th revision), Ada, ASP (Active Server Pages), PHP (PHP: Hypertext Preprocessor), Scala, Eiffel, Smalltalk, Erlang, Ruby, Flash®, Visual Basic®, Lua, MATLAB, SIMULINK, and Python®.