The subject disclosure relates to vehicular radar systems and, in particular, to a system and method for increasing an angular resolution of a vehicular radar array using a motion of the vehicle.
An autonomous vehicle can navigate with respect to an object in its environment by detecting the object and determining a trajectory that avoids the object. Detection can be performed by various detection systems, one of which is a radar system employing one or more radar antennae. An angular resolution of a radar antenna is limited due to its aperture size, which is generally a few centimeters. The angular resolution can be increased by using an array of antennae spanning a wider aperture. However, the dimension of the vehicle limits the dimension of the antenna array, thereby limiting its angular resolution. Accordingly, it is desirable to provide a system and method for operating an antenna array of a vehicle that extends its angular resolution beyond the limits imposed by the dimensions of the vehicle.
In one exemplary embodiment, a method of operating a vehicle is disclosed. A plurality of observations of an object are received at an extended radar array formed by moving a radar array of the vehicle through a selected distance. The plurality of observations is input to a neural network to generate a network output signal. An object parameter of the object with respect to the vehicle is determined from the network output signal. The vehicle is operated based on the object parameter of the object.
In addition to one or more of the features described herein, the method further includes obtaining the plurality of observations at each of a plurality of locations of the radar array as the radar array moves through the selected distance. The method further includes inputting the plurality of observations to the neural network to generate a plurality of features and combining the plurality of features to obtain the network output signal. The neural network includes a plurality of convolution networks, each convolution network receiving a respective observation from the plurality of observations and generating a respective feature of the plurality of features. The method further includes training the neural network by determining values of weights of the neural network that minimize a loss function including the network output signal and a reference signal. The reference signal is generated by coherently combining the plurality of observations over time based on a known relative distance between the radar array and the object during a relative motion between the vehicle and the object. The reference signal includes a product of an observation received from the extended radar array and a synthetic response based on angles and ranges recorded for the observation.
In another exemplary embodiment, a system for operating a vehicle is disclosed. The system includes an extended radar array, a processor and a controller. The extended radar array is formed by moving a radar array of the vehicle through a selected distance. The processor is configured to receive a plurality of observations of an object from the extended radar array, operate a neural network to generate a network output signal based on the plurality of observations, and determine an object parameter of the object with respect to the vehicle from the network output signal. The controller operates the vehicle based on the object parameter of the object.
In addition to one or more of the features described herein, the extended radar array obtains the plurality of observations at each of a plurality of locations of the radar array as the radar array moves through the selected distance. The processor is further configured to operate the neural network to generate a plurality of features based on the plurality of observations and to operate a concatenation module to combine the plurality of features to obtain the network output signal. The neural network includes a plurality of convolution networks, each convolution network configured to receive a respective observation from the plurality of observations and generate a respective feature of the plurality of features. The processor is further configured to train the neural network by determining values of weights of the neural network that minimize a loss function including the network output signal and a reference signal. The processor is further configured to generate the reference signal by coherently combining the plurality of observations over time based on a known relative distance between the radar array and the object during a relative motion between the vehicle and the object. The processor is further configured to generate the reference signal from a product of an observation received from the extended radar array and a synthetic response based on angles and ranges recorded for the observation.
In yet another exemplary embodiment, a vehicle is disclosed. The vehicle includes an extended radar array, a processor and a controller. The extended radar array is formed by moving a radar array of the vehicle through a selected distance. The processor is configured to receive a plurality of observations of an object from the extended radar array, operate a neural network to generate a network output signal, and determine an object parameter of the object with respect to the vehicle from the network output signal. The controller operates the vehicle based on the object parameter of the object.
In addition to one or more of the features described herein, the extended radar array obtains the plurality of observations at each of a plurality of locations of the radar array as the radar array moves through the selected distance. The processor is further configured to operate the neural network to generate a plurality of features based on inputting the plurality of observations and operate a concatenation module to combine the plurality of features to obtain the network output signal. The processor is further configured to train the neural network by determining values of weights of the neural network that minimize a loss function including the network output signal and a reference signal. The processor is further configured to generate the reference signal by coherently combining the plurality of observations over time based on a known relative distance between the radar array and the object during a relative motion between the vehicle and the object. The processor is further configured to generate the reference signal from a product of an observation received from the extended radar array and a synthetic response based on angles and ranges recorded for the observation.
The above features and advantages, and other features and advantages of the disclosure are readily apparent from the following detailed description when taken in connection with the accompanying drawings.
Other features, advantages and details appear, by way of example only, in the following detailed description, the detailed description referring to the drawings in which:
The following description is merely exemplary in nature and is not intended to limit the present disclosure, its application or uses. It should be understood that throughout the drawings, corresponding reference numerals indicate like or corresponding parts and features. As used herein, the term module refers to processing circuitry that may include an application specific integrated circuit (ASIC), an electronic circuit, a processor (shared, dedicated, or group) and memory that executes one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that provide the described functionality.
In accordance with an exemplary embodiment,
The autonomous vehicle 10 generally includes at least a navigation system 20, a propulsion system 22, a transmission system 24, a steering system 26, a brake system 28, a sensor system 30, an actuator system 32, and a controller 34. The navigation system 20 determines a trajectory plan for automated driving of the autonomous vehicle 10. The propulsion system 22 provides power for creating a motive force for the autonomous vehicle 10 and can, in various embodiments, include an internal combustion engine, an electric machine such as a traction motor, and/or a fuel cell propulsion system. The transmission system 24 is configured to transmit power from the propulsion system 22 to two or more wheels 16 of the autonomous vehicle 10 according to selectable speed ratios. The steering system 26 influences a position of the two or more wheels 16. While depicted as including a steering wheel 27 for illustrative purposes, in some embodiments contemplated within the scope of the present disclosure, the steering system 26 may not include a steering wheel 27. The brake system 28 is configured to provide braking torque to the two or more wheels 16.
The sensor system 30 includes a radar system 40 that senses objects in an exterior environment of the autonomous vehicle 10 and provides various radar parameters of the objects useful in determining object parameters of the one or more objects 50, such as the position and relative velocities of various remote vehicles in the environment of the autonomous vehicle. Such radar parameters can be provided to the navigation system 20. In operation, the transmitter 42 of the radar system 40 sends out a radio frequency (RF) source signal 48 that is reflected back at the autonomous vehicle 10 by one or more objects 50 in the field of view of the radar system 40 as one or more reflected echo signals 52, which are received at receiver 44. The one or more echo signals 52 can be used to determine various object parameters of the one or more objects 50, such as a range of the object, Doppler frequency or relative radial velocity of the object, and azimuth, etc. The sensor system 30 includes additional sensors, such as digital cameras, for identifying road features, etc.
The navigation system 20 builds a trajectory for the autonomous vehicle 10 based on radar parameters from the radar system 40 and any other relevant parameters. The controller 34 can provide the trajectory to the actuator system 32 to control the propulsion system 22, transmission system 24, steering system 26, and/or brake system 28 in order to navigate the autonomous vehicle 10 with respect to the object 50.
The controller 34 includes a processor 36 and a computer readable storage device or computer-readable storage medium 38. The computer readable storage medium includes programs or instructions 39 that, when executed by the processor 36, operate the autonomous vehicle based at least on radar parameters and other relevant data. The computer-readable storage medium 38 may further include programs or instructions 39 that when executed by the processor 36, determines a state of object 50 in order to allow the autonomous vehicle to drive with respect the object.
The radars (202a, 202b, 202c) are substantially aligned along a baseline 204 of the radar array 202. A length of the baseline 204 is defined by a distance from one end of the radar array 202 to an opposite end of the radar array. Although the baseline 204 can be a straight, in other embodiments, the radars (202a, 202b, 202c) are located along a baseline that is a curved surface such as a front surface of the autonomous vehicle 10.
Meanwhile, in box 608, the radar array positions (L1, . . . , LN) at each observation (X1, . . . , XN) are recorded. In box 610, the observations (X1, . . . , XN) are coherently combined given the radar array positions for each observation. The combined observations generate a reference signal Z, as shown in Eq. (1):
Z=∥Σ
n=1
N
a
H(θn,ϕn,Rn)Xn∥ Eq. (1)
where aH(θn, ϕn, Rn) is an array of synthetic responses based on angles and ranges recorded for the nth observation and Xn is the nth observation received from the extended radar array.
In box 612, a loss is calculated using a loss function based on the network output signal {circumflex over (Z)} and the reference signal Z as disclosed below in Eq. (2).
loss=E{∥{circumflex over (Z)}−Z∥p} Eq. (2)
where p is a value between 0.5 and 2, E represents an averaging operator over a set of examples (e.g., a training set). Therefore, the loss is an average over differences between the network output signal {circumflex over (Z)} and the reference signal Z. The loss calculated in box 612 is used at box 604 to update weights and coefficients of the neural network. Updating the weights and coefficients includes determining values of the weights and coefficients of the neural network that minimize the loss function or minimize the difference between the network output signal {circumflex over (Z)} and the reference signal Z.
Curve 904 shows an angular resolution for an extended radar array 302 based on the radar array 202 having three radars (202a, 202b, 202c). For objects in front of the vehicle (zero degrees), the resolution is the same as that of an individual antenna (e.g., 1.5 degrees) of the antenna array, as shown by curve 904. As the object angle increases, the angular resolution of the radar array 202 drops, such that at 10 degrees from the front of the vehicle, the angular resolution has improved to about 0.1 degrees. At higher object angles, the angular resolution of the radar array 202 steadily improves, such that an angular resolution at 45 degrees is about 0.02 degrees.
While the above disclosure has been described with reference to exemplary embodiments, it will be understood by those skilled in the art that various changes may be made, and equivalents may be substituted for elements thereof without departing from its scope. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the disclosure without departing from the essential scope thereof. Therefore, it is intended that the present disclosure not be limited to the particular embodiments disclosed, but will include all embodiments falling within the scope thereof