The subject disclosure relates to a radar system and method of use and, in particular, to methods for achieving an angular resolution of a radar signal in a radar array using match filtering.
A radar system can be implemented on a vehicles in order to detect objects in the path of the vehicle, allowing the vehicle to navigate with respect to the objects. The radar system can include a plurality of radar nodes at separated locations about the vehicle. Such a radar system forms a wide aperture radar which can provide a low resolution. Match filtering can be used for a wide aperture radar to increase the resolution. However, straightforward implementation of a match filter is complex, since different elements in the array observe each reflection point at different ranges, angles and Doppler frequencies due to variations in near-field measurements. Accordingly, it is desirable to provide an efficient and practical method of applying a match filter to a signal in a wide aperture radar in a near-field scenario.
In one exemplary embodiment, a method of operating a radar is disclosed. The method includes determining a first far-field parameter measurement for an object for a first node of the radar using sub-nodes of the first node, determining a second far-field parameter measurement for the object for a second node of the radar using sub-nodes of the second node, and obtaining a joint parameter measurement for the object by combining the first far-field parameter measurement with the second far-field parameter measurement by correcting for a near-field phase difference between the first node and the second node.
In addition to one or more of the features described herein, the first node and the second node of the radar form a near-field aperture, the subnodes of the first node form a far-field aperture and the subnodes of the second node form a far-field aperture. The method further includes determining first coarse grid parameter measurements for a first match filter associated with the first node and determining second coarse grid parameter measurements for a second match filter associated with the second node. The method further includes interpolating the first coarse grid parameter measurements to estimate the first far-field parameter measurement at grid location on a first fine grid and interpolating the second coarse grid parameter measurements to estimate the second far-field parameter measurement at a grid location on a second fine grid. Correcting for the near-field phase difference further includes applying a near-field correction with respect to a selected location to the first far-field parameter measurement and the second far-field parameter measurement. The method further includes performing at least one of (i) range FFT; (ii) Doppler FFT; (iii) beamforming to determine at least one of the first far-field parameter measurement and the second far-field parameter measurement. The method further includes navigating a vehicle with respect to the object using the joint parameter measurement.
In another exemplary embodiment, a radar system is disclosed. The radar system includes a radar array and a processor. The radar array includes at least a first radar node and a second radar node, each of the first radar node and the second radar node having a plurality of subnodes. The processor is configured to determine a first far-field parameter measurement for an object for a first node of the radar using sub-nodes of the first node, determine a second far-field parameter measurement for the object for a second node of the radar using sub-nodes of the second node, and obtain a joint parameter measurement for the object by combining the first far-field parameter measurement with the second far-field parameter measurement by correcting for a near-field phase difference between the first node and the second node.
In addition to one or more of the features described herein, the first node and the second node of the radar form a near-field aperture, the subnodes of the first node form a far-field aperture and the subnodes of the second node form a far-field aperture. The processor is further configured to determine first coarse grid parameter measurements for a first match filter associated with the first node and determine second coarse grid parameter measurements for a second match filter associated with the second node. The processor is further configured to interpolate the first coarse grid parameter measurements to estimate the first far-field first parameter measurement at a grid location on a first fine grid and interpolate the second coarse grid parameter measurements to estimate the second far-field parameter measurement at a grid location on a second fine grid. The processor is further configured to applying a near-field correction with respect to a selected location to the first far-field parameter measurement and the second far-field parameter measurement. The processor is further configured to perform at least one of (i) range FFT; (ii) Doppler FFT; (iii) beamforming to determine at least one of the first far-field parameter measurement and the second far-field parameter measurement. The processor is further configured to navigate a vehicle with respect to the object using the joint parameter measurement.
In yet another exemplary embodiment, a vehicle is disclosed. The vehicle includes a radar array and a processor. The radar array includes at least a first radar node and a second radar node, each of the first radar node and the second radar node having a plurality of subnodes. The processor is configured to determine a first far-field parameter measurement for an object for a first node of the radar using sub-nodes of the first node, determine a second far-field parameter measurement for the object for a second node of the radar using sub-nodes of the second node, obtain a joint parameter measurement for the object by combining the first far-field parameter measurement with the second far-field parameter measurement by correcting for a near-field phase difference between the first node and the second node, and navigate the vehicle with respect to the object using the joint parameter measurement.
In addition to one or more of the features described herein, the first node and the second node of the radar form a near-field aperture, the subnodes of the first node form a far-field aperture and the subnodes of the second node form a far-field aperture. The processor is further configured to determine first coarse grid parameter measurements for a first match filter associated with the first node and determine second coarse grid parameter measurements for a second match filter associated with the second node. The processor is further configured to interpolate the first coarse grid parameter measurements to estimate the first far-field first parameter measurement at a grid location on a first fine grid and interpolate the second coarse grid parameter measurements to estimate the second far-field parameter measurement at a grid location on a second fine grid. The processor is further configured to apply a near-field correction with respect to a selected location to the first far-field parameter measurement and the second far-field parameter measurement. The processor is further configured to perform at least one of (i) range FFT; (ii) Doppler FFT; (iii) beamforming to determine at least one of the first far-field parameter measurement and the second far-field parameter measurement.
The above features and advantages, and other features and advantages of the disclosure are readily apparent from the following detailed description when taken in connection with the accompanying drawings.
Other features, advantages and details appear, by way of example only, in the following detailed description, the detailed description referring to the drawings in which:
The following description is merely exemplary in nature and is not intended to limit the present disclosure, its application or uses. It should be understood that throughout the drawings, corresponding reference numerals indicate like or corresponding parts and features.
In accordance with an exemplary embodiment,
In various embodiments, the vehicle 10 is an autonomous vehicle and the trajectory planning system 100 is incorporated into the autonomous vehicle 10 (hereinafter referred to as the autonomous vehicle 10). The autonomous vehicle 10 is, for example, a vehicle that is automatically controlled to carry passengers from one location to another. The autonomous vehicle 10 is depicted in the illustrated embodiment as a passenger car, but it should be appreciated that any other vehicle including motorcycles, trucks, sport utility vehicles (SUVs), recreational vehicles (RVs), marine vessels, aircraft, etc., can also be used. In an exemplary embodiment, the autonomous vehicle 10 is a so-called Level Four or Level Five automation system. A Level Four system indicates “high automation”, referring to the driving mode-specific performance by an automated driving system of all aspects of the dynamic driving task, even if a human driver does not respond appropriately to a request to intervene. A Level Five system indicates “full automation”, referring to the full-time performance by an automated driving system of all aspects of the dynamic driving task under all roadway and environmental conditions that can be managed by a human driver.
As shown, the autonomous vehicle 10 generally includes a propulsion system 20, a transmission system 22, a steering system 24, a brake system 26, a sensor system 28, an actuator system 30, at least one data storage device 32, and at least one controller 34. The propulsion system 20 may, in various embodiments, include an internal combustion engine, an electric machine such as a traction motor, and/or a fuel cell propulsion system. The transmission system 22 is configured to transmit power from the propulsion system 20 to the vehicle wheels 16 and 18 according to selectable speed ratios. According to various embodiments, the transmission system 22 may include a step-ratio automatic transmission, a continuously-variable transmission, or other appropriate transmission. The brake system 26 is configured to provide braking torque to the vehicle wheels 16 and 18. The brake system 26 may, in various embodiments, include friction brakes, brake by wire, a regenerative braking system such as an electric machine, and/or other appropriate braking systems. The steering system 24 influences a position of the of the vehicle wheels 16 and 18. While depicted as including a steering wheel for illustrative purposes, in some embodiments contemplated within the scope of the present disclosure, the steering system 24 may not include a steering wheel.
The sensor system 28 includes one or more sensing devices 40a-40n that sense observable conditions of the exterior environment and/or the interior environment of the autonomous vehicle 10. The sensing devices 40a-40n can include, but are not limited to, radars, lidars, global positioning systems, optical cameras, thermal cameras, ultrasonic sensors, and/or other sensors. In various embodiments, the vehicle 10 includes a radar system including an array of radar sensors, the radar sensors of the radar array being located at various locations along the vehicle 10. In operation, a radar sensor sends out an electromagnetic pulse 48 that is reflected back at the vehicle 10 by one or more objects 50 in the field of view of the sensor.
The actuator system 30 includes one or more actuator devices 42a-42n that control one or more vehicle features such as, but not limited to, the propulsion system 20, the transmission system 22, the steering system 24, and the brake system 26. In various embodiments, the vehicle features can further include interior and/or exterior vehicle features such as, but are not limited to, doors, a trunk, and cabin features such as ventilation, music, lighting, etc. (not numbered).
The controller 34 includes at least one processor 44 and a computer readable storage device or media 46. The processor 44 can be any custom made or commercially available processor, a central processing unit (CPU), a graphics processing unit (GPU), an auxiliary processor among several processors associated with the controller 34, a semiconductor based microprocessor (in the form of a microchip or chip set), a macroprocessor, any combination thereof, or generally any device for executing instructions. The computer readable storage device or media 46 may include volatile and nonvolatile storage in read-only memory (ROM), random-access memory (RAM), and keep-alive memory (KAM), for example. KAM is a persistent or non-volatile memory that may be used to store various operating variables while the processor 44 is powered down. The computer-readable storage device or media 46 may be implemented using any of a number of known memory devices such as PROMs (programmable read-only memory), EPROMs (electrically PROM), EEPROMs (electrically erasable PROM), flash memory, or any other electric, magnetic, optical, or combination memory devices capable of storing data, some of which represent executable instructions, used by the controller 34 in controlling the autonomous vehicle 10.
The instructions may include one or more separate programs, each of which includes an ordered listing of executable instructions for implementing logical functions. The instructions, when executed by the processor 44, receive and process signals from the sensor system 28, perform logic, calculations, methods and/or algorithms for automatically controlling the components of the autonomous vehicle 10, and generate control signals to the actuator system 30 to automatically control the components of the autonomous vehicle 10 based on the logic, calculations, methods, and/or algorithms. Although only one controller 34 is shown in
The trajectory planning system 100 navigates the autonomous vehicle 10 based on a determination of objects and/their locations within the environment of the vehicle. In various embodiments the controller 34 operates a plurality of radars at various locations on the vehicle 10 to determine a location (i.e., range, elevation and azimuth) of the object 50 using interpolation of far-field responses using a correction for near-field assumptions of the responses. The determined location can be used either alone or in combination with similar parameters obtained by single radar systems in order to provide range, azimuth and/or elevation of the object 50 for navigation purposes. Upon determining various parameters of the object, such as range, azimuth, elevation, velocity, etc., the controller 34 can operate the one or more actuator devices 42a-n, the propulsion system 20, transmission system 22, steering system 24 and/or brake 26 in order to navigate the vehicle 10 with respect to the object 50.
The aperture d of the subnode array is the distance spanned by the subnodes 204a, . . . , 204n. Due to the relatively small size of the aperture d, the subnodes 204a, . . . , 204n are considered to receive signals in a far-field scenario for which the object is considered to be at infinity. For a small aperture of about 10 cm, and a wavelength of 4 mm the far-field conditions apply to objects that are at a distance of greater than about 5 meters. In the far-field scenario, the angles of arrival at each subnode are the same or substantially the same. Similar the range measured obtained from correlation of the signal waveform (and not from the carrier phase measurement) at each subnode is the same or substantially the same, as are Doppler measurements at each sub node. There is therefore a relatively simple relation between the reflection point position and the phase, range and Doppler measurements at each sub node 204a . . . , 204n.
The second radar array 310 shows a near-field spacing between nodes 202a, . . . , 202n spanning an aperture D. For the near-field spacing of array 300, the angles of arrival (θ0, θ1, θ2, θ3) are different for each node. Similarly, the ranges (r1, r2, r3, r4) are different for each node, and the Doppler measurements are all different from each other. There is therefore a complex relation between the reflection point position and the measured phases, ranges, and Doppler frequencies at the nodes.
Methods disclosed herein determine radar parameters of an object, such as range, Doppler and angle, by first obtaining a far-field estimate of the parameter using measurements at subnodes of a node. Then, the far-field estimates are combined across the nodes of the array. Combining the far-field estimates includes applying a near-field correction based on the spacing of the nodes of the array. These methods are discussed in further details below.
In various embodiments, a signal is received from the object by reflection of the source signal by object 50 located at distance d1 with respect to the first node 202a. Interpolation determines the location and complex value of the signal by using the coarse grid complex values (x1, x2, . . . , xN)) for the first match filter 404 and the known positions of the grid points of the first match filter 404. Interpolation is shown in Eq. (1):
y
1==(AHA)−1AHa0x Eq. (1)
where
x=[x1 x2 x3 x4]T Eq. (2)
and
A=[a1 a2 a3 a4] Eq. (2)
where a1, a2, a3, and a4 are vectors of the expected array response for each of the reflection point positions that correspond to the grid points x1, x2, x3 and x4, respectively, and a0 is the array response to a reflection point that is at the desired point on the fine grid.
z=y
1exp(j2πd1/λ)+y2exp(j2πd2/λ) Eq. (3)
where d1 is a distance between the center point of the first node 202a and the reflection point location of the first parameter measurement and d2 is a distance between the center point of the second node 202b and the reflection point location of the second parameter measurement, λ is the wavelength of the source signal of the radar system.
While the above disclosure has been described with reference to exemplary embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted for elements thereof without departing from its scope. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the disclosure without departing from the essential scope thereof. Therefore, it is intended that the present disclosure not be limited to the particular embodiments disclosed, but will include all embodiments falling within the scope thereof.