The present invention relates generally to a vehicle sensing system for a vehicle and, more particularly, to a vehicle sensing system that utilizes one or more sensors at a vehicle.
Use of radar sensors in vehicle sensing systems is common and known. Examples of such known systems are described in U.S. Pat. Nos. 9,146,898; 8,027,029 and/or 8,013,780, which are hereby incorporated herein by reference in their entireties.
A vehicular driving assist system includes a sensor disposed at a vehicle equipped with the vehicular driving assist system and that senses exterior of the vehicle and that is operable to capture sensor data. The vehicular driving assist system also includes an electronic control unit (ECU) that has electronic circuitry and associated software. The electronic circuitry of the ECU includes at least one data processor and the ECU is operable to process captured sensor data provided to the ECU. The vehicular driving assist system, via processing at the ECU of sensor data captured by the sensor, detects an object present exterior of the equipped vehicle. The vehicular driving assist system, via processing at the ECU of captured sensor data, determines a first set of angular velocities for the detected object. The vehicular driving assist system, via processing at the ECU of captured sensor data, determines a second set of angular velocities for the detected object. The first set of angular velocities for the detected object is determined at a first time and the second set of angular velocities for the detected object is determined at a second time that is after the first time. The vehicular driving assist system converts (i) the first set of angular velocities into a first set of Cartesian velocities and (ii) the second set of angular velocities into a second set of Cartesian velocities. The vehicular driving assist system pairs each angular velocity from the first set of angular velocities with a corresponding angular velocity from the second set of angular velocities to generate a set of paired Cartesian velocities for the detected object. The vehicular driving assist system determines velocity of the detected object relative to the equipped vehicle based on the set of paired Cartesian velocities for the detected object and controls the equipped vehicle based at least in part on the determined velocity of the detected object relative to the equipped vehicle.
These and other objects, advantages, purposes and features of the present invention will become apparent upon review of the following specification in conjunction with the drawings.
A vehicle sensing system and/or driver or driving assist system and/or object detection system and/or alert system operates to capture sensing data exterior of the vehicle and may process the captured data to detect objects at or near the vehicle and in the predicted path of the vehicle, such as to assist a driver of the vehicle in maneuvering the vehicle in a forward or rearward direction. The system includes a processor that is operable to process sensor data that is transferred to the processor from one or more sensors (e.g., radar sensors, lidar sensors, ultrasonic sensors, etc.) and provide an output, such as an alert or control of a vehicle system.
Referring now to the drawings and the illustrative embodiments depicted therein, a vehicle 10 (
Autonomous vehicles and vehicles with advanced driver assist systems (ADAS) rely on information from various sensors including imaging sensors or cameras, radar sensors, and/or lidar sensors. In particular, these autonomous vehicles and driving assistance systems include sensing systems that implement instantaneous or substantially instantaneous detection models (i.e., object detection and object tracking models) that process sensor data from these various sensors to detect and track objects located in the environment surrounding the vehicle. Yet, the detection models are oftentimes constrained by the sensor data captured by the sensors. For example, cameras capture image data that does not directly include velocity information of the objects captured by the image data. Extracting velocity information from the image data can be a non-trivial and difficult process for detection models. To that end, many sensing systems implement other sensors, such as Frequency Modulated Continuous Wave (FMCW) sensors, that directly captures velocity (i.e., Doppler) information of the detected objects. The FMCW sensors may be radar sensors or lidar sensors.
However, a drawback to using these other sensors is that the velocity information includes radial velocity for multiple different points of a detected object while detection models generally work with Cartesian coordinates, and thus, cannot readily process the captured radial velocities. Simply put, the detection models expect any captured velocity information to be in the Cartesian coordinate format of (e.g., (x, y) velocities) rather than the radial (i.e., angular) velocities that are captured by radar and lidar sensors. This problem is further complicated by the fact that radar and lidar sensors capture the radial velocity at multiple points of each detected object. One current approach to this problem includes implementing multiple sensors that provide information across multiple different dimensions thereby allowing detection models to determine current or instantaneous Cartesian velocities from individual angular velocities for every measurement point. However, this approach requires a cumbersome process of integrating sensor data from multiple sensors to determine the Cartesian velocities. Another current approach includes utilizing a tracking framework model (e.g., Kalman Filter-based approach) that leverages temporal information over a range of time to estimate the velocities of the tracked objects. The tracking framework model requires processing data over multiple points in time which is computationally expensive.
Implementations herein are directed towards a vehicular sensing system 12, such as a radar sensing system that includes a sensor (e.g., an imaging sensor or a non-imaging sensor, such as a radar sensor, a lidar sensor, an ultrasonic sensor, etc.) sensing exterior of the equipped vehicle by capturing sensor data, for example, radar data. The sensors may be located at one or both side mirrors of the vehicle and/or at a roof-top of the vehicle. Additionally or alternatively, the sensors may be located at front and/or rear corners of the vehicle, such as each corner of the vehicle (e.g., at the bumpers). The vehicular sensing system may include an electronic control unit (ECU) having a data processor for processing the sensor data captured by the sensor. In some examples, the vehicular sensing system includes a single sensor, for example, a single radar sensor. Alternatively, the data processor may only process sensor data from a respective one of the sensors at a given time despite the availability of sensor data from multiple sensors at the given time. The system leverages point-cloud information directly for performing velocity estimation (instead of a time-derivative of position), while relying on a general sense of the object over two instances of time (without the requirement of a full-blown tracker). This helps in achieving a semi-instantaneous estimation of Cartesian velocities which can be of great advantage in a multi-sensor perception system.
In some examples, a detection model may use a method of least squares to fit a probability distribution to a set of data. The least squares method involves estimating parameters by the squared discrepancies between observed data and the corresponding expected data. Yet, an overdetermined system of equations (e.g., Ax=b) has no discrete solutions. Thus, determining the vector x that is the closest to being a solution for the system of equations is referred to as the least square solution. Accordingly, a system of linear equations can be solved by the least square solution and may be represented by equation (1). Singular Value Decomposition (SVD), QR decomposition are some of the common methods of solving such linear square systems.
Assuming the detected object is moving in a perfectly linear motion, the Cartesian velocity at each measurement point of the object will be the same. Under this assumption, the radar sensing system can determine a range rate for each radar detection (i.e., measurement point) from the radar point-cloud of data according to:
However, Equation (1) requires the assumption that detected objects travel in perfectly linear motions (e.g., with no angular motion), which is rarely (if ever) true in many scenarios. Moreover, Equation (1) corresponds to determining the range rate when the sensor is static or stationary. To compensate for scenarios when the sensor is in motion (e.g., the sensor and/or vehicle is moving) the vehicular sensing system determines the resulting Doppler distortion of the sensor and determines a modified range rate according to:
Equations (2) and (3) determine the range rate for a single radar detection from the radar point-cloud of data for the detected object. Applying the least square solution to Equations (2) and (3), the detection model can determine the range rate for any number N of radar detections from the radar point-cloud of data captured by the sensor represented by:
Moreover, in some implementations, some of the radar detections captured by the sensor include noisy data. As such, the radar sensing system may apply outlier rejection to improve the quality of the radar detections used to determine the velocity of the detected object. For example, the radar sensing system may apply Random Sample Consensus (RANSAC) to randomly choose a minimal set of radar detections and determine a likelihood that each randomly chosen radar detection includes noisy data. If the likelihood satisfies a threshold (e.g., exceeds a noise threshold), the radar sensing system may randomly select another set of radar detections or replace the noisy radar detection with another radar detection.
In some implementations, the radar sensing system uses a state space model to detect single objects or multiple objects present in the scene surrounding the vehicle. For example, the state space model may use a Kalman Filter state estimation technique. The state space equation may be represented by:
In Equation (5), Xk+1 represents a propagated state of the detected object, Xk represents a current state of the detected object, ωk represents the assumed turn-rate (i.e., changing rate of heading angle of detected object), w represents a noise vector, and F represents the state space model. Thus, the state space vector, including positions and velocities of the detected object, may be represented by:
In Equation (6), x represents longitudinal position (in meters), vx represents longitudinal velocity (in meters/second), y lateral position (in meters), and vy lateral velocity (in meters/second). Moreover, the state space model F from Equation (5) may be represented by:
The velocity estimation approach described herein may rely on several assumptions. First, velocity estimation of objects utilizes a group of detections assumed to be from the same object over two subsequent time instances. That is, accurate clustering or grouping of detections (from the same object) for every time instance is not within the scope of the velocity estimation approach described herein. Moreover, accurate association of a cluster management from a current time instance to the propagated track from a previous time instance is not within the scope of this disclosure. The accurate association of the cluster management only helps track the detected object over a period of time. Second, the sensing system assumes the vehicle operates within a known or predetermined operating range of yaw-rate, such as road curvature, vehicle maneuvers, etc. That is, the sensing system may assume the yaw-rate remains less than a threshold amount. If the vehicle operates outside of the predetermined range, accuracy of the velocity estimation may be degraded. Third, resolution of the estimated velocity accuracy (e.g., primarily yaw-rate) is dependent upon the discretization of yaw-rate. In turn, the discretization of yaw-rate is dependent upon the processing power of the sensing system (e.g., processing power of the processor, ECU, or other hardware). Fourth, velocities of the detected object, including both linear and angular yaw-rate, are expected to remain the same between two subsequent time instances. Because most sensing systems operate at a high frequency, for example, 50 Hertz, the assumption that the velocities of the detected object remain the same between two subsequent time instances is reasonable. For example, the system may assume a first velocity of the detected object at a first and second time instance and assume a second velocity, different than the first velocity, of the detected object at a third and fourth time instance. Thus, given this assumption, the sensing system may be applicable within a predetermined operating range of vehicle speed and/or acceleration/deceleration limits.
Here, {right arrow over (νB)} (Equation (8)) represents the given velocity at point B, {right arrow over (rA)} (Equation (9)) represents the range rate at point A, and {right arrow over (rB)} represents the range rate at point B. Thus, it follows that for an object undergoing linear and angular velocity (i.e., linear and angular motion), the velocity at point A may be determined by:
Using the velocity at point A represented in Equation (12) and using the notations from Equations (8)-(10), the sensing system may obtain the following:
Thereafter, the sensing system may select a reference point P (xref, yref) from the detected object and determine the velocity at the reference point Vp (xref, yref) at every radar detection i, represented by:
When determining the velocity of the detected object, the sensing system may not assume that every point of the detected moving object has the same linear Cartesian velocities as other approaches assume. Instead, the sensing system may use yaw-rate (i.e., angular velocity) together with range-rate information to determine the velocity of detected objects. That is, using the equations above, the radar detection system establishes a relationship between range-rate of detections, Cartesian linear velocities, and angular velocities from the radar detections captured by the sensor. Using these relationships, the radar sensing system may determine or estimate the Cartesian velocity of the detected object for a single value of yaw-rate (i.e., angular velocity) according to:
For a single known value of yaw-rate ω (e.g., obtained from the sensor data), the values on the left-hand side of Equation (16) are all known except for ric. As such, Equation (16) can be simplified and represented by:
In Equation (17), (ric)eff may be defined according to:
Finally, the radar detection system may use Equation (17) to determine the Cartesian velocities of the radar detection of the object according to:
In some implementations, the radar points include noisy detections due to artifacts including multi-bounce, azimuth ambiguity, etc. Accordingly, as discussed above, the sensing system may apply an outlier rejection (e.g., RANSAC) to reduce the effects of noise on the quality of the Cartesian velocity estimates.
After determining the Cartesian velocity for the single angular velocity, the sensing system may determine multiple Cartesian velocity estimates each corresponding to a respective angular velocity over a range of angular velocities ωj∈[ωmin, ωmax] for the detected object. Simply put, the sensing system repeats the process of determining the Cartesian velocity for every angular velocity from the range of angular velocities according to:
Using the Cartesian velocities determined from Equation (20), the detection model generates a set of Cartesian velocity estimates corresponding to the Nth cycle operation (i.e., time step) according to:
In Equation (21), there are J pairs of Cartesian velocity estimates corresponding to the current cycle operation N where each cycle indicates a point in time or time step or time instance. Stated differently, each N indicates a time cycle of sensor data captured by the sensor.
Using the set of Cartesian velocity estimates from two subsequent or adjacent time cycles, for example, a first time step and a second time step, the radar sensing system links or associates each respective Cartesian velocity estimate from a first time step to a corresponding respective Cartesian velocity estimate from a second time step. Here, the first and second time steps may be adjacent time steps (i.e., the second time step immediately follows the first time step). Moreover, the radar sensing system may record relevant information from the Nth and (N−1)th time step. For example, using a clustering model (i.e., a density-based clustering algorithm), the multiple radar detections obtained from the sensor and the corresponding time step may be recorded. More specifically, the (x, y) position of the cluster from the radar point-cloud of data in the (N−1)th time step and the radar point-cloud of data linked to the detected object under consideration from the (N−1)th time step may be recorded. Thereafter, the cluster centroid from the (N−1)th time cycle may be propagated to the current time step using a kinematic model. The kinematic model may be used from Equations (5)-(7), however, any other kinematic model may be used. After the cluster position from the (N−1)th time step is propagated, any data association approach may associate (i.e., pair) the clusters from the two adjacent time steps. One data association approach may include the Hungarian algorithm with distance-based cost. The data association generates a resulting set of cluster pairs from the adjacent time steps.
Finally, after linking the detection from two subsequent time steps, the radar sensing system determines a full velocity estimation of the detected object. In particular, the radar sensing system uses the state space propagation model F (Equation (7)) for every value of ω (e.g., a total of J pairs from Equation (20)) and the state space information X (Equation (6)) from two adjacent time steps (e.g., Ti and Ti-1). Stated differently, for every angular velocity ω, the radar sensing system estimates the predicted state space vector at Ti using the state space information from Ti-1 and the corresponding propagation model represented by:
Thereafter, the radar sensing system extracts the (x, y) position values from the state space vectors and stores the position values as X′ and generate a vector to store all the J predicted state space subsets as well as the measured cluster position X′Ti from the Ti current time step represented by:
Finally, the radar sensing system determines a yaw-rate value w (i.e., angular velocity) that minimizes a L2 norm error between the predicted and measured state by finding the value of the index (j) that minimizes the L2 norm error. Once the yaw-rate that minimizes the L2 norm error is identified, the corresponding velocity estimates will complete the velocity estimation process for the current time step operation according to:
Since the least square estimation approach is highly sensitive to noise, an outlier rejection may be applied to reduce the effect of noisy/spurious radar detections. As discussed above, the outlier rejection may include a RANSAC-based approach. More specifically, for every kth iteration of Vref, the RANSAC-based approach selects a random number of detections M to determine the velocity profile of equations (instead of using all N detections). For example, the random number of detections M may be selected such that the number of cycle operations N is greater than or equal to the random number of detections M which is greater than or equal to four. The RANSAC-based approach estimates the Cartesian velocity of the object under consideration using these random number of detections M by solving the least squares problem. Thereafter, using the estimated Cartesian velocity, the RANSAC-based approach estimates (i.e., back-computes) the reprojected range rate values of the individual detections according to:
Thereafter, the RANSAC-based approach determines the normalized reprojection error of each detection according to:
In Equation (26), errornorm represents the reprojection error (i.e., error between the reported range-rate and the reprojected range-rate). Using the individual reprojection errors (from Equation (26)), the RANSAC-based approach classifies the detections as either inliers or outliers based on a predetermined threshold error value. For example, when the reprojection error is greater than the predetermined threshold error value, the detection is classified as an inlier, and when the reprojection error is less than the predetermined threshold error value, the detection is classified as an outlier. The RANSAC-based approach may repeat this process for different values of M and record the number of inliers for each iteration k of Vref and select the set of detections with a greatest number of inliers and the corresponding velocity estimates for processing by the detection model. That is, the set of detections with the greatest number of inliers are selected as the suitable candidate detections for each kth iteration and the corresponding value of the Cartesian velocities are recorded for this value of yaw-rate ωk.
Accordingly, the vehicular sensing system or vehicular driving assist system may include a sensor, for example, a radar sensor, a lidar sensor, or an ultrasonic sensor that captures sensor data. In some examples, the vehicular sensing system includes a single sensor (e.g., a single radar sensor). In other examples, the vehicular sensing system includes a plurality of sensors (e.g., multiple radar sensors, such as multiple corner radar sensors). Here, the vehicular sensing system may only process sensor data from a single sensor at any given time or may process sensor data from one or more of the plurality of sensors at any given time. The sensor data includes angular velocities from multiple different points of an object detected by the sensor. As such, each of the multiple different points of the detected object may be associated with a different angular velocity.
The vehicular sensing system may detect the presence of an object (such as a pedestrian or another vehicle) exterior of the vehicle responsive to processing sensor data. The sensing system may operate over a predetermined range of yaw-rates (i.e., angular velocities) corresponding to known road curvatures, vehicle maneuvers, etc. In some implementations, vehicular sensing system processes the sensor data at a first time step to determine a first set of first angular velocities from the multiple different points of the detected object and determine a second set of second angular velocities from the multiple different points of the detected object. Each angular velocity of the first set of angular velocities may be associated with a respective point on the detected object at the first time step. Thus, each angular velocity of the first set of angular velocities may be determined independent of each other angular velocity of the first set of angular velocities. For example, a front corner of a leading vehicle in front of the equipped vehicle may be traveling at a greater velocity than a rear corner of the leading vehicle when the leading vehicle is turning. Similarly, each angular velocity of the second set of angular velocities may be associated with a respective point on the detected object at the second time step. Thus, each angular velocity of the second set of angular velocities may be determined independent of each other angular velocity of the second set of angular velocities.
The first and second time steps may be adjacent time steps corresponding to two adjacent points in time that the radar sensor captures radar data. That is, the radar sensor may operate at a predetermined frequency, such as 50 Hertz, whereby at each time step the radar sensor obtains radar data. Thus, each first angular velocity corresponds to a respective point of the detected object and is associated with a second angular velocity corresponding to the same respective point of the detected object. Notably, however, the radar sensing system is compatible with Cartesian velocities of the detected object to perform object detection and object tracking and not angular velocities captured by the radar sensor. As such, the radar sensing system converts the first set of first angular velocities into a corresponding first set of first Cartesian velocities and converts the second set of second angular velocities into a corresponding second set of second Cartesian velocities. Simply put, the radar sensing system converts the angular velocities into corresponding Cartesian (i.e., linear velocities). Notably, the detected object may include different Cartesian velocities at each of the multiple different points of the detected object.
Moreover, the sensing system generates a set of paired Cartesian velocities by pairing each first Cartesian velocity from the first set of Cartesian velocities with a corresponding one of the second Cartesian velocities from the second set of Cartesian velocities. Here, each pair of Cartesian velocities from the set of paired Cartesian velocities includes a respective first Cartesian velocity and a respective second Cartesian velocity both corresponding to a same respective point from the detected object. Each pair of Cartesian velocities may include the same Cartesian velocity or different Cartesian velocities. In some implementations, the sensing system may pair the velocities before converting the angular velocities to Cartesian velocities and then convert the paired angular velocities to Cartesian velocities. Thereafter, the sensing system determines a set of object Cartesian velocities of the detected object based on the set of paired Cartesian velocities. Each Cartesian velocity of the set of object Cartesian velocities may be associated with a respective point on the detected object and may be determined independent of each other object Cartesian velocity of the set of object Cartesian velocities. Thus, each Cartesian velocity of the set of object Cartesian velocities may be different from at least one other Cartesian velocity of the set of object Cartesian velocities.
The set of object Cartesian velocities may represent a current or instantaneous Cartesian velocity of the detected object based on the set of paired Cartesian velocities. That is, while the set of object Cartesian velocities may represent a velocity at each respective point of the object, the sensing system may also determine an overall velocity of the detected object based on the set of object Cartesian velocities. The instantaneous Cartesian velocity may be a semi-instantaneous Cartesian velocity. The vehicular sensing system may provide the set of object Cartesian velocities of the detected object to an advanced driver or driving assistance system of the vehicle that may track a predicted path of the detected object and control operation of the vehicle (e.g., acceleration, declaration/braking, steering, etc.) based on the predicted path of the detected object. The vehicular sensing system may determine a non-linear predicted path of the detected object based on the set of object Cartesian velocities and provide the non-linear predicted path to the advanced driver assistance system. For example, the non-linear predicted path may include the detected object turning. In some examples, the sensing system controls the equipped vehicle based on the set of paired Cartesian velocities for the detected object. The sensing system may determine the velocity of the detected object relative to the equipped vehicle based on the set of paired Cartesian velocities for the detected object and control the equipped vehicle based at least in part on the determined velocity of the detected object relative to the equipped vehicle.
Controlling the equipped vehicle may include controlling acceleration of the equipped vehicle and/or steering of the equipped vehicle and/or braking of the equipped vehicle. Thus, the system may control the equipped vehicle to avoid colliding with the detected object. In some examples, the detected object may include a leading vehicle in front of the equipped vehicle that is within a traffic lane along which the equipped vehicle is traveling. Here, controlling the equipped vehicle may include controlling the equipped vehicle (e.g., controlling steering of the equipped vehicle and/or controlling speed of the equipped vehicle via accelerating or decelerating the equipped vehicle) to follow the leading vehicle and to maintain a set distance from the leading vehicle, or to decelerate the equipped vehicle (e.g., controlling speed of the equipped vehicle via braking of the equipped vehicle) if it is determined that the leading vehicle is slowing down or stopping.
Advantageously, determining the Cartesian (i.e., linear) velocity of the detected object in this manner allows the sensing system to estimate both linear (in Cartesian coordinates) and angular velocities of the detected object using a single sensor (e.g., a radar or lidar sensor). Thus, the constraints on vehicle maneuvers can be expanded because the sensing system can detect angular movement in addition to linear movement of the objects surrounding the vehicle. Moreover, different points of detected objects are no longer assumed to each have the same linear velocity enabling a better understanding of the velocity of detected objects.
The system may utilize sensors, such as radar sensors or imaging radar sensors or lidar sensors or the like, to detect presence of and/or range to objects and/or other vehicles and/or pedestrians. The sensing system may utilize aspects of the systems described in U.S. Pat. Nos. 10,866,306; 9,954,955; 9,869,762; 9,753,121; 9,689,967; 9,599,702; 9,575,160; 9,146,898; 9,036,026; 8,027,029; 8,013,780; 7,408,627; 7,405,812; 7,379,163; 7,379,100; 7,375,803; 7,352,454; 7,340,077; 7,321,111; 7,310,431; 7,283,213; 7,212,663; 7,203,356; 7,176,438; 7,157,685; 7,053,357; 6,919,549; 6,906,793; 6,876,775; 6,710,770; 6,690,354; 6,678,039; 6,674,895 and/or 6,587,186, and/or U.S. Publication Nos. US-2019-0339382; US-2018-0231635; US-2018-0045812; US-2018-0015875; US-2017-0356994; US-2017-0315231; US-2017-0276788; US-2017-0254873; US-2017-0222311 and/or US-2010-0245066, which are hereby incorporated herein by reference in their entireties.
The radar sensors of the sensing system each comprise a plurality of transmitters that transmit radio signals via a plurality of antennas, a plurality of receivers that receive radio signals via the plurality of antennas, with the received radio signals being transmitted radio signals that are reflected from an object present in the field of sensing of the respective radar sensor. The system includes an ECU or control that includes a data processor for processing sensor data captured by the radar sensors. The ECU or sensing system may be part of a driving assist system of the vehicle, with the driving assist system controls at least one function or feature of the vehicle (such as to provide autonomous driving control of the vehicle) responsive to processing of the data captured by the radar sensors.
The radar sensor or sensors may be disposed at the vehicle so as to sense exterior of the vehicle. For example, the radar sensor may comprise a front sensing radar sensor mounted at a grille or front bumper of the vehicle, such as for use with an automatic emergency braking system of the vehicle, an adaptive cruise control system of the vehicle, a collision avoidance system of the vehicle, etc., or the radar sensor may be comprise a corner radar sensor disposed at a front corner or rear corner of the vehicle, such as for use with a surround vision system of the vehicle, or the radar sensor may comprise a blind spot monitoring radars disposed at a rear fender of the vehicle for monitoring sideward/rearward of the vehicle for a blind spot monitoring and alert system of the vehicle. The radar sensing system may comprise multiple input multiple output (MIMO) radar sensors having multiple transmitting antennas and multiple receiving antennas.
Changes and modifications in the specifically described embodiments can be carried out without departing from the principles of the invention, which is intended to be limited only by the scope of the appended claims, as interpreted according to the principles of patent law including the doctrine of equivalents.
The present application claims the filing benefits of U.S. provisional application Ser. No. 63/579,991, filed Sep. 1, 2023, which is hereby incorporated herein by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
63579991 | Sep 2023 | US |