The present application generally relates to autonomous vehicles and, more particularly, to a vehicle center of gravity height detection and vehicle mass detection using light detection and ranging (LIDAR) point cloud data.
Some vehicles are equipped with an advanced driver assistance (ADAS) or autonomous driving system that is configured to perform one or more assistance or autonomous driving features (e.g., adaptive cruise control, lane centering, collision avoidance, etc.). Two important vehicle body parameters used by some autonomous driving features are a height of the vehicle's center of gravity (CoG) and a mass of the vehicle. Because the configuration of the vehicle (number of passengers, seating arrangement of passengers, cargo load, etc.) varies from trip to trip, these parameters need to be accurately determined for each vehicle trip. Unfortunately, there are no sensors in current vehicles that are able to directly measure these parameters. Conventional vehicle lateral and/or longitudinal dynamics based solutions require the vehicle to be experiencing excitation conditions (e.g., acceleration or deceleration) and may also require additional sensors, such as gyros. Accordingly, while conventional vehicle CoG height and vehicle mass detection techniques for autonomous driving features do work well for their intended purpose, there exists an opportunity for improvement in the relevant art.
According to one example aspect of the invention, a center of gravity (CoG) height and mass estimation system for a vehicle is presented. In one exemplary implementation, the system comprises a light detection and ranging (LIDAR) system configured to emit light pulses and capture reflected light pulses that collectively form LIDAR point cloud data and a controller configured to estimate the CoG height and the mass of the vehicle during a steady-state operating condition of the vehicle by: processing the LIDAR point cloud data to identify a ground plane, identifying a height difference between (i) a nominal distance from the LIDAR sensor to the ground plane and (ii) an estimated distance from the LIDAR sensor to the ground plane using the processed LIDAR point cloud data, estimating the vehicle CoG height as a difference between (i) a nominal vehicle CoG height and the height difference, and estimating the vehicle mass based on one of (i) vehicle CoG metrics and (ii) dampening metrics of a suspension of the vehicle.
In some implementations, the vehicle further comprises an autonomous driving system comprising a model configured to utilize the estimated vehicle CoG height and the estimated vehicle mass as part of an autonomous driving feature. In some implementations, the processing of the LIDAR point cloud data to identify the ground plane comprises (i) filtering the LIDAR point cloud data to extract points having z-coordinates in a predetermined range and (ii) implementing a least square algorithm with the extracted points from the LIDAR point cloud data to identify the ground plane. In some implementations, the processing of the LIDAR point cloud data comprises identifying an intersection between the ground plane and a z-axis of the LIDAR point cloud data to estimate the distance from the LIDAR sensor to the ground plane.
In some implementations, the estimating of the vehicle mass based on vehicle CoG metrics comprises (i) determining a relationship between the vehicle CoG and a CoG of an extra mass in the vehicle and (ii) implementing a least square algorithm to estimate the vehicle mass based on the determined relationship. In some implementations, the estimating of the vehicle mass based on vehicle suspension dampening metrics comprises (i) determining a spring stiffness of the vehicle suspension and (ii) estimating the vehicle mass based on the LIDAR point cloud data and the vehicle suspension spring stiffness.
In some implementations, the controller is further configured to estimate the vehicle CoG height and vehicle mass during an excitation operating condition of the vehicle based on vehicle lateral and/or longitudinal dynamics and vehicle powertrain and/or suspension characteristics. In some implementations, the controller is configured to, during the excitation operating condition of the vehicle: estimate the vehicle CoG height based on vehicle lateral and longitudinal motion and based on suspension characteristics or additional gyro devices, and estimate the vehicle mass based on vehicle longitudinal dynamics and vehicle powertrain characteristics. In some implementations, the excitation operating condition of the vehicle comprises at least one of (i) acceleration or deceleration of the vehicle above a first threshold and (ii) a steering angle of the vehicle above a second threshold.
According to another example aspect of the invention, a CoG height and mass estimation method for a vehicle is presented. In one exemplary implementation, the method comprises receiving, from a controller of a vehicle, LIDAR point cloud data from a LIDAR sensor of the vehicle, the LIDAR sensor being configured to emit light pulses and capture reflected light pulses that collectively form LIDAR point cloud data and during a steady-state operating condition of the vehicle, estimating, by the controller, the CoG height and the mass of the vehicle during a steady-state operating condition of the vehicle by: processing, by the controller, the LIDAR point cloud data to identify a ground plane, identifying, by the controller, a height difference between (i) a nominal distance from the LIDAR sensor to the ground plane and (ii) an estimated distance from the LIDAR sensor to the ground plane using the processed LIDAR point cloud data, estimating, by the controller, the vehicle CoG height as a difference between (i) a nominal vehicle CoG height and the height difference, and estimating, by the controller, the vehicle mass based on one of (i) vehicle CoG metrics and (ii) dampening metrics of a suspension of the vehicle.
In some implementations, the vehicle further comprises an autonomous driving system comprising a model configured to utilize the estimated vehicle CoG height and the estimated vehicle mass as part of an autonomous driving feature. In some implementations, the processing of the LIDAR point cloud data to identify the ground plane comprises (i) filtering, by the controller, the LIDAR point cloud data to extract points having z-coordinates in a predetermined range and (ii) implementing, by the controller, a least square algorithm with the extracted points from the LIDAR point cloud data to identify the ground plane. In some implementations, the processing of the LIDAR point cloud data comprises identifying, by the controller, an intersection between the ground plane and a z-axis of the LIDAR point cloud data to estimate the distance from the LIDAR sensor to the ground plane.
In some implementations, the estimating of the vehicle mass based on vehicle CoG metrics comprises (i) determining, by the controller, a relationship between the vehicle CoG and a CoG of an extra mass in the vehicle and (ii) implementing, by the controller, a least square algorithm to estimate the vehicle mass based on the determined relationship. In some implementations, the estimating of the vehicle mass based on vehicle suspension dampening metrics comprises (i) determining, by the controller, a spring stiffness of the vehicle suspension and (ii) estimating, by the controller, the vehicle mass based on the LIDAR point cloud data and the vehicle suspension spring stiffness.
In some implementations, the method further comprises estimating, by the controller, the vehicle CoG height and vehicle mass during an excitation operating condition of the vehicle based on vehicle lateral and/or longitudinal dynamics and vehicle powertrain and/or suspension characteristics. In some implementations, the method further comprises during the excitation operating condition of the vehicle: estimating, by the controller, the vehicle CoG height based on vehicle lateral and longitudinal motion and based on suspension characteristics or additional gyro devices, and estimating, by the controller, the vehicle mass based on vehicle longitudinal dynamics and vehicle powertrain characteristics. In some implementations, the excitation operating condition of the vehicle comprises at least one of (i) acceleration or deceleration of the vehicle above a first threshold and (ii) a steering angle of the vehicle above a second threshold.
Further areas of applicability of the teachings of the present disclosure will become apparent from the detailed description, claims and the drawings provided hereinafter, wherein like reference numerals refer to like features throughout the several views of the drawings. It should be understood that the detailed description, including disclosed embodiments and drawings referenced therein, are merely exemplary in nature intended for purposes of illustration only and are not intended to limit the scope of the present disclosure, its application or uses. Thus, variations that do not depart from the gist of the present disclosure are intended to be within the scope of the present disclosure.
As discussed above, vehicle center of gravity (CoG) height and vehicle mass are important parameters for some autonomous driving features and there exists an opportunity for improvement in the art of vehicle CoG height and vehicle mass detection. Accordingly, improved vehicle CoG height and vehicle mass estimation techniques are presented. These techniques are capable of accurately estimating vehicle CoG height and vehicle mass during steady-state operating conditions of the vehicle (e.g., cruise conditions along a road or highway). In other words, excitation conditions (e.g., acceleration or deceleration) of the vehicle are not required for these estimation techniques. During excitation conditions, however, the conventional techniques based on lateral and/or longitudinal dynamics and vehicle powertrain and/or suspension characteristics can be utilized. The steady-state operation estimation techniques utilize LIDAR point cloud data captured by a LIDAR sensor, which is filtered and points indicative of a ground plane are extracted. A height difference between the LIDAR sensor and the ground plane are then compared to a nominal height difference (e.g., with no additional vehicle passengers/mass) to estimate the vehicle CoG. One of two techniques is then utilized to estimate the vehicle mass: (1) a vehicle CoG based technique or (2) a vehicle suspension dampening metrics based technique. The estimated vehicle CoG and vehicle mass are then utilized as part of one or more autonomous driving features of the vehicle.
Referring now to
A controller 116 controls operation of the powertrain 104 to achieve a desired amount of drive torque, e.g., based a driver torque request provided via a driver interface 120. The controller 116 also implements one or more autonomous driving or ADAS features (automated braking, collision avoidance, etc.). The autonomous driving system of the present disclosure therefore generally comprises the controller 116, a LIDAR system or sensor 124, and one or more other sensors 128. Non-limiting examples of these other sensors 128 include lateral and longitudinal acceleration sensors (e.g., gyros) and a steering angle sensor. The LIDAR sensor 124 is configured to emit light pulses and capture reflected light pulses that collectively form a LIDAR point cloud. The controller 116 is also configured to perform at least a portion of the vehicle CoG height and vehicle mass estimation techniques of the present disclosure, which will now be described in greater detail with specific reference to
Referring now to
This function activator 308 determines whether vehicle excitation conditions are present based on signals from the other sensors 128 indicative of strong forces acting on the vehicle 100 (steering angle (δSA), longitudinal acceleration (αx), lateral acceleration (αx), etc.). When vehicle excitation conditions are not present, the function activator 308 generates an activation signal for function 304. The primary input for function 304 is the LIDAR point cloud data captured by the LIDAR sensor 124. Function block 312 performs filtering and extraction on the LIDAR point cloud to identify the ground plane (e.g., see 234 of
Referring now to
Referring again to
Each of these individual objects has an individual CoG height (hCoG,nominal, hest) and the CoG of two objects is located at hCoG,est=hCoG,nominal−Δhest. Then, a formula related to the CoG of composite objects can be used to extract the relation between their CoG heights of objects and their masses. Finally, using a parameter identification algorithm (e.g., least squares), the estimated vehicle mass mest is obtained. In Approach 2, on the other hand, the vehicle system is assumed to be a spring mass system. Using known vehicle specifications (i.e., dampening metrics), the spring stiffness can be identified. Then, adding extra mass will compress the spring mass system more. Thus, using the LIDAR point cloud data and the vehicle suspension dampening metrics, the estimated vehicle mass mest is obtained. The two outputs hCoG,est and mest are then utilized as part of one or more autonomous driving features (body control, rollover prevention, etc.).
Referring now to
It will be appreciated that the term “controller” as used herein refers to any suitable control device or set of multiple control devices that is/are configured to perform at least a portion of the techniques of the present disclosure. Non-limiting examples include an application-specific integrated circuit (ASIC), one or more processors and a non-transitory memory having instructions stored thereon that, when executed by the one or more processors, cause the controller to perform a set of operations corresponding to at least a portion of the techniques of the present disclosure. The one or more processors could be either a single processor or two or more processors operating in a parallel or distributed architecture.
It should also be understood that the mixing and matching of features, elements, methodologies and/or functions between various examples may be expressly contemplated herein so that one skilled in the art would appreciate from the present teachings that features, elements and/or functions of one example may be incorporated into another example as appropriate, unless described otherwise above.
Number | Name | Date | Kind |
---|---|---|---|
11199614 | Gan | Dec 2021 | B1 |
20050102083 | Xu et al. | May 2005 | A1 |
20100017128 | Zeng | Jan 2010 | A1 |
20200341116 | Smith | Oct 2020 | A1 |
Number | Date | Country |
---|---|---|
WO-2021143778 | Jul 2021 | WO |
Number | Date | Country | |
---|---|---|---|
20220144289 A1 | May 2022 | US |