The present invention generally relates to vehicle collision systems, and more particularly, to a vehicle collision system configured to detect and mitigate collisions.
Traditional vehicle collision systems are used to warn or otherwise alert a driver of a potential collision with an object or another vehicle. However, these warning systems are typically limited to other vehicles or objects in a forward or reverse host vehicle trajectory. Objects or other vehicles that pose a collision threat to the sides of a vehicle are generally more difficult to detect.
According to an embodiment of the invention, there is provided a method for use with a vehicle collision system. The method includes detecting one or more object(s) in a host vehicle's field-of-view, calculating time-to-pass estimates for each of the detected object(s), wherein the time-to-pass estimates represent an expected time for a reference plane of the host vehicle to pass a reference plane of the detected object(s), and determining a potential collision between the host vehicle and the one or more detected object(s) based on the time-to-pass estimates.
According to another embodiment of the invention, there is provided a method for use with a vehicle collision system that includes detecting at least one object in a host vehicle's field-of-view, calculating an expected host vehicle path relative to the at least one detected object, and determining a potential for a collision between the at least one detected object and a front, rear, left-side, or right-side of the host vehicle, wherein the potential for collision is based on the expected host vehicle path and an intersection of reference planes relating to the host vehicle and the at least one detected object.
According to yet another embodiment of the invention, there is provided a vehicle collision system having a plurality of sensors configured to identify one or more objects in a host vehicle's field-of-view, and a control module configured to calculate time-to-pass estimates for each of the detected object(s), wherein the time-to-pass estimates represent an expected time for a reference plane of the host vehicle to pass a reference plane of the detected object(s), and determine a potential collision between the host vehicle and the one or more detected object(s) based on the time-to-pass estimates.
One or more embodiments of the invention will hereinafter be described in conjunction with the appended drawings, wherein like designations denote like elements, and wherein:
The exemplary vehicle collision system and method described herein may be used to detect and avoid potential or impending collisions with stationary or moving objects, and in particular, a side collision at relatively low speed and/or parking scenarios. For purposes of the present application, the term “low speed” means vehicle speeds of 30 mph or less. The disclosed vehicle collision system implements a method for detecting objects around a periphery of the vehicle and determines whether there is a potential for collision based on the vehicle's trajectory and position of the detected objects. In one embodiment, determining potential collisions with the detected objects includes determining a type of potential collision and calculating a corresponding time-to-collision. Based on this information, the system then determines which of the detected objects poses the highest threat, which in one embodiment may take into account the time-to-collision and the type of potential collision. The highest threat object is then compared to a plurality of thresholds to determine the most appropriate remedial action to avoid or mitigate the collision.
With reference to
According to one example, vehicle collision system 10 employs object detection sensors 14, inertial measurement unit (IMU) 16, and a control module 18, which in one embodiment is an external object calculating module (EOCM). Object detection sensors 14 may be a single sensor or a combination of sensors, and may include without limitation, a light detection and ranging (LIDAR) device, a radio detection and ranging (RADAR) device, a vision device (e.g., camera, etc.), a laser diode pointer, or a combination thereof. Object detection sensors 14 may be used alone, or in conjunction with other sensors, to generate readings that represent an estimated position, velocity and/or acceleration of the detected objects with respect to the host vehicle 12. These readings may be absolute in nature (e.g., a velocity or acceleration of the detected object that is relative to ground) or they may be relative in nature (e.g., a relative velocity reading (Av) which is the difference between the detected object and host vehicle velocities, or a relative acceleration reading (Aa) which is the difference between the detected object and host vehicle accelerations). Collision system 10 is not limited to any particular type of sensor or sensor arrangement, specific technique for gathering or processing sensor readings, or particular method for providing sensor readings, as the embodiments described herein are simply meant to be exemplary.
Any number of different sensors, components, devices, modules, systems, etc. may provide vehicle collision warning system 10 with information or input that can be used by the present method. It should be appreciated that object detection sensors 14, as well as any other sensor located in and/or used by collision system 10 may be embodied in hardware, software, firmware, or some combination thereof. These sensors may directly sense or measure the conditions for which they are provided, or they may indirectly evaluate such conditions based on information provided by other sensors, components, devices, modules, systems, etc. Furthermore, these sensors may be directly coupled to control module 18, indirectly coupled via other electronic devices, a vehicle communications bus, network, etc., or coupled according to some other arrangement known in the art. These sensors may be integrated within another vehicle component, device, module, system, etc. (e.g., sensors integrated within an engine control module (ECM), traction control system (TCS), electronic stability control (ESC) system, antilock brake system (ABS), etc.), or they may be stand-alone components (as schematically shown in
As shown in
IMU 16 is an electronic device that measures and reports a vehicle's velocity, orientation, and gravitational forces using a combination of accelerometers, gyroscopes, and/or magnetometers. IMU 16 works by detecting a current rate of acceleration using one or more accelerometers and detects changes in rotational attributes like pitch, roll, and yaw using one or more gyroscopes. Some also include a magnetometer, mostly to assist calibration against orientation drift. Angular accelerometers measure how the vehicle is rotating in space. Generally, there is at least one sensor for each of the three axes: pitch (nose up and down), yaw (nose left and right) and roll (clockwise or counter-clockwise from the vehicle cockpit). Linear accelerometers measure non-gravitational accelerations of the vehicle. Since it can move in three axes (up & down, left & right, forward & back), there is a linear accelerometer for each axis. A computer continually calculates the vehicle's current position. First, for each of the six degrees of freedom (x,y,z and θx, θy and θz), it integrates over time the sensed acceleration, together with an estimate of gravity, to calculate the current velocity. Then it integrates the velocity to calculate the current position.
Control module 18 may include any variety of electronic processing devices, memory devices, input/output (I/O) devices, and/or other known components, and may perform various control and/or communication related functions. Depending on the particular embodiment, control module 18 may be a stand-alone vehicle electronic module (e.g., an object detection controller, a safety controller, etc.), it may be incorporated or included within another vehicle electronic module (e.g., a park assist control module, brake control module, etc.), or it may be part of a larger network or system (e.g., a traction control system (TCS), electronic stability control (ESC) system, antilock brake system (ABS), driver assistance system, adaptive cruise control system, lane departure warning system, etc.), to name a few possibilities. Control module 18 is not limited to any one particular embodiment or arrangement.
For example, in an exemplary embodiment control module 18 is an external object calculating module (EOCM) that includes an electronic memory device that stores various sensor readings (e.g., inputs from object detection sensors 14 and position, velocity, and/or acceleration readings from IMU 16), look up tables or other data structures, algorithms, etc. The memory device may also store pertinent characteristics and background information pertaining to vehicle 12, such as information relating to stopping distances, deceleration limits, temperature limits, moisture or precipitation limits, driving habits or other driver behavioral data, etc. EOCM 18 may also include an electronic processing device (e.g., a microprocessor, a microcontroller, an application specific integrated circuit (ASIC), etc.) that executes instructions for software, firmware, programs, algorithms, scripts, etc. that are stored in the memory device and may govern the processes and methods described herein. EOCM 18 may be electronically connected to other vehicle devices, modules and systems via suitable vehicle communications and can interact with them when required. These are, of course, only some of the possible arrangements, functions and capabilities of EOCM 18, as other embodiments could also be used.
Turning now to
At step 106, the vehicle's expected path is calculated based on data received from various vehicle components, such as, for example, the IMU 16, the throttle pedal sensor, the brake pedal sensor, and the steering wheel angle sensor. At step 108, preliminary assessments are made to determine the potential for collisions with the detected objects using the vehicle's expected path and the sensor data. In one embodiment, the assessments include course estimates regarding the vehicle's expected path and the current position of the detected objects. Based on these course estimates, the system determines whether a potential for a collision exists between the host vehicle 12 and the detected objects. If there is no potential for a collision with any of the detected objects, the process returns to referencing the sensor data at step 104. If there is a potential for a collision with one or more detected objects, at step 110 the system initiates a preliminary threat assessment, which includes determining a time-to-collision for each potential collision. The system determines which of the detected objects poses the highest threat and calculates a final time-to-collision between the host vehicle 12 and the highest threat object. In one embodiment, the highest threat object is the object having the lowest time-to-collision. In words, the first object that is likely to collide with the host vehicle 12 based on the relationship between the position, movement, and trajectory of both the vehicle 12 and the detected object. In other embodiments, the highest threat object is determined based on a combination of the time-to-collision and the type of collision.
In one particular embodiment of step 108, the collision assessment includes determining whether there is a potential for collision between the host vehicle 12 and the detected objects, and also, the type of potential collision. An exemplary method for implementing the collision assessment of step 108 is described below with reference to the flow chart illustrated in
An exemplary visual representation of the TTP estimates is shown in
The TTP estimates are calculated for parallel reference planes between the host vehicle 12 and the detected object 34. In other words, TTP estimates are calculated between the front and rear planes 36, 38 of the host vehicle 12 to each of the first-surface and third-surface planes 44, 48 of the detected object 34, and between the left-side and right-side planes 40, 42 of the host vehicle 12 to each of the second-surface and fourth-surface planes 46, 50 of the detected object 34. The TTP estimates for each of these parallel planes are shown in
Referring again to the flowchart in
At step 108c, the types of potential collisions are determined based on the TTP estimates. By way of example, collisions between the detected object 34 and the front of the host vehicle 12 (i.e. a front collision) are determined likely to occur when the front surface of the host vehicle 12 will cross the nearest parallel surface of the detected object 34 after a side surface of the host vehicle 12 crosses the nearest lateral plane (i.e., nearest plane of the detected object that is parallel to the side surface plane of host vehicle) of the detected object 34, but before the other far side surface of the host vehicle 12 will cross the furthest lateral plane of the detected object. Stated another way, a front collision occurs when the time expected for the front plane 36 of the host vehicle 12 to cross the plane of the nearest parallel surface of the detected object 34 is greater than the time expected for a side surface plane of the host vehicle 12 to cross the plane of the nearest side surface of the detected object 34, but less than the time expected for the opposite far side surface plane of the host vehicle 12 to cross the plane of the furthest side surface of the detected object 34.
Using a specific example and with reference to the scenario shown in
The collision assessments relative to the front, rear, and side surfaces of the host vehicle 12, and in particular, the TTP estimate conditions for determining the type of collision with respect to the host vehicle 12, are summarized in the table below.
One of ordinary skill in the art appreciates that the reference frames described above with respect to
Referring back to
If the time-to-collision for the highest threat object is less than or equal to the steering action threshold, at step 118 the system determines a steering maneuver to avoid the collision with the highest threat object. The steering maneuver is determined in part based on the relationship between the position, movement, and trajectory of both the vehicle 12 and the detected object. In one embodiment, step 118 may also include sending a brake pulse command as a haptic indicator to the driver prior to commanding the steering maneuver. Prior to initiating the calculated steering maneuver, at step 120 the system evaluates the vehicle's new trajectory to determine if any objects are in the new path of the vehicle 12. If there are objects in the new path that continue to pose a potential for collision, the process returns to step 114 and initiates an emergency braking feature by sending a command to the electronic brake control module to decelerate and stop the vehicle. If there are no objects in the new path, then at step 122 a steering request command is sent to a power steering module (not shown) to execute the steering maneuver to avoid the collision. Thereafter, the process returns to step 102 to continually check if the remedial action and/or external conditions have changed.
Referring back to step 116, if the time-to-collision for the highest threat object is not less than or equal to the steering action threshold, at step 124 the time-to-collision for the highest threat object is compared to a warning action threshold. If the time-to-collision for the highest threat object is less than or equal to the warning action threshold, at step 126 an alert is sent to the instrument panel cluster (not shown) warning the vehicle occupants of the potential collision. The alert can be, without limitation, a message via the instrument panel cluster, audible alerts, haptic alerts, and/or brake pulses.
It is to be understood that the foregoing is a description of one or more embodiments of the invention. The invention is not limited to the particular embodiment(s) disclosed herein, but rather is defined solely by the claims below. Furthermore, the statements contained in the foregoing description relate to particular embodiments and are not to be construed as limitations on the scope of the invention or on the definition of terms used in the claims, except where a term or phrase is expressly defined above. Various other embodiments and various changes and modifications to the disclosed embodiment(s) will become apparent to those skilled in the art. All such other embodiments, changes, and modifications are intended to come within the scope of the appended claims.
As used in this specification and claims, the terms “e.g.,” “for example,” “for instance,” “such as,” and “like,” and the verbs “comprising,” “having,” “including,” and their other verb forms, when used in conjunction with a listing of one or more components or other items, are each to be construed as open-ended, meaning that the listing is not to be considered as excluding other, additional components or items. Other terms are to be construed using their broadest reasonable meaning unless they are used in a context that requires a different interpretation.