The present disclosure relates to systems and methods for adjusting driver assist functions based on a distraction level of a driver.
The background description provided here is for the purpose of generally presenting the context of the disclosure. Work of the presently named inventors, to the extent it is described in this background section, as well as aspects of the description that may not otherwise qualify as prior art at the time of filing, are neither expressly nor impliedly admitted as prior art against the present disclosure.
Modern vehicle control systems perform driver assist functions such as autonomous emergency braking, forward collision warning, adaptive cruise control, lane departure warning, lane keep assist, lane centering control, lane change warning, lane change assist, front cross traffic alert and/or braking, and highway pilot. Autonomous emergency braking involves applying a brake actuator to decelerate a vehicle to avoid or mitigate an impact with an object ahead of the vehicle. Forward collision warning involves notifying a driver of a potential impact with an object ahead of the vehicle.
Adaptive cruise control involves adjusting the speed of a vehicle to maintain a set speed selected by the driver and a following distance or time gap to another vehicle. Lane departure warning involves notifying a driver when a vehicle is about to depart or has departed from a lane in which the vehicle is travelling without the intention of the driver. Lane keep assist involves adjusting the path of a vehicle to prevent the vehicle from departing from a lane in which the vehicle is travelling without the intention of the driver.
Lane centering control involves adjusting the path of a vehicle to maintain the centerline of the vehicle aligned with the centerline of a lane in which the vehicle is travelling. Lane change warning involves notifying a driver of an object within a future path of a vehicle during a lane change, such as in a blind spot of the vehicle. Lane change assist involves adjusting the speed and path of a vehicle to perform a highway lane change maneuver when instructed to do so by a driver.
Front cross traffic alert involves notifying a driver of a vehicle when another vehicle is crossing a forward path of the vehicle where a potential impact is imminent. Front cross traffic alert with braking involves applying a brake actuator to decelerate a vehicle when another vehicle is crossing a forward path of the vehicle and a potential impact is imminent. Highway pilot involves controlling brake, acceleration, and steering actuators to adjust the speed and path of a vehicle to complete a portion of a route selected by the driver that takes place on a designated highway or other geo-fenced region.
The present disclosure will become more fully understood from the detailed description and the accompanying drawings, wherein:
In the drawings, reference numbers may be reused to identify similar and/or identical elements.
Driver assist functions such as those discussed above serve several purposes. Some driver assist functions perform tasks of a driver to reduce the amount of effort required to drive a vehicle. These tasks are typically level one or two functions under the level definitions provided by the Society of Automotive Engineers (SAE). Some driver assist functions notify a driver of a potential impact with an object in case the driver does not observe the object. These tasks are typically level zero functions under the SAE level definitions. Some driver assist functions control a vehicle actuator to cause a vehicle to take a corrective action that avoids or mitigates an impact. These tasks are typically level zero, one, or two functions under the SAE level definitions.
Driver assist functions that perform the tasks of a driver may be designed with the assumption that the driver is paying attention to the driving tasks while the driver assist functions are performed. Thus, if a driver becomes distracted while these driver assist functions are performed, the number of redundant safety measures may be less than intended. Diver assist functions that notify a driver of a potential impact and/or control a vehicle actuator to avoid or mitigate an impact are most beneficial when the driver is not paying attention to the vehicle surroundings. However, these driver assist functions may become annoying to a driver that is paying attention, prompting the driver to disable these driver assist functions and thereby eliminating the safety benefits of these driver assist functions.
A vehicle control system according to the present disclosure addresses these issues by adjusting driver assist functions based on a distraction level of a driver. In one example, when the driver distraction level is medium or high, the system disables driver assist functions that perform driving tasks. In turn, the driver is forced to either pay attention to the driving tasks being performed by the vehicle control system or perform those driving tasks himself/herself. In another example, when the driver distraction level is low, the system adjusts driver assist functions that notify a driver of a potential impact by delaying or disabling the notifications or reducing the intensity of the notifications. This reduces the likelihood that the driver will become annoyed with the notifications and therefore disable these driver assist functions. In another example, when the driver distraction level is low, the system adjusts driver assist functions that control a vehicle actuator to avoid or mitigate an impact by delaying, disabling, and/or reducing the magnitude the corrective actions. This reduces the likelihood that the driver will become annoyed with the corrective actions and therefore disable these driver assist functions. In another example, when the driver distraction level is high, the system increases the aggressiveness of one or more driver assist functions that control a vehicle actuator in order to prevent unintended driving maneuvers or impacts such as rear end collisions.
Referring now to
The user interface device 16 generates visible messages (e.g., text, images), audible messages, and/or tactile messages (e.g., haptic feedback) providing information to the driver of the vehicle 10. The user interface device 16 may include an electronic display (e.g., a touchscreen) operable to display a visible message and/or to generate electronic signals in response to a user input (e.g., a user touching the touchscreen). In addition, the user interface device 16 may include a heads-up display operable to project a visible message onto a windshield (not shown) of the vehicle 10. Further, the user interface device 16 may include one or more vibrators mounted to, for example, a steering wheel (not shown) and/or the driver's seat (not shown) to provide haptic feedback to the driver. Moreover, the user interface device 16 may include a speaker operable to generate a sound or audible message within the cabin 14, and/or a microphone operable to receive verbal commands from the driver.
The steering actuator 18 is operable to steer the vehicle 10 by moving linkages attached to front wheels (not shown) of the vehicle 10 and thereby turn the vehicle 10 right or left. The steering actuator 18 may be an electronically controlled steering gear. The acceleration actuator 20 is operable to accelerate the vehicle 10 by increasing the torque output of an engine (not shown) of the vehicle 10 and/or an electric motor (not shown) of the vehicle 10. Additionally or alternatively, the acceleration actuator 20 may accelerate the vehicle 10 by downshifting a transmission (not shown) of the vehicle 10. The acceleration actuator 20 may be or include an electronically controlled throttle, a motor control module, and/or a transmission gear actuator. The brake actuator 22 is operable to decelerate the vehicle 10 by rubbing against a component mounted to the front wheels of the vehicle 10 and/or rear wheels (not shown) of the vehicle 10. The brake actuator 22 may be or include one or more friction brakes (e.g., disc brakes).
The vehicle control module 24 performs several driver assist functions to assist the driver of the vehicle 10 based on inputs from sensors and cameras on the vehicle 10. The driver assist functions performed by the vehicle control module 24 include autonomous emergency braking (AEB), forward collision warning (FCW), adaptive cruise control (ACC), lane departure warning (LDW), lane keep assist (LKA), lane centering control (LCC), lane change warning (LCW), lane change assist (LCA), front cross traffic alert (FCTA), front cross traffic alert with braking (FCTB), rear cross traffic alert (RCTA), rear cross traffic alert with braking (RCTB) and highway pilot (HP). AEB involves applying the brake actuator 22 to decelerate the vehicle 10 to avoid or mitigate an impact with an object ahead of the vehicle 10. FCW involves controlling the user interface device 16 to generate a visible, audible, and/or tactile message notifying the driver of a potential impact with an object ahead of the vehicle 10.
ACC involves controlling the acceleration and brake actuators 20 and 22 to adjust the speed of the vehicle 10 to maintain a set speed selected by the driver and a following distance to another vehicle. LDW involves controlling the user interface device 16 to generate a visible, audible, and/or tactile message notifying the driver when the vehicle 10 is about to depart or has departed from a lane in which the vehicle 10 is travelling. LKA involves controlling the steering actuator 18 to maintain the vehicle 10 within the lane in which the vehicle 10 is travelling. LCC involves controlling the steering actuator 18 to maintain the centerline of the vehicle 10 aligned with the centerline of the lane. LCW involves controlling the user interface device 16 to generate a visible, audible, and/or tactile message notifying the driver of an object in a future path of the vehicle 10 during a lane change, such as in a blind spot of the vehicle 10. LCA involves controlling the steering, acceleration and brake actuators 18, 20, and 22 to adjust the speed and path of the vehicle 10 to perform a lane change maneuver when instructed to do so by the driver.
FCTA involves controlling the user interface device 16 to generate a visible, audible, and/or tactile message notifying the driver when another vehicle is crossing a forward path of the vehicle 10 and an impact is imminent. FCTB involves applying the brake actuator 22 to decelerate the vehicle 10 when another vehicle is crossing a forward path of the vehicle 10 and an impact is imminent. HP involves controlling the steering, acceleration, and brake actuators 18, 20, and 22 to adjust the speed and path of the vehicle 10 to complete a portion of a route selected by the driver that takes place in a geo-fenced area or access-controlled highway. RCTA involves controlling the user interface device 16 to generate a visible, audible, and/or tactile message notifying the driver when another vehicle is crossing a rearward path of the vehicle 10 and an impact is imminent. RCTB involves applying the brake actuator 22 to decelerate the vehicle 10 when another vehicle is crossing a reward path of the vehicle 10 and an impact is imminent.
The sensors include a left front short range radar (SRR) sensor 34, a right front SRR sensor 36, a left rear SRR sensor 38, a right rear SRR sensor 40, and a long range radar (LRR) sensor 42. The SRR sensors 34, 36, 38, 40 and the LRR sensor 42 detect objects near the vehicle 10. The left front SRR sensor 34 detects objects that are both within a field of view 35 of the left front SRR sensor 34 and within a certain range or distance of the left front SRR sensor 34. The right front SRR sensor 36 detects objects that are both within a field of view 37 of the right front SRR sensor 36 and within a certain range or distance of the right front SRR sensor 36.
The left rear SRR sensor 38 detects objects that are both within a field of view 39 of the left rear SRR sensor 38 and within a certain range or distance of the left rear SRR sensor 38. The right rear SRR sensor 40 detects objects that are both within a field of view 41 of the fight rear SRR sensor 40 and within a certain range or distance of the right rear SRR sensor 40. The LRR sensor 42 detects objects that are both within a field of view 43 of the LRR sensor 42 and within a certain range or distance of the LRR sensor 42. As their names indicate, the distances or ranges within which the SRR sensors 34, 36, 38, 40 are capable of detecting objects may be shorter than the distance or range within which the LRR sensor 42 is capable of detecting objects.
The SRR sensors 34, 36, 38, 40 and the LRR sensor 42 may detect objects near the vehicle 10 by emitting a radar wave and identifying the presence of an object when the radar wave bounces off the object and is received by the sensor that emitted it. The SRR sensors 34, 36, 38, 40 and the LRR sensor 42 may also output the period from the time when the radar wave is transmitted and the time when the radar wave it received by the sensor that emitted it. This period may be used to determine the distance between the vehicle 10 and objects detected by the SRR sensors 34, 36, 38, 40 and the LRR sensor 42. In various implementations, the vehicle 10 may include one or more lidar sensors (not shown) and/or one or more sonar sensors (not shown) in place of or in addition to the SRR sensors 34, 36, 38, 40 and/or the LRR sensor 42. The vehicle control module 24 may use inputs from the lidar and/or sonar sensors in the same way that the vehicle control module 24 uses inputs from the SRR sensors 34, 36, 38, 40 and the LRR sensor 42.
The cameras include a front camera 44, a rear camera 46, and a cabin camera 48. The front camera 44 captures a digital image of an area in front of the vehicle 10 within a field of view 45 of the front camera 44. The rear camera 46 captures a digital image of an area to the rear of the vehicle 10 within a field of view 47 of the rear camera 46. The cabin camera 48 captures a digital image of an area in the cabin 14 that is within a field of view 49 of the cabin camera 48.
The vehicle control module 24 also determines a distraction level of the driver of the vehicle 10 and adjusts the driver assist functions based on the driver distraction level. The vehicle control module 24 determines the driver distraction level based on the image captured by the cabin camera 48. For example, the vehicle control module 24 may identify features of the driver in the image such as the driver's eyes and hands, and determine the driver distraction level based on whether the driver's features are positioned, oriented, and moving in a manner consistent with attentive driving.
Referring now to
The lane boundary position module 50 may use a predetermined relationship between the number of pixels in the image and distance to determine the lane boundary positions relative to the vehicle 10 based on the locations of the lane boundaries in the image. The predetermined relationship may be obtained by positioning the vehicle 10 at known distances from lane boundaries and observing the number of pixels between the vehicle 10 and the lane boundaries in the image captured by the front camera 44. The predetermined relationship may be a two-dimensional coordinate system that relates the positions of pixels in the image captured by the front camera 44 to fore-aft and side-to-side distances between the vehicle 10 and the object depicted by the pixels.
The object position module 52 determines the positions of objects relative to the vehicle 10. The object position module 52 may determine the positions of objects ahead of the vehicle 10 and/or crossing a forward path of the vehicle 10 based on input(s) from the left front SRR sensor 34, the right front SRR sensor 36, the LRR sensor 42, and/or the front camera 44. The object position module 52 may determine the positions of objects rearward of the vehicle 10 and/or crossing a rearward path of the vehicle 10 based on input(s) from the left rear SRR sensor 38, the right rear SRR sensor 40, and/or the rear camera 46. Additionally or alternatively, the object position module 52 may determine the positions of objects rearward of the vehicle 10 and/or crossing a rearward path of the vehicle 10 based on input(s) from the sonar sensors of the vehicle 10. The object position module 52 outputs the positions of objects relative to the vehicle 10.
The object position module 52 may determine the position of an object detected by one of the SRR sensors 34, 36, 38, 40 or the LRR sensor 42 based on the period from the time when a radar wave is emitted from the sensor that detected the object and the time when the radar wave is received by that sensor. The object position module 52 may determine the position of an object detected by one of the aforementioned sensors based on the aforementioned period using a predetermined relationship between (i) time elapsed between radar wave transmission and receipt and (ii) distance. In addition, the object position module 52 may determine the position of an object detected by one of the SRR sensors 34, 36, 38, 40 or the LRR sensor 42 based on the direction in which the radar wave is emitted from the sensor that detected the object.
The object position module 52 may identify objects in the images captured by the front camera 44 and/or the rear camera 46 using edge detection techniques, and determine the positions of the objects relative to the vehicle 10 based on locations of the objects in the images. For example, the object position module 52 may use a predetermined relationship between the number of pixels in the images and distance to determine the object positions relative to the vehicle 10 based on the locations of the objects in the images. The predetermined relationship may be obtained by positioning the vehicle 10 at known distances from objects and observing the number of pixels between the vehicle 10 and the objects in the images captured by the front camera 44 and/or the rear camera 46. The predetermined relationship may be a two-dimensional coordinate system that relates the positions of pixels in the images captured by the front camera 44 and/or the rear camera 46 to fore-aft and side-to-side distances between the vehicle 10 and the objects depicted by the pixels.
The driver distraction module 54 determines the distraction level of the driver of the vehicle 10 based on the image captured by the cabin camera 48. Examples of how the driver distraction module 54 may determine the driver distraction level based on the image captured by the cabin camera 48 are described below with respect to
The driver assist module 56 performs the driver assist functions that are performed by the vehicle control module 24. Thus, the driver assist module 56 performs the AEB, FCW, ACC, LDW, LKA, LCC, LCW, LCA, FCTA, FCTB, RCTA, RCTB, and HP functions. The driver assist module 56 performs the driver assist functions based on the lane boundary positions from the lane boundary position module 50 and the object position(s) from the object position module 52. The driver assist module 56 also adjusts the driver assist functions based on the driver distraction level from the driver distraction module 54. Examples of how the driver assist module 56 may perform the driver assist functions and adjust the driver assist functions based on the driver distraction level is described below with respect to
The example implementation of the vehicle control module 24 shown in
The steering control module 60 controls the steering actuator 18 to steer the vehicle 10. The steering control module 60 controls the steering actuator 18 by sending an electronic signal to the steering actuator 18 instructing the steering actuator 18 to steer the vehicle 10 left or right, as well as the amount by which the steering actuator 18 is to steer the vehicle 10. In an example of the latter, the signal output by the steering control module 60 may indicate a target torque output of the steering actuator 18.
The brake control module 62 controls the brake actuator 22 to decelerate the vehicle 10. The brake control module 62 may control the brake actuator 22 by sending an electronic signal to the brake actuator 22 instructing the brake actuator 22 to decelerate the vehicle 10. The signal output from the brake control module 62 to the brake actuator 22 may also indicate the rate by which the brake actuator 22 is to decelerate the vehicle 10.
The acceleration control module 64 controls the acceleration actuator 20 to accelerate the vehicle 10. The acceleration control module 64 may control the acceleration actuator 20 by sending an electronic signal to the acceleration actuator 20 instructing the acceleration actuator 20 to accelerate the vehicle 10. The signal output from the acceleration control module 64 to the acceleration actuator 20 may also indicate the rate by which the acceleration actuator 20 is to accelerate the vehicle 10.
The driver assist module 56 performs the driver assist functions by sending instructions to the UID control module 58, the steering control module 60, the brake control module 62, and the acceleration control module 64. For example, the driver assist module 56 performs the driver assist functions that involve notifying a driver of a potential impact by sending an instruction (e.g., a signal) to the UID control module 58 to control the user interface device 16 to generate a visible, audible, and/or tactile message. In another example, the driver assist module 56 performs the driver assist functions that involve controlling a vehicle actuator to avoid or mitigate an impact by sending an instruction to the steering control module 60, the brake control module 62, and/or the acceleration control module 64. In another example, the driver assist module 56 performs the driver assist functions that involve controlling a vehicle actuator to perform driving tasks by sending an instruction to the steering control module 60, the brake control module 62, and/or the acceleration control module 64. In various implementations, the driver assist module 56 may include, or perform the functions of, the lane boundary position module 50, the object position module 52, the UID control module 58, the steering control module 60, the brake control module 62, and the acceleration control module 64.
Referring now to
The driver distraction module 54 may detect edges of the features or items in the image captured by the cabin camera 48, and determine the shapes and sizes of the features or items based on the detected edges. The driver distraction module 54 may compare the determined shapes and sizes to pairs of predetermined shapes and sizes corresponding to features such as eye pupils, eyelids, and hands. If the determined shapes and sizes are within a predetermined range of one of the pairs of predetermined shapes and sizes, the driver distraction module 54 may determine that the feature or item detected in the image is the feature or item corresponding to that pair.
The driver distraction module 54 may determine sizes of features or items in the image captured by the cabin camera 48, and the amount by which the features or items move, using a predetermined relationship between pixels in the image and distance. The predetermined relationship may be obtained by placing an object of a known size in the expected location of the driver's features, determining the number of pixels in the image that correspond to the object, and comparing that number to the size of the object. The driver distraction module 54 may observe the magnitude of movements of the feature or items over time in order to determine the speed of the movements.
At 74, the driver distraction module 54 determines whether the driver's eyes are focused on a road on which the vehicle 10 is travelling by, for example, determining whether the pupils of the driver's eyes are oriented toward the road. The driver distraction module 54 may adjust the driver distraction level to a lower level when the driver's eyes are focused on the road and adjust the driver distraction level to a higher level when the driver's eyes are not focused on the road. In various implementations, the driver distraction module 54 may determine whether the driver's eyes are focused on an object that is within a possible future path of the vehicle 10 and adjust the driver distraction level based on that determination. For example, the driver distraction module 54 may adjust the driver distraction level to a lower level when the driver's eyes are focused on the object and adjust the driver distraction level to a higher level when the driver's eyes are not focused on the object.
The driver distraction module 54 may determine whether the pupils of the driver's eyes are pointing toward the road based on the shape and/or orientation of the pupils in the image captured by the cabin camera 48. In various implementations, the vehicle 10 may include multiple ones of the cabin camera 48 to obtain images of the cabin 14 from different perspectives, and the driver distraction module 54 may generate a three-dimensional (3D) image of the cabin 14 based on the images. In these implementations, the driver distraction module 54 may determine whether the pupils of the driver's eyes are pointing toward the road based on the 3D orientation of the pupils.
At 76, the driver distraction module 54 determines whether the driver's eyes are blinking slowly, which indicates that the driver is drowsy. The driver distraction module 54 may adjust the driver distraction level to a lower level when the driver's eyes are not blinking slowly and adjust the driver distraction level to a higher level when the driver's eyes are blinking slowly. The driver distraction module 54 may determine whether the driver's eyes are blinking slowly by determining the speed at which the driver's eyelids move and comparing that speed to a predetermined speed for normal blinking. If the determined speed is less than the predetermined speed, the driver distraction module 54 determines that the driver's eyes are blinking slowly. Otherwise, the driver distraction module 54 determines that the driver's eyes are not blinking slowly.
At 78, the driver distraction module 54 determines whether the driver's hands are on the steering wheel of the vehicle 10. The driver distraction module 54 may adjust the driver distraction level to a lower level when the driver's hands are on the steering wheel and adjust the driver distraction level to a higher level when one or both of the driver's hands is/are not on the steering wheel. The driver distraction module 54 may identify the steering wheel in the image captured by the cabin camera 48 by comparing sizes, shapes, and locations of features or items in the image to a predetermined shape, size, and location of the steering wheel. The driver distraction module 54 may then determine whether objects in contact with and/or wrapped around the steering wheel corresponds to the driver's hands and, if so, determine that the driver's hands are on the steering wheel.
At 80, the driver distraction module 54 determines whether the driver's hands are holding a distracting item such as a cell phone or a cup. The driver distraction module 54 may adjust the driver distraction level to a lower level when the driver's hands are not holding a distracting item and adjust the driver distraction level to a higher level when the driver's hands are holding a distracting item. In various implementations, the driver distraction module 54 may use determinations made at 74 and 80 to determine whether the driver is exhibiting distracted behavior and increase the driver distraction level when that is the case. For example, if the driver's eyes are not focused on the road and the driver is holding a cell phone, the driver distraction module 54 may determine that the driver is using the cell phone and increase the driver distraction level.
At 82, the driver distraction module 54 determines the driver distraction level. The driver distraction module 54 may adjust the driver distraction level at 74, 76, 78, and 80, or determine driver distraction level after 74, 76, 78, and 80 are completed. The driver distraction module 54 may set the driver distraction level to a qualitative level (e.g., low, medium, high) or a quantitative level (e.g., an integer between 0 and 10) based on the determinations made at 74, 76, 78, and/or 80. In one example, the driver distraction module 54 increases the driver distraction level by 0 or 1 depending on whether a true or false determination is made at each of 74, 76, 78, and 80 (0 for true and 1 for false). In this example, the driver distraction level may range from 0 to 4. The method ends at 84. The driver distraction module 54 may repeatedly perform the method of
Referring now to
At 94, the driver assist module 56 determines a period until the vehicle 10 is likely to impact the object based on the position of the object relative to the vehicle 10 and the speed of the vehicle 10. For example, the driver assist module 56 may multiple the speed of the vehicle 10 by the distance between the vehicle 10 and the object to obtain the period to impact. The driver assist module 56 may execute 94 when performing the AEB or FCW functions and not execute 94 when performing the FCTA, FCTB, RCTA, or RCTB functions.
At 96, the driver assist module 56 determines whether the period to impact is less than a first threshold period. The driver assist module 56 may make this determination when performing the AEB or FCW functions. When the driver assist module 56 is performing the FCTA or FCTB functions, the driver assist module 56 may simply determine whether an object is crossing the forward path of the vehicle 10 at 96. When the driver assist module 56 is performing the RCTA or RCTB functions, the driver assist module 56 may simply determine whether an object is crossing the rearward path of the vehicle 10 at 96. If the period to impact is less than the first threshold period and/or an object is crossing the forward path of the vehicle 10, the method continues at 98. Otherwise, the method continues at 100.
At 98, the driver assist module 56 controls the user interface device 16 to generate a visible, audible, and/or tactile message notifying the driver of a potential impact with the object. The driver assist module 56 may control the user interface device 16 directly or by sending an instruction to the UID control module 58. At 100, the driver assist module 56 does not control the user interface device 16 to generate such a message.
At 102, the driver assist module 56 determines whether the period to impact is less than a second threshold period. The second threshold period may be less than the first threshold period. If the period to impact is less than the first threshold period, the method continues at 104. Otherwise, the method continues at 106. At 104, the driver assist module 56 applies the brake actuator 22 and/or pressurizes a hydraulic brake system (not shown) of the vehicle 10 (e.g., increase the pressure of hydraulic fluid in the brake system to an operating pressure). The driver assist module 56 may pressurize the brake system to decrease the response time of the brake actuator 22 when the brake actuator 22 is applied. The driver assist module 56 may pressurize the brake system by activating a pump of the brake system for a brief period (e.g., less than one second). The driver assist module 56 may control the brake actuator 22 and the brake system directly or by sending an instruction to the brake control module 62. At 106, the driver assist module 56 does not apply the brake actuator 22 or pressurize the brake system.
At 108, the driver assist module 56 determines whether the driver distraction level is greater than a threshold level. If the driver distraction level is a quantitative level (e.g., an integer between 0 and 10), the threshold level may be an integer (e.g., 3). If the distraction level is a qualitative level (e.g., low, medium, high), the threshold level may be one of the qualitative levels (e.g., low). For example, if the threshold level is low, the driver distraction level may be greater than the threshold level if the driver distraction level is medium or high. If the driver distraction level is greater than the threshold level, the method continues at 110. Otherwise, the method continues at 112.
At 110, the driver assist module 56 increases the first and second threshold periods. Alternatively, if the driver distraction level was greater than the threshold level during the last iteration of the method of
At 112, the driver assist module 56 decreases the first and second threshold periods. Alternatively, if the driver distraction level was less than or equal to the threshold level during the last iteration of the method of
The method may continue from 112 to 114 before proceeding to 116, or the method may continue to 116 directly from 112. At 114, the driver assist module 56 disables forward impact notifications. In other words, when the driver distraction level is less than or equal to the threshold level, the driver assist module 56 does not execute 98 regardless of whether the period to impact is less than the first threshold period. The method ends at 116.
The method of
Referring now to
At 126, the driver assist module 56 determines whether each of the distances between the vehicle 10 and the lane boundaries is less than a first threshold distance. If one or both of the distances between the vehicle 10 and the lane boundaries is/are less than the first threshold distance, the method continues at 128. Otherwise, the method continues at 130.
At 128, the driver assist module 56 controls the steering actuator 18 to make a steering correction (e.g., to steer the vehicle 10 away from the lane boundary to which the vehicle 10 is closest). The driver assist module 56 may control the steering actuator 18 directly or by sending an instruction to the steering control module 60. At 130, the driver assist module 56 does not control the steering actuator 18 to make a steering correction.
At 132, the driver assist module 56 determines whether each of the distances between the vehicle 10 and the lane boundaries is less than a second threshold distance (e.g., a distance within a range from zero to 1.5 feet). The second threshold distance may be less than the first threshold distance. If one or both of the distances between the vehicle 10 and the lane boundaries is/are less than the first threshold distance, the method continues at 134. Otherwise, the method continues at 136.
In various implementation, at 132, the driver assist module 56 may determine whether the vehicle 10 is likely to depart from the lane based on a trajectory and speed of the vehicle 10 (or a velocity vector of the vehicle 10). The driver assist module 56 may make this determination instead of or in addition to determining whether each of the distances between the vehicle 10 and the lane boundaries is less than the second threshold distance. The driver assist module 56 may determine the trajectory of the vehicle 10 based on the current position of the vehicle 10 relative to the lane boundaries and a steering angle of the vehicle 10.
If the vehicle 10 is likely to depart from the lane in which the vehicle 10 is travelling, the method continues at 134. Otherwise, the method continues at 136. At 134, the driver assist module 56 controls the user interface device 16 to generate haptic feedback (e.g., at the steering wheel) to notify the driver that the vehicle 10 is about to depart or has departed from the lane in which the vehicle 10 is travelling. At 136, the driver assist module 56 does not control the user interface device 16 to generate haptic feedback.
At 138, the driver assist module 56 determines whether the driver distraction level is greater than a threshold level such as the example threshold levels discussed above with reference to
At 140, the driver assist module increases the first and second threshold distances. Increasing the threshold distances causes the driver assist module 56 to execute 128 and 134 sooner when the vehicle 10 starts to get closer to one of the lane boundaries. The driver assist module 56 may increase the first and second threshold distances by amounts that are directly proportional to the amount by which the driver distraction level is greater than the threshold level (i.e., in an analog manner). Alternatively, the driver assist module 56 may increase the first and second threshold distances by a fixed amount regardless of the amount by which the driver distraction level is greater than the threshold level (i.e., in a digital manner).
At 144, the driver assist module 56 increases the intensity of the haptic feedback generated by the user interface device 16. The driver assist module 56 may adjust the haptic feedback intensity by adjusting the instruction it sends to the UID control module 58 or adjusting a control signal it sends to the user interface device 16. At 146, the driver assist module 56 increases the torque output of the steering actuator 18, which increases the amount by which the steering actuator 18 steers the vehicle 10 away from the lane boundary to which the vehicle 10 is closest. The driver assist module 56 may not adjust the threshold distances, the haptic feedback intensity, or the steering actuator torque output at 140, 144, and 146 if the driver distraction level was greater than the threshold level during the last iteration of the method of
At 142, the driver assist module decreases the first and second threshold distances. The driver assist module 56 may decrease the first and second threshold distances at 142 in the same manner that the driver assist module 56 increases the first and second threshold distances at 140. For example, if the driver assist module 56 increases the first and second threshold distances in an analog manner at 140, the driver assist module 56 may decrease the first and second threshold distances in an analog manner at 142.
At 148, the driver assist module 56 decreases the intensity of the haptic feedback generated by the user interface device 16. At 150, the driver assist module 56 decreases the torque output of the steering actuator 18, which decreases the amount by which the steering actuator 18 steers the vehicle 10 away from the lane boundary to which the vehicle 10 is closest. The driver assist module 56 may not adjust the threshold distances, the haptic feedback intensity, or the steering actuator torque output at 142, 148, and 150, respectively, if the driver distraction level was less than or equal to the threshold level during the last iteration of the method of
Referring now to
At 164, the driver assist module 56 determines a distance between the centerline of the vehicle 10 and the centerline of the lane in which the vehicle 10 is travelling. The centerline of the vehicle 10 may be predetermined and stored in the driver assist module 56. The driver assist module 56 may determine the distance between the centerline of the vehicle 10 and the centerline of the lane at one fore-aft location along the vehicle 10, such as along a line that extends through centers of the front wheels of the vehicle 10.
At 166, the driver assist module 56 determines whether the distance between the vehicle centerline and the lane centerline is greater than a threshold distance. If the distance between the centerlines is greater than the threshold distance, the method continues at 168. Otherwise, the method continues at 170.
At 168, the driver assist module 56 adjusts the steering actuator 18 to center the vehicle 10 within the lane in which the vehicle 10 is traveling. In other words, the driver assist module 56 controls the steering actuator 18 to decrease the distance between the vehicle and lane centerlines. At 170, the driver assist module 56 does not adjust the steering actuator 18 to center the vehicle 10 within the lane in which the vehicle 10 is traveling.
At 172, the driver assist module 56 determines whether the driver distraction level is greater than a threshold level such as the example threshold levels discussed above with reference to
At 174, the driver assist module 56 decreases the threshold distance. Decreasing the threshold distance causes the driver assist module 56 to execute 168 sooner when the centerline of the vehicle 10 moves away from the centerline of the lane. The driver assist module 56 may decrease the threshold distance by an amount that is directly proportional to the amount by which the driver distraction level is greater than the threshold level (i.e., in an analog manner). Alternatively, the driver assist module 56 may decrease the threshold distance by a fixed amount regardless of the amount by which the driver distraction level is greater than the threshold level (i.e., in a digital manner). The driver assist module 56 may not adjust the threshold distance at 174 if the driver distraction level was greater than the threshold level during the last iteration of the method of
At 176, the driver assist module 56 increases the threshold distance. The driver assist module 56 may increase the threshold distance at 176 in the same manner that the driver assist module 56 decreases the threshold distance at 174. For example, if the driver assist module 56 decreases the threshold distance in an analog manner at 174, the driver assist module 56 may increase the threshold distance in an analog manner at 176. The driver assist module 56 may not adjust the threshold distance at 176 if the driver distraction level was less than or equal to the threshold level during the last iteration of the method of
At 180, the driver assist module 56 determines whether a distraction period (i.e., the period for which the distraction level is greater than the threshold period) is greater than a threshold period. If the distraction period is greater than the threshold period, the method continues at 182. Otherwise, the method continues at 184.
At 182, the driver assist module 56 stops performing the LCC function (or lane centering assist). In other words, the driver assist module 56 stops adjusting the steering actuator 18 to center the vehicle 10 within the lane in which the vehicle 10 is traveling regardless of whether the distance between the vehicle and lane centerlines is greater than the threshold distance. The driver assist module 56 may communicate with the driver via the user interface device 16 to confirm that the driver is ready to take control of steering the vehicle 10 before the driver assist module 56 stops performing the LCC function. At 184, the driver assist module 56 continues to perform the LCC function (or lane centering assist). The method ends at 186. The driver assist module 56 may repeatedly perform the method of
Referring now to
At 196, the driver assist module 56 controls the acceleration or brake actuators 20 or 22, as well as the steering actuator 18, to adjust the speed and path of the vehicle 10 to complete the lane change that the driver intends to make (or would like to be made). The driver assist module 56 may control the acceleration actuator 20 directly or by sending an instruction to the acceleration control module 64. At 200, the driver assist module 56 determines whether an object is in the future path of the vehicle 10 during the intended or desired lane change. If the object is in the future path of the vehicle 10, the method continues at 202. Otherwise, the method continues at 204.
At 202, the driver assist module 56 controls the user interface device 16 to generate haptic feedback at, for example, the steering wheel of the vehicle 10 to notify the driver that an object is in the future path of the vehicle 10 during the intended or desired lane change. Additionally or alternatively, at 202, the driver assist module 56 may abort (i.e., stop performing) the lane change or, if the driver assist module 56 has not yet started performing the lane change, the driver assist module 56 may not perform the lane change at all. If the driver assist module 56 aborts the lane change, the driver assist module 56 may control the acceleration or brake actuators 20 or 22, as well as the steering actuator 18, to return the vehicle 10 to the lane from which the vehicle 10 was changing. At 204, the driver assist module 56 does not control the user interface device 16 to generate haptic feedback.
At 206, the driver assist module 56 determines whether the driver distraction level is greater than a threshold level such as the example threshold levels discussed above with reference to
At 208, the driver assist module 56 increases the intensity of the haptic feedback generated by the user interface device 16. Alternatively, if the driver distraction level was greater than the threshold level during the last iteration of the method of
At 210, the driver assist module 56 decreases the intensity of the haptic feedback generated by the user interface device 16. Alternatively, if the driver distraction level was less than or equal to the threshold level during the last iteration of the method of
Referring now to
At 224, the driver assist module 56 controls the brake actuator 22 and/or the acceleration actuator 20 to maintain the speed of the vehicle 10 at the cruise control speed. The driver assist module 56 may control the speed of the vehicle 10 in a closed-loop manner using a measured speed of the vehicle 10 as a feedback. At 228, the driver assist module 56 determines whether there is another vehicle ahead of the vehicle 10 based on an input from the object position module 52. If there is another vehicle ahead of the vehicle 10, the method continues at 230. Otherwise, the method continues at 226.
At 230, the object position module 52 determines a following distance or time gap from the vehicle 10 to the other vehicle that is ahead of the vehicle 10. The object position module 52 may determine the following distance based on a period that elapses from a time when one of the SRR sensors 34, 36 or the LRR sensor 42 emit a radar wave to a time when the radar wave is received from the sensor that emitted it. The object position module 52 may also determine the following distance using the predetermined relationship between (i) time elapsed between radar wave transmission and receipt and (ii) distance. The object position module 52 may determine the time gap based on the following distance and the current speed of the vehicle 10.
At 232, the driver assist module 56 controls the acceleration actuator 20 and/or the brake actuator 22 to maintain the following distance of the vehicle 10 within a predetermined range of a first distance. The first distance may be the distance set by the driver. Alternatively, at 232, the driver assist module 56 controls the acceleration actuator 20 and/or the brake actuator 22 to maintain the actual time gap within a predetermined range of a first period. The first period may be the time gap set by the driver. At 234, the driver assist module 56 determines whether the driver distraction level is greater than a threshold level such as the example threshold levels discussed above with reference to
At 236, the driver assist module 56 increases the first distance or the first period. The driver assist module 56 may increase the first distance or the first period by an amount that is directly proportional to the amount by which the driver distraction level is greater than the threshold level (i.e., in an analog manner). Alternatively, the driver assist module 56 may increase the first distance or the first period by a fixed amount regardless of the amount by which the driver distraction level is greater than the threshold level (i.e., in a digital manner). The driver assist module 56 may not adjust the first distance or the first period at 236 if the driver distraction level greater than the threshold level during the last iteration of the method of
At 238, the driver assist module 56 decreases the first distance or the first distance. The driver assist module 56 may decrease the first distance or the first period in the same manner that the driver assist module 56 increases the first distance or the first period at 236. For example, if the driver assist module 56 increases the first distance or the first period in an analog manner at 236, the driver assist module 56 may decrease the first distance or the first period in an analog manner at 238. The driver assist module 56 may not decrease the first distance or the first period to a value that is less than that set by the driver. The driver assist module 56 may not adjust the first distance or the first period at 238 if the driver distraction level less than or equal to the threshold level during the last iteration of the method of
Referring now to
At 244, the driver assist module 56 determines whether the vehicle 10 is on an access-controlled or designated highway. If the vehicle 10 is on an access-controlled or designated highway, the method continues at 248. Otherwise, the method continues at 250. At 248, the driver assist module 56 controls the user interface device 16 to prompt the driver of the vehicle 10 enter a desired destination and select a desired route. At 250, the driver assist module 56 controls the user interface device 16 to inform the driver that the HP function (or highway pilot assist) is unavailable.
At 252, the driver assist module 56 performs the HP function (or highway pilot assist) to complete the portion of the route selected by the driver that takes place on the designated highway. Performing the HP function involves executing 254, 256, and 258. At 254, the driver assist module 56 controls the steering actuator 18 to maintain the vehicle 10 within the lane in which the vehicle 10 is travelling. At 256, the driver assist module 56 controls the acceleration actuator 20 and/or the brake actuator 22 to maintain the speed of the vehicle 10 at a target speed (e.g., a speed limit for the designate highway that may be stored in the driver assist module 56). At 258, the driver assist module 56 controls the brake actuator 22 to decelerate the vehicle 10 and thereby avoid an impact between the vehicle 10 and an object ahead of the vehicle 10.
At 260, the driver assist module 56 determines whether the driver distraction level is greater than a threshold level such as the example threshold levels discussed above with reference to
At 262, the driver assist module 56 determines whether the distraction period (i.e., the period for which the distraction level is greater than the threshold period) is greater than the threshold period. If the distraction period is greater than the threshold period, the method continues at 264. Otherwise, the method continues at 266.
At 264, the driver assist module 56 stops performing the HP function (or highway pilot assist). The driver assist module 56 may communicate with the driver via the user interface device 16 to confirm that the driver is ready to take control of the vehicle 10 before the driver assist module 56 stops performing the HP function (or highway pilot assist). At 266, the driver assist module 56 continues to perform the HP function (or highway pilot assist). The method ends at 246. The method of
The foregoing description is merely illustrative in nature and is in no way intended to limit the disclosure, its application, or uses. The broad teachings of the disclosure can be implemented in a variety of forms. Therefore, while this disclosure includes particular examples, the true scope of the disclosure should not be so limited since other modifications will become apparent upon a study of the drawings, the specification, and the following claims. It should be understood that one or more steps within a method may be executed in different order (or concurrently) without altering the principles of the present disclosure. Further, although each of the embodiments is described above as having certain features, any one or more of those features described with respect to any embodiment of the disclosure can be implemented in and/or combined with features of any of the other embodiments, even if that combination is not explicitly described. In other words, the described embodiments are not mutually exclusive, and permutations of one or more embodiments with one another remain within the scope of this disclosure.
Spatial and functional relationships between elements (for example, between modules, circuit elements, semiconductor layers, etc.) are described using various terms, including “connected,” “engaged,” “coupled,” “adjacent,” “next to,” “on top of,” “above,” “below,” and “disposed.” Unless explicitly described as being “direct,” when a relationship between first and second elements is described in the above disclosure, that relationship can be a direct relationship where no other intervening elements are present between the first and second elements, but can also be an indirect relationship where one or more intervening elements are present (either spatially or functionally) between the first and second elements. As used herein, the phrase at least one of A, B, and C should be construed to mean a logical (A OR B OR C), using a non-exclusive logical OR, and should not be construed to mean “at least one of A, at least one of B, and at least one of C.”
In the figures, the direction of an arrow, as indicated by the arrowhead, generally demonstrates the flow of information (such as data or instructions) that is of interest to the illustration. For example, when element A and element B exchange a variety of information but information transmitted from element A to element B is relevant to the illustration, the arrow may point from element A to element B. This unidirectional arrow does not imply that no other information is transmitted from element B to element A. Further, for information sent from element A to element B, element B may send requests for, or receipt acknowledgements of, the information to element A.
In this application, including the definitions below, the term “module” or the term “controller” may be replaced with the term “circuit.” The term “module” may refer to, be part of, or include: an Application Specific Integrated Circuit (ASIC); a digital, analog, or mixed analog/digital discrete circuit; a digital, analog, or mixed analog/digital integrated circuit; a combinational logic circuit; a field programmable gate array (FPGA); a processor circuit (shared, dedicated, or group) that executes code; a memory circuit (shared, dedicated, or group) that stores code executed by the processor circuit; other suitable hardware components that provide the described functionality; or a combination of some or all of the above, such as in a system-on-chip.
The module may include one or more interface circuits. In some examples, the interface circuits may include wired or wireless interfaces that are connected to a local area network (LAN), the Internet, a wide area network (WAN), or combinations thereof. The functionality of any given module of the present disclosure may be distributed among multiple modules that are connected via interface circuits. For example, multiple modules may allow load balancing. In a further example, a server (also known as remote, or cloud) module may accomplish some functionality on behalf of a client module.
The term code, as used above, may include software, firmware, and/or microcode, and may refer to programs, routines, functions, classes, data structures, and/or objects. The term shared processor circuit encompasses a single processor circuit that executes some or all code from multiple modules. The term group processor circuit encompasses a processor circuit that, in combination with additional processor circuits, executes some or all code from one or more modules. References to multiple processor circuits encompass multiple processor circuits on discrete dies, multiple processor circuits on a single die, multiple cores of a single processor circuit, multiple threads of a single processor circuit, or a combination of the above. The term shared memory circuit encompasses a single memory circuit that stores some or all code from multiple modules. The term group memory circuit encompasses a memory circuit that, in combination with additional memories, stores some or all code from one or more modules.
The term memory circuit is a subset of the term computer-readable medium. The term computer-readable medium, as used herein, does not encompass transitory electrical or electromagnetic signals propagating through a medium (such as on a carrier wave); the term computer-readable medium may therefore be considered tangible and non-transitory. Non-limiting examples of a non-transitory, tangible computer-readable medium are nonvolatile memory circuits (such as a flash memory circuit, an erasable programmable read-only memory circuit, or a mask read-only memory circuit), volatile memory circuits (such as a static random access memory circuit or a dynamic random access memory circuit), magnetic storage media (such as an analog or digital magnetic tape or a hard disk drive), and optical storage media (such as a CD, a DVD, or a Blu-ray Disc).
The apparatuses and methods described in this application may be partially or fully implemented by a special purpose computer created by configuring a general purpose computer to execute one or more particular functions embodied in computer programs. The functional blocks, flowchart components, and other elements described above serve as software specifications, which can be translated into the computer programs by the routine work of a skilled technician or programmer.
The computer programs include processor-executable instructions that are stored on at least one non-transitory, tangible computer-readable medium. The computer programs may also include or rely on stored data. The computer programs may encompass a basic input/output system (BIOS) that interacts with hardware of the special purpose computer, device drivers that interact with particular devices of the special purpose computer, one or more operating systems, user applications, background services, background applications, etc.
The computer programs may include: (i) descriptive text to be parsed, such as HTML (hypertext markup language), XML (extensible markup language), or JSON (JavaScript Object Notation) (ii) assembly code, (iii) object code generated from source code by a compiler, (iv) source code for execution by an interpreter, (v) source code for compilation and execution by a just-in-time compiler, etc. As examples only, source code may be written using syntax from languages including C, C++, C#, Objective-C, Swift, Haskell, Go, SQL, R, Lisp, Java®, Fortran, Perl, Pascal, Curl, OCaml, Javascript®, HTML5 (Hypertext Markup Language 5th revision), Ada, ASP (Active Server Pages), PHP (PHP: Hypertext Preprocessor), Scala, Eiffel, Smalltalk, Erlang, Ruby, Flash®, Visual Basic®, Lua, MATLAB, SIMULINK, and Python®.
This application claims priority under 35 U.S.C. § 119(e) to U.S. Provisional Application 63/071,702, filed on Aug. 28, 2020. The disclosure of this prior application is considered part of the disclosure of this application and is hereby incorporated by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
63071702 | Aug 2020 | US |