The present disclosure generally relates to vehicles, and more particularly relates to methods and systems for monitoring drivers of vehicles.
Many vehicles today include various systems that can improve driving experience and/or safety. Such systems may include, among others, active safety systems, avoidance systems, steering assist systems, automatic steering systems, and semi-automatic steering systems. It may be desired to further customize such systems based on the driver of the vehicle.
Accordingly, it is desirable to provide techniques for monitoring a driver of a vehicle, and for taking actions based on the monitoring of the driver. It is also desirable to provide methods, systems, and vehicles utilizing such techniques. Furthermore, other desirable features and characteristics of the present invention will be apparent from the subsequent detailed description and the appended claims, taken in conjunction with the accompanying drawings and the foregoing technical field and background.
In accordance with an exemplary embodiment, a method is provided. The method comprises detecting whether a driver of a vehicle is looking in a direction with respect to the vehicle, and providing an action based at least in part on whether the driver is looking in the direction.
In accordance with another exemplary embodiment, a system is provided. The system comprises a sensing unit and a processor. The sensing unit is configured to at least facilitate detecting whether a driver of a vehicle is looking in a direction with respect to the vehicle. The processor is coupled to the sensing unit, and is configured to at least facilitate providing an action based at least in part on whether the driver is looking or has recently looked in the direction.
In accordance with a further exemplary embodiment, a vehicle is provided. The vehicle comprises a body, a steering system, a sensing unit, and a processor. The steering system is formed with the body. The sensing unit is configured to at least facilitate detecting whether a driver of the vehicle is looking in a direction with respect to the vehicle. The processor is coupled to the sensing unit and the steering system, and is configured to at least facilitate providing a steering action based at least in part on whether the driver is looking or has recently looked in the direction
The present disclosure will hereinafter be described in conjunction with the following drawing figures, wherein like numerals denote like elements, and wherein:
The following detailed description is merely exemplary in nature and is not intended to limit the disclosure or the application and uses thereof. Furthermore, there is no intention to be bound by any theory presented in the preceding background or the following detailed description.
As described in greater detail further below, the vehicle 100 includes a control system 102 for monitoring a driver of the vehicle 100, and for taking appropriate actions based on the monitoring. As discussed further below, the control system 102 includes a sensor array 104, a controller 106, and a notification unit 108. In various embodiments, the controller 106 controls the performance of one or more actions for the vehicle 100 based at least in part on the monitoring of the driver of the vehicle 100, in accordance with the steps set forth further below in connection with the processes 300, 400 of
As depicted in
In the exemplary embodiment illustrated in
Still referring to
The steering system 150 is mounted on the chassis 112, and controls steering of the wheels 116. In the depicted embodiment, the steering system 150 includes a steering wheel 151, a steering column 152, and a turn signal 153. In various embodiments, the steering wheel 151 and turn signal 153 receive inputs from a driver of the vehicle 100 when a turn is desired. The steering column 152 results in desired steering angles for the wheels 116 via the drive shafts 134 based on the inputs from the driver. In certain embodiments, an autonomous vehicle may utilize steering commands that are generated by a computer, with no involvement from the driver.
The braking system 160 is mounted on the chassis 112, and provides braking for the vehicle 100. The braking system 160 receives inputs from the driver via a brake pedal (not depicted), and provides appropriate braking via brake units (also not depicted). The driver also provides inputs via an accelerator pedal (not depicted) as to a desired speed or acceleration of the vehicle, as well as various other inputs for various vehicle devices and/or systems, such as one or more vehicle radios, other entertainment systems, environmental control systems, lighting units, navigation systems, and the like (also not depicted). Similar to the discussion above regarding possible variations for the vehicle 100, in certain embodiments steering, braking, and/or acceleration can be commanded by a computer instead of by a driver.
The control system 102 is mounted on the chassis 112. As discussed above, the control system 102 controls an adaptive cruise control feature of the vehicle 100. In one embodiment, the control system 102 provides monitoring of the driver of the vehicle 100, and provides actions (such as executing a turn into a desired lane, providing steering assist, providing a notification, and/or one or more other vehicle actions) based at least in part on the monitoring of the driver. In certain embodiments, the control system 102 may comprise, may be part of, and/or may be coupled to the electronic control system 118, the steering system 150, one or more active safety systems, and/or more other systems of the vehicle 100.
As noted above and depicted in
The driver input detection unit 162 detects one or more inputs provided by the driver of the vehicle 100. In certain embodiments, the driver input detection unit 162 comprises one or more sensors configured to detect when a driver has engaged the steering wheel 151 and/or the turn signal 153 of the vehicle 100. Also in certain embodiments, the driver input detection unit 162 further comprises sensors configured to detect when the driver has initiated a starting of an ignition of the vehicle 100 (e.g. by turning a key of the ignition, pressing a start button, and/or engaging a keyfob).
The driver detection unit 164 monitors a driver of the vehicle 100. In one embodiment, the driver detection unit 164 comprises one or more sensors configured to monitor a position and/or movement of a head of the driver. In another embodiment, the driver detection unit 164 comprises one or more sensors configured to monitor a position and/or movement of eyes of the driver. In yet another embodiment, the driver detection unit 164 comprises one or more sensors configured to monitor a position and/or movement of both the head and eyes of the driver.
With reference to
With reference again to
In various embodiments, the sensor array 104 provides the detected information to the controller for processing. Also in various embodiments, the controller 106 performs these and other functions in accordance with the steps of the processes 300, 400 described further below in connection with
The controller 106 is coupled to the sensor array 104 and to the notification unit 108. The controller 106 utilizes the various measurements and information from the sensor array 104, and controls one or more actions (e.g. steering and/or warnings) based at least in part on a monitoring of the driver of the vehicle 100. In various embodiments, the controller 106, along with the sensor array 104 and the notification unit 108, provide these and other functions in accordance with the steps discussed further below in connection with the schematic drawings of the vehicle 100 in
As depicted in
In the depicted embodiment, the computer system of the controller 106 includes a processor 172, a memory 174, an interface 176, a storage device 178, and a bus 180. The processor 172 performs the computation and control functions of the controller 106, and may comprise any type of processor or multiple processors, single integrated circuits such as a microprocessor, or any suitable number of integrated circuit devices and/or circuit boards working in cooperation to accomplish the functions of a processing unit. During operation, the processor 172 executes one or more programs 182 contained within the memory 174 and, as such, controls the general operation of the controller 106 and the computer system of the controller 106, generally in executing the processes described herein, such as the processes 300, 400 described further below in connection with
The memory 174 can be any type of suitable memory. For example, the memory 174 may include various types of dynamic random access memory (DRAM) such as SDRAM, the various types of static RAM (SRAM), and the various types of non-volatile memory (PROM, EPROM, and flash). In certain examples, the memory 174 is located on and/or co-located on the same computer chip as the processor 172. In the depicted embodiment, the memory 174 stores the above-referenced program 182 along with one or more stored values 184.
The bus 180 serves to transmit programs, data, status and other information or signals between the various components of the computer system of the controller 106. The interface 176 allows communication to the computer system of the controller 106, for example from a system driver and/or another computer system, and can be implemented using any suitable method and apparatus. In one embodiment, the interface 176 obtains the various data from the sensors of the sensor array 104. The interface 176 can include one or more network interfaces to communicate with other systems or components. The interface 176 may also include one or more network interfaces to communicate with technicians, and/or one or more storage interfaces to connect to storage apparatuses, such as the storage device 178.
The storage device 178 can be any suitable type of storage apparatus, including direct access storage devices such as hard disk drives, flash systems, floppy disk drives and optical disk drives. In one exemplary embodiment, the storage device 178 comprises a program product from which memory 174 can receive a program 182 that executes one or more embodiments of one or more processes of the present disclosure, such as the steps of the processes 300, 400 (and any sub-processes thereof) described further below in connection with
The bus 180 can be any suitable physical or logical means of connecting computer systems and components. This includes, but is not limited to, direct hard-wired connections, fiber optics, infrared and wireless bus technologies. During operation, the program 182 is stored in the memory 174 and executed by the processor 172.
It will be appreciated that while this exemplary embodiment is described in the context of a fully functioning computer system, those skilled in the art will recognize that the mechanisms of the present disclosure are capable of being distributed as a program product with one or more types of non-transitory computer-readable signal bearing media used to store the program and the instructions thereof and carry out the distribution thereof, such as a non-transitory computer readable medium bearing the program and containing computer instructions stored therein for causing a computer processor (such as the processor 172) to perform and execute the program. Such a program product may take a variety of forms, and the present disclosure applies equally regardless of the particular type of computer-readable signal bearing media used to carry out the distribution. Examples of signal bearing media include: recordable media such as floppy disks, hard drives, memory cards and optical disks, and transmission media such as digital and analog communication links. It will be appreciated that cloud-based storage and/or other techniques may also be utilized in certain embodiments. It will similarly be appreciated that the computer system of the controller 106 may also otherwise differ from the embodiment depicted in
The notification unit 108 is coupled to the controller 106, and provides notifications for the driver of the vehicle 100. In certain embodiments, the notification unit 108 provides audio, visual, haptic, and/or other notifications to the driver based on instructions provided from the controller 106 (e.g. from the processor 172 thereof), for example when an object in proximity to the vehicle 100 may be a threat to the vehicle 100 and/or when a desired turn may not presently be executed (e.g. if the driver is not looking in the direction of the intended turn). Also in various embodiments, the notification unit 108 performs these and other functions in accordance with the steps of the processes 300, 400 described further below in connection with
While the components of the control system 102 (including the sensor array 104, the controller 106, and the notification unit 108) are depicted as being part of the same system, it will be appreciated that in certain embodiments these features may comprise two or more systems. In addition, in various embodiments the control system 102 may comprise all or part of, and/or may be coupled to, various other vehicle devices and systems, such as, among others, the actuator assembly 120, the electronic control system 118, the steering system 150, and/or one or more other systems of the vehicle 100.
As depicted in
Monitoring is performed for the driver (step 304). In various embodiments, a driver is monitored to ascertain whether the driver is looking in a particular direction. In one embodiment, the monitoring includes detection and monitoring of the position and movement of the driver's eyes. In another embodiment, the monitoring includes detection and monitoring of the position and movement of the driver's head. In yet other embodiments, the monitoring includes detection and monitoring of the position and movement of both the driver's eyes and head. In addition, in various embodiments, the monitoring includes detecting whether the driver is looking in the direction of a particular object, threat, and/or lane proximate the vehicle. Also in one embodiment, the monitoring of step 304 is performed via measurements and/or detection provided by one or more sensors of the driver detection unit 164 of
A determination is made as to whether an event condition is satisfied (step 306). In one embodiment, this determination is made by the processor 172 of
If it is determined that the event condition is not satisfied, then the process returns to step 304, as the driver continues to be monitored in a new iteration. Once a determination is made in an iteration of step 306, then the proceeds to step 308, described directly below.
During step 308, a determination is made as to whether a driver condition is satisfied. In one embodiment, this determination is made by the processor 172 of
Different actions (or lack of action) are provided based on whether the driver condition of step 308 is satisfied. Specifically, as depicted in one embodiment, a first action is provided in step 310 if the driver condition is satisfied, and a second is provided in step 312 if the driver condition is not satisfied. Also in various embodiments, the actions are implemented at least in part based on instructions provided by the processor 172 of
In one example in which the event condition is satisfied when a threat is present near the vehicle 100 that may justify a warning, the warning is not provided (or may be delayed) in step 310 if the driver is already looking in the direction of the threat, but the warning is provided in step 312 if the driver is not looking in the direction of the threat. In another example in which the event condition is satisfied when a threat may warrant use of a steering assist feature, the steering assist (e.g. added steering torque) is provided in step 310 if the driver is looking in an appropriate direction (in one example this may be the direction of the threat, and in another example this may be the intended steering direction), and the steering assist is not provided in step 312 if the driver is not looking in the appropriate direction. In another example in which the event condition is satisfied when the driver has indicated a desire to make a turn (e.g. by engaging the steering wheel 151 and/or the turn signal 153 of
As depicted in
A determination is made that the driver has requested a lane change for the vehicle (step 402). In one embodiment, this determination is made by the processor 172 of
A path or road on which the vehicle is travelling is monitored (step 404). In one embodiment, the road on which the vehicle is travelling (including the vehicle's lane and any adjacent lanes, and any lanes that may affect the turn into the desired lane) is monitored using the data from the road detection unit 166 of
A determination is made as to whether there is a sufficient level of confidence that it would be unsafe for the vehicle to turn into the desired lane (step 406). In one embodiment, this determination is made by the processor 172 of
If it is determined in step 406 that there is a sufficient level of confidence that it would be unsafe to change lanes, then the vehicle waits a short time, without changing lanes (step 408) before evaluating the situation again. In one embodiment, the vehicle waits for a fraction of a second (e.g. half of a second in one example, although this may vary in other embodiments). In one embodiment, this is performed for the vehicle 100 via instructions provided by the processor 172 to the steering system 150 of
During step 410, a determination is made as to whether a maximum amount of wait time to make the turn has been reached (step 410). In one embodiment, this determination is made by the processor 172 of
If it is determined in step 410 that the maximum wait time has been reached, then the lane change is not executed (step 412). Specifically, in one embodiment, during step 412 a lane change on demand function is exited, and no lane change is executed unless and until a subsequent request is received in a future iteration of step 402. In addition, in one embodiment, a notification is provided to the driver. In one such embodiment, an audio and/or visual notification is provided by the notification unit 108 of
Conversely, if it is determined in step 410 that the maximum wait time has not been reached, then the process returns to step 404 in a new iteration. The process then continues with further monitoring of the road in step 404 and a subsequent determination in step 406 with the new, updated road monitoring data.
With reference back to step 406, if it is determined in step 406 that there is not a sufficient level of confidence that it would be unsafe for the vehicle to turn into the desired lane, then a separate determination is made as to whether there is a sufficient level of confidence that it would be safe for the vehicle to turn into the desired lane (step 414). In one embodiment, this determination is made by the processor 172 of
If it is determined in step 414 that there is a sufficient level of confidence that it would be safe to change lanes, then the requested turn is executed (step 416). In one embodiment, in step 416 the vehicle 100 is turned into the desired lane (per the request in step 402) automatically by the steering system 150 of
Conversely, if it is determined in step 414 that there is not a sufficient level of confidence that it would be safe to change lanes, then driver monitoring is performed (step 418). In various embodiments, a driver is monitored to ascertain whether the driver is looking in the direction of the intended turn. In one embodiment, the monitoring includes detection and monitoring of the position and movement of the driver's eyes. In another embodiment, the monitoring includes detection and monitoring of the position and movement of the driver's head. In yet other embodiments, the monitoring includes detection and monitoring of the position and movement of both the driver's eyes and head. Also in one embodiment, the monitoring of step 418 is performed via measurements and/or detection provided by one or more sensors of the driver detection unit 164 of
During step 420, a determination is made as to whether a driver condition is satisfied with respect to the turn. In one embodiment, this determination is made by the processor 172 of
If it is determined that the driver condition is satisfied, then the process proceeds to the above-described step 416, in which the requested turn is executed. Conversely, if it is determined that the driver condition is not satisfied, then the process proceeds instead to the above-described step 410, in which a determination is made as to whether the maximum wait time has been reached.
Accordingly, in one embodiment of the process 400, the requested turn is automatically executed if there is sufficient confidence that the vehicle 100 can safely make the turn (e.g. if the lane is clear of objects). Conversely, the requested turn is not executed if there is sufficient confidence that the vehicle 100 cannot safely make the turn (e.g. if the lane is full of objects). In cases in which there is not a sufficient level of confidence as to whether the requested turn can safely be executed, the turn is executed if and only if the driver is looking or has recently looked in the appropriate direction for the turn.
With reference to
Accordingly, methods, systems, and vehicles are provided for monitoring drivers of vehicles. In various embodiments, one or more vehicle actions (e.g. providing vehicle notifications, initiating steering assist, and/or executing a requested turn) are executed based at least in part on whether the driver of the vehicle is looking in an appropriate direction with respect to the event.
It will be appreciated that the disclosed methods, systems, and vehicles may vary from those depicted in the Figures and described herein. For example, the vehicle 100, the control system 102, and/or various components thereof may vary from that depicted in
While at least one exemplary embodiment has been presented in the foregoing detailed description, it should be appreciated that a vast number of variations exist. It should also be appreciated that the exemplary embodiment or exemplary embodiments are only examples, and are not intended to limit the scope, applicability, or configuration of the disclosure in any way. Rather, the foregoing detailed description will provide those skilled in the art with a convenient road map for implementing the exemplary embodiment or exemplary embodiments. It should be understood that various changes can be made in the function and arrangement of elements without departing from the scope of the appended claims and the legal equivalents thereof.