The disclosure relates generally to generating navigation guidance based on driver state monitoring.
Some vehicles may include an advanced driver assistance system (ADAS) that may assist a driver of the vehicle during vehicle operation. For example, the ADAS may perform driver monitoring and take various escalating actions in response to detecting driver drowsiness or distraction. Such actions may include, for example, outputting a driver alert, increasing lane-keeping assistance, slowing the vehicle, and stopping the vehicle.
However, the inventors herein have recognized that some driving situations may result in a higher frequency of driver drowsiness and/or distraction. For example, continuous driving for long durations and driving on long, unchanging roads may increase driver drowsiness. As another example, road-side attractions and scenic views may increase driver distraction. As such, the inventors herein have recognized that data acquired by the ADAS may be advantageously used to generate navigation guidance in order to reduce incidents of driver drowsiness and/or distraction.
In various embodiments, the issues described above may be addressed by methods for generating navigation guidance based on a plurality of driver state events detected via an advanced driver assistance system (ADAS), and outputting the navigation guidance via a navigation system. The methods may detect each driver state event of the plurality of driver state events via the ADAS. The methods may tag each driver state event of the plurality of driver state events with a location of occurrence and/or a time of occurrence. The methods may statistically group the plurality of driver state events with respect to the location of occurrence and/or the time of occurrence, such as via a cluster analysis. The navigation guidance may comprise a map layer and/or a route recommendation that reduces vehicle travel through locations and travel times having high driver state event clustering. In this way, driver drowsiness and/or distraction may be reduced by reducing travel through areas that have a statistically higher incidence of driver state events.
In some embodiments, the issues described above may be addressed by methods for generating a navigation route for a vehicle based on navigation guidance determined from at least one of data collected within the vehicle (e.g., internally provided data) and externally provided data. The navigation guidance may comprise a map layer and/or a route recommendation determined based on driver state events. The externally provided data may comprise data received from a cloud computing system, and the route recommendation may comprise a general vehicle route recommendation. The methods may determine the general vehicle route recommendation via the cloud computing system based on an occurrence of the driver state events from a plurality of different ADAS reported to the cloud computing system for a plurality of drivers. The general vehicle route recommendation may reduce travel through the locations having statistically significant clusters of the driver state events. The internally provided data may comprise the driver state events for a driver of the vehicle. The methods may detect the driver state events for the driver of the vehicle based at least on images of the driver received from an in-vehicle camera. The route recommendation may additionally or alternatively comprise an individualized vehicle route recommendation. The methods may determine the individualized vehicle route recommendation system based on a cluster analysis of the driver state events for the driver of the vehicle. The individualized vehicle route recommendation may reduce travel through the locations having the statistically significant clusters of the driver state events for the driver. In this way, the navigation guidance may be generated based on aggregate data from a plurality of vehicles and/or tailored for a specific individual driver.
In some embodiments, the issues described above may be addressed by a vehicle system, which includes an in-vehicle camera housed within a cabin of the vehicle, an ADAS, and a navigation system with a display, having one or more processors and a non-transitory memory including instructions that, when executed, cause the one or more processors to: detect driver state events in a vehicle based on driver images acquired by the in-vehicle camera, report the driver state events to a cloud computing platform in an anonymized manner, and receive navigation guidance from the cloud computing platform, the navigation guidance determined by one of the cloud computing platform based on the driver state events reported by a plurality of vehicles and a driver profile specific to a driver of the vehicle. The non-transitory memory may include further instructions that, when executed, cause the one or more processors to output the navigation guidance to the display of the navigation system within the vehicle, the navigation guidance comprising at least one of a map layer and a route recommendation. The driver state events may comprise occurrences of at least one of a tired, fatigued, and distracted driver state. The map layer may comprise a heat map indicating statistically significant clusters of the driver state events. The route recommendation may comprise a navigation route that reduces travel through the statistically significant clusters of the driver state events. The map layer and/or the route recommendation may be displayed via the navigation system based on user input to the display of the navigation system. In this way, the systems may advantageously utilize the cloud computing platform for data aggregation, storage, and processing.
It should be understood that the summary above is provided to introduce in simplified form a selection of concepts that are further described in the detailed description. It is not meant to identify key or essential features of the claimed subject matter, the scope of which is defined uniquely by the claims that follow the detailed description. Furthermore, the claimed subject matter is not limited to implementations that solve any disadvantages noted above or in any part of this disclosure.
The disclosure may be better understood from reading the following description of non-limiting embodiments, with reference to the attached drawings, wherein below:
Disclosed herein are systems and methods for implementing navigation guidance based on driver state events detected by an advanced driver assistance system (ADAS) of a vehicle. For example, a plurality of vehicles may communicate data with a cloud computing platform, such as depicted in
As used herein, the terms “substantially the same as” or “substantially similar to” are construed to mean the same as with a tolerance for variation that a person of ordinary skill in the art would recognize as being reasonable. As used herein, an element or step recited in the singular and preceded with the word “a” or “an” should be understood as not excluding plural of said elements or steps, unless such exclusion is stated. As used herein, terms such as “first,” “second,” “third,” and so on are used merely as labels and are not intended to impose any numerical requirements, any particular positional order, or any sort of implied significance on their objects. As used herein, terminology in which “an embodiment,” “some embodiments,” or “various embodiments” are referenced signify that the associated features, structures, or characteristics being described are included in at least some embodiments, but are not necessarily in all embodiments. Moreover, the various appearances of such terminology do not necessarily all refer to the same embodiments. As used herein, terminology in which elements are presented in a list using “and/or” language means any combination of the listed elements. For example, “A, B, and/or C” may mean any of the following: A alone; B alone; C alone; A and B; A and C; B and C; or A, B, and C.
Cloud 130 may include memory and/or processors that are standalone or integrally constructed as part of various programmable devices, including, for example, computing device(s) 132 (which may be, or may include, servers or server computing devices). Cloud 130 may facilitate data aggregation, storage, and processing. As depicted in
Cloud 130 may further store data processing algorithm(s) 138 and a database 140, which may be stored on and/or accessed by computing device(s) 132. Data processing algorithm(s) 138 may analyze data received from first vehicle 110 and/or second vehicle 120. Data processing algorithm(s) may output processed data and/or conclusions from the processed data to database 140, first vehicle 110, and/or second vehicle 120. In various embodiments, data processing algorithm(s) 138 may include one or more driver state event analysis algorithms that build road profiles and statistical maps based on driver state events received from first vehicle 110 and/or second vehicle 120, as will be elaborated herein with particular respect to
Referring now to
Vehicle 202 may include an in-vehicle camera 204 and/or a plurality of sensors that provide information regarding a vehicle environment and a state of a driver operating vehicle 202, collectively referred to as driver and driver environment inputs 212. Driver and driver environment inputs 212 may include driver images 214, a cabin occupancy input 216, and a driving behavior input 218, although other inputs are also possible. As depicted in
Vehicle 202 may have an ADAS 240. ADAS 240 may include a driver state monitor 242 that receives and analyzes driver and driver environment inputs 212. For example, driver state monitor 242 may include one or more computer vision models and/or image recognition algorithms that analyze facial structures of the driver in driver images 214, such as data points on the eyes and face, to identify a state of the driver, such as whether the driver is awake or asleep, alert or tired (e.g., sleepy, drowsy, or fatigued), and focused or distracted, for example. In some embodiments, driver state monitor 242 may further perform facial recognition to determine an identity of the driver.
In some embodiments, driver state monitor 242 may determine a state of the driver (e.g., a driver state) based on driver images 214 alone and without additional inputs. In other embodiments, driver state monitor 242 may further determine the cabin occupancy from cabin occupancy input 216 and driving behavior from driving behavior input 218, which may provide further context into the driver state. For example, driver state monitor 242 may analyze driving behavior input 218 for swerving, hard braking, aggressive acceleration, or other driving behaviors that may be caused by driver drowsiness and/or distraction. As another example, swerving and hard braking may be associated with driver distraction. Further, having cabin occupants may increase driver distraction in some instances.
ADAS 240 may also include a driver profile 244. In various embodiments, driver state monitor 242 may communicate with driver profile 244. Driver profile 244 may help driver state monitor 242 distinguish nominal driving behavior from drowsy and/or distracted driving behavior. For example, driver profile 244 may generate and store driving preferences, such as typical acceleration and braking rates and typical steering behavior, for the identified driver. Additionally or alternatively, driver profile 244 may generate driver-specific navigation guidance, as will be elaborated below.
Driver state monitor 242 additionally receives route inputs 230, which may include a location input 232 and/or a trajectory input 234. In some embodiments, a global positioning system (GPS) 220 may provide location input 232 while a navigation system 224 provides trajectory input 234, such as shown in
Driver state monitor 242 outputs a driver state event 246 in response to detecting attention-related driver states that may impair or impede the driver's ability to operate vehicle 202, such as lost concentration, driver distraction, driver sleepiness, and the like. For example, driver state event 246 may be an event (e.g., occurrence or incidence) where the driver is determined to be in a distracted state, a fatigued state, and/or a sleepy state. As such, driver state monitor 242 might not output driver state event 246 in response to the driver being in a focused state or an alert state, for example. Driver state event 246 may be tagged with a location of its occurrence, as determined from route inputs 230, as well as a time of its occurrence (e.g., a timestamp). For example, the timestamp may include a date (e.g., month, day, and year), time, and day of the week. Driver state event 246 may also specify a type of event that occurred (e.g., “distracted state,” “asleep state,” “drowsy state,” “fatigued state”). Further, driver state event 246 may include information from trajectory input 234, such as the trajectory of vehicle 202 for a pre-determined duration of time (e.g., 30 minutes) immediately prior to driver state event 246 occurring.
Driver state event 246 might not include any personal or identifying information regarding the driver. However, in embodiments where driver profile 244 is used to build driver-specific navigation guidance, driver state event 246 may be output to driver profile 244 so that the detected driver state event 246 is associated with a specific individual.
Driver state event 246 is also output to cloud 250 in real-time or nearly real-time as the driver state event is detected. As used herein, the term “real-time” denotes processes executed without intentional delay, such as substantially instantaneously. As mentioned above, driver state event 246 does not include personal or identifying information, including demographic information. Thus, vehicle 202 reports driver state event 246 to cloud 250 in an anonymized manner, and cloud 250 receives anonymized reports for a plurality of driver state event occurrences, including the time and location of each occurrence, from a plurality of vehicles in addition to vehicle 202.
Cloud 250 includes driver state event analysis algorithm(s) 252 that process each received driver state event 246. As shown in system 200, driver state event analysis algorithm(s) 252 may include a road profile builder 256 and/or a statistical map builder 254. Driver state event analysis algorithm(s) 252 may be data processing algorithm(s) 138 of
Statistical map builder 254 may build location maps by statistically grouping driver state event occurrences via cluster analysis algorithms or the like. For example, statistical map builder 254 may perform a point-based or density-based clustering of the received driver state events from a plurality of vehicles to detect areas where driver state events are concentrated and areas where driver state events are sparse or not present. The time of occurrence of the driver state events may be further used in the clustering. Driver state events that are not part of a cluster are not statistically relevant and may be labeled as noise, and thus might not be represented on a map layer 260 output by driver state event analysis algorithm(s) 252, which will be further described below. However, the driver state events labeled as noise may be stored in database 258, as the clustering may change over time as new driver state events are reported.
Road profile builder 256 may store the clustering information from statistical map builder 254 to build a location-specific profile and a time-specific profile for a given roadway as well as information about the type of road (e.g., urban, rural, mountain) and geometry of the road (e.g., straight, curvy) at the location of the clustering as a road profile for the given roadway. In some embodiments, road profile builder 256 may extrapolate information from a first roadway to a second roadway having similar characteristics in order to anticipate areas that may have high incidences of driver state events using a confidence interval with high (e.g., 90% or greater) probability. Thus, driver state event analysis algorithm(s) 252 might not only receive and analyze data from driver state events that have occurred, but also predict driver state events, at least in some embodiments.
Driver state event analysis algorithm(s) 252 may analyze data in substantially real-time and output map layer 260 and/or a route recommendation 262 in substantially real-time. Map layer 260 and/or route recommendation 262 may be received by navigation system 224 of vehicle 202. Map layer 260 may indicate areas having a high concentration (e.g., cluster) of driver state event occurrences, as determined by statistical map builder 254. Map layer 260 may include, for example, a heat map of driver state event occurrences in a location-specific fashion. Map layer 260 may further indicate the driver state event occurrences in a time-specific fashion. For example, map layer 260 may change depending on the time of day to coincide with a time-specific clustering of driver state events, when relevant. As an illustrative example, driving during peak commuting time (e.g., so-called “rush hour”) may increase driver fatigue and/or distractedness (e.g., for drivers whose work-day has recently ended, or for individual drivers at predetermined times), and so map layer 260 may show more or different areas during the peak commuting time on weekdays compared with other times of the day or during that same time on weekends.
Route recommendation 262 may include recommended roads of travel to avoid or reduce vehicle travel through areas and times of day having high occurrences of driver state events (e.g., as determined via a first cluster analysis). Further, route recommendation 262 may include location-specific and time-specific information that may be used by navigation system 224 to generate a travel route for vehicle 202, such as when a destination is known. Route recommendation 262 may be a general vehicle route recommendation that is the same for all drivers and all vehicles.
In some embodiments, ADAS 240 may further generate a driver-specific map layer 264 and/or a driver-specific route recommendation 266 based on driver profile 244 (e.g., as determined via a second cluster analysis). For example, driver-specific map layer 264 and/or driver-specific route recommendation 266 may be generated by performing a second cluster analysis on driver state event data for an individual driver. As another example, ADAS 240 may adjust map layer 260 and/or route recommendation 262 such that driver-specific map layer 264 and/or driver-specific route recommendation 266 are generated based on data compiled for a plurality of drivers and tailored for the individual driver.
Thus, navigation guidance may be generated based on data collected within vehicle 202 and/or externally collected data from a plurality of vehicles outside of vehicle 202. Further, the navigation guidance may include generalized guidance (e.g., a generalized map layer, such as map layer 260, and/or a generalized route recommendation, such as route recommendation 262) and/or individualized guidance (e.g., an individualized map layer, such as driver-specific map layer 264, and/or an individualized vehicle route recommendation, such as driver-specific route recommendation 266). In various embodiments, navigation guidance (e.g., a map layer including, for example, a heat map of driver state event occurrences in a location-specific fashion) might be used for additional purposes, such as insurance analysis, infrastructure planning, and so on.
Map layer 260, route recommendation 262, driver-specific map layer 264, and/or driver-specific route recommendation 266 may be displayed to the driver via navigation system 224 in response to user input received from the driver, as will be elaborated herein. In some embodiments, navigation system 224 may output an alert to indicate that map layer 260, route recommendation 262, driver-specific map layer 264, and/or driver-specific route recommendation 266 are available for display. Display of the map layer 260, route recommendation 262, driver-specific map layer 264, and/or driver-specific route recommendation 266 may be chosen via user input to the user interface 418.
System 200 may also provide driver behavior prediction. For example, system 200 may predict that a driver state event may occur on a particular route traveling through locations and times having high occurrences of driver state events due to the increased probability of a driver state event occurring. As another example, system 200 may predict that a driver state event is less likely to occur on routes having no or few clusters of driver state events. Thus, an occurrence of driver state events may be reduced by utilizing map layer 260, route recommendation 262, driver-specific map layer 264, and/or driver-specific route recommendation 266 to reduce travel through areas having increased driver state event occurrence probability (e.g., by reducing an estimated or calculated length of travel in distance, and/or by reducing an estimated or calculated length of travel in time). As another example, by alerting the driver to the high occurrence of driver state events in particular areas or at particular times of day, the driver may travel through the associated areas or driving times with increased focus and caution.
As shown, an instrument panel 306 may include various displays and controls accessible to a human driver (also referred to as the user) of vehicle 302. For example, instrument panel 306 may include a touch screen 308 of an in-vehicle computing system or infotainment system 309 (e.g., an infotainment system), an audio system control panel, and an instrument cluster 310. Touch screen 308 may receive user input to the in-vehicle computing system or infotainment system 309 for controlling audio output, visual display output, navigation system display, user preferences, control parameter selection, and so on. While the example system shown in
Cabin 300 may include one or more sensors for monitoring the vehicle, the user, and/or the environment. For example, cabin 300 may include one or more seat-mounted pressure sensors configured to measure the pressure applied to the seat to determine the presence of a user, door sensors configured to monitor door activity, humidity sensors to measure the humidity content of the cabin, microphones to receive user input in the form of voice commands, to enable a user to conduct telephone calls, and/or to measure ambient noise in the cabin 300, and so on. Cabin 300 may also include an in-vehicle camera, such as in-vehicle camera 204 of
Cabin 300 may also include one or more user objects, such as mobile device 328, that are stored in the vehicle before, during, and/or after travelling. Mobile device 328 may include a smart phone, a tablet, a laptop computer, a portable media player, and/or any suitable mobile computing device. Mobile device 328 may be connected to the in-vehicle computing system via a communication link 330. Communication link 330 may be wired (e.g., via Universal Serial Bus (USB), Mobile High-Definition Link (MHL), High-Definition Multimedia Interface (HDMI), Ethernet, and so on) or wireless (e.g., via Bluetooth®, Wi-Fi®, Wi-Fi Direct®, Near-Field Communication (NFC), cellular connectivity, and so on) and configured to provide two-way communication between the mobile device and the in-vehicle computing system. (Bluetooth® is a registered trademark of Bluetooth SIG, Inc., Kirkland, WA. Wi-Fi® and Wi-Fi Direct® are registered trademarks of Wi-Fi Alliance, Austin, Texas.) Mobile device 328 may include one or more wireless communication interfaces for connecting to one or more communication links (e.g., one or more of the example communication links described above). The wireless communication interface may include one or more physical devices, such as antenna(s) or port(s) coupled to data lines for carrying transmitted or received data, as well as one or more modules/drivers for operating the physical devices in accordance with other devices in the mobile device. For example, communication link 330 may provide sensor and/or control signals from various vehicle systems (such as vehicle audio system, climate control system, and so on) and touch screen 308 to mobile device 328 and may provide control and/or display signals from mobile device 328 to the in-vehicle systems and touch screen 308. Communication link 330 may also provide power to mobile device 328 from an in-vehicle power source in order to charge an internal battery of the mobile device.
In-vehicle computing system or infotainment system 309 may also be communicatively coupled to additional devices operated and/or accessed by the user but located external to vehicle 302, such as one or more external devices 350. In the depicted embodiment, external devices 350 are located outside of vehicle 302 though it will be appreciated that in alternate embodiments, external devices 350 may be located inside cabin 300. The external devices may include a server computing system, cloud computing system (e.g., cloud 130 of
In-vehicle computing system or infotainment system 309 may analyze the input received from external devices 350, mobile device 328, and/or other input sources and select settings for various in-vehicle systems (such as climate control system, audio system, and/or navigation system), provide output via touch screen 308 and/or speakers 312, communicate with mobile device 328 and/or external devices 350, and/or perform other actions based on the assessment. In some embodiments, all or a portion of the assessment may be performed by mobile device 328 and/or external devices 350.
In some embodiments, one or more of the external devices 350 may be communicatively coupled to in-vehicle computing system or infotainment system 309 indirectly, via mobile device 328 and/or another of the external devices 350. For example, communication link 336 may communicatively couple external devices 350 to mobile device 328 such that output from external devices 350 is relayed to mobile device 328. Data received from external devices 350 may then be aggregated at mobile device 328 with data collected by mobile device 328, the aggregated data then transmitted to in-vehicle computing system or infotainment system 309 and touch screen 308 via communication link 330. Similar data aggregation may occur at a server system before being transmitted to in-vehicle computing system or infotainment system 309 and touch screen 308 via communication link 336 and/or communication link 330.
In-vehicle computing system or infotainment system 309 may include one or more processors including an operating system processor 414 and an interface processor 420. Operating system processor 414 may execute an operating system on the in-vehicle computing system and control input/output, display, playback, and other operations of the in-vehicle computing system. Interface processor 420 may interface with a vehicle control system 430 via an inter-vehicle system communication module 422.
Inter-vehicle system communication module 422 may output data to one or more other vehicle systems 431 and/or one or more other vehicle control elements 461 while also receiving data input from other vehicle systems 431 and other vehicle control elements 461, e.g., by way of vehicle control system 430. When outputting data, inter-vehicle system communication module 422 may provide a signal via a bus corresponding to any status of the vehicle, the vehicle surroundings, or the output of any other information source connected to the vehicle. Vehicle data outputs may include, for example, analog signals (such as current velocity), digital signals provided by individual information sources (such as clocks, thermometers, location sensors such as GPS sensors, and so on), digital signals propagated through vehicle data networks (such as an engine controller area network (CAN) bus through which engine related information may be communicated, a climate control CAN bus through which climate control related information may be communicated, and a multimedia data network through which multimedia data is communicated between multimedia components in the vehicle). For example, the in-vehicle computing system or infotainment system 309 may retrieve from the engine CAN bus the current speed of the vehicle estimated by the wheel sensors, a power state of the vehicle via a battery and/or power distribution system of the vehicle, an ignition state of the vehicle, and so on. In addition, other interfacing means such as Ethernet may be used as well without departing from the scope of this disclosure.
A storage device 408 may be included in in-vehicle computing system or infotainment system 309 to store data such as instructions executable by operating system processor 414 and/or interface processor 420 in non-volatile form. The storage device 408 may store application data, including prerecorded sounds, to enable in-vehicle computing system or infotainment system 309 to run an application for connecting to a cloud-based server (e.g., cloud 130 of
A microphone 402 may be included in the in-vehicle computing system or infotainment system 309 to receive voice commands from a user, to measure ambient noise in the vehicle, to determine whether audio from speakers of the vehicle is tuned in accordance with an acoustic environment of the vehicle, and so on. A speech processing unit 404 may process voice commands, such as the voice commands received from the microphone 402. In some embodiments, in-vehicle computing system or infotainment system 309 may also be able to receive voice commands and sample ambient vehicle noise using a microphone included in an audio system 432 of the vehicle.
One or more additional sensors may be included in a sensor subsystem 410 of the in-vehicle computing system or infotainment system 309. For example, the sensor subsystem 410 may include a camera, such as a rear view camera for assisting a user in parking the vehicle and/or a cabin camera (e.g., in-vehicle camera) for identifying a user (e.g., using facial recognition and/or user gestures). Sensor subsystem 410 of in-vehicle computing system or infotainment system 309 may communicate with and receive inputs from various vehicle sensors and may further receive user inputs. For example, the inputs received by sensor subsystem 410 may include transmission gear position, transmission clutch position, gas pedal input, brake input, transmission selector position, vehicle speed, engine speed, mass airflow through the engine, ambient temperature, intake air temperature, and so on, as well as inputs from climate control system sensors (such as heat transfer fluid temperature, antifreeze temperature, fan speed, passenger compartment temperature, desired passenger compartment temperature, ambient humidity, and so on), an audio sensor detecting voice commands issued by a user, a fob sensor receiving commands from and optionally tracking the geographic location/proximity of a fob of the vehicle, and so forth.
While certain vehicle system sensors may communicate with sensor subsystem 410 alone, other sensors may communicate with both sensor subsystem 410 and vehicle control system 430 or may communicate with sensor subsystem 410 indirectly via vehicle control system 430. A navigation subsystem 411 of in-vehicle computing system or infotainment system 309 may generate and/or receive navigation information such as location information (e.g., via a GPS sensor and/or other sensors from sensor subsystem 410), route guidance (e.g., to avoid locations having high occurrences of driver state events), traffic information, point-of-interest (POI) identification, and/or provide other navigational services for the driver. Navigation subsystem 411 may be, or may be part of, navigation system 224 of
An external device interface 412 of in-vehicle computing system or infotainment system 309 may be coupleable to and/or communicate with one or more external devices 350 located external to vehicle 302. While the external devices are illustrated as being located external to vehicle 302, it is to be understood that they may be temporarily housed in vehicle 302, such as when the user is operating the external devices while operating vehicle 302. In other words, the external devices 350 are not integral to vehicle 302. The external devices 350 may include a mobile device 328 (e.g., connected via a Bluetooth®, NFC, Wi-Fi Direct®, or other wireless connection) or an alternate Bluetooth®-enabled device 452.
Mobile device 328 may be a mobile phone, smart phone, wearable devices/sensors that may communicate with the in-vehicle computing system via wired and/or wireless communication, or other portable electronic device(s). Other external devices include one or more external services 446. For example, the external devices may include extra-vehicular devices that are separate from and located externally to the vehicle. Still other external devices include one or more external storage devices 454, such as solid-state drives, pen drives, USB drives, and so on. External devices 350 may communicate with in-vehicle computing system or infotainment system 309 either wirelessly or via connectors without departing from the scope of this disclosure. For example, external devices 350 may communicate with in-vehicle computing system or infotainment system 309 through external device interface 412 over a network 460, a USB connection, a direct wired connection, a direct wireless connection, and/or other communication link.
External device interface 412 may provide a communication interface to enable the in-vehicle computing system to communicate with mobile devices associated with contacts of the driver. For example, external device interface 412 may enable phone calls to be established and/or text messages (e.g., Short Message Service (SMS), Multimedia Message Service (MMS), and so on) to be sent (e.g., via a cellular communications network) to a mobile device associated with a contact of the driver. External device interface 412 may additionally or alternatively provide a wireless communication interface to enable the in-vehicle computing system to synchronize data with one or more devices in the vehicle (e.g., the driver's mobile device) via Wi-Fi Direct®, as described in more detail below.
One or more mobile device applications 444 may be operable on mobile device 328. As an example, mobile device application 444 may be operated to aggregate user data regarding interactions of the user with the mobile device. For example, mobile device application 444 may aggregate data regarding music playlists listened to by the user on the mobile device, telephone call logs (including a frequency and duration of telephone calls accepted by the user), positional information including locations frequented by the user and an amount of time spent at each location, and so on. The collected data may be transferred by mobile device application 444 to external device interface 412 over network 460. In addition, specific user data requests may be received at mobile device 328 from in-vehicle computing system or infotainment system 309 via external device interface 412. The specific data requests may include requests for determining where the user is geographically located, an ambient noise level and/or music genre at the user's location, an ambient weather condition (temperature, humidity, and so on) at the user's location, and so on. Mobile device application 444 may send control instructions to components (e.g., microphone, amplifier, and so on) or other applications (e.g., navigational applications) of mobile device 328 to enable the requested data to be collected on the mobile device or requested adjustment made to the components. Mobile device application 444 may then relay the collected information back to in-vehicle computing system or infotainment system 309. For example, mobile device application 444 may include a navigation application that is used in addition to or as an alternative to navigation subsystem 411.
Likewise, one or more external services applications 448 may be operable on external services 446. As an example, external services applications 448 may be operated to aggregate and/or analyze data from multiple data sources. For example, external services applications 448 may aggregate data from one or more social media accounts of the user, data from the in-vehicle computing system (e.g., sensor data, log files, user input, and so on), data from an internet query (e.g., weather data, POI data), and so on. The collected data may be transmitted to another device and/or analyzed by the application to determine a context of the driver, vehicle, and environment and perform an action based on the context (e.g., requesting/sending data to other devices).
Vehicle control system 430 may include controls for controlling aspects of various vehicle systems 431 involved in different in-vehicle functions. These may include, for example, controlling aspects of vehicle audio system 432 for providing audio entertainment to the vehicle occupants, aspects of a climate control system 434 for meeting the cabin cooling or heating needs of the vehicle occupants, as well as aspects of a telecommunication system 436 for enabling vehicle occupants to establish telecommunication linkage with others.
Audio system 432 may include one or more acoustic reproduction devices including electromagnetic transducers such as one or more speakers 435. Vehicle audio system 432 may be passive or active such as by including a power amplifier. In some embodiments, in-vehicle computing system or infotainment system 309 may be the only audio source for the acoustic reproduction device, or there may be other audio sources that are connected to the audio reproduction system (e.g., external devices such as a mobile phone). The connection of any such external devices to the audio reproduction device may be analog, digital, or any combination of analog and digital technologies.
Climate control system 434 may be configured to provide a comfortable environment within the cabin or passenger compartment of vehicle 302. Climate control system 434 includes components enabling controlled ventilation such as air vents, a heater, an air conditioner, an integrated heater and air-conditioner system, and so on. Other components linked to the heating and air-conditioning setup may include a windshield defrosting and defogging system capable of clearing the windshield and a ventilation-air filter for cleaning outside air that enters the passenger compartment through a fresh-air inlet.
Vehicle control system 430 may also include controls for adjusting the settings of various vehicle control elements 461 (or vehicle controls, or vehicle system control elements) related to the engine and/or auxiliary elements within a cabin of the vehicle, such as one or more steering wheel controls 462 (e.g., steering wheel-mounted audio system controls, cruise controls, windshield wiper controls, headlight controls, turn signal controls, and so on), instrument panel controls, microphone(s), accelerator/brake/clutch pedals, a gear shift, door/window controls positioned in a driver or passenger door, seat controls, cabin light controls, audio system controls, cabin temperature controls, and so on. Vehicle control elements 461 may also include internal engine and vehicle operation controls (e.g., engine controller module, actuators, valves, and so on) that are configured to receive instructions via the CAN bus of the vehicle to change operation of one or more of the engine, exhaust system, transmission, and/or other vehicle system. The control signals may also control audio output at one or more speakers 435 of the vehicle's audio system 432. For example, the control signals may adjust audio output characteristics such as volume, equalization, audio image (e.g., the configuration of the audio signals to produce audio output that appears to a user to originate from one or more defined locations), audio distribution among a plurality of speakers, and so forth. Likewise, the control signals may control vents, air conditioner, and/or heater of climate control system 434. For example, the control signals may increase delivery of cooled air to a specific section of the cabin.
Vehicle control system 430 may further include an ADAS 437. ADAS 437 may be ADAS 240 of
Control elements positioned on an outside of a vehicle (e.g., controls for a security system) may also be connected to in-vehicle computing system or infotainment system 309, such as via inter-vehicle system communication module 422. The control elements of the vehicle control system may be physically and permanently positioned on and/or in the vehicle for receiving user input. In addition to receiving control instructions from in-vehicle computing system or infotainment system 309, vehicle control system 430 may also receive input from one or more external devices 350 operated by the user, such as from mobile device 328. This allows aspects of vehicle systems 431 and vehicle control elements 461 to be controlled based on user input received from the external devices 350.
In-vehicle computing system or infotainment system 309 may further include one or more antennas 406. The in-vehicle computing system may obtain broadband wireless internet access via antennas 406, and may further receive broadcast signals such as radio, television, weather, traffic, and the like. The in-vehicle computing system or infotainment system 309 may receive positioning signals such as GPS signals via antennas 406. The in-vehicle computing system may also receive wireless commands via radio frequency (RF), such as via antennas 406 or via infrared or other means through appropriate receiving devices. In some embodiments, antennas 406 may be included as part of audio system 432 or telecommunication system 436. Additionally, antennas 406 may provide AM/FM radio signals to external devices 350 (such as to mobile device 328) via external device interface 412.
One or more elements of the in-vehicle computing system or infotainment system 309 may be controlled by a user via user interface 418. User interface 418 may include a graphical user interface presented on a touch screen, such as touch screen 308 and/or display screen 311 of
In acquiring 502, driver and driver environment inputs are acquired. Examples of the driver and driver environment inputs include driver images, a cabin occupancy input, and a driving behavior input, such as described with respect to
In acquiring 504, route inputs are acquired. The route inputs may include a location input and a trajectory input, such as described above with respect to
In determining 506, a driver state is determined based on the acquired driver and driver environment inputs. The driver state may be determined via a driver state monitor (e.g., driver state monitor 242 of
In detecting 510, it is determined if a driver state event is detected. The driver state event may be detected in response to the current state of the driver being one that may impair or impede the driver's ability to operate the vehicle, such as when the driver state is distracted, sleepy, drowsy, and/or asleep. The driver state event might not be detected in response to the current state of the driver not being one that may impair or impede the driver's ability to operate the vehicle, such as when the driver state is determined to be alert and focused.
In response to the driver state event not being detected, in continuing 512, the driver state is continued to be monitored without outputting a driver state event indication, and method 500 may proceed to generating and/or receiving 520. In response to the driver state event being detected, method 500 may proceed to outputting 514.
In outputting 514, the driver state event indication is output. Outputting 514 may further include a tagging 516 and a tagging 518. In tagging 516, the driver state event is tagged with a location of its occurrence, as determined from the location input. In tagging 518, the driver state event is tagged with a time of its occurrence (e.g., date, time, day of week). Outputting 514 may include outputting the driver state event indication, including a type of driver state event (e.g., the detected driver state) tagged with the location and the time, to the cloud. In some embodiments, outputting 514 may further include outputting the driver state event indication to a driver profile (e.g., driver profile 244).
In generating and/or receiving 520, navigation guidance is generated and/or received based on a plurality of driver state events, as will be further described with respect to
In updating 522, a navigation system output is updated based on the navigation guidance. The navigation system output may be a displayed map (e.g., displayed via a display screen) and/or a displayed navigation route. For example, the displayed map may be updated to include a driver state event map layer that indicates areas having high occurrences of driver state events, such as determined via a cluster analysis (e.g., via the method of
Method 500 may then end. In various embodiments, method 500 may be repeated continuously or at a pre-determined frequency during vehicle operation so that the driver and driver environment inputs and the route inputs are updated over time and driver state events are detected and output accordingly.
In receiving 602, a driver state event indication is received. As explained above with respect to
In analyzing 604, a plurality of the driver state events is analyzed in a location-based and time-based manner. The plurality of driver state events may undergo a cluster analysis, for example, to identify specific locations that have a statistically higher occurrence or concentration of the driver state events. The time-based analysis may further include identifying specific travel times of day within the specific locations that have a statistically higher occurrence of the driver state events. When the analyzing is performed via the cloud, the plurality of driver state events comprises driver state event indications received from a plurality of different vehicles for a plurality of different drivers. When the analyzing is performed within the ADAS, the plurality of driver state events comprises multiple driver state events for one individual driver. Thus, analyzing 604 may identify statistically significant clusters of driver state events in a location-based and/or time-based manner.
In generating 606, navigation guidance is generated based on the location(s) and time(s) having the high concentration of driver state events. The navigation guidance comprises a driver state event map layer and/or a route recommendation. The driver state event map layer may be configured as a heat map or may use another type of visual representation, such as lines, points, icons, and the like to indicate areas having driver state event clusters. The route recommendation may include a recommended travel route to avoid or reduce traveling through locations having driver state event clusters. The route recommendation may incorporate a driver-specified origin and/or a driver-specified destination. In some embodiments, the route recommendation may recommend roadways based on a current travel trajectory even when the driver-specified destination is not provided.
Generating 606 optionally includes an outputting 608 and/or an outputting 610. In outputting 608, the diver state event map layer is output. In outputting 610, the route recommendation is output. Outputting 608 and outputting 610 may include outputting the corresponding navigation guidance (e.g., the driver state event map layer and/or the route recommendation) to a navigation system of one or more vehicles, for example. The driver state event map layer and/or the route recommendation may be output based on a request received from the vehicle and/or a navigation system within the vehicle (e.g., integrated within the vehicle or communicatively coupled to the vehicle). Thus, the navigation guidance may be generated even when not output or displayed.
Method 600 may then end. In various embodiments, method 600 may be performed continually or repeated at a pre-determined frequency, such as additional driver state event indications are received, in order to re-analyze the plurality of driver state events and update the navigation guidance accordingly. As such, the navigation guidance may be adjusted continually or at the pre-determined frequency.
In
In
In
In this way, systems and methods are provided for providing navigation guidance by leveraging information received by an ADAS. As a result, map layers and route guidance may be generated in order to decrease driver drowsiness and/or distraction. As a result, higher driver satisfaction may be achieved. Further occurrences of driver state events may be reduced by avoiding routes having characteristics that induce driver state events and/or by alerting the driver to the propensity of certain routes to induce driver state events.
The disclosure also provides support for a method of operation of a navigation system of a vehicle, comprising: detecting a plurality of driver state events via an advanced driver assistance system (ADAS), generating navigation guidance based on the plurality of driver state events detected via ADAS, and communicating the navigation guidance to a user of the vehicle via the navigation system, the navigation system comprising a display and the navigation guidance including at least one of a route recommendation and a map layer which are displayed via the display of the navigation system. In a first example of the method, the plurality of driver state events is detected via one of a plurality of different ADAS for a plurality of drivers and an individual driver of the vehicle, wherein the individual driver of the vehicle is the user of the vehicle, wherein the method further comprises: outputting driver state events for the plurality of drivers to a cloud computing system and outputting driver state events for the individual driver to a driver profile. In a second example of the method, optionally including the first example, generating the navigation guidance based on the plurality of driver state events detected via the ADAS comprises: detecting each driver state event of the plurality of driver state events via the ADAS, and tagging each driver state event of the plurality of driver state events with a location of occurrence. In a third example of the method, optionally including one or both of the first and second examples, generating the navigation guidance based on the plurality of driver state events detected via the ADAS further comprises tagging each driver state event of the plurality of driver state events with a time of occurrence. In a fourth example of the method, optionally including one or more or each of the first through third examples, generating the navigation guidance based on the plurality of driver state events detected via the ADAS further comprises statistically grouping the plurality of driver state events with respect to location of occurrence. In a fifth example of the method, optionally including one or more or each of the first through fourth examples, generating the navigation guidance based on the plurality of driver state events detected via the ADAS further comprises statistically grouping the plurality of driver state events with respect to time of occurrence within the location of occurrence. In a sixth example of the method, optionally including one or more or each of the first through fifth examples, statistically grouping the plurality of driver state events comprises performing a cluster analysis. In a seventh example of the method, optionally including one or more or each of the first through sixth examples, detecting each driver state event of the plurality of driver state events via the ADAS comprises: receiving images of the plurality of drivers of a plurality of vehicles at the ADAS, analyzing facial structures in the received images of the plurality of drivers to determine a state of each of the plurality of drivers, and outputting a driver state event indication in response to the state being one or more of asleep, tired, and distracted. In a eighth example of the method, optionally including one or more or each of the first through seventh examples, the map layer comprises a heat map display of driver state event clustering. In a ninth example of the method, optionally including one or more or each of the first through eighth examples, the route recommendation reduces vehicle travel through locations and travel times having high driver state event clustering.
The disclosure also provides support for a method for navigation, comprising: generating a navigation route for a vehicle based on navigation guidance determined from at least one of internally provided data collected within the vehicle and externally provided data, the navigation guidance comprising at least one of a map layer and a route recommendation determined based on driver state events, and communicating the navigation route to a user of the vehicle via a display of a navigation system housed inside the vehicle. In a first example of the method, the externally provided data comprises data received from a cloud computing system and the internally provided data comprises data of a driver profile received from a plurality of images of a driver of the vehicle obtained via an in-vehicle camera. In a second example of the method, optionally including the first example, the route recommendation generated from the externally provided data comprises a general vehicle route recommendation, wherein the general vehicle route recommendation is determined via the cloud computing system based on location and time of occurrences of the driver state events reported to the cloud computing system for a plurality of drivers. In a third example of the method, optionally including one or both of the first and second examples, the method further comprises: performing a first cluster analysis on the driver state events detected from the plurality of drivers to define locations and times having statistically significant clusters of the driver state events via the cloud computing system. In a fourth example of the method, optionally including one or more or each of the first through third examples, the route recommendation generated from the internally provided data comprises an individualized vehicle route recommendation, and the method further comprises: determining the individualized vehicle route recommendation based on a second cluster analysis of the driver state events for the driver of the vehicle to define locations having statistically significant clusters of driver state events particular to the driver of the vehicle. In a fifth example of the method, optionally including one or more or each of the first through fourth examples, the route recommendation reduces an extent of travel through locations having statistically significant clusters of driver state events based on at least one of internally provided data collected within the vehicle and externally provided data, wherein the route recommendation generated based on the internally provided data is individualized to a driver of the vehicle and the route recommendation generated based on the externally provided data is generalized based on data of a plurality of drivers uploaded to a cloud computing system. In a sixth example of the method, optionally including one or more or each of the first through fifth examples, the driver state events comprise at least one of a distracted state, a fatigued state, and a sleepy state.
The disclosure also provides support for a vehicle system, comprising, one or more processors, an in-vehicle camera housed within a cabin of the vehicle, an advanced driver assistance system (ADAS), a navigation system comprising a display, and a non-transitory memory including instructions that, when executed, cause the one or more processors to: detect driver state events in the vehicle based on driver images acquired by the in-vehicle camera and analysis of the driver images performed by the ADAS, report the driver state events to a cloud computing platform in an anonymized manner and to a driver profile specific to a driver imaged by the in-vehicle camera, and receive navigation guidance from the cloud computing platform, the navigation guidance determined by one of the cloud computing platform based on the driver state events reported by a plurality of vehicles and the driver profile specific to the driver of the vehicle. In a first example of the system, the non-transitory memory further includes further instructions that, when executed, cause the one or more processors to: output the navigation guidance to the display of the navigation system within the vehicle, the navigation guidance comprising at least one of a map layer and a route recommendation. In a second example of the system, optionally including the first example, the driver state events comprise occurrences of at least one of a tired, fatigued, and distracted driver, the map layer comprises a heat map indicating statistically significant clusters of the driver state events, and the route recommendation comprises a navigation route that reduces an extent of travel through the statistically significant clusters of the driver state events.
The description of embodiments has been presented for purposes of illustration and description. Suitable modifications and variations to the embodiments may be performed in light of the above description or may be acquired from practicing the methods. For example, unless otherwise noted, one or more of the described methods may be performed by a suitable device and/or combination of devices, such as computing device(s) 132 and in-vehicle computing system or infotainment system 309 described with reference to
As used in this application, an element or step recited in the singular and proceeded with the word “a” or “an” should be understood as not excluding plural of said elements or steps, unless such exclusion is stated. Furthermore, references to “one embodiment” or “one example” of the present disclosure are not intended to be interpreted as excluding the existence of additional embodiments that also incorporate the recited features. The terms “first,” “second,” and “third,” and so on. are used merely as labels, and are not intended to impose numerical requirements or a particular positional order on their objects. The following claims particularly point out subject matter from the above disclosure that is regarded as novel and non-obvious.
The present application claims priority to U.S. Provisional Application No. 63/266,167, entitled “METHODS AND SYSTEMS FOR NAVIGATION GUIDANCE BASED ON DRIVER STATE EVENTS”, and filed on Dec. 29, 2021. The entire contents of the above-listed application are hereby incorporated by reference for all purposes.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/IB2022/062853 | 12/29/2022 | WO |
Number | Date | Country | |
---|---|---|---|
63266167 | Dec 2021 | US |