The subject disclosure relates to the art of automated driving or automated device operation. More particularly, the subject disclosure relates to a system and method for communicating and interacting with a user or driver for the purpose of allocating non-monitoring periods during automated device or vehicle operation.
Vehicles are increasingly equipped with automated driving systems that provide various levels of automation. Vehicles can, under certain conditions, feature full automated control, semi-automated control, and automated control of specific vehicle functions (e.g., braking or steering). Automation in vehicles can be categorized according to automation levels. For example, Level 0 automation refers to full manual operation (no driving automation), and Level 1 automation includes driver assistance. Level 2 automation allows for vehicle control of steering and acceleration, with the driver monitoring and ready to take control at any time. In Level 3 automation (conditional automation), a vehicle can monitor the environment and automatically control the operation. The driver in Level 3 need not monitor the environment, but must be ready to take control with notice.
Level 2 automation systems generally require that a driver is attentive (eyes on the road) and ready to take manual control at any moment when the vehicle is performing automated operations. Commonly, a short period of time (e.g., 3-5 seconds, depending on speed and other factors) is allowed for the driver’s eyes to be off the road. Such a limited time period precludes the driver from being able to perform many non-driving related tasks, and does not make any allowance for driving context. Thus, it would be desirable to have a system that provides flexibility for a user to perform various non-driving related tasks.
In one exemplary embodiment, a system for user interaction with an automated device includes a control system configured to operate the device during an operating mode, the operating mode corresponding to a first state in which the control system automatically controls the device operation, and the operating mode prescribes that a user monitor the device operation during automated control. The control system is configured to allocate a time period for the device to transition to a temporary state in which automated control is maintained and the user is permitted to stop monitoring and perform a task unrelated to the device operation. The system also includes a user interaction system including a visual display configured to present trajectory information, an indication as to whether an area is conducive to putting the device in the temporary state, and time period allocation information to the user, the user interaction system including an interface engageable by the user to manage scheduling of one or more allocated time periods.
In addition to one or more of the features described herein, the control system is configured to allocate the time period in response to a request, the time period including a non-monitoring period having a duration based on an amount of time to complete the task, and put the device into the temporary state at initiation of the allocated time period.
In addition to one or more of the features described herein, the device is a vehicle and the task is a non-driving related task (NDRT).
In addition to one or more of the features described herein, the indication includes a representation of an allowable area intersecting the trajectory, the allowable area being conducive to putting the device into the temporary state.
In addition to one or more of the features described herein, the visual display includes a first indicator configured to inform the user as to whether the device is within the allowable area, and the user interaction system includes an adaptive timer configured to inform the user as to an amount of time remaining for the user to perform the task.
In addition to one or more of the features described herein, the visual display includes a second indicator configured to fade in upon the user’s gaze being directed to the visual display, and fade out upon the user’s gaze being directed away from the visual display.
In addition to one or more of the features described herein, the visual display includes a directional indicator configured to indicate a direction of a detected object or condition in an environment around the device.
In addition to one or more of the features described herein, the user interaction system includes a mobile device application, the mobile device application configured to present a second indicator in coordination with the first indicator.
In addition to one or more of the features described herein, the user interaction system is configured to present an alert the user when the device is able to enter the temporary state, the alert including at least one of a visual alert, an audible alert and a haptic alert.
In addition to one or more of the features described herein, the user interaction system is configured to prevent allocation of the time period based on a detecting an urgent condition that warrants a transition from the first state to a manual state.
In one exemplary embodiment, a method of controlling an autonomous device includes operating the device during an operating mode, the operating mode corresponding to a first state in which the control system automatically controls the device operation, the operating mode prescribing that a user monitor the device operation during automated control. The method also includes receiving, via a user interaction system, a request for the user to temporarily stop monitoring in order to perform a task unrelated to the device operation, and allocating a time period for the device to transition to a temporary state in which automated control is maintained and the user is permitted to stop monitoring and perform a task unrelated to the device operation, and presenting, via a visual display of the user interaction system, trajectory information, an indication as to whether an area is conducive to putting the device in the temporary state, and time period allocation information to the user, the user interaction system including an interface engageable by the user to manage scheduling of one or more allocated time periods.
In addition to one or more of the features described herein, the allocated time period includes a non-monitoring period having a duration based on an amount of time to complete the task, the method including putting the device into the temporary state at initiation of the allocated time period.
In addition to one or more of the features described herein, the indication includes a representation of an allowable area intersecting the trajectory, the allowable area being conducive to putting the device into the temporary state.
In addition to one or more of the features described herein, the visual display includes a first indicator configured to inform the user as to whether the device is within the allowable area, and the user interaction system includes an adaptive timer configured to inform the user as to an amount of time remaining for the user to perform the task.
In addition to one or more of the features described herein, the visual display includes a second indicator configured to fade in upon the user’s gaze being directed to the visual display, and fade out upon the user’s gaze being directed away from the visual display.
In addition to one or more of the features described herein, the visual display includes a directional indicator configured to indicate a direction of a detected object or condition in an environment around the device.
In one exemplary embodiment, a vehicle system includes a memory having computer readable instructions, and a processing device for executing the computer readable instructions, the computer readable instructions controlling the processing device to perform: operating the vehicle during an operating mode, the operating mode corresponding to a first state in which the control system automatically controls the vehicle operation, the operating mode prescribing that a user monitor the vehicle operation during automated control, receiving, via a user interaction system, a request for the user to temporarily stop monitoring in order to perform a task unrelated to the vehicle operation, allocating a time period for the vehicle to transition to a temporary state in which automated control is maintained and the user is permitted to stop monitoring and perform a task unrelated to the vehicle operation, and presenting, via a visual display of the user interaction system, trajectory information, an indication as to whether an area is conducive to putting the vehicle in the temporary state, and time period allocation information to the user, the user interaction system including an interface engageable by the user to manage scheduling of one or more allocated time periods.
In addition to one or more of the features described herein, the indication includes a representation of an allowable area intersecting the trajectory, the allowable area being conducive to putting the vehicle into the temporary state.
In addition to one or more of the features described herein, the visual display includes a first indicator configured to inform the user as to whether the vehicle is within the allowable area, and the user interaction system includes an adaptive timer configured to inform the user as to an amount of time remaining for the user to perform the task.
In addition to one or more of the features described herein, the visual display includes a directional indicator configured to indicate a direction of a detected object or condition in an environment around the vehicle.
The above features and advantages, and other features and advantages of the disclosure are readily apparent from the following detailed description when taken in connection with the accompanying drawings.
Other features, advantages and details appear, by way of example only, in the following detailed description, the detailed description referring to the drawings in which:
The following description is merely exemplary in nature and is not intended to limit the present disclosure, its application or uses. It should be understood that throughout the drawings, corresponding reference numerals indicate like or corresponding parts and features.
In accordance with one or more exemplary embodiments, methods and systems are provided for scheduling and allocating non-monitoring periods for automated vehicles, systems or devices. Some vehicles can have autonomous capability (Level 5) and may be able to degrade themselves to lower levels of automation (Level 4, Level 3, Level 2 and/or Level 1) depending on environmental conditions, sensor capabilities and a drivers’ condition and intent. In an embodiment, the systems and methods perform scheduling and allocation for a vehicle when the vehicle is at a level of automation that allows automated control of the vehicle while a user or driver is actively monitoring the vehicle. An example of such a level of automation is Level 2 automation as defined by the Society of Automotive Engineers (SAE).
A scheduling and allocation system is configured to schedule and allocate non-monitoring time periods that allow a user to temporarily divert attention from automated device operation, and stop active monitoring in order to perform a task unrelated to the automated operation. The non-monitoring time period may be a pre-selected time period that can be extended if conditions permit. A short monitoring period may be provided between non-monitoring periods (periscoping). The automated device may be a vehicle or any other suitable device or system, such as an aircraft, a power plant supervised by humans, production or manufacturing system or equipment, equipment used in a medical procedure, and others. In the context of vehicles, unrelated tasks are referred to as non-driving related tasks (NDRTs). In the following, unrelated tasks are described as NDRTs; however, it is to be understood that embodiments described herein are applicable any type of unrelated task performed during operation of any suitable automated device (e.g., Level 2 and/or 3 vehicle).
The system, in response to a request (e.g., from the user or generated by a vehicle processing unit) for a non-monitoring time period (an “NDRT request”), allocates a time period that includes a non-monitoring time period and may also include allotments (allocations) for transitioning between vehicle states and reacting to environmental conditions or events. The non-monitoring time period is based on an estimated amount of time to perform an NDRT (e.g., reading an e-mail, answering a call, etc.).
The system can allocate time periods under a “fixed” scheme in which a defined amount of time is provided for an NDRT, or under a “rolling” scheme in which an allocated time period can be further extended based on current conditions. An allocated time period may include a relatively short monitoring period between non-monitoring periods or within a given non-monitoring period. Inclusion of such a short monitoring period is referred to as “periscoping.” A “short” monitoring period, in an embodiment, is a duration that is sufficient to allow a user to direct attention to the road and observe objects in the road (e.g., 3 seconds).
In an embodiment, the system is configured to coordinate the scheduling of multiple allocated time periods for a plurality of discrete NDRTs. For example, the system includes a dynamic priority queue or other mechanism to schedule NDRTs based on factors such as urgency, importance and physiological (comfort) needs.
In an embodiment, the system determines the amount of time to be allocated based on user readiness and environmental context. An “environmental context” includes any combination of environmental conditions and features that can affect driving or operating behavior. An environmental context may include features of the environment around a vehicle or other automated system, which may include the physical surrounding and features and conditions thereof (e.g., other vehicles, pedestrians, road type, intersections, traffic control devices, road conditions, time of day, weather, etc.), and vehicle dynamics (e.g., stationary, at a given speed, braking, accelerating, turning, etc.). User readiness refers to a condition of the user (e.g., distracted, stressed, eyes away from the road, transitioning to manual control, etc.) indicative of whether a user is ready to perform a task related to controlling the vehicle’s automated system..
Allocation of a time period may occur in response to a user request, or pre-selected requests scheduled before or during driving or operation. Allocation may occur automatically in response to selected criteria (e.g., based on a suggestion presented to the user and the user accepting the suggestion). For example, a user can be monitored or tracked to determine the level of readiness to take control of the vehicle or assume monitoring, and/or to identify conditions indicative of a desire to perform an NDRT (e.g., user appears tired or hungry, user looks to a mobile device or messages from a vehicle infotainment system). For example, the allocation process may be activated based on eye gaze tracking by a vehicle’s driver monitoring system (DMS) or other suitable tracking system.
In an embodiment, the vehicle and/or scheduling and allocation system includes a user interaction system that supports transitions between vehicle states (e.g., an NDRT state during which an NDRT can be performed, and a monitoring state during which the user is actively monitoring automated operation). The user interaction system also allows the user to manage aspects of scheduling and allocation, such as inputting NDRT requests, selecting NDRT requests (e.g., from a dynamic queue), indicating completion of NDRTs and/or overriding or vetoing NDRT requests.
The user interaction system includes a visual display that displays or presents relevant information to the user, such as indications of an NDRT state, indications of upcoming or current allocated time periods, and/or indications of time periods and areas that are available for allocation (allowable times and/or allowable areas). The visual display may also present environment information (e.g., location of detected objects such as other vehicles and road users) and driving-related information (e.g., notification of upcoming maneuvers). In an embodiment, upcoming time periods are addressed by speech dialogue, and current time periods are visually presented.
The user interaction system can display information to a user in an intuitive and subtle way that avoids overly distracting the user while providing cues to the user regarding upcoming transitions and allowable areas. For example, the user interaction system supports transitions via color coded indicators that indicate vehicle states and inform the user as to whether the user can perform an NDRT. The indicators may gradually appear (fade-in) and disappear (fade-out) based on the direction of a user’s gaze. The user interaction system can provide combinations of modalities (e.g., visual, audible and/or haptic) that notify the user regarding time allocations and vehicle states, and may also notify the user regarding the direction of detected objects.
Although the following description is in the context of a vehicle, embodiments described herein are not so limited and can be realized in conjunction with any of various devices and systems having automated behaviors (automated systems), or any system or process that involves human monitoring (where the human can intervene). Examples of such devices or systems include aircraft, factory or manufacturing machinery, robotics, construction vehicles, smart home devices, internet-of-things devices, and others.
Embodiments described herein present a number of advantages. For example, current automated vehicles have a Level 2 or Level 3 automation, which require a mechanism for transfer of control back to manual if a driver is inattentive (stops monitoring) for more than a few seconds. For example, many automated vehicles are subject to the informal, rule of thumb, duration used by the automated vehicle industry that allows for “3 second” eyes-off-road when, for example, the driver operates a radio or other infotainment device. However, the amount of eyes-off time that can be allocated without adverse effects may vary based on many factors, such as driver state, vehicle state, road and other road user state, and environmental state. For example, there are driving situations that necessitate eyes-off periods of no more than 1 second or even none (such as during a merge or exit maneuver), but there are also situations (e.g., straight highway, few other road users, fair weather, etc.) where there is a possibility to extend automated control capabilities facilitating automated behavior, and thereby extend the eyes-off-road duration to longer than 3 seconds. Embodiments described herein improve current automated vehicle capabilities by providing for the allocation of time periods that can be tailored to specific users and situations to address a user’s non-driving needs while maintaining safety. In addition, the embodiments provide features and capabilities that facilitate allocation and scheduling, for example, by engaging the user and helping to reduce the time that a user takes to transition between NDRT states and monitoring states.
The vehicle also includes a monitoring, detection and automated control system 18, aspects of which may be incorporated in or connected to the vehicle 10. The control system 18 in this embodiment includes one or more optical cameras 20 configured to take images, which may be still images and/or video images. Additional devices or sensors may be included in the control system 18, such as one or more radar assemblies 22 included in the vehicle 10. The control system 18 is not so limited and may include other types of sensors, such as infrared.
The vehicle 10 and the control system 18 include or are connected to an on-board computer system 30 that includes one or more processing devices 32 and a user interface 34. The user interface 34 may include a touchscreen, a speech recognition system and/or various buttons for allowing a user to interact with features of the vehicle. The user interface 24 may be configured to interact with the user via visual communications (e.g., text and/or graphical displays), tactile communications or alerts (e.g., vibration), and/or audible communications. The on-board computer system 30 may also include or communicate with devices for monitoring the user, such as interior cameras and image analysis components. Such devices may be incorporated into a driver monitoring system (DMS).
In addition to the user interface 34, the vehicle 10 may include other types of displays and/or other devices that can interact with and/or impart information to a user. For example, in addition to, or alternatively, the vehicle 10 may include a display screen (e.g., a full display mirror or FDM) incorporated into a rearview mirror 36 and/or one or more side mirrors 38. In one embodiment, the vehicle 10 includes one or more heads up displays (HUDs). Other devices that may be incorporated include indicator lights, haptic devices, interior lights, auditory communication devices, and others. Haptic devices (tactile interfaces) include, for example, vibrating devices in the vehicle steering wheel and/or seat. The various displays, haptic devices, lights, and auditory devices are configured to be used in various combinations to present explanations to a user (e.g., a driver, operator or passenger).
The vehicle 10, in an embodiment, includes a scheduling and allocation system, which may be incorporated into the on-board computer system 30 or in communication with the computer system 30. In addition, or alternatively, the scheduling and allocation system can be incorporated into a remote processing device such as a server, a personal computer, a mobile device, or any other suitable processor.
Components of the computer system 40 include the processing device 42 (such as one or more processors or processing units), a system memory 44, and a bus 46 that couples various system components including the system memory 44 to the processing device 42. The system memory 44 may include a variety of computer system readable media. Such media can be any available media that is accessible by the processing device 42, and includes both volatile and non-volatile media, and removable and non-removable media.
For example, the system memory 44 includes a non-volatile memory 48 such as a hard drive, and may also include a volatile memory 50, such as random access memory (RAM) and/or cache memory. The computer system 40 can further include other removable/non-removable, volatile/non-volatile computer system storage media.
The system memory 44 can include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out functions of the embodiments described herein. For example, the system memory 44 stores various program modules that generally carry out the functions and/or methodologies of embodiments described herein. A module or modules 52 may be included to perform functions related to determination of user state, vehicle state and environmental conditions. A scheduling and allocation module 54 may be included for receiving data (e.g., state information and NDRT requests). An interface module 56 may be included for interacting with a user to facilitate various methods described herein. The system 40 is not so limited, as other modules may be included. The system memory 44 may also store various data structures, such as data files or other structures that store data related to imaging and image processing. As used herein, the term “module” refers to processing circuitry that may include an application specific integrated circuit (ASIC), an electronic circuit, a processor (shared, dedicated, or group) and memory that executes one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that provide the described functionality.
The processing device 42 can also communicate with one or more external devices 58 such as a keyboard, a pointing device, and/or any devices (e.g., network card, modem, etc.) that enable the processing device 42 to communicate with one or more other computing devices. In addition, the processing device 42 may communicate with one or more devices such as the cameras 20 and the radar assemblies 22. The processing device 42 may communicate with one or more display devices 60 (e.g., an onboard touchscreen, cluster, center stack, HUD, mirror displays (FDM) and others), and vehicle control devices or systems 62 (e.g., for partially automated (e.g., driver assist) and/or fully automated vehicle control). Communication with various devices can occur via Input/Output (I/O) interfaces 64 and 65.
The processing device 42 may also communicate with one or more networks 66 such as a local area network (LAN), a general wide area network (WAN), and/or a public network (e.g., the Internet) via a network adapter 68. It should be understood that although not shown, other hardware and/or software components may be used in conjunction with the computer system 40. Examples include, but are not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, and data archival storage systems, etc.
The system 80 includes a scheduling and allocation module 82 that receives time estimates (e.g., testnonmonitoring, tevasive maneuver, and tmonitoring->driving as defined below) based on requests for non-monitoring periods (NDRT requests), driver state information, and vehicle and environmental state information, and estimates and allocates time periods that accommodate user NDRTs. The system 80 includes various components or modules that interact with the scheduling and allocation module 82 and/or provide NDRT requests. For example, a driver state estimation module 84 determines a driver state (e.g., attentive, distracted, eyes on road, eyes off road, etc.), and an environment and vehicle state estimation module 86 determines the state of the vehicle (e.g., operating mode, dynamics) and a state or condition of an environment around the vehicle. The driver state may be estimated to determine an estimation of the time required for a driver to transition from monitoring to manual driving (to ensure that the driver has sufficient time to return to monitoring and driving if needed), and the time required for the system to suggest time periods for NDRTs.
The environment and vehicle state estimation module 86 can be used to determine whether environmental and vehicle conditions exist such that a time period for NDRT can be allocated. Examples of environmental conditions include road type (e.g., highway, local), proximity to other vehicles, objects in the environment, whether there is an event that the vehicle is approaching that would preclude or limit availability for NDRT, or any combination of environmental features that would affect the availability of times for a user to be inattentive.
The driver state estimation module 84 may be used to detect whether the driver performs an action that is indicative of a non-driving task or behavior (e.g., looking down at a mobile device, or reading a newspaper on the front passenger seat). In an embodiment, the driver state estimation module 84 determines based on a user condition (e.g., eyes off road, driver picks up or looks at mobile device, driver appears agitated or hungry, etc.), whether an NDRT would be appropriate to benefit the driver. An NDRT may be considered a benefit if the NDRT is consistent with user preferences (e.g., from user inputs or inferred from tracking user behavior and condition), or consistent with similar users’ preferences, given the time allowed for the NDRT. For example, if a user has been monitoring for a given amount of time, the driver state estimation module 84 can determine that the user would benefit from a change in position, determine transition times (amounts of time to transition between states), and automatically generate a request or provide a suggestion to the user of an NDRT.
The modules 84 and 86 can be used to compute various time periods (e.g., test nonmonitoring, tevasivemaneuver, and tmonitoring->driving) that are used to calculate individual allocated time periods (tNDRT_alloc.) for performing NDRTs. Such individual time periods take into account the time needed for transitioning between operating states and performing evasive maneuvers. For example, the module 84 and/or 86 can be used to determine the amount of time the driver can be in a non-monitoring capacity (testnonmonitoring), which is determined given the state of the vehicle, the environment and/or the state of the driver. In another example, the module 84 and/or 86 can be used to determine an amount of time for the user and the vehicle to transition from a “monitoring state” in which the driver is attentive to the road and vehicle system state, but is not physically controlling the vehicle, to a “manual state” or “driving state” in which the driver has active manual control. This time period is denoted as tmonitoring->driving. During an allocated time period, the vehicle is in an “NDRT state” in which the driver does not need to monitor the vehicle (i.e., user monitoring is suspended).
Furthermore, the module 84 and/or 86 can be used to compute a time period for performing a critical evasive maneuver (tevasivemaneuver). Examples of evasive maneuvers include evasive actions such as steering, changing lanes, emergency braking and others. tevasivemaneuver may vary based on environmental and vehicle conditions, as well as driver readiness for the evasive maneuver. For example, this time period is shorter in higher speed regions (e.g., highways) or congested regions, and is longer in lower speed regions or non-congested regions.
The system 80 also includes a human-machine interface (HMI) 90, such as a touchscreen or display that allows a user or other entity to input requests to the allocation module 82. The HMI 90 may also output information to a user (block 97), such as an indication that an allocated time period is started, the allocated duration, route information, etc.
Allocation of a time period for an NDRT is prompted by a request (an NDRT request), which can be generated by, or provided from, various locations and devices or modules. As discussed further below, a request or requests (represented by block 92) may be generated by a user while the vehicle is driving (“during ride request”) and/or prior to driving (“pre-ride request”), such as for an anticipated videoconference or phone call. Such requests may be entered by a user via the HMI 90 (e.g., a touchscreen, mobile device, vehicle infotainment system displays or buttons). Requests may also be generated automatically based on monitoring a user’s condition via, for example, the driver state estimation module 84.
In an embodiment, an NDRT request is generated based on a user’s history (e.g., by machine learning). The system 80 learns a specific pattern of the user (e.g., the user always calls his wife when he is driving back home). An NDRT request may also be generated based on identifying notifications, such as an incoming urgent email, or SMS message.
The time periods tmonitoring->driving and tevasivemaneuver are calculated (e.g., continuously or on a periodic basis) and input to the allocation and scheduling module 82, along with an estimated amount of time in which the user would not be required to monitor vehicle operation (testnon monitoring), which may be acquired from various sources such as regulatory sources, analyses of previously collected data and/or simulations, and/or based on conditions such as road conditions, traffic density, speed and user state.
Inputs to the allocation and scheduling module 82 may also include information regarding estimated times for performing NDRTs (testimated_NDRT), estimated times for returning from the NDRT state to the monitoring state after a nonmonitoring period (ttransition->monitoring), and a minimum time for a given NDRT (tminNDRT). The minimum NDRT time can be determined from experimental data or previously collected vehicle data, and is provided to avoid making suggestions or allowing NDRTs when the system 80 cannot allocate sufficient time. If the system 80 cannot accommodate at least the minimum time, a suggestion is not presented (and the allocation and scheduling module 82 can move to another NDRT, e.g., in a list or queue).
The scheduling and allocation module 82 includes various modules for initiating NDRT states, returning to monitoring or driving states, and/or coordinating multiple NDRT states. For example, a scheduling module 94 receives inputs from the HMI 90 and/or the modules 84 and 86, and/or from a ML unit that estimates NDRT times based on history and learned behaviors. For a given NDRT request, the scheduling module 94 receives testimated_NDRT and ttransition->monitoring (and in some cases, a minimum NDRT time tminNDRT). These time periods may be pre-determined periods stored remotely and/or locally at a selected location. For example, a look-up table or other data structure in a database 96 stores time budgets and information such as estimated NDRT times, estimated transition times and/or minimum NDRT times (block 95) in the database 96.
Upon initiation of an allocated time period and initiation of an NDRT state, a transition module 98 transitions the vehicle from the monitoring state to theNDRT state. An NDRT module 100 controls transitions between the monitoring state, the NDRT state and the manual state. The NDRT module 100 may transition back to the monitoring state at the expiration of the allocated time period (tNDRT total). The NDRT module 100 can transition sooner, for example, if the user so requests or a condition arises that would necessitate a transition to manual driving (e.g., an accident, pedestrian or other object in road, etc.). One or more of the above modules may be configured as or include one or more finite state machines (FSMs).
The scheduling module 94 may be configured to perform and/or coordinate the execution of multiple NDRT states associated with multiple NDRTs. For example, the scheduling module 94 can access times for various types of tasks, and use such times to allocate time periods for execution of multiple tasks. In some cases, the module 82 can assign a short monitoring period between NDRTs. As discussed further below, time periods can be allocated based on urgency or importance.
Time periods 114, 118 and 122 are estimated to determine whether there is sufficient time for the vehicle to transition to manual control (driving state) and react to a road event or environmental condition (e.g., pedestrian in the road, another vehicle intersects the vehicle or trajectory). The time periods 114, 118 and 122 represent the ability of a vehicle and user to return to manual operation to react to a road event.
Time period 118 is a time period that corresponds to the time to transition from the monitoring state to the (manual) driving state (tmonitoring->driving). This time period allows for returning to the driving loop (including stabilization) in an effective manner. Time period 120 (tdriving) represents the vehicle being in the driving state. Time period 122 (tevasivemaneuver) is provided for the user to perform a successful critical maneuver (e.g., an emergency evasive maneuver).
The method 130 is discussed in conjunction with the vehicle of
At block 131, a vehicle system (e.g., the control system 80) tracks or observes a user during automated operation of a vehicle. The vehicle is in a monitoring state in which the vehicle is controlled automatically and the driver is tracked to ensure that the driver is attentive (“eyes on road”). While in the monitoring state, the vehicle may allow for a default period of time for the user to be inattentive, which may be a short period (e.g., 3 seconds) or longer (e.g., minutes, tens of minutes or greater). This amount of time is a fixed amount that is unrelated to the user condition or desire to perform an NDRT.
At block 132, the system identifies an NDRT request. A request may be identified based on receiving an affirmative input from the user, based on pre-scheduled tasks, or based on the user condition. For example, a user may enter a request via the vehicle’s infotainment system, the HMI 90 or other user interface.
One type of request is generated based on a suggestion presented to the user and accepted by the user, which may be identified by monitoring user behavior, vehicle state and/or environment to identify a potential NDRT. An NDRT suggestion (e.g., change position, make a phone call, etc.) may be generated and displayed to the user, and the user can confirm or reject the suggestion. For example, the user is monitored via a camera and/or other sensors to identify indications that the user would benefit from an NDRT (e.g., indications that the user is uncomfortable or agitated, or that the user has been in the driving or monitoring state for an extended period of time and could use a break).
Furthermore, a request may be generated by identification of external demands and activities (incoming of an important e-mail, SMS, phone). For example, the system can access the vehicle’s infotainment system for important incoming e-mails, SMS from favorites list, SMS from work related list, and others. The system may be configured to provide suggestions for potential NDRTs based on user history and learning.
Requests can also be identified based on previously entered or scheduled NDRTs, which may be entered into the vehicle system or a mobile application (app). For example, a calendar in a mobile device or in the infotainment system can be synchronized with system prior to vehicle operation, and the information becomes an input to the system.
In an embodiment, a machine learning module, FSM or other processing unit is configured to identify potential NDRTs and make suggestions based on machine learning (ML). Such a module reviews a user’s history of engagement with a vehicle system (e.g., infotainment system) or mobile device, and makes suggestions based on the timing and recipients of previous calls (or other communications, such as texts and emails). For example, the module learns that a user typically makes a call to mom around 8:00 AM on most or all previous days, and thus automatically generates a suggestion to “call mom” at 8:00 AM. As discussed further herein, the module can also dynamically arrange a vehicle route so as to create a better opportunity for a scheduled NDRT. Thus, systems and methods herein can request NDRTs in an ad-hoc or opportunistic manner, and can request and schedule a planned NDRT (based on, e.g., a scheduled event, an ML learned NDRT, user preference, etc.) that can be pre-arranged by selecting the best routes, speeds, lanes, times, etc. to ensure safe and effective (i.e., with little chances for emergency takeovers) allotments or allocations of time for NDRTs.
In an embodiment, the system maintains a queue of tasks that have been pre-scheduled, may potentially be requested by the user during operation and/or tasks that can be suggested based on user condition. A dynamic queue may be maintained that lists various NDRTs (or types of NDRTs) and associates each NDRT with a level of importance or urgency, which can change dynamically. Each NDRT is associated in the queue with time periods related to a minimum time to perform the NDRT and times to transition to monitoring. NDRTs can be added to the queue, either at a scheduled time or based on current conditions.
At block 133, the system determines a level of readiness of the driver, and/or the driver’s intentions and motivations. Readiness categories include physical availability, physiological preparedness, cognitive awareness and skill level (ability to perform various NDRTs or types of tasks). Readiness can be determined, for example, by monitoring a driver’s posture, monitoring eye gaze, and facial analysis to determine emotional states, and/or querying the driver. Readiness categories can be stored and correlated with an amount of time for a driver to transition from the NDRT state to monitoring (ttransition->monitoring). A value of ttransition->monitoring may be attached to every NDRT in the queue, as it can be a function of the user as well as the specific NDRT.
In addition to driver “readiness,” the system may assess the driver’s “intentions and motivation” to assume monitoring and/or take manual control. “Intentions and motivation” can be assessed based on past occurrences in similar situations (e.g., a driver that is reluctant to assume monitoring when he or she is reading the newspaper), on driver’s verbal accounts, and/or on his or her responses to ongoing alerts. Readiness, as well as intentions and motivation, are used to estimate a quality of transfer of control to the user.
The quality of transfer of control, in an embodiment, is estimated as the cartesian product of “readiness” and “intentions and motivation.” For example, if driver readiness is low (e.g., the driver is not in the seat or is in some awkward posture), but his or her motivation is high due to frustration and anxiety (as estimated based on image analysis or other driver tracking means), the product of the two elements (“readiness” and “intentions and motivation”) yields transition times (ttransttion→monitoring , tmonitoring→monitoring , tevasivemaneuver). Based on these times, the length of the NDRT that can be provided to the driver, can be predicted.
At block 134, the system determines based on the level of readiness, intentions and motivation, and/or environmental context whether the NDRT can be performed. If so, the system calculates or acquires a time allocation tNDRT_Alloc. for the user to perform the NDRT, taking into account the time for transition to monitoring. The system also determines time periods for transitioning and performing evasive maneuvers as discussed above.
At block 135, the system transitions to an NDRT state, during which the system automatically controls the vehicle and does not require that the user stay attentive.
At block 136, the system transitions back to a monitoring state at the expiration of an allocated time period. Alternatively, the system transitions back to the monitoring state earlier based on user input (e.g., the user cancels the NDRT request or indicates that the NDRT is complete, or in response to a change in the vehicle or the environment). Other events or user actions can be identified as an indication of completion, such as the user sending an email or text, ending a phone call, finishing eating etc.
In an embodiment, the system can actively query the user for an indication that the NDRT is complete, which can be used to determine completion alone or in combination with collected data from cameras and other sensors. For example, at a selected point during the tNDRT_Alloc. (e.g., some number of seconds before the end), the system begins questioning the driver regarding the completion of the task via audible request or display. Further, completion can be forced by removing the task from the queue.
NDRTs can be scheduled in various ways. For example, an overall time period can be scheduled according to a “fixed” NDRT algorithm, a “rolling” NDRT algorithm or a “rolling bounded” algorithm.
Fixed NDRT time periods are specific to a given task and provide a pre-determined time for a user to perform an NDRT. Rolling NDRT time periods may provide additional time to complete a task if an originally allocated time is insufficient. Rolling bounded NDRT time periods provide additional time but are bounded by, for example, regulatory requirement or by internal computation based on driver state, road condition, and environment state.
In response to a request for an NDRT, the FSM 150 transitions to a NDRT request state 154. In this state, the scheduling and allocation module 82 determines the estimated total amount of non-monitoring time based on the vehicle state (e.g., speed), road and other environmental conditions, traffic conditions and driver state. “Non-monitoring time “or test nonmonitoring is the total amount of time available for the driver to be in a non-monitoring state.
test nonmonitoring is the estimated time in which the driver or user will not be required to monitor vehicle operation, and may be based on factors such as the vehicle’s sensing capabilities (e.g., range, coverage, detection abilities, etc.), road condition (e.g., traffic and other road users in vicinity), road and infrastructure type (e.g., suburban, rural, highway, urban, etc.), environmental factors (e.g., rain, dusk, sleet, etc.), and road geometry (e.g., curvy, straight, etc.). The time for nonmonitoring may be subject to a pre-determined maximum (e.g., bounded by some regulatory value). In addition, the time for non-monitoring may be adjusted dynamically as conditions change.
The non-monitoring time is compared to a sum of a minimum NDRT time tminNDRT (which may be based on data observed from the driver and/or other drivers) and the time to transition to monitoring (ttransition->monitoring) to ensure that sufficient time can be allocated for the NDRT. This comparison can be represented as:
The minimum time may be a value assigned to all tasks, or may be specific to one or more tasks.
If the estimated non-monitoring time is greater than or equal to the sum, then the FSM 150 transitions to an NDRT state 156. In the NDRT state 156, the vehicle operates automatically while allowing the user to be inattentive and perform the NDRT.
In this example, the estimated time for performing the NDRT (composing and sending the message) is 10 seconds, tminNDRT is 5 seconds and ttransition->monitoring is 3 seconds. The estimated non-monitoring time is 9 sec, which represents an allowable amount of time based on the vehicle sensor capabilities, the state of the driver and external conditions (e.g., weather, road and traffic).
tminNDRT + ttransition->monitoring is 8 seconds, which is less than the nonmonitoring time, thus the system can allocate time for this NDRT. The allocated time period 112 (tNDRTalloc.) in this example is 9 seconds - 3 seconds = 6 seconds.
In an embodiment, the FSM 160 is configured for executing a “rolling” NDRT as discussed above, and/or is configured for executing a “rolling bounded” NDRT allocation method.
In the rolling embodiment, the FSM 160 stays in the NDRT state 166 as long as test nonmonitoring ≥ ttransition→monitoring (unless conditions or user input necessitate earlier completion). In principle, the system 80 can stay in the NDRT state indefinitely, or at least until conditions occur that necessitate a return to monitoring and/or manual control.
In rolling bounded NDRT embodiments, the total non-monitoring time is bounded to be within a selected or calculated range.
For example, NDRT time is bound by regulatory requirement, manufacturer, driver preferences or by internal computation. In an embodiment, tNDRTalloc. is bounded by a maximum time, referred to as “NDRT Total.” Such a bound can be imposed or provided by a manufacturer or per the driver’s preferences. The bound can be also imposed by regulatory requirements.
In the NDRT state 166, the system can allocate time for multiple tasks. If the user indicates that the NDRT is complete, test nonmonitoring < ttransition→monitoring , or NDRT Total is greater than or equal to a selected threshold, the FSM 160 transitions to the transition to monitoring state 168, and the system 80 returns to the monitoring state and the driver returns to monitoring.
In an embodiment, the system 80 is configured to extend an NDRT allocation tNDRT alloc. after a short monitoring period. This allotment of a short monitoring period is referred to as “periscoping.”
It is noted that testnon-monitoring may change over time as a function of environment and/or user conditions. When the user is occupied in the NDRT, the ability to assess the user’s ability to return to monitoring (and driving) is likely to reduce over time (e.g., the user is looking down at his cell phone). Thus, as the user spends more time in the NDRT state (rolling), test non-monitoring decreases, both because the system 80 may be temporarily unable to estimate the user’s ability to monitor and drive, as well as due to the fact that the likelihood of a system malfunctions increases over time. As a result, the FSM 170 can transition to short period monitoring, or “periscope.” Once periscoping is complete, test nonmonitoring is likely to increase again.
For example, when the FSM 170 enters the NDRT state 176, a timer is activated. The FSM 160 stays in the NDRT state 166 as long as test nonmonitoring ≥ ttransition→monitoring . If the user indicates that the NDRT is complete, or test nonmonitoring < ttransition→monitoring , the FMS transitions to a transition to monitoring state 178.
If the NDRT is not complete, or another NDRT is to be performed, the FSM 170 transitions to a short period monitoring state 180, in which a short period tmonitoring min is allotted for the driver to return to monitoring temporarily. When tmonitoring min is over and test nonmonitoring < tminNDRT + ttransition→monitoring, the FSM 170 returns to the monitoring state 172. If tmonitoringmin is over and test nonmonitoring≥tminNDRT + ttransition→monitoring, the FSM 170 transitions bask to the NDRT state 176. This may be repeated as desired or as time and conditions allow.
In an embodiment, rolling and periscoping may be bounded based on, for example, regulatory requirements. As an allocated NDRT period progresses, the accumulated NDRT time tNDRT_Total (starting from the last transition to NDRT state) is restricted within a maximum time. For example, the FSM 170 stays in the NDRT state 176 if test nonmonitoring≥ttransition->monitoring and tNDRT_Total is less than a threshold (e.g., 30 seconds). Likewise, transition from the NDRT state 176 to the transition to monitoring state 178 occurs if test non monitoring < ttransition->monitoring, the user indicates completion, or the threshold is reached (e.g., tNDRT_Total is greater than or equal to 30).
In an embodiment, the system 80 is configured to schedule and allocate multiple tasks, which may be previously scheduled tasks or tasks requested during driving or operation. Scheduled tasks may be assigned priority or urgency levels so that more important or urgent tasks are assigned quickly.
The FSM 185 includes the monitoring state 172, the NDRT request state 174, and the NDRT state 176. The FDM 185 also includes the transition to monitoring state 178 and the short period monitoring state 180. Transitions between these states are described above with reference to the FSM 170.
The FSM 185 is configured to receive camera data, sensor data and/or other information related to the user condition and behavior. If the user’s behavior indicates that the user is attempting to perform an NDRT (without approval from the system and without an allocated NDRT time period), the FSM 185 transitions to an “unidentified NDRT” state 186. Behaviors that can trigger this transition include changes in posture, directing the user’s gaze away from the road for longer than the default time (e.g., toward the infotainment interface, the passenger seat or rear seating areas, etc.), the user attempting to engage the infotainment system or other vehicle interface, and the user verbalizing an intention to perform an NDRT.
If the user’s behavior indicates that the user intends to or is attempting to perform an NDRT, the FSM 185 transitions to the unidentified NDRT state 186. When at the unidentified NDRT state 186, if conditions arise such that the user should be monitoring or driving, the FSM 185 transitions immediately, or within the default time, to the monitoring state, essentially refusing the NDRT. At this point, the system 80 can indicate to the user that the NDRT is not available and/or provide an estimate as to when the system 80 can allocate time for the intended NDRT.
If conditions permit, the FSM 185 transitions to an “identify NDRT” state 188 and the system 80 attempts to identify the intended NDRT. If the NDRT is identified, the system consults a look-up table or other data structure to determine the estimated time to transition to monitoring ttransition->monitoring. If the intended NDRT is not identified, then pre-configured fixed values for ttransition->monitoring are selected.
If test nonmonitoring is greater than or equal to ttransition->monitoring , the FSM 185 transitions to the NDRT state 176. If test nonmonitoring is less than ttransition->monitoring , the system refuses the identified NDRT, transitions to the monitoring state 172, and indicates the refusal to the user. Alternatively, if conditions permit, the system can attempt to increase test nonmonitoring (e.g., by lowering speed, shifting to the right lane, etc.).
The queue 190 is populated by individual NDRT entries (NDRTi) according to an order of execution, or according to priority. When an NDRT is complete, its record is removed and subsequent NDRT entries are moved up in the queue.
For example, the queue 190 includes a low priority entry for an NDRT “driver position change” 192. Incoming emails prompt corresponding entries to be inserted into the queue 190 according to priority. In this example, an “incoming email - urgent” entry 194 is inserted at the top of the queue with a high priority. An “incoming email - non urgent” entry 196 is inserted with an intermediate priority.
NDRTs may change their priority over time - hence “dynamic”. The priority of individual records may change, for example, due to changes in vehicle, user or environmental conditions. After a period of time has passed in the example of
An NDRTx has two defining values: a minimum required time tminNDRT(X) and additional time ttransition->monitoring(x) to return to the monitoring state from a specific NDRT state. NDRT entries in the queue are examined from highest priority downwards if they comply with the following:
If so, the following time is allocated for the NDRT:
This might be more than required for the specific NDRT. In such a case, the driver will indicate “done” and will return to short period monitoring before being able to switch to another NDRT.
Referring still to
Once the system receives an explicit indication of completion from the user or an implicit indication, the NDRT is completed (in progress NDRT = False) and the completed NDRTx is removed from the queue.
If testnonmonitoring < ttransition->monitoring, transition to monitoring in state 178 is performed. If the NDRTx is not completed, the FMS 200 transitions to the short monitoring period state 180 and initializes a monitoring timer. If the NDRT is currently being performed and the monitoring timer reaches a minimum value tmonitoring min, the system returns to the NDRT state 176. In this way, a short monitoring period is inserted into the allocated NDRT time for a specific NDRT.
If test nonmonitoring < ttransition->monitoring, the FSM 200 transitions to monitoring in state 178. If the NDRTx is complete (in progress NDRT = False), the transition is to the monitoring state 172. NDRTx is then removed from the queue and x is initialized to zero. If additional time for performing NDRTs is available, the FSM 200 transitions back to state 206 and again checks the NDRT according to the queue.
In an embodiment, vehicle operation and/or route may be controlled in order to allow time for, or otherwise facilitate longer NDRTs. If multiple routes are possible, a given route may be selected based on criteria related to NDRT performance. For example, evaluating routes is based on determining the shortest route, subject to sufficient estimated NDRT time throughout the route, to accommodate the current NDRTs in the queue (e.g., those requested by the driver). This is a setting that the driver can manipulate in a preferences page of the system. The setting can be pre-defined and/or changed dynamically during driving. Available routes may be represented as a set S= { S1, S2, ..., Sk} of possible routes. Each route si, has two relevant parameters: ti and NDRT timei. The optimization problem for choosing the best possible route is to find the minimum ti under the constraint of having NDRT timei be at least some estimated value based on the driver’s requested NDRTs.
The system 80 may also adjust vehicle dynamics in order to accommodate NDRTs. For example, the system 80 identifies a need for an immediate NDRT (e.g., scheduled prior to or during drive, explicit requests, positive NDRT acts, urgent email, etc.). The system 80 then determines whether test nonmonitoring is sufficient to allow a user to perform the NDRT. If not, the system 80 can adjust driving style to increase test non monitoring. Non-limiting examples of driving adjustments include changing to left-most lane and avoiding additional change-of-lane, reducing speed, adjusting speed to follow another vehicle, and making changes to a route.
The following table illustrates examples of NDRT times. Each NDRT time is associated with a specific task or associated with a type of task. As shown, NDRT times can be categorized as short and long. Short times (type I) are typically under one minute and involve relatively simple tasks. Long times (type II) are typically of longer duration (minutes to hours) and involve high concentration. This table may be configured as a look-up table and stored, for example, in the database 96.
Embodiments of a user interaction system 199 are described with reference to
The user interaction system 199 supports various forms of user input, such as touchscreen inputs (e.g., soft buttons in the display 200 and/or a touchscreen of a mobile device), heads-up display (HUD) inputs, audible or speech inputs via a microphone, and implicit inputs derived from tracking or monitoring user condition and behavior (e.g., eye tracking).
Outputs include affordances (i.e., alerts, indications and/or visualizations prompting the user to act in order to perform an NDRT, or informing the user that the vehicle is conducive to performing an NDRT given all external conditions) and allocations of time periods for NDRTs, display features (e.g., color, representations of vehicle path and environment), displayed timer information (e.g., remaining time of an allocated time period or tNDRT Alloc.), and situational awareness information. The situational awareness information can be provided via a combination of one or more modalities, such as visual representations, lights, sounds (directional or non-directional) and haptics (directional or non-directional). For example, as discussed further herein, outputs include trajectory information and a visual indication (e.g., dashed lines) of an upcoming area or segment that is conducive to allocation of a non-monitoring period, optionally with additional queues (e.g., sound and/or haptics) as the vehicle approaches the area.
As shown in
Referring to
In addition, a steering wheel indicator 212, such as an array of LED lights along a section of the steering wheel 202, may complement the path markers 210 and the NDRT safe segment markers 214. The steering wheel indicator 212 is also color coded to correspond with the color coding of the vehicle path and NDRT safe segment markers (e.g., the color coding of the steering wheel can be green with white dashed lines 201 that match the color and representation format of the NDRT safe segment in the road display 204).
Additional indicators may be included in the display 200 and/or at various locations within the vehicle cockpit. For example, as shown in
In an embodiment, the color of the vehicle path markers 210, border 216 and steering wheel indicator 212 all have the same color, which may be selected to be different than the color of other objects or vehicles represented by the road display 204. For example, the color of the vehicle path markers is green with white dashed lines when in a safe segment, plain green in automated road segments, yellow in response to a non-urgent system request for takeover, and red when an allocation is unavailable (e.g., conditions are not conducive to NDRTs or the user should be available to take control because of an urgent condition). Any of various color schemes may be used. For example, if a vehicle already uses a green-yellow-red color scheme for other purposes, an additional color (e.g., magenta), or texture, can be used to differentiate NDRT allocations.
When the vehicle is in an “NDRT safe area” corresponding to a safe segment, allocation of a time period and entry into the NDRT state is permitted and an NDRT safe segment markers 214 appear. The time during which the vehicle is in an NDRT safe area is referred to as an “NDRT safe period.” During the NDRT safe period, an NDRT or NDRTs can be selected and time allocated for performance thereof.
When the allocation and scheduling system 80 decides that it is safe to execute an NDRT, a single or multi-modal alert can be generated, and features of the infotainment graphic 209 (e.g., “My Tasks”) are enabled and open for interaction. In addition, the interaction system 199 causes the border 216 to appear in the display 200 (e.g., in response to a user’s gaze and/or upon determination that it is safe to execute the NDRT requested previously), and provides an alert in the form of a steady light or a subtle pulsating light, a short pleasant sound, and/or haptics including one or more pulsations of the steering wheel, a vibration, and others. The alert informs the user that an NDRT can be performed or an allocation can be requested. For example, in conjunction with fade-in of the border 216, a “MyTasks” feature can show a list of user requests and/or a list of system generated recommendations for NDRTs (e.g., from a dynamic queue) that the user can select. Current activities are tracked, and uncompleted and scheduled tasks remain in the list (e.g., uncompleted tasks remain on top). An additional queue can be provided to inform the user when at or approaching the end of an NDRT safe segment.
In an embodiment, the interaction system 199 includes an adaptive timer configured to track the amount of time remaining in an allocated time period or time remaining for the user to request an NDRT, as shown in
As discussed further below, one or more of the indicators may be activated and deactivated based on direct user input (e.g., via the user pressing the soft button, or via speech), or implicitly based on where the user is looking as determined by eye gaze detection or other means of user attention tracking. For example, the border 216 (when the vehicle is in a safe area) appears when the user’s gaze is directed to the infotainment graphic 209, indicating that it is safe to interact with the graphic 209.
In an embodiment, the user interaction system 199 accommodates periscoping (i.e., the provision of a short monitoring period within an allocated time period or between NDRTs). For example, during an allocated time period (vehicle is in the NDRT state), if the user diverts his or her gaze from the infotainment interface 209 back to the road, the countdown marks 207 and the border 216 fade away. A new activation of the infotainment area (e.g., by returning gaze to the infotainment interface 209) resets the timer. Upon the completion of an NDRT, the user may notify the system 199 by speech, swiping the task away, pressing the soft button 205 or any other suitable manner.
The indicators may operate in conjunction with the timer to inform the user as to the time remaining in a subtle manner. For example, as the timer counts down, the border 216 can gradually fade until the timer ends, and optionally may pulsate. Haptic indicators may operate in coordination with the border 216, to indicate that the vehicle path and the NDRT safe segment are coming to an end. For example, a pleasant but firm chime can play and the vehicle seat can also vibrate. A haptic device in the steering wheel 202 may pulsate with a frequency that is synchronized with pulsation of the border 216. The indicators may also operate in this manner to indicate the start of an NDRT safe period.
In addition to the above information, the display 200 may be configured to provide situational awareness information regarding detected objects in the environment around the vehicle, such as other vehicles and road users, and/or any other features or objects of the environment. For example, as shown in
The user may request an NDRT at any time during the NDRT safe period. The request may be made by speech, specifying in natural language the required task, its estimated durations and/or the conditions required for its execution. Alternatively, the user may make a request without providing any further details. Moreover, the user can decide to start an NDRT in an informal way, as detected for example by tracking the user (without any request). In this case, if the NDRT can be supported, the system 80 will allow it. If the NDRT cannot be supported, the system 80 will veto it, optionally with some explanation and/or an alert message and markers on the road display 204 indicating if and when the NDRT safe segment will become available.
If the vehicle is not in an NDRT safe area, input features of the display 200 may be disabled. For example, interaction with the My Tasks area of the infotainment graphic 209 is disabled. If the user touches or interacts with the infotainment graphic 209, the display 200 can notify the user in various ways that My Tasks is disabled, why it is disabled, and when it will be made available again. This notification may be visual, for example, by the border 216 turning grey or another color to indicate that the vehicle is outside of an NDRT safe segment. This notification may also be in the form of speech. For example, the system 199 can read out listed tasks following a request, including how long it will take to get to the next NDRT safe zone, and allow the user to place a request that will be handled or addressed by the system 80 at a later time.
As shown in
When the user selects a NDRT, a time period is allocated, and a countdown is shown on the soft button 205. The countdown marks 207 have the same color as the vehicle path markers 210 and NDRT safe segment marker 210 when a new task begins. If the user wishes to perform an off-line task (i.e., a short task, such as turning to the back seat) that is not listed, the user may activate the button 205 independent of the selected NDRT, which allocates a fixed time (e.g., 10 seconds). Turning one’s gaze may activate peripheral lighting (see
As shown in
Aspects of the user interaction system may be incorporated into a mobile device that coordinates with the allocation system and display. For example, as shown in
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the present disclosure. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, element components, and/or groups thereof.
While the above disclosure has been described with reference to exemplary embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted for elements thereof without departing from its scope. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the disclosure without departing from the essential scope thereof. Therefore, it is intended that the present disclosure not be limited to the particular embodiments disclosed, but will include all embodiments falling within the scope thereof.