Embodiments of the invention relate to the field of automotive control systems. In particularly, some embodiments of the invention relate to driver assistance systems and driver interface devices.
Driver assistance systems such as, for example, adaptive cruise control and automated lane change systems have been successfully deployed to the market to increase of driver comfort and safety. As these driver assistance systems progress in sophistication, less driver interaction may be required. In some cases, the driver assistance systems may be fully automated for portions of a trip. Accordingly, the role of the driver has changed from that of an active driver to that of a passenger, for at least some portion of the trip. Highly automated vehicles allow the driver to hand over control to the automated vehicle and to do other tasks while driving.
Vehicle systems that employ an adaptive cruise control or automated driving features may provide a user interface shown on a display that provide the user/driver with information regarding the operations being performed by the automated system. However, some advanced automated vehicle systems may require little or no interaction from the user/driver during normal operation of the automated vehicle system. Instead, one function and purpose of providing a graphical user interface with detailed feedback information for the user/driver is to allow the user/driver to feel more comfortable “letting go” of control over the operation of the vehicle and turning operation of the vehicle over to the automated system.
A novice user/driver may require a substantial amount of information about the system state and the operations being performed by the automated vehicle system in order to develop trust in the automated system and to be convinced that the automated system is functioning safely and properly. On the other hand, a more experienced user/driver may have already developed a significant degree of trust and may prefer a simpler user interface with less information.
Some embodiments of this invention provide an adaptive user interface for a vehicle equipped with one or more automated vehicle systems. The adaptive user interface changes the format and level of detail of the information displayed on the user interface based on feedback from the user/driver. In some embodiments, the adaptive system is configured to be manually adjusted by the user/driver. In other embodiments, the adaptive system is configured to monitor one or more sensors and to ascertain a stress level of the driver to evaluate whether the driver is comfortable with the level of information being displayed on the user interface.
In one embodiment, the invention provides an adaptive user interface system for a vehicle with an automatic vehicle system. The adaptive user interface system includes a display and an electronic controller. The controller is configured to generate a graphical user interface indicative of operation of the automatic vehicle system, output the graphical user interface on the display, monitor an indicia of a driver's comfort level, and determine, based on the monitored indicia, when the driver is not comfortable with the operation of the automatic vehicle system. In response to determining that the driver is not comfortable with the operation of the automatic vehicle system, the electronic controller modifies the graphical user interface to provide an increased level of detail.
Other aspects of the invention will become apparent by consideration of the detailed description and accompanying drawings.
Before any embodiments of the invention are explained in detail, it is to be understood that the invention is not limited in its application to the details of construction and the arrangement of components set forth in the following description or illustrated in the following drawings. The invention is capable of other embodiments and of being practiced or of being carried out in various ways.
Also, it is to be understood that the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting. The use of “including,” “comprising” or “having” and variations thereof herein is meant to encompass the items listed thereafter and equivalents thereof as well as additional items. The terms “mounted,” “connected” and “coupled” are used broadly and encompass both direct and indirect mounting, connecting and coupling. Further, “connected” and “coupled” are not restricted to physical or mechanical connections or couplings, and can include electrical connections or couplings, whether direct or indirect. Also, electronic communications and notifications may be performed using any known means including wired connections, wireless connections, etc.
It should also be noted that a plurality of hardware and software based devices, as well as a plurality of different structural components may be utilized to implement the invention. It should also be noted that a plurality of hardware and software based devices, as well as a plurality of different structural components may be used to implement the invention. In addition, it should be understood that embodiments of the invention may include hardware, software, and electronic components or modules that, for purposes of discussion, may be illustrated and described as if the majority of the components were implemented solely in hardware. However, one of ordinary skill in the art, and based on a reading of this detailed description, would recognize that, in at least one embodiment, the electronic based aspects of the invention may be implemented in software (e.g., stored on non-transitory computer-readable medium) executable by one or more processors. As such, it should be noted that a plurality of hardware and software based devices, as well as a plurality of different structural components may be utilized to implement the invention. For example, “control units” and “controllers” described in the specification can include one or more processors, one or more memory modules including non-transitory computer-readable medium, one or more input/output interfaces, and various connections (e.g., a system bus) connecting the components.
In the example illustrated, the autonomous vehicle control system 10 includes an electronic controller 12, vehicle control systems 14, sensors 16, one or more exterior vehicle cameras 18, a transceiver 20, and a display 22. The components of the autonomous vehicle control system 10, along with other various modules and components are electrically coupled to each other by or through one or more control or data buses, which enable communication therebetween. The use of control and data buses for the interconnection between, and communication among, the various modules and components would be known to a person skilled in the art in view of the invention described herein. In alternative embodiments, some or all of the components of the autonomous vehicle control system 10 may be communicatively coupled using suitable wireless modalities (for example, Bluetooth™ or near field communication). For ease of description, the autonomous vehicle control system 10 illustrated in
The electronic controller 12 includes an electronic processor 24 (e.g., a microprocessor, application specific integrated circuit, etc.), a memory 26, and an input/output interface 28. The memory 26 may be made up of one or more non-transitory computer-readable media, and includes at least a program storage area and a data storage area. The program storage area and the data storage area can include combinations of different types of memory, such as read-only memory (“ROM”), random access memory (“RAM”) (e.g., dynamic RAM (“DRAM”), synchronous DRAM (“SDRAM”), etc.), electrically erasable programmable read-only memory (“EEPROM”), flash memory, a hard disk, an SD card, or other suitable magnetic, optical, physical, or electronic memory devices. The electronic processor 24 is coupled to the memory 26 and the input/output interface 28. The electronic processor 24 sends and receives information (e.g., from the memory 26 and/or the input/output interface 28), and processes the information by executing one or more software instructions or modules, capable of being stored in the memory 26, or another non-transitory computer readable medium. The software can include firmware, one or more applications, program data, filters, rules, one or more program modules, and other executable instructions. The electronic processor 24 is configured to retrieve from the memory 26 and execute, among other things, software for autonomous vehicle control, and for performing methods as described herein.
The input/output interface 28 transmits and receives information from devices external to the electronic controller 12 (e.g., over one or more wired and/or wireless connections), such as the vehicle control systems 14, the sensors 16, the exterior cameras 18, the transceiver 20, and the display 22. The input/output interface 28 receives user input, provides system output, or a combination of both. As described herein, user input from a driver or passenger of a vehicle may be provided by one or more human-machine interface components including, for example, a touch-screen display 22, a microphone, or a button/control. The input/output interface 38 may also include other input and output mechanisms, which for brevity are not described herein and which may be implemented in hardware, software, or a combination of both.
It should be understood that although
The electronic processor 24 uses the input/output interface 28 to send and receive information or commands to and from the vehicle control systems 14 (e.g., over a vehicle communication bus, such as a CAN bus). The vehicle control systems 14 include components (e.g., actuators, motors, and controllers) to control a plurality of vehicle systems (e.g., braking, steering, and engine power output). For the sake of brevity, the vehicle control systems 14 will not be described in greater detail. The electronic processor 24 is configured to operate the vehicle control systems 14 to autonomously drive the vehicle. In some embodiments, the vehicle control systems 14 are controlled to automatically drive the vehicle without driver intervention or input for the entirety of a trip. In other embodiments, the vehicle control systems 14 are controlled to drive the vehicle for one or more portions of a trip, and to allow or require a driver to manually operate the vehicle for one or more portions of the trip.
For example, the user interface of
In addition to displaying the vehicle 401 and the driving lane 403, the user interface of
Additionally, the detailed view of the user interface of
The adaptive user interface system is configured to modify the type and amount of information on the display to adapt to the driver's preferences. In some implementations, the controller, via the user interface, asks the driver how much information the driver desires. The user interface provides multiple questions to the driver seeking information that is grouped into categories such as low, medium, or high. In some implementations, the controller is configured to ask questions about specific displayed items. The questions may be presented on start-up of the vehicle or based on time/mileage intervals.
However, if the driver indicates that they would like to change the display mode (step 603), the system further queries whether the driver would like for the user interface to provide more detail or less detail (step 613). If the driver responds with “more,” the system modifies the user interface to adapt to the request (step 611). If the driver responds with “less,” the system modifies the user interface to adapt to that request accordingly (step 613).
In some implementations, the adaptive user interface system is configured to cycle through a series of preconfigured user interface configurations in response to driver feedback. For example, if the adaptive user interface system is currently showing the user interface of
Over time, as the driver becomes familiar and comfortable with the automated systems, the controller automatically reduces the amount of information presented on the display. The system may also cause the display to indicate to the driver the information that is not currently being displayed, but is available if desired.
In other implementations, instead of or in addition to prompting the driver for conscious/active feedback (as illustrated in
Conversely, if the sensors do not indicate a level of distress (step 703), the system checks to see if a defined period of time has elapsed since the display was last modified (step 709) and, if so, the level of detail in the user interface is automatically decreased (step 711). However, if the user interface was already recently modified (step 709), then the system continues to display that same level of detail in the user interface. In this way, the system not only responds automatically to the driver's comfort level, but ultimately seeks to reduce the level of detail provided in the user interface as the driver becomes more comfortable with the automatic operation of the vehicle system.
It is to be understood that the specific examples described above illustrative and that other implementations are possible without departing from the scope of the invention. For example, instead of showing the user interface on a screen mounted on the interior dashboard of the vehicle, the user interface may be output through an instrument cluster, a heads-up display, a center console, a mobile device (e.g., a smart phone or tablet), or a wearable device (e.g., a watch). Likewise, the specific user interfaces illustrated in
Thus, the invention provides, among other things, a drive state indicator for an autonomous vehicle. Various features and advantages of the invention are set forth in the following claims.
This application claims the benefit of U.S. Provisional Application No. 62/097,868, filed Dec. 30, 2014, the entire contents of which are incorporated herein by reference.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2015/068012 | 12/30/2015 | WO | 00 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2016/109635 | 7/7/2016 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
6812942 | Ribak | Nov 2004 | B2 |
8209093 | Hill | Jun 2012 | B2 |
8676431 | Mariet | Mar 2014 | B1 |
8712632 | Tuukkanen | Apr 2014 | B2 |
20060218506 | Srenger et al. | Sep 2006 | A1 |
20090040054 | Wang | Feb 2009 | A1 |
20090262239 | Cho et al. | Oct 2009 | A1 |
20100179932 | Yoon et al. | Jul 2010 | A1 |
20100182140 | Kohno et al. | Jul 2010 | A1 |
20110022393 | Waller et al. | Jan 2011 | A1 |
20110082620 | Small et al. | Apr 2011 | A1 |
20120306637 | McGough et al. | Dec 2012 | A1 |
20130144470 | Ricci | Jun 2013 | A1 |
20140088840 | Baumgarten | Mar 2014 | A1 |
20150175169 | Flehmig | Jun 2015 | A1 |
20170021837 | Ebina | Jan 2017 | A1 |
Number | Date | Country |
---|---|---|
101417655 | Apr 2009 | CN |
101443227 | May 2009 | CN |
102109821 | Aug 2013 | CN |
102011101708 | Nov 2012 | DE |
Entry |
---|
International Search Report and Written Opinion for Application No. PCT/US2015/068012 dated Apr. 14, 2016, (11 pages). |
First Office Action from the National Intellectual Property Office Administration, P.R. China for Application No. 201580077143.5 dated Nov. 27, 2018 (8 pages). |
Number | Date | Country | |
---|---|---|---|
20180001903 A1 | Jan 2018 | US |
Number | Date | Country | |
---|---|---|---|
62097868 | Dec 2014 | US |