The present disclosure is related to an adaptive external road scene transfer to improve driver awareness.
Vehicles, such as cars, typically include displays or indicators to provide information to the vehicle user. Such displays or indicators may, for example, provide information regarding mileage, fuel consumption, and vehicle speed. The vehicle user usually has to shift his eye gaze away from the road scene and onto an in-vehicle display in order to visually process the information presented by these displays or indicators.
One possible aspect of the disclosure provides a method of alerting a user of a vehicle as to a scene, external to the vehicle, which includes capturing data pertaining to the scene, external to the vehicle. The captured data, pertaining to the scene external to the vehicle, is transmitted to a processor. The determination is made, in the processor that a characteristic is in the scene that requires the user's visual attention. Data pertaining to a gaze of the user is captured. Captured data, pertaining to the gaze of the user, is transmitted to the processor. A determination is made, in the processor that the user is gazing toward a user device. A signal is transmitted to the user device such that the user device alerts the user as to the characteristic in the scene requiring the user's visual attention.
Another aspect of the disclosure provides a scene awareness system for a vehicle that includes at least one exterior camera, at least one interior camera, and a vehicle controller. The exterior camera is configured to capture data pertaining to a scene, external to the vehicle. The interior camera is configured to capture data pertaining to an orientation of a gaze of a user of the vehicle. The vehicle controller is in communication with each of the at least one exterior camera and the at least one interior camera. The vehicle controller is configured to determine whether at least one characteristic in the scene, exterior to the vehicle, requires the user's visual attention. The vehicle controller is also configured to determine whether the user is gazing at a user device. The vehicle controller is configured to transmit a signal to the user device such that the user device alerts the user as to the characteristic in the scene requiring the user's visual attention when the controller determines there is at least one characteristic in the scene requiring the user's attention, simultaneous with the controller determining the user is gazing at the user device.
In yet another aspect of the disclosure, a vehicle includes a body and a scene awareness system. The scene awareness system includes at least one exterior camera, at least one interior camera, and a vehicle controller. The cameras are operatively attached to the body. The exterior camera is configured to capture data pertaining to a scene, external to the vehicle. The interior camera is configured to capture data pertaining to an orientation of a gaze of a user of the vehicle. The vehicle controller is in communication with each of the cameras. The vehicle controller is operable for receiving data pertaining to the scene, external to the vehicle, from the at least one exterior camera. The vehicle controller includes a process and a characteristic in the scene requiring the user's visual attention that is determined in the processor. Data, pertaining to a gaze of the user, is received from the interior camera. A determination is made that the user is gazing toward a user device. A signal is transmitted to the user device, from the controller that alerts the user as to the characteristic in the scene requiring the user's visual attention.
The above features and advantages and other features and advantages of the present teachings are readily apparent from the following detailed description of the best modes for carrying out the present teachings when taken in connection with the accompanying drawings.
Those having ordinary skill in the art will recognize that terms such as “above,” “below,” “upward,” “downward,” “top,” “bottom,” etc., are used descriptively for the figures, and do not represent limitations on the scope of the invention, as defined by the appended claims. Furthermore, the invention may be described herein in terms of functional and/or logical block components and/or various processing steps. It should be realized that such block components may be comprised of any number of hardware, software, and/or firmware components configured to perform the specified functions.
Referring now to the drawings, wherein the like numerals indicate corresponding parts throughout the several views,
With reference to
Referring to
The interior camera(s) 28 is also in operative communication with the processor 22. The interior camera(s) 28 may be used to capture visual information with respect to the user 18 of the vehicle 10. More specifically, the interior camera(s) 28 may capture information about the users 18 head, including their eyes, in real-time. The interior camera(s) 28 may be positioned within the vehicle 10, e.g., attached to a rearview mirror, and the like. Alternatively, the interior camera(s) 28 may be positioned within the vehicle, where the interior camera 28 is resident within the user device 42. In such an example, the user device 42 may be configured with both the interior camera 28 and the exterior camera 26. The captured visual information may be transmitted as data (arrow 40) to the processor 22 within the vehicle controller 20. The interior camera(s) 28 is configured to determine where the eyes of the user 18 are looking and/or to determine an orientation of the user's 18 head, relative to the windshield 14 and/or the forward end 21 of the vehicle 10. The interior camera(s) 28 and/or the user device 42 may be configured to determine whether the user 18 is gazing out the windshield 14, gazing at the user device 42, or gazing in any other direction, using MV for facial detection. The interior camera(s) 28 may be a camera, an infrared (IR) sensor, and the like.
The vehicle controller 20 is also configured to be in operative communication with the user device 42. The user device 42 includes a device controller 44 and a display screen 46. The user device 42 is configured to display information, including text and graphics, on the display screen 46. The user device 42 may be a portable personal device, such as a cell phone, a tablet, a computer, and the like. Alternatively, the user device 42 may be integrated into the vehicle 10, e.g., as an integrated infotainment device.
As described in more detail below, the user device 42 is configured to be in operative communication with the vehicle controller 20. As such, the vehicle controller 20 may selectively transmit a signal (arrow 48) to the device controller 44 to clear the display screen 46 and/or to replace the displayed image of the display screen 46 with the captured scenery 32.
Referring again to
The vehicle controller 20 is programmed with, or has access to, the algorithm or method 100, the execution of which provides a method of improving awareness of a user 18 of the vehicle 10 as to the scene 32, external to a vehicle 10, with the algorithm 100 explained in detail below and as shown in
The vehicle controller 20 of
Still referring to
The vehicle controller 20, shown in
Referring to
Additionally, determining whether the vehicle 10 is being operated may include receiving geographic data into the vehicle controller 20 regarding vehicle 10 position, e.g., via a global positioning system (GPS), and the like. This geographic data may, in turn, be used as a prompt to determine upcoming traffic conditions in order to suggest a driving route and/or suggest a traffic lane to be used. This geographic data may also be used to prevent a preview of an upcoming scene, e.g., a hidden intersection, a hidden driveway, a curve, and other geographic data. Next, the method proceeds to step 104.
At step 104, the vehicle controller 20 determines, based on the data, received by the vehicle controller 20, whether the vehicle 10 is being operated by the user 18. If the vehicle controller 20 determines the vehicle 10 is not being operated, the method returns to step 102. However, if the vehicle controller 20 determines the vehicle 10 is being operated, the method proceeds to step 106.
At step 106, the exterior camera(s) 26 captures data pertaining to the scene 32, external to the vehicle 10. The method then proceeds to step 108.
At step 108, the captured data, pertaining to the scene 32 exterior to the vehicle 10, is transmitted to the processor 22, in real-time. The method then proceeds to step 110.
At step 110, the processor 22 determines whether the scene 32 is visually salient. More specifically, the processor 22 determines whether there are important objects 30 or characteristic within the captured data of the scene 32. If a determination is made that an object 30 considered to be important is detected in the scene 32, the method proceeds to step 112. However, if a determination is made that no visually salient objects 30 or characteristics are detected in the scene 32, the method returns to step 102.
At step 112, with the user 18 seated within the interior 16 of the vehicle 10, the interior camera(s) 28 captures data, in real-time, data pertaining to a gaze 29 of the user 18, including but not limited to, an orientation of the head and/or eyes of the user 18. The method then proceeds to step 114.
At step 114, the captured data is transmitted to the processor 22 in the vehicle controller 20, in real-time. The method then proceeds to step 116.
At step 116, the processor 22 determines where the eyes of the user 18 are gazing. The vehicle controller 20 receives the captured data and processes the captured data using a processor 22. If the processor 22 determines at step 116 that the user 18 is not gazing at the user device 42, the method returns to step 118.
At step 118, the vehicle controller 20 may transmit a signal to the user device 42 such that the user device 42 displays content on the display screen 46, not pertaining to the scenery 32, i.e., the display screen 46 returns to a nominal condition or to an originally displayed condition. It should be appreciated that, in one embodiment, such a signal may only be transmitted to the user device 42 after certain criteria are satisfied, e.g., the passage of a pre-defined period of time where the user 18 is not gazing at the user device 42, etc. The method then returns to step 102.
However, if the processor 22 determines at step 116 that the user 18 is gazing at the user device 42, the method proceeds to step 118.
At step 118, the controller may, in turn, transmit a signal to the user device 42. Such a signal may instruct the controller of the user device 42 to clear the display screen 46. Additionally, such a signal may instruct the controller of the user device 42 to display imagery corresponding to the visually salient scenery 32. By changing the display screen 46 of the user device 42, the user 18 is alerted to scenery 32 external to the vehicle 10, requiring the user's 18 attention.
It should be appreciated that the signal transmitted by the vehicle controller 20 may not be limited to clearing the display screen 46 and/or displaying scenery 32 external to the vehicle 10. By way of a non-limiting example, the signal may be also be transmitted to the user device 42 to instruct the user device 42 to output an audible sound and/or cause the user device 42 to vibrate. In another non-limiting example, the signal may be transmitted to change displays of the driving displays, e.g., instrument cluster, driver information center (DIC), navigation screen, heads-up display, inside rearview mirror, outside rearview mirror, and the like when a determination is made that the user 18 may be gazing at such driving displays, instead of at the scenery 32 external to the vehicle 10.
While the best modes for carrying out the many aspects of the present teachings have been described in detail, those familiar with the art to which these teachings relate will recognize various alternative aspects for practicing the present teachings that are within the scope of the appended claims.
This application is a continuation of International Patent Application No. PCT/US2014/035398, filed on Apr. 25, 2014, which claims the benefit of, U.S. Provisional Application No. 61/816,089, filed Apr. 25, 2013, each of which are hereby incorporated by reference in their entirety.
Number | Date | Country | |
---|---|---|---|
61816089 | Apr 2013 | US |
Number | Date | Country | |
---|---|---|---|
Parent | PCT/US2014/035398 | Apr 2014 | US |
Child | 14920420 | US |