The subject disclosure relates to methods, systems, and computer program products for using driver gaze to control vehicle systems.
The main controls for vehicle entertainment systems are typically located in the center of the vehicle dashboard. In order to manipulate the entertainment system to perform some function, a driver temporarily changes their focus of attention from the road to the entertainment system.
Accordingly, it is desirable to provide systems and methods for manipulating vehicle functions without changing the focus of the driver.
In one exemplary embodiment, a method of controlling a vehicle system of a vehicle is provided. The method includes: receiving image data that is captured from an occupant of the vehicle; determining a focus of the occupant of the vehicle based on the image data; and generating a user command to control the vehicle system based on the focus of the occupant.
In another exemplary embodiment, a control system for a vehicle is provided. The control system includes a first module that determines a focus of an occupant of the vehicle based on image data, where the image data is captured from the occupant using an image processor. A second module generates a user command to control a vehicle system based on the focus of the occupant.
In yet another exemplary embodiment, a vehicle is provided. The vehicle includes an image processor that captures image data of an occupant of the vehicle. A human machine interface module processes the image data and generates a user command based on the image data. A vehicle system receives the user command and controls a function of the vehicle system based on the user command.
The above features and advantages and other features and advantages of the invention are readily apparent from the following detailed description of the invention when taken in connection with the accompanying drawings.
Other features, advantages and details appear, by way of example only, in the following detailed description of embodiments, the detailed description referring to the drawings in which:
The following description is merely exemplary in nature and is not intended to limit the present disclosure, its application or uses. It should be understood that throughout the drawings, corresponding reference numerals indicate like or corresponding parts and features. As used herein, the term module refers to an application specific integrated circuit (ASIC), an electronic circuit, a processor (shared, dedicated, or group) and memory that executes one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that provide the described functionality.
In accordance with exemplary embodiments, a gaze based control system shown generally at 10 is shown to be included within a vehicle 12. The gaze based control system 10 processes retinal information of a driver 13 or other occupant of the vehicle 12 and generates control signals for controlling functions of vehicle systems 16a-16n based thereon. Such vehicle systems 16a-16n may include, for example, but are not limited to, an entertainment system, a navigation system, a telematics system, and/or any other system within the vehicle. As can be appreciated, the control signals can be transmitted to control modules 18a-18n associated with the vehicle systems 16a-16n through a vehicle network 20 (as shown generally at 22) and/or can be transmitted directly to the vehicle system 16n′ (as shown generally at 24).
In various embodiments, the gaze based control system 10 includes an image processor 26 that captures image data of the driver 13 and a human machine interface (HMI) module 28 that processes the captured image data. Based on the processed image data, the HMI module 28 generates the control signals to control one or more of the vehicle systems 16a-16n.
In various embodiments, the HMI module communicates with a display system 30. The display system 30 displays a user interface that corresponds to the vehicle system 16a-16n to be controlled. In various embodiments, the user interface is displayed in or near the line of vision of the driver 13. For example, the user interface can be displayed on a windshield (not shown) of the vehicle 12 by a heads up display (HUD) system. The HMI module 28 processes the image data relative to the displayed user interface.
Referring now to
The image processing module 32 receives as input image data 40. The image processing module 32 processes the image data 40 to distinguish a face and eyes of the driver 13 (
The gaze analysis module 34 receives as input the eye data 42 and the face data 44. The gaze analysis module 34 determines a focus of the driver 13 (
In various embodiments, the gaze analysis module 34 can analyze the face data 44 using an analysis method to determine a position of the face. The gaze analysis module 34 can then analyze the eye data 42 using an analysis method and based on the face position to determine a focus of the eye(s). In various embodiments, the focus of the eye(s) can be represented by, for example, a general location, or an x, y coordinate within a two-dimensional plane.
The display module 36 receives as input system data 48. The system data 48 can be generated by the vehicle system 16a-16n (
The display module 36 then makes available that display data or other display/function data 52 to the command module 38. The display/function data 52 indicates any functions associated with the particular screen or notification that is being displayed and a mapping of the functions to the two-dimensional plane on which the user interface is displayed. For example, function x, y, and z are provided by the screen being displayed. Function x is mapped to the coordinates ((a, b), (c, d), (e, f), etc.) of the user interface; function y is mapped to the coordinates ((g, h), (i, j), (k, l), etc.) of the user interface; and function z is mapped to the coordinates ((m, n), (o, p), (q, r), etc.) of the user interface.
In various embodiments, the display module 36 further receives as input a selection 54. As will be discussed in more detail with regard to the command module 38, the selection 54 can be generated when the function selected is associated with a new screen or notification of the user interface. Based on the selection 54, the display module 36 can modify the display data 50 (e.g., to display a new screen or notification) and the display/function data 52.
The command module 38 receives as input the display/function data 52 and the focus data 46. Based on the inputs 46, 52, the command module 38 determines the selection 54 of the driver and, in some cases, generates a user command 56 based thereon. For example, the command module 38 maps the coordinates of the focus data 46 to coordinates of display/function data 52 and determines the function associated with the coordinates. The command module 38 then determines the appropriate command for that function and generates a user command 56 based thereon.
The user command 56 can be received by the vehicle system 16a-16n′ to control a system feature. The user command can be provided back to the display module to initiate the change in the user interface.
Referring now to
In various embodiments, the method can be scheduled to run based on predetermined events, and/or may run continually during operation of the vehicle 12 (
In one example, the method may begin at 100. It is determined whether gaze based control is active at 110 (e.g., a switch is selected by the driver to activate the gaze based control system). If the gaze based control is active at 110, the user interface is determined at 130 based on control system data and any selection. The display data is generated based on the system data and any command data to display the user interface, for example, via the HUD at 140. The image data from the image processor is analyzed at 150 to determine the eye data and/or face data. The eye data and/or the face data is analyzed at 160 to determine the focus. The selection is determined based on the focus and the functions associated with the current user interface being displayed (as indicated by the display/function data) at 170. Optionally, the selection can be confirmed via a prompt from the user interface at 180. If the selection is confirmed at 180, the user command is generated based on the selection and the function associated with the selection at 190. Thereafter, the method continues with monitoring whether the gaze based control system is active at 110. If, however, the gaze based control system is not active, or the gaze based control system becomes not active (e.g., is then switched off), the method may end at 120.
While the invention has been described with reference to exemplary embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted for elements thereof without departing from the scope of the invention. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the invention without departing from the essential scope thereof. Therefore, it is intended that the invention not be limited to the particular embodiments disclosed, but that the invention will include all embodiments falling within the scope of the present application.
Number | Name | Date | Kind |
---|---|---|---|
4625329 | Ishikawa et al. | Nov 1986 | A |
5008946 | Ando | Apr 1991 | A |
5598145 | Shimotani et al. | Jan 1997 | A |
5734357 | Matsumoto | Mar 1998 | A |
6198998 | Farmer et al. | Mar 2001 | B1 |
6272431 | Zamojdo et al. | Aug 2001 | B1 |
6327522 | Kojima et al. | Dec 2001 | B1 |
6348877 | Berstis et al. | Feb 2002 | B1 |
6578869 | Zayan et al. | Jun 2003 | B2 |
6580973 | Leivian et al. | Jun 2003 | B2 |
6789044 | Claussen | Sep 2004 | B2 |
6859144 | Newman et al. | Feb 2005 | B2 |
7027621 | Prokoski | Apr 2006 | B1 |
7180476 | Guell et al. | Feb 2007 | B1 |
7231060 | Dowling et al. | Jun 2007 | B2 |
7386372 | Breed et al. | Jun 2008 | B2 |
7460940 | Larsson et al. | Dec 2008 | B2 |
7519459 | Ito et al. | Apr 2009 | B2 |
7639149 | Katoh | Dec 2009 | B2 |
7792328 | Albertson et al. | Sep 2010 | B2 |
7810050 | Hirai et al. | Oct 2010 | B2 |
7970175 | Malawey et al. | Jun 2011 | B2 |
8009025 | Engstrom et al. | Aug 2011 | B2 |
8055023 | Goto et al. | Nov 2011 | B2 |
8063952 | Kamada et al. | Nov 2011 | B2 |
8144002 | Kiuchi | Mar 2012 | B2 |
8350686 | Inoue | Jan 2013 | B2 |
8376595 | Higgins-Luthman | Feb 2013 | B2 |
8379918 | Pfleger et al. | Feb 2013 | B2 |
8391554 | Lee et al. | Mar 2013 | B2 |
8400313 | Noguchi et al. | Mar 2013 | B2 |
20020055808 | Matsumoto | May 2002 | A1 |
20020135165 | Zayan et al. | Sep 2002 | A1 |
20040153229 | Gokturk et al. | Aug 2004 | A1 |
20060011399 | Brockway et al. | Jan 2006 | A1 |
20060079729 | Kim | Apr 2006 | A1 |
20060149426 | Unkrich et al. | Jul 2006 | A1 |
20060202843 | Ota | Sep 2006 | A1 |
20070156317 | Breed | Jul 2007 | A1 |
20070193811 | Breed et al. | Aug 2007 | A1 |
20080230297 | Lee et al. | Sep 2008 | A1 |
20080236929 | Fukaya et al. | Oct 2008 | A1 |
20090086165 | Beymer | Apr 2009 | A1 |
20090090577 | Takahashi et al. | Apr 2009 | A1 |
20090097705 | Thorn | Apr 2009 | A1 |
20090231145 | Wada et al. | Sep 2009 | A1 |
20100079413 | Kawashima et al. | Apr 2010 | A1 |
20100123781 | Shimura | May 2010 | A1 |
20100288573 | Nishina | Nov 2010 | A1 |
20110029185 | Aoki et al. | Feb 2011 | A1 |
20120212353 | Fung et al. | Aug 2012 | A1 |
20130006525 | Stroila | Jan 2013 | A1 |
Number | Date | Country | |
---|---|---|---|
20130024047 A1 | Jan 2013 | US |