The present invention relates generally to an interface and, more particularly, to a vehicle user interface operable in multiple ways.
A conventional vehicle includes various systems that allow the user, i.e., the driver or passenger, a means of interfacing with the vehicle, specifically providing a means for monitoring vehicle conditions and controlling various vehicle functions. Depending upon the complexity of the systems to be monitored and/or controlled, such a user interface may utilize visual, tactile and/or audible feedback. In a typical vehicle, the systems and conditions that may be monitored and/or controlled by such an interface include climate control (e.g., temperature settings, fan settings, defroster operation, etc.); entertainment system control (e.g., audio source, radio station, audio track, tonal balance, volume, etc.); and the navigation system (e.g., map, destination and route, estimated time of arrival (ETA), miles to destination, etc.).
While the cost of the various controls and subsystems that encompass a vehicle's user interface may make up only a small fraction of the total vehicle cost, the user interface, as the primary source of interaction between the user and the vehicle, is critical to the driver's operation and enjoyment of the vehicle. For instance, a poorly designed or poorly positioned headlight or windshield wiper switch may require the driver to divert attention from the road for an unsafe period of time in order for the driver to turn on the headlights or wipers. In other instances, an overly complex or poorly designed interface, for example an audio or navigation interface, may quickly lead to user frustration and dissatisfaction, and potentially lost car sales.
In a conventional vehicle, the user interface is actually comprised of multiple interfaces, each interface grouping together those controls necessary to monitor and/or operate a specific vehicle subsystem or function. For example, the controls and display for the audio system are typically co-located as are the controls for the heating, ventilation and air conditioning (HVAC) system. In addition to simplifying subsystem control, co-location of controls allows the manufacturer to utilize a modular approach in which several options for a particular system, e.g., the audio system, may be provided. Not only does this approach simplify upgrades, it also allows the manufacturer to design and build a single subsystem that can be integrated into several different vehicle models.
In the past decade, the advent of dash-mounted monitors has caused a major change in the design of vehicle interfaces. In addition to being used in navigation systems, such monitors allow various information to be communicated to the user as well as providing a novel technique for controlling system functionality. For example, in addition to its use in the navigation system, some vehicles use a multi-page menu approach to provide the driver and/or passenger with control over the audio system, the HVAC system, on-board or Bluetooth® enabled/coupled communication devices, etc. In such an application, either a touch-sensitive display may be used or a non-touch-sensitive monitor may be used with corresponding hard buttons (e.g., mounted around the periphery of the display) or with a mouse-like pointer that allows selection of designated functions.
While conventional vehicles provide a variety of devices and techniques for the driver and/or passenger to control and monitor the vehicle's various subsystems and functions, typically a specific vehicle or specific vehicle model will utilize a single type of interface, thus forcing the user to adapt to that particular interface's mode of interaction. One approach to providing the end user with a more user-friendly interface is to allow the end user to modify or customize the interface to meet their particular needs and usage patterns. Such an approach is provided in co-pending U.S. patent application Ser. No. 12/708,547. While this approach does provide a customizable, and thus improved, user interface, it does require that the user or a representative of the user (e.g., a manufacturer's service center) reconfigure the interface to match that particular user's preferences. In some instances, however, it may be undesirable to implement such a user configurable vehicle interface. Accordingly, what is needed is a single vehicle user interface that automatically matches the usage patterns and preferences of a variety of users. The present invention provides such a user interface.
The present invention provides a method of operating a vehicle user interface. The method includes the steps of providing a touch-screen display in the vehicle; displaying a general vehicle subsystem interface corresponding to a single vehicle subsystem (e.g., climate control or audio subsystem); displaying a plurality of touch-screen function controllers all corresponding to the single vehicle subsystem; accepting a first user touch via a first of the plurality of touch-screen function controllers, where the first touch-screen function controller corresponds to a single vehicle function of the single vehicle subsystem; transforming the general vehicle subsystem interface to a specific vehicle subsystem interface in response to the first user touch, where the specific vehicle subsystem interface corresponds to the single vehicle function; accepting additional user touch input via the specific vehicle subsystem interface, where the additional user touch input only affects the single vehicle function; and transforming the specific vehicle subsystem interface to the general vehicle subsystem interface after a preset period of time (e.g., less than 10 seconds) since acceptance of the last user touch input via the specific vehicle subsystem interface. The method may further comprise the step of enlarging the first touch-screen function controller when the general vehicle subsystem interface is transformed to the specific vehicle subsystem interface. The method may further comprise the step of eliminating a portion of the plurality of touch-screen function controllers when the general vehicle subsystem interface is transformed to the specific vehicle subsystem interface, where the eliminated touch-screen function controllers do not control the single vehicle function. The method may further comprise the steps of determining the control interaction technique utilized in the first user touch and selecting a feature set for the specific vehicle subsystem interface based on the outcome of the control interaction technique determining step. The method may further comprise the steps of determining the control interaction technique utilized in the first user touch and adding a second plurality of touch-screen function controllers to the specific vehicle subsystem interface based on the outcome of the control interaction technique determining step, wherein if the control interaction technique is a tapping input technique, some or all of the second plurality of touch-screen function controllers will utilize a tapping input technique; wherein if the control interaction technique is a touch-and-slide input technique, some or all of the second plurality of touch-screen function controllers will utilize a touch-and-slide input technique; and wherein if the control interaction technique is a touch-and-hold input technique, some or all of the second plurality of touch-screen function controllers will utilize a touch-and-hold input technique.
A method of operating a vehicle user interface is provided that includes the steps of providing a touch-screen display in the vehicle; displaying a vehicle subsystem interface; displaying a plurality of touch-screen function controllers; accepting user input via either a tapping technique or a touch-and-slide technique without altering the interface configuration and where either form of input yields the same response. In at least one embodiment, all of the touch-screen function controllers correspond to a single vehicle subsystem, for example a climate control or audio subsystem. In at least one embodiment, the interface also accepts user input via a touch-and-hold technique without altering the interface configuration, and where the same response is achieved as that utilizing either the tapping or touch-and-slide techniques.
A further understanding of the nature and advantages of the present invention may be realized by reference to the remaining portions of the specification and the drawings.
The present invention utilizes a touch-screen for the user interface. A large format touch-screen as illustrated in
In the exemplary touch-screen system 100 shown in
Preferably zone 101 is comprised of persistent soft buttons, i.e., soft buttons that persist regardless of how the remaining portion of display 100 is used. Persistent soft buttons 105 may be used, for example, to provide general display control settings or to provide the user with direct access to specific subsystems (e.g., climate control subsystem, audio subsystem, mobile/cell phone interface, navigation subsystem, etc.).
In the exemplary touch-screen shown in
In a typical interface in accordance with the invention, one or more of the interface's functions operate in the same manner as in a conventional touch-screen utilizing a ‘touch-on’ and ‘touch-off’ feature. For example, in exemplary interface 300 seat heater soft buttons 301, defroster soft button 303 and air flow soft buttons 305 all operate conventionally such that each touch of the selected button cycles the corresponding subsystem function between an on state and off state.
Preferably those features of a particular subsystem that offer the user a range of possible selections, rather than a simple two-position switch (e.g., on/off), utilize one or more features of the invention as described in further detail below. In exemplary interface 300, driver temperature selector 307, passenger temperature selector 309, and fan speed selector 311 all fit within this category of control. In an audio subsystem interface, at a minimum the function controllers that typically operate in this manner include the volume control, channel selector, audio input selector, and tone controls. It will be understood that the invention is not limited to climate control and audio subsystem interfaces, nor is it limited to the specific function controllers noted above.
In at least one embodiment of the invention, one or more of the function controllers within a particular subsystem interface allow the user to utilize any of a variety of different control interaction techniques to perform the same function. Thus in contrast to a conventional function controller in which the user is required to conform to the control interaction technique that is pre-configured by the manufacturer or by another user, this aspect of the invention allows the user to interact with the controller in any of a variety of ways, thus making the interaction more natural and intuitive. Furthermore, as different users may prefer different ways of interacting with the interface, the invention allows each user, e.g., each driver or passenger, to utilize their preferred technique without requiring any reconfiguration of the interface. Thus, for example, one driver may increase the driver temperature from 70° to 72° by touching the number “72” and holding their finger on the desired temperature until controller 307 indicates that the new temperature has been input into the system (i.e., using a touch-and-hold control interaction technique). Indication of the selected temperature may be through the use of a different font size as illustrated, location within the controller's display region (e.g., centering the selected temperature as illustrated), or through other means (e.g., different font colors, different background colors, etc.). A different driver/user may prefer a tapping motion, e.g., tapping on the upper portion of the display region of function controller 307 to increase the temperature, or tapping on the lower portion of the display region of function controller 307 to decrease the temperature (i.e., using a tapping control interaction technique). Preferably in this approach the temperature changes by 1 degree per tap. Also preferably in this approach the location of the numbers within region 307 would change with each tap, thus indicating the presently selected temperature. Yet another driver/user may prefer a touch-and-slide control interaction technique, also referred to herein as a sliding or scrolling motion, thus mimicking the interaction with a rotating switch. In this approach the user increases the temperature to the desired temperature by touching interface 300 somewhere within the display region of function controller 307, and sliding their finger downwards on the interface until the controller indicates that the user has selected 72°, for example by highlighting the number “72” and/or centering the number “72” in region 307.
In at least one embodiment of the invention, when the user employs a touch-and-slide movement during interaction with a particular multi-position function controller (e.g., audio volume, fan speed, channel selector, temperature controller, etc.), the faster the sliding motion, the faster the speed with which the controlled function varies. In other words, a rapid sliding motion causes acceleration of the changing function, making the controller more responsive to finger movement during the user's touch-and-slide motion. For example, if a passenger touches the display region of function controller 309 and slowly moves their finger in an upwards motion, the temperature changes slowly. In contrast, if the passenger touches region 309 and then flicks or otherwise moves their finger in a rapid upwards motion, then the temperature varies rapidly, potentially moving to the lowest possible setting. Therefore this aspect of the invention allows the controller, and thus the user, to differentiate between fine function control and course function control while using a single function controller. This type of function acceleration is especially useful for interface applications such as the audio subsystem channel selector (e.g., AM, FM, or satellite channel selectors) where there is a wide selection range, but where fine control is required once the user nears the desired frequency or channel.
In at least one embodiment of the invention, when a particular interface function controller is engaged, the visual properties of the interface temporarily change. Changing the visual properties of the interface may be done in order to emphasize the function currently under control, for example by highlighting a particular function controller (e.g., temperature controllers 307/309, fan controller 311, etc.) once the user touches that control region of the display. Alternately, both the visual properties and the manner in which the user interacts with the function controller may change upon the user's initial touch/interaction with a particular function controller, thereby simplifying further user interaction with the selected function controller. The length of time that the visual or visual/interactive properties of the interface change may be configured by the manufacturer, a service representative, or by the end user. Preferably once the user interacts with the affected function controller, thereby causing the interface properties to change, they remain changed throughout the time in which the user is interacting with that particular interface function controller. In order to ensure that the interface properties remain unchanged during this time, the interface is configured to remain in its altered state for a preset period of time after the user's last screen touch (e.g., 5 seconds, 10 seconds, etc.). Typically this period of time is selected to ensure that the interface remains altered while the user's attention is diverted, for example due to the demands of driving, or while the user is still deciding on the final function setting.
While simply highlighting a selected function controller aids user-interface navigation, the inventors have found that simplifying the overall interface and simultaneously expanding the selected controller provides additional benefits. For example, the interface may be configured so that when the user first touches a function controller such as temperature controller 307, the selected controller expands to cover a much larger region of the display screen as illustrated in the exemplary embodiment shown in
In addition to enlarging the selected function controller and eliminating some or all of the other, non-selected, functions from the interface, the present invention may be used to further aid user navigation for the selected controller, for example by adding additional control features to the interface when a particular controller is activated. For example,
As previously disclosed, once a particular function controller is selected by a user, that controller's functionality may be expanded by adding additional function controls that are not included in the initial interface (e.g., additional controls 603-606). In a preferred embodiment, when the user interacts with a particular function controller, a new interface screen with additional controller functionality is displayed that is based on the way in which the user initially interacted with the display. For instance, if the user's initial interaction with controller 307 is by a tapping motion, then interface 300 may morph into a screen that emphasizes control via tapping. For example, in the exemplary screen shown in
In at least one embodiment of the invention, the appearance of the display remains unchanged during interaction, but the zone of interaction expands. More specifically, after the user first touches a controller, the control region expands so that it is easier for the user to input his or her changes into the user interface. For example, if the user touches the fan controller 311, then for a preset period of time a much larger region of the display (e.g., a large region of the zone or a large region of the display) becomes the active input region even though there is no change in the overall appearance of the interface or the appearance of controller 311 in particular. In an exemplary configuration, after touching fan controller 311, the user may touch anyplace within the upper half of interface 300 to increase fan speed, or touch anyplace within the lower half of interface 300 to decrease fan speed. Alternately, after touching fan controller 311, the user may touch anywhere on interface 300 and slide their finger in a rightward motion to decrease fan speed, or touch anywhere on interface 300 and slide their finger in a leftward motion to increase fan speed. This aspect of the invention only requires that the user accurately position their finger in the desired control region once since after the initial touch the control region is greatly expanded. Typically the preset period of time in which the control region is expanded is relatively short (e.g., 1-10 seconds), thus allowing the user to interact with the desired function controller and then, within a relatively short period, interact with a different function controller.
In a minor variation of the previous embodiment, once the user touches a control region, their interaction with the selected region continues for as long as their finger remains in contact with the screen, regardless of whether or not their finger remains within the selected region. This aspect of the invention allows a user to briefly look at the interface, just long enough to correctly position their finger, and then look away while still controlling the selected function. For example, a user may touch the volume control of an audio interface and then continue to control the audio volume regardless of where their finger moves to on the display screen, as long as their finger remains in contact with the screen. Thus in this example the user may touch the volume control region, then increase the volume by moving their finger up or to the right, or decrease the volume by moving their finger down or to the left.
As will be understood by those familiar with the art, the present invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. Accordingly, the disclosures, descriptions and figures provided herein are intended to be illustrative, but not limiting, of the scope of the invention which is set forth in the following claims.
This application is a continuation-in-part of U.S. patent application Ser. No. 12/708,547, filed Feb. 19, 2010, and claims the benefit of the filing date of U.S. Provisional Patent Application Ser. No. 61/278,337, filed Oct. 5, 2009, the disclosures of which are incorporated herein by reference for any and all purposes.
Number | Name | Date | Kind |
---|---|---|---|
5555502 | Opel | Sep 1996 | A |
5859628 | Ross et al. | Jan 1999 | A |
6298305 | Kadaba et al. | Oct 2001 | B1 |
6308203 | Itabashi et al. | Oct 2001 | B1 |
6374173 | Ehlbeck | Apr 2002 | B1 |
6650345 | Saito et al. | Nov 2003 | B1 |
6771063 | Stolfus | Aug 2004 | B2 |
6775603 | Yester et al. | Aug 2004 | B2 |
6819990 | Ichinose | Nov 2004 | B2 |
6859687 | Obradovich et al. | Feb 2005 | B2 |
6956470 | Heise et al. | Oct 2005 | B1 |
7043699 | Obradovich | May 2006 | B2 |
7292228 | Nagasaka et al. | Nov 2007 | B2 |
7489303 | Pryor | Feb 2009 | B1 |
7683771 | Loeb | Mar 2010 | B1 |
7698033 | Hering et al. | Apr 2010 | B2 |
7730401 | Gillespie et al. | Jun 2010 | B2 |
7853366 | Imura et al. | Dec 2010 | B2 |
7978056 | Mercurio et al. | Jul 2011 | B2 |
8078359 | Small et al. | Dec 2011 | B2 |
8188887 | Catten et al. | May 2012 | B2 |
8566045 | Shaffer et al. | Oct 2013 | B2 |
20020177944 | Ihara et al. | Nov 2002 | A1 |
20030220722 | Toba et al. | Nov 2003 | A1 |
20050143884 | Bihler et al. | Jun 2005 | A1 |
20050261815 | Cowelchuk et al. | Nov 2005 | A1 |
20060017552 | Andreasen et al. | Jan 2006 | A1 |
20060022955 | Kennedy | Feb 2006 | A1 |
20060026535 | Hotelling et al. | Feb 2006 | A1 |
20060028453 | Kawabe | Feb 2006 | A1 |
20060146039 | Prados et al. | Jul 2006 | A1 |
20060155431 | Berg | Jul 2006 | A1 |
20060155445 | Browne et al. | Jul 2006 | A1 |
20060161871 | Hotelling et al. | Jul 2006 | A1 |
20060277495 | Obradovich | Dec 2006 | A1 |
20070124043 | Ayoub et al. | May 2007 | A1 |
20070176797 | Rhodes et al. | Aug 2007 | A1 |
20070262953 | Zackschewski | Nov 2007 | A1 |
20070262965 | Hirai et al. | Nov 2007 | A1 |
20080036743 | Westerman et al. | Feb 2008 | A1 |
20080082920 | Eom | Apr 2008 | A1 |
20080122798 | Koshiyama et al. | May 2008 | A1 |
20080177458 | Malone et al. | Jul 2008 | A1 |
20080211779 | Pryor | Sep 2008 | A1 |
20080243373 | Cat et al. | Oct 2008 | A1 |
20080282173 | Kim et al. | Nov 2008 | A1 |
20090021491 | Kawamura | Jan 2009 | A1 |
20090144661 | Nakajima et al. | Jun 2009 | A1 |
20090174677 | Gehani et al. | Jul 2009 | A1 |
20090210110 | Dybalski et al. | Aug 2009 | A1 |
20090244016 | Casparian et al. | Oct 2009 | A1 |
20090259969 | Pallakoff | Oct 2009 | A1 |
20090315848 | Ku et al. | Dec 2009 | A1 |
20100070093 | Harrod et al. | Mar 2010 | A1 |
20100070932 | Hur | Mar 2010 | A1 |
20100088632 | Knowles et al. | Apr 2010 | A1 |
20100095245 | Fino et al. | Apr 2010 | A1 |
20100271385 | Lan et al. | Oct 2010 | A1 |
20100318266 | Schaaf et al. | Dec 2010 | A1 |
20110001722 | Newman et al. | Jan 2011 | A1 |
20110035700 | Meaney et al. | Feb 2011 | A1 |
Number | Date | Country |
---|---|---|
1420333 | May 2004 | EP |
1702805 | Sep 2006 | EP |
2050610 | Apr 2009 | EP |
1065413 | Jan 2011 | EP |
2003-146055 | May 2003 | JP |
2007-030633 | Feb 2007 | JP |
2008-285046 | Nov 2008 | JP |
2008-292219 | Dec 2008 | JP |
2009-057013 | Mar 2009 | JP |
WO 03039914 | May 2003 | WO |
WO 2004081777 | Sep 2004 | WO |
Entry |
---|
slide.pdf (slide—definition of slide by the Free Online Dictionary, Thesaurus and Encyclopedia., Oct. 30, 2013, Free Online Dictionary, pp. 1-5). |
swipe.pdf (swipe—definition of swipe by the Free Online Dictionary, Thesaurus and Encyclopedia. Oct. 31, 2013 Free Online Dictionary, pp. 1-5). |
http://windows.microsoft.com/en-us/windows7/Arrange-windows-side-by-side-on-the-desktop-using-Snap archived Jun. 29, 2009 on http://www.archive.org/web/web/php. |
slide.pdf (slide—definition of slide by the Free Online Dictionary, Thesaurus and Encyclopedia, Oct. 30, 2013, Free Online Dictionary, pp. 1-5). |
swipe.pdf (swipe—definition of swipe by the Free Online Dictionary, Thesaurus and Encyclopedia, Oct. 31, 2013, Free Online Dictionary, pp. 1-5). |
Number | Date | Country | |
---|---|---|---|
20110082627 A1 | Apr 2011 | US |
Number | Date | Country | |
---|---|---|---|
61278337 | Oct 2009 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 12708547 | Feb 2010 | US |
Child | 12725391 | US |