This disclosure relates to presenting information about various vehicle subsystems, such as a navigational system and audio system, on a display screen.
Vehicles, such as automobiles, typically include an array of on-board subsystems, such as a navigation system, audio system, video system, beating and air conditioning system, rear-view camera system, fuel system, and others. One or more consoles, such as a radio console or a navigation console in a dashboard or other readily-accessible location, are typically included in the vehicle to provide a user with information about and/or control of various subsystem.
In one aspect, the invention features an information display unit for a vehicle (e.g., automobile, aircraft, watercraft, etc.) that has a user interface which presents on a display screen two display elements relating to one or more subsystems of the vehicle (e.g., navigation system, audio system, video system, fuel system instrumentation system, etc.). The user interface is configured to visually emphasize one display element relative to another display element in response to a predetermined internal stimulus (e.g., an action or event triggered by one of the subsystems) or a predetermined external stimulus (e.g., an action taken by a user).
In another aspect, the invention features an information display unit for an automobile that includes a display screen and a user interface presented on the display screen that simultaneously presents at least two display elements each depicting information relating to a different subsystem of the automobile. The user interface is further configured to visually emphasize at least one display element relative to at least one other display element in response to a predetermined stimulus.
Implementations may include one or more of the following features. The display elements may depict information relating to a navigational system, audio system, heating and air conditioning system, instrumentation system, rear-view camera system, on-board telephone system, or other subsystems of the automobile. The display elements may be an output of a subsystem, such as a navigational map produced by a navigation system, or may summarize operation of a subsystem, such as a window summarizing operation of the audio or video system.
The user interface may visually emphasize a display element in response to an internal stimulus, such as a stimulus created by one of the subsystems). For example, a user interface may emphasize a display element depicting information relating to the navigational subsystem of the automobile in response to the automobile approaching a turning maneuver determined by the navigational subsystem. The user interface may also visually emphasize a display element in response to an external stimulus, such as user's hand or other object touching or proximate to a control element associated with a display element and the user interface. The information display unit may include one or more proximity sensors for detecting presence of an object near a control element. A display elements may be touch-sensitive and function as a control element.
The user interface of the information display unit may simultaneously present two display elements by overlaying one over another. For example, a first display element depicting a navigational map or an image produced by a rear-view camera may be shown on substantially the entire display screen while a second display element, such as a window summarizing operation of an audio or video system, is overlaid over the first display element.
The user interface may visually emphasize a display element by changing visual characteristics of the emphasized display element, changing visual characteristics of other display elements, or both. The user interface may change any number of visual characteristics of one or more display elements to emphasize a particular display elements, such as changing the size, position, color, transparency, or brightness of one or more display elements presented on the display.
In another aspect, the invention features an automobile that includes a plurality of subsystems (e.g., audio system, video system, HVAC system, instruments system, fuel system, navigation system, rear-view camera system, etc.) and an information display unit (e.g., mounted in a dashboard). The information display unit includes a display screen and a user interface presented on the display screen that simultaneously presents at least two display elements each depicting information relating to a different subsystem of the automobile. The user interface is configured to visually emphasize at least one display element relative to at least one other display element in response to a predetermined stimulus.
In another aspect, the invention features a method for displaying information about multiple subsystems of a vehicle that includes simultaneously presenting on an electronic display at least two display elements each depicting information relating to one or more subsystems of the vehicle and visually emphasizing at least one display element relative to at least one other display element in response to a predetermined stimulus.
Implementations may include one or more of the following features. The method may also include associating a display element with a control element, and the predetermined stimulus may be a user touching a control element associated with the display element. The method may also include detecting presence of an object near the control element associated with the display element, and the predetermined stimulus may be detection of an object near the control element associated with the display element.
The predetermined stimulus may be an external stimulus, such as a stimulus caused by a user's action, and/or an internal stimulus, such as an action or event in one of the subsystems.
A display element may be emphasized by changing visual characteristics of the emphasized display element and/or by changing visual characteristics of other display elements.
In another aspect, the invention features a software product residing on a medium bearing instructions to cause an instruction processor to simultaneously present on an electronic display at least two display elements each depicting information relating to a different subsystem of an automobile and visually emphasize at least one display element relative to at least one other display element in response to a predetermined stimulus.
Implementations may include one or more of the following features. The software product may have further instructions to cause the instruction processor to associate a display element with a control element, and a predetermined stimulus may be detection of an object touching the control element associated with the display element or detection of an object near the control element associated with the display element. The predetermined stimulus may be caused by a user action or action or event in one of the subsystems.
Other features, as well as advantages and objects, are described in more detail below and in the accompanying figures and claims.
c and
A vehicle, such as an automobile, may be provided with an information display unit that efficiently presents information about one or more of the vehicle's subsystems by visually emphasizing a display element that is likely to be of interest of a vehicle operator or other user. By visually emphasizing display elements that are likely to be of interest to an operator, the operator's attention will be quickly directed toward the emphasized display element and potentially lessen the amount of time the operator is distracted from operation of the vehicle.
For example, as shown in
The information display unit 12 includes a controller and storage unit 28 and display console 26. The storage unit stores software which is executed by the controller to present a graphical user interface on the display console 26.
As shown in
The control knobs 32a-32b are configured to control operation of various vehicle subsystems (e.g., audio, video, rear-camera, on-board telephone, navigation, HVAC, etc.), and may be in an active or inactive state. When a control element is in an active state, it is ready to accept an input from the user. In addition, a display element presented on the display screen may be associated with a control element. For example, a display element may be associated with an active control element such that when a user actuates the control element (e.g., rotates a control knob), the display element changes to inform the user of the corresponding action (e.g., volume of the audio system is increasing).
As shown in
Control knob 46b is ready to accept user input and is thus in an active state. Control knob 46b includes two concentric dials, an inner dial 47a and an outer dial 47b, that are each associated with the display element 44. The inner dial 47b is associated with the second display element 44 such that if a user were to turn the inner dial in one direction (e.g., clockwise) the second display element 44 would show the radio seek to the next receivable AM frequency after AM 530, and if a user were to turn the inner dial 47a in the other direction, the second display element 44 would show the radio would seek to the receivable AM frequency just before AM 530. The second display element 44 is also associated with the outer dial 47b of the control knob 46b such that if a user were to turn the outer dial 47b in one direction (e.g., counter-clockwise), the second display element would show operation of the inner dial 47a switch from seek control (shown in
In contrast to control knob 46b, control knob 46a is in an inactive state in the example shown in
The display console also includes a proximity sensor (not shown) that senses when an object such as a user's hand is in close proximity to one of the control knobs located adjacent to the display screen 30. Example of proximity sensors that may be implemented in a display console are described in detail in U.S. patent application Ser. No. 10/956,836, titled “System and Method for Accepting A User Control Input” to Carl Price, Andrew Olcott, John Coffey, Neil Gelfond, Joe Killough, Peter Santoro, Lee Zamir, and James Hotary, filed Oct. 1, 2004, which is fully incorporated herein by reference. When the proximity sensor detects that an object is close to an active-state control element, the user interface visually emphasizes one display element presented on the display screen relative to other displayed elements. For example, referring to
When the user removes his or her hand away from the control knob, the user interface reverts to the configuration shown in
As shown in
While the example illustrated in
Visual emphasis of a display element may be changed in response to other internal or external stimuli in addition to the proximity of a user's hand to a control element or actuation of a selection button. For example, a display element may be emphasized when a user touches or actuates a control knob. Moreover, visual emphasis of a display element may be triggered by internal stimuli. For example, a display element of a navigational map (e.g., display element 44 shown in
Visually emphasizing a display element involves changing visual characteristics of one or more display elements to draw a user's attention to the emphasized element. Thus, a display element may be visually emphasized by changing visual characteristics of that display element (e.g., increasing its brightness or size, changing its position on the display screen, causing it to blink, changing color to a bright color, changing the amount of information presented in the display element, making the display element opaque, etc.) and/or by changing visual characteristics of other display elements (e.g., decreasing brightness or size, changing position, changing color to a subdued color or to black and white, changing the amount of information in the display element, making the element transparent, etc.). Various combinations of these techniques can also be used to emphasize one display element relative to another. For example, referring again to
A number of embodiments of the invention have been described. Nevertheless, it will be understood that various modifications may be made without departing from the spirit and scope of the invention. For example, the display elements shown in
Number | Name | Date | Kind |
---|---|---|---|
5450075 | Woddington | Sep 1995 | A |
5594469 | Freeman et al. | Jan 1997 | A |
5774828 | Brunts et al. | Jun 1998 | A |
5777603 | Jaeger et al. | Jul 1998 | A |
5844500 | Beuk et al. | Dec 1998 | A |
5847704 | Hartman | Dec 1998 | A |
5923267 | Beuk et al. | Jul 1999 | A |
5941930 | Morimoto et al. | Aug 1999 | A |
5982355 | Jaeger et al. | Nov 1999 | A |
6009355 | Obradovich et al. | Dec 1999 | A |
6031519 | O'Brien | Feb 2000 | A |
6154201 | Levin et al. | Nov 2000 | A |
6182010 | Berstis | Jan 2001 | B1 |
6208932 | Ohmura et al. | Mar 2001 | B1 |
6222525 | Armstrong | Apr 2001 | B1 |
6225980 | Weiss et al. | May 2001 | B1 |
6232961 | Kunimatsu et al. | May 2001 | B1 |
6256558 | Sugiura et al. | Jul 2001 | B1 |
6275231 | Obradovich | Aug 2001 | B1 |
6297810 | Anderson | Oct 2001 | B1 |
6498628 | Iwamura | Dec 2002 | B2 |
6529804 | Draggon et al. | Mar 2003 | B1 |
6539289 | Ogino et al. | Mar 2003 | B2 |
6583801 | Eastty et al. | Jun 2003 | B2 |
6650345 | Saito et al. | Nov 2003 | B1 |
6711474 | Treyz et al. | Mar 2004 | B1 |
6768868 | Schnell | Jul 2004 | B1 |
6803905 | Capps et al. | Oct 2004 | B1 |
6842677 | Pathare | Jan 2005 | B2 |
6904570 | Foote et al. | Jun 2005 | B2 |
6970783 | Knockeart et al. | Nov 2005 | B2 |
6975932 | Obradovich | Dec 2005 | B2 |
6988246 | Kopitzke et al. | Jan 2006 | B2 |
7007417 | Segan et al. | Mar 2006 | B2 |
7043699 | Obradovich | May 2006 | B2 |
7126583 | Breed | Oct 2006 | B1 |
7187368 | Rekimoto | Mar 2007 | B2 |
7218312 | Takaku | May 2007 | B2 |
20020003206 | Culver | Jan 2002 | A1 |
20020054159 | Obradovich | May 2002 | A1 |
20020055811 | Obradovich | May 2002 | A1 |
20020101334 | Ueda | Aug 2002 | A1 |
20020154003 | Ueda | Oct 2002 | A1 |
20030023353 | Badarneh | Jan 2003 | A1 |
20030025676 | Cappendijk | Feb 2003 | A1 |
20030043206 | Duarte | Mar 2003 | A1 |
20030076301 | Tsuk et al. | Apr 2003 | A1 |
20030095096 | Robbin et al. | May 2003 | A1 |
20030210258 | Williams | Nov 2003 | A1 |
20040007450 | Kojima | Jan 2004 | A1 |
20040036769 | Sadahiro | Feb 2004 | A1 |
20040085447 | Katta et al. | May 2004 | A1 |
20040148093 | Tanaka et al. | Jul 2004 | A1 |
20040203411 | Holz auf der Heide et al. | Oct 2004 | A1 |
20050016824 | Olcott et al. | Jan 2005 | A1 |
20050018172 | Gelfond et al. | Jan 2005 | A1 |
20050030379 | Luskin et al. | Feb 2005 | A1 |
20050080528 | Obradovich | Apr 2005 | A1 |
20050115816 | Gelfond et al. | Jun 2005 | A1 |
20050171690 | Brass et al. | Aug 2005 | A1 |
20060004517 | Hasegawa et al. | Jan 2006 | A1 |
20060074553 | Foo et al. | Apr 2006 | A1 |
20060082545 | Choquet et al. | Apr 2006 | A1 |
20060122742 | Hasegawa et al. | Jun 2006 | A1 |
20060161871 | Hotelling et al. | Jul 2006 | A1 |
20060195232 | Obradovich | Aug 2006 | A1 |
20070016370 | Kuenzner | Jan 2007 | A1 |
Number | Date | Country |
---|---|---|
44 12 859 | Nov 1994 | DE |
19936257 | Feb 2001 | DE |
10107572 | Oct 2002 | DE |
10121685 | Nov 2002 | DE |
0 794 408 | Sep 1997 | EP |
0901229 | Mar 1999 | EP |
1 080 974 | Mar 2001 | EP |
1 080 976 | Mar 2001 | EP |
1168396 | Jan 2002 | EP |
1228917 | Aug 2002 | EP |
1241557 | Sep 2002 | EP |
1293882 | Mar 2003 | EP |
2126388 | Mar 1984 | GB |
2382292 | May 2003 | GB |
09147671 | Jun 1997 | JP |
2003043175 | Jun 2003 | JP |
2003151399 | Sep 2003 | JP |
WO 9743749 | Nov 1997 | WO |
WO 0171397 | Sep 2001 | WO |
WO 02063601 | Aug 2002 | WO |
WO 03023781 | Mar 2003 | WO |
WO 03044646 | May 2003 | WO |
WO 2004025834 | Mar 2004 | WO |
WO2005116801 | Dec 2005 | WO |
Number | Date | Country | |
---|---|---|---|
20060265126 A1 | Nov 2006 | US |