The present invention generally relates to user interfaces, and more particularly relates to gesture recognition in touchscreen user interfaces of the type used in vehicles, aircraft, and the like.
It is desirable in a variety of contexts to replace traditional electro-mechanical controls such as knobs, switches, sliders, buttons, and the like with comparable control systems utilizing computer user interfaces. Touchscreen devices, for example, provide a convenient way to consolidate controls using user interface elements such as buttons, drop-down menus, radio buttons, and other such controls, thereby reducing the “real estate” needed for mechanical actuators and controls. This is particularly desirable in the context of aircraft cockpits and automobile cabins, where space is always at a premium.
The density of controls provided by touchscreen displays comes with a cost, however. Since the user interface elements are typically arranged in a hierarchical menu structure with only a subset of elements per page (to reduce the necessary screen size), a user must typically navigate through multiple menus or pages to reach the desired control function. For example, in the aircraft cockpit context, the pilot often desires to reduce or increase the volume of his or her headset. To perform this task from the primary or default display screen, it is often necessary to navigate through two or more menu screens. The same issues arise with respect to changing display screen contrast and other such “tuning functions.”
Accordingly, it is desirable to provide improved user interface methods that allow simplified access to certain functions. Furthermore, other desirable features and characteristics of the present invention will become apparent from the subsequent detailed description of the invention and the appended claims, taken in conjunction with the accompanying drawings and this background of the invention.
In accordance with one embodiment, a user interface method includes displaying, on a touchscreen display, a plurality of user interface elements, entering a first mode, the first mode including providing a signal responsive to touch events associated with the user interface elements; determining whether a touch event corresponds to a predetermined touchscreen gesture; and switching from the first mode to a second mode when the touch event corresponds to the predetermined touchscreen gesture, the second mode including providing, for the duration of the touch event, a signal indicative of a value of a selected function that is not associated with the displayed plurality of user interface elements.
A touchscreen device in accordance with one embodiment includes: a touchscreen display configured to receive a touch event from a user, and a processor coupled to the touchscreen display. The processor is configured to instruct the touchscreen display to display a plurality of user interface elements; receive a signal associated with the touch event; enter a first mode, the first mode including providing a signal responsive to the touch event when the touch event is associated with one or more of the user interface elements; determine whether the touch event corresponds to a predetermined touchscreen gesture; and switch from the first mode to a second mode when the touch event corresponds to the predetermined touchscreen gesture, the second mode including providing, for the duration of the touch event, a signal indicative of a value of a tuning function.
A cockpit control device in accordance with one embodiment includes a touchscreen device having a plurality of user interface elements displayed thereon, including at least one user interface control configured to react to a touch event occurring within a region occupied by the at least one user interface control. The touchscreen device is configured, upon receipt of a predetermined touchscreen gesture, to temporarily ignore the at least one user interface control and provide immediate access to the tuning function such that a value of the tuning function is modified based on the predetermined touchscreen gesture.
The present invention will hereinafter be described in conjunction with the following drawing figures, wherein like numerals denote like elements, and
The following detailed description of the invention is merely exemplary in nature and is not intended to limit the invention or the application and uses of the invention. Furthermore, there is no intention to be bound by any theory presented in the preceding background of the invention or the following detailed description of the invention.
In general, the present invention is directed to a touchscreen device configured, in “normal” mode, to display a number of user interface elements that are grouped together in pages in accordance with a conventional hierarchy. However, upon receipt of a predetermined touchscreen gesture (e.g., the circular motion of a finger) the menu hierarchy is bypassed and the user is given immediate control over a selected function, for example, a tuning function such as audio volume, screen contrast, or the like.
As a preliminary matter, it will be appreciated that the user interface and gestural input methods described below may be implemented in a variety of devices, including, for example, cellular phones (or “smartphones”), personal data assistants (PDAs), global positioning (GPS) systems, navigation systems and displays, e-book readers, tablet computers, netbook computers, point-of-sale devices, gaming devices, pen pads, and any other electronic apparatus that may include a touchscreen device used to traverse a multi-page hierarchy. Since the systems disclosed below are particularly useful in contexts where it is not desirable for the user to be distracted by the display for extended lengths of time—e.g., while driving a vehicle or piloting an aircraft—the illustrated examples may, without loss of generality, be described in the context of aircraft cockpit control systems. However, the invention is not so limited.
Referring now to
Touchscreen display 130 (in conjunction with processor 110) is configured to interact with one or more manipulators (not shown), such as a stylus, one or more user fingers, etc. The manipulators, when in contact or close proximity to touchscreen 130, produces a signal that is received and interpreted as a touch event by processor 110, which is configured (through any combination of hardware and software components) to determine the location and any other selected attributes of the touch event. The touch events may be stored within a memory, such as memory 120, and/or communicated to controller 150 for further control actions, as may be appropriate in the particular application.
Display 130 may include a thin, transparent touch sensor component superimposed upon a display (e.g., an LCD display or other type of display, not illustrated) that is viewable by a user. Examples of such displays include capacitive displays, resistive displays, surface acoustic wave (SAW) displays, optical imaging displays, and the like. Display 130 may also provide haptic feedback to the user—e.g., a clicking response or keypress feel in response to a touch event. The present embodiments contemplate any suitable touch sensitive surface or sensor.
Touchscreen display 130 may have any desired 2D or 3D rectilinear and/or curvilinear shape. Touchcreen display 130 may also be rigid or flexible, as is the case with various organic LED (OLED) display. The illustrated embodiments, without loss of generality, generally depict rectangular regions oriented in a portrait or landscape orientation (i.e., with respect to a user viewing the device); however, the present invention comprehends any range of shapes, sizes, and orientations.
It will be appreciated that the block diagram of
In general, the user interface elements will include “control” elements that receive input from manipulator 204 (i.e., via a “touch event”) and react accordingly (illustrated as elements 210 and 211 in
In general, touchscreen device 200 operates in at least two modes. The first mode (the “normal” or “default” mode) corresponds to a standard operational mode in which user interface elements 210 and 211 respond in accordance with touch events in the normal fashion, and touchscreen device 200 provides a signal (e.g., through data connection 151 in
In accordance with the present invention, touchscreen device 200 also operates in a second mode, which is entered when touchscreen device 200 determines that a touch event corresponds to a predetermined touchscreen gesture 208. In the second mode, touchscreen device 200 provides, for the duration of the touch event, a signal indicative of a value of a selected function that is not associated with the currently displayed user interface elements 220, 210, and 211. That is, by using a particular gesture or gestures, the user can quickly bypass the standard menu hierarchy and directly perform the selected function.
As mentioned previously, the entire set of available user interface elements (elements 210, 211, and 220 being just a few) will typically be grouped into a number of pages according to a menu hierarchy. This is illustrated conceptually in
In accordance with one embodiment of the invention, the menu hierarchy (702-714) includes a user interface element corresponding to the selected function, but switching from the first mode to the second mode includes bypassing the menu hierarchy to modify the value of the selected function directly.
The predetermined touchscreen gesture may comprise any single or multi-touch event or combination of events. With brief reference to
Referring again to
While touchscreen gesture 208 is being performed, and touchscreen device 200 is in the second mode, the sweeping or dragging of manipulator 204 over element 210 preferably does not trigger the function usually provided by element 210 during the first mode (e.g., the slider function described in the example above), even though element 210 is still being displayed. That is, it is preferable that the user not worry about activating other user interface elements when he or she is performing the touchscreen gesture 208, and that touchscreen device 200 temporarily ignores any contact with user interface element 210 and instead provides immediate access to the selected function.
Touchscreen device 100 may respond to multiple predetermined touchscreen gestures, each corresponding to a particular selected function. That is, one gesture may be used to control headset volume, while another is used to control the contrast and brightness of display 202. The gestures, and the way that they are mapped to selected functions, may be configurable by the user or pre-stored within touchscreen device 100 (e.g., memory 120 in
In accordance with another embodiment, touchscreen device 200 may temporarily display a graphical depiction of the value of the tuning function while the predetermined touchscreen gesture 208 is being performed. That is, referring to
The selected function that is controlled via the predetermined touchscreen gesture may be any function that the user would typically control via touchscreen device 100, or indeed any other function. In one embodiment, the selected function is a “tuning function”—i.e., a function that “tunes” some aspect of the user's interaction with touchscreen device 100 or some other mechanical or electro-mechanical control in the vicinity of the user. As mentioned above, typical tuning functions include, for example, the volume of an audio signal provided to the user (160 in
In general, a computer program product in accordance with one embodiment comprises a computer usable medium (e.g., standard RAM, an optical disc, a USB drive, or the like) having computer-readable program code embodied therein, wherein the computer-readable program code is adapted to be executed by processor 110 (working in connection with an operating system) to implement the methods and systems described above. The program code may be implemented in any desired language, and may be implemented as machine code, assembly code, byte code, interpretable source code or the like (e.g., C, C++, Java, or the like). The combination of code and/or processor hardware may be logically and functionally partitioned into various modules—for example, a touchscreen event interface module, a touchscreen event interpretation module, and a signal interface module.
While at least one exemplary embodiment has been presented in the foregoing detailed description of the invention, it should be appreciated that a vast number of variations exist. It should also be appreciated that the exemplary embodiment or exemplary embodiments are only examples, and are not intended to limit the scope, applicability, or configuration of the invention in any way. Rather, the foregoing detailed description will provide those skilled in the art with a convenient road map for implementing an exemplary embodiment of the invention. It being understood that various changes may be made in the function and arrangement of elements described in an exemplary embodiment without departing from the scope of the invention as set forth in the appended claims.