The present invention generally relates to user interfaces of aircraft systems, and more particularly relates to aircraft user interfaces with haptic feedback.
Flight displays of aircraft systems continue to advance in sophistication, achieving increasingly higher levels of information density, and consequently, presenting a greater amount of visual information to be perceived and understood by the operator. It is important that aircraft visual displays and the associated user interfaces provide a proper cognitive mapping between the task desired from the system and the resulting implementation by the operator. As a result, such systems continuously attempt to improve instrumentation and control of the user interfaces that cooperate with the visual displays and overall aircraft systems.
Touch screen user interfaces have been advantageously used to improve user interaction in many types of systems outside of avionics, including widespread use in cell phones. Some touch screen user interfaces generate a tactile or haptic feedback in response to user inputs. Haptic feedback may provide cues that enhance and simplify the user interaction. Specifically, vibration effects may be useful to alert the user to specific events or to provide realistic feedback in the subject system. However, in certain applications, such as avionics, user interfaces with conventional haptic feedback are unsuitable for a number of reasons, including the size of the displays and the physical environment of flight.
Accordingly, it is desirable to improved user interfaces, particularly in an aircraft environment. Furthermore, other desirable features and characteristics of the present invention will become apparent from the subsequent detailed description of the invention and the appended claims, taken in conjunction with the accompanying drawings and this background of the invention.
In accordance with an exemplary embodiment, a user interface includes a display element configured to display a visual image; a touch screen panel coupled to the display element and configured to receive a user input; a controller coupled to the touch screen panel and configured to generate a feedback signal based on the user input; a first actuator coupled to the controller and the touch screen panel and configured to operate in a first mode based on the feedback signal; and a second actuator coupled to the controller and the touch screen panel and configured to operate in a second mode based on the feedback signal.
In accordance with another exemplary embodiment, a method is provided for controlling haptic feedback in a user interface having a first actuator and a second actuator coupled to a touch screen display panel. The method includes operating the first actuator in a first mode to generate a first portion of the haptic feedback on touch screen panel with first nodes; and operating the second actuator in a second mode to generate a second portion of the haptic feedback on touch screen panel with second nodes, different from the first nodes.
The present invention will hereinafter be described in conjunction with the following drawing figures, wherein like numerals denote like elements, and
The following detailed description is merely exemplary in nature and is not intended to limit the invention or the application and uses of the invention. As used herein, the word “exemplary” means “serving as an example, instance, or illustration.” Thus, any embodiment described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other embodiments. All of the embodiments described herein are exemplary embodiments provided to enable persons skilled in the art to make or use the invention and not to limit the scope of the invention which is defined by the claims. Furthermore, there is no intention to be bound by any expressed or implied theory presented in the preceding technical field, background, brief summary, or the following detailed description.
Broadly, exemplary embodiments discussed herein provide aircraft systems and methods with improved user interfaces. Particularly, the systems and methods include user interfaces with a touch screen panel having first and second actuators that are operated to generate a haptic response. The first and second actuators are driven to respectively generate first and second modes. The modes are selected to have non-overlapping nodes and to generate a desirable haptic response. The actuators may be positioned around the perimeter of the touch screen panel in an arrangement that more readily achieves the selected modes. For example, some arrangements may be asymmetrical. In particular, the first and second actuators operate at different modes to generate a consistent haptic response over the entire touch screen panel.
The processor 110 may be a computer processor such as, for example, a microprocessor, digital signal processor, or any suitable processor capable of at least receiving and/or retrieving aircraft status information, navigation and control information (e.g., from FMS 120) and user inputs, and generating suitable control signals for the functions described below. The processor 110 may be a dedicated processor, for example, of the user interface 140 and/or FMS 120.
In general, the FMS 120 is a specialized computer system that assists the pilot in performing a wide variety of in-flight tasks. As one example, the FMS 120 may include a navigation system that generates a flight plan and a guidance system that assists the pilot in flying the flight plan. The FMS 120 may use data from a number of sources, including various sensors and the database 130. The database 130 can be a memory device (e.g., non-volatile memory, disk, drive, tape, optical storage device, mass storage device, etc.) that stores aircraft information used by the processor 110 or FMS 120.
The user interface 140 may include any component that enables the user to communicate and otherwise interact with the system 100. As described in greater detail below, the user interface 140 may include a display element for displaying various types of computer-generated symbols and information representing, for example, avionics information in an integrated, multi-color or monochrome form. As such, many known display monitors are suitable for displaying this type of information, symbols and data, such as, for example, various CRT and flat-panel display systems (e.g., CRT displays, LCDs, OLED displays, plasma displays, projection displays, HDDs, Heads-Up Displays/HUDs, etc.). Such displays may include various graphical elements associated with, for example, the position, flight-plan and/or other indicia of the aircraft's operational state. The user interface 140 may include input elements such as keyboards, pointer devices, microphones, switches, dials, joysticks, mice, trackballs, and the like. In one exemplary embodiment, the user interface 140 may be a primary flight display (PFD) or a multi-function control and display unit (MCDU).
As shown in
Generally, the controller 210 may include a processing element and memory for storing instructions that are executed by the processing element. Particularly, as discussed below, the controller 210 may generate signals for displaying an interactive visual display on the display elements 220, interpreting a user response on the touch screen panel 230, and generating a haptic feedback for the user on the touch screen panel 230 via the actuators 240 and 250. The controller 210 may be a dedicated controller or integrated as part of another system, such as integrated with the processor 110 of
In general, the display element 220 may be any type of display apparatus that provides a visual display to a user based on visual display commands from the controller 210. As described above, the display element 220 may provide aircraft and/or flight information. The display element 220 further displays graphical user input elements, such as graphically presented keyboards, buttons, menus, knobs, switches, graphics, sliders, arrows, pull-down menus, graphics with active elements, functional icons, and the like that enable the user to interact with the system 100 (
The display element 220 may be part of a number of different display devices that generate visual output, including CRT display device, a flat panel display device, a plasma display device, an electro-luminescent display device, a Light Emitting Diode (LED) display device, a holographic display device such as a Head Up Display (HUD), a Micro Mirror Device (MMD) display device, or the like. In one embodiment, the display element 220 is a liquid crystal display (LCD) panel.
The touch screen panel 230 generally includes a plate 232 and a sensor array 234 arranged on the plate 232. In one exemplary embodiment, the plate 232 may be transparent, translucent, a color filter, or other light permeable panel arranged over or in front of the display element 220. For example, the plate 232 may be formed by polycarbonate, glass or a plastic material. In one exemplary embodiment, the plate 232 may be omitted and/or incorporated into the display element 220.
The sensor array 234 is coupled to or integrated with the plate 232 and includes a number of sensors operable to detect a physical manipulation of the display system 142. As shown in
The touch screen display system 142 may further be responsive to inputs from control devices 260 other than the sensor array of the touch screen panel. For example, such control devices 260 may include keyboards, buttons, menus, knobs, switches, and the like that enable the user to interact with the system 100 (
The actuators 240 and 250 may be any device that generates haptic effects in response to signals received from controller 210. Although
The first and second actuators 240 and 250 may be, for instance, linear actuators that are arranged to apply a force to the touch screen panel 230 in the z direction. Although not shown, springs or compliant elements, such as helical springs, leaf springs, flexures, foam, rubber, or the like may be provided to enable movement of the touch screen panel 230 in the z-direction. The first and second actuators 240 and 250 may be, for example, electromagnetic actuators, an Eccentric Rotating Mass (“ERM”) actuators in which an eccentric mass is moved by a motor, Linear Resonant Actuators (“LRA”) in which a mass attached to a spring is driven back and forth, or “smart materials” such as piezoelectric, electro-active polymers or shape memory alloys that move in response to electric signals.
In the depicted embodiment, the actuators 240 and 250 are coupled to the underside of the touch screen panel 230, although in other embodiments, the actuators 240 and 250 may be coupled to the other side of the touch screen panel 230 or along an edge. Other attributes of the actuators 240 and 250, including additional details about the position and arrangement, are discussed below.
The touch screen display system 142 may be operated as described below. The display element 220 generates a visual display for the user, which may include interactive display components, such as menus or keyboards. The touch screen panel 230 overlays the display element 220 to receive the user input, and the sensor array 234 detects the nature and location of the user input on the plate 232 and provides user input signals to the touch screen controller 212. The touch screen controller 212 interprets the user input signals to determine the appropriate action, e.g., adjusting the visual display on the display element 220 and/or performing an action related to the operation of the aircraft system 100 (
The first and second actuators 240 and 250 receive the feedback signals and generate the haptic response on the touch screen panel 230 to provide a tactile sensation for the user in response to the user input. As described in greater detail below, the first and second actuators 240 and 250 are operated such that the combined portions of haptic response are generally consistent over the entire touch screen panel 230. Particularly, the first and second actuators 240 and 250 are positioned relative to one another and to the touch screen panel 230 to produce different modes with non-overlapping nodes, which are discussed in greater detail below.
In the depicted embodiment, the first actuator 240 and second actuator 250 are mechanically coupled to the touch screen panel 230 along the perimeter, generally on the underside of an edge, although other positions are possible. Typically, the first and second actuators 240 and 250 are positioned to provide the desired haptic feedback, as described below, without obscuring the view of the display element 220 (
As particularly shown in
Similarly, the second actuator 250 generates a second portion of a haptic response on the touch screen panel 230. Based on characteristics of the touch screen panel 230, the position of the second actuator 250, and the frequency of the oscillations of the second actuator 250, the second portion of the haptic response may be characterized as a second mode that develops standing waves in the xy-plane with a number of nodes 350. The pattern of nodes 350 schematically shown in
However, the first and second actuators 240 and 250 are respectively positioned and operated such that the nodes 340 and 350 do not overlap. Since the nodes 340 and 350 do not overlap, the haptic response is generated over the entire touch screen panel 230, e.g., there are no “dead spots” because the first actuator 240 generates a response at the nodes 350 associated with the second actuator 250 and vice versa. In one exemplary embodiment, the non-overlapping positions of the nodes 340 and 350 are a result of the selection of modes, which in this embodiment also results in an asymmetrical arrangement of the first and second actuators 240 and 250. In this embodiment, the asymmetrical arrangement includes placing the first actuator 240 on the first side 301 and the second actuator 250 on the second corner 312.
The actuators 240 and 250 provide a relatively large vibration sensation as a haptic response, which is especially important in the aircraft system 100 (
Additional details about the selection of the modes and positioning of the actuators 240 and 250 are discussed in
In a first step 410, a first mode associated with the first actuator 240 is selected or identified, and in a second step 420, a second mode associated with the second actuator 250 is selected or identified. As noted above, the term “mode” refers to a standing wave generated by the actuator (e.g., actuator 240 or 250) characterized in the xy-plane as a pattern of peaks corresponding to the maximum vibrations and nodes with minimal vibrations. With respect to the haptic response resulting from a mode, the peaks of the mode correspond to the maximum amount of haptic response and the nodes correspond an absence of haptic response. As described below, the nature of the mode is based on characteristics of the touch screen panel 230 and the position and frequency of the actuator (e.g., actuator 240 or 250),
Reference is briefly made to
In
With continuing reference to
Accordingly, in steps 410 and 420, first and second modes are selected. For example, mode 506 (e.g., the (2, 2) mode)) may be selected as the first mode, and mode 511 (e.g., the (3, 3) mode) may be selected as the second mode. In one exemplary embodiment, the first mode in step 410 is generally selected as a mode with known acceptable characteristics, and the second mode in step 420 is selected as a different mode that is further evaluated in the subsequent steps discussed below.
In step 430, the modes are evaluated to determine if the nodes overlap. From a visual inspection of
In step 440, the associated frequency of the second mode is considered based on a number of possible factors. For example, some of the considerations for evaluating a frequency include the ease or difficulty at which the frequency may be implemented into the touch screen panel 230 and the haptic response. For example, relatively low frequencies and relatively high frequencies may be difficult to achieve because of the size or inherent properties of the touch screen panel 230 and the type of actuator 250. As examples, mode 501 may have a frequency too low to readily implement in the touch screen panel 230 and mode 516 may have a frequency too high to readily implement. Additionally, some frequencies may be too high to produce a haptic feedback that may be readily sensed by a user. For example, the nerves of a user's finger may not be sensitive enough to feel the relatively high frequencies of mode 516. If the second mode is determined to be unsuitable for the touch screen panel 230, the method 400 proceeds to step 450 and a new second mode is selected, which is subsequently evaluated in steps 430 and 440.
If the frequency of the second mode is acceptable, the method 400 proceeds to step 460. In step 460, the actuators 240 and 250 are mounted in positions on the touch screen panel 230 to generate the modes. In one exemplary embodiment, the actuators 240 and 250 may be mounted in positions that correspond to a peak on the edge of the touch screen panel 230. In general, the modes are more readily achieved with the actuators at the peaks. For example, as shown in
One example may be described with the arrangement of
Referring briefly again to
In the embodiment of
Accordingly, systems and methods are provided with improved user interfaces for avionics applications. The feedback provided on the touch screen panel improves flight crew awareness by providing an improved “look and feel” for operation, while taking advantage of enhanced touch screen flexibility and capabilities. The user interfaces may include touch screen panels that reduce flight deck panel clutter and costs by replacing separate mechanical or electrical knobs, switches and other user input devices. The feedback provided by the haptic responses results in a more intuitive and easier to interpret operation and requires less heads-down time, particularly during flight operations that are already subject to vibrations caused by the aircraft and other distractions. The systems and methods described in
While at least one exemplary embodiment has been presented in the foregoing detailed description of the invention, it should be appreciated that a vast number of variations exist. It should also be appreciated that the exemplary embodiment or exemplary embodiments are only examples, and are not intended to limit the scope, applicability, or configuration of the invention in any way. Rather, the foregoing detailed description will provide those skilled in the art with a convenient road map for implementing an exemplary embodiment of the invention. It being understood that various changes may be made in the function and arrangement of elements described in an exemplary embodiment without departing from the scope of the invention as set forth in the appended claims.
Number | Name | Date | Kind |
---|---|---|---|
7369115 | Cruz-Hernandez et al. | May 2008 | B2 |
7742036 | Grant et al. | Jun 2010 | B2 |
20050219206 | Schena et al. | Oct 2005 | A1 |
20060097996 | Tabata | May 2006 | A1 |
20060290662 | Houston | Dec 2006 | A1 |
20070236450 | Colgate et al. | Oct 2007 | A1 |
20070236474 | Ramstein | Oct 2007 | A1 |
20080062145 | Shahoian et al. | Mar 2008 | A1 |
20080088602 | Hotelling | Apr 2008 | A1 |
20090085882 | Grant et al. | Apr 2009 | A1 |
20090284485 | Colgate et al. | Nov 2009 | A1 |
20100013613 | Weston | Jan 2010 | A1 |
20100045612 | Molne | Feb 2010 | A1 |
20100141606 | Bae et al. | Jun 2010 | A1 |
20100156818 | Burrough et al. | Jun 2010 | A1 |
20100156823 | Paleczny et al. | Jun 2010 | A1 |
20100156843 | Paleczny | Jun 2010 | A1 |
20100214211 | Dods | Aug 2010 | A1 |
20100225596 | Eldering | Sep 2010 | A1 |
20100231367 | Cruz-Hernandez et al. | Sep 2010 | A1 |
20100238116 | Shin | Sep 2010 | A1 |
20100253487 | Grant et al. | Oct 2010 | A1 |
20100265191 | Mui | Oct 2010 | A1 |
20110006888 | Bae | Jan 2011 | A1 |
20110090167 | Harris | Apr 2011 | A1 |
20110310028 | Camp et al. | Dec 2011 | A1 |
20120028577 | Rodriguez | Feb 2012 | A1 |
20120081337 | Camp et al. | Apr 2012 | A1 |
20120149437 | Zurek | Jun 2012 | A1 |
20120232780 | Delson | Sep 2012 | A1 |
Number | Date | Country |
---|---|---|
101825967 | Sep 2010 | CN |
Entry |
---|
European search report for application No. 12 167 923.7, dated Sep. 4, 2012. |
European Communication for application No. 12 167 923.7, dated Sep. 14, 2012. |
Office Action for Chinese Patent Application No. 201210210483.9 dated Jul. 26, 2016. |
CN Office Action for Application No. 201210210483.9; dated Mar. 28, 2016. |
EP Examination Report for Application No. EP 12167923.7 dated Nov. 25, 2015. |
Number | Date | Country | |
---|---|---|---|
20120299839 A1 | Nov 2012 | US |