This invention relates generally to display systems in aircraft, and more particularly, to compensating for aircraft motion, vibration, and turbulence when aircrew interact with such displays.
All aircraft, whether fixed-wing, helicopter, or tilt-rotor, have flight instruments that present information about the aircraft's current state or position to aircrew. The type, arrangement, and number of flight instruments vary depending upon the aircraft type and mission. The flight instruments will generally include at least basic pitot-static instruments (e.g., airspeed indicator, altimeter, vertical speed indicator) and gyroscopic instruments (e.g., attitude indicator, heading indicator, turn coordinator). Originally, aircraft flight instruments were mechanical dials and gauges that provided an analog display to aircrew. The analog flight instrument configuration evolved into the modern “glass cockpit” in which the mechanical flight instruments are replaced with digital flight instrument displays. Typically, liquid-crystal display (LCD) panels are used to show the information presented by traditional mechanical instruments. Some flight instrument configurations include LCD panels that are dedicated to display a digital version of a replaced mechanical instrument. In other configurations, the LCD panels may be multi-function displays (MFD) that can display various flights instruments as selected by aircrew.
In one embodiment, a vehicle display system comprises a touch-sensitive display, a vehicle movement sensor, and a display driver configured to generate one or more objects displayed on the touch-sensitive display, wherein a position of at least one object on the touch-sensitive display is adjusted based upon motion detected by the vehicle movement sensor. The vehicle may be an aircraft, and the at least one object may be a virtual flight instrument. The vehicle movement sensor may be an accelerometer, for example. The vehicle display may further comprise a display sensor configured to detect a target point for an imminent input on the touch-sensitive display. The display driver is configured to generate a zoomed display around the target point. The target point may overlap a targeted object on the touch-sensitive display, and at least a portion of the targeted object may be expanded on the display. The display driver may also be configured to display the target point on the touch-sensitive display.
In another embodiment, a method for interacting with a display comprises displaying one or more objects on a touch-sensitive display in a vehicle, detecting motion of the vehicle, and adjusting the position of the at least one object on the touch-sensitive display based upon the vehicle motion. The method may further comprise displaying an object on the touch-sensitive display at an initial position, moving the object to a second position in response to the vehicle motion, and moving the object to the original position from the second position after the vehicle motion stops. The method may further comprise detecting when a user touches the touch-sensitive display, and providing tactile feedback to the user via a glove.
In a further embodiment, a method controlling a display comprises displaying one or more objects on a touch-sensitive display in a vehicle, detecting an imminent input to the touch-sensitive display, determining a target point on the touch-sensitive display for the imminent input, and zooming the display around the target point. The target point may overlap a targeted object on the touch-sensitive display, and a zoomed portion of the display may comprise at least a portion of the targeted object. The vehicle may be an aircraft, and the targeted object may be a virtual flight instrument. The method may further comprise displaying the target point on the touch-sensitive display. The imminent input may correspond to an input device coming within a predefined proximity to the touch-sensitive display. The input device may be, for example, a user finger, a user fingertip, a user hand, and a stylus.
Having thus described the invention in general terms, reference will now be made to the accompanying drawings, which are not necessarily drawn to scale, and wherein:
While the system of the present application is susceptible to various modifications and alternative forms, specific embodiments thereof have been shown by way of example in the drawings and are herein described in detail. It should be understood, however, that the description herein of specific embodiments is not intended to limit the system to the particular forms disclosed, but on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the present application as defined by the appended claims.
Illustrative embodiments of the system of the present application are described below. In the interest of clarity, not all features of an actual implementation are described in this specification. It will of course be appreciated that in the development of any such actual embodiment, numerous implementation-specific decisions must be made to achieve the developer's specific goals, such as compliance with system-related and business-related constraints, which will vary from one implementation to another. Moreover, it will be appreciated that such a development effort might be complex and time-consuming but would nevertheless be a routine undertaking for those of ordinary skill in the art having the benefit of this disclosure.
In the specification, reference may be made to the spatial relationships between various components and to the spatial orientation of various aspects of components as the devices are depicted in the attached drawings. However, as will be recognized by those skilled in the art after a complete reading of the present application, the devices, members, apparatuses, etc. described herein may be positioned in any desired orientation. Thus, the use of terms such as “above,” “below,” “upper,” “lower,” or other like terms to describe a spatial relationship between various components or to describe the spatial orientation of aspects of such components should be understood to describe a relative relationship between the components or a spatial orientation of aspects of such components, respectively, as the device described herein may be oriented in any desired direction.
In future generation aircraft, the mechanical instruments and MFDs found in current aircraft may be replaced with a wide, single-screen instrument panel that provides a touchscreen interface for aircrew. The single-screen instrument panel will behave like a tablet computer accepting well-known user gestures, such as pinch-zoom and finger-swipe inputs. Aircrew may select what information should be displayed on the single-screen instrument panel, such as flight instruments, engine instruments, navigation displays, communication interfaces, etc. The aircrew may also select the layout of the display, such as by adjusting the size and position of each instrument and interface. This would allow aircrew to select different instrument panel displays for different missions, weather conditions, or flight stages.
Interaction with a touchscreen interface in an aviation environment may be difficult since aircrew are subject to vibration, turbulence, and g-forces during flight. Such effects cause aircrew to move unexpectedly, and these movements are difficult to anticipate. If unexpected environmental forces are imposed on aircrew while they are trying to use a touchscreen instrument panel, the aircrew may miss an intended input or may unintentionally select an undesired command. These problems may be accentuated by the lack of tactile feedback inherent in current touchscreens. For example, an aircrewman wearing gloves and pressing on a flat screen might not have a high confidence level as to what input was selected and/or if an input was selected at all.
In one embodiment, information displayed on a single-screen instrument panel is coupled to the movement of a crew member's hand. The visual display may be digitally dampened so it has a slight float on the screen relative to a background image. A glove or finger may trigger a target point on the screen to identify the area being selected. A target area may be enlarged when the crew member's finger is near the touchscreen. The target area may track the projected target point to dampen any inadvertent and significant hand movement. Tactile feedback may be incorporated within a crew member's glove to confirm a selection or finger press.
In
In one embodiment, a background image, such as a picture, pattern, or color, may be displayed on instrument panel 101. The flight instruments and other information are displayed on instrument panel 101 so that they appear to be floating relative to the background and/or so that they appear to be closer to the user relative to the background. The background image does not move when aircraft motion is detected; however, the flight instruments and other instruments may move relative to the background image (as well as relative to instrument panel 101) when compensating for aircraft movement.
In one embodiment, the entire display on instrument panel 101 is moved to compensate for aircraft motion. In other embodiments, sensors or detectors (not shown) on or near instrument panel 101 may detect that a particular portion of instrument panel 101 is of interest to the user, such as keypad 102 itself or keypad 102 plus nearby displayed objects on instrument panel 101, and just that portion of the display is moved to compensate for aircraft motion.
The user may wear a glove as shown in
In
These problems may be overcome in one embodiment if instrument panel 201 or a display driver can identify a focus area 205 that the user is beginning to touch. For example, as the user's finger 204 is approaching keypad 202, instrument panel 201 may identify area 205 as the focus area of an imminent touch input. For example, optical sensors (not shown) may detect the approach of finger 204 toward instrument panel 201 before contact occurs. Alternatively, a sensor or detector 206 in the fingertip of the glove may be used to identify the focus area 205.
A target point 209 may be displayed on instrument panel 201 to indicate the predicted touch point. Target point 209 may be the center of the focus area 205 and/or expanded area 207. If the user moves his finger 204 across instrument panel 201, with or without touching the surface, then target point 209 will move to reflect changes in a predicted touch point. Similarly, if the user moves his finger 204 across instrument panel 201, with or without touching the surface, then focus area 205 and/or expanded area 207 may also move with the finger 204. For example, if the user makes multiple inputs, such as pressing multiple buttons 203 or 208, the touch point 209 and expanded area 207 will move between each button press to follow the user's finger 204.
In some embodiments, the instrument panel on an aircraft may use both the moving display object feature (
Although a keypad 102 or 202 is used in the examples of
As discussed above, aircraft motion may make it difficult at times for a crew member 309 to accurately select a desired input on instrument panel 301. Additionally, it may also be difficult to accurately select a desired input on instrument panel 301 if the crew member is wearing gloves or if the input buttons or objects have a relatively small size. These problems can be overcome by compensating for aircraft motion and other factors.
Display driver 302 includes or is coupled to a display compensation module 310. Sensors, such as accelerometers 311 and/or 312 or display sensors 314, provide inputs to compensation module 310. Accelerometer 311 is positioned at or near instrument panel 301, and accelerometer 312 is positioned at or near the seat 315 for crew member 309. Accelerometers 311 and 312 measure movement of instrument panel 301 and crew member 309 in one or more dimensions and provide movement data to display compensation module 310. In turn, display compensation module 310 provides inputs to display driver 302 that can be used to modify the virtual pitot-static and gyroscopic flight instruments 307 and other aircraft data 308 displayed on instrument panel 301. For example, if the aircraft suddenly descends due to turbulence, this might cause the crew member's hand 316 to rise relative to the objects shown on instrument panel 301. Accelerometers 311 and/or 312 detect this sudden movement and provide movement data (e.g., direction and magnitude) to display compensation module 310. Based upon the direction and magnitude of the aircraft motion, display compensation module 310 provides a compensation signal to display driver 302, which uses the compensation signal to modify the objects 307 and 308 shown on instrument panel 301. Display driver 302 moves the objects 307 and 308 upward on the display by an amount proportional to the aircraft motion, such as shown in
Display driver 302 and display compensation 301 may also receive inputs from display sensors 314, which indicate are area of interest on instrument panel 301, such as an area that the crew member's hand 316 or finger 317 is near. Display sensors 314 may be embedded in instrument panel 301 (not shown) and may interact or detect proximity of the crew member's finger 317. For example, a sensor component 318 may be incorporated into the fingertip of the crew member's glove. Alternatively, cameras 319 may be used to detect the crew member's finger 317 and to determine its position relative to objects shown instrument panel 301 using stereo vision. Display driver 302 modifies the display object 308 that is closest to the crew member's finger 317, such as by enlarging that object or a portion of the object as shown in
Some embodiments of systems and methods for modifying displays to compensate for aircraft motion, vibration, and turbulence, as described herein, may be implemented or executed, at least in part, by one or more computer systems. One such computer system is illustrated in
As illustrated, computer system 400 includes one or more processors 410A-N coupled to a system memory 420 via bus 430. Computer system 400 further includes a network interface 440 coupled to bus 430, and one or more I/O controllers 450, which in turn are coupled to peripheral devices such as display sensors 460, accelerometers 470, instrument panel or display 480, etc. Each of I/O devices 460, 470, 480 may be capable of communicating with I/O controllers 450, for example, via a wired connection (e.g., serial port, Universal Serial Bus port) or wireless connection (e.g., Wi-Fi, Bluetooth, Near Field Communications, etc.). Other devices may include, for example, keyboards, keypads, attitude-heading sensors, air data computer, navigation systems, communication systems, etc.
In various embodiments, computer system 400 may be a single-processor system including one processor 410A, or a multi-processor system including two or more processors 410A-N (e.g., two, four, eight, or another suitable number). Processors 410 may be any processor capable of executing program instructions. For example, in various embodiments, processors 410 may be general-purpose or embedded processors implementing any of a variety of instruction set architectures (ISAs), such as the x86, PowerPC®, ARM®, SPARC®, or MIPS® ISAs, or any other suitable ISA. In multi-processor systems, each of processors 410 may commonly, but not necessarily, implement the same ISA. Also, in some embodiments, at least one processor 410 may be a graphics processing unit (GPU) or another dedicated graphics-rendering device.
System memory 420 may be configured to store program instructions and/or data accessible by processor 410. In various embodiments, system memory 420 may be implemented using any suitable memory technology, such as static random-access memory (SRAM), synchronous dynamic RAM (SDRAM), nonvolatile/Flash-type memory, or any other type of memory. As illustrated, program instructions and data implementing certain operations and modules such as those described herein may be stored within system memory 420 as program instructions 425 and data storage 435, respectively. In other embodiments, program instructions and/or data may be received, sent, or stored upon different types of computer-accessible media or on similar media separate from system memory 420 or computer system 400.
A computer-accessible medium may include any tangible and/or non-transitory storage media or memory media such as electronic, magnetic, or optical media—e.g., disk or CD/DVD-ROM coupled to computer system 400 via bus 430. The terms “tangible” and “non-transitory,” as used herein, are intended to describe a computer-readable storage medium (or “memory”) excluding propagating electromagnetic signals, but are not intended to otherwise limit the type of physical computer-readable storage device that is encompassed by the phrase computer-readable medium or memory. For instance, the terms “non-transitory computer-readable medium” or “tangible memory” are intended to encompass types of storage devices that do not necessarily store information permanently, including for example, random access memory (RAM). Program instructions and data stored on a tangible computer-accessible storage medium in non-transitory form may further be transmitted by transmission media or signals such as electrical, electromagnetic, or digital signals, which may be conveyed via a communication medium such as a network and/or a wireless link.
In an embodiment, bus 430 may be configured to coordinate I/O traffic between processor 410, system memory 420, and any peripheral devices in the computer system, including network interface 440 or other peripheral interfaces, such as I/O devices 460, 470, 480. In some embodiments, bus 430 may perform any necessary protocol, timing, or other data transformations to convert data signals from one component (e.g., system memory 420) into a format suitable for use by another component (e.g., processor 410). In some embodiments, bus 430 may include support for devices attached through various types of peripheral buses, such as a variant of the Peripheral Component Interconnect (PCI) bus standard or the Universal Serial Bus (USB) standard, for example. In some embodiments, the function of bus 430 may be split into two or more separate components, such as a northbridge chipset and a southbridge chipset, for example. In addition, in some embodiments some or all the functionality of bus 430, such as an interface to system memory 420, may be incorporated directly into processor(s) 410A-N.
Network interface 440 may be configured to allow data to be exchanged between computer system 400 and other devices attached to a network, such as other computer systems, or between nodes of computer system 400. In various embodiments, network interface 440 may support communication via wired or wireless general data networks, or via any other suitable type of network and/or protocol
I/O controllers 450 may, in some embodiments, enable communications with one or more display terminals, keyboards, keypads, touchpads, scanning devices, voice or optical recognition devices, mobile devices, or any other devices suitable for entering or retrieving data by one or more computer system 400. Multiple I/O controllers 450 may be present in computer system 400 or may be distributed on various nodes of computer system 400. In some embodiments, I/O devices may be separate from computer system 400 and may interact with one or more nodes of computer system 400 through a wired or wireless connection, such as over network interface 440.
As shown in
A person of ordinary skill in the art will appreciate that computer system 400 is merely illustrative and is not intended to limit the scope of the disclosure described herein. The computer system and devices may include any combination of hardware or software that can perform the indicated operations. In addition, the operations performed by the illustrated components may, in some embodiments, be performed by fewer components or distributed across additional components. Similarly, in other embodiments, the operations of some of the illustrated components may not be provided and/or other additional operations may be available. Accordingly, systems and methods described herein may be implemented or executed with other computer system configurations including virtual configurations.
It should be understood that the various operations described herein, particularly in connection with
The foregoing has outlined rather broadly the features and technical advantages of the present invention in order that the detailed description of the invention that follows may be better understood. Additional features and advantages of the invention will be described hereinafter which form the subject of the claims of the invention. It should be appreciated that the conception and specific embodiment disclosed may be readily utilized as a basis for modifying or designing other structures for carrying out the same purposes of the present invention. It should also be realized that such equivalent constructions do not depart from the invention as set forth in the appended claims. The novel features which are believed to be characteristic of the invention, both as to its organization and method of operation, together with further objects and advantages will be better understood from the following description when considered in connection with the accompanying figures. It is to be expressly understood, however, that each of the figures is provided for the purpose of illustration and description only and is not intended as a definition of the limits of the present invention.