Embodiments of the present invention generally relate to aircraft, and more particularly relate to displaying information in a cockpit of an aircraft.
Modern aircraft include arrays of electronic displays, instruments, and sensors designed to provide the pilot with functional information, menus, data, and graphical options intended to enhance pilot performance and overall safety of the aircraft and the passengers. Some displays are programmable and/or customizable and some are also used by the pilot(s) as the primary instrument display for flying the aircraft. These displays are commonly referred to as the Primary Flight Displays (PFD) and are assigned or dedicated to both the pilot and copilot. PFDs display information such as aircraft altitude, attitude, and airspeed. All displays typically include a separate controller, including knobs, radio buttons, and the like, to select different menus and graphical presentations of information on the displays. Additionally, the cockpit instrument panel includes individual controllers for specific aircraft systems, such as the fuel system, the electrical power system, weather detection system, etc.
When an aircraft is in flight, it is imperative that the pilot can view the flight deck displays so that he/she can properly fly the aircraft. Normally this is not an issue. However, when smoke or another visual obscurant enters the cockpit of the aircraft, this could cause significant visual attenuation. Flight crew use oxygen masks to assist with breathing, but the visual impairment issues can make it difficult, if not impossible, for the pilot and co-pilot to see the primary or secondary flight displays, the flight deck controls or even the flight path outside the aircraft.
One solution to this problem is the Emergency Visual Assurance System (EVAS). EVAS is a self-contained system that includes a battery powered blower which draws smoky air in through a filter that filters out visible particles to a flexible air duct, which is connected to an inflatable transparent envelope, called an Inflatable Vision Unit (IVU). In essence, it uses an air displacement device that draws air through a filter and removes smoke/visible particles, then inflates a large bag with cleaner air. The inflated bag therefore “displaces” the smoke in the cockpit providing the crew with a limited view to the flight deck. However, a drawback of EVAS is that it takes at least 1 minute before it can be fully inflated and used.
There is a need for alternative technologies that allow pilots and flight crew to view the flight deck instrumentation when obscurants, such as smoke, enter the cockpit. Furthermore, other desirable features and characteristics of the present invention will become apparent from the subsequent detailed description and the appended claims, taken in conjunction with the accompanying drawings and the foregoing technical field and background.
A method is provided for communicating a video signal from a source inside an aircraft, and displaying the video signal on a display that is configured to be secured to an oxygen mask of the aircraft.
In one embodiment, a Cockpit Augmented Vision Unit (CAVU) is provided that includes a video signal feed, a housing configured to house or contain a display, and an attachment mechanism coupled to the housing configured to secure the housing and the display to an oxygen mask. The video signal feed can be communicatively coupled to at least one source of a video signal, and the display can be coupled to the video signal feed. The display is configured to display the video signal.
In another embodiment, an aircraft system is provided. The system includes an aircraft having at least one source of a video signal, and an oxygen mask that can be deployed within the aircraft. A Cockpit Augmented Vision Unit (CAVU) is communicatively coupled to the source, and includes a housing configured to house a display; and an attachment mechanism coupled to the housing that is configured to secure the display to the oxygen mask. The display can display the video signal.
Embodiments of the present invention will hereinafter be described in conjunction with the following drawing figures, wherein like numerals denote like elements, and wherein
As used herein, the word “exemplary” means “serving as an example, instance, or illustration.” The following detailed description is merely exemplary in nature and is not intended to limit the invention or the application and uses of the invention. Any embodiment described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other embodiments. All of the embodiments described in this Detailed Description are exemplary embodiments provided to enable persons skilled in the art to make or use the invention and not to limit the scope of the invention which is defined by the claims. Furthermore, there is no intention to be bound by any expressed or implied theory presented in the preceding technical field, background, brief summary or the following detailed description.
The display units 270 can be implemented using any man-machine interface, including but not limited to a screen, a display or other user interface (UI). In response to display commands supplied from the input devices 280, the display units 270 can selectively render various textual, graphic, and/or iconic information in a format viewable by a user, and thereby supply visual feedback to the operator. It will be appreciated that the display units 270 can be implemented using any one of numerous known displays suitable for rendering textual, graphic, and/or iconic information in a format viewable by the operator. Non-limiting examples of such displays include various cathode ray tube (CRT) displays, and various flat panel displays such as various types of liquid crystal display (LCD) and thin film transistor (TFT) displays. The display units 270 may additionally be implemented as a panel mounted display, a head-up display (HUD) projection, or any one of numerous technologies used as flight deck displays in aircraft. For example, it may be configured as a multi-function display, a horizontal situation indicator, or a vertical situation indicator. At least one of the display units 270 can be configured as a primary flight display (PFD). Depending on the implementation or mode of operation, the heads up display (HUD) unit 272 can be an actual physical display or implemented using projected images (e.g., images projected on a surface within the aircraft such as the windshield).
The audio elements 260 can include speakers and circuitry for driving the speakers. The input devices 280 can generally include, for example, any switch, selection button, keypad, keyboard, pointing devices (such as a cursor controlled device or mouse) and/or touch-based input devices including touch screen display(s) which include selection buttons that can be selected using a finger, pen, stylus, etc.
The onboard computer 210 includes a data bus 215, a processor 220, system memory 223, a synthetic vision system (SVS) 250, a SVS database 254, flight management systems (FMS) 252, and an enhanced vision system (EVS) 240 that receives information from EVS image sensor(s) 230.
The data bus 215 serves to transmit programs, data, status and other information or signals between the various elements of
The processor 220 performs the computation and control functions of the computer system 210, and may comprise any type of processor 220 or multiple processors 220. The processor 220 may be implemented or realized with a general purpose processor, a content addressable memory, a digital signal processor, an application specific integrated circuit, a field programmable gate array, any suitable programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination designed to perform the functions described herein. A processor device may be realized as a microprocessor, a controller, a microcontroller, or a state machine. Moreover, a processor device may be implemented as a combination of computing devices, e.g., a combination of a digital signal processor and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a digital signal processor core, or any other such configuration.
It should be understood that the system memory 223 may be a single type of memory component, or it may be composed of many different types of memory components. The system memory 223 can includes non-volatile memory (such as ROM 224, flash memory, etc.), volatile memory (such as RAM 225), or some combination of the two. The RAM 225 can be any type of suitable random access memory including the various types of dynamic random access memory (DRAM) such as SDRAM, the various types of static RAM (SRAM).
In addition, it is noted that in some embodiments, the system memory 223 and the processor 220 may be distributed across several different on-board computers that collectively comprise the on-board computer system 210.
The processor 220 is in communication with the EVS 240, the SVS 250 and flight management system (FMS) 252.
The FMS 252 is a specialized computer system that automates a wide variety of in-flight tasks. For example, the FMS 252 allows for in-flight management of the flight plan, and can provide data such as vehicle positioning, heading, attitude, and a flight plan to the SVS 250. The FMS 252 can use various sensors (such as GPS and INS) to determine the aircraft's position, and guide the aircraft along the flight plan.
The EVS 240 can include a processor (not shown) that generates images for display on the heads up display (HUD) unit 272. The images provide a view, looking forward, outside the aircraft 110. The EVS 240 can receive output of one or more nose-mounted EVS image sensors 230 (e.g., infrared and/or millimeter wave video cameras). In one embodiment, the EVS 240 transmits images to a transparent screen in the pilot's forward field of vision, creating a seamless, uninterrupted flow of information that increases a pilot's situational awareness and response. The EVS 240 can be specifically tuned to pick up runway lights or other heat emitting objects through cloud and other precipitation, and can also show the pilot a horizon and some terrain. The EVS 240 can reveal, for example, taxiways, runway markings, adjacent highways and surrounding terrain, etc. even at night, in light fog or rain, etc. The EVS image sensors 230 can surrounded by an artificial cooling system that better enables the infrared receivers to detect the slightest variations in infrared light emitted from runway lights, airports and even vehicles on the ground.
The SVS 250 can include a processor (not shown) that communicates with the SVS database 254 and the flight management system (FMS) 252. The SV database 254 includes data related to, for example, terrain, objects, obstructions, and navigation information for output to one or more of the display units 270. In one embodiment, the SVS 250 is configured to render images based on pre-stored database information. These images can include three-dimensional color maps provide a geographic display that includes an accurate terrain representation of surrounding terrain, runways and approaches. In addition, in some embodiments, PFD information such as altitude, attitude, airspeed, turn and bank cues can be superimposed over the geographic display.
During normal flight conditions, the display units 370 provide the pilot with the vast majority of necessary information used in piloting an aircraft. As the primary instruments, the display units 370 display flight data according to various functions and, in a modern aircraft, are typically programmable by the pilot. One of the display units 370 is assigned to a pilot and can function as the PFD that can display attitude, airspeed, heading, etc.
Each standby display/controller 311, 312 includes a display 320 and a companion controller panel 330, and may be associated with a pilot or copilot and one or more of the display units 370. The standby display/controllers 311, 312 may provide control for and display of aircraft systems and control for display units 370. By functioning as both a configurable controller and as a standby display, the display/controllers 311, 312 may integrate not only the functions of the traditional configurable controllers, standby display and standby heading display. The standby display/controllers 311, 312 typically control the programmable display units 370 such that the display units 370 may display attitude and airspeed information, as well as navigational or systems information, according to the preferences of a pilot. For example, through the display controller 320, a pilot may configure the display 370. In addition to controlling and configuring the display units 370, the controller 320 may also be configured to control aircraft systems and display the status of aircraft systems on an associated screen shown. For example, the controller 320 may be configured to control and display status information regarding the fuel system or the auxiliary power unit for the aircraft. As such, through the control of the displays and the aircraft systems, the controller 320 plays a significant role in the flight of the aircraft.
The standby display/controllers 311, 312 may be configured to include a controller mode and a standby mode. In the controller mode, each standby display/controller 311, 312 presents aircraft system data and menu options for managing the aircraft systems, and may display data for an automatic flight control system.
In the event of an emergency or if the display units 370 are lost (e.g., during abnormal conditions such as an electrical failure), the display units 370 may not be available to the pilot and/or the copilot, the display/controllers 311, 312 may be configured to default to the standby mode. In such an emergency situation, standby display/controller 311, 312 can provide the pilots with the necessary information in a standardized fashion. In the standby mode, at least one of the standby display/controllers 311, 312 displays required regulatory flight data at all times. Video signals from a source (e.g., display) inside a cockpit of an aircraft can be provided to a vision unit that is secured to an oxygen mask within the aircraft. Additionally, in some embodiments, actual video images of the cockpit can be acquired via a camera and provided to the vision unit. A user (e.g., pilot or crew member) can select a particular one of the video signals or the actual video images that are to be displayed at a display of the vision unit.
The CAVU 400 is mountable on an oxygen mask 405, and includes an input selector 410, a housing 420, a display 430, an attachment mechanism 440, one or more video signal feed(s) 470, user input devices 480, 482, and an optional camera 495. The video signal feed 470 can be communicatively coupled to various blocks 240, 250, 252, 270, 272, 272, 276 of
The display 430 can be housed within the housing 420 such that the display 430 is contained (at least partially) within the housing 420. The attachment mechanism 440 can be attached or coupled to the housing 420. The attachment mechanism 440 is used to secure the CAVU to the oxygen mask 405 when needed. The attachment mechanism 440 allows for the CAVU 400 to be quickly mounted flush with the oxygen mask, and easily removed in situations where the oxygen mask 405 is required but the display 430 is not required (e.g. rapid decompression). The attachment mechanism 440 allows the user (e.g., pilot or crew) to secure the housing 420, and hence the display 430, to an oxygen mask 405 that is deployed within the cockpit under certain circumstances, such as when smoke or other visual obscurants start to enter the cockpit. This allows the user to view information that is presented on the display 430 when the CAVU 400 is attached to and worn over the oxygen mask 405. In one embodiment, the attachment mechanism 440 can be an adjustable, elastic head strap that allows for quick and easy attachment of the CAVU 400 to the oxygen mask 405. In one implementation, the housing 420 can include soft padding or a seal that contacts against the oxygen mask 405 when mounted on the oxygen mask 405.
The video signal feed 470 can be implemented using cables that are compliant with component video, composite video (e.g., NTSC, PAL or SECAM), or s-video standards. The display 430 can be indirectly coupled to the video signal feed 470 via the input selector 410. The user input devices 480, 482 can receive inputs from the user (referred to herein as “user input”), which is provided to the input selector 410 to control which source of video information is displayed on the display 430. In one embodiment, the user input devices can include a switch button 480 that is used to toggle between selection of the video camera 495 and the other video signals, and another switch button 482 that is used to switch between select a particular one of the video signals.
In some embodiments, a video camera 495 can be integrated with and/or mounted on the housing 420. The video camera 495 operates using a portion of the electromagnetic spectrum to provide penetration of obscurants such as smoke. The video camera 495 can be, for example, a shortwave infrared (IR) or near IR camera. The video camera 495 can be augmented by in-band illumination sources (e.g., IR LEDs) inside the flight deck. In one embodiment, to enhance the visibility of flight deck controls to the user the illumination sources will be located close to primary controls. The video camera 495 provides the user with a view of the flight deck and allows the flight deck to be viewed by the user through dense smoke or similar obscurants that would normally prevent the user from seeing them. The video camera 495 can be used to acquire video images 497 of the cockpit of the aircraft 110, including actual images of flight deck controls and display units 270 located within the cockpit of the aircraft 110, when normal viewing of the flight deck controls and the display units 270 is visually attenuated, obscured or impaired in some way. In addition, in other embodiments, the CAVU 400 can communicate with other video cameras that are mounted anywhere within the cockpit, and can receive video images acquired by those cameras. In one embodiment, the video camera 495 can be removable, which allows the user to move it to another location in the cockpit (e.g., the windshield). Alternatively, one or more other video cameras (not shown) can be provided that can be mounted anywhere within the cockpit, and can receive real-time video images acquired by those cameras, which can in turn be communicated to the video input selector 410 of the CAVU 400 to provide additional sources of video information.
The CAVU 400 includes a port (not illustrated) that receives the video signal feed 470, and couples it to the video input selector 410 of the CAVU 400. The video input selector 410 is coupled to the camera 495, the user input devices 480, 482 and the display 430. The input selector 410 receives the various video signals 500 via the video signal feed 470 and the video images 497 of the cockpit that are acquired by the video camera 495. The video signal feed 470 carries video information from various different sources onboard the aircraft, and provides them to the input selector 410. The video signal feed 470 can carry video signals 500 received from different displays 270-276 within the cockpit, but it should be appreciated that these sources are not limited to these displays 270-276 and can include other sources depending on the implementation.
The user can interact with the input devices 480, 482 to generate user input signals that are used to control which source of video information is displayed on the display 430. The input devices 480, 482 can generally include, for example, any switch, selection button, and/or touch-based input devices which include selection buttons that can be selected using a finger. Each user input device is configured to receive user inputs that are provided to the input selector 410. In response to the user inputs from the user input devices, the input selector 410 can select one of its video inputs (e.g., either the video images 497 from the camera 495 or one of the different/unique video signals 500) that will be output to the display 430.
The user input devices 480, 482 can be implemented using switches, such as rotary switches, or any type of touch sensitive control devices including, for example, switch buttons. In one embodiment, the user input devices can include a switch button 480 that is used to select the video images 497 from the video camera 495 as the output for the display 430, and another switch button 482 that is used to select and switch between the video signals 500 to select a particular one of the video signals 500 as the output for the display 430. When the user selects one particular video signal 500 as the desired output, the CAVU 400 can provide that particular video signal 500 to the display 430 for presentation to the user.
When in operation, a user (e.g., pilot or crew) can use the input devices 480/482 to select from different, unique sources of video information that can be repeated and displayed at the display 430. Stated differently, in response to user input, the video input selector 410 will output either one of the different video signals 500 that drive the display units 270, 272, or the video images 497 of the cockpit to display the selected video information to the user via the display 430.
In the embodiment illustrated in
It should be appreciated that the video signals 500 illustrated in
The disclosed embodiments augment natural vision by allowing the flight crew to see all primary flight data and leverages advanced features of aircraft such as Synthetic Vision System (SVS), Enhanced Vision System (EVS), and Head up Display (HUD) data in order to provide a wearable, cost-effective solution for a visually obstructed cockpit environment.
Those of skill in the art would further appreciate that the various illustrative logical blocks/tasks/steps, modules, circuits, and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both. Some of the embodiments and implementations are described above in terms of functional and/or logical block components (or modules) and various processing steps. However, it should be appreciated that such block components (or modules) may be realized by any number of hardware, software, and/or firmware components configured to perform the specified functions. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention. For example, an embodiment of a system or a component may employ various integrated circuit components, e.g., memory elements, digital signal processing elements, logic elements, look-up tables, or the like, which may carry out a variety of functions under the control of one or more microprocessors or other control devices. In addition, those skilled in the art will appreciate that embodiments described herein are merely exemplary implementations
The various illustrative logical blocks, modules, and circuits described in connection with the embodiments disclosed herein may be implemented or performed with a general purpose processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general-purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration. The word “exemplary” is used exclusively herein to mean “serving as an example, instance, or illustration.” Any embodiment described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other embodiments.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The processor and the storage medium may reside in an ASIC.
In this document, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Numerical ordinals such as “first,” “second,” “third,” etc. simply denote different singles of a plurality and do not imply any order or sequence unless specifically defined by the claim language. The sequence of the text in any of the claims does not imply that process steps must be performed in a temporal or logical order according to such sequence unless it is specifically defined by the language of the claim. The process steps may be interchanged in any order without departing from the scope of the invention as long as such an interchange does not contradict the claim language and is not logically nonsensical.
Furthermore, depending on the context, words such as “connect” or “coupled to” used in describing a relationship between different elements do not imply that a direct physical connection must be made between these elements. For example, two elements may be connected to each other physically, electronically, logically, or in any other manner, through one or more additional elements.
While at least one exemplary embodiment has been presented in the foregoing detailed description, it should be appreciated that a vast number of variations exist. For example, although embodiments described herein are specific to aircraft, it should be recognized that principles of the inventive subject matter may be applied to other types of vehicles. It should also be appreciated that the exemplary embodiment or exemplary embodiments are only examples, and are not intended to limit the scope, applicability, or configuration of the invention in any way. Rather, the foregoing detailed description will provide those skilled in the art with a convenient road map for implementing the exemplary embodiment or exemplary embodiments. It should be understood that various changes can be made in the function and arrangement of elements without departing from the scope of the invention as set forth in the appended claims and the legal equivalents thereof.