Currently there exists an expensive safety problem of aircraft wingtips clipping obstacles (e.g., 2011 Paris Air Show, an A380 accident in which a wing hit a building; 2012 Chicago O'Hare accident in which a Boeing 747 cargo aircraft's wing clipped an Embraer 140's rudder; 2011 Boston Logan Int. Airport, a Boeing 767 struck a horizontal stabilizer of a Bombardier CRJ900, etc.). Some solutions focus on object detection by radar sensors placed at the wingtips and information about these potential obstacles is presented to the pilot on a human-machine interface (e.g., head-up, head-down, or head-mounted display). A challenging drawback of this solution is the fact that the sensor signal covers only the directly forward area in front of the wingtip and leaving the side wingtip angles uncovered by the radar signal, which can be dangerous, especially in turns. Many wingtip collisions were investigated and it was found that many accidents occur in turns (e.g., 1995 London Heathrow, A340 struck B757 tail; 2006 Melbourne, B747 hit B767 horizontal stabilizer; 2010 Charlotte Douglas, A330 hit A321 rudder, etc.). Current solutions provide only limited benefit in such cases, as the obstacle would appear in the sensor's field of view (FOV) just before striking the obstacle and, thus, not providing the aircrew sufficient time for suitable reaction with respect to the given situation.
The present invention provides an enhanced system that uses adaptive steering of radar beam pattern for coverage during aircraft turns. In an exemplary solution, the radar sensor system is installed in an adaptive rack, which would mechanically or electrically alter the radar sensor's beam pattern in order to adapt the radar sensor's field of view (FOV) to cover the area of anticipated aircraft wingtip trajectory. The anticipated trajectory is derived, for example, from the aircraft groundspeed, acceleration, heading, turn rate, tiller position, attitude, taxi clearance, etc. Also, the anticipated trajectory can be derived from knowledge of the operator, i.e., the radar beam can be steered manually by the aircraft operator as well.
In one aspect of the invention, the radar sensor's beam pattern is steered, based on the trajectory information and/or based on knowledge of position(s) of obstacles.
Preferred and alternative embodiments of the present invention are described in detail below with reference to the following drawings:
In one embodiment, as shown in
In one embodiment, the UI device 44 includes a processor 50 (optional), a communication device (wired or wireless) 52, and an alerting device(s) 54. The UI device 44 provides audio and/or visual cues (e.g., via headphones, PC tablets, etc.) based on sensor-derived and processed information.
Based on information from the sensors 26, the UI device 44 provides some or all of the following functions: detect and track intruders, evaluate and prioritize threats, sensor steering and control, and declare and determine actions. Once an alert associated with a detection has been produced, then execution of a collision-avoidance action (e.g., stop the aircraft, maneuver around intruder, etc.) is manually performed by the operator or automatically by an automated system (e.g., autobrakes, auto steering).
In one embodiment, processing of the sensor information is done by the processor 36 at the sensor level and/or the processor 50 at the UI device 44.
In one embodiment, situational awareness is improved by integration with automatic dependent surveillance-broadcast/traffic information service-broadcast (ADS-B/TIS-B), airport/airline information on vehicles/aircraft/obstacles (e.g., through WiMax or other wireless communication means), and with synthetic vision system/enhanced vision system/combined vision system (SVS/EVS/CVS) received by the respective devices using the communication device 38.
In one embodiment, the present invention reduces false alarms by utilizing flight plan and taxi clearance information, and airport building/obstacle databases stored in memory 60 or received from a source via the communication device 52.
The sensors 26 included in the wing and tail navigation light modules provide near-complete sensor coverage of the aircraft 20. Full coverage can be attained by placing sensors in other lights or locations that are strategically located on the aircraft 20.
The pilot is alerted aurally, visually, and/or tactilely. For example, a visual alert presented on a primary flight or navigation display or an electronic flight bag (EFB) display shows aircraft wingtips outlined or a highlight of any obstructions. Aural alerting is through existing installed equipment, such as the interphone or other warning electronics or possibly the enhanced ground proximity warning system (EGPWS) platform.
In one embodiment, the processor 36 or 50 determines direction to steer or sweep the sensor(s) 26, based on information from any of a number of different sources ADS-B, flight management system (FMS), global positioning system (GPS), inertial navigation system (INS), etc. For example, a radar beam pattern produced by a radar sensor is adaptively steered to provide coverage of an incremented area during a sensed aircraft turn.
In one embodiment, the radar sensor is installed in an adaptive rack, which mechanically moves and/or adjusts the radar sensor's beam pattern in order to adapt the radar sensor's field-of-view (FOV) to cover the area into which the aircraft is taxiing (anticipated trajectory), based on the information from the other source(s). The anticipated trajectory is derived, for example, from at least some of the following data: groundspeed, acceleration, heading, turn rate, tiller position, and/or attitude, etc.
In one embodiment, a UI device (not shown) is included in the UI device 44. The UI device allows a user to control the steering of the beam(s).
In one embodiment, the sensor is installed at other fuselage areas, such as above each engine or at the nose of the aircraft, etc. Even though the sensor is not at the wingtip, the scan data is buffered, thus allowing the image 120 to be displayed.
When the beam pattern is turned, as shown in
In this case, the obstacles 200 are detected as the adaptive beam pattern is directed on the basis of aircraft anticipated trajectory (turn), as determined by the processor(s) 36, 50, based on trajectory information determined from speed, heading, and position information received from another vehicle/aircraft system, such as a global positioning system (GPS), inertial navigation system (INS), comparable system or by operator input. The detected obstacle can be therefore presented to the pilot on a display inside the cockpit, see exemplary display image 120 in
The beam coverage areas 124 will show any obstacles that are sensed when the radar has been steered. In one embodiment, the beam coverage areas 124 are parallel with the ownship icon 126 and an indicator (i.e., text (e.g., Steering Left), arrow . . . ) is presented to indicate to the operator the direction that the sensor is being steered. In another embodiment, the beam coverage areas 124 are curved (not shown) based on the determined trajectory, radius of turn, or location of a tracked target.
The beam coverages shown are given as examples. Actual beam coverages may expand into more of a cone, and permit somewhat wider effective coverage at distance. Adaptive beam steering, coupled with an ability to selectively widen the field of view to one that spreads outward with distance, will, when combined properly in software masking of the field of view and with the mechanical steering, provide nearly full coverage.
As shown in
While the preferred embodiment of the invention has been illustrated and described, as noted above, many changes can be made without departing from the spirit and scope of the invention. Accordingly, the scope of the invention is not limited by the disclosure of the preferred embodiment. Instead, the invention should be determined entirely by reference to the claims that follow.
This application claims the benefit of U.S. Provisional Application Ser. No. 61/653,297, filed May 30, 2012, the contents of which are hereby incorporated by reference. This application also claims the benefit of U.S. Provisional Application Ser. No. 61/706,632, filed Sep. 27, 2012, the contents of which are hereby incorporated by reference.
Number | Date | Country | |
---|---|---|---|
61653297 | May 2012 | US | |
61706632 | Sep 2012 | US |