There currently exists an expensive safety problem of aircraft wingtips clipping obstacles (e.g., 2011 Paris Air Show, an A380 accident in which a wing hit a building; 2012 Chicago O'Hare accident in which a Boeing 747 cargo aircraft's wing clipped an Embraer 140's rudder). Some solutions have radar sensors placed at the wingtips and information about these potential obstacles is presented to the pilot on a human-machine interface (e.g., head-up, head-down, or head-mounted display). Having such information available improves crewmembers' awareness of obstacles, allowing them to better adjust the current aircraft speed and direction to the detected obstacles and to evaluate if a particular obstacle is a threat. However, providing information about only the lateral location of obstacles relative to an aircraft does not explicitly address whether its height of the wing, wingtips, or nacelle will clear the obstacles, based on height of the object.
The present invention provides systems and methods for predicting and displaying targets based on height in relation to the wing or other elements of the aircraft, such as engine nacelles. In addition, the systems and methods for depicting target threat based on lateral and/or vertical proximity of targets are included. The location of ground obstacles is based on radar returns (from sensors deployed on the ownship); aircraft surveillance data, such as automatic dependent surveillance-broadcast (ADS-B); and/or an airport moving map database (e.g., location of buildings, towers, etc., on the airport surface). The ADS-B data provides aircraft-type data and an onboard database provides a look-up table for aircraft and/or other vehicle geometry information. In addition, airport ground vehicles equipped with ADS-B are detectable.
Preferred and alternative embodiments of the present invention are described in detail below with reference to the following drawings:
The UI device 44 includes a processor 50 (optional), a communication device (wired or wireless) 52, and an alerting device(s) 54. The UI device 44 provides audio and/or visual cues (e.g., via headphones, PC tablets, etc.) based on sensor-derived and processed information.
Based on information from the sensors 26, the UI device 44 provides some or all of the following functions: detect and track intruders, evaluate and prioritize threats, and declare and determine actions. Once an alert associated with a detection has been produced, then execution of a collision-avoidance action (e.g., stop the aircraft, maneuver around intruder, etc.) is manually performed by a pilot or automatically by an automated system (e.g., autobrakes).
In one embodiment, processing of the sensor information is done by the processor 36 at the sensor level and/or by the processor 50 at the UI device 44.
In one embodiment, situational awareness is improved by integration with automatic dependent surveillance-broadcast/traffic information service-broadcast (ADS-B/TIS-B), airport/airline information on vehicles/aircraft/obstacles (e.g., through WiMax), and with synthetic vision system/enhanced vision system/combined vision system (SVS/EVS/CVS) received by the respective devices using the communication device 38.
In one embodiment, the present invention reduces false alarms by utilizing flight plan and taxi clearance information, and airport building/obstacle databases stored in memory 60 or received from a source via the communication devices 52. The stored airport building/obstacle databases include height information for any ground obstacle.
The sensors 26 integrated in the wing and tail modules 30 provide near-complete sensor coverage of the aircraft 20. Full coverage can be attained by placing sensors at various locations on the aircraft 20.
The pilot is alerted aurally, visually, and/or tactilely. For example, aural alerting is through existing installed equipment, such as the interphone or other warning electronics (e.g. Crew Alerting System) or possibly the Enhanced Ground Proximity Warning System (EGPWS) platform.
The present invention provides systems and methods for allowing a flight crew to visualize an object's distance from the host aircraft using two scalable range rings, which can be set to either feet or meters.
Gray (or other) colored gridlines 128, 130 are included in the areas 124. The gridlines 128, 130 add perspective and situational awareness at fixed intervals on the image 120. In one embodiment, each vertical and horizontal gridline 128, 130 represents 10 feet each, when units are in English feet, and 5 meters each, when units are in metric. Other increments may be used.
In one embodiment, as shown in
In the image 180, a primary strike zone 190 is identified as the zone from the wingtip inward toward the aircraft (i.e., the icon 126). A secondary strike zone 194 is a predetermined distance from the wingtip outward within the defined coverage area 124. Primary targets are those inside the primary strike zone 190 and secondary targets are those in the secondary strike zone 194. Separation between the zones 190, 194 is shown with a dashed line 196 (e.g., white or other color). In one embodiment, primary targets are shown as a large solid circle 200 (e.g., brown or other color) and secondary targets are shown as a smaller, hollow circle 202 (e.g., brown or other color). Other forms of display differentiation could also be used to identify the primary and secondary targets, such as texture (patterns). In one embodiment, a different symbolic shape is used to distinguish between the primary and secondary targets. The secondary targets represent targets that would not be threats if the intended trajectory of the host aircraft (vehicle) is maintained.
In one embodiment, the dashed line 196 does not appear on the display image 180. However, differentiation between primary and secondary zones is shown with a thicker white or gray line in the middle of the radar beam area 124 (i.e., visual coding is used to distinguish between primary and secondary zones 190, 194). Other visual coding techniques may be used.
If information is available on the vertical extent of detected obstacles (targets) the processor 36 or 50 shows (in the image 180) those targets that are in the primary strike zone 190 but either above or below the host vehicle component (e.g., wing or nacelle area). The secondary targets represent targets that would not be threats if the intended trajectory of the host aircraft (vehicle) is maintained with secondary target icons 220, as shown in
If vertical information related to obstacles is available, an airport moving map display or vertical situation display (VSD) (e.g., displayed below the plan view display) incorporates obstacle information from an airport database, based on received target information.
As shown in
In one embodiment, a vertical (y) scale in the VSD image 280 shows obstacle height in either meters or feet (set through either a menu or maintenance options user interface). The lateral distance (feet or meters) to the target is shown along the lateral (x) scale of the VSD image 280. The horizontal scale can be adjusted using a zoom knob on a cursor control device (CCD) or using a range set knob.
In one embodiment, a vertical line 292 is drawn from the primary target icon 286 to the lateral scale to show how far the associated object is from the wingtip (or other aircraft structure).
In one embodiment, the flightcrew is presented with the option of selecting either a “LEFT WING” or “RIGHT WING” vertical profile depiction. The selection causes the processors 36, 50 to show icons for obstacles located on only the selected side of the host aircraft.
If vertical information related to obstacles is available (i.e., received via the communication device 52, or retrieved from local memory (60) based on identifying of the associated target), the distance in feet or meters of objects below (or above) the wing or nacelle is shown adjacent the respective icon, see VSD image 340
While the preferred embodiment of the invention has been illustrated and described, as noted above, many changes can be made without departing from the spirit and scope of the invention. Accordingly, the scope of the invention is not limited by the disclosure of the preferred embodiment. Instead, the invention should be determined entirely by reference to the claims that follow.
This application is a continuation of U.S. patent application Ser. No. 13/872,889 by Khatwa et al., filed Apr. 29, 2013 and entitled, “SYSTEMS AND METHODS FOR ENHANCED AWARENESS OF OBSTACLE PROXIMITY DURING TAXI OPERATIONS,” which claims the benefit of U.S. Provisional Patent Application Ser. No. 61/706,632, filed Sep. 27, 2012, the entire contents of each of which are hereby incorporated by reference in their entireties. The entire content of U.S. Provisional Application Ser. No. 61/653,297, filed May 30, 2012, is also incorporated by reference herein.
Number | Name | Date | Kind |
---|---|---|---|
6118401 | Tognazzini | Sep 2000 | A |
7379014 | Woodell | May 2008 | B1 |
7783427 | Woodell | Aug 2010 | B1 |
7860641 | Meunier | Dec 2010 | B2 |
7903023 | Cornie et al. | Mar 2011 | B2 |
7932838 | Hamza | Apr 2011 | B2 |
7986249 | Wilson | Jul 2011 | B2 |
8077081 | Bateman et al. | Dec 2011 | B2 |
8249762 | Flotte | Aug 2012 | B1 |
8924139 | Louis et al. | Dec 2014 | B2 |
8958942 | Kolcarek et al. | Feb 2015 | B2 |
8970423 | Kabrt et al. | Mar 2015 | B2 |
9037392 | Kirk et al. | May 2015 | B2 |
20030179215 | Coldefy | Sep 2003 | A1 |
20060007021 | Konya | Jan 2006 | A1 |
20060238376 | Khatwa | Oct 2006 | A1 |
20060238402 | Khatwa | Oct 2006 | A1 |
20070240056 | Pepitone | Oct 2007 | A1 |
20080062011 | Butler | Mar 2008 | A1 |
20080109160 | Sacle | May 2008 | A1 |
20080306691 | Louis | Dec 2008 | A1 |
20090045982 | Caillaud | Feb 2009 | A1 |
20090174591 | Cornic | Jul 2009 | A1 |
20090219189 | Bateman | Sep 2009 | A1 |
20090265088 | Dias | Oct 2009 | A1 |
20100042312 | Meunier | Feb 2010 | A1 |
20100123599 | Hamza | May 2010 | A1 |
20100127895 | Wilson | May 2010 | A1 |
20100332123 | Filias | Dec 2010 | A1 |
20110267206 | Reynolds | Nov 2011 | A1 |
20120130624 | Clark et al. | May 2012 | A1 |
20130096814 | Louis et al. | Apr 2013 | A1 |
20130321169 | Bateman et al. | Dec 2013 | A1 |
20130321176 | Vasek et al. | Dec 2013 | A1 |
20130321192 | Starr et al. | Dec 2013 | A1 |
20130321193 | Vasek et al. | Dec 2013 | A1 |
20130325312 | Khatwa et al. | Dec 2013 | A1 |
20140062756 | Lamkin et al. | Mar 2014 | A1 |
20140085124 | Dusik et al. | Mar 2014 | A1 |
Number | Date | Country |
---|---|---|
2011028579 | Feb 2011 | JP |
Entry |
---|
Response to European Examination Report dated Nov. 19, 2015, from counterpart European Application No. 13796644.6, filed Feb. 24, 2016, 18 pp. |
International Preliminary Report on Patentability from counterpart International Patent Application No. PCT/US2013/043287, dated Dec. 11, 2014, 8 pp. |
Extended Search Report from counterpart European Application No. 13796644.6-1812, dated Nov. 2, 2015, 8 pp. |
Search Report and Written Opinion from counterpart International Application No. PCT/US2013/043287, dated Aug. 27, 2013, 11 pp. |
Prosecution History from U.S. Appl. No. 13/872,889, dated Apr. 29, 2013 through Nov. 25, 2015, 59 pp. |
Examination Report from counterpart European Application No. 13796644.6, dated Sep. 7, 2018, 4 pp. |
Number | Date | Country | |
---|---|---|---|
20160133139 A1 | May 2016 | US |
Number | Date | Country | |
---|---|---|---|
61706632 | Sep 2012 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13872889 | Apr 2013 | US |
Child | 14961524 | US |