The present invention pertains to formation flight information, and more particularly to the calculation and display of predicted trajectory data, collision avoidance alerts, time on target details, and chalk-specific information on an augmented reality device, using navigational tools to include inertial and global positioning systems.
The following is a tabulation of some prior art that presently appears relevant (and are herein incorporated by reference in their entirety):
The Federal Aviation Administration defines formation flight in 14 C.F.R. § 91.111 as operating near other aircraft, along with a series of operational requirements regarding collision hazards and passengers for hire. Generally, formation flight consists of two or more aircraft ordered by serial and chalk, with a clear and well-defined delineation of in-air responsibilities, and a shared objective. Safe formation flight requires a thorough understanding of route structures, aircraft aerodynamics, procedures relating to lost visual contact and lost communication, in-flight link-up procedures, lead change procedures, emergency procedures, and more.
Timely, tight, and precise formation flight is an essential task for several military missions which collectively engage a wide range of rotary-wing and fixed-wing aircraft. The United States Army Aviation enterprise especially prides itself on its Air Assault mission-set, typically involving some combination of UH-60 Black Hawk and CH-47 Chinook helicopters. Air Assault is the strategic movement of Soldiers, typically light infantry, into and across the battlefield to seize and maintain critical territory. Air Assault further encompasses several unique capabilities to include fast-roping, rappelling, and special patrol insertion and extraction. The 101st Airborne Division is the United States Army's light infantry division specializing in Air Assault operations, and Air Assault training is conducted at several specialized Air Assault schools across the United States.
Formation flight, although a force multiplier on the battlefield, is inherently risky. There are a multitude of well-documented case studies detailing fatal formation flight accidents on the civilian and military side alike, the vast majority being mid-air collisions between formation aircraft caused by either failing to properly execute emergency procedures or failing to maintain situational awareness of nearby aircraft.
Augmented reality displays provide situationally relevant information in real-time in both visually degraded environments (such as during nighttime operations) and environmentally hazardous environments (such as during formation flight). Army Aviators already train and maintain proficiency on military heads-up displays during nighttime operations; these systems are designed to integrate with existing night vision devices.
Described is an intelligent augmented reality system and associated methods, featuring networked computing devices and at least one augmented reality display device. These devices provide relevant information, and thereby increased situational awareness, to pilots engaged in formation flight, to ensure enhanced flight operations.
Herein, are disclosed various embodiments for systems and associated methods for calculating and displaying formation flight information, to include aircraft spacing, predicted trajectory data, collision avoidance alerts, time on target details, and chalk-specific information, on an augmented reality display designed to interface with aviation helmets. Two or more networked computing devices, each on a separate aircraft, collect some combination of aircraft altitude, location, and inertial data, preform certain calculations, and then develops a virtual overlay according to aircraft relative position and nearby aircraft trajectories. The virtual overlay is further informed by compass and gyroscopic data from an operatively coupled augmented reality display device. The developed virtual overlay is then transmitted to the display device for viewing by the pilot. The display of relevant formation flight information using augmented reality tools may result in improved formation flight spacing, emergency procedure response, and collision avoidance.
These and other embodiments of the invention are described in more detail below.
Many aspects of the present disclosure can be better understood with reference to the following drawings. The components in the drawings are not necessarily to scale, emphasis instead being placed upon clearly illustrating the principles of the disclosure. Moreover, in the drawings, like reference numerals designate corresponding parts throughout the several views.
The concise, intelligent, and organized presentation of aircraft data using heads-up displays and similar augmented reality applications has resulted in improved safety outcomes and enhanced overall performance. The present invention takes advantage of advances in networked applications to provide pilots critical information regarding not only their own aircraft, but aircraft in their immediate vicinity; some embodiments of the present invention may extend this application to not only the discussed dangers of formation flying, but single-ship general aviation procedures as well. Embodiments of the present invention discussed in the detailed description and shown in the drawings are not intended to restrict the scope of aircraft augmented reality applications, but rather, to illuminate the underpinnings and advantages brought about through variations on the proposed invention. Additional embodiments may be possible. The various embodiments may be used individually or together as may be needed or desired.
The figures will now be discussed in-depth to describe the present invention.
The terms “chalk,” “serial” and “lift” as used herein are military aviation terms which describe aircraft in flight formations. They are discussed in Army Training Publication No. 3-04.1 “Aviation Tactical Employment,” dated April 2016 (and, in particular, see Chapter 5, Section, V, pp 5-23), herein incorporated by reference in its entirety. In general, a “chalk” refers to a single aircraft comprised of the load contained within. A “serial” is a group of chalks. So, in a serial, different aircraft can be designated as chalk 1, chalk 2, chalk 3, so on and so forth, as an example. A “lift” is comprised of one or more serials having a serial designation like serial 1, serial 2, serial 2, and so on, as an example. All chalks should be designated within serials and lifts. Thus, an exemplary chalk designation of 1-2 refers to the serial 1, chalk 2.
The user input 110 may be accessed, filled-out, and submitted through an application on a mobile device or other platform, and is intended to be completed prior to the formation flight in accordance with the mission or objective. The user input 110 will be examined in additional depth in the discussion of
The operatively coupled device package per aircraft chalk 130 includes, at a minimum, a computing device 131, but may further include a display device 132. In aircraft with a dual-pilot cockpit, more than computing device 131 and display device 132 may be present, along with a means or mechanism for indicating which pilot is actively on the controls. Computing device 131 is a compact computing device capable of collecting data from a variety of sensors and the user input 110, processing that data in-flight to generate a virtual overlay using virtual overlay program 204, and transmitting the generated overlay to display device 132. Computing device 131 is also capable of drawing and/or receiving data from like and networked computing devices on nearby aircraft to support with virtual overlay generation and to enable intelligent features such as collision avoidance detection.
Computing device 131 includes and/or is operatively connected to a pressure sensor 201, GPS receiver 202, processor 203, virtual overlay program 204, transceiver 205, and may further include accelerometers 206 and gyroscopes 207. The computing device 131 is being represented as the overall system for the aircraft. Some of elements, such as pressure sensor 201 and GPS processor 203, may already exist elsewhere on the aircraft and may provide input for the computer device 131. In other words, such elements need only to be operatively connected to the computing device 131. Although, in other embodiments, such elements may be incorporated into or otherwise a part of the computing device 131 itself. Pressure sensor 201 may include, for example, one or more, or any combination of digital pressure, analogue barometric pressure, and differential pressure sensors, to determine aircraft altitude. GPS receiver 202 may include one or more global positioning system receivers that provides accurate location data calculated from delays in signal propagation from four or more GPS satellites.
GPS data is a way used for determining and tracking the position of each aircraft. As received, GPS data describes a device's location and generally includes terms of longitude, latitude, altitude, and the time that the transmission was received. Alternatively, aircraft altitude and location may be sourced to computing device 131 through any one of several existing internal or external aircraft systems designed to aggregate aircraft information. The use of known aviation-proximity sensors, such as RADAR, LIDAR, ultrasonic, or otherwise, can be included in or on the aircraft. Such sensors are common for most aircraft. They can be used to provide distance measurements between aircrafts as well as other objects. These measurements, in turn, can be used to estimate aircraft relative position in a formation. For instance, the system can be operatively configured to interact with the aviation-proximity sensors to measure and track distances between the aircrafts in formation.
Altitude measurements can be determined using the aircraft's altimeter. Altimeters traditionally use barometric pressure sensors for measuring ambient pressure outside the aircraft. Alternatively, GPS data which includes altitude information can be used. Virtual overlay program 204, runs on processor 203, and generates a unique virtual overlay informed by both user input 110 and the multitude of sensors on a network of computing devices 131.
Transceiver 205 is one or more transceivers capable of receiving and transmitting data with both an operatively coupled display device 132 and like computing devices 131 on network 120. It includes one or more antennas and other requisite hardware and processors for radio communications.
Integrating accelerometers 206 and/or gyroscopes 207 may improve the accuracy of collected location data by employing a hybrid navigation model consisting of both inertial and GPS systems. Hybrid navigation is today pretty-standard implementation in most advanced systems (aircraft, self-driving cars, etc.). A flight plan can be uploaded ahead of flight which provides the flight path. Such data may include mapping information setting forth the flight path. Precise data, like GPS, can be used for this purpose. The hybrid navigation model tracks the aircraft's position in real time and compares it to the flight path. The system responds to maintain the aircraft on the desired flight path by making correction to the flight controls. Hybrid navigational models may use neural networks or so-called fuzzy logic correction algorithms for more precise controls. One exemplary hybrid navigation system is described in the following paper: A. Hiliuta, R. Landry and F. Gagnon, “Fuzzy corrections in a GPS/INS hybrid navigation system,” in IEEE Transactions on Aerospace and Electronic Systems, vol. 40, no. 2, pp. 591-600, April 2004, herein incorporated by reference in its entirety. This and other hybrid navigation models can be used in one or more embodiments to determine aircraft location and predict aircraft trajectory as a function of both the outputs of the GPS receiver and inertial data. Accelerometer 206 and gyroscope 207 may be packaged separately, or on a consolidated inertial measurement unit (IMU) chip. These hardware arrangements are standard to aviation industry practice are will not be further described herein.
Display device 132 includes an augmented reality display 210, gyroscope 211, compass 212, transceiver 213, and may further include a camera 214. Camera 214 can be any camera or imaging system. Ideally, it is a high-definition video camera for providing adequate views. While conventional video cameras that record in the visible spectrum may be used other cameras, such as those which record in “night-vision” and/or IR spectrums, might be also provided. These enable night missions. In other embodiments and implementations, various other imaging system can be used for added image capture capabilities which can provide other mission benefits. For instance, a second camera (preferably, configured to provide stereo vision) can be included to greatly enhances visional depth. Additional cameras might also be provided to increase the field of view in some instances.
Augmented reality display 210 is a wearable heads-up display through which the pilot is presented critical formation flight information during flight In embodiments, the display 210 may include a liquid crystal display (LCD), light emitting diode (LED) display or organic LED (LED) display, as non-limiting examples, operatively connected to the computing device 131 which generates and provides digital content data (e.g., video, image, sound, etc.). Augmented reality display 210 may be worn alone in the form of a headset, goggles, or glasses, or may be integrated into an existing aviation helmet design, as shown in 1100. In some implementations, a strap can be provided with the display 210 for securely coupling it to the wearer's head; it may be elastic and/or provided with adjustable attachment means, like Velcro® or a buckle.
Gyroscope 211 measures the orientation and angular velocity of the pilot's head thus providing the pilot's orientation. The compass 212 determining direction of the pilot's head thus providing the heading. The orientation data from the gyroscope 211, when paired with heading data from compass 212, may be used to map the pilot's overall operational view.
The mapping of the pilot's view is important to properly correlate the display of virtual view-dependent information with the pilot's view. This mapping allows for the estimation of the approximate location of a second aircraft in the pilot's view. Positional tracking algorithms are common and important for many headset displays, and include, for example, the well-known Oculus Rift. A discussion of a positional tracking algorithm for a heads-up display is provided in the following article: Ananth Ranganathan, “The Oculus Insight positional tracking system,” AI Accelerator Institute, 27 Jun. 2022, available at: https://www.aiacceleratorinstitute.com/the-oculus-insight-positional-tracking-system-2/ (and, in particular, see the subsection titled “Building a map and localizing”), herein incorporated by reference in its entirety. Essentially, an IMU helps predict movement between frames, and thus locations on the global map. Use of a second camera (stereo vision) can also greatly enhance global map accuracy. This algorithms and other positional tracking algorithms and models can be used in one or more embodiments to map the pilot's overall operational view on the display of the on augmented reality display 210. By developing a map that extends 180 degrees or more around the pilot, the approximate location of a second aircraft in the pilot's view can be tracked even when not in the immediate view of the pilot. If we assume the origin exists when the pilot is facing immediately forwards, the map can reasonably be expected to extend, at the least, about 100 degrees in yaw in either direction, such that visibility of other aircraft through side windows are included. However, if the aircraft forward of our aircraft is at the 90-degree point from the origin of said map, that would suggest the two aircraft are actually abeam one another (parallel)—anything further would be a change in formation configuration (and thus a change in leading/trailing aircraft). The same technique can be used in the pitch direction too. The views can be easily scaled to the display unit. For instance, for aircrafts and object close to the pilot, virtual information can be depicted/displayed nearly to the center of the display whereas those farther away can be depicted/displayed at the edges or periphery of the display. In that way, the virtual displayed information approximates the virtual viewpoint of the wearer. The estimated positioning of a second aircraft may be used to inform the placement of key information on augmented reality display 210.
Transceiver 213 transmits and receives data with an operationally coupled computing device 130. It includes one or more antennas and other requisite hardware and processors for radio communications. Camera 214 may be one or more outward facing cameras strategically positioned to capture the perspective of the pilot for the purposes of improved second aircraft detection and tracking. The cameras 214 preferably collect imagery from the vantage point of a pilot in the aircraft for the purposes of improved second aircraft detection and tracking. In some embodiments, a forward-facing camera is provided in the aircrafts. The camera in trailing aircraft(s) thus can provide a “view” of leading aircraft(s). Alternatively or additionally, in embodiments, a backward-facing camera is provide in the aircrafts. The camera is leading aircraft(s) thus can provide a “view” of trailing aircraft(s). Taken together, the various views can be input to the system and made available to the pilots given them better situational awareness.
The computing device 131 may be configured to take distance estimate values and divide by a user-input spacing variable for a given aircraft to compute the operational unit for distance, like rotor disk.
As shown in trailing displays 400 and 410, the accurate estimation of the leading aircraft position in the trailing pilot's view 404 allows for the dynamic placement of relevant information, such as distance estimation metric 403 and estimated trajectory 405, around that position, so as not to interfere with the pilot's view of the leading aircraft. Although heads-up displays are characterized by the transparent (see-through) nature of their text and symbology, interfering in any manner with the pilot's view of an aircraft in proximity may prove dangerous. Further, for estimated trajectory 405 to be meaningful and correct, the trajectory must be instantiated at the leading aircraft's current position. Estimated trajectory 405 may be visualized differently depending on the separation distance of the leading aircraft and the presumed accuracy of the trajectory itself; for example, the trajectory path may be thicker if the leading aircraft is closer, and thinner if the leading aircraft is further away.
Additional information of value that may be denoted on the overlay includes serial and chalk position 401, associated position-based responsibilities 402, and time on target details 406. Serial and chalk position 401 may be dynamically updated depending on the relative position of an aircraft in formation as determined by computing devices 131 on network 120. Lead changes are common in formation flight, and aircraft may not maintain their initial positioning throughout the duration of a mission. Certain emergency procedures such as lost communication procedures also involve changes in formation flight positioning. Associated position-based responsibilities 402 are tied to an aircraft's serial and chalk number and must therefore be updated as well. Examples of associated position-based responsibilities 402 include making certain radio calls, opening and closing flight-plans, and attaining situation or weather information. In the figures, “LZ BLACK” refers to the present condition at the target, that is, the landing zone is dark.
Time on target (TOT) is a military aviation term. It refers to the time remaining in flight to a defined geographic coordinate (end point), typically where some mission-driven requirement is to occur, as part of the flight plan. Time on target details 406 may include the name of the upcoming waypoint, as well as time to and/or distance from the upcoming waypoint and destination. It can be computed by taking the difference in end position and current position (distance), and by knowing the current aircraft velocity (rate) and flight path, converted to a time (time). A simple conversion for an aircraft flying at constant velocity in a straight line to the target is time=distance/rate. Although, more sophisticated flight duration estimation algorithms exist and could be used. Initial serial and chalk position 401, associated position-based responsibilities 402, and time on target details 406 may all be defined in user input 110 as part of an initial mission briefing. In the figures, TOT is presently indicated at 2 minutes and 39 seconds.
The computer-implemented algorithm depicted in
An interesting edge case arises for the formation lead aircraft, which is not coupled to a leading aircraft.
The foregoing description of the specific embodiments will so fully reveal the general nature of the embodiments herein that others may, by applying current knowledge, readily modify and/or adapt for various applications such specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications should and are intended to be comprehended within the meaning and range of equivalents of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and not of limitation. Therefore, while the embodiments herein have been described in terms of preferred embodiments, those skilled in the art will recognize that the embodiments herein may be practiced with modification within the spirit and scope of the appended claims.
This application claims the benefit of U.S. Provisional Patent Application No. 63/301,482 filed Jan. 20, 2022, which is herein incorporated by reference in its entirety for all purposes.
The invention described herein may be manufactured, used, and licensed by or for the United States Government.
Number | Date | Country | |
---|---|---|---|
63301482 | Jan 2022 | US |