The disclosure generally relates to emergency management of (e.g., air) vehicles (e.g., fixed wing and rotary wing air vehicles), and more specifically, to autorotation for rotary wing air vehicles.
During emergency aviation situations, pilots often misinterpret the situation or fail to perform the correct emergency corrective actions (or fail to perform the corrective actions at the proper time). The National Transportation Safety Board (NTSB) and the Federal Aviation Administration (FAA) cite this as a common problem and top contributor to aviation accidents and fatalities. Previous emergency management solutions for air vehicles largely include visual and aural crew alerting aligned with regulatory certification standards. These warning and caution notifications still mandate a series of highly accurate, memory recalled, and perishable pilot skill tasks in order to effectively mitigate the emergency in an effective manner.
The disclosed embodiments have other advantages and features which will be more readily apparent from the detailed description, the appended claims, and the accompanying figures (or drawings). A brief introduction of the figures is below.
The Figures (FIGS.) and the following description relate to preferred embodiments by way of illustration only. It should be noted that from the following discussion, alternative embodiments of the structures and methods disclosed herein will be readily recognized as viable alternatives that may be employed without departing from the principles of what is claimed.
Reference will now be made in detail to several embodiments, examples of which are illustrated in the accompanying figures. It is noted that wherever practicable similar or like reference numbers may be used in the figures and may indicate similar or like functionality. The figures depict embodiments of the disclosed system (or method) for purposes of illustration only. One skilled in the art will readily recognize from the following description that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles described herein.
Disclosed is a system (and method and non-transitory computer readable storage medium comprising stored program code for autorotation management. By way of example, a system is configured to determine an occurrence of an autorotation condition for a rotary wing air vehicle controlled by a user. They system controls the air vehicle to enter into an autorotation in response to a determination of the occurrence of the autorotation condition. The system performs one or more non-user actions during the autorotation to assist the user with the autorotation. Examples of non-user actions may include maintaining a rotations per minute (RPM) of a rotor of the air vehicle or maintaining an airspeed of the air vehicle to a range of nominal values. While performing the one or more non-user actions during the autorotation, the system allows the user to maneuver the air vehicle by interacting one or more control interfaces of the air vehicle.
Also disclosed is a system (and method and non-transitory computer readable storage medium comprising stored program code) for automated and user assisted air vehicle emergency management. By way of example, the system determines an occurrence of emergency events of a vehicle traversing through a physical environment. The system ranks the emergency events according to importance level associated with each emergency event. The system selects an emergency event based on the ranking and notifies a user of the vehicle of the selected emergency event. The system identifies corrective actions associated with the selected emergency event. The identified corrective actions including a user action and a non-user action. Examples of non-user actions may include maintaining a rotations per minute (RPM) of a rotor of the air vehicle or maintaining an airspeed of the air vehicle to a range of nominal values. The system performs the non-user action of the identified corrective actions and notifies the user of the vehicle of the user action.
Figure (FIG.) 1 illustrates one example embodiment of a vehicle control and interface system 100. In the example embodiment shown, vehicle control and interface system 100 includes one or more universal vehicle control interfaces 110, universal vehicle control router 120, one or more vehicle actuators 130, one or more vehicle sensors 140, and one or more data stores 150. In other embodiments, the vehicle control and interface system 100 may include different or additional elements. Furthermore, the functionality may be distributed among the elements in a different manner than described. The elements of
The vehicle control and interface system 100 may be integrated with various vehicles having different mechanical, hardware, or software components. For example, the vehicle control and interface system 100 may be integrated with fixed-wing aircraft (e.g., airplanes), rotorcraft (e.g., helicopters), motor vehicles (e.g., automobiles), watercraft (e.g., power boats or submarines), or any other suitable vehicle. As described in greater detail below, the vehicle control and interface system 100 is advantageously configured to receive inputs for requested operation of a particular vehicle via universal set of interfaces and the inputs to appropriate instructions for mechanical, hardware, or software components of the particular vehicle to achieve the requested operation. In doing so, the vehicle control and interface system 100 enables human operators to operate different vehicles using the same universal set of interfaces or inputs. By way of example, “universal” indicates that a feature of the vehicle control and interface system 100 may operate or be architected in a vehicle-agnostic manner. This allows for vehicle integration without necessarily having to design and configure vehicle specific customizations or reconfigurations in order to integrate the specific feature. Although universal features of the vehicle control and interface system 100 can function in a vehicle-agnostic manner, the universal features may still be configured for particular contexts. For example, the vehicle control or interface system 100 may receive or process inputs describing three-dimensional movements for vehicles that can move in three dimensions (e.g., aircraft) and conversely may receive or process inputs describing two-dimensional movements for vehicles that can move in two dimensions (e.g., automobiles). One skilled in the art will appreciate that other context-dependent configurations of universal features of the vehicle control and interface system 100 are possible.
The universal vehicle control interfaces 110 is a set of universal interfaces configured to receive a set of universal vehicle control inputs to the vehicle control and interface system 100. The universal vehicle control interfaces 110 may include one or more digital user interfaces presented to an operator of a vehicle via one or more electronic displays. Additionally, or alternatively, the universal vehicle control interfaces 110 may include one or more hardware input devices, e.g., one or more control sticks inceptors, such as side sticks, center sticks, throttles, cyclic controllers, or collective controllers. The universal vehicle control interfaces 110 receive universal vehicle control inputs requesting operation of a vehicle. In particular, the inputs received by the universal vehicle control interfaces 110 may describe a requested trajectory of the vehicle, such as to change a velocity of the vehicle in one or more dimensions or to change an orientation of the vehicle. Because the universal vehicle control inputs describe an intended trajectory of a vehicle directly rather than describing vehicle-specific precursor values for achieving the intended trajectory, such as vehicle attitude inputs (e.g., power, lift, pitch, roll yaw), the universal vehicle control inputs can be used to universally describe a trajectory of any vehicle. This is in contrast to existing systems where control inputs are received as vehicle-specific trajectory precursor values that are specific to the particular vehicle. Advantageously, any individual interface of the set of universal vehicle control interfaces 110 configured to received universal vehicle control inputs can be used to completely control a trajectory of a vehicle. This is in contrast to conventional systems, where vehicle trajectory must be controlled using two or more interfaces or inceptors that correspond to different axes of movement or vehicle actuators. For instance, conventional rotorcraft systems include different cyclic (controlling pitch and roll), collective (controlling heave), and pedal (controlling yaw) inceptors. Similarly, conventional fixed-wing aircraft systems include different stick or yoke (controlling pitch and role), power (controlling forward movement), and pedal (controlling yaw) inceptors. Example configurations of the universal vehicle control interfaces 110 are described in greater detail below.
In various embodiments, inputs received by the universal vehicle control interfaces 110 can include “steady-hold” inputs, which may be configured to hold a parameter value fixed (e.g., remain in a departed position) without a continuous operator input. Such variants can enable hands-free operation, where discontinuous or discrete inputs can result in a fixed or continuous input. In a specific example, a user of the universal vehicle control interfaces 110 can provide an input (e.g., a speed input) and subsequently remove their hands with the input remaining fixed. Alternatively, or additionally, inputs received by the universal vehicle control interfaces 110 can include one or more self-centering or automatic return inputs, which return to a default state without a continuous user input.
In some embodiments, the universal vehicle control interfaces 110 include interfaces that provide feedback information to an operator of the vehicle. For instance, the universal vehicle control interfaces 110 may provide information describing a state of a vehicle integrated with the universal vehicle control interfaces 110 (e.g., current vehicle speed, direction, orientation, location, etc.). Additionally, or alternatively, the universal vehicle control interfaces 110 may provide information to facilitate navigation or other operations of a vehicle, such as visualizations of maps, terrain, or other environmental features around the vehicle. Embodiments of interfaces providing feedback information to an operator of a vehicle are described in greater detail below with reference to
The universal vehicle control router 120 routes universal vehicle control inputs describing operation of a vehicle to components of the vehicle suitable for executing the operation. In particular, the universal vehicle control router 120 receives universal vehicle control inputs describing the operation of the vehicle, processes the inputs using information describing characteristics of the aircraft, and outputs a corresponding set of commands for actuators of the vehicle (e.g., the vehicle actuators 130) suitable to achieve the operation. The universal vehicle control router 120 may use various information describing characteristics of a vehicle in order to convert universal vehicle control inputs to a suitable set of commands for actuators of the vehicle. Additionally, or alternatively, the universal vehicle control router 120 may convert universal vehicle control inputs to a set of actuator commands using a set of control laws that enforce constraints (e.g., limits) on operations requested by the universal control inputs. For example, the set of control laws may include velocity limits (e.g., to prevent stalling in fixed-wing aircraft), acceleration limits, turning rate limits, engine power limits, rotor revolution per minute (RPM) limits, load power limits, allowable descent altitude limits, etc. After determining a set of actuator commands, the universal vehicle control router 120 may transmit the commands to relevant components of the vehicle for causing corresponding actuators to execute the commands. Embodiments of the universal vehicle control router 120 are described in greater detail below with reference to
The universal vehicle control router 120 can decouple axes of movement for a vehicle in order to process received universal vehicle control inputs. In particular, the universal vehicle control router 120 can process a received universal vehicle control input for one axis of movement without impacting other axes of movement such that the other axes of movement remain constant. In this way, the universal vehicle control router 120 can facilitate “steady-hold” vehicle control inputs, as described above with reference to the universal vehicle control interfaces 110. This is in contrast to conventional systems, where a vehicle operator must manually coordinate all axes of movement independently for a vehicle in order to produce movement in one axis (e.g., a pure turn, a pure altitude climb, a pure forward acceleration, etc.) without affecting the other axes of movement.
In some embodiments, the universal vehicle control router 120 is configured to use one or more models corresponding to a particular vehicle to convert universal vehicle control inputs to a suitable set of commands for actuators of the vehicle. For example, a model may include a set of parameters (e.g., numerical values) that can be used as input to universal input conversion processes in order to generate actuator commands suitable for a particular vehicle. In this way, the universal vehicle control router 120 can be integrated with vehicles by substituting models used by processes of the universal vehicle control router 120, enabling efficient integration of the vehicle control and interface system 100 with different vehicles. The one or more models may be obtained by the universal vehicle control router 120 from a vehicle model database or other first-party or third-party system, e.g., via a network. In some cases, the one or more models may be static after integration with the vehicle control and interface system 100, such as if a vehicle integrated with the vehicle control and interface system 100 receives is certified for operation by a certifying authority (e.g., the United States Federal Aviation Administration). In some embodiments, parameters of the one or more models are determined by measuring data during real or simulated operation of a corresponding vehicle and fitting the measured data to the one or more models.
In some embodiments, the universal vehicle control router 120 processes universal vehicle control inputs according to a current phase of operation of the vehicle. For instance, if the vehicle is a rotorcraft, the universal vehicle control router 120 may convert a universal input describing an increase in lateral speed to one or more actuator commands differently if the rotorcraft is in a hover phase or in a forward flight phase. In particular, in processing the lateral speed increase universal input the universal vehicle control router 120 may generate actuator commands causing the rotorcraft to strafe if the rotorcraft is hovering and causing the rotorcraft to turn if the rotorcraft is in forward flight. As another example, in processing a turn speed increase universal input the universal vehicle control router 120 may generate actuator commands causing the rotorcraft to perform a pedal turn if the rotorcraft is hovering and ignore the turn speed increase universal input if the rotorcraft is in another phase of operation. As a similar example for a fixed-wing aircraft, in processing a turn speed increase universal input the universal vehicle control router 120 may generate actuator commands causing the fixed-wing aircraft to perform tight ground turn if the fixed-wing aircraft is grounded and ignore the turn speed increase universal input if the fixed-wing aircraft is in another phase of operation. One skilled in the art will appreciate that the universal vehicle control router 120 may perform other suitable processing of universal vehicle control inputs to generate actuator commands in consideration of vehicle operation phases for various vehicles.
The vehicle actuators 130 are one or more actuators configured to control components of a vehicle integrated with the universal vehicle control interfaces 110. For instance, the vehicle actuators may include actuators for controlling a power-plant of the vehicle (e.g., an engine). Furthermore, the vehicle actuators 130 may vary depending on the particular vehicle. For example, if the vehicle is a rotorcraft the vehicle actuators 130 may include actuators for controlling lateral cyclic, longitudinal cyclic, collective, and pedal controllers of the rotorcraft. As another example, if the vehicle is a fixed-wing aircraft the vehicle actuators 130 may include actuators for controlling a rudder, elevator, ailerons, and power-plant of the fixed-wing aircraft.
The vehicle sensors 140 are sensors configured to capture corresponding sensor data. In various embodiments the vehicle sensors 140 may include, for example, one or more global positioning system (GPS) receivers, inertial measurement units (IMUs), accelerometers, gyroscopes, magnometers, pressure sensors (altimeters, static tubes, pitot tubes, etc.), temperature sensors, vane sensors, range sensors (e.g., laser altimeters, radar altimeters, lidars, radars, ultrasonic range sensors, etc.), terrain elevation data, geographic data, airport or landing zone data, rotor revolutions per minute (RPM) sensors, manifold pressure sensors, or other suitable sensors. In some cases, the vehicle sensors 140 may include, for example, redundant sensor channels for some or all of the vehicle sensors 140. The vehicle control and interface system 100 may use data captured by the vehicle sensors 140 for various processes. By way of example, the universal vehicle control router 120 may use vehicle sensor data captured by the vehicle sensors 140 to determine an estimated state of the vehicle, as described in greater detail below with reference to
The data store 150 is a database storing various data for the vehicle control and interface system 100. For instance, the data store 150 may store sensor data (e.g., captured by the vehicle sensors 140), vehicle models, vehicle metadata, or any other suitable data.
The emergency management module 160 (also “emergency module 160”) performs various functions associated with emergency events. For example, the emergency module 160 can accurately interpret vehicle issues, identify emergency events, take (e.g., immediate) corrective actions, and provide appropriate augmentation in a manner that assists a user (e.g., pilot) to remedy, solve or overcome emergency events. The emergency module 160 may interact with any of the other components of the vehicle control and interface system 100 (e.g., the control router 120). The emergency module 160 is described in more detail with respect to
The vehicle state display 210 is one or more electronic displays (e.g., liquid-crystal displays (LCDs) configured to display or receive information describing a state of the vehicle including the configuration 200. In particular, the vehicle state display 210 may display various interfaces including feedback information for an operator of the vehicle. In this case, the vehicle state display 210 may provide feedback information to the operator in the form of virtual maps, 3D terrain visualizations (e.g., wireframe, rendering, environment skin, etc.), traffic, weather, engine status, communication data (e.g., air traffic control (ATC) communication), guidance information (e.g., guidance parameters, trajectory), and any other pertinent information. Additionally, or alternatively, the vehicle state display 210 may display various interfaces for configuring or executing automated vehicle control processes, such as automated aircraft landing or takeoff or navigation to a target location. The vehicle state display 210 may receive user inputs via various mechanisms, such as gesture inputs (as described above with reference to the gesture interface 220), audio inputs, or any other suitable input mechanism. Embodiments of the vehicle state display 230 are described in greater detail below with reference to
As depicted in
The multi-function interface 230 is configured to facilitate long-term control of the vehicle including the configuration 200. In particular, the primary vehicle control interface 220 may include information describing a mission for the vehicle (e.g., navigation to a target destination) or information describing the vehicle systems. Information describing the mission may include routing information, mapping information, or other suitable information. Information describing the vehicle systems may include engine health status, engine power utilization, fuel, lights, vehicle environment, or other suitable information. In some embodiments, the multi-function interface 230 or other interfaces enable mission planning for operation of a vehicle. For example, the multi-function interface 230 may enable configuring missions for navigating a vehicle from a start location to a target location. In some cases, the multi-function interface 230 or another interface provides access to a marketplace of applications and services. The multi-function interface 230 may also include a map, a radio tuner, or a variety of other controls and system functions for the vehicle. An example embodiment of the multi-function interface 230 is described in greater detail below with reference to
In some embodiments, the vehicle state display 210 includes information describing a current state of the vehicle relative to one or more control limits of the vehicle (e.g., on the primary vehicle control interface 220 or the multi-function interface 230). For example, the information may describe power limits of the vehicle or include information indicating how much control authority a use has across each axis of movement for the vehicle (e.g., available speed, turning ability, climb or descent ability for an aircraft, etc.). In the same or different example embodiment, the vehicle state display 210 may display different information depending on a level of experience of a human operator of the vehicle. For instance, if the vehicle is an aircraft and the human operator is new to flying, the vehicle state display may include information indicating a difficulty rating for available flight paths (e.g., beginner, intermediate, or expert). The particular experience level determined for an operator may be based upon prior data collected and analyzed about the human operator corresponding to their prior experiences in flying with flight paths having similar expected parameters. Additionally, or alternatively, flight path difficulty ratings for available flight paths provided to the human operator may be determined based on various information, for example, expected traffic, terrain fluctuations, airspace traffic and traffic type, how many airspaces and air traffic controllers along the way, or various other factors or variables that are projected for a particular flight path. Moreover, the data collected from execution of this flight path can be fed back into the database and applied to a machine learning model to generate additional and/or refined ratings data for the operator for subsequent application to other flight paths. Vehicle operations may further be filtered according to which one is the fastest, the most fuel efficient, or the most scenic, etc.
The one or more vehicle state displays 210 may include one or more electronic displays (e.g., liquid-crystal displays (LCDs), organic light emitting diodes (OLED), plasma). For example, the vehicle state display 210 may include a first electronic display for the primary vehicle control interface 220 and a second electronic display for the multi-function interface 230. In cases where the vehicle state display 210 include multiple electronic displays, the vehicle state display 210 may be configured to adjust interfaces displayed using the multiple electronic displays, e.g., in response to failure of one of the electronic displays. For example, if an electronic display rendering the primary vehicle control interface 240 fails, the vehicle state display 210 may display some or all of the primary vehicle control interface 240 on another electronic display.
The one or more electronic displays of the vehicle state display 210 may be touch sensitive displays is configured to receive touch inputs from an operator of the vehicle including the configuration 200, such as a multi-touch display. For instance, the primary vehicle control interface 220 may be a gesture interface configured to receive universal vehicle control inputs for controlling the vehicle including the configuration 200 via touch gesture inputs. In some cases, the one or more electronic displays may receive inputs via other type of gestures, such as gestures received via an optical mouse, roller wheel, three-dimensional (3D) mouse, motion tracking device (e.g., optical tracking), or any other suitable device for receiving gesture inputs. Embodiments of a gesture interface are described in greater detail below with reference to
Touch gesture inputs received by one or more electronic displays of the vehicle state display 210 may include single finger gestures (e.g., executing a predetermined pattern, swipe, slide, etc.), multi-finger gestures (e.g., 2, 3, 4, 5 fingers, but also palm, multi-hand, including/excluding thumb, etc.; same or different motion as single finger gestures), pattern gestures (e.g., circle, twist, convergence, divergence, multi-finger bifurcating swipe, etc.), or any other suitable gesture inputs. Gesture inputs can be limited asynchronous inputs (e.g., single input at a time) or can allow for multiple concurrent or synchronous inputs. In variants, gesture input axes can be fully decoupled or independent. In a specific example, requesting a speed change holds other universal vehicle control input parameters fixed—where vehicle control can be automatically adjusted in order to implement the speed change while holding heading and vertical rate fixed. Alternatively, gesture axes can include one or more mutual dependencies with other control axes. Unlike conventional vehicle control systems, such as aircraft control systems, the gesture input configuration as disclosed provides for more intuitive user experiences with respect to an interface to control vehicle movement.
In some embodiments, the vehicle state display 220 or other interfaces are configured to adjust in response to vehicle operation events, such as emergency conditions. For instance, in response to determining the vehicle is in an emergency condition, the vehicle control and interface system 100 may adjust the vehicle state display 210 to include essential information or remove irrelevant information. As an example, if the vehicle is an aircraft and the vehicle control and interface system 100 detects an engine failure for the aircraft, the vehicle control and interface system 100 may display essential information on the vehicle state display 210 including 1) a direction of the wind, 2) an available glide range for the aircraft (e.g., a distance that the aircraft can glide given current conditions), or 3) available emergency landing spots within the glide range. The vehicle control and interface system 100 may identify emergency landing locations using various processes, such as by accessing a database of landing spots (e.g., included in the data store 150 or a remote database) or ranking landing spots according to their suitability for an emergency landing.
The side-stick inceptor device 240 may be a side-stick inceptor configured to receive universal vehicle control inputs. In particular, the side-stick inceptor device 240 may be configured to receive the same or similar universal vehicle control inputs as a gesture interface of the vehicle state display 210 is configured to receive. In this case, the gesture interface and the side-stick inceptor device 240 may provide redundant or semi-redundant interfaces to a human operator for providing universal vehicle control inputs. The side-stick inceptor device 240 may be active or passive. Additionally, the side-stick inceptor device 240 and may include force feedback mechanisms along any suitable axis. For instance, the side-stick inceptor device 240 may be a 3-axis inceptor, 4-axis inceptor (e.g., with a thumb wheel), or any other suitable inceptor. Processing inputs received via the side-stick inceptor device 240 is described in greater detail below with reference to
The components of the configuration 200 may be integrated with the vehicle including the configuration 200 using various mechanical or electrical components. These components may enable adjustment of one or more interfaces of the configuration 200 for operation by a human operator of the vehicle. For example, these components may enable rotation or translation of the vehicle state display 230 toward or away from a position of the human operator (e.g., a seat where the human operator sits). Such adjustment may be intended, for example, to prevent the interfaces of the configuration 200 from obscuring a line of sight of the human operator to the vehicle operator field of view 250.
The vehicle operator field of view 250 is a first-person field of view of the human operator of the vehicle including the configuration 200. For example, the vehicle operator field of view 250 may be a windshield of the vehicle or other suitable device for enabling a first-person view for a human operator.
The configuration 200 additionally or alternately include other auxiliary feedback mechanisms, which can be auditory (e.g., alarms, buzzers, etc.), haptic (e.g., shakers, haptic alert mechanisms, etc.), visual (e.g., lights, display cues, etc.), or any other suitable feedback components. Furthermore, displays of the configuration 200 (e.g., the vehicle state display 210) can simultaneously or asynchronously function as one or more of different types of interfaces, such as an interface for receiving vehicle control inputs, an interface for displaying navigation information, an interface for providing alerts or notifications to an operator of the vehicle, or any other suitable vehicle instrumentation. Furthermore, portions of the information can be shared between multiple displays or configurable between multiple displays.
In the embodiment shown in
Inputs received from the stick inceptor device 315 or the gesture interface 320 are routed directly to the command processing module 365 as universal aircraft control inputs 330. Conversely, inputs received from the automated control interface 325 are routed to an automated aircraft control module 335 of the universal aircraft control router 310. Inputs received by the automated aircraft module may include information for selecting or configuring automated control processes. The automated control processes may include automated aircraft control macros (e.g., operation routines), such as automatically adjusting the aircraft to a requested aircraft state (e.g., a requested forward velocity, a requested lateral velocity, a requested altitude, a requested heading, a requested landing, a requested takeoff, etc.). Additionally, or alternatively, the automated control processes may include automated mission or navigation control, such as navigating an aircraft from an input starting location to an input target location in the air or ground. In these or other cases, the automated aircraft control module 335 generates a set of universal aircraft control inputs suitable for executing the requested automated control processes. The automated aircraft control module 335 may use the estimated aircraft state 340 to generate the set of universal aircraft control inputs, as described below with reference to the aircraft state estimation module 345. Additionally, or alternatively, the automated aircraft control module 335 may generate the set of universal aircraft control inputs over a period of time, for example during execution of a mission to navigate to a target location. The automated aircraft control module 335 further provides generated universal aircraft control inputs for inclusion in the set of universal aircraft control inputs 330.
The aircraft state estimation module 345 determines the estimated aircraft state 340 of the aircraft including the universal aircraft control router 310 using the validated sensor signals 350. The estimated aircraft state 340 may include various information describing a current state of the aircraft, such as an estimated 3D position of the vehicle with respect to the center of the Earth, estimated 3D velocities of the aircraft with respect to the ground or with respect to a moving air mass, an estimated 3D orientation of the aircraft, estimated 3D angular rates of change of the aircraft, an estimated altitude of the aircraft, or any other suitable information describing a current state of the aircraft. The aircraft state estimation module 345 determines the estimated state of the aircraft 340 by combining validated sensor signals 350 captured by different types of sensors of the aircraft, such as the vehicle sensors 140 described above with reference to
In some embodiments, the aircraft state estimation module 345 precisely estimates an altitude of the aircraft above a surface of the Earth (e.g., an “altitude above the ground”) by combining multiple altitude sensor signals included in the validated sensor signals 350. Altitude sensor signals may include GPS signals, pressure sensor signals, range sensor signals, terrain elevation data, or other suitable information. The aircraft state estimation module 345 may estimate an altitude of the aircraft above an ellipsoid representing the Earth using a GPS signal if the GPS signal is available in the validated sensor signals 350. In this case, the aircraft state estimation module 345 may estimate the altitude above the ground by combining the altitude above the ellipsoid with one or more range sensor signals (e.g., as described above with reference to the vehicle sensors 140) or terrain elevation data. Additionally, or alternatively, the aircraft state estimation module 345 may determine an offset between the altitude above the ellipsoid and a barometric altitude determined, e.g., using sensor signals captured by a pressure altimeter. In this case, aircraft state estimation module 345 may apply the offset to a currently estimated barometric altitude if a GPS signal is unavailable in order to determine a substitute altitude estimate for the altitude above the ellipsoid. In this way, the aircraft state estimation module 345 may still provide precise altitude estimates during GPS signal dropouts the and a barometric altitude using a pressure value received from a pressure altimeter.
Among other advantages, by precisely estimating the altitude above the ground through combining multiple altitude sensor signals, the aircraft state estimation module 345 can provide altitude estimates usable for determining if the aircraft has landed, taken off, or is hovering. Additionally, the aircraft state estimation module 345 can provide altitude estimates indicating precise characteristics of the ground below the aircraft, e.g., if the ground is tilted or level in order to assess if a landing is safe. This is in contrast to conventional systems, which require specialized equipment for determining specific aircraft events requiring precise altitude determinations (e.g., takeoffs or landing) due to imprecise altitude estimates. As an example, the universal aircraft control router 310 can use the precise altitude estimates to perform automatic landing operations at locations that are not equipped with instrument landing systems for poor or zero-visibility conditions (e.g., category II or III instrument landing systems). As another example, universal aircraft control router 310 can use the precise altitude estimates to automatically maintain a constant altitude above ground for a rotorcraft (e.g., during hover-taxi) despite changing ground elevation below the rotorcraft. As still another example, the universal aircraft control router 310 can use the precise altitude estimates to automatically take evasive action to avoid collisions (e.g., ground collisions).
In some embodiments, the aircraft state estimation module 345 estimates a ground plane below the aircraft. In particular, the aircraft state estimation module 345 may estimate the ground plane combing validated sensor signals from multiple range sensors. Additionally, or alternatively, the aircraft state estimation module 345 may estimate of a wind vector by combining a ground velocity, airspeed, or sideslip angle measurements for the aircraft.
The sensor validation module 355 validates sensor signals 360 captured by sensors of the aircraft including the universal aircraft control router 310. For example, the sensor signals 360 may be captured by embodiments of the vehicle sensors 140 described above with reference to
In some embodiments, the aircraft sensors include multiple sensors of the same type capturing sensor signals of the same type, referred to herein as redundant sensor channels and redundant sensor signals, respectively. In such cases the sensor validation module may compare redundant sensor signals in order to determine a cross-channel coordinated sensor value. For instance, the sensor validation module 355 may perform a statistical analysis or voting process on redundant sensor signals (e.g., averaging the redundant sensor signals) to determine the cross-channel coordinated sensor value. The sensor validation module 355 may include cross-channel coordinated sensor values in the validated sensor signals 350.
The command processing module 365 generates the aircraft trajectory values 370 using the universal aircraft control inputs 330. The aircraft trajectory values 370 describe universal rates of change of the aircraft along movement axes of the aircraft in one or more dimensions. For instance, the aircraft trajectory values 370 may include 3D linear velocities for each axis of the aircraft (e.g., x-axis or forward velocity, y-axis or lateral velocity, and z-axis or vertical velocity) and an angular velocity around a pivot axis of the vehicle (e.g., degrees per second), such as a yaw around a yaw axis.
In some embodiments the command processing module 365 performs one or more smoothing operations to determine a set of smoothed aircraft trajectory values that gradually achieve a requested aircraft trajectory described by the universal aircraft control inputs 330. For instance, the universal aircraft control inputs 330 may include a forward speed input that requests a significant increase in speed from a current speed (e.g., from 10 knots per second (KTS) to 60 KTS). In this case, the command processing module 365 may perform a smoothing operation to convert the forward speed input to a set of smoothed velocity values corresponding to a gradual increase in forward speed from a current aircraft forward speed to the requested forward speed. The command processing module 365 may include the set of smoothed aircraft trajectory values in the aircraft trajectory values. In some cases, the command processing module 365 may apply different smoothing operations to universal aircraft control inputs originating from different interfaces of the aircraft interfaces 305. For instance, the command processing module 365 may apply more gradual smoothing operations to universal aircraft control inputs received from the gesture interface 320 and less gradual smoothing operations to the stick inceptor device 315. Additionally, or alternatively, the command processing module 365 may apply smoothing operations or other operations to universal aircraft control inputs received from the stick inceptor device 315 in order to generate corresponding aircraft trajectory values that simulate manual operation of the aircraft.
In some embodiments, the command processing module 365 processes individual aircraft control inputs in the universal aircraft control inputs 330 according to an authority level of the individual aircraft control inputs. In particular, the authority levels indicate a processing priority of the individual aircraft control inputs. An authority level of an aircraft control input may correspond to an interface of the aircraft interfaces 305 that the aircraft control input originated from, may correspond to a type of operation the aircraft control input describes, or some combination thereof. In one embodiment, aircraft control inputs received from the stick inceptor device 315 have an authority level with first priority, aircraft control inputs received from the gesture interface 320 have an authority level with second priority, aircraft control inputs received from the automated aircraft control module 335 for executing automated aircraft control macros have an authority level with a third priority, and aircraft control inputs received from the automated aircraft control module 335 for executing automated control missions have an authority level with a fourth priority. Other embodiments may have different authority levels for different aircraft control inputs or may include more, fewer, or different authority levels. As an example, an operator of the aircraft may provide an aircraft control input via the stick inceptor device 315 during execution of an automated mission by the automated aircraft control module 335. In this case, the command processing module 365 interrupts processing of aircraft control inputs corresponding to automated mission in order to process the aircraft control input received from the stick inceptor device 315. In this way, the command processing module 365 may ensure that the operator of the aircraft can take control of the aircraft at any time via a suitable interface.
The control laws module 375 generates the actuator commands (or signals) 380 using the aircraft trajectory values 370. The control laws module 375 includes an outer processing loop and an inner processing loop. The outer processing loop applies a set of control laws to the received aircraft trajectory values 370 to convert the aircraft trajectory values 370 to corresponding allowable aircraft trajectory values. Conversely, the inner processing loop converts the allowable aircraft trajectory values to the actuator commands 380 configured to operate the aircraft to adjust a current trajectory of the aircraft to an allowable trajectory defined by the allowable aircraft trajectory values. Both the outer processing loop and the inner processing loop are configured to operate independently of the particular aircraft including the universal aircraft control router 310. In order to operate independently in this manner, the inner and outer processing loops may use a model including parameters describing characteristics of the aircraft that can be used as input to processes or steps of the outer and inner processing loops. In some embodiments, the model used by the control laws module 375 is a different than the model used by the aircraft state estimation module 345, as described above. For instance, the models used by the control laws module 375 and the aircraft state estimation module 345 may respectively include parameters relevant to determining the actuator commands 380 and relevant to determining the estimated aircraft state 340. The control laws module 375 may use the actuator commands 380 to directly control corresponding actuators, or may provide the actuator commands 380 to one or more other components of the aircraft to be used to operate the corresponding actuators.
The outer processing loop may apply the limit laws in order impose various protections or limits on operation of the aircraft, such as aircraft envelope protections, movement range limits, structural protections, aerodynamic protections, impose regulations (e.g., noise, restricted airspace, etc.), or other suitable protections or limits. Moreover, the limit laws may be dynamic, such as varying depending on an operational state of the aircraft, or static, such as predetermined for a particular type of aircraft or type of aircraft control input. As an example, if the aircraft is a rotorcraft the set of control laws applied by the outer processing loop may include maximum and minimum rotor RPMs, engine power limits, aerodynamic limits such as ring vortex, loss of tail-rotor authority, hover lift forces at altitude, boom strike, maximum bank angle, or side-slip limits. As another example, if the aircraft is a fixed-wing aircraft the set of control laws applied by the outer processing loop may include stall speed protection, bank angle limits, side-slip limits, g-loads, flaps or landing gear max extension speeds, or velocity never exceeds (VNEs). Additionally, or alternatively, the outer processing loop uses the estimated aircraft state 340 to convert the aircraft trajectory values 370 to corresponding allowable aircraft trajectory values. For instance, the outer processing loop may compare a requested aircraft state described by the aircraft trajectory values 370 to the estimated aircraft state 340 in order to determine allowable aircraft trajectory values, e.g., to ensure stabilization of the aircraft. In some embodiments, the inner processing loop converts the allowable aircraft trajectory values in an initial frame of reference to a set of body trajectory values relative to a body frame of reference for the aircraft. In particular, the set of body trajectory values precisely define movement of the aircraft intended by the allowable aircraft trajectory values. The initial frame of reference may be various suitable frames of reference, such as an inertial frame of reference, a frame of reference including rotations around one or more axes of the inertial frame, or some combination thereof. For instance, if the allowable aircraft trajectory values include a velocity for an x-axis, y-axis, z-axis and a heading rate change, the initial frame of reference may be an inertial frame with a rotation (e.g., yaw) around the z-axis. The body frame includes eight coordinates collectively representing 3D velocities and yaw, pitch, and roll angles of the aircraft.
In the same or different embodiments, the inner processing loop determines a difference between the estimated aircraft state 340 and an intended aircraft state corresponding to the allowable aircraft trajectory values, the difference referred to herein as a “command delta.” For example, the inner processing loop may determine the intended aircraft state using the body trajectory values of the aircraft, as described above. The inner processing loop uses the command delta to determine actuator commands 380 configured to operate actuators of the aircraft to adjust the state of the aircraft to the intended aircraft state. In some cases, the inner processing loop applies a gain schedule to the command delta to determine the actuator commands 380. For example, the inner processing loop may operate as a linear-quadratic regulator (LQR). Applying the gain schedule may include applying one or more gain functions to the command delta. The control laws module 375 may determine the gain schedule based on various factors, such as a trim airspeed value corresponding to the linearization of nonlinear aircraft dynamics for the aircraft. In the same or different embodiments, the inner processing loop uses a multiple input and multiple output (MIMO) protocol to determine or transmit the actuator commands 380.
In some embodiments where the aircraft is a rotorcraft, the outer processing loop is configured to facilitate execution of an automatic autorotation process for the rotorcraft. In particular, the automatic autorotation process facilitates autorotation by the rotorcraft during entry, glide, flare, and touch down phases. Additionally, or alternatively, the outer processing loop may be configured to facilitate autorotation by the aircraft in response to one or more emergency conditions (e.g., determined based on the estimated aircraft state 340). Execution of the automatic autorotation process by the outer processing loop offloads operation autorotation rotorcraft maneuvers from a human operator of the rotorcraft, thus simplifying user operation and improving the safety. Furthermore, in embodiments where the aircraft is a fixed-wing aircraft, the outer processing loop may facilitate an automatic landing procedure. In particular, the outer processing loop may facilitate the automatic landing procedure even during emergency conditions, e.g., if an engine of the aircraft has failed. The aircraft state display 385 includes one or more interfaces displaying information describing the estimated aircraft state 340 received from the universal aircraft control router 310. For instance, the aircraft state display may be an embodiment of the aircraft state display 210 described above with reference to
As depicted in
The gesture inputs 410, 420, 430, and 440 further include possible movement regions (indicated by the dashed lines) that indicate a range of possible movements for each of the gesture inputs 410, 420, 430, and 440. For instance, as depicted in
As depicted in
As described above with reference to the universal vehicle control interfaces 110, the mapping 500 may adjust according to a phase of operation of the aircraft. For instance, the rightward deflection 545 and the swipe right with one finger 550 may map to a lateral movement for a rotorcraft (e.g., a strafe) if the rotor craft is hovering. Similarly, the rightward deflection 545 and the swipe right with one finger 550 may be ignored for a fixed-wing aircraft if the fixed-wing aircraft is grounded.
In the embodiment shown, the aircraft state interface 600 includes a visualization of a virtual aircraft object 602 representative of a state of a physical aircraft. As depicted in
The aircraft state interface 600 further includes an environment display 604. The environment displays 604 represents a physical environment in which the physical aircraft is operating. As depicted in
In some embodiments, the vehicle control and interface system 100 generates the environment display 604 based on a computer vision pose of the physical aircraft (e.g., of the current aircraft conditions, global aircraft position or orientation). The pose can be determined based on GPS, odometry, trilateration from ground fiducials (e.g., wireless fiducials, radar fiducials, etc.), or other signals. The vehicle control and interface system 100 may generate the environment display 604 from suitable terrain database, map, imaging or other sensor data generated by the physical aircraft, or other suitable data. As an example, the vehicle control and interface system 100 may select a map segment using the aircraft pose, determine an augmented field of view or perspective, determine augmented target placement, determine pertinent information (e.g., glideslope angle), determine a type of virtual environment (e.g., map vs rendering), or any other suitable information based on the pose of the physical aircraft. The environment display 604 can be pre-rendered, rendered in real time (e.g., by z-buffer triangle rasterization), dynamically rendered, not rendered (e.g., 2D projected image, skin, etc.) or otherwise suitably generated relative to the view perspective.
The aircraft state interface 600 further includes a set of interface elements overlaying the environment display 604. The set of interface elements include an active input feedback interface element 606, a forward speed element 608, a vertical speed element 610, a heading element 612, and an aircraft control interface selection element 614.
The active input feedback interface element 608 indicates an aircraft interface that is currently providing aircraft control inputs, such as one of the aircraft interfaces 305. As depicted in
The forward speed element 608, the vertical speed element 610, and the heading element 612 each include information indicating a current aircraft control input value and information indicating a respective value for a current state of the aircraft.
In particular, the forward speed element 608 includes a vertical bar indicating a possible forward speed input value range from 20 knots (KTS) to 105 knots, where the grey bar indicates a current forward speed input value of 60 KTS. The forward speed element 608 also includes a bottom text box including text indicating the current forward speed input value. Further, the forward speed element 608 includes a top text box indicating a current forward speed value for the aircraft of 55 KTS.
Similar to the forward speed element 608, the vertical speed element 610 includes a vertical bar indicating a possible vertical speed input value range from −500 feet per minute (FPM) to 500 to 400 FPM, where the grey bar indicates a current vertical speed input value of 320 FPM. The vertical speed element 610 also includes a bottom text box including text indicating the current vertical speed input value. Further, the vertical speed element 610 includes a top text box indicating a current altitude value for the aircraft of 500 feet above mean sea level (MSL).
The heading element 612 includes a virtual compass surrounded by a circular bar indicating a possible heading input value range from −360 degrees (DEG) to +360 DEG. where the grey bar indicates a current heading input value of +5 DEG. The heading element 612 further includes horizontal bars on either side of the circular bar indicating the range of possible heading input values and a grey bar indicating the current heading input value. The virtual compass of the heading element 612 indicates a current heading value for the aircraft of 360 DEG.
The aircraft control interface selection element 614 facilitates selection of an aircraft control interface from a set of four aircraft control interfaces. As depicted in
In some embodiments, the aircraft state interface 600 or another interface may display additional interface elements corresponding to a selected aircraft control interface from the set of aircraft control interfaces. For example, if the gesture interface is selected the aircraft state interface 600 may display an additional interface including illustrations of the gesture touch inputs for providing universal aircraft control inputs, such as illustrations similar to those depicted in
As depicted in
In alternative embodiments than those depicted in
As depicted in
The mission planner element 652 facilitates interaction with navigation information, such as a routing database, inputting an origin or destination location, selecting intermediary waypoints, etc. As depicted in
The communication element 654 includes information describing relevant radio frequencies. For instance, the relevant radio frequencies may be based on a current position of the aircraft, a current mission for the aircraft, or other relevant information. In the same or different embodiments, the communication element 654 may include other communication-related information.
The system status element 656 includes information describing a status of the aircraft determined according to an estimated state of the aircraft (e.g., the estimated aircraft state 340). As depicted in
In some embodiments, some or all of the mission planner element 652, the communication element 654, or the system health element 656 are not persistently included on the aircraft state interface 650. Instead, the aircraft interface 650 is adjusted (e.g., by the vehicle control and interface system 100) to include some or all of these elements in response to triggers or events. In the same or different embodiments, the mission planner element 652, the communication element 654, or the system health element 656 are not persistently included on the aircraft state interface 650 include pertinent information. Pertinent information represents a limited set of information provided for display to the human operator at a particular time or after a particular event. For example, a human operator can be relied upon to process information or a direct attention according to a prioritization of: 1. aviate; 2. navigate; and 3. communicate. As only a subset of information describing a state of the physical aircraft is required for each of these tasks, the human operator can achieve these tasks more efficiently if pertinent information is displayed and irrelevant information is not displayed, which can be extraneous or distracting for the human operator. Pertinent information can include various apposite parameters, notifications, values, type of visual augmentation (e.g., two dimensional (2D), two and a half dimensional (2.5D), three dimensional (3D), augmentation mode, virtual environment.
The map display 658 is a virtual geographical map including an aircraft map position indicator 660 and an aircraft map trajectory indicator 662. The map display 658 includes virtual geographical data for a geographical region. The map display 658 may be generated using map data from various map databases. The aircraft map trajectory indicator 660 provides a visual indication of a geographical location of the aircraft relative to the geographical region displayed by the map display 658. Similarly, the aircraft map trajectory indicator 662 provides a visual indication of a trajectory of the aircraft in the geographical region of the map display 658. For example, the aircraft map trajectory 662 may be a 2D projection of the trajectory forecasts 628 or 636.
The particular interface elements depicted in
As previously described the vehicle control and interface system 100 may include an emergency module 160. The emergency module 160 is designed to reduce, among other things, the number of fatal air vehicle incidents attributed to user (e.g., pilot) error. The emergency module 160 can more accurately interpret air vehicle issues and can take (e.g., immediate) corrective actions (e.g., in a safer, more accurate, and repeatable basis) while still allowing the user to have agency over the air vehicle. The emergency module 160 may also provide notifications to the user that help the user make informed and intelligent decisions without providing excessive information that may slow or overwhelm the user's decision-making process.
At step 710, the emergency module 160 determines one or more emergency events have occurred. As described herein, an emergency event refers to an event that (1) occurred and (2) requires corrective action to prevent or reduce damage to the vehicle, the user (e.g., pilot) of the vehicle, passengers of the vehicle, or some combination thereof. An example emergency event is a low-g event. An emergency event may refer to a critical failure of a component of the vehicle. For example, the loss of tail rotor thrust on a rotorcraft may be referred to as an emergency event. Other examples include loss of engine power and loss of governor control (e.g., the engine is no longer able to provide enough torque to keep the vehicle at the current altitude). The emergency module 160 may determine the occurrence of an emergency event by analyzing data from a vehicle sensor (e.g., 140). For example, an emergency event may be detected using an algorithm in conjunction with sensor data from one or more sensors. In some embodiments, the emergency module 160 is configured to determine emergency events specified in a pilot operating handbook (POH) or other certification manual.
At step 715, the emergency module 160 ranks the determined emergency events (assuming multiple events were detected at step 710) according to importance level. For example, the highest ranked emergency events may have the highest importance levels. The importance level of an event may be a function of (1) the level of potential danger to the user or passengers if the corrective action isn't performed or (2) the level of urgency to perform the corrective action.
At step 720, the emergency module 160 notifies a user of one or more emergency events based on the ranking from step 715. The user may be notified via a notification on a user interface on a display (e.g., 210) (e.g., the emergency module 160 sends a notification for display on a display). In another example, the user is notified via an aural notification (e.g., the emergency module 160 sends a notification to an aural device (e.g., speaker system)). In some embodiments, the user is notified via a crew alerting system indication on a primary flight display. Specific example notifications include notifying the user that the vehicle just experienced a low-g event, or the vehicle is currently experiencing loss of tail rotor effectiveness.
Performing step 720 may result in a limited number of emergency events being presented to a user. For example, only the emergency event with the highest importance is presented to the user. In other examples, only emergency events with a threshold level of importance are presented to the user or only a threshold number of the emergency events are presented to the user (a threshold number of the highest ranked events). Among other advantages, the limited number of emergency event notifications allows the user (e.g., pilot) to focus on the important events first without being distracted by less important events, thus decreasing the likelihood of the user misinterpreting the situation or performing an error.
At step 725, the emergency module 160 identifies corrective actions associated with an emergency event presented to the user in step 720 (e.g., based on the type of emergency event). One or more of the corrective actions may be specified by a pilot operating handbook (POH) or other certification manuals for the specific emergency event. The corrective actions may be part of an emergency procedure associated with the emergency event. These actions may be divided into two categories: user actions (e.g., pilot actions) and non-user actions (e.g., non-pilot actions). User actions are corrective actions that should be or must be performed by the user. Non-user actions are corrective actions that the emergency module 160 is capable of performing (e.g., without the user's input or guidance). A non-user action may be implemented by the emergency module 160 communicating with the automated aircraft control module 335. For example, the automated aircraft control module 335, responsive to receiving an indication of a non-user action from the emergency module 160, generates control inputs (e.g., 330) suitable for accomplishing the non-user action.
At step 730, the emergency module 160 performs one or more of the non-user actions identified in step 725. The non-user actions to be performed, currently being performed, completed by the emergency module 160, or some combination thereof may be presented to the user to keep them informed. In some embodiments, these automated actions are triggered by designated fly-by-wire sensors collectively interpreted and voted the emergency module 160, and in compliance with the certified air vehicle flight manual emergency procedures. Example non-user actions are described in the context of a low-g event for a rotorcraft. In a low-g event, the rotorcraft experiences “low-g” (a vertical acceleration which makes the user (e.g., pilot) feel light in their seat). This event causes the rotor disk to become unloaded which can cause rotor mast to bump and separate the rotor from the airframe. The corrective action in this case is to reload the rotor disk by applying a quick pitch up maneuver and slowing down. In this case, the emergency module 160 may automatically detect the low-g event when it occurs and (e.g., immediately) applies this corrective action. An additional example non-user action is described in the context of autorotation. When the emergency module 160 detects the motor has failed, it may automatically enter the rotorcraft into an autorotation glide (e.g., by interacting with 335).
At step 735, the emergency module 160 notifies a user of one or more user actions identified in step 725 (e.g., the emergency module 160 sends a user action for display on a display). The notification may instruct the user how to perform the one or more user actions, thus decreasing the likelihood of the user misinterpreting the situation or performing an error. After the user performs one or more of the user actions, additional user actions may be provided to the user. Additionally, or alternatively, previously completed user actions may be provided to the user (e.g., displayed on a display) to remind the user of actions they already performed. Similar to step 720, the user may be notified via a notification (e.g., an alert) on a user interface or notified via an aural notification. For example, audible or visual cues notify the user when to flare in an autorotation. In some embodiments, the user is notified via a crew alerting system indication on a primary flight display.
Performing step 735 may result in a limited number of user actions being presented to a user. For example, only the next user action or a threshold number of user action notifications are presented to the user. Among other advantages, the limited number of user action notifications allows the user to focus on the next corrective action without being distracted by other (e.g., subsequent) corrective actions, thus decreasing the likelihood of the user performing an error. This also allows the user to effectively stay in control of the vehicle and assess the situation more accurately. For example, a display displays the most important user information to augment and accelerate the user's decision-making process with notifications such as “land immediately,” “land as soon as possible,” or “land as soon as practicable.”
In addition to notifying a user of one or more user actions, step 735 may notify the user of useful information such as the vehicle has reached its max operating envelop limit. The user may be notified via a user interface or via an aural notification.
However, the emergency module 160 may provide these notifications when useful instead of showing all possible indications at all times. Many conventional cockpits are full of clutter, and most pilots have more information than needed during any given phase of flight. Contrary to this, in some embodiments, the emergency module 160 only provides the most important or necessary notifications (e.g., crew alerts) in the appropriate context (e.g., when the information is useful to the user and when the user may use the information to make decisions). The parameters of these curated, system diagnosed, and context-specific notifications are helpful to combine user augmentation and user agency in a manner that allows for more accurate fault interpretation and appropriate procedure execution. The specific thresholds (e.g., for sensor data) that the emergency module 160 uses to classify or identify an emergency event (e.g., rate of signal change or persistence of event) along with associated corrective actions enable the emergency module 160 to notify the right order of errors or corrective actions. This allows the user to perform the proper actions more quickly rather than be inundated with information. The innovative nature of moving away from a panel/list of warning alerts, and towards a series of popups/dialogs is a highlight of the emergency module 160 (e.g., context specific prompts are advantageous over a list with all faults listed simultaneously).
Depending on the number of detected emergency events and the corrective actions associated with those events, steps 730 and 735 may each be performed multiple times. Additionally, or alternatively, steps 730 and 735 may be performed sequentially, in parallel, alternately, or some combination thereof. Furthermore, steps 720-735 may be repeated until (e.g., all) corrective actions are performed for (e.g., all) the emergency events determined in step 710.
Additionally, the interface 825 includes a left section 826 with a list of steps to perform to respond to one or more alerts in the middle section 827. In the example of
Interface 825 is in sharp contrast to traditional systems, which, in the example of an engine fire, simply include a single “engine fire” light that turns on. In these traditional systems, the user must manually remember the implications of the “engine fire” light, remember the proper steps to perform and then execute the next steps.
Among other advantages, the emergency module 160 works even if one or more subsystems fail or are compromised (e.g., due to an emergency event), in addition to (or alternative to) the user being incapacitated. Said differently, the emergency module 160 is operational even if the vehicle is in a degraded state. Example failures include an engine failure, tail rotor failure, landing gear failure, radar failure, and GPS failure. For example, in some embodiments, to conduct autorotation the emergency module 160 only uses input from a control interface (e.g., control-stick), a set of IMUs, a set of air-data sensors, and a set of rotor RPM sensors (e.g., these are the only items that are absolutely needed for autorotation). In another example, in some embodiments to conduct an automated landing (e.g. where the user can pick a landing spot), GPS will be needed.
In some embodiments, the emergency module 160 may include functionalities for parachutes (for emergency landings). Conventionally, to employ a parachute for an air vehicle, a user must manually release the parachute. However, there many downsides to this. If the vehicle is moving too fast, the parachute may tear off form the vehicle. If the vehicle is too low to the ground, the parachute may not slow the vehicle down enough before the vehicle makes ground contact. Furthermore, even if a parachute is released at the proper time, the air vehicle may land at a dangerous location.
Among other advantages, the emergency module 160 provides parachute functionalities. In one example embodiment, the parachutes may be applied for fixed wing aircraft. Since the emergency module 160 knows the vehicle state, the emergency module 160 may determine when to release the parachute. In some embodiments, the emergency module 160 determines when the parachute can be deployed based on the height above the ground, vertical speed of the vehicle, and forward speed of the vehicle. The emergency module 160 may automatically release the parachute (an example of a non-user corrective action) or inform the user when they should release the parachute. If parameters of the vehicle should be modified before the parachute is, the emergency module 160 may automatically control the vehicle (or provide instructions to the user) so the parameters are modified. For example, if the vehicle altitude is too low to deploy the parachute (e.g., the parachute won't sufficiently slow the decent rate of the vehicle), the emergency module 160 may trigger engine (one or more) thrust, which may be reserve engine thrusters, and/or adjust aircraft wing flaps to direct the vehicle to a higher altitude that is within an envelope to enable safe deployment of the parachute. The trigger may be based upon vehicle condition and operational considerations as well as environmental considerations (e.g., wind speed, atmospheric conditions) to determine the adjustments to make to the electrical and mechanical systems (e.g., engine thrust and/or flap positions).
In another example, if the vehicle is moving too fast to deploy the parachute, the emergency module 160 may direct the vehicle such that the vehicle speed decreases. For example, if engine thrust (or reserve thrusters) is operational, the engine may be signaled to generate thrust and/or aircraft flaps are deployed such that the rate of descent in slowed to within an acceptable operational envelope for deployment of the parachute.
In another example, if the vehicle will land at an undesirable location (e.g., in the middle of the ocean), the emergency module 160 may direct the vehicle before the parachute is released so that the vehicle will eventually land at a more desirable location (e.g., in the ocean but within swimming distance of the shore). For example, if engine thrust (or reserve thrusters) is operational, the engine may be signaled by a guidance and navigation system to identify a safe landing area. The guidance and navigation system may calculate aircraft parameters such as rate of drop and available thrust energy and/or flap deployment range as well as environmental factors such as wind speed and atmospheric conditions. The calculations are used to control the vehicle (e.g., via 335) such that the aircraft descends to the calculated location after the parachute is deployed. In the above examples, the emergency module 160 ‘directing’ the vehicle refers to both automatically controlling the vehicle (non-user actions) and providing control instructions to the user (examples of notifying a user of user actions e.g., step 735).
The emergency module 160 may make landing determinations based on specifics of the air vehicle. For example, if a vehicle has retractable landing gear, the emergency module 160 may identify this and direct the vehicle to land on a body of water based on this identification. Conversely, if the vehicle does not have retractable landing gear (or the landing gear is not operational (e.g., extended and jammed), it may be unsafe to land in certain environments such as on water. The emergency module 160 may identify the extended landing gear and direct the vehicle so it does not land on the water or so that the vehicle releases a parachute before landing on the water. In the above examples, the emergency module 160 ‘directing’ the vehicle refers to both automatically controlling the vehicle (non-user actions such as triggering engine thrust and/or maneuvering the flaps) and providing control instructions to the user (examples of notifying a user of user actions e.g., step 735).
Among other features, the emergency module 160 may assist a user with autorotation of a rotorcraft. Among other advantages, the emergency module 160, dramatically reduces the workload of the user during autorotation, thus making an autorotation easier and safer to perform. More specifically, the emergency module 160 automates some steps of an autorotation procedure while still giving the user agency over the rotorcraft. This allows the user to focus on other (e.g., more important) steps of the autorotation procedure, such as maneuvering the rotorcraft. For example, the emergency module 160 controls the main rotor RPM so it is within the ideal RPM range for autorotation while the user maneuvers the rotorcraft. In some embodiments, the emergency module 160 is capable of automating all steps of an autorotation procedure (e.g., if the user becomes incapacitated).
In a first example, the emergency module 160 may assist a user with a safe autorotational descent and a safe autorotational landing. When the emergency module 160 detects an autorotation condition (e.g., an engine failure, loss of tail rotor thrust (e.g., due to a bird strike), power governor failure, user-initiated autorotation, or training autorotation), the emergency module 160 may automatically enter the air vehicle into an autorotation descent profile. As indicated above, the user has the ability to initiate an autorotation at their discretion (e.g., there is an engine fire in flight and the user decides to enter an autorotation). The user may initiate an autorotation using with a thumbwheel (an example of a control interface 110 and e.g., as described with respect to
In a second example, the emergency module 160 may assist a user with a safe hovering autorotation descent and a safe landing. In these cases, when the emergency module 160 detects an autorotation condition (e.g., an engine failure or loss of tail rotor thrust (e.g., jammed tail rotor)) while the rotorcraft is in a hover at or below a threshold height (e.g., eight feet above ground level (AGL)), the emergency module 160 may initiate a hovering autorotation that attempts to maintain heading to prevent yaw. If this autorotation condition occurs in lateral flight, the emergency module 160 may attempt to align heading with the aircraft velocity vector to prevent dynamic rollover. The user may then allow the rotorcraft to settle and then increase the vertical thumb lever just before touchdown to cushion the landing.
In a third example, the emergency module 160 allows the user to perform an autorotation training scenario or practice autorotation (e.g., during user training, check rides, and practice maneuvers). In some embodiments, practice autorotations are not executed to the ground and thus include minimum altitude protection (e.g., if the rotorcraft descends below an altitude threshold, the emergency module 160 exits the autorotation).
As previously mentioned,
Steps 905A, 905B, 905C are example conditions that trigger the emergency module 160 to enter into autorotation (step 907). At step 905A, the emergency module 160 detects a failure event (e.g., an engine failure (e.g., the tail rotor losses thrust)). At step 905B, the user pulls the fuel cutoff (e.g., due to a fire). At step 905C, the user initiates an autorotation (e.g., to perform a practice autorotation).
At step 907, responsive to any of steps 905A-C occurring (or any other autorotation condition occurring), the emergency module 160 enters into an autorotation. Part of step 907 may include the emergency module 160 putting the rotorcraft into an autorotation glide, thus alleviating the user of performing this potentially difficult stage of autorotation. In a rotorcraft, after a failure (e.g., an engine or transmission failure), autorotation must be entered quickly (e.g., before the rotor system loses momentum below a threshold value) to avoid a catastrophic outcome. The required time to enter autorotation (e.g., less than two seconds) may be less than the typical user reaction time to recognize the failure, mentally process it, and then provide the correct control inputs to enter autorotation. Thus, among other advantages, the emergency module 160 can detect failures that require autorotation and can initiate the proper control inputs to enter the autorotation glide faster than a human user.
The remaining steps of the method 900 are steps that may occur during autorotation.
At step 909, the emergency module 160 allows the user to use control inputs (e.g., a stick) to maneuver the rotorcraft (e.g., to affect the glide path). In some embodiments, step 909 can be performed autonomously (e.g., if the user is unconscious). Steps 911-917 describe example ways the user can control the rotorcraft during autorotation.
At step 911, the user controls the velocity (e.g., within the envelope, non-persistent). The pitch may be controlled by longitudinal deflection of the side stick controller. For example, pushing forward on the side stick controller pitches the nose down and increases airspeed (the airspeed may be displayed to the user via a display). At step 913, the user controls the turn rate (e.g., within the envelope, non-persistent). The turn rate may be controlled by lateral deflection of the side stick controller. At step 915, the user controls the rotor RPM (e.g., within the envelope, non-persistent). The user may control the RPM using a vertical thumb lever. For example, rolling the vertical thumb lever up decreases the rotor RPM, thus decreasing the descent rate, and rolling the vertical thumb lever down increases the rotor RPM, thus increasing the decent rate. The decent rate may be displayed to the user via a display. At step 917, the user can control the side slip. The user may control the side slip with the side stick controller.
At step 919, the emergency module 160 maintains nominal 100% rotor (unless overridden by the user). The RPM can be changed from 100% (e.g., commanded to 98% or 101%) to either increase or decrease the rate of decent. Note that in some contexts RPM is referred to as the actual RPM value and an in other contexts as a percent relative to what is deemed a nominal value in powered flight. During decent, the emergency module 160 may also manage speed, sideslip, heading or some combination thereof (note that any of these may also be modified by the user, if needed).
At step 921, the emergency module 160 presents additional information to the user to help the user perform the autorotation. Steps 923-927 describe example information that may be displayed during autorotation. Limits that exist in powered flight but are no longer applicable during autorotation may be removed. For example, no dynamic limits (e.g., barberpole) are displayed to the user for airspeed or for vertical speed.
At step 923, the emergency module 160 presents the pitch angle (e.g., with a defined pitch envelope) of the rotorcraft. At step 925, the emergency module 160 presents the altitude of the rotorcraft (e.g., the Above Ground Level (AGL), which may be determined using radar). At step 927, the emergency module 160 presents the rotor RPM (e.g., in the envelope between 90% and 110% (in this context, RPM is a percent relative to a nominal value in powered flight).
At step 929, the emergency module 160 detects the rotorcraft is the flare zone and, responsive to this, notifies the user. For example, the emergency module 160 provides an aural notification, such as a chime or an annunciation that the rotorcraft is in the flare zone. The emergency module 160 may determine the rotorcraft is the flare zone based on altitude, current speed, and the minimum time it would take to reduce rotorcraft forward velocity to a safe speed to perform a landing.
At step 931, the user performs an operation to initiate a flare command. The flare command triggers the emergency module 160 to enter into a flare state. The operation may include the user pulling back on the stick past a threshold deflection (the “flare threshold”). The user can toggle off the flare state by performing another operation (e.g., pressing a command button in a user interface). During a flare state, rotor RPM is allowed to build so the vehicle is slowed to a safe landing velocity. The flare state is distinct from the glide state, during which rotor RPM is maintained to keep speed (the intent is not to build RPM but keep it within a nominal envelop).
At step 933, in the flare state, the emergency module 160 allows the flare pitch up to build up or maintain rotor RPM up to a threshold (e.g., 110%).
At step 935, the pitch angle continues to follow the user's control (e.g., longitudinal side stick controller deflection) (within envelope) while continuing to allow the user to control other maneuvers, such as turn rate, bank rate, and side slip.
At step 937, the user uses a control interface (e.g., rolls up the vertical thumb lever) to engage the built-up rotor RPM energy to cushion the landing at final setdown.
While the steps of method 900 are illustrated as sequentially occurring in order, this is not required. For example, after autorotation begins (step 907), the user may have agency to control the rotorcraft (steps 909-917) while other steps occur. Similarly, information may be presented to the user (steps 921-927) while other steps occur (e.g., information is presented after autorotation begins at step 907).
Additional details of an autorotation are further described below. Some details of the below descriptions may be repetitive in view of the previous descriptions. Any of the descriptions, features, embodiments, and examples described below may be incorporated into any of the descriptions, features, embodiments, and examples previously described.
An autorotation may be initiated due to any number of different emergency events for aircraft with rotors (which may include rotary blades) (e.g., helicopter), such as a failed main rotor and/or failed tail rotor. Generally, an autorotation may be triggered for an emergency event that necessitates landing the air vehicle quickly (e.g., immediately or as quickly as possible). The emergency module 160 may identify such an emergency event by analyzing data from sensors of the air vehicle (e.g., determining (e.g., via detection) an engine failure by identifying decreasing rotor torque or determining an inability to generate torque necessary to drive the rotor). In response, the emergency module 160 may automatically enter into an autorotation, notify the user that the air vehicle is experiencing an engine failure and should enter into an autorotation, or both.
An autorotation may also be initiated by the user by interacting with the OS (e.g., see
To enter into an autorotation, the emergency module 160 may invert the pitch of the main rotor blades, resulting in upward air movement that rotates the rotor blades. Among other advantages, the autorotation of the rotor blades slows down the air vehicle as it descends. Furthermore, inverting the pitch of the rotor blades helps maintain rotor RPM within a predetermined envelope range (e.g., RPM within 80%-100%). If the RPM drops below the envelope range, the air vehicle may stall, possibly resulting in a crash landing. To continue maintaining RPM in the envelope range during autorotation (thus helping prevent a stall), the emergency module 160 may dynamically adjust the pitch angle of the rotor blades. The emergency module 160 may consider other factors as well. For example, there may be a desired nominal airspeed range to help stabilize RPM. Thus, the emergency module 160 may control the vehicle (e.g., dynamically adjust the pitch angle of the rotor blades) so the vehicle's speed stays in the nominal airspeed range. In another example, the emergency module 160 may (e.g., automatically) manage the change in torque introduced into the system when the engine goes idle. The specific pitch angles of the rotor blades may be determined and updated in real time by an algorithm e.g., a feedback control loop.
The first stage of an autorotation is the glide stage, during which the air vehicle glides forward and downward. During the glide stage, the user can control the air vehicle, such as direction, descent rate, forward speed, and RPM. However, the emergency module 160 may prevent the air vehicle from exiting the glide envelope (e.g., allowing the rotor RPM to decrease below the envelope range). The emergency module 160 may adjust the allowable operations available to the user so that the air vehicle stays within the envelope. In some embodiments, the emergency module 160 applies boundaries on maneuver operations the user is allowed to perform. Additionally, or alternatively, if there are excursions beyond the envelope due to environmental factors, the emergency module 160 may drive the vehicle back into the envelope. These actions may be referred to as ‘dynamic envelope protection.’ Among other advantages, this reduces the burden of the user paying attention to the envelope parameters.
Based on altitude determinations, the emergency module 160 may calculate a window when the flare should be performed. The emergency module 160 may inform the user of this window e.g., audibly or visually alerting the user when they should perform the flare. Among other advantages, this increases the likelihood that user will perform the flare maneuver at the right time.
When the flare is performed (or shortly before) (e.g., when the user initiates the flare), the emergency module 160 may automatically rotate the rotor blades to flip the pitch to manage upward thrust for the flare maneuver. For example, the user simply pulls the control stick backwards to perform the flare. The rotor pitch may be changed based on user input or the emergency module 160 managing RPM.
After the flare, the user may have at least some control of the air vehicle to control the landing (e.g., the user may control the RPM decay during landing to control how soft or hard the landing is).
The emergency module 160 may allow the user to select the landing type for an autorotation landing. For example, the user can select between a “run-on landing” or a “vertical landing,” and then assist in controlling the air vehicle based on the selection. For a run-on landing, the vehicle continues to move forward after the flare and it hits the ground moving forward (like a fixed wing aircraft). The landing type selection option allows the user to assess the landing location and make selection based on that. For example, an airstrip is a good place for a run-on landing but untested ground, such as grass may be better for a vertical landing (or if there is an obstacle e.g., tree that would prevent a run-on landing). In some embodiments, a functioning tail rotor may be required to perform a run-on landing (e.g., to keep the air vehicle aligned with the forward motion). In those embodiments, if the tail rotor is not functioning properly, the user may not have the option to select a run-on landing.
In the above descriptions, the emergency module 160 is configured to generate instructions that help the user operate the air vehicle during stages of an autorotation. In other words, the emergency module 160 made performing an autorotation easier for the user. But the user was still in control of the air vehicle and was still able to apply decisions to control aspects of the aircraft, e.g., the user may control the vehicle during the glide, flare, and landing stages. In some embodiments, the emergency module 160 can partially or fully automate any stage of an autorotation, as further described below.
For example, in some embodiments, the user simply selects a landing location or a landing type. In these embodiments, the emergency module 160 may determine a set of possible landing locations for an autorotation and presents these locations to the user for selection via a UI. These locations may be determined based on a terrain map, safety envelope of the aircraft, and environmental conditions (e.g., weather conditions). The user may select based on their knowledge of the locations (e.g., airstrip vs empty field vs beach). If the user does not touch an exact location on the user interface, the selection may auto select the nearest location.
The emergency module 160 may additionally, or alternatively, allow the user to select a landing type. The available landing types may be based on the ground type of the landing location. After those selections, the emergency module 160 may control the air vehicle so that it performs an autorotation (e.g., including controlling the vehicle during the glide, flare, and landing stages) and lands at the selected location and by the selected landing type. In some embodiments, the emergency module 160 automatically selects the landing location or landing type. For example, if no user input is provided for the landing location in a threshold among of time (e.g., 10 s), the emergency module 160 may automatically select a landing location based on various criteria. Thus, if the user is incapacitated or distracted, emergency module 160 may still perform the autorotation. Similarly, if no user input is provided in a threshold amount of time for the landing type, the emergency module 160 may automatically select the landing type of landing based on criteria e.g., the ground type at the landing location. In these examples, if no user input is provided by the threshold time, the emergency module 160 may determine the user is incapacitated and then perform the autorotation input from the user. However, if the user provides selections during the threshold time, the emergency module 160 may allow the user to control the vehicle during autorotation.
In some embodiments, the emergency module 160 includes a planning tool that allows a user to indicate that, if an autorotation should be performed during the flight, they want the air vehicle to be able to perform an autorotation and land at locations that meet certain criteria (e.g., if an autorotation is performed, the user wants the air vehicle to only land at “safe” landing locations, such as airports or landing strips). If the user provides this indication, the emergency module 160 may identify emergency landing locations between the departure and arrival locations that meet the criteria and then select a flight plan, velocity, etc. so that, during the flight, if an autorotation should be performed, the vehicle may (e.g., always) be able to land at (at least) one of the identified emergency landing locations.
At step 1010, the emergency module 160 determines occurrence of an autorotation condition for a rotary wing air vehicle controlled by a user. At step 1020, the emergency module 160, responsive to determining the occurrence of the autorotation condition, controls the air vehicle to enter into an autorotation. At step 1030, the emergency module 160 performs one or more non-user actions during the autorotation to assist the user with the autorotation. At step 1040, the emergency module 160, while performing the one or more non-user actions during the autorotation, allows the user to maneuver the air vehicle by the user interacting with one or more control interfaces of the air vehicle (e.g., 110).
Optionally, controlling the air vehicle to enter into the autorotation comprises controlling the air vehicle to enter into an autorotation glide. Optionally, controlling the air vehicle to enter into the autorotation is performed automatically without input from the user. Optionally entering into the autorotation comprises inverting rotor blades of a rotor of the air vehicle. Optionally, performing the one or more non-user actions includes maintaining an RPM of a rotor of the air vehicle. Optionally, performing the one or more non-user actions includes maintaining an airspeed of the air vehicle to a range of nominal values (e.g., unless indicated otherwise the user). Optionally, maintaining the RPM of the rotor comprises maintain the RPM of the rotor between high and low autorotation RPM thresholds. Optionally, maintaining the RPM of the rotor of the air vehicle comprises dynamically adjusting a pitch angle value of a rotor blade of the rotor. Optionally, the pitch angle value of the is dynamically adjusted using a feedback control loop. Optionally, the autorotation condition includes at least one of: an engine failure; loss of tail rotor thrust below a threshold; a fire in or on the air vehicle; a power governor failure; or a user-initiated autorotation. Optionally, performing one or more non-user actions during the autorotation to assist the user with the autorotation comprises preventing the user from maneuvering the air vehicle outside of a security envelope. Optionally allowing the user to maneuver the air vehicle includes allowing the user to control at least one of: a forward speed of the air vehicle; a decent rate of the air vehicle; a turn rate of the air vehicle; a yaw of the air vehicle; a pitch of the air vehicle; a roll of the air vehicle; a RPM of a rotor of the air vehicle; or a side slip of the air vehicle. Optionally, the method 1000 further comprises, responsive to determining the air vehicle is below a threshold height from the ground, notifying the user to perform a flare maneuver. Optionally, the method 1000 further comprises, during a flare maneuver, automatically inverting rotor blades of a rotor of the air vehicle. Optionally, the autorotation condition occurs while the air vehicle is at or below a threshold height from the ground. Optionally, the air vehicle is controlled to enter into a hovering autorotation. Optionally, the method 1000 further comprises: responsive to determining the air vehicle is moving laterally, controlling the air vehicle to turn toward a velocity vector to prevent the air vehicle from rolling over. Optionally, the method 1000 further comprises, subsequent to receiving an autorotation landing type indication, controlling one or more non-user actions to assist the user land the air vehicle according to the landing type indication. Other aspects include components, devices, systems, improvements, methods, processes, applications, computer readable mediums, and other technologies related to any of the above.
The process 1100 includes the aircraft control router, e.g., 310, receiving 1110 aircraft control inputs describing a requested trajectory for an aircraft from. For example, a human operator of an aircraft may provide the aircraft control inputs via one of the aircraft interfaces 305. The aircraft control inputs may include one or more of a forward speed control input, a lateral speed control input, a vertical speed control input, or a turn control input, e.g., as described above with reference to
The process 1100 includes the aircraft control router, e.g., 310, generating 1120, using the aircraft control inputs, a plurality of trajectory values for axes of movement of the aircraft, the plurality of trajectory values corresponding to the requested trajectory. For instance, the aircraft control router may convert the aircraft control inputs to corresponding trajectory values for axes of movement of the aircraft. As an example, if the aircraft control inputs include some or all of a forward speed control input, a lateral speed control input, a vertical speed control input, or a turn control input, the aircraft control router may determine one or more of a corresponding aircraft x-axis velocity, aircraft y-axis velocity, aircraft z-axis velocity, or angular velocity about a yaw axis of the vehicle (e.g., a yaw).
The process 1100 includes the aircraft control router generating 1130, using information describing characteristics of the aircraft and the plurality of trajectory values, a plurality of actuator commands to control the plurality of actuators of the aircraft. The aircraft control router may apply a set of control laws to the plurality of trajectory values in order to determine allowable trajectory values for the axis of movement of the aircraft. The information describing characteristics of the aircraft may include various information, such as a model including parameters for the aircraft or an estimated state of the aircraft. Furthermore, the aircraft control router may convert the plurality of trajectory values to the plurality of actuator commands using one or both of an outer processing loop and an inner processing loop, as described above with reference to the universal aircraft control router 310.
The process 1100 includes the aircraft control router transmitting 1140 the plurality of actuators commands to corresponding actuators to adjust a current trajectory of the aircraft to the requested trajectory. Alternatively, or additionally, the aircraft control router may transmit some or all of the actuator commands to other components of the aircraft to be used to control relevant actuators.
The machine may be a computing system capable of executing instructions 1224 (sequential or otherwise) that specify actions to be taken by that machine. Further, while only a single machine is illustrated, the term “machine” shall also be taken to include any collection of machines that individually or jointly execute instructions 124 to perform any one or more of the methodologies discussed herein.
The example computer system 1200 includes a processor system 1202 (e.g., including one or more central processing units (CPUs), one or more graphics processing units (GPUs), one or more neural processing units (NPUs), one or more digital signal processors (DSPs), one or more application specific integrated circuits (ASICs), one or more radio-frequency integrated circuits (RFICs), one or more field programmable gate arrays (FPGAs), or a combination thereof), a main memory 1204, and a static memory 1206, which are configured to communicate with each other via a bus 1208. The computer system 1200 may further include visual display interface 1210. The visual interface may include a software driver that enables (or provide) user interfaces to render on a screen either directly or indirectly. The visual interface 1210 may interface with a touch enabled screen. The computer system 1200 may also include input devices 1212 (e.g., a keyboard a mouse), a storage unit 1216, a signal generation device 1218 (e.g., a microphone and/or speaker), and a network interface device 1220, which also are configured to communicate via the bus 1208.
The storage unit 1216 includes a machine-readable medium 1222 (e.g., magnetic disk or solid-state memory) on which is stored instructions 1224 (e.g., software) embodying any one or more of the methodologies or functions described herein. The instructions 1224 (e.g., software) may also reside, completely or at least partially, within the main memory 1204 or within the processor system 1202 (e.g., within a processor's cache memory) during execution.
Throughout this specification, plural instances may implement components, operations, or structures described as a single instance. Although individual operations of one or more methods are illustrated and described as separate operations, one or more of the individual operations may be performed concurrently, and nothing requires that the operations be performed in the order illustrated. Structures and functionality presented as separate components in example configurations may be implemented as a combined structure or component. Similarly, structures and functionality presented as a single component may be implemented as separate components. These and other variations, modifications, additions, and improvements fall within the scope of the subject matter herein.
Certain embodiments are described herein as including logic or a number of components, modules, or mechanisms. Modules may constitute either software modules (e.g., code embodied on a machine-readable medium and processor executable) or hardware modules. A hardware module is tangible unit capable of performing certain operations and may be configured or arranged in a certain manner. In example embodiments, one or more computer systems (e.g., a standalone, client or server computer system) or one or more hardware modules of a computer system (e.g., a processor or a group of processors) may be configured by software (e.g., an application or application portion) as a hardware module that operates to perform certain operations as described herein.
In various embodiments, a hardware module may be implemented mechanically or electronically. For example, a hardware module is a tangible component that may comprise dedicated circuitry or logic that is permanently configured (e.g., as a special-purpose processor, such as a field programmable gate array (FPGA) or an application-specific integrated circuit (ASIC)) to perform certain operations. A hardware module may also comprise programmable logic or circuitry (e.g., as encompassed within a general-purpose processor or other programmable processor) that is temporarily configured by software to perform certain operations. It will be appreciated that the decision to implement a hardware module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations.
The performance of certain of the operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines. In some example embodiments, the one or more processors or processor-implemented modules may be located in a single geographic location (e.g., within a home environment, an office environment, or a server farm). In other example embodiments, the one or more processors or processor-implemented modules may be distributed across a number of geographic locations.
Some portions of this specification are presented in terms of algorithms or symbolic representations of operations on data stored as bits or binary digital signals within a machine memory (e.g., a computer memory). These algorithms or symbolic representations are examples of techniques used by those of ordinary skill in the data processing arts to convey the substance of their work to others skilled in the art. As used herein, an “algorithm” is a self-consistent sequence of operations or similar processing leading to a desired result. In this context, algorithms and operations involve physical manipulation of physical quantities. Typically, but not necessarily, such quantities may take the form of electrical, magnetic, or optical signals capable of being stored, accessed, transferred, combined, compared, or otherwise manipulated by a machine. It is convenient at times, principally for reasons of common usage, to refer to such signals using words such as “data,” “content,” “bits,” “values,” “elements,” “symbols,” “characters,” “terms,” “numbers,” “numerals,” or the like. These words, however, are merely convenient labels and are to be associated with appropriate physical quantities.
Unless specifically stated otherwise, discussions herein using words such as “processing,” “computing,” “calculating,” “determining,” “presenting,” “displaying,” or the like may refer to actions or processes of a machine (e.g., a computer) that manipulates or transforms data represented as physical (e.g., electronic, magnetic, or optical) quantities within one or more memories (e.g., volatile memory, non-volatile memory, or a combination thereof), registers, or other machine components that receive, store, transmit, or display information. Further, unless expressly stated to the contrary, “or” refers to an inclusive or and not to an exclusive or. For example, a condition A or B is satisfied by any one of the following: A is true (or present) and B is false (or not present), A is false (or not present) and B is true (or present), and both A and B are true (or present).
Upon reading this disclosure, those of skill in the art will appreciate still additional alternative structural and functional designs through the disclosed principles herein. Thus, while particular embodiments and applications have been illustrated and described, it is to be understood that the disclosed embodiments are not limited to the precise construction and components disclosed herein. Various modifications, changes and variations, which will be apparent to those skilled in the art, may be made in the arrangement, operation and details of the method and apparatus disclosed herein without departing from the spirit and scope defined in the appended claims.
This application claims a benefit of, and priority to, U.S. Patent Application Ser. No. 63/526,747, filed Jul. 14, 2023, and U.S. Patent Application Ser. No. 63/580,415, filed Sep. 4, 2023, the contents of each being incorporated by reference herein.
Number | Date | Country | |
---|---|---|---|
63526747 | Jul 2023 | US | |
63580415 | Sep 2023 | US |