This application claims priority to India Provisional Patent Application No. 202311008704, filed Feb. 10, 2023, the entire content of which is incorporated by reference herein.
The present invention generally relates to collision threat awareness for mobile platforms, and more particularly relates to a system that automatically identifies collision threats proximate to a mobile platform and directs an onboard searchlight either toward the collision threats or toward an escape route that avoids the collision threats.
Controlled Flight into Terrain/Obstacle/Water (CFIT) accidents continue to occur during helicopter operations. CFIT is a type of accident in which a functioning aircraft is unintentionally flown into the ground, man-made objects, water, or the like. CFIT is a heavy contributor to accidents with helicopter emergency medical services (HEMS) rotorcraft as pilots may be prone to fly in inclement weather and nighttime conditions and may be relatively inexperienced. In addition, single-engine pilots may be prone to fly visual flight rules (VFR) at lower altitudes and under clouds and weather thereby increasing the likelihood of CFIT accidents.
Hence, there is a need for systems and methods that promote collision threat awareness for rotorcraft and thereby reduce the occurrence of CFIT accidents. Furthermore, other desirable features and characteristics of the present invention will become apparent from the subsequent detailed description and the appended claims, taken in conjunction with the accompanying drawings and the foregoing technical field and background.
This summary is provided to describe select concepts in a simplified form that are further described in the Detailed Description. This summary is not intended to identify key or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
A system is provided for promoting collision threat awareness for a mobile platform. The system comprises a searchlight assembly comprising a searchlight mounted on the mobile platform, the searchlight configured to emit a beam of light and the searchlight assembly configured to controllably articulate the searchlight to modify a direction of the beam of light, a sensor system configured to sense a location, an elevation, and an orientation of the mobile platform and/or locations and elevations of various obstacles including terrain and manmade objects, a source of data including the locations and the elevations of the various obstacles, and a controller operably coupled to the searchlight assembly, the sensor system, and the source of data, the controller configured to, by a processor: identify a first obstacle within a flight path of and/or an area adjacent to the mobile platform that poses a collision threat to the mobile platform based on the location, the elevation, and/or the orientation of the mobile platform as sensed by the sensor system and the location and the elevation of the first obstacle as stored in the data and/or sensed by the sensor system, and operate the searchlight assembly to: automatically direct the beam of light toward the first obstacle, and/or automatically direct the beam of light toward an escape route determined based on the various obstacles in the data, wherein the escape route represents a safe flight path that allows the mobile platform to avoid collision with the first obstacle.
A method is provided for promoting collision threat awareness for a mobile platform. The method comprises receiving, with a processor of a controller of the mobile platform, a location, an elevation, and an orientation of the mobile platform from a sensor system of the mobile platform, receiving, by the processor, data including locations and elevations of various obstacles including terrain and manmade objects from a source of the data, identifying, by the processor, a first obstacle within a flight path of and/or an area adjacent to the mobile platform that poses a collision threat to the mobile platform based on the location, the elevation, and/or the orientation of the mobile platform as sensed by the sensor system and the location and the elevation of the first obstacle as stored in the data and/or sensed by the sensor system, operating a searchlight assembly to emit a beam of light from a searchlight mounted on the mobile platform, controllably articulating the searchlight, by the processor, to modify a direction of the beam of light to: automatically direct the beam of light toward the first obstacle, and/or automatically direct the beam of light toward an escape route determined based on the various obstacles in the data, wherein the escape route represents a safe flight path that allows the mobile platform to avoid collision with the first obstacle
Furthermore, other desirable features and characteristics of the system and method will become apparent from the subsequent detailed description and the appended claims, taken in conjunction with the accompanying drawings and the preceding background.
The present invention will hereinafter be described in conjunction with the following drawing figures, wherein like numerals denote like elements, and wherein:
The following detailed description is merely exemplary in nature and is not intended to limit the invention or the application and uses of the invention. As used herein, the word “exemplary” means “serving as an example, instance, or illustration.” Thus, any embodiment described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other embodiments. All of the embodiments described herein are exemplary embodiments provided to enable persons skilled in the art to make or use the invention and not to limit the scope of the invention which is defined by the claims. Furthermore, there is no intention to be bound by any expressed or implied theory presented in the preceding technical field, background, brief summary, or the following detailed description.
For the sake of brevity, conventional techniques related to signal processing, data transmission, signaling, control, and other functional aspects of the systems (and the individual operating components of the systems) may not be described in detail herein. Furthermore, the connecting lines shown in the various figures contained herein are intended to represent example functional relationships and/or physical couplings between the various elements. It should be noted that many alternative or additional functional relationships or physical connections may be present in an embodiment of the present disclosure.
Systems and methods disclosed herein provide for automatically identifying collision threats during operation of a mobile platform and directing an onboard searchlight to illuminate either the collision threats or an escape route that includes a path intended to avoid the collision threats. The mobile platform may be any type of vehicle, such as but not limited to various types of aircraft. It should be noted that the term aircraft, as utilized herein, may include any manned or unmanned object capable of flight. Examples of aircraft may include, but are not limited to, fixed-wing aerial vehicles (e.g., propeller-powered or jet powered), rotary-wing aerial vehicles (e.g., helicopters), manned aircraft, unmanned aircraft (e.g., unmanned aerial vehicles, or UAVs), delivery drones, etc. For convenience, the systems and methods will be described in reference to a manned helicopter; however, as noted the systems and methods are not limited to such application.
Referring now to
Although schematically illustrated in
The term “controller,” as appearing herein, broadly encompasses those components utilized to carry-out or otherwise support the processing functionalities of the system 100. Accordingly, the controller 12 can encompass or may be associated with any number of individual processors, flight control computers, navigational equipment pieces, computer-readable memories (including or in addition to the memory 16), power supplies, storage devices, interface cards, and other standardized components.
In various embodiments, the controller 12 includes at least one processor, a communication bus, and a computer readable storage device or media. The processor performs the computation and control functions of the controller 12. The processor can be any custom made or commercially available processor, a central processing unit (CPU), a graphics processing unit (GPU), an auxiliary processor among several processors associated with the controller 12, a semiconductor-based microprocessor (in the form of a microchip or chip set), any combination thereof, or generally any device for executing instructions. The computer readable storage device or media may include volatile and nonvolatile storage in read-only memory (ROM), random-access memory (RAM), and keep-alive memory (KAM), for example. KAM is a persistent or non-volatile memory that may be used to store various operating variables while the processor is powered down. The computer-readable storage device or media may be implemented using any of a number of known memory devices such as PROMs (programmable read-only memory), EPROMs (electrically PROM), EEPROMs (electrically erasable PROM), flash memory, or any other electric, magnetic, optical, or combination memory devices capable of storing data, some of which represent executable instructions, used by the controller 12. The bus serves to transmit programs, data, status and other information or signals between the various components of the helicopter 10. The bus can be any suitable physical or logical means of connecting computer systems and components. This includes, but is not limited to, direct hard-wired connections, fiber optics, infrared, and wireless bus technologies.
The instructions may include one or more separate programs, each of which comprises an ordered listing of executable instructions for implementing logical functions. The instructions, when executed by the processor, receive and process signals from the sensors 22, perform logic, calculations, methods and/or algorithms, and generate data based on the logic, calculations, methods, and/or algorithms. Although only one controller 12 is shown in
The controller 12 may exchange data with one or more external sources 40 to support operation of the system 100 in various embodiments. In this case, bidirectional wireless data exchange may occur via the communication system 24 over a communications network, such as a public or private network implemented in accordance with Transmission Control Protocol/Internet Protocol architectures or other conventional protocol standards. Encryption and mutual authentication techniques may be applied, as appropriate, to ensure data security.
In various embodiments, the communication system 24 is configured to support instantaneous (i.e., real time or current) communications between on-board systems, the controller 12, and one or more external data source(s) 40. The communication system 24 may incorporate one or more transmitters, receivers, and the supporting communications hardware and software required for components of the system 100 to communicate as described herein. In various embodiments, the communication system 24 may have additional communications not directly relied upon herein, such as bidirectional pilot-to-ATC (air traffic control) communications via a datalink, and any other suitable radio communication system that supports communications between the helicopter 10 and various external source(s).
The memory 16 can encompass any number and type of storage media suitable for storing computer-readable code or instructions, such as the program 36, as well as other data generally supporting the operation of the system 100. As can be appreciated, the memory 16 may be part of the controller 12, separate from the controller 12, or part of the controller 12 and part of a separate system. The memory 16 can be any suitable type of storage apparatus, including various different types of direct access storage and/or other memory devices.
A source of information suitable for determining and identifying collision threats may be part of the system 100. In certain embodiments, the source is one or more databases 28 employed to receive and store map data, which may be updated on a periodic or iterative basis to ensure data timeliness. In various embodiments, the map data may include various terrain and manmade object locations and elevations and may be stored in the memory 16 or in the one or more databases 28, and referenced by the program 36. In various embodiments, these databases 28 may be available online and accessible remotely by a suitable wireless communication system, such as the communication system 24.
The sensor system 22 supplies various types of data and/or measurements to the controller 12. In various embodiments, the sensor system 22 supplies, without limitation, one or more of: inertial reference system measurements providing a location, Flight Path Angle (FPA) measurements, airspeed data, groundspeed data, vertical speed data, vertical acceleration data, altitude data, attitude data including pitch data and roll measurements, yaw data, data related to ownship weight, time/date information, heading information, data related to atmospheric conditions, flight path data, flight track data, radar altitude data, geometric altitude data, wind speed and direction data. Further, in certain embodiments of the system 100, the controller 12, and the other components of the system 100 may be included within or cooperate with any number and type of systems commonly deployed onboard aircraft including, for example, an FMS, an Attitude Heading Reference System (AHRS), an Instrument Landing System (ILS), and/or an Inertial Reference System (IRS).
With continued reference to
At least one avionic display 34 is generated on display device 32 during operation of the system 100. The term “avionic display” as used herein is synonymous with the terms “aircraft-related display” and “cockpit display” and encompasses displays generated in textual, graphical, cartographical, and other formats. The system 100 can generate various types of lateral and vertical avionic displays 34 on which symbology, text annunciations, and other graphics pertaining to flight planning are presented for a pilot to view. The display device 32 is configured to continuously render at least one avionic display 34 showing a terrain environment at a current location of the helicopter 10. The avionic display 34 generated and controlled by the system 100 can include alphanumerical input displays of the type commonly presented on the screens of multi-function control and display units (MCDUs), as well as Control Display Units (CDUs) generally. Specifically, certain embodiments of the avionic displays 34 include one or more two dimensional (2D) avionic displays, such as a horizontal (i.e., lateral) navigation display or vertical navigation display; and/or on one or more three dimensional (3D) avionic displays, such as a Primary Flight Display (PFD) or an exocentric 3D avionic display.
In various embodiments, a human-machine interface, such a touch screen display, is implemented as an integration of the user interface 18 and the display device 32. Via various display and graphics systems processes, the controller 12 may command and control the touch screen display generating a variety of graphical user interface (GUI) objects or elements, for example, buttons, sliders, and the like, which are used to prompt a user to interact with the human-machine interface to provide user input, and to activate respective functions and provide user feedback, responsive to received user input at the GUI element.
The searchlight assembly 25 may include various components including a searchlight 27 configured to emit a beam of light 50 that illuminates a light spot 52 on an object on which the beam of light 50 impinges, an actuation system comprising one or more actuators configured to control the position of the searchlight 27 and thereby the direction of the beam of light 50, and a searchlight controller configured to control the actuation system based on preprogrammed instructions and/or pilot input.
With reference to
In various embodiments, the collision threat module 110 receives as input external data 120 received from the external sources 40 via the communication system 24, onboard sensor data 122 generated by the sensors 22, and/or onboard database data 124 retrieved from the database 28. The external data 120 includes various data indicating information relating to locations and elevations of terrain (e.g., trees, mountains, hills, bodies of water, etc.), manmade objects (buildings, bridges, utility lines, etc.), and the like. The onboard sensor data 122 includes various data indicating sensed locations and/or elevations of terrain, manmade objects, and the like, sensed operating parameters of the helicopter 10, and/or environmental conditions (e.g., wind speed and/or direction; e.g., Automatic Terminal Information Service (ATIS) data or Air Data Computer (ADC) data) in a geographic area relevant to the helicopter 10 (e.g., adjacent to, along a flight path thereof, proximate to, etc.). The onboard database data 124 may include various data indicating information relating to locations and elevations of terrain, manmade objects, and the like.
The collision threat module 110 may analyze the external data 120, the sensor data 122, and/or the database data 124 to identify collision threats, that is, terrain, man-made objects, or the like (collectively referred to as obstacles) that are within in a geographic area relevant to the helicopter 10 that pose a threat of collision to the helicopter 10 based on, for example, a sensed position, elevation, orientation, direction of travel of the helicopter 10, a distance between the obstacle and the helicopter 10, etc. For example, the collision threat module 110 may determine that the helicopter 10 is likely to or has a potential to collide with a nearby obstacle (e.g., a building) based on the operating parameters of the helicopter 10 and the position and elevation of the obstacle as stored in the received data (i.e., the external data 120, the sensor data 122, or the database data 124). The determination of whether an obstacle is a collision threat, that is, poses a threat of collision with the helicopter 10, may be based on various preprogrammed criteria, such as a threshold relating to a minimum distance between the helicopter 10 and the obstacle. The collision threat module 110 generates collision threat data 126 that includes various data identifying collision threats.
In various embodiments in which the collision threat is intended to be illuminated, the searchlight control module 114 receives as input the collision threat data 126 generated by the collision threat module 110. The searchlight control module 114 may generate searchlight control data 132 configured to cause the searchlight assembly 25 to direct a beam of light 50 toward the collision threat 400 (e.g., a building in
In various embodiments, the escape route generation module 116 receives as input the collision threat data 126 generated by the collision threat module 110. The escape route generation module 116 analyzes the collision threat data 126 to determine an escape route based on the obstacle(s) identified as collision threats. The escape route represents a safe flight path that allows the helicopter 10 to avoid collision with the obstacle(s). The escape route generation module 116 generates escape route data 128 that includes various data indicating the determined escape route.
In embodiments in which an escape route is to be illuminated, the searchlight control module 114 receives as input the escape route data 128 generated by the escape route generation module 116. The searchlight control module 114 may generate the searchlight control data 132 configured to cause the searchlight assembly 25 to direct the beam of light 50 toward or along the determined escape route (e.g., between obstacles identified as collision threats).
In various embodiments, the display module 112 receives as input the searchlight control data 132 generated by the searchlight control module 114. The display module 112 generates display data 130 that includes various data configured to cause one or more graphic icons to be rendered on the display device 32 that visually indicates a location of the light spot 52, the collision threat 400, and/or the escape route.
In various embodiments in which the collision threat is intended to be illuminated, the display data 130 may include various data configured to cause one or more graphic icons to be rendered on the display device 32 that indicates, identifies, and/or highlights the collision threat 400. For example,
The PFD 300 is a perspective view Synthetic Vision System (SVS) display including graphical renderings of terrain and other geographical features representing the view from the cockpit under ideal visibility conditions (a so-called “glass cockpit” view). The simulated “glass cockpit” view produced on the PFD 300 thus includes an environmental graphic 350, which represents a first-person view of a real terrain environment which the helicopter 10 is presently approaching (typically oriented in or limited to a forward field of view relative to the helicopter 10). Additionally, the PFD 300 includes a dynamic visual element 332 representative of a location of the collision threat 400 that poses a threat to the helicopter 10. The visual element 332, also referred to herein as the collision threat icon 332, may be rendered on the PFD 300 in response to identification of a collision threat 402 (e.g., elevated terrain in
In some embodiments, the collision threat icon 332 may be configured to highlight the light spot 52 of the beam of light 50, configured to highlight the potential or likely point of impact in the event of a collision between the helicopter 10 and the collision threat 402, or may be configured to provide visual prominence.
In various embodiments, the display data 130 may include data configured to cause relevant data to be rendered on the display device 32. For example,
In various embodiments, the collision threat 402 may be located out of view relative to the display 34 and the collision threat icon 332 may be rendered on the display device 32 in a manner that indicates that the collision threat 402 is out of view. For example,
In embodiments in which an escape route is to be illuminated, the display data 130 may include various data configured to cause an escape route icon to be rendered on the display device 32 that indicates, identifies, and/or highlights the escape route. For example, the escape route icon may represent a position of the light spot 52, a direction of the beam of light 50, or a path along the escape route. In
With reference now to
The method 200 may begin at 210. At 212, the method 200 may include receiving environmental data that includes information relating to an environment adjacent to the helicopter 10 and/or along a flight path thereof. For example, the environmental data may include the external data 120, the sensor data 122, and/or the database data 124. At 214, the method 200 may include identifying a collision threat (e.g., collision threat 400 or 402) based on the environmental data. At 216, the method 200 may include determining an escape route that includes a path configured to avoid obstacles in the environmental data. At 218, the method 200 may include directing the searchlight 27 of the helicopter 10 to indicate the escape route or a portion thereof. At 220, the method 200 may include displaying an escape route icon (e.g., escape route icon 336) on the display device 32 of the helicopter 10 representing the escape route or a portion thereof.
Alternatively or in addition to the above, at 222, the method 200 may include directing the searchlight 27 to illuminate the collision threat. At 224, the method 200 may include tracking the collision threat with the searchlight 27, that is, position locking the searchlight 27 such that the beam of light (e.g., beam of light 50) is continuously directed toward the collision threat while the helicopter 10 is moving. At 226, the method 200 may include displaying a collision threat icon (e.g., collision threat icon 332) on the display device 32 representing a light spot (e.g., light spot 52) produced by the searchlight 27.
At 228, the method 200 may include fading the searchlight 27 in response to the helicopter 10 avoiding the collision threat, that is, the obstacle that was previously identified as the collision threat is no longer considered to be a threat to the helicopter 10. The method 200 may end at 230.
In various embodiments, the system 100 may be configured to optionally use the searchlight 27 and the display device 32 to indicate the escape route or the collision threat. In such embodiments, the system 100 may automatically switch between these functions based on preprogramed criteria, and/or a pilot may be provided with the ability to manually switch between the features. In such embodiments, the system 100 may indicate to the pilot which of the features is currently active. For example, a collision threat awareness icon may be rendered on the display device 32 indicating which of the features is active.
In some embodiments, the appearance of the collision threat icon 332 can be varied to generate visual alerts and convey other information. For example, the collision threat icon 332 may be altered in appearance to generate a visual alert cautioning the pilot that the helicopter 10 may be, for example, within a minimum distance of the collision threat. This and other visual alerts can be implemented by changing the appearance of the collision threat icon 332 in any number of manners. Other examples may include increasing a size of one or more aspects of the collision threat icon 332, changing a color of one or more aspects of the collision threat icon 332, or any other modification intended to draw the attention of the pilot. In some embodiments, the appearance of the collision threat icon 332 can be changed in other manners to generate alerts and the alerts can increase in urgency depending upon the severity of the alert condition. For example, in the case of a higher level alert, certain aspects can be rendered in a predetermined warning color (e.g., red) or animation (e.g., flashing) can be applied to one or more of aspects of the collision threat icon 332.
The systems and methods disclosed herein provide various benefits over certain existing systems and methods. For example, directing the searchlight 27 to illuminate the collision threat or indicate the escape route may assist the pilot during visual flight of the helicopter 10. Further, rendering the collision threat icon 332 and/or the escape route icon on the display device 32 may assist the pilot during instrument flight of the helicopter 10.
Those of skill in the art will appreciate that the various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both. Some of the embodiments and implementations are described above in terms of functional and/or logical block components (or modules) and various processing steps. However, it should be appreciated that such block components (or modules) may be realized by any number of hardware, software, and/or firmware components configured to perform the specified functions. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system 100. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention. For example, an embodiment of a system or a component may employ various integrated circuit components, e.g., memory elements, digital signal processing elements, logic elements, look-up tables, or the like, which may carry out a variety of functions under the control of one or more microprocessors or other control devices. In addition, those skilled in the art will appreciate that embodiments described herein are merely exemplary implementations.
The various illustrative logical blocks, modules, and circuits described in connection with the embodiments disclosed herein may be implemented or performed with a general purpose processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general-purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor such that the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The processor and the storage medium may reside in an ASIC. The ASIC may reside in a user terminal. In the alternative, the processor and the storage medium may reside as discrete components in a user terminal.
In this document, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Numerical ordinals such as “first.” “second.” “third,” etc. simply denote different singles of a plurality and do not imply any order or sequence unless specifically defined by the claim language. The sequence of the text in any of the claims does not imply that process steps must be performed in a temporal or logical order according to such sequence unless it is specifically defined by the language of the claim. The process steps may be interchanged in any order without departing from the scope of the invention as long as such an interchange does not contradict the claim language and is not logically nonsensical.
Furthermore, depending on the context, words such as “connect” or “coupled to” used in describing a relationship between different elements do not imply that a direct physical connection must be made between these elements. For example, two elements may be connected to each other physically, electronically, logically, or in any other manner, through one or more additional elements.
While at least one exemplary embodiment has been presented in the foregoing detailed description of the invention, it should be appreciated that a vast number of variations exist. It should also be appreciated that the exemplary embodiment or exemplary embodiments are only examples, and are not intended to limit the scope, applicability, or configuration of the invention in any way. Rather, the foregoing detailed description will provide those skilled in the art with a convenient road map for implementing an exemplary embodiment of the invention. It being understood that various changes may be made in the function and arrangement of elements described in an exemplary embodiment without departing from the scope of the invention as set forth in the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
202311008704 | Feb 2023 | IN | national |