Aircraft may encounter a variety of risks during flight, such as collision with other aircraft, equipment, buildings, birds, debris, terrain, and other objects. Self-piloted aircrafts may collect and process sensor data to detect objects in the space around the aircraft that pose a collision risk or may otherwise cause damage or injury to an aircraft or its occupants. The detection, recognition, and/or avoidance of sensed objects may, in some instances, include one or more intelligent (e.g., autonomous) components capable of independently adapting to sensed data and determining a suitable path for the aircraft to follow or a suitable action to perform (e.g. climb, descent, turn) in order to avoid colliding with the objects. Such components may not rely on explicitly programmed instructions, instead applying machine learning techniques to progressively generate modified, improved models and algorithms for perception and decision making.
In order for an aircraft to be certified as meeting airworthiness standards, any software and electronic hardware relating to safety-critical operations (such as collision avoidance) must meet certain standards promulgated by certification authorities, the International Organization for Standardization (ISO), and/or other standards-setting organizations. For example, DO-178 and DO-254, among other standards, may apply to regulate safety-critical hardware and software.
Because software based on machine learning models may not rely on a fixed set of code, several challenges arise with respect to meeting certification standards. Initially, once an aircraft has been certified to meet regulatory standards, the manufacturer of the aircraft may not be able to alter any safety-critical components on which certification was based, including software, without going through a new or supplementary certification process. The process of seeking recertification after each software update, however minor that update may be, may be prohibitively expensive, time-consuming, or otherwise impracticable. Further, the need to include certified safety-critical hardware on the aircraft may limit the hardware choices and configurations available to aircraft manufacturer.
The disclosure can be better understood with reference to the following drawings. The elements of the drawings are not necessarily to scale relative to each other, emphasis instead being placed upon clearly illustrating the principles of the disclosure.
In the figures, the use of the same reference numbers in different figures indicates similar or identical items or features. The drawings are not to scale.
The present disclosure generally pertains to computing architectures for aircrafts using autonomous machine learning algorithms for sensing and avoiding external objects. In some embodiments, an aircraft includes an aircraft monitoring system having sensors that are used to sense the presence of objects around the aircraft for collision avoidance, navigation, or other purposes. At least one of the sensors may be configured to sense objects within the sensor's field of view and provide sensor data indicative of the sensed objects. The aircraft includes one or more systems directed to the collection and interpretation of the sensor data to determine whether an object is a collision threat, providing a recommendation or advisory of an action to be taken by the aircraft to avoid collision with the sensed object, and controlling the aircraft to avoid collision if necessary. The detection, recognition, and/or avoidance of sensed objects may, in some instances, include one or more intelligent (e.g., autonomous) components capable of independently adapting to new data and previously-performed computations. Such components may not rely solely on explicitly programmed (e.g., pre-determined) instructions, instead applying machine learning techniques to iteratively train and generate improved models and algorithms for perception and decision making, which models are frozen and deployed on the aircraft after each iteration, until they are updated by the next subsequent update.
A sensing system may take in sensor information and output position and vector and/or classification information regarding a sensed object. A planning and avoidance system may take in the output of the sensing system and may generate an escape path or action that represents a route that the aircraft can follow to safely avoid a collision with the detected object. The escape path or action may, in some embodiments, be passed as an advisory (or guidance) to an aircraft control system that implements the advisory by controlling, as an example, the speed or direction of the aircraft, in order to avoid collision with the sensed object, to navigate the aircraft to a desired location relative to a sensed object, or to control the aircraft for other purposes.
In one embodiment, the architecture for a detect and avoid system is designed so as to comprise at least two avoidance algorithms, each using a machine learning solution to generate respective avoidance recommendations, though other embodiments may not necessarily use machine learning solutions. In an exemplary embodiment, a first algorithm (e.g., an Airborne Collision Avoidance System (ACAS) such as ACAS X, or Daedalus) may be directed to avoiding encounters with airborne aircrafts. A second algorithm responsible for lower priority encounters (e.g. encounters with drones or birds) may be directed to avoiding encounters with other (non-aircraft) airborne obstacles and ground obstacles (e.g., terrain, cranes, etc.). Depending on the type and number of objects being sensed, both algorithms may function to generate guidance, but only one guidance will be sent to the flight management system. If the detected objects are not aircraft, the ground and other airborne obstacles avoidance algorithm will generate the guidance for the flight management system. If the detected objects are aircraft, the airborne aircraft avoidance algorithm will generate the guidance for the flight management system. If detected objects include both aircraft and non-aircraft objects, the ground and other airborne obstacles avoidance algorithm will generate a guidance that will be fed to the airborne aircraft avoidance algorithm instead of the flight management system. This input guidance and the aircraft object detection will be taken in account simultaneously by the airborne aircraft avoidance algorithm to generate a unique blended guidance that is sent to the flight management system. The guidance sent to the flight management system is used to control the aircraft in an appropriate manner. In one embodiment where ground and other airborne objects are not a concern for the environment in which the aircraft is used, only the airborne aircraft avoidance algorithm is used to generate guidance for the flight management system.
In an exemplary embodiment, in addition to guidance, the second algorithm generates one or more inhibits or restrictions that are sent, in a feedback loop, as an input to the first (airborne aircraft) algorithms. The inhibits may include position and/or vector information regarding one or more locations or regions at which ground obstacles or non-aircraft airborne obstacles are located, or should otherwise be avoided, for instance to maintain a certain separation of airspace between the aircraft and the detected objects. The first algorithm may use this inhibit information as a restriction input, so as to factor in the position of non-aircraft objects in its generation of avoidance guidance regarding airborne aircrafts.
In conventional solutions using known standards for avoidance (e.g., ACAS X), avoidance guidance is limited to avoidance of airborne aircraft, and other non-aircraft objects and ground objects are not factored therein. The systems and methods described herein provide a highly-desired improvement to such conventional technology, allowing for the consideration of other sensed obstacles while still giving precedent and priority to aircraft avoidance.
In some embodiments, the detect and avoid system is designed with a sensing system that is certified to a high-level safety standard (as one exemplary embodiment, in accordance with safety classifications used by certification authorities, a Design Assurance Level such as DAL-B, though any standard may be used in other embodiments). The architecture for the sensing system may take in information from two different sensors (e.g., a camera and a radar) each certified to a mid-level safety standard (in the exemplary embodiment, DAL-C). The sensing system may include the output of one of the sensors (a primary sensor) into two dissimilar machine learning algorithms, each functioning in parallel independently from the other, each respectively certified to a lower-level safety standard (e.g., in the exemplary embodiment, DAL-D). The two dissimilar machine learning algorithms are independent in software, each being differently trained upon sensor data. Each machine learning algorithm outputs a respective detection based on the sensor data. A comparison or validation module determines whether the two independent and dissimilar machine learning algorithms output the same detection (or, e.g., within a certain discrepancy or error bound).
In the exemplary embodiment, if the outputs of the two algorithms are confirmed to overlap, the results of one or both of the machine learning algorithms are used by an avoidance system. If the outputs of the two algorithms are not confirmed to overlap (or exceed a preset error bound), the sensed output of the second of the sensors (a fallback sensor) is used by an avoidance system (in some embodiments, after being processed by a third, non-machine learning, algorithm). In the exemplary embodiment, the confirmed overlap of the two machine learning algorithms, each certified to a lower-level safety standard, (together, a dual-algorithm solution) may be certificatable to a mid-level safety standard. Further, the dual-algorithm solution, certified at a mid-level safety standard) taken together with the presence of a fallback sensor certified to a mid-level safety standard, allows the architecture of the sensing system as a whole to be certified to a high-level safety standard.
In conventional solutions, machine learning algorithms cannot, by their nature, be certified to high safety levels under the current certification standards, and therefore, highly-certified sensor systems (e.g., radar) and deterministic legacy software systems must be relied upon, whether as primary or back up systems. In some scenarios, radar and/or deterministic legacy software solutions may be less intelligent, accurate, or capable than modern machine learning solutions, and therefore, the performance of the aircraft's avoidance system may be capped, even as available technology for obstacle avoidance improves. Certification of any improvements is a cumbersome, expensive process that may take several months or years per update.
Contrary to conventional techniques, in the systems and methods described herein, the presence of a plurality of independent dissimilar machine learning solutions, confirmed to produce overlapping, reliable detection guidance, allows for a high-level of safety certification. In addition, the machine learning algorithms may provide a more consistent performance and improvement in accuracy as compared to the information generated by a fallback sensor system.
Still further, known sensor technology (e.g., radar) may not be reliable enough to allow for certification of individual sensor hardware at a high-level safety standard. Accordingly, conventional aircraft implementations may use duplicated or redundant sensor technology to reach the required safety levels. The systems and methods described herein minimize redundancy of hardware, allowing for a high level of safety while minimizing the number of physical sensors (e.g., cameras, radar) that must be installed on an aircraft. According, the SWAP (size, weight and power) of the hardware on the aircraft can be reduced, improving aircraft cost, complexity, and performance.
Aircraft 10 has one or more sensors 20 of a first type for monitoring space around the aircraft, and one or more sensors 30 of a second type for sensing the same space and/or additional spaces. Any number of sensors, and any number of types of sensors may comprise the illustrated sensors 20, 30. These sensors may, in various embodiments, be any appropriate optical or non-optical sensor(s) for detecting the presence of objects, such as an electro-optical or infrared (EO/IR) sensor (e.g., a camera), a light detection and ranging (LIDAR) sensor, a radio detection and ranging (radar) sensor, transponders, inertial navigation systems and/or global navigation satellite system (INS/GNSS), or any other sensor type that may be appropriate. For example, a sensor may be configured to receive a broadcast signal (e.g., through Automatic Dependent Surveillance-Broadcast (ADS-B) technology) from an object indicating the object's flight path.
For ease of illustration,
The object 15 may be any of various types that aircraft 10 may encounter during flight, for example, another aircraft (e.g., airplane or helicopter), a drone, a bird, debris, or terrain, or any other of various types of objects that may damage the aircraft 10 or impact its flight, if the aircraft 10 and the object 15 were to collide. The object 15 is depicted in
In identifying an escape path 35, the aircraft monitoring system 5 may use information from sensors 20, 30 about the sensed object 15, such as its location, velocity, and/or probable classification (e.g., that the object is a bird, aircraft, debris, building, etc.). Sensors 20, 30 are capable of detecting objects anywhere within their field of view. As mentioned above, the sensors have a full or partial field of view all around the aircraft (not specifically shown) in all directions; the field of view is not limited to the escape envelope 25 illustrated in
The components shown in
A combination of some components from the sensors 20, 30, the sensing system 205, and the planning and avoidance system 220 function together as a “detect and avoid” element 210. The detect and avoid element 210 may perform processing of sensor data (as well as other data, such as flight planning data (e.g., terrain and weather information, among other things) and/or data received from aircraft control system 240 regarding an escape envelope) to generate an avoidance recommendation (or advisory) for an action to be taken by the aircraft controller 245. Data in support of this avoidance recommendation may be sent from the sensing system 205 to an avoidance element 224 (of planning and avoidance system 220), which applies one or more avoidance algorithms thereto to generate an optimized escape path. In some embodiments, the avoidance algorithm may be deterministic or probabilistic in nature. The avoidance element 224 may, in some embodiments, employ a machine learning algorithm to classify and/or detect the location of an object 15 in order to better assess its possible flight performance, such as speed and maneuverability, and threat risk. In this regard, the system 5 may store object data that is indicative of various types of objects, such as birds or other aircraft that might be encountered by the aircraft 10 during flight, and may identify and/or classify sensed objects. It is possible to identify not just categories of objects (e.g., bird, drone, airplane, helicopter, etc.) but also specific object types within a category.
The avoidance algorithm(s) may, in some embodiments, also consider information from flight planning system 228. Such information may include, for example, a priori data 222, e.g., terrain information about the placement of buildings or other known static features, information about weather, airspace information, including known flight paths of other aircrafts (for example, other aircrafts in a fleet), and/or other relevant predetermined (or pre-discoverable) information. Such information may also include remote operation data 226, which may include information received from remote systems (e.g., air traffic control, operator information, etc.).
The planning and avoidance system 220 may provide its generated path information and/or other signals to the mission processing element 242 of aircraft control system 240. As one example of many, the planning and avoidance system may generate an escape action such as “climb at 500 ft/min and maintain regime until an advisory alert is turned off,” though any appropriate type of escape path or action may be used. The escape path or action may, in some embodiments, be passed as an advisory to an aircraft control system that implements the advisory by controlling, as an example, the speed or direction of the aircraft, in order to avoid collision with the sensed object, to navigate the aircraft to a desired location relative to a sensed object, or to control the aircraft for other purposes. In some embodiments, the aircraft controller 245 may perform suitable control operations of the aircraft 10 by providing signals or otherwise controlling a plurality of actuators 246 that may be respectively coupled to one or more flight control surfaces 248, such as rudders, ailerons, elevators, flaps, spoilers, brakes, or other types of aerodynamic devices typically used to control an aircraft. Although a single actuator 246 and a single flight control surface 248 are depicted in
It will be understood that the aircraft controller 245 is a reactive system, taking in the recommendation of detect and avoid system 210 and reacting thereto. In response to receiving the recommendation, the mission processing element 242 may be configured to provide a signal to aircraft controller 245 to take an action in response to the threat, such as providing a warning to a user (e.g., a pilot or passenger) or controlling the aircraft control system 240 (e.g., actuators 246 and the propulsion system 247) to change the velocity (speed and/or direction) of the aircraft 10. As an example, the aircraft controller 245 may control the velocity of the aircraft 10 in an effort to follow an escape path 35, thereby avoiding a sensed object 15. Alternatively, the aircraft controller 245 may navigate to a desired destination or other location based on the position, known or anticipated direction, and/or speed of the sensed object 15.
The various components of the aircraft monitoring system 5 may be implemented in hardware or a combination of hardware and software/firmware. As an example, the aircraft monitoring system 5 may comprise one or more application-specific integrated circuits (ASICs), field-programmable gate arrays (FPGAs), and/or microprocessors programmed with software or firmware, or other types of circuits for performing the described functionalities. Systems 210, 220, and 240 may in some embodiments be implemented on discrete computing hardware and/or software, or in alternate embodiments, some components may be implemented with the same computing hardware or may share processors or other resources. Any appropriate configuration may be used, for example based on considerations such as weight and power consumption, communication latency, processing and/or computational limitation, or varying safety requirements for different systems.
The second source of data provided to avoidance system 224 is the sensing system 205 which may take in the input from one or more sensors and output position and vector regarding one or more objects or obstacles sensed therein. In the illustrated embodiment, sensing system 205 is shown to collect information from an electro-optical (EO) sensor (e.g., a camera) and a radio detection and ranging (radar) sensor, however, in other embodiments, any sensor data may be used, such as data from one or more of an electro-optical or infrared (EO/IR) sensor, a light detection and ranging (LIDAR) sensor, a radar sensor, other sensor types, or any combination thereof. Non-cooperative aircraft (e.g., drones, certain other aircraft, and other airborne objects) cannot broadcast their own position information, and accordingly, detect and avoid system 210 uses this sensor system to detect such traffic, as well as other obstacles on the ground or in the air. In some embodiments, sensors such as transponders, inertial navigation systems and/or global navigation satellite system (INS/GNSS) may be used to collect information that is variously used by detect and avoid system 210, for example to derive the absolute position of a sensed object with regard to the aircraft 10. As the sensed data is safety critical, the sensors providing such data may be required to meet one or more safety standards. In one embodiment, the safety standards may be based on or derived from classifications used by the certification authorities, e.g., the Design Assurance Levels (DALs) “A” through “E”, each level being respectively less stringent. Other standards may be used in other embodiments, whether promulgated by certification authorities, the International Organization for Standardization (ISO), and/or other standards-setting organizations.
The generated outputs of module 314 and system 205 are typically position and/or vector data regarding objects or sensed obstacles in the airspace around aircraft 10. In embodiments, these outputs may also variously include classification data regarding the sensed objects and obstacles, identifying categories of objects (e.g., bird, drone, airplane, helicopter, tree, mountain, crane/equipment, unknown, etc.) and/or specific object types within a category (e.g., aircraft type or class) or other characteristics of the object (e.g., payload, nature of movement (e.g., known or erratic), etc.). The outputs are sent to the avoidance system 224 which may apply one or more algorithms to such data to generate an avoidance recommendation or advisory for how to avoid the sensed object, if necessary. The avoidance system may also rely upon known terrain and/or obstacle data (stored, e.g., in database 340) that may not necessarily be sensed by the system 205. The recommendation or advisory is passed to flight management system 330 (which may, in some embodiments, take in information from the terrain and obstacle data 340), which system functions to control the aircraft to avoid the sensed obstacles (if appropriate). The flight management system 330 may transmit information about the aircraft 10's position, speed and flight plan back to detect and avoid system 210. The flight management system 330 may also generate coordinate guidance data, which data is sent to the module 312 to send to any cooperative (e.g., intelligent) aircraft within the relevant airspace.
Because the results of the machine learning algorithms are self-validating, the validation module 440, taking in the outputs of two independent and dissimilar modules certified to a DAL-D standard (dual-algorithm output), can together be certified to a DAL-C standard. The validation module 440, and the output of the radar 414, can be considered together by one or more other components of the sensing system 205 or the avoidance system 224, and that aggregated result 465 may be certified to a DAL-B standard. It may be generally understood from the architecture of
It may be generally understood of course that any number of sensors and other components and/or any relevant safety standard(s) may be used in other embodiments. In the discussion of
The exemplary machine learning algorithms 420 and 430 are dissimilar to each other. As illustrated, the algorithms function in parallel, taking in the same image data, however, each are independent in software and are trained differently upon the sensor data, using different training datasets. In some embodiments, the algorithms 420 and 430 may additionally be independent in hardware, such that each uses a respective processor (or set of processors). In other embodiments, the algorithms 420, 430 may share hardware but be arranged to be logically independent from each other. In some alternative embodiments, the code of algorithms 420, 430 may include position-independent code, or may be stored in different sections of a memory. Accordingly, in the exemplary embodiment, the respective datasets, neural network architecture, and/or the testing and validation of the results may differ between the algorithms 420 and 430.
In some embodiments, rather than applying both algorithms to exactly the same images, camera 412 may output multiple frames in a short period of time (e.g., two frames per second), and the frames may be alternatingly processed by either of algorithms 420 and 430. As a result, although the two algorithms function in parallel, the throughput and real-time performance is not diminished.
In the exemplary embodiment, if the outputs of the two algorithms are confirmed to overlap, the result of one or both of the machine learning algorithms are used by an avoidance system. If the outputs of the two algorithms are not confirmed to overlap, the sensed output of the second sensor(s) (a fallback sensor) is used by the avoidance system 224. In some embodiments, the fallback sensor data may be first processed by a one or more (non-machine learning) algorithms.
In the exemplary embodiment, the confirmed overlap of the two machine learning algorithms each certified to a lower-level safety standard (together, a dual-algorithm solution) increases the assurance of the machine learning result beyond what an individual algorithm could be certifiable on under the existing frameworks, due to the nature of how a machine learning algorithm operates.
In the embodiment of
In an exemplary embodiment, airborne aircraft encounters logic 510 may be any Airborne Collision Avoidance System (ACAS), (e.g. ACAS X), or any other safety rated algorithm(s) directed to avoiding encounters with airborne aircrafts. The airborne aircraft encounters logic 510 is limited to the detection of aircraft, e.g., planes, helicopters, and the like. A detected aircraft may be likely to be carrying passengers, and therefore, the avoidance of collision between aircraft 10 and other detected aircraft is of paramount importance. Collision with aircraft carrying other types of cargo or payload is similarly important. However, many other obstacles may exist in the airspace around aircraft 10, including airborne objects such as birds or drones, and ground obstacles within the aircraft 10's flight plan. This may include for instance, trees, equipment (e.g., cranes), mountains or terrains, or other objects, whether at a high altitude, or, in circumstances involving takeoff and landing, lower altitudes. Avoidance system 224 takes such risks into consideration through the application of ground obstacles and airborne obstacles encounters logic 520, containing algorithm(s) directed to avoiding encounters with other non-aircraft airborne obstacles and ground obstacles.
In an exemplary embodiment, sensor system 205 and module 314 for detecting cooperative aircraft may transmit position and/or vector information to the system 224 indicating one or more detected objects. In some embodiments, the transmitted data may also include classification information that may be used to categorize the sensed objects as aircraft or non-aircraft, and/or into other more granular categories. In the case that an object is sensed, both algorithms 510 and 520 may function to generate guidance for a flight management system. In an exemplary embodiment, if the fuse guidance module 560 receives an output 530 from airborne aircraft encounters logic 510, fuse guidance module 560 selects that output for transmission to the flight management system 330 and ignores or discards any output 540 from ground obstacles and airborne obstacles encounters logic 520, as output 530 represents guidance regarding a detected aircraft, a higher priority target. In cases where the ground obstacles and airborne obstacles encounters logic 520, and the airborne aircraft encounters logic 510 are both generating a guidance in response to sensed objects of the category they are respectively responsible for, the guidance of the ground obstacles and airborne obstacles encounters logic 520 is sent to the airborne aircraft encounters logic 510 (to be factored into a combined or blended guidance output), and is discarded by the fuse guidance module 560 such that only the blended guidance 530 provided by the airborne aircraft encounters logic 510 is transmitted to the flight management system 330. If fuse guidance module 560 does not receive an output 530 from airborne aircraft encounters logic 510, and only receives an output 540 from ground obstacles and airborne obstacles encounters logic 520, then fuse guidance module 560 uses the output 540, which output represents a non-aircraft detection for transmission to the flight management system 330.
Airborne aircraft encounters logic 510 includes, in an exemplary embodiment, four modules 512-518, however other embodiments may include any number of modules and/or and configuration of functionalities therebetween.
Ground obstacles and airborne obstacles encounters logic 520 includes, in an exemplary embodiment, four modules 522-258, however other embodiments may include any number of modules and/or and configuration of functionalities therebetween.
In one embodiment, if the object is known (or is determined by algorithms 510 and/or 520) not to be an aircraft, the airborne aircraft avoidance logic 510 does not completely process such data, and therefore, does not generate a unique guidance. The ground obstacles and airborne obstacles encounters logic 520 does generate a unique guidance instead. Fuse guidance 560 therefore receives only one input, output 540 from ground obstacles and airborne obstacles encounters logic 520, and uses that output to transmit guidance to the flight management system 330.
Even in a case that an aircraft is detected, the avoidance system architecture allows consideration of ground or other airborne objects through the use of a feedback loop. While the output 540 of the ground obstacles and airborne obstacles encounters logic 520 will be discarded by fuse guidance module 560, module 528 generates guidance in a form readily interpretable by the flight management system as well as in the form of inhibits or restrictions 550 that are sent, in a feedback loop, as an input to the module 512 of airborne aircraft encounters logic 510. The inhibits 550 may set out position and/or vector information (and in some embodiment classification information) regarding one or more locations or regions at which ground obstacles or non-aircraft airborne obstacles are located. In some embodiments, the inhibits 550 may include a space larger or broader than the particular locations of the detected objects, so as to provide sufficient buffer to ensure safety of the vehicle 10 and/or to control the speed and angle of movement to avoid excessive force or trauma to the passengers inside vehicle 10. Airborne aircraft encounters logic 510 may use this information as a restriction input, so as to factor in the position of non-aircraft objects in its generation of avoidance guidance regarding airborne aircrafts. That is, module 518 may, in generating guidance for how to control aircraft 10 to avoid collision with an aircraft, limit the guidance to further avoid positions of areas of airspace specified by the inhibits 550, as such areas have been determined by logic 520 to likely contain other obstacles.
The feedback loop with data 550 is sent by ground obstacles and airborne obstacles encounters logic 520 each time the logic 520 makes a detection of the proper category, such that, in an exemplary embodiment, module 528 outputs both guidance 540 and inhibits 550 in parallel, regardless of whether guidance 540 will be selected by fuse guidance module 560. By these means, the detections by the ground obstacles and airborne obstacles encounters logic 520 can be considered by the logic 510, which is, in some embodiments, in line with a proven software standard (e.g., ACAS X).
As shown by
While the term “database” or “repository” is used herein, such structures variously may be, e.g., a cache, database, other data structure or any suitable type of repository. Memories 622, 632 may be any suitable storage medium, either volatile and non-volatile (e.g., RAM, ROM, EPROM, EEPROM, SRAM, flash memory, disks or optical storage, magnetic storage, or any other tangible or non-transitory medium), that stores information that is accessible by a processor 624, 634. While
It will be apparent that in the embodiment of
Note that the detect and avoid logic 210 or components thereof, when implemented in software, can be stored and transported on any computer-readable medium for use by or in connection with an instruction execution apparatus that can fetch and execute instructions. In the context of this document, a “computer-readable medium” can be any means that can contain or store code for use by or in connection with the instruction execution apparatus.
The foregoing is merely illustrative of the principles of this disclosure and various modifications may be made by those skilled in the art without departing from the scope of this disclosure. The above described embodiments are presented for purposes of illustration and not of limitation. The present disclosure also can take many forms other than those explicitly described herein. Accordingly, it is emphasized that this disclosure is not limited to the explicitly disclosed methods, systems, and apparatuses, but is intended to include variations to and modifications thereof, which are within the spirit of the following claims.
As a further example, variations of apparatus or process parameters (e.g., dimensions, configurations, components, process step order, etc.) may be made to further optimize the provided structures, devices and methods, as shown and described herein. In any event, the structures and devices, as well as the associated methods, described herein have many applications. Therefore, the disclosed subject matter should not be limited to any single embodiment described herein, but rather should be construed in breadth and scope in accordance with the appended claims.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US19/68384 | 12/23/2019 | WO |