This application claims priority to Australian patent application no. 2021900347 filed on 12 Feb. 2021, the entire contents of which is hereby incorporated by reference herein in its entirety.
This invention relates to systems and methods for monitoring activities in an aviation environment, including near and at airports.
Airports and aircraft typically employ various systems that help to prevent imminent or hazardous situations that have the potential to develop into incidents or serious incidents or accidents near or at the airport. Aviation safety incidents or serious incidents or accidents are herein referred as occurrences. These systems (commonly referred to as ‘safety nets’) usually have the capability to detect, identify, and track movements of aircraft, vehicles and personnel within the operating environment near and at the airport and can include both ground and airborne-based safety nets.
Ground-based safety nets are provided as an important component of the Air Traffic Management system to allow air traffic controllers to manage air traffic. Using primarily Air Traffic Services surveillance data, they provide warning times of up to two minutes. Upon receiving an alert, air traffic controllers are expected to promptly assess the situation and take appropriate action.
Advanced Surface Movement Guidance & Control System (A-SMGCS) is a system providing routing, guidance and surveillance for the control of aircraft and vehicles to prevent traffic conflicts near and at the airport and typically comprises several different systems/safety nets. Its surveillance infrastructure can consist of a Non-Cooperative Surveillance (e.g. surface movement radar, microwave sensors, optical sensors, Automatic Dependent Surveillance-Broadcast (ADS-B), commercial cellular networks) and Cooperative Surveillance (e.g. multilateration systems). The A-SMGCS system focuses on the prevention and mitigation of air traffic conflicts near and at airport. Specifically, it can include one or more of the following ground-based safety nets:
Short Term Conflict Alert (STCA); this is a ground-based safety net intended to assist the air traffic controller in preventing collision between aircraft by generating, in a timely manner, an alert of a potential or actual infringement of separation minima.
Area Proximity Warning (APW); this is a ground-based safety net which uses surveillance data and flight path prediction to warn the air traffic controller when an aircraft is, or is predicted to be, flying into a volume of notified airspace, such as controlled airspace, danger areas, prohibited areas and restricted areas.
Minimum Safe Altitude Warning (MSAW); this is a ground-based safety net intended to warn the air traffic controller about increased risk of controlled flight into terrain accidents by generating, in a timely manner, an alert of aircraft proximity to terrain or obstacles.
Approach Path Monitor (APM); this is a ground-based safety net intended to warn the controller about increased risk of controlled flight into terrain accidents by generating, in a timely manner, an alert of aircraft proximity to terrain or obstacles during final approach.
Airborne safety nets are fitted on aircraft and provide alerts and resolution advisories directly to the pilots. Warning times are generally shorter, up to 40 seconds. Pilots are expected to immediately take appropriate avoiding action. Specifically, it can include one or more of the following airborne based safety nets:
Enhanced/Ground Proximity Warning System (GPWS/EGPWS) reduces the risk of controlled flight into terrain by providing flight crews with timely, accurate information about terrain and obstacles in the area. The system uses various aircraft inputs and an internal database to predict and warn flight crews of potential conflicts with obstacles or terrain.
High Energy Approach Monitoring Systems (HEAMS) warns the pilots if the energy predicted at touch down exceeds a predetermined safe level.
Runway Overrun Protection Systems (ROPS) provides pilots with a real-time constantly updated picture in the navigation display of where the aircraft will stop on the runway in wet or dry conditions.
The current systems comprising the presently known ground and airborne safety nets have a number of disadvantages. First, these systems are confined to detect and monitor five occurrence types that are described above, i.e., traffic conflicts, airspace infringement, controlled flight into terrain, unsafe approach and runway overrun. Yet there are many other potential operational aviation safety risks that are associated with other occurrence types that occur near and at airports, which are not yet well monitored by existing systems and/or procedures performed by human operators. For example, runway incursion, runway undershoots, unstable approach, missed approach/go-around, foreign object damage (e.g. runway debris), ground strike.
Further, the present ground and airborne-based safety nets also necessitate the use of multiple independent and complex systems, which are expensive and resource-intensive to install, operate and maintain. Specifically, it requires a significant number of multiple types of sensors to be fit on aircraft, and/or ground vehicles, and/or ground locations near and at airport, and requires system integration—this leads to long installation periods and thus interruption to normal airport operation. Further there is a high implementation and operating cost, including training for airport controllers and airline staff, with sensors required to be fit on every aircraft, ground vehicle and crew member to provide comprehensive cover. Further, it is accordingly expensive and difficult to maintain, upgrade, retrofit or develop new capability, and any such maintenance, upgrade or retrofit is also likely to disrupt operation. In particular, software installations or upgrades, in addition to the hardware installations or upgrades mentioned above, are not easy to introduce.
Moreover, the present ground and airborne-based safety nets have limited object detection, classification and tracking/position capabilities and therefore limited situation awareness. In particular, their detection and tracking capabilities are limited to point-wise tracking and positioning of individual aircraft, and its relative location to certain reference points/areas i.e., runway boundaries, entry and exit points and the like. Moreover, object details such as object features (aircraft landing gear, engine), shape, size, class and object classes other than aircraft are not well monitored by using these safety nets.
The present ground and airborne-based safety nets further have limited safe operation assessment capability, which are constrained by the limited amount of information acquired, a limited capability to understand and assess complex behaviours/activity patterns, and a limited capacity to simultaneously perform multiple safe operation assessments.
Examples of the invention seek to solve or at least ameliorate one or more disadvantages of the existing ground and airborne-based safety nets. In particular examples, the invention may preferably provide one or more of the following:
The above references to and descriptions of prior proposals or products are not intended to be, and are not to be construed as, statements or admissions of common general knowledge in the art. In particular, the above prior art discussion does not relate to what is commonly or well known by the person skilled in the art, but assists in the understanding of the inventive step of the present invention of which the identification of pertinent prior art proposals is but one part.
According to an aspect of the present invention there is provided a system for monitoring activities in an aviation environment, the system including: at least two sensors wherein each sensor is adapted to obtain sensor information of at least one object, the at least two sensors being located in at least one pre-determined location in the aviation environment, the sensor information obtained from one sensor being different from the other(s); a processing system being configured to receive said information from the sensors and being further configured to process said information to monitor said at least one object wherein the system is further configured to compare the information associated with the at least one object with predetermined safety operation criteria, and to generate an alert signal when the compared information indicates unsafe operation.
The processing system can be configured to combine the different information from the at least two sensors by associating the sensor information with time information. Preferably, the processing system is configured to combine the different information from the at least two sensors by associating the sensor information with spatial information.
Preferably, combining information from the at least two sensors comprises data fusion.
Preferably, data fusion comprises sensor calibration and/or time-syncing.
The at least two sensors preferably comprise two types of sensors.
The processing system can be configured to calculate depth (i.e. range) information by using sensor information from a first sensor of the at least two sensors. Preferably the processing system is configured to determine identity and/or classification information of at least one object by using sensor information from a second sensor of the at least two sensors.
The at least two sensors can include light detection and ranging sensors (LiDAR), or other types of ranging sensors, and camera sensors. Other types of ranging sensors may include radar, sonar or ultrasonic rangefinders. The processing system may be configured to calculate range information from sensor information from at least one LiDAR sensor, or other type(s) of ranging sensor, via analysis of LiDAR sensor or other types of ranging sensor information. The processing system may be configured to calculate identity and/or classification information from at least one camera sensor via the application of a machine-learning and/or deep-learning detection and/or classification process.
The processing system is preferably configured to associate the range/depth information and identity/classification information from the at least two sensors to identify at least one object in the field of view of the at least two types of sensors.
The processing system is configured to associate at least one detected and/or identified object with time information thereby allowing measurement and/or tracking at least one physical property of the at least one object over time. Preferably, the processing system is configured to predict the at least one object's at least one physical property from tracked physical property information. Physical properties may include location, travel direction, velocity, acceleration, distance travelled/motion track/travel path, elevation and/or interactions with other objects. It may also include the relative properties such as relative velocities, relative distances of a group of objects from another object, for example. The comparison of the information associated with the at least one object with predetermined safety operation criteria can include measured physical property information and predicted physical property information from the at least one object.
The processing system may be configured to generate an alert signal when the compared information indicates a risk of a predicted occurrence of unsafe operation. Preferably unsafe operation includes occurrences on or near a runway, occurrences involving ground operation, occurrences involving aircraft control, occurrences involving environment and/or infrastructure. Occurrences can include aviation safety incidents, serious incidents or accidents.
Preferably, the alert signals are represented and/or communicated as a visual and/or audio signal. Preferably, the alert signals enable human operators to make informed decisions and implement actions.
The at least two sensors can be housed in a monitoring unit and the monitoring unit is one of a plurality of spaced-apart monitoring units. One or more of said plurality of said monitoring units can be mounted at one or more locations throughout the aviation environment near and at an airport including runway, taxiway, apron, ramp areas, passenger boarding bridges, ground service vehicles, ground support vehicles, ground crew, airport building structures including gates, and aircraft.
Preferably, the system comprises two (2) or more monitoring units.
Preferably, the system comprises a number of monitoring units sufficient to provide comprehensive volumetric surveillance coverage of the aviation environment.
Preferably, the system comprises a number of monitoring units sufficient to substantially remove, or eliminate, blind spots in the surveillance coverage.
Preferably, the number of monitoring units depends on the layout of the aviation environment (e.g. number of runways, runway length, apron size), activity type (commercial flight, training) and risk profile of a particular airport.
The at least one object can be a moving or stationary object in the at least one location in the aviation environment including aircraft, ground support vehicles, ground crew, runway, taxiway, apron ramp areas, passenger boarding bridges, ground service vehicles, ground support vehicles, ground crew, airport building structures including gates, and the operating environment near and/or on the runway.
According to another aspect of the present invention there is provided a method for monitoring activities in an aviation environment, the method including the steps of: obtaining sensor information of at least one object from at least two sensors, the at least two sensors being located in at least one pre-determined location in the aviation environment, wherein the sensor information obtained from one sensor is different from the other(s); receiving said information from the sensors at a processing system being configured to process said information to monitor said at least one object; and comparing the processed information associated with the at least one object with predetermined safety operation criteria, and to generate an alert signal when the compared information indicates unsafe operation
According to yet another aspect of the present invention there is provided a system for monitoring activities in an aviation environment near and at an airport, the system including: an aviation operating environment near and at the airport with a plurality of aircraft, runways, taxiways, aprons, ramp areas, passenger boarding bridges, ground service vehicles, ground support vehicles, ground crew, and airport building structures including gates and other objects such as animals and remotely piloted aircraft, a plurality of monitoring units mounted at one or more locations throughout the aviation environment near and at the airport including one at or more of the following locations: a runway, taxiway, apron, ramp areas, passenger boarding bridges, ground service vehicles, ground support vehicles, ground crew, airport building structures including gates, and aircraft, wherein the system is configured to produce continuous and real-time data representing the aviation activities within said aviation environment near and at the airport from said one or more locations, each monitoring unit comprises at least two types of sensors wherein each sensor is configured to produce/obtain/transmit sensor information of at least one object from at least one pre-determined location in the aviation environment near and at the airport, the sensor information produced/obtained/transmitted from one type of sensor being different from the other(s); an artificial intelligence-based data processing system comprising artificial intelligence models and other processing algorithms being configured to receive and fuse said real time data representing the aviation activities within said aviation environment near and at the airport from said one or more locations in a secure encrypted form, and being further configured to process said information to detect, identify, track and monitor said at least one object in the operational aviation environment near and at the airport from said one or more locations; wherein the system is further configured to compare the information associated with the at least one object in the said aviation environment from said one or more locations with predetermined safety operation criteria, and to generate an alert signal when the compared information indicates unsafe operation; wherein the system is further configured to produce representations of said aviation activities within said aviation environment near and at the airport from said one or more locations, and to communicate said one or more representations in a secure encrypted form; and devices to receive said communicated said one or more representations located in one or more of, or any combination of the following: a cockpit of said aircraft, air traffic control towers/centres, ground control locations and/or airport emergency response team locations.
According to still yet another aspect of the present invention there is provided a method for monitoring aviation activities in an aviation environment, the method including the steps of: providing a plurality of monitoring units, each comprising at least two types of sensors, namely at least a camera and at least a LiDAR, the monitoring units being positioned in one or more locations throughout the aviation environment near and at an airport; producing/obtaining and transmitting sensor information of at least one object from the at least two types of sensors from at least one monitoring unit, the at least one monitoring unit being located in at least one pre-determined location in the aviation environment near and at the airport, the sensor information being in a secure encrypted form wherein the sensor information obtained from one sensor type being different from the other(s) sensor type(s); receiving said information from the sensors by an artificial intelligence-based data processing system comprising artificial intelligence models and other processing algorithms being configured to fuse and process said information to detect, identify, track and monitor said at least one object in the aviation environment near and at the airport; comparing the processed information associated with the at least one object with predetermined at least one safety operation criteria, generating an alert signal when the compared information indicates unsafe operation; producing representations of said aviation activities within said aviation environment near and at the airport from said one or more locations; communicating said one or more representations in a secure encrypted form; and devices to receive said communicated said one or more representations located in one or more of, or any combination of the following: a cockpit of said aircraft, air traffic control towers/centres, ground control locations and airport emergency response team locations.
According to a further aspect of the invention, there is provided a system for monitoring activities in an aviation environment, the system including:
The system may be further configured to generate an alert signal when the compared information indicates a risk or likelihood of at least one occurrence type in each of a third occurrence group comprising ground operation occurrence types, and in a fourth occurrence group comprising environment occurrence types. The ground operation occurrence types may comprise one or more of, or any combination of: foreign object damage/debris, jet blast/propeller/rotor wash, or taxiing collision. The environment occurrence types may comprise one or more of, or any combination of: icing, lightning strike, or animal/bird strike.
The system may be further configured to generate an alert signal when the compared information indicates a risk or likelihood of at least one occurrence type in a fifth occurrence group comprising infrastructure occurrences, including runway lighting occurrences or other infrastructure type occurrences.
The range sensor may comprise a LiDAR sensor and the processing system is preferably configured to calculate range information of at least one object of the at least two objects by using sensor information from the LiDAR sensor.
The processing system is preferably configured to determine identity and/or classification information of at least one object of the at least two objects by using sensor information from the camera sensor and processing said sensor information using an artificial intelligence-based processing method. Preferably, the processing system is configured to apply a deep- and/or machine-learning detection process to calculate the identity and/or classification information.
The processing system is preferably configured to associate the range information and the identity and/or classification information from the sensors to identify the at least one object in the field of view of the sensors. The processing system may be configured to associate the at least one identified object with time information, thereby provides measurement and/or tracking at least one physical property of the at least one identified object over time. The processing system is preferably configured to predict a physical property of the at least one identified object from tracked physical property information. The comparison of the information associated with the at least one identified object with the predetermined safety operation criteria preferably includes comparing or otherwise applying measured physical property information and predicted physical property information from the at least one identified object.
Preferably, the measured and predicted physical property includes the aircraft's position, travel direction, velocity, acceleration, altitude and attitude, and other physical properties of aircraft of interest, and physical properties of other objects of interest including boundaries, markings, a centreline, a runway threshold, ground crew, a passenger, a ground vehicle, infrastructure and/or building structures.
The system is preferably configured to monitor the aircraft ground location and/or measure and/or calculate an estimate or prediction of the aircraft position or motion on the runway, aircraft position deviation from runway centreline, distance between aircraft and runway boundary and/or runway end, and a predicted time and/or position for runway excursion including veer-off.
The system is preferably configured to monitor an aircraft approach flight path and an aircraft landing configuration, to measure and/or calculate an estimate or prediction of acceptable deviation of measured flight path from an authorised or ideal flight path, and a likelihood of achieving safe touch-down or landing.
The system may be configured to monitor and/or track the aircraft location, and/or measure and/or calculate an estimate or prediction of the lift-off position, and a last safe stopping point along a take-off roll.
The system may be configured to receive and process additional information to assist with and/or facilitate calculation of the at least two objects' physical properties, an estimation or prediction of their physical properties and/or the safe operation criteria. The additional information includes one or more of, or any combination of the following: runway data, including runway length, boundaries, entries and exits, surface characteristics such as material or friction coefficients, and/or surface conditions such as wet, ice/snow, metrological data such as wind, temperature and the like, and aircraft data, such air craft type and capabilities/characteristics, weight, flying phase and/or intended or reference position or motion.
The system may be further configured to compare the measured or predicted physical properties of the aircraft and the runway to the safe operation criteria to determine the potential runway excursion risks. Preferably, the likelihood of runway excursion is predicted by monitoring distance between aircraft landing gears/fuselage/wingtip and runway side boundary for veer off and by monitoring runway distance remaining for runway overrun.
The system may be configured to receive information from one or more existing aviation safety-net systems to facilitate processing of information, and calculation of measured operational physical properties and prediction thereof, to act as a redundancy to the existing systems. The at least two objects may include one or more of, or a combination of the following:
ground vehicles, ground crew, taxiway, apron ramp areas, passenger boarding bridges, airport building structures, infrastructure, and the operating environment near and on the runway.
The plurality of locations in the aviation environment preferably includes at least one location on the aircraft.
The plurality of locations in the aviation environment includes one or more of, or any combination of, the following: on or near a taxiway; on or near an apron, a ramp area and/or a passenger boarding bridge; on or near a ground service vehicle, a ground support vehicle and/or ground crew; and/or on or near an airport building and/or infrastructure.
In accordance with a further aspect of the invention, there is provided a method for monitoring activities in an aviation environment, the method including the steps of:
Preferably, the range sensor is a LiDAR sensor.
In accordance with a further aspect of the invention, there is provided a system for monitoring activities in an aviation environment, the system including:
In accordance with a further aspect of the invention, there is provided a method for monitoring aviation activities in an aviation environment, the method including the steps of:
The features described in relation to one or more aspects of the invention are to be understood as applicable to other aspects of the invention. More generally, combinations of the steps in the method of the invention and/or the features of the system of the invention described elsewhere in this specification, including in the claims, are to be understood as falling within the scope of the disclosure of this specification.
The methods and/or systems of the invention may be applied as new systems or methods. However, the systems and/or methods of the invention are also suited to retrofit, or partly retrofit, existing systems or methods including in relation to existing aviation safety nets. The invention is conceived to, in some forms, take advantage of such existing system and method in order to assist in delivering one or more benefits of the invention.
Other aspects of the invention are also disclosed.
The present invention will now be described, by way of non-limiting example, with reference to the accompanying drawings in which:
Preferred features of the present invention will now be described with particular reference to the accompanying drawings. However, it is to be understood that the features illustrated in and described with reference to the drawings are not to be construed as limiting on the scope of the invention.
Referring now to
The exemplary host service 4 comprises one or more host servers that are connected to a network 6, and therefore communicate via that network 6 via wired or wireless communication in a conventional manner as will be appreciated by those skilled in the art. The host servers are configured to store a variety of information collected from the users/units 16, 18, 20, 22 and 24.
The host servers are also able to house multiple databases necessary for the operation of methods and systems of the present invention. The host servers comprise any of a number of servers known to those skilled in the art and are intended to be operably connected to the network so as to operable link to a computer system associated with the users 16, 18, 20 or third parties, 22 or 24. The host servers can be operated and supplied by a third party server providing service, or alternatively can be hosted locally by the processing system 4.
The host server 4 typically includes a central processing unit (CPU) and/or at least one graphics processing unit (GPU) 8 or the like which includes one or more microprocessors, and memory 10, and storage medium 12 for housing one or more databases, operably connected to the CPU and/or GPU and/or the like. The memory 10 includes any combination of random-access memory (RAM) or read only memory (ROM), and the storage medium 12 comprises magnetic hard disk drives(s) and the like.
The storage medium 12 is used for long term storage of program components as well as storage of data relating to the customers and their transactions. The central processing unit and/or graphics processing unit 8 which is associated with random access memory 10, is used for containing program instructions and transient data related to the operation of services provided by the host service 4. In particular, the memory 10 contains a body of instructions 14 for implementing at least part of a method for safety operation assessment in an aviation environment. The instructions 14 enable multiplatform deployment of the system 2, including on desktop computer, edge devices such as NVIDIA DRIVE or Jetson embedded platform. The instructions 14 also include instructions for providing a web-based user interface which enables users to remote access the system 2 from any client computer executing conventional web browser software.
Each user 16, 18, 20, 22, 24 is able to receive communication from the host service 4 via the network 16 and is able to communicate with the host service 4 via the network 6. Each user 16, 18, 20, 22, 24 may access the network 6 by way of a smartphone, tablet, laptop or personal computer, or any other electronic device. The host service 4 may be provided with a dedicated software application which is run by the CPU and/or GPU and/or the like stored in the host servers. Once installed, the software application of the host service 4 provides an interface that enables the host service 4 to facilitate communication of information and/or alerts, including sensor information, raw or processed, to a predetermined user 16, 18, 20.
In a preferred embodiment, the computing network 6 is the internet or a dedicated mobile or cellular network in combination with the internet, such as a GSM, CDMA, UTMS, WCDMA or LTE networks and the like. Other types of networks such as an intranet, an extranet, a virtual private network (VPN) and non-TCP/IP based networks are also envisaged.
With reference to
Referring particularly to
In a preferred embodiment, one of the at least two sensor types is a Light Detection and Ranging (whose acronym is LiDAR) 26. LiDAR sensors 26 are particularly advantageous in extracting accurate range information of objects in its field of view. In a more preferred embodiment, another of the at least two sensor types is a light detector such as a camera 28, such as colour or infrared cameras or similar which can provide information about the at least one object of interest and/or their surrounding environment which enables object classification and tracking. Most preferably, each monitoring unit 22 has one of each of the LiDAR sensor 26 and a camera-type sensor 28 thereby advantageously providing range information of one or more objects and surrounding environment by LiDAR sensor, allowing accurate motion and position measurement; and providing visual information of one or more objects and surrounding environment by both LiDAR and camera-types sensors but primarily by the camera-type sensor, which facilitates accurate, precise and reliable object classification/recognition. Further, the two sensor types 26, 28 work together to provide the information in both normal and challenging light conditions such as fog, low light, sun glare smoke-filled, and the like, within the sensor's field of view and preferably up to 250 m from the monitoring unit. In particular, the LiDAR sensor may be adapted to work in foggy or rainy conditions by using 1505 nm wavelengths at higher power and/or using a Frequency-Modulated Continuous Wave radar or Full Waveform radar. Other sensor types 30 may be provided in the monitoring unit 22 and/or information acquired using other sensor types may be provided for the purposes of enhancing the system 2 or providing redundancies. Information may include meteorological, surface movement (incl. runway, taxiway, apron), aircraft data. The sensor types may include A-DBS, surface movement radar.
The sensors/monitoring units 26, 28, 30, 22 are also capable of producing and transmitting information from multiple locations to processing system 4 which is configured to receive said to process the information associated with the aviation activities in the operating aviation environment, particularly near and at airports. Preferably, the information is transmitted to the processing system 4 in a secured manner.
The system 2 is configured to combine the information from the at least two types of sensors 26, 28, 30 acquired using at least one monitoring unit by associating the sensor information with time information, preferably by a processing system 4. The system 2 is also configured to combine the information from the at least two sensors 26, 28, 30 by associating the sensor information with spatial or distance or location information for example GPS coordinates or other positional information, range information and the like. The combination or ‘fusing’ of the sensor information with time information may be obtained by time synchronisation or temporal calibration, while the combination or ‘fusing’ of sensor information with spatial or distance or location information may be obtained by sensor calibration. At least one monitoring unit 22 can be employed to provide sensor information that can be fused into temporal and spatial data associated with objects in at least one predetermined location in the aviation environment, particularly near and at airports.
More than one monitoring unit 22 is employed in areas, such as runways 40, apron 44, ramp areas 46, to monitor the same predetermined location, where the multiple monitoring units 22 are spaced apart thereby allowing combination of multiple sensor information associated with multiple monitoring units which is temporally synchronised and spatially calibrated as illustrated in
The processing system 4 is an artificial intelligence-based system which is configured to receive and process the sensor information to provide real-time sensing, recognition/classification and tracking of aircraft 16, ground personnel 20, ground vehicles 18 and other objects, recognition of operating environment, e.g. runway 40, taxiway 42, apron 44 and volume above these surfaces, and their features, e.g. runway boundary 49, marking 50, centreline 52, runway end 47, runway threshold 48, aircraft engine 51, aircraft landing gear 53, object motion and position estimation. The sensor information may be fused, i.e. temporally synchronised and/or spatially calibrated once received by the processing system 4 or alternatively it may be fused beforehand.
The terms “artificial intelligence” and “intelligent algorithms” are used herein to refer to and encompass systems of data processing and analysis that are conducted by computers capable of harvesting large amounts of possible input data, including images and other information from monitoring and sensing devices, that may be processed, analysed, and categorized based on a set of rules and then may be communicated so that appropriate action may be taken, whether automatically by a system receiving the processed and analysed data or manually by at least one human operator such as Air Traffic Control officer, pilot and emergency response team.
Referring to
With reference to Table 2 and
In this particular example, see step 104 exemplified by table 2, the system 4 is configured to receive sensor information from the camera 28 and LiDAR 26 and to combine the two type of sensors' information by data fusion methods, including by sensor calibration and/or time-syncing.
Preferably, the data fusion, and preparation of the data therefor, includes acquisition of extrinsic, intrinsic and distortion parameters of sensors (i.e. LiDAR and camera), followed by quantification of sensor errors.
Preferably, time synchronisation may be achieved through the use of internal/external timer source(s) that are coupled to with the sensors, and the read and comparison of timestamps that are associated with individual image and point cloud data using the processing system.
Preferably, the LiDAR information, a 3-D point cloud of the objects within the aviation environment, is projected on the camera image or vice versa.
Preferably, the LiDAR information, acquired from multiple LiDAR sensors that are located at various locations, is registered/stitched/fused using algorithms such as Iterative Closest Point (ICP), normal-distributions transform (NDT), phase correlation, coherent point drift (CPD).
Preferably, the image information, acquired from multiple cameras that are located at various locations, is registered/stitched/fused using algorithms such as feature based image registration. The abovementioned operations may be incorporated in alternative examples or embodiments of the present invention.
It will be understood that the person skilled in the art would be able to conduct data fusion (e.g. sensor calibration, time-syncing) by a variety of methods or algorithms.
In the next step ‘Stage 2’ in the example described in Table 2, the system 2, and more preferably the processing system 4, is configured to process the sensor information to separate the foreground from the background via ground plane segmentation process(es). In this example, 3-D LiDAR point cloud obtained in the previous steps is used to separate foreground objects, such as aircraft or support ground-based vehicles, from background objects i.e. runway. In particular, the processing system 4 can perform the separation or ground plane segmentation by techniques such as ground plane estimation, however it is expected that other known techniques could be utilised.
Next the processing system 4 is then configured to form at least one object from 3-D LiDAR point cloud. In this example, the 3-D point cloud object is formed by the result of the combination of received outputs produced by separation of the foreground and background in the previous step (Stage 2 Step A in Table 2) and object detected and classified from the camera image which is processed in the Stage 3 Step A in Table 2. Preferably the object formed by combination is formed by a 3-D points grouping or clustering process thereby forming a 3-D space although it would be understood that other processes or techniques could be equally employed. The Stage 3 Step A, camera image processing step, is independently processed to the Stage 2 steps and therefore can be performed temporally before or in parallel with Stage 2 Step A such that the results of Stage 3 Step A is available and ready for use before the commencement of Stage 2 Step B. Results of Stage 3 Step A is an input to Stage 2 Step B.
In the next processing step, Stage 3, as illustrated in Table 2, the processing system 4 is configured to detect and/or identify and classify objects in the aviation environment. In the example ‘Stage 3’ summarised in Table 2, the processing system 4 can first process the camera sensor information received from the camera images/video frames to detect and/or identify the objects. In particular, the artificial intelligence-based data processing system 4 employs machine- or deep-learning-based object detection and/or classification models, which are trained, validated, verified and optimised for detection and classification of objects involved in aviation activities in an aviation environment near and at airport. The object detection and/or classification models that can be utilised include You Only Look Once (YOLO) or Fully Convolutional One-Stage (FCOS) models although it is expected that other artificial intelligence-based models could equally be used instead for similar effect.
In data processing Stage 2 Step B, the processing system 4 can also process the LiDAR sensor information which has been processed to form a cluster 3-D points in Stage 2 Step B, for object identification and/or recognition. Example techniques for the 3-D object recognition in the 3-D space can include the spin image method or the PointSeg network although other known methods could be utilised.
In the next processing step, Stage 3 Step C, the processing system 4 can then combine the processed camera sensor information and processed LiDAR sensor information which can result in a detection confidence score which can be associated with the sensor information. The use of detection confidence score enhances the detection and classification accuracy by reducing false detections and by increasing detection rate. For example, for reducing false detection, there are two aircraft have similar configurations and features but are different in size, i.e. both are configured with a cylindrical fuselage with two jet engines, and one aircraft is 30 metres long whereas another aircraft is 60 metre long. If the larger aircraft is located closer than the smaller aircraft to the camera, information acquired from the camera and subsequently processed by the processing system might not be able to accurately differentiate the size difference between the two aircraft. The information about these two aircraft acquired from LiDAR, on the other hand, can provide accurate size information and location information of these two different types of aircraft regardless of the difference in distance between the aircraft and the LiDAR sensor. For example, for increasing detection rate, while LiDAR information provides high positioning accuracy of 0.05 metres, the spatial resolution of 1.5 meters at a distance of 200 metres may be sufficient to detect and identify an aircraft with a length of 30 metres, but it may not be able to detect and identify objects with dimensions below 1.5 metres such as some ground equipment, e.g. tow bar 45, ground crew, cargo/baggage cart. By combining these two types of information acquired with camera and LiDAR sensors, the detection and classification accuracy may therefore be enhanced by reducing the effects of lack of range information from camera information and by reducing the effects of lack of visual detail and absence of colour from 3-D LiDAR point cloud.
Once the system 2 has detected, identified and/or classified the objects in the aviation environment near and at airports, the system 2 is then configured to associate the motion of at least one object, preferably multiple objects, over time as exemplified in the example Stage 4 of Table 2. Further the system 2 is also configured to provide an estimation of the motion of the object(s). For the purposes of object tracking and motion estimation, the system 2 is configured to associate moving objects in one information acquisition and at least one other subsequent information acquisition. One information acquisition refers to one camera/video frame and one LiDAR frame or its equivalent, which are temporally-synchronised and spatially calibrated. The system 2, particularly the processing system 4, is configured to process the sensor information from the 2-D camera/video images and/or the LiDAR point clouds from the 3-D space to associate sensor information from each sensor from one information acquisition (i.e. camera/video frame and/or LiDAR point clouds) to a subsequent or previous information acquisition. Preferably, the processing system 4 is able to process sensor information associated with at least two sequential camera/video frames at a particular moment when information acquisitions are received by the data processing system 4 continuously over time. In a particularly preferred embodiment, the processing system 4 employs the Kalman filter method to process the 2-D camera/video images, and the segment matching based method or joint probabilistic data association (JPDA) tracker to process the 3-D space data (LiDAR point clouds). It would be understood however that other models or methods to predict the physical properties of the objects' predicted physical properties could be substituted for the ones named above.
In a final processing stage (stage 5 of the example processing method in Table 2), the system 2 is configured to combine the outputs of the processed sensor information from the previous steps/stages to measure, calculate and provide an output of the estimation or prediction of one or more objects' physical properties such as position, acceleration, speed and/or travel direction of any object(s) motion. Furthermore, the system 2 is configured to compare one predicted object's physical properties to another, for example a distance or predicted distance between aircraft 16 and another object of interest, i.e. runway centreline 52, boundary 49, runway threshold 48, other aircraft 16, and to output information which is associated with these properties of the compared objects. The system 2 is able to assess the properties of the compared objects with predetermined safe operation criteria and to generate an alert (in step 106, see
Although the examples described herein refer to an aviation environment, particularly near and at airports, the system 2 can be utilised in a number of other environments requiring monitoring of multiple moving and static objects within an environment such as industrial environments, such as maritime operations, road/autonomous driving operations, mining operations, industrial plants, logistics centres, manufacturing factories, aviation operations that are not near and at airports, space operations and the like.
Details of the various predetermined safe operation criteria are provided in the following paragraphs and in particular from Tables 3 and 4.
Table 3 sets out an example of the occurrence types and groups that occur in an aviation environment particularly near and at airport (left column), such as runway (A1 to A7), ground operations (B1 to B14), aircraft control (C1 to C20), environment (D1 to D12), infrastructure (E1 to E3) occurrence groups. Multiple occurrence types can be monitored within each occurrence group category. In one example, the occurrence type runway excursion A1 is one of the occurrence types that are classified under runway occurrence group. These occurrence types are level 3 occurrence types, which are defined and used by Australian Transport Safety Bureau (ATSB). Advantageously, the system may be configured to monitor up to 59 ATSB level 3 occurrence types, i.e. A1 to E3 as exemplified in Table 3, in comparison with the five occurrence types which are typically monitored using current aviation safety monitoring systems.
In the right column of Table 3, there are shown detection and tracking multiple objects data processing capability and brief safe operation criteria that are required for each occurrence group, including object types, classes, different physical (both current and predicted) properties of each monitored object, and the types of risks and accompanying safe operation criteria that is associated with each occurrence type. Table 4 provides additional details into the particular safe and unsafe operation criteria (left column) for each of the occurrence types and in the right hand column there is provided the examples of assessment criteria/method for each of the safety operation criteria.
For example, in the example of the occurrence type A1 runway excursion illustrated in
Further in step 204 the system 2 is configured to process the sensor information including using the fused information to identify/classify/detect at least one object, such as the aircraft 16 and runway 40. Further the system 4 can calculate the at least one objects' physical properties, and to predict the at least one objects' physical properties. For example, the aircraft's position, travel direction, velocity, acceleration, altitude and attitude is monitored as well as the distance between aircraft 16 of interest and object of interest, e.g. the runway 40 in particular its' surface, boundary 49, markings 50, centreline 52, runway threshold 48 and to calculate runway distance remaining, distance between aircraft and runway boundaries, centreline and the like.
Additional information 54 can be received by the processing system 4 to assist and/or facilitate calculation of the objects' physical properties and estimation/prediction of their physical properties and/or safe operation criteria, including runway data, such as length, boundaries, entries and exits, surface characteristics such as material or friction coefficients, and/or surface conditions such as wet, ice/snow, metrological data such as wind, temperature and the like, and aircraft data, such air craft type and capabilities/characteristics, weight, flying phase and/or intended or reference position or motion.
Further in a next step 206 of the method 200, the system 2 is configured to measure or calculate an estimate or prediction of the particular physical properties of the aircraft 16 and runway 40 which may relate to a particular predetermined safety criteria, i.e. A1. For aircraft landings, as illustrated in
As illustrated in
In the example shown in
The system 2 is also configured to store the particular safe operation criteria in step 208 such as the calculated safe lift-off position for a particular aircraft type, for example, under specific aircraft loading, runway and meteorological conditions during take-off, calculated safe stopping position i.e. where the aircraft speed becomes low enough to ensure a safe stop before the end of the runway and/or aircraft can safely exit from the runway for a particular aircraft type under specific aircraft loading, runway and meteorological conditions during landing. The system 2 can also be configured to calculate the acceptable limits for the lift-off, veer off, touch-down and safe stopping positions, i.e. acceptable runway distance remaining, and/or to calculate and predict the safe operation criteria as required.
The system 2, is then configured to compare the measured or predicted physical properties of the aircraft 16 and runway 40 to the safe operation criteria to determine the potential runway excursion risks. In particular the system 2 in step 212 can predict the likelihood of runway excursion by monitoring distance between aircraft landing gears/fuselage/wingtip and runway side boundary for veer off and by monitoring runway distance remaining for runway overrun. If the comparison shows that the measured and predicted physical properties of the aircraft and runway are acting within safe operating criteria, then the system 2 can determine that the likelihood of risk of runway excursion is low and an indication/alert may be generated to a user to confirm safe aviation operation.
Alternatively, the system 2 is configured to determine that the comparison shows that risk of runway excursion is medium or high, i.e. runway excursion may occur in the next 15 seconds, or in the next 5 seconds, and the system is further configured to transmit at least one alert to at least one user accordingly. The user(s) could include aviation traffic control (ATC), pilots, emergency response team, and the like. Finally, if an excursion has occurred, the system 2 is configured to send an alert for at least one user, particularly emergency response teams and relevant authorities.
Lastly, as illustrated in
The system 2 in step 216 is also able to receive information from existing safety nets to facilitate processing of information and calculation of measured operational physical properties and prediction thereof and to act as a redundancy. For example, the system can be configured to receive information from runway overrun protection systems (ROPSI).
In a further example of the system 2 and method 300 as discussed according to preferred embodiments of the present invention, as illustrated in
For example, in the more specific example of the occurrence type B3 taxiing collision/near collision illustrated in
Further the system 2 in step 304 is configured to use the fused information to detect and identify at least one object, such as the aircraft(s) 16, ground vehicles and crew 18, 20, and airport infrastructure such as the boarding gates and the like, to calculate the at least one objects' physical properties, and to predict the at least one objects' physical properties. For example, the aircrafts' position, travel direction, velocity, acceleration, altitude and attitude is monitored as well as the distance between aircraft of interest and object of interest, e.g. the ground vehicles and crew, boarding gates, gate boundaries and the like.
Additional information 54 can be received by the processing system 4 to assist and/or facilitate calculation of the objects' physical properties and estimation/prediction of their physical properties and/or safety operation criteria, including runway data, such as boarding gates and bridges, apron and ramp area boundaries, surface characteristics such as material or friction coefficients, and/or surface conditions such as wet, ice/snow, metrological data such as wind, temperature and the like, and aircraft data, such air craft type and capabilities/characteristics, weight, flying phase and/or intended or reference position or motion, and ground crew/vehicle data.
Further in a next step 306 of the method 300, the system 2 is configured to measure or calculate an estimate or prediction of the physical properties of the aircraft(s) 16, airport infrastructure and ground vehicles/crew 18, 20. For an aircraft taxiing to and from the boarding gates and bridges, as illustrated in
As illustrated in
The system 2 in step 308 is also configured to store the particular safe operation criteria such as the defined and/or calculated safe distances between the objects, i.e. aircraft 16, ground vehicles/crew infrastructure 18, 20. The system 2 can also be configured to calculate the acceptable limits for the same and/or to calculate and predict the safe operation criteria as required.
The system 2, in the next step 312, is then configured to compare the measured or predicted physical properties of the aircraft 16 and other objects to the safe operation criteria to determine the potential collision risks. In particular the system 2 can predict the likelihood of collisions or near collisions by monitoring distance between any two or more objects, i.e. the distance between the aircraft(s) 16 and any ground crew/vehicles 18, 20 and airport infrastructure 42, 44. If the comparison shows that the measured and predicted physical properties of the aircraft 16, ground crew/vehicles 18, 20 and airport infrastructure are acting within safe operating criteria, then the system 2 in step 312 can determine that the likelihood of risk of runway excursion is low and an indication/alert may be generated to at least one user to confirm safe aviation operation.
Alternatively, the system 2 is configured to determine that the comparison shows that risk of collision or near collision is medium or high, i.e. runway excursion may occur in the next 120 seconds, or in the next 20 seconds, and the system is further configured to transmit at least one alert to at least one user accordingly i.e. yellow alert or red alert. The user(s) could include aviation traffic control (ATC), pilots, emergency response team, and the like. Finally, if a collision has occurred, the system is configured to send an alert for at least one user, particularly emergency response teams and relevant authorities.
Lastly, as illustrated in
The system 2 in step 316 is also able to receive information from existing safety nets i.e. traffic conflicts by STCA to facilitate processing of information and calculation of measured operational physical properties and prediction thereof and to act as a redundancy.
In another further example of the system 2 and method 400 according to preferred embodiments of the present invention,
The system 2 is first configured in step 402 to receive sensor information from the at least two sensors, i.e. the LiDAR and camera sensors 26, 28, in the form of multiple monitoring units 22, in the aviation environment. The system 2, the processing system 4 in particular, is configured to fuse the two sensors' information from each monitoring unit 22 by a time-syncing process and/or sensor calibration process.
Further the system 2 in step 404 is configured to use the fused information to detect and/or identify at least one object, such as the aircraft 16, to detect and classify at least one object feature, such as the aircraft landing gear status, i.e. landing gear 53, in an extended or a retracted position, to calculate the at least one objects' physical properties, and to predict the at least one objects' physical properties. For example, the aircraft's position, travel direction, velocity, acceleration, altitude and attitude are monitored.
Additional information 54 can be received by the processing system 4 to assist and/or facilitate calculation of the objects' physical properties and estimation/prediction of their physical properties and/or safety operation criteria, including runway data, such as length, boundaries, entries and exits, surface characteristics such as material or friction coefficients, and/or surface conditions such as wet, ice/snow, metrological data such as wind, temperature and the like, and aircraft data, such air craft type and capabilities/characteristics, weight, flying phase and/or intended or reference position or motion.
Further in a next step 406 of the method 400, the system 2 is configured to measure or calculate an estimate or prediction of the physical properties of the aircraft 16 and runway 40, in the system 2 is configured to measure and/or calculate an estimate or prediction of the approach flight path, tracked current aircraft location, deviation of path profile parameters such as lateral and vertical profile, airspeed, bank angle, altitude, vertical speed, altitude and attitude.
For wheels up landing, as illustrated in
The system 2 in step 408 is also configured to store the particular safe operation criteria such as for wheels up landing, whether on approach, the spatial position along the approach flight path of the aircraft 16 at which the landing gear should be fully extended/deployed to achieve safe touchdown/landing. For unstable approach, acceptable deviation of measured flight path from the intended/authorised/ideal flight path. The system 2 can also be configured to calculate the acceptable limits thereof and/or to calculate and predict the safe operation criteria as required.
The system 2 in the next step 412, is then configured to compare the measured or predicted physical properties of the aircraft and runway to the safe operation criteria to determine the potential risks. In particular the system can predict the likelihood of incorrect aircraft landing configuration by monitoring the landing gear configuration. The system can also predict the likelihood of unstable approach by monitoring the approach flight path. If the comparison shows that the measured and predicted physical properties of the aircraft and landing gear configuration are acting within/complying safe operating criteria, then the system can determine that the likelihood of risk of wheels up landing and/or unstable approach is low and an indication/alert may be generated to at least one user to confirm safe aviation operation.
Alternatively, the system 2 is configured to determine that the comparison shows that risk of wheels up landing and/or unstable approach is medium or high, i.e. wheels up landing and/or unstable approach may occur in the next 120 seconds, or in the next 20 seconds, and the system is further configured to transmit at least one alert to at least one user accordingly i.e. yellow alert or red alert. The user(s) could include aviation traffic control (ATC), pilots, emergency response team, and the like. Finally, if a wheels up landing has occurred, the system is configured to send an alert for at least one user, particularly emergency response teams and relevant authorities.
Lastly, as illustrated in
The system 2 in step 416 is also able to receive information from existing safety nets to facilitate processing of information and calculation of measured operational physical properties and prediction thereof and to act as a redundancy. For example, the system 2 can be configured to receive information from High Energy Approach Monitoring Systems (ROPSI).
The system can provide real-time monitoring of aviation activities, detection of unsafe aviation activity and generation of alerts, which can be displayed on at least one standalone screen or can be integrated with existing systems located in at least a cockpit of said aircraft, air traffic control towers/centres, ground control locations and airport emergency response team locations. The display format may include 3-D map and panoramic view.
The system and methods described above provide one or more of the following advantages including improvement in aviation safety, operation efficiency, capacity, operating cost efficiency, environment and security. Specifically, the advantages include the following: enhanced situation awareness of unsafe aviation activities to human operators and operating systems, e.g. Air Traffic Control officers, pilots, aircraft on board systems that control the aircraft and emergency response team: awareness of all objects and activities within the aviation operating environment near and at airport; prompt detection and awareness (within seconds) of deviation from and/or violation of safe aviation operation criteria; human operators and/or operating systems can immediately assess the detected and identified unsafe aviation activities, and implement appropriate corrective actions; prevention of aviation safety occurrences or reduction of severity/cost of aviation safety occurrences; increased redundancy to the existing technologies and procedures that detect/identify/prevent/mitigate unsafe aviation activities; a more cost-effective solution/technique/system compared to existing systems/technologies/solutions; reduced reliance on human involvement, e.g. human observation at Air Traffic Control; minimum changes to current procedures or workload.
It is apparent from the above, that the arrangements described are applicable to aviation industries, and related industries, and the processes, systems and equipment therefor.
Reference throughout this specification to “one embodiment” or “an embodiment” means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, appearances of the phrases “in one embodiment” or “in an embodiment” in various places throughout this specification are not necessarily all referring to the same embodiment, but may. Furthermore, the particular features, structures or characteristics may be combined in any suitable manner, as would be apparent to one of ordinary skill in the art from this disclosure, in one or more embodiments.
Similarly it should be appreciated that in the above description of example embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. This method of disclosure, however, is not to be interpreted as reflecting an intention that the claimed invention requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the Detailed Description are hereby expressly incorporated into this Detailed Description, with each claim standing on its own as a separate embodiment of this invention.
As used herein in the specification and claims, including as used in the examples and unless otherwise expressly specified, all numbers may be read as if prefaced by the word “about” or “approximately,” even if the term does not expressly appear. The phrase “about” or “approximately” may be used when describing magnitude and/or position to indicate that the value and/or position described is within a reasonable expected range of values and/or positions. For example, a numeric value may have a value that is +/−0.1% of the stated value (or range of values), +/−1% of the stated value (or range of values), +/−2% of the stated value (or range of values), +/−5% of the stated value (or range of values), +/−10% of the stated value (or range of values), etc. Any numerical values given herein should also be understood to include about or approximately that value, unless the context indicates otherwise. For example, if the value “10” is disclosed, then “about 10” is also disclosed. Any numerical range recited herein is intended to include all sub-ranges subsumed therein. It is also understood that when a value is disclosed that “less than or equal to” the value, “greater than or equal to the value” and possible ranges between values are also disclosed, as appropriately understood by the skilled artisan. It is also understood that each unit between two particular units are also disclosed. For example, if 10 and 15 are disclosed, then 11, 12, 13, and 14 are also disclosed.
As used herein, unless otherwise specified the use of the ordinal adjectives “first”, “second”, “third”, etc., to describe a common object, merely indicate that different instances of like objects are being referred to, and are not intended to imply that the objects so described must be in a given sequence, either temporally, spatially, in ranking, or in any other manner.
In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the invention may be practiced without these specific details. In other instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
The terms in the claims have the broadest scope of meaning they would have been given by a person of ordinary skill in the art as of the relevant date.
The term “associate”, and its derivatives (e.g. “associating”) in relation the combination of data includes the correlation, combination or similar linking of data.
The term “data fusion”, “fusing” and like terms are intended to refer to a multi-level process dealing with the association, correlation, combination of data and information from single and multiple sources to achieve refined position, identify estimates and complete and timely assessments of situations, risks and their significance.
The terms “a” and “an” mean “one or more”, unless expressly specified otherwise
Neither the title nor any abstract of the present application should be taken as limiting in any way the scope of the claimed invention.
Where the preamble of a claim recites a purpose, benefit or possible use of the claimed invention, it does not limit the claimed invention to having only that purpose, benefit or possible use.
In the present specification, terms such as “part”, “component”, “means”, “section”, or “segment” may refer to singular or plural items and are terms intended to refer to a set of properties, functions or characteristics performed by one or more items having one or more parts. It is envisaged that where a “part”, “component”, “means”, “section”, “segment”, or similar term is described as consisting of a single item, then a functionally equivalent object consisting of multiple items is considered to fall within the scope of the term; and similarly, where a “part”, “component”, “means”, “section”, “segment”, or similar term is described as consisting of multiple items, a functionally equivalent object consisting of a single item is considered to fall within the scope of the term. The intended interpretation of such terms described in this paragraph should apply unless the contrary is expressly stated or the context requires otherwise.
The term “connected” or a similar term, should not be interpreted as being limitative to direct connections only. Thus, the scope of the expression an item A connected to an item B should not be limited to items or systems wherein an output of item A is directly connected to an input of item B. It means that there exists a path between an output of A and an input of B which may be a path including other items or means. “Connected”, or a similar term, may mean that two or more elements are either in direct physical or electrical contact, or that two or more elements are not in direct contact with each other yet still co-operate or interact with each other.
In the claims which follow and in the preceding description of the invention, except where the context requires otherwise due to express language or necessary implication, the word “comprise” or variations such as “comprises” or “comprising” are used in an inclusive sense, i.e. to specify the presence of the stated features but not to preclude the presence or addition of further features in various embodiments of the invention.
Any one of the terms: including or which includes or that includes as used herein is also an open term that also means including at least the elements/features that follow the term, but not excluding others. Thus, including is synonymous with and means comprising.
Thus, while there has been described what are believed to be the preferred embodiments of the invention, those skilled in the art will recognize that other and further modifications may be made thereto without departing from the spirit of the invention.
Functionality may be added or deleted from the block diagrams/flow charts, and operations may be interchanged among functional blocks. Steps may be added or deleted to methods describe within the scope of the present invention.
Although the invention has been described with reference to specific examples, it will be appreciated by those skilled in the art that the invention may be embodied in many other forms.
Number | Date | Country | Kind |
---|---|---|---|
2021900347 | Feb 2021 | AU | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/AU2022/050099 | 2/14/2022 | WO |