This nonprovisional application claims the benefit of priority of EP23306401.3 filed Aug. 21, 2023, which is hereby incorporated by reference in its entirety.
This disclosure relates to systems and methods for controlling operation of one or more unmanned assets.
At present, unmanned assets, in particular unmanned aerial vehicles (UAVs), are commanded via the input of a specifically designated journey, or journey plan. This requires an unmanned asset commander to calculate a journey plan for each unmanned asset and manually instruct this plan. In the case of UAVs, the journey plan is a flight plan and the calculated flight plan must contain specific instructions, e.g. take off and climb to altitude X, follow heading Y for Z km, then turn onto heading A, etc. This is a very involved task for the commander.
Many modern operations are performed using a combination of manned and unmanned assets, and as such, it is advantageous for the crew of manned assets to be able to allocate tasks to unmanned teammates. Current systems require too much detailed input from commanders to be operated effectively by crew who must also focus on the operation of their own manned asset (e.g. a pilot must focus on flying his own aircraft and does not have the time to plot out and instruct detailed flight plans for each of his unmanned teammates). As such, a more efficient system for controlling unmanned assets, in particular UAVs, is needed.
According to this disclosure, there is provided a system for controlling operation of one or more unmanned assets, the system comprising:
According to this disclosure, there is also provided a method for controlling operation of one or more unmanned assets, the method comprising:
It will be understood that the tasks and/or commands are what the commander has instructed via their graphical inputs (e.g. take off from this location, deliver payload to this location, patrol/scan this area), however, the tasks and/or commands are in the form of an unordered list, and may not be allocated to any specific unmanned asset.
It will be understood that the operation instructions are (e.g. high-level) instructions which are specific to an individual unmanned asset (e.g. unmanned asset x is to take off from this location, then patrol this area, then deliver payload at this location etc). In examples, they result from the sequencing and allocation of the tasks and/or commands (e.g. unordered and unallocated list of tasks and/or commands) discussed above to the one or more unmanned assets. It will therefore be understood that, in examples, where the operation of a plurality of assets is being controlled, separate operation instructions will be generated for each of the plurality of assets.
It will further be understood that the journey plan is a more detailed plan (e.g. take off from this location, climb to altitude x, follow heading Y for Z km to reach the area to be patrolled, patrol the area at an altitude X whilst flying in a zig-zag pattern for T minutes, etc.) which is specific to an individual unmanned asset and is generated to be executed by each individually designated unmanned asset. It will therefore be understood that where the operation of a plurality of assets is being controlled, a separate journey plan will be generated for each of the plurality of assets.
As such, the method may comprise (and the operation planning module may be configured to) generating operation instructions for each of the one or more unmanned assets based at least on the tasks and/or commands.
As such, the method may comprise (and the journey planning module may be configured to) generating a journey plan for each of the one or more unmanned assets based at least on the operation instructions. Thus, the method may comprise (and the communications module may be configured to) communicating with the one or more unmanned assets to instruct the unmanned assets to operate according to the respective generated journey plans (i.e. the journey plan for each unmanned asset).
This journey plan may result from the refinement of the sequence of the operation instructions. The generation of the journey plan may also comprise translating the operation instructions into a format which the unmanned assets understand.
In examples, where not all of a plurality of assets are assigned tasks, the one or more assets which have not been assigned tasks may not have a journey plan communicated to them. Alternatively, the assets may have a journey plan communicated to them which instructs the assets to do nothing, or to hold position.
The communication between the communications module and the one or more unmanned assets may occur via a communications module on board each of the one or more unmanned assets.
In examples, the communications module is configured to communicate with the one or more unmanned assets to monitor the execution of the unmanned asset's instructed journey plan.
In some examples, the operation planning module is further configured to obtain data indicative of external and/or environmental constraints, and to generate operation instructions based on the data indicative of external and/or environmental constraints.
In some examples, the data indicative of external and/or environmental constraints comprises data indicative of one or more of: terrain information (e.g. elevation, gradient, surface type), weather, obstacles, air traffic, restricted and hazardous areas, published navigation information (e.g. routes, corridors, aerodromes, etc.).
In some examples, the graphical user interface is configured to accept hand-drawn inputs. In some examples, the graphical user interface comprises a touchscreen. In some examples, the graphical user interface may comprise a different device, for example, a projector, a hologram, cameras, a drawing tablet, a gesture recognition device, a non-touchscreen device requiring inputs using a conventional mouse.
In some examples, the system and method are for controlling operation of one or more unmanned assets including one or more unmanned aerial vehicles (UAVs). In some examples, the system and method are for controlling operation of one or more UAVs.
In some examples, the system and method are for controlling operation of a plurality of unmanned assets comprising a plurality of different unmanned asset types, e.g. unmanned aerial vehicles (UAVs), unmanned ground vehicles (UGVs), unmanned surface vehicles (USVs) for operation on the surface of water, unmanned underwater vehicles (UUVs).
In some examples, recognising the graphical inputs comprises (and the input interpretation module is configured to (recognize the graphical inputs by)) identifying the shapes of the graphical inputs, and further identifying geographical locations corresponding to the portions of the map on which the graphical inputs were drawn.
In some examples translating the graphical inputs comprises (and the input interpretation module is configured to (translate the graphical inputs by)) comparing the graphical inputs to a library of known input symbols and their corresponding commands.
In some examples, the input interpretation module comprises a (e.g. local) memory, wherein the library of known input symbols and corresponding commands is stored in the (e.g. local) memory.
In some examples, the library of known input symbols and corresponding commands is stored in a remote memory. In some examples, translating the graphical inputs comprises (and the input interpretation module is configured to) obtaining the library of known input symbols and corresponding commands from the (e.g. remote) memory.
In some examples, the method comprises (and the input interpretation module is configured to) producing an error output if the graphical input does not match any of the known input symbols.
In some examples, the method comprises (and the graphical user interface is configured to) providing an indication of the error to the user (e.g. via a message on a screen, or, in examples where the graphical user interface comprises an audio function, via an audible error message sound).
In some examples, the method comprises (and the operation planning module is configured to) receiving (e.g. via the communication module) position and/or status data relating to the one or more unmanned assets.
In examples, the unmanned asset status data may comprise one or more of availability, health status, charge/fuel level, and/or payload status relating to the one or more unmanned assets.
In some examples, the method comprises (and the operation planning module is configured to) receiving (e.g. via the communication module) position and status data from each unmanned asset (e.g. from a communication module of each unmanned asset).
In some examples, the method comprises (and the operation planning module is configured to) receiving (e.g. via the communication module) position and status data from an unmanned asset tracking database which contains the current position and status of one or more (e.g. all) unmanned assets in a fleet. The information in the unmanned asset tracking database may come from the unmanned assets themselves, or the position of unmanned assets may be determined using a system external from the unmanned assets, e.g. a GPS or RADAR tracking system. In another example, the information in the unmanned asset tracking database may come from logs. For example, it may be determined, from a usage log, that a fleet of unmanned assets are currently available for use and being stored at a base or in a hangar.
In some examples, the method comprises (and the operation planning module is configured to) allocating tasks associated with graphical inputs to different ones of a plurality of unmanned assets. In some examples, where the plurality of assets comprises a plurality of different asset types, the method may comprise (and the operation planning module may be configured to) obtaining data indicative of the capabilities of each asset, and allocating tasks based at least partially on the data indicative of the capabilities of each asset.
In some examples, the allocation of tasks is based at least partially on the position and/or status data.
In some examples, a single graphical input may be applicable to a plurality (e.g. the entire fleet or a cluster or group) of unmanned assets. As such, in some examples, the method comprises (and the operation planning module is configured to) allocating a single task or command to a plurality (e.g. the entire fleet or a cluster or group) of unmanned assets.
In some examples, the graphical inputs correspond to commands including one or more of: take off from this location, land in this location, ditch in this location in case of emergency, rendezvous at this location at this time, deliver payload to this location, pick up payload from this location, inspect (one time) this portion of highway/border/railway/powerline, patrol (several times) this portion of highway/border/railway/powerline from this location, inspect (one time) this portion of highway/border/railway/powerline, patrol/scan this area, avoid flying over this area, loiter in this location, monitor this point of interest in orbit, monitor this point of interest in a figure of eight, search this area for targets, search along this border for targets, gather at this location, split from this location.
According to this disclosure, there is also provided a system comprising one or more unmanned assets; and a system for controlling operation of the one or more unmanned assets as described herein.
In some examples, the one or more unmanned assets comprise a communications module configured to communicate with the communications module of the system for controlling operation of the one or more unmanned assets.
In some examples, the one or more unmanned assets include one or more unmanned aerial vehicles (UAVs).
In some examples, the system comprises a plurality of unmanned assets comprising a plurality of different unmanned asset types, e.g. unmanned aerial vehicles (UAVs), unmanned ground vehicles (UGVs), unmanned surface vehicles (USVs) for operation on the surface of water, unmanned underwater vehicles (UUVs).
In some examples, the one or more unmanned assets are configured to send position and/or status data (e.g. via a communication module of each unmanned assert) relating to the one or more unmanned assets to the system for controlling operation of the one or more unmanned assets.
In examples, the unmanned asset status data may comprise one or more of availability, health status, charge/fuel level, and/or payload status.
In some examples, the one or more unmanned assets are configured to send data indicative of their capabilities (e.g. land capabilities, aquatic capabilities, aerial capabilities, payload capabilities, monitoring capabilities, etc.).
One or more non-limiting examples will now be described, by way of example only, and with reference to the accompanying figures in which:
The below described examples will be understood to be exemplary only. It will be understood that where used herein, terms such as up and down refer to directions as viewed in the reference frame of the appended Figures.
In
Operation of the system 1 will now be explained with reference to the flowchart of
At step 23, the graphical user interface 3 accepts one or more illustrated inputs from the commander. In the illustrated example, the graphical user interface 3 is a touchscreen and so the commander provides their graphical inputs by drawing on the touchscreen with a finger or a stylus. In some examples, step 23 may also comprise accepting a further input from the commander confirming that they have finished providing graphical inputs, and the system 1 should proceed to analysing the graphical inputs.
At step 25, the input interpretation module 7 analyses the graphical inputs, and translates the graphical inputs into tasks and/or commands. This analysis comprises three steps:
It will be understood that, in examples, the order of steps (b) and (c) may be reversed.
In examples, the input interpretation module 7 may be configured to output an error message, for example to be displayed to a commander (e.g. using the graphical user interface) if at step (a) the shape is not recognized, or at step (c) the recognized shape does not match a known shape.
The shape recognition from step (a) can use any known shape recognition tool, for example, Adobe's® shaper tool. Example shapes, and their associated tasks and/or commands, are discussed below in relation to
It will be understood that the recognition step (a) comprises recognising the shape which has been drawn, without yet processing its significance, e.g. recognising that the commander has drawn a circle with a cross through it. It is at step (c) that the significance of this shape is determined. There may be scenarios where a shape is recognized at step (a) by the shape recognition tool, but the shape does not correspond to a known shape which has an associated task or command. For example, if the commander drew an octagon, the shape recognition tool may recognize (at step (a)) that an octagon had been drawn, but at step (c) no task and/or command is found which is associated with an octagon, and so an error message may be output at step (c).
The tasks and/or commands are then forwarded by the input interpretation module 7 to the operation planning module 9.
At step 27, external and environmental constraint data is obtained by the operation planning module 9. This data may comprise a plurality of different data types which may be relevant. In some examples, the external and environmental constraint data comprises terrain elevation data, weather data, obstacle data, and air traffic data. This data can be obtained in any suitable and desired way, for example via a ground network and/or via wireless communications. In some examples, the terrain elevation data may be stored in local memory.
At step 29, unmanned asset position and status data is obtained. The communication module 17 of each unmanned asset is configured to send position and status data to the communication module 13 of the control system 1. This data is then forwarded by the system's communication module 13 to the operation planning module 9. The unmanned asset status data may comprise details such as availability, health status, charge/fuel level, payload status, etc.
Obtaining unmanned asset position and status data may be performed by means other than the communication module 17 of each asset. For example, the system may be configured to interface with an unmanned asset tracking database which contains the current position and status of all unmanned assets in a fleet. The information in the unmanned asset tracking database may come from the unmanned assets themselves, or the position of unmanned assets may be determined using a system external from the unmanned assets, e.g. a GPS or RADAR tracking system. In another example, the information in the unmanned asset tracking database may come from logs. For example, it may be determined, from a usage log, that a fleet of unmanned assets are currently available for use and being stored at a base or in a hangar.
At step 31, the operation planning module 9 generates operation instructions based on the tasks and/or commands, the external and environmental constraint data, and the unmanned asset position and status data. The operation instructions may be considered macro-operations which outline a mission strategy, and a sequence of operations to be performed by the one or more unmanned assets 15. The operation planning module is configured to sequence and allocate (e.g. optimally sequence and allocate) the tasks and/or commands to the one or more unmanned assets. As such it may not be necessary for the commander to sketch the mission in the exact order the operations will be executed. The commander may start from the main objective (e.g. “search that area”), and then define where to take-off from, where to land, no-fly zones, emergency landing zones, etc. As such, once step 31 is complete, a set of operation instructions has been generated for each of the unmanned assets which has been allocated any tasks and/or commands
At step 33, the operation instructions are passed to the journey planning module 11. The journey planning module 11 uses the macro-operations outlined by the operation instructions to generate specific journey plans for one or more of the individual unmanned assets 15. These journey plans include all sufficient detail for a commander to carry out its assigned mission, e.g. take off location, exact route, altitude and groundspeed instructions, and payload commands.
At step 35, the communication module 13 sends the respective journey plans to the one or more unmanned assets 15 via the communications modules 17 of the unmanned assets 15. The guidance, navigation, control, and payload manager 19 of the or each unmanned asset 15 can then operate the unmanned asset 15 to follow the assigned journey plan.
The method 20 will now be explained in relation to an example.
At step 21, a map is displayed on a graphical user interface, and at step 23, the unmanned asset commander draws graphical inputs corresponding to tasks. At step 25, the inputs are analysed, and translated into a list of tasks and commands:
At step 27 and 29 external and environmental constraint data, and unmanned asset position and status data are obtained.
At step 31, operation instructions are generated based on the list of tasks and commands, the external and environmental constraint data, and the unmanned asset position and status data. The operation instructions are specific to each unmanned asset and result from the sequencing and allocation of the tasks and/or commands. In this example, the unmanned asset position and status data indicates that there are two unmanned assets, A and B available at the instructed take-off location, and that unmanned asset A has surveillance capabilities and unmanned asset B has payload capabilities. As such, the operation instructions are as follows:
Asset A operation instructions:
Asset B operation instructions:
At step 33, a journey plan is generated for unmanned asset A and unmanned asset B, based on their respective operation instructions. The journey plan contains detailed instructions for the journey in a format which the assets understand. As such, the journey plan instructions are as follows:
Asset A journey plan:
Asset B journey plan:
At step 35, assets A and B are sent their journey plans.
Use of the drawings shown in
An operation may be interrupted by an unmanned asset 15 for example in a search and rescue mission where the instruction is to scan a certain area until the missing person is located. Upon locating the person, the unmanned asset 15 would then interrupt its own scanning operation. In the defined vocabulary, commander interruptible operations are denoted by an upwardly pointing triangle whilst unmanned asset 15 interruptible operations are denoted by a downwardly pointing triangle.
The symbol shown in
The symbol shown in
The symbol shown in
It will be understood that for operations to be interrupted by the commander, the system may provide functionality for the commander to provide dynamic instructions such that the calculation unit 5 can make alterations to the journey plan of an unmanned asset 15 even after the asset has been deployed.
It can be seen from
In the example of
The example of
The symbol of
Input 51 corresponds to the symbol of
In the case of the unmanned assets 15 assigned to perform the interruptible operations associated with inputs 52 and 53 they will continue to monitor the points of interest until the commander interrupts them, at which point they will move to the rendezvous point at the location of input 56. Meanwhile, the unmanned asset 15 assigned to deliver the payload at the location of input 55 will drop its payload, and then proceed directly to the rendezvous point. Likewise, the unmanned asset 15 assigned to perform the patrol according to input 54 will do so, and then proceed to the rendezvous point. Depending on the unmanned asset 15 position and status data, the calculation unit may determine that the same unmanned asset 15 should perform the tasks associated with inputs 55 and 54 since, as can be seen from
Whilst the above described examples are primarily concerned with unmanned aerial vehicles (UAVs), the unmanned assets 15 being controlled could comprise any unmanned assets 15, e.g. unmanned ground vehicles (UGVs), unmanned surface vehicles (USVs) for operation on the surface of water, and unmanned underwater vehicles (UUVs). Further, the disclosed systems and methods could be applied to fleets of unmanned assets 15 comprising a plurality of different types of unmanned assets 15.
By way of example, in the example situation presented by
It will be understood that although the above examples are mainly focussed around the case of UAVs, the present disclosure is applicable to the control of many types of unmanned assets.
It will be seen that the system of the present disclosure has the potential to simplify the level of detail which needs to be input by an unmanned asset commander in order to effectively command one or more unmanned assets. As such, unmanned-manned teamwork can be more seamlessly provided, since the pilot or driver of a manned asset can command their unmanned teammates whilst still having the capacity to operate their own vehicle.
| Number | Date | Country | Kind |
|---|---|---|---|
| 23306401.3 | Aug 2023 | EP | regional |