This application claims priority to Chinese Patent Application No. 202210344428.2 with a filing date of Apr. 2, 2022. The content of the aforementioned application, including any intervening amendments thereto, is incorporated herein by reference.
The present disclosure relates to the field of unmanned aerial vehicle (UAV) technologies, and specifically, to a method for a multi-UAV formation to pass through a frame-shaped obstacle, which is applicable to rescue, firefighting, detection and other scenarios that require UAVs to collaboratively work during passing through obstacles.
In the last decade, UAVs have been widely used in civil, industrial and military applications. Cooperation among multiple UAVs during executing tasks such as passing through, rescue and resupply has been a research hotspot. Multiple UAVs form a UAV cluster and realize information exchange and sharing through aerial networking. The UAV cluster is capable of executing collaborative tasks through flight control, intelligent learning and other technologies, which give full play to the advantages of each UAV, so as to meet complex task requirements. Compared with a single UAV, the multi-UAV cluster has higher robustness, redundancy, scalability and efficiency when executing tasks, and the viability of the UAVs is also enhanced. When a node in the cluster fails or is attacked offline, the task undertaken by this node will be replaced by another node, which ensures the completion of tasks.
Regarding the cooperation among UAVs, it is the research emphasis on how the multi-UAV formation passes through obstacles. In order to realize the cooperation among UAVs, each UAV needs to obtain relative positions of neighboring UAVs which are usually obtained by external positioning systems such as the Global Position System (GPS) or the Global Navigation Satellite System (GNSS). When executing the pass-through task, the UAV usually carries a visual odometer to identify and calculate the center of the frame-shaped obstacle. However, it is difficult for the visual odometer to function since the UAVs usually work in some scenarios where the GPS or GNSS signals are weak, especially with dim light, low visibility, and occlusion.
In the prior art, there are the following solutions for the UAV formation collaboratively passing through the frame-shaped obstacle:
If the GPS is available, positioning modules such as GPS modules are installed on the UAVs and the obstacle, to obtain the positions of the UAVs and the obstacle, and the formation passes through the obstacle through position control. Alternatively, if indoor positioning based on Ultra Wide Band (UWB) base stations is available, the positions of the UAVs and the obstacle can also be obtained.
If monocular, binocular or infrared cameras (RGB-D) are available, map construction and positioning are performed through the simultaneous localization and mapping (SLAM) algorithm, and the obstacle is recognized based on feature points. The UAV formation may share positions through transmission protocols such as Bluetooth or WIFI. This solution has high requirements on hardware and involves more complex computation.
If both the GPS and the cameras are available, positioning modules such as GPS modules are installed on the UAVs to obtain positions, while monocular or binocular cameras are installed on the UAVs to identify the obstacle, which enables the formation to pass through the obstacle.
Without exception, all of the above solutions require at least one GPS or camera for the formation to pass through the obstacle, and they usually fail in some unknown environments where the GPS is limited and cameras are limited due to dim light.
The purpose of the present disclosure is to provide a method for a multi-UAV formation to pass through a frame-shaped obstacle, which uses only the on-board sensor of the UAV and the UWB distance sensor. This method has the advantages of strong adaptability and no dependence on the light and GPS.
To achieve the above objective, the following technical solutions are provided:
A method for a multi-UAV formation to pass through a frame-shaped obstacle includes:
Further, the relative position model between the frame-shaped obstacle and the UAV formation is constructed by using the following formula:
where p0(k) and pj(k) respectively represent positions of a primary UAV and a UAV j in a global coordinate system at a moment k, ps represents a position of the frame-shaped obstacle in the global coordinate system, q0(k) represents a relative position between a primary UAV and the frame-shaped obstacle, and qj0(k) represents a relative position between the UAV j and the primary UAV.
Further, the constructing the UAV velocity control model may include:
modeling each of the UAVs as a discrete-time integrator with bounded velocities by using the following formula:
where T is a sampling period,
Further, the constructing the parameter model based on the distance, the displacement, and the relative position model may include:
d
0(k)=∥q0(k)∥,dj0(k)=∥qj0(k)∥ (1.3)
φ0(k)=q0(k+1)−q0(k)
φj0(k)=qj0(k+1)−qj0(k)=φj(k)−φ0(k) (1.4)
where p0(0) represents a position of the primary UAV at an initial moment, d0(k) represents a distance between the primary UAV and the frame-shaped obstacle, dj0(k) represents a distance between the primary UAV and the a UAV j, φ0(k) represents a displacement of the primary UAV, φj(k) represents a displacement of the UAV j, φj0(k) represents a relative displacement between the primary UAV and the UAV j, as òs(k) is a constructed parameter model related to the position ps(k) of the frame-shaped obstacle, òj0(k) is a constructed parameter model related to the relative position qj0(k) between the UAV j and the primary UAV, and the superscript T represents transposition.
Further, the constructing a position estimation model of the frame-shaped obstacle may specifically include:
where I is a unit matrix, R0(k) is a covariance matrix at the moment k, {circumflex over (p)}s(k) is an estimation value of the position ps(k), λ is a forgetting factor, μ is a damping term, and Φ0(k) is a set matrix of φ0(k) and satisfies:
δ1 is a minimum eigenvalue of the set matrix, N represents a dimensionality of the method, and M (k) is a forgetting operator matrix of λ and satisfies:
Further, the relative position estimation model among the UAVs is expressed as follows:
where Φj0(k) is a set matrix of φj0(k), Rj0(k) is a covariance matrix, and {circumflex over (q)}j0(k) is an estimated position of the UAV j relative to the primary UAV.
Further, the controller for the UAV formation to pass through the obstacle may be designed as follows:
where φ0(i) represents a displacement of the primary UAV at a moment i, {circumflex over (p)}s(k) is an estimation value of the position ps(k) of the frame-shaped obstacle at the moment k, v0(k) and vj(k) are velocities of the primary UAV and the UAV j obtained by event-triggering the controller for the formation to pass through the obstacle,
Further, H0(k) and Hj0(k) are specifically expressed as follows:
where k1 is an initial time, k4 is an operation time, k2=sup{k>k1|{circumflex over (q)}0(k)<q0s} k3=sup{k>k3|{circumflex over (q)}0(k)>q0e}, {circumflex over (q)}0(k) represents an estimation value of a position of the primary UAV at the moment k, sup represents an upper bound, q0s and q0e are start and end points artificially set for an event trigger respectively, Rot is a rotation matrix of the frame-shaped obstacle relative to a coordinate system; in a two-dimensional case, the rotation matrix is:
w is an angle of the obstacle relative to the UAV in the coordinate system; α is a pass-through step during passing through the obstacle; qj0init is a formation of the UAVs before and after passing through the obstacle, and qj0pass is a formation of the UAVs during passing through the obstacle.
A computer device is provided, including a memory, a processor, and a computer program stored in the memory and executable on the processor, and the processor executes the computer program to perform the steps of the method described above.
A computer-readable storage medium is provided. The computer-readable storage medium stores a computer program, and the steps of the method described above are performed when the computer program is executed by a processor.
Compared with the prior art, technical solutions provided in the present disclosure have the following advantages:
In the prior at, the GPS is needed to obtain global absolute positions of UAVs and relative positions among the UAVS for the UAV formation to pass through the obstacle. The data volume is very large and hence the processing speed is slowed down. In addition, good light environment is necessary for passing through the obstacle. In view of this, the present disclosure proposes a method for a multi-UAV formation to pass through a frame-shaped obstacle, in which the GPS is not needed, and the method hence has strong environment applicability.
The method for the multi-UAV formation to pass through the frame-shaped obstacle, includes the following steps.
Step 1: Obtain in real time a distance between the frame-shaped obstacle and the UAV formation and a displacement of the UAV formation during flight.
The distance d0 between the frame-shaped obstacle and the UAV formation and the distance dj0 between a UAV j and a primary UAV 0 are obtained.
In addition, the UAVs obtain velocity information in real time and perform integration processing to obtain the displacement φ0 and φj. φ0 represents a displacement of the primary UAV (leader) in the formation, φj represents a displacement of the UAV j (follower) in the formation, and j represents the serial number of the UAV.
Step 2: Construct a relative position model between the frame-shaped obstacle and the UAV formation by using the following formula:
p0(k) and pj(k) respectively represent positions of the primary UAV and the UAV j in a global coordinate system at a moment k, ps represents a position of the frame-shaped obstacle in the global coordinate system, q0(k) represents a relative position between the primary UAV and the frame-shaped obstacle, and qj0(k) represents a relative position between the UAV j and the primary UAV.
Step 3: Construct a UAV velocity control model, including:
modeling each of the UAVs as a discrete-time integrator with bounded velocities by using the following formula:
where T is a sampling period,
Step 4: Construct a parameter model based on the distance, the displacement obtained in the step 1, and the relative position model obtained in the step 2. Details are as follows.
The distance and the displacement obtained in real time are expressed by relative positions among the UAVs as follows:
d
0(k)=∥q0(k)∥,dj0(k)=∥qj0(k)∥ (1.3)
φ0(k)=q0(k+1)−q0(k)
φj0(k)=qj0(k+1)−qj0(k)=φj(k)−φ0(k) (1.4)
Then, the parameter model is constructed based on the formulas (1.3) and (1.4):
In the formula (1.5), p0(0) represents a position of the primary UAV at an initial moment, òs(k) is a constructed parameter model related to the position ps(k) of the frame-shaped obstacle, òj0(k) is a constructed parameter model related to a relative position qj0(k) between the UAV j and the primary UAV, and the superscript T represents transposition.
Step 5: Construct a position estimation model of the frame-shaped obstacle and a relative position estimation model among the UAVs based on the parameter model in the step 4, including the following derivations:
Step 5.1: Construct the position estimation model of the frame-shaped obstacle.
According to formula (1.5), a forgetting factor λ is added in this scheme to prevent new data from being overwhelmed by old data. In addition, a damping term μ is designed in this scheme to increase the stability of the algorithm in order to prevent unexpected cases during recognition of {circumflex over (p)}s(k).
I is a unit matrix, R0(k) is a covariance matrix at the moment k, {circumflex over (p)}s (k) is an estimation value of the position ps(k) of the frame-shaped obstacle at a moment k. In this scheme, the parameter marked with A represents an estimation of the parameter, similarly hereinafter. Φ0(k) is a set matrix of φ0(k) and satisfies:
δ1 is a minimum eigenvalue of the set matrix, and N represents a dimension of the method. N is 2 if the position is estimated in the two-dimensional case, and N is 3 if the position is estimated in the three-dimensional case. M(k) is a forgetting operator matrix of λ, and satisfies:
Therefore, when 0<λ≤1, μ2≤δ12λ2N×3, the primary UAV can accurately estimate the position of the frame-shaped obstacle by using the formula (1.6).
Step 5.2: Construct the relative position estimation model among the UAVs, including the following derivations:
The following relative position estimator is designed for the UAV j:
Φj0(k) is a set matrix of φj0(k), Rj0(k) is a covariance matrix, and {circumflex over (q)}j0(k) is an estimated position of the UAV j relative to the primary UAV. When 0<λ≤1, the relative position between the UAV j and the primary UAV can be accurately estimated by using the formula (1.7).
Step 6: Design, based on the position estimation model (1.6) and the relative position estimation model (1.7) constructed in the step 5, and the UAV velocity control model in the step 3, a controller for the UAV formation to pass through the obstacle, and control a flight velocity of the UAV formation through the controller to enable the UAV formation to pass through the frame-shaped obstacle.
The controller for the UAV formation to pass through the obstacle may be specifically designed as follows:
An event is designed as follows based on the formulas (1.6) and (1.7) and the UAV velocity control model (1.2) to trigger the controller for the formation to pass through.
{circumflex over (p)}s(k) is an estimation of the position ps(k) of the frame-shaped obstacle at the moment k, v0(k) and vj(k) are velocities of the primary UAV and the UAV j obtained by event-triggering the controller for the formation to pass through the obstacle, and
H0(k) and Hj0(k) each are a switching function that is a segmented constant function, which represents a switching signal between displaying formation and passing through the obstacle. The details are as follows:
k1 is an initial time, k4 is an operation time, k2=sup{k>k1|{circumflex over (q)}0(k)<q0s}, k3=sup{k>k3|{circumflex over (q)}0(k)>q0e}, {circumflex over (q)}0(k) represents an estimation of the position of the primary UAV at the moment k, and sup represents an upper bound. q0s and q0e are start and end points artificially set for the event trigger, expressing that formation pass-through starts when the relative position between the primary UAV and the frame-shaped obstacle reaches q0s, and ends when the relative position reaches q0e. Rot is a rotation matrix of the frame-shaped obstacle relative to the coordinate system. In a two-dimensional case, the rotation matrix is:
w is an angle of the obstacle relative to the UAV in the coordinate system, and the angle may be artificially specified, or be measured by the odometer on the UWB sensor. α is a pass-through step during passing through the obstacle. qj0init is a formation of the UAVs before and after passing through the obstacle, and qj0pass is a formation of the UAVs during passing through the obstacle.
It should be noted that the formation pass-through error is exponentially convergent when the controller (1.8) and (1.9) for the UAV formation to pass through the obstacle satisfies the following conditions:
T represents a sampling time,
When the method of the present disclosure is applied in practice, firstly, the distance d0 and dj0 and the displacement φ0 and φj that are acquired in the step 1 are substituted into the parameter model in the step 4 for solving, to obtain òs(k) and òj0(k), which are then substituted into the position estimation model of the obstacle and the relative position estimation model among the UAVs in the step 5, to obtain {circumflex over (p)}s(k) and {circumflex over (q)}j0(k). Further, {circumflex over (p)}s(k) and {circumflex over (q)}j0(k) are substituted into the controller for the UAV formation to pass through the obstacle in the step 6, and the UAV formation is controlled to fly according to the preset formula (1.9). In addition, the vector velocity of the UAV formation at each moment is calculated by using the formula (1.8), and the UAV formation passes through the frame-shaped obstacle through speed control. This can solve the formation problem in the limited GPS environment and the pass-through problem in the limited light environment.
Simulation Experiments
The simulation effect of the method according to the present disclosure in MATLAB is shown in
Further, the method of the present disclosure is applied to GAZEBO software for verification, which depends on the powerful UAV dynamics model of GAZEBO. This aims to ensure that the mathematical model designed in the algorithms can be applied in a real UAV model. The simulation effects in GAZEBO are shown in
It can be learned that the multiple UAVs, as envisioned, maintain a formation before passing through, then change the formation to meet the requirements for passing through the frame-shaped obstacle, and resume the original formation afterwards. This verifies the reasonableness of this method in the real UAV model.
It should be noted that the above embodiments are merely intended to explain the technical solutions of the present application, rather than to limit the present application. Although the present application is described in detail with reference to the above embodiments, those of ordinary skill in the art should understand that they can still modify the technical solutions described in the above embodiments or make equivalent substitutions on some technical features therein without departing from the spirit of the technical solutions. However, these modifications or substitutions should fall within the protection scope of the present application.
Number | Date | Country | Kind |
---|---|---|---|
202210344428.2 | Apr 2022 | CN | national |