This application claims priority of Chinese Patent Application No. 202310106488.5, filed on Feb. 13, 2023, the entire contents of which are incorporated herein by reference.
The present disclosure relates to the technical field of aircraft navigation and positioning, in particular to a nighttime cooperative positioning method based on an unmanned aerial vehicle (UAV) group.
With the development of science and technology, UAV clusters have broad application prospects in military and civil fields, especially for vicinagearth security under the future-oriented near-ground security system. The UAV clusters have many advantages, such as strong combat capability, high system survival rate and low attack cost, which are of great significance for the protection, production, safety and rescue of industrial production, social economy, scientific research and education, as well as for national defense security, social stability and economic development. How to obtain the high-precision and high-reliability relative space-time relationship among UAVs in the cluster is very important for the flight safety and mission execution of UAV cluster. Therefore, there is an increasing demand and necessity for fast, economical and high quality unmanned cluster cooperative positioning technology.
At present, domestic and foreign scholars have made rich achievements in the autonomous relative positioning of UAV cluster, and put forward a series of methods, such as laser pulse ranging and positioning, UWB ranging and positioning, vision ranging and positioning, ultrasonic ranging and positioning, and radio ranging and positioning, which are widely used in various fields. The laser pulse ranging and positioning has extremely high costs; the UWB ranging and positioning has poor stability, and may also interfere with other wireless communications; the ultrasonic ranging and positioning has a slow acquisition speed and a small application scope; and the radio ranging and positioning is easily interfered and has poor reliability. Compared with other methods, the vision positioning has the advantages of low cost, passive sensing, and low detectability, and is one of the important research directions in the future. However, the existing vision ranging and positioning mainly uses a binocular camera, which has heavy computation task and cannot meet the requirements of night use.
Meanwhile, the ultimate goal of UAV cluster application is to adapt to all-weather and all-scene requirements, so the main challenge is from the complex environment. At present, there have been many related research results in the complex geographical and meteorological environment. However, there are few researches on the cooperative positioning perception relying on the UAV itself at night, and night is one of the important application scenarios of the UAV cluster, especially in the military field. Therefore, there is a need for a nighttime internal visual cooperative positioning method for a UAV group to ensure the normal operation of the UAV cluster in the nighttime environment.
In view of the above-mentioned deficiencies of the prior art, an object of the present disclosure is to provide a nighttime cooperative positioning method based on a UAV group.
In order to achieve the above object, the technical solutions adopted by the present disclosure are as follows.
A nighttime cooperative positioning method based on a UAV group, one UAV group including 5 UAVs, each of the UAVs including a two-dimensional turntable camera and an LED light, includes the following steps:
Further, the positioning benchmark construction specifically includes the following steps:
Further, the time benchmark construction includes: the synchronization of communication clocks among UAVs.
Further, in the flight process, the position change of any one of UAVs 1-4 leads to a deviation between a real-time monitored pixel coordinate value of the LED light of each of UAVs and the cooperative positioning benchmark, to maintain the formation of the UAV group by the attitude closed-loop controller.
Further, the directed communication topology instep 4.1.6 is specifically: the UAV 1 has a bidirectional communication relationship with the UAVs 2 and 4; the UAV 3 has a bidirectional communication relationship with the UAVs 2 and 4; the UAV 1 has no communication relationship with the UAV 3, the UAV 2 has no communication relationship with the UAV 4; and the UAV 0 has a unidirectional communication relationship with UAVs 1 and 2, the UAV 0 being a sender of information.
Preferably, the camera is a monocular camera.
Further preferably, the monocular camera has a viewing angle of 90°.
Preferably, the light colors of the LED lights may be set by driver software.
The safe radius described in the present application is twice the radius of a circumscribed circle of a maximum contour of the UAV body.
Compared with the prior art, the present disclosure has the following beneficial effects.
According to the present disclosure, the nighttime cooperative visual positioning of UAV group is realized by means of the LED lights of UAVs and the two-dimensional turntable cameras, without adding additional equipment, without relying on GPS, laser radar and ultrasonic radar, and without relying on an external signal source, avoiding external interference. Compared with the positioning method in a conventional manner, in the present disclosure, the system is effectively simplified, and the cooperative positioning among the interiors of a UAV cluster can be realized relatively simply and at a low cost to maintain the formation of the UAV group.
Other features, objects and advantages of the present disclosure will become more apparent by reading the detailed description of non-limiting examples with reference to the following drawings.
Hereinafter, the present disclosure will be further explained in detail with specific examples. The following examples will aid those skilled in the art in further understanding of the present disclosure, but do not limit the present disclosure in any way. It is to be pointed out that, for those of ordinary skill in the art, several variations and improvements can be made without departing from the concept of the present disclosure, which are all within the scope of protection of the present disclosure.
In the following, further details are given by means of specific implementations.
In an example, a nighttime cooperative positioning method based on a UAV group is proposed. As shown in
The specific implementation flow chart is shown in
At step 1: an unmanned cluster formation is arranged before take-off.
Each of UAV groups is arranged at a take-off site according to a rectangular geometry formation shown in
At step 2: the unmanned cluster formation is powered on.
At step 3: the corresponding light colors of UAVs are set according to the requirements shown in
Two UAVs on opposite corners of the rectangle are arranged with LED lights of color I, UAVs on the other opposite corners of the rectangle are arranged with LED lights of color II, and the UAV 0 is arranged with an LED light of color III, color I, color II and color III being different colors. In the example, the UAVs 1 and 4 located at the opposite corners of the rectangle are arranged with yellow LED lights, the UAVs 2 and 3 located at the other opposite corners of the rectangle are arranged with green LED lights, and the UAV 0 is arranged with a red LED light.
Description: the color setting of each of UAVs in the example is not limited to the colors described in
At step 4: the automatic benchmark construction is performed before the UAVs take off.
At step 4.1: the positioning benchmark construction is performed.
At step 4.1.1: an included angle α0 between an axis of the two-dimensional turntable camera of UAV 0 and a heading of UAV 0 is set as zero.
At step 4.1.2: the LED light with a red color of the UAV 0 is searched for as the UAVs 1-4 automatically rotate two-dimensional turntable cameras clockwise, to cause the LED light with the red color to be located at a horizontal center of the camera imaging plane.
At step 4.1.3: included angle values α0-α4 between axes of the two-dimensional turntable cameras of UAVs 0-4 and headings of UAVs 0-4 at this moment are recorded and stored. α0 is an included angle between an axis of the two-dimensional turntable camera of the UAV 0 and a heading of the UAV 0; α1 is an included angle between an axis of the two-dimensional turntable camera of the UAV 1 and a heading of the UAV 1; α2 is an included angle between an axis of the two-dimensional turntable camera of the UAV 2 and a heading of the UAV 2; α3 is an included angle between an axis of the two-dimensional turntable camera of the UAV 3 and a heading of the UAV 3; α4 is an included angle between an axis of the two-dimensional turntable camera of the UAV 4 and a heading of the UAV 4; and so on.
At step 4.1.4: a closed-loop maintenance control program of the included angle of the two-dimensional turntable camera is started to cause the included angle values of α0-α4 to be consistent with the recorded and stored values before take-off in the subsequent whole flight process.
At step 4.1.5: pixel coordinate values (x10, y10)-(x40, y40) of an LED red light spot of the UAV 0 in the camera imaging planes of the UAVs 1-4 at this moment are recorded and stored. The pixel coordinate value of the LED red light spot of the UAV 0 in the camera imaging plane of the UAV 1 is (x10, y10); the pixel coordinate value of the LED red light spot of the UAV 0 in the camera imaging plane of the UAV 2 is (x20, y20); the pixel coordinate value of the LED red light spot of the UAV 0 in the camera imaging plane of the UAV 3 is (x30, y30); the pixel coordinate value of the LED red light spot of the UAV 0 in the camera imaging plane of the UAV 4 is (x40, y40); and so on. At the same time, pixel coordinates (x12, y12) and (x21, y21) of the LED light of the UAVs 1 and 2 in the camera imaging planes of the other parties at this moment, and pixel coordinates (x34, y34) and (x43, y43) of the LED light of the UAVs 3 and 4 in the camera imaging planes of the other parties are recorded and stored. It is to be noted that due to light shielding, pixel coordinates (x14, y14) and (x41, y41) of the LED light of UAVs 1 and 4 in the camera imaging planes of the other parties coincide with (x10, y10) and (x40, y40), and pixel coordinates (x23, y23) and (x32, y32) of the LED light of UAVs 2 and 3 in the camera imaging planes of the other parties coincide with (x20, y20) and (x30, y30).
At step 4.1.6: pixel coordinates (x21, y21) and (x41, y41) of the UAV 1 in the camera imaging planes of the UAVs 2 and 4 are acquired by the UAV 1 to acquire cooperative positioning benchmark information of the UAV 1 in the cluster as {(x10, y10), (x21, y21), (x41, y41)} by means of a directed communication topology shown in
At step 4.2: the time benchmark construction includes the synchronization of communication clocks among UAVs, ensuring the consistency of cooperative positioning of UAVs in the cluster.
At step 5: the vertical take-off of UAV 0 is controlled by a control instruction of an external system, and at the same time, an anti-collision warning instruction is sent by the UAV 0 to the UAV 1 or 2 through a communication topology when the LED light of the UAV 1 or 2 enters a camera viewing angle range of the UAV 0 in the flight process, that is, when an LED light pixel point of a yellow color or a green color appears in a camera imaging plane of the UAV 0, thereby avoiding the collision risk between the UAVs.
At step 6: real-time pixel coordinate values of the LED light with the red color of the UAV 0 in camera imaging planes of UAVs 1-4 are changing with the vertical take-off action of UAV 0, deviations between the real-time pixel coordinate values and pixel coordinate values (x10, y10)-(x40, y40) stored in the ground records of the UAVs 1-4 are calculated by attitude controllers of the UAVs 1-4 to finally follow the take-off action of UAV 0 by closed-loop control.
Supplementary note: in the flight process, the position change of any one of UAVs 1-4 leads to a deviation between a real-time monitored pixel coordinate value of the LED light of each of UAVs and the cooperative positioning benchmark, to cause the deviation to be zero or the control to be within a certain precision range by the attitude closed-loop controller to maintain the formation of the UAV group.
It is to be noted that, herein, relational terms such as “first” and “second” are only used to distinguish one entity or operation from another entity or operation, and do not necessarily require or imply that there is any such an actual relationship or order between these entities or operations. Moreover, the terms “including”, “containing” or any other variations are intended to cover non-exclusive inclusion, so that a process, method, article or equipment including a series of elements includes not only those elements, but also other elements not explicitly listed or elements inherent to such a process, method, article or equipment.
It is to be noted that similar numerals and letters indicate similar items in the following drawings, so once an item is defined in one drawing, it does not need to be further defined and explained in subsequent drawings.
What has been described above is only the example of the present disclosure, and the common sense of the specific structure and characteristics known in the solution is not described here too much. Those of ordinary skill in the field know all the general technical knowledge of the technical field to which the present disclosure belongs before the application date or priority date, can know all the existing technologies in the field, and have the ability to apply the conventional experimental means before the date. Under the inspiration given by the present application, those of ordinary skill in the field can improve and implement the solution in combination with their own abilities. Some typical well-known structures or well-known methods are not to be an obstacle for those of ordinary skill in the field to implement the present application. It is to be pointed out that, for those skilled in the art, several variations and improvements can be made without departing from the structure of the present disclosure, which are also to be regarded as the scope of protection of the present disclosure, and these will not affect the implementation effect of the present disclosure and the practicability of the patent. The scope of protection claimed by the present application is to be subject to the contents of the claims, and the detail descriptions in the specification can be used to explain the contents of the claims.
Number | Date | Country | Kind |
---|---|---|---|
202310106488.5 | Feb 2023 | CN | national |
Number | Name | Date | Kind |
---|---|---|---|
11238281 | Cui | Feb 2022 | B1 |
11861896 | Wang | Jan 2024 | B1 |
20140236388 | Wong | Aug 2014 | A1 |
20140374535 | Wong | Dec 2014 | A1 |
20170138732 | Pettersson | May 2017 | A1 |
20170193781 | Bryson | Jul 2017 | A1 |
20170210486 | O'Brien | Jul 2017 | A1 |
20170372625 | Horinouchi | Dec 2017 | A1 |
20180067502 | Chi-Hsueh | Mar 2018 | A1 |
20180074520 | Liu | Mar 2018 | A1 |
20180164820 | Aboutalib | Jun 2018 | A1 |
20180357909 | Eyhorn | Dec 2018 | A1 |
20190114925 | Schulman | Apr 2019 | A1 |
20190146501 | Schick | May 2019 | A1 |
20190176987 | Beecham | Jun 2019 | A1 |
20190246626 | Baughman | Aug 2019 | A1 |
20190291893 | Hörtner | Sep 2019 | A1 |
20190373173 | Wang | Dec 2019 | A1 |
20200108923 | Smith | Apr 2020 | A1 |
20200108926 | Smith | Apr 2020 | A1 |
20200404163 | Hörtner | Dec 2020 | A1 |
20210129989 | Schuett | May 2021 | A1 |
20210255645 | Wang | Aug 2021 | A1 |
20210300555 | Ali | Sep 2021 | A1 |
20210403159 | Dey | Dec 2021 | A1 |
20220285836 | Badichi | Sep 2022 | A1 |
20230058405 | Chen | Feb 2023 | A1 |
20230109390 | Wang | Apr 2023 | A1 |
20230410662 | Sha | Dec 2023 | A1 |
Number | Date | Country |
---|---|---|
107831783 | Mar 2018 | CN |
108052110 | May 2018 | CN |
110119158 | Aug 2019 | CN |
112631329 | Apr 2021 | CN |
113821052 | Dec 2021 | CN |
115097846 | Sep 2022 | CN |
115651204 | Jan 2023 | CN |
2022247597 | Dec 2022 | WO |
Number | Date | Country | |
---|---|---|---|
20240272650 A1 | Aug 2024 | US |