The subject matter herein generally relates to an unmanned aerial vehicle control method and an unmanned aerial vehicle.
Unmanned aerial vehicles (UAVs) become more widely used, for example, for performing surveillance, reconnaissance, and exploration tasks for military and civilian applications. Generally, before an UAV is controlled to land at a target surface by a remoter. Sometimes, the target surface is not a desired surface suitable for landing, for example, a bumpy and pitted road. Sometimes, the UAV may be crashed by an obstruction during a landing process due to unskilled operations. Therefore, there is a need for an UAV control method capable of providing a relatively smooth landing under a condition that where ever the UAV lands and whoever operates the UAV.
Implementations of the present technology will now be described, by way of example only, with reference to the attached figures.
It will be appreciated that for simplicity and clarity of illustration, where appropriate, reference numerals have been repeated among the different figures to indicate corresponding or analogous elements. In addition, numerous specific details are set forth in order to provide a thorough understanding of the embodiments described herein. However, it will be understood by those of ordinary skill in the art that the embodiments described herein can be practiced without these specific details. In other instances, methods, procedures and components have not been described in detail so as not to obscure the related relevant feature being described. Also, the description is not to be considered as limiting the scope of the embodiments described herein. The drawings are not necessarily to scale and the proportions of certain parts may be exaggerated to better illustrate details and features of the present disclosure.
A definition that applies throughout this disclosure will now be presented.
The term “comprising,” when utilized, means “including, but not necessarily limited to”; it specifically indicates open-ended inclusion or membership in the so-described combination, group, series and the like.
The depth-sensing camera 11 can be arranged at a bottom of the UAV 1 and configured to take images below the UAV 1 as illustrated in
The gyroscope 12 can be configured to detect attitude data of the UAV, including pitch angles, angular velocity and orientation. In at least one embodiment, the gyroscope 12 can be substituted by a currently available inertial measurement unit (IMU).
The least one drive unit 13 can be configured to drive the at least one rotor 14 to rotate to move the UAV 1. In the exemplary embodiment, the drive unit 13 can be a motor.
The storage device 15 can be an internal storage unit of the UAV 1, for example, a hard disk or memory, or a pluggable memory, for example, Smart Media Card, Secure Digital Card, Flash Card. In at least one embodiment, the storage device 15 can include two or more storage devices such that one storage device is an internal storage unit and the other storage device is a pluggable memory. The processor 16 can be a central processing unit (CPU), a microprocessor, or other data processor chip that performs functions of the UAV 1.
An UAV control system 10 can include computerized instructions in the form of one or more programs that can be stored in the storage device 15 and executed by the processor 16. In the embodiment, the UAV control system 10 can be integrated in the processor 16. In at least one embodiment, the UAV control system 10 can be independent from the processor 16. Referring to
The detecting module 101 can be configured to detect current attitude data. In the exemplary embodiment, the attitude data can include angular velocity, orientation, and pitch angles.
The photographing module 102 can be configured to control the depth-sensing camera 11 to take images below the UAV 1.
The determining module 103 can be configured to determine whether a current altitude of the UAV is less than a predefined value, for example, 10 m, 15 m, 20 m or any desirable values. The determining module 103 further can be configured to determine whether a surface directly below the UAV 1 is a desirable surface for landing based on depth information of the images taken by the depth-sensing camera 11. The determining module 103 further can be configured to determine whether a surface adjacent to the surface directly below the UAV 1 is a desirable surface for landing based on depth information of the images taken by the depth-sensing camera 11 when the surface directly below the UAV 1 is not suitable for landing.
The calculating module 104 can be configured to calculate drive data based on the attitude data.
The controlling module 105 can be configured to control the drive unit 13 to rotate the rotors so as to move the UAV 1 to a desirable position in a desirable way based on the drive data. For example, if the surface directly below the UAV 1 is suitable for landing, the drive data can control the drive unit to drive the rotors 14 to rotate so as to cause the UAV 1 to land slowly at the surface directly below the UAV 1. If the surface directly below the UAV 1 is not suitable for landing, and there is a desirable surface nearby suitable for landing, the drive data can control the drive unit to drive the rotors 14 to rotate so as to cause the UAV 1 to move to an adjacent desirable surface suitable for landing as illustrated at
Referring to
At block 202, the UAV controls the depth-sensing camera to take images of the surface below the UAV and the gyroscope to detect current pitch angle of the UAV. The image taken by the depth-sensing camera can include depth information.
At block 204, the UAV determines current altitude based on the depth information of the images. In the exemplary embodiment, the UAV can determine whether the surface is bumpy based on the depth information. For example, the image taken by the depth-sensing camera can be divided into a plurality of blocks, each block corresponding to a depth value. If a difference value between depth values of two adjacent blocks exceeds a predefined range, the surface can be determined as bumpy. On the other hand, if the difference value between depth values of two adjacent blocks falls within a predefined range, the surface can be determined as even. If the surface is determined to be even, referring to
At block 206, the UAV determines whether the current altitude is less than a predetermined value, for example, 10 m, 15 m, 20 m or other suitable values.
At block 208, the UAV controls the drive unit to drive at least one rotor to rotate so as to have the UAV descended in a balanced and slow way. In detail, the UAV calculates drive data based on the pitched angle and current velocity and then controls the drive unit to drive the rotor to rotate based on the drive data. The balanced way can indicate that the unmanned aerial vehicle is substantially in a horizontal level where the pitch angle of the unmanned aerial vehicle is substantially equal to zero.
At block 210, the UAV controls the depth-sensing camera to take images of the surface under the UAV.
At block 212, the UAV determines whether the surface is suitable for landing based on the images. In the exemplary embodiment, the UAV can determine the surface directly below the UAV is suitable for landing. The surface directly below the UAV 1 can include a rotor range R1 and an undercarriage range R2. Referring to
The UAV can determine based on depth information of the images. Similar to described above, the images can be divided into a plurality of blocks, each block including corresponding a depth value. If difference values between depth values of two adjacent blocks are within a predefined value, the surface is determined to be even and be suitable for landing. Otherwise, if difference values between depth values of two adjacent blocks are beyond a predefined value, the surface is determined to be bumpy and not be suitable for landing. If the surface is suitable for landing, the process goes to block 214, otherwise, the process goes to block 216.
At block 214, the UAV controls the drive unit to drive the rotors to rotate so as to slowly land the UAV at the surface.
At block 216, the UAV determines whether there is an adjacent suitable surface is available. The UAV obtains depth information of the images of surfaces adjacent to the surface directly below the UAV 1, and then determines whether the adjacent surfaces are suitable for landing. If there is an adjacent surface suitable for landing, the process goes to block 218, otherwise, the process goes to block 220.
At block 218, the UAV 1 controls the drive unit to rotate the rotor to land the UAV at the adjacent suitable surface.
For example, referring to
For example, referring to
At block 220, the UAV 1 controls the drive unit to drive the rotors to rotate to hover at the surface. For example, referring to
The embodiments shown and described above are only examples. Even though numerous characteristics and advantages of the present technology have been set forth in the foregoing description, together with details of the structure and function of the present disclosure, the disclosure is illustrative only, and changes may be made in the detail, including in matters of shape, size and arrangement of the parts within the principles of the present disclosure up to, and including, the full extent established by the broad general meaning of the terms used in the claims.
Number | Name | Date | Kind |
---|---|---|---|
5716032 | McIngvale | Feb 1998 | A |
6043759 | Paterson | Mar 2000 | A |
6130705 | Lareau | Oct 2000 | A |
6148250 | Saneyoshi | Nov 2000 | A |
6694228 | Rios | Feb 2004 | B2 |
6714663 | Bornowski | Mar 2004 | B1 |
7239339 | Nagai | Jul 2007 | B2 |
8473125 | Rischmuller | Jun 2013 | B2 |
8474761 | Callou | Jul 2013 | B2 |
8498447 | Derbanne | Jul 2013 | B2 |
8583296 | Allen | Nov 2013 | B2 |
8711206 | Newcombe | Apr 2014 | B2 |
8918209 | Rosenstein | Dec 2014 | B2 |
8930019 | Allen | Jan 2015 | B2 |
8942917 | Chrysanthakopoulos | Jan 2015 | B2 |
8983684 | Callou | Mar 2015 | B2 |
9427874 | Rublee | Aug 2016 | B1 |
9429953 | Miller | Aug 2016 | B1 |
9513635 | Bethke | Dec 2016 | B1 |
20020077731 | Hilb | Jun 2002 | A1 |
20050125142 | Yamane | Jun 2005 | A1 |
20050165517 | Reich | Jul 2005 | A1 |
20070043482 | Aimar | Feb 2007 | A1 |
20100292868 | Rotem | Nov 2010 | A1 |
20110049290 | Seydoux | Mar 2011 | A1 |
20110091096 | Morris | Apr 2011 | A1 |
20110137547 | Kwon | Jun 2011 | A1 |
20110178658 | Kotaba | Jul 2011 | A1 |
20110270470 | Svoboda | Nov 2011 | A1 |
20110311099 | Derbanne | Dec 2011 | A1 |
20120265374 | Yochum | Oct 2012 | A1 |
20120300070 | Ohtomo | Nov 2012 | A1 |
20120320203 | Liu | Dec 2012 | A1 |
20130253733 | Lee | Sep 2013 | A1 |
20130325244 | Wang | Dec 2013 | A1 |
20140052555 | MacIntosh | Feb 2014 | A1 |
20140240498 | Ohtomo | Aug 2014 | A1 |
20150094952 | Moeglein | Apr 2015 | A1 |
20150097084 | Szabo | Apr 2015 | A1 |
20150325064 | Downey | Nov 2015 | A1 |
Number | Date | Country | |
---|---|---|---|
20170029134 A1 | Feb 2017 | US |