The subject matter herein generally relates to an unmanned aerial vehicle control method and an unmanned aerial vehicle.
Current protection devices are centered around passive monitoring of individuals. For example, cameras are installed at fixed locations. The cameras have a given field of view based on the camera and instillation configuration.
Implementations of the present technology will now be described, by way of example only, with reference to the attached figures.
It will be appreciated that for simplicity and clarity of illustration, where appropriate, reference numerals have been repeated among the different figures to indicate corresponding or analogous elements. In addition, numerous specific details are set forth in order to provide a thorough understanding of the embodiments described herein. However, it will be understood by those of ordinary skill in the art that the embodiments described herein can be practiced without these specific details. In other instances, methods, procedures and components have not been described in detail so as not to obscure the related relevant feature being described. Also, the description is not to be considered as limiting the scope of the embodiments described herein. The drawings are not necessarily to scale and the proportions of certain parts may be exaggerated to better illustrate details and features of the present disclosure.
The term “comprising,” when utilized, means “including, but not necessarily limited to”; it specifically indicates open-ended inclusion or membership in the so-described combination, group, series and the like.
The disclosure will now be described in relation to an electronic device with an unmanned aerial vehicle detection method and the unmanned aerial vehicle.
The UAV system 10 includes a direction control module 16, a structuring module 17, a image detecting module 18, and a controlling module 19. In one embodiment, the UAV detecting system 10 may include computerized instructions in the form of one or more programs that are stored in the storage device 13 and executed by the processor 15. It should be understood that
The UAV1 can fly in the space above the user. The camera device 11 includes four depth-sensing cameras 111 and four rotors 112 driving the UAV 1 to fly as illustrated in
The direction control module 16 acquires positions of the user through the GPS 12, and detects the flight direction of the UAV1 from the electronic compass built-in the UAV, to compute a relative position of user and the UAV 1, which can adjust the flight direction of the UAV 1 according to the relative position. The structuring module 17 creates a sample feature database (such as clothes, figure, sex, hair style) for a user, and stores the sample features of sample features database in the storage device 13.
The timing module 191 is configured to compute whether a storage time of features of object stored in the storage device exceeds an alarm time, such as 30 minutes. If the storage time of features of object stored in the storage device exceeds the alarm time, the computing module 192 computes whether a size change of the features of the object in the scene image exceeds a preset percent value, such as 20%. If the size change of features of the object does not exceed the preset percent value, the object is confirmed to be a tagger. The sending module 193 sends an alarm signal to the handheld device 2 through the web module 14.
Referring to
At block 801, the structuring module 17 creates a sample feature database (such as clothes, figure, sex, hair style) for a user, and stores the sample features of sample features database in the storage device 13 with numbers.
At block 802, the direction control module 16 acquires positions of user through GPS, to determine a relative position between the UAV 1 and the user. The direction control module 16 controls the direction and position of the UAV 1 according to the relative position, to make the UAV 1 follow the user continually.
At block 803, the camera device 11 continually shoots scene image around the user in a certain scope in every direction. Referring to
At block 804, analyze features of human image, such as clothes, figure, sex, hair style. In at least one embodiment, the features of human image can be obtained through a technology of human detecting.
At block 805, the comparing module 182 of the image detecting module 18 compares features of user stored in the storage device 13 with the features of human image of the object. In the embodiment, if the features of human image of the object are same with the features of user stored in the storage device 13, the UAV 1 can determine the object can own a number as that of the user. If the features of human image of the object are different from the features of user stored in the storage device 13, the UAV 1 can determine the object current acquired has not been appeared in the scene image, and then the editing module 183 can give the human image a new number. The features of the human image can be stores in the storage device 13.
At block 806, monitoring module 190 determines whether the features of the object have appeared again in a preset period time, such as five (5) minutes, if the features of the object have appeared again in a preset period time, the monitoring module 190 can control the features of object to be eliminated from the storage device 13 and the process goes to block 807, otherwise, the process goes to block 805.
At block 807, the monitoring module 190 can eliminate the features and number of object in the storage device 13, to release storage space and resource of the storage device 13.
At block 808, the timing module 191 determines a storage time of the features of the object stored in the storage device 13. If the storage time of the features of the object stored in the storage device 13 exceeds an alarm time, such as 30 minutes, the processor goes to block 809, otherwise, the process goes to block 804.
At block 809, the computing module 19 computes a size change of the features of the object. Referring to
At block 810, the UAV 1 send an alarm signal to the handheld device 2 taken by the user, and the alarm signal configured to prompt the user is danger.
While the disclosure has been described by way of example and in terms of the embodiment, it is to be understood that the disclosure is not limited thereto. On the contrary, it is intended to cover various modifications and similar arrangements as would be apparent to those skilled in the art. Therefore, the range of the appended claims should be accorded the broadest interpretation so as to encompass all such modifications and similar arrangements.
Number | Name | Date | Kind |
---|---|---|---|
9367067 | Gilmore | Jun 2016 | B2 |
20040119819 | Aggarwal | Jun 2004 | A1 |
20060167597 | Bodin | Jul 2006 | A1 |
20090015674 | Alley | Jan 2009 | A1 |
20100013917 | Hanna | Jan 2010 | A1 |
20110264311 | Lee | Oct 2011 | A1 |
20110320068 | Lee | Dec 2011 | A1 |
20120016534 | Lee | Jan 2012 | A1 |
20120035785 | Lee | Feb 2012 | A1 |
20120089274 | Lee | Apr 2012 | A1 |
20120221176 | Lee | Aug 2012 | A1 |
20120221179 | Lee | Aug 2012 | A1 |
20120221180 | Lee | Aug 2012 | A1 |
20120296497 | Lee | Nov 2012 | A1 |
20120307042 | Lee | Dec 2012 | A1 |
20130034834 | Lee | Feb 2013 | A1 |
20130073775 | Wade | Mar 2013 | A1 |
20130162822 | Lee | Jun 2013 | A1 |
20130173088 | Callou | Jul 2013 | A1 |
20130176423 | Rischmuller | Jul 2013 | A1 |
20130253733 | Lee | Sep 2013 | A1 |
20130287261 | Lee | Oct 2013 | A1 |
20140025236 | Levien | Jan 2014 | A1 |
20140067162 | Paulsen | Mar 2014 | A1 |
20140140575 | Wolf | May 2014 | A1 |
20140316616 | Kugelmass | Oct 2014 | A1 |
20140320667 | Densham | Oct 2014 | A1 |
20140327733 | Wagreich | Nov 2014 | A1 |
20140327770 | Wagreich | Nov 2014 | A1 |
20150087258 | Barnes | Mar 2015 | A1 |
20150134143 | Willenborg | May 2015 | A1 |
20170032175 | Lee | Feb 2017 | A1 |
Number | Date | Country | |
---|---|---|---|
20170032175 A1 | Feb 2017 | US |