The present invention relates a drone. More specifically, the present invention relates to an amphibious VTOL super drone adapted self-powered solar cells and wind turbine with field view mapping and advanced collision system.
The conventional drones are adapted for flying and capturing the environment with simple 2d pictures and does not communicate with the other unmanned vehicles there by collision occurs, and conventional drones do not have self-powered solar cells and wind turbine which also used as horizontal flight propeller to have super speed. Conventional drones do not have folding functions to act as mobile phone cases also. The present invention overcomes such problems.
Object of the present invention is to provide an amphibious VTOL super unmanned aerial vehicle with field view mapping and advanced collision system. And these invented amphibious VTOL super drones have self-powered solar cells and wind turbine which also used as horizontal flight propeller to have super speed. And these new invented drones have folding functions to act as mobile phone cases to perform selfie and selfie video.
Another object of the present invention is to provide an unmanned aerial vehicle with field view mapping and advanced collision system which can capture area mapping.
Yet another of the present invention is to provide an unmanned aerial vehicle with field view mapping and advanced collision system which can communicate with other unmanned vehicles.
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
According to the present invention a VTOL unmanned aerial vehicle with field view mapping and advanced collision system is provided. The VTOL unmanned aerial vehicle comprises a plurality of cameras, a plurality of rotors, a power supplying unit, a landing gear, a control device and a communication system. The plurality of cameras are adapted for providing a real-time first-person video and a real-time first-person view and normal footage video recording and 360-degree panoramic video recording used for virtual reality views and interactive videos.
The plurality of rotors are configured laterally on a periphery of the unmanned aerial vehicle adapted for creating a thrusting force thereby moving the unmanned aerial vehicle towards the thrusting force. The power supplying unit is for supplying power to the plurality of rotors for moving the unmanned aerial vehicle. The landing gear is adapted for safe landing of the unmanned aerial vehicle. The control device is adapted to set a flight path and area to map by the unmanned aerial vehicle. The communication system is adapted for sharing the flight path and position of the unmanned aerial vehicle thereby controlling the unmanned aerial vehicle in a predetermined path.
All illustrations of the drawings are for the purpose of describing selected versions of the present invention and are not intended to limit the scope of the present invention.
Referring now to
Referring now to
The plurality of camera 110 is arranged on a camera stabilization system 112 arranged on a surface of the unmanned aerial vehicle 100.The plurality of camera 110 is configured to adjust one or more of the following parameters: zoom, shutter speed, aperture, ISO, focal length, depth of field, exposure compensation, white balance, video or photo frame size and orientation, camera resolution and frame rates; switch cameras used for live streaming, digitally stabilize video; capture panoramic photos, capture thermal measurements, edit colour correction, produce night vision images and video, produce flash.
The plurality of camera 110 captures images in a panoramic view, the plurality of camera capture 360-degree view of the environment, the plurality of cameras 110 are adapted to capture the video in different resolution, the plurality of cameras 110 are adapted for capturing the video in 4 k resolution, the plurality of cameras 110 are adapted for capturing the 3d models of the area captured by the unmanned aerial vehicle 100. The plurality of camera 110 comprises zooming lens, the zooming lens are adapted for capturing the distant objects, wherein the zooming lens are telescopic lens. The zooming lens have autofocus. The zooming lens 120 are adapted for the unmanned aerial vehicle 100 for capturing the images and videos of the object which are at a distance more than 10 miles without any errors and blurs.
Further the plurality of camera 110 is having at least one lens filter. Also the plurality of camera 110 are adapted for mapping the aerial view captured by the unmanned aerial vehicle 100. Specifically, the plurality of camera 110 is a depth camera. The depth camera is adapted for the finding the distance between the captured object and the unmanned aerial vehicle 100. Further, the camera stabilization system 112 includes a gimbal system adapted for capturing the images and video without disturbances, the camera stabilization 112 system is adapted for controlling the focal point and focus of the plurality of cameras 110.
In an embodiment the plurality of cameras 110 are adapted for mapping the area of the selected 2d maps, the plurality of cameras 110 map the selected maps according to the pre-selected map and sends the mapped images to a mobile device 200 using cellular data or wifi or wireless communication or Bluetooth etc.
In an embodiment the unmanned aerial vehicle 100 includes a site scan mapping platform (not shown in the figure). The site scan mapping platform is a fully automated and intelligent platform enabling the unnamed aerial vehicle 100 mapping platform for easy, fast and accurate. The aerial data acquisition enables to take informed and targeted action. Site scan mapping provides a level of insight that's invaluable to industries like agriculture, construction, mining, and land and resource management, or for gathering data for any area.
The site scan mapping involves various steps like plan, fly, process. Select the area to map using the application from the mobile devices 200, and the unmanned aerial vehicle 100 computes the flight path that will cover it. While in flight, on board software automatically captures all the right photos and geo-tags. In an embodiment the plurality of cameras 110 are adapted for the depth analysis and calculating the depths in water bodies calculating the depth in valleys and in mountain areas calculating the distance between the objects which are at depth.
In the present embodiment, the communication system 160 comprises a traffic control system and a collision avoidance system. The traffic control system is used to control the air traffic between the unmanned aerial vehicles. The collision avoidance system is adapted to communicate between the unmanned aerial vehicles using cellular network or Wi-Fi or Bluetooth. When the collision system detects any obstacle, the unmanned aerial vehicle 100 immediately halt forward motion, allowing the pilot to redirect the unamanned aerial vehicle to avoid a crash. This will work when the unmanned aerial vehicle 100 is flying forward, backward and side wards or at any direction obvious to a person skilled in the art.
In the present embodiment, the collision avoidance system is a low altitude tracking and avoidance system. Further, the collision avoidance system is configured to a remote device for controlling the unmanned aerial vehicle 100. The low altitude tracking and avoidance system platform connects leading airspace management technologies, such as sense and avoid, geofencing and aircraft tracking, into a service package for commercial and recreational drone operators as well as regulators and air traffic controllers.
Further referring to figure XX, the plurality of rotors 120 are tiltable rotors which tilt from 0-90 digress to change the direction of thrust there by creating the movement of vehicle 100 in all directions. The plurality of rotors 120 are having plurality of blades wherein the plurality of blades are aero foils adapted for creating the forward thrust and reverse thrust. Further the plurality of rotors 120 are connected to at least one motors arranged on the unmanned aerial vehicle 100. The plurality of rotors 120 are adapted for creating vertical lifting and landing.
Further, the power supplying unit 130 is a solar panel for supplying power to batteries and APU, thereby providing power to rotate the plurality of rotors 120. Specifically, the solar panel is retractable. The solar panels convert the solar energy and stored in the batteries and can be used as backup and as power bank for several electronic devices and to supply electricity to the various components of the unmanned aerial vehicle 100. The power supplying unit 130 comprises a plurality of sensors controlled by the control device 150 to detect the battery levels and power consumption of the unmanned aerial vehicle 100.
The plurality of sensors includes at least one GPS sensor and an at least one acoustic sensor. The at least one GPS sensor is adapted to guide the unmanned aerial vehicle to a desired location. The control device 150 is adapted to send navigation and position to the unmanned aerial vehicle 100 through the GPS sensor. The at least one acoustic sensor is adapted for finding minerals and ores in water and land.
The landing gears 140 are adapted for landing the unmanned aerial vehicle 100 to a dock safely. Also the landing gear 140 is adapted for horizontal stabilization. The landing gear 140 comprises a plurality of tilting cameras wherein the plurality of tilting camera are adapted for capturing the 360 degree view of the area.
The control device 150 is a remote control device adapted for giving commands and communication to the unmanned aerial vehicle 100. The control device 150 is a mobile phone or a tablet, a communication device. The control device 150 includes a tap fly, the tap fly allows the user to tap on a point on a map displayed in the control device 150 for choosing a flight path automatically thereby avoiding obstacles along the way of flight.
The unmanned aerial vehicle 100 further comprises a plurality of location sensor adapted to guide the unmanned aerial vehicle 100 to the desired location, the control device 150 is adapted for sending the navigation and position to the unmanned aerial vehicle 100 through the plurality of location sensors. The plurality of location sensors includes at least one acoustic sensor which are adapted for finding minerals and ores in water and land. The unmanned aerial vehicle 100 is adapted for underwater, surface, aerial for surveillance for capturing videos, for first person view, for recording 4 k resolution. The unmanned aerial vehicle 100 is adapted for aerial delivery, surface delivery and under water delivery, the unmanned aerial vehicle 100 sensors detect delivery address from the control device 150.
In another aspect a method 300 of controlling an unmanned aerial vehicle 100 in accordance with the present invention is illustrated. Referring now to figure XX, a flow chart of the method 300 in accordance with the present invention is provided. For the sake of brevity, the method 300 is explained in conjunction with the unmanned aerial vehicle 100 explained above.
The method 300 starts at step 310
At step 320, a plurality of cameras 110 captures a real-time first-person video and a real-time first-person view and normal footage video recording and 360-degree panoramic video recording used for virtual reality views and interactive video.
At step 330, a control device 150 communicates with an unmanned aerial vehicle 100.
At step 340, the unmanned aerial vehicle 100 is safe landed using a landing gear 140 provided on the unmanned aerial vehicle 100.
Therefore, the present invention has an advantages is to provide an unmanned aerial vehicle with field view mapping and advanced collision system. The present invention can capture area mapping. The present invention can also communicate with other unmanned vehicles.
The foregoing descriptions of specific embodiments of the present invention have been presented for purposes of illustration and description. They are not intended to be exhaustive or to limit the present invention to the precise forms disclosed, and obviously many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles of the present invention and its practical application, and to thereby enable others skilled in the art to best utilize the present invention and various embodiments with various modifications as are suited to the particular use contemplated. It is understood that various omissions and substitutions of equivalents are contemplated as circumstances may suggest or render expedient, but such omissions and substitutions are intended to cover the application or implementation without departing from the spirit or scope of the claims of the present invention.
This application is a continuation-in-part of U.S. application Ser. No. 29/572,722, entitled “Amphibious vtol, hover, backward, leftward, rightward, turbojet, turbofan, rocket engine, ramjet, pulse jet, afterburner, and scramjet single/dual all in one jet engine (fuel/electricity) with onboard self computer based autonomous module gimbaled swivel propulsion (GSP) system device, same as ducted fan(fuel/electricity)”, filed Jul. 29,2016. This application is a continuation-in-part of U.S. application Ser. No. 29/567,712, entitled“ Amphibious vtol, hover, backward, leftward, rightward, turbojet. turbofan, rocket engine, ramjet, pulse jet, afterburner, and scramjet all in one jet engine (fuel/electricity) with onboard self computer based autonomous gimbaled swivel propulsion system device” filed Jun. 10, 2016. This application is a continuation-in-part of U.S. application Ser. No. 14/940,379, entitled “AMPHIBIOUS VERTICAL TAKEOFF AND LANDING UNMANNED SYSTEM AND FLYING CAR WITH MULTIPLE AERIAL AND AQUATIC FLIGHT MODES FOR CAPTURING PANORAMIC VIRTUAL REALITY VIEWS, INTERACTIVE VIDEO AND TRANSPORTATION WITH MOBILE AND WEARABLE APPLICATION”. filed Nov. 13, 2015. This application is a continuation-in-part of U.S. application Ser. No. 14/957,644 (publication no. 2016/0086,161), entitled “SYSTEMS AND METHODS FOR MOBILE APPLICATION, WEARABLE APPLICATION, TRANSACTIONAL MESSAGING, CALLING, DIGITAL MULTIMEDIA CAPTURE AND PAYMENT TRANSACTIONS”, filed Dec. 3, 2015; which is continuation-in-part of U.S. patent application Ser. No. 14/815,988 (publication no. 2015/0371,215), entitled “SYSTEMS AND METHODS FOR MOBILE APPLICATION, WEARABLE APPLICATION, TRANSACTIONAL MESSAGING, CALLING, DIGITAL MULTIMEDIA CAPTURE AND PAYMENT TRANSACTIONS” filed Aug. 1, 2015; which is continuation-in-part of Ser. No. 13/760,214 filed Feb. 6, 2013, which in turn is a continuation-in-part of Ser. No. 10/677,098 which claims priority to Provisional Application Ser. No. 60/415,546, filed on Oct. 1, 2002, the content of which is incorporated herein by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
Parent | 29572722 | Jul 2016 | US |
Child | 15345308 | US | |
Parent | 29567712 | Jun 2016 | US |
Child | 29572722 | US | |
Parent | 14940379 | Nov 2015 | US |
Child | 29567712 | US | |
Parent | 14957644 | Dec 2015 | US |
Child | 14940379 | US | |
Parent | 14815988 | Aug 2015 | US |
Child | 14957644 | US | |
Parent | 13760214 | Feb 2013 | US |
Child | 14815988 | US |