None.
Various embodiments of the disclosure relate to unmanned aerial vehicles. More specifically, various embodiments of the disclosure relate to control of an unmanned aerial vehicle (UAV) swarm.
With advancements in fields of electronics, propulsion systems, and information technology, unmanned aerial vehicles (UAVs) have become more capable and less expensive. The growth of capability and reduction in costs of UAVs have led to use and adoption of UAV-based solutions in various industries and application areas, such as, but not limited to, surveillance, defense, motion picture industry, mining, seaports, oil & gas, warehouses, and other industrial facilities. In certain UAV-based solutions, multiple UAVs may be used together as a group or swarm of UAVs to capture photos and/or videos of a target from multiple locations and/or angles. Conventional methods for control of the UAVs in the swarm of UAVs may be based on techniques that may involve pre-defined missions or path planning for the UAVs. However, dynamic control of the UAVs may be a challenge due to the unpredictability associated with a movement of the target and also due to a requirement to maintain a certain formation within the swarm of UAVs.
Further limitations and disadvantages of conventional and traditional approaches will become apparent to one of skill in the art, through comparison of described systems with some aspects of the present disclosure, as set forth in the remainder of the present application and with reference to the drawings.
A system and a method for unmanned aerial vehicle (UAV) swarm control is provided substantially as shown in, and/or described in connection with, at least one of the figures, as set forth more completely in the claims.
These and other features and advantages of the present disclosure may be appreciated from a review of the following detailed description of the present disclosure, along with the accompanying figures in which like reference numerals refer to like parts throughout.
The following described implementations may be found in the disclosed system and method for an unmanned aerial vehicle (UAV) swarm control. The system may include a plurality of Unmanned Aerial Vehicles (UAVs) that may be configured to form a group (or a swarm arrangement). The plurality of UAVs may include a leader UAV and a plurality of follower UAVs. The plurality of follower UAVs may be communicably coupled with the leader UAV. The system may further include a Ground Control Station (GCS) that may include circuitry configured to determine a geo-location (i.e. current location) of the leader UAV. The circuitry of the GCS may be further configured to determine formation information based on the determined geo-location of the leader UAV and/or based on a request (such as, but is not limited to, a user request) for formation that may be received at the GCS. The formation information determined by the circuitry of the GCS may indicate at least a relative position for each of the plurality of follower UAVs with respect to the leader UAV. The GCS may be further configured to transmit the determined formation information directly to each of the leader UAV and the plurality of follower UAVs. Each of the plurality of follower UAVs may be further configured to receive the transmitted formation information from the GCS, and adjust a position of the corresponding follower UAV based on the received formation information. In an embodiment, the GCS may be integrated with the leader UAV.
In another embodiment, following described implementations may be found in an electronic device for an unmanned aerial vehicle (UAV) swarm control. The electronic device may act as a leader device (for example, but is not limited to, a controller, a computing device, a ground station controller, or a wearable device) with one or more functionalities of leader UAV or GCS. The electronic device may be configured to determine target information that may indicate a location of a target to be captured by the plurality of follower UAVs. The target may be an animate object or an inanimate object which is to be captured by an image capturing device (for example a camera) integrated with one or more of the plurality of follower UAVs. The location of the target may be at an offset distance from a location of the electronic device. The electronic device may be further configured to determine the formation information for the plurality of follower UAVs. The determined formation information may indicate at least a relative position for each of the plurality of follower UAVs with respect to the electronic device or with respect to the target to be captured. The electronic device may be further configured to transmit the determined formation information and the target information directly to each follower UAV of the plurality of follower UAVs. Each follower UAV of the plurality of follower UAVs may receive the transmitted formation information and the target information and may further adjust a position of the corresponding follower UAV based on the received formation information and the target information, to maintain relative distance between follower UAV and the electronic device or the target, irrespective of any change in movement of the electronic device or the target, or any change in distance (angle or orientation) of the electronic device (or the target) with respect to each follower UAV.
The disclosed GCS and/or the disclosed electronic device may directly control positions of the plurality of follower UAVs and/or the leader UAV based on direct communication of the determined formation information with the plurality of follower UAVs and/or the leader UAV. The formation information may include, but is not limited to, instructions to adjust positions of each of the plurality of follower UAVs and/or the leader UAV. In addition, the electronic device may also transmit the target information to the plurality of follower UAVs. The target information may indicate a location of the target to be captured, while the electronic device itself may be at the offset distance from the target. The movements (or angle or orientation) of the plurality of follower UAVs may be automatically controlled, based on the movement of either the leader UAV and/or the target, by use of the formation information and/or the target information while maintain a set formation of the plurality of UAVs (i.e. without a need for complex computer vision techniques). The automatic control of the movements of the plurality of follower UAVs may eliminate a need for manual control of each of the plurality of follower UAVs, individually. This may further reduce a cost and manual effort associated with the control of the plurality of UAVs. The automatic control may also ensure that the plurality of follower UAVs may remain in a desired formation with respect to the leader UAV and/or to capture the target accurately which may be in motion. The automatic control may also improve a quality of images/videos captured by each follower UAV due to accurate and dynamic positioning of the plurality of follower UAVs with respect to one another, the leader UAV, and the target.
The N number of the plurality of follower UAVs 106 shown in
The GCS 102 may include suitable logic, circuitry, interfaces, and/or code that may be configured to communicate with the leader UAV 104 and each of the plurality of follower UAVs 106. The GCS 102 may be in direct communication with the leader UAV 104 and each of the plurality of follower UAVs 106. The GCS 102 may receive a user request for control of a plurality of UAVs (including the leader UAV 104 and the plurality of follower UAVs 106). The GCS 102 may determine a geo-location of the leader UAV 104 and determine formation information based on the determined geo-location and/or the user request (i.e. including user inputs). The formation information may indicate at least a relative position for each of the plurality of follower UAVs 106 with respect to the leader UAV 104. The GCS 102 may transmit the determined formation information to the leader UAV 104 and the plurality of follower UAVs 106 to control a formation of the plurality of UAVs (including the leader UAV 104 and the plurality of follower UAVs 106). In an embodiment, the GCS 102 may include a formation control software (FCS) or an application to determine the formation information. Examples of the GCS 102 may include, but are not limited to, a station communication system, a communication device, a UAV controller, a portable computing device, a controller system, a smartphone, a cellular phone, a mobile phone, a gaming device, a mainframe machine, a distributed computing system (such as, an edge computing system), a computer workstation, and/or a consumer electronic (CE) device.
In an embodiment, the GCS 102 may include a server, which may be configured determine the formation information for the plurality of UAVs, which includes the leader UAV 104 and the plurality of follower UAVs 106, based on the user input. The server of the GCS 102 may be configured to transmit the determined formation information to the leader UAV 104 and each of the plurality of follower UAVs 106. The server of the GCS 102 may be implemented as a cloud server and may execute operations through web applications, cloud applications, HTTP requests, repository operations, file transfer, and the like. Other example implementations of the server may include, but are not limited to, a database server, a file server, a web server, a media server, an application server, a mainframe server, or a cloud computing server. In another embodiment, the GCS 102 may be implemented as a plurality of distributed cloud-based resources by use of several technologies that are well known to those ordinarily skilled in the art.
The plurality of UAVs of the system 100 may include the leader UAV 104 and the plurality of follower UAVs 106. The plurality of UAVs may include suitable logic, circuitry, and/or interfaces that may correspond to unmanned aerial vehicles or systems controlled by a remote pilot, through a remote system (such as, the GCS 102), or capable of autonomous flights. Typically, the plurality of UAVs may be a component of an unmanned aircraft system (UAS), which may include additionally a ground-based controller and a system of communications with the UAV. Essentially, a UAV may be defined as a flying robot that may be remotely controlled or fly autonomously through software-controlled flight plans in their embedded systems, in conjunction with onboard sensors and GPS (not shown), and/or complex dynamic automation systems. UAVs may be typically meant to carry out tasks that range from the mundane to the ultra-dangerous. In an embodiment, the robotic UAVs may operate without a pilot on-board and with different levels of autonomy based on the requirements. Each of the plurality of UAVs may include one or more on-board image capturing devices (e.g., an image capturing device 308 shown in
The communication network 108 may include a communication medium through which the GCS 102 and the plurality of UAVs may communicate with one another. The communication network 108 may be one of a wired connection or a wireless connection or a combination thereof. Examples of the communication network 108 may include, but are not limited to, the Internet, a cloud network, Cellular or Wireless Mobile Network (such as Long-Term Evolution and 5G New Radio), a Wireless Fidelity (Wi-Fi) network, a Personal Area Network (PAN), a Local Area Network (LAN), or a Metropolitan Area Network (MAN). Various devices in the network environment including the system 100 may be configured to connect to the communication network 108 in accordance with various wired and wireless communication protocols or a combination of protocols including both wired protocols and wireless protocols. Examples of such wired and wireless communication protocols may include, but are not limited to, at least one of a Transmission Control Protocol and Internet Protocol (TCP/IP), User Datagram Protocol (UDP), Hypertext Transfer Protocol (HTTP), File Transfer Protocol (FTP), Zig Bee, EDGE, IEEE 802.11, light fidelity (Li-Fi), 802.16, IEEE 802.11s, IEEE 802.11g, multi-hop communication, wireless access point (AP), device to device communication, cellular communication protocols, and Bluetooth (BT) communication protocols.
In operation with respect to
The GCS 102 may be further configured to transmit the determined formation information directly to each of the leader UAV 104 and the plurality of follower UAVs 106, in the group or swarm of UAVs. Each of the plurality of follower UAVs 106 of the system 100 may be further configured to directly receive the transmitted formation information from the GCS 102 (i.e. not via the leader UAV 104). Each follower UAV may be further configured to adjust a position (or angle or orientation) of the corresponding UAV with respect to the leader UAV 104, based on the received formation information from the GCS 102.
In an embodiment, the GCS 102 may be integrated with the leader UAV 104. In such case, the leader UAV 104 may include the formation control software (FCS) or an application to determine the formation information. In another embodiment, the leader UAV (e.g., the leader UAV 104) may be an electronic device, such as, but is not limited to, an automobile (as described, for example, in
The disclosed GCS 102 and/or the disclosed electronic device may control positions of the plurality of follower UAVs 106 and/or the leader UAV 104 based on direct communication of the determined formation information with the plurality of follower UAVs 106 and/or the leader UAV 104. The formation information may include instructions to adjust positions (or an orientation or a field of view of integrated image capturing devices) of each of the plurality of follower UAVs 106 and/or the leader UAV 104. In addition, the electronic device may also transmit the target information to the plurality of follower UAVs 106. The target information may indicate a location of the target to be captured, while the electronic device itself may be at an offset distance from the target (i.e. in proximity to the electronic device). The movements of the plurality of follower UAVs 106 may be automatically controlled, based the movement of the leader UAV 104 and/or the target, by use of the formation information and/or the target information. The dynamic computation of the formation information and/or the target information based on the recent location or movement of the leader UAV 104 and/or the target, may provide dynamic control of the plurality of follower UAVs 106 to form (or modify to) a desired swarm arrangement. The automatic and dynamic control of the movements of the plurality of follower UAVs 106 may eliminate a need for manual control of each of the plurality of follower UAVs 106, individually. This may further reduce a cost and effort associated with the control of the plurality of UAVs. The automatic control may also ensure that the plurality of follower UAVs may remain in a desired formation with respect to the leader UAV 104 and capture the target, which may be in motion. The automatic control may also improve or achieve a quality of images/video captured by each follower UAV due to accurate and dynamic positioning of the plurality of follower UAVs 106 with respect to one another, the leader UAV 104, and the target.
The circuitry 204 may include suitable logic, circuitry, and/or interfaces that may be configured to execute program instructions associated with different operations to be executed by the GCS 202. For example, one or more of such operations may be to dynamically determine a leader UAV from a group of UAVs and assign other UAVs in the group of UAVs as follower UAVs. The determination of the leader UAV (e.g., the leader UAV 104) and the plurality of follower UAVs (e.g., the plurality of follower UAVs 106) may be based on the input received from a user (not shown). The one or more operations may further include the determination of the geo-location of the leader UAV 104, the determination of the formation information based on the user input and/or the geo-location of the leader UAV 104, and the transmission of the formation information to the leader UAV 104 and the plurality of follower UAVs 106. The circuitry 204 may include one or more specialized processing units, which may be implemented as a separate processor. In an embodiment, the one or more specialized processing units may be implemented as an integrated processor or a cluster of processors that perform the functions of the one or more specialized processing units, collectively. The circuitry 204 may be implemented based on a number of processor technologies known in the art. Examples of implementations of the circuitry 204 may be an X86-based processor, a Graphics Processing Unit (GPU), a Reduced Instruction Set Computing (RISC) processor, an Application-Specific Integrated Circuit (ASIC) processor, a Complex Instruction Set Computing (CISC) processor, a microcontroller, a central processing unit (CPU), other control circuits and/or a combination thereof.
The memory 206 may include suitable logic, circuitry, and/or interfaces that may be configured to store the program instructions executable by the circuitry 204. The memory 206 may be further configured to store information such as, but not limited to, the geo-location of the leader UAV 104, the determined formation information, a flight path of the plurality of UAVs, and/or the target information. Example implementations of the memory 206 may include, but are not limited to, Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Hard Disk Drive (HDD), a Solid-State Drive (SSD), a CPU cache, and/or a Secure Digital (SD) card. Other forms of memory devices known in the art and not listed herein may also be covered within the scope of the embodiments of the present disclosure.
The I/O device 208 may include suitable logic, circuitry, and interfaces that may be configured to receive an input and provide an output based on the received input. The I/O device 208 may include various input and output devices, which may be configured to communicate with the circuitry 204. In example, the I/O device 208 may receive, from the user, a user input associated with the plurality of UAVs of the system 100. For example, the user input is to create a particular formation by the plurality of follower UAVs 106 or the user input by indicate a particular flight path to be followed by each of the plurality of UAVs. In example, the I/O device 208 may render an output associated with a set of images captured by the plurality of UAVs of the system 100. Examples of the I/O device 208 may include, but are not limited to, a touch screen, a keyboard, a mouse, a joystick, a microphone, a display device, and a speaker.
The network interface 210 may include suitable logic, circuitry, interfaces, and/or code that may be configured to facilitate communication between the GCS 102 and each UAV in the plurality of UAVs, such as, the leader UAV 104 and the plurality of follower UAVs 106A, via the communication network 108. The network interface 210 may be configured to implement known technologies to support wired or wireless communication. The network interface 210 may include, but is not limited to, an antenna, a radio frequency (RF) transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a coder-decoder (CODEC) chipset, a subscriber identity module (SIM) card, and/or a local buffer.
The network interface 210 may be configured to communicate via wireless communication with networks, such as the Internet, an Intranet, and/or a wireless network, such as a cellular telephone network, a wireless local area network (WLAN), personal area network, and/or a metropolitan area network (MAN). The wireless communication may use any of a plurality of communication standards, protocols and technologies, such as Global System for Mobile Communications (GSM), Enhanced Data GSM Environment (EDGE), wideband code division multiple access (W-CDMA), code division multiple access (CDMA), LTE, time division multiple access (TDMA), Bluetooth, Wireless Fidelity (Wi-Fi) (such as IEEE 802.11, IEEE 802.11b, IEEE 802.11g, IEEE 802.11n, and/or any other IEEE 802.11 protocol), voice over Internet Protocol (VoIP), Wi-MAX, Internet-of-Things (IoT) technology, Machine-Type-Communication (MTC) technology, a protocol for email, instant messaging, and/or Short Message Service (SMS).
The functions or operations executed by the GCS 102, as described in
The circuitry 304 may include suitable logic, circuitry, and interfaces that may be configured to execute program instructions associated with different operations to be executed by the UAV 302. For example, one or more of such operations may be to receive formation information from the GCS 202 (shown in
The memory 306 may include suitable logic, circuitry, and/or interfaces that may be configured to store the program instructions executable by the circuitry 304. In an embodiment, the memory 306 may be configured to store the received formation information and/or the target information. Example implementations of the memory 306 may include, but are not limited to, Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Hard Disk Drive (HDD), a Solid-State Drive (SSD), a CPU cache, and/or a Secure Digital (SD) card.
The image capturing device 308 may include suitable logic, circuitry, and interfaces that may be configured to capture an image or a plurality of images or a video stream of a target object (not shown in
The network interface 310 may include suitable logic, circuitry, interfaces, and/or code that may be configured to facilitate communication between the UAV 302, the GCS 202, and/or other UAVs in the plurality of UAVs. The network interface 210 may be configured to implement known technologies to support wired or wireless communication. The network interface 310 may include, but is not limited to, an antenna, a radio frequency (RF) transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a coder-decoder (CODEC) chipset, a subscriber identity module (SIM) card, and/or a local buffer. The functions of the network interface 310 may be similar to the functions of the network interface 210 described, for example, in
The power supply unit 312 may include suitable logic, circuitry, and interfaces that may be configured to execute program instructions associated with different operations to be executed by the UAV 302. The power supply unit 312 may use a combustible energy source or a solar panel attached to the UAV 302 to provide power to the UAV 302. In an embodiment, the power supply unit 312 may use fuel cells that may use hydrogen to generate an electric current which can be used to power a motor associated with the propulsion system 314. Combustible energy source and solar power sources are well known to a person of ordinary skill in the art and are therefore omitted from discussion of the embodiments of the present disclosure. The power supply unit 312 may power the entire UAV 302 and enable operation of various components of the UAV 302. In an embodiment, a combination of different energy sources may be used to power the UAV 302. The selection of the power supply sources may depend on the type of the UAV 302 (for example, based on a weight, payload capacity, dimensions, and a wing-type of the UAV 302). In an embodiment, the power supply unit 312 may include a battery to store energy generated by the energy source. The battery may be a source of electric power for one or more electric circuits of the UAV 302. For example, the battery may be a source of electrical power to the circuitry 304, the memory 306, the image capturing device 308, the network interface 310, the propulsion system 314, the location sensor 316, and the IMU 318. The battery may be a rechargeable battery. The battery may be the source of electrical power to start or control the movement of the UAV 302. In some embodiments, the battery may correspond to a battery pack, which may have a plurality of clusters of batteries. Examples of the battery may include, but are not limited to, a lead acid battery, a nickel cadmium battery, a nickel-metal hydride battery, a lithium-ion battery, and other rechargeable batteries.
The propulsion system 314 is a set of mechanical and electrical components that generates thrust to push the UAV 302 upward/downward and/or forward/backward during the flight. The propulsion system 314 may control the movement of the UAV 302 based on one or more control instruction received from the circuitry 304 or the GCS 202. The propulsion system 314 may further include, but is not limited to, motors, propellers, and an electronic speed controller (ESC).
The motors may be brushless direct current (BLDC) motors in which coils are fixed either to an outer casing or an inner casing of the motors, and the magnets are configured to rotate. The brushless DC motors may be one of an in-runner, an out-runner, or hybrid runner motors, based on a rotation speed for a given voltage. The propellers may include rotor blades with a pre-specified diameter that rotate on a pre-configured speed to produce a minimum thrust for the UAV 302. In addition to the pre-specified diameter, the propellers may be further associated with a shape, an angle of attack, a pitch, and a surface area of the rotor blades. The propellers may be manufactured using different materials, such as injection-molded plastic, fiber-reinforced polymer, or natural materials (such as wood). The ESC may comprise suitable logic, circuitry, interfaces, and/or code that may be configured to control the speed and direction of the motors and accordingly control the speed and direction of movement of the UAV 302. The ESC may be configured to receive the one or more control instructions from the GCS 202 or the circuitry 304 to control the speed and the direction of the UAV 302.
The location sensor 316 may include suitable logic, circuitry, and/or interfaces that may be configured to determine a current geo-location of the UAV 302. The location sensor 316 may be configured to communicate the current geo-location to the circuitry 304 of the UAV 302 and the GCS 202. Examples of the location sensor 316, may include, but are not limited to, a Global Navigation Satellite System (GNSS)-based sensor. Examples of the GNSS-based sensor may include, but are not limited to, global positioning sensor (GPS), Global Navigation Satellite System (GLONASS), or other regional navigation systems or sensors. In another embodiment, the location sensor 316 may provide information about the geo-location based on real-time kinematics (RTK) positioning.
The IMU 318 may include suitable logic, circuitry, and/or interfaces that may be configured to detect current orientation of the UAV 302 and provide the detected current orientation, as IMU data, to the circuitry 304 or the GCS 202. Based on the IMU data, the GCS 202 may determine the formation information associated with the current orientation of the UAV 302 and transmit the determined formation information to the circuitry 304 or to the other UAVs. The circuitry 304 may further control the orientation of the UAV 302 based on the received formation information determined by the GCS 202 based on the current orientation of the UAV 302. Examples of the IMU 318 may include, but are not limited to, a motion sensor, a tilt sensor, an accelerometer, or a gyro sensor.
It should be noted that the UAV 302 in
At 402, a geo-location of a leader UAV may be determined. In an embodiment, the circuitry 204 of the GCS 202 may be configured to determine the geo-location of the leader UAV (e.g., the leader UAV 104) from a plurality of UAVs (such as, the leader UAV 104 and the plurality of follower UAVs 106). Prior to the determination of the geo-location, the circuitry 204 may receive a user request from a user or a human controller associated with the GCS 202. The user request may include a user input indicative of an assignment of roles (e.g., a leader UAV role or a follower UAV role) to UAVs in the plurality of UAVs. The user input may be further indicative of a desired alignment or positions of the follower UAVs in a certain formation with respect to the leader UAV of the plurality of UAVs. The positions may correspond to different coordinates in XYZ plane, and the alignment may correspond to different angles or the orientations of the UAVs with respect to the leader UAV or the target to capture. In an embodiment, the circuitry 204 may assign the role of a leader UAV to a certain UAV (e.g., the leader UAV 104) and assign the role of a follower UAV to remaining UAVs (e.g., the plurality of follower UAVs 106) of the plurality of UAVs, based on the user inputs in the user request. In an alternate embodiment, the assignment of the roles of the leader UAV and follower UAVs to the various UAVs in the plurality of UAVs may be pre-defined. In an embodiment, the circuitry 204 may transmit, to the plurality of UAVs, information associated with the role assigned to the corresponding UAV. Based on the information associated with the role received by the UAV, the corresponding UAV may assign the role in the plurality of UAVs. For example, based on the information associated with the role, a UAV (such as, the leader UAV 104) may assign the role of a leader in the plurality of UAVs. Similarly, a UAV (such as, a UAV of the plurality of follower UAVs 106) may assign the role of a follower in the plurality of UAVs. In an embodiment, the user inputs may further include information about, but is not limited to, a geographical starting point, a geographical ending point, a particular flight path to be taken by the plurality of UAVs between the geographical starting point and the geographical ending point, one or more altitude positions to be taken by the plurality of UAVs along the flight path, or a series of intermediate geographical positions or waypoints in the flight path.
In an embodiment, the circuitry 204 of the GCS 202 may transmit a geo-location request to a UAV assigned with the role of a leader UAV (e.g., the leader UAV 104) in the plurality of UAVs. Based on the receipt of the geo-location request by the leader UAV (e.g., the leader UAV 104), the leader UAV may further determine the geo-location of the leader UAV and further transmit the determined geo-location of the leader UAV to the circuitry 204 of the GCS 202. For example, the leader UAV 104 may use an on-board location sensor (e.g., the location sensor 316) to determine the geo-location of the leader UAV 104 and transmit information associated with the determined geo-location to the circuitry 204 of the GCS 202. Based on the receipt of the information associated with the geo-location of the leader UAV 104, the circuitry 204 may determine the geo-location of the leader UAV 104. In an embodiment, the circuitry 204 of the GCS 202 may periodically receive the geo-location of the leader UAV 104 such that GCS 202 may be aware about the current location of the leader UAV 104 on real-time basis.
At 404, IMU data associated with the leader UAV (e.g., the leader UAV 104) may be determined. In an embodiment, the circuitry 204 may determine the IMU data associated with the leader UAV 104. The circuitry 204 may transmit an IMU data request to the leader UAV 104. Based on the receipt of the IMU data request, the leader UAV 104 may use an on-board orientation sensor (such as, the IMU 318) to determine the IMU data. The leader UAV 104 may further transmit the determined IMU data to the circuitry 204 of the GCS 202. In an embodiment, the leader UAV 104 may be configured to smoothen sensor readings included in the IMU data, prior to the transmission of the IMU data to the GCS 202. For example, a Kalman filter may be used to smoothen the sensor readings. The smoothened sensor readings in the IMU data may improve prediction of change in direction of the leader UAV 104 based on the IMU data. The circuitry 204 of the GCS 202 may be configured to receive the determined IMU data from the leader UAV 104. In an embodiment, the IMU data may include information, such as, but is not limited to, motion information, tilt information, yaw rotation information, pitch rotation information, roll rotation information, speed information, acceleration information, or gyroscope measurements associated with the leader UAV 104. In addition to the transmission of the geo-location, the IMU data related to the leader UAV 104, may provide accurate details related to exact position and/or orientation of the leader UAV 104 to the GCS 202. In an embodiment, the circuitry 204 of the GCS 202 may periodically receive the IMU data of the leader UAV 104 such that GCS 202 may be aware about the current orientation (or acceleration or change in direction) of the leader UAV 104 on real-time basis. In some embodiments, the IMU 318 may be integrated in an electronic device and/or in the target (i.e. described in
At 406, target information may be determined. In an embodiment, the circuitry 204 of the GCS 202 may be configured to determine the target information. The target information may be indicative of at least one of, but is not limited to, an identification of the target, or a location of the target to be captured. In an embodiment, the location of the target may be at an offset distance from the GCS 202 (for example, an electronic device as described, for example, in
At 408, formation information may be determined. In an embodiment, the circuitry 204 of the GCS 202 may be configured to determine the formation information for the plurality of UAVs. In an embodiment, the formation information may be determined based on the determined geo-location of the leader UAV 104. In an embodiment, the determination of the formation information may be further based on the received user request (including user inputs about the formation of the plurality of UAVs). In other words, the formation information may be determined based on the receipt of the user inputs provided by the user or the human controller of the GCS 202. The formation information may include details of a position of the leader UAV 104 and a relative position of each of the follower UAVs 106A-106N with respect to the leader UAV 104 or the target (shown in
At 410, the determined formation information may be transmitted. In an embodiment, the circuitry 204 of the GCS 202 may be configured to transmit the determined formation information directly to each of the leader UAV 104 and the plurality of follower UAVs 106. The transmitted formation information may include relative positions, for example, a distance between each of the plurality of follower UAVs 106 and the leader UAV 104 (or the target), an altitude of each of the plurality of follower UAVs 106 with respect to the leader UAV 104 (or the target), an orientation of each of the plurality of follower UAVs 106 with respect to the leader UAV 104 (or the target). Based on the formation information received by each UAV in the plurality of UAVs, a required formation (or swarm arrangement) of the plurality of UAVs may be created around the target for the capture of images/videos of the target, as required by the user or for any different purposes (for example, but is not limited to, entertainment-related, surveillance-related, sports-related, education-related, or health-related). Each of the plurality of UAVs (i.e. using an in-built image capturing device) may be configured to capture images or video streams of the predefined target and transmit the captured images or video streams to the GCS 102.
In an embodiment, each of the plurality of follower UAVs 106 may be configured to directly receive the transmitted formation information from the GCS 202 (i.e. not via the leader UAV 104). In certain conventional solutions, the follower UAV may receive formation information from a Ground Control Station (GCS), via a leader UAV or related device, which may lead to a time lag in the transmission of the formation information or other instructions between the GCS and follower UAVs. In contrast, the disclosed GCS 202 may directly transmit the determined formation information to each of the plurality of follower UAVs 106, via the communication network 108 (i.e. shown in
In an embodiment, based on the receipt of the formation information, each of the plurality of follower UAVs 106 may further adjust a position of the corresponding follower UAV of the plurality of follower UAVs 106 based on the received formation information. The adjustment of the positions (i.e. XYZ positions) of the plurality of follower UAVs 106 may be performed to dynamically form or adjust the swarm arrangement in light of any change in position (or geo-location) or angle/orientation of the leader UAV 104. The adjustment of the position (or angle or orientation) of a UAV (e.g., a follower UAV) based on the formation information is described further, for example, in
With reference to
With reference to
With reference to
In an embodiment, the circuitry 204 of the GCS 202 may be configured to dynamically determine the formation information for each of the plurality of follower UAVs (for example, the follower UAVs 504A-504D, or the follower UAVs 508A-508E, or the follower UAVs 512A-512E) based on the formation request (i.e. user inputs) and/or the determined geo-location or IMU data of the leader UAV (for e.g., the leader UAV 502, or the leader UAV 506, or the leader UAV 510, respectively). The circuitry 204 may be further configured to directly transmit the determined formation information to the leader UAV (for e.g., the leader UAV 502, or the leader UAV 506, or the leader UAV 510) and each of the plurality of follower UAVs (for example, the follower UAVs 504A-504D, or the follower UAVs 508A-508E, or the follower UAVs 512A-512D, respectively).
In an embodiment, the plurality of UAVs may need to travel a predefined path while maintaining a particular formation which may be either predefined or formed based on the user inputs. In such cases, the formation information may indicate at least one of a geographical starting point for each of the plurality of UAVs, a geographical ending point for each of the plurality of UAVs, one or more altitude positions for each of the plurality of UAVs along a flight path, a series of intermediate geographical positions or waypoints in the flight path, a separation distance between adjacent UAVs of the plurality of UAVs, or information about one or more speeds or velocities of the plurality of UAVs at corresponding waypoints. For example, the geographical starting point of a UAV may indicate a position or geo-location (such as, GPS co-ordinates) from which the UAV may start a flight or take-off, while the geographical ending point may indicate a position or geo-location at which the UAVs may be required to end the flight (or land) or to be finally positioned to capture the target for a particular duration. The one or more altitude positions for the UAV may indicate a set of heights from a ground-level at which the UAV may be required to fly along a certain flight path (or a flight trajectory). The series of intermediate geographical positions or waypoints for the UAV may indicate a set of geo-locations (such as, GPS co-ordinates) that the UAV may be required to reach in the flight path of the UAV. The separation distance between adjacent UAVs may be a minimum distance that each UAV may be required to maintain from other UAVs (e.g., nearby UAVs) in the flight path. This may be required to avoid collision among the UAVs during the flight path. The information about one or more speeds or velocities of the plurality of UAVs at corresponding waypoints may indicate a range of required speeds and velocity vectors to be maintained by each UAV at different geo-locations (or waypoints) in the flight path. Different information about the starting/ending positions, altitudes, intermediate geographical positions, and speed/velocity in the formation information may ensure to maintain a consistent formation by the plurality of UAVs during the flight path (i.e. without a need for complex computer vision techniques). In an embodiment, the circuitry 204 of the GCS 202 is further configured to directly transmit the formation information (i.e. determined for each UAV) to each of the plurality of follower UAVs (i.e., the follower UAVs 504A-504D, or the follower UAVs 508A-508E, or the follower UAVs 512A-512D) and the leader UAV (i.e., the leader UAV 502, or the leader UAV 506, or the leader UAV 510, respectively) at a predetermined frequency. For example, the predetermined frequency may be, but is not limited to, in milliseconds, seconds, or minutes. Based on the predetermined frequency, the circuitry 204 of the GCS 202 may determine the change in position or IMU data of the leader UAV 104 (or the target) and accordingly determine or update the formation information for the plurality of follower UAVs 106. In an embodiment, the predetermined frequency may be, but is not limited to, 10 Hz or 20 Hz.
In an embodiment, the GCS 202 may be configured to receive target information from the leader UAV (for example, the leader UAV 502). The target information may indicate a location or position of the target to be captured by the plurality of UAVs. In an example, the leader UAV may determine the target information based on a user input or by an application of one or more machine learning and image processing techniques applied on a set of images of the target captured by the leader UAV. In such case, the target may be an object (for example, but is not limited to, a particular person, an animal, an event, a vehicle, a building, etc.) which may be recognized by the leader UAV using the application of the one or more machine learning and image processing techniques on the set of images of the target. The leader UAV 104 may further determine the current location or position of the recognized target in a three-dimension real space, include the determined location or position of the target in the target information, and further transmit the target information to the GCS 202. In some embodiments, the GCS 202 may be configured to directly recognize a particular target, determine the related position of the recognized target, and generate the target information. The GCS 202 may be further configured to determine the formation information for the plurality of UAVs based on the received or determined target information. The formation information determined based on the target information may indicate the XYZ positions, angle, or orientation of each of the plurality of UAVs based on the current position (or posture) of the target to be captured. The formation information for each of the plurality of UAVs may be determined such that an appropriate and high-quality 2D or 3D images of the target may be captured by the formation of the plurality of UAVs. The GCS 202 may be further configured to transmit the determined formation information to each of the plurality of follower UAVs (for example, follower UAVs 504A-504N). Each of the plurality of follower UAVs may be configured to control an in-built image capturing device (e.g., the image capturing device 308) based on the target information in the received formation information. For example, a follower UAV of the plurality of UAVs may adjust one or more camera parameters (such as, but is not limited to, a focal length, an aperture, a zoom, a tilt, or a field-of-view) associated with the image capturing device, to further capture the images or videos of the target. Such one or more camera parameters may be included in the formation information determined for each of the plurality of follower UAVs 106 and/or the leader UAV 104. Such camera parameters may be determined based on the position of the target and current positions or IMU data related to the leader UAV and the plurality of follower UAVs.
In an embodiment, the leader UAV (for example, the leader UAV 502 shown in
It should be noted that the first scenario 500A, the second scenario 500B, and the third scenario 500C of
With reference to
In an embodiment, the electronic device 608 may be communicably coupled to each of the plurality of follower UAVs 606A-606N. The electronic device 608 may also operate as a GCS (for example, the GCS 102) and/or the leader UAV (such as the leader UAV 104). In such case, the electronic device 608 may include one or more processing functionalities of the GCS 202 and/or the leader UAV 104. Examples of the electronic device 608 may include, but is not limited to, a computing device, a mobile phone, an onboard processing integrated circuit (IC), a computer workstation, a controller system, a personal digital assistant (PDA), a smartphone, a cellular phone, a camera device, a gaming device, a server, a distributed computing system, or any electrical/electronic device with imaging, computation and communication capabilities. In an embodiment, the electronic device 608 may be a wearable device as described, for example, in
The electronic device 608 may include circuitry (not shown) that may be configured to determine target information which may indicate the location of a target (e.g., the target object 602) to be captured. The location of the target object 602 may be at an offset distance from the location of the electronic device 608. In an embodiment, the target information may be predefined. For example, the target object 602 may be at a predefined distance (i.e. in certain feets, meters, or yards) from the electronic device 608 or from the automobile 604 on which the electronic device 608 may be located. In another embodiment, the circuitry of the electronic device 608 may periodically capture one or more images of the target object 602 to determine the position (or distance from the electronic device 608) of the target object 602 based on different image processing techniques to further determine the target information on real-time basis. Thus, any real-time change in the location of the target object 602 (or distance between the target object 602 and the electronic device 608) may be updated in the target information. In some embodiments, the electronic device 608 may receive the captured images of the target object 602 from different imaging device (not shown) and the target object 602 may be included in a field-of-view (FOV) of the imaging device.
The circuitry of the electronic device 608 may be further configured to determine formation information for the plurality of follower UAVs 606A-606N. The formation information may indicate at least a relative position for each of the plurality of follower UAVs 606A-606N with respect to the location of the electronic device 608 or with respect to the location of the target object 602. The circuitry in the electronic device 608 may be further configured to transmit the determined formation information and the target information directly to each of the plurality of follower UAVs 606A-606N. Each of the plurality of follower UAVs 606A-606N may be further configured to receive the transmitted formation information and the target information. Based on the received formation information and the target information, each of the plurality of follower UAVs 606A-606N may be configured to adjust a position (or angle or orientation or imaging parameters of an inbuilt imaging device) of the corresponding UAV with respect to the electronic device 608 or the target object 602. The formation information transmitted to each of the plurality of follower UAVs 606A-606N may include, but is not limited to, relative XYZ position of each UAV with respect to the target object 602 (or the electronic device 608), information about angle or orientation for each UAV with respect to the target object 602 (or the electronic device 608), imaging parameters, information about flight path, starting/ending geo-coordinates as described, for example, in
In an embodiment, each of the plurality of follower UAVs 606A-606N may include an image capturing device (such as, image capturing device 308). A field-of-view (FOV) of the image capturing device may be controlled based on the received target information. For example, the target information may include information associated with the location of the target object 602 and/or the offset distance between the target object 602 and the electronic device 608. Based on the information associated with the location and the offset distance, each follower UAV from the plurality of follower UAVs 606A-606N may adjust the FOV of the image capturing device (e.g., the image capturing device 308) associated with the corresponding follower UAV to capture an image/video of the target object 602. Information about the FOV may be included in the formation information or in the target information as one of the imaging parameters related to each follower UAV. In an embodiment, the electronic device (e.g., the electronic device 608) may be associated with an automobile (for example, as shown in
With reference to
With reference to
It may be noted that the first scenario 600A, the second scenario 600B, and the third scenario 600C of
At 704, the user request may be received at the GCS 202. In an embodiment, the circuitry 204 may be configured to receive the user request at the GCS 202. The user request may be for a plurality of UAVs including the leader UAV 104 and the plurality of follower UAVs 106. The user request may include, but is not limited to, one or more instructions for the plurality of UAVs to form a particular formation, one or more instructions for the plurality of UAVs to follow a certain target in a particular formation and capture images/videos of the target, or defined roles for the plurality of UAVs. Different exemplary formations for which input may be provided in the form of the user request or automatically formed are described, for example, in
At 706, the geo-location of the leader UAV 104 may be determined. In an embodiment, the circuitry 204 may be configured to determine the geo-location of the leader UAV 104. The geo-location may refer to the geographical (e.g., latitudinal, longitudinal, and/or altitudinal) location of the leader UAV 104. The determination of the geo-location of the leader UAV 104 is described further, for example, in
At 708, the formation information may be determined based on the determined geo-location of the leader UAV 104, the received user request, or the geo-location of the target object. In an embodiment, the circuitry 204 may be configured to determine the formation information based on the geo-location of the leader UAV 104, the received user request, or the target object. The formation information may indicate at least a relative position for each of the plurality of follower UAVs 106 with respect to the leader UAV 104 in the plurality of UAVs. The formation information may be dynamically determined or updated for each of the plurality of follower UAVs 106 based on the formation request (i.e. user request) provided by the user, the determined geo-location of the leader UAV 104 (or the target to be captured) or based on the change in the IMU data with respect to the leader UAV 104 and the target object. The determination of the formation information is described, for example, in
At 710, the determined formation information may be transmitted. In an embodiment, the circuitry 204 may be configured to directly transmit the determined formation information to each of the plurality of follower UAVs 106 and the leader UAV 104 in the plurality of UAVs. The formation information may be transmitted to initiate the creation of a required formation of the plurality of UAVs, for example, around the target object. In an embodiment, each of the plurality of follower UAVs 106 may be configured to receive the formation information directly from the GCS 202 and adjust a position of the corresponding follower UAV with respect to the leader UAV 104 based on the received formation information. Thus, based on the formation information, each of the plurality of UAVs may automatically organize itself into a formation (as discussed, for example, in
Although the flowchart 700 is illustrated as discrete operations, such as, 704, 706, 708, and 710, the disclosure may not be so limited. Accordingly, in certain embodiments, such discrete operations may be further divided into additional operations, combined into fewer operations, or eliminated, depending on the particular implementation without detracting from the essence of the disclosed embodiments.
Various embodiments of the disclosure may provide a non-transitory computer-readable medium and/or storage medium having stored thereon, instructions executable by a machine and/or a computer (for example the Ground Control Station 202). The instructions may cause the machine and/or computer to perform operations that may include reception of a user request for a plurality of Unmanned Arial Vehicles (UAVs) including a leader UAV (such as the leader UAV 104) and a plurality of follower UAVs (such as the plurality of follower UAVs 106). The operations may further include determination of a geo-location of the leader UAV. The operations may further include determination of formation information based on the determined geo-location of the leader UAV and the received user request. The formation information may indicate at least a relative position for each of the plurality of follower UAVs with respect to the leader UAV. The operations may further include transmission the determined formation information directly to each of the plurality of follower UAVs. Each of the plurality of follower UAVs may further receive the transmitted formation information and adjust a position based on the received formation information.
Various embodiments of the disclosure may provide an exemplary system (e.g., the system 100) for UAV swarm control. The system 100 may include a plurality of UAVs including a leader UAV (e.g., the leader UAV 104) and a plurality of follower UAVs (e.g., the plurality of follower UAVs 106) communicably coupled with the leader UAV 104. The system 100 may further include a GCS (e.g., the GCS 202) that may include circuitry (e.g., the circuitry 204). The circuitry 204 may be configured to determine a geo-location of the leader UAV 104. The circuitry 204 may be further configured to determine formation information based on the determined geo-location of the leader UAV 104. Herein, the formation information may indicate at least a relative position for each of the plurality of follower UAVs 106 with respect to the leader UAV 104 in the plurality of UAVs. The circuitry 204 may be further configured to transmit the determined formation information directly to each of the leader UAV 104 and the plurality of follower UAVs 106. Each of the plurality of follower UAVs 106 may be configured to receive the transmitted formation information, and adjust a position based on the received formation information.
In an embodiment, the circuitry 204 may be further configured to dynamically determine the formation information for each of the plurality of follower UAVs 106 based on a formation request (i.e. user request) and the determined geo-location of the leader UAV 104. The circuitry 204 may be further configured to transmit the determined formation information to the leader UAV 104 and each of the plurality of follower UAVs 106. In an embodiment, the formation information may further indicate at least one of, but is not limited to, a geographical starting point, a geographical ending point, one or more altitude positions for the plurality of UAVs along a flight path, a series of intermediate geographical positions or waypoints in the flight path, a separation distance between adjacent UAVs of the plurality of UAVs, or information about one or more speeds or velocities of the plurality of UAVs at corresponding waypoints.
In an embodiment, the leader UAV 104 of the plurality of UAVs may include an automobile. In an embodiment, the GCS 102 or the circuitry 204 may be further configured to transmit the formation information to each of the plurality of follower UAVs 106 and the leader UAV 104 at a predetermined frequency.
In an embodiment, the GCS 102 and/or circuitry 204 may be further configured to receive inertial measurement unit (IMU) data from the leader UAV 104. The GCS 102 and/or circuitry 204 may be further configured to determine the formation information based on the received IMU data. The determined formation information may further indicate changes in a direction for the leader UAV 104 and each of the plurality of follower UAVs 106.
In an embodiment, the leader UAV 104 and each of the plurality of follower UAVs 106 may include an image capturing device (for example, the image capturing device 308 of
In an embodiment. the GCS 102 and/or the circuitry 204 may be further configured to receive target information from the leader UAV 104, wherein the target information may indicate a location of a target to be captured. Examples of the target are illustrated in
Various embodiments of the disclosure may provide an electronic device (e.g., the electronic device 608) that may include circuitry configured to determine target information, which may indicate a location of a target to be captured. The location of the target may be at an offset distance from a location of the electronic device 608. In an embodiment, the circuitry of the electronic device 608 may be configured to determine formation information for a plurality of follower Unmanned Aerial Vehicles (UAVs). The formation information may indicate at least a relative position for each of the plurality of follower UAVs with respect to the electronic device 608. In further embodiment, the electronic device 608 may be further configured to transmit the determined formation information and the target information directly to each of the plurality of follower UAVs. Each of the plurality of follower UAVs may be further configured to receive the transmitted formation information and the target information, and may further adjust a position based on the received formation information and the target information.
In an embodiment, each of the plurality of follower UAVs may include an image capturing device. A field-of-view (FOV) of the image capturing device may be controlled based on the received target information. In an embodiment, the electronic device may be a wearable device. In an embodiment, the electronic device may be integrated in an automobile
The present disclosure may be realized in hardware, or a combination of hardware and software. The present disclosure may be realized in a centralized fashion, in at least one computer system, or in a distributed fashion, where different elements may be spread across several interconnected computer systems. A computer system or other apparatus adapted to carry out the methods described herein may be suited. A combination of hardware and software may be a general-purpose computer system with a computer program that, when loaded and executed, may control the computer system such that it carries out the methods described herein. The present disclosure may be realized in hardware that includes a portion of an integrated circuit that also performs other functions.
The present disclosure may also be embedded in a computer program product, which includes all the features that enable the implementation of the methods described herein, and which, when loaded in a computer system, is able to carry out these methods. Computer program, in the present context, means any expression, in any language, code or notation, of a set of instructions intended to cause a system with an information processing capability to perform a particular function either directly, or after either or both of the following: a) conversion to another language, code or notation; b) reproduction in a different material form.
While the present disclosure has been described with reference to certain embodiments, it will be understood by those skilled in the art that various changes may be made, and equivalents may be substituted without deviation from the scope of the present disclosure. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the present disclosure without deviation from its scope. Therefore, it is intended that the present disclosure is not limited to the particular embodiment disclosed, but that the present disclosure will include all embodiments falling within the scope of the appended claims.