An Unmanned Aerial Vehicle (UAV) is a term used to describe an aircraft with no pilot on-board the aircraft. The use of UAVs is growing in an unprecedented rate, and it is envisioned that UAVs will become commonly used for package delivery and passenger air taxis. However, as UAVs become more prevalent in the airspace, there is a need to regulate air traffic and ensure the safe navigation of the UAVs.
The Unmanned Aircraft System Traffic Management (UTM) is an initiative sponsored by the Federal Aviation Administration (FAA) to enable multiple beyond visual line-of-sight drone operations at low altitudes (under 400 feet above ground level (AGL)) in airspace where FAA air traffic services are not provided. However, a framework that extends beyond the 400 feet AGL limit is needed. For example, unmanned aircraft that would be used by package delivery services and air taxis may need to travel at altitudes above 400 feet. Such a framework requires technology that will allow the FAA to safely regulate unmanned aircraft.
Methods, apparatuses, system, devices, and computer program products for updating airspace awareness for unmanned aerial vehicles are disclosed. In a particular embodiment, the classification of a detected object is identified based on sensor data collected by an in-flight unmanned aerial vehicle (UAV). The location of the object is determined based on the sensor data. A controller generates, in dependence upon the classification of the object and the location of the object, an airspace awareness update concerning the object.
The foregoing and other objects, features, and advantages of the invention will be apparent from the following more particular descriptions of exemplary embodiments of the invention as illustrated in the accompanying drawings wherein like reference numbers generally represent like parts of exemplary embodiments of the invention.
Particular aspects of the present disclosure are described below with reference to the drawings. In the description, common features are designated by common reference numbers throughout the drawings. As used herein, various terminology is used for the purpose of describing particular implementations only and is not intended to be limiting. For example, the singular forms “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It may be further understood that the terms “comprise,” “comprises,” and “comprising” may be used interchangeably with “include,” “includes,” or “including.” Additionally, it will be understood that the term “wherein” may be used interchangeably with “where.” As used herein, “exemplary” may indicate an example, an implementation, and/or an aspect, and should not be construed as limiting or as indicating a preference or a preferred implementation. As used herein, an ordinal term (e.g., “first,” “second,” “third,” etc.) used to modify an element, such as a structure, a component, an operation, etc., does not by itself indicate any priority or order of the element with respect to another element, but rather merely distinguishes the element from another element having a same name (but for use of the ordinal term). As used herein, the term “set” refers to a grouping of one or more elements, and the term “plurality” refers to multiple elements.
In the present disclosure, terms such as “determining,” “calculating,” “estimating,” “shifting,” “adjusting,” etc. may be used to describe how one or more operations are performed. It should be noted that such terms are not to be construed as limiting and other techniques may be utilized to perform similar operations. Additionally, as referred to herein, “generating,” “calculating,” “estimating,” “using,” “selecting,” “accessing,” and “determining” may be used interchangeably. For example, “generating,” “calculating,” “estimating,” or “determining” a parameter (or a signal) may refer to actively generating, estimating, calculating, or determining the parameter (or the signal) or may refer to using, selecting, or accessing the parameter (or signal) that is already generated, such as by another component or device.
As used herein, “coupled” may include “communicatively coupled,” “electrically coupled,” or “physically coupled,” and may also (or alternatively) include any combinations thereof. Two devices (or components) may be coupled (e.g., communicatively coupled, electrically coupled, or physically coupled) directly or indirectly via one or more other devices, components, wires, buses, networks (e.g., a wired network, a wireless network, or a combination thereof), etc. Two devices (or components) that are electrically coupled may be included in the same device or in different devices and may be connected via electronics, one or more connectors, or inductive coupling, as illustrative, non-limiting examples. In some implementations, two devices (or components) that are communicatively coupled, such as in electrical communication, may send and receive electrical signals (digital signals or analog signals) directly or indirectly, such as via one or more wires, buses, networks, etc. As used herein, “directly coupled” may include two devices that are coupled (e.g., communicatively coupled, electrically coupled, or physically coupled) without intervening components.
Exemplary methods, apparatuses, and computer program products for updating airspace awareness for unmanned aerial vehicles in accordance with the present invention are described with reference to the accompanying drawings, beginning with
A UAV, commonly known as a drone, is a type of powered aerial vehicle that does not carry a human operator and uses aerodynamic forces to provide vehicle lift. UAVs are a component of an unmanned aircraft system (UAS), which typically include at least a UAV, a control device, and a system of communications between the two. The flight of a UAV may operate with various levels of autonomy including under remote control by a human operator or autonomously by onboard or ground computers. Although a UAV may not include a human operator pilot, some UAVs, such as passenger drones drone taxi, flying taxi, or pilotless helicopter carry human passengers.
For ease of illustration, the UAV 102 is illustrated as one type of drone. However, any type of UAV may be used in accordance with embodiments of the present disclosure and unless otherwise noted, any reference to a UAV in this application is meant to encompass all types of UAVs. Readers of skill in the art will realize that the type of drone that is selected for a particular mission or excursion may depend on many factors, including but not limited to the type of payload that the UAV is required to carry, the distance that the UAV must travel to complete its assignment, and the types of terrain and obstacles that are anticipated during the assignment.
In
The camera 112 is configured to capture image(s), video, or both, and can be used as part of a computer vision system. For example, the camera 112 may capture images or video and provide the video or images to a pilot of the UAV 102 to aid with navigation. Additionally, or alternatively, the camera 112 may be configured to capture images or video to be used by the processor 104 during performance of one or more operations, such as a landing operation, a takeoff operation, or object/collision avoidance, as non-limiting examples. Although a single camera 112 is shown in
The positioning circuitry 114 is configured to determine a position of the UAV 102 before, during, and/or after flight. For example, the positioning circuitry 114 may include a global positioning system (GPS) interface or sensor that determines GPS coordinates of the UAV 102. The positioning circuitry 114 may also include gyroscope(s), accelerometer(s), pressure sensor(s), other sensors, or a combination thereof, that may be used to determine the position of the UAV 102.
The processor 104 is configured to execute instructions stored in and retrieved from the memory 106 to perform various operations. For example, the instructions include operation instructions 108 that include instructions or code that cause the UAV 102 to perform flight control operations. The flight control operations may include any operations associated with causing the UAV to fly from an origin to a destination. For example, the flight control operations may include operations to cause the UAV to fly along a designated route (e.g., based on route information 110, as further described herein), to perform operations based on control data received from one or more control devices, to take off, land, hover, change altitude, change pitch/yaw/roll angles, or any other flight-related operations. The UAV 102 may include one or more actuators, such as one or more flight control actuators, one or more thrust actuators, etc., and execution of the operation instructions 108 may cause the processor 104 to control the one or more actuators to perform the flight control operations. The one or more actuators may include one or more electrical actuators, one or more magnetic actuators, one or more hydraulic actuators, one or more pneumatic actuators, one or more other actuators, or a combination thereof.
The route information 110 may indicate a flight path for the UAV 102 to follow. For example, the route information 110 may specify a starting point (e.g., an origin) and an ending point (e.g., a destination) for the UAV 102. Additionally, the route information may also indicate a plurality of waypoints, zones, areas, regions between the starting point and the ending point.
The route information 110 may also indicate a corresponding set of control devices for various points, zones, regions, areas of the flight path. The indicated sets of control devices may be associated with a pilot (and optionally one or more backup pilots) assigned to have control over the UAV 102 while the UAV 102 is in each zone. The route information 110 may also indicate time periods during which the UAV is scheduled to be in each of the zones (and thus time periods assigned to each pilot or set of pilots).
In the example of
In the example of
In some examples, the computer executable instructions that implement the airspace awareness controller 113 cause the processor 104 to carry out the operations of: determining whether an airspace awareness map includes the object and generating the airspace awareness update in response to determining that the airspace awareness map omits the object. In some examples, the computer executable instructions that implement the airspace awareness controller 113 cause the processor 104 to carry out the operations of: identifying the classification of the detected object in dependence upon the sensor data and one or more object models. In some examples, the computer executable instructions that implement the airspace awareness controller 113 cause the processor 104 to carry out at least one of the operations of: populating the object on an airspace awareness map; providing the airspace awareness update to a central repository; and providing the airspace awareness update to one or more UAVs. In some examples, the computer executable instructions that implement the airspace awareness controller 113 cause the processor 104 to carry out the operations of: providing the airspace awareness update to one or more UAVs in dependence upon the classification of the object. In some examples, the computer executable instructions that implement the airspace awareness controller 113 cause the processor 104 to carry out the operations of: identifying, from an airspace awareness map, an expected location of an object; determining, based on sensor data collected by a UAV while in flight, that the object is missing; and generating, in response to determining that the object is missing, an airspace awareness update.
The control device 120 includes a processor 122 coupled to a memory 124, a display device 132, and communication circuitry 134. The display device 132 may be a liquid crystal display (LCD) screen, a touch screen, another type of display device, or a combination thereof. The communication circuitry 134 includes a transmitter and a receiver or a combination thereof (e.g., a transceiver). In a particular implementation, the communication circuitry 134 (or the processor 122 is configured to encrypt outgoing message(s) using a private key associated with the control device 120 and to decrypt incoming message(s) using a public key of a device (e.g., the UAV 102 or the server 140 that sent the incoming message(s). Thus, in this implementation, communication between the UAV 102, the control device 120, and the server 140 are secure and trustworthy (e.g., authenticated).
The processor 122 is configured to execute instructions from the memory 124 to perform various operations. The instructions also include control instructions 130 that include instructions or code that cause the control device 120 to generate control data to transmit to the UAV 102 to enable the control device 120 to control one or more operations of the UAV 102 during a particular time period, as further described herein.
In the example of
In the example of
In some examples, the computer executable instructions that implement the airspace awareness controller 135 cause the processor 122 to carry out the operations of: determining whether an airspace awareness map includes the object and generating the airspace awareness update in response to determining that the airspace awareness map omits the object. In some examples, the computer executable instructions that implement the airspace awareness controller 135 cause the processor 122 to carry out the operations of: identifying the classification of the detected object in dependence upon the sensor data and one or more object models. In some examples, the computer executable instructions that implement the airspace awareness controller 135 cause the processor 122 to carry out at least one of the operations of: populating the object on an airspace awareness map; providing the airspace awareness update to a central repository; and providing the airspace awareness update to one or more UAVs. In some examples, the computer executable instructions that implement the airspace awareness controller 135 cause the processor 122 to carry out the operations of: providing the airspace awareness update to one or more UAVs in dependence upon the classification of the object. In some examples, the computer executable instructions that implement the airspace awareness controller 135 cause the processor 122 to carry out the operations of: identifying, from an airspace awareness map, an expected location of an object; determining, based on sensor data collected by a UAV while in flight, that the object is missing; and generating, in response to determining that the object is missing, an airspace awareness update.
The server 140 includes a processor 142 coupled to a memory 146, and communication circuitry 144. The communication circuitry 144 includes a transmitter and a receiver or a combination thereof (e.g., a transceiver). In a particular implementation, the communication circuitry 144 (or the processor 142) is configured to encrypt outgoing message(s) using a private key associated with the server 140 and to decrypt incoming message(s) using a public key of a device (e.g., the UAV 102 or the control device 120) that sent the incoming message(s). As will be explained further below, the outgoing and incoming messages may be transaction messages that include information associated with the UAV. Thus, in this implementation, communication between the UAV 102, the control device 120, and the server 140 are secure and trustworthy (e.g., authenticated).
The processor 142 is configured to execute instructions from the memory 146 to perform various operations. The instructions include route instructions 148 comprising computer program instructions for aggregating data from disparate data servers, virtualizing the data in a map, generating a cost model for paths traversed in the map, and autonomously selecting the optimal route for the UAV based on the cost model. For example, the route instructions 148 are configure to partition a map of a region into geographic cells, calculate a cost for each geographic cell, wherein the cost is a sum of a plurality of weighted factors, determine a plurality of flight paths for the UAV from a first location on the map to a second location on the map, wherein each flight path traverses a set of geographic cells, determine a cost for each flight path based on the total cost of the set of geographic cells traversed, and select, in dependence upon the total cost of each flight path, an optimal flight path from the plurality of flight paths. The route instructions 148 are further configured to obtain data from one or more data servers regarding one or more geographic cells, calculate, in dependence upon the received data, an updated cost for each geographic cell traversed by a current flight path, calculate a cost for each geographic cell traversed by at least one alternative flight path from the first location to the second location, determine that at least one alternative flight path has a total cost that is less than the total cost of the current flight path, and select a new optimal flight path from the at least one alternative flight paths. The route instructions 148 may also include instructions for storing the parameters of the selected optimal flight path as route information 110. For example, the route information may include waypoints marked by GPS coordinates, arrival times for waypoints, pilot assignments. The route instructions 148 may also include instructions receiving, by a server in a UAV transportation ecosystem, disinfection area data; accessing, by the server, UAV parameters for a type of UAV; determining, by the server in dependence upon the disinfection area data and the UAV parameters, a number of UAVs needed to complete a coordinated aerial disinfection of a disinfection area within a time limit; and partitioning, by the server, the disinfection area into a plurality of partitions, wherein the number of partitions is equal to the number of UAVs. The server 140 may be configured to transmit the route information 110, including disinfection route information, to the UAV 102.
The instructions may also include control instructions 150 that include instructions or code that cause the server 140 to generate control data to transmit to the UAV 102 to enable the server 140 to control one or more operations of the UAV 102 during a particular time period, as further described herein.
In the example of
In some examples, the computer executable instructions that implement the airspace awareness controller 145 cause the processor 142 to carry out the operations of: determining whether an airspace awareness map includes the object and generating the airspace awareness update in response to determining that the airspace awareness map omits the object. In some examples, the computer executable instructions that implement the airspace awareness controller 145 cause the processor 142 to carry out the operations of: identifying the classification of the detected object in dependence upon the sensor data and one or more object models. In some examples, the computer executable instructions that implement the airspace awareness controller 145 cause the processor 142 to carry out at least one of the operations of: populating the object on an airspace awareness map; providing the airspace awareness update to a central repository; and providing the airspace awareness update to one or more UAVs. In some examples, the computer executable instructions that implement the airspace awareness controller 145 cause the processor 142 to carry out the operations of: providing the airspace awareness update to one or more UAVs in dependence upon the classification of the object. In some examples, the computer executable instructions that implement the airspace awareness controller 145 cause the processor 142 to carry out the operations of: identifying, from an airspace awareness map, an expected location of an object; determining, based on sensor data collected by a UAV while in flight, that the object is missing; and generating, in response to determining that the object is missing, an airspace awareness update.
In the example of
The distributed computing network 151 of
The processor 142 is configured to execute instructions from the memory 154 to perform various operations. The memory 154 includes a blockchain manager 155 that includes computer program instructions for operating an UAV. Specifically, the blockchain manager 155 includes computer program instructions that when executed by the processor 152 cause the processor 152 to receive a transaction message associated with a UAV. For example, the blockchain manager may receive transaction messages from the UAV 102, the control device 120, or the server 140. The blockchain manager 155 also includes computer program instructions that when executed by the processor 152 cause the processor 152 to use the information within the transaction message to create a block of data; and store the created block of data in a blockchain data structure 156 associated with the UAV.
The blockchain manager may also include instructions for accessing information regarding an unmanned aerial vehicle (UAV). For example, the blockchain manager 155 also includes computer program instructions that when executed by the processor 152 cause the processor to receive from a device, a request for information regarding the UAV; in response to receiving the request, retrieve from a blockchain data structure associated with the UAV, data associated with the information requested; and based on the retrieved data, respond to the device.
The UAV 102, the control device 120, and server 140 are communicatively coupled via a network 118. For example, the network 118 may include a satellite network or another type of network that enables wireless communication between the UAV 102, the control device 120, the server 140, and the distributed computing network 151. In an alternative implementation, the control device 120 and the server 140 communicate with the UAV 102 via separate networks (e.g., separate short range networks).
In some situations, minimal (or no) manual control of the UAV 102 may be performed, and the UAV 102 may travel from the origin to the destination without incident. However, in some situations, one or more pilots may control the UAV 102 during a time period, such as to perform object avoidance or to compensate for an improper UAV operation. In some situations, the UAV 102 may be temporarily stopped, such as during an emergency condition, for recharging, for refueling, to avoid adverse weather conditions, responsive to one or more status indicators from the UAV 102, etc. In some implementations, due to the unscheduled stop, the route information 110 may be updated (e.g., via a subsequent blockchain entry, as further described herein) by route instructions 148 executing on the UAV 102, the control device 120, or the server 140). The updated route information may include updated waypoints, updated time periods, and updated pilot assignments.
In a particular implementation, the route information is exchanged using a blockchain data structure. The blockchain data structure may be shared in a distributed manner across a plurality of devices of the system 100, such as the UAV 102, the control device 120, the server 140, and any other control devices or UAVs in the system 100. In a particular implementation, each of the devices of the system 100 stores an instance of the blockchain data structure in a local memory of the respective device. In other implementations, each of the devices of the system 100 stores a portion of the shared blockchain data structure and each portion is replicated across multiple of the devices of the system 100 in a manner that maintains security of the shared blockchain data structure as a public (i.e., available to other devices) and incorruptible (or tamper evident) ledger. Alternatively, as in
The blockchain data structure 156 may include, among other things, route information associated with the UAV 102, the telemetry data 107, the control instructions 130; and the route instructions 148. For example, the route information 110 may be used to generate blocks of the blockchain data structure 156. A sample blockchain data structure 300 is illustrated in
The block data of each block includes information that identifies the block (e.g., a block ID) and enables the devices of the system 100) to confirm the integrity of the blockchain data structure 300. For example, the block data also includes a timestamp and a previous block hash. The timestamp indicates a time that the block was created. The block ID may include or correspond to a result of a hash function (e.g., a SHA256 hash function, a RIPEMD hash function, etc.) based on the other information (e.g., the availability data or the route data) in the block and the previous block hash (e.g., the block ID of the previous block). For example, in
In addition to the block data, each block of the blockchain data structure 300 includes some information associated with a UAV (e.g., availability data, route information, telemetry data, incident reports, updated route information, maintenance records, etc.). For example, the block Bk_1304 includes availability data that includes a user ID (e.g., an identifier of the mobile device, or the pilot, that generated the availability data), a zone (e.g., a zone at which the pilot will be available), and an availability time (e.g., a time period the pilot is available at the zone to pilot a UAV). As another example, the block Bk_2306 includes route information that includes a UAV ID, a start point, an end point, waypoints, GPS coordinates, zone markings, time periods, primary pilot assignments, and backup pilot assignments for each zone associated with the route.
In the example of
Referring back to
In a particular embodiment, the route instructions 148 cause the server 140 to plan a flight path, generate route information, dynamically reroute the flight path and update the route information based on data aggregated from a plurality of data servers. For example, the server 140 may receive air traffic data 167 over the network 119 from the air traffic data server 160, weather data 177 from the weather data server 170, regulatory data 187 from the regulatory data server 180, and topographical data 197 from the topographic data server 190. It will be recognized by those of skill in the art that other data servers useful in-flight path planning of a UAV may also provide data to the server 140 over the network 119 or through direct communication with the server 140.
The air traffic data server 160 may include a processor 162, memory 164, and communication circuitry 168. The memory 164 of the air traffic data server 160 may include operating instructions 166 that when executed by the processor 162 cause the processor to provide the air traffic data 167 about the flight paths of other aircraft in a region, including those of other UAVs. The air traffic data may also include real-time radar data indicating the positions of other aircraft, including other UAVs, in the immediate vicinity or in the flight path of a particular UAV. Air traffic data servers may be, for example, radar stations, airport air traffic control systems, the FAA, UAV control systems, and so on.
The weather data server 170 may include a processor 172, memory 174, and communication circuitry 178. The memory 174 of the weather data server 170 may include operating instructions 176 that when executed by the processor 172 cause the processor to provide the weather data 177 that indicates information about atmospheric conditions along the UAV's flight path, such as temperature, wind, precipitation, lightening, humidity, atmospheric pressure, and so on. Weather data servers may be, for example, the National Weather Service (NWS), the National Oceanic and Atmospheric Administration (NOAA), local meteorologists, radar stations, other aircraft, and so on.
The regulatory data server 180 may include a processor 182, memory 184, and communication circuitry 188. The memory 184 of the weather data server 180 may include operating instructions 186 that when executed by the processor 182 cause the processor to provide the regulatory data 187 that indicates information about laws and regulations governing a particular region of airspace, such as airspace restrictions, municipal and state laws and regulations, permanent and temporary no-fly zones, and so on. Regulatory data servers may include, for example, the FAA, state and local governments, the Department of Defense, and so on.
The topographical data server 190 may include a processor 192, memory 194, and communication circuitry 198. The memory 194 of the topographical data server 190 may include operating instructions 196 that when executed by the processor 192 cause the processor to provide the topographical data that indicates information about terrain, places, structures, transportation, boundaries, hydrography, orthoimagery, land cover, elevation, and so on. Topographic data may be embodied in, for example, digital elevation model data, digital line graphs, and digital raster graphics. Topographic data servers may include, for example, the United States Geological Survey or other geographic information systems (GISs).
In some embodiments, the server 140 may aggregate data from the data servers 160, 170, 180, 190 using application program interfaces (APIs), syndicated feeds and eXtensible Markup Language (XML), natural language processing, JavaScript Object Notation (JSON) servers, or combinations thereof. Updated data may be pushed to the server 140 or may be pulled on-demand by the server 140. Notably, the FAA may be an important data server for both airspace data concerning flight paths and congestion as well as an important data server for regulatory data such as permanent and temporary airspace restrictions. For example, the FAA provides the Aeronautical Data Delivery Service (ADDS), the Aeronautical Product Release API (APRA), System Wide Information Management (SWIM), Special Use Airspace information, and Temporary Flight Restrictions (TFR) information, among other data. The National Weather Service (NWS) API allows access to forecasts, alerts, and observations, along with other weather data. The USGS Seamless Server provides geospatial data layers regarding places, structures, transportation, boundaries, hydrography, orthoimagery, land cover, and elevation. Readers of skill in the art will appreciate that various governmental and non-governmental entities may act as data servers and provide access to that data using APIs, JSON, XML, and other data formats.
Readers of skill in the art will realize that the server 140 can communicate with a UAV 102 using a variety of methods. For example, the UAV 102 may transmit and receive data using Cellular, 5G, Sub1 GHz, SigFox, WiFi networks, or any other communication means that would occur to one of skill in the art.
The network 119 may comprise one or more Local Area Networks (LANs), Wide Area Networks (WANs), cellular networks, satellite networks, internets, intranets, or other networks and combinations thereof. The network 119 may comprise one or more wired connections, wireless connections, or combinations thereof.
The arrangement of servers and other devices making up the exemplary system illustrated in
For further explanation,
In some examples, the computer executable instructions that implement the airspace awareness controller 480 cause the processor 442 to carry out the operations of: determining whether an airspace awareness map includes the object and generating the airspace awareness update in response to determining that the airspace awareness map omits the object. In some examples, the computer executable instructions that implement the airspace awareness controller 480 cause the processor 442 to carry out the operations of: identifying the classification of the detected object in dependence upon the sensor data and one or more object models. In some examples, the computer executable instructions that implement the airspace awareness controller 480 cause the processor 442 to carry out at least one of the operations of: populating the object on an airspace awareness map; providing the airspace awareness update to a central repository; and providing the airspace awareness update to one or more UAVs. In some examples, the computer executable instructions that implement the airspace awareness controller 480 cause the processor 442 to carry out the operations of: providing the airspace awareness update to one or more UAVs in dependence upon the classification of the object. In some examples, the computer executable instructions that implement the airspace awareness controller 480 cause the processor 442 to carry out the operations of: identifying, from an airspace awareness map, an expected location of an object; determining, based on sensor data collected by a UAV while in flight, that the object is missing; and generating, in response to determining that the object is missing, an airspace awareness update.
The map server 440 maintains an airspace awareness map database 490. In some examples, the airspace awareness map database 490 includes indications of particular locations that should be avoided by a UAV because they are locations where UAV flight would, for example, pose a risk to the UAV, pose a public safety risk, or violate some law, regulation, or geofence. While in some examples the map database 490 identifies the location with a tag indicating the location that should be avoided in a UAV flight path, in other examples the map database 490 may also include a tag of an object at the location that is to be avoided. For example, the tag may include the type of object or other details about the object. The type of object may be, for example, person, animal, vehicle, structure, liquid, vegetation, smoke, fire, and so on. In some examples, the airspace awareness map database 490 also includes locations of interest. In such examples, the airspace awareness map database 490 may include a tag of an object of interest at a particular location.
In some implementations, the map server 440 acts as a central repository for the airspace awareness map database 490 and modifications to it. In these implementations, the server 440 provides airspace awareness map data 449 to the UAVs 402, 403, 405 and the control device 420 for route planning, navigation, and UAV missions. Accordingly, the memory the UAVs 402, 403, 405 or the memory of the control device 420 may include a local copy of an airspace awareness map generated from airspace awareness map data 449. The UAVs 402, 403, 405 or the control device 420 may load an airspace awareness map relevant to the intended flight path of the UAV from the map server 440 during prior to initiating a mission. The UAVs 402, 403, 405 or the control device 420 may also load an airspace awareness map relevant to the current flight path of the UAV from the map server 440 on-demand while the UAV is in flight. In addition to route planning and navigation, the UAVs 402, 403, 405 and the control device 420 may load an airspace awareness map from the map sever 440 that includes tags and locations of objects that are relevant to the UAV's mission. The UAVs 402, 403, 405 or the control device 420 may also generate updates to the airspace awareness map database 490 that are provided to the map server 440 based on in-flight observations, and the server 440 may propagate updates received from one UAV to other UAVs.
In a particular embodiment, the UAVs 402, 403, 405, the map server 440, the control device 420 are coupled for communication to a network 418. The network 418 may include a cellular network, a satellite network or another type of network that enables wireless communication between the UAVs 402, 403, 405, the server 440, the control device 420. In an alternative implementation, the UAVs 402, 403, 405, the server 440, the control device 420 communicate with each other via separate networks (e.g., separate short range networks). While only one control device 420 is illustrated, it will be appreciated that each UAV 402, 403, 405 may be operated by a distinct control device or the same control device.
For further explanation,
The sensor data 509 includes data collected from one or more sensors of a UAV (e.g., the UAV 102 of
In some embodiments, identifying 502, based on sensor data 509 collected by an in-flight UAV, a classification 531 of a detected object 533 is carried out through pattern recognition techniques using machine vision. In various examples, machine vision may include visual sensors such as monocular cameras and stereo cameras, thermal imaging sensors, LIDAR sensors, SONAR sensors, and other imaging sensors that may be useful in object detection, recognition, and classification. In some examples, pattern recognition techniques are applied to a still image obtained from a camera of the UAV (e.g., the camera 112 of
Feature detection and extraction techniques may be applied to the image to obtain a set of features useful in classifying or identifying the object 533. In some examples, convolution neural networks, support vector machines, and/or deep learning methods are used to extract features of the object and/or classify the object. For example, object recognition techniques such as region-based convolutional neural networks (R-CNN) or You Only Look Once (YOLO) may be useful in identifying 502, based on sensor data 509 collected by an in-flight UAV, a classification 531 of a detected object 533. In some examples, template-based image matching may be used in which a set of sample points of the extracted features are compared to image templates for object classification or identification. Other object recognition and machine vision techniques, such as optical character recognition (OCR) and shape recognition technology (SRT) may be useful in object recognition, classification, and identification. Readers will appreciate that an object may be part of a scene of objects, such that the scene provides context for object identification. A variety of other machine vision and object recognition, classification, and identification techniques, as will occur to those of skill in the art, may be utilized to identify an object type of a detected object.
In some examples, identifying 502, based on sensor data 509 collected by an in-flight UAV, a classification 531 of a detected object 533 includes identifying object types that are particularly relevant to UAV operation and UAV missions. For example, exterior artifacts such as structures and vehicles are more likely to be relevant to UAV operation and UAV missions than interior artifacts such as furniture or appliances. As such, the airspace awareness controller 501 may employ a particular set of object classifications for object or object type identification. For example, object classifications may include person, animal, vehicle, structure, liquid, vegetation, smoke, fire, and so on, that may be encountered during UAV flight. In some examples, subtypes or particular instances of an object classification, including particular characteristics of the object, may be identified. For example, the object could be a particular person or a particular vehicle. In such instances, the object may be identified using techniques such as facial recognition or other identification techniques. In other examples, the object to be detected can be a set of persons, such as persons having a particular characteristic. In still other examples of subtypes of object classifications, a body of liquid may be further differentiated as a lake, a river, etc.; a structure may be differentiated as a building, a communications tower; etc.; an animal by be differentiated by species, etc.
In some examples, the object classification is identified based on an association with another object. For example, the controller 501 may recognize a tall structure and identify the structure as a high-tension power line structure based on identified power lines attached to it. In another example, a characteristic may include patterns for recognition such as a bar code or quick response (QR) code, an object temperature, a movement characteristic such as smooth or intermittent, a gait style, object emissions, sound patterns, or other characteristics. Identifying the object type of a particular object may rely upon a plurality of sensors. For example, the sensor data may include information from a camera for a visual identification, a microphone for audio detection, a GPS system for identifying location, and/or a thermal sensor for identifying a temperature.
The example method of
To determine the location of the object 533, a number of techniques may be employed to determine the distance between the UAV and the object 533 based on the sensor data 509. In one example, stereo cameras are used to capture two images of the object from different viewpoints. In this example, an image processing algorithm can identify the same point in both images and calculate the distance triangulation. In another example, high frequency SONAR pulses are transmitted toward the object and the time it takes for the signal to reflect off the object 533 and return to the UAV is used to determine the distance to the object 533. In yet another example, a time-of-flight camera that includes an integrated light source and a camera is used to measure distance information for every pixel in the image by emitting a light pulse flash and calculating the time needed for the light to reach the object 533 and reflect back to the camera. In yet another example, LIDAR is used to determine how long it takes for a laser pulse to travel from the sensor to the object 533 and back and calculate the distance from the speed of light. In still another example, image processing algorithms are used to match sequential images taken by the same camera to determine distance to objects in the image.
The example method of
In some examples, the airspace awareness update 537 includes the location 535 of the object 533 and a tag associated with the location 535. For example, the tag may designate that the location 535 is an area to avoid based on the classification 531 of an object 533. In such an example, based on the detection of numerous objects classified as people, it may be determined that a location is densely populated and thus a UAV flight path should avoid passing over the location. In another example, the tag may designate that the location 535 is an area of interest based on the classification 531 of an object 533. In such an example, based on the detection of numerous objects classified as cows, it may be determined that the area is of particular interest to a UAV tasked with the mission of counting cattle on a ranch.
In some examples, the tag associated with the location 535 in the airspace awareness update 537 includes the object classification 531 of the detected object 533. For example, the tag may indicate that the location is the site of an object classified as a ‘construction crane,’ a ‘cell tower,’ a ‘fire’, a ‘vehicle’, a ‘person’, and so on. In further examples, the tag associated with the location 535 in the airspace awareness update 537 includes the object classification 531 of the detected object 533 and additional characteristics of the object 533. For example, the tag may indicate ‘vehicle’ and include the make and model identified as a subtype of the classification 531 or the tag may indicate ‘cell tower’ and include the height of the cell tower or the tag may indicate ‘building’ and include dimensions of the building. In still further examples, the tag associated with the location 535 in the airspace awareness update 537 includes the object classification 531 of the detected object 533 and additional flight parameters or limitations. For example, the tag may indicate ‘public space’ and indicate a cargo restriction or the tag may indicate ‘structure’ and include a minimum flight altitude to safely navigate over the structure.
In some embodiments, generating 506, by a controller and in dependence upon the classification 531 of the object and the location 535 of the object 533, an airspace awareness update 537 concerning the object 533 includes transmitting the airspace awareness update 537 to one or more UAV system components such as a UAV, a UAV control device, a distributed computing network, a server, or a user device coupled to the UAV system (e.g., the UAV system 100 of
In some implementations, the airspace awareness update modifies an airspace awareness map. In some examples, one or more components of a UAV system (e.g., the system 100 of
For further explanation,
The example method of
In the example method of
For further explanation,
In the example method of
To reduce the amount of computation required to compare the detected object to object models 703, the entire set of object models may be filtered to produce the set of candidate object models. Filtering may be applied based on characteristics of the detected object or scene, conditions present in the UAV, one or more UAV missions, or combinations thereof. As one simplified example, based on the altitude of the UAV and the camera angle, it may be easily determined that the scene of the image that includes the detected object is a skyscape. This precludes object models that are ground-based such a people, animals, vehicles. Based on a mission of collision avoidance, the set of candidate models may be narrowed based on the altitude of the UAV or the detected object, which may preclude object models for houses and retail stores and small office buildings. Based on the location of the UAV and the pastoral nature of the captured scene (e.g., a rural location, sparsely detected structures, observable greenery), the set of candidate object models may be filtered to exclude an office building, apartment building, or a construction crane. Ultimately, the set of candidate models may be, for example: aircraft, cell tower, or radio tower. If the detected object is actually a radio tower, the comparison of the extracted features of the detected object to the radio tower object model will score high than the comparisons based on the aircraft object model and the cell tower object model.
In some examples, the object models 703 loaded by the airspace awareness controller 501 may be specific to the UAV mission. For example, when the UAV mission is to find people, object models for people are loaded by the airspace awareness controller. When the UAV mission is to find cows, cow object models are loaded by the airspace awareness controller. In this way, the number of candidate object models may be further filtered and thus the number of comparisons may be reduced, thereby conserving computation resources and expediting a match result.
In some examples where the airspace awareness controller 501 is implemented in the UAV (i.e., the airspace awareness controller 113 of the UAV 102 in
In some examples where the airspace awareness controller 501 is implemented in a control device (e.g., the airspace awareness controller 135 of the control device 120 of
In some examples where the airspace awareness controller 501 is implemented in a server (e.g., the airspace awareness controller 145 of the server 140 of
For further explanation,
In the example method of
In some examples where the airspace awareness controller 501 is implemented in the UAV (i.e., the airspace awareness controller 113 of the UAV 102 in
In some examples where the airspace awareness controller 501 is implemented in a control device (e.g., the airspace awareness controller 135 of the control device 120 of
In some examples where the airspace awareness controller 501 is implemented in a server (e.g., the airspace awareness controller 145 of the server 140 of
For further explanation,
In the example method of
For further explanation,
In the example method of
In some examples where the airspace awareness controller 501 is implemented in a UAV (e.g., the UAV 402 of
For further explanation,
In the example method of
For further explanation,
The example method of
The example method of
The example method of
In a particular implementation, airspace awareness updates are exchanged using a blockchain data structure as described above. The blockchain data structure may be shared in a distributed manner across a plurality of devices of in a UAV network, such as the UAV 102, the control device 120, and the server 140 in
Readers will appreciate that, although a single airspace awareness controller is depicted in
Exemplary embodiments of the present invention are described largely in the context of a fully functional computer system for updating airspace awareness for unmanned aerial vehicles. Readers of skill in the art will recognize, however, that the present invention also may be embodied in a computer program product disposed upon computer readable storage media for use with any suitable data processing system. Such computer readable storage media may be any storage medium for machine-readable information, including magnetic media, optical media, or other suitable media. Examples of such media include magnetic disks in hard drives or diskettes, compact disks for optical drives, magnetic tape, and others as will occur to those of skill in the art. Persons skilled in the art will immediately recognize that any computer system having suitable programming means will be capable of executing the steps of the method of the invention as embodied in a computer program product. Persons skilled in the art will recognize also that, although some of the exemplary embodiments described in this specification are oriented to software installed and executing on computer hardware, nevertheless, alternative embodiments implemented as firmware or as hardware are well within the scope of the present invention.
The present invention may be a system, a method, and/or a computer program product. The computer program product may include a computer readable storage medium (or media) having computer readable program instructions thereon for causing a processor to carry out aspects of the present invention.
The computer readable storage medium can be a tangible device that can retain and store instructions for use by an instruction execution device. The computer readable storage medium may be, for example, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. A non-exhaustive list of more specific examples of the computer readable storage medium includes the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a static random access memory (SRAM), a portable compact disc read-only memory (CD-ROM), a digital versatile disk (DVD), a memory stick, a floppy disk, a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon, and any suitable combination of the foregoing. A computer readable storage medium, as used herein, is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission media (e.g., light pulses passing through a fiber-optic cable), or electrical signals transmitted through a wire.
Computer readable program instructions described herein can be downloaded to respective computing/processing devices from a computer readable storage medium or to an external computer or external storage device via a network, for example, the Internet, a local area network, a wide area network and/or a wireless network. The network may comprise copper transmission cables, optical transmission fibers, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. A network adapter card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium within the respective computing/processing device.
Computer readable program instructions for carrying out operations of the present invention may be assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, or either source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C++ or the like, and conventional procedural programming languages, such as the “C” programming language or similar programming languages. The computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider). In some embodiments, electronic circuitry including, for example, programmable logic circuitry, field-programmable gate arrays (FPGA), or programmable logic arrays (PLA) may execute the computer readable program instructions by utilizing state information of the computer readable program instructions to personalize the electronic circuitry, in order to perform aspects of the present invention.
Hardware logic, including programmable logic for use with a programmable logic device (PLD) implementing all or part of the functionality previously described herein, may be designed using traditional manual methods or may be designed, captured, simulated, or documented electronically using various tools, such as Computer Aided Design (CAD) programs, a hardware description language (e.g., VHDL or Verilog), or a PLD programming language. Hardware logic may also be generated by a non-transitory computer readable medium storing instructions that, when executed by a processor, manage parameters of a semiconductor component, a cell, a library of components, or a library of cells in electronic design automation (EDA) software to generate a manufacturable design for an integrated circuit. In implementation, the various components described herein might be implemented as discrete components or the functions and features described can be shared in part or in total among one or more components. Aspects of the present invention are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer readable program instructions.
These computer readable program instructions may be provided to a processor of a general-purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts or carry out combinations of special purpose hardware and computer instructions.
Advantages and features of the present disclosure can be further described by the following statements:
1. A method of updating airspace awareness for unmanned aerial vehicles, the method comprising: identifying, based on sensor data collected by an in-flight unmanned aerial vehicle (UAV), a classification of a detected object; determining, based on the sensor data, a location of the object; and generating, by a controller and in dependence upon the classification of the object and the location of the object, an airspace awareness update concerning the object.
2. The method of statement 1 wherein the airspace awareness update modifies an airspace awareness map.
3. The method of any of statements 1-2 further comprising: determining whether an airspace awareness map includes the object; and wherein generating, by a controller and in dependence upon the classification of the object and the location of the object, an airspace awareness update concerning the object includes generating the airspace awareness update in response to determining that the airspace awareness map omits the object.
4. The method of any of statements 1-3 wherein identifying, based on sensor data collected by an in-flight unmanned aerial vehicle (UAV), a classification of a detected object includes identifying the classification of the detected object in dependence upon the sensor data and one or more object models.
5. The method of any of statements 1-4 wherein generating, by a controller and in dependence upon the classification of the object and the location of the object, an airspace awareness update concerning the object includes populating the object on an airspace awareness map.
6. The method any of statements 1-5 wherein generating, by a controller and in dependence upon the classification of the object and the location of the object, an airspace awareness update concerning the object includes providing the airspace awareness update to a central repository.
7. The method of any of statements 1-6 wherein generating, by a controller and in dependence upon the classification of the object and the location of the object, an airspace awareness update concerning the object includes providing the airspace awareness update to one or more UAVs.
8. The method of any of statements 1-7, wherein providing the airspace awareness update to one or more UAVs includes providing the airspace awareness update to one or more UAVs in dependence upon the classification of the object.
9. The method of any of statements 1-8 further comprising: identifying, from an airspace awareness map, an expected location of an object; determining, based on sensor data collected by a UAV while in flight, that the object is missing; and generating, in response to determining that the object is missing, an airspace awareness update.
10. An apparatus for updating airspace awareness for unmanned aerial vehicles, the apparatus comprising: a processor; and a memory storing instructions that when executed by the processor cause the apparatus to carry out operations of: identifying, based on sensor data collected by an in-flight unmanned aerial vehicle (UAV), a classification of a detected object; determining, based on the sensor data, a location of the object; and generating, by a controller and in dependence upon the classification of the object and the location of the object, an airspace awareness update concerning the object.
11. The apparatus of statement 10 wherein the airspace awareness update modifies an airspace awareness map.
12. The apparatus of any of statements 10-11 further comprising instructions that when executed by the processor cause the apparatus to carry out operations of: determining whether an airspace awareness map includes the object; and wherein generating, by a controller and in dependence upon the classification of the object and the location of the object, an airspace awareness update concerning the object includes generating the airspace awareness update in response to determining that the airspace awareness map omits the object.
13. The apparatus of any of statements 10-12 wherein identifying, based on sensor data collected by an in-flight unmanned aerial vehicle (UAV), a classification of a detected object includes identifying the classification of the detected object in dependence upon the sensor data and one or more object models.
14. The apparatus of any of statements 10-13 wherein generating, by a controller and in dependence upon the classification of the object and the location of the object, an airspace awareness update concerning the object includes populating the object on an airspace awareness map.
15. The apparatus of any of statements 10-14 wherein generating, by a controller and in dependence upon the classification of the object and the location of the object, an airspace awareness update concerning the object includes providing the airspace awareness update to a central repository.
16. The apparatus of any of statements 10-15 wherein generating, by a controller and in dependence upon the classification of the object and the location of the object, an airspace awareness update concerning the object includes providing the airspace awareness update to one or more UAVs.
17. The apparatus of any of statements 10-16, wherein providing the airspace awareness update to one or more UAVs includes providing the airspace awareness update to one or more UAVs in dependence upon the classification of the object.
18. The apparatus of any of statements 10-17 further comprising instructions that when executed by the processor cause the apparatus to carry out operations of: identifying, from an airspace awareness map, an expected location of an object; determining, based on sensor data collected by a UAV while in flight, that the object is missing; and generating, in response to determining that the object is missing, an airspace awareness update.
19. A computer program product for updating airspace awareness for unmanned aerial vehicles, the computer program product disposed upon a non-transitory computer readable medium, the computer program product comprising computer program instructions that, when executed, cause a computer to carry out the operations of: identifying, based on sensor data collected by an in-flight unmanned aerial vehicle (UAV), a classification of a detected object; determining, based on the sensor data, a location of the object; and generating, by a controller and in dependence upon the classification of the object and the location of the object, an airspace awareness update concerning the object.
20. The computer program product of statement 19, wherein generating, by a controller and in dependence upon the classification of the object and the location of the object, an airspace awareness update concerning the object includes populating the object on an airspace awareness map.
It will be understood from the foregoing description that modifications and changes may be made in various embodiments of the present invention without departing from its true spirit. The descriptions in this specification are for purposes of illustration only and are not to be construed in a limiting sense. The scope of the present invention is limited only by the language of the following claims.
This application is a non-provisional application for patent entitled to a filing date and claiming the benefit of earlier-filed U.S. Provisional Patent Application Ser. No. 63/180,518, filed Apr. 27, 2021, the contents of which are herein incorporated by reference in their entirety.
Number | Date | Country | |
---|---|---|---|
63180518 | Apr 2021 | US |