The subject matter described herein relates generally to route planning for surveillance vehicles, and more particularly, embodiments of the subject matter relate to methods for generating a flight plan for an unmanned aerial vehicle based upon desired surveillance targets.
Unmanned aerial vehicles are currently used in a number of military and civilian applications. One common application involves using the unmanned aerial vehicle for video and/or photographic surveillance of a particular object or area of interest. In general, these vehicles may either be operated manually (e.g., via a remote control) or autonomously based upon a predetermined flight plan.
Most current flight planning tools for unmanned aerial vehicles require an operator to manually define a series of waypoints, that is, a series of points in three-dimensional space that define the desired flight path for the vehicle. However, some operators may not have familiarity or understanding of the particular nuances of specifying waypoints and how the series of waypoints translates to the actual flight path during operation. For example, physical limitations of the vehicle may affect the vehicle's ability to precisely traverse each waypoint of the flight plan. Additionally, the goal of the flight plan is often to garner intelligence about a particular object or region rather than simply fly the vehicle through a series of waypoints. However, current flight planning tools do not provide any means for determining the predicted camera path based on the waypoints in the flight plan.
A method is provided for generating a flight plan for an aerial vehicle having a surveillance module using a control unit having a display device. The method comprises graphically identifying, on a map displayed on the display device, a desired target for the surveillance module, and generating the flight plan such that a predicted camera path for the surveillance module overlaps the desired target.
In another embodiment, another method is provided for creating a flight plan for an aerial vehicle having a camera. The method comprises identifying a plurality of surveillance targets for the camera on a display device associated with the aerial vehicle, and generating a plurality of waypoints for use as the flight plan based on the plurality of surveillance targets.
Embodiments of the subject matter will hereinafter be described in conjunction with the following drawing figures, wherein like numerals denote like elements, and
The following detailed description is merely exemplary in nature and is not intended to limit the subject matter of the application and uses thereof Furthermore, there is no intention to be bound by any theory presented in the preceding background or the following detailed description.
Techniques and technologies may be described herein in terms of functional and/or logical block components, and with reference to symbolic representations of operations, processing tasks, and functions that may be performed by various computing components or devices. It should be appreciated that the various block components shown in the figures may be realized by any number of hardware, software, and/or firmware components configured to perform the specified functions. For example, an embodiment of a system or a component may employ various integrated circuit components, e.g., memory elements, digital signal processing elements, logic elements, look-up tables, or the like, which may carry out a variety of functions under the control of one or more microprocessors or other control devices.
The following description refers to elements or nodes or features being “coupled” together. As used herein, unless expressly stated otherwise, “coupled” means that one element/node/feature is directly or indirectly joined to (or directly or indirectly communicates with) another element/node/feature, and not necessarily mechanically. Thus, although the drawings may depict one exemplary arrangement of elements, additional intervening elements, devices, features, or components may be present in an embodiment of the depicted subject matter. In addition, certain terminology may also be used in the following description for the purpose of reference only, and thus are not intended to be limiting. For example, terms such as “first”, “second” and other such numerical terms referring to structures do not imply a sequence or order unless clearly indicated by the context.
For the sake of brevity, conventional techniques related to graphics and image processing, navigation, flight planning, unmanned vehicle controls, and other functional aspects of the systems (and the individual operating components of the systems) may not be described in detail herein. Furthermore, the connecting lines shown in the various figures contained herein are intended to represent exemplary functional relationships and/or physical couplings between the various elements. It should be noted that many alternative or additional functional relationships or physical connections may be present in an embodiment of the subject matter.
Technologies and concepts discussed herein relate generally to route planning or travel planning for autonomous operation of surveillance vehicles. Although the subject matter may be described herein in the context of an unmanned aerial vehicle, various aspects of the subject matter may be implemented in other unmanned vehicles, for example, unmanned ground vehicles or unmanned underwater vehicles, or any other surveillance vehicle (manned or unmanned) that is capable of autonomous operation (e.g., equipped with autopilot or a comparable feature), and the subject matter is not intended to be limited to use with any particular vehicle. As described below, in an exemplary embodiment, a ground control station is configured to display a map of an area proximate the unmanned aerial vehicle and allow a user to identify points on the map as desired surveillance targets. Based upon the desired surveillance targets, the ground control station generates a flight plan for the unmanned aerial vehicle such that predicted path for a camera onboard the unmanned aerial vehicle covers and/or overlaps the desired surveillance targets. The generated flight plan may then be uploaded and/or transferred to the unmanned aerial vehicle for subsequent autonomous operation.
In an exemplary embodiment, the vehicle control system 102 is coupled to the navigation system 104, the surveillance module 106, and the communication module 108. The vehicle control system 102 generally represents the hardware, software, firmware, processing logic, and/or other components of the unmanned aerial vehicle 100 that enable the unmanned aerial vehicle 100 to achieve unmanned operation and/or flight based upon a predetermined flight plan in order to achieve video and/or other surveillance of a desired surveillance target, as will be appreciated in the art and described in greater detail below. In this regard, the vehicle control system 102 and the communication module 108 are cooperatively configured to allow the transferring and/or downloading of a flight plan from an associated ground control station to the vehicle control system 102 along with the transferring and/or uploading of surveillance data (e.g., video data or photographic data) from the surveillance module 106 to the ground control station, as will be appreciated in the art.
In an exemplary embodiment, the unmanned aerial vehicle 100 operates in conjunction with an associated ground control station or control unit, as described in greater detail below. In this regard, the unmanned aerial vehicle 100 and the associated ground control station are preferably configured to support bi-directional peer-to-peer communication. The communication module 108 generally represents the hardware, software, firmware, processing logic, and/or other components that enable bi-directional communication between the unmanned aerial vehicle 100 and the associated ground control station or control unit, as will be appreciated in the art. In this regard, the communication module 108 may support one or more wireless data communication protocols. Any number of suitable wireless data communication protocols, techniques, or methodologies may be supported by the communication module 108, as will be appreciated in the art. In addition, the communication module 108 may include a physical interface to enable a direct physical communication medium between the unmanned aerial vehicle 100 and the associated ground control station.
In an exemplary embodiment, the navigation system 104 is suitably configured to support unmanned flight and/or operation of the unmanned aerial vehicle. In this regard, the navigation system 104 may be realized as a global positioning system (GPS), inertial reference system (IRS), or a radio-based navigation system (e.g., VHF omni-directional radio range (VOR) or long range aid to navigation (LORAN)), and may include one or more sensors suitably configured to support operation of the navigation system 104, as will be appreciated in the art. In an exemplary embodiment, the navigation system 104 is capable of obtaining and/or determining the current location (e.g., the latitude and longitude), altitude, and heading of the unmanned aerial vehicle 100 and providing these navigational parameters to the vehicle control system 102 to support unmanned flight and/or unmanned operation of unmanned aerial vehicle 100.
In an exemplary embodiment, the surveillance module 106 is realized as at least one camera adapted to capture surveillance data (e.g., images and/or video) for a viewing region proximate the unmanned aerial vehicle 100 during operation. In this regard, the camera may be realized as a video camera, an infrared camera, a radar-based imaging device, a multi-spectral imaging device, or another suitable imaging camera or device. For example, in accordance with one embodiment, the surveillance module 106 comprises a first video camera that is positioned and/or angled downward (e.g., the camera lens is directed beneath the unmanned aerial vehicle) and a second video camera positioned and/or angled such that the lens points outward from the unmanned aerial vehicle 100 aligned with the horizontal line of travel (e.g., the camera lens is directed straight out or forward). In an exemplary embodiment, the vehicle control system 102 and the communication module 108 are cooperatively configured to allow the transferring and/or uploading of surveillance data (e.g., video data or photographic data) from the surveillance module 106 to a control unit or ground control station, as will be appreciated in the art.
It should be understood that
In an exemplary embodiment, the display device 202 is coupled to the processor 206, which in turn is coupled to the user interface device 204. In an exemplary embodiment, the display device 202, user interface device 204, and processor 206 are cooperatively configured to allow a user to define a flight plan for the unmanned aerial vehicle 100 by graphically identifying or designating desired surveillance targets or desired camera targets, and possibly other spatial constraints on the display device 202, as described below. The processor 206 is coupled to the database 210, and the processor 206 is configured to display, render, or otherwise convey one or more graphical representations or images of the terrain and/or objects proximate the unmanned aerial vehicle 100 on the display device 202, as described in greater detail below. In an exemplary embodiment, the processor 206 is coupled to a communication module 208 and cooperatively configured to communicate and/or upload a flight plan to the unmanned aerial vehicle 100.
In an exemplary embodiment, the display device 202 is realized as an electronic display configured to display a map of the real-world terrain and/or objects proximate the associated unmanned aerial vehicle 100, along with flight planning information and/or other data associated with operation of the unmanned aerial vehicle 100 under control of the processor 206. Depending on the embodiment, the display device 202 may be realized as a visual display device such as a monitor, display screen, flat panel display, or another suitable electronic display device. In various embodiments, the user interface device 204 may be realized as a keypad, touchpad, keyboard, mouse, touchscreen, stylus, joystick, or another suitable device adapted to receive input from a user. In an exemplary embodiment, the user interface device 204 is adapted to allow a user to graphically identify or designate desired camera targets and other spatial constraints on the map rendered on the display device 202, as described below. It should also be appreciated that although
The processor 206 may be implemented or realized with a general purpose processor, a content addressable memory, a digital signal processor, an application specific integrated circuit, a field programmable gate array, any suitable programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof, designed to perform the functions described herein. In this regard, a processor may be realized as a microprocessor, a controller, a microcontroller, a state machine, or the like. A processor may also be implemented as a combination of computing devices, e.g., a combination of a digital signal processor and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a digital signal processor core, or any other such configuration. In practice, processor 206 includes processing logic that may be configured to carry out the functions, techniques, and processing tasks associated with the operation of the control unit 200, as described in greater detail below. Furthermore, the steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in firmware, in a software module executed by processor 206, or in any practical combination thereof
In an exemplary embodiment, the processor 206 accesses or includes one or more databases 210 configured to support rendering a map on the display device 202, as described below. In this regard, the database 210 may be realized in memory, such as, for example, RAM memory, flash memory, registers, a hard disk, a removable disk, or any other form of storage medium known in the art. In this regard, the database 210 is coupled to the processor 206 such that the processor 206 can read information from the database 210. In some embodiments, the database 210 may be integral to the processor 206.
Referring now to
Referring now to
Referring again to
For example, referring now to
In an exemplary embodiment, the flight plan generation process 400 continues by identifying any timing constraints for the flight plan (task 404). For example, the flight plan generation process 400 may be configured to a user to identify one or more timing constraints for each identified surveillance target. For example, the user may designate that a first surveillance target (e.g., object 304) should be observed and/or viewed at a specified time or within a specified time period (e.g., “before 10:00 AM” or “between 10:00 AM and 10:05 AM”). In accordance with one embodiment, the flight plan generation process 400 is also be configured to allow a user to input or otherwise designate a desired departure or starting time for the flight plan.
In an exemplary embodiment, the flight plan generation process 400 continues by generating a flight plan that satisfies the identified spatial constraints, viewing constraints, and timing constraints and determining a predicted camera path or predicted viewing path for the camera and/or surveillance module 106 onboard the unmanned aerial vehicle based on the flight plan (tasks 406, 408). As used herein, a predicted camera path or predicted viewing path should be understood as referring to the predicted path or region that the viewing region of the camera and/or surveillance module 106 will theoretically observe if the unmanned aerial vehicle operates in accordance with the generated flight plan. In an exemplary embodiment, the flight plan generation process 400 is configured to generate the flight plan by generating a plurality of waypoints such that at least a portion the predicted camera path overlaps the identified surveillance targets. In an exemplary embodiment, the flight plan generation process 400 is configured to take into account the physical limitations of the unmanned aerial vehicle when generating the waypoints for use as the flight plan. For example, the unmanned aerial vehicle may be limited in its ability to maneuver and/or turn or there may otherwise be some lag in maintaining camera and/or surveillance module 106 focused in a particular direction relative to the unmanned aerial vehicle 100, as will be appreciated in the art. In this regard, the flight plan generation process 400 may generate a predicted flight path for the unmanned aerial vehicle based on the generated flight plan, and determine the predicted camera path based on the predicted flight path. In other words, the tasks of generating the flight plan and determining the predicted camera path may be performed contemporaneously and/or iteratively.
In an exemplary embodiment, the plurality of waypoints for use as the flight plan are generated such that predicted flight path of the unmanned aerial vehicle does not overlap and/or travel through any areas identified as no-fly regions. If the flight plan generation process 400 is unable to generate a flight plan that satisfies the identified constraints or the flight plan is otherwise infeasible (e.g., based on fuel requirements or physical limitations of the unmanned aerial vehicle), depending on the embodiment, the flight plan generation process 400 may be configured to provide a notification to the user, reinitialize (e.g., repeat tasks 402 and 404), or terminate (or exit) the flight plan generation process 400. Ideally, the predicted camera path based on the generated flight plan will overlap the identified surveillance targets in their entirety, however, in practice, physical limitations of the unmanned aerial vehicle or other constraints may be such that the predicted camera path overlaps only a portion of one or more desired surveillance targets.
For example, referring again to
Referring again to
Referring to
Referring again to
In response to receiving a user input that identifies the flight plan is accepted, in an exemplary embodiment, the flight plan generation process 400 continues by uploading or otherwise transferring the flight plan (e.g., the order or sequence of waypoints along with any timing information) to the unmanned aerial vehicle (task 420). In this regard, the vehicle control system 102 may be configured to receive the flight plan from the control unit 200 (e.g., via communication modules 108, 208) in a conventional manner. In an exemplary embodiment, the vehicle control system 102 and navigation system 104 are cooperatively configured to fly, operate, or otherwise direct the unmanned aerial vehicle 100 through the waypoints of the flight plan during operation of the unmanned aerial vehicle 100, as will be appreciated in the art. In this manner, the generated flight plan controls autonomous operation (e.g., unmanned flight) of the unmanned aerial vehicle.
To briefly summarize, the methods and systems described above allow a user to generate a flight plan based upon desired surveillance targets. The user can quickly ascertain the predicted camera path and make fine tuned adjustments to the flight plan without the complexity of manually determining what the camera onboard the unmanned aerial vehicle may or may not be able to observe. As a result, an unskilled or untrained user can quickly and reliably create a flight plan that accomplishes the desired surveillance objectives.
While at least one exemplary embodiment has been presented in the foregoing detailed description, it should be appreciated that a vast number of variations exist. It should also be appreciated that the exemplary embodiment or exemplary embodiments are only examples, and are not intended to limit the scope, applicability, or configuration of the subject matter in any way. Rather, the foregoing detailed description will provide those skilled in the art with a convenient road map for implementing an exemplary embodiment of the subject matter. It being understood that various changes may be made in the function and arrangement of elements described in an exemplary embodiment without departing from the scope of the subject matter as set forth in the appended claims.