The subject matter described herein relates generally to video surveillance applications, and more particularly, embodiments of the subject matter relate to methods for associating surveillance video data stream with a flight plan for an unmanned aerial vehicle.
Unmanned vehicles, such as unmanned aerial vehicles (UAVs), are currently used in a number of military and civilian applications. One common application involves using the unmanned aerial vehicle for video and/or photographic surveillance of a particular object or area of interest. In general, these vehicles may either be operated manually (e.g., via a remote control) or autonomously based upon a predetermined flight plan. Generally, the flight plan comprises a predefined series of waypoints, that is, a series of points in three-dimensional space that define the desired flight path for the vehicle. In most applications, the goal of the flight plan is to garner intelligence about a particular object or region rather than simply fly the vehicle through a series of waypoints.
Generally, an operator reviews streaming data (e.g., video) captured by the unmanned aerial vehicle remotely from a ground control station. The operator attempts to glean useful intelligence information by analyzing and interpreting the streaming video. Often, the operator manipulates the streaming video in order to thoroughly analyze the captured video, for example, by zooming in on a particular region or slowing down, pausing, or rewinding the video stream. As a result, the operator is often reviewing buffered or past content (rather than real-time streaming video) and manually analyzing and/or characterizing the content. Thus, if the operator is reviewing the buffered video, the operator may be unaware of real-time events or the real-time status of the unmanned aerial vehicle. For example, the operator may be unable to determine the current status of the unmanned aerial vehicle within the flight plan or quickly ascertain the relationship between the flight plan and the video segment currently being reviewed.
In some prior art surveillance applications, the operator utilizes a separate display that shows the flight plan and/or status of the unmanned aerial vehicle within the flight plan and attempts to manually correlate the video segment with the flight plan. In addition to increasing the burden on the operator, the result of the manual correlation is inexact, if not inaccurate, and thereby degrades the overall quality of the intelligence information.
In accordance with one embodiment, a method is provided for displaying a video data stream captured by a surveillance module associated with an aerial vehicle during execution of a flight plan. The method comprises displaying a timeline corresponding to the video data stream on a display device associated with the aerial vehicle, and displaying a first indicator on the timeline. The first indicator corresponds to a first waypoint of the flight plan, and the first indicator is positioned on the timeline such that the first indicator corresponds to a first segment of the video data stream at a first time. The first time is based at least in part on a position of the aerial vehicle.
In another embodiment, another method is provided for displaying video information obtained from a surveillance module. The method comprises displaying a progress bar on a display device associated with the surveillance module. The progress bar is associated with a video data stream captured by the surveillance module. The method further comprises identifying a marking event associated with a first time, and in response to identifying the marking event, displaying a first marker on the progress bar. The first marker is displayed on the progress bar corresponding to a segment of the video data stream captured at the first time.
Embodiments of the subject matter will hereinafter be described in conjunction with the following drawing figures, wherein like numerals denote like elements, and
The following detailed description is merely exemplary in nature and is not intended to limit the subject matter of the application and uses thereof. Furthermore, there is no intention to be bound by any theory presented in the preceding background or the following detailed description.
Techniques and technologies may be described herein in terms of functional and/or logical block components, and with reference to symbolic representations of operations, processing tasks, and functions that may be performed by various computing components or devices. It should be appreciated that the various block components shown in the figures may be realized by any number of hardware, software, and/or firmware components configured to perform the specified functions. For example, an embodiment of a system or a component may employ various integrated circuit components, e.g., memory elements, digital signal processing elements, logic elements, look-up tables, or the like, which may carry out a variety of functions under the control of one or more microprocessors or other control devices.
The following description refers to elements or nodes or features being “coupled” together. As used herein, unless expressly stated otherwise, “coupled” means that one element/node/feature is directly or indirectly joined to (or directly or indirectly communicates with) another element/node/feature, and not necessarily mechanically. Thus, although the drawings may depict one exemplary arrangement of elements, additional intervening elements, devices, features, or components may be present in an embodiment of the depicted subject matter. In addition, certain terminology may also be used in the following description for the purpose of reference only, and thus are not intended to be limiting. For example, terms such as “first”, “second” and other such numerical terms referring to structures do not imply a sequence or order unless clearly indicated by the context.
For the sake of brevity, conventional techniques related to graphics and image processing, video processing, video data streaming and/or data transfer, video surveillance systems, navigation, flight planning, unmanned vehicle controls, and other functional aspects of the systems (and the individual operating components of the systems) may not be described in detail herein. Furthermore, the connecting lines shown in the various figures contained herein are intended to represent exemplary functional relationships and/or physical couplings between the various elements. It should be noted that many alternative or additional functional relationships or physical connections may be present in an embodiment of the subject matter.
Technologies and concepts discussed herein relate generally to unmanned vehicle-based video surveillance applications. Although the subject matter may be described herein in the context of an unmanned aerial vehicle, various aspects of the subject matter may be implemented in other surveillance applications (e.g., non-vehicle-based applications) or with other unmanned vehicles, for example, unmanned ground vehicles or unmanned underwater vehicles, or any other surveillance vehicle (manned or unmanned) that is capable of autonomous operation (e.g., equipped with autopilot or a comparable feature), and the subject matter is not intended to be limited to use with any particular vehicle. As described below, in an exemplary embodiment, graphical indicators that correspond to various spatial criteria (such as waypoints in a flight plan) are displayed overlying a video timeline. The graphical indicators are positioned along the video timeline in a manner that corresponds to the unmanned vehicle reaching the particular spatial criterion (e.g., reaching a particular waypoint). The user may then quickly ascertain the spatial and temporal relationship between a segment of video currently being reviewed and the flight plan. As a result, the user may review and analyze a surveillance video data stream with improved spatial and temporal awareness and/or situational awareness, thereby improving the accuracy and/or effectiveness of the intelligence information being gathered.
In an exemplary embodiment, the vehicle control system 102 is coupled to the navigation system 104, the surveillance module 106, the sensor system 108, and the communication module 110. The vehicle control system 102 generally represents the hardware, software, firmware, processing logic, and/or other components of the UAV 100 that enable the UAV 100 to achieve unmanned operation and/or flight based upon a predetermined flight plan in order to acquire video and/or other surveillance data for a desired surveillance target and/or region. In this regard, the vehicle control system 102 and the communication module 110 are cooperatively configured to allow the transferring and/or downloading of a flight plan from an associated ground control station to the vehicle control system 102 along with the transferring and/or uploading of surveillance data (e.g., video data or photographic data) from the surveillance module 106 to the ground control station.
In an exemplary embodiment, the UAV 100 operates in conjunction with an associated ground control station or control unit, as described in greater detail below. In this regard, the UAV 100 and the associated ground control station are preferably configured to support bi-directional peer-to-peer communication. The communication module 110 generally represents the hardware, software, firmware, processing logic, and/or other components that enable bi-directional communication between the UAV 100 and the associated ground control station or control unit, as will be appreciated in the art. In this regard, the communication module 110 may support one or more wireless data communication protocols. Any number of suitable wireless data communication protocols, techniques, or methodologies may be supported by the communication module 110, as will be appreciated in the art. In addition, the communication module 110 may include a physical interface to enable a direct physical communication medium between the UAV 100 and the associated ground control station.
In an exemplary embodiment, the navigation system 104 is suitably configured to support unmanned flight and/or operation of the unmanned aerial vehicle. In this regard, the navigation system 104 may be realized as a global positioning system (GPS), inertial reference system (IRS), or a radio-based navigation system (e.g., VHF omni-directional radio range (VOR) or long range aid to navigation (LORAN)), and may include one or more sensors suitably configured to support operation of the navigation system 104, as will be appreciated in the art. In an exemplary embodiment, the navigation system 104 is capable of obtaining and/or determining the current geographic position and heading of the UAV 100 and providing these navigational parameters to the vehicle control system 102 to support unmanned flight and/or unmanned operation of UAV 100. In this regard, the current geographic position should be understood as comprising the three-dimensional position of the UAV 100, that is, the current geographic position includes the geographic coordinates or real-world location (e.g., the latitude and longitude) of the UAV 100 along with the altitude or above ground level of the UAV 100.
In an exemplary embodiment, the surveillance module 106 is realized as at least one camera adapted to capture surveillance data (e.g., images and/or video) for a viewing region proximate the UAV 100 during operation. In this regard, the camera may be realized as a video camera, an infrared camera, a radar-based imaging device, a multi-spectral imaging device, or another suitable imaging camera or device. For example, in accordance with one embodiment, the surveillance module 106 comprises a first video camera that is positioned and/or angled downward (e.g., the camera lens is directed beneath the unmanned aerial vehicle) and a second video camera positioned and/or angled such that the lens points outward from the UAV 100 aligned with the horizontal line of travel (e.g., the camera lens is directed straight out or forward). In an exemplary embodiment, the vehicle control system 102 and the communication module 110 are cooperatively configured to allow the transferring and/or uploading of surveillance data (e.g., video data or photographic data) from the surveillance module 106 to a control unit or ground control station, as will be appreciated in the art.
In an exemplary embodiment, a sensor system 108 is configured to sense or otherwise obtain information pertaining to the operating environment proximate the UAV 100 during operation of the UAV 100. It will be appreciated that although
It should be understood that
In an exemplary embodiment, the display device 202 is coupled to the processor 206, which in turn is coupled to the user interface device 204. In an exemplary embodiment, the display device 202, user interface device 204, and processor 206 are cooperatively configured to allow a user to define a flight plan for the UAV 100. For example, a user may create the flight plan by manually entering or defining a series of waypoints that delineate a desired flight path for the UAV 100. As used herein, a waypoint should be understood as defining a geographic position in three-dimensional space, for example, the waypoint comprise latitude and longitude coordinates in conjunction with an above ground level or altitude. It should be noted that a waypoint may also be associated with a waypoint type (e.g., fly over, fly by, etc.) that defines a particular action to be undertaken by the UAV 100 in association with the waypoint, as will be appreciated in the art. The processor 206 is coupled to the database 210, and the processor 206 is configured to display, render, or otherwise convey one or more graphical representations or images of the terrain and/or objects proximate the UAV 100 on the display device 202, as described in greater detail below. In an exemplary embodiment, the processor 206 is coupled to the communication module 208 and cooperatively configured to communicate and/or upload a flight plan to the UAV 100.
In an exemplary embodiment, the display device 202 is realized as an electronic display configured to display a surveillance video data stream obtained from the UAV 100 under control of the processor 206. In some embodiments, the display device 202 may also display a map of the real-world terrain and/or objects proximate the associated unmanned aerial vehicle 100, along with flight planning information and/or other data associated with operation of the UAV 100. Depending on the embodiment, the display device 202 may be realized as a visual display device such as a monitor, display screen, flat panel display, or another suitable electronic display device. In various embodiments, the user interface device 204 may be realized as a keypad, touchpad, keyboard, mouse, touchscreen, stylus, joystick, or another suitable device adapted to receive input from a user. In an exemplary embodiment, the user interface device 204 is adapted to allow a user to graphically identify or otherwise define the flight plan for the UAV 100 on the map rendered on the display device 202, as described below. It should also be appreciated that although
The processor 206 may be implemented or realized with a general purpose processor, a content addressable memory, a digital signal processor, an application specific integrated circuit, a field programmable gate array, any suitable programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof, designed to perform the functions described herein. In this regard, the processor 206 may be realized as a microprocessor, a controller, a microcontroller, a state machine, or the like. The processor 206 may also be implemented as a combination of computing devices, e.g., a combination of a digital signal processor and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a digital signal processor core, or any other such configuration. In practice, the processor 206 includes processing logic that may be configured to carry out the functions, techniques, and processing tasks associated with the operation of the control unit 200, as described in greater detail below. Furthermore, the steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in firmware, in a software module executed by processor 206, or in any practical combination thereof. In this regard, the processor 206 may access or include a suitable amount of memory configured to support streaming video data on the display device 202, as described below. In this regard, the memory may be realized as RAM memory, flash memory, registers, a hard disk, a removable disk, or any other form of storage medium known in the art.
In some alternative embodiments, although not separately depicted in
Referring now to
Depending on the embodiment, the map 300 may be based on one or more sectional charts, topographic maps, digital maps, or any other suitable commercial or military database or map, as will be appreciated in the art. The processor 206 may also be configured to display a graphical representation of the unmanned aerial vehicle 302 at a location on the map 300 that corresponds to the current (i.e., real-time) geographic position of the UAV 100. Although
Referring now to
Referring to
In an exemplary embodiment, the video streaming process 400 may initialize by obtaining a flight plan (or travel plan) for an unmanned vehicle (task 402). As used herein, a flight plan or travel plan should be understood as referring to a sequence of real-world locations or waypoints that delineate or otherwise define a proposed travel path for a vehicle, and may include other spatial parameters. In this regard, a flight plan for the UAV 100 comprises a plurality of waypoints, where each waypoint defines a particular location or position in three-dimensional space. In this regard,
In an exemplary embodiment, during execution of the flight plan, the UAV 100 captures a video data stream during execution of the flight plan and the control unit 200 receives and buffers the video data stream. As used herein, buffering the video data stream should be understood as referring to the process of temporarily storing data as it is received from another device, and may be implemented in either hardware or software, as will be appreciated in the art. In this regard, the processor 206 may buffer a real-time surveillance video data stream that is captured by the surveillance module 106 and downloaded or otherwise received from the UAV 100 via communication module 208 to obtain a buffered video data stream. In this manner, the buffered video data stream may be utilized to hold or maintain the video data stream for display and/or rendering on the display device 202 at a time subsequent to when the video data stream is received by the control unit 200.
In an exemplary embodiment, the video streaming process 400 continues by displaying a first segment or portion of the buffered video data stream on a display device (task 404). For example, referring to
In an exemplary embodiment, the video streaming process 400 continues by displaying and/or rendering a video timeline (or alternatively, a progress bar) corresponding to the video data stream captured by the surveillance module on the display device (task 406). In this regard, each point or location on the video timeline corresponds to a particular segment of the video data stream that has been captured by the surveillance module at a particular instant in time. In an exemplary embodiment, the width or length of the video timeline is based at least in part a characteristic of the video data stream. In an exemplary embodiment, the width or length of the video timeline corresponds to the expected duration for the video data stream, that is, the estimated flight time for the UAV based on the flight plan. In this regard, the video timeline 508 may have a fixed width within the viewing area 502, wherein the time scale (e.g., the amount of time corresponding to an incremental increase in width of the progress segment 512) for the video timeline 508 is scaled based on the expected duration for the video data stream. In other words, the width (or length or duration) of the video timeline is based on the flight plan and is scaled so that the fixed width of video timeline 508 reflects the expected mission duration (e.g., the estimated flight time for the UAV). In an exemplary embodiment, the video timeline also includes a graphical feature that is used to indicate the relationship between duration of the video data stream that has already been captured relative to the expected duration of the video data stream.
For example, as shown in
Referring again to
In an exemplary embodiment, the video streaming process 400 continues by determining or otherwise identifying whether a marking event has occurred, and in response to identifying or determining that a marking event has occurred, displaying and/or rendering a graphical indicator or marker on the video timeline that corresponds to the marking event (tasks 410, 412). In this regard, the graphical indicator or marker is displayed and/or rendered on the video timeline at a position that corresponds to the segment of the video data stream captured by the surveillance module at the time associated with the marking event, that is, the time at which the marking event occurred. Depending on the embodiment, a marking event may correspond to the UAV 100 satisfying a spatial criterion or the UAV 100 detecting a trigger event. As used herein, a trigger event should be understood as referring to a real-time event or occurrence in the environment proximate the UAV 100 that has been previously deemed of interest or satisfies some predetermined threshold criteria. In this regard, the sensor system 108 may be configured to detect or otherwise identify a trigger event. For example, depending on the embodiment, a trigger event may correspond to detecting and/or determining motion of an object that occurs within the viewing region of the camera and/or surveillance module 106, an auditory or acoustic event proximate the UAV 100, a presence of light, or an obstacle in the path of the UAV 100. It should be appreciated in the art that there are numerous possible trigger events, and the subject matter described herein is not limited to any particular trigger event.
In an exemplary embodiment, in response to identifying or determining that a marking event has occurred, the video streaming process 400 records or stores the time associated with the marking event (i.e., the real-time or elapsed mission time at the time of the marking event) and displays and/or renders a graphical indicator or marker that is positioned on the video timeline in a manner corresponding to the time associated with the marking event. For example, in accordance with one embodiment, the spatial criteria correspond to the waypoints of the flight plan, such that a marking event corresponds to the UAV 100 reaching a waypoint of the flight plan. The control unit 200 may obtain the current (i.e., real-time) geographic position of the UAV 100 (e.g., from the navigation system 104 via communication modules 110, 208) and compare the current geographic position of the UAV 100 to a waypoint of the flight plan, for example, the next (or upcoming) waypoint based on the sequence of waypoints defined by the flight plan. In response to determining that the current geographic position (e.g., the latitude, longitude and altitude) of the UAV 100 is within a threshold distance of the waypoint, the control unit 200 may record or store the current time (e.g., the elapsed mission time or real-time) and establish an association between the current time and the marking event. In this regard, the threshold distance is a radial distance (i.e., in any direction) from the waypoint that defines an imaginary sphere or zone centered about the waypoint. The threshold distance is preferably chosen to be small enough such that when the distance between the UAV 100 and the waypoint is less than the threshold distance (e.g., the UAV 100 is within the imaginary sphere about the waypoint), the geographic position of the UAV 100 is substantially equal to the waypoint (e.g., within practical and/or realistic operating tolerances). For example, the threshold distance may range from about zero to fifty feet, however, it will be appreciated that in practice, the threshold distance may vary depending upon UAV operating characteristics (e.g., navigation and/or positioning precision, range of the UAV onboard sensors) as well as the objectives of the flight plan and/or operation. In response to the UAV 100 reaching the waypoint (e.g., coming within the threshold distance of the waypoint), a graphical indicator or marker corresponding to the waypoint is then displayed and/or rendered on the video timeline at a position that corresponds to the time the UAV 100 reached the waypoint.
For example, referring now to
Referring again to
In an exemplary embodiment, the video streaming process 400 displays and/or renders graphical indicia (e.g., graphical indicators or markers) on the video timeline that indicate the estimated arrival times for when the UAV 100 will satisfy the remaining spatial criteria (task 416). In this regard, the indicia are position on the video timeline in a manner that corresponds to the respective estimated arrival time for each spatial criterion, such that the indicia reflect expected or anticipated marking events that may occur at some point in the future. For example, referring now to
In an exemplary embodiment, the loop defined by tasks 410, 412, 414, and 416 repeats throughout execution of the flight plan by the UAV. In this manner, the indicia 524, 526, 528, 530 may be dynamically updated and adjusted to reflect the current operating status of the UAV 100. In some embodiments, the indicia 520, 522 for the marking events that have already occurred may be displayed and/or rendered using a first visually distinguishable characteristic and the indicia 524, 526, 528, 530 for the anticipated marking events may be displayed and/or rendered using a second visually distinguishable characteristic. In this regard, the first and second visually distinguishable characteristics may be chosen and utilized to enable a user to more readily identify the spatial criteria that have or have not been satisfied. Depending on the embodiment, a visually distinguishable characteristic may be realized by using one more of the following: shape, color, hue, tint, brightness, graphically depicted texture or pattern, contrast, transparency, opacity, animation (e.g., strobing, flickering or flashing), and/or other graphical effects.
To briefly summarize, the methods and systems described above allow a user to quickly ascertain the spatial relationship between a segment of a surveillance video data stream from a surveillance module onboard a UAV that is currently being reviewed and the flight plan that the UAV is currently executing. By positioning graphical indicators that correspond to various spatial criteria (such as waypoints in a flight plan) positioned along a video timeline in a manner that reflects the current status of the UAV, the user may review and analyze the surveillance video data stream with improved spatial awareness and/or situational awareness and without the complexity of manually correlating the surveillance video with the UAV position. As a result, the effectiveness of the intelligence information being gathered by the UAV is improved while at the same time improving the efficiency and accuracy of such information gathering.
While at least one exemplary embodiment has been presented in the foregoing detailed description, it should be appreciated that a vast number of variations exist. It should also be appreciated that the exemplary embodiment or exemplary embodiments are only examples, and are not intended to limit the scope, applicability, or configuration of the subject matter in any way. Rather, the foregoing detailed description will provide those skilled in the art with a convenient road map for implementing an exemplary embodiment of the subject matter. It being understood that various changes may be made in the function and arrangement of elements described in an exemplary embodiment without departing from the scope of the subject matter as set forth in the appended claims.