Camera control methods such as pan, tilt and/or zoom control are known in the art. These different camera control methods may be used in order to train the field of view of a camera onto a target, for example, to put the target under surveillance. It is common for cameras to be delivered to a target location, for example, in order to monitor a mission in progress. For example, it may be important while combatting a forest fire to monitor the progress of fire fighters in areas where other forms of direct communication may be difficult. These cameras may be delivered to a target location, for example, by way of an unmanned aerial vehicle (UAV). UAVs are typically piloted to the target area by line-of-sight or by first person view. However, use of a UAV to direct a camera to a target requires a pilot trained in flying UAVs. Additionally, while preprogramming a desired target location into a UAV is known, in a disaster situation such GPS coordinates may not be readily available. This presents a problem in monitoring the ongoing progress of responding to a natural disaster or monitoring another situation in that a trained UAV pilot is required to fly with now pre-known GPS coordinates.
Another concern in the use of UAVs to deliver cameras to a targeted area for surveillance is that a targeted area may change, for example in a scenario where UAVs are dispatched with cameras to monitor a forest fire, by the time the UAV reaches the target area the fire may be quelled in that area or a hot spot may have broken out elsewhere, requiring a pilot of the UAV or a programmer of the path of UAV to change the path to reflect the change in conditions.
The discussion above is merely to provide for general background information, and is not intended to be used as an aid in determining the scope of the claimed subject matter.
An unmanned vehicle control system is provided. In one embodiment, the control system comprises an image acquisition device configured to capture an image. A vehicle is configured to receive and execute a vehicle control command. A control device is configured to generate the vehicle control command. The control device comprises a display component, an input component and a processor. The display component is configured to present the image obtained from the image acquisition device. The input component is configured to receive an input, wherein the input at least references the obtained image. The processor is configured to obtain the image from the image acquisition device, analyze the received input, and generate the vehicle control command. A communication component is configured to facilitate transmission of the vehicle control command to the vehicle.
These and various other features and advantages that characterize the claimed embodiments will become apparent upon reading the following detailed description and upon reviewing the associated drawings.
In accordance with various embodiment described herein, a UAV may be directed to a target location by the use of waypoints, or a series of preprogrammed positions along a path taken by the UAV from a starting point to the target location. A waypoint, in one embodiment, may be a set of preprogrammed, known, GPS coordinates corresponding to a location. However, a waypoint may also be, in one embodiment, determined by a received input from an operator of the UAV, for example through the use of touch screen technology with a current picture including the target as shown, for example, in
The waypoints may be generated, for example, on the fly by a controller of the UAV by selection of a position within an image of a target zone. The image may, in one embodiment, be generated by a live video or camera image taken from a camera currently viewing the target area. For example, in one embodiment, the image may be generated by a cue camera, such as, but not limited to, any one of various types of wide area view (WAV) cameras. These WAV cameras may be camera arrays, scanning and rotating cameras, or other cameras with wide angles or anamorphic lens systems. In one embodiment, an operator selection of a position on the live video or image feed from the cue camera is received and translated into a set of coordinates that are provided to the UAV, for example as described herein.
In one embodiment, the operation center 102 may be a manned or unmanned base of operations that generates the directions for the UAV 106. The operations center 102 may comprise a computing device with a display configured to allow an operator to communicate with and control the UAV 106. In one embodiment, the operator may control the UAV 106 using the slew-to-cue control methods as described herein. The operator may communicate with the UAV 106 directly. However, in another embodiment where the UAV 106 is too far away to directly communicate with the operations center 102; the operation center 102 may communicate with the UAV 106 through the use of a ground control station 104 or another suitable communication intermediary. This may be the case, for example in an embodiment where the UAV 106 is too far from the operation center 102 for direct communication over the network from the operation center 102 to the UAV 106.
In one embodiment, a slew-to-cue control method comprises sending a movement command to a UAV 106, based on a selected position, or cue, in an image received from a cue camera. The cue camera may be a Wide-Angle-View (WAV) camera, in one embodiment, located remote from the UAV 106. For example, the cue camera may be located on a ground control station 104. In another example, the cue camera may be located on the operations center 102. In another embodiment, the cue image is provided from a camera associated with the UAV 106. The image, in one embodiment, may comprise a video stream provided substantially in real-time. In another embodiment, the image is a most-recently captured image from the UAV 106, provided substantially in real-time.
The cue, on which the movement command may be based, in one embodiment, is provided by an operator selecting a pixel, representing a new desired location for the UAV 106, in a received image from the cue camera. The operator may select the cue position, in one embodiment, using a joystick control mechanism. In another embodiment, the operator may select the cue position using a computer mouse or other external selection mechanism. In another embodiment, the operator may select the cue position by contacting a touch screen interface presenting the image.
In one embodiment, the ground control station 104 may comprise one or more features allowing it to be, alternatively, a launching station or a waypoint for the UAV 106. The ground control station 104 includes at least an internal computing device 110 with communications capability to relay commands from the operation center 102 to the UAV 106. In one embodiment, the ground control station 104 may also comprise a charging station 112 or a fueling station 116. These may be important features to allow the UAV 106 to travel from the operation center 102 to a target zone. These features may also be helpful in order to ensure that one or more cameras 126 associated with the UAV 106 are sufficiently charged to image a target zone for a length of a mission. In one embodiment, the ground control station 104 may include storage component 114 such that it can receive images and/or video feed from the UAV 106 and store them separately, for example as a backup, from an onboard storage component within the UAV 106. Alternatively, the storage component 114 may be used as a relay to translate live camera feed from the UAV 106 to the operation center 102 such that the operation center 102 receives the camera feed substantially instantaneously. In another embodiment, the camera feed is received by the operation center 102 after a delay. The ground control station 104 may also have one or more cameras 118 allowing for the ground control station to take live camera or video feed that captures a view comprising both the target area and the UAV 106. The ground control station 104 may also include a GPS unit 120. Alternatively, the location of the ground control station 104 is known.
The ground control station 104 may, in one embodiment, be shore or battery powered with a generator, a solar or other appropriate backup power mechanism. Additionally, the ground control station 104 may comprise a regenerative power source such as, for example, portable solar panels. The ground control station 104 may, in one embodiment, support without attendance by an operator: battery recharges, offload of high definition video, charge of sensor package, and/or rest up during extended missions.
The UAV 106 may comprise at least an internal computing device 122 that comprises at least a processor. The UAV may also comprise a GPS unit 124. The UAV 106 may also comprise one or more cameras 126 configured to take live video feed and/or still camera images. The UAV 106 may also comprise a communications interface 128 capable of communicating with the ground control station 104 and/or the operations center 102. In one embodiment, communication between the UAV 106 and the operations center 102, either directly or through the ground control station 104, comprises transmitting live video feed received from the one or more cameras 126 on the UAV 106. In another embodiment, communication comprises sending images from the UAV 106, received from the one or more cameras 126.
Hostile Factors Facing UAV Operation
UAVs often are required to fly in imperfect conditions comprising one or more hostile conditions, for example: extreme distance, inaccessibility, inclement weather, extended missions, and visually impaired conditions. These, as well as other exemplary hostile factors, make the use of a UAV preferable to the use of manned vehicles. However the use of a UAV, such as UAV 106, creates additional challenges for an operator/pilot of the UAV using, for example, first person view or line of sight to fly the UAV as, often, the hostile conditions make flight using these conventional methods challenging.
One known hostile factor is long distance between the operation center 102 and the ground control station 104, requiring travel of the UAV 106 across miles of terrain not necessarily visible to the operator. Further, the UAV 106 may then need to travel further from the ground control station 104 to a target area, which presents a similar problem, especially as the target area may not correspond to known GPS coordinates. Integrating a camera on the UAV 106, for example camera unit 126, and implementing an automatic slew-to-cue method of generating directions described herein may allow for directing transportation of the UAV 106 from the operation center 102, or other exemplary starting point, to a target position at a remote location, without the need for specific knowledge of the target's GPS coordinate location. In one embodiment, using the automatic slew-to-cue method implemented on a cue camera system 118 associated with a ground control station 104, an operator can more reliably direct the UAV 106 across a potentially unknown distance, even if the distance would otherwise induce latency into the system. Additionally, the methods described herein allow for more reliable operation of a UAV 106 in a situation where there is a possibility of signal loss between the UAV 106 and the ground control station 104.
Another known hostile factor is inaccessibility of the target area. For example, there may be unexploded ordinances on the ground near the target area, live munition striking targets in the area or another cause of inaccessibility, for example, smoke related to a forest fire. This may prevent access to the area by ground troops, and otherwise prevent acquisition of accurate GPS coordinates necessary for conventional flight of the UAV 106. However, in one instance, inspection of the target area must occur in near real time for mission success. Additionally, the operation center 102 may be miles from the target. In one embodiment, a ground control station 104 that is able to automatically launch and retrieve the UAV 106 can be installed within the border of a closed area of operation, but far enough away from the target area as to avoid damage by munitions. In one embodiment the ground control station 104 may be, for example, within one mile of the target area. However, in another embodiment, the ground control station 104 may be further than one mile away from the target area. The operator may be able to, using the systems and methods described herein, direct the UAV 106 to travel the distance from the ground control station 104 to the target area and back, as needed, between operations in the target area, from the safety of the operations center 102.
Another difficulty facing UAV management is the length of time of an extended mission. For example, in one embodiment, a single exercise may require hours on target, or may require visiting and imaging multiple target locations in a row, taxing the battery life of an onboard camera 126 and the fuel reservoirs of the UAV 106. In one embodiment, the UAV 106 interacts with the remote ground control station 104 with an unattended, automatic battery charging module 112, fueling station 116 and/or video offload module 114. In one embodiment, while one UAV 106 is interacting with the modules on ground control station 104, a second UAV 106 is launched to allow for continuous surveillance of a target area.
A better method for directing UAV 106 to a target is desired. For example, in many cases a target location changes throughout the mission as conditions shift. For example, in a forest fire scenario, hot zones of the fire may shift and change over time as weather conditions change and firefighting is conducted. Therefore, even if a GPS location of a target was known at the outset of a mission, the exact GPS coordinates of where the UAV 106 is needed at a later time may be unknown and/or changing, as the UAV 106 approaches the target area. Additionally, because of potential visibility challenges, it may be difficult for a pilot of a UAV 106 to fly using the video feed, or other control mechanism, from the UAV 106 itself, for example due to smoke or other inclement weather. However, by implementing a slew-and-cue method as described herein, the UAV 106 may be directed to the target with decreased complexity. In one embodiment, the UAV 106 is enabled with an inspection camera, for example camera 126, and a video enabled commercial grade, heavy lift multi-rotor and communications units, for example communications unit 128. This embodiment enables an operator of the UAV 106 to realize a significant reduction in cost over the typical expense of high end, military grade systems without compromising the information obtained or the quality of the mission. It may also reduce the training requirements for an operator of the UAV 106.
In another embodiment, the UAV 106 is enabled for automatic launch and retrieval from the operation center 102. For example, the UAV 106 may be launched by command from an operator within the operation center 102, to the control software such that the UAV 106 can execute an assigned mission profile and return to base. In the embodiment where the operation center is a significant distance from the target, or if the mission is extensibly long or changes during the mission time parameters, it may be necessary to launch the UAV 106 from the ground control station 104 with automated support.
Control System for Remote Control of a Vehicle
In one embodiment, the UAV 106 is programmed with control software such that it can receive commands directly from the operation center 102 or commands relayed through the ground control station 104. The control software may be implemented on an exemplary computing device 140 as shown in
In the embodiment where the exemplary computing device 140 is implemented on the ground controls station 104 and/or the UAV 106, the exemplary computing device 140 may also comprise access to a camera unit 160, such that it can command an external camera to take images and/or video feed, and store the images and/or video feed and otherwise relay them back to the ground control station 104 and/or the operation center 102.
Software System for UAV Control
In one embodiment, the unmanned aerial vehicle 106 is programmed with control software such that it can receive and comply with commands sent from the operation center 102 either directly to the UAV 106 or through communications relay with the ground control station 104. In one embodiment, the control software is also viewable by an operator within the operation center 102, for example on the interactive user interface 200, shown in
Additionally, the operator may need to transition the UAV 106 from an autonomous mode to a first person or line-of-sight control, based on a change in conditions in the field. In one embodiment, the UAV 106 is programmed with control software that allows it to transition in and out of the slew-to-cue control mode into other control modes to fit the different conditions the operator encounters during an individual mission with the UAV 106. In one embodiment, however, the slew-to-cue control mode allows the UAV 106 to fly in an autonomous manner without a dedicated pilot. The slew-to-cue method may allow for an operator, without UAV-specific pilot training, for example, to direct the UAV 106 through received images from the UAV 106 and/or the ground control system 104, using an intuitive and user-friendly control interface. The ability to utilize personnel without specialized training on an automated, user-friendly interface may cut down on both pilot-related error and fatigue-related error.
Some examples of the benefits of using a slew-to-cue method over conventional UAV flight techniques are the cost effective and precise control of a video-enabled inspection UAV 106, increased ability to handle hostile factors in an area of operations with minimal manual intervention by an operator, and the ability to utilize operators without specific training in piloting a UAV. The control means of the slew-to-cue method allows for operators to control the UAV with minimal training as, in one embodiment, the UAV controls are intuitive.
In one embodiment, the initial set up screen 210 is configured to receive additional waypoints throughout a mission. For example, in the forest fire scenario, when a change in wind is detected, an operator may add an additional waypoint, either through the slew-to-cue method described below with respect to
A user may be able to, using the mission profile view 220, create a new mission profile 224 or scroll through a series of saved programs 226. The saved programs may involve waypoints previously known or previous targets whose GPS coordinates have since been determined, or whose relative locations are known with respect to other known waypoints. For example, during a mission, a user may bookmark 228 a current location of the UAV 106 to use as a waypoint in the future. The bookmarked location may have known GPS coordinates, calculated GPS coordinates relayed from the UAV 106, or may only be known through its relative location to the previous waypoint. Additionally, the GPS coordinate of the bookmarked location may be determinable using coordinate geometry as described below with respect to
In one embodiment, the automatic mode 244 may show a current mission path 270 of the UAV 106 with the one or more currently programmed waypoints 272 listed. The waypoints 272 may be indicated as achieved 274, meaning that they have already been passed through, and potentially imaged, by the UAV 106. Additionally, as shown in
Additionally, through navigation view 240, a user of the graphical user interface 200 is presented with additional options to switch to other preprogrammed automatic navigation paths, for example a holding pattern or another preprogrammed pattern, such as a circle pattern around a target area, where the circle pattern may have a set radius, for example 500 feet around the target area at a preset altitude, for example 200 feet. Alternatively, the parameters of the preprogrammed pattern may be entered by an operator when the automatic navigation path is selected.
The graphical user interface 200 may also include an inspection camera view 250, configured to show one or more camera feeds as received in substantially real time from either the UAV 106 and/or the cue system on the ground control system 104. The inspection camera tab 250 may populate from a cue camera associated with the ground control station 104 or may populate directly from a camera unit 126 associated with the UAV 106. Additionally, for example as shown in
In one embodiment, the ground control station view 252 comprises at least a view of a target 258 and the UAV 256. The ground control station view 252 may also comprise a cursor 257 that an operator may manipulate across the ground control station view 252, for example by an external mouse or through a touch screen associated with a display 154. A user of the user interface 200 may manipulate the cursor 257 to indicate a new target for the UAV 106. For example, the target area 258 may now be an outdated target, and the operator may want to direct the UAV 106 to move instead to the new target area indicated by cursor 257. Using the slew-to-cue method, the operator may select a new target using the cursor 257, and by selecting the new target on the ground control station view 252, send a command to the UAV 106 to proceed to the selected new target area.
The raster view 254 as shown in
Ground control station view 252 may comprise, as indicated in
In one embodiment, the cursor 257 allows a user of the user interface 200 to select a new waypoint, for example by actuating the cursor 257. In another embodiment, entry of a new waypoint also requires entry of an authorization code by the operator such that an accidental change of course is less likely to occur. However, in another embodiment, actuation of the cursor 257 only switches between views, for example allows a user to zoom in on the live image or video feed generated by a cue camera system 118 associated with the ground control station 104 or a camera 126 associated with the UAV 106. In one embodiment, the cursor 257 offers both functionalities of changing a view presented on the inspection camera tab 250 as well as selecting a new waypoint 257, through different actuation mechanisms, for example. Actuation of the cursor 257 may comprise, for example, touching the screen and dragging and dropping the cursor 257, in an embodiment where the screen is a capacitive touchscreen. Actuating cursor 257 may comprise, in an alternative embodiment, dragging and dropping the cursor 257 through the use of an external device, for example a computer mouse or joystick. Additionally, actuation may comprise another actuation mechanism of the cursor 257 such that a control scheme is activated offering camera control and/or UAV control options for the operator to choose. In one embodiment, the control scheme may comprise a wizard that retrieves the necessary indications from an operator prior to calculating and programming new directions for the UAV 106.
In one embodiment, for example as shown in
One or more of the image feeds may be a thermal or other image, which may be either a recently taken camera image or a live video feed. In one embodiment, an operator of the user interface 200 can use the slew-to-cue method on the graphical user interface 200 of the
Different cameras or different camera modes may be available for different situations. In one embodiment, the camera used in the slew-to-cue method takes images in the visible light spectrum. In another embodiment, the camera takes images or video in the 3.5-9 micrometer mid-wave thermal wave for general inspection. In another embodiment, the camera takes images or video in the 3.8-4.05 micrometer mid-wave thermal range in order to see through flames. In another embodiment, the camera takes images or video in the 3.2-3.4 micrometer mid-wave thermal range tuned to image volatile organic compounds. In another embodiment, the camera takes images or video in the 8-14 micrometer long-wave thermal range tuned to image oil slicks. In another embodiment, the camera takes images or video in the 8-8.6 micrometer long-wave thermal range tuned to image refrigerant compounds. In another embodiment, the camera takes images or video in the 10.3-10.7 micrometer long-wave thermal range tuned to image anhydrous ammonia.
The camera used in the slew-to-cue method may be mounted, in one embodiment, on the UAV 106 or, in another embodiment, on ground control station 104. Additionally, there may be multiple inspection cameras on a single ground control station 104 or UAV 106. This may allow an operator a greater visual range in order to select the appropriate next waypoint. For example, as shown in
Methods of Directing a UAV to a Target Location
In one embodiment, in block 304, an image is received from the ground control station 104 that at least includes the UAV within the image. In the embodiment where the operation center 102 communicates directly with the UAV 106, block 304 may instead designate receipt of an image of the UAV 106 from a cue camera associated with the operation center 102. The image may also comprise the indicated new target area, or may only include an indication of a current location of the UAV 106.
In block 306, in one embodiment, the UAV's current position is determined from the camera image received in block 304. The current position of the UAV 106 may be a relative position, for example relative to the ground control station 102. Alternatively, the UAV's current position may be transmitted from the UAV 106 directly, and may be an absolute location derived from an internal GPS module within the UAV 106. In the embodiment where the image received in block 302 is transmitted from the UAV 106 itself, the method may optionally progress from block 302 to block 306 directly, as indicated in
In one embodiment, after a current position of the UAV 106 is determined, either relative or absolute, a relative or absolute position of the new target is determined based on the image received in block 302. This may be accomplished, for example, using the known location of the UAV 106 to determine a distance between the UAV 106 and the selected new target. Based on, for example, the calculation method shown in
In block 310, directions are generated and sent to the UAV 106, such that the UAV can travel from its current location to the newly selected target location. These directions may be automatically sent to the UAV 106, or they may be presented to an operator on the user interface 200 such that the operator has to confirm the directions before altering the UAV's current path. The additional step requiring the operator confirmation may reduce the chance of accidently changing a current path of the UAV to an undesired location. In one embodiment, the entirety of method 300 is accomplished automatically upon actuation of a cursor 257, for example. In an embodiment where method 300 is accomplished automatically, a confirmation option may appear to an operator prior to the transmission of new waypoint instructions to the UAV 106. This may allow, for example, any operator in the operations center 102, to select a new waypoint, and direct the UAV 106 to proceed to the new waypoint, using the graphical user interface 200 without any specialized pilot training.
In one embodiment, each frame of a cue camera video feed, for example, generated from a cue camera system 118 associated with ground control station 104, comprises an image that can be mapped to a Cartesian system of coordinates with an x and y axis, as the UAV travels close enough to the surface of the earth that its curvature can be disregarded Once an exemplary position is selected within a single frame of the live video feed or a most recently captured or acquired camera image, in one exemplary method, the x-y coordinates of the selected position with the image, are automatically translated into a latitude or longitude address for the selected new target area, as explained in the calculation illustrated in
In block 402, a selection of a new target location is received by the control system. The selection may come from an operator viewing a feed from the inspection camera, for example on the inspection camera view 250 of the graphical user interface 200. The receipt of the new location in block 402 may be, for example, received from actuation of the cursor 257 indicating a position on a video feed from the ASTC camera, for example.
In block 404, the control software determines a UAV's relative current position to the ASTC. In another embodiment, the UAV's absolute position is determined and expressed in GPS coordinates. The GPS coordinates representing the UAV's current position may be presented to an operator of the control software, for example through the user interface 200.
In block 406, the relative position of the selected new target location is determined, for example using one of the methods for determining a coordinate address described below, for example with regard to
In block 408, directions are provided to the UAV 106 based on the relative position of UAV 106 to the selected location, or the absolute position of the selected location. In one embodiment, directions comprise only sending coordinates of the selected location to the UAV 106, which automatically calculates directions. The directions may also comprise, in another embodiment, a calculated heading, distance, and altitude.
In one embodiment, in block 410, the control system is configured to monitor the UAV 106 as it proceeds to the selected location. The method 400 may be repeated periodically through a mission profile as new target areas are selected, requiring the control system to repeat the steps listed in block 402, 404, 406, 408, and 410 repeatedly throughout the process as new waypoints are selected during the mission. However, in an instance where the process is repeated, the UAV's current relative position of UAV 106 may be determined as the relative position of the previously selected location, such that block 404 may be omitted from the method 400 after a first iteration of the method 400 is completed.
In block 504, the relative current position of the UAV 106 is determined. This may be accomplished using any of the geometric methods described below, for example with regard to
In block 506 a relative location of a selected new waypoint is determined, for example using any of the geometric methods described below, for example with regard to
In block 508, directions are generated and provided to the UAV 106 based on the detected current relative position of the UAV 106 and a location of the new waypoint. In one embodiment, directions comprise only sending coordinates of the selected location to the UAV 106, which automatically calculates directions. The directions may also comprise, in another embodiment, a calculated heading, distance, and altitude provided to the UAV 106.
In block 510, the control system may monitor the UAV 106 from its current position to the newly programmed waypoint and, for example, provide an indication to an operator that the new waypoint has been achieved. Additionally, in one embodiment, the control system may also provide an indication of time remaining until the next waypoint is achieved. The method 500 may then be repeated, for example as shown through recycle arrow 512 as new waypoints are added to a mission profile.
In block 604, in one embodiment, once the image has been received, the image may be translated to a system of Cartesian coordinates, presented to an operator of the UAV on the graphical user interface 200 and calculated, for example, as shown in
In block 606, an indicated new target area is received, where the target is indicated on the received image feed. In one embodiment, a coordinate system is not applied to every image received, but only after a selected new target is indicated.
In block 608, the indicated new target area is translated into a corresponding latitude and longitude address. The latitude and longitude address may be absolute latitude and longitude or they may be relative to a current position of the UAV 106 in one embodiment, for example as expressed in compass-based directions.
In block 610 the new target address is provided to the UAV 106. In one embodiment, method 600 is implemented on a computing device at the operation center 102. However, in another embodiment, the method 600 is implemented on a remote ground control station, for example ground control station 104. Method 600 may also comprise providing an indication that the UAV 106 is still on target to achieve the indicated new target area and, in one embodiment, an indication when the new target area is achieved.
Method for Determining a Coordinate Address
The position of the camera 702 may be, in one embodiment, proximate to the ground control station 104. The directions to the selected target 706 may, then, be expressed as directions from the ground control station 104. The known position of the UAV 106 and the ground control station 104 may also be used, in one embodiment, to provide directions from the UAV 106 to the selected target 706.
The latitude and longitude of the selected target 706 may be, in one embodiment, derived from the live image, for example an image substantially similar to diagram 750, taken by a geo-referenced cue camera, for example camera 702, associated with a ground control station 104. The geo-referenced cue camera 702, in one embodiment, provides the cue for the target location of the UAV 720. In one embodiment, once the location of the selected target 706 is determined, the control software on the UAV 720 then executes the prescribed surveillance profile, for example entered by an operator of the UAV 720. In another embodiment, the control software may be located at the ground control station 104 and may remotely communicate with the UAV 720 to transmit instructions for proceeding to the selected target 706.
In one embodiment, for example the exemplary physical depiction of
In one embodiment, an operator of a UAV 720 is presented with an image sent from a camera, for example camera 702, on the user interface 200. The image may correspond to the diagram 750. The operator can select a position, for example using cursor 257, in the displayed image and a coordinate point will be returned, in one embodiment. In the diagram of
In one embodiment, the vertical camera field of view angle 710, the horizontal camera field of view angle 712, the vertical distance from level 754, and the horizontal distance from center 760 can be used to calculate an angular position 784 of selected position 706 in relation to camera 702. A camera has a known field of view angle and a lens resolution (in both horizontal and vertical dimensions), measured in pixels. The vertical distance from level 754 and horizontal distance from center 760, in one embodiment, may be measurable in pixels. In one embodiment, Equations 1 and 2 may be used to calculate the angular position 784 between the camera 702 and the selected target 706.
In Equation 2, the vertical field of view angle 710, is divided by the vertical resolution 762 to obtain a vertical angle to pixel ratio. Multiplying the vertical angle to pixel ratio by the vertical distance from level 754 will calculate the vertical component 786 of angular position 784, in one embodiment. In Equation 1, the horizontal field of view angle 712, is divided by the horizontal resolution 764 to obtain a horizontal angle to pixel ratio. Multiplying the horizontal angle to pixel ratio by the horizontal distance from center 760 will calculate the horizontal component 788 of angular position 784, in one embodiment. In one embodiment the camera 702 may be pointed at such an angle that horizontal level line 756 may not be in the camera field of view 704. In such a scenario, a calculated theoretical distance from horizontal level line 756 would be used. In another embodiment, the angular position 784 of the select point 706 could also be determined with the use of accelerometers and compasses within the camera 702.
Using trigonometry, the calculated angular position 784 can be combined with the height 708, bearing direction 792 and the GPS location of the camera 702 to calculate the GPS coordinates corresponding to point 706, for example using Equations 3 and 4 as well. For more accuracy, the curvature of the Earth and terrain (from WGS84 for example) can also be included in the calculation. Within short distances, however, the assumption that the Earth has a flat surface may be sufficiently accurate, allowing for the curvature of the Earth to be ignored. In one embodiment, where distances are short enough that a planar accuracy is sufficient, the following equations may be used to determine the coordinate address for selected point 706.
The calculation diagram of
Once the coordinate address for selected point 706 is calculated, directions can be transmitted to the UAV 720, in one embodiment. Directions to selected point 706, may comprise, in one embodiment, relative locations from the UAV 720, or camera 702, to the selected point 706. In another embodiment, the directions may be in given in terms of absolute locations. In another embodiment, only the bearing 788 and distance between camera 702 or UAV 720 and selected location 706 is transmitted.
While the present invention has been described with respect to the control of a UAV, workers skilled in the art will recognize that the systems and methods could be used to control any remote-controlled vehicle with an associated camera feed. For example, the slew-to-cue control method could also be used to move a ground control station 104 into an initial position using a WAV camera feed associated with the ground control station 104.
Additionally, the slew-to-cue control methods and interfaces could be used, for example, to control movement of a remotely-controlled land vehicle facing hospitable conditions, for example remotely maneuvering a vehicle in desert or other remote locations. The slew-to-cue control methods and interface could also be used, for example, to maneuver an aquatic vehicle over water, for example a boat or other water-based vehicle. Additionally, the slew-to-cue control method could be implemented, in one embodiment, as an application to allow a person to control a remote-controlled vehicle from a portable computing device, for example a phone, tablet or other appropriate device.
Although the present invention has been described with reference to preferred embodiments, workers skilled in the art will recognize that changes may be made in form and detail without departing from the spirit and scope of the invention.
The present application is based on and claims the benefit of U.S. Provisional Patent Application Ser. No. 62/040,736, filed Aug. 22, 2014, the content of which application is hereby incorporated by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
3816741 | Macall | Jun 1974 | A |
3978281 | Burrer | Aug 1976 | A |
4043672 | Baumgartner | Aug 1977 | A |
5022723 | Schmidt et al. | Jun 1991 | A |
5149969 | Fouilloy et al. | Sep 1992 | A |
5339188 | Fritzel | Aug 1994 | A |
5610730 | Osipchuk | Mar 1997 | A |
5721585 | Keast et al. | Feb 1998 | A |
5870220 | Migdal et al. | Feb 1999 | A |
5923365 | Tamir et al. | Jul 1999 | A |
6034716 | Whiting et al. | Mar 2000 | A |
6064423 | Geng | May 2000 | A |
6081278 | Chen | Jun 2000 | A |
6147701 | Tamura et al. | Nov 2000 | A |
6195609 | Pilley et al. | Feb 2001 | B1 |
6226035 | Korein et al. | May 2001 | B1 |
6304284 | Dunton et al. | Oct 2001 | B1 |
6335758 | Ochi et al. | Jan 2002 | B1 |
6356296 | Driscoll, Jr. et al. | Mar 2002 | B1 |
6373508 | Moengen | Apr 2002 | B1 |
6421185 | Wick et al. | Jul 2002 | B1 |
6456232 | Milnes et al. | Sep 2002 | B1 |
6654063 | Tadatsu | Nov 2003 | B1 |
6665003 | Peleg et al. | Dec 2003 | B1 |
6717608 | Mancuso et al. | Apr 2004 | B1 |
6734808 | Michaelson et al. | May 2004 | B1 |
6738073 | Park et al. | May 2004 | B2 |
6744403 | Milnes et al. | Jun 2004 | B2 |
6757446 | Li et al. | Jun 2004 | B1 |
6765566 | Tsao | Jul 2004 | B1 |
6795113 | Jackson et al. | Sep 2004 | B1 |
6798923 | Hsieh et al. | Sep 2004 | B1 |
6831693 | Sunaga | Dec 2004 | B1 |
6909438 | White et al. | Jun 2005 | B1 |
6927905 | Kashitani et al. | Aug 2005 | B1 |
6975353 | Milinusic et al. | Dec 2005 | B1 |
7058239 | Singh et al. | Jun 2006 | B2 |
7092132 | Matsuda | Aug 2006 | B2 |
7136096 | Yamagishi et al. | Nov 2006 | B1 |
7206017 | Suzuki | Apr 2007 | B1 |
7245744 | Kaneko et al. | Jul 2007 | B2 |
7256834 | Sagefalk et al. | Aug 2007 | B1 |
7260241 | Fukuhara et al. | Aug 2007 | B2 |
7301557 | Kakou et al. | Nov 2007 | B2 |
7489330 | Hayashi et al. | Feb 2009 | B2 |
7495694 | Cutler | Feb 2009 | B2 |
7528864 | Sassa | May 2009 | B2 |
7583815 | Zhang et al. | Sep 2009 | B2 |
7660439 | Lu et al. | Feb 2010 | B1 |
7710463 | Foote | May 2010 | B2 |
7801328 | Au et al. | Sep 2010 | B2 |
7801330 | Zhang et al. | Sep 2010 | B2 |
7884848 | Ginther | Feb 2011 | B2 |
7911517 | Hunt, Jr. | Mar 2011 | B1 |
8072482 | Gibbs et al. | Dec 2011 | B2 |
8099201 | Barber et al. | Jan 2012 | B1 |
8511606 | Lutke | Aug 2013 | B1 |
8521339 | Gariepy | Aug 2013 | B2 |
8670020 | Gibbs et al. | Mar 2014 | B2 |
8792002 | Gibbs et al. | Jul 2014 | B2 |
8803972 | Gibbs et al. | Aug 2014 | B2 |
20010005218 | Gloudemans et al. | Jun 2001 | A1 |
20020024599 | Fukuhara et al. | Feb 2002 | A1 |
20020054211 | Edelson et al. | May 2002 | A1 |
20020071122 | Kulp et al. | Jun 2002 | A1 |
20020109772 | Kuriyama et al. | Aug 2002 | A1 |
20020126226 | Dudkowski | Sep 2002 | A1 |
20020196962 | Fukuhara et al. | Dec 2002 | A1 |
20030142203 | Kawakami et al. | Jul 2003 | A1 |
20030171169 | Cavallaro et al. | Sep 2003 | A1 |
20040021766 | Daniilidis et al. | Feb 2004 | A1 |
20040022453 | Kusama et al. | Feb 2004 | A1 |
20040061774 | Wachtel et al. | Apr 2004 | A1 |
20050029458 | Geng et al. | Feb 2005 | A1 |
20050031204 | Kaneko et al. | Feb 2005 | A1 |
20050259146 | Berdugo | Nov 2005 | A1 |
20050259158 | Jacob et al. | Nov 2005 | A1 |
20060017816 | Gat | Jan 2006 | A1 |
20060023074 | Cutler | Feb 2006 | A1 |
20060069497 | Wilson, Jr. | Mar 2006 | A1 |
20060072020 | McCutchen | Apr 2006 | A1 |
20060227997 | Au et al. | Oct 2006 | A1 |
20060265109 | Canu-Chiesa et al. | Nov 2006 | A1 |
20060268102 | Ginther | Nov 2006 | A1 |
20060283317 | Melnychuk et al. | Dec 2006 | A1 |
20070140427 | Jensen et al. | Jun 2007 | A1 |
20070244608 | Rath | Oct 2007 | A1 |
20080068451 | Hyatt | Mar 2008 | A1 |
20080088719 | Jacob et al. | Apr 2008 | A1 |
20080125896 | Troy | May 2008 | A1 |
20080166015 | Haering et al. | Jul 2008 | A1 |
20080185526 | Horak et al. | Aug 2008 | A1 |
20080215204 | Roy | Sep 2008 | A1 |
20080219509 | White et al. | Sep 2008 | A1 |
20080252527 | Garcia | Oct 2008 | A1 |
20080263592 | Kimber et al. | Oct 2008 | A1 |
20080291279 | Samarasekera et al. | Nov 2008 | A1 |
20090087029 | Coleman et al. | Apr 2009 | A1 |
20090223354 | Root, Jr. | Sep 2009 | A1 |
20090260511 | Melnychuk et al. | Oct 2009 | A1 |
20090278932 | Yi | Nov 2009 | A1 |
20100002082 | Buehler et al. | Jan 2010 | A1 |
20100013926 | Lipton et al. | Jan 2010 | A1 |
20100026802 | Titus et al. | Feb 2010 | A1 |
20100045799 | Lei et al. | Feb 2010 | A1 |
20100073460 | Gibbs et al. | Mar 2010 | A1 |
20100073475 | Gibbs et al. | Mar 2010 | A1 |
20100128110 | Mavromatis | May 2010 | A1 |
20100156630 | Ainsbury | Jun 2010 | A1 |
20100157055 | Pechatnikov | Jun 2010 | A1 |
20100215212 | Flakes, Jr. | Aug 2010 | A1 |
20100228406 | Hamke | Sep 2010 | A1 |
20110081043 | Sabol et al. | Apr 2011 | A1 |
20110150272 | GunasekaranBabu et al. | Jun 2011 | A1 |
20110169867 | Kniffen et al. | Jul 2011 | A1 |
20110298923 | Mukae | Dec 2011 | A1 |
20110299733 | Jahangir et al. | Dec 2011 | A1 |
20120120189 | Gibbs et al. | May 2012 | A1 |
20130079954 | Malecki | Mar 2013 | A1 |
20140146173 | Joyce | May 2014 | A1 |
20140316614 | Newman | Oct 2014 | A1 |
20140320595 | Gibbs et al. | Oct 2014 | A1 |
20140327733 | Wagreich | Nov 2014 | A1 |
20150248584 | Greveson | Sep 2015 | A1 |
20150346722 | Herz | Dec 2015 | A1 |
Number | Date | Country |
---|---|---|
2239762 | Mar 1994 | GB |
Entry |
---|
Prosecution history from U.S. Appl. No. 12/908,281, including: Non-Final Rejection dated Mar. 22, 2013, Amendment filed Aug. 22, 2013, Final Rejection dated Dec. 31, 2013, Amendment and RCE filed Mar. 18, 2014, Non-Final Rejection dated Apr. 15, 2014, Amendment filed May 27, 2014, Final Rejection dated Jun. 9, 2014, Response After Final filed Jul. 1, 2014, Advisory Action dated Jul. 28, 2014. |
Amendment Submitted/Entered with Filing of CPA/RCE (dated Sep. 9, 2014), Non-Final Rejection dated Oct. 22, 2014, Amendment filed Jan. 22, 2015, Non-Final Rejection dated May 15, 2015, Amendment filed Aug. 3, 2015 and Final Rejection dated Nov. 19, 2015, 221 pages. |
Hampapur et al., “Smart Video Surveillance”, IEEE Signal Processing Magazine, pp. 38-51, Mar. 2005. |
Girgensohn et al. “DOTS: Support for Effective Video Surveillance”, ACM Multimedia 2007, pp. 423-432, Sep. 2007. |
Khoshabeh et al., “Multi-Camera Based Traffice Flow Characterization & Classification”, Proceedings of the 2007 IEEE Intelligent Transportation Conference, pp. 259-264, Sep. 2007. |
Pham et al., “A Multi-Camera Visual Surveillance System for Tracking of Reoccurrences of People”, First ACM/IEEE International Conference on Distributed Smart Cameras (ICDSC '07), pp. 164-169, Sep. 2007. |
Saleemi et al., “Probabilistic Modeling of Scene Dynamics for Applications in Visual Surveillance”, IEEE Transactions on Pattern Analysis and Machine Intelligence, v. 31, n. 8, pp. 1472-1485, Aug. 2009. |
USDOT, Transview (TV32) Installation and Operations Guide for Maritime Safety and Security Information System (MSSIS), Version 7.7.4B, revision 1, Jun. 18, 2009. |
Zhang et al., “Ship Tracking Using Background Subtraction and Inter-frame Correlation”, 2nd International Congress on Image and Signal Processing (CISP '09), pp. 1-4, Oct. 17, 2009. |
Chen et al., “A Trajectory-Based Ball Tracking Framework with Visual Enrichment for Broadcast Baseball Videos”, Journal of Information Science and Engineering, v. 24, pp. 143-157, 2008. |
Kasi et al., “Yet Another Algorithm for Pitch Tracking”, IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP), v. 1, pp. 361-364, May 2002. |
Whiting, “Science lesson for baseball”, SFGate.com, Sep. 19, 2009. |
Assfalg et al., “Semantic annotation of soccer videos: automatic highlights identification”, Computer Vision and Image Understanding, v. 92, pp. 285-305, 2003. |
Cavallaro, “The FoxTrax Hockey Puck Tracking System” . IEEE Computer Graphics and Applicaitons, pp. 6-12, Mar.-Apr. 1997. |
D'Orazio et al., “A Visual system for real time detection of goal events during soccer matches”, Computer Vision and Image Understanding, v. 113, pp. 622-632, 2009. |
Figueroa et al., “Tracking soccer players aiming their kinematic motion analysis”, Computer Vision and Understanding, v. 101, pp. 122-135, 2006. |
Khatoonabadi et al. “Automatic soccer players tracking in goal scenes by camera motion elimination”, Image and Vision Computing, v. 27, pp. 469-479, 2009. |
Liu et al., “Extracting 3D information from broadcast soccer video”, Image and Vision Computing , v. 24, pp. 1146-1162, 2006. |
Liu et al., “Automatic player detection, labeling and tracking in broadcast soccer video”, Pattern Recognition Letters, v. 30, pp. 103-113, 2009. |
Pallavi et al., “Ball detection from broadcast soccer videos using static and dynamic features”, Journal of Visual Communication and Image Representation, v. 19, n. 7, pp. 426-436, Oct. 2008. |
Piciarelli et al, “On-line trajectory clustering for anomalous events detection”, Pattern Recognition Letters, v. 27, pp. 1835-1842. 2006. |
Xie et al, “Structure analysis of soccer video with domain knowledge and hidden Markov models”, Pattern Recognition Letters, v. 25, pp. 767-775, 2004. |
Zhang et al., “People detection in low-resolution video with non-stationary background”, Image and Vision Computing, v. 27, pp. 437-443, 2009. |
Zhu et al., “Trajectory Based Events Tactics Analysis in Broadcast Sports Video”, Proceedings of the 15th International Conference on Multimedia, pp. 58-67, 2007. |
Prosecution history from U.S. Appl. No. 14/314,646, including: Requirement for Election/ Restriction dated Aug. 12, 2015, Response to Restriction Requirement filed Oct. 12, 2015, and non-final Rejection dated Dec. 30, 2015. 18 pages. |
Durucan, Emrullah, “Change Detection and Background Extraction by Linear Algebra”, Proceedings of the IEEE vol. 89, No. 10, Oct. 2001, pp. 1368-1381. |
Blair, Brian J. “The Laser Imaging Sensor: a medium altitude, digitization-only, airborne laser altimeter for mapping vegetation and topography”, ISPRS Journal for Photogrammetry & Sensing 54 (1999), pp. 115-122. |
Number | Date | Country | |
---|---|---|---|
20160054733 A1 | Feb 2016 | US |
Number | Date | Country | |
---|---|---|---|
62040736 | Aug 2014 | US |