This patent claims priority to European Application Number EP15382382, filed on Jul. 23, 2015, which is incorporated herein by reference in its entirety.
This patent relates generally to positioning aircraft and, more particularly, to methods and apparatus for positioning aircraft based on images of mobile targets.
Mobile objects (e.g., people, land vehicles, water vehicles, etc.) are often tracked by aircraft. Recently, unmanned aircraft have been utilized to track the mobile objects. These unmanned aircraft include cameras that enable the mobile objects to be identified and/or tracked. In some examples, the unmanned aircraft are planes that circle the tracked mobile objects and/or perform other maneuvers to track the mobile objects.
In one example, a method includes identifying, by executing first instructions via a processor, a mobile target in an image obtained by a camera mounted on a first aircraft and obtaining, by executing second instructions via the processor, current coordinates of the first aircraft. The method includes determining, by executing third instructions via the processor, coordinates of the mobile target based on the coordinates of the first aircraft and the image. The coordinates of the mobile target are within an area of uncertainty. The method includes determining, by executing fourth instructions via the processor, a first position for the first aircraft that reduces the area of uncertainty of the coordinates of the mobile target.
In another example, an apparatus includes a camera mounted to an aircraft to obtain an image. The apparatus includes a processor of the aircraft to identify a mobile target in the image obtained by the camera, obtain current coordinates of the aircraft, and determine coordinates of the mobile target based on the coordinates of the aircraft and the image. The coordinates of the mobile target is within an area of uncertainty. The processor of the aircraft is to determine a position for the aircraft that reduces the area of uncertainty of the coordinates of the mobile target and instruct the aircraft to move to the position to reduce the area of uncertainty.
In another example, an apparatus includes means for obtaining an image mounted to an aircraft. The apparatus includes means for determining a position of the aircraft to identify a mobile target in an image obtained by the means for obtaining an image, obtain current coordinates of the aircraft, determine coordinates of the mobile target and an area of uncertainty based on the coordinates of the aircraft and the image, and determine a position for the aircraft that reduces the area of uncertainty. The apparatus includes means for moving the aircraft to the position to reduce the area of uncertainty.
The figures are not to scale. Instead, to clarify multiple layers and regions, the thicknesses of the layers may be enlarged in the drawings. Wherever possible, the same reference numbers will be used throughout the drawing(s) and accompanying written description to refer to the same or like parts.
Mobile objects are often tracked by aircraft. For example, mobile objects that travel along land (e.g., people, animals, land vehicles, etc.) and/or water (e.g., people, animals, water vehicles, etc.) may be tracked by aircraft. Unmanned aircraft are now being utilized to track mobile objects. In some examples, the unmanned aircraft are piloted and/or controlled by a person in a remote location (e.g., via radio control and/or a human-machine interface). Other example unmanned aircraft automatically track and/or follow mobile objects without human intervention (e.g., without a pilot within and/or remote from the aircraft). To enable such unmanned aircraft to identify and/or track the mobile objects, the unmanned aircraft include cameras that obtain images of the mobile objects. Some known unmanned aircraft that track mobile objects without human intervention include planes with cameras that obtain images of the mobile objects. In some instances, the unmanned planes may perform complicated maneuvers to continue to track the mobile objects. For example, because planes must move continuously, the unmanned planes may attempt to circle around (e.g., in an elliptical trajectory) a mobile object that becomes stationary and/or moves at a slow rate (e.g., slower than the unmanned planes is able to travel). In some instances, the maneuvers performed by the unmanned plane cause the unmanned plane to lose and/or determine inaccurate coordinates of the mobile object.
The example apparatus and methods disclosed herein determine a position (e.g., a stationary position) and/or orientation at which an unmanned aircraft is able to accurately determine and/or track coordinates of a mobile target. For example, the apparatus and methods disclosed herein determine a position and an orientation at which an unmanned helicopter and/or other unmanned rotocraft may be able to be positioned (e.g., by hovering) to accurately track a mobile target without having to perform difficult flight maneuvers. Further, by monitoring the mobile target from the stationary position, the example methods and apparatus disclosed herein reduce an amount of processing performed (e.g., by a processor such as a microcontroller or microcontroller unit) to determine coordinates of the mobile target by collecting data (e.g., from images obtained from a camera of the aircraft) from a stationary (e.g., not moving) reference point.
The examples disclosed herein include identifying a mobile target in an image obtained by a camera (e.g., a first camera) mounted on an aircraft (e.g., a first aircraft) and obtaining current coordinates of the aircraft. For example, the current coordinates are obtained from data received via a global positioning sensor in communication with the processor of the aircraft. Further, in the examples disclosed herein, coordinates are determined for the mobile target based on the coordinates of the aircraft and the image obtained from the camera. The coordinates of the mobile target are within an area of uncertainty (e.g., resulting from margins of error of the obtained data). Further, a first position and/or first orientation for the aircraft are determined that reduce (e g, minimizes) the area uncertainty of the coordinates of the mobile target when the aircraft is tracking the mobile target. In some examples, the aircraft is instructed to move to the first position and/or first orientation (e.g., via an autopilot system of the aircraft). For example, the aircraft is moved to the first position to enable the camera of the aircraft to obtain a second image in which the mobile target is centered (e.g., is in a principal position) to further reduce (e.g., minimize) the area of uncertainty in tracking the mobile target.
In some examples, the coordinates of the mobile target are 3-dimensional coordinates that are determined based 3-dimensional reference coordinates of the aircraft and the 2-dimensional image obtained by the camera of the aircraft. Additionally or alternatively, a distance between the aircraft and/or the camera of the aircraft is determined and utilized to determine the coordinates of the mobile target. Further, a size, a color and/or a velocity of the mobile target may be determined based on the image obtained by the camera to determine a classification of the mobile target to further reduce the area of uncertainty and/or enable the aircraft to track the mobile target. In such examples, a vector is formed that includes information related to the mobile target such as size, color, velocity, weight, class, position, etc.
The first position of the aircraft may be based on the coordinates of the aircraft, the coordinates of the mobile target, the image obtained by the camera, a focal length of the camera, an angle of the camera relative to a fuselage of the aircraft and/or a yaw of the aircraft. Further, the area of uncertainty of the coordinates of the mobile target may have an elliptical shape that includes a major axis and a minor axis. For example, the first position of the aircraft that reduces the area of uncertainty is orthogonal to the major axis and/or along the minor axis. In some examples, two positions are identified for the aircraft that would minimize the area of uncertainty and are orthogonal to the major axis and/or along the minor axis. In such examples, the aircraft is instructed to move to the position that is closest to the current coordinates of the aircraft.
In some examples, the mobile target is identified and/or tracked based on a second image obtained by another camera (e.g., a second camera) mounted on another aircraft (e.g., a second aircraft). In such examples, current coordinates of the other aircraft are obtained and a second position for the other aircraft is determined that would further reduce the area of uncertainty of the coordinates of the mobile target. For example, the second position for the other aircraft is orthogonal to the minor axis and/or along the major axis of the area of uncertainty when the first position of the initial aircraft is orthogonal to the major axis and/or along the minor axis.
Further, in some examples, images of a mobile target are analyzed to position aircraft by identifying a mobile target in an image obtained via a camera mounted on an aircraft, determining (e.g., via internal sensors in communication with the processor of the aircraft) coordinates of the aircraft with respect to a Global Reference System, determining coordinates of the mobile target with respect to the Global Reference System, determining an area of uncertainty associated with the coordinates of the mobile target, and determining a first position of the aircraft from which to monitor the mobile target that minimizes the area of uncertainty associated with the coordinates of the mobile target. For example, to determine the coordinates of the mobile target with respect to the Global Reference System, a distance between the aircraft and the mobile target is determined (e.g., via a sensor such as a telemeter and/or a laser telemeter that is mounted on the aircraft and in communication with the processor), coordinates of the mobile target are determined with respect to axes of the image of the camera, an angle of the camera relative to a fuselage of the aircraft (e.g., a fixed angle, an adjustable angle) is determined, an orientation of the aircraft (e.g., a yaw angle) relative to the Global Positioning System is determined (e.g., via a sensor, such as a gyroscope, of the aircraft that is in communication with the processor of the aircraft), and the coordinates of the mobile target in the image (e.g., 2-dimensional coordinates) are transformed into coordinates of the mobile target relative to the Global Positioning System (e.g., 3-dimensional coordinates).
Further, in some examples, a real-time position of an aircraft of a fleet of aircraft may be known, in real-time, to the other aircraft of the fleet to enable the aircraft to be positioned relative to each other in an efficient manner for tracking and/or monitoring the mobile target. For example, the aircraft of the fleet that is closer and/or is able to more easily move to the first position to monitor the mobile device moves to the first position. Further, the fleet of aircraft may take into account characteristics of the aircraft (e.g., maximum velocity, fuel level, etc.) when determining which aircraft is to move to the first position. If it is determined that an aircraft is not in a condition to track a mobile target, the aircraft may be instructed to return to a base to address and/or resolve the conditions of the aircraft.
Thus, the examples disclosed herein enable aerial tracking of a mobile target by an unmanned aircraft that remains suspended at a fixed position without operator intervention by determining a position at which the aircraft is able to monitor the mobile target with a least amount of uncertainty with respect to determined coordinates of the mobile target.
For example, to reduce the area of uncertainty 5 associated with the coordinates of the mobile target 2, the mobile target 2 is identified in an image obtained by the camera 3 and current coordinates of the aircraft 1 are obtained from a sensor (e.g., a global positioning sensor) of the aircraft 1. Coordinates of the mobile target 2 are calculated and/or determined based on the current coordinates of the aircraft 1 and the image obtained by the camera 3. Further, the area of uncertainty 5 associated with the determined coordinates of the mobile target 2 and a target position (e.g., a first position) and/or orientation (e.g., a first orientation) that reduces the area of uncertainty 5 are determined to enable the mobile target 2 to be tracked. In some examples, the aircraft 1 may be instructed to move to the first position and/or a first orientation to reduce the area of uncertainty 5. Additionally or alternatively, the camera 3 and sensor(s) of the aircraft 1 are in communication with a processor (e.g., a microcontroller unit) of the aircraft to enable the processor to determine the first position and/or the first orientation that reduces the area of uncertainty 5 and/or to instruct the aircraft 1 to move to the first position (e.g., via autopilot, without human intervention, etc.). For example, when the aircraft 1 is at the first position in the first orientation, the aircraft 1 is able to track the mobile target 2 such that the mobile target 2 remains in a principal position (e.g. the center of the image) of the field of vision of the camera 3. In some examples, the camera 3 is fixed relative to the fuselage 4 of the aircraft 1 without independent control of the camera 3. In other examples, the camera 3 camera is movable and/or controllable such that an orientation of the camera 3 relative to the fuselage 4 is adjustable.
In some examples, there may be more than one mobile target (e.g., the mobile target 2 and another mobile target) identified and/or tracked via the images obtained by the camera 3 of the aircraft 1. In such examples, each of the mobile targets is identified by their respective characteristics (e.g., shape, color, velocity, etc.). For example, the characteristics of the mobile targets are measured by conventional perception systems of the camera 3 and/or the aircraft 1. In some examples, a state vector, {right arrow over (b)}, is determined and/or defined for each of the mobile targets and includes the identified characteristics of the respective mobile target. The state vectors of the respective mobile targets enable the aircraft 1 to distinguish the mobile targets from each other.
For example, a state vector, {right arrow over (bk(t))}, is determined for a mobile target, k (e.g., the mobile target 2). The state vector includes dynamic characteristic(s) (e.g., position, velocity, etc.) of the mobile target, k, and static characteristics (e.g., color, size, etc.) of the mobile target, k, that enable the mobile target, k, to be classified into a predetermined class. For example, the state vector, {right arrow over (bk(t))}, is expressed as provided below in Equation 1:
In Equation 1 provided above, pk represents a position of the mobile target, {dot over (p)}k represents a velocity of the mobile target, and θk represents static characteristics of the mobile target.
For a tracking mission in which there is a plurality of mobile targets, there is one state vector B (t)=[{right arrow over (b1)}(t) . . . {right arrow over (bk)}(t) . . . {right arrow over (bN)}(t)], which includes the respective states of each of the mobile targets 1 through N.
The characteristics and/or information regarding the states of the respective mobile targets are determined based on measurements taken and/or data collected by a perception system of the aircraft 1 (e.g., the camera 3 and/or other sensors) and/or the perception systems of other aircraft (e.g., an aircraft 8 of
In some examples, margins of error in measurements (e.g., based on images obtained by the camera 3) result in an uncertainty in the estimation of the state of the mobile target 2. In the illustrated example, the uncertainty of a location of the mobile target 2 is represented by the area of uncertainty 5. The area of uncertainty 5 of the mobile target 2 has an associated covariance matrix, Σ. A covariance matrix is a conventional term utilized when instruments are subjected to errors and/or tolerances in measurements and/or estimations. In the illustrated example, the covariance matrix, Σ, is an ellipsoid. Axes of the ellipsoid have a standard deviation, 3σ, with respect to an estimation of the position of the mobile target 2. Further, the covariance matrix Σ is determined based on noise characteristics of the sensors of the aircraft 1 and/or methods utilized to estimate the first position of the aircraft 1 (e.g., noise characteristics from a position of the camera 3 with respect the aircraft 1, a position of the aircraft 1, an orientation of the aircraft 1, uncertainty of detection of the mobile target 2 in image(s), etc.).
In the illustrated example, the coordinates of the first position of the aircraft 1 that minimizes the area of uncertainty 5 is represented by
The orientation of the aircraft 1 that reduces the area of uncertainty 5 at the first position is represented by a roll angle, γw, a pitch angle, βw, and a yaw angle, αw. Further, the current or initial coordinates of the aircraft 1 is represented by
{right arrow over (M)}w, the roll angle, γw, the pitch angle, βw, and the yaw angle, αw, are calculated based on estimated coordinates, {right arrow over (M)}t, of the mobile target 2 and the corresponding area of uncertainty 5. The estimated coordinates, {right arrow over (M)}t, of the mobile target 2 equal
in a Global Reference System (e.g., established via coordinates of the Universal Transverse Mercator (UTM)) and are determined based on coordinates ut and vt associated with pixels of the image obtained from the camera 3 of the aircraft 1. For example, the coordinates ut and vt are expressed as a function of the axes û and {circumflex over (v)} of the image of the camera 3. Provided below in Equation 2 is a bidimensional vector containing the coordinates of the image:
Further, {right arrow over (m)}t and {right arrow over (M)}t are transformed to {tilde over (m)} and {tilde over (M)}, respectively, as provided below in Equation 3 and Equation 4:
The vectors {tilde over (m)} and {tilde over (M)} of Equations 3 and 4, respectively, are expressed in different reference systems and may be expressed in relation to each other based on rotational and transformation matrices.
For example, two reference systems {A} and {B} have a same origin of coordinates and standard bases {î,ĵ,{circumflex over (k)}} and {î′,ĵ′,{circumflex over (k)}′}, respectively. A vector expressed in the reference system {A} is represented as {right arrow over (v)}A, and the same vector is expressed in the system {B} as {right arrow over (v)}B=RBA{right arrow over (v)}A, in which RBA is the rotation matrix from the reference system {A} to the reference system {B}. RBA is represented by Equation 5 provided below:
In examples in which free vectors are utilized, a vector may be transformed from one reference system to another reference via the rotational matrix. In examples in which position vectors are utilized, the vector may be transformed via the rotational matrix and a transformation matrix. For example, if the reference systems {A} and {B} have different origin coordinates, a translation vector, {right arrow over (t)}A, translates an origin of {A} to an origin of {B} expressed in the system {A}, and transformation matrix, TAB, transforms a vector expressed in the reference system {B} to a vector expressed in reference system {A} as illustrated in Equation 6 provided below:
Returning to the vectors {tilde over (m)} and {tilde over (M)} expressed in Equations 3 and 4, the relation between one point of Cartesian 3D space at which the mobile target 2 is located and its projection onto the image obtained by the camera 3 is expressed in terms of the transformed vectors, {tilde over (m)} and {tilde over (M)}, via Equation 7 provided below:
s*{tilde over (m)}=C*TCG*{tilde over (M)}=C*[RCG{circumflex over (t)}C]*{tilde over (M)} Equation 7
In Equation 7, s represents a scale factor that equals a value corresponding to a function of the relative positions of the camera 3 and the mobile target 2 and characteristics and/or parameters of the camera 3. Further, s is defined by Equation 8 provided below:
s=(Dt−f)/f Equation 8
In Equation 8, f is a focal length of the camera 3, and Dt is a distance between the camera 3 and the mobile target 2 that can be measured via, for example, a telemeter on-board the aircraft 1 and directed toward the mobile target 2.
TCG of Equation 7 is a transformation matrix (e.g., a rotation and translation matrix) and {circumflex over (t)}C is translation vector for transitioning between a reference system {G} of global coordinates of a Global Reference System and a local reference system {C} of the camera 3 of
In Equation 9 provided above, u0 and v0 represent a central point of the image obtained by the camera 3, αu and αv represent scaling factors of the axes û and {circumflex over (v)} of the image obtained by the camera 3, and γ represents a parameter that defines an orthogonality and/or skewness between the axes û and {circumflex over (v)} of the plane of the image obtained by the camera 3.
Based on Equations 3-4 and 7-9 provided above, a vector, {right arrow over (Ψ)}c, defines a direction in which the aircraft 1 is pointing (e.g., toward the mobile target 2) from its current location and/or coordinates. The vector, {right arrow over (Ψ)}c, is expressed in the local reference system of the camera 3 and is provided below in Equation 10:
Equation 10 may be rewritten as provided below in Equation 11:
Further, the coordinates of the vector, {right arrow over (Ψ)}c, that are based in the local reference system {C} of the camera 3 can be translated into coordinates of a vector, {right arrow over (Ψ)}G, that are based in the global coordinates {G} of the Global Reference System. The vector, {right arrow over (Ψ)}G, is provided below in Equation 12:
In Equation 12, RUC is a rotation matrix between a reference system of the camera 3 and a reference system of the aircraft 1, and RGU is a rotation matrix between the reference system of the aircraft 1 and the Global Reference System.
In combination with Equations 11-12 provided above, a parametric equation provided below in Equation 13 is utilized to determine the target position (e.g., the first position) of the aircraft 1 in the global coordinates {G} of the Global Reference System at which the camera 3 is pointed toward the mobile target 2:
Thus, Equation 13 provided above enables the first position of the aircraft 1 that minimizes the area of uncertainty 5 to be determined when
{right arrow over (Ψ)}G, and λ are known and/or determined.
To determine λ, a quadric or ruled space is defined to express the area of uncertainty 5 associated with the location of the mobile target 2 based on the state vector of Equation 1. The quadric or ruled space is provided below in Equation 14:
xT*Q*x+p*x+r=0,
where:
and r is a constant
In Equation 14 provided above, the terms are inherent to an equation that defines a quadric. For example, if the quadric is an ellipsoid, the condition det(Q)>0 is set on the determinant of the matrix Q, which is an invertible matrix.
To determine a direction and/or module of a major axis 6 and/or minor axis 7 of the ellipsoid of the area of uncertainty 5 corresponding to the mobile target 2, Equation 14 may be rewritten with reference to a center of the ellipsoid as provided below in Equation 15:
(x−k)T*RDRT*(x−k)=1 Equation 15
In Equation 15 provided above, k represents the center of the ellipsoid (e.g., the coordinates of the mobile target 2), R represents a rotation matrix, and D is a diagonal matrix. Based on Equations 15 provided above, Equation 14 can be rewritten as provided below in Equation 16:
(x−k)T*Q*(x−k)=xT*Q*x−2*kT*Q*x+kT*Q*k=(xT*Q*x+pT*x+r)−(2*Q*k+p)T*x+(kT*Q*k−r)=−(2*Q*k+p)T*x+(kT*Q*k−r) Equation 16
Further, the center of the ellipsoid, k, can be determined as provided below in Equation 17:
k=−Q−1*p/2 Equation 17
Equation 17 can be rewritten as provided below in Equation 18:
kT*Q*k=pT*Q−1*p/4 Equation 18
Further, Equation 18 can be rewritten as provided below in Equation 19:
(x−k)T*Q*(x−k)=pT*Q−1*p/4−r Equation 19
By dividing Equation 19 by the scalar, r, Equation 19 can be rewritten as provided below in Equation 20:
In Equation 20, represents Φ a quadric. Further, Equation 20 can be rewritten as provided below in Equation 21:
(x−k)T*Φ*(x−k)=1 Equation 21
In Equation 21 provided above, a symmetric matrix of the quadric, Φ, is factored utilizing an eigenvalue decomposition such that Φ=R*D*RT, in which R is the rotation matrix and D is the diagonal matrix. For example, a main diagonal of the diagonal matrix, D, is composed of positive factors.
In some examples, terrain on which the mobile target 2 is positioned and/or located is known via a Geographic Information System (GIS) database. In such examples, there is substantially no uncertainty of the position of the mobile target in a z-axis. Further, in such examples, uncertainty in an x-y plane (e.g., a horizontal plane) is determined to identify the area of uncertainty 5 of the mobile target 2. In examples in which the uncertainty is an ellipsoid extending in the x-y plane, an eigenvalue decomposition is determined, calculated and/or otherwise analyzed to determine the major axis 6 and the minor axis 7 of the ellipsoid of the area of uncertainty 5. For example, the quadric, Φ, a corresponding eigenvector, ω, and a corresponding eigenvalue, λ, can be written as provided below in Equation 22 when the eigenvalue, λ, is a non-null (e.g., non-zero) vector:
Φ*ω=λ*ω Equation 22
In Equation 22 provided above, the eigenvalues are determined based on the quadratic equation provided below in Equation 23:
det(Φ−λ*I)=0, where
Φ=(φ
Based on Equation 23, the eigenvalues λ1 and λ2 are determined utilizing Equation 24 provided below:
For example, the eigenvalues λ1 and λ2 that are determined utilizing Equation 24 are utilized in Equation 13 to determine the first position that minimizes the area of uncertainty 5.
Further, the eigenvalues λ1 and λ2 are utilized to determine the area of uncertainty 5 associated with the mobile target 2. The eigenvalues λ1 and λ2 can be expressed in Equation 25 and Equation 26 as provided below:
In Equation 25, ω1 represents the major axis 6 of the ellipsoid of the area of uncertainty.
In Equation 26, ω2 represents the major axis 6 of the ellipsoid of the area of uncertainty.
Further, Equations 25 and 26 are rewritten as provided below in Equation 27:
Φ*R=R*D Equation 27
In Equation 27, R=[ω1 ω2] in which the columns are eigenvectors of unitary length and D=diag(λ1;λ2). By multiplying the right side of the previous equation by the matrix RT, Equation 28 as provided below is formed:
Φ=R*D*RT Equation 28
Thus, the equation that defines the ellipsoid of the area of uncertainty 5 is rewritten in Equation 29 as provided below:
(x−k)T*Φ*(x−k)=1 Equation 29
In Equation 29, the quadric, Φ, corresponds to the covariance matrix Σ as provided below in Equation 30:
Φ=¼*Σ−1 Equation 30
The covariance matrix Σ of Equation 30 is periodically calculated and/or determined, for example, by a processor of the aircraft 1.
Based on the covariance matrix Σ, Equation 23, and Equation 30, the major axis, ω1, (the major axis 6 in
Otherwise, if φ11<φ22, the major axis, ω1, is determined utilizing Equation 33 provided below and the minor axis, ω2, is determined utilizing Equation 33 provided below:
The axes of the area of uncertainty 5 can change dynamically as the determined location of the mobile target 2 changes based on new and/or additional data and/or information obtained regarding the mobile target 2 and/or the aircraft 1.
Returning to determining the first position of the aircraft 1 that minimizes the area of uncertainty 5, the first position can be determined utilizing Equations 11-13 and 24 given some constraints and/or assumptions. For example, a constraint that the mobile target 2 is to be centered in the image obtained by the camera 3 of the aircraft is represented by Equation 35 provided below:
In Equation 35, w represents a number of pixels of a width of the image obtained by the camera 3, and h represents a number of pixels of a height of the image obtained by the camera 3. The values of Equation 35 are utilized to determine the values of Ψcx and Ψcy in Equation 11, which are utilized to determine xw and yw of the location that minimizes the area of uncertainty 5.
Additionally or alternatively, another constraint requires that zw of the location that minimizes the area of uncertainty 5 is assumed to equal zπ. zπ represents an altitude of the aircraft 1 that is determined based on, for example, an instantaneous field of view (IFOV) of the camera 3 and a size of the mobile target 2. In some examples, zπ equals the size of the mobile target 2 divided by the IFOV of the camera 3. For example, based on the constraint regarding the altitude of the aircraft 1, Equation 13, which represents the first position of the aircraft 1, can be rewritten as provided below in Equation 36:
Further, another constraint may require the camera 3 to be aligned with a plane of symmetry of the aircraft 1 and inclined at the angle θ (
To determine the rotational matrix, RGU, other constraints and/or assumptions are made. For example, the roll angle, γw, and the pitch angle, βw, are assumed to equal 0. Further, the yaw angle, αw, of the camera 3 is focused on the mobile target 2 in a direction that is perpendicular to the major axis, ω1, of the area of uncertainty 5. The yaw angle, αw, is determined based on Equation 38 provided below:
For example, in Equation 38, an angle of 0 degrees is directed to the north and a positive angle is clockwise from the north.
Based on the constraints regarding the roll angle, γw, the pitch angle, βw, and the yaw angle, αw, the rotational matrix, RGU, which is utilized to determine the first location of the aircraft 1, is represented by the matrix provided below in Equation 37:
For example, the yaw angle, αw, of Equation 39 is calculated based on Equation 38.
Based on the following constraints, two positions that minimize the area of uncertainty 5 are determined because eigenvalues λ of Equation 24 is a quadratic equation. The aircraft 1 is to move to the one of the two positions,
that is closest to the current coordinates,
The closer of the two positions of the aircraft 1 is determined based on Equation 40 provided below:
Thus, if the first equation of Equation 40 is true, the aircraft 1 is to move to
to deduce the area of uncertainty 5. Otherwise, the aircraft 2 is to move to
to deduce the area of uncertainty 5.
In the illustrated example, the camera 3 of the aircraft 1 is focused on the mobile target 2 in a direction that is perpendicular to the major axis 6 (the major axis, ω1, in the above equations) of the area of uncertainty 5, and the camera 9 of the aircraft 8 is focused on the mobile target 2 in a direction that is perpendicular to the minor axis 7 (the minor axis, ω2, in the above equations) of the area of uncertainty 5. For example, the aircraft 1 is in the first position that is orthogonal to the major axis 6 along the minor axis 7 and the aircraft 8 is in a second position (e.g., a second principal position) along the major axis 6 orthogonal to the minor axis 7. The aircraft 1, 8 track the mobile target 2 from substantially perpendicular positions and/or directions to further reduce the area of uncertainty 5 related to the determined position of the mobile target 2.
Further, the blocks of the example method 60 may be implemented by executing corresponding instructions (e.g., first instructions, second instructions, third instructions, etc.) via a processor. For example, the processor is hardware of the example aircraft 1 and/or the example aircraft 8. The processor can be implemented by one or more integrated circuits, logic circuits, microprocessors or controllers from any desired family or manufacturer. In some examples, the processor includes a local memory (e.g., a cache). In some examples, the memory includes volatile memory and/or non-volatile memory. Example volatile memory may be implemented by Synchronous Dynamic Random Access Memory (SDRAM), Dynamic Random Access Memory (DRAM), RAMBUS Dynamic Random Access Memory (RDRAM) and/or any other type of random access memory device. Example non-volatile memory may be implemented by flash memory and/or any other desired type of memory device. Access to the memory is controlled, for example, by a memory controller. In some examples, the memory is a tangible computer readable storage medium such as a flash memory, a read-only memory (ROM), a cache, a random-access memory (RAM) and/or any other storage device or storage disk in which information is stored for any duration (e.g., for extended time periods, permanently, for brief instances, for temporarily buffering, and/or for caching of the information). As used herein, the term tangible computer readable storage medium is expressly defined to include any type of computer readable storage device and/or storage disk and to exclude propagating signals and transmission media. As used herein, “tangible computer readable storage medium” and “tangible machine readable storage medium” are used interchangeably. In some examples, the memory is a non-transitory computer and/or machine readable medium such as a flash memory, a read-only memory, a cache, a random-access memory and/or any other storage device or storage disk in which information is stored for any duration (e.g., for extended time periods, permanently, for brief instances, for temporarily buffering, and/or for caching of the information). As used herein, the term non-transitory computer readable medium is expressly defined to include any type of computer readable storage device and/or storage disk and to exclude propagating signals and transmission media.
The method 60 for positioning aircraft(s) to reduce an area of uncertainty of coordinates of a mobile target is discussed in connection with the aircraft 1 of
The example method 60 disclosed herein starts at block 61 by a camera (e.g., the camera 3 of
At block 63, current coordinates of a first aircraft (e.g., the aircraft 1 of
Upon determining that there is not another aircraft or upon obtaining the coordinates of the second aircraft and the characteristics of the corresponding camera, coordinates of the mobile target are determined based on the image(s) obtained from the camera, the coordinates of the aircraft(s), and the characteristics of the camera(s) (block 66). Further, an area of uncertainty (e.g., the area of uncertainty 5 of
At block 68, the first aircraft is positioned to be orthogonal to a major axis (e.g., the major axis 6 of
Although certain example apparatus and methods have been described herein, the scope of coverage of this patent is not limited thereto. On the contrary, this patent covers all methods, apparatus and articles of manufacture fairly falling within the scope of the amended claims either literally or under doctrine of equivalents.
Number | Date | Country | Kind |
---|---|---|---|
15382382 | Jul 2015 | EP | regional |
Number | Name | Date | Kind |
---|---|---|---|
6349898 | Leonard | Feb 2002 | B1 |
8990002 | Leonard | Mar 2015 | B1 |
9194948 | Leonard | Nov 2015 | B1 |
9607219 | Greveson | Mar 2017 | B2 |
9626874 | Gupta | Apr 2017 | B1 |
9746554 | Millar | Aug 2017 | B2 |
20040165775 | Simon | Aug 2004 | A1 |
20050273254 | Malchi | Dec 2005 | A1 |
20070010965 | Malchi | Jan 2007 | A1 |
20090021423 | Cheng | Jan 2009 | A1 |
20090262977 | Huang | Oct 2009 | A1 |
20100017046 | Cheung | Jan 2010 | A1 |
20100092079 | Aller | Apr 2010 | A1 |
20100116886 | Flowers | May 2010 | A1 |
20110299730 | Elinas | Dec 2011 | A1 |
20120123628 | Duggan | May 2012 | A1 |
20130259304 | Aller | Oct 2013 | A1 |
20130317667 | Kruglick | Nov 2013 | A1 |
20150248584 | Greveson | Sep 2015 | A1 |
20150371431 | Korb | Dec 2015 | A1 |
20160085238 | Hayes | Mar 2016 | A1 |
20160349363 | Millar | Dec 2016 | A1 |
20170102467 | Nielsen | Apr 2017 | A1 |
Entry |
---|
Torrieri, “Statistical Theory of Passive Location Systems,” IEEE Transactions on Aerospace and Electonic Systems, vol. AES-20, No. 2, Mar. 1984 (16 pages). |
Regina, “New Target Tracking and Monitoring Guidance Laws for UAV,” Dipartimento di Elettronica, Informatica e Sistemistica Viale Risorgimento 2, Bologna, Italy, Mar. 3, 2011 (175 pages). |
Forssén, “Low and Medium Level Vision using Channel Representation,” Department of Electrical Engineering Linköping University, Linköping, Sweden, Mar. 2004 (164 pages). |
Gomez-Balderas, “Tracking a Ground Moving Target with a Quadrotor Using Switching Control”, International conference on Unmanned Aircraft Systems (ICUAS 2012), Philadelphia, PA, Jun. 2012 (11 pages). |
Sang, “Planning Algorithm Based on Airborne Sensor for UAV to Track and Intercept Moving Target in Dynamic Environment”, Proceedings of 2014 IEEE Chinese Guidance, Navigation and Control Conference, Yantai, China, Aug. 8-10, 2014 (6 pages). |
Choi, “Tracking an Unknown Moving Target from UAV”, Proceedings of the 5th International Conference on Automation, Robotics and Applications, Wellington, New Zealand, Dec. 6-8, 2011 (6 pages). |
Morbidi, “Active Target Tracking and Cooperative Localization for Teams of Aerial Vehicles”, IEEE Transactions on Control Systems Technology, vol. 21, No. 5, Sep. 2013 (14 pages). |
Frew, “Cooperative Standoff Tracking of Uncertain Moving Targets using Active Robot Networks”, 2007 IEEE International Conference on Robotics and Automation, Rome, Italy, Apr. 10-14, 2007 (6 pages). |
Spica, “Aerial Grasping of a Moving Target with a Quadrotor UAV”, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, Vilamoura, Algarve, Portugal, Oct. 7-12, 2012 (8 pages). |
Number | Date | Country | |
---|---|---|---|
20170023941 A1 | Jan 2017 | US |