The present application relates to aircraft-mounted cameras, and is particularly directed to apparatus and method of compensating for relative motion of at least two aircraft-mounted cameras.
An aircraft may include two cameras that are used as part of an object detection and collision avoidance system, for example. In this example application, one camera can be mounted on a portion of an aircraft wing, and the other camera can be mounted on a portion of another aircraft wing. Since the aircraft wings flex and the cameras are relatively far apart from each other, the distance and orientation between the cameras can vary greatly due to wing vibrations, for example, during flight. As a result of the variations in distance and orientation between the cameras, the system is unable to stereoscopically accurately determine the position of an object, such as a bird, approaching the aircraft to avoid a collision with the object. It would be desirable to provide an apparatus and method in which the varying distances and orientations between the two aircraft-mounted cameras are compensated so that the system is able to accurately determine the position of an object approaching the aircraft.
In one aspect, a method is provided of compensating for variations in distance and orientation between first and second wing-mounted cameras of an aircraft due to flexing of at least one aircraft wing. The method comprises determining a first distance and orientation between the first wing-mounted camera and the second wing-mounted camera during a neutral wing condition of the aircraft, determining a second distance and orientation between the first wing-mounted camera and the second wing-mounted camera during a flexed wing condition of the aircraft, and processing the difference between the first and second distances and orientations to provide a real-time varying distance and orientation for use in providing a compensated distance between the first and second wing-mounted cameras.
In another aspect, a method is provided of processing image data captured by a left wing-mounted camera of an aircraft and a right wing-mounted camera of the aircraft to compensate for variations in distance and orientation between the cameras due to flexing of left and right aircraft wings. The method comprises correlating captured images from the left wing-mounted camera against a left nose template associated with a left aircraft wing, transforming image data from at least one image frame captured by the left wing-mounted camera to eliminate relative motion associated with motion of the left aircraft wing, correlating captured images from the right wing-mounted camera against a right nose template associated with a right aircraft wing, and transforming image data from at least one image frame captured by the right wing-mounted camera to eliminate relative motion associated with motion of the right aircraft wing.
In yet another aspect, an apparatus is provided for an aircraft-mounted object detection and collision avoidance system. The apparatus comprises a first camera attached to one portion of the aircraft and a second camera attached to another portion of the aircraft. The first and second cameras cooperate to captures images of an object in a flight path. The apparatus further comprises a motion compensation module configured to calculate a real-time distance and orientation between the first camera and the second camera. The apparatus also comprises a detection module configured to calculate a distance between the aircraft and the object based upon the calculated real-time distance between the first camera and the second camera.
Other aspects will become apparent from the following detailed description, the accompanying drawings and the appended claims.
The present application is directed to an apparatus and method of compensating for relative motion of at least two aircraft-mounted cameras. The specific apparatus, motion compensation methods, and the industry in which the apparatus and motion compensation methods are implemented may vary. It is to be understood that the disclosure below provides a number of embodiments or examples for implementing different features of various embodiments. Specific examples of components and arrangements are described to simplify the present disclosure. These are merely examples and are not intended to be limiting.
By way of example, the disclosure below describes an apparatus and motion compensation methods for aircraft in compliance with Federal Aviation Administration (FAA) regulations. Specifications of FAA regulations are known and, therefore, will not be described.
Referring to
In the example implementation illustrated in
The object 16 may be any object that may potentially strike the vehicle 12. As an example, the object 16 may be any moving airborne object moving along the path 18 that may intersect the path 14 of the vehicle 12. For example, as illustrated in
Throughout the present disclosure, the terms “strike”, “struck”, “collision”, “collide” and any similar or related terms may refer to the impact of the vehicle 12 and the object 16. For example, the phrase “an object striking or potentially striking a vehicle” may refer to a moving vehicle 12 impacting with a moving object 16 (e.g., an airborne object).
Referring to
The two cameras 21, 22 may operate over any range or ranges of wavelengths and/or frequencies to obtain images 24 (e.g., video images 26). For example and without limitation, the two cameras 21, 22 may be configured to obtain images 24 at infrared, near infrared, visible, ultraviolet, other wavelengths, or combinations of wavelengths. The two cameras 21, 22 may be configured to obtain images 24 from light that is polarized.
For example, the two cameras 21, 22 may include one or more long-wavelength infrared (“LWIR”) cameras. As another example, the two cameras 21, 22 may include one or more med-wavelength infrared (“MWIR”) cameras. As another example, the two cameras 21, 22 may include one or more short-wavelength infrared (“SWIR”) cameras. As still another example, the two cameras 21, 22 may include a combination of one or more long-wavelength infrared cameras, med-wavelength infrared cameras, and short-wavelength infrared cameras.
In an example implementation, the images 24 may be video images 26. The video images 26 may include a sequential series of digital video image frames taken rapidly over a period of time (e.g., 30 Hz). The images 24 provided by the two cameras 21, 22 may be used to detect the presence of one or more objects 16 and to identify one or more characteristics of the object 16.
Referring back to
In an example implementation, the two cameras 21, 22 may include a combined field of view. In another example implementation, the two cameras 21, 22 may include an overlapping field of view 27. For example, the two cameras 21, 22 may be used including an overlapping field of view 27 in order for the system 10 to determine the distance of the object 16 relative to the vehicle 12 using a stereo solution (e.g., stereo vision).
The two cameras 21, 22 may be mounted to the vehicle 12 at any suitable or appropriate location. For simplicity and purposes of description herein, one camera 21 is mounted to the end of one wing 31 of the aircraft 30 and the other camera 22 is mounted to the end of the other wing 32 of the aircraft 30, as schematically shown in
The two cameras 21, 22 of the image capture module 20 may be connected to the vehicle 12 at various positions and orientations. The two cameras 21, 22 may face in any appropriate direction. For example, the two cameras 21, 22 may generally face forward on the vehicle 12 (e.g., in the direction of movement 14) in order to view the object 16 in the path of the vehicle 12 or crossing the path of the vehicle 12 (e.g., within the field of view 40).
Referring again to
However, before the detection module 50 processes the images 24, the images 24 are processed by an apparatus including a motion compensation module 100 constructed in accordance with an embodiment. The motion compensation module 100 includes a processing unit 102 that executes instructions stored in an internal data storage unit 104, an external data storage unit (not shown), or a combination thereof. The processing unit 102 may comprise any type of technology. For example, the processing unit 102 may comprise a dedicated-purpose electronic processor. Other types of processors and processing unit technologies are possible. The internal data storage unit 104 may comprise any type of technology. For examples, the internal data storage unit 104 may comprise random access memory (RAM), read only memory (ROM), solid state memory, or any combination thereof. Other types of memories and data storage unit technologies are possible.
The motion compensation module 100 further includes a number of input/output (I/O) devices 106 that may comprise any type of technology. For example, the I/O devices 106 may comprise a keypad, a keyboard, a touch-sensitive display screen, a liquid crystal display (LCD) screen, a microphone, a speaker, or any combination thereof. Other types of I/O devices and technologies are possible.
The motion compensation module 100 processes the images 24 to compensate for variations in distance and orientation (e.g., rotation) between the two cameras 21, 22 mounted on the ends of the wings 31, 32 of the aircraft 30 due to flexing motion of at least one of the wings 31, 32. More specifically, the processing unit 102 executes instructions of a motion compensation program 105 stored in the data storage unit 104 to compensate for the variations in the distance and orientation between the two cameras 21, 22 due to the flexing motion of one or both of the wings 31, 32. Operation of the motion compensation module 100 is described hereinbelow.
Referring to
Referring to
It should be apparent that the image 300 from the camera 21 on the left aircraft wing 31 and the image 400 from the camera 22 on the right aircraft wing 32 are similar. The two images 300, 400 are processed by the motion compensation module 100 in the same way. For simplicity, image processing of the image 400 from the camera 22 on the right aircraft wing 32 will be described in detail. It is understood that the same image processing details apply to the camera 21 on the left aircraft wing 31.
Referring to
Referring to
Referring to
It should be apparent that
Referring to
Referring to
Based upon the associated bird objects from block 750 and the stereoscopic disparities of the bird objects as measured in block 760, bird ranges and bird range rates are computed as shown in block 770. The process then proceeds to block 780 in which bird collision metrics are computed. If a potential bird collision is determined based upon the bird collision metrics computer in block 780, then an alarm is provided to an operator as shown in block 790.
The following additional description and explanations are provided with reference to the flow diagram 700 of
As an example calculation for the above formula, C(T) can be calculated every video frame from the two cameras 21, 22 with T set to 10 seconds. The resulting integer C(10) could be used to drive an alarm which goes off when it increases from 0 to any non-zero value. The alarm increases in urgency as the number rises. Thus, the possibility of a large number of imminent bird collisions captures the fact that this event is much more likely to lead to engine failure or damage than if a single bird “collides”.
Referring to
The following additional description and explanations are provided with reference to the flow diagram 800 of
The front part of the aircraft 30 is visible from each camera (each sees one side) of the at least two cameras 21, 22. When the aircraft wings 30, 32 flex, the apparent location of the nose of the aircraft 30 changes. The change in apparent location of the nose of the aircraft 30 from the camera 22 can be tracked easily by constructing the right nose template and correlating captured images 24 from the camera 22 against the right nose template. The right nose template may comprise the captured image 510 shown in
When captured nose images 24 from the camera 22 are correlated, features of the aircraft 30, such as the features 43, 44, 45, 46, 47 shown in
The movement of the correlation peak determines the movement (displacement) in two dimensional pixel space. By adjusting the bird positions in pixels with the reverse of this displacement, their positions in pixel space in the camera 22 has been adjusted for the relative motion of the camera 22 due to the flexing movement of the right aircraft wing 32. This adjustment of the bird positions in pixel space is shown as the compensated image 540 in
The above-described correlation assumes that lens distortion of the camera 22 has been compensated for during a pre-calibration step. This calibration step allows the creation of a fixed function p( ) that maps pixel locations (x, y) to solid angle vectors (θ, φ), where θ is the angle in x-y space (the reference ground plane of the airplane) and φ is the elevation angle off of the reference ground plane of the airplane. This is denoted by the following function:
(θ,φ)=p(x,y)
The above function is defined during final installation of the object detection and collision avoidance system 10 and updated at periodic calibration intervals.
Referring to
As an example calculation of the above function (θ,φ)=p(x,y), let l=(lx, ly, 0) be the left camera 21 location (z is assumed to be zero) and r=(rx, ry, 0) be the right camera 22 location on the tips of the wings 31, 32. As shown in
l=(lx,ly,0)=(−40,120,0)
r=(rx,ry,0)=(−40,−120,0)
Then, given a bird location in pixel space in each camera (xl, yl) and (xr, yr), their locations in physical space line along the lines formed by the angles (θl, φl)=p(xl, yl) and (θr, φr)=p(xr, yr) and points in space given by the camera locations. These two lines then are defined by the following one-dimensional parametric forms:
(lx,ly,0)+(ax,ay,az)*s
(rx,ry,0)+(bx,by,bz)*t
The point of nearest intersection c can be calculated as follows:
Let m2=(b×a)·(b×a)
R=(r−1)×((b×a)/m2)
The point of nearest intersection c is equal to the following:
An example scenario showing example calculations of the above-identified equations is described hereinbelow with reference to coordinates shown in
First, it is assumed that the aircraft 30 is centered at location (0, 0, 0), the bird 34 is at location (800, −200, 100), the camera 21 on the left aircraft wing 31 is at location (−40, −120, 0), and the camera 22 on the right aircraft wing 32 is at location (−40, −120, 0). For example 1001×1001 pixel cameras with no lens distortion, the following angles for the left camera 21 and the right camera 22 can be calculated as follows:
Second, it is assumed that each of the cameras 21, 22 has a 120 degrees field of view (FOV) in both horizontal and vertical directions. The pixel locations for the bird 34 in the left and right cameras 21, 22 can be expressed as follows:
Based upon the coordinates of the bird 34 and the cameras 21, 22 shown in
[−20.8545,−5.4403,6.3480,6.7587]
The normalized direction vectors a and b for the lines from the cameras 21, 22 to the bird 34 would be as follows:
Also, the calculations that compute the nearest point c between the two lines between the cameras 21, 22 and the bird 34 would be as follows:
Accordingly, the final resulting point (in this case the actual bird location) would be the midpoint c calculated as follows:
It should be noted that q1 and q2 are both the same and equal to the correct answer because there was no motion error (due to flexing of the aircraft wings 31, 32) introduced into the calculation as would be the case in a real system.
After the above-described wing motion error compensation is performed based upon blocks 810 and 820 for the left wing camera 21 and blocks 812 and 814 for the right wing camera 22, the process of
More specifically, the bird range (i.e., BR) from the aircraft 30 can be calculated as norm of c or |c|, which gives the range of the bird to the center point (i.e., (0, 0, 0) between the wings 31, 32 of the aircraft 30. This can be calculated for each of the synchronized video frames of the cameras 21, 22. Thus, for example, a 30 Hz frame rate (i.e., FR) means a new range for each identified bird every 33.3 ms. In general, the ranges {c1, c2, . . . ,} allow the range rate (i.e., BRR) at every frame to be calculated using the following equation:
BRRj=(cj−cj-1)/FR
Predicted bird strike events at any future time can be calculated using the above equation for BRRj.
The process of
Coded instructions to implement the motion compensation method may be stored in a mass storage device, in a volatile memory, in a non-volatile memory, and/or on a removable tangible computer readable storage medium such as a CD or DVD.
The motion compensation method may be implemented using machine readable instructions that comprise a program for execution by a processor such as the processing unit 102 shown in the example motion compensation module 100 discussed above in connection with
As mentioned above, the example motion compensation method of
Additionally or alternatively, the example motion compensation method of
While an example manner of implementing the example aircraft-mounted object detection and collision avoidance system 10 is illustrated in
When reading any of the apparatus or system claims of this patent to cover a purely software and/or firmware implementation, at least one of the example motion compensation module 100 and/or, more generally, the example aircraft-mounted object detection and collision avoidance system 10 is/are hereby expressly defined to include a tangible computer readable storage device or storage disk such as a memory, a digital versatile disk (DVD), a compact disk (CD), a Blu-ray disk, etc. storing the software and/or firmware.
The mounting of the two cameras 21, 22 at the ends of the wings 31, 32 provides an unobstructed view and a long baseline (i.e., the distance between the two cameras 21, 22) for accurate distance measurement. However, as the wings 31, 32 flex, the cameras 21, 22 move, and the accurate baseline needed for distance measurement is lost. By providing the motion compensation module 100, the relative motion of the cameras 21, 22 is accounted for so that the baseline can be maintained during flight including takeoff, turning, and landing.
Also, the mounting of the two cameras 21, 22 at the ends of the wings 31, 32 allows stereo measurements in real time. These real-time stereo measurements allow the two cameras 21, 22 to focus in on an object, obtain a three-dimensional view, and obtain accurate measurements of the object. The motion compensation module 100 provides a real-time way to calculate the distance between the two cameras 21, 22 whose distance is changing due to wing vibration, for example. The calculated distance between the two cameras 21, 22 is then used to calculate the distance between the aircraft 30 and an approaching object to be avoided.
Although the above description describes the object to be avoided by the aircraft 30 is in the air, it is conceivable that the object to be avoided by the aircraft be an object that is not in the air, such as an object on a runway for example.
Also, although the above description describes the image capture module 20 as having only two cameras, it is conceivable that more than two cameras be used. However, the use of more than two cameras would provide shorter baselines that lead to less accurate distance measurements. For example, a third camera (not shown) can be mounted on a portion of the aircraft 30. The processing unit 102 (
Further, although the above-description describes an example apparatus and an example motion compensation method for aircraft in the aviation industry in accordance with FAA regulations, it is contemplated that apparatus and motion compensation methods may be implemented for any industry in accordance with the applicable industry standards.
Although various embodiments of the disclosed apparatus and motion compensation methods have been shown and described, modifications may occur to those skilled in the art upon reading the specification. The present application includes such modifications and is limited only by the scope of the claims.