Information
-
Patent Grant
-
6192322
-
Patent Number
6,192,322
-
Date Filed
Friday, April 19, 199628 years ago
-
Date Issued
Tuesday, February 20, 200123 years ago
-
Inventors
-
Original Assignees
-
Examiners
Agents
- Raufer; Colin M.
- Alkov; Leonard A.
- Lenzen, Jr.; Glenn H.
Abstract
A spinning strip aperture imaging radiometer sensor system and data processing methods for detecting moving objects derived from a plurality of image frames acquired by a strip aperture imaging sensor. A moving object in any individual image frame results in a motion smear signature in the total synthesized image. The motion smear signature is processed to detect the moving objects. One embodiment of the system comprises a rotating strip aperture telescope, a two dimensional detector array that detects images in the telescope's focal plane, a rotation compensation device that prevents rotational smear during integration time of detectors of the array, a signal processor that records a plurality of image frames of a scene imaged by the telescope as it rotates around its optical axis, and that implements method(s) for detecting the moving objects present in the recorded images. A hierarchy of moving object detection processors and methods 20 is disclosed that includes spatial, temporal, spatial frequency, and temporal frequency domain detection processors, and is compatible with multi-spectral background rejection techniques. Selection of the appropriate processing procedure and method depends upon the scenario, and the effective signal to noise ratio characteristics of the moving object.
Description
BACKGROUND
The present invention relates generally to spinning aperture radiometers and methods, and more particularly to spinning strip (partial) aperture imaging radiometers that synthesize equivalent full circular aperture images and detect moving objects from a plurality of rotating strip aperture image, or synthesized image measurements.
To provide high resolution images from space-based platforms, for example, conventional sensor architectures use active control of large, heavy, deployable optical systems. Depending upon the mission requirements and size of the primary mirror, the active control can range from periodic piston and tilt control of primary mirror segments to piston, tilt, figure, and alignment control of all optical elements of the sensor. Full aperture systems with the same resolution as the present invention have a great deal of light gathering capability because of the relatively large aperture areas. However, to place multi-meter diameter apertures into orbit, full aperture systems competing with the present invention require: segmented optical surfaces and folded support structures, if the optical system diameters are larger than the launch vehicle's fairing; complex and potentially high bandwidth adaptive optical techniques, if thin deformable mirrors are used to save weight; and complex piston and pupil matching control, if implemented as a phased array. Therefore, full aperture systems are relatively heavy and have high technical risk when compared to the present invention.
The processors and methods described herein relate to and provide for an improvement to the system capabilities of the closest prior art, referred to herein as SpinAp. The SpinAp system and processing method is described in U.S. Pat. No. 5,243,351, entitled “Full Aperture Image synthesis Using Rotating Strip Aperture Image Measurements”, which is assigned to the assignee of the present invention.
SpinAp is a sensor system and data processing method that is capable of synthesizing images having resolution equivalent to a full circular aperture. The equivalent SpinAp synthesized fill circular aperture has a diameter equal to the largest correlation length associated with the strip aperture's geometry. To accurately synthesize an equivalent full aperture image, SpinAp acquires imaging data using the rotating sensor's focal plane detectors at time intervals appropriate for completely measuring the optical system passband of the equivalent full circular aperture. The images are processed by methods described in U.S. Pat. No. 5,243,351.
The commonality of the approaches between the SpinAp system and the system of the present invention (referred to as SpinAp moving object detection) arises from the use of temporally registered strip aperture measurements to synthesize images having resolution equivalent to a full circular aperture. The equivalent SpinAp synthesized full circular aperture has a diameter equal to the largest correlation length associated with the strip aperture's geometry.
To accurately synthesize an equivalent full aperture image, SpinAp acquires imaging data using the rotating sensor's focal plane detectors at time intervals appropriate for completely measuring the optical system passband of the equivalent full circular aperture. The images are processed by methods described in U.S. Pat. No. 5,243,351.
However, a moving object in the instantaneous field of view of any of the individual frames will result in a motion smear signature in the total synthesized image. Detection of the moving object is required to establish object phenomenology, such as, position, size, trajectory, velocity, acceleration, and point of origin. In addition, the detection process is a precursor to applying processors and methods to the acquired data that permit stationary image synthesis of the moving objects. Therefore, a hierarchy of moving object detection processors and methods is described herein. The hierarchy includes spatial, temporal, spatial frequency, and temporal frequency domain detection processors, which may also incorporate multi-spectral background rejection techniques. The present system makes use of the available measurements to detect moving objects in the field of view of the synthesized images, in the field of view of any pair of SpinAp individual frames, in any one of the SpinAp individual frames, or any combination. The same methods may also be applied to transient event detection.
Therefore, it would be an advantage to have modified SpinAp image processing methods that would result in lower payload weight for a given effective synthesized aperture size, while providing capability to detect moving objects in the sensor's field of view. Furthermore, it would also be advantageous to have a system having image processing and moving object detection methods that provide a lower risk, lower cost, lighter weight, and simpler fabrication deployment alternatives to deploying complex, large full circular apertures (or phased array telescopes) requiring complex adaptive optical systems for space based imaging applications.
SUMMARY OF THE INVENTION
To meet the above and other objectives, one embodiment of the present invention provides for a spinning strip radiometer imaging and detection system that is capable of detecting moving objects. The system includes a rotating strip aperture telescope that produces temporally contiguous or sequential images. The rotating strip aperture telescope typically comprises a rotating strip aperture primary reflector and a secondary reflector. A two dimensional detector array is provided to detect images located in the focal plane of the telescope. A rotation compensation device is employed to prevent rotational smear during the integration time of the detectors of the array. A signal processor is provided for recording a plurality of image frames of a target scene imaged by the telescope as the strip aperture rotates around the optical axis of the telescope, for synthesizing the equivalent full circular aperture image, and for detecting moving objects in the individual strip aperture frames, any pair of frames, the synthesized image, or any combination of individual frames and synthesized images.
The present invention thus provides for a spinning strip (partial) aperture imaging system that synthesizes equivalent full circular aperture images, and detects moving objects in the sensors field of view from a plurality of rotating strip aperture image measurements, while compensating for random, and/or systematic line of sight errors between individual strip aperture images. The present invention thus provides improved total system performance when compared to the SpinAp system of U.S. Pat. No. 5,243,351 by providing means to synthesize an equivalent full circular image, and simultaneously providing the capability to detect moving objects.
Example embodiments of SpinAp moving object detection methods are described below. Since each embodiment of the radiometer system, image synthesis, and moving object detection processing method depends upon specific mission requirements and engineering tradeoffs, the radiometer system, image synthesis, and moving object detection method incorporates means to compensate for random, and/or systematic line of sight drift between frames, and a priori and a posteriori known error sources, such as, non-isoplanatic optical system point spread functions, field point independent image smear due to image motion and finite electronic bandwidth of the detectors, and field point dependent image smear caused by uncompensated rotational motion of the image.
Embodiments of the image synthesis and detection process performed by the present invention summarized above are as follows. As the spinning strip aperture rotates around the telescope's optical axis the following occurs. The rotation compensation device counter-rotates during the detector integration time, thereby providing a temporally stationary image. An image frame is recorded and saved. If a rotating (relative to the scene) detector array architecture has been selected, the acquired frame is coordinate transformed and interpolated to a reference grid of the synthesized image. The data is Fourier transformed and stored. A new frame is recorded and saved.
An estimate of the frame-to-frame misregistration of the recorded data due to random line of sight errors is obtained. The strip aperture images, or the Fourier transforms, are corrected for their line of sight errors and are stored. The preceding steps are sequentially repeated for each strip aperture image frame, or the frames are sequentially acquired and stored, and then global estimates of the line of sight errors are determined and used to register the frames. Once any pair, or combination of pairs, of the individual strip aperture images have been corrected for the line of sight errors, the SpinAp moving object detection process may begin. In addition, once a sufficient number (as described in U.S. Pat. No. 5,243,351) of strip aperture image measurements have been acquired the image synthesis process may begin. The SpinAp moving object detection method may be performed on two, or more, synthesized images.
An embodiment of the SpinAp moving object detection method using two or more synthesized images acquired by observing the same scene spatially registers the synthesized images (if required due to line of sight errors between measurement sets), differences all possible distinct pairs of synthesized images, sums the differenced images, and passes the resulting differenced and summed data through a matched filter array and then a comparator. A moving object detection is declared when any of the outputs from the matched filter array exceed the predetermined thresholds of the comparator.
Another embodiment of the SpinAp moving object detection method using two or more individual line of sight corrected frames resulting from the observation of the same scene applies a system transfer function (STF) equalization filter to the individual frames to be differenced, differences any or all distinct pairs of equalized frames, and applies a non-common passband rejection filter to the resulting differenced data. The resulting filtered and differenced data are combined by addition to form the total summed spectrum of filtered differences. The summed spectra are inverse transformed, thereby providing a moving object signature in the spatial domain, which is transferred to a noise optimal matched filter array and a comparator. A moving object detection is declared when any of the outputs from the matched filter array exceed the predetermined thresholds of the comparator.
The noise optimal matched filter detection processor incorporates properties corresponding to the physical phenomenology of potential objects, such as, shape, size, velocity, acceleration, background to object contrast ratio, signal to noise ratio, number of objects, number of track crossings, and the like. The matched filter detection processor also incorporates a priori knowledge of the measurement noise statistics propagated through the equalization, differencing, and filtering process, as well as the unique signal measurement characteristics of the SpinAp sensor system.
The equalization, differencing, and filtering operations associated with the present invention eliminate all temporally stationary components in a common overlap passband of the individual frame images, as well as the synthesized images, leaving only a noisy signature of smeared moving objects. The performance of the moving object detection techniques is scenario dependent and is degraded by individual frame measurement noise propagated through the equalization, differencing, and filtering operations, and background leakage due to residual frame to frame, and/or synthesized image to synthesized image registration error.
Alternative embodiments of spatial (and spatial frequency) methods that are described herein correspond to applying optimal matched filters to any of the differenced frames, and combining the outputs for all possible frame differences.
While the previous embodiments are typical of SpinAp temporal domain detection methods, temporal frequency domain processes may also be employed. An embodiment of a temporal frequency domain processor utilizes a temporal sequence of the spatially registered individual SpinAp strip aperture frames obtained by observations of the same scene. The spatially registered frames are spatially Fourier transformed to a common spatial frequency grid. The common spatial frequency grid is associated with the equivalent full circular aperture SpinAp synthesized spatial frequency passband. Any or all possible spatially registered and Fourier transformed individual frame images are temporally Fourier transformed to produce a data set associating all temporal frequencies with each spatial frequency. To obtain the best possible detection performance, the temporal and spatial frequency transformed data set is transferred to a noise optimal matched filter detection processor. The matched filter detection processor incorporates a priori knowledge of the object phenomenology, signal and noise measurement statistics, as well as the unique measurement characteristics of the SpinAp sensor system. A moving object detection is declared when any of the outputs from the matched filter array exceed predetermined thresholds.
By eliminating the zero temporal frequency, the matched filter implicitly eliminates all temporally stationary components in the transformed time history of the individual frames, thereby leaving only the temporal frequency and spatial frequency signature of the moving objects. The performance of the moving object detection technique is scenario dependent and is degraded by individual frame measurement noise propagation, and background leakage due to residual frame to frame registration error.
If observational parameters permit, acceptable performance may be achieved for less than optimal approaches, whereby the optimal matched filter array and comparator can be replaced by template matched filters and a comparator, streak detectors and a comparator, or a simple single pixel or multiple pixel threshold exceedence detection processor. In this context the word pixel refers to the spatial, or the spatial frequency element of the two dimensional discrete representation of the image, or its transform.
A hierarchy of moving object detection processors and methods is provided by the present invention. The hierarchy includes spatial, temporal, spatial frequency, and temporal frequency domain detection processors, as well as, multi-spectral background rejection techniques. Selection of the appropriate processing procedure and method depends upon the scenario, and the effective signal to noise ratio characteristics of the moving object. The same methods may be applied to transient event detection.
Given knowledge of the strip aperture's spatial response function, the spatial response function of the detectors of the array, the noise statistics, and the temporal registrations of each of the recorded strip aperture images permits moving object detection by the sensor system and processing methods.
BRIEF DESCRIPTION OF THE DRAWINGS
The various features and advantages of the present invention may be more readily understood with reference to the following detailed description taken in conjunction with the accompanying drawings, wherein like reference numerals designate like structural elements, and in which:
FIG. 1
illustrates a spinning aperture imaging radiometer system employing improved data processing methods in accordance with the principles of the present invention;
FIG. 2
illustrates a top level embodiment of a SpinAp moving object detection processing method in accordance with the principles of the present invention utilizing a plurality of spatially registered SpinAp synthesized images;
FIG. 3
illustrates a top level embodiment of a SpinAp moving object detection processing method in accordance with the principles of the present invention utilizing a plurality of spatially registered, system transfer function equalized individual SpinAp transformed frames;
FIG. 4
illustrates a top level embodiment of a SpinAp moving object detection processing method in accordance with the principles of the present invention utilizing a plurality of spatially registered, temporal and spatial frequency transforms of SpinAp individual frame images; and
FIG. 5
illustrates a top level embodiment of a SpinAp moving object detection processing method in accordance with the principles of the present invention utilizing a plurality of spatially registered, system transfer function equalized, individual SpinAp transformed frames, and sequential difference frame matched filtering.
DETAILED DESCRIPTION
Referring to the drawing figures,
FIG. 1
illustrates a spinning aperture imaging radiometer system
10
and data processing methods
20
(corresponding to the system described in U.S. Pat. No. 5,243,351) that is appropriately modified using the principles of the present invention. The spinning aperture imaging radiometer system
10
and data processing method
20
provides moving object detection information, and provide the capability to generate equivalent full circular imagery, while removing line of sight jitter and frame registration errors. Therefore, the system
10
and data processing methods
20
described herein provide enhancements to the SpinAp system of U.S. Pat. No. 5,243,351. The contents of U.S. Pat. No. 5,243,351 are incorporated herein by reference in its entirety.
The spinning aperture imaging radiometer system
10
comprises a rotating strip aperture telescope
11
that comprises a primary
12
a
and a secondary reflector
12
b
. A tertiary reflector (not shown) may be employed in the telescope
11
under certain circumstances. The system
10
is shown in the form of a satellite comprising a stationary section
13
having earth pointing antenna
13
a
. The telescope
11
is disposed on a platform
14
, to which the stationary section
13
is also coupled. The spinning aperture imaging radiometer system
10
is adapted to record a number of image frames of a target scene imaged by the telescope
11
as the primary mirror
12
a
(comprising a strip aperture
12
) rotates around the telescope's optical axis.
A line of sight stabilization mirror
15
and an image derotation device
16
are disposed along the optical path of the telescope
11
that are adapted to stabilize and derotate the image prior to its sensing by a detector array
17
. The derotation device
16
counter rotates the image during the integration time of detectors comprising the detector array
17
, under control of a rotation compensation controller
19
, thereby providing a stationary image. The line of sight stabilization mirror
15
is used by a line of sight control system (such as may be provided by a signal processor
18
or other dedicated control system) to remove high bandwidth line of sight errors, as well as line of sight errors due to orbital dynamics of the system
10
.
The target scene is imaged onto the detector array
17
located at the focal plane of the telescope
11
that is coupled to the signal processor
18
that is adapted to process the image frames. Individual image frames are processed and combined in the signal processor
18
to synthesize equivalent full circular aperture images. In the present enhanced moving object detection SpinAp system
10
, the signal processor
18
(or an independent processor) is modified according to the methods of the present invention to provide for moving object detection. Detailed descriptions of the modified processing methods are illustrated with reference to
FIGS. 2-5
.
FIGS. 2-5
illustrate a variety of different embodiments of the sensor system
10
and processing methods
20
,
20
-
2
,
20
-
3
,
20
-
4
in accordance with the principles of the present invention employed by the spinning aperture radiometer system. A particular embodiment is selected based upon the observational scenario and timelines, the amount of available a priori knowledge, and the available computational throughput of the processing chain.
FIG. 2
illustrates a top level diagram of a simple SpinAp moving object detection processing method
20
in accordance with the principles of the present invention employed using an appropriately modified spinning aperture radiometer system and processing methods described in U.S. Pat. No. 5,243,351. The detection method
20
shown in
FIG. 2
employs two or more SpinAp strip aperture image sets that observe the same scene to generate equivalent synthesized full circular aperture images
30
of the scene in accordance with U.S. Pat. No. 5,243,351. In the presence of line of sight registration errors, each of the synthesized images
30
are optimally registered spatially by methods described in U.S. Pat. No. 5,243,351.
Any, or all, possible distinct pairs of spatially registered synthesized images
30
are differenced
40
and subsequently added
35
and stored, thereby generating a total sum
45
of differenced synthesized images. If the number of distinct synthesized images
30
is N, then the total number of possible distinct differenced synthesized image pairs is given by N(N−1)/2.
To obtain the best possible detection performance, the resulting sum
45
of differenced synthesized images is transferred to a noise optimal matched filter detection processor
50
. The matched filter detection processor
50
incorporates an array of matched filters
55
having properties corresponding to the physical phenomenology of potential objects, such as, shape, size, velocity, acceleration, background to object contrast ratio, signal to noise ratio number of objects, number of track crossings, and the like. The matched filter detection processor
50
also incorporates a priori knowledge of measurement noise statistics propagated through the synthesis and differencing process, as well as, knowledge concerning the unique signal measurement characteristics of the SpinAp sensor system
10
. Each of the matched filters
55
of the detection processor
50
is applied to every pixel location of the total sum
45
of differenced synthesized images. The output of each matched filter
55
is transferred to a comparator
60
with thresholds set to a predetermined level of each matched filter
55
. A moving object detection is declared when any of the outputs from the matched filters
55
exceed the predetermined thresholds of the comparator
60
. The threshold may be predetermined by a priori assumptions concerning the signal and noise statistics, or by a posteriori assumptions concerning the signal and noise statistics based upon the individual frame or set of SpinAp frames.
The differencing operation eliminates all temporally stationary components of the synthesized image, leaving only the signature of smeared synthesized moving objects. The performance of the moving object detection processing method
20
is scenario dependent and is degraded by individual frame measurement noise propagated through the synthesis and differencing operations, and background leakage due to residual frame to frame registration error. If observation parameters permit, acceptable performance may be achieved for less than optimal approaches, whereby the optimal matched filters
55
and the comparator
60
may be replaced by template matched filters and the comparator
60
, streak detectors and the comparator
60
, or a simple single pixel or multiple pixel threshold exceedence detection processor.
FIG. 3
illustrates a top level embodiment of a SpinAp moving object detection processing method
20
-
2
in accordance with the principles of the present invention employed using an appropriately modified spinning aperture radiometer system and processing methods described in U.S. Pat. No. 5,243,351. The detection method
20
-
2
shown in
FIG. 3
employs two or more SpinAp strip aperture images
30
-
2
obtained by observing the same scene, and generated using methods described in U.S. Pat. No. 5,243,351. In the presence of line of sight registration errors, each of the individual SpinAp images
30
-
2
are optimally registered spatially as described in U.S. Pat. No. 5,243,351, and Fourier transformed to a common spatial frequency coordinate system and grid. Typically, the common spatial frequency grid is associated with an equivalent full circular aperture SpinAp synthesized spatial frequency passband.
Any or all possible distinct pairs of spatially registered and Fourier transformed individual frame images
30
-
2
are multiplied by spatial frequency filters
31
-
2
designed to compensate for the temporally varying system transfer functions associated with each individual frame and resulting from the rotation of the SpinAp sensor system
10
. The system transfer function compensation may be implemented by one of several methods, including: equalizing each frame to a target full circular aperture or other reference system transfer function, equalizing one frame to the system transfer function of the frame to be differenced, or deconvolving the system transfer function from both images to be differenced. For ideal system transfer functions, and if and only if only frames acquired π radians apart are to be differenced, no system transfer function equalization is required.
For a template signal detection approach, SpinAp moving object detection methods
20
-
2
based upon any one of the many possible equalization procedures have unique noise performance characterized by the particular method's receiver operating characteristics (ROC). The differences in ROC for signal template approaches are dominated by different and uncompensated levels of noise “boosting” or “de-boosting” generated by the equalization process. However, an appropriately designed noise optimal matched filter approach compensates for the noise characteristics in the equalization process.
The equalized individual frames are differenced
40
-
2
, and spatial frequency filtered
41
-
2
to eliminate non-common information outside an overlap passband associated with the two frames. The spatial frequency filtered frame difference is added
35
-
2
to previous filtered frame differences and stored, thereby generating a total sum
35
-
2
of differenced spatial frequency filtered Fourier transformed images. If the number of distinct individual frame images is N, then the total number of possible distinct differenced individual frame image pairs contributing to the sum may be as large as N(N−1)/2.
To obtain the best possible detection performance, the resulting sum
35
-
2
of differenced spatial frequency filtered transformed images is inverse Fourier transformed
42
-
2
to obtain a spatial domain image
45
-
2
, which is transferred to a noise optimal matched filter detection processor
50
-
2
. The matched filter detection processor
50
-
2
incorporates an array of matched filters
55
-
2
having properties corresponding to the physical phenomenology of potential objects, such as, shape, size, velocity, acceleration, background to object contrast ratio, signal to noise ratio, number of objects, number of track crossings, and the like. The matched filter detection processor
50
-
2
also incorporates a priori knowledge of the measurement noise statistics propagated through the equalization, differencing, and filtering process, as well as the unique signal measurement characteristics of the SpinAp sensor system
10
. Each of the matched filters
55
-
2
of the detection processor
50
-
2
is applied to every pixel location of the total sum
35
-
2
of differenced filtered images. The output of each matched filter
55
-
2
is transferred to a comparator
60
-
2
with thresholds set to a predetermined level for each matched filter
55
-
2
. A moving object detection is declared when any of the outputs from the array of matched filters
55
-
2
exceed the predetermined thresholds of the comparator
60
-
2
. The threshold may be predetermined by a priori assumptions concerning the signal and noise statistics, or the a posteriori assumptions concerning the signal and noise statistics based upon the individual frame or set of SpinAp frames.
The equalization, differencing, and filtering operations eliminate all temporally stationary components in the common overlap passband of the individual frame images, leaving only the signature of smeared moving objects. The performance of the moving object detection processing method
20
-
2
is scenario dependent and is degraded by individual frame measurement noise propagated through the equalization, differencing, and filtering operations, and background leakage due to residual frame to frame registration error. If observational parameters permit, acceptable performance may be achieved for less than optimal approaches, whereby the optimal array of matched filters
55
-
2
and comparator
60
-
2
may be replaced by template matched filters and the comparator
60
-
2
, streak detectors and the comparator
60
-
2
, or a simple single pixel or multiple pixel threshold exceedence detection processor.
FIG. 4
illustrates a top level embodiment of a SpinAp moving object detection processing method
20
-
3
in accordance with the principles of the present invention employing an appropriately modified spinning aperture radiometer system and processing methods described in U.S. Pat. No. 5,243,351. The detection method
20
-
3
shown in
FIG. 4
employs a temporal sequence of individual SpinAp strip aperture images
50
-
3
or sub-images obtained by observing the same scene, generated using methods described in U.S. Pat. No. 5,243,351. In the presence of line of sight registration errors, each of the individual SpinAp images
50
-
3
are optimally registered spatially by techniques described in U.S. Pat. No. 5,243,351, and Fourier transformed
51
-
3
to a common spatial frequency coordinate system and grid, thereby producing Fourier transformed images
52
-
3
. Typically, the common spatial frequency grid is associated with an equivalent full circular aperture SpinAp synthesized spatial frequency passband. Any or all possible spatially registered and Fourier transformed individual frame images
52
-
3
are temporally Fourier transformed
52
-
3
to produce a data set
54
-
3
associating all temporal frequencies with each spatial frequency.
To obtain the best possible detection performance, the temporal frequency and spatial frequency transformed data set is transferred to a noise optimal matched filter detection processor
55
-
3
. The matched filter detection processor
55
-
3
incorporates an array of matched filters
56
-
3
having properties corresponding to the physical phenomenology of potential objects, such as, shape, size, velocity, acceleration, background to object contrast ratio, signal to noise ratio, number of objects, number of track crossings, and the like. The matched filter detection processor
55
-
3
also incorporates a priori knowledge of the variation of the system transfer function and other unique measurement characteristics associated with the SpinAp sensor system
10
, and the measurement signal and noise statistics propagated through the measurement and transforming processes. Each matched filter
56
-
3
of the detection processor
55
-
3
is applied to the set
54
-
3
of temporally and spatially transformed data. The output of each matched filter
56
-
3
is transferred to a comparator
60
-
3
with thresholds set to a predetermined level for each matched filter
56
-
3
. A moving object detection is declared when any of the outputs from the array of matched filters
56
-
3
exceed the predetermined thresholds of the comparator
60
-
3
. The threshold may be predetermined by a priori assumptions concerning the signal and noise statistics, and/or by a posteriori assumptions concerning the signal and noise statistics based upon the individual frame or set of SpinAp frames.
The temporal frequency matched filter
56
-
3
implicitly eliminates all temporally stationary components in the individual frame images, leaving only the temporal and spatial frequency signature of smeared moving objects. The performance of the moving object detection processing method
20
-
3
is scenario dependent and is degraded by individual frame measurement noise propagation, and background leakage due to residual frame to frame registration error. If observational parameters permit, acceptable performance may be achieved for less than optimal approaches, whereby the optimal array of matched filters
56
-
3
and comparator
60
-
3
may be replaced by template matched filters and the comparator
60
-
3
, or a simple single frequency “pixel” or multiple frequency “pixel” threshold exceedence detection processor.
FIG. 5
illustrates a top level embodiment of a SpinAp moving object detection processing method
20
-
4
in accordance with the principles of the present invention employing an appropriately modified spinning aperture radiometer system and processing methods described in U.S. Pat. No. 5,243,351. The detection method
20
-
4
of
FIG. 5
employs two or more SpinAp strip aperture images
30
-
4
obtained by observing the same scene, generated in accordance with methods described in U.S. Pat. No. 5,243,351. In the presence of line of sight registration errors, each of the individual SpinAp images are optimally registered spatially by methods described in U.S. Pat. No. 5,243,351, and Fourier transformed to a common spatial frequency coordinate system and grid. Typically, the common spatial frequency grid is associated with the equivalent full circular aperture SpinAp synthesized spatial frequency passband.
Any or all possible distinct pairs of spatially registered and Fourier transformed individual frame images
30
-
4
are multiplied by spatial frequency filters
31
-
4
designed to compensate for the temporally varying system transfer functions associated with each individual frame and resulting from the rotation of the SpinAp sensor system
10
. The system transfer function compensation may be implemented by one of several methods, a few examples of which include: equalizing each frame to a target full circular aperture or other reference system transfer function, equalizing one frame to the system transfer function of the frame to be differenced, or deconvolving the system transfer function from both images
30
-
4
to be differenced. For ideal system transfer functions, and if and only if only frames acquired a radians apart are to be differenced, no system transfer function equalization is required.
The equalized individual frames are differenced
40
-
4
, and spatial frequency filtered
41
-
4
to eliminate non-common information outside the overlap passband associated with the two frames. The spatial frequency filtered frame differences are inverse Fourier transformed
42
-
4
producing individual pairwise differenced images
45
-
4
. If the number of distinct individual frame images is N, then the total number of possible distinct differenced individual frame image pairs can be as large as N(N−1)/2.
For a template signal detection approach, SpinAp moving object detection methods
20
-
4
based upon any one of the many possible equalization procedures have unique noise performance characterized by the particular method's receiver operating characteristics (ROC). The differences in ROC for signal template approaches are dominated by the different and uncompensated levels of noise “boosting” or “de-boosting” generated by the equalization process. However, an appropriately designed noise optimal matched filter approach compensates for the noise characteristics in the equalization process.
To obtain the best possible detection performance, the set of differenced spatial frequency filtered transformed images
45
-
4
is transferred to a noise optimal matched filter detection processor
50
-
4
. The matched filter detection processor
50
-
4
incorporates an array of individual pairwise differenced frame matched filters
55
-
4
having properties corresponding to the physical phenomenology of potential objects, such as, shape, size, velocity, acceleration, background to object contrast ratio, signal to noise ratio, number of objects, number of track crossings, and the like. The matched filter detection processor
50
-
4
also incorporates a priori knowledge of the measurement noise statistics propagated through the equalization, differencing, and filtering process, as well as unique signal measurement characteristics of the SpinAp sensor system
10
. Each of the individual pairwise differenced frame matched filters
55
-
4
of the detection processor
50
-
4
is applied to ever pixel location for the appropriate frame difference and subsequently summed. The output of each object phenomenology total matched filter
56
-
4
is transferred to a comparator
60
-
4
with thresholds set to a predetermined level for each of the total matched filters
56
-
4
. A moving object detection is declared when any of the outputs from the total matched filter array
55
-
4
exceed the predetermined thresholds of the comparator
60
-
4
. The threshold may be predetermined by a priori assumptions concerning the signal and noise statistics, or by a posteriori assumptions concerning the signal and noise statistics based upon the individual frame or set of SpinAp frames.
The equalization, differencing, and filtering operations eliminate all temporally stationary components in the common overlap passband of the individual frame images, leaving only the signature of smeared moving objects. The performance of the moving object detection method
20
-
4
is scenario dependent and is degraded by individual frame measurement noise propagated through the equalization, differencing, and filtering operations, and background leakage due to residual frame to frame registration error. If observational parameters permit, acceptable performance may be achieved for less than optimal approaches, whereby the optimal array of matched filters
56
-
4
and comparator
60
-
4
may be replaced by template matched filters and the comparator
60
-
4
, streak detectors and the comparator
60
-
4
, or a simple single pixel or multiple pixel threshold exceedence detection processor.
For the purposes of completeness, the formulation of the SpinAp moving object detection processing method
20
is as follows. Using the definitions and nomenclature in U.S. Pat. No. 5,243,351, a description of the mathematical foundations of the enhanced moving object detection SpinAp sensor
10
and data processing methods
20
,
20
-
2
,
20
-
3
,
20
-
4
embodied in the present invention.
Individual SpinAp noise free measurements will first be discussed. In the absence of noise, the mean number of detection electrons produced during the integration time of a detector located at position {overscore (R)} acquired during frame time t
j
may be expressed as,
O
s
(
{overscore (R)},t
j
)=∫
d
2
FÔ
s
(
{overscore (F)},t
j
)exp
{−2πi{overscore (F)}·{overscore (R)}}
where O
s
({overscore (F)},t
j
) is the spatial Fourier transform of the detector output at spatial frequency {overscore (F)} and frame time t
j
. Assuming the effective sensor system transfer function is temporally stationary during the detectors integration time ΔT, the Fourier transform of the individual frames detected intensity may be expressed as
where {overscore (I)}
AS
is the intensity spatially averaged over the effective focal plane array area, Ŵ
0
is the zero spatial frequency component of the detector transfer function, A
fpaS
is the effective focal plane array area, STF ({overscore (F)},t
j
) is the total sensor system transfer function correspond to frame time t
j
, and Ŝ
n
(M{overscore (F)},t) is the normalized spatial Fourier transform of the geometrical projection of the scene at time t.
A mathematical representation of a scene and the scene's geometrical projection to the imaging plane, including background, a single moving object, and scattering component will now be discussed. The intensity emitted from a scene containing a single object moving in front (relative to the SpinAp sensor) of a background, and behind transmissive scattering layers may be expressed as,
S
(
{overscore (r)},t
)=
T
(
{overscore (r)},t
)+
B
(
{overscore (r)},t
)−
B
(
{overscore (r)},t
)
W
t
W
T
(
{overscore (r)},t
)+
H
(
{overscore (r)},t
),
where T({overscore (r)},t) is the intensity distribution of the moving object at scene position {overscore (r)} and time t, B({overscore (r)},t) is the intensity distribution of the background at scene position {overscore (r)} and time t, H({overscore (r)},t) is the intensity distribution of transmissive scattering layers corresponding to scene position {overscore (r)} and time t, and W
T
({overscore (r)},t) is the object's inverse silhouette at scene position {overscore (r)} and time t. The object's inverse silhouette is given by
W
T
(
{overscore (r)},t
)={1 if
T
(
{overscore (r)},t
)>0 and 0 otherwise.
Subtracting B({overscore (r)},t)W
t
({overscore (r)},t) in the equation for S({overscore (r)},t) corresponds to removing the background photons obscured by the moving object at scene position {overscore (r)} and time t.
Therefore, the geometrical projection of the scene's intensity to the focal plane of the SpinAp optical system is represented by
where M is the magnification of the optical system. Fourier transforming the geometrical projection and applying the Fourier transform similarity theorem yields,
or
where FT and {circumflex over ( )} refer to the Fourier transform operation. Application of the Fourier transform convolution theorem, and transforming variables permits the evaluation of FT{B({overscore (R)}/M,t)W
T
({overscore (R)}/M,t)}/M
2
yielding,
The corresponding geometrical projection of the scene spectrum and its zero spatial frequency component are given by
and
The instantaneous normalized geometrically projected scene spectrum is represented as
The normalized scene spectrum can be decomposed into two components dominated by the background and foreground spectrum, and the moving objects contribution to the spectrum, such a representation is
S
n
(
M{overscore (F)},t
)≡
{circumflex over (B)}
n
(
M{overscore (F)},t
)+Δ
{circumflex over (T)}
n
(
M{overscore (F)},t
)
where
{circumflex over (B)}
n
(M{overscore (F)},t) is the normalized background and foreground component for spatial frequency M{overscore (F)} at time t, and
where Δ{circumflex over (T)}
n
(M{overscore (F)},t) is the normalized object's variation from the background and foreground for spatial frequency {overscore (F)} at time t.
Normalized background and foreground spectra for a temporally stationary scenes will now be discussed. For observational scenarios having temporally stationary total background components, the normalized total background spectrum, and moving object variation are
and
Moving object and background decoupling will now be discussed. If the number of background photons obscured by the moving object is negligible when compared to the total number of background photons, then the total background and moving object variation normalized spectra can be further simplified, yielding
and Δ{circumflex over (T)}
n
is given by
Temporal variation of the object and the object's inverse silhouette will now be discussed. The time dependent motion of the object may be describe by applying a temporal dependent spatial shift to the argument of the objects spatial intensity distribution. Defining the starting time of the imaging process to be t
0
, the initial object and the object inverse silhouette are given by
T
0
(
{overscore (r)}
)=
T
(
{overscore (r)},t
0
)
W
T0
(
{overscore (r)}
)=
W
T
(
{overscore (r)},t
0
).
The motion dependent temporal variations of the object and the object silhouette are given by
T
(
{overscore (r)},t
)=
T
0
(
{overscore (r)}−{overscore (r)}
T
(
t
))
W
T
(
{overscore (r)},t
)=
W
T0
(
{overscore (r)}−{overscore (r)}
T
(
t
)),
where +{overscore (r)}
T
(t) is the vectorial position of the object in the scene coordinates as a function of time, referred to as object track.
Moving object and inverse silhouette spectra will now be discussed. Applying the Fourier transform shift theorem to the moving object and moving object inverse silhouette produces the corresponding spectra, given by
{circumflex over (T)}
(
M{overscore (F)},t
)=
{circumflex over (T)}
(
M{overscore (F)},t
0
)exp(−2π
iM{overscore (F)}·{overscore (r)}
T
(
t
))≡
{circumflex over (T)}
0
(
M{overscore (F)}
)exp(−2π
i{overscore (F)}·R
T
(
t
))
and
Ŵ
T
(
M{overscore (F)},t
)=
Ŵ
T
(
M{overscore (F)},t
0
)exp(−2π
iM{overscore (F)}·{overscore (r)}
T
(
t
))≡
Ŵ
T0
(
M{overscore (F)}
)exp(−2π
i{overscore (F)}·R
T
(
t
)
respectively.
Moving object variation from background with explicit dependence on the object track will now be discussed. Substituting the above equations for {circumflex over (T)}(M{overscore (F)},t) and Ŵ
T
(M{overscore (F)},t) into the expression for the object spectrum variation Δ{overscore (T)}
N
, yields
For constant velocity motion during the detector integration time, the object's vectorial position can be represented {overscore (R)}
T
(t)={overscore (V)}t, where {overscore (V)} is the velocity component of the moving object. Therefore, the normalized object spectral variation, Δ{circumflex over (T)}
n
,
Time average individual SpinAp frame spectra will now be discussed. Since the mean detector measurement results from a temporal average of the imaged moving object and total background, the time averaged Fourier spectrum of the measurement assuming the system transfer function is constant during the detector integration time is
where the time average of the constant velocity motion required integration of the ‘track’ integral
The SpinAp synthesizer and synthesizing process will now be discussed. The simplest version of the SpinAp synthesizing process employs a synthesizer, such that the spatial frequency redundancy average of the noise free measurements yield the equivalent full circular aperture image, or
Ô
F syn
(
{overscore (F)}
)={circumflex over (Λ)}(
{overscore (F)}
)
Ô
S ave
(
{overscore (F)}
)
where Ô
F syn
({overscore (F)}) is the full circular aperture image to be synthesized, {circumflex over (Λ)}({overscore (F)}) is the synthesizer, and where Ô
S ave
({overscore (F)}) is the spatial frequency redundancy average of the (noise free) individual frame measurement spectra.
for Ŵ
NS
OTF
S
({overscore (F)})=STF
S
({overscore (F)})≠0, where STF corresponds to the system transfer function, which is the product of the optical system transfer function, OTF, the normalized detector transfer function Ŵ
N
, and the subscripts S and F designate parameters associated with the SpinAp strip aperture sensor and the target full circular aperture system, respectively. The spatial frequency redundancy averaged strip aperture transfer function appearing in the denominator of the synthesizer is
Therefore, the synthesized image spectrum resulting from the observation of a scene consisting of a temporally stationary total background and foreground, and an object moving in the sensors field of view is given by
Further simplification produces,
The first term in the synthesized image spectrum equation corresponds to the equivalent fall circular aperture image of the temporally stationary background. The next two terms in the equation correspond to the moving object's signature in the synthesized image.
Differenced synthesized images and moving object signatures will now be discussed. A moving object spatial frequency domain signature may be obtained by differencing two or more synthesized image spectra. Differencing two completely independent, full aperture synthesized images (or image spectra), while observing the same scene, requires a minimum of two 180 degree rotations. Alternatively, to decrease the time between frame differences, synthesized images (or spectra) may be differenced with as few as one non-common frame, or the synthesis operation may be performed with fewer than the required number of frames to generate the full circular passband image spectra.
Since the temporally stationary parts of the synthesized images are identical for any pair of synthesized images observing the same scene, the difference of synthesized images will contain only moving object information. The noise free spatial frequency domain moving object signature of the differenced synthesized images is given by
The spatial signature of the moving object resulting from the difference of synthesized images is
Δ
Ô
BA
(
R,δT
)=∫
d
2
FΔÔ
BA
({overscore (F)})exp(2
πi{overscore (F)}·{overscore (R
+L )}),
where δT represents the time delay associated with the acquisition of different sets of acquired strip aperture frames.
Differencing any pair of appropriately modified SpinAp frames for background removal and moving object detection will now be discussed. For 2N
f
individual SpinAp strip aperture frames, there are 2N
f
(2N
f
−1)/2 possible distinct pairwise frame comparisons, or differences. An optimal detection technique would take advantage of the background removal when differencing any possible pair, as well as, the potentially large number of moving object pairwise signatures. However, unlike differencing frames for the purposes of background removal and moving object detection for temporally stationary aperture systems, specific techniques and methods must be applied to the SpinAp data sets to compensate for the rotating nature of the system transfer function. The SpinAp data sets are modified to distinguish between temporal variations in the scene and temporal variations originating from the moving object.
SpinAp individual noisy frame measurements and frame to frame system transfer function equalization will now be discussed. The individual strip aperture noisy measurements spectra can be represented as the noise free strip aperture detected image spectra plus the strip aperture additive noise spectra
Ô
Sm
(
F,t
j
)=
Ô
S
(
F,t
j
)+
{circumflex over (n)}
s
(
F,t
j
),
where Ô
S
and {circumflex over (n)}
s
are the noise free signal and noise spectra, respectively.
Individual frame system transfer function passband filter will now be discussed. In general, the individual frame measurements provide information related to different portions of the equivalent full circular aperture optical system passband. The detection of temporally nonstationary phenomenology, such as, moving objects or transient events requires the identification and utilization of common information between frames, as well as, accommodating for the differences in image contrast due to the time varying nature of the SpinAp sensor's transfer function. The frame to frame system transfer function compensation can be implemented by several methods, a few examples of which are equalizing each frame to a target full circular aperture or other reference system transfer function; equalizing one frame to the system transfer function of the frame to be differenced; deconvolving the system transfer function from both images to be differenced; or for ideal system transfer functions, and if only frames acquired π radians apart are to be differenced, no system transfer function equalization is required.
For a template signal detection approach the SpinAp moving object detection methods based upon any one of the many possible equalization procedures would have unique noise performance characterized by the particular method's receiver operating characteristics (ROC). The differences in ROC for signal template approaches are dominated by the different and uncompensated levels of noise “boosting” or “de-boosting” generated by the equalization process. However, an appropriately designed noise optimal matched filter approach would compensate for the noise characteristics in the equalization process.
While any one of many methods can be used to implement the system transfer function equalization, the formulation described in the following paragraphs is typical of the mathematical details associated with each procedure. For brevity, only the procedure whereby one frame is equalized to another is described in detail.
Selecting one frame as the current reference frame and any different second frame to be used in the comparison, the spectra of the frames are spatial frequency filtered to provide spatial frequency information in the common overlap region of the passbands of both frames. The second frame is also modified to compensate for the differences in system transfer function, i.e.
{circumflex over (m)}
R
(
F,t
i
)={circumflex over (χ)}(
F,t
j
){circumflex over (χ)}(
F,t
i
)
Ô
Sm
(
F,t
i
)={circumflex over (χ)}(
F,t
i
){circumflex over (χ)}(
F,t
j
)[
Ô
S
(
F,t
j
)+
{circumflex over (n)}
s
(
F,t
j
)],
and
{circumflex over (m)}
E
(
F,t
i
)={circumflex over (χ)}(
F,t
j
){circumflex over (χ)}(
F,t
i
)
Ô
Sm
)
F,t
i
)
Ĥ
(
F,t
j
,t
i
)={circumflex over (χ)}
(
F,t
i
){circumflex over (χ)}(
F,t
j
)[
Ô
S
(
F,t
j
)+
{circumflex over (n)}
s
(
F,t
j
)]
Ĥ
(
F,t
j
,t
i
),
where {circumflex over (m)}
R
(F,t
i
) denotes the spatial frequency filtered output of the reference frame, and {circumflex over (m)}
E
(F,t
i
) the spatial frequency filtered and transfer function equalized output of the frame to be compared, {circumflex over (χ)} is the optical system passband filter,
{circumflex over (χ)}(
F,t
j
)=1 if
STF
s
(
F,t
j
)>0, and 0 otherwise
and Ĥ is the noise free passband equalization filter
The filtered and equalized difference between any two frames can be expressed as the sum of differenced signal and noise spectra, i.e.
Δ{circumflex over (m)}
ij
(
{overscore (F)}
)=
{circumflex over (m)}
R
(
F,t
i
)−
{circumflex over (m)}
E
(
F,t
j
)=
ij
(
{overscore (F)}
)+
ij
(
{overscore (F)}
),
where the differenced, filtered, and equalized noise spectrum,
ij
, is
ij
(
{overscore (F)}
)={circumflex over (χ)}(
F,t
j
){circumflex over (χ)}(
F,t
i
)
{circumflex over (n)}
S
(
F,t
i
)−{circumflex over (χ)}(
F,t
i
){circumflex over (χ)}(
F,t
j
)
{circumflex over (n)}
S
(
F,t
j
)
Ĥ
S
(
F,t
j
,t
i
),
and the differenced, filtered, and equalized signal spectrum,
ij
({overscore (F)}), is
For temporally stationary backgrounds and foregrounds the resultant signal spectrum is related to the difference between the normalized moving object variation from the background, Δ{circumflex over (T)}
N
, by
ij
(
{overscore (F)}
)≡
{overscore (I)}
AS
Ŵ
0S
A
fpaS
{circumflex over (χ)}(
F,t
i
){circumflex over (χ)}(
F,t
j
)
STF
S
(
F,t
i
)[
{circumflex over (T)}
N
(
M,{overscore (F)},t
i
)−Δ
{circumflex over (T)}
N
(
M,{overscore (F)},t
j
)].
If the object is moving with constant velocity during the integration time of the detectors, then the filtered differenced signal corresponds to
SpinAp spatial frequency and temporal frequency domain moving object detection will now be discussed. A simple mathematical representation of the spatial frequency and temporal frequency domain SpinAp moving object detection procedure is obtained by approximating the discrete temporal sampling as a continuum and taking the continuous Fourier transform of the spatial frequency transform for each of the SpinAp individual frames. The Fourier spectrum of the noiseless individual frame measurements is given by
The temporal Fourier transform of the spatial frequency spectrum of the individual frame is obtained by evaluating the integration, and applying the convolution theorem yielding
(
{overscore (F)},f
)≡∫
dt
exp(−2π
ift
)
Ô
S
(
{overscore (F)},t
)
=
{overscore (I)}
AS
Ŵ
0
A
fpaS
∫dt
exp(−2π
ift
)
STF
S
(
F,t
)
S
N
(
M{overscore (F)},t
)
=
{overscore (I)}
AS
Ŵ
0
A
fpaS
∫df′
S
(
F,f′
)
Ŝ
N
(
M{overscore (F)},f−f′
),
where
S
is the temporal Fourier transform of the SpinAp system transfer function, and is the temporal and spatial frequency Fourier transforms of the normalized geometrically projected scene spectrum
Evaluating the temporal Fourier transform of the normalized geometrically projected scene spectrum, the noiseless temporal frequency transform of the strip aperture measurements, becomes
The first equation above defining Ô
S
(F,t
j
) is used to determine the signal component of the temporal and spatial frequency matched filters. The matched filter essentially eliminates the temporally stationary components, since the first term of equation for (F,f) contains all temporally stationary effects and is zero for all nonzero spatial frequencies,
Simplified signature's for uniform background, foreground, and moving object will now be discussed. Prediction of receiver operating characteristics for SpinAp moving object detection procedures can be simplified for the special case of observational scenarios associated with uniform intensity backgrounds, foregrounds, and moving objects. Indeed, the design of coarse moving object detection approaches may utilize such simplifying assumptions. The temporally stationary and spatially uniform components of the background and foreground can be expressed as,
B
(
{overscore (r)},t
)≡
B
0
≡B
SCN
and
H
(
{overscore (r)},t
)≡
H
0
≡H
SCN
where B
SCN
is the average background intensity in scene coordinates, and H
SCN
is the average foreground intensity in scene coordinates.
Focal plane average values and Fourier transforms will now be discussed. The geometrical projection of the average values of the scene's background and foreground intensities are
and
and the corresponding Fourier transforms are
where use has been made of the Fourier transforms of the backgrounds in the scene coordinate system
{circumflex over (B)}
(
{overscore (F)}
)≡
{overscore (B
0
+L )}δ(
{overscore (F)}
)=
{overscore (B
scn
+L )}δ(
{overscore (F)}
), and
Ĥ
(
{overscore (F)},t
)≡
{overscore (H
0
+L )}δ(
{overscore (F)}
)=
{overscore (H
scn
+L )}δ(
{overscore (F)}
)
and the Fourier transform similarity theorem.
Zero spatial frequency relationships to mean values will now be discussed. The normalized geometrical projection of the scene spectrum is generated by dividing the Fourier transform of the geometrical projection of the scene by the zero spatial frequency component of the geometrically projected scene spectrum. The geometrically projected zero spatial frequency component of the background's, and moving object's intensities can be expressed in terms of the spatially averaged mean values and the geometrical projected area of the backgrounds and object, as
{circumflex over (T)}
(
M{overscore (F)}
=0
,t
)=
{overscore (T)}
fpa
A
trg fpa
, {circumflex over (B)}
(
M{overscore (F)}
=0
,t
)=
{overscore (B)}
fpa
A
fpa
, Ĥ
(
M{overscore (F)}
=0
,t
)=
{overscore (H)}
fpa
A
fpa
,
where A
trg fpa
is the approximate instantaneous geometrically projected area of the moving object, A
fpa
is the approximate detected area of the background in the focal plane of the SpinAp sensor, and the overline symbol (--) refers to a spatial average in the focal plane array.
Zero spatial frequency relationships for the object inverse silhouette will now be discussed. Applying the Fourier transform definition to the object inverse silhouette in the scene coordinates, and transforming to the image plane coordinates by a change of variables given by Mr
scene
=R
image
, yields
Therefore, the geometrical projection of the moving object's zero spatial frequency component can also be related to the object's effective area in the focal plane by
Normalized spectra for constant foreground and background will now be discussed. Applying the preceding relationships for the uniform background components of the scene enables determination of mathematical expression for the normalized background and normalized object. Recognizing the ratio of the sum of background and foreground projected scene spectra to their zero spatial frequency component to be
yields the total normalized scene background given
In like manner, the normalized object difference from the background, {circumflex over (T)}
N
, for temporally stationary and spatially uniform backgrounds can be expressed as
Uniform object approximation will now be discussed. In scene coordinates, a uniform object can be represented as a product of the object's inverse silhouette, W
T
, with the object's constant intensity level, T
0
, i.e.
T
(
r,t
)=
T
0
W
T
(
r,t
).
Therefore, the Fourier transform of a uniform object is
T
(
{overscore (F)},t
)=
T
scn
W
T
(
{overscore (F)},t
),
and the corresponding geometrically projected transform of the uniform object is
T
(
M,{overscore (F)},t
)=
T
scn
W
T
(
M{overscore (F)},t
).
Thus, for uniform objects and backgrounds, the normalized object variation from the background, Δ{circumflex over (T)}
n
, for temporally stationary scenes is given by
where the normalized inverse silhouette is defined to be
and the zero spatial frequency component of the normalized inverse silhouette is
Applying the Fourier transform shift theorem to describe an object moving at constant velocity the object inverse silhouette is
Ŵ
TN
(
M{overscore (F)},t
)=
Ŵ
TN
(
M{overscore (F)},t
0
)exp(−2π
i{overscore (F)}·{overscore (V)}t
)≡
Ŵ
TNo
(
M,{overscore (F)}
)exp(−2π
i{overscore (F)}·{overscore (V)}t
)
where Ŵ
TNo
(M{overscore (F)})=Ŵ
TN
(M{overscore (F)},t
0
) is the normalized object's inverse silhouette's Fourier transform at time t
0
=0. The resulting expression for the object variation from the backgrounds is
The differenced and filtered moving object spatial frequency signature for uniform objects and backgrounds
ij
, is therefore
The spatial domain differenced signal for uniform object, background, and foreground will now be discussed. Inverse Fourier transforming the moving object's differenced and filtered spatial frequency spectrum, produces the spatial domain moving object signature, which is
The differenced detector mean signal to RMS noise ratio will now be discussed. The mean differenced signal to RMS differenced noise ratio at location {overscore (R)} for uniform objects and backgrounds can be expressed as
where <. . . > denotes noise ensemble averaging, and the differenced noise variance σ
2
ΔN
ij
({overscore (R)}
m
) may be obtained by analytic procedures described with reference to spatially white noise.
Any differenced synthesized pair signal spectrum for uniform objects and backgrounds is now respresented. The noise free spatial frequency domain signature for any pair of differenced synthesized images associated with uniform objects and backgrounds is given by
and the corresponding spatial domain moving object of any two differenced synthesized images is
Matched filter nomenclature will now be discussed. The best possible performing detection processor, for wide classes of performance criteria, has been established to be an array of generalized matched filters with a comparison device, which will be referred to as a comparator. The design of the optimum matched filter arrays for SpinAp moving object detection processors requires incorporating knowledge of the unique signal measurement characteristics of the SpinAp sensor, the desired data processing before input into the matched filter array, and the anticipated object phenomenology. The a priori object knowledge can be either statistical or deterministic in nature, including physical parameters, such as, shape, size, velocity, acceleration, background to object contrast ratio, signal to noise ratio, number of objects, number of track crossings, noise cross-covariance, etc. The matched filters of the array for a particular signature can be expressed in matrix notation as,
F=S
t
C
NN
−1
where S is the a priori noiseless signal vector, and C
NN
−1
is the noise cross-covariance matrix inverse. Both the signal and noise cross-covariance matrices are appropriately defined for the desired moving object detection data processing method.
The signal matrix resulting from any of the combinations of differenced frames, differenced synthesized images, temporal frequency variations or other signatures as determined to be appropriate for SpinAp moving object detection may be represented as
where ξ refers to any or all combinations of independent spatial, spatial frequency, temporal, and temporal frequency signal coordinates associated with the a priori statistical or deterministic SpinAp moving object signature.
Likewise, the noise matrix resulting from any of the combinations of differenced frames, differenced synthesized images, temporal frequency variations or other appropriate signatures as determined to be appropriate for SpinAp moving object detection can be represented as
The set of noisy measurements can therefore be represented in matrix notation as the sum of signal and noise components, yielding
Complex functioned matched filters can be constructed by recognizing the signal and noise matrix elements are the real and imaginary parts (or amplitude and phase parts) of the complex signal and noise, respectively.
For example, the signal and noise matrices for any number of differenced frames, or differenced synthesized images within the spatially shifting matched filter acquisition “window” would be represented by
and
respectively, where n
now
refers to the number of total elements in the spatial matched filter acquisition window.
The matched filter output: The matched filter output therefore corresponds to the noise free matched filter signal and additive “whitened” noise, or
M
out
=FM=S
t
C
NN
−1
(
S+N
).
Matched filter receiver operating characteristics for Gaussian noise will now be discussed. The receiver operating characteristics (ROC) for the SpinAp matched filter moving object detection processing methods utilizing a comparator and a threshold exceedence decision approach can be readily determined for approximately Gaussian individual frame measurement noise. The probability of detection and probability of false alarms for the SpinAp matched filter output for a given threshold can be determined from
where ln K is the threshold parameter of the comparator, and
respectively.
SpinAp differenced noise cross-covariance matrix elements will now be discussed. Determination of the noise optimal SpinAp moving object detection matched filters depends upon a priori knowledge or assumptions concerning the noise processes and noise propagation from the measurement steps to the matched filter. The classes of SpinAp moving object detection procedures described elsewhere in this disclosure utilize individual weighted frame differences, synthesized frame differences, or temporal and spatial frequency matched filters. The determination of the appropriate propagated noise cross-covariance matrix elements is required to establish the noise optimal matched filters, or the performance of less than noise optimal techniques. The following sections provide example techniques for determining the appropriate SpinAp noise covariance matrix elements. Covariance matrix elements associated with processing steps not explicitly described can follow similar methods and procedures of determination.
The noise spectrum for any pairwise differenced frames will now be discussed. One embodiment of the SpinAp moving object detection processing method differences any distinct pair of system transfer function equalized, and overlap passband filtered individual frames. The corresponding differenced, equalized, and filtered noise spectrum, N
ij
, is
ij
(
{overscore (F)}
)≡{circumflex over (χ)}(
F,t
i
){circumflex over (χ)}(
F,t
j
)
{circumflex over (n)}
S
(
F,t
i
)−{circumflex over (χ)}(
F,t
i
){circumflex over (χ)}(
F,t
j
)
{circumflex over (n)}
S
(
F,t
j
)
Ĥ
(
F,t
j
,t
i
).
The individual pixel noise associated with position {overscore (R)} is obtained by performing an inverse Fourier transform of the processed noise spectrum, and is
For zero mean individual frame noise, <{circumflex over (n)}
S
({overscore (R)},t
i
)>=0, <{circumflex over (n)}
S
({overscore (R)},t
j
)>=0<{circumflex over (n)}
S
({overscore (F)},t
i
)>=0, <{circumflex over (n)}
S
({overscore (F)},t
j
)>=0, and <ΔN
ij
({overscore (R)})>=0=<ΔN
ij
({overscore (F)})>.
The spatial noise cross-covariance will now be discussed. The spatial noise cross-covariance between any two pairs of differenced, equalized, and filtered frames at pixel locations {overscore (R)}
m
and {overscore (R)}
n
is
where the super script * refers to complex conjugation.
The effects of zero mean, temporally stationary and uncorrelated noise will now be discussed. For zero mean, temporally stationary, and temporally uncorrelated noise, the individual frame noise spectrum cross covariance is related to the noise power spectral density Ĉ
nn
by
where δ
i,j
and δ({overscore (F)}−{overscore (F)}) are the Kronecker and Dirac delta functions, respectively.
Substituting individual frame noise spectrum cross covariance into the expression for the spatial differenced and filtered noise covariance produces
A special case corresponding to the pixel to pixel cross-covariance for a single filtered and equalized frame difference is obtained by letting the frame indices take the values i=k,j=1, and i≠j. The corresponding effective noise spatial cross covariance is:
Spatially white noise and Nyquist sampling will now be discussed. For spatially white noise, and Nyquist sampling of the optics passband the individual SpinAp frame noise power spectral density can be approximated as
where λ is the wavelength, Z is the effective optical system focal length, D
x
and D
y
are the aperture sizes in the x and y directions, respectively, Δx
S
and Δ
yS
are spatial sampling increments, which for 100% fill factor focal plane detector arrays corresponds to the detector spacing, and {overscore (σ
dS
2
+L )} is the mean noise variance on an individual detector. Substituting individual white noise power spectral density cross covariance into the expression for the spatial differenced and filtered noise covariance produces
The special case corresponding to the pixel to pixel cross-covariance for a single filtered and equalized frame difference is
Pi-wise differences and the covariance matrix elements and white noise will now be discussed. One further simplifying example, which for ideal system transfer functions has the processing advantage of not requiring system transfer function equalization, corresponds to differencing frames that are π radians apart. For an ideal rectangular strip aperture, the system transfer functions for two frames π radians apart frames provide identical contrast performance, and therefore the equalization filter is exactly unity, or Ĥ({overscore (F)},t
j
,t
i
)=1, and the overlap passbands are identical. Therefore for spatially white, and temporally uncorrelated noise, the cross covariance noise matrix elements are
<ΔN
ij
(
{overscore (R)}
m
)Δ
N
k1
*(
{overscore (R)}
n
)>≡
C
NN
|
ij
=2σ
dS
2
δ
i,j
or the covariance matrix can be represented by
C
NN
={overscore (2σ
dS
2
I
+L )}
where I is the identity matrix. The inverse of the cross covariance matrix is therefore
The synthesized noise spectrum will now be discussed. Another embodiment of the SpinAp moving object detection processing method differences any distinct pair of SpinAp synthesized images. As described in U.S. Pat. No. 5,243,351, synthesized noise spectrum for the simplest form of synthesis process is
{circumflex over (n)}
syn
(
{overscore (F)}
)={circumflex over (Λ)}(
{overscore (F)}
)
{circumflex over (n)}
S ave
(
{overscore (F)}
)
where the spatial frequency redundancy averaged noise {circumflex over (n)}
S ave
({overscore (F)}), is given by
The differenced synthesized noise spectrum will now be discussed. Given the synthesized noise spectrum, the noise spectrum, Δ{circumflex over (n)}
syn
({overscore (F)}), for any pair (not necessarily derived from completely distinct data sets) of synthesized images is
Δ{circumflex over (n)}
syn
(
{overscore (F)}
)={circumflex over (Λ)}Δ
{circumflex over (n)}
S ave
(
{overscore (F)}
).
where Δ{circumflex over (n)}
S ave
(
{overscore (F)}
) is
where the subscripts A and B refer to the SpinAp individual frame sets that were used to generate the synthesized images.
Spatial domain differenced synthesized noise will now be discussed. Performing the inverse Fourier transform of the differenced synthesized noise spectrum, generates the spatial domain synthesized noise at pixel location {overscore (R)},
Spatial domain differenced synthesized noise covariance will now be discussed. The pixel to pixel cross-covariance for the synthesized noise difference can be expressed in terms of the SpinAp synthesizer and the cross covariance of the redundancy averaged differenced noise spectrum, or
where the redundancy averaged differenced noise spectrum is
Zero mean, temporally stationary and uncorrelated noise will now be discussed. For zero mean, temporally stationary, and temporally uncorrelated noise the individual frame noise spectrum cross covariance is related to the noise power spectral density Ĉ
NN
by
where the individual frame times are displayed explicitly and accommodate the potential of synthesizing images based on sets of SpinAp individual frames that may possess common data.
The differenced redundancy averaged noise spectrum correlation will now be discussed. For synthesized images generated from distinct sets of SpinAp individual frames, the noise spectrum cross-covariance for differenced synthesized images is
Therefore, the pixel to pixel differenced synthesized noise for temporally uncorrelated and stationary noise is
and, for temporally uncorrelated, temporally stationary, and spatially white, the spatial differenced synthesized noise cross-covariance matrix is
Embodiments of the present SpinAp moving object detection system consistent with a realization of the method illustrated in
FIGS. 2 and 3
have been reduced to practice and tested. Image data was generated using the embodiment of
FIG. 2
to produce a noiseless differenced synthesized moving object signature and a corresponding template matched filter output.
The image data demonstrated a low signal to noise ratio differenced synthesized image containing a faint smear of a moving object, and the output of the template matched filter, thus demonstrating an excellent output signal to noise ratio on the order of 16 to 1. This demonstrates enhanced receiver operating characteristics that are produced using the present invention for this particular class of moving objects. In addition, experimental data has been obtained and used to demonstrate the π-wise moving object detection methods described herein.
Thus, improved spinning strip (partial) aperture imaging radiometers that synthesize equivalent full circular aperture images and detect moving objects from a plurality of rotating strip aperture image measurements, or synthesized image measurements have been disclosed. It is to be understood that the described embodiments are merely illustrative of some of the many specific embodiments which represent applications of the principles of the present invention. Clearly, numerous and varied other arrangements may be readily devised by those skilled in the art without departing from the scope of the invention.
Claims
- 1. A spinning strip aperture radiometer sensor system comprising:a telescope having a focal plane and comprised of a rotating strip aperture that rotates around an optical axis and that produces temporally sequential images of a scene; a two dimensional detector array for detecting images formed at the focal plane of the telescope; a signal processor coupled to the two-dimensional detector array for recording a plurality of the temporally sequential images of the scene as the strip aperture rotates around the optical axis of the telescope, for synthesizing images from the plurality of temporally sequential images of the scene, and for simultaneously detecting moving objects found in the temporally sequential images of the scene; wherein the signal processor additionally comprises: means for correcting line of sight errors in at least two temporally sequential images of the scene to form spatially registered image frames; means for spatially Fourier transforming the spatially registered image frames to a common spatial frequency to form spatially transformed frames; means for filtering individual spatially transformed frames using a system transfer function equalization filter to form equalized frames; means for differencing distinct pairs of equalized frames to form difference frames; means for filtering the difference frames using a non-common passband rejection filter; means for summing the filtered and difference frames to form a total summed spectrum of filtered differences; means for inverse transforming the total summed spectra of filtered differences to provide a moving object signature in the spatial domain; means for processing the moving object signature in the spatial domain through a matched filter array having a plurality of outputs; and means for comparing each of the plurality of outputs of the matched filter array to a corresponding predetermined threshold and declaring a moving object detection when at least one of the outputs of the matched filter array exceeds the corresponding predetermined threshold.
- 2. A spinning strip aperture radiometer sensor system comprising:a telescope having a focal plane and comprised of a rotating strip aperture that rotates around an optical axis and that produces temporally sequential images of a scene; a two dimensional detector array for detecting images formed at the focal plane of the telescope; a signal processor coupled to the two-dimensional detector array for recording a plurality of the temporally sequential images of the scene as the strip aperture rotates around the optical axis of the telescope, for synthesizing images from the plurality of temporally sequential images of the scene, and for simultaneously detecting moving objects found in the temporally sequential images of the scene; wherein the signal processor additionally comprises: means for correcting line of sight errors in the plurality of temporally sequential images of the scene to form a plurality of spatially registered image frames; means for spatially Fourier transforming the plurality of spatially registered image frames to a common spatial frequency grid to form a plurality of spatially transformed frames; means for temporally Fourier transforming the plurality of spatially transformed frames to produce a temporal and spatial frequency transformed data set; and means for processing the temporal and spatial frequency transformed data set using a noise optimal matched filter detection processor.
- 3. The system of claim 2 wherein the noise optimal matched filter detection processor uses a priori knowledge of object phenomenology, signal and noise measurement statistics, and unique measurement characteristics of the spinning strip aperture radiometer sensor system.
- 4. The system of claim 2 wherein the noise optimal matched filter detection processor comprises template matched filters.
- 5. The system of claim 2 wherein the noise optimal matched filter detection processor comprises streak detectors.
- 6. The system of claim 2 wherein the noise optimal matched filter detection processor comprises a single pixel threshold exceedence detection processor.
- 7. The system of claim 2 wherein the noise optimal matched filter detection processor comprises a multiple pixel threshold exceedence detection processor.
- 8. A spinning strip aperture radiometer sensor system comprising:a telescope having a focal plane and comprised of a rotating strip aperture that rotates around an optical axis and that produces temporally sequential images of a scene; a two dimensional detector array for detecting images formed at the focal plane of the telescope; a signal processor coupled to the two-dimensional detector array for recording a plurality of the temporally sequential images of the scene as the strip aperture rotates around the optical axis of the telescope, for synthesizing images from the plurality of temporally sequential images of the scene, and for simultaneously detecting moving objects found in the temporally sequential images of the scene; wherein the signal processor additionally comprises: means for correcting line of sight errors in the plurality of temporally sequential images of the scene to form a plurality of spatially registered image frames; means for spatially Fourier transforming the plurality of spatially registered image frames to a common spatial frequency grid to form spatially transformed frames; means for filtering individual spatially transformed frames using a system transfer function equalization filter to form equalized frames; means for differencing distinct pairs of equalized frames to form difference frames; means for filtering the difference frames using a non-common passband rejection filter; means for inverse transforming the filtered difference frames to provide a moving object signature in the spatial domain; means for processing the moving object signature in the spatial domain through a matched filter array having a plurality of outputs; and means for comparing each of the plurality of outputs of the matched filter array to a corresponding predetermined threshold and declaring a moving object detection when at least one of the outputs of the matched filter array exceeds the corresponding predetermined threshold.
- 9. The system of claim 8 wherein the matched filter array uses a priori knowledge of object phenomenology, signal and noise measurement statistics, and unique measurement characteristics of the spinning strip aperture radiometer sensor system.
- 10. The system of claim 8 wherein the matched filter array comprises template matched filters.
- 11. The system of claim 8 wherein the matched filter array comprises streak detectors.
- 12. The system of claim 8 wherein the matched filter array comprises a single pixel threshold exceedence detection processor.
- 13. The system of claim 8 wherein the matched filter array comprises a multiple pixel threshold exceedence detection processor.
- 14. An object detection method for use in a spinning strip aperture radiometer sensor system comprising:rotating a strip aperture about an optical axis of an imaging telescope having a focal plane; detecting and recording a plurality of the temporally sequential two-dimensional image frames formed at the focal plane as the strip aperture rotates around the optical axis of the telescope; synthesizing images from the plurality of recorded temporally sequential two-dimensional image frames; correcting line of sight errors in at least two of the recorded temporally sequential two-dimensional image frames to form line of sight corrected frames; spatially Fourier transforming the line of sight corrected frames to form spatially transformed frames; filtering individual spatially transformed frames using a system transfer function equalization filter to form equalized frames; differencing distinct pairs of equalized frames to form difference frames; filtering the difference frames using a non-common passband rejection filter; summing the filtered difference frames to form a total summed spectrum of filtered differences; inverse transforming the total summed spectrum of filtered differences to provide a moving object signature in the spatial domain; and processing the moving object signature through a matched filter array having a plurality of outputs; and declaring a moving object detection when at least on of the plurality of outputs of the matched filter array exceeds a predetermined threshold.
- 15. An object detection method for use in a spinning strip aperture radiometer sensor system comprising:rotating a strip aperture about an optical axis of an imaging telescope having a focal plane; detecting and recording a plurality of temporally sequential two-dimensional image frames formed at the focal plane as the strip aperture rotates around the optical axis of the telescope; synthesizing images from the plurality of recorded temporally sequential two-dimensional image frames; correcting line of sight errors in at least two of the recorded temporally sequentially two-dimensional frames to form line of sight corrected frames; spatially Fourier transforming the line of sight corrected frames to form spatially transformed frames; filtering individual spatially transformed frames using a system transfer function equalization filter to form equalized frames; differencing distinct pairs of equalized frames to form difference frames; filtering the difference frames using a non-common passband rejection filter; inverse transforming the filtered difference frames to provide a moving object signature in the spatial domain; and processing the moving object signature in the spatial domain through a matched filter array having a plurality of outputs; and declaring a moving object detection when at least one of the plurality of outputs of the matched filter array exceeds a predetermined threshold.
- 16. An object detection method for use in a spinning strip aperture radiometer sensor system comprising:rotating a strip aperture about an optical axis of an imaging telescope having a focal plane; detecting and recording a plurality of temporally sequential two-dimensional image frames formed at the focal plane as the strip aperture rotates around the optical axis of the telescope; synthesizing images from the plurality of recorded temporally sequential two-dimensional image frames; correcting line of sight errors in a temporal sequence of the recorded temporally sequential two-dimensional image frames to form a plurality of spatially registered frames; spatially Fourier transforming the plurality of spatially registered frames to a common spatial frequency grid to form a plurality of spatially transformed frames; temporally Fourier transforming the plurality of spatially transformed frames to produce a temporal and spatial frequency transformed data set; and processing the temporal and spatial frequency transformed data set using a noise optimal matched filter detection processor to detect moving objects.
- 17. The object detection method of claim 16 wherein the noise optimal matched filter detection processor uses a priori knowledge of object phenomenology, signal and noise measurement statistics, and unique measurement characteristics of the spinning strip aperture radiometer sensor system.
- 18. The object detection method of claim 16 wherein the step of processing the temporal and spatial frequency transformed data set comprises the step of processing the temporal and spatial frequency transformed data set using template matched filters.
- 19. The object detection method of claim 16 wherein the step of processing the temporal and spatial frequency transformed data set comprises the step of processing the temporal and spatial frequency transformed data set using streak detectors.
- 20. The object detection method of claim 16 wherein the step of processing the temporal and spatial frequency transformed data set comprises the step of processing the temporal and spatial frequency transformed data set using a single pixel threshold exceedence detection processor.
- 21. The object detection method of claim 16 wherein the step of processing the temporal and spatial frequency transformed data set comprises the step of processing the temporal and spatial frequency transformed data set using a multiple pixel threshold exceedence detection processor.
US Referenced Citations (6)