Event multiplexer for managing the capture of images

Information

  • Patent Grant
  • 8520079
  • Patent Number
    8,520,079
  • Date Filed
    Thursday, February 14, 2008
    16 years ago
  • Date Issued
    Tuesday, August 27, 2013
    10 years ago
Abstract
An image capture system for capturing images of an object, such as the Earth. The image capture system includes a moving platform, at least two image capture devices, a position system, an event multiplexer and a computer system. The image capture devices are mounted to the moving platform. Each image capture device has a sensor for capturing an image and an event channel providing an event signal indicating the capturing of an image by the sensor. The position system records data indicative of a position as a function of time related to the moving platform. The event multiplexer has at least two image capture inputs and at least one output port. Each image capture input receives event signals from the event channel of one of the image capture devices. The event multiplexer outputs information indicative of an order of events indicated by the event signals, and identification of image capture devices providing the event signals. The computer system receives and stores the information indicative of the order of events indicated by the event signals, and identification of image capture devices providing the event signals.
Description
STATEMENT REGARDING FEDERALLY SPONSORED RESEARCH AND DEVELOPMENT

Not Applicable.


BACKGROUND OF THE INVENTION

As background, in the remote sensing/aerial imaging industry, imagery is used to capture views of a geographic area and to be able to measure objects and structures within the images as well as to be able to determine geographic locations of points within the image. These are generally referred to as “geo-referenced images” and come in two basic categories:


Captured Imagery—these images have the appearance they were captured by the camera or sensor employed.


Projected Imagery—these images have been processed and converted such that they confirm to a mathematical projection.


All imagery starts as captured imagery, but as most software cannot geo-reference captured imagery, that imagery is then reprocessed to create the projected imagery. The most common form of projected imagery is the ortho-rectified image. This process aligns the image to an orthogonal or rectilinear grid (composed of rectangles). The input image used to create an ortho-rectified image is a nadir image—that is, an image captured with the camera pointing straight down. It is often quite desirable to combine multiple images into a larger composite image such that the image covers a larger geographic area on the ground. The most common form of this composite image is the “ortho-mosaic image” which is an image created from a series of overlapping or adjacent nadir images that are mathematically combined into a single ortho-rectified image.


When creating an ortho-mosaic, this same ortho-rectification process is used, however, instead of using only a single input nadir image, a collection of overlapping or adjacent nadir images are used and they are combined to form a single composite ortho-rectified image known as an ortho-mosaic. In general, the ortho-mosaic process entails the following steps:


A rectilinear grid is created, which results in an ortho-mosaic image where every grid pixel covers the same amount of area on the ground.


The location of each grid pixel is determined from the mathematical definition of the grid. Generally, this means the grid is given an X and Y starting or origin location and an X and Y size for the grid pixels. Thus, the location of any pixel is simply the origin location plus the number of pixels times the size of each pixel. In mathematical terms: Xpixel=Xorigin+Xsize×Columnpixel and Ypixel=Yorigin+Ysize×Rowpixel.


The available nadir images are checked to see if they cover the same point on the ground as the grid pixel being filled. If so, a mathematical formula is used to determine where that point on the ground projects up onto the camera's pixel image map and that resulting pixel value is then transferred to the grid pixel.


Because the rectilinear grids used for the ortho-mosaic are generally the same grids used for creating maps, the ortho-mosaic images bear a striking similarity to maps and as such, are generally very easy to use from a direction and orientation standpoint.


In producing the geo-referenced aerial images, hardware and software systems designed for georeferencing airborne sensor data exist and are identified herein as a “POS”, i.e., a position and orientation system. For example, a system produced by Applanix Corporation of Richmond Hill, Ontario, Canada and sold under the trademark “POS AV” provides a hardware and software system for directly georeferencing sensor data. Direct Georeferencing is the direct measurement of sensor position and orientation (also known as the exterior orientation parameters), without the need for additional ground information over the project area. These parameters allow data from the airborne sensor to be georeferenced to the Earth or local mapping frame. Examples of airborne sensors include: aerial cameras (digital or film-based), multi-spectral or hyper-spectral scanners, SAR, or LIDAR.


The POS system, such as the POS AV system, was mounted on a moving platform, such as an airplane, such that the airborne sensor was pointed toward the Earth. The positioning system received position signals from a satellite constellation and also received time signals from an accurate clock. The sensor was controlled by a computer running flight management software to take images. Signals indicative of the taking of an image were sent from the sensor to the positioning system to record the time and position where the image was taken.


However, the prior POS systems only included one or two ports for recording the time of sensor capture and position of only one or two independent sensors. The industry standard method for solving this problem is to slave one or more cameras to another camera and then to actuate all of the cameras simultaneously. However, this does not permit independent actuation of each of the cameras.


If more than two independent sensors are to be used with the prior positioning system, then a separate system for recording the time and position of sensor capture must be developed. It is to such a system for recording the time of sensor that the present invention is directed.





BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWING


FIG. 1 is a perspective view of an exemplary image capture system constructed in accordance with the present invention.



FIG. 2 is a perspective view of another example of an image capture system constructed in accordance with the present invention.



FIG. 3 is a perspective view of yet another example of an image capture system constructed in accordance with the present invention.



FIG. 4 is a block diagram of the image capture system depicted in FIG. 1.



FIG. 5 is a block diagram of one version of an event multiplexer system constructed in accordance with the present invention.



FIG. 6 is a diagrammatic view of a timing/logic flow of an event multiplexer constructed in accordance with the present invention.



FIG. 7 is a block diagram of another version of an event multiplexer system constructed in accordance with the present invention.



FIG. 8 is a block diagram of yet another version of an event multiplexer system constructed in accordance with the present invention.



FIG. 9 is a block diagram of another version of an image capture system constructed in accordance with the present invention.





DETAILED DESCRIPTION OF THE INVENTION

Before explaining at least one embodiment of the invention in detail, it is to be understood that the invention is not limited in its application to the details of construction, experiments, exemplary data, and/or the arrangement of the components set forth in the following description or illustrated in the drawings. The invention is capable of other embodiments or being practiced or carried out in various ways. Also, it is to be understood that the phraseology and terminology employed herein is for purposes of description and should not be regarded as limiting.


Referring to the drawings, and in particular to FIGS. 1, 2 and 3, shown therein and designated by a reference numeral 10 is an image capture system constructed in accordance with the present invention. The image capture system 10 is typically used for capturing aerial images as shown in FIGS. 1 and 2. However, while the image capture system 10 is extremely useful for aerial imaging, it has numerous other applications—such as when a system has more external triggers than inputs on a device that must react to the external triggers. For instance, as shown in FIG. 3, a municipality might have an intersection with a high occurrence of speeding. In this case, the municipality might wish to install a speed monitoring device, such as a radar gun, combined with multiple independently controlled image capture devices 14 to precisely link the time of image capture to the time of radar reading.


The images can be oblique images, orthogonal images, or nadir images, or combinations thereof.


As shown in FIG. 4, the image capture system 10 is provided with, one or more image capture devices 14, one or more monitoring systems 16, one or more event multiplexer systems 18, and one or more data storage units or computer systems 20. In the examples depicted in FIGS. 1-3 the image capture system 10 is provided with four image capture devices 14 mounted in a sweep pattern (FIG. 1); five image capture devices 14 mounted in a 360 pattern having image capture devices 14 pointing fore, aft, port, starboard and straight down (FIG. 2); four image capture devices 14 mounted in separate directions generally aligned with respective parts of streets (FIG. 3).


In certain embodiments depicted in FIGS. 1 and 2, the image capture devices 14, the one or more monitoring systems 16, the one or more event multiplexer systems 18 and the computer system 20 are mounted to a moving platform 21. The moving platform 21 can be any type of device or system that can move through space in a predetermined, or random manner. Typically, the moving platform 21 is a manned airplane, but it should be understood that the moving platform 21 can be implemented in other manners. For example, the moving platform 21 can be implemented as an unmanned airplane, a train, an automobile such as a van, a boat, a four wheeler, a motor cycle, tractor, a robotic device or the like.


The image capture devices 14 are mounted to the moving platform 21, and once mounted are typically calibrated so that the exact position and orientation of the image capture devices 14 are known with respect to at least a portion of the moving platform 21. For example, as shown in FIGS. 1 and 2, the image capture devices 14 can be mounted onto a common substrate 22 and calibrated with respect to the substrate 22. It should be noted that all of the cables, wires or other signal paths connecting the image capture devices 14, monitoring system 16, event multiplexer 18 and computer system 20 are not shown in FIGS. 1-3 for purposes of clarity. The substrate 22 having the image capture devices 14 mounted thereto is then mounted to the moving platform 21. In the embodiment depicted in FIG. 1, the image capture devices 14 are mounted internally to the moving platform 21 and the moving platform 21 has one or more opening 23 for the image capture devices 14 to sense data through. In other embodiments, one or more of the image capture devices 14 can be mounted externally to the moving platform 21. For example, in FIG. 2 the image capture devices 14 are mounted to an under-wing pod. Alternatively, the image capture devices 14 can be mounted to a roof rack of a car.


Each of the image capture devices 14 has a sensor (not shown) for capturing sensor data, such as an image. Each of the image capture devices 14 is also provided with an event channel 26 providing an event signal indicating the capturing of an image by the sensor. The event channel 26 can be any device that provides a signal coincident with the capturing of the image, such as a flash output. The sensor can capture the image in an analog manner, digital manner, or on film. Further, it should be understood that the image can be stored electronically, optically, or provided on a film-based medium.


The event multiplexer system 18 has at least one image capture input 28 and at least one output port 30. In a preferred embodiment the event multiplexer system 18 has at least two image capture inputs 28. Each image capture input 28 receives signals from the event channel 26 of one of the image capture devices 14. The event multiplexer system 18 outputs event signals indicative of an order of events indicated by the signals provided by the image capture devices 14, and an identification (CID) of image capture devices 14 providing the event signals.


The monitoring system 16 records data indicative of the capturing of the images. For example, the monitoring system 16 can record position data as a function of time, time data and/or orientation data. In the embodiments depicted in FIGS. 1 and 2, the monitoring system 16 records position data as a function of time, as well as time data and/or orientation data related to the moving platform 21. In the embodiment depicted in FIG. 3, the monitoring system 16 records time data. Preferably, the monitoring system 16 automatically and continuously reads and/or records the data. However, it should be understood that the monitoring system 16 can read and/or record the data in other manners, such as on a periodic basis, or upon receipt of a signal to actuate the monitoring system 16 to obtain and record the data. For example, the event signals produced by the event multiplexer system 18 can be provided to the monitoring system 16 to cause the monitoring system 16 to read and/or record the data indicative of position as a function of time related to the moving platform 21.


In the embodiments depicted in FIGS. 1 and 2, the monitoring system 16 also includes a satellite receiver 34 typically receiving position and timing signals from a satellite constellation 36, using any appropriate protocol, such as GPS or loran, although other types of position determining systems can be used, such as cell phone triangulation, e.g., Wireless Application Protocol (WAP).


The computer system 20 receives and stores (preferably in the database 38) the information indicative of the order of events indicated by the event signals, and identification of image capture devices 14 providing the event signals. The computer system 20 optionally also receives and stores the images (preferably in the database 38) generated by the image capture devices 14. The monitoring system 16 records the data indicative of the capturing of images by storing it internally, outputting it to the computer system 20, or outputting such data in any other suitable manner, such as storing such data on an external magnetic or optical storage system. The position related to the moving platform 21 can be provided in any suitable coordinate system, such as an X, Y, Z coordinate system, or a WGS1984 latitude/longitude coordinate system.


Further, the image capture system 10 can be provided with an orientation system, such as an inertial measurement unit 40 for capturing other types of information with respect to the moving platform 21, such as the orientation of the moving platform 21. The inertial measurement unit 40 can be provided with a variety of sensors, such as accelerometers (not shown) for determining the roll, pitch and yaw related to the moving platform 21. Further, it should be understood that the position and/or orientation information does not necessarily have to be a position and/or orientation of the moving platform 21. The position and orientation information is simply related to the moving platform 21, i.e. the position and/or orientation of the moving platform 21 should be able to be determined by the information recorded by the monitoring system 16. For example, the position and orientation information can be provided for a device connected to the moving platform 21. Then, the position and orientation for each image capture device can be determined based upon their known locations relative to the moving platform 21.


Shown in FIG. 5 is a block diagram of an exemplary event multiplexer system 18 constructed in accordance with the present invention. The event multiplexer system 18 includes an event multiplexer 46 and a computer interface 48. The event multiplexer 46 preferably includes one or more quasi Xor gate 50, and a memory device 54. The quasi Xor gate 50 has a plurality of inputs 56 forming the image capture inputs, and an output 58 forming the at least one output port 30. The memory device 48 receives data from the image capture inputs 28 and also receives an event signal from the quasi Xor gate 50 to record the state of the image capture inputs 28 when the quasi Xor gate 50 determines an event has or is taking place. The memory device 48 can optionally also record the identity of the image capture devices 14 providing the event signals. The quasi Xor gate 50 functions as an exclusive- or gate with the exception that the quasi Xor gate produces an event signal at its output when two or more signals occur simultaneously on the image capture inputs 28.


The memory device 54 can be formed by a first-in-first-out register or any other type of device capable of optionally recording the identity of the image capture device 14 providing the signal, and a sequential order of events based upon the order in which the signals are received.


The computer interface 48 is connected to the memory device 54 to permit the computer system 20 to read the data in the memory device 54. The computer interface 48 can be a device capable of reading the information from the memory device 54 and providing same to the computer system 20. For example, the computer interface 48 can be a serial port, parallel port or a USB port.


For example, FIG. 6 illustrates a diagrammatic view of a timing/logic flow of the event multiplexer 46. In this example, four image capture devices 14 (indicated as C1, C2, C3 and C4 for purposes of clarity) are connected to the Quasi Xor gate 50. The image capture devices C1 and C4 provide a signal simultaneously at time T0; the image capture device C2 provides a signal at T1, and the image capture device C3 provides a signal at T2. The quasi Xor gate 50 outputs an event signal at time T0 on the port 30 because the signals provided by the image capture devices C1 and C4 have a simultaneous leading edge. The quasi Xor gate 50 also outputs an event signal at time T1 based on the signal received from the image capture device C2. Although the quasi Xor gate 50 received the signal from the image capture device C3 at time T2, the quasi Xor gate 50 delays providing an event signal until the lagging edge of the signal provided by the image capture device C2. The quasi Xor gate 50 can be constructed using any suitable logic device(s) capable of providing the functions described above. For example, the quasi Xor gate 50 can be constructed using a microprocessor, digital signal processor, microcontroller, field programmable gate array or a combination of logic gates.


Shown in FIG. 7 is a block diagram of another version of an event multiplexer system 18a constructed in accordance with the present invention. The event multiplexer system 18a is similar in construction and function to the event multiplexer system depicted in FIG. 5 and described above, with the exception that the event multiplexer system 18a includes at least two event multiplexers 46a and 46b (designated by reference numerals X and Y) and a computer interface 48a. The computer interface 48a and the event multiplexers 46a and 46b can be constructed in a similar manner as the computer interface 48 and event multiplexer 46 described above. The event multiplexer 46a receives signals from the event channels of a subset of the image capture devices (in this case image capture devices designated by C1 and C2), and the event multiplexer 46b receives signals from the event channels of a different subset of the image capture devices (in this case image capture devices designated by C3, C4 and C5). Thus, the image capture devices C1-C5, for example, can be divided into a variety of different subsets having different numbers of image capture devices C1-C5. For example, image capture devices C1-C3 can be connected to the event multiplexer 46a and the image capture devices C4-C5 can be connected to the event multiplexer 46b


Each of the event multiplexers 46a and 46b provide event signals indicative of an order of events based upon the respective signals received from the image capture devices C1-C5. The event signals provided by the event multiplexers 46a and 46b can be provided to different ports of the monitoring system 16, or to different monitoring systems for causing the monitoring system(s) 16 to read and/or record the data indicative of the position as a function of time related to the moving platform 21.


The computer interface 48a communicates with the event multiplexers 46a and 46b to permit the computer system 20 to read the data from the memory devices and store it on a computer readable medium accessible by the computer system 20. The computer interface 48a can be a device capable of reading the information from the memory device(s) and providing same to the computer system 20. For example, the computer interface 48a can be a serial port, parallel port or a USB port.


Shown in FIG. 8 is a block diagram of another version of an event multiplexer system 18b constructed in accordance with the present invention. The event multiplexer system 18b is similar in construction and function to the event multiplexer system depicted in FIG. 7 and described above, with the exception that the event multiplexer system 18b includes a multiplexer director 70 receiving signals from the event channels from all or a subset of the image capture devices C1-Cn and directing each signals to one or more of at least two event multiplexers (designated by reference numerals 46a and 46b). The multiplexer director 70 provides an image capture device identification signal CID to the computer interface 48a for passing on to the computer system 20. The event multiplexers 46a and 46b receive the signals from the multiplexer director, 70 and in response thereto, provide event signals to the monitoring system 20.


The computer interface 48a and the event multiplexers 46a and 46b can be constructed in a similar manner as the computer interface 48 and event multiplexer 46 described above. Each of the event multiplexers 46a and 46b provide event signals indicative of an order of events based upon the respective signals received from the image capture devices C1-C5 via the multiplexer director 70. The event signals provided by the event multiplexers 46a and 46b can be provided to different ports of the monitoring system 16, or to different monitoring systems 16 for causing the monitoring system(s) 16 to read and/or record the data indicative of the position as a function of time related to the moving platform 21.


The computer interface 48a communicates with the event multiplexers 46a and 46b to permit the computer system 20 to read the data indicative of the order of events E(t) from the respective memory devices and store such data E(t) on a computer readable medium accessible by the computer system 20.


Shown in FIG. 9 and designated by a reference numeral 10a is a block diagram of another image capture system constructed in accordance with the present invention. The image capture system 10a is similar to the image capture system 10 except as described below. In the image capture system 10 (FIG. 4), the event multiplexer system 18, 18a or 18b records the order of the signals from the image capture devices 14 (relative timing) correlated with an identification of the image capture device 14 producing the signal. The monitoring system 16 records an absolute timing of events that is then correlated to the relative timing. In the image capture system 10a, an event multiplexer system 18c receives a time sync signal T from an external device such as the monitoring system 16a and records the signals from the image capture devices 14 as a function of time and typically an identification of the image capture device 14 producing the signal. The computer system 20a receives the signals, T, E(t), and CID and correlates position as a function of time information P(t) recorded by the monitoring system 16A.


Thus, the event multiplexer systems 18, 18a, and 18b record order of event and optionally record identification of camera information. The event multiplexer systems 18, 18a and 18b utilize an external timing device (i.e., the monitoring system 16 or the computer system 20) to correlate event with time and thus location. The event multiplexer system 18c, receives and records absolute time in addition to the order of events and identification of image capture devices 14. Thus, “order of event” could include relative or absolute time.


The image capture system 10a is provided with one or more image capture devices 14a, one or more monitoring systems 16a, one or more event multiplexer systems 18c, and one or more computer systems 20a and one or more inertial measurement unit 40a.


The image capture devices 14a, the monitoring system 16a and inertial measurement unit 40a are similar in construction and function as the image capture devices 14, monitoring system 16 and inertial measurement unit 40 described above.


The event multiplexer system 18c is similar in construction and function as the event multiplexer system 18, 18a and 18b described above, with the exception that the event multiplexer system 18c receives absolute time signals, such as a (PPS) and time sync signal from the monitoring system 16a (or a separate time receiver) and records the absolute time each instance in which a signal is received from one of the image capture devices 14a. The time signal is preferably an accurate signal received from a satellite constellation or an atomic clock. In addition, the absolute time of each event (T, E(t)), and an identification of the image capture device 14a for each event (CID) is passed to the computer system 20a.


The monitoring system 16a passes signals to the computer system 20a indicative of the position of the moving platform 21 as a function of time, and the image capture devices 14a pass the captured images to the computer system 20a. Thus, the computer system 20a is provided with and stores the information for correlating the data indicative of position as a function of time with particular captured images.


The event multiplexer system 18c can also optionally provide event signals e(t) to the monitoring system 16a in a similar manner as the event multiplexer systems 18, 18a and 18b described above, for purposes of redundancy, and/or verification of the data. The event signals e(t) can also optionally be provided to the computer system 20a from the monitoring system 16a and in this instance the computer system 20a optionally compares e(t) with E(t) for calibration, diagnostic or feedback purposes to make sure that the time sync signal is correct.


In using the systems depicted in FIGS. 1 and 2, the image capture devices 14 or 14a are mounted on the moving platform 21, such as an airplane, such that image capture devices 14 or 14a are pointed toward an object, such as the Earth. The moving platform 21 is then actuated to move, and the image capture devices 14 or 14a capture images at pre-determined or random times or positions. Typically, the image capture devices 14 or 14a will be independently controlled by flight management software running on the computer system 20 or 20a and the taking of the images will be pre-determined. In any event, as the image capture devices 14 or 14a capture the images, signals are passed to the event multiplexers systems 18, 18a, 18b or 18c and the order of events (relative or absolute), image capture device identification and the position as a function of time data is logged and stored by the cooperation of the event multiplexer systems 18, 18a, 18b and 18c, monitoring systems 16 or 16a and computer systems 20 or 20a as described above. Then, the images are geo-referenced as described in the Background of the Invention section above utilizing the recorded data regarding the order of events (relative or absolute), image capture device identification and the position as a function of time data.


In using the system depicted in FIG. 3, the image capture devices 14 or 14a are mounted adjacent to the intersection. For example, the image capture devices 14 or 14a can be mounted to separate traffic light poles such that the image capture devices 14 or 14a are pointed at the streets entering or leaving the intersection. The system depicted in FIG. 3 also includes a radar gun pointing at the intersection to sense the speed of cars moving through the intersection. When a car speeds through the intersection, one or more of the image capture devices 14 or 14a can be actuated (preferably by a computer controlled management system) to preferably take a picture of the driver and tag of the car, while the event multiplexer systems 18, 18a, 18b or 18c capture data such as time data correlated with the data produced by the radar gun. This precisely links the time of image capture (or other sensor data) to the time of radar reading to provide evidence of the identity of the speeding car and driver in this example.


It will be understood from the foregoing description that various modifications and changes may be made in the preferred and alternative embodiments of the present invention without departing from its true spirit.


This description is intended for purposes of illustration only and should not be construed in a limiting sense. The scope of this invention should be determined only by the language of the claims that follow. The term “comprising” within the claims is intended to mean “including at least” such that the recited listing of elements in a claim are an open group. “A,” “an” and other singular terms are intended to include the plural forms thereof unless specifically excluded.

Claims
  • 1. An image capture system for capturing images of an object, the image capture system comprising: a moving platform;at least three image capture devices mounted to the moving platform, each image capture device having a sensor for capturing an image and an event channel providing an event signal indicating the capturing of an image by the sensor;a monitoring system recording data indicative of a position and orientation as a function of time related to the moving platform, the monitoring system having fewer input ports for initiating the recording of the data indicative of the position and orientation as a function of time than the at least three image capture devices;an event multiplexer multiplexing at least three image capture inputs to at least one output port, the event multiplexer having more image capture inputs than the input ports of the monitoring system, wherein the event channels of the at least three image capture devices are connected to the at least three image capture inputs, the event multiplexer outputting information from the image capture devices through the at least one output port to the monitoring system for correlating the data indicative of position and orientation as a function of time with particular captured images; anda computer system independently controlling the at least three image capture devices and receiving and storing the information.
  • 2. The image capture system of claim 1, wherein the information stored by the computer system includes a sequential order of events based upon the order in which the event signals are received by the event multiplexer.
  • 3. The image capture system of claim 1, wherein the event multiplexer comprises a first-in-first out-register storing the event signals.
  • 4. The image capture system of claim 1, wherein the event multiplexer comprises a quasi-X or gate having inputs forming the image capture inputs, and an output forming one of the at least one output port.
  • 5. The image capture system of claim 1, wherein the monitoring system includes an input port receiving event signals from the output port of the event multiplexer, and wherein the monitoring system records data indicative of the time at which the event signal was received.
  • 6. The image capture system of claim 5, wherein the computer system stores the data indicative of the time at which the event signal was received by the monitoring system and a camera ID identifying the image capture device from which the event signal was received.
  • 7. The image capture system of claim 1, wherein the information output by the event multiplexer includes absolute time data.
  • 8. The image capture system of claim 1, wherein the information output by the event multiplexer includes relative time data.
  • 9. The image capture system of claim 1, wherein the computer system correlates the information indicative of the order of events indicated by the event signals, and identification of image capture devices providing the event signals to identify particular captured images.
  • 10. The image capture system of claim 1, wherein the at least three image capture devices are mounted to a common substrate.
  • 11. The image capture system of claim 1, wherein the at least three image capture devices include at least five image capture devices mounted to point fore, aft, port, starboard and straight down.
CROSS-REFERENCE TO RELATED APPLICATIONS

The present patent application claims priority to the provisional patent application identified by U.S. Ser. No. 60/901,444, which was filed on Feb. 15, 2007, the entire content of which is hereby incorporated herein by reference.

US Referenced Citations (145)
Number Name Date Kind
2273876 Lutz et al. Feb 1942 A
3153784 Petrides et al. Oct 1964 A
3594556 Edwards Jul 1971 A
3614410 Bailey Oct 1971 A
3621326 Hobrough Nov 1971 A
3661061 Tokarz May 1972 A
3716669 Watanabe et al. Feb 1973 A
3725563 Woycechowsky Apr 1973 A
3864513 Halajian et al. Feb 1975 A
3866602 Furihata Feb 1975 A
3877799 O'Donnell Apr 1975 A
4015080 Moore-Searson Mar 1977 A
4044879 Stahl Aug 1977 A
4184711 Wakimoto Jan 1980 A
4240108 Levy Dec 1980 A
4281354 Conte Jul 1981 A
4344683 Stemme Aug 1982 A
4360876 Girault et al. Nov 1982 A
4382678 Thompson et al. May 1983 A
4387056 Stowe Jun 1983 A
4396942 Gates Aug 1983 A
4463380 Hooks Jul 1984 A
4489322 Zulch et al. Dec 1984 A
4490742 Wurtzinger Dec 1984 A
4491399 Bell Jan 1985 A
4495500 Vickers Jan 1985 A
4527055 Harkless et al. Jul 1985 A
4543603 Laures Sep 1985 A
4586138 Mullenhoff et al. Apr 1986 A
4635136 Ciampa et al. Jan 1987 A
4653136 Denison Mar 1987 A
4653316 Fukuhara Mar 1987 A
4673988 Jansson et al. Jun 1987 A
4686474 Olsen et al. Aug 1987 A
4688092 Kamel et al. Aug 1987 A
4689748 Hofmann Aug 1987 A
4707698 Constant et al. Nov 1987 A
4758850 Archdale et al. Jul 1988 A
4805033 Nishikawa Feb 1989 A
4807024 Mclaurin et al. Feb 1989 A
4814711 Olsen et al. Mar 1989 A
4814896 Heitzman et al. Mar 1989 A
4843463 Michetti Jun 1989 A
4899296 Khattak Feb 1990 A
4906198 Cosimano et al. Mar 1990 A
4953227 Katsuma et al. Aug 1990 A
4956872 Kimura Sep 1990 A
5034812 Rawlings Jul 1991 A
5086314 Aoki et al. Feb 1992 A
5121222 Endoh et al. Jun 1992 A
5138444 Hiramatsu Aug 1992 A
5155597 Lareau et al. Oct 1992 A
5164825 Kobayashi et al. Nov 1992 A
5166789 Myrick Nov 1992 A
5191174 Chang et al. Mar 1993 A
5200793 Ulich et al. Apr 1993 A
5210586 Grage et al. May 1993 A
5231435 Blakely Jul 1993 A
5247356 Ciampa Sep 1993 A
5251037 Busenberg Oct 1993 A
5265173 Griffin et al. Nov 1993 A
5267042 Tsuchiya et al. Nov 1993 A
5270756 Busenberg Dec 1993 A
5296884 Honda et al. Mar 1994 A
5335072 Tanaka et al. Aug 1994 A
5342999 Frei et al. Aug 1994 A
5345086 Bertram Sep 1994 A
5353055 Hiramatsu Oct 1994 A
5369443 Woodham Nov 1994 A
5402170 Parulski et al. Mar 1995 A
5414462 Veatch May 1995 A
5467271 Abel et al. Nov 1995 A
5481479 Wight et al. Jan 1996 A
5486948 Imai et al. Jan 1996 A
5506644 Suzuki et al. Apr 1996 A
5508736 Cooper Apr 1996 A
5555018 von Braun Sep 1996 A
5604534 Hedges et al. Feb 1997 A
5617224 Ichikawa et al. Apr 1997 A
5633946 Lachinski et al. May 1997 A
5668593 Lareau et al. Sep 1997 A
5677515 Selk et al. Oct 1997 A
5798786 Lareau et al. Aug 1998 A
5835133 Moreton et al. Nov 1998 A
5841574 Willey Nov 1998 A
5844602 Lareau et al. Dec 1998 A
5852753 Lo et al. Dec 1998 A
5894323 Kain et al. Apr 1999 A
5899945 Baylocq et al. May 1999 A
5963664 Kumar et al. Oct 1999 A
6088055 Lareau et al. Jul 2000 A
6094215 Sundahl et al. Jul 2000 A
6097854 Szeliski et al. Aug 2000 A
6108032 Hoagland Aug 2000 A
6130705 Lareau et al. Oct 2000 A
6157747 Szeliski et al. Dec 2000 A
6167300 Cherepenin et al. Dec 2000 A
6222583 Matsumura et al. Apr 2001 B1
6236886 Cherepenin et al. May 2001 B1
6256057 Mathews et al. Jul 2001 B1
6373522 Mathews et al. Apr 2002 B2
6421610 Carroll et al. Jul 2002 B1
6434280 Peleg et al. Aug 2002 B1
6597818 Kumar et al. Jul 2003 B2
6639596 Shum et al. Oct 2003 B1
6711475 Murphy Mar 2004 B2
6731329 Feist et al. May 2004 B1
6747686 Bennett Jun 2004 B1
6834128 Altunbasak et al. Dec 2004 B1
6876763 Sorek et al. Apr 2005 B2
7009638 Gruber et al. Mar 2006 B2
7018050 Ulichney et al. Mar 2006 B2
7046401 Dufaux et al. May 2006 B2
7061650 Walmsley et al. Jun 2006 B2
7065260 Zhang et al. Jun 2006 B2
7123382 Walmsley et al. Oct 2006 B2
7127348 Smitherman et al. Oct 2006 B2
7142984 Rahmes et al. Nov 2006 B2
7233691 Setterholm Jun 2007 B2
7262790 Bakewell Aug 2007 B2
7348895 Lagassey Mar 2008 B2
8078396 Meadow et al. Dec 2011 B2
20020041328 LeCompte et al. Apr 2002 A1
20020041717 Murata et al. Apr 2002 A1
20020114536 Xiong et al. Aug 2002 A1
20030014224 Guo et al. Jan 2003 A1
20030043824 Remboski et al. Mar 2003 A1
20030088362 Melero et al. May 2003 A1
20030214585 Bakewell Nov 2003 A1
20040105090 Schultz et al. Jun 2004 A1
20040167709 Smitherman et al. Aug 2004 A1
20050073241 Yamauchi et al. Apr 2005 A1
20050088251 Matsumoto Apr 2005 A1
20050169521 Hel-Or Aug 2005 A1
20060028550 Palmer et al. Feb 2006 A1
20060092043 Lagassey May 2006 A1
20060195858 Takahashi et al. Aug 2006 A1
20060238383 Kimchi et al. Oct 2006 A1
20060250515 Koseki et al. Nov 2006 A1
20070024612 Balfour Feb 2007 A1
20070046448 Smitherman Mar 2007 A1
20070237420 Steedly et al. Oct 2007 A1
20080120031 Rosenfeld et al. May 2008 A1
20080123994 Schultz et al. May 2008 A1
20080273090 Niimura Nov 2008 A1
Foreign Referenced Citations (27)
Number Date Country
331204 Jul 2006 AT
9783798 Apr 1999 AU
3874400 Sep 2000 AU
03291364 Jun 2004 AU
0316110 Sep 2005 BR
2402234 Sep 2000 CA
2505566 May 2004 CA
1735897 Feb 2006 CN
60017384 Mar 2006 DE
60306301 Nov 2006 DE
1418402 Oct 2006 DK
0424875 Oct 1990 EP
1180967 Feb 2002 EP
1418402 May 2004 EP
1696204 Aug 2006 EP
2266704 Mar 2007 ES
1088421 Nov 2006 HK
2003317089 Nov 2003 JP
2006505794 Feb 2006 JP
PA05004987 Feb 2006 MX
200503341 May 2007 SG
WO9918732 Apr 1999 WO
WO0053090 Sep 2000 WO
WO2004044692 May 2004 WO
WO2005088251 Sep 2005 WO
WO2008028040 Mar 2008 WO
WO 2008081993 Jul 2008 WO
Non-Patent Literature Citations (90)
Entry
Ackermann, Prospects of Kinematic GPS Aerial Triangulation, ITC Journal, 1992.
Ciampa, John A., “Pictometry Digital Video Mapping”, SPIE, vol. 2598, pp. 140-148, 1995.
Ciampa, J. A., Oversee, Presented at Reconstruction After Urban earthquakes, Buffalo, NY, 1989.
Dunford et al., Remote Sensing for Rural Development Planning in Africa, The Journal for the International Institute for Aerial Survey and Earth Sciences, 2:99-108, 1983.
Gagnon, P.A., Agnard, J. P., Nolette, C., & Boulianne, M., “A Micro-Computer based General Photogrammetric System”, Photogrammetric Engineering and Remote Sensing, vol. 56, No. 5., pp. 623-625, 1990.
Konecny, G., “Issues of Digital Mapping”, Leibniz University Hannover, Germany, GIS Ostrava 2008, Ostrava 27.—Jan. 30, 2008, pp. 1-8.
Konecny, G., “Analytical Aerial Triangulation with Convergent Photography”, Department of Surveying Engineering, University of New Brunswick, pp. 37-57, 1966.
Konecny, G., “Interior Orientation and Convergent Photography”, Photogrammetric Engineering, pp. 625-634, 1965.
Graham, Lee A., “Airborne Video for Near-Real-Time Vegetation Mapping”, Journal of Forestry, 8:28-32, 1993.
Graham, Horita TRG-50 SMPTE Time-Code Reader, Generator, Window Inserter, 1990.
Hess, L.L, et al., “Geocoded Digital Videography for Validation of Land Cover Mapping in the Amazon Basin”, International Journal of Remote Sensing, vol. 23, No. 7, pp. 1527-1555, 2002.
Hinthorne, J., et al., “Image Processing in The Grass GIS”, Geoscience and Remote Sensing Symposium, 4:2227-2229, 1991.
Imhof, Ralph K., “Mapping from Oblique Photographs”, Manual of Photogrammetry, Chapter 18.
Jensen, John R., Introductory Digital Image Processing: A Remote Sensing Perspective, Prentice-Hall, 1986; 399 pages.
Lapine, Lewis A., “Practical Photogrammetric Control by Kinematic GPS”, GPS World, 1(3):44-49, 1990.
Lapine, Lewis A., Airborne Kinematic GPS Positioning for Photogrammetry—The Determination of the Camera Exposure Station, Silver Spring, MD, 11 pages, at least as early as 2000.
Linden et al., Airborne Video Automated Processing, US Forest Service Internal report, Fort Collins, CO, 1993.
Myhre, Dick, “Airborne Video System Users Guide”, USDA Forest Service, Forest Pest Management Applications Group, published by Management Assistance Corporation of America, 6 pages, 1992.
Myhre et al., “An Airborne Video System Developed Within Forest Pest Management—Status and Activities”, 10 pages, 1992.
Myhre et al., “Airborne Videography—A Potential Tool for Resource Managers”—Proceedings: Resource Technology 90, 2nd International Symposium on Advanced Technology in Natural Resource Management, 5 pages, 1990.
Myhre et al., Aerial Photography for Forest Pest Management, Proceedings of Second Forest Service Remote Sensing Applications Conference, Slidell, Louisiana, 153-162, 1988.
Myhre et al., “Airborne Video Technology”, Forest Pest Management/Methods Application Group, Fort Collins, CO, pp. 1-6, at least as early as Jul. 30, 2006.
Norton-Griffiths et al., 1982. “Sample surveys from light aircraft combining visual observations and very large scale color photography”. University of Arizona Remote Sensing Newsletter 82-2:1-4.
Norton-Griffiths et al., “Aerial Point Sampling for Land Use Surveys”, Journal of Biogeography, 15:149-156, 1988.
Novak, Rectification of Digital Imagery, Photogrammetric Engineering and Remote Sensing, 339-344, 1992.
Slaymaker, Dana M., “Point Sampling Surveys with GPS-logged Aerial Videography”, Gap Bulletin No. 5, University of Idaho, http://www.gap.uidaho.edu/Bulletins/5/PSSwGPS.html, 1996.
Slaymaker, et al., “Madagascar Protected Areas Mapped with GPS-logged Aerial Video and 35mm Air Photos”, Earth Observation magazine, vol. 9, No. 1, http://www.eomonline.com/Common/Archives/2000jan/00jan—tableofcontents.html, pp. 1-4, 2000.
Slaymaker, et al., “Cost-effective Determination of Biomass from Aerial Images”, Lecture Notes in Computer Science, 1737:67-76, http://portal.acm.org/citation.cfm?id=648004.743267&coll=GUIDE&dl=, 1999.
Slaymaker, et al., “A System for Real-time Generation of Geo-referenced Terrain Models”, 4232A-08, SPIE Enabling Technologies for Law Enforcement Boston, MA, ftp://vis-ftp.cs.umass.edu/Papers/schultz/spie2000.pdf, 2000.
Slaymaker, et al.,“Integrating Small Format Aerial Photography, Videography, and a Laser Profiler for Environmental Monitoring”, In ISPRS WG III/1 Workshop on Integrated Sensor Calibration and Orientation, Portland, Maine, 1999.
Slaymaker, et al., “Calculating Forest Biomass With Small Format Aerial Photography, Videography and a Profiling Laser”, In Proceedings of the 17th Biennial Workshop on Color Photography and Videography in Resource Assessment, Reno, NV, 1999.
Slaymaker et al., Mapping Deciduous Forests in Southern New England using Aerial Videography and Hyperclustered Multi-Temporal Landsat TM Imagery, Department of Forestry and Wildlife Management, University of Massachusetts.
Star et al., “Geographic Information Systems an Introduction”, Prentice-Hall, 1990.
Tomasi et al., “Shape and Motion from Image Streams: a Factorization Method”—Full Report on the Orthographic Case, pp. 9795-9802, 1992.
Warren, Fire Mapping with the Fire Mousetrap, Aviation and Fire Management, Advanced Electronics System Development Group, USDA Forest Service, 1986.
Welch, R., “Desktop Mapping with Personal Computers”, Photogrammetric Engineering and Remote Sensing, 1651-1662, 1989.
Westervelt, James, “Introduction to GRASS 4”, pp. 1-25, 1991.
“RGB Spectrum Videographics Report, vol. 4, No. 1, McDonnell Douglas Integrates RGB Spectrum Systems in Helicopter Simulators”, pp. 1-6, 1995.
RGB “Computer Wall”, RGB Spectrum, 4 pages, 1995.
“The First Scan Converter with Digital Video Output”, Introducing . . . The RGB/Videolink 1700D-1, RGB Spectrum, 2 pages, 1995.
ERDAS Field Guide, Version 7.4, A Manual for a commercial image processing system, 1990.
“Image Measurement and Aerial Photography”, Magazine for all branches of Photogrammetry and its fringe areas, Organ of the German Photogrammetry Association, Berlin-Wilmersdorf, No. 1, 1958.
“Airvideo Analysis”, MicroImages, Inc., Lincoln, NE, 1 page, Dec. 1992.
Zhu, Zhigang, Hanson, Allen R., “Mosaic-Based 3D Scene Representation and Rendering”, Image Processing, 2005, ICIP 2005, IEEE International Conference on 1(2005).
Mostafa, et al., “Direct Positioning and Orientation Systems How do they Work? What is the Attainable Accuracy?”, Proceeding, American Society of Photogrammetry and Remote Sensing Annual Meeting, St. Louis, MO, Apr. 24-27, 2001.
“POS AV” georeferenced by APPLANIX aided inertial technology, http://www.applanix.com/products/posav—index.php.
Mostafa, et al., “Ground Accuracy from Directly Georeferenced Imagery”, Published in GIM International vol. 14 N. Dec. 12, 2000.
Mostafa, et al., “Airborne Direct Georeferencing of Frame Imagery: An Error Budget”, The 3rd International Symposium on Mobile Mapping Technology, Cairo, Egypt, Jan. 3-5, 2001.
Mostafa, M.R. and Hutton, J., “Airborne Kinematic Positioning and Attitude Determination Without Base Stations”, Proceedings, International Symposium on Kinematic Systems in Geodesy, Geomatics, and Navigation (KIS 2001) Banff, Alberta, Canada, Jun. 4-8, 2001.
Mostafa, et al., “Airborne DGPS Without Dedicated Base Stations for Mapping Applications”, Proceedings of ION-GPS 2001, Salt Lake City, Utah, USA, Sep. 11-14.
Mostafa, “ISAT Direct Exterior Orientation QA/QC Strategy Using POS Data”, Proceedings of OEEPE Workshop: Integrated Sensor Orientation, Hanover, Germany, Sep. 17-18, 2001.
Mostafa, “Camera/IMU Boresight Calibration: New Advances and Performance Analysis”, Proceedings of the ASPRS Annual Meeting, Washington, D.C., Apr. 21-26, 2002.
Hiatt, “Sensor Integration Aids Mapping at Ground Zero”, Photogrammetric Engineering and Remote Sensing, Sep. 2002, p. 877-878.
Mostafa, “Precision Aircraft GPS Positioning Using CORS”, Photogrammetric Engineering and Remote Sensing, Nov. 2002, p. 1125-1126.
Mostafa, et al., System Performance Analysis of INS/DGPS Integrated System for Mobile Mapping System (MMS), Department of Geomatics Engineering, University of Calgary, Commission VI, WG VI/4.
Artes F., & Hutton, J., “GPS and Inertial Navigation Delivering”, Sep. 2005, GEOconnexion International Magazine, p. 52-53.
“POS AV” APPLANIX, Product Outline, airborne@applanix.com, 3 pages.
POSTrack, “Factsheet”, APPLANIX, Ontario, Canada, www.applanix.com.
POS AV “Digital Frame Camera Applications”, 3001 Inc., Brochure, 2007.
POS AV “Digital Scanner Applications”, Earthdata Brochure,.
POS AV “Film Camera Applications” AeroMap Brochure.
POS AV “LIDAR Applications” MD Atlantic Brochure.
POS AV “OEM System Specifications”, 2005.
POS AV “Synthetic Aperture Radar Applications”, Overview, Orbisat Brochure.
“POSTrack V5 Specifications” 2005.
“Remote Sensing for Resource Inventory Planning and Monitoring”, Proceeding of the Second Forest Service Remote Sensing Applications Conference—Slidell, Louisiana and NSTL, Mississippi, Apr. 11-15, 1988.
“Protecting Natural Resources with Remote Sensing”, Proceeding of the Third Forest Service Remote Sensing Applications Conference—Apr. 9-13, 1990.
Heipke, et al, “Test Goals and Test Set Up for the OEEPE Test—Integrated Sensor Orientation”, 1999.
Kumar, et al., “Registration of Video to Georeferenced Imagery”, Sarnoff Corporation, CN5300, Princeton, NJ, 1998.
McConnel, Proceedings Aerial Pest Detection and Monitoring Workshop—1994.pdf, USDA Forest Service Forest Pest Management, Northern Region, Intermountain regiion, Forest Insects and Diseases, Pacific Northwest Region.
“Standards for Digital Orthophotos”, National Mapping Program Technical Instructions, US Department of the Interior, Dec. 1996.
Tao, “Mobile Mapping Technology for Road Network Data Acquisition”, Journal of Geospatial Engineering, vol. 2, No. 2, pp. 1-13, 2000.
“Mobile Mapping Systems Lesson 4”, Lesson 4 SURE 382 Geographic Information Systems II, pp. 1-29, Jul. 2, 2006.
Konecny, G., “Mechanische Radialtriangulation mit Konvergentaufnahmen”, Bildmessung und Luftbildwesen, 1958, Nr. 1.
Myhre, “ASPRS/ACSM/RT 92” Technical papers, Washington, D.C., vol. 5 Resource Technology 92, Aug. 3-8, 1992.
Rattigan, “Towns get new view from above,” The Boston Globe, Sep. 5, 2002.
Mostafa, et al., “Digital image georeferencing from a multiple camera system by GPS/INS,” ISP RS Journal of Photogrammetry & Remote Sensing, 56(1): I-12, Jun. 2001.
Dillow, “Grin, or bare it, for aerial shot,” Orange County Register (California), Feb. 25, 200I.
Anonymous, “Live automatic coordinates for aerial images,” Advanced Imaging, 12(6):51, Jun. 1997.
Anonymous, “Pictometry and US Geological Survey announce—Cooperative Research and Development Agreement,” Press Release published Oct. 20, 1999.
Miller, “Digital software gives small Arlington the Big Picture,” Government Computer News State & Local, 7(12), Dec. 2001.
Garrett, “Pictometry: Aerial photography on steroids,” Law Enforcement Technology 29(7):114-116, Jul. 2002.
Weaver, “County gets an eyeful,” The Post-Standard (Syracuse, NY), May 18, 2002.
Reed, “Firm gets latitude to map O.C. in 3D,” Orange County Register (California), Sep. 27, 2000.
Reyes, “Orange County freezes ambitious aerial photography project,” Los Angeles Times, Oct. 16, 2000.
International Preliminary Report on Patentability.
Examination Report, Nov. 17, 2010.
English Translation of EP 0 424 875 A2.
International Search Report and Written Opinion, May 23, 2008.
Response to Examination Report dated May 17, 2011.
Related Publications (1)
Number Date Country
20080204570 A1 Aug 2008 US
Provisional Applications (1)
Number Date Country
60901444 Feb 2007 US