Claims
- 1. A computerized method for determining a weather condition using visible imagery comprising the steps:
receiving a plurality of sequential visible images each depicting substantially the same field of view; determining based on the received plurality of sequential visible images, a composite visible image depicting the field of view; generating a first and second edge-detected images based on one of the plurality of sequential visible images and the composite visible image, respectively; comparing the fist and second edge-detected images; and determining a weather condition based on the comparison of edge-detected images.
- 2. The method of claim 1, wherein the receiving step comprises receiving the plurality of sequential visible images from an image detection device.
- 3. The method of claim 2, wherein the image detection device comprises a digital camera.
- 4. The method of claim 1, wherein the receiving step comprises receiving the plurality of sequential visible images via a network interface.
- 5. The method of claim 1, wherein each of the plurality of sequential visible images comprises a graphical image selected from the group including: JPEG, JPEG2000, TIFF, bitmap, GIF.
- 6. The method of claim 1, further comprising the step of converting each of the received plurality of sequential visible images.
- 7. The method of claim 1, further comprising the step of discarding each of the received plurality of sequential visible images depicting the field of view during nighttime hours.
- 8. The method of claim 1, wherein the step of determining the composite visible image comprises averaging the received plurality of sequential visible images.
- 9. The method of claim 8, wherein the averaging step comprises weighted averaging the composite visible image with a currently-received visible image of the plurality of sequential visible images.
- 10. The method of claim 1, further comprising the step of calibrating each of the received plurality of sequential visible images correcting for daily variations in lighting.
- 11. The method of claim 10, wherein the calibrating step comprises the steps:
determining a mean clear-day brightness for each of the plurality of sequential visible images received on a clear day; storing the determined clear-day mean brightness; determining a mean brightness for each of the plurality of sequential visible images; and adjusting, for each of the received plurality of visible images, image intensity based on the stored clear-day brightness.
- 12. The method of claim 1, wherein the generating step comprises edge-detection filtering each of the currently-received visible image of the plurality of sequential visible images and the composite visible image.
- 13. The method of claim 1, further comprising the step of registering each of the received plurality of sequential visible images.
- 14. The method of claim 13, wherein the registering step comprises the steps:
comparing the first and second edge-detected images; determining that the first edge-detected image depicts substantially the same field of view as the second edge-detected image; and registering the first edge-detected image responsive a determination that the fist edge-detected image depicts substantially the same field of view as the second edge-detected image.
- 15. The method of claim 14, wherein the determining step comprises correlating the first and second edge-detected images.
- 16. The method of claim 1, wherein the comparing step comprises extracting expected edges and normalizing the extracted edges.
- 17. The method of claim 16, wherein the extracting step comprises the steps:
comparing for each pixel of the first edge-detected image, a first pixel value to a predetermined threshold value; comparing for each pixel of the second edge-detected image corresponding to the pixel of the first edge-detected image, a second pixel value to the predetermined threshold value; and keeping the pixel value of the first edge-detected image responsive to the first pixel value and the second pixel value being greater than the predetermined threshold.
- 18. The method of claim 17, further comprising the step of writing the pixel value of the first edge- detected image to an extraneous edge image responsive to the first pixel value being greater than the predetermined threshold and the second pixel value being not greater than the predetermined threshold.
- 19. The method of claim 16, wherein the step of normalizing comprises dividing the first edge-detected image by the second edge-detected image.
- 20. The method of claim 1, wherein the step of determining a weather condition is based on the comparisons of the first and second edge-detected images.
- 21. The method of claim 1, wherein the step of determining a weather condition is based on a predetermined scoring function.
- 22. The method of claim 1, wherein the weather parameter comprises visibility.
- 23. The method of claim 1, further comprising the step of determining status of an image detection device based on the first and second edge-detected images.
- 24. The method of claim 1, further comprising the step of sending the determined weather condition to a user interface.
- 25. An apparatus for automatically determining a weather condition using visible imagery comprising:
an image input receiving a plurality of sequential visible images each depicting substantially the same field of view; an image processor in communication with the image input determining a composite visible image depicting the field of view based on the received plurality of sequential visible images; a filter in communication with the image input and the image processor generating a first and second edge-detected images based on one of the plurality of sequential visible images and the composite visible image, respectively; a comparator in communication with the filter comparing the fist and second edge-detected images; and a weather processor in communication with the comparator determining a weather condition based on the comparison of edge-detected images.
- 26. The apparatus of claim 25, wherein the image input receives the plurality of sequential visible images from an image-detection device.
- 27. The apparatus of claim 25, wherein the image input device comprises a digital camera.
- 28. The apparatus of claim 25, wherein the image input comprises a network interface.
- 29. The apparatus of claim 25, wherein the image processor comprises a memory storing the composite visible image; and
a weighted-averaging filter updating a weighted average of the composite visible image and a currently-received one of the plurality of sequential visible images.
- 30. The apparatus of claim 25, wherein the image processor comprises a brightness calibration filter adjusting a mean image intensity of the currently-received image according to a predetermined relationship.
- 31. The apparatus of claim 25, wherein the filter comprises an edge-detection filter filtering each of a currently received one of the plurality of sequential received visible images and the composite visible image.
- 32. The apparatus of claim 25, wherein the comparator comprises:
a correlative filter correlating the detected edges of a currently-received one of the plurality of sequential visible images; and a registration processor registering the detected edges according to the correlation.
- 33. The apparatus of claim 25, wherein the weather processor comprises an image processor determining the weather condition based on the comparison of the first and second edge-detected images.
- 34. The apparatus of claim 33, further comprising a pre-determined scoring function based on a relationship between detected edges and the weather parameter.
- 35. The apparatus of claim 33, further comprising a user interface.
- 36. A computerized apparatus for determining a weather condition using visible imagery comprising:
means for receiving a plurality of sequential visible images each depicting substantially the same field of view; means in communication with the receiving means for determining based on the received plurality of sequential visible images, a composite visible image depicting the field of view; means in communication with the receiving means and the determining means for generating a first and second edge-detected images based on one of the plurality of sequential visible images and the composite visible image, respectively; means in communication with the receiving means and the generating means for comparing the fist and second edge-detected images; and means in communication with the comparing means for determining a weather condition based on the comparison of edge-detected images.
- 37. The apparatus of claim 36, wherein the generating means comprises an edge-detection filter.
- 38. The apparatus of claim 36, wherein the receiving means receives the plurality of sequential visible images from an image-detection device.
- 39. The apparatus of claim 38, wherein the image input device comprises a digital camera.
- 40. The apparatus of claim 36, wherein the weather condition comprises visibility.
- 41. The apparatus of claim 36, further comprising a means for interfacing with a user.
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application claims the benefit of provisional patent application serial No. 60/295,688, filed Jun. 4, 2001, the entirety of which is incorporated herein by reference.
STATEMENT REGARDING FEDERALLY SPONSORED RESEARCH OR DEVELOPMENT
[0002] The subject matter described herein was supported in part under Contract Number F19628-00-C-0002 awarded by the U.S. Department of the Air Force.
Provisional Applications (1)
|
Number |
Date |
Country |
|
60295688 |
Jun 2001 |
US |