Multi-spectral and hyperspectral cameras are used in the field to measure the chemical composition of crops and minerals. Two dimensional sensor arrays can be used to collect images of large areas of interest for later analysis. In some implementations, a small number of sensors is used with a large number of band pass filters, with a mechanism to replace the filters for successive image acquisitions. In others, an array of cameras, each with a separate band pass filter, is used so that the images may be acquired simultaneously.
In order to obtain calibrated results with camera arrays, ground targets of known reflectance are typically used to provide a reference reflected value for the images. Examples of ground targets include painted wooden panels, or vehicles (typically white), whose spectral characteristics have been measured so they can be used as references. In some cases two identical camera arrays have been used, one camera array looking up to measure the incident light, and another camera array flown at an altitude above the earth to collect images of the area of interest on earth. The use of ground targets of known reflectance or additional camera arrays to obtain calibrated results can add costs, time, and/or inconvenience. Accordingly, there is a need in the art for a method and apparatus to obtain multi-spectral and/or hyperspectral images without the need for reference targets within the area of interest, or duplicate cameras looking up to measure the incident light.
Embodiments of the present invention relate to a method and apparatus for imaging one or more discrete bands of the spectrum of a target and calculating the true absorption/reflectance of the target with reference to a static ambient light sensor for each of the bands of the spectrum implemented in the device. In specific embodiments, an array of cameras, each with a separate band pass filter, is used to acquire images simultaneously.
Embodiments of the invention relate to imaging one or more discrete bands of the spectrum of a target and calculating the true absorption/reflectance of the target with reference to a static ambient light sensor for each of the bands of the spectrum implemented in the device. Specific embodiments image one or more discrete bands in the visible light region. Specific embodiments image one or more discrete bands in the NIR and/or IR light region.
Image data can be collected by an array of cameras with matched sensors, where each camera has a narrow pass filter installed to limit its input to the band of light corresponding to the narrow pass filter. An additional matched sensor and camera can then measure ambient light through one or more optical fibers. In an embodiment, a set of optical fibers, having one fiber for each of the supported bands, can be used. Each fiber can have a narrow band pass filter that corresponds to one of the narrow band pass filters in the camera array.
The optical fibers can bring the corresponding ambient light to discrete locations of the sensor, for example, one location for each fiber. The light incident at each discrete location on the sensor can be detected. In an embodiment, the light incident at each location can be digitized and saved at the same instant as the camera array captures images of a selected target. The optical fibers can be as long as necessary to allow the ambient light collection to be done away from any interfering structures.
Using the target images and collected optical fiber data, the data for each band of the target is transformed to a radiometric reflectance value, using calibration constants determined at the time the array of cameras is configured and tested.
The results can be saved as a multi-plane radiometric image of the target. Such multi-plane radiometric image of the target can be used to determine, for example, the molecular composition and surface condition of the target.
This final conversion of the target image, collected optical fiber data, the data for each band of the target, and calibration constants determined at the time the array of cameras is configured and tested, into a radiometric reflectance value, can be performed on a computer that has extracted the raw band samples from each of the imaging cameras and the ambient light camera in the array.
Alternatively, the final conversion of the target image, collected optical fiber data, the data for each band of the target, and calibration constants determined at the time the array of cameras is configured and tested, into a radiometric reflectance value can be performed by one of the cameras in the array using an inter-camera communications technique. In a specific embodiment, to determine a radiometric reflectance value, the signal from the area sensor collecting images is first taken for an object of known reflectance, say 50%, at a known exposure value. As the incident light value is known to be twice the amount of the area sensor signal, the value of the signal from each fiber is assigned a scaling constant that raises its calculated signal to twice the value of the corresponding image sensor. The scaling values are then preserved in memory for future image captures. When a picture is taken of an arbitrary scene, each pixel is converted to reflectance by first scaling the corresponding fiber measurement by the saved constant, then scaling for difference in exposure time versus the calibration sequence. The pixel value is divided by the result, which produces a number in the range of 0 to 1.0, for the radiometric reflectance value. The value can be saved as a binary fixed point number such that 0.5 is expressed as 10000000 for an 8 bit pixel. The most significant bit is the largest binary fraction bit (½).
The filters used for the fibers and imaging cameras can be easily replaceable and/or interchangeable, allowing reconfiguration of the set-up for different bands in the field. In a specific embodiment, a large filter goes over the area sensor and a matching smaller filter over the corresponding fiber. The bands are selected according to the spectral characteristics of the subject, and, in a specific embodiment, each filter is a 2 nm to 40 nm wide segment of the visible and NIR spectrum, which spans 400 nm to 1000 nm. An example set of filters is as follows:
1. 10 nm filter centered at 420 nm
2. 20 nm filter centered at 540 nm
3. 20 nm filter centered at 720 nm
4. 10 nm filter centered at 750 nm
5. 40 nm filter centered at 880 nm
Any band pass filter can be created in the range of the spectrum supported by the instrument. Specific embodiments can utilize filters having a width in the range of 2 nm to 10 nm, 10 nm to 20 nm, 20 nm to 30 nm, 30 nm to 40 nm, 5 nm to 35 nm, 18 nm to 22 nm, 10 nm to 30 nm, and/or 15 nm to 25 nm. Embodiments can use 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, or more bands. The bands can have the same widths or different widths. The bands can overlap or not overlap.
There can be more optical fibers and ambient light collection filters than imaging camera bands, allowing better characterization of the ambient light spectrum, or fewer optical fibers and ambient light collection fibers than imaging camera bands if desired.
The optical fibers can be fitted with different apertures, or have different efficiencies to equalize light measurements through filters of different band widths.
There can be several sensors used to measure ambient light.
The sensors used to measure ambient light can be different in kind from the image sensors in the camera array, or can be the same.
A single sensor camera with a color filter array (CFA) can have a single optical fiber that produces an ambient light measurement patch in the corner of the image.
Although
Specific embodiments can involve measuring the chemical composition of crops and minerals, or other characteristic(s), of a target on the ground based on images captured by cameras located between 200 m and 1000 m, 100 m and 200 m, 200 m and 300 m, 300 m and 400 m, 400 m and 500 m, 500 m and 600 m, 600 m and 700 m, 700 m and 800 m, 800 m and 900 m, and/or 900 m and 1000 m above ground level (AGL). Of course other altitudes can also be implemented. Preferably, the images are such that each pixel represents less than 20 cm×20 cm, less than 15 cm×15 cm, less than 10 cm×10 cm, less than 5 cm×5 cm, and/or between 12 cm×12 cm and 8 cm×8 cm of the target.
Aspects of the invention, such as calculating absorption/reflectance of a target, calibration constants, radiometric images, multi-plane radiometric images, molecular composition, surface conditions, scaling constants, and/or chemical composition, may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc., that perform particular tasks or implement particular abstract data types. Moreover, those skilled in the art will appreciate that the invention may be practiced with a variety of computer-system configurations, including multiprocessor systems, microprocessor-based or programmable-consumer electronics, minicomputers, mainframe computers, and the like. Any number of computer-systems and computer networks are acceptable for use with the present invention.
Specific hardware devices, programming languages, components, processes, protocols, and numerous details including operating environments and the like are set forth to provide a thorough understanding of the present invention. In other instances, structures, devices, and processes are shown in block-diagram form, rather than in detail, to avoid obscuring the present invention. But an ordinary-skilled artisan would understand that the present invention may be practiced without these specific details. Computer systems, servers, work stations, and other machines may be connected to one another across a communication medium including, for example, a network or networks.
As one skilled in the art will appreciate, embodiments of the present invention may be embodied as, among other things: a method, system, or computer-program product. Accordingly, the embodiments may take the form of a hardware embodiment, a software embodiment, or an embodiment combining software and hardware. In an embodiment, the present invention takes the form of a computer-program product that includes computer-useable instructions embodied on one or more computer-readable media.
Computer-readable media include both volatile and nonvolatile media, removable and nonremovable media, and contemplate media readable by a database, a switch, and various other network devices. By way of example, and not limitation, computer-readable media comprise media implemented in any method or technology for storing information. Examples of stored information include computer-useable instructions, data structures, program modules, and other data representations. Media examples include, but are not limited to, information-delivery media, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile discs (DVD), holographic media or other optical disc storage, magnetic cassettes, magnetic tape, magnetic disk storage, and other magnetic storage devices. These technologies can store data momentarily, temporarily, or permanently.
The invention may be practiced in distributed-computing environments where tasks are performed by remote-processing devices that are linked through a communications network. In a distributed-computing environment, program modules may be located in both local and remote computer-storage media including memory storage devices. The computer-useable instructions form an interface to allow a computer to react according to a source of input. The instructions cooperate with other code segments to initiate a variety of tasks in response to data received in conjunction with the source of the received data.
The present invention may be practiced in a network environment such as a communications network. Such networks are widely used to connect various types of network elements, such as routers, servers, gateways, and so forth. Further, the invention may be practiced in a multi-network environment having various, connected public and/or private networks.
Communication between network elements may be wireless or wireline (wired). As will be appreciated by those skilled in the art, communication networks may take several different forms and may use several different communication protocols. And the present invention is not limited by the forms and communication protocols described herein.
All patents, patent applications, provisional applications, and publications referred to or cited herein are incorporated by reference in their entirety, including all figures and tables, to the extent they are not inconsistent with the explicit teachings of this specification.
It should be understood that the examples and embodiments described herein are for illustrative purposes only and that various modifications or changes in light thereof will be suggested to persons skilled in the art and are to be included within the spirit and purview of this application.
The present application claims the benefit of U.S. Provisional Application Ser. No. 61/672,598, filed Jul. 17, 2012, which is hereby incorporated by reference herein in its entirety, including any figures, tables, or drawings.
Number | Date | Country | |
---|---|---|---|
61672598 | Jul 2012 | US |