Technical Field
The present disclosure is directed to thermal calibration of an infrared image sensor using a remote, non-contact temperature sensing device positioned on a same substrate as and adjacent to the image sensor.
Description of the Related Art
Products that help owners and operators better utilize their land and equipment by capturing information about thermal conditions of an area. For example, this information can provide building heat analysis, solar panel heat measurements, or heat analysis of land and items on the land. Thermal imaging devices, such as an infrared sensor array, capture images about the area. These thermal imaging devices output temperature measurements, but these measurements are not necessarily calibrated to an absolute temperature scale.
The present disclosure is directed to a method and system of calibrating images from thermal imaging devices to provide accurate and calibrated temperature information of an area imaged. Such calibration may be referred to as an absolute calibration such that data output to a user reflects actual temperatures of the area imaged, in units of degrees Celsius or Kelvin or equivalent. This is achieved by incorporating an imaging device next to a non-contact temperature sensing device on a same substrate. The non-contact temperature sensing device determines temperature of a subset of the area imaged by the imaging device. The output of the imaging device is then modified or calibrated by information from the temperature sensing device such that the user can view an image or data that reflects absolute temperature information of the area imaged.
In the drawings, identical reference numbers identify similar elements. The sizes and relative positions of elements in the drawings are not necessarily drawn to scale.
The present disclosure is directed to a system 100 that images a physical area or scene and outputs calibrated thermal images with accurate (absolute) temperature information about the imaged area. These thermal images can be used in a variety of contexts where accurate temperature is helpful. For example, the health of crops, including water loss from the leaves, can be observed with thermal imaging. In particular, evapotranspiration of a plant can be determined by evaluating air temperature and thermal information of the leaves captured with an infrared sensor. Evapotranspiration represents the amount of water evaporating from the plant. Accordingly, a grower can evaluate the health and water needs of his/her plants by taking photographs, such as from an aerial position. The grower can then determine which of the plants to water. This avoids excess watering and allows for precise individual plant care.
The system 100 captures images from the ground or aerially from a manned or unmanned aerial vehicle 102, such as an airplane or a drone as illustrated in
Both the image sensor and the temperature sensor are positioned to image the same area. They are positioned adjacent to each other on the support substrate to ensure the images and data they capture overlap. Each of the image sensor and the temperature sensor may be packaged in separate packages, where each package may include a substrate. The substrate 104 is a support, that may be active or passive, that supports both the image sensor and the temperature sensor in a way that allows the fields of view of each of the sensors to be directed to a same area. The substrate 104 may be any number of shapes and sizes. For example, the support substrate 104 may be an integral piece in the housing 101 or may simply be the housing. In one embodiment, the temperature sensor may be attached to a housing or package of the image sensor, such that the support substrate that supports the image sensor supports the temperature sensor by way of the image sensor.
The processor may be a chip included on the support 104 that is coupled to both the image sensor and the temperature sensor. In other embodiments, the processor may be part of a post-processing computer to which the housing 101 that includes the image sensor and the temperature sensor are coupled after an image taking session. The thermal image sensor and the temperature sensor may capture data on their own where there is no computer on board applying calibrations. The calibrations and processing may also be performed in the cloud, a remote computing network that receives the collected data wirelessly from the image and temperature sensors or receives the data through a wired network once the housing 101 holding image and temperature sensors is coupled to a computer after the image capturing session. In another embodiment, the temperature sensor may be coupled to the image sensor such that only the image sensor sends data to the processor. The image sensor would send both the temperature sensor data and the image sensor data. In another embodiment, the image sensor may include the processor such that the temperature sensor data is received and processed by the image sensor processor.
Each image sensor 106 includes a housing 112 or some external packaging the houses an application specific integrated circuit 114, an image sensor die 116, and a lens 118. As will be appreciated by one of skill in the art, the image sensor 106 can include various die and packaging arrangements as specified by the particular implementation of the image sensor. The image sensor die 116 is an infrared image sensor die that includes an array of sensors (or pixels) that respond to long wave infrared wavelengths and whose response is reflective of temperature of the imaged area. Data output by the image sensor 106 may include non-calibrated information about temperature of the imaged area, i.e., the image sensor measures a relative difference of the temperatures in the imaged area and may provide a non-calibrated temperature output. This data primarily provides information about whether a specific item in the imaged area is hotter or colder than surrounding items.
The image sensor 116 in the image sensor 106 may include a plurality of microbolometers, which are devices that measure power of incident electromagnetic radiation in response to heating of a material that has a temperature-dependent electrical resistance. These microbolometers can be implemented as focal plan array sensors, which respond to these longer wavelengths. In particular, a microbolometric sensor detects infrared radiation in wavelengths between 7.5-14 microns as they interact with the material. As the infrared radiation hits the material it heats the material, which changes the material's electrical resistance. The change in resistance is measured and processed to create an image that represents thermal information about the area imaged. Such microbolometers may be uncooled thermal sensors.
Uncooled thermal image sensors operate at ambient temperatures, but generate their own heat during operation. The measured change in resistance that corresponds to the received infrared radiation is a function of the temperature of the focal plane array. The output data, the thermal image, must be adjusted accordingly if the operating temperature of the sensor is impacting the collected data.
Some prior solutions include incorporating complex shutter mechanisms that have known temperature within the image sensor. These shutter mechanisms are expensive, power hungry, and heavy. The shutter mechanisms are not easily incorporated into light-weight portable systems, such as the system 100, which can be easily carried and utilized by an unmanned aerial vehicle. The image sensor of the present disclosure in conjunction with the temperature sensor will provide accurate temperature information regarding the imaged area without the complex and heavy shutter mechanism.
The temperature sensor 108 outputs data about actual temperature of the area imaged by the image sensor. This system is configured to process the data from the temperature sensor 108 and adjust the data output from the image sensor 106 to provide a user with data that represents accurate temperature of features in the imaged area. The system is light-weight and cost effective for mobile applications.
The system can be configured to be positioned away from the area to be imaged and carried by hand, a moving plane or other terrestrial or aerial vehicle. The temperature sensor includes a thermopile, a pyrometer, an infrared (long wave) photodiode, or any suitable non-contact remote temperature sensing device. The temperature sensor may be a single-pixel thermopile. A thermopile is an electronic device that converts thermal energy into electrical output. The temperature sensor will output a single value that represents an average of the actual temperature across the field of view of the temperature sensor. This value is in absolute units, such as degrees Celsius. In some embodiments, the thermopile may have a lens that provides a more uniform field of view. In addition, each of the image sensors may include a manually or automatically adjustable lens.
The field of view 122 is shifted to the right from a center of the field of view of the imaging sensor as the temperature sensor 108 is positioned to the right of the image sensor on the substrate 104. Said differently, a center point 124 (on a center line 127) of the field of view 122 of the temperature sensor 108 is shifted from a center line 126 of the field of view of the image sensor.
In this example, the single temperature sensor 108 is used to calibrate the entire thermal image 121. Each pixel of the thermal image 121 has a digital number that, in conjunction, with the other digital numbers indicates how hot or cold that pixel is in comparison to other pixels. The system either during operation or pre-shipment to a user, determines the pixels in each image that correspond to the field of view 122 of the temperature sensor. This includes identifying the pixels that correspond with a boundary of the field of view 122 and a pixel or group of pixels of the thermal image 121 that corresponds to the center point 124 of the temperature sensor.
The temperature sensor 108 outputs an actual temperature that represents an average temperature of the items in the thermal image that are within the temperature sensor's field of view 122. Each of the pixels of the thermal image that are within the field of view 122 is analyzed with respect to the actual temperature to get a weighted average. For example, the center pixel, which has an x and y value, corresponds to the center point 124 of the field of view 122, is given a weight of 1. The pixels at the boundary of the field of view 122 are given a weight of 0. Moving radially from the center pixel to the boundary pixels, each pixel will be given a weight between 1 and 0, decreasing from 1 to 0 as the pixels are further from the center pixel. For example, see
The field of view of the temperature sensor is illustrated as circular, however other shapes of the field of view, which represent the temperature information gathered are envisioned. In addition, the field of view does not have rigid boundaries. An outer-most edge may be defined by the system to utilize the most accurate data returned by the temperature sensor module. As in
Without inclusion of complicated shutter mechanisms and other components that are heavy and expensive, the thermal image sensor alone does not provide temperature data in degrees, which is very useful information for a user analyzing the data. Knowing how hot or how cold an item of the imaged area actually is can provide the user with actionable data. For example, if the imaged area is an orchard the user can determine specific watering patterns to match the needs of individual or groups of plants. Orchards are often organized in rows and columns of plants and trees such that from an aerial perspective, the plants are easily imaged and analyzed. For example, see
As mentioned above, temperature is indicative of stress of each plant or tree. It is envisioned that the system will capture a plurality of sequential images of the area, such as the orchard. For example, a drone or other aerial vehicle 504 carrying the system can fly a route 506 over the orchard. The system 100 will capture thermal images 508 either at a selected rate or frequency or simply sequentially throughout the flight. The adjacent images 508 will include overlapping subject matter regarding the scene. The system will simultaneously be gathering thermal information about the actual temperature of the orchard associated with each image. The pixels of the thermal image are evaluated to determine actual temperature information for each pixel based on the temperature information. A calibration value is determined based on the subset of the pixels that correspond to the temperature sensor and then the calibration value is used to calibrate the whole thermal image. A calibrated thermal image is created.
The calibrated thermal image may be directly output to the user. Alternatively, each of the thermal images 508 are thermally calibrated with the temperature information to generate a plurality of calibrated thermal images. Once each image is thermally calibrated, the plurality of images may be stitched together to create an image that represents the actual temperature information about an area, such as the whole orchard or a specific tree in the orchard.
The system is configured to determine a calibration value, such as a weighted average for each of the fields of view 212a-212d. The calibration value may then be applied to a quadrant of the pixels of the image from the image sensor within which the field of view is positioned. Alternatively, a final calibration value may be determined from the four calibration values such that the thermal image from the image sensor is calibrated with the final calibration value. For example, the four calibration values that correspond to the weighted average generated for the four fields of view of the four temperature sensors may be averaged together or may be used to generate a regression-based calibration curve. This collective or final calibration value is then applied to every pixel of the thermal image to generate the calibrated thermal image.
In some embodiments, the field of view of the temperature sensors may overlap. The weighted averages from each field of view may be compared with each other to confirm the actual temperature of the scene. For example, the system could determine the temperature associated with the portion of the field of view that is overlapping with an adjacent field of view. The values that represent each of the overlapping sections can be compared to determine if they are reading a similar temperature. If they are slightly off, the system generates an offset that calibrates the calibration value before calibrating the whole thermal image. This gives the system more precise temperature information, which results in a more precise calibrated thermal image to provide to the user or to further process.
Each image sensor may include a different lens and a different field of view. This provides a user with flexibility regarding an amount of zoom available from the system. For example, if the user is operating and reviewing data in real time, the user may identify a portion of the area for which they want more information regarding temperature. The system can then zoom in on the portion using a different one of the image sensors that has a different field of view.
In addition, the image sensors may be operated sequentially or in a pattern such that during an imaging phase a plurality of images having different fields of view are gathered and later processed. The temperature sensor gathers temperature information during the whole imaging phase and based on time and duration information, the images from the different image sensors can be matched and calibrated from the temperature sensor data.
Each of the thermal images gathered from each of the image sensors can be calibrated with the temperature information gathered by the temperature sensor. In processing, the calibration value determined from the temperature information and the pixels of the thermal image that correspond to the field of view of the temperature sensor can be compared to make the temperature information output with the calibrated thermal image more accurate.
For each of the examples described in this disclosure, the image sensor or modules and the temperature sensor or modules are aligned on the support substrate to ensure overlap of the field of views of the modules. For reference, the image sensor 106 and temperature sensor 108 of
In some implementations, the temperature sensor may be slightly angled off a first surface 128 to adjust a position of the field of view 122 with respect to the field of view 120 of the image sensor. For example, as noted above, as a result of being physically next to the image sensor, the field of view of the temperature sensor is shifted from the center line 126 of the field of view of the image sensor. To better align the center point 124 with the center line 126, one or both of the modules may be angled with respect to the first surface of the substrate, i.e. a bottom surface of the module may have a first portion closer to the substrate than a second portion. Ideally, there is no angle for either the temperature sensor or the image sensor with respect to the first surface of the substrate as the modules will be physically very close, resulting in overlap of their fields of view. In addition, the scene or area being imaged is typically at a large distance from the system as compared to a distance between the image sensor and the temperature sensor.
Other than providing the user with accurate temperature information regarding the area or scene imaged, incorporating the temperature sensor within this system can provide information about the vignette effect of the image sensor.
Step 408 includes applying the calibration value to the thermal image to change the digital number of each pixel to represent a temperature value, such as in degrees Celsius. The method then includes outputting a calibrated thermal image representative of actual temperature of the scene in step 410. Each pixel will have a temperature value so that when viewed as a whole the various temperatures across the imaged area will be visible. This process can be performed on a subset of the thermal images such that a calibration value may be applied to a sequence of thermal images or the process can be applied to every single thermal image gathered by the system.
In the description, certain specific details are set forth in order to provide a thorough understanding of various embodiments of the disclosure. However, one skilled in the art will understand that the disclosure may be practiced without these specific details. In other instances, well-known structures have not been described in detail to avoid unnecessarily obscuring the descriptions of the embodiments of the present disclosure.
Unless the context requires otherwise, throughout the specification and claims that follow, the word “comprise” and variations thereof, such as “comprises” and “comprising,” are to be construed in an open, inclusive sense, that is, as “including, but not limited to.”
Reference throughout this specification to “one embodiment” or “an embodiment” means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment. Thus, the appearances of the phrases “in one embodiment” or “in an embodiment” in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
As used in this specification and the appended claims, the singular forms “a,” “an,” and “the” include plural referents unless the content clearly dictates otherwise. It should also be noted that the term “or” is generally employed in its sense including “and/or” unless the content clearly dictates otherwise.
As used in the specification and appended claims, the use of “correspond,” “corresponds,” and “corresponding” is intended to describe a ratio of or a similarity between referenced objects. The use of “correspond” or one of its forms should not be construed to mean the exact shape or size.
In the drawings, identical reference numbers identify similar elements or acts. The size and relative positions of elements in the drawings are not necessarily drawn to scale.
The various embodiments described above can be combined to provide further embodiments. All of the U.S. patents, U.S. patent application publications, U.S. patent applications, foreign patents, foreign patent applications and non-patent publications referred to in this specification and/or listed in the Application Data Sheet are incorporated herein by reference, in their entirety. Aspects of the embodiments can be modified, if necessary to employ concepts of the various patents, applications and publications to provide yet further embodiments.
These and other changes can be made to the embodiments in light of the above-detailed description. In general, in the following claims, the terms used should not be construed to limit the claims to the specific embodiments disclosed in the specification and the claims, but should be construed to include all possible embodiments along with the full scope of equivalents to which such claims are entitled. Accordingly, the claims are not limited by the disclosure.
Number | Name | Date | Kind |
---|---|---|---|
7056012 | Blakeley, III | Jun 2006 | B2 |
7111981 | Blakeley, III | Sep 2006 | B2 |
7163336 | Blakeley, III | Jan 2007 | B2 |
7168316 | Blakeley, III | Jan 2007 | B2 |
7192186 | Blakeley, III | Mar 2007 | B2 |
7452127 | Blakely, III | Nov 2008 | B2 |
8208026 | Högasten et al. | Jun 2012 | B2 |
8520970 | Strandemar | Aug 2013 | B2 |
8565547 | Strandemar | Oct 2013 | B2 |
8727608 | Blakeley, III | May 2014 | B2 |
8749635 | Högasten et al. | Jun 2014 | B2 |
8766808 | Hogasten | Jul 2014 | B2 |
8780208 | Högasten et al. | Jul 2014 | B2 |
9058653 | Kostrzewa et al. | Jun 2015 | B1 |
9083897 | Högasten et al. | Jul 2015 | B2 |
9143703 | Boulanger et al. | Sep 2015 | B2 |
9171361 | Strandemar | Oct 2015 | B2 |
9207708 | Simolon et al. | Dec 2015 | B2 |
9208542 | Högasten et al. | Dec 2015 | B2 |
9235023 | Burt et al. | Jan 2016 | B2 |
9235876 | Högasten et al. | Jan 2016 | B2 |
9237284 | Högasten et al. | Jan 2016 | B2 |
9247131 | Kostrzewa et al. | Jan 2016 | B2 |
9292909 | Högasten et al. | Mar 2016 | B2 |
D765081 | Frank et al. | Aug 2016 | S |
9451183 | Högasten et al. | Sep 2016 | B2 |
9471970 | Strandmar | Oct 2016 | B2 |
9473681 | Hoelter et al. | Oct 2016 | B2 |
9509924 | Terre et al. | Nov 2016 | B2 |
9517679 | Frank et al. | Dec 2016 | B2 |
9521289 | Dart et al. | Dec 2016 | B2 |
9538038 | Sieh et al. | Jan 2017 | B2 |
9635285 | Teich et al. | Apr 2017 | B2 |
9674458 | Teich et al. | Jun 2017 | B2 |
9706137 | Scanlon et al. | Jul 2017 | B2 |
9706138 | Teich et al. | Jul 2017 | B2 |
9706139 | Nussmeier et al. | Jul 2017 | B2 |
9716843 | Fox et al. | Jul 2017 | B2 |
9716844 | Nussmeier et al. | Jul 2017 | B2 |
9723227 | Högasten et al. | Aug 2017 | B2 |
9723228 | Boulanger et al. | Aug 2017 | B2 |
20090256077 | Brady | Oct 2009 | A1 |
20110261207 | Strandemar | Oct 2011 | A1 |
20120262584 | Strandemar | Oct 2012 | A1 |
20130173435 | Cozad, Jr. | Jul 2013 | A1 |
20130250125 | Garrow et al. | Sep 2013 | A1 |
20130253551 | Boyle et al. | Sep 2013 | A1 |
20130258111 | Frank et al. | Oct 2013 | A1 |
20130278771 | Magoun et al. | Oct 2013 | A1 |
20130300875 | Strandemar et al. | Nov 2013 | A1 |
20130314536 | Frank et al. | Nov 2013 | A1 |
20130321637 | Frank et al. | Dec 2013 | A1 |
20130342691 | Lewis et al. | Dec 2013 | A1 |
20140085482 | Teich et al. | Mar 2014 | A1 |
20140092256 | Simolon et al. | Apr 2014 | A1 |
20140112537 | Frank et al. | Apr 2014 | A1 |
20140139643 | Högasten et al. | May 2014 | A1 |
20140168433 | Frank et al. | Jun 2014 | A1 |
20140168445 | Högasten et al. | Jun 2014 | A1 |
20140184807 | Simolon et al. | Jul 2014 | A1 |
20140232875 | Boulanger et al. | Aug 2014 | A1 |
20140253735 | Fox et al. | Sep 2014 | A1 |
20150085133 | Teich et al. | Mar 2015 | A1 |
20150109454 | Strandemar et al. | Apr 2015 | A1 |
20150172545 | Szabo et al. | Jun 2015 | A1 |
20150288892 | Frank et al. | Oct 2015 | A1 |
20150296146 | Scanlon et al. | Oct 2015 | A1 |
20150312488 | Kostrzewa et al. | Oct 2015 | A1 |
20150312489 | Hoelter et al. | Oct 2015 | A1 |
20150312490 | Hoelter et al. | Oct 2015 | A1 |
20150319378 | Hoelter et al. | Nov 2015 | A1 |
20150319379 | Nussmeier et al. | Nov 2015 | A1 |
20150358560 | Boulanger et al. | Dec 2015 | A1 |
20150379361 | Boulanger | Dec 2015 | A1 |
20160074724 | Terre | Mar 2016 | A1 |
20160156880 | Teich et al. | Jun 2016 | A1 |
20160224055 | Simolon et al. | Aug 2016 | A1 |
20160316119 | Kent | Oct 2016 | A1 |
20160316154 | Elmfors et al. | Oct 2016 | A1 |
20170004609 | Strandemar | Jan 2017 | A1 |
20170078590 | Högasten et al. | Mar 2017 | A1 |
20170088098 | Frank et al. | Mar 2017 | A1 |
20170208260 | Terre et al. | Jul 2017 | A1 |
Number | Date | Country |
---|---|---|
2011131758 | Oct 2011 | WO |
Number | Date | Country | |
---|---|---|---|
20170358105 A1 | Dec 2017 | US |
Number | Date | Country | |
---|---|---|---|
62350116 | Jun 2016 | US |