The present disclosure relates to active vision systems and, more particularly, relates to a method for determining optimum spectral bands for active vision systems that operates in one of these optimum spectral bands.
This section provides background information related to the present disclosure which is not necessarily prior art. This section provides a general summary of the disclosure, and is not a comprehensive disclosure of its full scope or all of its features.
Typically, vision systems monitor a scene using electromagnetic radiation usually from a specific spectral band, such as ultra violet (UV), visible (VIS), near infrared (NIR), short wave infrared (SWIR), and long wave infrared (LWIR) radiation. Together, the NIR and SWIR spectral bands are also referred to as the reflected infrared band (as opposed to the emitted radiation from the thermal infrared band), while the LWIR band is referred to as the thermal infrared band.
Infrared (IR) vision systems are used in numerous civilian and military applications. Some vision systems are designed to observe scenes under extreme low illumination by using light amplification or light intensification technologies, such as the night vision device described in U.S. Pat. No. 4,463,252. Different information about a scene can be conveyed through incoming electromagnetic radiation from the various portions of the spectrum. Indeed, many techniques have been developed by combining multi-spectral images of the scenes of interest. For example, U.S. Pat. No. 5,035,472 describes a device that transmits the signal of an image along two separate paths, one directing the signal towards an IR detector and the other directing the signal towards an image intensifier. Then, the IR and intensified images are combined for displaying the information to the user.
The number of autonomous vehicles in use on public roads and in civilian airspace has been increasing steadily, exposing them to hazardous environmental conditions such as slippery roads and aircraft icing conditions. Therefore, autonomous vehicles will likely be required to have control systems configured to receive information regarding not only the surrounding terrain and the obstacles in their path, but also the conditions of the road and the airspace ahead. In addition, autonomous vehicles will likely need to respond to this information automatically by commanding maneuvers to negotiate terrain, avoid obstacles, and track a particular path in order to avoid potentially hazardous conditions.
Recent accidents involving automobiles employing advanced automation systems have been discussed extensively in the engineering communities and in the media. Accidents frequently occur when vision systems fail due to exposure to intense sunlight (e.g. sun glare). This is a common problem for vision systems that rely on cameras, lidars, or other devices operating in portions of the spectrum strongly affected by sunlight, such as in the visible and near infrared spectral bands.
The principles of the present teachings provide a method for avoiding problems caused by sunlight in devices used for avoiding obstacles, navigating, detecting road conditions (i.e. distinguishing dry roads from wet roads and icy roads, estimating the thickness of water layers), and sensing the atmospheric conditions in the airspace around an aircraft or an autonomous air vehicle (i.e. detecting potentially hazardous icing conditions or volcanic ash ahead). In addition, the present teachings can be used in systems for detecting the concentration of gases leaking from industrial systems or natural atmospheric constituents.
The present teachings provide a method for determining optimum spectral bands for active vision systems used outdoor and a device that employs this method. The method and the device can be used to provide warnings to drivers, to provide information for aircraft, automobile, and autonomous air, and ground or sea vehicles, among other applications.
In some embodiments of the present invention, detectors, detector arrays, or multi-spectral cameras can be used to make the required measurements. A similar system can be used for detecting ice or water unambiguously on aircraft surfaces, manufacturing systems, or any other object of interest. In some embodiments, a system using measurements in a single optimum spectral band, such as a lidar, can be used for obstacle avoidance or navigation.
In some embodiments of the present invention, a road condition monitoring system is provided that is configured to detect water, snow, frost, clear ice, and other types of ices on roads and any other surface of interest. The system is configured to distinguish dry surfaces from those covered by water, snow, frost, and various types of ice even when these substances cover only a fraction of the field of view of the road condition monitoring system.
Water and ice can often be difficult to detect by drivers or current synthetic vision systems. Clear ice is unusually difficult to detect. Aircrafts, cars, trucks, buses, motorcycles, and other vehicles would benefit from systems capable of detecting the presence of ice or water on surfaces, such as roadways, bridges, sidewalks, or even runway and taxiways (i.e. in connection with ground operations of aircrafts or supporting personal and vehicles). The fact that drivers, operators, and synthetic vision systems fail to detect deteriorating road conditions ahead of a vehicle because of sun glare frequently leads to accidents.
Some of the prior art approaches for vision systems designed for navigation and for the detection of ice and water on roads, aircraft surfaces, and in the airspace around them are based on near infrared radiance measurements. However, these prior art techniques are subject to the negative effects of sun glare because they are based on measurements in portions of the spectrum strongly affect by sunlight.
U.S. Patent App. Pub. No. 2008/0129541AI refers to a slippery ice warning system capable of monitoring the road ahead of a vehicle. One or two visible cameras are used to image the same scene at two orthogonal polarizations. When a single camera is used, a polarization beam splitter is used to separate the reflected light into two orthogonal polarizations. The possible (but ambiguous) determination of the existence of slippery ice ahead of the vehicle is detected by measuring the polarization of the reflected light. Moreover, since this system operates in the visible portion of the spectrum it is subject to the negative effects of sun glare.
U.S. Patent App. Pub. No. 2005/0167593A1 refers to a method that uses shifts in the wavelength of the reflectance near 1.4 μm to distinguish water from ice. In this method, liquid water and ice are discriminated from each other by analyzing shifts in the short wavelength edge of the 1.4 μm band reflectance. Detection decisions are based on shifts in wavelengths in a portion of the spectrum strongly affected by sunlight. Unfortunately, systems based on this method are subject to problems caused by sun glare.
A more recent invention described in U.S. Patent App. Pub. No. 20120193477A1 uses effective reflectance defined as the reflectance of a given material at a given wavelength divided by the reflectance of this same material at a wavelength equal to 1.1 μm to determine the measurements bands for distinguishing wet material from that containing ice on its surface. In this technique, the detection signal is the contrast C between the measurements in the first and the second band, where the contrast signal is defined the ratio between the differences in the intensity of radiation in the second and first bands and the sum of the intensities in the two bands. In this technique, detection decisions are based on the contrast signal is spectral bands strongly affected by sunlight. Unfortunately, systems based on this method are also subject to problems caused by sun glare.
U.S. Pat. No. 9,304,081, which is incorporated herein by reference, describes a technique that uses the radiance ratio around a crossover point (γ=Rλ1/Rλ2) to monitor the condition of the road or the airspace ahead of a land or air vehicle. The key feature of this technique is that the detection signal is robust because if depends simply on the ratio of the measurements in two nearby narrow spectral bands. However, minimization of the effects of sun glare was not a concern when the technique was developed.
According to the principles of the present teachings, optimum spectral bands are implemented in a system for monitoring potentially hazardous conditions ahead such as water, snow, and ice on roads or runways. The system disclosed herein overcomes the disadvantages of the prior art because it is immune to the negative effects of sun glare. Active vision systems can be designed for navigation and for the detection of potentially hazardous conditions in the airspace or in the surface ahead of a vehicle. Industrial systems for outdoors use can be configured to map the system's surroundings or to distinguish clean surfaces from those covered by ice, snow, oil, water, or other particular substances of interest.
In some embodiments of the present teachings, a light source (i.e. a pulsed laser), multispectral detectors and/or multispectral camera, a data processor unit, and interfaces with displays, safety systems, and/or autonomous systems are employed to provide an indication of the conditions ahead and respond to it.
In some embodiments, the road condition monitoring system of the present teachings contains only a detector pair with filters, a data processor unit, and interfaces to displays and control systems. In some embodiments, a lidar using a laser with wavelengths falling within one of the optimum spectral bands described in this document can be used.
Further areas of applicability will become apparent from the description provided herein. The description and specific examples provided in this summary are for illustration only; they are not intended to limit the scope of the present disclosure.
The drawings described herein are for illustrative purposes only of selected embodiments and not all possible implementations, and are not intended to limit the scope of the present disclosure.
Corresponding reference numerals indicate corresponding parts throughout the several views of the drawings.
Example embodiments will now be described more fully with reference to the accompanying drawings.
Example embodiments are provided so that this disclosure will be thorough, and will fully convey the scope to those who are skilled in the art. Numerous specific details are set forth such as examples of specific components, devices, and methods, to provide a thorough understanding of embodiments of the present disclosure. It will be apparent to those skilled in the art that specific details need not be employed, that example embodiments may be embodied in many different forms and that neither should be construed to limit the scope of the disclosure. In some example embodiments, well-known processes, well-known device structures, and well-known technologies are not described in detail.
The terminology used herein is for the purpose of describing particular example embodiments only and is not intended to be limiting. As used herein, the singular forms “a,” “an,” and “the” may be intended to include the plural forms as well, unless the context clearly indicates otherwise. The terms “comprises,” “comprising,” “including,” and “having,” are inclusive and therefore specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. The method steps, processes, and operations described herein are not to be construed as necessarily requiring their performance in the particular order discussed or illustrated, unless specifically identified as an order of performance. It is also to be understood that additional or alternative steps may be employed.
When an element or layer is referred to as being “on,” “engaged to,” “connected to,” or “coupled to” another element or layer, it may be directly on, engaged, connected or coupled to the other element or layer, or intervening elements or layers may be present. In contrast, when an element is referred to as being “directly on,” “directly engaged to,” “directly connected to,” or “directly coupled to” another element or layer, there may be no intervening elements or layers present. Other words used to describe the relationship between elements should be interpreted in a like fashion (e.g., “between” versus “directly between,” “adjacent” versus “directly adjacent,” etc.). As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items.
Although the terms first, second, third, etc. may be used herein to describe various elements, components, regions, layers and/or sections, these elements, components, regions, layers and/or sections should not be limited by these terms. These terms may be only used to distinguish one element, component, region, layer, or section from another region, layer or section. Terms such as “first,” “second,” and other numerical terms when used herein do not imply a sequence or order unless clearly indicated by the context. Thus, a first element, component, region, layer or section discussed below could be termed a second element, component, region, layer or section without departing from the teachings of the example embodiments.
Spatially relative terms, such as “inner,” “outer,” “beneath,” “below,” “lower,” “above,” “upper,” and the like, may be used herein for ease of description to describe one element or feature's relationship to another element(s) or feature(s) as illustrated in the figures. Spatially relative terms may be intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. For example, if the device in the figures is turned over, elements described as “below” or “beneath” other elements or features would then be oriented “above” the other elements or features. Thus, the example term “below” can encompass both an orientation of above and below. The device may be otherwise oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly.
According to the principles of the present teachings, a vision system 10, such as a road condition monitoring system, is provided for detecting road condition and/or for monitoring conditions of interest, such as but not limited to ice detection on aircraft, manufacturing systems, or other objects of interest. The vision system 10 is configured and operable to make measurements in spectral bands in which solar radiation is strongly absorbed by atmospheric constituents (
In some embodiments, vision system 10 uses measurements of radiance in at least two narrow spectral bands indicated in
The vision system 10 illustrated in
With particular reference to
The vision system 10 can further comprise, a processing unit 20 configured to receive the optics signal from the optics systems and calculate the ratio of the radiance in the two bands in which solar radiation is reduced substantially due to absorption by atmospheric constituents in order to mitigate the negative effects of solar radiation. The processing system 20 is configured to output a resultant signal in response to the measurements in the spectral bands to a display or data interface system 22, in response to light source 24. In some embodiments, light source 24 comprises halogen lights, incandescent lights, or pulsed infrared lasers, or LEDs used to illuminate the area of interest 100. In some embodiments, the area of interest 100 ahead of the vehicle can be illuminated with intense laser beams 24 containing the desired spectrum.
In some embodiments, vision system 10 can comprise an automation system 26 operably coupled to a system of the vehicle for automatically controlling the system of the vehicle in response to an output signal from the data processing unit 20 or data interface 22.
Radiance measurements (instead of reflectance measurements) are sufficient for most practical applications, because the targets can be illuminated with light sources containing relatively small power variations between the spectral bands of interest (e.g., surfaces illuminated by direct or indirect sunlight, or illuminated by a known light source).
As illustrated in
In some embodiments, the present teachings provide a system that alerts the driver or provides feedback to the vehicle's automation systems. The present system is capable of quantifying the hazards level using road condition assessment matrices such as that illustrated in
In some embodiments, the data processing unit or system 20 is configured to implement an algorithm to detect the presence of water, snow, frost, ice and water/ice mixtures using non-transitory software and look-up tables to estimate the surface condition and output a warning or alert when a predetermined condition is detected based on the radiance measurements in the first band and the second band. By way of non-limiting example, the algorithm for assessing road condition ahead of a vehicle can comprise the following steps:
1. Measurements with optical system 11 containing a pair of detectors 18 or a pixel array with spectral filters 16 are used to measure the radiance of the area of interest 100 (e.g. 100 m ahead of the vehicle).
2. Measurements with a spectral filter that allows radiance in a first band between about 1.908 and 1.968 μm to pass are used to determine the radiance at one side of the crossover point (R1.938 μm);
3. Measurements with a spectral filter that allows radiance in a second band between about 2.000 and 2.060 μm to pass are used to determine the radiance at the other side of the crossover point (R2.030 μm);
4. The measurements in the first and second band are then used to determine the radiance ratio γ=R1.938 μm/R2.030 μm of the area of interest 100;
5. The value of the radiance ratio γ is used to determine the surface or road condition using a road condition assessment matrix defined below;
6. The surface temperature (T) is estimated using measurements by a thermocouple or any other suitable method;
7. The road surface temperature is then used to refine the surface or road condition using a road condition assessment matrix relating the detection signal and the road surface temperature with the coefficient of friction between the vehicle's wheels and the road; and
8. Feedback is provided to the operator or vehicle automation system. In some embodiments, a WARNING is produced when hazardous conditions are detected. The hazard condition can be quantified by a numerical code.
In summary, a road condition monitoring or vision system immune to the negative effects of sun glare is provided. The vision system is capable of measuring the radiance reflected by an area of interest in wavelengths range containing a crossover point between the curves representing the absorption of electromagnetic radiation by ice and water. The detection system measures the radiance in a first band having wavelengths in a spectral band on a first side of the crossover point and outputting a first band signal, and further measures the radiance in a second band having wavelengths in a spectral band on a second opposing side, of the same crossover point and outputting a second band signal. The crossover point and the measurement bands on each side of the crossover point are selected carefully to avoid the negative effects of solar radiation and to provide unambiguous detection of water, snow and various types of ice even when these substances cover a fraction of the field of view of the road condition monitoring system. A processing unit determines the ratio of the first band signal to the second band signal, and compares the ratio to predetermined critical ratios to output the determination signal indicating the presence of water or various types of ice.
The foregoing description of the embodiments has been provided for purposes of illustration and description. It is not intended to be exhaustive or to limit the disclosure. Individual elements or features of a particular embodiment are generally not limited to that particular embodiment, but, where applicable, are interchangeable and can be used in a selected embodiment, even if not specifically shown or described. The same may also be varied in many ways. Such variations are not to be regarded as a departure from the disclosure, and all such modifications are intended to be included within the scope of the disclosure.
Number | Name | Date | Kind |
---|---|---|---|
3767915 | Battist | Oct 1973 | A |
3970428 | Barringer | Jul 1976 | A |
4054255 | Magenheim | Oct 1977 | A |
4221482 | Macourt | Sep 1980 | A |
4274091 | Decker | Jun 1981 | A |
4441363 | Hill et al. | Apr 1984 | A |
4463252 | Brennan et al. | Jul 1984 | A |
4804849 | Booth et al. | Feb 1989 | A |
4809197 | Tashiro et al. | Feb 1989 | A |
4819480 | Sabin | Apr 1989 | A |
4920263 | Fimian et al. | Apr 1990 | A |
4965573 | Gallagher et al. | Oct 1990 | A |
4984163 | Kuwana et al. | Jan 1991 | A |
5005015 | Dehn et al. | Apr 1991 | A |
5028929 | Sand et al. | Jul 1991 | A |
5035472 | Hansen | Jul 1991 | A |
5124914 | Grangeat | Jun 1992 | A |
5218206 | Schmitt et al. | Jun 1993 | A |
5301905 | Blaha | Apr 1994 | A |
5313202 | Hansman, Jr. et al. | May 1994 | A |
5497100 | Reiser et al. | Mar 1996 | A |
5521594 | Fukushima | May 1996 | A |
5596320 | Barnes | Jan 1997 | A |
5695155 | Macdonald et al. | Dec 1997 | A |
5796344 | Mann et al. | Aug 1998 | A |
5818339 | Giles et al. | Oct 1998 | A |
5905570 | White et al. | May 1999 | A |
6040916 | Griesinger | Mar 2000 | A |
6091335 | Breda et al. | Jul 2000 | A |
6161075 | Cohen | Dec 2000 | A |
6166645 | Blaney | Dec 2000 | A |
6166657 | Mann | Dec 2000 | A |
6269320 | Otto | Jul 2001 | B1 |
6384611 | Wallace et al. | May 2002 | B1 |
6430996 | Anderson et al. | Aug 2002 | B1 |
6459083 | Finkele et al. | Oct 2002 | B1 |
6819265 | Jamieson et al. | Nov 2004 | B2 |
6921898 | Chen | Jul 2005 | B1 |
6977597 | Doherty | Dec 2005 | B2 |
7100427 | Kahn et al. | Sep 2006 | B2 |
7104502 | Otto et al. | Sep 2006 | B2 |
7119891 | White et al. | Oct 2006 | B2 |
7224453 | Elman | May 2007 | B2 |
7265846 | Forsyth | Sep 2007 | B2 |
7301478 | Chinn et al. | Nov 2007 | B1 |
7370525 | Zhao et al. | May 2008 | B1 |
7424399 | Kahn et al. | Sep 2008 | B2 |
7796833 | Polonskiy et al. | Sep 2010 | B2 |
7839301 | Doherty et al. | Nov 2010 | B2 |
7986408 | Ray et al. | Jul 2011 | B2 |
8000847 | Shue | Aug 2011 | B2 |
8044823 | Doherty et al. | Oct 2011 | B2 |
8325338 | Pope et al. | Dec 2012 | B1 |
8350910 | Capello et al. | Jan 2013 | B2 |
8666570 | Tillotson | Mar 2014 | B1 |
8711008 | Cook et al. | Apr 2014 | B2 |
8796627 | Rockwell et al. | Aug 2014 | B2 |
8854464 | Ishi et al. | Oct 2014 | B2 |
9013332 | Meis | Apr 2015 | B2 |
9041926 | Ray et al. | May 2015 | B2 |
9297755 | Renno | Mar 2016 | B2 |
9302777 | Renno | Apr 2016 | B2 |
9304081 | Renno | Apr 2016 | B2 |
9305220 | Funayama et al. | Apr 2016 | B2 |
20020162962 | Rudolph | Nov 2002 | A1 |
20030150992 | Chavez et al. | Aug 2003 | A1 |
20030169186 | Vopat | Sep 2003 | A1 |
20040036630 | Jamieson et al. | Feb 2004 | A1 |
20040206854 | Shah et al. | Oct 2004 | A1 |
20040231410 | Bernard et al. | Nov 2004 | A1 |
20050002435 | Hashimoto et al. | Jan 2005 | A1 |
20050100336 | Mendenhall et al. | May 2005 | A1 |
20050105103 | Schietinger et al. | May 2005 | A1 |
20050151965 | Bissett et al. | Jul 2005 | A1 |
20050167593 | Forsyth | Aug 2005 | A1 |
20050218268 | Otto et al. | Oct 2005 | A1 |
20050230553 | Otto et al. | Oct 2005 | A1 |
20060050270 | Elman | Mar 2006 | A1 |
20060261975 | Fridthjof | Nov 2006 | A1 |
20070074415 | Gagnon | Apr 2007 | A1 |
20080110254 | Zhao et al. | May 2008 | A1 |
20080129541 | Lu et al. | Jun 2008 | A1 |
20080161878 | Tehrani et al. | Jul 2008 | A1 |
20080218385 | Cook et al. | Sep 2008 | A1 |
20090222238 | Gagnon | Sep 2009 | A1 |
20090261811 | Gordon | Oct 2009 | A1 |
20100072367 | Meurer | Mar 2010 | A1 |
20100085175 | Fridthjof | Apr 2010 | A1 |
20100110431 | Ray et al. | May 2010 | A1 |
20100131203 | Lilie et al. | May 2010 | A1 |
20110019188 | Ray et al. | Jan 2011 | A1 |
20110135197 | Paris et al. | Jun 2011 | A1 |
20110213554 | Archibald et al. | Sep 2011 | A1 |
20120085868 | Barnes | Apr 2012 | A1 |
20120123637 | Funayama et al. | May 2012 | A1 |
20120140233 | Rockwell et al. | Jun 2012 | A1 |
20120182544 | Asahara et al. | Jul 2012 | A1 |
20120187301 | Markson | Jul 2012 | A1 |
20120191350 | Prata et al. | Jul 2012 | A1 |
20120193477 | Thorez et al. | Aug 2012 | A1 |
20120266669 | Sage | Oct 2012 | A1 |
20120274938 | Ray | Nov 2012 | A1 |
20120327410 | Maston | Dec 2012 | A1 |
20130008174 | Gould et al. | Jan 2013 | A1 |
20130234884 | Bunch et al. | Sep 2013 | A1 |
20130249701 | Zhang | Sep 2013 | A1 |
20140112537 | Frank et al. | Apr 2014 | A1 |
20140347189 | Weksler et al. | Nov 2014 | A1 |
20140372069 | Nuzzio | Dec 2014 | A1 |
20150019185 | Cunningham et al. | Jan 2015 | A1 |
20150120092 | Renno | Apr 2015 | A1 |
20150120093 | Renno | Apr 2015 | A1 |
20150170485 | DeCusatis et al. | Jun 2015 | A1 |
20170197728 | Renno | Jul 2017 | A1 |
20180056854 | Kunii | Mar 2018 | A1 |
Number | Date | Country |
---|---|---|
0537206 | Jan 1996 | EP |
1401707 | Mar 2004 | EP |
2637045 | Sep 2013 | EP |
2001-099710 | Apr 2001 | JP |
2006046936 | Feb 2006 | JP |
2009115498 | May 2009 | JP |
4492883 | Jun 2010 | JP |
2014132073 | Sep 2014 | WO |
2015116873 | Aug 2015 | WO |
2016000666 | Jan 2016 | WO |
Entry |
---|
Nakauchi, S., K Nishino, T. Yamashita, “Selection of Optimal Combinations of Band-Pass Filters for Ice Detection by Hyperspectral Imaging,” Opt. Express 20, 986-1000 (2012). |
Gregoris, D., S. Yu, and F. Teti, “Multispectral Imaging of Ice,” Paper Presented at the Canadian Conference on Electrical nd Computer Engineering, 4, 2051-2056. |
Jonsson, P, Remote Sensor for Winter Road Surface Status Detection. In: Proceedings of IEEE Sensors (2011) p. 1285-8. |
Kou, L., Labile, D., and Chylek, P., “Refractive indices of water and ice in the 0.65- to 2.5μm spectral range”, (Jul. 1993), Appl. Opt., 32(19), 3531-3540. |
Vanderlei Martins, J., et al., “Remote sensing the vertical profile of cloud droplet effective radius, thermodynamic phase, and temperature”, (Mar. 2007), Atmos. Chem. Phys. Discuss., 7, 4481-4519. |
Rennó, N. O., et al., “CHASER: An Innovative Satellite Mission Concept to Measure the Effects of Aerosols on Clouds and Climate”, (May 2013), Bull. Amer. Meteor. Soc., 94, 685-694. |
International Search Report and Written Opinion, International Application No. PCT/US2014/038003, dated Aug. 14, 2014, 13 pages. |
International Search Report and Written Opinion, International Application No. PCT/US2014/047415, dated Nov. 20, 2014, 12 pages. |
Zahorowski et al., Vertical Radon-222 Profiles in the Atmospheric Boundary Layer, CAWCR 5th Annual Workshop, Atmospheric Composition Observations and Modelling and the Cape Grim Annual Science Meeting, Nov. 15-17, 2011, Bureau of Meteorology, Melbourne, Australia. |
Turekian et al., Geochemistry of Atmospheric Radon and Radon Products, Annual Review of Earth and Planetary Sciences, 1977, vol. 5, pp. 227-255. |
Jacob and Prather, Radon-222 as a Test of Convective Transport in a General Circulation Model, Tellus (1990), 42B, pp. 118-134. |
Mason, Engine Power Loss in Ice Crystal Conditions, Aero Quarterly, 2007. |
Guffanti et al., Encounters of Aircraft with Volcanic Ash Clouds: a Compilation of Known Incidents, 1953-2009, U.S. Department of the Interior, U.S. Geological Survey, Data Series 545, Version 1.0, 2010, Reston, Virginia, U.S.A. |
Li et al., A Three-Dimensional Global Episodic Tracer Transport Model: 1. Evaluation of its Transport Processes by Radon 222 Simulations, Journal of Geophysical Research, vol. 101, No. D20, pp. 25,931-25,947, Nov. 20, 1996. |
Kritz et al., Radon Measurements in the Lower Tropical Stratosphere: Evidence for Rapid Vertical Transport and Dehydration of Tropospheric Air, Journal of Geophysical Research, vol. 98, No. D5, pp. 8725-8736, May 20, 1993. |
Lambert et al., Volcanic Output of Long-Lived Radon Daughters, Journal of Geophysical Research, vol. 87, No. C13, pp. 11,103-11,108, Dec. 20, 1982. |
Schery, An Estimate of the Global Distribution of Radon Emissions from the Ocean, Geophysical Research Letters, vol. 31, Oct. 7, 2004. |
Williams et al., The Vertical Distribution of Radon in Clear and Cloudy Daytime Terrestial Boundary Layers, Journey of Atmospheric Sciences, vol. 68, pp. 155-174, Jan. 2011. |
Moore et al., 222Rn, 210 Pb, 210Bi, and 210Po Profiles and Aerosol Residence Times Versus Altitude, Journal of Geophysical Research, vol. 78, No. 30, Oct. 20, 1973. |
Martins et al., Remote Sensing the Vertical Profile of Cloud Droplet Effective Radius, Thermodynamic Phase, and Temperature, Atmospheric Chemistry and Physics Discussions, 7, 4481-4519, 2007. |
Kendra, J. R., Ulaby, F. T., & Sarabandi, K. (1994). Snow probe for in situ determination of wetness and density. Geoscience and Remote Sensing, IEEE Transactions on Geoscience and Remote Sensing, 32 (6), 1152-1159. |
Sarabandi, K., & Li, E. S. (1997). Microstrip ring resonator for soil moisture measurements. Geoscience and Remote Sensing, IEEE Transactions on Geoscience and Remote Sensing, 35 (5), 1223-1231. |
Dionigi, M., Ocera, A., Fratticcioli, E., and Sorrentino, R. (2004), A new resonant probe for Dielectric permittivity measurement, European Micro. Conf. Dig., Amsterdam, 673-676. |
Fratticcioli, E., M. Dionigi, and R. Sorrentino (2004), “A simple and low-cost measurement system for the complex permittivity characterization of materials,” IEEE Trans. on Instrumentation and Measurement, 53(4), 1071-1077. |
Sagnard, F. and Y.-L. Beck (2009), “Experimental study of the influence of moisture and dry density on a silt soil using a monopole probe,” Micro and Optical Tech. Lett., 51(3), 820-826. |
Chang, K., & Hsieh, L. H. (2004), Microwave Ring Circuits and Related Structures, Second Edition, John Wiley & Sons, Inc. |
International Search Report and Written Opinion, International Application No. PCT/US2014/061949, dated Feb. 9, 2015, 9 pages. |
International Search Report and Written Opinion of the International Searching Authority dated Apr. 21, 2017 regarding PCT/US2017/012410. |