1. Field of the Invention
The present invention relates to a device and method to detect a distribution of vegetations.
2. Description of Related Art
Patent Citation 1 (Japanese Patent Application Laid-Open No. 2002-117402) discloses a method to discriminate between vegetations and other subjects by distinguishing detailed colors of green sections by means of visible light.
Patent Citation 2 (Japanese Patent Application Laid-Open No. 2007-018387) discloses a method to discriminate between vegetations and other subjects by means of NDVI (Normalized Differenced Vegetation Index) commonly used in satellite remote sensing based on the reflectance of two color lights, near infrared (NIR) light (wavelength band: 800 to 1200 nm) and visible red (VISR) light (wavelength band: 600 to 700 nm).
With regard to an image in which bright spots and dark (shadow) spots are present, such as an image of sunlight filtering through trees, there is not much of a gradation difference between pixels with low luminance values such as pixels framing the shadow parts. In those pixels output values from a camera (values in RBG space) become similar. While, with regard to the luminance (reflectance) of visible light, there is a little difference in the reflectances (or color values) of red, blue and green lights. Regarding low shrubs, for instance, the reflectance of red light is 18% and the reflectance of green light is approximately 12%. Thus, in the method of Patent Citation 1 discrimination errors between vegetations and the other subjects in an image of sunlight filtering through trees are easily caused since the pixels of shadow parts in the image are affected by noise.
Also, large field constructions are made from some typical materials such as sand, soil, rock, cement, asphalt, plastic. Table 1 shows the reflectance of near infrared light, the reflectance of visible red light and the reflectance ratio thereof under daylight. As shown in the table, specifically the reflectance ratio of light from blue-colored plastics is close to that from grass (100 mm height). This is not a problem since it is not necessary to distinguish vegetations from such materials in air photographs or satellite photographs. However, the constructions made from the blue-colored plastics are accidentally detected as vegetations by means of the method described in Patent Citation 2.
In addition, when a vegetation detecting operation is performed at night by means of the methods of Patent Citations 1 and 2, a floodlight using a light source of strong visible light such as a searchlight is needed. It causes the operators to be dazzled by the strong light or causes the location of security guards on duty to be clear so as to let suspicious people know the location of the guards.
In order to solve the above-mentioned issues, the present invention has a purpose of providing a vegetation detector and a related method capable of (1) preventing discrimination errors between vegetations and other subjects even in a situation where bright spots and dark spots are present, such as in a section with sunlight filtering through trees, (2) preventing constructions constructed of blue-colored plastics from accidentally being detected as vegetations, and (3) operating detection of vegetations at night without a floodlight using a light source of strong visible light such as a searchlight.
According to a first aspect of the present invention, there is provided a vegetation detector including: (1) a first imaging section that includes a first optical filter selectively transmitting light with a near infrared wavelength band; (2) a second imaging section that includes a second optical filter selectively transmitting light with a short wave infrared wavelength band; (3) a reflectance ratio calculating section that calculates a ratio of a reflectance calculated based on observation data of light from subjects obtained by the first imaging section to a reflectance calculated based on observation data of light from the subjects obtained by the second imaging section as a reflectance ratio; (4) and a determining section that determines whether the subjects are vegetations or not by comparing the reflectance ratio with a predetermined threshold value.
According to a second aspect of the present invention, there is provided a method of detecting vegetation including: (1) calculating a ratio of a reflectance calculated based on observation data of light from subjects obtained by a first imaging section including a first optical filter selectively transmitting light with a near infrared wavelength band to a reflectance calculated based on observation data of light from the subjects obtained by a second imaging section including a second optical filter selectively transmitting light with a short wave infrared wavelength band as a reflectance ratio; and (2) determining whether the subjects are vegetations or not by comparing the reflectance ratio with a predetermined threshold value.
There will be described below a preferred embodiment of the present invention with reference to the drawings. Note that in the figures common elements are indicated with the same reference numerals and the repetitive explanations are omitted.
The first imaging section 11 images light with a predetermined wavelength band in a wavelength band (800 nm to 1300 nm) of near infrared (NIR) light. The second imaging section 12 images light with a water absorption wavelength band (1450 nm±50 nm, or 1940 nm±100 nm) in a wavelength band of short wave infrared (SWIR) light. The first imaging section 11 and the second imaging section 12 can simultaneously obtain exposure times and gains at taking images of a subject. The reflectance ratio calculating section 13 calculates the ratio of the reflectance of light with the near infrared light wavelength band to the reflectance of light with the water absorption wavelength band by means of the later-described method. The determining section 14 determines whether the subject is a vegetation or not by comparing the reflectance ratio calculated by the reflectance ratio calculating section 13 with a threshold value recorded in the recording section 15.
As shown in
As shown in
The near infrared light transmission filters 116, 131 and 151 only (selectively) transmit light with a predetermined wavelength band in the near infrared light band. The water absorption wavelength band light transmission filters 113 and 141 only (selectively) transmit light with a wavelength band that water absorbs.
As for the near infrared light transmission filters 116, 131 and 151, a filter, such as a long pass filter that only transmits the light with a wavelength of 800 nm or more, or such as a band pass filter that transmits light with a wavelength of 800 nm or more and of which a transmission wavelength width is 100 nm or more, is preferably used. The former filter is used when the filter only restricts the lower limit of the transmission range since the upper limit of the light receiving wavelength of the CCD camera is approximately between 900 nm to 1100 nm. While, the latter filter is used in order to transmit light with a wavelength of 800 nm or more and make the transmission wavelength width wide as much as possible since the light receiving energy becomes small and the signal-to-noise ratio becomes worse if the transmission wavelength width is narrow.
As for the water absorption wavelength band light transmission filters 113 and 141, a filter, such as a band pass filter having a transmission central wavelength of 1450 nm and a transmission wavelength width of 80 nm, is used. According to vegetation observation results, it has been clarified that: the central wavelength of light that water absorbs is 1450 nm; the absorption rate of light with a wavelength of 1400 nm to 1500 nm is high; and the absorption rate of light with a wavelength of 1350 nm or less and 1550 nm or more is low. Therefore, as a water absorption wavelength band light transmission filter, it is preferably used the band pass filter fulfilling the conditions that: the lower limit of the transmission wavelength is more than 1350 nm; the upper limit of the transmission wavelength is less than 1550 nm; and the transmission wavelength width is 50 nm or more.
By selecting two wavelengths in order to make the reflectance ratio large, it is possible to detect vegetations in observation points with high luminance contrast. Therefore the two different wavelengths are preferably selected so as to make the reflectance ratio larger. However it is not sufficient to image only light with wavelengths at peaks with high reflectance and at peaks with low reflectance. When the wavelength width of light to be imaged is narrow, which means only peaks are taken, the received light energy becomes small and the signal-to-noise ratio becomes worse. Thus it is necessary to make the wavelength width of receiving light wide and to select wavelengths of which the reflectance ratio of light to be imaged is sufficiently large (or sufficiently small). The selected regions in the figure satisfy the conditions and therefore the wavelength bands of receiving light are preferably determined in view of camera characteristics and filter characteristics within the regions. Note that the wavelengths of the both ends in each selected region are determined according to the following determination criteria: (1) the both ends are defined in boundaries that reflectance values are changed; and (2) sampling points within the average of a high peak and a low peak before and after changing (right and left direction in the figure) are the limit values of the both ends.
Gains are controlled to be able to pick up images by each camera with appropriate luminance at imaging. Therefore it may be appropriately used techniques to enable an auto gain control that is usually equipped with cameras, or to control exposure time so that almost the entire subject (except small shining parts such as a spotlight-like illuminant) corresponds to the imaging range.
A same observation point of a subject is indicated as different coordinate values (pixel coordinate values) in coordinate systems set in lenses, respectively, since each lens has a different field of view. Therefore it is necessary to perform coordinate transformation between the coordinate systems to relate a coordinate value representing the same observation point in one coordinate system to a coordinate value representing the same observation point in the other coordinate system.
When the plural plate camera 110 shown in
Therefore the coordinate value of each pixel in the coordinate system of the CCD light receiving element 118 is transformed into the corresponding coordinate value in the coordinate system of the InSb semiconductor light receiving element 115 by using
Here (xNIR, yNIR) is the coordinate value of each pixel on the light receiving surface of the CCD light receiving element 118; (xSWIR, ySWIR) is the coordinate value of each pixel on the light receiving surface of the InSb semiconductor light receiving element 115; fxNIR and fyNIR are the focal distances of the lens 117 of the CCD light receiving element 118 in the x-axis and y-axis directions; fxSWIR and fySWIR are the focal distances of the lens 114 of the InSb semiconductor light receiving element 115 in the x-axis and y-axis directions; (cxNIR, cyNIR) is the coordinate value of the center of the light axis on the CCD light receiving element 118; (cxSWIR, cySWIR) is the coordinate value of the center of the light axis on the InSb semiconductor light receiving element 115; and R is a 3×3 rotation matrix from the coordinate system of the CCD light receiving element 118 to the coordinate system of the InSb semiconductor light receiving element 115. Note that R is the unit matrix I when each light receiving surface is accurately placed perpendicular to the light axis.
Since this coordinate transformation enables to match the coordinate value of the pixel representing each observation point of the subject in the CCD light receiving element 118 with the coordinate value of the pixel representing each observation point of the subject in the InSb semiconductor light receiving element 115, it is possible to compare observation data between pixels corresponding in the coordinate systems.
When a plurality of cameras are used as shown in
When the coordinate (XNIR1, YNIR1) of an observation point of a subject on a light receiving surface of the first near infrared light camera 130 and the corresponding coordinate (XNIR2, YNIR2) of the observation point of the subject on a light receiving surface of the second near infrared light camera 150 are discovered, it is possible to obtain θ1, θ2, |{right arrow over (T)}NIR| by accurately measuring RNIR and {right arrow over (T)}NIR in advance. Moreover, it is possible to obtain the distance d between each pixel of the first near infrared light camera 130 and the observation point of the subject by means of triangulation method using these measured values.
θ1, θ2, |{right arrow over (T)}NIR| can be obtained by (S21-1) to (S21-3) as described below.
(S21-1) A position vector {right arrow over (P)}2-2 representing an observation point of the subject in the coordinate system of the second near infrared light camera 150 can be described as
by use of the coordinate (xNIR2, yNIR2) of the observation point on the light receiving surface of the second near infrared light camera 150. In this case, θ2 is calculated as
(S21-2) A position vector {right arrow over (P)}1-2 representing the observation point in the coordinate system of the second near infrared light camera 150 observed from the first near infrared light camera 130 can be obtained, by use of
{right arrow over (P)}
1-2
=A
2
R
NIR
A
1
−1
{right arrow over (P)}
1-1, (4)
by transforming a position vector
representing the observation point in the coordinate system of the first near infrared light camera 130, into the coordinate system of the second near infrared light camera 150. Here A1 and A2 are 3×3 matrices described as
similar to formula (1), respectively, where fx1 and fy1 are the focal distances of the lens 132 of the first near infrared light camera 130 in the x-axis and y-axis directions; fx2 and fy2 are the focal distances of the lens 152 of the second near infrared light camera 150 in the x-axis and y-axis directions; (cx1, cy1) is the coordinate value of the center of the light axis on the first near infrared light camera 130; (cx2, cy2) is the coordinate value of the center of the light axis on the second near infrared light camera 150. Here θ1 is calculated as
by use of {right arrow over (P)}1-2.
(S21-3) A distance |{right arrow over (T)}NIR| between the observing points of the second near infrared light camera 150 and the first near infrared light camera 130 is calculated as an absolute value of {right arrow over (T)}NIR.
The above-mentioned corresponding points (XNIR1, YNIR1) and (XNIR2, YNIR2) can be obtained by block matching between two images or by optical flow. Alternatively, it is possible to use a method of obtaining a matrix
F=TxRNIR (8)
from the rotation matrix RNIR and a matrix
rearranging the elements t1, t2 and t3 of the translation vector {right arrow over (T)}NIR, so as to detect common pixels by searching on a straight line (epipolar line) to fulfill constraint conditions:
The distance d is calculated as
by means of sine theorem. Thus, the three-dimensional coordinate value of each pixel in the coordinate system of the first near infrared light camera 130 can be calculated as
{right arrow over (P)}NIR={right arrow over (P)}NIR1·d. (12)
The distance has been measured by a stereo camera in this case. However it is possible to measure the distance by use of a three-dimensional laser range finder that makes a positional relationship clear or the other alternative methods. In addition, when the three-dimensional geometry of a subject is almost determined by maps or rules (e.g. almost no undulation), the three-dimensional location can be determined based on the maps or rules so as to transform between the corresponding points.
The coordinate transformation from the coordinate system of the first near infrared light camera 130 into the coordinate system of the water absorption wavelength band light camera 140 can be calculated as
{right arrow over (P)}SWIR=RSWIR{right arrow over (P)}NIR+{right arrow over (T)}SWIR (13)
where {right arrow over (T)}SWIR is a translation vector representing a translation from the coordinate center of the first near infrared light camera 130 to the coordinate center of the water absorption wavelength band light camera 140; and RSWIR is a rotation matrix representing a rotation from the coordinate system of the first near infrared light camera 130 to the coordinate system of the water absorption wavelength band light camera 140.
From the three-dimensional coordinate values in the coordinate system of the first near infrared light camera 130, the three-dimensional location {right arrow over (P)}SWIR of each corresponding pixel in the coordinate system of the water absorption wavelength band light camera 140 can be calculated as
where ASWIR is a 3×3 matrix represented by
similar to formula (1), where fxSWIR and fySWIR are the focal distances of the lens 142 of the water absorption wavelength band light camera 140 in the x-axis and y-axis directions; (cxSWIR, cySWIR) is the coordinate value of the center of the light axis on the water absorption wavelength band light camera 140. Also, {right arrow over (n)} is a normal vector.
The coordinate values transformed as described above cannot always become integral numbers in general. Therefore the coordinate value of a pixel is estimated from the coordinate values of the peripheral pixels by means of bilinear interpolation method. Thus, since it is possible to match the coordinate value of each observation point of the subject in the first near infrared light camera 130 with the coordinate value of each observation point of the subject in the water absorption wavelength band light camera 140 after the coordinate transformation, it is possible to compare observation data between pixels corresponding in the coordinate systems.
With regard to the use of the plural plate camera shown in
of the reflectance RNIR(x, y, z:t) of light with the near infrared wavelength band to the reflectance RSWIR(x, y, z:t) of light with the water absorption wavelength band in each observation point of the subject (three-dimensional coordinate (x, y, z)) at a time t is obtained by the following method after each pixel corresponding to each observation point of the subject is detected in the near infrared light camera and the water absorption wavelength band light camera, respectively. This reflectance ratio is proportional to the ratio of the received light amount
per hour at the time t in each pixel of the near infrared light camera (coordinate (X, Y) on the light receiving surface) to the received light amount
per hour at the time t in each pixel of the water absorption wavelength band light camera (coordinate (X, Y) on the light receiving surface). Here VNIR (X, Y:t) and VSWIS(X, Y:t) are coordinate values of the pixels represented as coordinates (X, Y) on the light receiving surfaces in the near infrared light camera and the water absorption wavelength band light camera at the time of t; Eλ(t) is an exposure time at the time t; and Kλ(t) is an imaging gain at the time of t.
In other words, the ratio of the reflectance RNIR(x, y, z:t) of the near infrared light to the reflectance RSWIR(x, y, z:t) of the water absorption wavelength band light in an observation point (x, y, z) of the subject at the time t can be represented as
in the corresponding pixels of the near infrared light camera and the water absorption wavelength band light camera (both coordinates on the light receiving surfaces are (X, Y)). Here CRef is a reflectance ratio at calibration (t=t0) and is calculated as
CRef includes factors such as the light receiving sensitivity QNIR of the near infrared light camera at the time of t and the light receiving sensitivity QSWIR of the water absorption wavelength band light camera at the time t. In this case, VNIR(X, Y:t0) and VSWIR(X, Y:t0) represent the pixel values of a know object (subject for calibration such as a plate for reference) having the known reflectances of the near infrared light and the water absorption wavelength band light. RRefNIR(x, y, z:t0) and RRefSWIR(x, y, z:t0) represent known reflectances in the subject for calibration of the near infrared light and the water absorption wavelength band light, respectively.
It is possible to maintain calculating the accurate reflectance ratio when the light source is the same even imaging conditions are different from the near infrared light camera and the water absorption wavelength band light camera. When the ratio of the spectral amount of light from the light source alters, calibration may be constantly performed so as to obtain an image of the subject for calibration on the edges of the image any time. As for the subject for calibration, a standard reflector, of which the reflectance change in every kind of light from visible light to short wavelength infrared light is within 5%, is employed.
By comparing the reflectance ratio for each pixel calculated as described above with a predetermined threshold value, the vegetation detection that determines subjects with pixels having the reflectance more than a predetermined threshold value as vegetation leaves is executed. Preferably, the threshold value of the reflectance ratio is given according to weather conditions. For instance, the threshold value between 2.0 to 3.0 is used in fine weather and the threshold value between 2.5 to 3.0 is used in rainy weather. Alternatively, the threshold value of the reflectance ratio may be set at a common value between 2.5 to 3.0 in fine and rainy weather.
The basis for these threshold values is shown in Tables 2 and 3. Tables 2 and 3 show the observation results of the reflectances of the near infrared light and the water absorption wavelength band light, and reflectance ratios thereof in fine weather (dried condition) and in rainy weather (wet condition), respectively. In view of the reflectance ratios of the near infrared light and the water absorption wavelength band light obtained by the tests, it is apparent that the reflectance ratios of the vegetation leaves result in 4.0 or more and the reflectance ratios of the other subjects result in 2.5 or less in rainy weather (Table 3), and the reflectance ratios of the vegetation leaves result in 3.0 or more and the reflectance ratios of the other subjects result in 2.0 or less in fine weather (Table 2), as shown in Tables.
When a same threshold value is used in both fine and rainy weather, a value between 2.5 to 3.0, for instance, 2.75, is set as the threshold value so as to divide the reflectance ratio into a region of 2.75 or more and a region of less than 2.75. Thus it is possible to easily distinguish the vegetations that are in the former region from the other subjects that are in the latter region, except noise. Note that the allowable range of error by noise is 0.25/2.75 (approximately 9.1%).
In addition, when the weather can be determine (raining or not) by a rainfall sensor and the like, the threshold value is set at a value between 2.0 to 3.0, for instance, 2.5, in no rain condition so as to divide the reflectance ratio into a region of 2.5 or more and a region of less than 2.5. Thus it is possible to easily distinguish the vegetations that are in the former region from the other subjects that are in the latter region, except noise. The allowable range of error by noise is 0.5/2.5 (approximately 20%). Moreover, the threshold value is defined by the value between 2.5 to 4.0, for instance, 3.25, in rainy condition so as to divide the reflectance ratio into a region of 3.25 or more and a region of less than 3.25. Thus, it is possible to easily distinguish the vegetations that are in the former region from the other subjects that are in the latter region, except noise. The allowable range of error by noise is 0.75/3.25 (approximately 23%). Therefore, the more stable detector can be obtained.
There will be described below in detail a calculating method of the ratio (reflectance ratio) of the reflectance of the near infrared light and the reflectance of the water absorption wavelength band light in observation points of subjects with reference to
The amount of received light per hour at the time of t in each pixel (coordinate (X, Y) on the light receiving surface) of the near infrared light camera 160 can be calculated by
based on formula (17). Similarly, the amount of received light per hour at the time of t in each pixel (coordinate (X, Y) on the light receiving surface) of the water absorption wavelength band light camera 170 can be calculated by
based on formula (18). Here INIR(XNIR, YNIR:t) and ISWIR(XSWIR, YSWIR:t) represent the amount of received light amount per hour at a time t in coordinate (X, Y) on the light receiving surfaces of the near infrared light camera and the water absorption wavelength band light camera, respectively; VNIR(XNIR, YNIR:t) and VSWIR(XSWIR, YSWIR:t) represent an pixel value at a time t in coordinate (X, Y) on the light receiving surfaces of the near infrared light camera and the water absorption wavelength band light camera, respectively; ENIR(t) and ESWIR(t) represent exposure time at a time t, respectively; KNIR(t) and KSWIR(t) represent imaging gain at a time t, respectively; LNIR(t) and LSWIR(t) represent the light amount of the near infrared light and the water absorption wavelength band light of light from a light source at a time t, respectively; RNIR(x, y, z:t) and RSWIR(x, y, z:t) represent the reflectance of the near infrared light and the water absorption wavelength band light of the reflected light from a subject at a time t, respectively; D(ψ,φ:t) represents the reflectance distribution of the reflected light (the near infrared light and the water absorption wavelength band light) from the subject in which an incident angle ψ from the light source to each observation point of the subject and a reflection angle φ from each observation point to the light receiving surfaces of the near infrared light and the water absorption wavelength band light are parameters; PNIR(x, y, z:t) and PSWIR(x, y, z:t) represent the transmission of the near infrared light and the water absorption wavelength band light on light paths from the light source to the subject at a time t, respectively; WNIR(x, y, z:t) and WSWIR(x, y, z:t) represent the transmission of the near infrared light and the water absorption wavelength band light on light paths from the subject to the light receiving surfaces of the near infrared light and the water absorption wavelength band light at a time t, respectively (this is the F value of the lens because of a short range in the atmosphere); and QNIR(t) and QSWIR(t) represent the transfer efficiency (which is the product of light receiving areas and transfer efficiency) in the light receiving surfaces of the near infrared light and the water absorption wavelength band light, respectively. The above-mentioned amounts can be obtained by the pixel values observed by each camera in real time.
For instance, suppose that the reflectances RRefNIR(x, y, z:t0) and RRefSWIR(x, y, z:t0) of the near infrared light and the water absorption wavelength band light for the subject for calibration at calibration is observed at calibration (t=t0). In this case, formula (21) leads to
and also formula (22) leads to
If the heat transfer efficiencies QNIR(t) and QSWIR(t) can be approximated by fixed values (i.e. QNIR(t)=QNIR(t0)=QNIR and QSWIR(t)=QSWIR(t0)=QSWIR), respectively, since the efficiencies do not vary with time, the transmission of the near infrared light and the water absorption wavelength band light WNIR(x, y, z:t) and WSWIR(x, y, z:t) can be approximated by fixed values (i.e. WNIR(x, y, z:t)=WNIR(x, y, z:t0)=WNIR and WSWIR(x, y, z:t)=WSWIR(x, y, z:t0)=WSWIR), respectively, and the fluctuation of the light amounts LNIR(t) and LSWIR(t) is small (i.e. LNIR(t)=LNIR(t0) and LSWIR(t)=LSWIR(t0)), Then formulas (23) and (24) are modified as
respectively.
Then, dividing formula (25) by formula (26) gives
When the light source such as a searchlight is used, the reflectance of light from the light source in a subject depends on the incident angle from a light source to the subject. While, the light amount loss on a light path of the reflected light can be ignored. Therefore the change of the light amount on the light path until the light applied from the light source is arrived in the light receiving surfaces of the cameras by reflecting at the subject depends only on the incident angle from the light source to the subject, not depending on the light wavelength. This means that
P
NIR(x, y, z:t)=PSWIR(x, y, z:t), PNIR(x, y, z:t0)=PSWIR(x, y, z:t0). (28)
Substituting condition (28) into formula (27) gives
While, when the light source such as the sun is used, condition (28) cannot be used in general since light from the light source is affected by atmospheric influence such as scattering. In that case, it is assumed the condition
P
NIR(x, y, z:t)=PNIR(x, y, z:t0), PSWIR(x, y, z:t)=PSWIR(x, y, z:t0). (30)
This condition is effective during no large shift of the sun inclination that is a major factor of the sun light scattering. Similarly, formula (29) can be obtained by substituting condition (30) into formula (27).
Thus, even when the light source is the searchlight, or the sun under the condition that the position is changed little even after calibration, formula (29) is equally obtained.
Moreover, formula (29) can be modified as
by use of formula (20).
Consequently, the ratio (reflectance ratio) RateRef(x, y, z:t) of the reflectance RNIR(x, y, z:t) of the near infrared light to the reflectance RSWIR(x, y, z:t) of the water absorption wavelength band light in an observation point of the subject (x, y, z) at a time t can be calculated as
by use of formulas (21) and (22) similar to formula (19).
The vegetation detection may be performed by use of a function of the reflectance ratio RateRef(x, y, z:t) such as
with simpler values (in this case, between 0 to 1) in place of the reflectance ratio RateRef(x, y, z:t) itself
As shown in
With regard to the above-mentioned vegetation detection using the reflectance ratio of the near infrared light to the water absorption wavelength band light, it is emphasized that this is a new combination for vegetation detection provided as a result of testing a variety of combinations. As for the other possibilities, at least the following options may be considered, but they have disadvantages.
(A) It is known that the reflectance ratio of infrared light to ultraviolet light alters according to the composition roughness of a substance. Therefore, since the reflectance of vegetations differs from the reflectance of other substances that have density and roughness different from those of plant cells, a vegetation detecting method of using the reflectance ration may be possible.
Although a difference of the reflectance ratio is recognized regarding sand and soil, there are substances undetectable by the reflectance ratio regarding plastics regardless of color. A possible reason is that ultraviolet adsorbent for preventing deterioration of plastics is included therein.
(B) It may be also possible to consider the reflectance (or absorbance) of blue light (400 nm to 500 nm) and red light as the reflectance (or absorbance) of chlorophyll and carotene, respectively, so as to detect vegetations, as another proposal.
Although differing in color to be error detection, this method faultily detects plastics colored with dyes as well as the method described in Patent Citation 2.
On the other hand, the present invention adopts a method of detecting vegetations by use of the small reflectance of the water absorption wavelength band light in the short wave infrared light, and the large reflectance of the near infrared light.
At the beginning, it was considered that there was a possibility of not being able to distinguish vegetations from wet earth and sand by this method. However, it has been recognized by tests that there is a great difference in the absorbance of vegetations and that of wet earth and sand. A possible reason is that wet earth and sand absorb only the water on these surfaces since light is mainly reflected on these surfaces. However, the vegetations greatly absorb the light since light is repeatedly reflected diffusely in the cells thereof. Moreover, the attenuation of light similar to the case that light passes through water with several mm order has been observed in the tests, which is a basis for this method.
The water absorption wavelength includes a wavelength band centered on 1940 nm, which is effective to use. From the data of the broadleaf tree shown in
As described above, according to the vegetation detector of the present invention, it is possible to (1) prevent discrimination errors between vegetations and other subjects even in a situation where bright spots and dark spots are present, such as in a section with sunlight filtering through trees, (2) prevent constructions constructed of blue-colored plastics from accidentally being detected as vegetations, and (3) operate detection of vegetations at night without a floodlight using a light source of strong visible light such as a searchlight.
The invention is not limited to the embodiment described above and modifications may become apparent to these skilled in the art, in light of the teachings herein.
This application is based upon the Japanese Patent Application No. 2008-135713, filed on May 23, 2008, the entire content of which is incorporated by reference herein.
Number | Date | Country | Kind |
---|---|---|---|
2008-135713 | May 2008 | JP | national |