The present invention relates to an unmanned aerial vehicle (a drone) suitable for taking pictures of field crops to evaluate their growth status with image analysis.
A method of taking pictures of field crops from above in the sky with a drone (aka unmanned aerial vehicle or multicopter) or the like in order to evaluate the growth status of the field crops with image analysis is well-known (for example, Patent Document 1). By using a drone flying in a low altitude above the field, more accurate information can be acquired than satellite imagery. There, however, is still a problem that information thus acquired is not precise enough for detailed analysis.
For example, pests of crops such as planthoppers often occur in stems near to the root, but it was difficult to properly take picture of that part above from the sky. Similarly, it was difficult to take pictures lesions occurring on stems near to the root and weeds growing on the water surface. Moreover, in the case of rice, if an image of the shape of the leaves bent by the wind were obtained, the amount of accumulated silicon would be evaluated, and growth of rice can be estimated based on the information and the fertilizer plan can be optimized. However, it was difficult to properly obtain such an image from above the sky.
To provide a drone (unmanned aerial vehicle) that can properly photograph the stubble portion of the crop and the sides of the field.
The present invention provides, an unmanned aerial vehicle, comprising: a camera; and rotors, wherein the camera is provided at a position that is substantially rearward with respect to a traveling direction of the unmanned aerial vehicle with a depression angle of about 60 degrees with respect to the horizontal line, and the camera is configured to capture an image of a base part of a stem or a side of a leaf of a field crop exposed by a wind created by the rotors in order to solve the above problem.
The present invention further provides the unmanned aerial vehicle according to Paragraph 0006, further comprising means for adjusting a depression angle of the direction of the camera depending on a flying speed or a wind force or direction in order to solve the above problem.
The present invention further provides the unmanned aerial vehicle according to Paragraph 0006 or 0007, further comprising control means for performing posture control so that the camera is always directed to backward with respect to the traveling direction of the unmanned aerial vehicle when the flying direction is changed in order to solve the above problem.
Further, the present invention provides A method to evaluate the status of the growth, pests or weeds of the field crop comprising inputting the images of the base part of the stem or the side of the leaf of a field crop taken by the camera of the unmanned aerial vehicle according to Paragraph 0006, 0007, or 0008 to a neural network in order to solve the above problem.
A drone (unmanned aerial vehicle) that can properly capture images of stems near to the root and leaf sides of field crops is provided.
Hereinafter, embodiments of the present invention will be described with reference to the drawings. All drawings/figures are exemplary.
As shown in
The inventors' experiments have shown that when the drone is moving at a typical speed (about 5 meters per second), the crops most affected by the airflow created by the rotor blades are behind the direction of travel of the drone at an depression angle of about 60 degrees. The camera (103) preferably may be pointed to this direction. Alternatively, the wide-angle camera (103) may be pointed downward of the drone body and the image of the root of the crop and the sides of the leaves can be extracted later.
Since the optimal direction of the camera can vary depending on the flying speed of the drone, a speed sensor may be installed in the drone (100) and the direction of the camera (103) may be adjusted depending of the flying speed using a stepping motor or the like. In addition, since the relative position the area where the crops are temporarily knocked down may be affected by the wind force and direction, the drone (100) may be provided with a wind sensor, and the direction of the camera (103) may be adjusted depending on the wind force and/or direction. The images captured by the camera (103) may be displayed on the remote control unit of the drone (100), and the position of the camera (103) may be fine-tuned manually by the operator of the remote control unit. The camera (103) may be controlled such that it does not take pictures when the drone (100) is hovering or flying at a low speed (e.g., about 3 meters per second or less).
As an alternative method, as shown in
Analysis of the images taken by the camera (103) provides a variety of information that could not be obtained previously. For example, the presence of chlorophyll can be detected by analyzing a near-infrared image (e.g., near 780 nm wavelength), which allows only the crop parts to be extracted from the images. Edge detection can be applied to the extracted image of the crop parts to extract the contour lines of the leaves to determine how much the leaves bend when exposed to the wind. This allows the leaf thickness to be estimated and, as a result, the growth condition of the crop can be estimated. Particularly when the crop is rice, it is also possible to determine the amount of silicon accumulation (because silicon increases the hardness of the leaves). In addition, in the water area detected with near-infrared image analysis, the area with dense straight lines (detected by the edge detection) can be presumed to be the base (near-to-the-root) part of the crop. When near-infrared edge detection is applied to the base parts, if the spotted areas are detected, the plant is suspected to be attached to planthoppers. If there are strong red areas are seen at the base of the plant, it is suspected to suffer from sheath blight disease. In addition, since plants are usually planted at 20 to 30 centimeters apart to each other, if the water surface area does not appear evenly spaced in the image, the weeds are presumed to be present. In addition to these image analyses, as the inventors experiments have shown, it is possible to perform an efficient and accurate analysis with machine learning using a large number of image data samples as input to a neural network (preferably a deep neural network).
With the drone according to the present application, it is possible to efficiently acquire images of the root part of the stems and the side of the leaves of the entire crop in the field. The image thus obtained can be analyzed for an effective and efficient pest control and fertilization plans. In addition, in the case of rice, the shape of the leaves as they are exposed to the wind can be analyzed to evaluate the amount of silicon accumulation, which can be used to estimate the level of growth of the rice plant and optimize a fertilizer plan.
Number | Date | Country | Kind |
---|---|---|---|
JP2017-046844 | Mar 2017 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2018/008497 | 3/6/2018 | WO | 00 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2018/168565 | 9/20/2018 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
10175362 | Redden | Jan 2019 | B2 |
10845301 | Ni | Nov 2020 | B2 |
11050979 | Gornik | Jun 2021 | B2 |
11074447 | Fox | Jul 2021 | B1 |
11100641 | Malahe | Aug 2021 | B1 |
11188752 | Papanikolopoulos | Nov 2021 | B2 |
20140146303 | Mitchell et al. | May 2014 | A1 |
20160216245 | Sutton | Jul 2016 | A1 |
20160217562 | Ulman | Jul 2016 | A1 |
20160280397 | Christ | Sep 2016 | A1 |
20180068164 | Cantrell | Mar 2018 | A1 |
20180267008 | Sutton | Sep 2018 | A1 |
20180335372 | Oral | Nov 2018 | A1 |
20180364157 | Ghiraldi | Dec 2018 | A1 |
20190220964 | Mello | Jul 2019 | A1 |
20200077601 | McCall | Mar 2020 | A1 |
20200253127 | McCall | Aug 2020 | A1 |
20200273172 | Weldemariam | Aug 2020 | A1 |
20210073692 | Saha | Mar 2021 | A1 |
20210287001 | Meltzer | Sep 2021 | A1 |
Number | Date | Country |
---|---|---|
108001687 | May 2018 | CN |
2006264573 | Oct 2006 | JP |
WO 2016065071 | Apr 2016 | WO |
WO 2017077543 | May 2017 | WO |
Entry |
---|
Chen etal, “The Drones are Coming unmanned aerial vehicles for Agriculture”, 2015, Citrograph vol. 6 No 4, pp. 18-22 (Year: 2015). |
Yu et al, “Crop Row Segmentation and Detection in Paddy Fields Based on Treble-Classification Otsu and Double-Dimensional Clustering Method”, 2021, Remote sensing, whole document (Year: 2021). |
Lass et al, “A review of remote sensing of invasive weeds and example of the early detection of spotted knapweed (Centaurea maculosa) and babysbreath (Gypsophila paniculata) with a hyperspectral sensor”, 2005, Weed Science, Whole Document (Year: 2005). |
Khan et al, “Weed Detection in Crops Using Computer Vision”, 2016, University of Central Punjab, Whole Document (Year: 2016). |
Barrero et al, “Weed Detection in Rice Fields Using Aerial Images and Neural Networks”, 2016, IEEE, whole document (Year: 2016). |
Calado et al, “Weed emergence as inXuenced by soil moisture and air temperature”, 2009, Springer, Whole Document (Year: 2009). |
Number | Date | Country | |
---|---|---|---|
20210316857 A1 | Oct 2021 | US |