The disclosure relates to identification and quantification of plants and various characteristics thereof.
There are various known sensors may be present on the planter that can monitor what is dropped in the rows to be planted (seeds). These known sensors do not provide data on plant emergence nor active growth for the growing season.
To determine plant emergence and/or active growth currently a grower can go out and measure a fixed length of a row, and manually count the number of emerged plants which is very time consuming and prone to human error. The additional challenge with this is that it generally happens on a small-scale comparative to the rest of the field. Further there is not a clear or reliable way of accurately knowing the row of the planter that the count came from.
One issue with manually counting a small area is that generally a grower will count 3-4 rows in a location that represent a 1/1,000 of an acre (generally about 17.5 ft of a 30″ spaced row). By manually counting in this manner, it may not be observed, nor be obvious which rows of the planter planted the counted 3-4 rows. Further by manually counting it is difficult to count the plants in the same location a week or two later to see if additional stand emerged. Further, if the grower uses variable rate seeding, where parts of the field get a different seeding rate than other parts, it may not be known what seeding rate was applied at the counted location.
Additionally, there are certain known drone systems that can provide some rudimentary knowledge regarding plant population/stand. These currently known systems have a number of drawbacks including a lack of utility and useful information that can be provided.
The most common current practice for obtaining stand information prior to harvest is manual stand counting, as discussed above. This requires a lot of labor and walking into the field, marking off a distance and doing the counts one row at a time. This is very time consuming and often times limits how many counts are done in a field. This process also makes repeatability of counts difficult to manage. In addition, data collected for the manual counts is often handwritten or manually entered into a spreadsheet and would not contain any spatial locations of the plants. Additionally, the current manual process does not allow for an easy or accurate way to tie the stand counts to other data, such as yield data, planter row that placed the seed, or multi-spectral imagery which limits the usefulness of the data being manually collected.
Being able to count plants in very quick and accurate manner over a large area of a production/research/test field, provides helpful information for a grower to know if the stand that they planted has emerged evenly, and at the expected population. Additionally, having spatial locations and other data layered with the stand and emergence data provides helpful information for future agronomic decisions.
Using a georeferenced image allows a grower to capture a much larger area of the field. Subsequently, this means that a grower can gain a better understanding of the true emerged population of a crop. Additional advantages of using the system and methods described herein include capturing multiple planter passes to identify trends of each row, as well as plants that are not part of the intended crop. Further, a grower or the system can combine the imagery as discussed herein with recorded planting GIS data to index the plant counts to the planter pass. Still further, if the grower uses variable rate seeding the grower can map the various seeding rates on the imagery to know what was intended, and what the emerged plant stand actually is. Still further, the disclosed systems and method provide the opportunity to identify trends amongst the rows of the planter, such as to see if one row had fewer plants emerge than others. Identifying the size of plants at an early stage can also be advantageous to gauge the emergence from each row. The various data gathered and analyzed by the system can lead to identifying wear parts that need to be replaced, or settings on the planter row that were not adjusted properly. Other metrics like leaf area, canopy closure, plant stand are significant improvements to what has been done prior.
One example includes a method of counting plants in a field. This example of counting plants also includes acquiring images of crops, mapping (georeferencing) and processing the images, optionally stitching images of planted crops together, and estimating one or more crop characteristics such as plant number, location, size, emergence, growth stage, and the like from the processed images. In various implementations, these crop characteristics can be used in conjunction with other data gathered, such as data gathered during planting, stored by the user, and/or from other sources. Other implementations of this example include corresponding computer systems, apparatus, and computer programs recorded on one or more computer storage devices, each configured to perform the actions of the methods.
In Example 1, a method for evaluating post-emergence crop stands, comprising: collecting images of an area of interest; geo-referencing the collected images; processing the collected images to isolate individual plants within the collected images; and determining at least one plant characteristic for each individual plant within the collected images.
Example 2 relates to the method of Example 1, wherein the at least one plant characteristic is selected from leaf length, orientation, surface area, and color.
Example 3 relates to the method of any of Examples 1-2, wherein processing the collected images comprises adjusting color levels within the collected images to isolate a plant color of interest.
Example 4 relates to the method of any of Examples 1-3, further comprising performing a triangulation routine.
Example 5 relates to the method of any of Examples 1-4, further comprising defining a pixel threshold wherein a triangle with a side with a pixel range below the pixel threshold is considered part of a plant and a triangle with a side with pixel range above the pixel threshold is considered background.
Example 6 relates to the method of any of Examples 1-5, further comprising defining a threshold distance within a triangle with a side with a distance below the threshold distance is considered part of a plant and a triangle with a side with a distance above the threshold distance is considered background.
Example 7 relates to the method of any of Examples 1-6, further comprising removing background from the collected images.
Example 8 relates to the method of any of Examples 1-7, further comprising identifying a central point of each triangle comprised of plant pixels; connecting each of the central points; and identifying a junction of two or more central points to locate a stem of the plant.
Example 9 relates to the method of any of Examples 1-8, further comprising counting each plant within the collected images to determine a stand count.
Example 10 relates to the method of any of Examples 1-9, further comprising comparing leaf length and surface area.
Example 11 relates to the method of any of Examples 1-10, further comprising merging collected images with geo-referenced as-planted data.
Example 12 relates to the method of any of Examples 1-11, further comprising merging collected images with geo-referenced harvest data.
Example 13 relates to the method of any of Examples 1-12, further comprising identifying volunteer plants and weeds.
Example 14 relates to the method of any of Examples 1-13, further comprising identifying late emerged plants.
Example 15 relates to the method of any of Examples 1-14, further comprising: collecting imagery of the area of interest at a second point in time; and overlaying the collected imagery from the second point in time with the collected imagery.
In Example 16, a method for evaluating post emergence crops, comprising: capturing images of an area of interest at a first point in time; geo-referencing the images with as-planted data; processing the images to isolate individual plants with the images; modeling the surface of the individual plants; generating at least one crop characteristic for each individual plant; identifying a location for each individual plant; and comparing the locations with as-planted data of expected plant locations.
Example 17 relates to the method of Example 16, further comprising generating a point map of each plant location.
Example 18 relates to the method of any of Examples 16-17, further comprising categorizing a plant growth stage for each individual plant and mapping the plant growth stage on the point map.
Example 19 relates to the method of any of Examples 16-18, further comprising identifying plants more than a threshold distance off a centerline for each crop row.
In Example 20, a method for comparing as-planted data to post-emergence data, comprising: acquiring as-planted data from a field computing device; identifying planter rows for each implement pass; creating a point map of individual plant locations from collected images; assigning plants to a planter row and implement pass; determining any plants that are more than a threshold distance from a centerline of a planter row; and generating a distance between adjacent plants.
While multiple embodiments are disclosed, still other embodiments of the disclosure will become apparent to those skilled in the art from the following detailed description, which shows and describes illustrative embodiments of the invention. As will be realized, the disclosure is capable of modifications in various obvious aspects, all without departing from the spirit and scope of the disclosure. Accordingly, the drawings and detailed description are to be regarded as illustrative in nature and not restrictive.
The disclosed systems, apparatus, and methods relate to the evaluation of crop emergence and other characteristics via the processing of imagery, optionally high-resolution imagery. With the use of imagery, the disclosed system, apparatus, and methods are able to identify individual plants in a field, as well as various characteristics of those plants. The various characteristics of the plants can include leaf area, leaf length, and/or the number and placement of plants within a research plot, subsection of a field, or a whole field. This plant characteristic data can be merged with other agricultural map layers such as planting, derived analysis/machine learning data/layers, harvest stand counts, and other related data as would be understood. By combining various data points and imagery, the system, apparatus, and methods disclosed herein can quantify various agricultural analyses such as to better understand the relationship of the planted seeds, to the emerged plants, to the harvested plants. In various implementations, the imagery and/or other data is gathered in the early season, for example just after plant emergence.
The combined data and information from the system, apparatus, and methods described herein can be used to improve planting practices, improve seed genetics, and identify yield potential during the growing season based on measurements of a single or multiple geo-referenced image(s).
In conjunction with the described implementations, it is understood that a system of one or more computers can be configured to perform particular operations or actions by virtue of having software, firmware, hardware, or a combination of them installed on the system that in operation causes or cause the system to perform the actions. One or more computer programs can be configured to perform particular operations or actions by virtue of including instructions that, when executed by data processing apparatus, cause the apparatus to perform the actions.
Certain of the disclosed implementations can be used in conjunction with any of the devices, systems or methods taught or otherwise disclosed in U.S. Pat. No. 10,684,305 issued Jun. 16, 2020, entitled “Apparatus, Systems and Methods for Cross Track Error Calculation From Active Sensors,” U.S. patent application Ser. No. 16/121,065, filed Sep. 4, 2018, entitled “Planter Down Pressure and Uplift Devices, Systems, and Associated Methods,” U.S. Pat. No. 10,743,460, issued Aug. 18, 2020, entitled “Controlled Air Pulse Metering apparatus for an Agricultural Planter and Related Systems and Methods,” U.S. Pat. No. 11,277,961, issued Mar. 22, 2022, entitled “Seed Spacing Device for an Agricultural Planter and Related Systems and Methods,” U.S. patent application Ser. No. 16/142,522, filed Sep. 26, 2018, entitled “Planter Downforce and Uplift Monitoring and Control Feedback Devices, Systems and Associated Methods,” U.S. Pat. No. 11,064,653, issued Jul. 20, 2021, entitled “Agricultural Systems Having Stalk Sensors and/or Data Visualization Systems and Related Devices and Methods,” U.S. Pat. No. 11,297,768, issued Apr. 12, 2022, entitled “Vision Based Stalk Sensors and Associated Systems and Methods,” U.S. patent application Ser. No. 17/013,037, filed Sep. 4, 2020, entitled “Apparatus, Systems and Methods for Stalk Sensing,” U.S. patent application Ser. No. 17/226,002 filed Apr. 8, 2021, and entitled “Apparatus, Systems and Methods for Stalk Sensing,” U.S. Pat. No. 10,813,281, issued Oct. 27, 2020, entitled “Apparatus, Systems, and Methods for Applying Fluid,” U.S. patent application Ser. No. 16/371,815, filed Apr. 1, 2019, entitled “Devices, Systems, and Methods for Seed Trench Protection,” U.S. patent application Ser. No. 16/523,343, filed Jul. 26, 2019, entitled “Closing Wheel Downforce Adjustment Devices, Systems, and Methods,” U.S. patent application Ser. No. 16/670,692, filed Oct. 31, 2019, entitled “Soil Sensing Control Devices, Systems, and Associated Methods,” U.S. patent application Ser. No. 16/684,877, filed Nov. 15, 2019, entitled “On-The-Go Organic Matter Sensor and Associated Systems and Methods,” U.S. Pat. No. 11,523,554, issued Dec. 13, 2022, entitled “Dual Seed Meter and Related Systems and Methods,” U.S. patent application Ser. No. 16/891,812, filed Jun. 3, 2020, entitled “Apparatus, Systems and Methods for Row Cleaner Depth Adjustment On-The-Go,” U.S. patent application Ser. No. 16/918,300, filed Jul. 1, 2020, entitled “Apparatus, Systems, and Methods for Eliminating Cross-Track Error,” U.S. patent application Ser. No. 16/921,828, filed Jul. 6, 2020, entitled “Apparatus, Systems and Methods for Automatic Steering Guidance and Visualization of Guidance Paths,” U.S. patent application Ser. No. 16/939,785, filed Jul. 27, 2020, entitled “Apparatus, Systems and Methods for Automated Navigation of Agricultural Equipment,” U.S. patent application Ser. No. 16/997,361, filed Aug. 19, 2020, entitled “Apparatus, Systems and Methods for Steerable Toolbars,” U.S. patent application Ser. No. 16/997,040, filed Aug. 19, 2020, entitled “Adjustable Seed Meter and Related Systems and Methods,” U.S. patent application Ser. No. 17/011,737, filed Sep. 3, 2020, entitled “Planter Row Unit and Associated Systems and Methods,” U.S. patent application Ser. No. 17/060,844, filed Oct. 1, 2020, entitled “Agricultural Vacuum and Electrical Generator Devices, Systems, and Methods,” U.S. patent application Ser. No. 17/105,437, filed Nov. 25, 2020, entitled “Devices, Systems and Methods For Seed Trench Monitoring and Closing,” U.S. patent application Ser. No. 17/127,812, filed Dec. 18, 2020, entitled “Seed Meter Controller and Associated Devices, Systems and Methods,” U.S. patent application Ser. No. 17/132,152, filed Dec. 23, 2020, entitled “Use of Aerial Imagery For Vehicle Path Guidance and Associated Devices, Systems, and Methods,” U.S. patent application Ser. No. 17/164,213, filed Feb. 1, 2021, entitled “Row Unit Arm Sensor and Associated Systems and Methods,” U.S. patent application Ser. No. 17/170,752, filed Feb. 8, 2021, entitled “Planter Obstruction Monitoring and Associated Devices and Methods,” U.S. patent application Ser. No. 17/225,586, filed Apr. 8, 2021, entitled “Devices, Systems, and Methods for Corn Headers,” U.S. patent application Ser. No. 17/225,740, filed Apr. 8, 2021, entitled “Devices, Systems, and Methods for Sensing the Cross Sectional Area of Stalks,” U.S. patent application Ser. No. 17/323,649, filed May 18, 2021, entitled “Assisted Steering Apparatus and Associated Systems and Methods,” U.S. patent application Ser. No. 17/369,876, filed Jul. 7, 2021, entitled “Apparatus, Systems, and Methods for Grain Cart-Grain Truck Alignment and Control Using GNSS and/or Distance Sensors,” U.S. patent application Ser. No. 17/381,900, filed Jul. 21, 2021, entitled “Visual Boundary Segmentations and Obstacle Mapping for Agricultural Vehicles,” U.S. patent application Ser. No. 17/461,839, filed Aug. 30, 2021, entitled “Automated Agricultural Implement Orientation Adjustment System and Related Devices and Methods,” U.S. patent application Ser. No. 17/468,535, filed Sep. 7, 2021, entitled “Apparatus, Systems, and Methods for Row-by-Row Control of a Harvester,” U.S. patent application Ser. No. 17/526,947, filed Nov. 15, 2021, entitled “Agricultural High Speed Row Unit,” U.S. patent application Ser. No. 17/566,678, filed Dec. 20, 2021, entitled “Devices, Systems, and Method For Seed Delivery Control,” U.S. patent application Ser. No. 17/576,463, filed Jan. 14, 2022, entitled “Apparatus, Systems, and Methods for Row Crop Headers,” U.S. patent application Ser. No. 17/724,120, filed Apr. 19, 2022, entitled “Automatic Steering Systems and Methods,” U.S. patent application Ser. No. 17/742,373, filed May 11, 2022, entitled “Calibration Adjustment for Automatic Steering Systems,” U.S. patent application Ser. No. 17/902,366, filed Sep. 2, 2022, entitled “Tile Installation System with Force Sensor and Related Devices and Methods,” U.S. patent application Ser. No. 17/939,779, filed Sep. 7, 2022, entitled “Row-by-Row Estimation System and Related Devices and Methods,” U.S. patent application Ser. No. 18/081,432, filed Dec. 14, 2022, entitled “Seed Tube Guard and Associated Systems and Methods of Use,” U.S. patent application Ser. No. 18/087,413, filed Dec. 22, 2022, entitled “Data Visualization and Analysis for Harvest Stand Counter and Related Systems and Methods,” U.S. patent application Ser. No. 18/097,801, filed Jan. 17, 2023, entitled “Agricultural Mapping and Related Systems and Methods,” U.S. patent application Ser. No. 18/101,394, filed Jan. 25, 2023, entitled “Seed Meter with Integral Mounting Method for Row Crop Planter and Associated Systems and Methods,” U.S. patent application Ser. No. 18/102,022, filed Jan. 26, 2023, entitled “Load Cell Backing Plate and Associated Devices, Systems, and Methods,” U.S. patent application Ser. No. 18/116,714, filed Mar. 2, 2023, entitled “Cross Track Error Sensor and Related Devices, Systems, and Methods,” U.S. patent application Ser. No. 18/203,206+−, filed May 27, 2022, entitled “Seed Tube Camera and Related Devices, Systems and Methods,” U.S. Patent Application 63/357,082, filed Jun. 30, 2022, entitled “Seed Tube Guard,” U.S. Patent Application 63/357,284, filed Jun. 30, 2022, entitled “Grain Cart Bin Level Sharing,” U.S. Patent Application 63/394,843, filed Aug. 3, 2022, entitled “Hydraulic Cylinder Position Control for Lifting and Lowering Towed Implements,” U.S. Patent Application 63/395,061, filed Aug. 4, 2022, entitled “Seed Placement in Furrow,” U.S. Patent Application 63/400,943, filed August 2022, entitled “Combine Yield Monitor,” U.S. Patent Application 63/406,151, filed Sep. 13, 2022, entitled “Hopper Lid with Magnet Retention and Related Systems and Methods,” U.S. Patent Application 63/427,028, filed Nov. 21, 2022, entitled “Stalk Sensors and Associated Devices, Systems and Methods,” U.S. Patent Application 63/445,960, filed Feb. 15, 2023, entitled “Ear Shelling Detection and Related Devices, Systems, and Methods,” U.S. Patent Application 63/445,550, filed Feb. 14, 2023, entitled “Liquid Flow Meter and Flow Balancer,” U.S. Patent Application 63/466,144, filed May 12, 2023, entitled “Devices, Systems, and Methods for Providing Yield Maps,” and U.S. Patent Application 63/466,560, filed May 15, 2023, entitled “Devices, Systems, and Methods for Agricultural Guidance and Navigation,” each of which is incorporated by reference herein.
Turning to the figures in more detail,
In various implementations, a single image may show around 2-4 acres but as will be described further herein multiple images can optionally be put together (stitched) to form a larger map/image. The area depicted in an image may vary depending on the type of platform from which the image is acquired. For example, aerial platforms are likely to capture a larger area in a single image when compared to a ground-based platform that will likely image a smaller area from a closer distance. For ground-based platform—handheld or vehicle/implement mounted—captured images, the captured area per images can be a linear row or rows, or a small area of interest.
After images are captured, they may optionally be stitched together to form a single scene or mosaic of multiple combined images. One exemplary stitched image 10 is shown in
Turning back to
In various further implementations, the imagery can optionally be manipulated/modified in various way to improve/enhance processing (box 106). In one exemplary implementation, the imagery is modified to improve the contrast to the remainder of the field (bare soil, previous crop residue, tire tracks etc.) to the growing plant, shown for example in
In another example, the imagery can be modified to improve contacts via certain image processing techniques such as via the use of normalized difference vegetation index (NDVI) with an NIR equipped camera or similar. In these implementations, using the formula NIR−R/NIR+R will isolate the green plant material for use as described herein. Various alternative methods and processes for enhancing contrast are possible and would be recognized by those of skill in the art.
In certain alternate implementations, the target plant/fruit color may not be green and as such different processing to isolate color to the target crop, for enhancing contrast may be needed. If the object that the user is trying to isolate is a different color than green (e.g. pumpkins in a pumpkin patch/orange), the system 100 or user is able to define and use a different formula to allow the desired color to be isolated from the non-desired or non-target color(s). These changes to the formulae would be recognized by those of skill in the art.
Other image improvements to isolating the plant in the image from the rest of the non-plant items can optionally include adjusting the hue, saturation, and/or luminance to make a greater difference in the plant 1 and non-plant 2 areas. In several of these approaches, the brightness and/or scale of the image can be adjusted for better results/make target plants 1 more pronounced and easier to identify the edge of the plant 1 and/or plant leaves.
Other methods to identify the plant include machine learning where the shape and color of the plant are identified by comparison to sample images of the same type of plant. For example, a machine learning algorithm can match the shape and orientation of the leaves to another image that was used as a sample in training the program. Various alternative method and process for application of machine learning are possible and would be recognized by those of skill in the art.
In various implementations, when the individual plants 1 are identifiable by the system 100, with or with modification/processing (box 106), the system 100 can analyze individual plants 1 such as by modeling the surface (box 108). In one optional step, the entire image 10 may be covered with a triangulation routine such as a Delaunay Triangulation. Triangulation is used between the pixels identified to be plants that are defined as a target color (green) that after processing may be white or other contrasting color. Lines connecting leaf edges on the same plant 1 will be shorter than lines that were connected between different plants 1—encompassing the background 2, as can be seen in
Utilizing the pixels of the image 10, triangle with more than a threshold of the number of pixels between corners of the triangle are to be removed. For example, if there are more than a threshold number of pixels (e.g. >6) between two corners of the triangle, the triangle will be removed from the image because it is identified as being a non-plant/background 2 area. The shorter lines that have fewer than the threshold number of pixels (e.g. 6 pixels) that remain in the image are identified as those that cover the plant 1 leaf, and these are the areas of the plant 1 visible in the picture/image 10.
In various alternative implementations, the system 100, or user identifies a measured distance between the corners of the triangle and that is used instead of or in addition to the number of pixels between corners of the triangles to remove the space between plants or leaves of the same plant.
Turning back to
In various implementations, the system 100 can estimate the location of the plant stem 3 as the location where several of the connecting center lines come together and form a “Y”, or junction (box 112), as seen in
As would be understood, the spacing between plants, and where the leaves of neighboring plants touch each other, can impact the ability to identify an individual plant. With imperfect spacing of plants, which is typical from most planters, the spacing can be variable, where 2 plants may be 3″ apart, and the next 2 plants might be 8″ part. By using a minimum distance filter between estimated stem 3 location points, the system 100 can account for some of this variability. As an example, using a 3″ plant spacing filter, if a junction 3 is more than 3 inches away from the previous plants 1 junction 3 the system 100 identifies two different plants 1. If the junctions 3 are closer than 3″ the system 100 may assume there is only one plant 1, and for example the orientation of the plant 1 leaves may have made it appear that there were two plants 1 on top of each other, when in fact there was only one plant 1.
By repeating some of these processes through the image(s) 10, the system 100 can classify each plant 1 and determine the average leaf length. Taking an average of the length of the longest two branches off the junction 3, mentioned above. In various implementations, the system 100 can classify the size of the plant 1 and tag each plant 1 location with the leaf length for use by the system 100.
In certain implementations, the system 100 can classify the length of a leaf into a crop growth stage, and/or into a user or system 100 defined range. For example, a leaf length from 1.3″ to 2″ is a ‘V1’ stage and from 2″ to 3″ is a ‘V2’ crop stage, and so on, as would be understood.
In some crops, such as corn, the yield is impacted by how evenly the plants emerge. At an early stage of emergence, the difference in plant size is apparent enough to make a determination of a late emerged plant versus a plant that came up with the majority of the rest of the field.
In some implementations, the system 100 may gather image data over several days or more than once separated by about 36-72 hours or more as would be understood. The time between collection of imagery can vary depending on crop type, weather, and other conditions as would be understood. The imagery from the first collection and the imagery from a second or subsequent collection can be compared and the triangulated leaf area from the two collection of imagery, to compare the amount of plant growth between the collection times. In various implementations, the imagery and data therefrom can be visible as multiple layers in a single map, as would be understood. This type of analysis can be helpful in realizing the efficiency of plant growth during a known weather environment, which can be helpful in ranking plant genetics that will grow better under adverse conditions. This is in addition to the ability to compare the normal plant stand and confirm that the plant stand found in a first collection is close or matches the plant stand in a second collection.
Turning back to
In a further example, with having a comparison to the planted data, conclusions to the quality of seed metering (seed spacing) can be identified. Having erratic spacing can equally damage yield potential even if they emerge at the same time. The closer plants are to each other, the root systems will need to compete for water and nutrients in the growing season. As can be seen in
As noted above, during the process of identifying the plants, a measurement of the leaves can help determine size compared to the neighboring plant. This is important to see the number of late emerged plants as they are more likely to not produce a harvestable ear. By measuring all the plants, and then assigning the length of each leaf to each plant, the system 100 can then create a map layer that shows the size of each plant, and then can categorize plants by grouping them in a growth stage to present to a producer. This allows the producer to see at a glance how many plants are behind a growth stage, and subject to a risk of not producing a harvestable ear. This would likely trigger the grower into inspection of planting equipment for maintenance, upgrading parts, and/por changing planting practices based on weather/soil conditions that lead to uneven emergence to improve the evenness of emergence in future years. In some implementations, a producer may also be able to see how even the plants are after improvements as well by comparing imagery year over year or between passes where changes are made. Other possibilities could include that some plant/seed varieties have different genetics for emergence, compared to others, such that the system 100 may allow for comparison or evaluation of genetics/plant characteristics between seed varieties. That is, if no mechanical adjustments were made to the planter, but the plant/seed variety changed, a difference in emergence or other characteristic may be attributed to plant genetics. This data may allow a grower or the system 100 to evaluate and trigger different selections of seed varieties in future years to achieve more even emergence/higher overall yield.
Furthermore, some images that are captured will show that a plant is between the rows of a planted crop such as volunteer crop or weeds, shown in
As would be understood, geo-referencing images from a drone is not always spatially accurate. Generally, the images are accurate to within a few feet or less to the true position of the objects they capture. With reference information from planting, in season applications, and/or harvest data that was collected with an accurate GPS receiver such as sub-inch RTK receivers, the imagery can be adjusted in any direction to the spatial positions/passes where data was collected (box 104). This allows the option to improve the alignment of the imagery to where the data was collected spatially. Conversely, if the accuracy of the planting or other data is poorer than the image, that data could be adjusted to match the image as well (box 104). Some outside factors that would create some of this error could include slope of the field (a flat image rendered from a curved surface), angle of a downward looking camera that was not completely vertical due to wind, or other forces that are understood to effect camera orientation.
In some cases, a missing plant or several missing plants in close proximity to each other may imply that field traffic caused some of the stand loss, as seen in
With the imagery, and the ability to document the same plant a few days apart from each other, it is also possible to see changes in the color of the leaves. During the early growth period of a plant, it is generally expected that as more leaves become larger and convert more sunlight to chlorophyll to continue to grow. If during a first collection pass, and then confirmed in a second collection pass that there is an area within the image that is not as healthy as the first image (due to pale green color, or variable coloring to leaves spots, lesions, holes, or plants that have died) a grower could assess the percentage of change, and take action if able to reverse the damage, or limit the damage. These actions could include fertilizer applications, or pesticide applications to solve an issue that is damaging their crop and improve profitability.
By looking at the amount of area covered by a plant leaf, and how much soil area remains visible to the sky, the system 100 and/or a user can assess the plant canopy in the field with a high degree of accuracy. The importance of plant canopy in agriculture is to reduce opportunity for weeds to compete with the growing crop. One of the ways this is done is by shading out the soil by the desired crop to prevent weeds from being able to capture sunlight and competing with the crop. By observing an area multiple times and comparing the growth of individual plants, the system 100 could estimate the time remaining until the crop achieves a certain level or percentage of canopy. Or by contrast, if the crop never gets to canopy, the system 100 can then estimate how much open area still exists.
The system 100 and methods described herein are able to count plants in a quick and accurate manner over a large area of a production/research/test field to provide information to a grower, as discussed herein. Images and map layers collected early in the season can be compared to other counting methods as well. The emerged stand early in the season is a good indicator, but the final stand is generally considered what will make a good or poor crop. The emergence data from early in the season can also be compared with harvest sensors. This would provide further evidence of what was counted early in the growing season and a comparison of what was harvested at the end of the growing season. This comparison can assist in determining at what point in the season and issue was encountered effecting yield.
As noted above, known solutions for evaluating plant emergence and early plant characteristics are time consuming and prone to human error. Using an imagery-based approach described herein allows the grower to quickly look at multiple rows, and multiples segments of the same row to identify trends and areas of concern that the grower can go back to the field and the ground truth or inspect the planter for issues that would contribute to the issues they found.
Another advantage of using such an image-based approach is that it provides a repeatable process for counting the same area of the field multiple times if needed or looking back over time to see if the emergence has changed over that period of time. This is especially true in the early season when there are plants emerging at different times, the stand changes so what was counted as 30,000 seeds/acre in week one may turn into 33,000 seeds/acre in week two. Having an image of the area of interest can validate that there were 3,000 more plants in the same location in the field than the week later. This can also help in times of a dispute between a grower and a provider of the seed/fertilizer/crop treatment when there is a reduced count in a portion of the field compared to a second area of a field without the same treatment.
As noted above, there are a number of known drone systems that will provide an estimate of plant stand for a whole field in coarse resolution. In some cases, these known drone systems will fly an 80-acre field, and the drone will stop over 80 spots in the field and capture a single or limited number of images, and then move to the next spot. With these prior systems there is a count of the emerged stand at that time, but there is no reference to what rows were captured, or what size of plants were present, or if there were plants that were volunteer corn, or weeds in that same scene. As such, the disclosed method provides more ground truth to what is present in the field for a harvestable crop. Other methods will show a similar point map, but will have no intelligence as to which row each plant was from, nor does it show the leaf surface area, growth stage, etc.
The disclosed system, apparatus, and methods can give a grower geo-located proof of the stand and surrounding metrics that are not possible by counting plants manually. Should there be a question of how plants emerged at the end of the growing season, this could be proof of what was in a specific spot that is dated, photographic, and spatially located evidence.
In a further optional step, individual row level lines from the full implement width passes are identified (box 152). In these and other implementations, the system 100 identifies the individual rows for each planter pass, optionally matching rows planted by a particular row unit between passes.
In another optional step, the system 100 creates a point map of detected plants (box 166), discussed in further detail above. The system 100 can then assign each detected plant to the closest row (box 154). The system 100 may also optionally identify an offset distance from the center of the row for each detected plant (box 156). In some implementations, plants outside a threshold distance from the row may not be assigned a row and instead to labeled as a volunteer plant or weed, as appropriate (box 164).
In another optional step, the system 100 can identify the row for each plant (the row unit on the planter that planted the seed) (box 158). Additionally, the system 100 can identify the planter pass for each plant and row. In various implementations, this row and pass data can be used to determine the distance between plants.
In a further optional step, the system 100 can use row to row measurements to determine plant to plant spacing for plants within the same row (box 160). Additionally, or alternatively, the system 100 can determine plant population. Further, optionally, the system 100 can compare different passes and overall performance from the data (including but not limited to planter pass, row spacing, plant spacing, and population) for optional future improvement and/or remediation steps (box 160).
Additionally, the system 100 may automatically or semi-automatically perform various statistical analysis for individual plants, rows, planter passes, test plots, whole field, and the like, as would be understood (box 162).
In some implementations, the collected imagery data is compared to as-planted information.
Plants 1 that are more than a threshold distance from the row 4 can be shown in a contrasting color or other symbol to be easily identified visually by a user. The system 100 may also create a data point or log for each plant 1 not assigned to a row 4. These volunteer plants 1A and weeds are important values as discussed herein. Knowing the distance between rows 4 of sensed plants 1, the distance between rows 4 can then be applied to the plants 1 within the row 4 to know plant 1 spacing and plant 1 populations. If the system 100 knows the distance between the rows 4 is 30″ apart, the system 100 can determine a scale for the distance of plants 1 within the same row 4 and find spacing and population, as would be understood.
Ranges can be expressed herein as from “about” one particular value, and/or to “about” another particular value. When such a range is expressed, a further aspect includes from the one particular value and/or to the other particular value. Similarly, when values are expressed as approximations, by use of the antecedent “about,” it will be understood that the particular value forms a further aspect. It will be further understood that the endpoints of each of the ranges are significant both in relation to the other endpoint, and independently of the other endpoint. It is also understood that there are a number of values disclosed herein, and that each value is also herein disclosed as “about” that particular value in addition to the value itself. For example, if the value “10” is disclosed, then “about 10” is also disclosed. It is also understood that each unit between two particular units are also disclosed. For example, if 10 and 15 are disclosed, then 11, 12, 13, and 14 are also disclosed.
Although the disclosure has been described with references to various embodiments, persons skilled in the art will recognized that changes may be made in form and detail without departing from the spirit and scope of this disclosure.
This application claims the benefit under 35 U.S.C. § 119(e) to U.S. Provisional Application 63/351,602, filed Jun. 13, 2022, and entitled APPARATUS, SYSTEMS AND METHODS FOR IMAGE PLANT COUNTING, which is hereby incorporated herein by reference in its entirety for all purposes.
Number | Date | Country | |
---|---|---|---|
63351602 | Jun 2022 | US |