APPARATUS, SYSTEMS AND METHODS FOR IMAGE PLANT COUNTING

Information

  • Patent Application
  • 20230401703
  • Publication Number
    20230401703
  • Date Filed
    June 13, 2023
    a year ago
  • Date Published
    December 14, 2023
    a year ago
Abstract
A method for evaluating post emergence crops, comprising: capturing images of an area of interest at a first point in time; geo-referencing the images with as-planted data; processing the images to isolate individual plants with the images; modeling the surface of the individual plants; generating at least one crop characteristic for each individual plant; identifying a location for each individual plant; and comparing the locations with as-planted data of expected plant locations.
Description
TECHNICAL FIELD

The disclosure relates to identification and quantification of plants and various characteristics thereof.


BACKGROUND

There are various known sensors may be present on the planter that can monitor what is dropped in the rows to be planted (seeds). These known sensors do not provide data on plant emergence nor active growth for the growing season.


To determine plant emergence and/or active growth currently a grower can go out and measure a fixed length of a row, and manually count the number of emerged plants which is very time consuming and prone to human error. The additional challenge with this is that it generally happens on a small-scale comparative to the rest of the field. Further there is not a clear or reliable way of accurately knowing the row of the planter that the count came from.


One issue with manually counting a small area is that generally a grower will count 3-4 rows in a location that represent a 1/1,000 of an acre (generally about 17.5 ft of a 30″ spaced row). By manually counting in this manner, it may not be observed, nor be obvious which rows of the planter planted the counted 3-4 rows. Further by manually counting it is difficult to count the plants in the same location a week or two later to see if additional stand emerged. Further, if the grower uses variable rate seeding, where parts of the field get a different seeding rate than other parts, it may not be known what seeding rate was applied at the counted location.


Additionally, there are certain known drone systems that can provide some rudimentary knowledge regarding plant population/stand. These currently known systems have a number of drawbacks including a lack of utility and useful information that can be provided.


BRIEF SUMMARY

The most common current practice for obtaining stand information prior to harvest is manual stand counting, as discussed above. This requires a lot of labor and walking into the field, marking off a distance and doing the counts one row at a time. This is very time consuming and often times limits how many counts are done in a field. This process also makes repeatability of counts difficult to manage. In addition, data collected for the manual counts is often handwritten or manually entered into a spreadsheet and would not contain any spatial locations of the plants. Additionally, the current manual process does not allow for an easy or accurate way to tie the stand counts to other data, such as yield data, planter row that placed the seed, or multi-spectral imagery which limits the usefulness of the data being manually collected.


Being able to count plants in very quick and accurate manner over a large area of a production/research/test field, provides helpful information for a grower to know if the stand that they planted has emerged evenly, and at the expected population. Additionally, having spatial locations and other data layered with the stand and emergence data provides helpful information for future agronomic decisions.


Using a georeferenced image allows a grower to capture a much larger area of the field. Subsequently, this means that a grower can gain a better understanding of the true emerged population of a crop. Additional advantages of using the system and methods described herein include capturing multiple planter passes to identify trends of each row, as well as plants that are not part of the intended crop. Further, a grower or the system can combine the imagery as discussed herein with recorded planting GIS data to index the plant counts to the planter pass. Still further, if the grower uses variable rate seeding the grower can map the various seeding rates on the imagery to know what was intended, and what the emerged plant stand actually is. Still further, the disclosed systems and method provide the opportunity to identify trends amongst the rows of the planter, such as to see if one row had fewer plants emerge than others. Identifying the size of plants at an early stage can also be advantageous to gauge the emergence from each row. The various data gathered and analyzed by the system can lead to identifying wear parts that need to be replaced, or settings on the planter row that were not adjusted properly. Other metrics like leaf area, canopy closure, plant stand are significant improvements to what has been done prior.


One example includes a method of counting plants in a field. This example of counting plants also includes acquiring images of crops, mapping (georeferencing) and processing the images, optionally stitching images of planted crops together, and estimating one or more crop characteristics such as plant number, location, size, emergence, growth stage, and the like from the processed images. In various implementations, these crop characteristics can be used in conjunction with other data gathered, such as data gathered during planting, stored by the user, and/or from other sources. Other implementations of this example include corresponding computer systems, apparatus, and computer programs recorded on one or more computer storage devices, each configured to perform the actions of the methods.


In Example 1, a method for evaluating post-emergence crop stands, comprising: collecting images of an area of interest; geo-referencing the collected images; processing the collected images to isolate individual plants within the collected images; and determining at least one plant characteristic for each individual plant within the collected images.


Example 2 relates to the method of Example 1, wherein the at least one plant characteristic is selected from leaf length, orientation, surface area, and color.


Example 3 relates to the method of any of Examples 1-2, wherein processing the collected images comprises adjusting color levels within the collected images to isolate a plant color of interest.


Example 4 relates to the method of any of Examples 1-3, further comprising performing a triangulation routine.


Example 5 relates to the method of any of Examples 1-4, further comprising defining a pixel threshold wherein a triangle with a side with a pixel range below the pixel threshold is considered part of a plant and a triangle with a side with pixel range above the pixel threshold is considered background.


Example 6 relates to the method of any of Examples 1-5, further comprising defining a threshold distance within a triangle with a side with a distance below the threshold distance is considered part of a plant and a triangle with a side with a distance above the threshold distance is considered background.


Example 7 relates to the method of any of Examples 1-6, further comprising removing background from the collected images.


Example 8 relates to the method of any of Examples 1-7, further comprising identifying a central point of each triangle comprised of plant pixels; connecting each of the central points; and identifying a junction of two or more central points to locate a stem of the plant.


Example 9 relates to the method of any of Examples 1-8, further comprising counting each plant within the collected images to determine a stand count.


Example 10 relates to the method of any of Examples 1-9, further comprising comparing leaf length and surface area.


Example 11 relates to the method of any of Examples 1-10, further comprising merging collected images with geo-referenced as-planted data.


Example 12 relates to the method of any of Examples 1-11, further comprising merging collected images with geo-referenced harvest data.


Example 13 relates to the method of any of Examples 1-12, further comprising identifying volunteer plants and weeds.


Example 14 relates to the method of any of Examples 1-13, further comprising identifying late emerged plants.


Example 15 relates to the method of any of Examples 1-14, further comprising: collecting imagery of the area of interest at a second point in time; and overlaying the collected imagery from the second point in time with the collected imagery.


In Example 16, a method for evaluating post emergence crops, comprising: capturing images of an area of interest at a first point in time; geo-referencing the images with as-planted data; processing the images to isolate individual plants with the images; modeling the surface of the individual plants; generating at least one crop characteristic for each individual plant; identifying a location for each individual plant; and comparing the locations with as-planted data of expected plant locations.


Example 17 relates to the method of Example 16, further comprising generating a point map of each plant location.


Example 18 relates to the method of any of Examples 16-17, further comprising categorizing a plant growth stage for each individual plant and mapping the plant growth stage on the point map.


Example 19 relates to the method of any of Examples 16-18, further comprising identifying plants more than a threshold distance off a centerline for each crop row.


In Example 20, a method for comparing as-planted data to post-emergence data, comprising: acquiring as-planted data from a field computing device; identifying planter rows for each implement pass; creating a point map of individual plant locations from collected images; assigning plants to a planter row and implement pass; determining any plants that are more than a threshold distance from a centerline of a planter row; and generating a distance between adjacent plants.


While multiple embodiments are disclosed, still other embodiments of the disclosure will become apparent to those skilled in the art from the following detailed description, which shows and describes illustrative embodiments of the invention. As will be realized, the disclosure is capable of modifications in various obvious aspects, all without departing from the spirit and scope of the disclosure. Accordingly, the drawings and detailed description are to be regarded as illustrative in nature and not restrictive.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is an exemplary flow chart showing a system process, according to one implementation.



FIG. 2 is an exemplary stitched or single overhead image, according to one implementation.



FIG. 3 is the image from FIG. 2 adjusted to increase contrast between plant pixels and background pixels, according to one implementation.



FIG. 4 is the image from FIGS. 2-3 overlayed with triangulation, according to one implementation.



FIG. 5 is the image from FIGS. 2-4 with the background pixels removed, according to one implementation.



FIG. 6 is the image from FIGS. 2-5 with lines connecting the central point of each of the triangles of plant pixels, according to one implementation.



FIG. 7 is the image from FIGS. 2-6 with the central portion or stem of each plant identified, according to one implementation.



FIG. 8 is the image from FIG. 2 overlayed with the data from FIG. 7 showing the central portion or stem of each plant on the unprocessed/stitched imagery, according to one implementation.



FIG. 9 is an exemplary dashboard showing categorization of measured plants, according to one implementation.



FIG. 10 is an exemplary image of a field showing plant locations, according to one implementation.



FIG. 11 is a processed overhead image showing a volunteer plant between the planted rows, according to one implementation.



FIG. 12 is a processed image showing plant locations and areas with a thin stand due to implement/vehicle traffic, according to one implementation.



FIG. 13 is an exemplary dashboard showing plant data, according to one implementation.



FIG. 14 is a flow chart showing a system process, according to one implementation.



FIG. 15 is a schematic depiction of as-planted data, according to one implementation.



FIG. 16 is a depiction of plant locations with weeds/voluntary crops, according to one implementation.



FIG. 17 shows in the data from FIG. 16 overlayed with the as-planted data of FIG. 15, according to one implementation.





DETAILED DESCRIPTION

The disclosed systems, apparatus, and methods relate to the evaluation of crop emergence and other characteristics via the processing of imagery, optionally high-resolution imagery. With the use of imagery, the disclosed system, apparatus, and methods are able to identify individual plants in a field, as well as various characteristics of those plants. The various characteristics of the plants can include leaf area, leaf length, and/or the number and placement of plants within a research plot, subsection of a field, or a whole field. This plant characteristic data can be merged with other agricultural map layers such as planting, derived analysis/machine learning data/layers, harvest stand counts, and other related data as would be understood. By combining various data points and imagery, the system, apparatus, and methods disclosed herein can quantify various agricultural analyses such as to better understand the relationship of the planted seeds, to the emerged plants, to the harvested plants. In various implementations, the imagery and/or other data is gathered in the early season, for example just after plant emergence.


The combined data and information from the system, apparatus, and methods described herein can be used to improve planting practices, improve seed genetics, and identify yield potential during the growing season based on measurements of a single or multiple geo-referenced image(s).


In conjunction with the described implementations, it is understood that a system of one or more computers can be configured to perform particular operations or actions by virtue of having software, firmware, hardware, or a combination of them installed on the system that in operation causes or cause the system to perform the actions. One or more computer programs can be configured to perform particular operations or actions by virtue of including instructions that, when executed by data processing apparatus, cause the apparatus to perform the actions.


Certain of the disclosed implementations can be used in conjunction with any of the devices, systems or methods taught or otherwise disclosed in U.S. Pat. No. 10,684,305 issued Jun. 16, 2020, entitled “Apparatus, Systems and Methods for Cross Track Error Calculation From Active Sensors,” U.S. patent application Ser. No. 16/121,065, filed Sep. 4, 2018, entitled “Planter Down Pressure and Uplift Devices, Systems, and Associated Methods,” U.S. Pat. No. 10,743,460, issued Aug. 18, 2020, entitled “Controlled Air Pulse Metering apparatus for an Agricultural Planter and Related Systems and Methods,” U.S. Pat. No. 11,277,961, issued Mar. 22, 2022, entitled “Seed Spacing Device for an Agricultural Planter and Related Systems and Methods,” U.S. patent application Ser. No. 16/142,522, filed Sep. 26, 2018, entitled “Planter Downforce and Uplift Monitoring and Control Feedback Devices, Systems and Associated Methods,” U.S. Pat. No. 11,064,653, issued Jul. 20, 2021, entitled “Agricultural Systems Having Stalk Sensors and/or Data Visualization Systems and Related Devices and Methods,” U.S. Pat. No. 11,297,768, issued Apr. 12, 2022, entitled “Vision Based Stalk Sensors and Associated Systems and Methods,” U.S. patent application Ser. No. 17/013,037, filed Sep. 4, 2020, entitled “Apparatus, Systems and Methods for Stalk Sensing,” U.S. patent application Ser. No. 17/226,002 filed Apr. 8, 2021, and entitled “Apparatus, Systems and Methods for Stalk Sensing,” U.S. Pat. No. 10,813,281, issued Oct. 27, 2020, entitled “Apparatus, Systems, and Methods for Applying Fluid,” U.S. patent application Ser. No. 16/371,815, filed Apr. 1, 2019, entitled “Devices, Systems, and Methods for Seed Trench Protection,” U.S. patent application Ser. No. 16/523,343, filed Jul. 26, 2019, entitled “Closing Wheel Downforce Adjustment Devices, Systems, and Methods,” U.S. patent application Ser. No. 16/670,692, filed Oct. 31, 2019, entitled “Soil Sensing Control Devices, Systems, and Associated Methods,” U.S. patent application Ser. No. 16/684,877, filed Nov. 15, 2019, entitled “On-The-Go Organic Matter Sensor and Associated Systems and Methods,” U.S. Pat. No. 11,523,554, issued Dec. 13, 2022, entitled “Dual Seed Meter and Related Systems and Methods,” U.S. patent application Ser. No. 16/891,812, filed Jun. 3, 2020, entitled “Apparatus, Systems and Methods for Row Cleaner Depth Adjustment On-The-Go,” U.S. patent application Ser. No. 16/918,300, filed Jul. 1, 2020, entitled “Apparatus, Systems, and Methods for Eliminating Cross-Track Error,” U.S. patent application Ser. No. 16/921,828, filed Jul. 6, 2020, entitled “Apparatus, Systems and Methods for Automatic Steering Guidance and Visualization of Guidance Paths,” U.S. patent application Ser. No. 16/939,785, filed Jul. 27, 2020, entitled “Apparatus, Systems and Methods for Automated Navigation of Agricultural Equipment,” U.S. patent application Ser. No. 16/997,361, filed Aug. 19, 2020, entitled “Apparatus, Systems and Methods for Steerable Toolbars,” U.S. patent application Ser. No. 16/997,040, filed Aug. 19, 2020, entitled “Adjustable Seed Meter and Related Systems and Methods,” U.S. patent application Ser. No. 17/011,737, filed Sep. 3, 2020, entitled “Planter Row Unit and Associated Systems and Methods,” U.S. patent application Ser. No. 17/060,844, filed Oct. 1, 2020, entitled “Agricultural Vacuum and Electrical Generator Devices, Systems, and Methods,” U.S. patent application Ser. No. 17/105,437, filed Nov. 25, 2020, entitled “Devices, Systems and Methods For Seed Trench Monitoring and Closing,” U.S. patent application Ser. No. 17/127,812, filed Dec. 18, 2020, entitled “Seed Meter Controller and Associated Devices, Systems and Methods,” U.S. patent application Ser. No. 17/132,152, filed Dec. 23, 2020, entitled “Use of Aerial Imagery For Vehicle Path Guidance and Associated Devices, Systems, and Methods,” U.S. patent application Ser. No. 17/164,213, filed Feb. 1, 2021, entitled “Row Unit Arm Sensor and Associated Systems and Methods,” U.S. patent application Ser. No. 17/170,752, filed Feb. 8, 2021, entitled “Planter Obstruction Monitoring and Associated Devices and Methods,” U.S. patent application Ser. No. 17/225,586, filed Apr. 8, 2021, entitled “Devices, Systems, and Methods for Corn Headers,” U.S. patent application Ser. No. 17/225,740, filed Apr. 8, 2021, entitled “Devices, Systems, and Methods for Sensing the Cross Sectional Area of Stalks,” U.S. patent application Ser. No. 17/323,649, filed May 18, 2021, entitled “Assisted Steering Apparatus and Associated Systems and Methods,” U.S. patent application Ser. No. 17/369,876, filed Jul. 7, 2021, entitled “Apparatus, Systems, and Methods for Grain Cart-Grain Truck Alignment and Control Using GNSS and/or Distance Sensors,” U.S. patent application Ser. No. 17/381,900, filed Jul. 21, 2021, entitled “Visual Boundary Segmentations and Obstacle Mapping for Agricultural Vehicles,” U.S. patent application Ser. No. 17/461,839, filed Aug. 30, 2021, entitled “Automated Agricultural Implement Orientation Adjustment System and Related Devices and Methods,” U.S. patent application Ser. No. 17/468,535, filed Sep. 7, 2021, entitled “Apparatus, Systems, and Methods for Row-by-Row Control of a Harvester,” U.S. patent application Ser. No. 17/526,947, filed Nov. 15, 2021, entitled “Agricultural High Speed Row Unit,” U.S. patent application Ser. No. 17/566,678, filed Dec. 20, 2021, entitled “Devices, Systems, and Method For Seed Delivery Control,” U.S. patent application Ser. No. 17/576,463, filed Jan. 14, 2022, entitled “Apparatus, Systems, and Methods for Row Crop Headers,” U.S. patent application Ser. No. 17/724,120, filed Apr. 19, 2022, entitled “Automatic Steering Systems and Methods,” U.S. patent application Ser. No. 17/742,373, filed May 11, 2022, entitled “Calibration Adjustment for Automatic Steering Systems,” U.S. patent application Ser. No. 17/902,366, filed Sep. 2, 2022, entitled “Tile Installation System with Force Sensor and Related Devices and Methods,” U.S. patent application Ser. No. 17/939,779, filed Sep. 7, 2022, entitled “Row-by-Row Estimation System and Related Devices and Methods,” U.S. patent application Ser. No. 18/081,432, filed Dec. 14, 2022, entitled “Seed Tube Guard and Associated Systems and Methods of Use,” U.S. patent application Ser. No. 18/087,413, filed Dec. 22, 2022, entitled “Data Visualization and Analysis for Harvest Stand Counter and Related Systems and Methods,” U.S. patent application Ser. No. 18/097,801, filed Jan. 17, 2023, entitled “Agricultural Mapping and Related Systems and Methods,” U.S. patent application Ser. No. 18/101,394, filed Jan. 25, 2023, entitled “Seed Meter with Integral Mounting Method for Row Crop Planter and Associated Systems and Methods,” U.S. patent application Ser. No. 18/102,022, filed Jan. 26, 2023, entitled “Load Cell Backing Plate and Associated Devices, Systems, and Methods,” U.S. patent application Ser. No. 18/116,714, filed Mar. 2, 2023, entitled “Cross Track Error Sensor and Related Devices, Systems, and Methods,” U.S. patent application Ser. No. 18/203,206+−, filed May 27, 2022, entitled “Seed Tube Camera and Related Devices, Systems and Methods,” U.S. Patent Application 63/357,082, filed Jun. 30, 2022, entitled “Seed Tube Guard,” U.S. Patent Application 63/357,284, filed Jun. 30, 2022, entitled “Grain Cart Bin Level Sharing,” U.S. Patent Application 63/394,843, filed Aug. 3, 2022, entitled “Hydraulic Cylinder Position Control for Lifting and Lowering Towed Implements,” U.S. Patent Application 63/395,061, filed Aug. 4, 2022, entitled “Seed Placement in Furrow,” U.S. Patent Application 63/400,943, filed August 2022, entitled “Combine Yield Monitor,” U.S. Patent Application 63/406,151, filed Sep. 13, 2022, entitled “Hopper Lid with Magnet Retention and Related Systems and Methods,” U.S. Patent Application 63/427,028, filed Nov. 21, 2022, entitled “Stalk Sensors and Associated Devices, Systems and Methods,” U.S. Patent Application 63/445,960, filed Feb. 15, 2023, entitled “Ear Shelling Detection and Related Devices, Systems, and Methods,” U.S. Patent Application 63/445,550, filed Feb. 14, 2023, entitled “Liquid Flow Meter and Flow Balancer,” U.S. Patent Application 63/466,144, filed May 12, 2023, entitled “Devices, Systems, and Methods for Providing Yield Maps,” and U.S. Patent Application 63/466,560, filed May 15, 2023, entitled “Devices, Systems, and Methods for Agricultural Guidance and Navigation,” each of which is incorporated by reference herein.


Turning to the figures in more detail, FIG. 1 shows an exemplary workflow/steps for crop identification. In a first optional step, the system 100 captures images of growing crops (box 102). As would be appreciated, when crops are small (such as <10″ tall) there is enough leaf material above ground that taking an image above the ground such as by drone, aircraft, handheld camera, autonomous vehicle, vehicle mounted platform, or the like over the field can capture an image that is able to be processed to identify individual plants within the image. Once plants that are larger it may be more difficult to identify individual plants within an image because the leaves/plant bodies are large enough to begin to overlap.


In various implementations, a single image may show around 2-4 acres but as will be described further herein multiple images can optionally be put together (stitched) to form a larger map/image. The area depicted in an image may vary depending on the type of platform from which the image is acquired. For example, aerial platforms are likely to capture a larger area in a single image when compared to a ground-based platform that will likely image a smaller area from a closer distance. For ground-based platform—handheld or vehicle/implement mounted—captured images, the captured area per images can be a linear row or rows, or a small area of interest.


After images are captured, they may optionally be stitched together to form a single scene or mosaic of multiple combined images. One exemplary stitched image 10 is shown in FIG. 2. Optionally, these images are a high-resolution images, 0.1 to 2 cm/pixel, and optionally about 0.5 CM/pixel. Such high-resolution imagery may be needed to have sufficient clarity to conduct the analyses described herein.


Turning back to FIG. 1, in a further optional step, the system 100 can then import the imagery into a mapping program (such as Agfiniti® or SMS Advanced® from Ag Leader®) to do additional diagnostics of the imagery. For example, the imagery may be geo-referenced (box 104) or otherwise integrated with other data from prior field activities (box 104). In certain implementations, the imagery is imported or otherwise processed to align the imagery with a location in the field where the planter or other machinery has operated.


In various further implementations, the imagery can optionally be manipulated/modified in various way to improve/enhance processing (box 106). In one exemplary implementation, the imagery is modified to improve the contrast to the remainder of the field (bare soil, previous crop residue, tire tracks etc.) to the growing plant, shown for example in FIG. 3 where the black and white image highlights contrast between the soil/debris 2 and the plant of interest 1. In this example, the imagery was modified by converting the image (such as the image 10 shown in FIG. 2) using a method of a scale factor multiplied by subtracting the red band of the image from the green band. This resulted in the rest of the color of the image being removed, and the white silhouette of the plant 1 remaining.


In another example, the imagery can be modified to improve contacts via certain image processing techniques such as via the use of normalized difference vegetation index (NDVI) with an NIR equipped camera or similar. In these implementations, using the formula NIR−R/NIR+R will isolate the green plant material for use as described herein. Various alternative methods and processes for enhancing contrast are possible and would be recognized by those of skill in the art.


In certain alternate implementations, the target plant/fruit color may not be green and as such different processing to isolate color to the target crop, for enhancing contrast may be needed. If the object that the user is trying to isolate is a different color than green (e.g. pumpkins in a pumpkin patch/orange), the system 100 or user is able to define and use a different formula to allow the desired color to be isolated from the non-desired or non-target color(s). These changes to the formulae would be recognized by those of skill in the art.


Other image improvements to isolating the plant in the image from the rest of the non-plant items can optionally include adjusting the hue, saturation, and/or luminance to make a greater difference in the plant 1 and non-plant 2 areas. In several of these approaches, the brightness and/or scale of the image can be adjusted for better results/make target plants 1 more pronounced and easier to identify the edge of the plant 1 and/or plant leaves.


Other methods to identify the plant include machine learning where the shape and color of the plant are identified by comparison to sample images of the same type of plant. For example, a machine learning algorithm can match the shape and orientation of the leaves to another image that was used as a sample in training the program. Various alternative method and process for application of machine learning are possible and would be recognized by those of skill in the art.


In various implementations, when the individual plants 1 are identifiable by the system 100, with or with modification/processing (box 106), the system 100 can analyze individual plants 1 such as by modeling the surface (box 108). In one optional step, the entire image 10 may be covered with a triangulation routine such as a Delaunay Triangulation. Triangulation is used between the pixels identified to be plants that are defined as a target color (green) that after processing may be white or other contrasting color. Lines connecting leaf edges on the same plant 1 will be shorter than lines that were connected between different plants 1—encompassing the background 2, as can be seen in FIG. 4.


Utilizing the pixels of the image 10, triangle with more than a threshold of the number of pixels between corners of the triangle are to be removed. For example, if there are more than a threshold number of pixels (e.g. >6) between two corners of the triangle, the triangle will be removed from the image because it is identified as being a non-plant/background 2 area. The shorter lines that have fewer than the threshold number of pixels (e.g. 6 pixels) that remain in the image are identified as those that cover the plant 1 leaf, and these are the areas of the plant 1 visible in the picture/image 10.



FIG. 5 shows an exemplary image 10 where the non-plant/background 2 pixels (those triangles with a side greater than the threshold size) are removed leaving only the plant 1 pixel 2 (those triangles with sides of smaller than the threshold size). Using this approach, the edge of a plant 1 is digitized, and the system 100 can identify the difference between soil, residue, (background 2) and living plants 1.


In various alternative implementations, the system 100, or user identifies a measured distance between the corners of the triangle and that is used instead of or in addition to the number of pixels between corners of the triangles to remove the space between plants or leaves of the same plant.


Turning back to FIG. 1, in a further optional step, the system 100 can quantify various characteristics of the target crops (box 110). For example, to estimate length of a leaf or several leaves, the center of each of the generated triangles is identified and a line is connected through all the centers of the triangles, shown for example in FIG. 6. By connecting the center points of each triangle to the system 100 can estimate the length of an individual leaf or several leaves of a plant 1 that may go in different directions from the stem 3. This allows the system 100 can use the estimated length of the leaves as a filter for other processes, as will be described further herein and as would be appreciated.


In various implementations, the system 100 can estimate the location of the plant stem 3 as the location where several of the connecting center lines come together and form a “Y”, or junction (box 112), as seen in FIG. 7. In various implementations, the system 100 uses the stem 3 locations to identify individual plants 1, and create a spatial location and count for each plant 1. At this location, the system 100 can optionally place a spatial dot 3 at the junction to define where the plant 1 is compared to other plants 1.


As would be understood, the spacing between plants, and where the leaves of neighboring plants touch each other, can impact the ability to identify an individual plant. With imperfect spacing of plants, which is typical from most planters, the spacing can be variable, where 2 plants may be 3″ apart, and the next 2 plants might be 8″ part. By using a minimum distance filter between estimated stem 3 location points, the system 100 can account for some of this variability. As an example, using a 3″ plant spacing filter, if a junction 3 is more than 3 inches away from the previous plants 1 junction 3 the system 100 identifies two different plants 1. If the junctions 3 are closer than 3″ the system 100 may assume there is only one plant 1, and for example the orientation of the plant 1 leaves may have made it appear that there were two plants 1 on top of each other, when in fact there was only one plant 1.


By repeating some of these processes through the image(s) 10, the system 100 can classify each plant 1 and determine the average leaf length. Taking an average of the length of the longest two branches off the junction 3, mentioned above. In various implementations, the system 100 can classify the size of the plant 1 and tag each plant 1 location with the leaf length for use by the system 100.


In certain implementations, the system 100 can classify the length of a leaf into a crop growth stage, and/or into a user or system 100 defined range. For example, a leaf length from 1.3″ to 2″ is a ‘V1’ stage and from 2″ to 3″ is a ‘V2’ crop stage, and so on, as would be understood.


In some crops, such as corn, the yield is impacted by how evenly the plants emerge. At an early stage of emergence, the difference in plant size is apparent enough to make a determination of a late emerged plant versus a plant that came up with the majority of the rest of the field. FIG. 8 shows an image with plant junctions 3 layered over the complete imagery 10, in this figure the difference in size between a late emerged plant 1A and an on-time plant 1 is evident. As would be understood a plant 1A that came up or emerged about 36 hours after the adjacent plant 1 (a late emerged plant 1A) will often not produce a harvestable ear, as it will always be shadowed by the first to emerge plant 1.



FIG. 9 shows an exemplary dashboard/display 20 image of crop/plant 1 locations coded by their estimated growth stage. The point map 30 displayed on the display 20 of FIG. 9 maps the junctions 3 of the plants 1 to a geolocations. In various implementations, the system 100 can review the image selection in one scan looking for larger plants (normal growth stage plants 1), and then another scan to look for smaller plants (late emerged plants 1A). In various implementations, a user can see a map of the large plants 1, a map of the large plants 1 and small plants 1A together, and/or a map of solely small/late emerged plants 1A. In various implementations, the system 100 can then estimate how many plants exist in the sample that are late emerged plants 1A. By knowing the number of late emerged plants 1A, a grower can know the number of plants 1A that will likely not produce a harvestable ear, and thus be less profitable compared to all plants 1 emerging on time/in the same defined period such as about a 24-36 hour period.


In some implementations, the system 100 may gather image data over several days or more than once separated by about 36-72 hours or more as would be understood. The time between collection of imagery can vary depending on crop type, weather, and other conditions as would be understood. The imagery from the first collection and the imagery from a second or subsequent collection can be compared and the triangulated leaf area from the two collection of imagery, to compare the amount of plant growth between the collection times. In various implementations, the imagery and data therefrom can be visible as multiple layers in a single map, as would be understood. This type of analysis can be helpful in realizing the efficiency of plant growth during a known weather environment, which can be helpful in ranking plant genetics that will grow better under adverse conditions. This is in addition to the ability to compare the normal plant stand and confirm that the plant stand found in a first collection is close or matches the plant stand in a second collection.


Turning back to FIG. 1, in another optional step after generating the spatial map of all the points in the viewable area from the imagery, the system 100 can then compare/analyze the imagery based on as-planted information (box 114), such as data that that was recorded with a GPS, field computer, and/or other plating system as would be understood. The alignment of the points from the imagery representing emerged plants to the as-planted data recorded by a field computing device can then quantify the number of plants that came from each row, pass, as well as populations, plant spacing, and other data as would be understood (box 116). By quantifying this data, a grower can better understand and quantify the quality of emergence by pass or row of the planter. This helps alert a grower if there is a repeatable situation where they need to make repairs or adjustments to the planter or other equipment to do a better job and improve yield in the future (box 118).


In a further example, with having a comparison to the planted data, conclusions to the quality of seed metering (seed spacing) can be identified. Having erratic spacing can equally damage yield potential even if they emerge at the same time. The closer plants are to each other, the root systems will need to compete for water and nutrients in the growing season. As can be seen in FIG. 10, overlapping data layers can show a comparison of rows 4 to each other as well as rows 4 on different passes 5 (each section represents a different pass 5). Further the system 100 can highlight or otherwise flag plants 1A that are more than a threshold distance outside a row 4, such as 6″ or more. The system 100 can then identify these plants 1A as possible volunteer crops or weeds. FIG. 11 shows imagery 10, with adjusted contrast (such as that described above in relation to FIG. 3), showing a volunteer plant 1A between two planted rows 4.


As noted above, during the process of identifying the plants, a measurement of the leaves can help determine size compared to the neighboring plant. This is important to see the number of late emerged plants as they are more likely to not produce a harvestable ear. By measuring all the plants, and then assigning the length of each leaf to each plant, the system 100 can then create a map layer that shows the size of each plant, and then can categorize plants by grouping them in a growth stage to present to a producer. This allows the producer to see at a glance how many plants are behind a growth stage, and subject to a risk of not producing a harvestable ear. This would likely trigger the grower into inspection of planting equipment for maintenance, upgrading parts, and/por changing planting practices based on weather/soil conditions that lead to uneven emergence to improve the evenness of emergence in future years. In some implementations, a producer may also be able to see how even the plants are after improvements as well by comparing imagery year over year or between passes where changes are made. Other possibilities could include that some plant/seed varieties have different genetics for emergence, compared to others, such that the system 100 may allow for comparison or evaluation of genetics/plant characteristics between seed varieties. That is, if no mechanical adjustments were made to the planter, but the plant/seed variety changed, a difference in emergence or other characteristic may be attributed to plant genetics. This data may allow a grower or the system 100 to evaluate and trigger different selections of seed varieties in future years to achieve more even emergence/higher overall yield.


Furthermore, some images that are captured will show that a plant is between the rows of a planted crop such as volunteer crop or weeds, shown in FIG. 11. For example, as would be understood, in fields that are planted to corn after the previous season was also corn, there is a chance that some of the previous crop will land on the ground during harvest and emerge along with the new crop the following growing season. These are often called volunteer plants, that were not intended to be a part of the new crop. If there is enough volunteer crop, the volunteer crop can affect the overall yield of the intended crop and/or attract unwanted pests. By comparing where these plants are, the system 100 can estimate how many additional plants are in the field and how they may impact the season and optionally if remediation steps are warranted/necessary. In one example, if too many volunteer plants are in the field, the grower may elect to apply more fertilizer to help ensure that all plants have enough nutrients to generate profit/improve yield. In some implementations, the system 100 can identify the color of the plant to determine if the plant off the planted line is a weed and not a volunteer crop. If excessive weeds are present a grower may elect to apply additional herbicide. As would be understood, a weed can be a competing plant that may impact yield.


As would be understood, geo-referencing images from a drone is not always spatially accurate. Generally, the images are accurate to within a few feet or less to the true position of the objects they capture. With reference information from planting, in season applications, and/or harvest data that was collected with an accurate GPS receiver such as sub-inch RTK receivers, the imagery can be adjusted in any direction to the spatial positions/passes where data was collected (box 104). This allows the option to improve the alignment of the imagery to where the data was collected spatially. Conversely, if the accuracy of the planting or other data is poorer than the image, that data could be adjusted to match the image as well (box 104). Some outside factors that would create some of this error could include slope of the field (a flat image rendered from a curved surface), angle of a downward looking camera that was not completely vertical due to wind, or other forces that are understood to effect camera orientation.


In some cases, a missing plant or several missing plants in close proximity to each other may imply that field traffic caused some of the stand loss, as seen in FIG. 12. Here, the section 6 of a row 4 has a number of missing plants 1. This may lead to needing to generate an application plan to apply a chemical to these areas to prevent weeds from growing and thus effecting surrounding crop. In many cases the imagery that is collected will result in less than a 1 cm/pixel resolution. With this accuracy it is often possible to see evidence of a tire track from a machine driving in the field. Generally, the grower will try to drive between crop rows to avoid damage, but it is not always a guarantee that they will accomplish this. As such it is a strong possibility that a vehicle may at some point will accidentally drive over the crop that is intended to be harvested. The imagery can be used to see that there is a reduction in crop, as noted by the absence of plants, as well as note that when a tire pattern drives into that same area in the image, that the cause of those plants to be missing is due to field traffic.


With the imagery, and the ability to document the same plant a few days apart from each other, it is also possible to see changes in the color of the leaves. During the early growth period of a plant, it is generally expected that as more leaves become larger and convert more sunlight to chlorophyll to continue to grow. If during a first collection pass, and then confirmed in a second collection pass that there is an area within the image that is not as healthy as the first image (due to pale green color, or variable coloring to leaves spots, lesions, holes, or plants that have died) a grower could assess the percentage of change, and take action if able to reverse the damage, or limit the damage. These actions could include fertilizer applications, or pesticide applications to solve an issue that is damaging their crop and improve profitability.


By looking at the amount of area covered by a plant leaf, and how much soil area remains visible to the sky, the system 100 and/or a user can assess the plant canopy in the field with a high degree of accuracy. The importance of plant canopy in agriculture is to reduce opportunity for weeds to compete with the growing crop. One of the ways this is done is by shading out the soil by the desired crop to prevent weeds from being able to capture sunlight and competing with the crop. By observing an area multiple times and comparing the growth of individual plants, the system 100 could estimate the time remaining until the crop achieves a certain level or percentage of canopy. Or by contrast, if the crop never gets to canopy, the system 100 can then estimate how much open area still exists.


The system 100 and methods described herein are able to count plants in a quick and accurate manner over a large area of a production/research/test field to provide information to a grower, as discussed herein. Images and map layers collected early in the season can be compared to other counting methods as well. The emerged stand early in the season is a good indicator, but the final stand is generally considered what will make a good or poor crop. The emergence data from early in the season can also be compared with harvest sensors. This would provide further evidence of what was counted early in the growing season and a comparison of what was harvested at the end of the growing season. This comparison can assist in determining at what point in the season and issue was encountered effecting yield.


As noted above, known solutions for evaluating plant emergence and early plant characteristics are time consuming and prone to human error. Using an imagery-based approach described herein allows the grower to quickly look at multiple rows, and multiples segments of the same row to identify trends and areas of concern that the grower can go back to the field and the ground truth or inspect the planter for issues that would contribute to the issues they found. FIG. 10 and FIG. 13 show exemplary dashboards 20 displaying data from image collection and processing including the number of plants per row, population, seed spacing, volunteer plants (shown at the −1 row) and other data. Often without the disclosed system 100 those mini-trends would go undocumented and unresolved potentially costing revenue to the grower due to yield reduction in future years as well as the current year.


Another advantage of using such an image-based approach is that it provides a repeatable process for counting the same area of the field multiple times if needed or looking back over time to see if the emergence has changed over that period of time. This is especially true in the early season when there are plants emerging at different times, the stand changes so what was counted as 30,000 seeds/acre in week one may turn into 33,000 seeds/acre in week two. Having an image of the area of interest can validate that there were 3,000 more plants in the same location in the field than the week later. This can also help in times of a dispute between a grower and a provider of the seed/fertilizer/crop treatment when there is a reduced count in a portion of the field compared to a second area of a field without the same treatment.


As noted above, there are a number of known drone systems that will provide an estimate of plant stand for a whole field in coarse resolution. In some cases, these known drone systems will fly an 80-acre field, and the drone will stop over 80 spots in the field and capture a single or limited number of images, and then move to the next spot. With these prior systems there is a count of the emerged stand at that time, but there is no reference to what rows were captured, or what size of plants were present, or if there were plants that were volunteer corn, or weeds in that same scene. As such, the disclosed method provides more ground truth to what is present in the field for a harvestable crop. Other methods will show a similar point map, but will have no intelligence as to which row each plant was from, nor does it show the leaf surface area, growth stage, etc.


The disclosed system, apparatus, and methods can give a grower geo-located proof of the stand and surrounding metrics that are not possible by counting plants manually. Should there be a question of how plants emerged at the end of the growing season, this could be proof of what was in a specific spot that is dated, photographic, and spatially located evidence.



FIG. 14 shows a further alternative system 100 process. In one optional step, planting data is recorded from a field computing device (box 150). That is, planting data recorded by the planter and/or on-board systems (including seed metering sensors and GPS receivers) are obtained by the system 100, such as downloading from a cloud server. Alternatively, the system 100 may operate on the same display/computer used to record the planting data.


In a further optional step, individual row level lines from the full implement width passes are identified (box 152). In these and other implementations, the system 100 identifies the individual rows for each planter pass, optionally matching rows planted by a particular row unit between passes.


In another optional step, the system 100 creates a point map of detected plants (box 166), discussed in further detail above. The system 100 can then assign each detected plant to the closest row (box 154). The system 100 may also optionally identify an offset distance from the center of the row for each detected plant (box 156). In some implementations, plants outside a threshold distance from the row may not be assigned a row and instead to labeled as a volunteer plant or weed, as appropriate (box 164).


In another optional step, the system 100 can identify the row for each plant (the row unit on the planter that planted the seed) (box 158). Additionally, the system 100 can identify the planter pass for each plant and row. In various implementations, this row and pass data can be used to determine the distance between plants.


In a further optional step, the system 100 can use row to row measurements to determine plant to plant spacing for plants within the same row (box 160). Additionally, or alternatively, the system 100 can determine plant population. Further, optionally, the system 100 can compare different passes and overall performance from the data (including but not limited to planter pass, row spacing, plant spacing, and population) for optional future improvement and/or remediation steps (box 160).


Additionally, the system 100 may automatically or semi-automatically perform various statistical analysis for individual plants, rows, planter passes, test plots, whole field, and the like, as would be understood (box 162).


In some implementations, the collected imagery data is compared to as-planted information. FIG. 15 shows as-planted passes 5 for a 6-row planter. The arrows (A and B) indicate the direction of travel as the planter 7 traverses across the field. As the planter 7 transitions between passes 5 the planter 7 makes an effective u-turn, the result being row 1 of a first pass being adjacent to row 1 of the second pass. As would be appreciated this is the pattern during standard operation of a planter 7 but deviations can exist due to field shape and obstacles, creating a random pattern. An as-planted map can detect and record the planting pattern to future map layers and analysis.



FIG. 16 shows a point map 30 for plant 1 locations from an image. The point map 30 may be generated as discussed above. As can be seen, plant 1 locations are generally within a row 4, with some deviation from center due to how the plant 1 pushed through the soil surface during emergence and slight variation caused by planting. In this image of FIG. 16 there are ten (10) locations where a plant 1A is detected that are more than a threshold distance from the center of the row 4—not aligned with the row 4—that may be a weed or volunteer plant 1A.



FIG. 17 shows the point map 30/plant stand (such as from FIG. 16) overlayed with as planted data (such as from FIG. 15). In this example image, plants 1 within ±3″ (or other threshold distance) are considered part of the row 4. The plants 1 within the row 4 can be assigned a pass 5 number and a row 4 number allowing a user to know which plants 1 emerged from which row 4 and which row unit planted each plant. The user and/or system 100 can then compare and analyze the data to determine if a particular pass 5 or row 4 experienced an issue during planting that is effecting overall yield, as would be appreciated.


Plants 1 that are more than a threshold distance from the row 4 can be shown in a contrasting color or other symbol to be easily identified visually by a user. The system 100 may also create a data point or log for each plant 1 not assigned to a row 4. These volunteer plants 1A and weeds are important values as discussed herein. Knowing the distance between rows 4 of sensed plants 1, the distance between rows 4 can then be applied to the plants 1 within the row 4 to know plant 1 spacing and plant 1 populations. If the system 100 knows the distance between the rows 4 is 30″ apart, the system 100 can determine a scale for the distance of plants 1 within the same row 4 and find spacing and population, as would be understood.


Ranges can be expressed herein as from “about” one particular value, and/or to “about” another particular value. When such a range is expressed, a further aspect includes from the one particular value and/or to the other particular value. Similarly, when values are expressed as approximations, by use of the antecedent “about,” it will be understood that the particular value forms a further aspect. It will be further understood that the endpoints of each of the ranges are significant both in relation to the other endpoint, and independently of the other endpoint. It is also understood that there are a number of values disclosed herein, and that each value is also herein disclosed as “about” that particular value in addition to the value itself. For example, if the value “10” is disclosed, then “about 10” is also disclosed. It is also understood that each unit between two particular units are also disclosed. For example, if 10 and 15 are disclosed, then 11, 12, 13, and 14 are also disclosed.


Although the disclosure has been described with references to various embodiments, persons skilled in the art will recognized that changes may be made in form and detail without departing from the spirit and scope of this disclosure.

Claims
  • 1. A method for evaluating post-emergence crop stands, comprising: collecting images of an area of interest;geo-referencing the collected images;processing the collected images to isolate individual plants within the collected images; anddetermining at least one plant characteristic for each individual plant within the collected images.
  • 2. The method of claim 1, wherein the at least one plant characteristic is selected from leaf length, orientation, surface area, and color.
  • 3. The method of claim 1, wherein processing the collected images comprises adjusting color levels within the collected images to isolate a plant color of interest.
  • 4. The method of claim 1, further comprising performing a triangulation routine.
  • 5. The method of claim 4, further comprising defining a pixel threshold wherein a triangle with a side with a pixel range below the pixel threshold is considered part of a plant and a triangle with a side with pixel range above the pixel threshold is considered background.
  • 6. The method of claim 4, further comprising defining a threshold distance within a triangle with a side with a distance below the threshold distance is considered part of a plant and a triangle with a side with a distance above the threshold distance is considered background.
  • 7. The method of claim 5, further comprising removing background from the collected images.
  • 8. The method of claim 4, further comprising: identifying a central point of each triangle comprised of plant pixels;connecting each of the central points; andidentifying a junction of two or more central points to locate a stem of the plant.
  • 9. The method of claim 1, further comprising counting each plant within the collected images to determine a stand count.
  • 10. The method of claim 2, further comprising comparing leaf length and surface area.
  • 11. The method of claim 1, further comprising merging collected images with geo-referenced as-planted data.
  • 12. The method of claim 1, further comprising merging collected images with geo-referenced harvest data.
  • 13. The method of claim 1, further comprising identifying volunteer plants and weeds.
  • 14. The method of claim 1, further comprising identifying late emerged plants.
  • 15. The method of claim 1, further comprising: collecting imagery of the area of interest at a second point in time; andoverlaying the collected imagery from the second point in time with the collected imagery.
  • 16. A method for evaluating post emergence crops, comprising: capturing images of an area of interest at a first point in time;geo-referencing the images with as-planted data;processing the images to isolate individual plants within the images;modeling the surface of the individual plants;generating at least one crop characteristic for each individual plant;identifying a location for each individual plant; andcomparing the locations with as-planted data of expected plant locations.
  • 17. The method of claim 16, further comprising generating a point map of each plant location.
  • 18. The method of claim 17, further comprising categorizing a plant growth stage for each individual plant and mapping the plant growth stage on the point map.
  • 19. The method of claim 18, further comprising identifying plants more than a threshold distance off a centerline for each crop row.
  • 20. A method for comparing as-planted data to post-emergence data, comprising: acquiring as-planted data from a field computing device;identifying planter rows for each implement pass;creating a point map of individual plant location from collected images;assigning plants to a planter row and implement pass;determining any plants that are more than a threshold distance from a centerline of a planter row; andgenerating a distance between adjacent plants.
CROSS-REFERENCE TO RELATED APPLICATION(S)

This application claims the benefit under 35 U.S.C. § 119(e) to U.S. Provisional Application 63/351,602, filed Jun. 13, 2022, and entitled APPARATUS, SYSTEMS AND METHODS FOR IMAGE PLANT COUNTING, which is hereby incorporated herein by reference in its entirety for all purposes.

Provisional Applications (1)
Number Date Country
63351602 Jun 2022 US