The present invention relates to systems and methods for operating machines for planting seeds—for example, a crop row planter configured to plant seeds in a row along a field surface. More specifically, the present invention relates to system and methods for monitoring and evaluating the performance of machines that plant seeds
In one embodiment, the invention provides a system for automatically capturing visual data of a seed placed by a seed planting machine (e.g., a crop row planter). An electronic controller is configured to receive a signal indicative of a seed being dispensed by the seed planting machine and to trigger a camera to capture an image of the dispensed seed in response to a determination, based on the signal, that the seed has been dispensed by the seed planting machine. In some implementations, the system includes a seed sensor configured to detect a seed moving through a seed tube that dispensed seeds from the seed planting machine. In other implementations, the system is configured to detect a new seed being dispensed by the seed planting machine by analyzing image data captured by a camera.
In some implementations where the system is configured to detect a new seed by analyzing captured image data, the system is configured to capture a sequence of images and to analyze each image to determine whether a new seed is present in the image. When the controller determines that a new seed is present in an image of the sequence of images, it triggers the camera to capture an image of the dispensed seed. In some such implementations, the image captured by the camera in response to the trigger is of a higher resolution than the images of the sequence of images. Similarly, in some implementations, a flash light source is configured to illuminate the field of view of the camera in response to detecting an image in the sequence of images that includes a new seed. Accordingly, the field of view is illuminated by the flash light source when a camera image is captured in response to the trigger, but the field of view is not illuminated by the flash light source while the camera captures the other sequence of images.
Other aspects of the invention will become apparent by consideration of the detailed description and accompanying drawings.
Before any embodiments of the invention are explained in detail, it is to be understood that the invention is not limited in its application to the details of construction and the arrangement of components set forth in the following description or illustrated in the following drawings. The invention is capable of other embodiments and of being practiced or of being carried out in various ways.
For example, the seed sensor may include a light beam emitter and a light sensor positioned in a counter-facing arrangement on either side of the seed tube 205 so that, when no seed is present in the seed tube, a light beam emitted by the light beam emitter is received & detected by the light sensor. When a seed passes through the seed tube, the light beam is obstructed and, in response to the temporary absence of a light beam detected by the light sensor, the seed sensor generates a signal indicating that a seed has passed through the seed tube 205. Although the example of
In the example of
In some implementations, the controller 401 is configured to cause the camera 409 to capture one or more images of seeds deposited in the trench by a row seeder. Images captured by the camera 409 in this way may be stored to memory as evidence confirming that seeds have been deposited in the trench. The stored image data can then be analyzed later in order to evaluate the manner and quality of seeding provided by the row crop planter 100. This information can, in turn, be used to evaluate and adjust settings of the row crop planter 100 and to evaluate the performance of the operator of the row crop planter 100. Additionally, in some implementations, image data captured by the camera 409 can be processed and/or displayed to the operator in real-time (or near-real-time, or “on-demand”) to monitor and evaluate the planting process while planting is underway. This information might be used by the operator, for example, to make adjustments to improve the planting operation or to detect system failures before completing the planting process. Furthermore, in some implementations, the controller 401 is configured to automatically adjust one or more operating settings of the row crop planter (e.g., a cutting depth of the “opening disc,” a speed at which seeds are ejected through the seed tube, or the speed over the tractor pulling the row crop planter 100) in response to an analysis of the images captured by the camera 409.
In some implementations, the camera 409 is provided and operated as a “still” image camera configured to capture individual still images. In other implementations, the camera 409 may be provided and/or operated as a video camera and configured to capture a sequence of image frames in response to the trigger from the controller 401. Furthermore, although the example of
Finally, although activation of the flash and the capture of the image(s) is triggered by the output of a seed sensor in the example of
In the example of
In some implementations, the controller 401 and the camera 409 are configured to capture images of the trench at a relatively low resolution until a seed is detected and the flash is activated and will then operate to capture images with a higher resolution while the trench is illuminated by the flash. Also, although the system of
Furthermore, in the example of
In the example of
In these and other implementations, the electronic controller may be configured to determine when a seed will be dispensed into the trench based on the actuation signals and/or the timing schedule for the controllable seed dispensing actuator. Accordingly, instead of receiving a signal from an external system indicative of a detected presence of the seed (e.g., an output from a seed sensor or an image of the trench captured by the camera), the system may be configured to trigger the camera to capture an image based on the actuation signal and/or timing schedule for the controllable seed dispensing actuator. For example, the electronic controller may be configured to generate an actuation signal instructing the controllable seed dispensing actuator to eject a seed, to then wait for a defined delay period to allow enough time for the ejected seed to reach the trench, and to then trigger the camera to capture an image after expiration of the delay period.
As mentioned above, once images of the seeds are captured, they can be stored to the memory 405 for later review/analysis or to establish a record of evidence of the amount and locations of seeds planted in a particular field. However, in some implementations, the system may be configured to display image data to an operator of the system during the planting process.
In addition or instead of displaying image data with superimposed stationary “scales,” in some implementations, the system may be configured to analyze captured image data to make a more specific determination of particular planting variables. For example, as illustrated in the example of
As discussed above, captured image data can be analyzed to determine a final position and/or variation of seed placement in the trench. However, the captured image data might also be used to determine other characteristics of seed behavior to better understand the cause of variations in seed placement. For example, the captured image data may be analyzed by the controller to determine whether the seed is impacting a sidewall of the trench as it is dispensed instead of directly impacting the bottom of the trench. Based on this analysis, the system determines whether an adjustment to the seed dispensing mechanism (e.g., the position of the seed tube) may be necessary to ensure that seeds are dispensed directly to the bottom of the trench. In some implementations, the captured image data is also analyzed to determine whether dispensed seeds move (e.g., “tumble”) along the trench after they are dispensed instead of coming to rest at a location of initial impact in the trench. By detecting “tumbling” seeds, the system may be configured to determine whether an adjustment to the speed of the planter (e.g., the speed at which the tractor pulling the planter is moving) and/or the speed at which seeds are ejected from the planter are necessary to ensure appropriate and consistent seed placement. In some implementations, the system may be configured to capture a series of images each time an individual seed is dispensed so that movement of the dispensed seed can be monitored and analyzed by the system. Also, in some implementations, the system may be configured to automatically make adjustments to the operation, configuration, or position of the planter based on the analysis of the captured image data.
The examples described above in reference to
Furthermore, because some implementations are configured to capture images of individual seeds deposited in each of a plurality of different trenches. The system may be configured with various different mechanisms for displaying the captured seed image data to an operator of the system. For example, in some implementations, the system is configured to display images of seeds deposited by each row seeder unit in sequence as a “flip-book” or an “animation” to show variations in seeding. For example, a system that includes 20 row seeder units may be configured to display images in order from the first row seeder unit to the last and to then repeat the display process.
Alternatively or additionally, in some implementations, the system may be configured to detect when a seed placement (e.g., the trench depth, seed spacing, etc.) for a particular trench/row seeder unit does not meet certain prescribed criteria or exceeds a variation threshold as compared to seeds in other trenches. In such implementations, the system might be configured to automatically display images of seeds corresponding to that identified trench/row seeder unit that does not meet the prescribed criteria and may require adjustment.
As yet another display feature in addition to or instead of the display mechanisms described above, the system may be configured to provide a user interface in which the operator can select one or more specific row seeder units to monitor on the display. This may include, for example, displaying all of the camera images at the same time (e.g., in a grid layout), receiving a selection from the operator (e.g., via a touchscreen interface) of one or more particular images and subsequently displaying camera images corresponding to the images that were selected by the operator.
Finally, as discussed above, in some implementations, the system may be configured to adjust or regulate the operation of the row crop planter based on captured image data. For example, a row seeder unit, in some implementations, may be equipped with an actuator designed to controllably raise and lower the opening disc and, thereby, control the depth of the trench. The controller may be configured to determine an average seed depth in a particular trench based on the captured image data and, in response, operate the actuator to achieve/approach a target seed depth. In various implementations, the controller may be configured to adjust other actuators in addition to or instead of an opening disc height actuator in response to an analysis of the captured image data of deposited seeds. Such actuators may include, but are not limited to, opening disc angle actuators configured to adjust a width of a trench and a seed dispensing actuator configured to control the speed and/or frequency at which seeds are ejected through the seed tube.
Thus, the invention provides, among other things, systems and methods for automatically capturing visual data indicative of seeds deposited by a planting system and for providing information regarding planting quality based on the captured image data. Various features and advantages of the invention are set forth in the following claims.
This application is a continuation of U.S. patent application Ser. No. 16/280,814, filed Feb. 20, 2019, entitled “SYSTEM AND METHOD FOR VISUAL CONFIRMATION OF PLANTER PERFORMANCE,” the entire contents of which are hereby incorporated by reference.
Number | Name | Date | Kind |
---|---|---|---|
5956255 | Flamme | Sep 1999 | A |
9030549 | Redden | May 2015 | B2 |
9226442 | Grimm et al. | Jan 2016 | B2 |
9779330 | Wellington et al. | Oct 2017 | B2 |
10172285 | Sierra et al. | Jan 2019 | B2 |
10561059 | Levy et al. | Feb 2020 | B2 |
10681861 | Morgan et al. | Jun 2020 | B2 |
10681862 | Stoller et al. | Jun 2020 | B2 |
11381785 | Mentzer | Jul 2022 | B2 |
20040231575 | Wilkerson et al. | Nov 2004 | A1 |
20070266917 | Riewerts et al. | Nov 2007 | A1 |
20110098851 | Glendenning et al. | Apr 2011 | A1 |
20130128257 | Stettner et al. | May 2013 | A1 |
20160029547 | Casper et al. | Feb 2016 | A1 |
20160057923 | Sauder et al. | Mar 2016 | A1 |
20160345847 | Gu et al. | Dec 2016 | A1 |
20160361949 | Cavender-Bares et al. | Dec 2016 | A1 |
20170049044 | Stoller et al. | Feb 2017 | A1 |
20180114305 | Strnad et al. | Apr 2018 | A1 |
20180125002 | Stoller et al. | May 2018 | A1 |
20190232313 | Grimm et al. | Aug 2019 | A1 |
20190313575 | Stoller et al. | Oct 2019 | A1 |
20190320579 | Stoller et al. | Oct 2019 | A1 |
20200296885 | Stoller et al. | Sep 2020 | A1 |
20200352086 | Stoller et al. | Nov 2020 | A1 |
20200375090 | Morgan et al. | Dec 2020 | A1 |
20200390026 | Walter et al. | Dec 2020 | A1 |
Number | Date | Country |
---|---|---|
107278434 | Oct 2017 | CN |
107750550 | Mar 2018 | CN |
102015103379 | Sep 2016 | DE |
2080430 | Jul 2009 | EP |
2949194 | Dec 2015 | EP |
20160205422 | Dec 2016 | WO |
2017030903 | Feb 2017 | WO |
2017097292 | Jun 2017 | WO |
2017197274 | Nov 2017 | WO |
2018013858 | Jan 2018 | WO |
2018013860 | Jan 2018 | WO |
Entry |
---|
European Patent Office Action Extended European Search Report for Application No. 20157662.6 dated Jul. 7, 2020 (8 pages). |
Number | Date | Country | |
---|---|---|---|
20220030201 A1 | Jan 2022 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16280814 | Feb 2019 | US |
Child | 17494266 | US |