The present disclosure relates to a visualization system for a work machine, and in particular, adjustment of a mechanical delay offset factor for improved planter section control using image processing by the visualization system for a planter row unit. The present disclosure also relates to adjustment of one or more systems on the work machine to control application of a fertilizer or herbicide product to a field based characterization of the fertilizer or herbicide product relative to a commodity or a weed in the field using image processing by the visualization system for the planter row unit.
There is a wide variety of different types of agricultural machines that apply material to an agricultural field. Some such agricultural machines include sprayers, tillage machines with side dressing bars, air seeders, and planters that have row units.
As one example, a row unit is often mounted to a planter with a plurality other row units. The planter is often towed by a tractor over soil where seed is planted in the soil, using the row units. The row units on the planter follow the ground profile by using a combination of a down force assembly that imparts a down force to the row unit to push disk openers into the ground and gauge wheels to set depth of penetration of the disk openers.
Row units can also be used to apply material to the field (e.g., fertilizer to the soil, to a seed, etc.) over which they are traveling. In some scenarios, each row unit has a valve that is coupled between a source of material to be applied, and an application assembly. As the valve is actuated, the material passes through the valve, from the source to the application assembly, and is applied to the field.
Many current systems apply the material in a substantially continuous way. For instance, where the application machine is applying a liquid fertilizer, it actuates the valve to apply a substantially continuous strip of the liquid fertilizer. The same is true of materials that provide other liquid substances, or granular substances, as examples.
A headland or turnrow is the area at each end of a planted field and is one type of a boundary. Planters often create rows in the headland area wherein these rows run perpendicular to the lay of the field. Other types of boundaries include waterways and previously planted areas of a field.
Planter section control turns implement sections on and off wherein the implement sections are assembled with a work machine such as a planter. By reducing product application overlap, section control decreases the total amount of product used in the field, which can lead to lower costs. Additionally, an increase in yields can be seen as there is less competition amongst plants who may suffer in overlapped areas of the field where the seed population is too high. Planter section control can be difficult to adjust in order to align seed, nutrient, fertilizer, or any commodity that is applied to the ground surface with any boundaries without overlap or underlap. Overlap occurs when the seed or commodity is positioned in a location that is passed its intended boundary or target location in the direction of travel of the planter. Underlap occurs when the seed or commodity is positioned in a location that is short of its intended boundary or target location in the direction of travel of the planter. Planter section control is difficult because latencies and tolerance stack-ups in time occur when a large system such as a planter crosses a boundary or target location at a fairly high rate of speed. Latencies occur as the GPS system is mounted on the tractor to measure the tractor position however the row units that dispense the commodity or seed are positioned rearwardly of the tractor. Tolerance stack-ups are the combination of various part dimension tolerances of the planter.
Another reason planter section control is difficult is that a time and global position or actual location that the seed or commodity is deposited onto the ground is unknown with certainty and accuracy. It is known when the seed or commodity system is engaged to begin the distribution process. However it can be difficult to determine with adequate certainty the precise location or time offset from when the commodity delivery system is engaged to when/where the seed or commodity comes to its final resting place on the ground and the actual location of each commodity or seed on the ground. Therefore, it can be hard to predict exactly where commodity or seeds are located with respect to any boundary.
One technique to verify and adjust the location of the dispensed seeds is to manually dig in the ground for the seeds and check the location of the seeds relative to the boundary such as with a tape measure or other devices. This measurement typically requires the operator to stop and exit the planter, then check the location of the seeds or commodity. Based on the speed of the vehicle, distance or spacing between the deposited seeds or commodity, and distance of the seeds or commodity to the boundary, the operator will adjust a time or distance offset to obtain the desired placement with respect to the boundary. This is time consuming and prone to errors.
Mechanical delay is an error inherent to the system. For example, since the motors for the implement sections cannot ramp up instantly to full capacity, the seed or commodity meter and the belt have to overcome inertia and start spinning up, and the seed needs time to fall to the ground that causes some offset from the perfect placement due to the mechanical delay. Another type of mechanical delay relates to placement of the fertilizer relative to the seed or commodity placement in the furrow or vice versa. Fertilizer can be applied prior to seed placement or after seed placement however the intended application location of the fertilizer is on top of or below the seed. Due to the mechanical delay of a fertilizer valve, the fertilizer may not be applied directly under or over the seed. The mechanical delay is always present to some degree and operators manually account for the mechanical delay. However, operators cannot always accurately account for the mechanical delay therefore there is always some error.
Thus there is a need for improvement for planter section control and a need for improvement in accurately accounting for mechanical delay of the fertilizer system or other systems associated with the planter section control. The discussion above is merely provided for general background information and is not intended to be used as an aid in determining the scope of the claimed subject matter.
According to one embodiment of the present disclosure, a method comprising: emitting a patterned light from a visualization system mounted on a planter row unit onto a trench in a ground surface, wherein the visualization system includes a structured light unit mounted on the planter row unit; depositing a commodity in the trench by the planter row unit; capturing a two-dimensional image of the projected patterned light on the trench with the visualization system that includes a camera mounted on the planter row unit, wherein the two-dimensional image includes the commodity; and determining, with a controller operably connected to the camera, an error placement of the commodity in the two-dimensional image relative to a boundary.
In one example, further comprising: determining, with the controller, a location of the commodity in the trench in the two-dimensional image.
In one example, further comprising: determining, with the controller, whether the commodity captured in the two-dimensional image is a boundary commodity.
In one example, further comprising: determining, with the controller, whether the error placement is acceptable or unacceptable; in response to the error placement being unacceptable, adjusting a mechanical delay offset factor of a planter section control coupled with the planter row unit.
In one example, wherein the planter row unit is coupled to an agricultural work machine, the planter section control includes a commodity delivery system coupled to the controller and the agricultural work machine.
In one example, wherein the adjusting the mechanical delay offset factor of the planter section control is performed automatically by the controller.
In one example, wherein the adjusting the mechanical delay offset factor of the planter section control is performed by an operator.
In one example, wherein the error placement is an underlap condition of the commodity relative to the boundary.
In one example, wherein the error placement is an overlap condition of the commodity relative to the boundary.
According to one embodiment of the present disclosure, a method comprising: emitting a patterned light from a visualization system mounted on a planter row unit onto a ground surface, wherein the visualization system includes a structured light unit mounted on the planter row unit; depositing a commodity in the ground surface by the planter row unit; capturing a two-dimensional image of the projected patterned light on the trench with the visualization system that includes an imaging unit mounted on the planter row unit, wherein the two-dimensional image includes the commodity; and determining, with a controller operably connected to the camera, whether the commodity captured in the two-dimensional image is a boundary commodity relative to a boundary.
In one example, further comprising: in response to the commodity being the boundary commodity, determining with the controller, an error placement of the boundary commodity in the two-dimensional image relative to the boundary.
In one example, further comprising: determining, with the controller, whether the error placement is acceptable or unacceptable; in response to the error placement being unacceptable, adjusting a mechanical delay offset factor of a planter section control coupled with the planter row unit.
In one example, wherein the planter row unit is coupled to an agricultural work machine, the planter section control includes commodity delivery system coupled to the controller and the agricultural work machine.
In one example, further comprising: determining a desired error of placement that includes a desired distance for depositing the boundary commodity relative to the boundary; determining whether the error placement of the boundary commodity in the two-dimensional image relative to the boundary is greater than the desired error of placement; in response to the error placement being greater than the desired error of placement, adjusting a mechanical delay offset factor of a planter section control coupled with the planter row unit.
In one example, wherein in response to the error placement being less than the desired error of placement, alerting an operator of this condition.
In one example, wherein the error placement is an underlap condition of the boundary commodity relative to the boundary.
In one example, wherein the error placement is an overlap condition of the boundary commodity relative to the boundary.
In one example, further comprising: in response to the commodity being the boundary commodity, determining with the controller, a geographical identification metadata of each of the boundary commodity in the trench in the two-dimensional image and the boundary.
In one example, wherein the planter section control is operably coupled to one or more of a hopper, a seed meter, and a seed delivery system, or other systems coupled with the planter row unit that are included in the mechanical delay offset factor.
In one example, wherein the adjusting the mechanical delay offset factor of the planter section control is performed automatically by the controller.
In one example, wherein the adjusting the mechanical delay offset factor of the planter section control is performed by an operator.
According to one embodiment of the present disclosure, a method comprising: emitting a patterned light from a visualization system mounted on a planter row unit onto a trench in a ground surface, wherein the visualization system includes a structured light unit mounted on the planter row unit; depositing a commodity in the trench by the planter row unit; capturing a two-dimensional image of the projected patterned light on the trench with the visualization system that includes a camera mounted on the planter row unit, wherein the two-dimensional image includes the commodity; depositing a product in the trench by the planter row unit; capturing a two-dimensional image of the projected patterned light on the trench with the visualization system, wherein the two-dimensional image includes the product; determining, with a controller operably connected to the camera, a location of the commodity in the two-dimensional image; and determining, with the controller, a product characterization of the product in the two-dimensional image.
In one example, wherein the commodity and the product are captured in the same two-dimensional image.
In one example, wherein the commodity is captured in one of the two-dimensional image and the product is captured in a second of the two-dimensional image.
In one example, wherein the product is any of a fertilizer, a liquid material, a granular material, or a herbicide material.
In one example, further comprising: determining, with the controller, an error characterization of the product characterization relative to the location of the commodity in the two-dimensional image; and determining, with the controller, whether the error characterization is within an acceptable range or an unacceptable range.
In one example, wherein the product characterization includes any of a length, a width, an area, or a depth of the product as determined from the two-dimensional image.
In one example, further comprising: in response to the error characterization being in the unacceptable range, adjusting a mechanical delay offset factor of any of a planter section control, a commodity delivery system, or a product delivery system coupled with the planter row unit and the controller.
In one example, wherein the adjusting the mechanical delay offset factor is performed automatically by the controller.
In one example, wherein the unacceptable range of the error characterization is greater than 5 inches.
In one example, wherein the error characterization is in the acceptable range that is between 0 and 5 inches.
In one example, wherein the product characterization is a band of fertilizer having a length, wherein the commodity is a seed, wherein the error characterization indicates a placement of the band of product in the two-dimensional image relative to a location of the commodity in the two-dimensional image.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter. The claimed subject matter is not limited to implementations that solve any or all disadvantages noted in the background.
Corresponding reference numerals are used to indicate corresponding parts throughout the several views.
Some of the benefits of the present disclosure include using an optical sensing device such as a visualization system in combination with a precise global timestamp or GPS geospatial tag, to determine a precise place or location and/or time that commodities are placed with respect to a boundary and thereby adjust planter section control by an operator or automatically. The planter section control includes a mechanical shut-off device coupled to a controller and a GPS receiver device on the agricultural work machine. The shut-off device can be a single row clutch mounted on every row unit, an electronic shut-off device that controls a section or multiple rows of the planter row units, or a turn-on device to start the commodity flow or placement. In one embodiment, wherein the shut-off device is the single row clutch then the clutch is disengaged to start the flow of the commodity.
The visualization system can sense the boundary or the boundary may be determined by an operator or automatically. The image processing performed by the visualization system determines precisely where the first or last seed or commodity was planted in relationship to the boundary. The timing of subsequent starting and stopping of the section control, or mechanical shut-off and turn-on, is adjusted based on an error placement between an actual commodity relationship to the boundary and a desired commodity relationship to boundary. In some embodiments, the actual commodity relationship is the same as the desired commodity relationship. The boundary can be set by the operator or determined by the visualization system. The boundary can include waterways, previously planted areas of a field, headland areas, or other areas. The boundary forms a geofence and is known by the global timestamp or GPS geospatial tag.
The visualization system and in some embodiments a controller coupled to the visualization system determines an error placement based on a boundary commodity that is closest to or intended to be deposited on the boundary wherein the boundary is the desired location of the boundary commodity. The visualization system determines from images that include the boundary commodity and the GPS location of the boundary relative to the boundary commodity a time and/or place or location that the boundary commodity should be deposited relative to the boundary. The visualization system and in some embodiments a controller coupled to the visualization system determines any adjustments needed for the planter section control to optimize placement or deposition of future or additional boundary commodities relative to the boundary to position the boundary commodities close to the boundary or desired location. The visualization system and in some embodiments a controller coupled to the visualization system optimizes the error placement to minimize the distance between the boundary commodity and the boundary.
The present disclosure adjusts the planter section control, via automatically or by an operator, to turn one or more implement sections on and off to position the boundary commodity placement close to the boundary. The adjustment or mechanical delay offset factor time to turn the implement sections on or off can be small increments such as seconds, milli-seconds, or any increment that is deemed effective to optimize the boundary commodity placement relative to the boundary. As the planter or agricultural work machine travels over a field, the locations of the boundary and the boundary commodity can change so adjustment or mechanical system delay time of the one or more implement sections can change. In some embodiments, the velocity of planter or agricultural work machine does not change. Adjustment of the mechanical system delay for the implement sections to shut off or turn on is based on comparison placement of the boundary commodity in the image to the known position or location of the boundary. For example, the present disclosure determines if the mechanical system delay time is starting one second too late, then the controller or operator can adjust the mechanical system delay time to start one second earlier to compensate.
Mechanical delay is an error inherent to the system. For example, since the motors for the implement sections cannot ramp up instantly to full capacity, the seed or commodity meter and the belt have to overcome inertia and start spinning up, and the seed needs time to fall to the ground that causes some offset from the perfect placement due to the mechanical delay. The mechanical delay is measured and the present application compensates for it via the mechanical delay offset factor that adjusts the timing to compensate for the mechanical delay. The mechanical delay is always present to some degree however the present application accounts for the mechanical delay via the visualization system and the adjustment of the mechanical delay offset. For example, the visualization system determines that a seed placement is off a certain distance such as 3″ due to the mechanical delay so the system is turned on 0.5 seconds earlier to account for that known delay.
In one exemplary embodiment, a planter row unit coupled to the planter or agricultural work machine travels over a field and deposits a boundary commodity near the boundary. The visualization system takes an image of the boundary commodity and determines an error of placement of the boundary commodity relative to the GPS location of the boundary. The visualization system and/or a controller can also determine a GPS location of the boundary commodity for the error of placement. Or the visualization system and/or a controller can determine a location of the boundary in the image that includes the boundary commodity. Based on the error of placement, an operator or the controller will make an adjustment to mechanical system delay time. The planter row unit will then place a new boundary commodity and check a placement or location of the new boundary commodity in an image relative to the new boundary to determine a subsequent error placement. If the subsequent error placement is not acceptable, then the controller or operator will make another adjustment to mechanical system delay time until the measurement/placement of the new boundary commodity in the image relative to the boundary is within a “tolerance”. The tolerance can be set by operator or automatically.
In another exemplary embodiment, an operator or the controller determines one or more or all of geolocations or geospatial tags of desired commodity placement prior to actual placement of the commodity. Next, the operator or controller determines a geolocation or geospatial tag of the boundary. As the planter row unit deposits the commodity in the field, the visualization system georeferences the actual commodity in the images captured by the visualization system. The visualization system and/or controller compares the georeferenced location of the actual commodity in the image and the prescribed geolocation of commodity placement. In particular, the visualization system and/or controller determines an error of placement for the boundary commodity relative to the boundary. Based on the error of placement, the planter section control is adjusted to turn implement sections on and off.
Referring now to
Each planter row unit 14 may include an auxiliary or secondary hopper 18 for holding product such as fertilizer, seed, chemical, or any other known product or commodity. In this embodiment, the secondary hopper 18 may hold seed. As such, a seed meter 20 is shown for metering seed received from the secondary seed hopper 18. A furrow opener or furrow opening disk 22 may be provided on the planter row unit 14 for forming a furrow or trench in a field for receiving metered seed (or other product) from the seed meter 20. The seed or other product may be transferred to the trench from the seed meter 20 by a seed delivery system 24. In one embodiment, a closing system or closing wheel 26 may be coupled to each planter row unit 14 and is used to close the furrow or trench with the seed or other product contained therein. The closing system includes a closing wheel but in other embodiments the closing system can include closing disks, closing tires, and/or drag chains to name a few examples.
In one embodiment, the seed meter 20 is a vacuum seed meter, although in alternative embodiments other types of seed meters using mechanical assemblies or positive air pressure may also be used for metering seed or other product. As described above, the present disclosure is not solely limited to dispensing seed. Rather, the principles and teachings of the present disclosure may also be used to apply non-seed products to the field. For seed and non-seed products, the planter row unit 14 may be considered an application unit with a secondary hopper 18 for holding product, a product meter for metering product received from the secondary hopper 18 and an applicator for applying the metered product to a field. For example, a dry chemical fertilizer or pesticide may be directed to the secondary hopper 18 and metered by the product meter 20 and applied to the field by the applicator.
The planter row unit 14 includes a shank 40. The shank 40 is coupled to a closing wheel frame 52. The closing wheel frame 52 has a pivot end 54 that is pivotably connected to a pivot 49 and an opposite end 56 with a body portion 58 that spans between the pivot end 54 and the opposite end 56. The planter row unit 14 includes a pair of furrow opening disks 22 rotatably mounted on the shank 40 and a pair of closing wheels 26 rotatably mounted on the closing wheel frame 52. The planter row unit 14 can also include a pair of gauge wheels but those are not illustrated. The pair of furrow opening disks 22 form a trench or furrow 192 in the field or in a ground surface G during operation of the planter row unit 14. Alternatively, other opening devices can be used in place of the pair of furrow opening disks 22. The trench 192 has a cross-sectional shape as a V shape as illustrated in
An exemplary configuration of a visualization system 60 is operably connected and mounted to the planter row unit 14 is illustrated in
Although one camera or imaging unit 62 is illustrated, additional cameras 62 can be used with the structured light unit 64. The camera or imaging unit 62 is mounted between the pair of closing wheels 26 and the pair of furrow opening disks 22 or alternatively the camera or imaging unit 62 is mounted between the pair of closing wheels 26 and the seed delivery system 24. The structured light unit 64 is also mounted between the pair of closing wheels 26 and the pair of furrow opening disks 22 or alternatively the structured light unit 64 is mounted between the pair of closing wheels 26 and the seed delivery system 24. In the illustrated embodiment, the camera or imaging unit 62 is positioned close to the pair of closing wheels 26 and the structured light unit 64 is positioned close to the seed delivery system 24 and/or the pair of furrow opening disks 22. In other embodiments, the structured light unit 64 is positioned close to the pair of closing wheels 26 and the camera or imaging unit 62 is positioned close to the seed delivery system 24 and the pair of furrow opening disks 22.
In some embodiments, the visualization system 60 includes a general illumination light 68 mounted to the planter row unit 14. The general illumination light 68 can include one or more light emitting diodes (LED) or broad-beamed, high intensity artificial light. The general illumination light 68 can illuminate the trench 192 to help capture the visual context of the trench 192 by the camera or imaging unit 62. The general illumination light 68 can be used with the structured light unit 64. Imaging by the camera or imaging unit 62 can be performed with alternating light sources such that the structured light unit 64 is operable while the general illumination light 68 is non-operable, and vice versa wherein the structured light unit 64 is non-operable while the general illumination light 68 is operable. Non-operation of the general illumination light 68 during operation of the structured light unit 64 enables the camera or imaging unit 62 to capture a 2D image where the pattern created by the structured light unit 64 stands out significantly from the rest of the background. Non-operation of the structured light unit 64 during operation of the general illumination light 68 enables the camera or imaging unit 62 to capture a better image of the visual context of the trench 192 by the camera or imaging unit 62. Alternatively, the general illumination light 68 and the structured light unit 64 can be operational together. For example, the structured light unit 64 is activated while the camera or imaging unit 62 captures images however the general illumination light 68 is not operational for every image that is captured by the camera or imaging unit 62. As a further example, the general illumination light 68 can be operational for some of the images that are captured and non-operational for other of the images that are captured by the camera or imaging unit 62. The general illumination light 68 is placed between the pair of closing wheels 26 and the pair of furrow opening disks 22. The general illumination light 68 can alternatively be mounted or combined with the camera or imaging unit 62. The general illumination light 68 can be placed under the shank 40 or under the closing wheel frame 52. The general illumination light 68 can be placed anywhere on the planter row unit 14 to illuminate a field of view of the camera or imaging unit 62.
In any embodiment, the camera or imaging unit 62 is oriented to point down towards the ground surface G at the trench 192 that is formed by the pair of furrow opening disks 22. The camera or imaging unit 62 also points down toward the projected light from the structured light unit 64 at the trench 192 in the ground surface G. The structured light unit 64 projects a narrow band of light across the trench 192 to produce a line of illumination or patterned light and can be used for location of a seed or commodity 102 therein and location of a boundary 200. The structured light unit 64 points towards the ground surface G and the trench 192 formed therein. In any embodiment, the structured light unit 64 and the camera or imaging unit 62 are accurately calibrated relative to each other so that 3D locations of the commodity 102, the trench 192, and the boundary 200 can be recovered by triangulation or other techniques.
The structured light unit 64 includes a single laser or single light source that projects a single line, multiple lines, grids, stripes, one or more dots or point projections, cross, triangle, or other known pattern of light, collectively “patterned light” on the trench in the ground surface G. Alternatively, the structured light unit 64 can include multiple lasers or light sources. For example, the structured light unit 64 can emit a single point projection to a trench bottom for determining a trench depth or commodity location. As another example, the structured light unit 64 can emit a single line projection for measuring cross-section of the trench as well as the trench depth or the commodity location. As yet another example, the structured light unit 64 can emit an area projection such as multiple lines, grids, or stripes for measuring a location of the commodity 102, the trench 192, and the boundary 200 at various points within the measured section. In one embodiment, a slit in a light cover can be positioned in front of the structured light unit 64 to thereby project multiple lines on the trench 192 to provide additional points, mesh, or an area of 3D points to perform a multiple cross sectional measurement. Multiple lines may be beneficial in a dusty environment to increase the potential to obtain a good measurement. The structured light unit 64 can also pass through a digital spatial light modulator to form a pattern with regular and equidistant stripes of light on the trench 192. In one embodiment, projection by the structured light unit 64 of a single line as the planter row unit 14 moves towards the direction of laser scanning T for additional scanning of cross sectional measurements and measurement of the commodity 102.
In one embodiment, the structured light unit 64 is a green light but in other embodiments the structured light unit 64 can be another colored light such as blue or a white light. If the structured light unit 64 is configured as a colored light, then the camera or imaging unit 62 is a color or monochrome camera. Alternatively, the structured light unit 64 can be a near-infrared (NIR), infrared (IR), or other non-visible range for better visibility in challenging or obstructive environmental conditions such as dust, fog, or haze wherein the NIR or IR light is used with the camera or imaging unit 62 being infrared or near-infrared. As such, the camera or imaging unit 62 and the structured light unit 64 can be operated in the visible spectrum range, or outside of the visible spectrum range such as infrared range in order to have better air obscurant penetration such as dust penetration. While the trench 192 is formed by the furrow opening disks 22, soil and dust can fill or permeate the air so it is difficult for the operator or a conventional color camera to capture the trench 192 cross-sectional shape. A near infrared camera or imaging unit 62 can be used in dusty or visibly challenging environments to improve the visualization of the 2D plane that is projected by the structured light unit 64.
In certain embodiments, the visualization system 60 includes or is operatively connected to a controller 80 structured to perform certain operations to control the camera or imaging unit 62, the structured light unit 64, and the general illumination light 68. The controller 80 can be placed anywhere on the planter row unit 14, the planter, the agricultural work machine or tractor 140, or any work machine that may be connected to or capable of performing one or more planting operations. In certain embodiments, the camera or imaging unit 62 includes the controller 80. In certain embodiments, the controller 80 forms a portion of a processing subsystem including one or more computing devices having memory, processing, and communication hardware. The controller 80 may be a single device or a distributed device, and the functions of the controller 80 may be performed by hardware or by instructions encoded on computer readable medium. The controller 80 may be included within, partially included within, or completely separated from other controllers (not shown) associated with the work machine and/or the visualization system 60. The controller 80 is in communication with any sensor or other apparatus throughout the visualization system 60, including through direct communication, communication over a datalink, and/or through communication with other controllers or portions of the processing subsystem that provide sensor and/or other information to the controller 80.
The vehicle controller 80 can include a GPS device or be operably coupled with a GPS device 180 (
The location-based field registration or map 300 can also account for the velocity of the planting vehicle or the agricultural work machine 140 at the time any images were captured. The map 300 is illustrative of one exemplary map however it is contemplated that other types of maps can be configured as desired, and based on the individual field, commodity application, and GPS coordinates, to name a few characteristics of the map 300.
In some embodiments, images and determined metrics may be tagged or geotagged during registration with GPS information producing the field map 300. For example, the images with the commodity 102 and the trench 192 may be geotagged to include geographical identification metadata to various media such as a geotagged photograph or video, websites, SMS messages, QR Codes or RSS feeds and is a form of geospatial metadata. This data usually consists of latitude and longitude coordinates, though they can also include altitude, bearing, distance, accuracy data, and place names, and perhaps a time stamp. Geotagging can help users find a wide variety of location-specific information from a device. The geographical location data used in geotagging can be derived from the GPS device 180. In many embodiments, a user experience interface such as in the agricultural work machine 140 may comprise a processor enabled display system wherein the processor executes computer readable instructions to produce visualized output illustrating of a plurality of captured data including the map 300, the image 100, and the boundary 200.
Another aspect of the present application provides systems and methods for automatically adjusting planter components or mechanical delay offset factor of the planter section control in response to determined seeding or commodity measurements and/or GPS locations of the seed or commodity 102 relative to the boundary 200 based on images captured by the visualization system 60. These systems and methods determine an error placement between a boundary commodity to the boundary 200 and a desired boundary commodity to the boundary 200. These systems and methods can then adjust the mechanical delay offset factor of the mechanical systems of the mechanical systems of the planter row unit 14 accordingly until the error placement is acceptable. The adjustments to the mechanical delay offset factor of the mechanical systems of the planter row unit 14 can be made by an operator or automatically such as by the controller 80.
In certain embodiments, the controller 80 is described as functionally executing certain operations. The descriptions herein including the controller operations emphasizes the structural independence of the controller, and illustrates one grouping of operations and responsibilities of the controller. Other groupings that execute similar overall operations are understood within the scope of the present application. Aspects of the controller may be implemented in hardware and/or by a computer executing instructions stored in non-transient memory on one or more computer readable media, and the controller may be distributed across various hardware or computer based components.
Example and non-limiting controller implementation elements include sensors providing any value determined herein, sensors providing any value that is a precursor to a value determined herein, datalink and/or network hardware including communication chips, oscillating crystals, communication links, cables, twisted pair wiring, coaxial wiring, shielded wiring, transmitters, receivers, and/or transceivers, logic circuits, hard-wired logic circuits, reconfigurable logic circuits in a particular non-transient state configured according to the module specification, any actuator including at least an electrical, hydraulic, or pneumatic actuator, a solenoid, an op-amp, analog control elements (springs, filters, integrators, adders, dividers, gain elements), and/or digital control elements.
The listing herein of specific implementation elements is not limiting, and any implementation element for any controller described herein that would be understood by one of skill in the art is contemplated herein. The controllers herein, once the operations are described, are capable of numerous hardware and/or computer based implementations, many of the specific implementations of which involve mechanical steps for one of skill in the art having the benefit of the disclosures herein and the understanding of the operations of the controllers provided by the present disclosure.
One of skill in the art, having the benefit of the disclosures herein, will recognize that the controllers, control systems and control methods disclosed herein are structured to perform operations that improve various technologies and provide improvements in various technological fields. Certain operations described herein include operations to interpret one or more parameters. Interpreting, as utilized herein, includes receiving values by any method known in the art, including at least receiving values from a datalink or network communication, receiving an electronic signal (e.g. a voltage, frequency, current, or PWM signal) indicative of the value, receiving a software parameter indicative of the value, reading the value from a memory location on a non-transient computer readable storage medium, receiving the value as a run-time parameter by any means known in the art, and/or by receiving a value by which the interpreted parameter can be calculated, and/or by referencing a default value that is interpreted to be the parameter value.
In some embodiments, the visualization system 60 is operably connected to a mobile device (not illustrated) such as a mobile phone, computer, laptop, or electronic tablet that includes a user interface for operably engaging with the visualization system 60 and the operator; however, in other embodiments the visualization system 60 is not connected to the mobile device. The user interface of the mobile device can display the same display as a user interface in the agricultural work machine 140 or a different display.
In some embodiments, a desired error of placement 220 and 222 is determined as illustrated in
Measurement of the location of the commodity 102 will now be described by measuring the three-dimensional (3D) location of the laser points or patterned light of an image 100 that is projected by the structured light unit 64 as illustrated in
Certain systems are described and include examples of controller operations in various contexts of the present disclosure. In certain embodiments, such as procedure 350 shown in
In step 360, the controller 80 determines if the error placement 210 or 212 is acceptable. If the error placement 210 or 212 is not acceptable, then at step 362 adjustment of a mechanical delay offset factor of the planter row unit 14 is determined. In step 362, adjustment of the mechanical delay offset factor of the planter components or mechanical delay offset factor of the planter section control in response to determined error placement 210 or 212 is not acceptable is made. The adjustments to the mechanical delay offset factor of the mechanical systems, the planter section control, and/or commodity delivery systems of the planter row unit 14 can be made by an operator or automatically such as by the controller 80.
Step 362 then continues to step 352 to repeat the procedure 350 and continue checking for the boundary commodity 102 in relation to the boundary 200 and an acceptable error placement 210 or 212. The procedure 350 is applicable to commodity that can include seed, sprays, residue, fertilizer, growing plants, and/or emergence detection of a plant.
In step 360, if the error placement 210 or 212 is acceptable, then at step 364 the procedure 350 ends.
Turning now to the embodiment illustrated in
The controller 80 determines, based on the captured images of the product 304, a product characterization that includes details of the product 304 that was placed in the ground G based on the captured image. Some details of the product characterization of the product 304 include a length, a width, an area, a depth of the product in the ground G, but other details could be determined. The controller 80 determines an error characterization 306 of the product characterization relative to placement of the commodity 102 in the captured image. The controller 80 determines the error characterization 306 or error placement between the target commodity 102 and the product characterization of the product 304 applied near, on, or under the target commodity 102 in the capture images. For example, the target commodity 102 can include a seed, weed plant, or a fertilizer. The product 304 can include anything that is not the target commodity 102. For example, if the target commodity 102 is a seed then the product 304 can be a fertilizer that is applied near, on, or under the target commodity 102 that is the seed. The target commodity 102 can include a weed and the product 304 can be a herbicide.
The controller 80 determines whether the error characterization 306 is within an acceptable range. If the error characterization 306 is not within the acceptable range, then adjustments to the mechanical delay offset factor of the mechanical systems, the planter section control, seed delivery system, seed metering system, commodity delivery system, and/or product delivery system of the planter row unit 14 can be made by an operator or automatically such as by the controller 80. The systems and methods adjust the mechanical delay offset factor of the mechanical systems of the planter row unit 106 accordingly until the error characterization 306 is within an acceptable range or value. The adjustments to the mechanical delay offset factor of the mechanical systems of the planter row unit 106 can be made by an operator or automatically such as by the controller 80 or the material application control system 113. The present description proceeds with respect to the examples being deployed on a row unit of a planter. They could just as easily be deployed on a sprayer, an air seeder, a tillage machine with a side-dress bar, or other piece of agricultural equipment that is used to apply a commodity or product.
Turning now to
Machine 100 is a row crop planting machine that illustratively includes a toolbar 102 that is part of a frame 104.
In the example shown in
As liquid passes through actuator 109, it travels through an application assembly 117 from a proximal end (which is attached to an outlet end of actuator 109) to a distal tip (or application tip) 119, where the liquid is discharged into a trench, or proximate a trench or furrow 162, opened by disc opener 114 (as is described in more detail below).
Some parts of row unit 106 will now be discussed in more detail. First, it will be noted that there are different types of seed meters 124, and the one that is shown is shown for the sake of example only and is described in greater detail below. However, in one example, each row unit 106 need not have its own seed meter. Instead, metering or other singulation or seed dividing techniques can be performed at a central location, for groups of row units 106. The metering systems can include finger pick-up discs and/or vacuum meters (e.g., having rotatable discs, rotatable concave or bowl-shaped devices), among others. The seed delivery system can be a gravity drop system (such as seed tube 120 shown in
A downforce actuator 126 is mounted on a coupling assembly 128 that couples row unit 106 to toolbar 102. Actuator 126 can be a hydraulic actuator, a pneumatic actuator, a spring-based mechanical actuator or a wide variety of other actuators. In the example shown in
Arms (or gauge wheel arms) 148 illustratively abut against a mechanical stop (or arm contact member-or wedge) 150. The position of mechanical stop 150 relative to shank 152 can be set by a planting depth actuator assembly 154. Control arms 148 illustratively pivot around pivot point 156 so that, as planting depth actuator assembly 154 actuates to change the position of mechanical stop 150, the relative position of gauge wheels 116, relative to the double disc opener 114, changes, to change the depth at which seeds are planted.
In operation, row unit 106 travels generally in the direction indicated by arrow 160. The double disc opener 114 opens a furrow 162 in the soil 138, and the depth of the furrow 162 is set by planting depth actuator assembly 154, which, itself, controls the offset between the lowest parts of gauge wheels 116 and disc opener 114. Seeds are dropped through seed tube 120, into the furrow 162 and closing wheels 118 close the furrow 162, e.g., push soil back into the furrow 162.
As the seeds are dropped through seed tube 120, they can be sensed by seed sensor 122. Some examples of seed sensor 122 are described in greater detail below. Some examples of seed sensor 122 may include an optical or reflective sensor, which includes a radiation transmitter component and a receiver component. The transmitter component emits electromagnetic radiation and the receiver component then detects the radiation and generates a signal indicative of the presence or absence of a seed adjacent the sensors. In another example, row unit 106 may be provided with a seed firmer that is positioned to travel through the furrow 162, after seeds are placed in furrow 162, to firm the seeds in place. A seed sensor can be placed on the seed firmer and generate a sensor signal indicative of a seed. Again, some examples of seed sensors are described in greater detail below.
The present description proceeds with respect to the seed sensor being located to sense a seed passing it in seed tube 120, but this is for the sake of example only. Material application control system 113 illustratively receives a signal from seed sensor 122, indicating that a seed is passing sensor 122 in seed tube 120. It then determines when to actuate actuator 109 so that material or product being applied through application assembly 117 (and out distal tip 119 of application assembly 117) will be applied at a desired location relative to the seed in trench or furrow 162.
Material application control system 113 illustratively is programmed with, or detects a distance, e.g., a longitudinal distance, that the distal tip 119 is from the exit end 121 of seed tube 120. It also illustratively senses, or is provided by another component, such as the GPS device 180 or the agricultural work machine 140 such as a tractor, the ground speed of row unit 106. Once system 113 receives a seed sensor signal indicating that a seed is passing sensor 122 in seed tube 120, system 113 determines the amount of time it will take for the seed to drop through the outlet end of seed tube 121 and into furrow 162 to reside at its final seed location and position in furrow 162. It then determines when tip 119 will be in a desired location relative to that final seed location and it actuates valve 109 to apply the material or product at the desired location. By way of example, it may be that some material or product is to be applied directly on the seed. In that case, system 113 times the actuation of actuator 109 so that the applied material or product will be applied at the seed location. In another example, it may be desirable to apply some material or product at the seed location and also a predetermined distance on either side of the seed location. In that case, system 113 controls the signal used to control actuator 109 so that the material or product is applied in the desired fashion. In other examples, it may be that the material or product is to be applied at a location between seeds in furrow 162. By way of example, relatively high nitrogen fertilizer may be most desirably applied between seeds, instead of directly on the seed. In that case, system 113 has illustratively been programmed with the desired location of the applied material, relative to seed location, so that it can determine when to actuate actuator 109 in order to apply the material between seeds. Further, as discussed above, actuator 109 can be actuated to dispense material or product at a varying rate. It can dispense more material on the seed location and less at locations spaced from the seed location, or vice versa, or according to other patterns.
It will be noted that a wide variety of different configurations are contemplated herein. For instance, in one example,
As unit 105 moves, material application control system 113 controls actuator 109 to dispense material or product 304. This can be done relative to seed or plant locations of the seeds or target commodity 102, if they are sensed or are already known or have been estimated. It can also be done before the seed or plant locations are known. In this latter scenario, the locations where the material or product 304 is applied can be stored so that seeds or target commodity 102 can be planted later, relative to the locations of the material or product 304 that has been already dispensed.
The visualization system 60 including the camera or imaging unit 62, the structured light unit 64, and in some embodiments the general illumination light 68 is mounted to the applicator unit 105. The visualization system 60 is configured to capture one or more images of the commodity 102 and the product 304 when these are deposited on the ground 138. In other embodiments, the visualization system 60 is mounted to the towing vehicle 94.
In such a system, material application control system 113 considers the speed at which delivery system 166 moves the seed from seed sensor 122 to the exit end 170. It also illustratively considers the speed at which the seed moves from the exit end 170 into furrow 162. For instance, in one example the seed simply drops from exit end 170 into furrow 162 under the force of gravity. In another example, however, the seed can be ejected from delivery system 166 at a greater or lesser speed than that which would be reached under the force of gravity. Similarly, it may be that the seed drops straight downward into furrow 162 from the outlet end 170. In another example, however, it may be that the seed is propelled slightly rearwardly from the outlet end 170, to accommodate for the forward motion of the row unit 106, so that the travel path of the seed is more vertical and so the seed rolls less once it reaches the furrow. Further, the seed can be ejected rearwardly and trapped against the ground by a trailing member (such as a pinch wheel) which functions to stop any rearward movement of the seed, after ejection, and to force the seed into firm engagement with the ground.
The visualization system 60 including the camera or imaging unit 62, the structured light unit 64, and in some embodiments the general illumination light 68 is mounted to the row unit 106. The visualization system 60 is configured to capture one or more images of the commodity 102 and the product 304 when these are deposited on the ground 138.
In another example, member 172 can be positioned so that it moves through the furrow after the seed is placed in the furrow. In such an example, member 172 may act as a seed firmer, which firms the seed into its final seed location.
In either case, member 172 can include a seed sensor 176, which senses the presence of the seed. It may be an optical sensor, which optically senses the seed presence as member 172 moves adjacent to, ahead of, or over the seed. It may be a mechanical sensor that senses the seed presence, or it may be another type of sensor that senses the presence of the seed in the furrow. Sensor 176 illustratively provides a signal to material application control system 113 indicating the presence of the sensed seed.
In such an example, it may be that actuator 109 is placed at the location of actuator 109E, shown in
The visualization system 60 including the camera or imaging unit 62, the structured light unit 64, and in some embodiments the general illumination light 68 is mounted to the row unit 106. The visualization system 60 is configured to capture one or more images of the commodity 102 and the product 304 when these are deposited on the ground 138.
Also, in the example shown in
Once a seed comes to rest in (or proximate) an aperture 184, the vacuum or positive pressure differential acts to hold the seed within the aperture 184 such that the seed is carried upwardly generally in the direction indicated by arrow 188, from seed pool 186, to a seed discharge area 190. It may happen that multiple seeds are residing in an individual seed cell. In that case, a set of brushes or other members 194 that are located closely adjacent the rotating seed cells tend to remove the multiple seeds so that only a single seed is carried by each individual cell. Additionally, a seed sensor 193 can also illustratively be mounted adjacent to rotating element 180. It generates a signal indicative of seed presence and this may be used by system 113, as will be discussed in greater detail below.
Once the seeds reach the seed discharge area 190, the vacuum or other pressure differential is illustratively removed, and a positive seed removal wheel or knock-out wheel 191, can act to remove the seed from the seed cell. Wheel 191 illustratively has a set of projections 195 that protrude at least partially into apertures 184 to actively dislodge the seed from those apertures. When the seed is dislodged (such as seed 171), it is illustratively moved by the seed tube 120, seed delivery system 166 (some examples of which are shown above in
After the seed is moved to the furrow 162 in the ground, the visualization system 60 captures one or more images of the seed or commodity 102 in the ground 138.
Therefore, when seeds are moved by rotating element 180 to the seed discharge area 190, where they are discharged from the seed cells in rotating element 180, they are illustratively positioned within the bristles 202 by the projections 182 that push the seed into the bristles 202. Seed delivery system 166 illustratively includes walls that form an enclosure around the bristles 202, so that, as the bristles 202 move in the direction indicated by arrow 208, the seeds are carried along with them from the seed discharge area 190 of the metering mechanism, to a discharge area 210 either at ground level, or below ground level within a trench or furrow 162 that is generated by the furrow opener 114 on the row unit 106.
Additionally, a seed sensor 203 is also illustratively coupled to seed delivery system 166. As the seeds are moved within bristles 202, sensor 203 can detect the presence or absence of a seed. It should also be noted that while the present description will proceed as having sensors 122, 193 and/or 203, it is expressly contemplated that, in another example, only one sensor is used. Or additional sensors can also be used. Similarly, the seed sensor 203 shown in
In some embodiments, the seed sensor may signal to the visualization system 60 to capture an image of the seed or target commodity 102 in the furrow 162. After the seed is moved to the furrow 162 in the ground, the visualization system 60 captures one or more images of the seed or commodity 102 in the ground 138.
There are a wide variety of other types of delivery systems as well, that include a transport mechanism and a receiver that receives a seed. For instance, they include dual belt delivery systems in which opposing belts receive, hold, and move seeds to the furrow, a rotatable wheel that has sprockets, which catch seeds from the metering system and move them to the furrow, multiple transport wheels that operate to transport the seed to the furrow, and an auger, among others. The present description will proceed with respect to an endless member (such as a brush belt, a flighted belt) and/or a seed tube, but many other delivery systems are contemplated herein as well.
Before continuing with the description of applying material or product 304 relative to seed or target commodity 102 location, a brief description of some examples of seed sensors 122, 193 and 203 will first be provided. Sensors 122, 193 and 203 are illustratively coupled to seed metering system 124 and seed delivery system 120, 166. Sensors 122, 193 and 203 sense an operating characteristic of seed metering system 124 and seed delivery systems 120, 166. In one example, sensors 122, 193 and 203 are seed sensors that are each mounted at a sensor location to sense a seed within seed tube 120, seed metering system 124, and delivery system 166, respectively, as the seed passes the respective sensor location. In one example, sensors 122, 193, and 203 are optical or reflective sensors and thus include a transmitter component and a receiver component. The transmitter component emits electromagnetic radiation into seed tube 120, seed metering system 180, and/or delivery system 166. In the case of a reflective sensor, the receiver component then detects the reflected radiation and generates a signal indicative of the presence or absence of a seed adjacent to sensor 122, 193, and 203 based on the reflected radiation. With other sensors, radiation such as light, is transmitted through the seed tube 120, seed metering system 124, or the delivery system 166. When the light beam is interrupted by a seed, the sensor signal varies, to indicate a seed. Thus, each sensor 122, 193, and 203 generates a seed sensor signal that pulses or otherwise varies, and the pulses or variations are indicative of the presence of a seed passing the sensor location proximate the sensor.
For example, in regards to sensor 203, bristles 202 pass sensor 203 and are colored to absorb a majority of the radiation emitted from the transmitter. As a result, absent a seed, reflected radiation received by the receiver is relatively low. Alternatively, when a seed passes the sensor location where sensor 203 is mounted, more of the emitted light is reflected off the seed and back to the receiver, indicating the presence of a seed. The differences in the reflected radiation allow for a determination to be made as to whether a seed is, in fact, present. Additionally, in other examples, sensors 122, 193, and 203 can include a camera and image processing logic such as the visualization system 60 that allow visual detection as to whether a seed is present within seed metering system 124, seed tube 120, and/or seed delivery system 166, at the sensor location proximate the sensor. They can include a wide variety of other sensors (such as RADAR or LIDAR sensors) as well.
For instance, where a seed sensor is placed on a seed firmer, it may be mechanical or other type of sensor that senses contact with the seed as the sensor passes over the seed. Also, while the speed of the seed in the delivery system (or as it is ejected) can be identified by using a sensor that detects the speed of the delivery system, in some examples, the speed and/or other characteristics of movement of the seed can be identified using seed sensors. For instance, one or more seed sensors can be located to sense the speed of movement of the seed, its trajectory or path, its instantaneous acceleration, its presence, etc. This can be helpful in scenarios in which the seed delivery system changes speed.
The visualization system 60 previously described is operably connected and mounted to the row unit 106 as illustrated in
Imaging by the camera or imaging unit 62 and operation of the general illumination light 68 and the structured light unit 64 is described previously. The general illumination light 68 can be placed anywhere on the row unit 106 to illuminate a field of view of the camera or imaging unit 62. Operation of the structured light unit 64, the general illumination light 68, and the camera or imaging unit 62 is previously described.
In certain embodiments, the visualization system 60 includes or is operatively connected to the controller 80 or a material application control system 113 structured to perform certain operations to control the camera or imaging unit 62, the structured light unit 64, and the general illumination light 68. The material application control system 113 can be placed anywhere on the row unit 106, the planter, the towing vehicle 94, or any work machine that may be connected to or capable of performing one or more planting operations. In certain embodiments, the camera or imaging unit 62 includes the controller 80 or the material application control system 113. The controller 80 is in communication with any sensor or other apparatus throughout the visualization system 60 and the row unit 106, including through direct communication, communication over a datalink, and/or through communication with other controllers or portions of the processing subsystem that provide sensor and/or other information to the controller 80.
Measurement of the location of the commodity 102 in an image 400 is illustrated in
In one form, the measurement of the locations of the commodity 102 and the product 304 are determined by using structured-light based sensing. In the embodiment wherein the structured light unit 64 projects a single line laser, the structured light unit 64 emits patterned light (see
Additionally, the camera or imaging unit 62 captures an image of the trench 192 with the product 304 therein with the projected patterned light. In some embodiments, the image 400 includes the commodity 102 and the product 304. In other embodiments, a first image includes the commodity 102 and a second image includes the product 304 such that the commodity 102 and the product 304 are not shown in a single image. The geometric relationship between the laser or light plane LP projected by the structured light unit 64 and the principle optical axis CV of the camera or imaging unit 62 is determined by the location and orientation of the camera or imaging unit 62 and the structured light unit 64 relative to each other, therefore the 3D location of the laser line pixels in the image 400 is determined. After the 3D location of the laser line pixels are determined, 3D locations of the commodity 102 and the product 304 are determined. The 3D locations of the commodity 102 and the product 304 are computed based on 3D measurement in the image 400.
Certain systems are described and include examples of controller operations in various contexts of the present disclosure. In certain embodiments, such as procedure 460 shown in
As the commodity 102 is placed in the furrow 162 in the ground 138, the commodity 102 is captured in images by the visualization system 60 assembled with the row unit 106. As the product 304 is discharged from the distal tip 119 into the furrow 162 in the ground 138, the product 304 is captured in images by the visualization system 60 assembled with the row unit 106. In some embodiments, the commodity 102 is placed in the furrow 162 before the product 304 is placed in the furrow 162. In other embodiments, the product 304 is placed in the furrow 162 before the commodity 102 is placed. The commodity 102 can include a seed, weed plant, or a fertilizer. The product 304 can include anything that is not the commodity 102. For example, if the commodity 102 is a seed then the product 304 can be a fertilizer that is applied near the commodity 102 that is the seed. As another example, the commodity 102 can include a weed and the product 304 can be a herbicide.
In some images the commodity 102 and the product 304 are captured in the same image. In other embodiments, the commodity 102 is captured in a first image and the product 304 is captured in a second image, or vice versa.
At operation 456, the controller 80 is operable to determine whether the image 400 includes the commodity 102. If the image 400 does not include the commodity 102, then the camera or imaging unit 62 continues to capture the image 400 of the trench with the projected patterned light at the operation 454 until the captured image includes the commodity 102.
At step 456, if the image 400 includes the commodity 102, then the procedure continues to step 358 wherein the controller 80 is operable to determine whether the image 400 includes the product 304. If the image 400 does not include the product 304, then the camera or imaging unit 62 continues to capture the image 400 of the trench with the projected patterned light at the operation 454 until the captured image includes the product 304.
At step 458, if the image 400 includes the product 304, then the procedure continues to step 360 wherein the controller 80 is operable to determine, based on the captured images 400 of the product 304, a product characterization that includes details of the product 304 being a fertilizer, liquid, granular, herbicide, herbicide product, or other product that was placed in the ground G as determined from the captured image 400. Some details of the product characterization of the product 304 include the type of product such as fertilizer or herbicide, a length, a width, an area, a depth of the product 304 in the ground G, 3D location of the product 304, and other details about the product 304 can be determined. Other details of the product characterization of the product 304 include if the product 304 is liquid material and is being applied in a band of liquid, it may indicate the length of each application band applied on the ground G. Similarly, the application rate may vary within an application band. For instance, the product 304 may be applied more heavily near the center of the band than at either end of the band or vice versa.
The procedure then continues to step 462 wherein the controller 80 determines an error characterization 306 of the product characterization determined in step 458 relative to placement and location of the commodity 102 in the captured image from step 456. The controller 80 determines the error characterization 306 or error placement between the commodity 102 and the product characterization of the product 304 applied near the commodity 102 in the images 400. The error characterization 306 or error placement between the commodity 102 and the product characterization of the product 304 may indicate a placement of a band or product 304 relative to the seed location of the commodity 102. For instance, where the band or product 304 is four inches long, the product characterization determined in step 458 may indicate a placement of the center of the band (along its longitudinal length). The error characterization 306 or error placement may indicate relative placement of the product characterization of the product 304 to commodity location of the commodity 102. In this way, where the product 304 is to be applied at the commodity location of the commodity 102, then the center of the band will illustratively correspond to the commodity location. However, where the product 304 is to be applied at a location other than the commodity location of the commodity 102, then the center of the band will illustratively be offset from the commodity location by a desired amount.
The procedure then continues to step 464 wherein the controller 80 determines whether the error characterization 306 or error placement is within an acceptable range. If the error characterization 306 or error placement is not within the acceptable range, then the procedure continues to step 466 wherein adjustments to the mechanical delay offset factor of any of the mechanical systems, the planter section control, commodity delivery system 166, seed metering system 124, actuator 109, application assembly 117, applicator unit 105, and/or product delivery systems of the planter row unit 14 or 106 can be made by an operator or automatically such as by the controller 80. These systems and methods can then adjust the mechanical delay offset factor of the mechanical systems of the planter row unit 106 accordingly until the error characterization 306 or error placement is within an acceptable range or value. In one example, the unacceptable range of the error characterization is greater than 5 inches. In other embodiments, the unacceptable range of the error characterization could be greater than 2, 3, or 4 inches. In one example, the error characterization is in the acceptable range that is between 0 and 5 inches. In other embodiments, the error characterization is in the acceptable range that is between 0 and 8 inches. In other embodiments, the operator can designate the unacceptable and acceptable ranges of the error characterization.
The adjustments to the mechanical delay offset factor of the mechanical systems of the planter row unit 106 can be made by an operator or automatically such as by the controller 80 or the material application control system 113. The present description proceeds with respect to the visualization system 60 deployed on a row unit of a planter. The visualization system 60 could be deployed on a sprayer, an air seeder, a tillage machine with a side-dress bar, or other piece of agricultural equipment that is used to apply the commodity 102 or the product 304. The present description could also be deployed on autonomy cameras on tractors and sprayers such that the visualization system 60 could be used to define stop/start times or section control. For example, the sprayer camera can see and identify the grass in a waterway in the capture image and thus shut off. An alternative view was using the cameras to determine on/off system delays by seeing the spray via viable or thermal views. Alternatively, this concept could also be utilized to see fertilizer from an air boom or dry box spread and provide the appropriate system on/off time settings.
If the error characterization 306 or error placement is within the acceptable range, then the procedure continues to step 468 and no adjustments to the mechanical delay offset factor of any of the mechanical systems, the planter section control, commodity delivery system 166, seed metering system 124, actuator 109, application assembly 117, applicator unit 105, and/or product delivery systems of the planter row unit 14 or 106 are made and the procedure ends.
The present discussion has mentioned processors and servers. In one example, the processors and servers include computer processors with associated memory and timing circuitry, not separately shown. They are functional parts of the systems or devices to which they belong and are activated by, and facilitate the functionality of the other components or items in those systems.
Also, a number of user interface displays have been discussed. They can take a wide variety of different forms and can have a wide variety of different user actuatable input mechanisms disposed thereon. For instance, the user actuatable input mechanisms can be text boxes, check boxes, icons, links, drop-down menus, search boxes, etc. They can also be actuated in a wide variety of different ways. For instance, they can be actuated using a point and click device (such as a track ball or mouse). They can be actuated using hardware buttons, switches, a joystick or keyboard, thumb switches or thumb pads, etc. They can also be actuated using a virtual keyboard or other virtual actuators. In addition, where the screen on which they are displayed is a touch sensitive screen, they can be actuated using touch gestures. Also, where the device that displays them has speech recognition components, they can be actuated using speech commands.
A number of data stores have also been discussed. It will be noted they can each be broken into multiple data stores. All can be local to the systems accessing them, all can be remote, or some can be local while others are remote. All of these configurations are contemplated herein.
Also, the figures show a number of blocks with functionality ascribed to each block. It will be noted that fewer blocks can be used so the functionality is performed by fewer components. Also, more blocks can be used with the functionality distributed among more components.
The present application is a continuation-in-part of U.S. Nonprovisional application Ser. No. 18/535,673 filed on Dec. 11, 2023, which claims the benefit of U.S. Provisional Patent Application No. 63/476,298 filed on Dec. 20, 2022, the present application also claims the benefit of U.S. Provisional Patent Application No. 63/529,853 filed on Jul. 31, 2023, which are incorporated herein by reference in their entireties.
Number | Date | Country | |
---|---|---|---|
63529853 | Jul 2023 | US | |
63476298 | Dec 2022 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 18535673 | Dec 2023 | US |
Child | 18416551 | US |