The present disclosure relates generally to systems and methods for monitoring field conditions and, more particularly to systems for monitoring field conditions as an agricultural implement moves across a field.
It is well known that, to attain the best agricultural performance from a field, a farmer must cultivate the soil, typically through a tillage operation. Tillage implements typically include one or more ground engaging tools configured to engage the soil as the implement is moved across the field. Such ground engaging tool(s) loosen and/or otherwise agitate the soil to prepare the field for subsequent agricultural operations, such as planting operations. The field conditions after a tillage operation, such as surface roughness and residue coverage, impact subsequent farming operations within the field. In this regard, sensor systems have been developed that allow field conditions to be detected along a portion of the field behind the tillage implement during the tillage operation.
However, conventional sensor systems typically include a fixed sensor having a limited field of view. As such, field conditions may only be captured for a small portion of the field behind the implement. Such issue can potentially be addressed with the use of multiple fixed sensors. However, multi-sensor system arrangements are often prohibitively expensive.
Accordingly, improved systems and methods for monitoring field conditions as an agricultural implement is moved across a field would be welcomed in the technology.
Aspects and advantages of the invention will be set forth in part in the following description, or may be obvious from the description, or may be learned through practice of the invention.
In one aspect, the present subject matter is directed to a system for monitoring field conditions of a field. The system includes a sensor, an actuator, and a controller. The sensor is supported on an agricultural implement such that the sensor has a field of view directed towards an aft portion of the field disposed rearward of the agricultural implement relative to a direction of travel of the agricultural implement. The sensor is configured to generate data indicative of a field condition associated with the aft portion of the field. The actuator is configured to actuate the sensor back and forth relative to an adjacent portion of the agricultural implement along a sensor movement path. The controller is configured to receive data from the sensor indicative of the field condition as the actuator actuates the sensor back and forth along the sensor movement path such that the field of view of the sensor is oscillated across the aft portion of the field while the agricultural implement is being moved across the field. The controller is further configured to monitor the field condition based at least in part on the data received from the sensor.
In further aspect, the present subject matter is directed to another system for monitoring field conditions of a field. The system includes a sensor supported on an agricultural implement such that the sensor has a field of view directed towards the field, where the sensor is configured to generate data indicative of a field condition associated with the field. The system further includes an actuator configured to actuate the sensor back and forth relative to an adjacent portion of the agricultural implement along a sensor movement path. The system additionally includes a controller configured to determine an area-of-interest within the field. The controller being further configured to control an operation of the actuator to actuate the sensor along the sensor movement path such that the field of view is directed towards the area-of-interest within the field. The controller being additionally configured to monitor the field condition associated with the area-of-interest based at least in part on the data received from the sensor.
In another aspect, the present subject matter is directed to yet another system for monitoring field conditions of a field. The system includes a sensor supported on an agricultural implement, where the sensor has a field of view directed towards a portion of the field. The sensor is configured to generate data indicative of afield condition associated with the portion of the field. The system further includes an actuator configured to linearly actuate the sensor back and forth relative to an adjacent portion of the agricultural implement along a linear movement path. The system additionally includes a controller configured to receive data from the sensor indicative of the field condition as the actuator linearly actuates the sensor back and forth along the linear movement path such that the field of view of the sensor is oscillated across the portion of the field while the agricultural implement is being moved across the field. The controller is further configured to monitor the field condition based at least in part on the data received from the sensor.
In a further aspect, the present subject matter is directed to a method for monitoring field conditions of a field. The method includes receiving, with a computing device, data from a sensor indicative of a field condition as an actuator actuates the sensor back and forth along a sensor movement path such that a field of view of the sensor is oscillated across a portion of the field disposed relative to an agricultural implement while the agricultural implement is being moved across the field. The method further includes monitoring, with the computing device, the field condition based at least in part on the data received from the sensor. The method additionally includes performing, with the computing device, a control action based on the monitored field condition.
In an additional aspect, the present subject matter is directed to another method for monitoring field conditions of a field. The method includes receiving, with a computing device, an input associated with determining an area-of-interest within a field while an agricultural implement is being moved across the field. The method further includes controlling, with the computing device, an operation of an actuator to actuate a sensor along a sensor movement path such that a field of view of the sensor is directed towards the area-of-interest, the sensor being configured to generate data indicative of a field condition within the area-of-interest. Additionally, the method includes monitoring, with the computing device, a field condition associated with the area-of-interest based at least in part on data received from the sensor.
These and other features, aspects and advantages of the present invention will become better understood with reference to the following description and appended claims. The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and, together with the description, serve to explain the principles of the invention.
A full and enabling disclosure of the present invention, including the best mode thereof, directed to one of ordinary skill in the art, is set forth in the specification, which makes reference to the appended figures, in which:
Repeat use of reference characters in the present specification and drawings is intended to represent the same or analogous features or elements of the present technology.
Reference now will be made in detail to embodiments of the invention, one or more examples of which are illustrated in the drawings. Each example is provided by way of explanation of the invention, not limitation of the invention. In fact, it will be apparent to those skilled in the art that various modifications and variations can be made in the present invention without departing from the scope or spirit of the invention. For instance, features illustrated or described as part of one embodiment can be used with another embodiment to yield a still further embodiment. Thus, it is intended that the present invention covers such modifications and variations as come within the scope of the appended claims and their equivalents.
In general, the present subject matter is directed to systems and methods for monitoring field conditions of a field as an agricultural implement moves across the field. Specifically, in several embodiments, a computing device or controller of the disclosed system may be configured to monitor one or more field conditions based on data received from a sensor provided in operative association with an agricultural implement performing an operation within the field. The sensor may have a field of view directed towards a portion of the field such that the sensor generates data indicative of the monitored field condition(s) associated with such portion of the field. Additionally, in accordance with aspects of the present subject matter, the sensor may be configured to be moved or actuated back and forth along a sensor movement path such that the field of view of the sensor is oscillated across an adjacent portion of the field while the agricultural implement is being used to perform an operation within the field. As such, the sensor may capture data associated with the monitored field condition(s) across a larger area of the field than if the sensor were fixed in position. In some embodiments, the sensor movement path may be linear, such that the sensor is linearly oscillated back and forth along the linear movement path. Additionally or alternatively, in some embodiments, the sensor movement path may be arced or curved such that the sensor is pivotably oscillated back and forth along the arced movement path.
Moreover, in accordance with aspects of the present subject matter, the system controller may be configured to determine an area-of-interest within the field. For instance, in one embodiment, the controller may monitor the field condition data received from the sensor to determine an area-of-interest within the field. In other embodiments, the controller may receive an indication of a desired area-of-interest within the field from an operator. In further embodiments, the controller may monitor additional or supplemental data from one or more secondary sensors configured to detect parameters indicative of operating parameters of the implement, such as vibrations, levelness, etc., and/or other field conditions, such as moisture content, etc. Upon the determination of an area-of-interest within the field, the sensor may be moved along its associated sensor movement path such that the field of view of the sensor is directed towards the area-of-interest, thereby allowing the controller to specifically monitor the field condition(s) within the area-of-interest. In one embodiment, the controller may be configured to adjust the operation of the implement based on the determined condition(s) within the area-of-interest.
Additionally, in accordance with aspects of the present subject matter, the controller may also be configured to generate a field condition map for the field based at least in part on the data received from the sensor. More particularly, the data received from the sensor may be geo-referenced such that an estimated field condition(s) may be determined at each location within the field. However, in certain instances, the data received from the sensor will only correspond to a portion of the field as the sensor is being oscillated back and forth along its associated sensor movement path. Thus, in such instances, the controller may be configured to estimate the associated field condition(s) of one or more portions of the field outside of the field of view of the sensor based on the data received from the sensor to “fill-out” the field condition map. The field condition map may then be used, for example, to control the operation of the implement performing the current field operation or an implement performing a subsequent field operation.
Referring now to the drawings,
In general, the implement 10 may be configured to be towed across a field in a direction of travel (e.g., as indicated by arrow 14 in
As shown in
As shown in
In several embodiments, the frame 28 may be configured to support one or more gangs or sets 44 of disc blades 46. Each disc blade 46 may, in turn, be configured to penetrate into or otherwise engage the soil as the implement 10 is being pulled through the field. In this regard, the various disc gangs 44 may be oriented at an angle relative to the direction of travel 14 to promote more effective tilling of the soil. In the embodiment shown in
Moreover, in several embodiments, the implement 10 may include a plurality of disc gang actuators 104 (
Additionally, as shown, in one embodiment, the implement frame 28 may be configured to support other ground engaging tools. For instance, in the illustrated embodiment, the frame 28 is configured to support a plurality of shanks 50 or tines (not shown) configured to rip or otherwise till the soil as the implement 10 is towed across the field. Furthermore, in the illustrated embodiment, the frame 28 is also configured to support a plurality of leveling blades 52 and rolling (or crumbler) basket assemblies 54. The implement 10 may further include shank frame actuator(s) 50A and/or basket assembly actuator(s) 54A configured to move or otherwise adjust the orientation or position of the shanks 50 and the basket assemblies 54, respectively, relative to the implement frame 28. It should be appreciated that, in other embodiments, any other suitable ground-engaging tools may be coupled to and supported by the implement frame 28, such as a plurality closing discs.
It should be appreciated that the configuration of the implement 10 and work vehicle 12 described above are provided only to place the present subject matter in an exemplary field of use. Thus, it should be appreciated that the present subject matter may be readily adaptable to any manner of implement or work vehicle configurations.
Referring now to
In one embodiment, the rearward sensor 152 may be supported relative to the implement 10 such that the field of view 152A of the rearward sensor 152 is directed towards an aft portion of the field disposed rearward of the implement 10 relative to the direction of travel 14. For example, in the embodiment shown, the support arm 156 is positioned at or adjacent to the aft end 32 of the implement 10. As such, the rearward sensor 152 may be configured to generate data indicative of one or more field conditions associated with the aft portion of the field located behind or aft of the implement 10. For instance, the rearward sensor 152 may be configured to generate data indicative of at least one of a surface roughness, clod size, residue coverage, soil compaction, and/or the like of the aft portion of the field. The rearward sensor 152 may be configured as any suitable device, such as a camera(s) (including stereo camera(s), and/or the like), radar sensor(s), ultrasonic sensor(s), LIDAR device(s), infrared sensor(s), and/or the like such that the rearward sensor 152 generates image data, radar data, point-cloud data, infrared data, ultrasound data, and/or the like indicative of one or more monitored field conditions. For instance, the rearward sensor 152 may be configured as a radar sensor(s), an ultrasonic sensor(s), a LIDAR device(s), and/or a camera(s) to generate data indicative of soil roughness. Similarly, the rearward sensor 152 may be configured as a LIDAR device(s) and/or a camera(s) to generate data indicative of clod size and/or residue coverage. Further, the rearward sensor 152 may be configured as a radar sensor(s), specifically as ground-penetrating radar sensor(s), to generate data indicative of soil compaction.
In one embodiment, the field of view 152A of the rearward sensor 152 may be narrower than the implement 10 such that the rearward sensor 152 is only configured to capture data associated with a sub-section of the portion of the field located aft or behind the implement 10. More particularly, as shown in
Accordingly, as will be described in greater detail below, the disclosed sensing assembly 150 may also include an actuator 154 provided in operative association with the rearward sensor 152 that is configured to actuate the rearward sensor 152 relative to the implement 10 back and forth along a given sensor movement path such that the field of view 152A of the rearward sensor 152 can be oscillated across all or a given portion of the width W1 of the implement/swath, thereby allowing data to be captured along different sub-sections of the field swath being worked.
It should be appreciated that, while the sensing assembly 150 is shown as having only one rearward sensor 152, the sensing assembly 150 may have any other suitable number of rearward sensors 152, such as two or more rearward sensors 152. Further, while only one sensing assembly 150 is shown, the system 148 may have any other suitable number of sensing assemblies 150. Furthermore, in alternative embodiments, the sensing assembly 150 may be supported at any other suitable location on the implement 10 and/or the towing vehicle 12 such that the field of view 152A of the rearward sensor 152 is directed towards any other suitable portion of the field. For instance, in one embodiment, the sensing assembly 150 may be supported adjacent the forward end of the implement 10 or the aft end of the vehicle 12 such that the field of view 152A of the rearward sensor 152 is directed towards a portion of the field positioned immediately forward of the implement 10 (or immediately behind the vehicle 12) relative to the direction of travel 14. In another embodiment, the sensing assembly 150 may be supported adjacent the forward end of the vehicle 12 such that the field of view 152A of the rearward sensor 152 is directed towards a portion of the field positioned immediately forward of the vehicle 12 relative to the direction of travel 14.
Additionally, in some embodiments, the system 148 may include one or more forward sensors 160 configured to generate data indicative of one or more field conditions associated with a portion of the field prior to such field portions being worked by the implement 10. For instance, the forward sensor(s) 160 may be positioned at any suitable location relative to the implement 10 and/or work vehicle 12 such that a field of view 160A of each forward sensor 160 is directed towards a portion of the field disposed in front of the implement 10 and/or work vehicle 12 relative to the direction of travel 14. For example, the forward sensor(s) 160 may be positioned at a forward end 30 of the implement 10, at a rear end 15 of the work vehicle 12, or at a front end 13 of the work vehicle 12 as shown in
In one embodiment, the forward sensor(s) may have a fixed field of view 160A relative to the portion of the associated implement 10 or work vehicle 12. However, in other embodiments, the forward sensor(s) 160 may be configured to be a part of a sensing assembly, similar to the rearward sensor 152 of the sensing assembly 150 described above, such that the forward sensor(s) 160 may be configured to be actuated back and forth along a sensor movement path relative to the portion of the associated implement 10 or work vehicle 12 by an actuator 162 (
Referring now to
As shown in
The actuator 154 may correspond to any suitable actuation device that is configured to drive the rearward sensor 152 along the linear movement path 164. For instance, in a particular embodiment, the rearward sensor 152 is coupled to the support arm 156 by a rail system 162. One or more of the rails of the rail system 162 may be configured as a fixed rack configured to engage a corresponding pinion gear coupled to the actuator 154. In such an embodiment, the actuator 154 may correspond to a rotary actuator (e.g., an electric motor) configured to rotationally drive the pinion gear to linearly actuate the rearward sensor 152 along the linear movement path 164.
It should be appreciated that, in alternative embodiments, the rearward sensor 152 may be coupled to the support arm 156 by any other suitable means that allows the rearward sensor 152 to be actuated along the linear movement path 164. For instance, the rearward sensor 152 may be coupled to the support arm 156 by a track, a parallel linkage assembly, a pivoting arm, and/or the like. Furthermore, it should be appreciated that the actuator 154 may correspond to any suitable actuator that is configured to actuate the rearward sensor 152 along an associated linear movement path 164. For instance, the actuator 154 may be configured as a hydraulic cylinder, a pneumatic cylinder, a belt drive, a screw drive, and/or the like.
As shown in
Referring now to
The data generated by the rearward sensor 152 as the implement 10 is moved across the field may be used to generate a field condition map. As indicated above, in certain embodiments, the rearward sensor 152 generates data indicative of a field condition(s) for only a portion of the field due to its oscillating field of view as the sensor 152 is actuated back and forth along its sensor movement path, such as the first sub-portion(s) P1 of the field shown in
Referring now to
In several embodiments, the system 200 may include a controller 202 and various other components configured to be communicatively coupled to and/or controlled by the controller 202, such as a sensing assembly (e.g., sensing assembly 150) having one or more sensors configured to capture field conditions of a field (e.g., sensor(s) 152,160) and one or more actuators (e.g., actuator(s) 154, 162), a user interface (e.g., user interface 60), various components of the implement 10 and/or the work vehicle 12 (e.g., implement actuator(s) 50A, 54A, 104), and/or various other components of the sensing assembly 150 (e.g., actuator(s) 154, 162). The user interface 60 described herein may include, without limitation, any combination of input and/or output devices that allow an operator to provide operator inputs to the controller 202 and/or that allow the controller 202 to provide feedback to the operator, such as a keyboard, keypad, pointing device, buttons, knobs, touch sensitive screen, mobile device, audio input device, audio output device, and/or the like.
In general, the controller 202 may correspond to any suitable processor-based device(s), such as a computing device or any combination of computing devices. Thus, as shown in
It should be appreciated that the controller 202 may correspond to an existing controller for the implement 10 or the vehicle 12 or may correspond to a separate processing device. For instance, in one embodiment, the controller 202 may form all or part of a separate plug-in module that may be installed in operative association with the implement 10 or the vehicle 12 to allow for the disclosed system and method to be implemented without requiring additional software to be uploaded onto existing control devices of the implement 10 or the vehicle 12.
In several embodiments, the data 208 may be stored in one or more databases. For example, the memory 206 may include a field condition database 212 for storing field condition data received from the sensor(s) 152, 160. For instance, the sensor(s) 152, 160 may be configured to continuously or periodically capture data associated with a portion of the field, such as immediately before and/or after the performance of an agricultural operation within such portion of the field. In such an embodiment, the data transmitted to the controller 202 from the sensor(s) 152, 160 may be stored within the field condition database 212 for subsequent processing and/or analysis. It should be appreciated that, as used herein, the term field condition data 212 may include any suitable type of data received from the sensor(s) 152, 160 that allows for the field conditions of a field to be analyzed, including photographs or other images, RADAR data, LIDAR data, and/or other image-related data (e.g., scan data and/or the like).
It should be appreciated that, in several embodiments, the field condition data 212 may be geo-referenced or may otherwise be stored with corresponding location data associated with the specific location at which such data was collected within the field. In one embodiment, the field condition data 212 may be correlated to a corresponding position within the field based on location data received from one or more positioning devices. For instance, the controller 202 may be communicatively coupled to a positioning device(s) 214, such as a Global Positioning System (GPS) or another similar positioning device, configured to transmit a location corresponding to a position of the sensor(s) 152, 160 within the field when field condition data 212 is collected by the sensor(s) 152, 160.
Referring still to
Further, in some embodiments, the instructions 210 stored within the memory 206 of the controller 202 may be executed by the processor(s) 204 to implement an area-of-interest (AOI) module 218. In one embodiment, the AOI module 218 may be configured to automatically analyze the field condition data 212 deriving from the sensor(s) 152, 160 to determine an area-of-interest. For instance, the AOI module 218 may compare the data from the sensor(s) 152, 160 to one or more associated thresholds and determine an area-of-interest within the field when the data crosses such threshold(s). For example, the AOI module 218 may monitor the surface roughness, clod size, residue coverage, and/or soil compaction of the field from data received from the sensor(s) 152, 160 and determine an area-of-interest when the surface roughness, clod size, residue coverage, and/or soil compaction exceeds and/or drops below an associated threshold. In other embodiments, the AOI module 218 may similarly monitor the data from the forward sensor(s) 160 to determine an area-of-interest when the data crosses such threshold(s). In further embodiments, the AOI module 218 may monitor data from one or more auxiliary sensors (not shown) indicative of the vibrations or levelness of the implement 10 and/or the moisture content of the field and determine an area-of-interest when the vibrations, levelness, or moisture content exceeds and/or drops below an associated threshold. In additional embodiments, the controller 202 may receive an indication of such area-of-interest from an operator, e.g., via the user interface 60.
Referring briefly to
Referring back to
Additionally, in some embodiments, the instructions 210 stored within the memory 206 of the controller 202 may be executed by the processor(s) 204 to implement a control module 222. In some embodiments, the control module 222 may be configured to adjust a position of one or more components of the implement 10, the sensing assembly 150, and/or the user interface 60 based on the monitored field conditions. For instance, in some embodiments, the control module 222 may be configured to adjust the downforce acting on components of the implement 10 by one or more of the actuators 50A, 54A, 104 to improve the field surface conditions based on the monitored field conditions and/or performance of the implement 10. In some embodiments, the control module 222 may control the actuation of the actuator 154 to move the sensor 152 such that the field of view 152A of the sensor 152 is directed towards the area-of-interest determined by the AOI module 218 for monitoring the field condition(s) of the area-of-interest. In some embodiments, the control module 222 may be configured to adjust the operation of the implement 10 based on an input from the operation, e.g., via the user interface 60. Additionally or alternatively, in some embodiments, the controller 202 may further be configured to control the operation of the user interface 60 to notify an operator of the field conditions, performance efficiency of the implement 10, and/or the like.
Moreover, as shown in
Referring now to
As shown in
Further, at (404), the method 400 may include monitoring the field condition based at least in part on the data received from the sensor. For example, as described above, the controller 202 may monitor one or more field conditions associated with the portions of the field captured within the field of view of the sensor based on an assessment or analysis of the data received from the sensor 152. For instance, based on the type of sensor being used and/or the type of data being collected, the controller 202 may be configured to monitor the soil roughness within the field, clod sizes, crop residue coverage, soil compaction, and/or the like.
Additionally, at (406), the method 400 may include performing a control action based on the monitored field condition. For instance, as described above, the control action may include automatically controlling one or more components of the implement 10 (e.g., by controlling one or more of the actuators 50A, 54A, 104) to adjust the operation of the implement 10 in a manner that varies the monitored field condition, controlling the operation of the sensor actuator 164 to move the sensor 152 to adjust the field of view 152A of the sensor 152 (e.g., direct the field of view 152A towards an area-of-interest), and/or notifying an operator of the present field conditions.
Referring now to
As shown in
Further, at (504), the method 500 may include controlling an operation of an actuator to actuate a sensor along a sensor movement path such that a field of view of the sensor is directed towards the area-of-interest. As indicated above, the controller 202 may be configured to control the operation of the actuator 154 to actuate the rearward sensor 152 such that the field of view 152A of the rearward sensor 152 is directed towards the area-of-interest 306, where the rearward sensor 152 generates data indicative of the field conditions within the area-of-interest 306 while the implement 10 continues to move across the field.
Additionally, at (506), the method 500 may include monitoring a field condition associated with the area-of-interest based at least in part on data received from the sensor. As described above, the controller 202 may be configured to monitor the data received from the rearward sensor 152 associated with a field condition(s) within the area-of-interest to determine a field condition within the area-of-interest.
It is to be understood that, in several embodiments, the steps of the methods 400, 500 are performed by the controller 202 upon loading and executing software code or instructions which are tangibly stored on a tangible computer readable medium, such as on a magnetic medium, e.g., a computer hard drive, an optical medium, e.g., an optical disc, solid-state memory, e.g., flash memory, or other storage media known in the art. Thus, in several embodiments, any of the functionality performed by the controller 202 described herein, such as the methods 400, 500, are implemented in software code or instructions which are tangibly stored on a tangible computer readable medium. The controller 202 loads the software code or instructions via a direct interface with the computer readable medium or via a wired and/or wireless network. Upon loading and executing such software code or instructions by the controller 202, the controller 202 may perform any of the functionality of the controller 202 described herein, including any steps of the methods 400, 500 described herein.
The term “software code” or “code” used herein refers to any instructions or set of instructions that influence the operation of a computer or controller. They may exist in a computer-executable form, such as machine code, which is the set of instructions and data directly executed by a computer's central processing unit or by a controller, a human-understandable form, such as source code, which may be compiled in order to be executed by a computer's central processing unit or by a controller, or an intermediate form, such as object code, which is produced by a compiler. As used herein, the term “software code” or “code” also includes any human-understandable computer instructions or set of instructions, e.g., a script, that may be executed on the fly with the aid of an interpreter executed by a computer's central processing unit or by a controller.
This written description uses examples to disclose the invention, including the best mode, and also to enable any person skilled in the art to practice the invention, including making and using any devices or systems and performing any incorporated methods. The patentable scope of the invention is defined by the claims, and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they include structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements with insubstantial differences from the literal languages of the claims.