The present disclosure generally relates to systems and methods for controlling a position of at least a portion of a header of a harvester and, more specifically, proactive variable control of the position of a reel of a harvester.
Agricultural harvesters, such as, for example, combine harvesters, can include different portions or sections for cutting and processing crops. For example, certain types of combine harvesters include a cutting platform such as an auger platform, draper header, corn header, or row header, among others, that is arranged to be moved in at least a forward direction over a field. A draper header can include a reel that transports cut crop material that is cut by a cutting bar of the header to a plurality of draper belts of the draper header. The draper belts can then transport the collected crop material laterally inwardly and rearwardly to a feederhouse of the harvester for processing by the harvester. A corn header can include several crop dividers, each crop divider defining a channel to direct stalks of a crop material, namely corn, to a row unit. The row unit can include a gathering chain, deck plates, and stalk rolls configured to separate corn ears from the stalk. Further, the separated corn ears are moved to a floor of the corn header, where an auger located above the floor can move the corn ears to a location for collection. Additionally, a crop gather device, such as a reel, can be positioned above the row units to direct lodged crop to the auger bed area for conveyance to the machine.
The present disclosure may comprise one or more of the following features and combinations thereof.
In one embodiment of the present disclosure, a system is provided for adjusting a position of at least a portion of a header of an agricultural vehicle. The system can include at least one optical sensor positioned to obtain captured information of the header, at least one processor, and a memory device coupled to the at least one processor. The memory device can include instructions that when executed by the at least one processor cause the at least one processor to classify one or more features detected in the captured information without use of fiduciary markers to identify one or more objects represented in the captured information, the one or more objects comprising at least a portion of the header. The memory device can further include instructions that when executed by the at least one processor cause the at least one processor to determine, using at least the captured information, a position of the portion of the header relative to at least one attribute of an upstream crop material or a ground surface of a field upon which the agricultural vehicle performs an agricultural operation. Additionally, the memory device can further include instructions that when executed by the at least one processor cause the at least one processor to determine, for a location upstream of the agricultural vehicle, a variance in a position of at least the portion of the header relative to a position of the at least one attribute of the upstream crop material or the ground surface. Additionally, the memory device can further include instructions that when executed by the at least one processor cause the at least one processor to determine, based on the variance, one or more adjusted control settings to adjust the position of the portion of the header relative to the position of the at least one attribute of the upstream crop material or the ground surface, and adjust, based on the one or more adjusted control settings, the position of the portion of the header before, or upon, the header being displaced to a location at which the header encounters the variance in the at least one attribute of the upstream crop material the ground surface.
In another embodiment, a method is provided for adjusting a position of at least a portion of a header of an agricultural vehicle. The method can include capturing, by at least one optical sensor, captured information of the header and a crop material positioned upstream of the agricultural vehicle and classifying one or more features identified in the captured information without use of a fiduciary marker to identify one or more objects represented in the captured information, the one or more objects comprising at least a portion of the header. Additionally, using at least the captured information, a position can be determined for the one or more objects relative to a first coordinate system, and the position of at least the portion of the header can be determined relative to a position of at least one attribute of an upstream crop material or a ground surface of a field upon which the agricultural vehicle performs an agricultural operation. Based on the variance, one or more adjusted control settings can be determined to adjust the position of the portion of the header relative to the position of the at least one attribute of the upstream crop material or the ground surface. Additionally, based on the one or more adjusted control settings, the position of the portion of the header before, or upon, the header being displaced to a location at which the header encounters the variance in the at least one attribute of the upstream crop material or the ground surface.
These and other features of the present disclosure will become more apparent from the following description of the illustrative embodiments.
The disclosure contained herein is illustrated by way of example and not by way of limitation in the accompanying figures. For simplicity and clarity of illustration, elements illustrated in the figures are not necessarily drawn to scale. For example, the dimensions of some elements may be exaggerated relative to other elements for clarity. Further, where considered appropriate, reference labels have been repeated among the figures to indicate corresponding or analogous elements.
Corresponding reference numerals are used to indicate corresponding parts throughout the several views.
While the concepts of the present disclosure are susceptible to various modifications and alternative forms, specific embodiments thereof have been shown by way of example in the drawings and will be described herein in detail. It should be understood, however, that there is no intent to limit the concepts of the present disclosure to the particular forms disclosed, but on the contrary, the intention is to cover all modifications, equivalents, and alternatives consistent with the present disclosure and the appended claims.
References in the specification to “one embodiment,” “an embodiment,” “an illustrative embodiment,” etc., indicate that the embodiment described may include a particular feature, structure, or characteristic, but every embodiment may or may not necessarily include that particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Further, when a particular feature, structure, or characteristic is described in connection with an embodiment, it is submitted that it is within the knowledge of one skilled in the art to affect such feature, structure, or characteristic in connection with other embodiments whether or not explicitly described. Additionally, it should be appreciated that items included in a list in the form of “at least one A, B, and C” can mean (A); (B); (C); (A and B); (A and C); (B and C); or (A, B, and C). Similarly, items listed in the form of “at least one of A, B, or C” can mean (A); (B); (C); (A and B); (A and C); (B and C); or (A, B, and C).
In the drawings, some structural or method features may be shown in specific arrangements and/or orderings. However, it should be appreciated that such specific arrangements and/or orderings may not be required. Rather, in some embodiments, such features may be arranged in a different manner and/or order than shown in the illustrative figures. Additionally, the inclusion of a structural or method feature in a particular figure is not meant to imply that such feature is required in all embodiments and, in some embodiments, may not be included or may be combined with other features.
A number of features described below may be illustrated in the drawings in phantom. Depiction of certain features in phantom is intended to convey that those features may be hidden or present in one or more embodiments, while not necessarily present in other embodiments. Additionally, in the one or more embodiments in which those features may be present, illustration of the features in phantom is intended to convey that the features may have location(s) and/or position(s) different from the locations(s) and/or position(s) shown.
The embodiments of the present disclosure described below are not intended to be exhaustive or to limit the disclosure to the precise forms in the following detailed description. Rather, the embodiments are chosen and described so that others skilled in the art may appreciate and understand the principles and practices of the present disclosure.
Although not shown in
The agricultural vehicle 100 depicted in
The header 104 can also be moveable relative to the ground surface to adjust a vertical height of the header, also referred to as header height. For example, according to certain embodiments, the header 104 is pivotable by an actuator 118 relative to the chassis 102 about an axis which extends horizontally and transversely to the forward direction indicated by arrow 105. In some embodiments, the axis may coincide with a rotational axis of an upper guide roller of the central belt conveyor 116 of the header 104 to be able to modify the height of the header 104 above the ground.
The header 104 is configured to direct cut crop material to a feederhouse 114. Moreover, the header 104 can include one or more conveyors or augers, as well as a combination thereof, to convey crop material, and, more particularly cut crop material, along the header 104 and to the feederhouse 114. For example, the exemplary header 104 shown in at least
In at least certain circumstances, the position of the header 104 or the reel 106 can be adjusted to accommodate for variances in either, or both, the elevation of the associated terrain or attributes, including characteristics or properties, of the crop material, such as, for example, the height of the crop or crop posture. Moreover, regardless of the type of header 104 utilized, in at least efforts to improve the efficiency in the operation of the header 104, and corresponding crop yield, the header 104, or associated components thereof, is/are often at a select position(s) relative to either, or both, the adjacent ground surface or an attribute of the crop material, including, for example, crop height. However, the elevation of the ground surface, or associated crop attribute, as well as soil types, moisture levels, and nutrients, among other characteristics, can vary across a field, which can, if not properly addressed, adversely impact the crop yield gathered via operation of the header. Therefore, referencing
In the illustrated embodiments, the one or more of the actuators 130, 132, can be selectively actuated to be extendable and retractable in a manner that can cooperatively vertically displace (as generally indicated in at least
The agricultural vehicle 100 and the secondary device 192 can also each include a communication unit 158, 206 that can accommodate the communication of information, including, for example, a terrain map having ground terrain and coordinate information stored in a database 206 of the secondary device 192, between the agricultural vehicle 100 and the secondary device 192, among other systems, components, devices, equipment, or machinery. The communication units 158, 206 can be configured for either, or both, wired or wireless communications, including, for example, via proprietary and non-proprietary wireless communication protocols. For example, the communication units 158, 206 can be configured to accommodate Wi-Fi, ZigBee, Bluetooth, radio, cellular, or near-field communications, among other communications that use other communication protocols, including, but not limited to, communications over a wireless network 160, such as, for example internet, cellular, and/or Wi-Fi networks, as well as combinations thereof. According to certain embodiments, the communication units 158, 206 can each comprise a transceiver.
The agricultural vehicle 100 can also include a location system 162, such as, for example, a global navigation satellite system, including, but not limited to, a global positioning system (GPS). The location system 162 can be operated to provide a detailed indication of the location of the agricultural vehicle 100, particularly as the agricultural vehicle 100 traverses across a field. According to certain embodiments, the location system 162 can include a receiver that can receive information from an external source that can indicate the location of the agricultural vehicle 100, including, for example, via location coordinates.
The agricultural vehicle 100 can further include one or more input devices 164, such as, for example, one or more keyboards, keypads, touch screens, mouse, buttons, joysticks, switches, or mobile personal computing devices, such as, for example, mobile phones, smart phones, or tablets, as well as combinations thereof, among other types of input devices 164. Such input devices 164 can be configured to receive information, commands, or instructions inputted by an operator of the agricultural vehicle 100. Summarily, the agricultural vehicle 100 can include one or more output devices 166, including, for example, one or more displays or touchscreens, among other types of output devices. Further the input device 164 and the output device 166 can either, or both, be local to, and remote from, the agricultural vehicle 100.
The agricultural vehicle 100 or header control system 150 can also include an onboard sensor system 168 that can include one or more optical sensors 170. A variety of different types of optical detection devices can be utilized for the optical sensor 170, including, but not limited to, stereo depth cameras, stereo sensors, RGBD (red, green, blue, depth) cameras, three-dimensional sensors, LIDAR, radar, and three-dimensional cameras, as well as combinations thereof, among other types of sensors. The optical sensor 188 can capture information in one or more images or videos of at least a portion of the header 104, including, for example the reel 106, among other portions of the header 104, and at least a portion of crop material in front of the header 104. As discussed below, according to such an embodiment, the controller 152 of the agricultural machine 100, or other controller associated with the sensor system 168 or optical sensor 180, can be configured to at least assist in the evaluation of such captured information from the optical sensor 188, such as, for example, on a pixel level, or based on a collection or area(s) of pixels, among other bases for evaluation. Such an evaluation can be based, for example, at least in part, on either or both a color or level of light present or not present in an area(s) or pixels in the captured information, as well as associated depth information. As discussed below, such evaluation can involve object classification, which can, for example, be performed on a pixel level. Moreover, according to certain embodiments, such evaluation of the captured information can include associating particular pixels from the image(s) or video(s) with a particular object that is anticipated to be present in the captured information, such as, for example, a central tube 138 of the reel 106, at least some of the reel fingers 140 that extend from the central tube 138, as shown in
The sensor system 168 can further include one or more reel speed sensors 172 that can be associated with the reel 106 or a motor utilized to provide power for rotational displacement of the reel 106, among other associated rotating elements of the reel 106. For example, according to certain embodiments, the reel speed sensor 172 can provide an indication of a rotational speed of the tube 138 of the reel 106 to which fingers 140 of the reel 106 are coupled. The reel speed sensor 172 can be a known type of sensor that senses speed by magnetic or optical targets that rotate with the reel 106, motor, or other rotating element of the reel 106 that are sensed by a stationary sensor. The controller 152 can generate one or more signals to control the operation of an associated actuator, such as, for example, a speed and direction of rotation of a motor of the reel 106, or operation of an associated pump and control valve, to control the speed and direction of rotation of the reel 106. Further, either or both the controller 152 or an operator of the agricultural vehicle 100 can adjust the speed of the reel 106 based on a variety of criteria or control settings, including, for example, based on one or more attributes, including characteristics or properties, of the crop material, including, for example, crop type, crop height, or crop posture, among other attributes, as well as combinations thereof. Moreover, the speed at which the reel is rotating, as indicated by the reel speed sensor 172, can be indicated to the operator via the output device 166, and selectively adjusted by the operator via one or more commands inputted by the operator via the input device 164.
The illustrated sensor system 168 can further include one or more position sensors 174 that are configured to sense the position of the header 104, reel 106, cutter bar 112, or other front-end equipment relative to the frame 110 of agricultural harvester 100. For instance, one or more sensors 174 may sense the height of header 104 above the ground. Further, one or more of the position sensors 174 can provide information indicating a vertical position of the reel 106, or other components of the header 104, relative to the ground or other portions of the header 104. Additionally, or alternatively, one or more position sensors 174 can be utilized to provide information regarding the fore/aft position of the reel 106, including with respect to the header 104 or other portion of the agricultural vehicle 100. Further, according to certain embodiments, a position sensor 174 can provide information indicative of an elevation of the reel 106 with respect to the at least one support element of the header 104, which can assist in determining the position of the reel 106, central tube 138 of the reel 106, or fingers 140 of the reel 106. Moreover, knowledge relating to the position of the central tube 138 and dimensions or geometric configuration of the fingers 140 can be utilized to determine a location of the fingers 140. Such information, as well as information relating the crop attributes, including crop height, or the relative location of the ground can be used to determine the extent the fingers 140 extend into the crop material or proximity of the fingers 140 to the ground surface.
The sensor system 168 can also include one or more orientation sensors 176. According to certain embodiments, at least one orientation sensor 176 can provide an indication of an orientation, including, for example, a tilt angle, of one or more portions of the header 104 relative to other portions of the header 104, agricultural vehicle 100, crop material, or ground. Thus, such information can be used in connection with determining relative positions of the header 104 or reel 106, among other components of the header 104, including the reel 106, as the slope of the terrain changes across a width or length of the header 104, including along a direction that is generally perpendicular to a direction of forward travel of the agricultural vehicle 100.
According to certain embodiments, the sensor system 168 can further include one or more ground speed sensors 178 that can sense a travel speed of the agricultural vehicle 100 over the ground. A variety of different types of sensors can be utilized as the ground speed sensor(s) 178, including, for example, a sensor(s) that senses a rotational speed of ground engagement bodies of the agricultural vehicle 100, such as, for example, one or more wheels, or an associated drive shaft or axel, among other components associate with driving movement of the agricultural vehicle 100 along the field. Additionally, or alternatively, the travel speed of the agricultural vehicle 100 can be based on changes in positional information for the agricultural vehicle 100, or portions thereof, as determined using information provided by the location system 162, including changes in GPS coordinates over an identified time.
One or both of the controllers 152, 194 of the agricultural vehicle 100 or the secondary device 192 can also include a machine learning optimization module 184 that can adjust one or more control settings for the header 104, or components of the header 104, such as, for example, the reel 106. For example, as discussed below, the optimization module 184 can generate control settings that can be used to generate signals to adjust one or more of a height of the header 104, a height of the reel 106, or a fore/aft position of the reel 106, among other header 104 components. Such control settings can be derived, and updated, via use of at least the optimization module 184 based on one or more models, including algorithms, and input information that can include information provided by, or derived from, the sensor system 168, a terrain map, and a feedback module 186, 202, among other input information. According to certain embodiments, such machine learning for either or both the development or refinement of the model(s), including algorithms, used by the optimization module 184 can be performed online at the agricultural vehicle 100, including via training of a neural network 190 of an artificial intelligence (AI) engine 188. Additionally, or alternatively, the optimization module 200 can be located at the secondary device 192, and the secondary device 192 therefore include the neural network 190 and artificial AI engine 188. Further, the feedback module 186, 202, which can be located at either or both the agricultural vehicle 100 or the secondary device 192, can include a recording of adjustments made by an operator to the control settings communicated by, or from, the optimization module 184, 202, as further discussed below. The feedback module(s) 186, 202 can include, among other information, information indicating the extent of the adjustment(s) or resulting setting(s) from such adjustment(s), among other information, that an operator manually inputted to the implementation of one or more of the control settings.
At block 602, one or more of the optical sensors 170 can be operated to capture information in the form of one or more at least partially forward-directed images, including, for example, one or more photographs or videos, as well as combinations thereof. The captured information can include at least a portion of the header 104, including, for example, the real 106, as well as crop material that is positioned ahead of the header 104, which can collectively be referred to herein as objects. Moreover, the crop material captured in the captured information can include not only crop material that is generally directly adjacent to the header 104, but also crop material that is located upstream of the direction of travel of the header 104. Further, according to embodiments in which the optical sensor 170 is one or more stereo cameras, including red, green, and blue (RGB) stereo cameras, the captured information can include colors, shading, and depth or distance information relating to the objects captured in the captured information obtained by the optical sensor(s) 168.
At block 604, the captured information can be analyzed by the controller 152, including, for example, by a machine learning classification algorithm, also referred to herein as a classifier, to associate information from the captured information with one or more particular objects. According to certain embodiments, such object classification can occur at a pixel level. Such classification of objects in the captured information can involve, at least in part, an analysis of the color, brightness/darkness, shading, texture, patterns, depth information, and location of individual pixels or groups of pixels in the captured information. Further, such classification can be aided via use of known relative locations, coordinates, and geometric sizes and shapes of objects that are anticipated to be represented or otherwise captured in the captured information. Additionally, information from the sensor system 168, including information provided by one or more of the reel speed sensors 172, position sensor 174, and orientation sensor 176, as well as lookup table information stored by the memory device 156, can provide actual position/speed feedback information that can be utilized to further assist with classification of objects from the captured information.
Such classification of the captured information can at least assist in identifying the position and location of the header 104, or certain components of the header 104, including, for example, the central tube 138 of the reel 106. Additionally, captured information obtained by the optical sensor 170 at different times, or by other optical sensors 170, can be compared in connection with identifying a particular object, as well as a position of that object, in the captured information. Such object classification by the classifier can also be used to derive information relating to particular attributes, including characteristics or properties, of the crop material that the header 104 is, or will be, encountering.
The classifier, and, more specifically, the machine learning classification algorithm, can be configured to detect objects within the captured information in the absence of such objects having fiducial markers, or related props, including optical markers, that may otherwise traditionally be utilized or added to items for the purpose of assisting in visual detection or recognition of an object. Such object detection without fiducial markers can also involve determining a position of the detected object within a first coordinate system, such as, for example, a vehicle coordinate system, as discussed below. For example, according to certain embodiments, the classifier can identify the central tube 138 of the reel 106, among other portions of the reel 106 and header 104, without reliance on the presence, or detection, of fiduciary markers on the central tube 138 or other portions of the reel 106 or header 104. As discussed below, such markerless detection of at least portions of the reel 106 or header 104, or both, as well as an associated determination of the location of those detected objects can be utilized in subsequent determinations of the position, including, for example, vertical height, of the reel 106 and the header 104 relative to either or both the upstream crop material (i.e., crop canopies) or the ground, as may be identified by at least a terrain map or sensors, among other sources of information.
For example, such object classification can, as discussed below, be utilized to determine a crop height, crop color, texture, or crop posture, as well as combinations thereof, among other crop material information. Such information, or certain aspects of such information, can also, or alternatively, be provided from other sources. For example, additionally, or alternatively, crop canopy information, including crop height, can be provided, or derived using information, from a three-dimensional (3D) point cloud and scrolling map generation, and, moreover, be provided in the form of a 3D point cloud from stereo cameras, among other optical sensors 170, and terrain profiles from a GPS or terrain map. Further, crop posture information can provide an indication of an orientation of the crop relative to the ground, including, for example, whether the crop is standing, or lodged, among other orientations. As discussed below, such crop posture and height information can be utilized, along with other information, by the header control system 150 to determine a position or orientation of the header 104 or one or more components of the header 104, including, for example, the reel 106 or cutting bar 112, among other components, as well as operational settings for such components, including a speed at which the reel 106 is to rotate.
According to certain embodiments, a version, such as, for example, a software version, of the classifier can be maintained at the agricultural vehicle 100, including, for example, by or at the controller 152. However, the classifier can be machine learnable such that one or more models, including algorithms, of the classifier can be updated and refined offline, such as, for example, at the secondary device 192, using at least data generated during operation of the header 104 or while the agricultural vehicle 100 is performing a harvesting operation, as well as based on operator feedback. Thus, for example, information, including data, collected or derived via the operation of the agricultural vehicle 100, including, for example, by either or both one or more sensors of the sensor system 168 or the classifier at the agricultural vehicle 100, among other information, can be communicated from the agricultural vehicle 100 to the secondary device 192 as feedback information. Such feedback information provided to the classifier at the secondary device 192 can be utilized by the secondary device 192 to evaluate the accuracy of the information being determined by the classifier at least at the agricultural vehicle 100. Moreover, discrepancies between object classifications generated by the classifier at the agricultural vehicle 100 and the feedback information communicated to the secondary device 192 can be used to identify, or further refine, patterns used in connection with the machine learning of the classifier. Further, updates or refinements in the classifier that are developed via machine learning at the secondary device 192 can at least periodically be communicated from the secondary device 192 to the agricultural vehicle 100 via the associated communication units 158, 206 and network 160 in the form of a software update for the classifier at the agricultural vehicle 100.
Using the information provided by the one or more optical sensors 170 at block 602, and the object classifications, including classifications derived via the markerless detection of components of the reel 106 or header 104, from block 604, at block 606 the controller 152 can determine position information for the header 104, or components thereof, relative to a first coordinate system. For example, according to certain embodiments, at block 606, the controller 152, including, via the processor 154 executing one or more instructions contained on the memory device 156, can identify at least one of a position, location, and height of the central tube 138 of the real 106 relative to a three-dimensional first coordinate system, including, for example, a camera or sensor coordinate system. Further, such positional information can assist in determining the position of the header 104, or components thereof, relative to at least the crop material. Thus, such a markerless approach can accommodate at least detection of features or characteristics of, or relating to, the header 104, including, for example, positional information of at least certain components of the header 104, without reliance or use of features utilized with traditional marker approaches. Moreover, the markerless approach disclosed herein can identify, or be used to determine, such features or characteristics relating to at least the header 104 without reliance on traditional markers, including, for example, without reliance on detection of characteristics of one or more markers, including the color of such traditional marker(s).
Additionally, the object classifications from block 604, among other information provided by the captured information from block 602, including colors, brightness, and depth information, can be utilized at block 608 to determine attributes, including characteristics or properties, of the crop material captured in the captured information. As previously discussed, such determined attributes of the crop material can include crop height, crop color, texture, orientation, or crop posture, as well as combinations thereof, among other crop material information that can impact the control settings of the header 104, including, for example, the height of the header 104 or reel 106, the fore/aft position of the reel 106, the speed of the reel 106, or the cutting bar 112 position, among other control settings.
At block 610, a terrain map of at least the field, or portion of the field, upon which the agricultural vehicle 100 is, or will be, performing an agricultural operation can be retrieved or received. For example, in at least certain instances, the terrain map can be stored at the agricultural vehicle 100, including, for example, at the memory device 156. Additionally, or alternatively, the terrain map, or an updated version of the terrain map, can be stored at the secondary device 192, including, for example, at the database 204 of the secondary device 192. Thus, to the extent not available at the agricultural vehicle 100, in certain instances, at block 610, the terrain map can be communicated via the communication units 158, 206 and the associated network 160 from the secondary device 192 to the agricultural vehicle 100.
According to certain embodiments, the terrain map can include a variety of information regarding the characteristics of the field, including topography related information. Thus, for example, the terrain map can include information regarding changes in the elevation of the field, including, for example, locations at which the ground surface may be upwardly or downwardly sloped, as well as locations of depressions and protrusions within the field. Additionally, the terrain map can also provide information of internal boundaries regarding potential obstacles located in the field, including, for example, the location of rocks and fences, among other structures and obstacles. Additionally, the terrain map can provide information regarding the particular locations of the various features within, or of, the field. Such location information from the terrain map can be provided in terms of a second three-dimensional coordinate system that is different than the previously mentioned first coordinate system. For example, while the first coordinate system can be based, for example, on a camera or sensor coordinate system, the second coordinate system used with information provided in the terrain map can utilize a location system coordinate system including, for example, a GPS coordinate system. Thus, at block 612, the controller 152 of the agricultural vehicle 100 can convert location information relating to the captured information from the first coordinate system to the second coordinate system. According to certain embodiments, such a conversion can be aided by use of the location system 162 of the agricultural vehicle 100, as well as one or more reference locations or points that may be reflected in the captured information that may have known, or identifiable, locations along both the first and second coordinate systems. Thus, conversion of such known reference locations from the first coordinate system and the second coordinate system, and relative positions of other objects in the capture information to such reference locations, can assist in converting the location of other objects from the first coordinate system to the second coordinate system.
The relative transformation from the first coordinate system to the second coordinate system can be performed, including via operation of the controller 152, in a variety of different manners. For example, according to certain embodiments, such transformation can be performed via use of machine type lookup tables, manual calibrations, or via automated calibration that may, or may not, involve receipt of operator inputs. Alternatively, according to other embodiments, such a transformation from the first coordinate system to the second coordinate system can involve performance, by the controller 152, of active calibration. Such active calibration can include, for example, providing captured information to the controller 152 from the optical sensor 170 or an associated recognition system, that can be used by the controller 152 in establishing thresholds that can be based on highest and lowest setpoint leveraging of the recognition system.
The conversion of information provided by, or derived from, the captured information from at least block 602 to the second coordinate system at block 612 can assist in at least determining current positions of the header 104, or associated components, and crop material relative to at least the adjacent ground. Such information can also be used to determine positions of the header 104, or components thereof, relative to crop material. Additionally, such relative positional information may pertain to current locations or positions of the header 104, or associated components, and either or both adjacent crop material and the ground.
The conversion of captured information from the first coordinate system to the second coordinate system can also be used to derived information regarding the position of the header 104, or components thereof, relative to ground terrain or crop material that are at locations upstream of, or generally remote from, the present location of the header 104 or agricultural vehicle 100. Such a forward-looking approach can be utilized to proactively determine adjustments, if any, that are to be made to the position or operation of the header 104 or associated components in view of identified upstream variances or changes in crop material attributes or ground terrain. Further, such a proactive approach to adjustments for the operation or position of the header 104, or associated components, can be based, at least in part, on efforts to compensate for at least certain latencies of the agricultural vehicle 100. For example, such a proactive approach as to identification of upcoming or upstream variances in ground terrain or crop material attributes, and a determination by the optimization module 184 of the associated adjustments that are to be made to one or more control settings of the header 104, can be made in a manner to allow for time to not only identify the upcoming variance and control setting adjustment(s), but to also allow time for those adjustments to be implement or executed. For example, such a proactive approach can be configured such that as the header 104 approaches, or has reached, the identified variances in terrain or crop material attributes, the changes in the control setting(s) have implemented for the header 104. For example, such a proactive approach can be implemented at block 618 such that changes relating to the height of the header 104 or reel 106, among other positional adjustments, due to identified variances in the elevation of the ground or the height of crop material can be implemented before, or just as, the header 104 reaches those changes in elevation or crop material height.
Further, such adjustments in the control settings can be based on an outcome from the optimization module 184, and can relate to operational adjustments in addition to the positional adjustments made at block 618. For example, based on one or more outcomes from the optimization module 184, the controller 152 can issue one or more signals to change a location of the cutting bar 112, the speed at which the reel 106 rotates, the speed at which the conveyors 116, 124 are being rotated by the corresponding driver 182. Additionally, or alternatively, such changes in one or more operational settings can be based on an identification of a predetermined operational setting when a certain condition is determined or detected, including, for example, a height at which the header 104 or reel 106 is to be positioned, or with respect to an identified crop material attribute, including crop material height, orientation, or posture, among other attributes. By being able to proactively make such changes in the settings of the header 104, or components thereof, throughout a duration of a performance of an agricultural operation about a field, potential loss in crop yield that could otherwise be associated with variations in terrain elevation or crop material height may be minimized, if not eliminated.
The distance in front of the header 104 or agricultural vehicle 100 at which crop material or terrain, and the size of the area of crop material or terrain, that is to be examined in connection with proactive determinations of whether to adjust one or more control settings associated with the header 104 can be based on a variety of different criteria. For example, in certain instances, the distance in front of the header 104 or agricultural vehicle 100 that is to be examined for variances in terrain or crop material, or both, can be at least partially a function of the speed of travel of the agricultural vehicle 100, which can, for example, be determined via use of the speed sensor 178. Thus, for example, when the agricultural vehicle is traveling at a first speed, the crop material or ground terrain being evaluated for the variances that could facilitate a change in the control settings can be at a first distance away from the header 104 or agricultural vehicle 100. However, when the agricultural vehicle 100 is traveling at a second speed that is faster than the first speed, the crop material or ground terrain being evaluated for the variances that could facilitate a change in the control settings can be at a second distance away from the header 104 or agricultural vehicle 100, the second distance being larger than the first distance. The differences in first and second distances can be configured to provide sufficient time for identification of the variance(s) in the terrain or crop material attributes, determination of the appropriate adjustment(s) (if any) in the control setting(s), and implementation of such an adjustment(s), such as, for example, via operation of one or more of the actuators 180. Accordingly, such an approach can seek to have the position or operation of the header 104, or associated components, adjusted at least by the time the header 104 reaches the changing terrain or crop material attributes, as indicated by block 618.
Accordingly, using at least relative position information regarding the header 104, components of the header, ground terrain, or crop material, as well as combinations thereof, as well as information regarding the attributes of the crop material, the optimization module 184 of the agricultural vehicle 100, the controller 152 can determine at block 614 as to whether one or more control settings for the header 104, or components thereof, are to be changed. For example, as previously mentioned, terrain information and one or more attributes of the crop material, including for example, crop height, posture, or orientation, or any combination thereof, as well as the relative position of the heeder 104, or component of the header 104, can be used by the controller 152, including, for example, the optimization module 184, to determine if the height of the header 104, or associated component, is to be changed. Thus, for example, an identification of an upcoming: change in height of the crop material; change in the posture of the crop material; or change in the elevation of the ground, as well as combinations thereof, among other identified approaching variances, can result in the optimization module 184 at block 616 being used to determine one or more adjustments that can adjust, at block 618, the height of the header 104, the height of the cutting bar 112, or the height, fore/aft position, or speed of the reel 106, among other adjustments.
Therefore, if a determination is made by the controller 152 at block 614 that one or more control settings are to be adjusted, such adjustments to the header 104, or associated components, as determined by the optimization module 184, can be automatically implemented at blocks 618 and 620 by one or more signals generated by the controller 152. However, according to certain embodiments, the operator can adjust, or override such changes before, during, or after the controller 152 has communicated one or more signals corresponding to such an adjustment(s) to one or more of the actuators 180. For example, as indicated in
To the extent a determination is made at block 614 that no adjustments are to be made, or that such adjustments are made at blocks 618, 620, or 624, the method 600 can proceed to block 626, where the controller 152 can determine whether the agricultural operation being performed using the agricultural vehicle 100 or header 104 has concluded. If the agricultural operation is determined at block 626 to have concluded, then at block 628 the method 600 can end. Conversely, if the agriculture operation has not yet concluded, the method 600 can return to block 602, wherein captured information of crop material that is upstream of the agricultural vehicle 100 or header 104 can again be captured. The method 600 can continue to be repeated throughout the course of the agriculture operation such that the header 104, or associated components, can continuously be proactively adjusted with respect to at least one of position or operation of the header 104, or associated components, including, for example, the reel 106, to accommodate for variations in the terrain of the field or the attributes of the crop material located therein.
In the exemplary embodiment of the method 700 depicted in
Additionally, similar to at least block 624 of
At block 710, the operator can use the input device 164 to adjust a sensitivity of the header control system 150. For example, as previously discussed, the distance ahead of the header 104 or agricultural vehicle 100 at which the header control system 150 is evaluating the terrain or crop material attributes for changes or variations that may result in a change in position or operation of the header 104, or of an associated component of the header 104, can be a function of the speed at which the agricultural vehicle 100 is traveling. According to certain embodiments, different ranges of speed can be associated with different zones or areas of crop material or terrain upstream of direction of travel of the agricultural vehicle 100. With such an embodiment, each zone or area of crop material or terrain upstream of the agricultural vehicle 100 can correspond to a different distance away from the agricultural vehicle 100, as well as a different speed or range of speeds of travel of the agricultural vehicle 100. Thus, for example, a first zone that is positioned generally adjacent to the header 104 or agricultural vehicle 100 can correspond to a first range of speeds, while another zone that is positioned further away from the header 104 or agricultural vehicle 100 can correspond to another, higher, range of speeds. Thus, when the agricultural vehicle 100 is traveling at a speed that is within the second speed range, the controller 152 will examine information regarding the terrain and crop material attributes of the second area or zone for variances that may necessitate a change in the position or operation of the header 104 or associated components of the header 104.
According to such embodiment, at block 710, an operator can adjust the sensitivity of the header control system 150, at least in part, by adjusting one or more of: the range of speed associated with one or more of the zones; the distance one or more zones are away from the agricultural vehicle 100 or header 104; or a size, such as, for example a length, of one or more of the zone(s) in direction that is generally parallel to the direction of travel of the agricultural vehicle 100. Such adjustments can impact the time the header control system 150 has to identify one or more variances in either or both the terrain of the ground or crop material attributes, determine an appropriate adjustment to a control setting(s), and implement that adjusted setting before the agricultural vehicle 100 or header 104 reaches ground or crop material having the identified variance. Moreover, such adjustments to the sensitivity of the header control system 150 can adjust the time allotted for delays in at least the identification and implementation of adjustments that can be attributed to inherent system latencies of the header control system 150.
The changes identified at block 708 and block 710 can be recorded, such as, for example, via the feedback module 186, 202, and, to the extent not yet communicated, communicated to the secondary device 192. Further, while blocks 702, 704, 706, 708, and 710 have been discussed with respect to operation of a particular, local agricultural vehicle 100, similar information can also be recorded for a plurality of other agricultural vehicles 100 that may be located at other locations, including agricultural vehicles 100 that can be owned or operated by different operators. For example, information similar to that discussed above with respect to block 702, 704, 706, 708, and 710, can be obtained at block 714 for a plurality of other agricultural vehicles 100 using similar header control systems 150 or optimization modules 184, and communicated to the secondary device 192, such as, for example, via use of the network 160. Such acquiring of data by the secondary device 192 from a plurality of agricultural vehicles 100 can be utilized to analyze the accuracy of the optimization models 182, 200, and moreover, utilized in connection with the machine learning of the optimization model 200 at the secondary device 192. Thus, for example, variations between the control settings generated by the optimization module 186 of the associated agricultural vehicles 100 and the corresponding adjustments made, if any, by operators of those agricultural vehicles 100 to those control settings, as well as other information that may be provided by the associated sensor systems 168, can be utilized by a neural network 190 of an AI engine 188 at the secondary device 192 to identify patterns or discrepancies in the outcomes provided by the model(s) or algorithm(s) of the optimization models 186 of those agricultural vehicles 100. With such an offline embodiment, as depicted in
Additionally, the offline machine learning for the optimization module 200 that occurs at block 718 can also utilize a variety of additional information in addition to, or in lieu of, the information illustrated by blocks 712 and 714. For example, according to the illustrated embodiment, in addition to the information provided at blocks 712 and 714, other header setting data 716 can also be provided for the machine learning that can occur at block 718. For example, other header 104 related settings data provided at block 718 can relate to preselected operator settings that may be set based on user preferences. Such preset settings can correspond to one or more crop material cut heights preselected by an operator, each cut height corresponding to a presence or identification by the controller 152 of a different, or particular, crop material attribute.
Accordingly, the offline machine learning performed at block 718 for the model(s), including algorithm(s), of the optimization module 200 using at least the inputted information from one or more of blocks 712, 714, and 716, can result in the generation of updated control settings for the optimization module 200, as indicated at block 720. The updated control settings can be recorded by the secondary device 192, such as, for example, by either or both the optimization module 200 and the database 204. Additionally, at block 722, the updated control settings, and, moreover, the updated model(s), including algorithm(s), of the optimization module 200, can be communicated via the communication units 158, 206 and network 160 to the controller 152 of the agricultural vehicle 100. For example, the updated control settings can be provided as a software update for the optimization module 184 of the controller 152 of the agricultural vehicle 100.
The method 700 depicted in
In view of the foregoing, blocks 802 through 822 shown in
More specifically, the controller 152 can determine at block 822 whether a signal, such as, for example, an override signal or other manual adjustment, has been received from the operator via operation of an input device 164 that changes or adjusts a control setting (a) that was/were generated by the optimization module 182 of the agricultural vehicle 100 at block 816, or an operational setting that was provided at block 820, such as, for example, a speed of rotation for the reel 106. If the controller 152 determines at block 822 that such a signal(s) has been received from the input device 164, then at block 824 the controller 152 can record such an adjustment(s), including, for example, via the feedback module 186. Such recorded information can include an identification of not only an adjustment(s) being made by the operator, but also the extent of the adjustment(s) or the setting(s) resulting from the adjustment(s), among other information. Further, such adjustments by the operator can also include adjustments with respect to the sensitivity of the header control system 150. For example, the feedback module 186 can record operator adjustments that may impact the amount of time allotted to account for inherent system latencies of the header control system 150 to making adjustments in response to determined variations in the terrain or crop material attributes, as previously discussed.
At block 826, the recorded adjustments or changes in the sensitivity of the header control system 150 can be utilized in connection with the machine learning for the optimization module 184 at the agricultural vehicle 100. Thus, similar to the blocks 718 and 720 of the method 700 depicted in
While the disclosure has been illustrated and described in detail in the foregoing drawings and description, the same is to be considered as exemplary and not restrictive in character, it being understood that only illustrative embodiments thereof have been shown and described and that all changes and modifications that come within the spirit of the disclosure are desired to be protected.