This invention relates generally to agricultural vehicles. More particularly it relates to a vehicle with an implement for agricultural purposes and a method for controlling the implement during work on a field.
A number of agricultural operations require that an implement is guided along a nominal path, such that the operation performed by the implement is performed at a desired location. Such a location can be a place where a seed particle is to be sown (with a seeding or drill machine) or where a plant growing in the field is to be fertilized (with a spreader or sprayer) or where weed is to be hoed (with a cultivator) or sprayed (with a sprayer). Normally, such implements are moved by a vehicle over the field. The vehicle and/or the implement can be provided with a location determining apparatus, like a global navigation satellite system (GNSS) receiver, or a camera with an image processing system recognizing features in the field, in particular plant rows. The implement can be supported on the vehicle or towed behind the vehicle.
In a number of prior art implementations, only the vehicle is actively steered, with the implement towed behind the vehicle by a tongue. Since the vehicle and the implement move, at least when deviating from straight steering, on different respective paths, a certain error in the implement path results, which was proposed to reduce by steering the vehicle in a manner such that the implement follows a desired path (WO 2008/005195 A2). It was also proposed to steer the vehicle on a desired path (based on a camera or GNSS receiver) and to control an actuator adapted for a lateral adjustment of the implement with respect to the vehicle (also based on a camera or GNSS receiver) to keep the implement on the desired path (EP 2283719 A2, US 2013/0110358 A1), thus compensating for possible steering errors of the vehicle.
One disadvantage of the prior art systems controlling an actuator influencing the lateral position (or yaw angle) of the implement with respect to the vehicle in a manner to keep the implement on a desired path is that any automatic or manual steering action of the vehicle influences the lateral position of the implement. Once the vehicle thus performs a steering action, the implement also moves laterally and the control system of the implement will hence—with a certain time delay—sense an error between the actual lateral position of the implement and the nominal path and react with a control signal to the actuator, which moves the implement back to the desired path. This control reaction takes time and it is possible that the implement deviates far enough from the intended path that it cannot perform the desired agricultural operation over a significant length and/or that plants are damaged or destroyed by the implement. The present disclosure attempts to mitigate these disadvantages.
Various aspects of examples of the present disclosure are set out in the claims.
According to a first aspect of the present disclosure, a system for controlling an implement connected to a vehicle can include steerable ground engaging means, an actuator, and an implement control unit. The implement can be adapted to perform an agricultural operation on a field. The vehicle can include steerable ground engaging means for propelling the vehicle over the field. The actuator can be arranged to control at least one of a yaw angle and a lateral position of the implement with respect to the vehicle. The implement control unit can be programmed to control the actuator based upon a first signal regarding the difference between a sensed lateral position of the implement and a nominal lateral position of the implement, and a second signal regarding a steering movement of the vehicle. The implement control unit can be programmed to control the actuator to minimize the difference between the sensed lateral position and the nominal lateral position and thereby to predictively consider an expected lateral movement of the implement due to a change of the trajectory of the vehicle caused by the steering movement of the vehicle.
The second signal can comprise data regarding at least one of a steering angle of the vehicle, a steering angle change rate of the vehicle, a propelling speed of the vehicle, a geometric parameter of the vehicle, a propelling direction of the vehicle, and a radius of a curve driven by the vehicle.
The implement control unit can be adapted to learn a relation between the second signal and a control signal to the actuator from the first signal and the second signal.
The first signal can be provided by at least one of a positioning system and a camera with an image processing system.
The second signal can be provided by a sensor interacting with a manually actuated steering system of the vehicle or by at least one of an automatic steering system of the vehicle, a positioning system associated with the vehicle and an inertial sensor associated with the vehicle.
According to a second aspect of the present disclosure, a method of controlling an implement connected to a vehicle is provided, wherein the implement performs an agricultural operation on a field, the vehicle has steerable ground engaging means propelling the vehicle over the field, an actuator controls at least one of a yaw angle and a lateral position of the implement with respect to the vehicle, and an implement control unit controls the actuator based upon a first signal regarding the difference between a sensed lateral position of the implement and a nominal lateral position of the implement, and a second signal regarding a steering movement of the vehicle.
The above and other features will become apparent from the following description and accompanying drawings.
The detailed description of the drawings refers to the accompanying figures in which:
Like reference numerals are used to indicate like elements throughout the several figures.
At least one example embodiment of the subject matter of this disclosure is understood by referring to
The vehicle 12 is a tractor 18 with a chassis 20 or frame supported on ground engaging means in the form of steerable front wheels 26 and driven rear wheels 28. The vehicle 12 also comprises an operator cab 24 and an engine 61 for driving the rear wheels 28 and optionally the front wheels 26 and a PTO (not shown).
Implement 14 comprises a cross beam 36 supporting a number of row units 22 distributed side by side along the length of the cross beam 36. The row units 22 are performing an agricultural operation on the field 10. In the embodiment shown, the row units 22 can be hoes for weeding or spraying device for feeding the plants 16. In the embodiment shown, between each row of plants 16, seen in the forward direction V of the vehicle 12 (which extends in
On the rear of the chassis 20, a three-point hitch 46 with lower links 32 and an upper link 30 is mounted. The links 30, 32 are connected at their rear ends to a transverse support bar 36, which on its end is connected by longitudinal bars 34 to the cross beam 36 of the implement 14. The links 30 and 32 are pivotally mounted around vertical axes to the chassis 20 and to the transverse support bar 36. An actuator 38 in the form of a hydraulic cylinder is connected with its first end to the chassis 20 and with its second end to the lower links 32, respectively, and can thus move the transverse support bar 36 and hence the entire implement 14 in a parallelogram-wise manner in a lateral direction (extending horizontally and transversely to the forward direction V). The actuator 38 is controlled by a valve block 50 which is connected to an electronic vehicle control unit 52. The electronic vehicle control unit 52 is adapted to receive a control signal via a bus system 56 (preferably operating according to standard ISO 11783) which transmits control commands from an electronic implement control unit 54 to the vehicle control unit 52. The implement control unit 54 thus can control the lateral position of the implement 14. This is described in more detail in DE 102016212201 A1, the contents of which are incorporated herein by reference. A sensor 86 detects the angle of one of the lower links 32 with respect to the chassis 20 around the vertical axis and thus provides a signal regarding the lateral position of the implement 14 with respect to chassis 20. It should be mentioned that sensor 86 could be integrated into the housing of actuator 38 (cf. EP 1210854 A1). In another embodiment, actuators 38 could be used between the chassis 20 and each lower link 32, with integrated or separate sensors 86, wherein the actuators are double or single acting.
In another possible embodiment, the implement 14 can be connected to the vehicle 12 by a so-called side shift frame, using an actuator for lateral position control of the implement 14, as described for example in EP 2283719 A2 and US 2013/0110358 A1, the contents of which are incorporated herein by reference. It would also be possible to support the implement 14 on wheels and connect it to a hitch of the vehicle 12 by a tongue and to have at least one actuator 38 actively control the angle of the tongue and/or to control the steering angle of the wheels of the implement by the actuator (cf. US 2013/0186657 A1, the contents of which are incorporated herein by reference).
Thus, the lateral position of the implement 14 is controlled by the implement control unit 54 using the actuator 38. Since the implement 14 is intended to move over the field 10 at a lateral position where the row units 22 are located at their appropriate positions between the rows of plants 16 in order to provide the desired agricultural operation and avoid damage to the plants (or in any useful nominal position useful to perform an agricultural operation, like seeding, planting, nursing or harvesting the plants), the implement control unit 54 is automatically guided along the rows of plants 16, based on signals of a first camera 60 with an image processing system 62, an optional second camera 60′ with an image processing system 62′ and an optional receiver 58 for receiving signals of a satellite-based positioning system, like GPS, Glonass, or Galileo. The receiver 58 is mounted on the cross beam 36 of the implement 14. The image processing systems 62 could also be integrated into the implement control unit 54. In another embodiment, the implement control unit 54 could also be incorporated in the vehicle control unit 52.
The cameras 60, 60′ are mounted on the cross beam 36 of the implement 14 and look onto the field 10 in front of the implement. The image processing systems 62, 62′ extract from the images the relative position of the rows of plants 16 with respect to the camera 60, 60′ and compare this position with a pre-stored or programmed nominal (desired) position of the plants. Thus, a first signal is provided to the implement control unit 54 indicating a possible deviation between the actual lateral position and the nominal lateral position of the implement 14. The signals from the image processing systems 62, 62′ can be augmented or replaced by signals from the receiver 58, using a pre-stored map with the location of the plants 16 as reference. Fusing the signals from image processing systems 62, 62′ and receiver 58 can be based on the relative quality of the signals.
Thus, as mentioned, the implement control unit 54 controls the actuator 38 to have the implement 14 and its row units 22, based on the first signal, moving along a nominal path. In the shown embodiment, this nominal path is defined by the position of the plants 16 on the field and actuator 38 is controlled by the implement control unit 54 (using appropriate software) based on the signals from cameras 60 and optionally 60′ such that the row units 22 move between the plants 16 (according to the position of the plants as detected by the camera 60, 60′). The nominal path can alternatively or additionally be pre-stored in a memory of the implement control unit 54 and the actuator 38 be controlled based on the stored nominal path. Both options and their combination are generally described in US 2002/0193928 A1, the contents of which are incorporated by reference herein. An embodiment of lateral guidance of the implement 14 based on cameras 60, 60′, and receiver 58 is also described in more detail in EP 2910098 A1, the contents of which are incorporated herein by reference.
The front wheels 26 of the vehicle 12 can be steered manually by an operator of vehicle 12 in a conventional manner or the vehicle control unit 52 controls a steering actuator 64 influencing the steering angle of the front wheels 26 based upon signals from a positioning system with a receiver 48 for receiving signals of a satellite-based positioning system, like GPS, Glonass, or Galileo, using a pre-stored map with the location of the plants 16 or a path (tramline) to drive over the field 10 as reference. The receiver 48, optionally incorporating an inertial measuring unit (IMU), as described in EP 1475609 A2, is mounted on the roof of cab 24. Alternatively or additionally, the vehicle 12 can be steered based upon a camera (not shown) mounted on the vehicle 12 with an image processing system detecting the rows of plants 16 in front of the vehicle. It should also be mentioned that in case of a track-based vehicle 12, the steering angle thereof could be influenced by speed differences of the tracks on both sides of the vehicle 12, and in case of articulated steering, an actuator would control the steering angle of vehicle 12 by rotating the front and rear parts of the vehicle 12 around a joint.
There may be cases in which the vehicle 12 is not steered on a straight path as shown in
In particular, the central angle 72 in
This lateral error 68 is evaluated by the vehicle control unit 52 and sent via bus 56 to the implement control unit 54, or evaluated by the implement control unit 54, which considers this as a second signal, further to the first signal mentioned above, when calculating the control signal to the actuator 38. In this way, the lateral error 68 is influencing the deviation of the implement 14 from its desired position at least to a lesser extent then without considering the second signal.
In order to consider the lateral error 68 resulting from the changes in the expected trajectory (path) of the vehicle, a second adding input of adding/subtraction unit 84 is connected to a first electronic calculation unit 90 representing a self-learning model, which receives input on an estimated steering movement from a second electronic calculation unit 92 also representing a model.
The second electronic calculation 92 unit receives, via bus 56, data regarding geometric and drive parameters of the vehicle 12 from the vehicle control unit 52. These data comprise the distance between the front wheels 26 and the rear wheels 28 of vehicle 12 (wheel base), propelling speed of vehicle 12 (sensed for example on the rear wheels 28 or by a radar sensor interacting with the ground) and the distance between the length of the part of the chassis 20 between the center of the rear axle and the three-point hitch 46, and if available, the length of the three-point hitch 46 in the forward direction V. Further on, the second electronic calculation unit 92 receives data regarding the steering movement of the vehicle 12, for example from receiver 48 including its IMU, if provided (via the vehicle control unit 52 and bus 56), or from a separate steering angle sensor 94 interacting with the front wheels 26 in particular if the vehicle 12 is manually steered and has no receiver 48. The second electronic calculation unit 92 can thus receive data regarding at least one of the steering angle of the vehicle 12, a steering angle change rate of the vehicle 12, a propelling direction V of the vehicle 12, and a radius r of a curve driven by the vehicle 12, as indicated in
The second electronic calculation unit 92 works like described above regarding
These calculated data are sent to the (optional) first electronic calculation unit 90, which is using a self-learning model to finally provide the second signal to the second adding input of the adding/subtraction unit 84. The self-learning model of the first electronic calculation unit 90 receives the first signal and the signal from the second electronic calculation unit 90 and thus can learn how the system comprising vehicle 12 and implement 14 reacts on certain control signals. It can thus learn during the field operation how the vehicle 10 reacts to steering control signals (this reaction can for example depend on ground conditions like ground moisture and slope) and it can learn geometric parameters (like length of the three-point hitch 46) which are not necessarily known to the implement control unit 54. To learn the mentioned parameters, it would also be possible to have a first learning step after the implement 14 is mounted on the vehicle 12 and, before field operation, the first electronic calculation unit 90 can, when the vehicle 12 moves on a suitable test terrain, provide certain test signals to the adding/subtraction unit 84 and receive the reaction of the system of vehicle 12 and implement 14 to the test signals (by means of the first signal and/or the second signal) to derive kinematical and/or geometrical parameters of the system for later control on the field 10. In this manner, influences that can hardly or only with high efforts be sensed, like ground (traction) properties, topography, ballasting of the vehicle 12, type, size, air pressure, and state of the tires of wheels 26, 28, can thus be considered for controlling the implement 14 and also optionally for controlling the steering of the vehicle 12. The implement control unit 54 thus sends control signals to the actuator 38 which neither under-nor overshoot.
Based on the mentioned data, the first calculation unit 90 can calculate the expected trajectory of the vehicle 12 and/or a resulting path or position of the implement 14, and based thereon provide the first signal to the adding/subtraction unit 84 in order to finally achieve that the implement 14 remains as closely as possible on its desired path over the field 10, despite any steering movement of the vehicle 12 which represents a disturbance to the closed loop control comprising the implement control unit 54, the actuator 38, and the first signal. This disturbance is incorporated in the second signal, which is used for compensating the disturbance.
In the embodiment of
While the present disclosure has been illustrated and described in detail in the drawings and foregoing description, such illustration and description is not restrictive in character, it being understood that illustrative embodiment(s) have been shown and described and that all changes and modifications that come within the spirit of the present disclosure are desired to be protected. Alternative embodiments of the present disclosure may not include all of the features described yet still benefit from at least some of the advantages of such features. Those of ordinary skill in the art may devise their own implementations that incorporate one or more of the features of the present disclosure and fall within the spirit and scope of the appended claims.
Number | Name | Date | Kind |
---|---|---|---|
5857539 | Diekhans et al. | Jan 1999 | A |
8577558 | Mitchell | Nov 2013 | B2 |
20020193928 | Beck | Dec 2002 | A1 |
20060282205 | Lange | Dec 2006 | A1 |
20090032273 | Hahn | Feb 2009 | A1 |
20090272551 | Thompson | Nov 2009 | A1 |
20100017075 | Beaujot | Jan 2010 | A1 |
20130110358 | Merx et al. | May 2013 | A1 |
20130186657 | Kormann et al. | Jul 2013 | A1 |
20140324291 | Jones | Oct 2014 | A1 |
20160057921 | Pickett | Mar 2016 | A1 |
20170311534 | Rusciolelli | Nov 2017 | A1 |
20170316692 | Rusciolelli | Nov 2017 | A1 |
20170325443 | Crinklaw | Nov 2017 | A1 |
20180035606 | Burdoucci | Feb 2018 | A1 |
20180092295 | Sugumaran | Apr 2018 | A1 |
20180168094 | Koch | Jun 2018 | A1 |
20180215393 | Miyakubo | Aug 2018 | A1 |
20180242517 | Davis | Aug 2018 | A1 |
20180243771 | Davis | Aug 2018 | A1 |
20180243772 | Davis | Aug 2018 | A1 |
20180243773 | Davis | Aug 2018 | A1 |
20180243774 | Davis | Aug 2018 | A1 |
20180317388 | Gresch | Nov 2018 | A1 |
20180319396 | Foster | Nov 2018 | A1 |
20180321682 | Matsumoto | Nov 2018 | A1 |
20180373259 | Aberle | Dec 2018 | A1 |
20190307070 | Dima | Oct 2019 | A1 |
20190343032 | Stanhope | Nov 2019 | A1 |
20200093053 | Ehlert | Mar 2020 | A1 |
20200236833 | Kremmer | Jul 2020 | A1 |
20200288621 | Kremmer | Sep 2020 | A1 |
20210000006 | Ellaboudy | Jan 2021 | A1 |
20210084820 | Vandike | Mar 2021 | A1 |
20210105995 | Palomares | Apr 2021 | A1 |
20210176916 | Sidon | Jun 2021 | A1 |
20210192754 | Sibley | Jun 2021 | A1 |
20210212249 | Disberger | Jul 2021 | A1 |
Number | Date | Country |
---|---|---|
3010410 | Feb 2019 | CA |
3048851 | Feb 2020 | CA |
102015009889 | Feb 2017 | DE |
102016212201 | Jan 2018 | DE |
102017113726 | Dec 2018 | DE |
1210854 | Jun 2002 | EP |
1475609 | Nov 2004 | EP |
2283719 | Feb 2011 | EP |
2910098 | Aug 2015 | EP |
3170380 | May 2017 | EP |
3364265 | Aug 2018 | EP |
3366130 | Aug 2018 | EP |
3366131 | Aug 2018 | EP |
3366132 | Aug 2018 | EP |
3366133 | Aug 2018 | EP |
3366134 | Aug 2018 | EP |
3685649 | Jul 2020 | EP |
2008005195 | Jan 2008 | WO |
WO2015065174 | May 2015 | WO |
WO-2017201466 | Nov 2017 | WO |
Entry |
---|
European Search Report issued in counterpart application No. 19212738.9 dated Jun. 5, 2020 (10 pages). |
Number | Date | Country | |
---|---|---|---|
20200236837 A1 | Jul 2020 | US |
Number | Date | Country | |
---|---|---|---|
62796700 | Jan 2019 | US |