This application is the US National Stage for International Application No. PCT/EP2011/052471, filed Feb. 18, 2011, which itself is related to and claims the benefit of Belgian Application No. BE 2010/0103, filed Feb. 21, 2010.
The present invention is related to agricultural harvesting machines, such as combine or forage harvesters, equipped with an unloading apparatus (such as a spout) for filling harvested and processed crop materials into a container travelling alongside the harvester. The invention is related to methods for controlling the filling of such a container on the basis of image data of the container.
In forage harvesters of the above type, control of the spout position and of the position of a pivotable end portion (flap) of the spout on the basis of camera images is known in the art. EP2020174 describes a harvester equipped with an electro-optical device configured to detect characteristic parameters of the spout and/or of the container and the harvester. The electro-optical device may be a camera system capable of obtaining three-dimensional image data of the container filling area, to derive characteristic parameters from such data. Among such parameters are the spatial position and height of the side walls of the container, as well as the filling height of crop already deposited in the container. The methods for processing the image data involve the recognition of patterns and characteristic lines in the image. Such methods however require rather complex image recognition algorithms and may lack robustness, accuracy and speed.
It is an object of the invention to provide a method for processing image data of the filling area which allows a more efficient and simplified control of the filling process.
The present invention pertains to a method and harvesting machine as disclosed in the appended claims. The method of the invention provides the advantage that the analysis of the image is based on a limited portion of the region, in particular on a number of strips selected in the image. This allows a fast and efficient analysis.
According to one aspect of the invention there is provided a method for directing a movable unloading apparatus of an agricultural harvesting machine to a container driven adjacent the harvesting machine, the container comprising, as seen from the harvesting machine, near and remote upper borders, the method comprising the steps of:
using a camera on the harvesting machine for capturing images of at least a portion of the container, the camera generating image data containing data on the distance between the camera and the portion of the container;
processing the image data for deriving therefrom data on the relative position between the portion of the container and the unloading apparatus; and
using the relative position data for automatically moving the unloading apparatus relative to the harvesting machine towards a predetermined position relative to the container,
characterised in that
the step of processing the image data comprises:
wherein the predetermined relative position is a position wherein the unloading apparatus directs the harvested crop to a position above said near upper border and/or below said remote upper border, respectively.
The analysis of the distance values in each strip may provide data on the level of crop in the container and the available space in the container for further filling without piling up the crop above the borders. The location of borders, levels and/or edges may be derived from the distance data, e.g., from a first or second derivative of the distance sequence.
Advantageously, the image processing may also comprise the analysis of a vertical strip in the image to derive therefrom the presence of a left or right corner of the container or the presence of a left or right upper border. These data can be used to limit or revert the movement of the unloading apparatus relative to the container.
According to a further aspect of the present invention, there is provided a harvesting machine equipped with an unloading apparatus for unloading crop materials into a container moving alongside the harvesting machine, a 3D camera being mounted in connection to the harvesting machine, said harvesting machine further comprising control means configured to direct the movable unloading apparatus according to the method of the first aspect of the invention. The harvesting machine may comprise a forage harvester provided with a spout or a combine harvester provided with an unloading tube for the grain tank.
Preferred embodiments will now be described with reference to the drawings. The detailed description is not limiting the scope of the invention, which is defined only by the appended claims.
The position of the camera with respect to the container 4 is such that in the horizontal direction, the camera is placed to one side of the container (i.e. the horizontal position A of the camera is not located between the transverse horizontal locations B and C of the side walls 5 and 6 of the container). In the vertical direction, the camera preferably is placed higher than the upper borders 6 and 7 of the side walls 5 and 6 of the container. As a consequence, the camera takes images in which the front surface of at least the near side wall 5 and possibly also the remote side wall 7 is visible (i.e. the surface facing the harvesting machine), as well as the near upper border 6 and possibly the remote upper border 8 of these side walls, as shown in the example in
According to the method of the invention, the image 19 in
In an image as shown in
As a result, three distance curves are obtained, as shown in
The distance values of the central strip 23 and in particular the distance value at the level of the container borders can be used for monitoring the position of the crop stream and adjusting the same with respect to the detected border or borders.
According to the preferred embodiment, the first and/or second derivative of the curves in
According to a further embodiment, in addition to determining the position of one or both upper borders 6 and 8, the method comprises steps to determine the level of crop already deposited in the container. As the crop level rises, the image will contain, in between the near and remote upper borders 6 and 8, a region of crop material having an intersection line with the remote side wall 7. In each strip 21 and 22, the point of intersection between the crop and the remote side wall defines the crop level in said strip. Thus, from the same strips 21 and 22 as described above, to the left and right of the spout, the filtered distance values are analysed, and the vertical position is determined corresponding to the highest filtered distance value in the region between the near and remote upper borders 6 and 8. This vertical position then is the position corresponding to the crop level. Alternatively, the position between the near and remote upper borders having the maximum value of first and/or second derivative is selected as the position corresponding to the crop level. The method then further comprises the step of comparing the established crop level on both sides of the spout with the vertical position of the near side wall 5. When one of the crop levels at the left or right hand side of the spout reaches a predetermined level relative to the near upper border 6, the spout 1 is moved to a region of the container 4 that is not yet filled or, when the predetermined level is reached on both sides of the spout, the filling is stopped.
According to a further embodiment, a horizontal strip (30 or 31) is selected on an image as shown in
Number | Date | Country | Kind |
---|---|---|---|
2010/0103 | Feb 2010 | BE | national |
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/EP2011/052471 | 2/18/2011 | WO | 00 | 8/15/2012 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2011/101458 | 8/25/2011 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
5575316 | Pollklas | Nov 1996 | A |
5629989 | Osada | May 1997 | A |
5749783 | Pollklas | May 1998 | A |
6963664 | Braspenning et al. | Nov 2005 | B2 |
20020048402 | Braspenning et al. | Apr 2002 | A1 |
20030145571 | Diekhans | Aug 2003 | A1 |
20030174207 | Alexia et al. | Sep 2003 | A1 |
20090044505 | Huster et al. | Feb 2009 | A1 |
20100063692 | Madsen et al. | Mar 2010 | A1 |
20100108188 | Correns et al. | May 2010 | A1 |
Number | Date | Country |
---|---|---|
2008101694 | Aug 2008 | WO |
Number | Date | Country | |
---|---|---|---|
20120316737 A1 | Dec 2012 | US |