This application claims priority under 35 U.S.C. ยง 119 to German Patent Application No. DE 102021101219.8 filed Jan. 21, 2021, the entire disclosure of which is hereby incorporated by reference herein.
The invention relates to a system and method for determining a broken grain fraction of a quantity of grains.
This section is intended to introduce various aspects of the art, which may be associated with exemplary embodiments of the present disclosure. This discussion is believed to assist in providing a framework to facilitate a better understanding of particular aspects of the present disclosure. Accordingly, it should be understood that this section should be read in this light, and not necessarily as admissions of prior art.
Combine harvesters (also termed combines) are designed to harvest a variety of grain crops, and can perform reaping, threshing, gathering, and winnowing. Combines, such as in EP2742791B1, may include a camera and grain loss sensors.
The present application is further described in the detailed description which follows, in reference to the noted drawings by way of non-limiting examples of exemplary implementation, in which like reference numerals represent similar parts throughout the several views of the drawings, and wherein:
In one or some embodiments, a system and method are disclosed to determine the broken grain fraction of a quantity of grains.
This may be achieved by a system for determining a broken grain fraction of a quantity of grains comprising at least one camera that is configured to create an image of the quantity of grains, and a computing unit that is configured to evaluate the image. The computing unit may be configured to use artificial intelligence to evaluate the image, such as determining broken grains in the image, and configured to determine, based on the determined broken grains in the image, the broken grain fraction of the quantity of grains in the image.
In one or some embodiments, the camera comprises any optical sensor that emits at least two dimensional sensor data. In one or some embodiments, the corresponding sensor data are termed an image. A camera may therefore be one or both a classic camera or a lidar sensor program.
In one or some embodiments, artificial intelligence comprises a model for recognizing objects in images. In one or some embodiments, the artificial intelligence may be trained with a plurality of images (e.g., supervised learning using tagged images). In so doing, the broken grains may be manually identified in the plurality of images (e.g., tagged in the plurality of images), and parameters of the model may be determined in training by a mathematical method so that the artificial intelligence can recognize broken grains in images. Various mathematical methods are contemplated. Alternatively, or in addition, whole grains may be recognized by artificial intelligence or by classic image processing, for example by watershed transformation. The broken grain fraction may then be determined in the images from the recognized grains and broken grains. Moreover, it is contemplated to use artificial intelligence for recognizing non-grain objects such as straw. The recognition of non-grain objects may prevent these objects from being recognized as grains in classic image processing.
In one or some embodiments, the artificial intelligence comprises a deep neural network which is also termed deep learning. Other architectures for creating artificial intelligence are contemplated.
In one or some embodiments, the broken grain fraction may be indicated as the fraction of broken grains of all recognized whole grains and broken grains in the images. To accomplish this, the whole grains and the broken grains may be counted. The broken grain fraction results from the number of broken grains divided by the sum of whole grains and broken grains. The advantage of this broken grain fraction is that it may be easier to determine.
Since the broken grains are generally smaller than the whole grains, the broken grain fraction may be indicated as the surface fraction. To this end, the surfaces of the whole grains and the broken grains in the images are determined for example as the number of pixels in digital images. The broken grain fraction then results from the sum of surfaces of broken grains divided by the sum of the surfaces of the whole grains and the broken grains.
Since the grains are three-dimensional, it may be desirable to output the broken grain fraction, such as a volume fraction. When using a camera that generates the three-dimensional sensor data as an image, for example a stereo camera or a lidar sensor, these data may be used to determine the volumes of the grains and broken grains. Alternatively, when two dimensional data are used, the volume of the broken grains and the whole grains may be approximated from the surfaces of the broken grains and the whole grains. Regardless, the output of the broken grain fraction may comprise any one, any combination, or all of an area fraction, a volume fraction, or a weight fraction. The approximation may depend on the type of grains. The type of grains may be manually specified or determined automatically from the images. Alternatively, the type of grains may be obtained from a farm management system, wherein the type of cultivated plants is saved in the farm management system for the site of use of the system. The broken grain fraction then results as the sum of the volumes of broken grains divided by the sum of the volumes of the whole grains and the broken grains.
Assuming a constant density, the broken grain fraction as a volume fraction is identical to a broken grain fraction as a mass fraction. The broken grain fraction may be output using the same method as the mass fraction.
In one or some embodiments, the camera is part of a mobile device, such as a smart phone or a combine. The use of a camera that is part of a mobile device enables flexible image capture. With a smart phone, the user may capture images of whole grains and broken grains at every location at which she/he is located, for example grain samples from a combine. A camera in a combine may be installed in the combine so that images of the harvested grains are automatically captured. For example, grains conveyed by a grain elevator into the grain tank may be automatically photographed.
In one or some embodiments, the computing unit is part of the mobile device (e.g., a smartphone with camera functionality). When the camera and computing unit are part of the same mobile device, the images may be locally evaluated.
In one or some embodiments, the computing unit is at a distance or separated from the mobile device. Beyond mobile devices such as smart phones or combines, frequently greater computing capacities may be made available more conveniently. The image captured by the camera in the mobile device may be, for example, transmitted wirelessly to the computing unit and evaluated there by the computing unit. The computing unit may, for example, be located in a computing center of a service provider. After analysis (e.g., determination of the broken grain fraction), the broken grain fraction and if applicable other evaluation results may then be returned to the mobile device.
In one or some embodiments, the system comprises a learning unit, wherein the learning unit is provided and configured to improve the artificial intelligence with the images. On the one hand, the images are evaluated by the artificial intelligence, on the other hand, the images are used to improve the artificial intelligence. For improvement, the images may be manually annotated (e.g., the broken grains in the image are identified or tagged, and the artificial intelligence may be trained therewith using the identified/tagged images).
In one or some embodiments, the learning unit is part of a computing unit remote from the mobile device. Since training artificial intelligence frequently may require considerable computing capacity, the learning unit, in one or some embodiments, may be part of a computer remote from the mobile device.
In one or some embodiments, the system is configured to output an indication of the broken grain fraction via an output interface, such as a display device. The broken grain fraction may be brought to the awareness of the user (such as the operator of the combine) and/or may be transmitted to other systems for further processing. For example, the broken grain fraction may be transmitted by a smart phone to a combine.
In one or some embodiments, the system includes a combine with at least one work assembly, such as a threshing system. The system may be configured to at least control or regulate one or more aspects (e.g., one or more control aspects) of the work assembly, such as one or more settings of the work assembly, based on the determined broken grain fraction. In one embodiment, the work assembly may be regulated directly by the computing unit. Alternatively, the computing unit may forward the broken grain fraction, or a value derived therefrom, to a regulation unit, which may, in turn, control or regulate the setting of the work assembly.
In one or some embodiments, the system comprises a base, wherein the base is configured to receive the grains, such as in the same orientation. Further, the camera may be positioned and thereby configured to photographic the grains on the base. In one or some embodiments, the base offers a defined background. In this way, the images obtained of the grains on the base makes it possible to better recognize the grains and broken grains and therefore to better determine the broken grain fraction. In one or some embodiments, an equal orientation means that the longest axis of the grains are oriented parallel. To this end, the base may have elevations on which the grains may be oriented.
In one or some embodiments, the system is configured to reduce or to exclude accumulations of grains when evaluating the images. When grains accumulate, some grains may partially cover other grains, which may make it difficult to recognize or identify broken grains. By excluding identified accumulations, only individual layers of grains and broken grains may be evaluated, thereby improving the determination of the broken grain fraction.
Moreover, the invention relates to a method for determining a broken grain fraction of a quantity of grains, wherein the method is performed using a camera and a computing unit, wherein the camera creates an image of the grains and transmits the image to the computing unit. In turn, the computing unit evaluates the image with artificial intelligence, which determines broken grains in the image. In turn, the computing unit determines, based on the determined broken grains in the image, the broken grain fraction of the quantity of grains in the image.
In one or some embodiments, the image is transmitted by the camera to a learning unit, wherein the learning unit improves the artificial intelligence with the image. The image may be annotated or tagged, and the artificial intelligence is trained by the learning unit with the annotated or tagged image.
In one or some embodiments, based on the broken grain fraction, a work assembly, such as a combine, may be controlled or regulated. Control or regulation may be performed in one of several ways. In one way, control or regulation may be performed entirely automatically without operator intervention. For example, based on the determined broken grain fraction, the operation of the work assembly may be automatically modified (e.g., in order to reduce the broken grain fraction). Alternatively, the determined broken grain fraction and/or the recommended control or regulation (determined by the computing unit) may be output to an operator for the operator to confirm prior to modifying operation of the work assembly.
Referring to the figures,
Computing unit 17 may comprise any type of computing functionality, such as at least one processor 22 (which may comprise a microprocessor, controller, PLA, or the like) and at least one memory 23. The memory 23 may comprise any type of storage device (e.g., any type of memory). Though the computing unit 17 is depicted with a single processor 22 and a single memory 23 as separate elements, they may be part of a single machine, which includes a microprocessor (or other type of controller) and a memory. Further, the computing unit 17 may include more than one processor 22 and/or more than one memory 23.
The computing unit 25 is merely one example of a computational configuration. Other types of computational configurations are contemplated. For example, all or parts of the implementations may be circuitry that includes a type of controller, including an instruction processor, such as a Central Processing Unit (CPU), microcontroller, or a microprocessor; or as an Application Specific Integrated Circuit (ASIC), Programmable Logic Device (PLD), or Field Programmable Gate Array (FPGA); or as circuitry that includes discrete logic or other circuit components, including analog circuit components, digital circuit components or both; or any combination thereof. The circuitry may include discrete interconnected hardware components or may be combined on a single integrated circuit die, distributed among multiple integrated circuit dies, or implemented in a Multiple Chip Module (MCM) of multiple integrated circuit dies in a common package, as examples.
The computing unit 17, using software and/or hardware, is configured to manifest artificial intelligence KI. For example, the artificial intelligence KI may be stored in memory 23 and loaded into processor 22 and may comprise an application comprising object recognition (e.g., recognizing broken grains). Specifically, in the computing unit 17, the images or series of images may be analyzed by an artificial intelligence KI to recognize the broken grains. Based on the analytical results, the setting parameters for the work assembly(ies) of the combine 1 may be automatically or manually modified or changed by the computing unit 17. In one or some embodiments, the modification or change is configured to modify operation of the work assembly (ies) in order to obtain very uniform quality of the harvested material G in the grain tank 14 (e.g., to obtain quality of the harvested material G in the grain tank 14 within a predetermined deviation).
Harvested material M is picked up or collected with the known means of a cutting unit 2 and an inclined conveyor 3 by the combine 1 and processed with the known work assemblies such as a threshing unit 4 consisting of a pre-accelerator drum 5, a threshing drum 6, a turning drum 7, a threshing concave 8, and a separating device consisting of a shaker 9, a returning area 10 and a cleaning device 11 with a blower 12 in order to obtain the harvested material G. Along the harvested material transport path W, the flow of harvested material S is fed via a grain elevator 13 to the grain tank 14.
In one or some embodiments, the system 15 for determining a broken grain fraction comprises a camera 16 and a computing unit 17 that are connected (e.g., wired and/or wirelessly) to each other by a data line D. In the example of the combine 1, the camera 16 is arranged or positioned in the area of the elevator head of the grain elevator 13. The computing unit 17 is installed in the combine 1. It is also contemplated to use an external computing unit and configure the data line D as a radio path (e.g., a long distance wireless communication path).
The images, or series of images, or the analytical results fed to the computing unit 17 may in turn be forwarded or transmitted by the computing unit 17 to a user interface comprising (or consisting of) a display 18 and an operating device 19 in the driver's cab 20 of the combine 1. There, the images or series of images may, for example, be displayed to a user F of the self-propelling agricultural harvesting machine (e.g., combine 1) so that she/he may execute, for example, a manual input in order to change or optimize the setting parameters of any one, any combination, or all of the work assemblies 4, 5, 6, 7, 8, 9, 10, 11, 12. A change to the setting parameters of any one, any combination, or all of the work assemblies 4, 5, 6, 7, 8, 9, 10, 11, 12 may also be performed automatically by the computing unit 17 (such as by control system 24 discussed below) depending on the default setting in the operating device 19 (e.g., one or more rules may be stored in computing unit 17 in order to determine the change to the setting parameters based on the determined broken grain fraction).
Further, it is intended that the foregoing detailed description be understood as an illustration of selected forms that the invention can take and not as a definition of the invention. It is only the following claims, including all equivalents, that are intended to define the scope of the claimed invention. Further, it should be noted that any aspect of any of the preferred embodiments described herein may be used alone or in combination with one another. Finally, persons skilled in the art will readily recognize that in preferred implementation, some, or all of the steps in the disclosed method are performed using a computer so that the methodology is computer implemented. In such cases, the resulting physical properties model may be downloaded or saved to computer storage.
Number | Date | Country | Kind |
---|---|---|---|
102021101219.8 | Jan 2021 | DE | national |