1. Field of the Present invention
The present invention relates generally to a system and method for controlling mechanized irrigation machines and, more particularly, to a system and method for steering, adjusting and directing an irrigation machine using a 3D scanner.
2. Background of the Invention
Prior art irrigation machines have used buried wire and RF antennas, furrows, above-ground wires or cables, and/or GPS positioning to define proper the steering paths. The buried wire method is costly to install, and is subject to damage by lightning, rodents, and digging equipment. The buried wire method is also difficult to make path changes. GPS guidance utilizes costly equipment, and is dependent on satellite availability, reference locations, and can require continual software updates. In addition to each of these drawbacks, none of the prior art systems can dynamically react to obstacles such as trees, equipment, or personnel.
The purpose of a 3D scanner is usually to create a point cloud of geometric samples on the surface of the subject. These points can then be used to extrapolate the shape of the subject (a process called reconstruction). If color information is collected at each point, then the colors on the surface of the subject can also be determined.
3D scanners share several traits with cameras. Like most cameras, they have a cone-like field of view, and like cameras, they can only collect information about surfaces that are not obscured. While a camera collects color information about surfaces within its field of view, a 3D scanner collects distance information about surfaces within its field of view. The “picture” produced by a 3D scanner describes the distance to a surface at each point in the picture. This allows the three-dimensional position of each point in the picture to be identified. These devices are used extensively by the entertainment industry in the production of movies and video games. Other common applications of this technology include industrial design, orthotics and prosthetics, reverse engineering and prototyping, quality control/inspection and documentation of cultural artifacts.
To address the shortcomings presented in the prior art, the present invention provides a system and method for using a 3D scanner to define a path for an irrigation machine to follow using terrain, markings, or other identifiers (natural or manmade). According to a preferred embodiment, identifiers can be programmed into the control computer to recognize location and define steering inputs. The 3D sensors can also identify if a foreign object is present in the path to create notifications and change machine operating parameters thus improving the ability of the machine to avoid collisions. Accordingly, the system of the present invention uses 3D sensor input to modify a predefined path and/or other system parameters in response to detected image data.
According to a further aspect of the present invention, the system of the present invention may preferably use 3D image input data to determine the alignment of the irrigation machine, stop locations of various drive towers (Bender, end of field), and/or automated connections (drop spans).
The accompanying drawings, which are incorporated in and constitute part of the specification, illustrate various embodiments of the present invention and together with the description, serve to explain the principles of the present invention.
For the purposes of promoting an understanding of the principles of the present invention, reference will now be made to the embodiments illustrated in the drawings and specific language will be used to describe the same. It will nevertheless be understood that no limitation of the scope of the present invention is hereby intended and such alterations and further modifications in the illustrated devices are contemplated as would normally occur to one skilled in the art.
The terms “program,” “computer program,” “software application,” “module” and the like as used herein, are defined as a sequence of instructions designed for execution on a computer system. A program, computer program, module or software application may include a subroutine, a function, a procedure, an object implementation, an executable application, an applet, a servlet, a source code, an object code, a shared library, a dynamic load library and/or other sequence of instructions designed for execution on a computer system. A data storage means, as defined herein, includes many different types of computer readable media that allow a computer to read data therefrom and that maintain the data stored for the computer to be able to read the data again. Such data storage means can include, for example, non-volatile memory, such as ROM, Flash memory, battery backed-up RAM, Disk drive memory, CD-ROM, DVD, and other permanent storage media. However, even volatile storage such as RAM, buffers, cache memory, and network circuits are contemplated to serve as such data storage means according to different embodiments of the present invention.
With reference now to
As further shown in
The input data preferably further includes remote data inputs 22 which preferably include data such as internet data and remote input/output data. Such data may preferably include image detection data, machine learning algorithms or the like for identification of acquired images. Such image detection data may preferably include 3-dimensional image data representing objects detected by other 3D Image Scanners employed by other irrigation systems or which have been previously detected and stored by 3D Image Scanner 12 or any other system. Preferably, the image detection data may be selected based on the physical distance between the irrigation system and other irrigation systems. In this way, locally occurring image data may preferably be fed to the drive system CPU 16 to aid in the detection of common objects detected within the local region in which the 3D image scanner 12 is used. Further, the 3D image data may preferably be selected and updated based on the time of year, common soil types, common crops and/or common irrigation systems being used. The drive system CPU 16 preferably analyzes each piece of image data, identifies obstacles and environmental factors, and directs the vehicle steering and drive systems 24 accordingly as discussed further below.
With reference now to
As further shown, auxiliary sensor input is received 30 along with remote system input 32 and provided to the drive system data input 28. With the collected data, the drive system CPU preferably then creates a drive system instruction set 34 which is transmitted to the drive system for execution 36.
According to further aspects of the present invention, the data from the 3D image scanner 12 is preferably further used by the system CPU 16 in conjunction with auxiliary sensor input 20 and stored mapping data to further provide command and control instructions for the irrigation drive systems 24. For example, the system CPU 16 may in a first instance use GPS data and stored mapping data to provide a first irrigation path for the drive system 24 to follow. Thereafter, the CPU 16 may further use input from the 3D image scanner 12 to create adjustments to the GPS determined irrigation path during irrigation. In this way, data from the 3D image scanner could be used for obstacle avoidance for obstacles which are encountered during the traversal of an irrigation path navigated by the system using GPS and stored map data. Further, data from the 3-D image scanner 12 could be used by the CPU 16 to correct the path of the machine during subsequent passes more accurately than other methods (e.g. GPS, RTK). Still further, the data from the scanner may preferably be used by the CPU to fully guide the machine along the irrigation path without GPS or other external guidance signals.
With reference now to
According to a further aspect of the present invention, the disclosed 3D scanner/irrigation system can be used to create a path for the irrigation system to follow. For example, the path may preferably be defined with terrain, markings, or other identifiers (natural or manmade) such as reflectors spaced uniformly along the path, tire tracks, the edge of the neighboring water supply canal, a berm or furrow of soil, or the like. According to a further aspect of the present invention, identifiers can be programmed into the control computer to recognize location and define steering inputs. Further, the 3D sensors may preferably be used to identify if a foreign object is present in the path to create notifications and change machine operating parameters accordingly.
According to a further preferred embodiment, the present invention preferably uses 3D sensor input to define and trigger dynamic reactions of both the drive system and the irrigation systems of an irrigation machine in response to detected image characteristics. In this way, the path and direction of the system can be modified to avoid obstacles, or change the irrigation and/or drive programs accordingly. Further, the 3D image inputs may also be used to identify crop characteristics such as growth stage. With this crop data, the system may use the growth stage and other crop characteristics to determine adjustments to the operating parameters of the irrigation system, such as changing speeds, adjusting sprinkler height, or modifying irrigation applications. Further the data input from the 3D scanners may also be used to determine the alignment of the machine, stop locations of drive towers (e.g. bender or end of field towers), automated connections (drop spans) and the like.
While the above descriptions regarding the present invention contain much specificity, these should not be construed as limitations on the scope, but rather as examples. Many other variations are possible. Accordingly, the scope should be determined not by the embodiments illustrated, but by the appended claims and their legal equivalents.
The present application claims priority to U.S. Provisional Application No. 62/458,379 filed Feb. 13, 2017.
Number | Name | Date | Kind |
---|---|---|---|
5921280 | Ericksen et al. | Jul 1999 | A |
7093244 | Lajoie et al. | Aug 2006 | B2 |
8589908 | Subbakrishna et al. | Nov 2013 | B2 |
9104521 | Sabet et al. | Aug 2015 | B2 |
9176725 | Lal | Nov 2015 | B2 |
9348326 | Yuh et al. | May 2016 | B2 |
9378340 | Cooper et al. | Jun 2016 | B2 |
9414552 | Halahan et al. | Aug 2016 | B2 |
9443358 | Breed | Sep 2016 | B2 |
9983041 | Jerphagnon | May 2018 | B1 |
20020131405 | Lin et al. | Sep 2002 | A1 |
20030179102 | Barnes | Sep 2003 | A1 |
20040005085 | Andersen | Jan 2004 | A1 |
20070043833 | Lu et al. | Feb 2007 | A1 |
20110145809 | Hwang | Jun 2011 | A1 |
20110301755 | Anderson | Dec 2011 | A1 |
20120083982 | Bonefas | Apr 2012 | A1 |
20130153673 | Younis | Jun 2013 | A1 |
20140007076 | Kim et al. | Jan 2014 | A1 |
20140146173 | Joyce | May 2014 | A1 |
20150327449 | Bartlett et al. | Nov 2015 | A1 |
20150339116 | Nekoomaram et al. | Nov 2015 | A1 |
20160066505 | Bakke | Mar 2016 | A1 |
20160306735 | Adderly et al. | Oct 2016 | A1 |
20170251589 | Tippery | Sep 2017 | A1 |
Number | Date | Country |
---|---|---|
2007118405 | Oct 2007 | WO |
2009107000 | Nov 2009 | WO |
2011005051 | Apr 2011 | WO |
Number | Date | Country | |
---|---|---|---|
62458379 | Feb 2017 | US |