The present disclosure relates generally to ground engaging tool contact detection systems, and more particularly to a ground engaging tool contact detection system and method for a crawler.
Work vehicles, such as a crawler or motor grader, can be used in construction and maintenance for grading terrain to a flat surface at various angles, slopes, and elevations. When paving a road for instance, a motor grader can be used to prepare a base foundation to create a wide flat surface to support a layer of asphalt. When automatically controlling a ground engaging tool, it is valuable to know when the tool is in contact with a surface. As such, there is a need in the art for an improved system and method that identifies when the ground engaging tool is in contact with the surface.
According to one embodiment of the present disclosure, a control system for a work vehicle that operates on a surface is disclosed. The control system comprises an optical sensor that is coupled to the work vehicle. The optical sensor is configured to capture image data that includes an implement. A non-transitory computer-readable memory stores operation information. An electronic processor is configured to perform an operation by controllably adjusting a position of the implement relative to the work vehicle. The electronic processor receives image data captured by the optical sensor and applies an artificial neural network to identify whether the implement is in contact with the surface based on the image data from the optical sensor. Wherein, the artificial neural network is trained to receive the image data as an input and to produce as an output an indication of whether the implement is in contact with the surface. The electronic processor accesses, from the non-transitory computer-readable memory, the operation information corresponding to whether the implement is in contact with the surface, and automatically adjusts an operation of the work vehicle based on the accessed operation information corresponding to whether the implement is in contact with the surface.
According to another embodiment of the present disclosure, a work vehicle that operates on a surface is disclosed. The work vehicle comprises an implement and an optical sensor. The optical sensor is coupled to the work vehicle. The optical sensor is configured to capture image data that includes the implement. A non-transitory computer-readable memory is provided for storing operation information. An electronic processor is provided and is configured to perform an operation by controllably adjusting a position of the implement relative to the work vehicle, receive image data captured by the optical sensor, apply an artificial neural network to identify whether the implement is in contact with the surface based on the image data from the optical sensor, wherein the artificial neural network is trained to receive the image data as input and to produce as the output an indication of whether the implement is in contact with the surface, access, from the non-transitory computer-readable memory, the operation information corresponding to whether the implement is in contact with the surface, and automatically adjust an operation of the work vehicle based on the accessed operation information corresponding to whether the implement is in contact with the surface.
According to another embodiment of the present disclosure a method is disclosed. The method includes capturing image data with an optical sensor coupled to the work vehicle wherein, the image data includes an implement. The method further includes identifying whether the implement is in contact with the surface by processing the image data with an electronic processor. The method includes accessing, from a non-transitory computer-readable memory, operation information corresponding to whether the implement is in contact with the surface and automatically adjusting an operation of the work vehicle based on the accessed operation information corresponding to whether the implement is in contact with the surface.
Other features and aspects will become apparent by consideration of the detailed description and accompanying drawings.
The detailed description of the drawings refers to the accompanying figures in which:
Before any embodiments are explained in detail, it is to be understood that the disclosure is not limited in its application to the details of construction and the arrangement of components set forth in the following description or illustrated in the following drawings. The disclosure is capable of other embodiments and of being practiced or of being carried out in various ways. Further embodiments of the invention may include any combination of features from one or more dependent claims, and such features may be incorporated, collectively or separately, into any independent claim.
The implement 15 may be positioned at a front of the work vehicle 10 and may be attached to the work vehicle 10 in a number of different manners. In this embodiment, the implement 15 is attached to the work vehicle 10 through a linkage which includes a series of pinned joints, structural members, and hydraulic cylinders. This configuration allows the implement 15 to be moved up 55 and down 60 relative to a surface 65 or ground, rotate around a vertical axis 70 (i.e., an axis normal to the ground), rotate around a longitudinal axis 75 (e.g., a fore-aft axis of the work vehicle 10), and rotate around a lateral axis 80 of the work vehicle 10 (i.e., a left-right axis of the work vehicle 10). These degrees of freedom permit the implement 15 to engage the ground at multiple depths and cutting angles. Alternative embodiments may involve implements 15 with greater degrees of freedom, such as those found on some motor graders 40, and those with fewer degrees of freedom, such as “pushbeam” style blades found on some crawlers 35 and implements 15 which may only be raised, lowered, and rotated around a vertical axis as found on some excavators and skidders.
The operator may command movement of the implement 15 from the operator station 20, which may be coupled to the machine or located remotely. In the case of the work vehicle 10, those commands are sent, including mechanically, hydraulically, and/or electrically, to a hydraulic control valve. The hydraulic control valve receives pressurized hydraulic fluid from a hydraulic pump, and selectively sends such pressurized hydraulic fluid to a system of hydraulic cylinders based on the operator's commands. The hydraulic cylinders, which in this case are double-acting, in the system are extended or retracted by the pressurized fluid and thereby actuate the implement 15. Alternatively, electronic actuators may be used.
With reference to
Referring to
With reference to
The control system 90 also has a non-transitory computer-readable memory 115 that stores operation information 120. The non-transitory computer-readable memory 115 may comprise electronic memory, nonvolatile random-access memory, an optical storage device, a magnetic storage device, or another device for storing and accessing electronic data on any recordable, rewritable, or readable electronic, optical, or magnetic storage medium.
An electronic processor 125 is provided and configured to perform an operation by controllably adjusting a position of the implement 15 relative to the work vehicle 10. The electronic processor 125 may be arranged locally as part of the work vehicle 10 or remotely at a remote processing center (not shown). In various embodiments, the electronic processor 125 may comprise a microprocessor, a microcontroller, a central processing unit, a programmable logic array, a programmable logic controller, other suitable programmable circuitry that is adapted to perform data processing and/or system control operations.
The electronic processor 125 is configured to receive image data 100 captured by the optical sensor 95 and apply an algorithm of an artificial neural network 130 to identify whether the implement 15 is in contact with the surface 65, and/or how far from the surface, based on the image data 100 from the optical sensor 95. The artificial neural network 130 is trained to receive the image data 100 as input and to produce as the output an indication of whether the implement 15 is in contact with the surface 65 and/or how far from the surface. The electronic processor 125 accesses the operation information 120 corresponding to whether the implement 15 is in contact with the surface 65 from the non-transitory computer-readable memory 115 and automatically adjusts an operation of the work vehicle 10 based on the accessed operation information 120. The adjustment may include adjusting a position of the implement 15 relative to the work vehicle 10. The adjustment may include changing a feedback gain 135. The adjustment may include transitioning the control of the work vehicle 10 between a manual control 140 and an automatic control 145. During a snow plowing operation, the adjustment may include turning off a pressure control or adjusting pressure when the implement 15 is above or on the surface 65.
Additionally, the electronic processor 125 may predict when the implement 15 may be at or near the surface 65 and preemptively increase the speed of the engine 30. Alternatively, when the implement 15 is above the surface 65, the electronic processor 125 may decrease the speed of the engine 30.
Referring now to
Number | Name | Date | Kind |
---|---|---|---|
10186004 | Wei | Jan 2019 | B2 |
11691648 | Theverapperuma | Jul 2023 | B2 |
20150225923 | Wallace et al. | Aug 2015 | A1 |
20180094408 | Shintani | Apr 2018 | A1 |
20180174291 | Asada et al. | Jun 2018 | A1 |
20180251961 | France | Sep 2018 | A1 |
20200131731 | Wu | Apr 2020 | A1 |
20200387720 | Stanhope | Dec 2020 | A1 |
20210047808 | Yamada | Feb 2021 | A1 |
20210381194 | Kennedy | Dec 2021 | A1 |
20220024485 | Theverapperuma | Jan 2022 | A1 |
20220074168 | Yamada | Mar 2022 | A1 |
20220186461 | Konda | Jun 2022 | A1 |
20220220694 | Suzuki | Jul 2022 | A1 |
20220301142 | Henry | Sep 2022 | A1 |
20220381007 | Hodel | Dec 2022 | A1 |
20220382251 | Hodel | Dec 2022 | A1 |
Number | Date | Country |
---|---|---|
2017202070 | Nov 2017 | AU |
112018001587 | Dec 2019 | DE |
102019211801 | Feb 2020 | DE |
102021204424 | Sep 2021 | DE |
102021204797 | Nov 2022 | DE |
1862050 | Jun 2009 | EP |
3168373 | May 2017 | EP |
3985183 | Apr 2022 | EP |
WO-2015187467 | Dec 2015 | WO |
Entry |
---|
German Search Report issued in application No. DE102022206155.1 dated Mar. 10, 2023 (08 pages). |
Number | Date | Country | |
---|---|---|---|
20230030029 A1 | Feb 2023 | US |