N/A
The present disclosure relates to a control system for a work machine having an attachment, wherein the attachment is movably coupled to the work machine.
The present disclosure relates to a control system and method for facilitating the efficient operation of a work machine during loading operations. Loading operations generally include loading, carrying, and unloading a pile. A pile may include material such as dirt, sand, quarry rocks, and prefabricated man-made materials, etc. Optimizing operation of the subsystems of a work machine is contingent upon the operator's effectiveness and experience with engaging a pile. For example, if the work machine is moving in a fuel economy mode and suddenly engages with a pile, the machine may stall because the engine and transmission may not react quickly enough to overcome the sudden increase in load. Alternatively, if the operator overcompensates for an anticipated load through manual input, this can lead to excessive fuel consumption and increased tire wear.
Accordingly, the present disclosure includes a system for optimizing the loading parameters of a work machine with a sensor-augmented guidance system to address inefficiencies of the machine when engaging with a pile. The work machine, extending in a fore-aft direction, has a frame configured to support an engine, a transmission, a hydraulic cylinder, an engine speed sensor, and an attachment movably coupled to the work machine to engage a pile.
According to an aspect of the present disclosure, the sensor-augmented guidance system for optimizing the loading parameters comprise a sensor coupled with the work machine, a sensor processing unit, and a vehicle control unit. The sensor may be facing in a forward direction. The sensor may be configured to collect image data of the pile in a field of view of the sensor.
A sensor processing unit may be communicatively coupled with the sensor. The sensor processing unit may be configured to receive the image data from the sensor, wherein the sensor processing unit is configured to calculate a volume estimation of the pile based on the image data.
A vehicle control unit may be communicatively coupled with the sensor processing unit. The vehicle control unit can be configured to modify a loading parameter of the work machine in response to a predictive load of the pile.
The vehicle control unit may have a memory unit and a data processing unit.
The memory unit can associate a material property from a stored database based on either the image data or the operator's input.
The data processing unit, which may be in communication with the memory unit, is configured to calculate the predictive load of the pile based on the volume estimation and the material property.
The sensor may be either a stereoscopic vision device or a laser distance device.
The sensor processing unit may comprise a distance-calculating unit and an image processing unit. The distance-calculating unit may calculate the spatial offset of the pile from the sensor. The image processing unit may be in communication with the sensor and the distance-calculating unit. The image processing unit may calculate the volume estimation of the pile based on the image data and the spatial offset.
A loading parameter can be an engine speed, a transmission ratio, a hydraulic flow rate, a hydraulic pressure, a rimpull ratio, and a valve position.
In one instance, the vehicle control unit may generate an engine speed signal to the engine controller in response to the predictive load of the pile to temporarily increase the engine speed at least prior to or at the instant an attachment engages a pile.
In another instance, the vehicle control unit generates a transmission control signal to the transmission controller in response to the predictive load of the pile to temporarily increase the transmission ratio at least prior to or at the instant the attachment engages the pile.
In another instance, the vehicle control unit generates a hydraulic force signal to the hydraulic cylinder in response to the predictive load of the pile to modify the hydraulic flow rate, the hydraulic pressure, or a valve position.
Furthermore, the engine speed sensor may generate a subsequent engine speed signal after the attachment engages the pile. The vehicle control unit may compare the subsequent engine speed signal to the engine speed signal. The engine control unit may then adjusts future engine speed signals based on a moving average for use a next time the attachment engages the pile.
The sensor processing unit may further comprise an edge detection unit. The edge detection unit can identify discontinuities in either color or pixel intensity of the image data to identify edge where the sensor processing unit calculates a volume estimation based on discontinuities.
The system may further comprise a ground sensor. The ground sensor faces towards the ground to collect image data of a ground surface to determine a material property of the ground surface. The vehicle control unit may modify a loading parameter based on a material property of the ground surface.
These and other features will become apparent from the following detailed description and accompanying drawings, wherein various features are shown and described by way of illustration. The present disclosure is capable of other and different configurations and its several details are capable of modification in various other respects, all without departing from the scope of the present disclosure. Accordingly, the detailed description and accompanying drawings are to be regarded as illustrative in nature and not as restrictive or limiting.
The detailed description of the drawings refers to the accompanying figures in which:
Like reference numerals are used to indicate like elements throughout the several figures.
The embodiments disclosed in the above drawings and the following detailed description are not intended to be exhaustive or to limit the disclosure to these embodiments. Rather, there are several variations and modifications which may be made without departing from the scope of the present disclosure.
In accordance with one embodiment,
The work machine 100 may include ground engaging supports 160, such as wheels or a track system (not shown) that support the work machine 100. The engine 130 is configured to drive the transmission 135 that powers the ground engaging supports 160 and the hydraulic cylinders 140 to move the attachment 155.
The pile 115 of material may be any variety of materials that are to be loaded into the attachment 155 and dumped at another location. For example, the pile may include sand, dirt, gravel, quarry rock, and pre-fabricated man-made materials. Alternatively, the pile 115 may be an embankment or hill formed of a tough material, such as clay, embedded rocks, or other tough material. The work machine 100 may encounter any number of variations of material types in a pile 115 to be loaded during its course of operation. It is understood that the reference to a pile 115 encompasses any material to be loaded which may be more than a mere heap of things lying one on top of another.
The work machine 100 comprises a sensor 170 facing in a generally forward direction.
The forward direction may be either parallel to the fore-aft direction of the work machine 100, or in a generally forward direction wherein the sensor may move and face in a direction anywhere in an area forward of the work machine 100. The sensor 170 is configured to collect image data (shown in
As shown in
The sensor processing unit 195 is communicatively coupled to the sensor 170. The sensor processing unit 195 is configured to receive the image data 175 from the sensor 170, and calculate a volume estimation 310 of the pile 115 based on the image data 175. As shown in
It should be appreciated that the sensor processing unit 195 may correspond to an existing controller of the work machine or may correspond to a separate processing device. For instance, in one embodiment, the machine control module may form all or part of a separate plug-in module that may be installed within the work machine to allow for the disclosed system and method to be implemented without requiring additional software to be uploaded onto existing control devices of the work machine.
Returning to
The image processing unit 300 is in communication with the sensor 170 and the distance-calculating unit 295. The image processing unit 300 calculates the volume estimation 310 of the pile 115 based on the image data 175 and the spatial offset 303. In one example, the image processing unit 300 can identify a set of two-dimensional or three dimensional points (e.g. Cartesian coordinates or Polar coordinates) in the collected image data 175 that define the pile position, an aggregate 122 of the pile, or both. The set of two-dimensional or three-dimensional points can correspond to pixel positions in images collected by the stereoscopic vision device 230. The image processing unit 300 may rectify the image data 175 to optimize analysis. The image processing unit 300 may use color discrimination, intensity discrimination, or texture discrimination to identify pixels from one or more pile aggregate pixels from the image data 175 and associate them with pixel patterns, pixel attributes (e.g. color or color patterns like Red Green Blue (RGB) pixel values), pixel intensity patterns, texture patterns, luminosity, brightness, hue, or reflectivity to calculate the area of the pile 115 or the surface of the pile 118, and corresponding volume estimation 310 with the calculated or measured spatial offset 303 of the pile 115 or surface of the pile 118 from the sensor 170.
The sensor processing unit 195 may further comprise an edge detection unit 315 communicatively coupled to sensor 170 and/or image processing unit 300. The edge detection unit 315 identifies discontinuities in either pixel color or pixel intensity of the image data 175 to identify edges. The sensor processing unit 195 calculates the volume estimation 310 based on the discontinuities. The edge detection unit 315 may apply an edge detection algorithm to image data. Any number of suitable edge detection algorithms can be used by the edge detection unit 315. Edge detection refers to the process of identifying and locating discontinuities in pixels in an image data 175 or collected image data. For example, the discontinuities may represent material changes in pixel intensity or pixel color which define the boundaries of objects in an image. A gradient technique of edge detection may be implemented by filtering image data to return different pixel values in first regions of greater discontinuities or gradients than in second regions with lesser discontinuities or gradients. For example, the gradient technique detects the edges of an object by estimating the maximum and the minimum of the first derivative of the pixel intensity of the image data. The Laplacian technique detects the edges of an object in an image by searching for zero crossings in the second derivative of the pixel intensity image. Further examples of suitable edge detection algorithms include, but are not limited to, Roberts, Sobel, and Canny, as are known to those of ordinary skill in the art. The edge detection unit 315 may provide a numerical output, signal output, or symbol indicative, of the strength or reliability of the edges in field. For example, the edge detection unit 315 may provide a numerical value or edge strength indicator within a range or scale or relative strength or reliability to the linear Hough transformer.
The linear Hough transformer receives edge data (e.g. an edge strength indicator) related to the pile 115 and its aggregate material, and identifies the estimated angle and offset of the strong line segments, curved segments or generally linear edges of the pile 115 in the image data 175. The linear Hough transformer comprises a feature extractor for identifying line segments of objects with certain shapes from the image data 175. For example, the linear Hough transformer identifies the line equation parameters or ellipse equation parameters of objects in the image data from the edge data 320 outputted by the edge detection unit 315 or Hough transformer classifies the edge data 320 as a line segment, an ellipse, or a circle. Thus it is possible to detect the sub-components of an aggregate pile of stones, sand, dirt, rocks, or man-made materials such as pipes, each of which may have generally linear, rectangular, elliptical or circular features. Alternatively, the edge detection unit 315 may simply identify an estimated outline of the pile 115, thereby calculating its area.
In one embodiment, the sensor processing unit 195 may be coupled, directly or indirectly, to optional lights 330 (shown in
With continued reference to
The memory unit 350 associates a material property of the pile from a stored database 270 having material property reference data 370 based on either the image data 175, operator input signal 200 from the operator input device 157, or both. The stored database 270 may comprise an electronic memory, a magnetic disc drive, an optical disc drive or a magnetic storage device or an optical storage device, either on the work machine 100 or another location (e.g. data cloud 290 or a mobile device 280 shown in
The data processing unit 360 is communicatively coupled with the memory unit 350. The data processing unit 360 is configured to calculate the predictive load 340 of the pile based on the volume estimation 310 and the material property 370. Predictive load 340 is the anticipated load to be placed on any one or more of the loading parameters 120.
In another embodiment, the system 110 may further comprise a ground sensor 380 (shown in
The loading parameters 120 comprise engine speed 410, a transmission ratio 420, a hydraulic flow rate 430, a hydraulic pressure 440, a rimpull ratio 400, and a valve position 460.
An engine speed sensor 470 may be disposed in the control system 110 for detecting an engine speed 410 of the engine 130. Moreover, a transmission input speed sensor 480 may detect an input speed of the transmission 135, and a transmission output speed sensor 490 may detect an output speed of the transmission 135. The engine speed sensor 470, transmission input speed sensor 480, and the transmission output speed sensor 490 can be communicatively coupled to the vehicle control unit 190.
The vehicle control unit 190 can be communicatively coupled with the engine controller 500. The vehicle control unit 190 may generate an engine speed signal 510 in response to the predictive load 340 of the pile to temporarily increase the engine speed either prior to or at the instant of the attachment 155 engaging a pile 115. This advantageously provides sufficient force for the work machine 100 when engaging the pile 115 to prevent the engine from stalling if overloaded. At the same time, it would minimize fuel consumption waste, tire wear, and operator efficiency variation. With continued reference to
The vehicle control unit 190 can be further communicatively coupled with the transmission controller 540. The vehicle control unit 190 may generate a transmission control signal 550 in response to the predictive load 340 of the pile to lower the transmission ratio 420 either prior to or at the instant the attachment engages a pile 115. Similar to the subsequent engine speed signal 520, the vehicle control unit 190 may adjust the transmission control signal 550 after engaging the pile 115 for use a next time the attachment engages the pile.
The vehicle control unit 190 may be further communicatively coupled with the implement controller 450 which controls one or more hydraulic cylinders 140. The vehicle control unit may generate a hydraulic force signal 560 in response to the predictive load 340 of the pile to modify one or more of a hydraulic flow rate 430, a hydraulic pressure 440, and a position of a control valve 460. The hydraulic force signal 560 augments the operator's input command signal 200 in response to the predictive load 340 to move the attachment 155. The hydraulic force signal 560 mechanically, hydraulically, and/or electrically, to the hydraulic control valve 460. The hydraulic control valve 460 receives pressurized hydraulic fluid 590 from a hydraulic pump 600, and selectively sends such pressurized hydraulic fluid 590 to one or more of hydraulic cylinders 140 based on the augmented hydraulic force signal 560. The hydraulic cylinders 140 are extended or retracted by the pressurized fluid and thereby actuate the attachment 155.
Referring to
The terminology used herein is for the purpose of describing particular embodiments or implementations and is not intended to be limiting of the disclosure. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the any use of the terms “has,” “have,” “having,” “include,” “includes,” “including,” “comprise,” “comprises,” “comprising,” or the like, in this specification, identifies the presence of stated features, integers, steps, operations, elements, and/or components, but does not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
The references “A” and “B” used with reference numerals herein are merely for clarification when describing multiple implementations of an apparatus.
One or more of the steps or operations in any of the methods, processes, or systems discussed herein may be omitted, repeated, or re-ordered and are within the scope of the present disclosure.
While the above describes example embodiments of the present disclosure, these descriptions should not be viewed in a restrictive or limiting sense. Rather, there are several variations and modifications which may be made without departing from the scope of the appended claims.