The present disclosure generally relates to a utility vehicle. An embodiment of the present disclosure relates to work tool camera system for utility vehicles.
Utility vehicles, such as motor graders, bulldozers, crawlers, feller bunchers, scrapers, excavators, skid and track loaders often use a blade to move material. While operating the utility vehicle, the position of the operator with respect to the work tool can make it difficult to see the amount of material being moved by the work tool at any given time. This is especially true during grading operations or other precision maneuvers.
Various aspects of examples of the present disclosure are set out in the claims.
According to a first aspect of the present disclosure, a main frame, a work tool configured to move relative to the main frame to move a material, a first imaging apparatus aimed at a first portion of the work tool to capture a first image, a second imaging apparatus aimed at a second portion of the work tool to capture a second image, an electronic processor in communication with the first imaging apparatus and the second imaging apparatus, wherein the electronic processor is configured to generate a modified image, and a display for displaying the modified image of the material being moved by the work tool, wherein the modified image is based on a combination of the first image and the second image, and the modified image omits the work tool.
According to a second aspect of the present disclosure, a display system for a utility vehicle comprising a first imaging apparatus for capturing a first image of a first potion of a work tool of the utility vehicle, a second imaging apparatus for capturing a second image of a second portion of the work tool, an image generating unit in communication with the first imaging apparatus and the second imaging apparatus, wherein the image generating unit is configured to generate a modified image, where the modified image is based on a combination of the first image and the second image without showing the work tool, and a display unit for displaying the modified image.
According to a third aspect of the present disclosure, a method of operating a utility vehicle, the method comprising capturing a first image with a first imaging apparatus aimed at a first portion of a work tool, capturing a second image with a second imaging apparatus aimed at a second portion of the work tool, combining the first image with the second image to generate a third image, where the third image omits the work tool and includes an outline of the work tool, and displaying, on a display, the third image.
The above and other features will become apparent from the following description and accompanying drawings.
The detailed description of the drawings refers to the accompanying figures in which:
Like reference numerals are used to indicate like elements throughout the several figures.
At least one example embodiment of the subject matter of this disclosure is understood by referring to
While the present disclosure has been illustrated and described in detail in the drawings and foregoing description, such illustration and description is not restrictive in character, it being understood that illustrative embodiment(s) have been shown and described and that all changes and modifications that come within the spirit of the present disclosure are desired to be protected. Alternative embodiments of the present disclosure may not include all of the features described yet still benefit from at least some of the advantages of such features. Those of ordinary skill in the art may devise their own implementations that incorporate one or more of the features of the present disclosure and fall within the spirit and scope of the appended claims.
While operating a utility vehicle, the position of the operator with respect to the work tool can make it difficult to see the amount of material being moved by the work tool at any given time. This can be challenging for a number of reasons. For example, when the operator wants to adjust for changes in terrain or other reasons, being able to see how much and/or how little material is accumulating in or on the work tool can be beneficial. With a limited view, the operator is slower to react to changes in conditions. Having an improved view of the material being moved by the blade could improve performance and accuracy of maneuvers being done by the utility vehicle. In embodiments described herein, using technology to allow the operator to “see through” or effectively eliminate a work tool would provide better information about processes being performed by the utility vehicle.
Motor grader 10 includes a main frame 12 and an articulated frame 14 which is pivotable with respect to main frame 12. Operator cab 13 is mounted atop articulated frame 14. Operator cab 13 includes operator controls, such as display unit 70 shown in
The articulated frame 14 includes a moldboard 26 (e.g., a blade) mounted thereto. The blade 26 is configured for spreading, leveling, or otherwise moving earthen or other material. In order to facilitate such operations, blade 26 is mounted to frame 14 such that blade 26 is selectively moveable in a number of directions. A draft frame 22 is coupled to articulated frame 14 toward the front via a ball-and-socket joint. A circle frame 28 is coupled to the draft frame 22 to rotate relative thereto by use of a circle drive 38 mounted to the draft frame 22. A tilt frame 40 holds the blade 26 and is coupled pivotally to the circle frame 28 for pivotal movement of the tilt frame 40 and the blade 26 held thereby relative to the circle frame 28 about a tilt axis by use of a tilt cylinder (not shown in
The tilt cylinder is connected to circle frame 28 and tilt frame 40, such that actuation of tilt cylinder 30 changes the pitch of tilt frame 40 (and thus the moldboard 26) relative to circle frame 28. Left and right blade-lift cylinders 34 (i.e., hydraulic lift cylinders) are connected to saddle 36 (which in turn is fixed to articulation frame 14) and draft frame 22 such that actuation of left and right blade lift cylinders 34 raises and lowers the sides of draft frame 22, and thus the moldboard 26, relative to articulation frame 14
The first imaging apparatus 50 can capture a first image 58 and the second imaging apparatus 52 can capture a second image 60. For example, the first image 58 can show the back side (e.g., a first portion) of the blade, including any material coming off the left end of the blade 26 or the right end of the blade 26. The second image 60 can show the front of the blade 26 (e.g., a second portion), including the position and amount of material built up along the blade 26 and the upcoming grading surface (e.g., the surface of the terrain ahead of the blade that will come in contact with the blade soon). The upcoming terrain can include high points (e.g., mounds, piles, rises, etc.) and low points (e.g., holes, depressions, etc.).
The first image 58 and the second image 60 can be used to generate a third image 62, where the third image 62 can be shown from the perspective of the first imaging apparatus (e.g., looking forward) or, alternatively, where the third image 62 can be shown from the perspective of the second imaging apparatus (e.g., looking rearward) where the third image 62 eliminates the blade 26 and shows the material being pushed by the blade 26 (e.g., as if the blade was invisible or “see through”; as if the blade was not there) and/or the upcoming terrain. See
In some embodiments, a third imaging apparatus can be used to show the upcoming terrain. The third imaging apparatus could be located, for example, on the main frame 12 (or other location proximate the front area of the utility vehicle) and pointed forward towards the upcoming terrain when the utility vehicle is in forward motion.
Because of the position of the operator cab 13 with respect to the blade 26, the blade 26 itself, along with intervening portions of the utility vehicle 10 (e.g., the circle frame 28), can make it difficult and/or impossible to see amount of material being moved by the blade 26 at any given time. Having, for example, the first imaging apparatus 50 and the second imaging apparatus 52, along with the third image 62 can help the operator understand the amount of material 66 being moved by the blade 26 at any given time.
The material being moved by the blade 26 (e.g., dirt, rock, snow, etc.) can accumulate in varying amounts at different locations along the blade 26. In this example, material 66 being moved by the blade 26 varies along the length of the blade 26. The amount of material 66 can vary and change over time as the utility vehicle 10 moves. As the amount of material 66 proximate the blade changes, the corresponding indication on the third image 62 can also change, which is viewable on the display 70 by an operator in the operator cab or remotely on a different monitor.
Also shown in
Identification of the upcoming terrain can be determined by, for example, imaging apparatuses including stereo cameras, lidar, radar, or other similar devices. Identification of upcoming terrain can assist with identification of areas that require additional material (e.g., holes, low spots) which can be useful information for a grading operation.
The display 70 can show the material 66 being moved by the blade 26 and where the material 66 being moved is contacting the blade. The display 70 can also show the amount of the material 66 coming off an end 26A or 26B of the blade 26. This can be helpful for an operator as the terrain changes and the amount of material 66 being moved by the blade 26 changes.
For example, as the terrain drops away (i.e., dips down, drops, a hole, distance between the blade and the surface increases, etc.) the amount of material 66 moved by the blade 26 can decrease, resulting in less (or no) material 66 coming off an end of the blade 32.
Alternatively, as the terrain rises up (i.e., bumps up, a hump, a bump, a mound, a pile, distance between blade and the surface decreases, etc.) the amount of material 66 moved by the blade can increase, resulting is more material coming off an end 26A or 26B (or both ends) of the blade 26.
The display 70 can display information about the grading process. For example, blue shading (i.e., cross hatching pattern for low points 72) can demonstrate upcoming holes or depressions. Red shading (i.e., cross hatching pattern 2 for high points 74) can be the amount of material currently on the blade.
Additional color indicators 76 can be positioned on the display to represent the level of load currently being experienced by the work machine. For example,
The display 70 can show the estimated volume 78 of material 66 that is being graded. See additional information and description below.
This information displayed on the monitor can give extra information to the operator regarding the amount of material on the blade. The operator can see how the amount of load on the blade and can see how the load is dispersed between the toe, heel, or middle of the blade.
The blade control system 80 also has a non-transitory computer-readable memory 82 that stores image data 84. The non-transitory computer-readable memory 82 may comprise electronic memory, nonvolatile random-access memory, an optical storage device, a magnetic storage device, or another device for storing and accessing electronic data on any recordable, rewritable, or readable electronic, optical, or magnetic storage medium.
An electronic processor 86 (i.e., an image generating unit) is provided and configured to perform an operation by controllably adjusting a position of the work tool 26 relative to the utility vehicle 10 and capturing and processing images from the first imaging apparatus 50 and the second imaging apparatus 52 (and any additional imaging apparatuses). The electronic processor 86 may be arranged locally as part of the utility vehicle 10 or remotely at a remote processing center (not shown). In various embodiments, the electronic processor 86 may comprise a microprocessor, a microcontroller, a central processing unit, a programmable logic array, a programmable logic controller, or other suitable programmable circuitry that is adapted to perform data processing and/or system control operations. The electronic processor 86 executes or otherwise relies upon computer software applications, components, programs, objects, modules, or data structures, etc. Software routines resident in the included memory of the electronic processor 86 or other memory are executed in response to signals received.
The computer software applications, in other embodiments, may be located in the cloud. The executed software includes one or more specific applications, components, programs, objects, modules, or sequences of instructions typically referred to as “program code”. The program code includes one or more instructions located in memory and other storage devices which execute the instructions which are resident in memory, which are responsive to other instructions generated by the system, or which are provided by an operator interface 88 operated by the user (e.g., located in the operator cab or at a remote location). The electronic processor 86 is configured to execute the stored program instructions.
The electronic processor 86, along with the stored program instructions, can be used to estimate an amount of material being moved by the work tool. Based on the amount of area of an image of the blade that shows a material 66, the processor 86 can determine an estimate volume of material 66 being moved by the work tool.
The method 100 can further comprise the step 110 of estimating the volume of material being moved by the work tool base. Based on analysis of the third image, an estimate for the volume of material being moved by the work tool can be determined. This estimate is based on a comparison of the amount of the image that shows material being moved compared to the area of the work tool. The estimated amount of material can be based on, for example, the amount of material on (i.e., against, touching, being moved by) the blade (i.e., work tool) and the size of the blade (i.e., work tool). By having the size and area of blade determined, a volumetric area of interest can be set. When an imaging device (e.g., a camera) can see and determine how much of the volumetric area is covered by material, an estimate of the amount of material being moved can be made.
The method 100 can further comprise the step 112 of displaying, on a display, the amount of material being moved by the work tool.